diff --git a/DeToX/Base.py b/DeToX/Base.py
index 88f29f7..abefc60 100644
--- a/DeToX/Base.py
+++ b/DeToX/Base.py
@@ -471,9 +471,9 @@ def calibrate(self,
- 9: Comprehensive 9-point pattern (3×3 grid).
- list: Custom points in normalized coordinates [-1, 1].
Example: [(-0.4, 0.4), (0.4, 0.4), (0.0, 0.0)]
- infant_stims : list of str or None, optional
+ infant_stims : list of str or True, optional
Paths to engaging image files for calibration targets (e.g., colorful
- characters, animated objects). If None (default), uses built-in stimuli
+ characters, animated objects). If True (default), uses built-in stimuli
from the package. If fewer stimuli than calibration points are provided,
stimuli are automatically repeated in sequence to cover all points
(e.g., 3 stimuli for 7 points becomes [s1, s2, s3, s1, s2, s3, s1]).
@@ -569,7 +569,7 @@ def calibrate(self,
num_points = len(norm_points)
# --- Stimuli Loading ---
- if infant_stims is None:
+ if infant_stims is True:
# Load default stimuli from package
import glob
diff --git a/DeToX/Calibration.py b/DeToX/Calibration.py
index 6404255..30ca762 100644
--- a/DeToX/Calibration.py
+++ b/DeToX/Calibration.py
@@ -805,7 +805,7 @@ def __init__(
# --- Tobii-Specific Setup ---
self.calibration = calibration_api
-
+ self.verbose = verbose
def run(self, calibration_points):
"""
@@ -1078,7 +1078,7 @@ def __init__(
# --- Mouse-Specific Setup ---
self.mouse = mouse
self.calibration_data = {} # point_idx -> list of (target_pos, sample_pos, timestamp)
-
+ self.verbose = verbose
def run(self, calibration_points):
"""
diff --git a/Documentation/Vignettes/Calibration.qmd b/Documentation/Vignettes/Calibration.qmd
new file mode 100644
index 0000000..5745c3c
--- /dev/null
+++ b/Documentation/Vignettes/Calibration.qmd
@@ -0,0 +1,264 @@
+---
+title: "Calibration"
+description-meta:: "Learn how to use DeToX to perform calibration to ensure the best data quality"
+author: Tommaso Ghilardi
+keywords: [DeToX, Eye Tracking, Calibration, Setup, Tobii, PsychoPy, Infatn calibration, infant friendly, eye tracker setup, python]
+execute:
+ enabled: false
+---
+
+Good eye tracking data starts long before you present your first stimulus—it begins with proper setup and calibration. Even the most carefully designed experiment will produce noisy, unusable data if the eye tracker isn't configured correctly. This is particularly critical when testing infants and children, where data quality can be challenging even under ideal conditions. In this tutorial, we'll walk through how to use DeToX to ensure the best possible data quality from the start.
+
+We'll focus on two essential steps:
+
+1. positioning your participant in the eye tracker's optimal tracking zone
+
+2. running the calibration procedure
+
+Get these right, and everything else falls into place.
+
+## Part 1: Positioning Your Participant
+
+### Understanding the Track Box
+
+Every eye tracker has a track box—an invisible 3D zone where it can accurately detect eyes. Step outside this zone, and tracking quality drops fast! While we try to seat everyone consistently, even small differences in posture or chair height can matter. The good news? DeToX makes checking position as easy as calling a single method.
+
+::: callout-warning
+This tutorial assumes you've completed [Getting started](GettingStarted.qmd). If not, start there first!!!
+:::
+
+### Setup Code
+
+Let's begin with our standard setup:
+
+```{python}
+#| label: Libraries
+#| eval: false
+from psychopy import visual, core
+from DeToX import ETracker
+
+
+## Creat the window
+win = visual.Window(
+ size=[1920, 1080], # Window dimensions in pixels
+ fullscr=True, # Expand to fill the entire screen
+ units='pix' # Use pixels as the measurement unit
+)
+
+
+## Connect to the eyetracker
+ET_controller = ETracker(win)
+```
+
+We're following the same steps from the previous tutorial. First, we import the libraries we need—`psychopy.visual` for creating our display window and `DeToX.ETracker` for controlling the eye tracker.
+
+Next, we create a PsychoPy window where all our stimuli will appear.
+
+Finally, we connect to the Tobii eye tracker by creating an `ETracker` object. This automatically searches for connected devices, establishes communication, and links the tracker to our window. You'll see connection info printed to confirm everything's working. Now the `ET_controller` object is ready to control all eye-tracking operations!
+
+### Checking Position in Real-Time
+
+Now for the magic—visualizing where your participant's eyes are in the track box:
+
+```{python}
+Et_controller.show_status()
+```
+
+### What You'll See
+
+When you call `show_status()`, an animated video of animals appears to keep infants engaged. Overlaid on the video are two colored circles—🔵 **light blue** and 🔴 **pink** —showing where the tracker detects your participant's eyes in real-time. A black rectangle marks the track box boundaries where tracking works best.
+
+Below the rectangle, a green bar with a moving black line indicates distance from the screen. The line should be centered on the bar (about 65 cm distance).
+
+{fig-align="center" width="960"}
+
+::: callout-important
+### Simulation Mode
+
+If you're running with `simulate=True`, the positioning interface uses your mouse instead of real eye tracker data. Move the mouse around to see the eye circles follow, and use the **scroll wheel** to simulate moving closer to or further from the screen.
+:::
+
+Center both colored circles within the black rectangle. If they're drifting toward the edges, ask the participant to lean in the needed direction, or adjust the eye tracker mount if possible.
+
+For distance, watch the black line on the green bar. Too far left means too close (move back), too far right means too far (move forward). Aim for the center.
+
+Press **SPACE** when positioning looks good—you're ready for calibration!
+
+::: Advanced
+### Customizing the Positioning Display
+
+By default, `show_status()` displays an animated video of animals to keep infants engaged while you adjust their position. However, you can customize this behavior using the `video_help` parameter:
+
+**`video_help=True`**: Uses the built-in instructional video included with DeToX (default). This is the easiest option and works well for most studies.
+
+**`video_help=False`**: Disables the video entirely, showing only the eye position circles and track box. Useful if you prefer a minimal display or if video playback causes performance issues.
+
+**`video_help=visual.MovieStim(...)`**: Uses your own custom video. You'll need to pre-load and configure the MovieStim object yourself, including setting the appropriate size and position for your display layout.
+:::
+
+## Part 2: Calibration
+
+With your participant positioned correctly, you're ready for calibration. But what exactly is calibration, and why do we need it?
+
+### Understanding Calibration
+
+Calibration teaches the eye tracker how to interpret each participant's unique eye characteristics. Everyone's eyes are slightly different—different sizes, shapes, reflection patterns—so the tracker needs a personalized model to accurately estimate where someone is looking.
+
+The process is straightforward: you present targets at known locations on the screen, the participant looks at each one, and the tracker records the relationship between eye features and screen coordinates. This creates a custom mapping for that individual.
+
+](https://help.tobii.com/hc/article_attachments/360036648293/How_DoesEyetrackingWork_ScreenBased.jpg){fig-align="center" width="712"}
+
+Sounds complex, right? It is—but DeToX handles the complexity for you. Here's how simple it becomes:
+
+### Launching Calibration with DeToX
+
+```{python}
+ET_controller.calibrate(
+ calibration_points = 5,
+ shuffle=True,
+ audio=True,
+ anim_type='zoom',
+ visualization_style='circles'
+)
+```
+
+Let's break down what's happening here:
+
+**`calibration_points=5`**: Uses a standard 5-point calibration pattern (corners plus center). This is the default and works well for most studies. You can also choose 9 points for higher precision, or pass a custom list of coordinates in height units for specialized configurations.
+
+**`shuffle=True`**: Randomizes which image appears at each calibration point. This prevents habituation and keeps participants engaged throughout the procedure.
+
+**`audio=True`**: Plays an attention-getting sound along with the visual stimulus to help capture and maintain the participant's focus. **`anim_type='zoom'`**: Makes the stimuli gently pulse in size to attract attention. You can also use `'trill'` for a rotating animation.
+
+**`visualization_style='circles'`**: Displays the calibration results using dots at each point. You can also choose `'lines'` to show lines connecting the target to where the gaze landed.
+
+::: Advanced
+### Customizing Calibration Parameters
+
+Beyond the basic configuration, you can customize several aspects of the calibration procedure to match your experimental needs:
+
+**`infant_stims`**: By default (`infant_stims=True`), DeToX uses a set of engaging animal cartoons included with the package. These work well for most infant studies, but you can provide a list of file paths to your own images if you prefer custom stimuli that better match your study's theme or participant age group.
+
+**`audio`**: By default (`audio=True`), an attention-getting sound included with DeToX plays automatically when stimuli appear. Set this to `None` to disable audio entirely, or pass your own pre-loaded `psychopy.sound.Sound` object to use custom sounds that fit your experimental context.
+
+**`calibration_points`**: By default uses a 5-point pattern (corners plus center). You can specify `9` for higher precision, or pass a custom list of coordinates in height units for specialized configurations tailored to your stimulus regions.
+:::
+
+### The Calibration Process
+
+#### Step 1: Instructions
+
+First, you'll see instructions explaining the controls:
+
+``` markdown
+┌──────────────────── Calibration Setup ─────────────────────┐
+│Mouse-Based Calibration Setup: │
+│ │
+│ - Press number keys (1-5) to select calibration points │
+│ - Move your mouse to the animated stimulus │
+│ - Press SPACE to collect samples at the selected point │
+│ - Press ENTER to finish collecting and see results │
+│ - Press ESCAPE to exit calibration │
+│ │
+│ Any key will start calibration immediately! │
+└────────────────────────────────────────────────────────────┘
+```
+
+these instructions will tell you how to control the calibration using your keyboard.
+
+Press any key when you're ready to begin.
+
+#### Data Collection
+
+The calibration screen appears with a thin red border indicating you're in calibration mode. Press a number key (1-5) to display an animated stimulus at the corresponding calibration point. The participant should look at the stimulus while it animates. When you're confident they're fixating on the target, press **SPACE** to collect gaze samples. The system waits briefly (0.25 seconds) to ensure stable fixation, then records the eye tracking data for that point. Repeat this process for all calibration points. You don't need to go in order.
+
+#### Step 3: Review Results
+
+Once you've collected data for all points (or whenever you're satisfied), press **ENTER** to compute and visualize the calibration results. You'll see a display showing the calibration targets and dots (or lines if you selected them in the `visualization_style`) to where the gaze samples actually landed.
+
+At this stage, you have several options:
+
+``` markdown
+┌────────────── Calibration Results ───────────────┐
+│Review calibration results above. │
+│ │
+│ - Press ENTER to accept calibration │
+│ - Press Numbers → SPACE to retry some points │
+│ - Press ESCAPE to restart calibration │
+│ │
+└──────────────────────────────────────────────────┘
+```
+
+**Accept the calibration**: If the results look good across all points, press **SPACE** to accept and move forward with your experiment.
+
+**Retry specific points**: Notice one or two points with poor accuracy? Press the number keys corresponding to those points—they'll highlight in yellow to confirm your selection. You can select multiple points if needed. Once you've marked all points for retry, press **SPACE** again to recollect data for just those points. This targeted approach is especially valuable with infants and children, where you might get excellent data at most points but struggle with one or two locations.
+
+**Start over**: If the overall quality is poor or you want a fresh start, press **ESCAPE** to discard all data and restart the entire calibration procedure from the beginning.
+
+::: callout-important
+### Simulation Mode
+
+When running with `simulate=True`, the calibration procedure uses mouse position instead of real eye tracker data. Press number keys to display stimuli at calibration points, move your mouse to each target location, and press SPACE to "collect" samples. This allows you to test your entire experiment workflow without needing physical eye tracking hardware.
+:::
+
+### Calibration Complete!
+
+Congratulations! You've successfully positioned your participant and completed the calibration procedure using DeToX. With accurate calibration in place, you're now ready to present your experimental stimuli and collect high-quality eye tracking data.
+
+Here's a video demonstrating the entire calibration workflow from start to finish:
+
+```{=html}
+
Good eye tracking data starts long before you present your first stimulus—it begins with proper setup and calibration. Even the most carefully designed experiment will produce noisy, unusable data if the eye tracker isn’t configured correctly. This is particularly critical when testing infants and children, where data quality can be challenging even under ideal conditions. In this tutorial, we’ll walk through how to use DeToX to ensure the best possible data quality from the start.
+
We’ll focus on two essential steps:
+
+
positioning your participant in the eye tracker’s optimal tracking zone
+
running the calibration procedure
+
+
Get these right, and everything else falls into place.
+
+
Part 1: Positioning Your Participant
+
+
Understanding the Track Box
+
Every eye tracker has a track box—an invisible 3D zone where it can accurately detect eyes. Step outside this zone, and tracking quality drops fast! While we try to seat everyone consistently, even small differences in posture or chair height can matter. The good news? DeToX makes checking position as easy as calling a single method.
+
+
+
+
+
+
+Warning
+
+
+
+
This tutorial assumes you’ve completed Getting started. If not, start there first!!!
+
+
+
+
+
Setup Code
+
Let’s begin with our standard setup:
+
+
from psychopy import visual, core
+from DeToX import ETracker
+
+
+## Creat the window
+win = visual.Window(
+ size=[1920, 1080], # Window dimensions in pixels
+ fullscr=True, # Expand to fill the entire screen
+ units='pix'# Use pixels as the measurement unit
+)
+
+
+## Connect to the eyetracker
+ET_controller = ETracker(win)
+
+
We’re following the same steps from the previous tutorial. First, we import the libraries we need—psychopy.visual for creating our display window and DeToX.ETracker for controlling the eye tracker.
+
Next, we create a PsychoPy window where all our stimuli will appear.
+
Finally, we connect to the Tobii eye tracker by creating an ETracker object. This automatically searches for connected devices, establishes communication, and links the tracker to our window. You’ll see connection info printed to confirm everything’s working. Now the ET_controller object is ready to control all eye-tracking operations!
+
+
+
Checking Position in Real-Time
+
Now for the magic—visualizing where your participant’s eyes are in the track box:
+
+
Et_controller.show_status()
+
+
+
+
What You’ll See
+
When you call show_status(), an animated video of animals appears to keep infants engaged. Overlaid on the video are two colored circles—🔵 light blue and 🔴 pink —showing where the tracker detects your participant’s eyes in real-time. A black rectangle marks the track box boundaries where tracking works best.
+
Below the rectangle, a green bar with a moving black line indicates distance from the screen. The line should be centered on the bar (about 65 cm distance).
+
+
+
+
+
+
+
+
+
+
+
+Simulation Mode
+
+
+
+
If you’re running with simulate=True, the positioning interface uses your mouse instead of real eye tracker data. Move the mouse around to see the eye circles follow, and use the scroll wheel to simulate moving closer to or further from the screen.
+
+
+
Center both colored circles within the black rectangle. If they’re drifting toward the edges, ask the participant to lean in the needed direction, or adjust the eye tracker mount if possible.
+
For distance, watch the black line on the green bar. Too far left means too close (move back), too far right means too far (move forward). Aim for the center.
+
Press SPACE when positioning looks good—you’re ready for calibration!
+
+
+
+
+
+
+Advanced
+
+
+
+
+
+
Customizing the Positioning Display
+
By default, show_status() displays an animated video of animals to keep infants engaged while you adjust their position. However, you can customize this behavior using the video_help parameter:
+
video_help=True: Uses the built-in instructional video included with DeToX (default). This is the easiest option and works well for most studies.
+
video_help=False: Disables the video entirely, showing only the eye position circles and track box. Useful if you prefer a minimal display or if video playback causes performance issues.
+
video_help=visual.MovieStim(...): Uses your own custom video. You’ll need to pre-load and configure the MovieStim object yourself, including setting the appropriate size and position for your display layout.
+
+
+
+
+
+
+
Part 2: Calibration
+
With your participant positioned correctly, you’re ready for calibration. But what exactly is calibration, and why do we need it?
+
+
Understanding Calibration
+
Calibration teaches the eye tracker how to interpret each participant’s unique eye characteristics. Everyone’s eyes are slightly different—different sizes, shapes, reflection patterns—so the tracker needs a personalized model to accurately estimate where someone is looking.
+
The process is straightforward: you present targets at known locations on the screen, the participant looks at each one, and the tracker records the relationship between eye features and screen coordinates. This creates a custom mapping for that individual.
calibration_points=5: Uses a standard 5-point calibration pattern (corners plus center). This is the default and works well for most studies. You can also choose 9 points for higher precision, or pass a custom list of coordinates in height units for specialized configurations.
+
shuffle=True: Randomizes which image appears at each calibration point. This prevents habituation and keeps participants engaged throughout the procedure.
+
audio=True: Plays an attention-getting sound along with the visual stimulus to help capture and maintain the participant’s focus. anim_type='zoom': Makes the stimuli gently pulse in size to attract attention. You can also use 'trill' for a rotating animation.
+
visualization_style='circles': Displays the calibration results using dots at each point. You can also choose 'lines' to show lines connecting the target to where the gaze landed.
+
+
+
+
+
+
+Advanced
+
+
+
+
+
+
Customizing Calibration Parameters
+
Beyond the basic configuration, you can customize several aspects of the calibration procedure to match your experimental needs:
+
infant_stims: By default (infant_stims=True), DeToX uses a set of engaging animal cartoons included with the package. These work well for most infant studies, but you can provide a list of file paths to your own images if you prefer custom stimuli that better match your study’s theme or participant age group.
+
audio: By default (audio=True), an attention-getting sound included with DeToX plays automatically when stimuli appear. Set this to None to disable audio entirely, or pass your own pre-loaded psychopy.sound.Sound object to use custom sounds that fit your experimental context.
+
calibration_points: By default uses a 5-point pattern (corners plus center). You can specify 9 for higher precision, or pass a custom list of coordinates in height units for specialized configurations tailored to your stimulus regions.
+
+
+
+
+
+
The Calibration Process
+
+
Step 1: Instructions
+
First, you’ll see instructions explaining the controls:
+
┌──────────────────── Calibration Setup ─────────────────────┐
+│Mouse-Based Calibration Setup: │
+│ │
+│ - Press number keys (1-5) to select calibration points │
+│ - Move your mouse to the animated stimulus │
+│ - Press SPACE to collect samples at the selected point │
+│ - Press ENTER to finish collecting and see results │
+│ - Press ESCAPE to exit calibration │
+│ │
+│ Any key will start calibration immediately! │
+└────────────────────────────────────────────────────────────┘
+
these instructions will tell you how to control the calibration using your keyboard.
+
Press any key when you’re ready to begin.
+
+
+
Data Collection
+
The calibration screen appears with a thin red border indicating you’re in calibration mode. Press a number key (1-5) to display an animated stimulus at the corresponding calibration point. The participant should look at the stimulus while it animates. When you’re confident they’re fixating on the target, press SPACE to collect gaze samples. The system waits briefly (0.25 seconds) to ensure stable fixation, then records the eye tracking data for that point. Repeat this process for all calibration points. You don’t need to go in order.
+
+
+
Step 3: Review Results
+
Once you’ve collected data for all points (or whenever you’re satisfied), press ENTER to compute and visualize the calibration results. You’ll see a display showing the calibration targets and dots (or lines if you selected them in the visualization_style) to where the gaze samples actually landed.
+
At this stage, you have several options:
+
┌────────────── Calibration Results ───────────────┐
+│Review calibration results above. │
+│ │
+│ - Press ENTER to accept calibration │
+│ - Press Numbers → SPACE to retry some points │
+│ - Press ESCAPE to restart calibration │
+│ │
+└──────────────────────────────────────────────────┘
+
Accept the calibration: If the results look good across all points, press SPACE to accept and move forward with your experiment.
+
Retry specific points: Notice one or two points with poor accuracy? Press the number keys corresponding to those points—they’ll highlight in yellow to confirm your selection. You can select multiple points if needed. Once you’ve marked all points for retry, press SPACE again to recollect data for just those points. This targeted approach is especially valuable with infants and children, where you might get excellent data at most points but struggle with one or two locations.
+
Start over: If the overall quality is poor or you want a fresh start, press ESCAPE to discard all data and restart the entire calibration procedure from the beginning.
+
+
+
+
+
+
+Simulation Mode
+
+
+
+
When running with simulate=True, the calibration procedure uses mouse position instead of real eye tracker data. Press number keys to display stimuli at calibration points, move your mouse to each target location, and press SPACE to “collect” samples. This allows you to test your entire experiment workflow without needing physical eye tracking hardware.
+
+
+
+
+
+
Calibration Complete!
+
Congratulations! You’ve successfully positioned your participant and completed the calibration procedure using DeToX. With accurate calibration in place, you’re now ready to present your experimental stimuli and collect high-quality eye tracking data.
+
Here’s a video demonstrating the entire calibration workflow from start to finish:
+
+
+
+
+
Save and Load calibration
+
While we recommend performing calibration at the start of each session to ensure optimal accuracy, DeToX also allows you to save and load calibration data for convenience. In our opinion, this should only be used in special circumstances where there is a headrest and little to no chance of movement between sessions. However, if you need to save and reuse calibration data, here’s how:
+
+
Saving Calibration
+
After completing a successful calibration, you can save the calibration data to a file:
+
# Save with custom filename
+ET_controller.save_calibration(filename="S01_calibration.dat")
+
┌────── Calibration Saved ─────┐
+│Calibration data saved to: │
+│S01_calibration.dat │
+└──────────────────────────────┘
+
The calibration data is saved as a binary file (.dat format) that can be reloaded in future sessions. If you don’t specify a filename, DeToX automatically generates a timestamped name like 2024-01-15_14-30-00_calibration.dat.
+
you can also choose to use a GUI file dialog to select the save location:
+
# Save with GUI file dialog
+ET.save_calibration(use_gui=True)
+
+
+
Loading Calibration
+
To reuse a previously saved calibration in a new session:
+
# Load from specific file
+ET.load_calibration(filename="S01_calibration.dat")
---
+title: "Calibration"
+description-meta:: "Learn how to use DeToX to perform calibration to ensure the best data quality"
+author: Tommaso Ghilardi
+keywords: [DeToX, Eye Tracking, Calibration, Setup, Tobii, PsychoPy, Infatn calibration, infant friendly, eye tracker setup, python]
+execute:
+ enabled: false
+---
+
+Good eye tracking data starts long before you present your first stimulus—it begins with proper setup and calibration. Even the most carefully designed experiment will produce noisy, unusable data if the eye tracker isn't configured correctly. This is particularly critical when testing infants and children, where data quality can be challenging even under ideal conditions. In this tutorial, we'll walk through how to use DeToX to ensure the best possible data quality from the start.
+
+We'll focus on two essential steps:
+
+1. positioning your participant in the eye tracker's optimal tracking zone
+
+2. running the calibration procedure
+
+Get these right, and everything else falls into place.
+
+## Part 1: Positioning Your Participant
+
+### Understanding the Track Box
+
+Every eye tracker has a track box—an invisible 3D zone where it can accurately detect eyes. Step outside this zone, and tracking quality drops fast! While we try to seat everyone consistently, even small differences in posture or chair height can matter. The good news? DeToX makes checking position as easy as calling a single method.
+
+::: callout-warning
+This tutorial assumes you've completed [Getting started](GettingStarted.qmd). If not, start there first!!!
+:::
+
+### Setup Code
+
+Let's begin with our standard setup:
+
+```{python}
+#| label: Libraries
+#| eval: false
+from psychopy import visual, core
+from DeToX import ETracker
+
+
+## Creat the window
+win = visual.Window(
+ size=[1920, 1080], # Window dimensions in pixels
+ fullscr=True, # Expand to fill the entire screen
+ units='pix'# Use pixels as the measurement unit
+)
+
+
+## Connect to the eyetracker
+ET_controller = ETracker(win)
+```
+
+We're following the same steps from the previous tutorial. First, we import the libraries we need—`psychopy.visual` for creating our display window and `DeToX.ETracker` for controlling the eye tracker.
+
+Next, we create a PsychoPy window where all our stimuli will appear.
+
+Finally, we connect to the Tobii eye tracker by creating an `ETracker` object. This automatically searches for connected devices, establishes communication, and links the tracker to our window. You'll see connection info printed to confirm everything's working. Now the `ET_controller` object is ready to control all eye-tracking operations!
+
+### Checking Position in Real-Time
+
+Now for the magic—visualizing where your participant's eyes are in the track box:
+
+```{python}
+Et_controller.show_status()
+```
+
+### What You'll See
+
+When you call `show_status()`, an animated video of animals appears to keep infants engaged. Overlaid on the video are two colored circles—🔵 **light blue** and 🔴 **pink** —showing where the tracker detects your participant's eyes in real-time. A black rectangle marks the track box boundaries where tracking works best.
+
+Below the rectangle, a green bar with a moving black line indicates distance from the screen. The line should be centered on the bar (about 65 cm distance).
+
+{fig-align="center" width="960"}
+
+::: callout-important
+### Simulation Mode
+
+If you're running with `simulate=True`, the positioning interface uses your mouse instead of real eye tracker data. Move the mouse around to see the eye circles follow, and use the **scroll wheel** to simulate moving closer to or further from the screen.
+:::
+
+Center both colored circles within the black rectangle. If they're drifting toward the edges, ask the participant to lean in the needed direction, or adjust the eye tracker mount if possible.
+
+For distance, watch the black line on the green bar. Too far left means too close (move back), too far right means too far (move forward). Aim for the center.
+
+Press **SPACE** when positioning looks good—you're ready for calibration!
+
+::: Advanced
+### Customizing the Positioning Display
+
+By default, `show_status()` displays an animated video of animals to keep infants engaged while you adjust their position. However, you can customize this behavior using the `video_help` parameter:
+
+**`video_help=True`**: Uses the built-in instructional video included with DeToX (default). This is the easiest option and works well for most studies.
+
+**`video_help=False`**: Disables the video entirely, showing only the eye position circles and track box. Useful if you prefer a minimal display or if video playback causes performance issues.
+
+**`video_help=visual.MovieStim(...)`**: Uses your own custom video. You'll need to pre-load and configure the MovieStim object yourself, including setting the appropriate size and position for your display layout.
+:::
+
+## Part 2: Calibration
+
+With your participant positioned correctly, you're ready for calibration. But what exactly is calibration, and why do we need it?
+
+### Understanding Calibration
+
+Calibration teaches the eye tracker how to interpret each participant's unique eye characteristics. Everyone's eyes are slightly different—different sizes, shapes, reflection patterns—so the tracker needs a personalized model to accurately estimate where someone is looking.
+
+The process is straightforward: you present targets at known locations on the screen, the participant looks at each one, and the tracker records the relationship between eye features and screen coordinates. This creates a custom mapping for that individual.
+
+](https://help.tobii.com/hc/article_attachments/360036648293/How_DoesEyetrackingWork_ScreenBased.jpg){fig-align="center" width="712"}
+
+Sounds complex, right? It is—but DeToX handles the complexity for you. Here's how simple it becomes:
+
+### Launching Calibration with DeToX
+
+```{python}
+ET_controller.calibrate(
+ calibration_points =5,
+ shuffle=True,
+ audio=True,
+ anim_type='zoom',
+ visualization_style='circles'
+)
+```
+
+Let's break down what's happening here:
+
+**`calibration_points=5`**: Uses a standard 5-point calibration pattern (corners plus center). This is the default and works well for most studies. You can also choose 9 points for higher precision, or pass a custom list of coordinates in height units for specialized configurations.
+
+**`shuffle=True`**: Randomizes which image appears at each calibration point. This prevents habituation and keeps participants engaged throughout the procedure.
+
+**`audio=True`**: Plays an attention-getting sound along with the visual stimulus to help capture and maintain the participant's focus. **`anim_type='zoom'`**: Makes the stimuli gently pulse in size to attract attention. You can also use `'trill'` for a rotating animation.
+
+**`visualization_style='circles'`**: Displays the calibration results using dots at each point. You can also choose `'lines'` to show lines connecting the target to where the gaze landed.
+
+::: Advanced
+### Customizing Calibration Parameters
+
+Beyond the basic configuration, you can customize several aspects of the calibration procedure to match your experimental needs:
+
+**`infant_stims`**: By default (`infant_stims=True`), DeToX uses a set of engaging animal cartoons included with the package. These work well for most infant studies, but you can provide a list of file paths to your own images if you prefer custom stimuli that better match your study's theme or participant age group.
+
+**`audio`**: By default (`audio=True`), an attention-getting sound included with DeToX plays automatically when stimuli appear. Set this to `None` to disable audio entirely, or pass your own pre-loaded `psychopy.sound.Sound` object to use custom sounds that fit your experimental context.
+
+**`calibration_points`**: By default uses a 5-point pattern (corners plus center). You can specify `9` for higher precision, or pass a custom list of coordinates in height units for specialized configurations tailored to your stimulus regions.
+:::
+
+### The Calibration Process
+
+#### Step 1: Instructions
+
+First, you'll see instructions explaining the controls:
+
+``` markdown
+┌──────────────────── Calibration Setup ─────────────────────┐
+│Mouse-Based Calibration Setup: │
+│ │
+│ - Press number keys (1-5) to select calibration points │
+│ - Move your mouse to the animated stimulus │
+│ - Press SPACE to collect samples at the selected point │
+│ - Press ENTER to finish collecting and see results │
+│ - Press ESCAPE to exit calibration │
+│ │
+│ Any key will start calibration immediately! │
+└────────────────────────────────────────────────────────────┘
+```
+
+these instructions will tell you how to control the calibration using your keyboard.
+
+Press any key when you're ready to begin.
+
+#### Data Collection
+
+The calibration screen appears with a thin red border indicating you're in calibration mode. Press a number key (1-5) to display an animated stimulus at the corresponding calibration point. The participant should look at the stimulus while it animates. When you're confident they're fixating on the target, press **SPACE** to collect gaze samples. The system waits briefly (0.25 seconds) to ensure stable fixation, then records the eye tracking data for that point. Repeat this process for all calibration points. You don't need to go in order.
+
+#### Step 3: Review Results
+
+Once you've collected data for all points (or whenever you're satisfied), press **ENTER** to compute and visualize the calibration results. You'll see a display showing the calibration targets and dots (or lines if you selected them in the `visualization_style`) to where the gaze samples actually landed.
+
+At this stage, you have several options:
+
+``` markdown
+┌────────────── Calibration Results ───────────────┐
+│Review calibration results above. │
+│ │
+│ - Press ENTER to accept calibration │
+│ - Press Numbers → SPACE to retry some points │
+│ - Press ESCAPE to restart calibration │
+│ │
+└──────────────────────────────────────────────────┘
+```
+
+**Accept the calibration**: If the results look good across all points, press **SPACE** to accept and move forward with your experiment.
+
+**Retry specific points**: Notice one or two points with poor accuracy? Press the number keys corresponding to those points—they'll highlight in yellow to confirm your selection. You can select multiple points if needed. Once you've marked all points for retry, press **SPACE** again to recollect data for just those points. This targeted approach is especially valuable with infants and children, where you might get excellent data at most points but struggle with one or two locations.
+
+**Start over**: If the overall quality is poor or you want a fresh start, press **ESCAPE** to discard all data and restart the entire calibration procedure from the beginning.
+
+::: callout-important
+### Simulation Mode
+
+When running with `simulate=True`, the calibration procedure uses mouse position instead of real eye tracker data. Press number keys to display stimuli at calibration points, move your mouse to each target location, and press SPACE to "collect" samples. This allows you to test your entire experiment workflow without needing physical eye tracking hardware.
+:::
+
+### Calibration Complete!
+
+Congratulations! You've successfully positioned your participant and completed the calibration procedure using DeToX. With accurate calibration in place, you're now ready to present your experimental stimuli and collect high-quality eye tracking data.
+
+Here's a video demonstrating the entire calibration workflow from start to finish:
+
+```{=html}
+<div style="padding:56.25% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/1139719474?badge=0&autopause=0&player_id=0&app_id=58479" frameborder="0" allow="autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share" referrerpolicy="strict-origin-when-cross-origin" style="position:absolute;top:0;left:0;width:100%;height:100%;" title="Calibration procedure using DeToX"></iframe></div><script src="https://player.vimeo.com/api/player.js"></script>
+```
+
+## Save and Load calibration
+
+While we recommend performing calibration at the start of each session to ensure optimal accuracy, DeToX also allows you to save and load calibration data for convenience. In our opinion, this should only be used in special circumstances where there is a headrest and little to no chance of movement between sessions. However, if you need to save and reuse calibration data, here's how:
+
+### Saving Calibration
+
+After completing a successful calibration, you can save the calibration data to a file:
+
+``` python
+# Save with custom filename
+ET_controller.save_calibration(filename="S01_calibration.dat")
+```
+
+``` markdown
+┌────── Calibration Saved ─────┐
+│Calibration data saved to: │
+│S01_calibration.dat │
+└──────────────────────────────┘
+```
+
+The calibration data is saved as a binary file (`.dat` format) that can be reloaded in future sessions. If you don't specify a filename, DeToX automatically generates a timestamped name like `2024-01-15_14-30-00_calibration.dat`.
+
+you can also choose to use a GUI file dialog to select the save location:
+
+``` python
+# Save with GUI file dialog
+ET.save_calibration(use_gui=True)
+```
+
+### Loading Calibration
+
+To reuse a previously saved calibration in a new session:
+
+``` python
+# Load from specific file
+ET.load_calibration(filename="S01_calibration.dat")
+```
+
+``` markdown
+┌───── Calibration Loaded ──────┐
+│Calibration data loaded from: │
+│S01_calibration.dat │
+└───────────────────────────────┘
+```
+
+or again, use a GUI file dialog to select the file:
+
+``` python
+# Load with GUI file dialog
+ET.load_calibration(use_gui=True)
+```
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/docs/Vignettes/GettingStarted.html b/docs/Vignettes/GettingStarted.html
index d5a4a0f..3bffc12 100644
--- a/docs/Vignettes/GettingStarted.html
+++ b/docs/Vignettes/GettingStarted.html
@@ -123,6 +123,18 @@
});
+
@@ -159,6 +171,10 @@
So you’re interested in using DeToX? Awesome! Let’s get you set up quickly.
DeToX is designed as a lightweight wrapper around PsychoPy and tobii_research. Here’s the good news: tobii_research usually comes bundled with PsychoPy, which means the only real hurdle is installing PsychoPy itself. And yes, PsychoPy can be a bit tricky to install due to its many dependencies—but don’t worry, we’ll walk you through it. Once PsychoPy is up and running, adding DeToX is a breeze.
+
+
+
+
+
+
+Eye Tracker Drivers Required
+
+
+
+
Before DeToX can communicate with your Tobii eye tracker, you need to ensure the correct drivers are installed on your computer. The easiest way to do this is by downloading and installing the Tobii Pro Eye Tracker Manager—a free software provided by Tobii that allows you to installs the necessary drivers for your device.
+
As a bonus, the Eye Tracker Manager also includes a built-in calibration tool (designed for adult self-paced calibration), which can be useful for testing that your hardware is working properly before running DeToX.
+
+
Installing PsychoPy
Since PsychoPy is the main challenge, let’s tackle that first. You have two main options:
@@ -278,16 +305,16 @@
Installing PsychoPyThis method is ideal if you prefer working in an IDE (like Positron, VS Code, PyCharm, or Spyder) and want full control over your Python environment.
Step 1: Create a Virtual Environment
-
We like to use miniforge to handle our environments and Python installations. Any other method would work as well, but for simplicity we’ll show you how we prefer to do it.
+
We like to use miniconda to handle our environments and Python installations. Any other method would work as well, but for simplicity we’ll show you how we prefer to do it.
We recommend using Python 3.10 for the best compatibility:
-
mamba create -n detox_env python=3.10
+
conda create -n detox_env python=3.10
This will create an environment called detox_env with python 3.10. Exactly what we need!
You will probably need to confirm by pressing y, and after a few seconds you’ll have your environment with Python 3.10! Great!
Step 2: Activate Environment and Install PsychoPy
Now let’s activate this environment (making sure we’re using it) and then install PsychoPy:
-
mamba activate detox_env
+
conda activate detox_envpip install psychopy
this will take some time but if you are lucky you will have psychopy in your enviroment
Again, confirm if needed and you’re done! Amazing!
@@ -311,29 +338,14 @@
Step 1:
Installing DeToX
Once PsychoPy is installed, we can look at DeToX. Let’s gets our hand dirty! The installation is the same for both the Package and Standalone PsychoPy installations but some steps differ.
-
-
-
-
-
-
-DeToX is Still in Development
-
-
-
-
DeToX isn’t yet available on PyPI, so you’ll need to install it directly from our GitHub repository. Don’t worry—it’s straightforward, and we’ll guide you through it!
-
One requirement: You need Git installed on your system.
-
📥 Don’t have Git? Download it from git-scm.com—installation takes just a minute.
Wait a few seconds, confirm if needed, and you are done!
@@ -344,12 +356,26 @@
Installing DeToX
Select “Plugins/package manager…”
Click on “Packages” in the top tabs
Click the “Open PIP terminal” button
-
Type the following command:pip install git+https://github.com/DevStart-Hub/DeToX.git
+
Type the following command:pip install devst-detox
That’s it! You now have both PsychoPy and DeToX installed and ready to use.
+
+
+
+
+
+
+Install from git
+
+
+
+
If you want to install the latest development version of DeToX directly from the GitHub repository, you can do so by replacing the installation command with the following:
---
-title: "Installation"
-author: Tommaso Ghilardi
----
-
-So you're interested in using DeToX? Awesome! Let's get you set up quickly.
-
-DeToX is designed as a lightweight wrapper around **PsychoPy** and **tobii_research**. Here's the good news: `tobii_research` usually comes bundled with PsychoPy, which means the only real hurdle is installing PsychoPy itself. And yes, PsychoPy *can* be a bit tricky to install due to its many dependencies—but don't worry, we'll walk you through it. Once PsychoPy is up and running, adding DeToX is a breeze.
-
-## Installing PsychoPy
-
-Since PsychoPy is the main challenge, let's tackle that first. You have **two main options**:
-
-- **Package Installation**
-
- Install PsychoPy like any other Python package using `pip`. This approach is flexible and ideal if you prefer working in an IDE (like **Positron**, **VS Code**, **PyCharm**, or **Spyder**) where you have full control over your Python environment.
-
-- **Standalone Installation**
-
- Use the PsychoPy standalone installer, which bundles PsychoPy and all its dependencies into a single, ready-to-use application. This is often the **easiest way to get started**, especially if you're not familiar with managing Python environments or just want to hit the ground running.
-
-We like installing psychopy as a package but you do you!
-
-::: panel-tabset
-### Package
-
-This method is ideal if you prefer working in an IDE (like Positron, VS Code, PyCharm, or Spyder) and want full control over your Python environment.
-
-#### Step 1: Create a Virtual Environment
-
-We like to use miniforge to handle our environments and Python installations. Any other method would work as well, but for simplicity we'll show you how we prefer to do it.
-
-*We recommend using Python 3.10 for the best compatibility:*
-
-``` bash
-mamba create -n detox_env python=3.10
-```
-
-This will create an environment called detox_env with python 3.10. Exactly what we need!
-
-You will probably need to confirm by pressing `y`, and after a few seconds you'll have your environment with Python 3.10! Great!
-
-#### Step 2: Activate Environment and Install PsychoPy
-
-Now let's activate this environment (making sure we're using it) and then install PsychoPy:
-
-``` bash
-mamba activate detox_env
-pip install psychopy
-```
-
-this will take some time but if you are lucky you will have psychopy in your enviroment
-
-Again, confirm if needed and you're done! Amazing!
-
-### Standalone
-
-PsychoPy is a large package with many dependencies, and sometimes (depending on your operating system) installing it can be quite tricky! For this reason, the PsychoPy website suggests using the standalone installation method. This is like installing regular software on your computer - it will install PsychoPy and all its dependencies in one go.
-
-#### Step 1: Install PsychoPy Standalone
-
-1. Go to the [PsychoPy download page](https://www.psychopy.org/download.html)
-
-2. Download the standalone installer for your operating system
-
-3. Run the installer and follow the setup instructions
-
-You are done!!! Great!
-:::
-
-## Installing DeToX
-
-Once PsychoPy is installed, we can look at DeToX. Let's gets our hand dirty! The installation is the same for both the Package and Standalone PsychoPy installations but some steps differ.
-
-::: callout-warning
-## DeToX is Still in Development
-
-DeToX isn't yet available on PyPI, so you'll need to install it directly from our **GitHub repository**. Don't worry—it's straightforward, and we'll guide you through it!
-
-**One requirement:** You need **Git** installed on your system.
-
-📥 **Don't have Git?** Download it from [git-scm.com](https://git-scm.com/)—installation takes just a minute.
-:::
-
-::: panel-tabset
-### Package
-
-Again make sure to be in the correct environment if you installed PsychoPy as a package. with the following command:
-
-``` bash
-mamba activate detox_env
-```
-
-Then simply run:
-
-``` bash
-pip install git+https://github.com/DevStart-Hub/DeToX.git
-```
-
-Wait a few seconds, confirm if needed, and you are done!
-
-### Standalone
-
-1. **Open PsychoPy**
-
-2. **Go to Coder View** (the interface with the code editor)
-
-3. **Open the Tools menu**
-
-4. **Select "Plugins/package manager..."**
-
-5. **Click on "Packages"** in the top tabs
-
-6. **Click the "Open PIP terminal" button**
-
-7. **Type the following command:** `pip install git+https://github.com/DevStart-Hub/DeToX.git`
-
-That's it! You now have both PsychoPy and DeToX installed and ready to use.
-:::
-
-::: callout-warning
-## Important: DeToX requires coding
-
-DeToX is a code-based library that works with PsychoPy's Coder interface. If you typically use PsychoPy's Builder (the drag-and-drop visual interface), you'll need to switch to the Coder interface to use DeToX. Don't worry - we provide plenty of code examples to get you started!
-:::
+
---
+title: "Installation"
+description-meta: learn how to install DeToX. DetoX is a lightweight wrapper around PsychoPy and tobii_research, so you'll need to install PsychoPy first. This guide walks you through the installation process step-by-step.
+keywords: [DeToX, PsychoPy, tobii_research, installation, eye-tracking, Tobii, Python, miniconda, virtual environment]
+author: Tommaso Ghilardi
+---
+
+So you're interested in using DeToX? Awesome! Let's get you set up quickly.
+
+DeToX is designed as a lightweight wrapper around **PsychoPy** and **tobii_research**. Here's the good news: `tobii_research` usually comes bundled with PsychoPy, which means the only real hurdle is installing PsychoPy itself. And yes, PsychoPy *can* be a bit tricky to install due to its many dependencies—but don't worry, we'll walk you through it. Once PsychoPy is up and running, adding DeToX is a breeze.
+
+::: callout-note
+## Eye Tracker Drivers Required {fig-align="right" width="49" height="33"}
+
+Before DeToX can communicate with your Tobii eye tracker, you need to ensure the correct drivers are installed on your computer. The easiest way to do this is by downloading and installing the [Tobii Pro Eye Tracker Manager](https://www.tobii.com/products/software/applications-and-developer-kits/tobii-pro-eye-tracker-manager)—a free software provided by Tobii that allows you to installs the necessary drivers for your device.
+
+As a bonus, the Eye Tracker Manager also includes a built-in calibration tool (designed for adult self-paced calibration), which can be useful for testing that your hardware is working properly before running DeToX.
+:::
+
+## Installing PsychoPy
+
+Since PsychoPy is the main challenge, let's tackle that first. You have **two main options**:
+
+- **Package Installation**
+
+ Install PsychoPy like any other Python package using `pip`. This approach is flexible and ideal if you prefer working in an IDE (like **Positron**, **VS Code**, **PyCharm**, or **Spyder**) where you have full control over your Python environment.
+
+- **Standalone Installation**
+
+ Use the PsychoPy standalone installer, which bundles PsychoPy and all its dependencies into a single, ready-to-use application. This is often the **easiest way to get started**, especially if you're not familiar with managing Python environments or just want to hit the ground running.
+
+We like installing psychopy as a package but you do you!
+
+::: panel-tabset
+### Package
+
+This method is ideal if you prefer working in an IDE (like Positron, VS Code, PyCharm, or Spyder) and want full control over your Python environment.
+
+#### Step 1: Create a Virtual Environment
+
+We like to use miniconda to handle our environments and Python installations. Any other method would work as well, but for simplicity we'll show you how we prefer to do it.
+
+*We recommend using Python 3.10 for the best compatibility:*
+
+``` bash
+conda create -n detox_env python=3.10
+```
+
+This will create an environment called detox_env with python 3.10. Exactly what we need!
+
+You will probably need to confirm by pressing `y`, and after a few seconds you'll have your environment with Python 3.10! Great!
+
+#### Step 2: Activate Environment and Install PsychoPy
+
+Now let's activate this environment (making sure we're using it) and then install PsychoPy:
+
+``` bash
+conda activate detox_env
+pip install psychopy
+```
+
+this will take some time but if you are lucky you will have psychopy in your enviroment
+
+Again, confirm if needed and you're done! Amazing!
+
+### Standalone
+
+PsychoPy is a large package with many dependencies, and sometimes (depending on your operating system) installing it can be quite tricky! For this reason, the PsychoPy website suggests using the standalone installation method. This is like installing regular software on your computer - it will install PsychoPy and all its dependencies in one go.
+
+#### Step 1: Install PsychoPy Standalone
+
+1. Go to the [PsychoPy download page](https://www.psychopy.org/download.html)
+
+2. Download the standalone installer for your operating system
+
+3. Run the installer and follow the setup instructions
+
+You are done!!! Great!
+:::
+
+## Installing DeToX
+
+Once PsychoPy is installed, we can look at DeToX. Let's gets our hand dirty! The installation is the same for both the Package and Standalone PsychoPy installations but some steps differ.
+
+::: panel-tabset
+### Package
+
+Again make sure to be in the correct environment if you installed PsychoPy as a package. with the following command:
+
+``` bash
+conda activate detox_env
+```
+
+Then simply run:
+
+``` bash
+pip install dvst-detox
+```
+
+Wait a few seconds, confirm if needed, and you are done!
+
+### Standalone
+
+1. **Open PsychoPy**
+
+2. **Go to Coder View** (the interface with the code editor)
+
+3. **Open the Tools menu**
+
+4. **Select "Plugins/package manager..."**
+
+5. **Click on "Packages"** in the top tabs
+
+6. **Click the "Open PIP terminal" button**
+
+7. **Type the following command:** `pip install devst-detox`
+
+That's it! You now have both PsychoPy and DeToX installed and ready to use.
+:::
+
+::: callout-tip
+## Install from git
+
+If you want to install the latest development version of DeToX directly from the GitHub repository, you can do so by replacing the installation command with the following:
+
+``` bash
+pip install git+https://github.com/DevStart-Hub/DeToX.git
+```
+:::
+
+::: callout-warning
+## Important: DeToX requires coding
+
+DeToX is a code-based library that works with PsychoPy's Coder interface. If you typically use PsychoPy's Builder (the drag-and-drop visual interface), you'll need to switch to the Coder interface to use DeToX. Don't worry - we provide plenty of code examples to get you started!
+:::
Base class with common functionality for both calibration types.
This abstract base class provides shared calibration functionality for both Tobii hardware-based and mouse-based simulation calibration sessions. It handles visual presentation, user interaction, animation, and result visualization while delegating hardware-specific data collection to subclasses.
Display a message on screen and in console, then wait for keypress.
-
Shows formatted message both in the PsychoPy window and console output, then pauses execution until any key is pressed. Useful for instructions and status messages during calibration.
+
Shows formatted message both in the PsychoPy window and console output, enforces a minimum display time for readability and system stabilization, then pauses execution until any key is pressed.
Parameters
-
-
-
-
+
+
+
+
@@ -602,42 +607,23 @@
body
str
-
The main message text to display. Will be formatted with box-drawing characters via NicePrint.
+
The main message text to display.
required
title
str
-
Title for the message box. Appears at the top of the formatted box. Default empty string.
+
Title for the message box. Default empty string.
''
pos
tuple
-
Position of the message box center on screen in window units. Default (0, -0.15) places message slightly below center.
+
Position of the message box center on screen. Default (0, -0.15).
Re
Display a message on screen and in console, then wait for keypress.Shows formatted message both in the PsychoPy window and console output,
-then pauses execution until any key is pressed. Useful for instructions
-and status messages during calibration.
+enforces a minimum display time for readability and system stabilization,
+then pauses execution until any key is pressed.#### Parameters {.doc-section .doc-section-parameters}
-| Name | Type | Description | Default |
-|--------|--------|------------------------------------------------------------------------------------------------------------------------|--------------|
-| body | str | The main message text to display. Will be formatted with box-drawing characters via NicePrint. | _required_ |
-| title | str | Title for the message box. Appears at the top of the formatted box. Default empty string. | `''` |
-| pos | tuple | Position of the message box center on screen in window units. Default (0, -0.15) places message slightly below center. | `(0, -0.15)` |
-
-#### Returns {.doc-section .doc-section-returns}
-
-| Name | Type | Description |
-|--------|--------|---------------|
-| | None | |
+| Name | Type | Description | Default |
+|--------|--------|-------------------------------------------------------------------|--------------|
+| body | str | The main message text to display. | _required_ |
+| title | str | Title for the message box. Default empty string. | `''` |
+| pos | tuple | Position of the message box center on screen. Default (0, -0.15). | `(0, -0.15)` |