Sensing#

Brain Sensing#

Reset Timer#

The Reset Timer block is used to reset the EXP Brain’s timer.

Reset Timer block for EXP Brain, resets timer to 0 seconds after 2 seconds delay, displaying current time.

The Brain’s timer begins at the beginning of each project. The reset timer block is used to reset the timer back to 0 seconds.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

Illustration of the Reset Timer block for the EXP Brain, demonstrating timer reset functionality.

Timer Value#

The Timer Value block is used to report the value of the EXP Brain’s timer in seconds.

Reset Timer block for EXP Brain, resets timer to 0 seconds after 2 seconds delay, displaying current time.

The timer starts at 0 seconds when the program starts, and reports the timer’s value as a decimal value.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

Illustration of the Reset Timer block for the EXP Brain, demonstrating timer reset functionality.

Brain Button Pressed#

The Brain Button Pressed block is used to report if a button on the VEX EXP Brain is pressed.

Image of the Brain Button Pressed block, showing its function in reporting button press status on the VEX EXP Brain.

The Brain Button Pressed block reports True when the selected Brain button is pressed.

The Brain Button Pressed block reports False when the selected Brain button is not pressed.

Choose which Brain button to use on the EXP Brain.

Image of the VEX EXP Brain button pressed indicator showing the current button status on the display screen.

In this example, the Brain will print a message on its screen when the first time the right Brain Button is pressed.

Image showing the VEX EXP Brain displaying the timer reset and button pressed status during a programming example.

Cursor Column#

The Cursor Column block is used to report the column number of the EXP Brain’s screen cursor location.

Image showing the cursor column block for the VEX EXP Brain, indicating cursor position on the screen.

The Cursor Column block will report a value from 1-80 and will start on column 1 at the start of a project.

In this example, the Brain will move the cursor to (3, 7) and then print the current column (7) on the Brain’s screen.

Image showing a cursor column example on the EXP Brain's screen with timer and button press information.

Cursor Row#

The Cursor Row block is used to report the row number of the EXP Brain’s screen cursor location.

Image showing a graphical representation of various sensing blocks for the VEX EXP Brain and Controller functionalities.

The Cursor Column block will report a value from 1-9 and will start on row 1 at the start of a project.

In this example, the Brain will move the cursor to (3, 7) and then print the current row (3) on the Brain’s screen.

Image showing an example of cursor row and column positions on a VEX EXP Brain display.

Battery Voltage#

The Battery Voltage block is used to report the voltage of the EXP Brain’s battery.

Battery voltage block diagram showing voltage readings for the EXP Brain's battery in a robotics context.

The Battery Voltage block reports a range from 6 volts to 9 volts.

In this example, the Brain will print its current battery voltage on the Brain’s screen.

Battery voltage example showing the EXP Brain's voltage reading displayed on its screen.

Battery Current#

The Battery Current block is used to report the current of the EXP Brain’s battery.

Diagram illustrating battery current monitoring for the EXP Brain's functionality in robotics.

The Battery Current block reports a range from 0.0 amps to 15.0 amps.

In this example, the Brain will print its current battery current on the Brain’s screen.

Graph showing battery voltage, current, and capacity readings from the EXP Brain during a project.

Battery Capacity#

The Battery Capacity block is used to report the charge level of the EXP Brain’s battery.

Diagram illustrating battery capacity reporting for the EXP Brain, showing charge levels from 0% to 100%.

The Battery Capacity block reports a range from 0% to 100%.

In this example, the Brain will print its current battery charge on the Brain’s screen.

Diagram illustrating battery capacity monitoring for the EXP Brain, showing charge levels from 0% to 100%.

Controller Sensing#

Controller Pressed#

The Controller Pressed block is used to report if a button on the EXP Controller is pressed.

Image showing the Controller Pressed block indicating if a button on the EXP Controller is pressed or not.

The Controller Pressed block reports True when the selected Controller button is pressed.

The Controller Pressed block reports False when the selected Controller button is not pressed.

Choose which Controller button to use.

Image of a VEX EXP Controller button being pressed, indicating input for sensing actions in robotics.

In this example, the Brain will print a message on its screen the first time the A button on the controller is pressed.

Image showing a VEX EXP Brain with the Controller Pressed block highlighted, indicating button press status.

Position of Controller#

The Position of Controller block is used to report the position of a joystick on the EXP Controller along an axis.

Image of a controller position block displaying joystick axis position and button press status for the EXP Brain.

The Position of Controller block reports a range from -100 to 100.

The Position of Controller block reports 0 when the joystick axis is centered.

Choose the joystick’s axis.

Image showing the controller position axis for joystick input on the EXP Controller in a robotics project.

In this example, the Brain will print the 3rd axis of the EXP Controller’s joysticks.

Example of controller position sensing with joystick axes and button press indicators on the EXP Brain's display.

Controller Enable/Disable#

The Controller Enable/Disable block is used to enable or disable Controller configured actions from the Devices menu.

Image of a controller block used for sensing and reporting various controller actions and states in a robotics project.

Choose to either enable or disable the configured Controller actions. By default, the Controller is Enabled in every project.

Image showing various blocks related to brain sensing, including timer reset, battery status, and controller inputs.

In this example, the Controller will be disabled at the start of the project and be re-enabled after the drivetrain has moved forward for 6 inches.

Image showing the EXP Brain's reset timer block and its function to reset the timer to 0 seconds in a project.

Motor Sensing#

Motor is Done?#

The Motor is Done? block is used to report if the selected EXP Smart Motor or Motor Group has completed its movement.

Image showing a motor status block indicating if the motor has completed its movement in a robotics project.

The Motor is Done? block reports True when the selected Motor or Motor Group has completed its movement.

The Motor is Done? block reports False when the selected Motor or Motor Group is still moving.

Choose which Motor or Motor Group to use.

Image showing the Motor is Done?" block indicating if the motor has completed its movement in the VEX EXP Brain interface.

Motor is Spinning?#

The Motor is Spinning? block is used to report if the selected EXP Smart Motor or Motor Group is currently moving.

Image of a motor spinning, indicating its current operational status in a robotics context.

The Motor is Spinning? block reports True when the selected Motor or Motor Group is moving.

The Motor is Spinning? block reports False when the selected Motor or Motor Group is not moving.

Choose which Motor or Motor Group to use.

Image showing a motor status indicator with Motor is Spinning?" text and related graphical elements.

Position of Motor#

The Position of Motor block is used to report the distance an EXP Smart Motor or the first motor of a Motor Group has traveled.

Diagram of motor position sensing blocks in VEX EXP Brain programming interface.

Choose which Motor or Motor Group to use.

Diagram illustrating motor position and various sensing parameters for robotic control systems.

Choose the units to report in, degrees or turns.

Diagram illustrating motor position units and their measurements in robotics.

In this example, the Motor will spin forward for 1 second before its current position is printed on the Brain’s screen.

Diagram illustrating motor position sensing parameters and values in a robotic control system.

Velocity of Motor#

The Velocity of Motor block is used to report the current velocity of an EXP Smart Motor or the first motor of a Motor Group.

Motor velocity block diagram illustrating motor performance parameters like speed, power, and torque in a robotics context.

The Velocity of Motor block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose which Motor or Motor Group to use.

Diagram illustrating motor velocity parameters including current, power, torque, and efficiency for robotics applications.

Choose the units to report in, percent (%) or rpm.

Diagram illustrating motor velocity units for the EXP Brain, showing various motor performance metrics and measurements.

In this example, the Motor will spin forward for 1 second before its current velocity is printed on the Brain’s screen.

Graph illustrating motor velocity example with various parameters and measurements related to motor performance.

Current of Motor#

The Current of Motor block is used to report the amount of current a EXP Smart Motor or Motor Group is drawing in amperes (amps).

Diagram of motor current sensing block showing parameters like current, power, torque, and efficiency for motor control.

Choose which Motor or Motor Group to use.

Diagram illustrating motor current parameters including motor power, torque, and efficiency in robotics systems.

In this example, the Motor will spin forward for 1 second before its current is printed on the Brain’s screen.

Diagram illustrating motor current monitoring and control parameters for robotic systems.

Power of Motor#

The Power of Motor block is used to report the amount of power output a EXP Smart Motor or the first motor of a Motor Group is currently generating.

Image of a motor power block displaying motor power output data for a robotic control system.

Choose which Motor or Motor Group to use.

Diagram illustrating motor power parameters including torque, efficiency, and temperature for robotic systems.

In this example, the Motor will spin forward for 1 second before its current power is printed on the Brain’s screen.

Diagram illustrating motor power sensing parameters including current, voltage, torque, and efficiency metrics.

Torque of Motor#

The Torque of Motor block is used to report the amount of torque (rotational force) a EXP Smart Motor or the first motor of a Motor Group is currently using.

Image of a motor torque block displaying motor torque values and settings in a robotics context.

The Torque of Motor block reports a range from 0.0 to 18.6 inch-pounds (InLB) or 0.0 to 2.1 Newton-meters (Nm).

Choose which Motor or Motor Group to use.

Diagram illustrating motor torque parameters including power, efficiency, and temperature for motor control systems.

Choose the units to report in, Nm or InLb.

Diagram illustrating motor torque units and their measurement scales for robotics applications.

In this example, the Motor will spin forward for 1 second before its current torque is printed on the Brain’s screen.

Diagram illustrating motor torque measurement parameters and examples in a robotics context.

Efficiency of Motor#

The Efficiency of Motor block is used to report the efficiency of a EXP Smart Motor or the first motor of a Motor Group.

Diagram illustrating motor efficiency metrics including power, torque, and temperature for robotic systems.

The Efficiency of Motor block reports a range from 0% to 100%, determined by the value of the power (in watts) the motor is using (input), versus the amount of power (in watts) the motor is providing (output).

An EXP Smart Motor or Motor Group typically reaches a maximum efficiency of 65% under normal use cases.

Choose which Motor or Motor Group to use.

Diagram illustrating motor efficiency metrics including power, torque, and temperature for motor performance analysis.

In this example, the Motor will spin forward for 1 second before its current efficiency is printed on the Brain’s screen.

Diagram illustrating motor efficiency metrics including power, torque, and temperature for motor performance analysis.

Temperature of Motor#

The Temperature of Motor block is used to report the temperature of a EXP Smart Motor or the first motor of a Motor Group.

Motor temperature block displaying data related to the temperature of the motor in a robotic system.

The Temperature of Motor block reports a range from 0% to 100%.

Choose which Motor or Motor Group to use.

Image showing the temperature readings of a motor in a robotic system, indicating operational status and efficiency.

In this example, the Motor will spin forward for 1 second before its current temperature is printed on the Brain’s screen.

Motor temperature monitoring example displaying the temperature readings of an EXP Smart Motor.

Drivetrain Sensing#

Drive is Done?#

The Drive is Done? block is used to report if the Drivetrain has completed its movement.

Image showing a block labeled Drive is Done?" used to report if the Drivetrain has completed its movement.

The Drive is Done? block reports True when the Drivetrain’s motors have completed their movement.

The Drive is Done? block reports False when the sDrivetrain’s motors are still moving.

Drive is Moving?#

The Drive is Moving? block is used to report if the Drivetrain is currently moving.

Diagram illustrating the 'Drive is Moving?' sensing block for robotics control systems.

The Drive is Moving? block reports True when the Drivetrain’s motors are moving.

The Drive is Moving? block reports False when the sDrivetrain’s motors are not moving.

Drive Heading#

The Drive Heading block is used to report the direction that the Drivetrain is facing by using the Inertial sensor’s current angular position.

Image of a reset timer block used in the EXP Brain to reset the timer back to 0 seconds in a robotics project.

The Drive Heading block reports a range from 0.0 to 359.99 degrees.

In this example, the Drivetrain will turn to the right for 1 second before its current heading is printed on the Brain’s screen.

Image showing various brain sensing parameters and controls for robotics, including timer, battery, and motor status.

Drive Rotation#

The Drive Rotation block is used to report the Drivetrain’s angle of rotation.

Image of a reset timer block for the VEX EXP Brain, illustrating timer reset functionality in robotics programming.

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

In this example, the Drivetrain will turn to the left for 1 second before its current rotation is printed on the Brain’s screen.

Illustration of drive rotation example showing drivetrain movement and sensor data monitoring.

Drive Velocity#

The Drive Velocity block is used to report the current velocity of the Drivetrain.

Image showing the Drive Velocity block used to report the current velocity of the Drivetrain in a robotics project.

The Drive Velocity block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose the units to report in, percent (%) or rpm.

Image showing drive velocity units for a robotic system, including parameters like percent and rpm.

In this example, the Drivetrain will drive forward for 1 second before its current velocity is printed on the Brain’s screen.

Graph illustrating drive velocity example with various parameters related to motor and drivetrain performance.

Drive Current#

The Drive Current block is used to report the amount of current (power) that the Drivetrain is currently using.

Diagram illustrating the functionality of various sensing blocks in the VEX EXP Brain programming environment.

In this example, the Drivetrain will drive forward for 1 second before its current is printed on the Brain’s screen.

Screenshot of VEX EXP Brain programming interface displaying various sensing blocks and their functions.

Drive Power#

The Drive Power block is used to report the amount of power output the Drivetrain is currently generating.

Image of the EXP Brain's Drive Power Block, illustrating power output reporting for the drivetrain.

In this example, the Drivetrain will drive forward for 1 second before its current power is printed on the Brain’s screen.

Flowchart illustrating various sensing and control blocks for the EXP Brain in robotics applications.

Drive Torque#

The Drive Torque block is used to report the amount of torque (rotational force) the Drivetrain is currently using.

Image showing the drive torque block used for monitoring motor torque in a robotic control system.

The Drive Torque block reports a range from 0.0 to 18.6 inch-pounds (InLB) or 0.0 to 2.1 Newton-meters (Nm).

Choose the units to report in, Nm or InLb.

Diagram illustrating various drive torque units and their measurements in a robotics context.

In this example, the Drivetrain will drive forward for 1 second before its current torque is printed on the Brain’s screen.

Diagram illustrating drive torque parameters and motor performance metrics in robotic systems.

Drive Efficiency#

The Drive Efficiency block is used to report the efficiency of the Drivetrain.

Diagram illustrating various sensing and control blocks for a robotic brain, including timer, battery, and motor efficiency.

The Drive Efficiency block reports a range from 0% to 100%, determined by the value of the power (in watts) the motor is using (input), versus the amount of power (in watts) the motor is providing (output).

An EXP Drivetrain typically reaches a maximum efficiency of 65% under normal use cases.

In this example, the Drivetrain will drive forward for 1 second before its current efficiency is printed on the Brain’s screen.

Illustration of a drive efficiency example showing motor and drivetrain performance metrics and sensor readings.

Drive Temperature#

The Drive Temperature block is used to report the temperature of the EXP Smart Motors powering the Drivetrain.

Image showing the drive temperature block for monitoring motor temperature in a robotic system.

The Drive Temperature block reports a range from 0% to 100%.

In this example, the Drivetrain will drive forward for 1 second before its current temperature is printed on the Brain’s screen.

Image showing a temperature reading example for drive motors in a robotic system.

Bumper Sensing#

Bumper Pressed#

The Bumper Pressed block is used to report if the Bumper Switch is pressed.

Image of a pressed bumper switch block indicating its status in a robotics programming environment.

The Bumper Pressed block reports True when the selected Bumper Switch is pressed.

The Bumper Pressed block reports False when the selected Bumper Switch is not pressed.

Choose which Bumper Switch to use.

Bumper pressed indicator on the EXP Brain, showing the status of the bumper switch in a robotics context.

In this example, the Brain will print a message on its screen the first time the Bumper Switch is pressed.

Image of the VEX EXP Brain display showing a pressed bumper switch with timer reset and button press status.

Limit Sensing#

Limit Pressed#

The Limit Pressed block is used to report if the Limit Switch is pressed.

Limit Pressed block diagram illustrating the functionality of the Limit Switch in a VEX EXP Brain project.

The Limit Pressed block reports True when the selected Limit Switch is pressed.

The Limit Pressed block reports False when the selected Limit Switch is not pressed.

Choose which Limit Switch to use.

Limit switch pressed indicator on the VEX EXP Brain, showing true or false status for the Limit Sensing block.

In this example, the Brain will print a message on its screen the first time the Limit Switch is pressed.

Image showing the Limit Pressed block in a coding environment for reporting Limit Switch status in robotics.

Gyro Sensing#

Calibrate#

The Calibrate block is used to calibrate the Gyro/Inertial Sensor to reduce the amount of drift. It is recommended that this block is used at the start of the project.

Image of the Calibrate block for Gyro/Inertial Sensor, used to reduce drift during initialization.

The Brain must remain still for calibration process to succeed, which takes approximately 2 seconds.

Choose which Gyro/Inertial Sensor to use.

Image showing a user interface for calibrating a device's gyro sensor with settings for heading and rotation.

In this example, the Brain’s Inertial Sensor will calibrate for 2 seconds before printing the current orientation of the Inertial Sensor.

Flowchart illustrating the calibration process for a Gyro/Inertial Sensor, showing steps and outputs.

Set Heading#

The Set Heading block is used to set the Gyro/Inertial sensor’s current heading position to a set value.

Image of a reset timer block used in the EXP Brain to reset the timer back to 0 seconds in a robotics project.

The Set Heading block accepts a range of 0.0 to 359.99 degrees.

Choose which Gyro/Inertial Sensor to use.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, motor, and controller functions.

In this example, the Brain’s Inertial sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

Image showing the VEX EXP Brain's Reset Timer block, illustrating timer reset functionality in a programming context.

Set Rotation#

The Set Rotation block is used to set the Gyro/Inertial sensor’s current rotation position to a set value.

Image of a reset timer block for the VEX EXP Brain, illustrating timer reset functionality in robotics programming.

The Set Rotation block accepts any positive or negative decimal or integer number.

Choose which Gyro/Inertial Sensor to use.

Image of a rotation device used in robotics for sensing and controlling motor positions and movements.

In this example, the Brain’s Inertial sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new rotation.

Example of setting rotation in a VEX EXP Brain project, showing the rotation sensor's position and angle.

Angle of Heading#

The Angle of Heading block is used to report the 3-Wire Gyro Sensor or EXP Inertial Sensor’s current heading in degrees.

Image of a reset timer block used in the EXP Brain to reset the timer back to 0 seconds in a robotics project.

The Angle of Heading block reports a range from 0.0 to 359.99 degrees.

Choose which Gyro/Inertial Sensor to use.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, motor, and controller functions.

In this example, the Brain’s Inertial sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

Image showing the VEX EXP Brain's Reset Timer block, illustrating timer reset functionality in a programming context.

Angle of Rotation#

The Angle of Rotation block is used to report the 3-Wire Gyro Sensor or EXP Inertial Sensor’s current rotation in degrees.

Image of a reset timer block for the VEX EXP Brain, illustrating timer reset functionality in robotics programming.

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

Choose which Gyro/Inertial Sensor to use.

Image of a rotation device used in robotics for sensing and controlling motor positions and movements.

In this example, the Brain’s Inertial sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new heading.

Example of setting rotation in a VEX EXP Brain project, showing the rotation sensor's position and angle.

Inertial Sensing#

Acceleration of#

The Acceleration of block is used to report the acceleration value from one of the axes (x, y, or z) on the Inertial Sensor.

Image of an acceleration block used in brain sensing for reporting acceleration values in robotics applications.

The Acceleration of block reports a range from -4.0 to 4.0 Gs.

Choose which Gyro/Inertial Sensor to use.

Image of an acceleration device used for monitoring and reporting motion and orientation data in robotics applications.

Choose which axis to use:

  • x - The X-axis reports acceleration when the Inertial Sensor moves forward to backward.

  • y - The Y-axis reports acceleration when the Inertial Sensor moves side to side.

  • z - The Z-axis reports acceleration when the Inertial Sensor moves up to down.

Diagram illustrating the acceleration axis for various sensing parameters in robotics, including motor and drivetrain data.

In this example, the Drivetrain will move forward and print its current X-axis acceleration while moving.

Diagram illustrating various brain sensing functionalities and their outputs for robotics control.

Gyro Rate of#

The Gyro Rate of block is used to report the rate of rotation from one of the axes (x, y, or z) on the Inertial Sensor.

Image of a gyro rate block used for reporting rotation rates in degrees per second for an inertial sensor.

The Gyro Rate of block reports a range from -1000.0 to 1000.0 in dps (degrees per second).

Choose which Gyro/Inertial Sensor to use.

Image of a gyro rate device used for measuring angular rotation in robotics applications.

Choose which axis to use:

  • x - The X-axis reports rate of rotation when the Inertial Sensor rotates on the X-Axis (based on the orientation of the sensor).

  • y - The Y-axis reports rate of rotation when the Inertial Sensor rotates on the Y-Axis (based on the orientation of the sensor).

  • z - The Z-axis reports rate of rotation when the Inertial Sensor rotates in the Z-Axis (based on the orientation of the sensor).

Diagram illustrating gyro rate sensing with axes labeled for orientation and rotation measurements.

In this example, the Drivetrain will turn to the right and print its current X-axis gyro rate while turning.

Graph showing gyro rate example with axes labeled for rate of rotation in degrees per second.

Orientation of#

The Orientation of block is used to report the orientation angle of the inertial sensor.

Image of the EXP Brain's Reset Timer block used to reset the timer back to 0 seconds in programming context.

Choose which Gyro/Inertial Sensor to use.

Image of an orientation device used for various sensing applications in robotics and automation.

Choose which orientation to use:

  • roll - The Y-axis represents roll, which reports a value between -180 to +180 degrees.

  • pitch - The X-axis represents pitch, which reports a value between -90 to +90 degrees.

  • yaw - The Z-axis represents yaw, which reports a value between -180 to +180 degrees.

Diagram illustrating various sensing blocks and their functions in the VEX EXP Brain system.

In this example, the Drivetrain will turn to the right and print its current roll as it turns.

Flowchart illustrating various brain sensing functionalities and their corresponding blocks in a robotics context.

Encoder Sensing#

Set Shaft Encoder Position#

The Set Shaft Encoder Position block is used to set the Shaft Encoder’s position to the given value.

Shaft encoder block diagram illustrating position and velocity sensing for robotic applications.

Choose which Shaft Encoder to use.

Image of a shaft encoder device used for measuring rotational position and velocity in robotics.

In this example, the Shaft Encoder will print its starting position, set its position to 90 degrees, and then print the new position.

Shaft encoder example showing position and velocity readings in a robotics context with visual indicators.

Shaft Encoder Position#

The Shaft Encoder Position block is used to report the distance the Shaft Encoder has rotated.

Shaft encoder position block displaying current position and velocity of the shaft encoder in a robotics context.

Choose which Shaft Encoder to use.

Image of a shaft encoder position device used for measuring rotational position in robotics applications.

Choose which unit to report in: degrees or turns.

Shaft encoder position units diagram illustrating how to set and read encoder position values.

In this example, the Shaft Encoder will print its starting position, set its position to 90 degrees, and then print the new position.

Shaft encoder example showing position and velocity readings in a robotics context with visual indicators.

Shaft Encoder Velocity#

The Shaft Encoder Velocity block is used to report the current velocity of a Shaft Encoder.

Shaft encoder velocity block diagram showing the relationship between encoder position and velocity.

Choose which Shaft Encoder to use.

Shaft encoder velocity device displaying current velocity readings in degrees per second or rotations per minute.

Choose which unit to report in: degrees per second (dps) or rotations per minute (rpm).

Shaft encoder velocity units illustration showing degrees per second (dps) and rotations per minute (rpm) measurements.

Distance Sensing#

Object Distance#

The Object Distance block is used to report the distance of the nearest object from the Distance Sensor.

Image showing an object distance sensor block used to report the distance to the nearest object in millimeters or inches.

The Object Distance block reports a range from 20mm to 2000mm.

Choose which Distance Sensor to use.

Image of a distance sensing device displaying object distance and related metrics in a robotics context.

Choose what units to report in: millimeters (mm) or inches.

Image illustrating various object distance sensing units and measurements for robotics applications.

In this example, the Distance Sensor will report the current distance between it and the closest object.

Diagram illustrating the distance sensing capabilities of a sensor detecting object distance and size.

Object Velocity#

The Object Velocity block is used to report the current velocity of an object in meters per second (m/s).

Image showing the Object Velocity block used to report the current velocity of an object in meters per second.

Choose which Distance Sensor to use.

Image of an object velocity device used for measuring the speed of objects in motion.

In this example, the Distance Sensor will report the current velocity of an object moving in front of it.

Graph illustrating object velocity measurement with distance sensor, showing velocity vs. time data points.

Object Size Is#

The Object Size Is block is used to report if the Distance Sensor detects the specified object size.

Diagram illustrating the object size detection capabilities of a distance sensor, showing various size categories.

The Distance Sensor determines the size of the object detected (none, small, medium, large) based on the amount of light reflected and returned to the sensor.

The Object Size Is block reports True when the Distance Sensor detects the specified size.

The Object Size Is block reports False when the Distance Sensor doesn’t detect the specified size.

Choose which Distance Sensor to use.

Diagram illustrating object size detection by a distance sensor, showing small, medium, and large object indicators.

Choose which size of the object you want the Object Sensor to check for.

  • small

  • medium

  • large

Diagram illustrating the relationship between object size and distance detected by a sensor.

In this example, if the Distance Sensor detects a small object, it will drive forward until the object is large.

Image showing a graphical representation of object size detected by a distance sensor in a robotics context.

Distance Sensor Found Object#

The Distance Sensor Found Object block is used to report if the Distance Sensor sees an object within its field of view.

Diagram illustrating distance sensing capabilities of an object detection system, including various sensor outputs.

The Distance Sensor Found Object block reports True when the Distance Sensor sees an object or surface within its field of view.

The Distance Sensor Found Object block reports False when the Distance Sensor does not detect an object or surface.

Choose which Distance Sensor to use.

Diagram illustrating distance detection capabilities of an object detection sensor device.

In this example, when the Distance Sensor detects an object, it will print a message to the Brain.

Diagram illustrating distance sensing with an object detection example using sensors and a brain controller.

Optical Sensing#

Set Optical Mode#

The Set Optical Mode block is used to set an Optical Sensor to either detect colors or gestures.

Diagram illustrating the Set Optical Mode block for configuring optical sensor detection settings.

By default, an Optical Sensor is set to always detect colors. Before using any Optical Sensor gesture blocks, the Optical Sensor must be set to detect gestures.

Choose which Optical Sensor to use.

Image showing a diagram for setting the optical mode of a sensor in a robotics context.

Choose whether you want to set the mode of the Optical Sensor to either detect colors or gestures.

Diagram illustrating the process of setting the optical mode for a sensor to detect colors or gestures.

In this example, the Optical Sensor is set to detect gestures before waiting until a left gesture is detected to print a message.

Diagram illustrating the use of the Set Optical Mode block for configuring an Optical Sensor to detect colors or gestures.

Set Optical Light#

The Set Optical Light block is used to set the light on the Optical Sensor to on or off. The light lets the Optical Sensor see objects if it is looking at an object in a dark area.

Diagram illustrating the optical light block settings for the EXP Brain's optical sensor system.

Choose which Optical Sensor to use.

Illustration of an optical light device used in sensing applications, showcasing its features and settings.

Choose whether to turn the light on or off.

Image showing the setting of optical light mode for an optical sensor in a robotics context.

In this example, the Optical Sensor will turn its light on for two seconds before turning it off.

Diagram illustrating the settings and functions of the optical light sensor in a robotics control system.

Set Optical Light Power#

The Set Optical Light Power block is used to set the light power of Optical sensor

Illustration of an optical light power block for setting light power in optical sensing applications.

The The Set Optical Light Power block block accepts a range of 0% to 100%. This will change the brightness of the light on the Optical Sensor. If the light is off, this block will turn the light on.

Choose which Optical Sensor to use.

Optical light power device interface for setting optical light power levels in sensing applications.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

Diagram illustrating the use of optical light power settings for a sensor in a robotics context.

Optical Sensor Found Object#

The Optical Sensor Found Object block is used to report if the Optical Sensor detects an object close to it.

Diagram illustrating optical sensor functions, including object detection, color detection, and gesture recognition.

The Optical Sensor Found Object block reports True when the Optical Sensor detects an object close to it.

The Optical Sensor Found Object block reports False when an object is not within range of the Optical Sensor.

Choose which Optical Sensor to use.

Diagram of an optical sensor device detecting objects and colors in a robotic sensing context.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

Diagram illustrating the use of optical light power settings for a sensor in a robotics context.

Optical Sensor Detects Color#

The Optical Sensor Detects Color block is used to report if the Optical Sensor detects the specified color.

Optical sensor detecting color with various color blocks and brightness levels displayed on a screen.

The Optical Sensor Detects Color block reports True when the Optical Sensor detects the specified color.

The Optical Sensor Detects Color block reports False when the Optical Sensor doesn’t detect the specified color.

Choose which Optical Sensor to use.

Image of an optical sensor detecting color, with indicators for brightness, hue, and gesture detection.

Choose which color the Optical Sensor will check for.

Optical sensor detecting color with various hues and brightness levels in a robotics context.

In this example, the Optical Sensor’s will wait until it detects a blue object before printing a message.

Optical sensor detecting color with various hues and brightness levels displayed on a screen.

Optical Brightness#

The Optical Brightness block is used to report the amount of light detected by the Optical Sensor.

Diagram illustrating the Optical Brightness block for reporting light levels detected by an Optical Sensor.

The Optical Brightness block reports a number value from 0% to 100%.

A large amount of light detected will report a high brightness value.

A small amount of light detected will report a low brightness value.

Choose which Optical Sensor to use.

Optical brightness device displaying light intensity measured by the optical sensor in a robotics context.

In this example, the Optical Sensor will print the current brightness value to the Brain’s screen.

Graph illustrating optical brightness values detected by a sensor, showing a range from low to high brightness levels.

Optical Hue#

The Optical Hue block is used to report the hue of the color of an object.

Optical hue block displaying color hue values from an optical sensor, ranging from 0 to 359 degrees.

The Optical Hue block reports a number value that is the hue of the color of an object. It returns a number between 0 and 359.

The value can be thought of as the location of the color on a color wheel in degrees.

Choose which Optical Sensor to use.

Optical hue device displaying color detection data and sensor readings in a robotics context.

In this example, the Optical Sensor will print the currently seen hue to the Brain’s screen.

Optical sensor displaying the hue value of an object, represented on a color wheel from 0 to 359 degrees.

Optical Sensor Detects Gesture#

The Optical Sensor Detects Gesture block is used to report whether an Optical Sensor has detected the specified gesture.

Illustration of an optical sensor detecting gestures for robotic applications, highlighting gesture recognition features.

Important: The Optical Sensor must first be set to detect gestures using the Set Optical Mode block, otherwise it will not detect any gestures.

The Optical Sensor Detects Gesture block reports True when the Optical Sensor detects the specified gesture.

The Optical Sensor Detects Gesture block reports False when the Optical Sensor doesn’t detect the specified gesture.

Choose which Optical Sensor to use.

Optical gesture detection device displaying various gesture recognition features and settings for sensor configuration.

Choose which gesture the Optical Sensor will check for.

Optical sensor detecting a gesture, used for gesture recognition in robotics applications.

In this example, the Optical Sensor is set to detect gestures before waiting until a left gesture is detected to print a message.

Diagram illustrating the use of the Set Optical Mode block for configuring an Optical Sensor to detect colors or gestures.

Rotation Sensing#

Set Rotation Sensor Position#

The Set Rotation Sensor Position block is used to set a Rotation Sensor’s current position to a defined value.

Rotation sensor position block diagram illustrating the setup and functionality of a rotation sensor in robotics.

The Set Rotation Sensor Position block accepts any positive or negative decimal or integer number.

Choose which Rotation Sensor to use.

Rotation sensor position device displaying angle and position metrics for rotational sensing applications.

In this example, the Rotation Sensor will print its starting position, set its position to -100 degrees, and then print the new position.

Illustration showing the positioning of a rotation sensor with angle and velocity settings for robotic applications.

Rotation Sensor Angle#

The Rotation Sensor Angle block is used to report the Rotation Sensor’s current angle of rotation in degrees.

Rotation sensor angle block displaying current angle of rotation in degrees for robotic applications.

The Rotation Sensor Angle block reports values in the range of 0.00 to 359.99.

Choose which Rotation Sensor to use.

Rotation sensor device displaying angle and position data for sensing applications in robotics.

In this example, the Rotation Sensor will print its starting rotation.

Image showing an example of a rotation sensor angle measurement in a robotics context.

Rotation Sensor Position#

The Rotation Sensor Position block is used to report the current rotational position of the selected Rotation Sensor.

Rotation sensor position block diagram illustrating the setup and functionality of a rotation sensor in robotics.

Choose which Rotation Sensor to use.

Rotation sensor position device displaying angle and position metrics for rotational sensing applications.

Choose what units the position will be reported in: degrees or turns.

Rotation sensor position units diagram illustrating angle and velocity measurements for rotation sensing.

In this example, the Rotation Sensor will print its starting position, set its position to -100 degrees, and then print the new position.

Illustration showing the positioning of a rotation sensor with angle and velocity settings for robotic applications.

Rotation Sensor Velocity#

The Rotation Sensor Velocity block is used to report the current velocity of a Rotation Sensor.

Image of a rotation sensor velocity block used in robotics for measuring rotational speed and position.

Choose which Rotation Sensor to use.

Rotation sensor and velocity device for monitoring motor and drivetrain performance in robotics applications.

Choose what units the position will be reported in: revolutions per minute (rpm) or degrees per second (dps).

Image showing rotation sensor with velocity units for reporting angle and position in degrees or rpm.

In this example, the Drivetrain will drive turn to the right for 1 second before its current rotational velocity is printed on the Brain’s screen.

Illustration of a rotation sensor displaying velocity measurements and sensor data in a robotics context.

Vision Sensing#

Take Vision Sensor Snapshot#

The Take Vision Sensor Snapshot block is used to take a snapshot from the Vision Sensor.

Image of a vision sensor snapshot block used for capturing and analyzing images in robotics applications.

The Take Vision Sensor Snapshot block will capture the current image from the Vision Sensor to be processed and analyzed for color signatures and codes.

A snapshot is required first before using any other Vision Sensor blocks.

Choose which Vision Sensor to use.

Vision sensor snapshot device interface displaying various sensing parameters and configurations for robotics applications.

Select which vision signature to use. Vision signatures are configured from the Devices window.

Vision sensor snapshot showing detected objects and their properties for analysis in robotics applications.

Set Vision Sensor Object Item#

The Set Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) out of the number of objects detected.

Image of a vision sensor object block used in robotics for detecting and processing visual data.

Choose which Vision Sensor to use.

Diagram illustrating various sensing capabilities and parameters for robotic control systems.

Vision Sensor Object Count#

The Vision Sensor Object Count block is used to report how many objects the Vision Sensor detects.

Vision sensor object count block displaying the count of detected objects in a robotics programming interface.

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Count block reports a number of objects.

The Vision Sensor Object Count block will only detect the number of objects from the last snapshot signature.

Choose which Vision Sensor to use.

Vision sensor displaying object count and detection status in a robotics context.

Vision Sensor Object Exists?#

The Vision Sensor Object Exists? block is used to report if the Vision Sensor detects a configured object.

Vision sensor block displaying object detection status and configuration options in a robotics programming interface.

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

The Vision Sensor Object Exists? block reports True when the Vision Sensor detects a configured object.

The Vision Sensor Object Exists? block reports False when the Vision Sensor does not detect a configured object.

Choose which Vision Sensor to use.

Vision sensor displaying object detection status with graphical interface elements and numerical values.

Vision Sensor Object#

The Vision Sensor Object block is used to report information about a detected object from the Vision Sensor.

Image of a vision sensor object block used in robotics for detecting and processing visual data.

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

Choose which Vision Sensor to use.

Diagram illustrating various sensing capabilities and parameters for robotic control systems.

Choose which property to report from the Vision Sensor:

  • width - How wide the object is in pixels, from 2 - 316 pixels.

  • height - How tall the object is in pixels, from 2 - 212 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 315 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 211 pixels.

  • angle - The angle of the detected object, from 0 - 180 degrees.

Diagram illustrating the properties and functions of a vision sensor in robotics sensing systems.

AI Vision Sensing#

Take AI Vision Snapshot#

The Take AI Vision Snapshot block is used to capture the current image from the AI Vision Sensor to be processed and analyzed for Visual Signatures.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

Flowchart illustrating the functions of various sensing and control blocks in the EXP Brain system.

A snapshot is required first before using any other AI Vision Sensor blocks.

Choose which AI Vision Sensor to use.

Image of a device interface displaying various sensor and motor status metrics for robotics control.

Select what Visual Signature the AI Vision Sensor should take a snapshot of.

  • AprilTags.

  • AI Classifications.

  • A configured Color Signature or Color Code.

AI Vision snapshot showing detected objects and their classifications with visual signatures and coordinates.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

AI Vision sensor snapshot example showing detected objects and their properties.

AI Classification Is#

The AI Classification Is block is used to report if the specified AI Classification has been detected.

Diagram illustrating the AI Vision Classification process with various sensing blocks and outputs.

The Take AI Vision Snapshot block is required first for AI Classifications before using the AI Classification Is block.

The Take AI Vision Snapshot block reports True when the AI Vision Sensor has detected the specified AI Classification.

The Take AI Vision Snapshot block reports False when the AI Vision Sensor has not detected the specified AI Classification.

Choose which AI Vision Sensor to use.

AI Vision Sensor classification result display showing detected object types and counts.

Choose which AI Classification to detect.

  • BlueBall

  • GreenBall

  • RedBall

  • BlueRing

  • GreenRing

  • RedRing

  • BlueCube

  • GreenCube

  • RedCube

AI Vision classification output displaying detected object types and their identifiers in a robotics context.

In this example, the AI Vision Sensor will take a snapshot of all AI Classifications before checking if a Blue Ball was detected or not. If a Blue Ball was detected, it will print a message to the Print Console.

AI Vision classification example showing detected object with labels for identification and analysis.

Detected AprilTag Is#

The Detected AprilTag Is block is used to report if the specified AprilTag is detected.

Diagram illustrating the functionality of AprilTag detection in AI vision sensing systems.

The Take AI Vision Snapshot blockis required first for AprilTags before using the Detected AprilTag Is block.

The Detected AprilTag Is block reports True when the AI Vision Sensor has detected the specified AprilTag.

The Detected AprilTag Is block reports False when the AI Vision Sensor has not detected the specified AprilTag.

Choose which AI Vision Sensor to use.

AI Vision Sensor displaying an AprilTag detection interface with various sensor readings and status indicators.

In this example, the AI Vision Sensor will take a snapshot of all AprilTags before checking if the AprilTag with the ID “3” was detected. If that specific AprilTag was detected, it will print a message to the Print Console.

Image of an AI Vision sensor displaying an AprilTag detection example in a robotics context.

Set AI Vision Sensor Object Item#

The Set AI Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) from the objects detected. By default, the Object item is set to 1 at the start of a project.

Diagram of the EXP Brain's reset timer block, illustrating how to reset the timer back to zero seconds.

The Take AI Vision Snapshot block is required first before the Set AI Vision Sensor Object Item block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities and controls for a robotic system, including motor and vision sensors.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected, it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

AI Vision sensor snapshot example showing detected objects and their properties.

AI Vision Sensor Object Count#

The AI Vision Sensor Object Count block is used to report how many objects the AI Vision Sensor detects that match the specified Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

AI Vision sensor block diagram illustrating object counting and detection features in a robotics context.

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Count block can be used.

Choose which AI Vision Sensor to use.

AI Vision Sensor object count display interface showing detected objects and their details in a graphical format.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

AI Vision Sensor object count example displaying detected objects and their properties on the Brain's screen.

AI Vision Sensor Object Exists?#

The AI Vision Sensor Object Exists? block is used to report if the AI Vision Sensor detects a Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

AI Vision Sensing block diagram illustrating various sensor functionalities and data reporting for object detection.

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Exists? block can be used.

The AI Vision Sensor Object Exists? block reports True when the AI Vision Sensor has detected an object.

The AI Vision Sensor Object Exists? block reports False when the AI Vision Sensor has not detected an object.

Choose which AI Vision Sensor to use.

AI Vision Sensor interface showing object detection status and parameters including object count and existence.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

AI Vision Sensor object count example displaying detected objects and their properties on the Brain's screen.

AI Vision Sensor Object#

The AI Vision Sensor Object block is used to report information about a specified Visual Signature from the AI Vision Sensor.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

Diagram illustrating the EXP Brain's reset timer block and its functionality in reporting timer values.

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities of a robotic system, including motor and controller feedback.

Choose which property to report from the AI Vision Sensor:

  • width - How wide the object is in pixels, from 0 - 320 pixels.

  • height - How tall the object is in pixels, from 0 - 240 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 320 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 240 pixels.

  • originX - The X coordinate of the object’s leftmost corner, from 0 - 320 pixels.

  • originY - The Y coordinate of the object’s leftmost corner, from 0 - 240 pixels.

  • angle - The angle of the detected Color Code only, from 0 - 360 degrees.

  • tagID - The detected AprilTag’s identification number.

  • score - The confidence score (up to 100%) for AI Classifications. This score indicates how confident the model is in the detected AI Classification. A higher score indicates greater confidence in the accuracy of the AI Classification.

Diagram illustrating various sensing blocks and properties for robotic control, including motor and controller functions.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

AI Vision sensor snapshot example showing detected objects and their properties.

Arm Sensing#

Can 6-Axis Arm Move to Position#

The Can 6-Axis Arm Move to Position block is used to report if the 6-Axis Robotic Arm is able to reach the specified position.

Image illustrating the arm's reach to block an object in a robotics context, showcasing sensor data interactions.

The Can 6-Axis Arm Move to Position block reports True when the 6-Axis Arm can reach that position.

The Can 6-Axis Arm Move to Position block reports False when the 6-Axis Arm can not reach that position.

Choose which 6-Axis Arm to use.

Diagram illustrating the functionality of the EXP Brain's timer reset and value reporting features.

Select which unit to use: millimeters (mm) or inches.

Diagram illustrating the arm's reach to various units in a robotic control system.

In this example, the 6-Axis Arm will check if it can move to (0, 0, 0) and print that it can not reach the position.

Image showing the Brain Sensing Reset Timer block in a robotics programming context, illustrating timer reset functionality.

Can 6-Axis Arm Increment Move to Position#

The Can 6-Axis Arm Increment Move to Position block is used to report if the 6-Axis Robotic Arm is able to incrementally move for that distance.

Image of the EXP Brain Reset Timer block used to reset the timer back to 0 seconds in programming context.

The Can 6-Axis Arm Increment Move to Position block reports True when the 6-Axis Arm can incrementally move for that distance.

The Can 6-Axis Arm Increment Move to Position block reports False when the 6-Axis Arm can not incrementally move for that distance.

Choose which 6-Axis Arm to use.

Diagram illustrating the VEX EXP Brain's timer reset functionality and button press detection.

Select which unit to use: millimeters (mm) or inches.

Diagram of various sensing units and their functionalities for robotics control and monitoring systems.

In this example, the 6-Axis Arm will check if it can increment move for 500 millimeters on the Y axis and print that it can’t move for that distance.

Diagram illustrating various sensing blocks for a robotics controller, including timer, battery, and motor status.

Can 6-Axis Arm End Effector Move to Orientation#

The Can 6-Axis Arm End Effector Move to Orientation block is used to report if the 6-Axis Arm’s End Effector can rotate about an axis to a specific orientation.

Diagram illustrating the connection between the 6-axis arm and various sensing and control blocks in a robotics system.

The Can 6-Axis Arm End Effector Move to Orientation block reports True when the 6-Axis Arm can rotate about an axis to a specific orientation.

The Can 6-Axis Arm End Effector Move to Orientation block reports False when the 6-Axis Arm can not rotate about an axis to a specific orientation.

Choose which 6-Axis Arm to use.

Diagram illustrating the connections and functions of the 6-axis robotic arm and its sensors.

Select which axis to use:

  • pitch - Movement around the Y-axis.

  • roll - Movement around the X-axis.

  • yaw - Movement around the Z-axis.

Diagram showing the connections between various sensing components and the EXP Brain in a robotic system.

In this example, the 6-Axis Arm will check if the End Effector can point towards the 40 degrees position on the X axis and print if it can or can not.

Flowchart illustrating the functions and blocks of the EXP Brain for various sensing capabilities and motor controls.

Can 6-Axis Arm End Effector Incrementally Move to Orientation#

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block is used to report if the 6-Axis Arm’s End Effector can incrementally rotate about an axis its orientation for a specific amount of degrees.

Diagram of the VEX EXP Brain's Reset Timer block, illustrating its function to reset the timer to zero seconds.

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block reports True when the 6-Axis Arm can incrementally rotate about an axis for a specific amount of degrees.

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block reports False when the 6-Axis Arm can not incrementally rotate about an axis for a specific amount of degrees.

Choose which 6-Axis Arm to use.

Diagram illustrating the functionalities of a 6-axis robotic arm with various sensing capabilities and motor controls.

Select which axis to use:

  • pitch - Rotation around the Y-axis.

  • roll - Rotation around the X-axis.

  • yaw - Rotation around the Z-axis.

Diagram of a 6-axis robotic arm with labeled components and axes for movement and orientation.

In this example, the 6-Axis Arm will check if the End Effector can increment move for 20 degrees on the Z axis and print if it can or can not.

Flowchart illustrating the VEX EXP Brain's timer reset process and button press detection.

6-Axis Arm is Done?#

The 6-Axis Arm is Done? block is used to report if the 6-Axis Arm has completed moving.

Image showing the status of a 6-axis robotic arm indicating that the arm has completed its movement.

The 6-Axis Arm is Done? block reports True when the 6-Axis Arm is not moving.

The 6-Axis Arm is Done? block reports False when the 6-Axis Arm is moving.

Choose which 6-Axis Arm to use.

Image of a 6-axis robotic arm indicating that it has completed its movement, showing its position and status.

In this example, the Arm will move to the position (-100, 200, 100) and print its Y coordinate in mm every .25 seconds as it moves, until it is done moving.

Image of a 6-axis robotic arm indicating its movement status with a motor is done" message displayed on the screen.

6-Axis Arm Position#

The 6-Axis Arm Position block is used to report the current position of the 6-Axis Arm in the specified axis.

Illustration of the arm position block for controlling a 6-axis robotic arm in a programming context.

Choose which 6-Axis Arm to use.

Diagram illustrating the position of a 6-axis robotic arm with labeled axes and joint angles.

Choose which axis to report.

Diagram illustrating the position and axis of a 6-axis robotic arm in a sensing context.

Choose which unit to report with: millimeters (mm) or inches.

Diagram illustrating various arm position units for robotic control and sensing applications.

In this example, the 6-Axis Arm will print its current Z axis position in millimeters to the Print Console.

Illustration of arm position for a 6-axis robotic arm in a sensing context.

6-Axis Arm End Effector Orientation#

The 6-Axis Arm End Effector Orientation block is used to report the current orientation of the 6-Axis Arm’s End Effector.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, battery, controller, and motor sensing.

Choose which 6-Axis Arm to use.

Diagram illustrating arm orientation for a 6-axis robotic arm in a sensing context.

Choose which axis to report:

  • pitch - Rotation around the Y-axis.

  • roll - Rotation around the X-axis.

  • yaw - Rotation around the Z-axis.

Diagram illustrating arm orientation axes for robotic control and movement analysis.

In this example, the 6-Axis Arm will print the End Effector’s current Y axis orientation in degrees to the Print Console.

Image demonstrating arm orientation for robotic control in a VEX EXP Brain project.

Line Tracking Sensing#

Line Tracker Reflectivity#

The Line Tracker Reflectivity block is used to report the amount of light reflected using the Line Tracker Sensor.

Line tracker sensor displaying reflectivity measurement on a screen, used for detecting light levels.

Choose which Line Tracker Sensor to use.

Line tracker reflectivity device displaying light reflectivity levels for sensor data analysis.

In this example, the Line Tracker Sensor will print the current detected reflectivity to the Brain’s Screen.

Line tracker sensor output displaying reflectivity values in a robotics context.

Light Sensing#

Light Sensor Brightness#

The Light Sensor Brightness block is used to report the amount of light detected by the Light Sensor.

Light sensor brightness block displaying the amount of light detected by the sensor on a digital interface.

Choose which Light Sensor to use.

Light sensor device displaying brightness measurement in a robotics context.

In this example, the Light Sensor will print the current detected brightness to the Brain’s Screen.

Light sensor example showing brightness detection with graphical representation of light levels.

Potentiometer Sensing#

Potentiometer Angle#

The Potentiometer Angle block is used to report the angular position of the Potentiometer.

Image of a potentiometer angle block used for reporting angular position in degrees or percent.

Choose which Potentiometer to use.

Image of a potentiometer angle device used for measuring angular position in robotics and electronics applications.

Choose which unit to report in: percent (%) or degrees.

Image showing a potentiometer with angle measurement units for sensing applications.

In this example, the Potentiometer will print its current angular position to the Brain’s Screen.

Illustration of a potentiometer showing angle measurement for sensing applications in robotics.

Accelerometer Sensing#

Accelerometer Acceleration#

The Accelerometer Acceleration block is used to report the acceleration value from one axis on the Analog Accelerometer.

Diagram of an accelerometer acceleration block used for sensing in robotics applications.

The Accelerometer Acceleration block reports a range from -2.0 G to 2.0 G or -6.0 G to 6.0 G depending upon the jumper setting on the Analog Accelerometer.

Choose which Accelerometer to use.

Image of an accelerometer displaying acceleration values and settings for various axes and configurations.

In this example, the Drivetrain will drive forward for 1 second before its current acceleration is printed on the Brain’s screen.

Illustration of accelerometer acceleration example showing axis and measurement values in a robotics context.

Range Finder Sensing#

Range Finder Found Object?#

The Range Finder Found Object? block is used to report if the Ultrasonic Range Finder Sensor sees an object within its field of view.

Diagram illustrating the Range Finder's object detection capabilities and distance measurement functionality.

The Range Finder Found Object? block reports True when the Ultrasonic Range Finder Sensor sees an object or surface within its field of view.

The Range Finder Found Object? block reports False when the Ultrasonic Range Finder Sensor does not detect an object or surface.

Choose which Ultrasonic Range Finder Sensor to use.

Image of an ultrasonic range finder sensor detecting an object within its field of view.

In this example, every .25 seconds the Range Finder will check if it detects an object, and if so, will print the distance between it and the object to the Brain’s Screen.

Image showing a range finder detecting an object with distance measurement displayed on a screen.

Range Finder Distance#

The Range Finder Distance block is used to report the distance of the nearest object from the Ultrasonic Range Finder Sensor.

Diagram illustrating the Range Finder Distance block used to measure object distance with an Ultrasonic sensor.

Choose which Ultrasonic Range Finder Sensor to use.

Image of a range finder distance device used for measuring object distance in robotics applications.

Choose which unit to report in: millimeters (mm) or inches.

Image showing distance measurement units for a range finder sensor, illustrating various distance reporting formats.

In this example, every .25 seconds the Range Finder will check if it detects an object, and if so, will print the distance between it and the object to the Brain’s Screen.

Image showing a range finder detecting an object with distance measurement displayed on a screen.

Digital In Sensing#

Digital In#

The Digital In block is used to report if the Digital In signal is high.

Diagram illustrating various sensing blocks and their functionalities for the VEX EXP Brain and controller.

The 3-Wire ports function at a 5V logic signal voltage level.

The Digital In block reports True when the digital input signal is high.

The Digital In block reports False when the digital input signal is low.

Choose which Digital In device to use.

Diagram illustrating digital input and output functions of a robotic brain, including timer, button, and sensor readings.

Digital Out Sensing#

Digital Out#

The Digital Out block is used to set the logic level of a digital out 3-Wire port.

Digital Out block diagram illustrating the reset timer functionality for the EXP Brain in robotics applications.

The 3-Wire ports function at a 5V logic signal voltage level.

Choose which Digital Out port to use.

Illustration of digital output device used in robotics for controlling signals and device interactions.

Choose what to output: a low or high digital logic signal.

Diagram illustrating the Digital Out signal block in the VEX EXP Brain programming environment.