Sensing#

Brain Sensing#

Reset Timer#

The Reset Timer block is used to reset the EXP Brain’s timer.

VEXcode blocks stack of code containing a timer in seconds block.#
  timer in seconds

The Brain’s timer begins at the beginning of each project. The reset timer block is used to reset the timer back to 0 seconds.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

VEXcode blocks stack of code containing a when started block, a wait 2 seconds block, a print timer in seconds on Brain and set cursor to next row, and a reset timer block.#
  when started :: hat events
  wait (2) seconds
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row
  reset timer
  print (timer in seconds) on [Brain v]  ◀ and set cursor to next row

Timer Value#

The Timer Value block is used to report the value of the EXP Brain’s timer in seconds.

VEXcode blocks stack of code containing a timer in seconds block.#
  timer in seconds

The timer starts at 0 seconds when the program starts, and reports the timer’s value as a decimal value.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

VEXcode blocks stack of code containing a when started block, a wait 2 seconds block, a print timer in seconds on Brain and set cursor to next row block, and a reset timer block.#
  when started :: hat events
  wait (2) seconds
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row
  reset timer
  print (timer in seconds) on [Brain v]  ◀ and set cursor to next row

Brain Button Pressed#

The Brain Button Pressed block is used to report if a button on the VEX EXP Brain is pressed.

    <Brain [Left v] button pressed?>

The Brain Button Pressed block reports True when the selected Brain button is pressed.

The Brain Button Pressed block reports False when the selected Brain button is not pressed.

Choose which Brain button to use on the EXP Brain.

Image of the VEX EXP Brain button pressed indicator showing the current button status on the display screen.

In this example, the Brain will print a message on its screen when the first time the right Brain Button is pressed.

    when started :: hat events
    [Don't print the message until the right Brain Button is pressed]
    wait until <Brain [Right v] button pressed?>
    print [Right Brain button was pressed.] on [Brain v]  ◀ and set cursor to next row

Cursor Column#

The Cursor Column block is used to report the column number of the EXP Brain’s screen cursor location.

VEXcode blocks stack of code containing a cursor column block.#
  cursor column

The Cursor Column block will report a value from 1-80 and will start on column 1 at the start of a project.

In this example, the Brain will move the cursor to (3, 7) and then print the current column (7) on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a set cursor to row 3 column 7 on Brain block, and a print cursor column on Brain and set cursor to next row block.#
  when started :: hat events
  set cursor to row (3) column (7) on Brain
  print (cursor column) on [Brain v] ◀ and set cursor to next row

Cursor Row#

The Cursor Row block is used to report the row number of the EXP Brain’s screen cursor location.

VEXcode blocks stack of code containing a cursor row block.#
  cursor row

The Cursor Column block will report a value from 1-9 and will start on row 1 at the start of a project.

In this example, the Brain will move the cursor to (3, 7) and then print the current row (3) on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a set cursor to row 3 column 7 on Brain block, and a print cursor row on Brain and set cursor to next row block.#
  when started :: hat events
  set cursor to row (3) column (7) on Brain
  print (cursor row) on [Brain v] ◀ and set cursor to next row

Battery Voltage#

The Battery Voltage block is used to report the voltage of the EXP Brain’s battery.

VEXcode blocks stack of code containing a battery voltage in volts block.#
  (battery voltage in volts)

The Battery Voltage block reports a range from 6 volts to 9 volts.

In this example, the Brain will print its current battery voltage on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, and a print battery voltage in volts on Brain and set cursor to next row block.#
  when started :: hat events
  print (battery voltage in volts) on [Brain v] ◀ and set cursor to next row

Battery Current#

The Battery Current block is used to report the current of the EXP Brain’s battery.

VEXcode blocks stack of code containing a battery current in amps block.#
  (battery current in amps)

The Battery Current block reports a range from 0.0 amps to 15.0 amps.

In this example, the Brain will print its current battery current on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, and a print battery current in amps on Brain and set cursor to next row block.#
  when started :: hat events
  print (battery current in amps) on [Brain v] ◀ and set cursor to next row

Battery Capacity#

The Battery Capacity block is used to report the charge level of the EXP Brain’s battery.

VEXcode blocks stack of code containing a battery capacity in % block.#
  (battery capacity in %)

The Battery Capacity block reports a range from 0% to 100%.

In this example, the Brain will print its current battery charge on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, and a print battery capacity in % on Brain and set cursor to next row block.#
  when started :: hat events
  print (battery capacity in %) on [Brain v] ◀ and set cursor to next row

Controller Sensing#

Controller Pressed#

The Controller Pressed block is used to report if a button on the EXP Controller is pressed.

VEXcode blocks stack of code containing a Controller Up pressed? block.#
<[Controller1 v] [Up v] pressed?>

The Controller Pressed block reports True when the selected Controller button is pressed.

The Controller Pressed block reports False when the selected Controller button is not pressed.

Choose which Controller button to use.

Image of a VEX EXP Controller button being pressed, indicating input for sensing actions in robotics.

In this example, the Brain will print a message on its screen the first time the A button on the controller is pressed.

VEXcode blocks stack of code containing a when started block, a wait until Controller1 x pressed? block, and a print The X button was pressed. on Brain and set cursor to next row block.#
  when started :: hat events
  [Don't do anything until the X button is pressed.]
  wait until <[Controller1 v] [X v] pressed? :: #5cb0d6>
  print [The X button was pressed.] on [Brain v] ◀ and set cursor to next row

Position of Controller#

The Position of Controller block is used to report the position of a joystick on the EXP Controller along an axis.

VEXcode blocks stack of code containing a Controller 1 position block.#
  (Controller [1 v] position)

The Position of Controller block reports a range from -100 to 100.

The Position of Controller block reports 0 when the joystick axis is centered.

Choose the joystick’s axis.

Image showing the controller position axis for joystick input on the EXP Controller in a robotics project.

In this example, the Brain will print the 3rd axis of the EXP Controller’s joysticks.

VEXcode blocks stack of code containing a when started block, and a print Controller 3 position on Brain and set cursor to next row block.#
  when started :: hat events
  print (Controller [3 v] position) on [Brain v] ◀ and set cursor to next row

Controller Enable/Disable#

The Controller Enable/Disable block is used to enable or disable Controller configured actions from the Devices menu.

VEXcode blocks stack of code containing a Controller Disable block.#
  Controller [Disable v]

Choose to either enable or disable the configured Controller actions. By default, the Controller is Enabled in every project.

Image showing various blocks related to brain sensing, including timer reset, battery status, and controller inputs.

In this example, the Controller will be disabled at the start of the project and be re-enabled after the drivetrain has moved forward for 6 inches.

VEXcode blocks stack of code containing a when started block, a Controller Disable block, a drive forward for 6 inches block, and a Controller Enable block.#
  when started :: hat events
  Controller [Disable v]
  drive [forward v] for (6) [inches v] ▶
  Controller [Enable v]

Motor Sensing#

Motor is Done?#

The Motor is Done? block is used to report if the selected EXP Smart Motor or Motor Group has completed its movement.

VEXcode blocks stack of code containing a ClawMotor is done? block.#
  <[ClawMotor v] is done?>

The Motor is Done? block reports True when the selected Motor or Motor Group has completed its movement.

The Motor is Done? block reports False when the selected Motor or Motor Group is still moving.

Choose which Motor or Motor Group to use.

Image showing the Motor is Done?" block indicating if the motor has completed its movement in the VEX EXP Brain interface.

Motor is Spinning?#

The Motor is Spinning? block is used to report if the selected EXP Smart Motor or Motor Group is currently moving.

VEXcode blocks stack of code containing a ClawMotor is spinning? block.#
  <[ClawMotor v] is spinning?>

The Motor is Spinning? block reports True when the selected Motor or Motor Group is moving.

The Motor is Spinning? block reports False when the selected Motor or Motor Group is not moving.

Choose which Motor or Motor Group to use.

Image showing a motor status indicator with Motor is Spinning?" text and related graphical elements.

Position of Motor#

The Position of Motor block is used to report the distance an EXP Smart Motor or the first motor of a Motor Group has traveled.

VEXcode blocks stack of code containing a Motor3 position in degrees block.#
  ([Motor3 v] position in [degrees v])

Choose which Motor or Motor Group to use.

Diagram illustrating motor position and various sensing parameters for robotic control systems.

Choose the units to report in, degrees or turns.

Diagram illustrating motor position units and their measurements in robotics.

In this example, the Motor will spin forward for 1 second before its current position is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 position in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current position after 1 second.]
  print ([Motor3 v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row

Velocity of Motor#

The Velocity of Motor block is used to report the current velocity of an EXP Smart Motor or the first motor of a Motor Group.

VEXcode blocks stack of code containing a Motor3 velocity in % block.#
  ([Motor3 v] velocity in [% v])

The Velocity of Motor block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose which Motor or Motor Group to use.

Diagram illustrating motor velocity parameters including current, power, torque, and efficiency for robotics applications.

Choose the units to report in, percent (%) or rpm.

Diagram illustrating motor velocity units for the EXP Brain, showing various motor performance metrics and measurements.

In this example, the Motor will spin forward for 1 second before its current velocity is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 velocity in % on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current velocity after 1 second.]
  print ([Motor3 v] velocity in [% v]) on [Brain v] ◀ and set cursor to next row

Current of Motor#

The Current of Motor block is used to report the amount of current a EXP Smart Motor or Motor Group is drawing in amperes (amps).

VEXcode blocks stack of code containing a Motor3 current in amps block.#
  ([Motor3 v] current in amps)

Choose which Motor or Motor Group to use.

Diagram illustrating motor current parameters including motor power, torque, and efficiency in robotics systems.

In this example, the Motor will spin forward for 1 second before its current is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 current in amps on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current after 1 second.]
  print ([Motor3 v] current in amps) on [Brain v] ◀ and set cursor to next row 

Power of Motor#

The Power of Motor block is used to report the amount of power output a EXP Smart Motor or the first motor of a Motor Group is currently generating.

VEXcode blocks stack of code containing a Motor3 power in watts block.#
  ([Motor3 v] power in watts)  

Choose which Motor or Motor Group to use.

Diagram illustrating motor power parameters including torque, efficiency, and temperature for robotic systems.

In this example, the Motor will spin forward for 1 second before its current power is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 power in watts on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current power after 1 second.]
  print ([Motor3 v] power in watts) on [Brain v] ◀ and set cursor to next row

Torque of Motor#

The Torque of Motor block is used to report the amount of torque (rotational force) a EXP Smart Motor or the first motor of a Motor Group is currently using.

VEXcode blocks stack of code containing a Motor3 torque in Nm block.#
  ([Motor3 v] torque in [Nm v])

The Torque of Motor block reports a range from 0.0 to 18.6 inch-pounds (InLB) or 0.0 to 2.1 Newton-meters (Nm).

Choose which Motor or Motor Group to use.

Diagram illustrating motor torque parameters including power, efficiency, and temperature for motor control systems.

Choose the units to report in, Nm or InLb.

Diagram illustrating motor torque units and their measurement scales for robotics applications.

In this example, the Motor will spin forward for 1 second before its current torque is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 torque in Nm on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current torque after 1 second.]
  print ([Motor3 v] torque in [Nm v]) on [Brain v] ◀ and set cursor to next row 

Efficiency of Motor#

The Efficiency of Motor block is used to report the efficiency of a EXP Smart Motor or the first motor of a Motor Group.

VEXcode blocks stack of code containing a Motor3 efficiency in % block.#
  ([Motor3 v] efficiency in %)

The Efficiency of Motor block reports a range from 0% to 100%, determined by the value of the power (in watts) the motor is using (input), versus the amount of power (in watts) the motor is providing (output).

An EXP Smart Motor or Motor Group typically reaches a maximum efficiency of 65% under normal use cases.

Choose which Motor or Motor Group to use.

Diagram illustrating motor efficiency metrics including power, torque, and temperature for motor performance analysis.

In this example, the Motor will spin forward for 1 second before its current efficiency is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 second block, and a print Motor3 efficiency in % on Brain and set cursor to next row block.#
  when started :: hat events
  [Spin Motor3 forward for 1 second.]
  spin [Motor3 v] [forward v]
  wait (1) seconds
  [Print Motor3's current efficiency after 1 second.]
  print ([Motor3 v] efficiency in %) on [Brain v] ◀ and set cursor to next row

Temperature of Motor#

The Temperature of Motor block is used to report the temperature of a EXP Smart Motor or the first motor of a Motor Group.

VEXcode blocks stack of code containing a Motor3 in temperature in % block.#
  ([Motor3 v] temperature in % :: #5cb0d6)

The Temperature of Motor block reports a range from 0% to 100%.

Choose which Motor or Motor Group to use.

Image showing the temperature readings of a motor in a robotic system, indicating operational status and efficiency.

In this example, the Motor will spin forward for 1 second before its current temperature is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a spin Motor3 forward block, a wait 1 seconds block, and a print Motor3 temperature in % on Brain and set sursor to next row block.#
when started :: hat events
[Spin Motor3 forward for 1 second.]
spin [Motor3 v] [forward v]
wait (1) seconds
[Print Motor3's current temperature after 1 second.]
print ([Motor3 v] temperature in % :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Drivetrain Sensing#

Drive is Done?#

The Drive is Done? block is used to report if the Drivetrain has completed its movement.

VEXcode blocks stack of code containing a drive is done ? block.#
  <drive is done?>

The Drive is Done? block reports True when the Drivetrain’s motors have completed their movement.

The Drive is Done? block reports False when the sDrivetrain’s motors are still moving.

Drive is Moving?#

The Drive is Moving? block is used to report if the Drivetrain is currently moving.

VEXcode blocks stack of code containing a drive is moving ? block.#
  <drive is moving?>

The Drive is Moving? block reports True when the Drivetrain’s motors are moving.

The Drive is Moving? block reports False when the sDrivetrain’s motors are not moving.

Drive Heading#

The Drive Heading block is used to report the direction that the Drivetrain is facing by using the Inertial sensor’s current angular position.

VEXcode blocks stack of code containing a drive heading in degrees block.#
  (drive heading in degrees)

The Drive Heading block reports a range from 0.0 to 359.99 degrees.

In this example, the Drivetrain will turn to the right for 1 second before its current heading is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a turn right block, a wait 1 second block, and a print drive heading in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  [Turn towards the right for 1 second.]
  turn [right v]
  wait (1) seconds
  [Print Drivetrain's current heading after 1 second.]
  print (drive heading in degrees) on [Brain v] ◀ and set cursor to next row 

Drive Rotation#

The Drive Rotation block is used to report the Drivetrain’s angle of rotation.

VEXcode blocks stack of code containing a drive rotation in degrees block.#
  (drive rotation in degrees)

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

In this example, the Drivetrain will turn to the left for 1 second before its current rotation is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a turn left block, a wait 1 second block, and a print drive rotation in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  [Turn towards the left for 1 second.]
  turn [left v]
  wait (1) seconds
  [Print Drivetrain's current rotation after 1 second.]
  print (drive rotation in degrees) on [Brain v] ◀ and set cursor to next row 

Drive Velocity#

The Drive Velocity block is used to report the current velocity of the Drivetrain.

VEXcode blocks stack of code containing a drive velocity in % block.#
  drive velocity in [% v]

The Drive Velocity block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose the units to report in, percent (%) or rpm.

Image showing drive velocity units for a robotic system, including parameters like percent and rpm.

In this example, the Drivetrain will drive forward for 1 second before its current velocity is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive velocity in % on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current velocity after 1 second.]
  print (drive velocity in [% v]) on [Brain v] ◀ and set cursor to next row

Drive Current#

The Drive Current block is used to report the amount of current (power) that the Drivetrain is currently using.

VEXcode blocks stack of code containing a drive current amps block.#
  (drive current amps)

In this example, the Drivetrain will drive forward for 1 second before its current is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive current amps on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current after 1 second.]
  print (drive current amps) on [Brain v] ◀ and set cursor to next row

Drive Power#

The Drive Power block is used to report the amount of power output the Drivetrain is currently generating.

VEXcode blocks stack of code containing a drive power in watts block.#
(drive power in watts)

In this example, the Drivetrain will drive forward for 1 second before its current power is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive power in watts on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current power after 1 second.]
  print (drive power in watts) on [Brain v] ◀ and set cursor to next row 

Drive Torque#

The Drive Torque block is used to report the amount of torque (rotational force) the Drivetrain is currently using.

VEXcode blocks stack of code containing a drive torque in Nm block.#
  (drive torque in [Nm v])

The Drive Torque block reports a range from 0.0 to 18.6 inch-pounds (InLB) or 0.0 to 2.1 Newton-meters (Nm).

Choose the units to report in, Nm or InLb.

Diagram illustrating various drive torque units and their measurements in a robotics context.

In this example, the Drivetrain will drive forward for 1 second before its current torque is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive torque in Nm on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current torque after 1 second.]
  print (drive torque in [Nm v]) on [Brain v] ◀ and set cursor to next row 

Drive Efficiency#

The Drive Efficiency block is used to report the efficiency of the Drivetrain.

VEXcode blocks stack of code containing a drive efficiency in % block.#
  (drive efficiency in %) 

The Drive Efficiency block reports a range from 0% to 100%, determined by the value of the power (in watts) the motor is using (input), versus the amount of power (in watts) the motor is providing (output).

An EXP Drivetrain typically reaches a maximum efficiency of 65% under normal use cases.

In this example, the Drivetrain will drive forward for 1 second before its current efficiency is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive efficiency in % on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current efficiency after 1 second.]
  print (drive efficiency in %) on [Brain v] ◀ and set cursor to next row

Drive Temperature#

The Drive Temperature block is used to report the temperature of the EXP Smart Motors powering the Drivetrain.

VEXcode blocks stack of code containing a drive temperature in % block.#
  (drive temperature in %)

The Drive Temperature block reports a range from 0% to 100%.

In this example, the Drivetrain will drive forward for 1 second before its current temperature is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, and a print drive temperature in % on Brain and set cursor to next row.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current temperature after 1 second.]
  print (drive temperature in %) on [Brain v] ◀ and set cursor to next row

Bumper Sensing#

Bumper Pressed#

The Bumper Pressed block is used to report if the Bumper Switch is pressed.

VEXcode blocks stack of code containing a Bumper8 pressed? block.#
  <[BumperB v] pressed?>

The Bumper Pressed block reports True when the selected Bumper Switch is pressed.

The Bumper Pressed block reports False when the selected Bumper Switch is not pressed.

Choose which Bumper Switch to use.

Bumper pressed indicator on the EXP Brain, showing the status of the bumper switch in a robotics context.

In this example, the Brain will print a message on its screen the first time the Bumper Switch is pressed.

VEXcode blocks stack of code containing a when started block, a wait until Bumper8 pressed? block, and a print Bumper Switch was pressed. on Brain and set cursor to next row.#
  when started :: hat events
  [Don't print the message until the Bumper Switch is pressed.]
  wait until <[BumperB v] pressed?>
  print [Bumper Switch was pressed.] on [Brain v] ◀ and set cursor to next row

Limit Sensing#

Limit Pressed#

The Limit Pressed block is used to report if the Limit Switch is pressed.

  <[LimitSwitchA v] pressed?>

The Limit Pressed block reports True when the selected Limit Switch is pressed.

The Limit Pressed block reports False when the selected Limit Switch is not pressed.

Choose which Limit Switch to use.

Limit switch pressed indicator on the VEX EXP Brain, showing true or false status for the Limit Sensing block.

In this example, the Brain will print a message on its screen the first time the Limit Switch is pressed.

  when started :: hat events
  [Don't print the message until the Limit Switch is pressed.]
  wait until <[LimitSwitchA v] pressed?>
  print [Limit Switch was pressed.] on [Brain v] ◀ and set cursor to next row

Gyro Sensing#

Calibrate#

The Calibrate block is used to calibrate the Gyro or Inertial Sensor to reduce the amount of drift. It is recommended that this block is used at the start of the project.

The Gyro or Inertial Sensor will automatically adjust its values depending on the orientation of the EXP Brain during calibration so it remains consistent across all possible EXP Brain orientations.

../../_images/EXP_right_orientation.png../../_images/EXP_left_orientation.png../../_images/EXP_top_orientation.png

The Brain must remain still for calibration process to succeed, which takes approximately 2 seconds.

VEXcode blocks stack of code containing a calibrate GyroH block.#
  calibrate [GyroH v]

Choose which Gyro/Inertial Sensor to use.

Image showing a user interface for calibrating a device's gyro sensor with settings for heading and rotation.

In this example, the Brain’s Inertial Sensor will calibrate for 2 seconds before printing the current orientation of the Inertial Sensor.

VEXcode blocks stack of code containing a when started block, a calibrate BrainInertial block, a wait 2 seconds block, and a print BrainInertial orientation of roll on Brain and set cursor to next row block.#
  when started :: hat events
  calibrate [BrainInertial v]
  wait (2) seconds
  print ([BrainInertial v] orientation of [roll v]) on [Brain v] ◀ and set cursor to next row 

Set Heading#

The Set Heading block is used to set the Gyro/Inertial sensor’s current heading position to a set value.

VEXcode blocks stack of code containing a set GyroA heading to 0 degrees block.#
  set [GyroA v] heading to (0) degrees

The Set Heading block accepts a range of 0.0 to 359.99 degrees.

Choose which Gyro/Inertial Sensor to use.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, motor, and controller functions.

In this example, the Brain’s Inertial sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

VEXcode blocks stack of code containing a when started block, a print BrainInertial heading in degrees on Brain and set cursor to next row block, a set BrainInertial heading to 90 degrees block, and a print BrainInertial heading in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([BrainInertial v] heading in degrees) on [Brain v] ◀ and set cursor to next row 
  set [BrainInertial v] heading to (90) degrees
  print ([BrainInertial v] heading in degrees) on [Brain v] ◀ and set cursor to next row 

Set Rotation#

The Set Rotation block is used to set the Gyro/Inertial sensor’s current rotation position to a set value.

VEXcode blocks stack of code containing a set Inertial1 rotation to 0 degrees.#
  set [Inertial1 v] rotation to (0) degrees 

The Set Rotation block accepts any positive or negative decimal or integer number.

Choose which Gyro/Inertial Sensor to use.

Image of a rotation device used in robotics for sensing and controlling motor positions and movements.

In this example, the Brain’s Inertial sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new rotation.

VEXcode blocks stack of code containing a when started block, a print BrainInertial rotation in degrees on Brain and set cursor to next row block, a set BrainInertial rotation to -100 degrees block, a print BrainInertial rotation in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([BrainInertial v] rotation in degrees) on [Brain v] ◀ and set cursor to next row
  set [BrainInertial v] rotation to (-100) degrees
  print ([BrainInertial v] rotation in degrees) on [Brain v] ◀ and set cursor to next row

Angle of Heading#

The Angle of Heading block is used to report the 3-Wire Gyro Sensor or EXP Inertial Sensor’s current heading in degrees.

VEXcode blocks stack of code containing a BrainInertial heading in degrees block.#
  ([BrainInertial v] heading in degrees)

The Angle of Heading block reports a range from 0.0 to 359.99 degrees.

Choose which Gyro/Inertial Sensor to use.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, motor, and controller functions.

In this example, the Brain’s Inertial sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

VEXcode blocks stack of code containing a when started block, a print BrainInertial heading in degrees on Brain and set cursor to next row block, a set BrainInertial rotation to 90 degrees block, a print BrainInertial heading in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([BrainInertial v] heading in degrees) on [Brain v] ◀ and set cursor to next row
  set [BrainInertial v] rotation to (90) degrees
  print ([BrainInertial v] heading in degrees) on [Brain v] ◀ and set cursor to next row

Angle of Rotation#

The Angle of Rotation block is used to report the 3-Wire Gyro Sensor or EXP Inertial Sensor’s current rotation in degrees.

VEXcode blocks stack of code containing a BrainInertial rotation in degrees block.#
  ([BrainInertial v] rotation in degrees)

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

Choose which Gyro/Inertial Sensor to use.

Image of a rotation device used in robotics for sensing and controlling motor positions and movements.

In this example, the Brain’s Inertial sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new heading.

VEXcode blocks stack of code containing a when started block, a print BrainInertial rotation in degrees on Brain and set cursor to next row block, a set BrainInertial rotation to -100 degrees block, a print BrainInertial rotation in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([BrainInertial v] rotation in degrees) on [Brain v] ◀ and set cursor to next row
  set [BrainInertial v] rotation to (-100) degrees
  print ([BrainInertial v] rotation in degrees) on [Brain v] ◀ and set cursor to next row

Inertial Sensing#

Acceleration of#

The Acceleration of block is used to report the acceleration value from one of the axes (x, y, or z) on the Inertial Sensor.

  ([Inertial20 v] acceleration of [X v] axis :: #5cb0d6)

The Acceleration of block reports a range from -4.0 to 4.0 Gs.

Choose which Gyro/Inertial Sensor to use.

Image of an acceleration device used for monitoring and reporting motion and orientation data in robotics applications.

Choose which axis to use:

  • x - The X-axis reports acceleration when the Inertial Sensor moves forward to backward.

  • y - The Y-axis reports acceleration when the Inertial Sensor moves side to side.

  • z - The Z-axis reports acceleration when the Inertial Sensor moves up to down.

Diagram illustrating the acceleration axis for various sensing parameters in robotics, including motor and drivetrain data.

In this example, the Drivetrain will move forward and print its current X-axis acceleration while moving.

  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print the X-axis acceleration while the Drivetrain is moving.]
  print ([BrainInertial v] acceleration of [X v] axis in g :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Gyro Rate of#

The Gyro Rate of block is used to report the rate of rotation from one of the axes (x, y, or z) on the Inertial Sensor.

  ([Inertial20 v] gyro rate to [X v] axis :: #5cb0d6)

The Gyro Rate of block reports a range from -1000.0 to 1000.0 in dps (degrees per second).

Choose which Gyro/Inertial Sensor to use.

Image of a gyro rate device used for measuring angular rotation in robotics applications.

Choose which axis to use:

  • x - The X-axis reports rate of rotation when the Inertial Sensor rotates on the X-Axis (based on the orientation of the sensor).

  • y - The Y-axis reports rate of rotation when the Inertial Sensor rotates on the Y-Axis (based on the orientation of the sensor).

  • z - The Z-axis reports rate of rotation when the Inertial Sensor rotates in the Z-Axis (based on the orientation of the sensor).

Diagram illustrating gyro rate sensing with axes labeled for orientation and rotation measurements.

In this example, the Drivetrain will turn to the right and print its current X-axis gyro rate while turning.

  when started :: hat events
  [Turn towards the right 1 second.]
  turn [right v]
  wait (1) seconds
  [Print the X-axis gyro rate while the Drivetrain is turning.]
  print ([BrainInertial v] acceleration of [x v] axis in g :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Orientation of#

The Orientation of block is used to report the orientation angle of the inertial sensor.

  ([Inertial20 v] orientation of [roll v])

Choose which Gyro/Inertial Sensor to use.

Image of an orientation device used for various sensing applications in robotics and automation.

Choose which orientation to use:

  • roll - The Y-axis represents roll, which reports a value between -180 to +180 degrees.

  • pitch - The X-axis represents pitch, which reports a value between -90 to +90 degrees.

  • yaw - The Z-axis represents yaw, which reports a value between -180 to +180 degrees.

Diagram illustrating various sensing blocks and their functions in the VEX EXP Brain system.

In this example, the Drivetrain will turn to the right and print its current yaw as it turns.

  when started :: hat events
  [Turn towards the right for 1 second.]
  turn [right v]
  wait (1) seconds
  [Print the roll while the Drivetrain is turning.]
  print ([BrainInertial v] orientation of [roll v]) on [Brain v] ◀ and set cursor to next row

Encoder Sensing#

Set Shaft Encoder Position#

The Set Shaft Encoder Position block is used to set the Shaft Encoder’s position to the given value.

VEXcode blocks stack of code containing a set EncoderC position to 0 degrees block.#
  set [EncoderC v] position to (0) degrees

Choose which Shaft Encoder to use.

Image of a shaft encoder device used for measuring rotational position and velocity in robotics.

In this example, the Shaft Encoder will print its starting position, set its position to 90 degrees, and then print the new position.

VEXcode blocks stack of code containing a when started block, a print EncoderC position in degrees on brain and set cursor to next row block, a set EncoderC position to 90 degrees block, and a print EncoderC position in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([EncoderC v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row
  set [EncoderC v] position to (90) degrees
  print ([EncoderC v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row

Shaft Encoder Position#

The Shaft Encoder Position block is used to report the distance the Shaft Encoder has rotated.

VEXcode blocks stack of code containing an EncoderC position in degrees block.#
  ([EncoderC v] position in [degrees v])

Choose which Shaft Encoder to use.

Image of a shaft encoder position device used for measuring rotational position in robotics applications.

Choose which unit to report in: degrees or turns.

Shaft encoder position units diagram illustrating how to set and read encoder position values.

In this example, the Shaft Encoder will print its starting position, set its position to 90 degrees, and then print the new position.

VEXcode blocks stack of code containing a when started block, a print EncoderC position in degrees on Brain and set cursor to next row block, a set EncoderC position to 90 degrees block, and a print EncoderC position in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([EncoderC v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row
  set [EncoderC v] position to (90) degrees
  print ([EncoderC v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row

Shaft Encoder Velocity#

The Shaft Encoder Velocity block is used to report the current velocity of a Shaft Encoder.

VEXcode blocks stack of code containing an EncoderC velocity in dps block.#
  ([EncoderC v] velocity in [dps v])

Choose which Shaft Encoder to use.

Shaft encoder velocity device displaying current velocity readings in degrees per second or rotations per minute.

Choose which unit to report in: degrees per second (dps) or rotations per minute (rpm).

Shaft encoder velocity units illustration showing degrees per second (dps) and rotations per minute (rpm) measurements.

Distance Sensing#

Object Distance#

The Object Distance block is used to report the distance of the nearest object from the Distance Sensor.

VEXcode blocks stack of code containing a Distance10 object distance in mm block.#
  ([Distance10 v] object distance in [mm v])

The Object Distance block reports a range from 20mm to 2000mm.

Choose which Distance Sensor to use.

Image of a distance sensing device displaying object distance and related metrics in a robotics context.

Choose what units to report in: millimeters (mm) or inches.

Image illustrating various object distance sensing units and measurements for robotics applications.

In this example, the Distance Sensor will report the current distance between it and the closest object.

VEXcode blocks stack of code containing a when started block, and a Distance10 object distance in mm on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Distance10 v] object distance in [mm v]) on [Brain v] ◀ and set cursor to next row

Object Velocity#

The Object Velocity block is used to report the current velocity of an object in meters per second (m/s).

VEXcode blocks stack of code containing a Distance10 object velocity in m/s block.#
  ([Distance10 v] object velocity in m/s)

Choose which Distance Sensor to use.

Image of an object velocity device used for measuring the speed of objects in motion.

In this example, the Distance Sensor will report the current velocity of an object moving in front of it.

VEXcode blocks stack of code containing a when started block, and a print Distance10 object velocity in m/s on Brain and set cursor to next row.#
  when started :: hat events
  print ([Distance10 v] object velocity in m/s) on [Brain v] ◀ and set cursor to next row

Object Size Is#

The Object Size Is block is used to report if the Distance Sensor detects the specified object size.

VEXcode blocks stack of code containing a Distance10 object size is small ? block.#
  <[Distance10 v] object size is [small v] ?>

The Distance Sensor determines the size of the object detected (none, small, medium, large) based on the amount of light reflected and returned to the sensor.

The Object Size Is block reports True when the Distance Sensor detects the specified size.

The Object Size Is block reports False when the Distance Sensor doesn’t detect the specified size.

Choose which Distance Sensor to use.

Diagram illustrating object size detection by a distance sensor, showing small, medium, and large object indicators.

Choose which size of the object you want the Object Sensor to check for.

  • small

  • medium

  • large

Diagram illustrating the relationship between object size and distance detected by a sensor.

In this example, if the Distance Sensor detects a small object, it will drive forward until the object is large.

VEXcode blocks stack of code containing a when started block, an if Distance10 object size is small ? then block, a drive forward block, a wait until Distance10 object size is large ? block, a stop driving block, and an end block.#
  when started :: hat events
  [Check if the Distance Sensor sees a small object .]
  if <[Distance10 v] object size is [small v] ?> then
  [If a small object is detected, drive forward.]
  drive [forward v]
  [Wait until the small detected object is large.]
  wait until <[Distance10 v] object size is [large v] ? >
  [When the object size is large, stop driving.]
  stop driving
  end

Distance Sensor Found Object#

The Distance Sensor Found Object block is used to report if the Distance Sensor sees an object within its field of view.

VEXcode blocks stack of code containing a Distance10 found an object? block.#
  <[Distance10 v] found an object?>

The Distance Sensor Found Object block reports True when the Distance Sensor sees an object or surface within its field of view.

The Distance Sensor Found Object block reports False when the Distance Sensor does not detect an object or surface.

Choose which Distance Sensor to use.

Diagram illustrating distance detection capabilities of an object detection sensor device.

In this example, when the Distance Sensor detects an object, it will print a message to the Brain.

VEXcode blocks stack of code containing a when started block, a wait until Distance10 found an object? block, and a print Distance Sensor has detected an object. on Brain and set cursor to next row.#
  when started :: hat events
  [Don't print the message until the Distance Sensor finds an object.]
  wait until <[Distance10 v] found an object?>
  print [Distance Sensor has detected an object.] on [Brain v] ◀ and set cursor to next row

Optical Sensing#

Set Optical Mode#

The Set Optical Mode block is used to set an Optical Sensor to either detect colors or gestures.

VEXcode blocks stack of code containing a set Optical7 to color mode block.#
  set [Optical7 v] to [color v] mode 

By default, an Optical Sensor is set to always detect colors. Before using any Optical Sensor gesture blocks, the Optical Sensor must be set to detect gestures.

Choose which Optical Sensor to use.

Image showing a diagram for setting the optical mode of a sensor in a robotics context.

Choose whether you want to set the mode of the Optical Sensor to either detect colors or gestures.

Diagram illustrating the process of setting the optical mode for a sensor to detect colors or gestures.

In this example, the Optical Sensor is set to detect gestures before waiting until a left gesture is detected to print a message.

VEXcode blocks stack of code containing a when started block, a set Optical7 to gesture mode block, a wait until Optical7 gesture left detected? block, and a print Left gesture detected. on Brain and set cursor to next row block.#
  when started :: hat events
  [Set Optical sensor to detect gestures.]
  set [Optical7 v] to [gesture v] mode
  [Don't print the message until a left gesture is detected.]
  wait until <[Optical7 v] gesture [left v] detected?>
  print [Left gesture detected.] on [Brain v] ◀ and set cursor to next row

Set Optical Light#

The Set Optical Light block is used to set the light on the Optical Sensor to on or off. The light lets the Optical Sensor see objects if it is looking at an object in a dark area.

VEXcode blocks stack of code containing a set Optical7 light on block.#
  set [Optical7 v] light [on v]

Choose which Optical Sensor to use.

Illustration of an optical light device used in sensing applications, showcasing its features and settings.

Choose whether to turn the light on or off.

Image showing the setting of optical light mode for an optical sensor in a robotics context.

In this example, the Optical Sensor will turn its light on for two seconds before turning it off.

VEXcode blocks stack of code containing a when started block, a set Optical7 light on block, a wait 2 seconds block, and a set Optical7 light off block.#
  when started :: hat events
  set [Optical7 v] light [on v]
  wait (2) seconds
  set [Optical7 v] light [off v]

Set Optical Light Power#

The Set Optical Light Power block is used to set the light power of Optical sensor

VEXcode blocks stack of code containing a set Optical7 light power to 50 % block.#
  set [Optical7 v] light power to (50) %

The The Set Optical Light Power block block accepts a range of 0% to 100%. This will change the brightness of the light on the Optical Sensor. If the light is off, this block will turn the light on.

Choose which Optical Sensor to use.

Optical light power device interface for setting optical light power levels in sensing applications.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

VEXcode blocks stack of code containing a when started block, a set Optical7 light power to 75 % block, a wait until Optical7 found an object? block, and a print Object detected. on Brain and set cursor to next row block.#
  when started :: hat events
  [Set the Optical Sensor's light power to 75%]
  set [Optical7 v] light power to (75) %
  [Don't print the message until an object is detected.]
  wait until <[Optical7 v] found an object?>
  print [Object detected.] on [Brain v] ◀ and set cursor to next row 

Optical Sensor Found Object#

The Optical Sensor Found Object block is used to report if the Optical Sensor detects an object close to it.

VEXcode blocks stack of code containing an Optical7 found an object? block.#
  <[Optical7 v] found an object?>

The Optical Sensor Found Object block reports True when the Optical Sensor detects an object close to it.

The Optical Sensor Found Object block reports False when an object is not within range of the Optical Sensor.

Choose which Optical Sensor to use.

Diagram of an optical sensor device detecting objects and colors in a robotic sensing context.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

VEXcode blocks stack of code containing a when started block, a set Optical7 light power to 75 % block, a wait until Optical7 found an object ? block, and a print Object detected. on Brain and set cursor to next row block.#
  when started :: hat events
  [Set the Optical Sensor's light power to 75%.]
  set [Optical7 v] light power to (75) %
  [Don't print the message until an object is detected.]
  wait until <[Optical7 v] found an object?>
  print [Object detected.] on [Brain v] ◀ and set cursor to next row

Optical Sensor Detects Color#

The Optical Sensor Detects Color block is used to report if the Optical Sensor detects the specified color.

VEXcode blocks stack of code containing an Optical7 detects red ? block.#
  <[Optical7 v] detects [red v] ?>

The Optical Sensor Detects Color block reports True when the Optical Sensor detects the specified color.

The Optical Sensor Detects Color block reports False when the Optical Sensor doesn’t detect the specified color.

Choose which Optical Sensor to use.

Image of an optical sensor detecting color, with indicators for brightness, hue, and gesture detection.

Choose which color the Optical Sensor will check for.

Optical sensor detecting color with various hues and brightness levels in a robotics context.

In this example, the Optical Sensor’s will wait until it detects a blue object before printing a message.

VEXcode blocks stack of code containing a when started block, a wait until Optical7 detects blue ? block, and a print Color blue detected. on Brain and set cursor to next row block.#
  when started :: hat events
  [Don't print the message until the color blue is detected.]
  wait until <[Optical7 v] detects [blue v] ?>
  print [Color blue detected.] on [Brain v] ◀ and set cursor to next row

Optical Brightness#

The Optical Brightness block is used to report the amount of light detected by the Optical Sensor.

VEXcode blocks stack of code containing an Optical7 brightness in % block.#
  ([Optical7 v] brightness in %)

The Optical Brightness block reports a number value from 0% to 100%.

A large amount of light detected will report a high brightness value.

A small amount of light detected will report a low brightness value.

Choose which Optical Sensor to use.

Optical brightness device displaying light intensity measured by the optical sensor in a robotics context.

In this example, the Optical Sensor will print the current brightness value to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, and a print Optical7 brightness in % on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Optical7 v] brightness in %) on [Brain v] ◀ and set cursor to next row

Optical Hue#

The Optical Hue block is used to report the hue of the color of an object.

VEXcode blocks stack of code containing an Optical7 hue in degrees block.#
  ([Optical7 v] hue in degrees)

The Optical Hue block reports a number value that is the hue of the color of an object. It returns a number between 0 and 359.

The value can be thought of as the location of the color on a color wheel in degrees.

Choose which Optical Sensor to use.

Optical hue device displaying color detection data and sensor readings in a robotics context.

In this example, the Optical Sensor will print the currently seen hue to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, and a print Optical7 hue in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Optical7 v] hue in degrees) on [Brain v] ◀ and set cursor to next row

Optical Sensor Detects Gesture#

The Optical Sensor Detects Gesture block is used to report whether an Optical Sensor has detected the specified gesture.

VEXcode blocks stack of code containing an Optical7 gesture up detected? block.#
  <[Optical7 v] gesture [up v] detected?>

Important: The Optical Sensor must first be set to detect gestures using the Set Optical Mode block, otherwise it will not detect any gestures.

The Optical Sensor Detects Gesture block reports True when the Optical Sensor detects the specified gesture.

The Optical Sensor Detects Gesture block reports False when the Optical Sensor doesn’t detect the specified gesture.

Choose which Optical Sensor to use.

Optical gesture detection device displaying various gesture recognition features and settings for sensor configuration.

Choose which gesture the Optical Sensor will check for.

Optical sensor detecting a gesture, used for gesture recognition in robotics applications.

In this example, the Optical Sensor is set to detect gestures before waiting until a left gesture is detected to print a message.

VEXcode blocks stack of code containing a when started block, a set Optical7 to gesture mode block, a wait until Optical7 gesture left detected? block, and a print Left gesture detected. on Brain and set cursor to next row block.#
  when started :: hat events
  [Set Optical Sensor to detect gestures.]
  set [Optical7 v] to [gesture v] mode
  [Don't print the message until a left gesture is detected.]
  wait until <[Optical7 v] gesture [left v] detected?>
  print [Left gesture detected.] on [Brain v] ◀ and set cursor to next row

Rotation Sensing#

Set Rotation Sensor Position#

The Set Rotation Sensor Position block is used to set a Rotation Sensor’s current position to a defined value.

VEXcode blocks stack of code containing a set Rotation9 position to 0 degrees block.#
  set [Rotation9 v] position to (0) degrees

The Set Rotation Sensor Position block accepts any positive or negative decimal or integer number.

Choose which Rotation Sensor to use.

Rotation sensor position device displaying angle and position metrics for rotational sensing applications.

In this example, the Rotation Sensor will print its starting position, set its position to -100 degrees, and then print the new position.

VEXcode blocks stack of code containing a when started block, a print Rotation9 position in degrees on Brain and set cursor to next row block, a set Rotation9 position to -100 degrees block, and a print Rotation9 position in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Rotation9 v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row
  set [Rotation9 v] position to (-100) degrees :: #5cb0d6
  print ([rotation9 v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row

Rotation Sensor Angle#

The Rotation Sensor Angle block is used to report the Rotation Sensor’s current angle of rotation in degrees.

VEXcode blocks stack of code containing a Rotation9 angle in degrees block.#
  ([Rotation9 v] angle in degrees)

The Rotation Sensor Angle block reports values in the range of 0.00 to 359.99.

Choose which Rotation Sensor to use.

Rotation sensor device displaying angle and position data for sensing applications in robotics.

In this example, the Rotation Sensor will print its starting rotation.

VEXcode blocks stack of code containing a when started block, and a print Rotation9 angle in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Rotation9 v] angle in degrees) on [Brain v] ◀ and set cursor to next row

Rotation Sensor Position#

The Rotation Sensor Position block is used to report the current rotational position of the selected Rotation Sensor.

VEXcode blocks stack of code containing a Rotation9 position in degrees block.#
  ([Rotation9 v] position in [degrees v])

Choose which Rotation Sensor to use.

Rotation sensor position device displaying angle and position metrics for rotational sensing applications.

Choose what units the position will be reported in: degrees or turns.

Rotation sensor position units diagram illustrating angle and velocity measurements for rotation sensing.

In this example, the Rotation Sensor will print its starting position, set its position to -100 degrees, and then print the new position.

VEXcode blocks stack of code containing a when selected block, a print Rotation9 position in degrees on Brain and set cursor to next row block, a set Rotation9 position to -100 degrees block, and a print Rotation9 position in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Rotation9 v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row
  set [Rotation9 v] position to (-100) degrees :: #5cb0d6
  print ([Rotation9 v] position in [degrees v]) on [Brain v] ◀ and set cursor to next row

Rotation Sensor Velocity#

The Rotation Sensor Velocity block is used to report the current velocity of a Rotation Sensor.

VEXcode blocks stack of code containing a Rotation9 velocity in rpm block.#
  ([Rotation9 v] velocity in [rpm v])

Choose which Rotation Sensor to use.

Rotation sensor and velocity device for monitoring motor and drivetrain performance in robotics applications.

Choose what units the position will be reported in: revolutions per minute (rpm) or degrees per second (dps).

Image showing rotation sensor with velocity units for reporting angle and position in degrees or rpm.

In this example, the Drivetrain will drive turn to the right for 1 second before its current rotational velocity is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a turn right block, a wait 1 second block, and a print Rotation9 velocity in rpm on Brain and set cursor to next row block.#
  when started :: hat events
  [Turn towards the right for 1 second.]
  turn [right v]
  wait (1) seconds
  [Print the correct rotational velocity after 1 second.]
  print ([Rotation9 v] velocity in [rpm v]) on [Brain v] ◀ and set cursor to next row

Vision Sensing#

Take Vision Sensor Snapshot#

The Take Vision Sensor Snapshot block is used to take a snapshot from the Vision Sensor.

VEXcode blocks stack of code containing a take a Vision1 snapshot of SELECT_A_SIG block.#
  take a [Vision1 v] snapshot of [SELECT_A_SIG v]

The Take Vision Sensor Snapshot block will capture the current image from the Vision Sensor to be processed and analyzed for color signatures and codes.

A snapshot is required first before using any other Vision Sensor blocks.

Choose which Vision Sensor to use.

Vision sensor snapshot device interface displaying various sensing parameters and configurations for robotics applications.

Select which vision signature to use. Vision signatures are configured from the Devices window.

Vision sensor snapshot showing detected objects and their properties for analysis in robotics applications.

Set Vision Sensor Object Item#

The Set Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) out of the number of objects detected.

VEXcode blocks stack of code containing a set Vision1 object item to 1 block.#
  set [Vision1 v] object item to (1)

Choose which Vision Sensor to use.

Diagram illustrating various sensing capabilities and parameters for robotic control systems.

Vision Sensor Object Count#

The Vision Sensor Object Count block is used to report how many objects the Vision Sensor detects.

VEXcode blocks stack of code containing a Vision1 object count block.#
  ([Vision1 v] object count)

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Count block reports a number of objects.

The Vision Sensor Object Count block will only detect the number of objects from the last snapshot signature.

Choose which Vision Sensor to use.

Vision sensor displaying object count and detection status in a robotics context.

Vision Sensor Object Exists?#

The Vision Sensor Object Exists? block is used to report if the Vision Sensor detects a configured object.

VEXcode blocks stack of code containing a Vision1 object exists? block.#
  <[Vision1 v] object exists?>

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

The Vision Sensor Object Exists? block reports True when the Vision Sensor detects a configured object.

The Vision Sensor Object Exists? block reports False when the Vision Sensor does not detect a configured object.

Choose which Vision Sensor to use.

Vision sensor displaying object detection status with graphical interface elements and numerical values.

Vision Sensor Object#

The Vision Sensor Object block is used to report information about a detected object from the Vision Sensor.

VEXcode blocks stack of code containing a Vision1 object width block.#
  <[Vision1 v] object [width v] :: #5cb0d6>

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

Choose which Vision Sensor to use.

Diagram illustrating various sensing capabilities and parameters for robotic control systems.

Choose which property to report from the Vision Sensor:

  • width - How wide the object is in pixels, from 2 - 316 pixels.

  • height - How tall the object is in pixels, from 2 - 212 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 315 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 211 pixels.

  • angle - The angle of the detected object, from 0 - 180 degrees.

Diagram illustrating the properties and functions of a vision sensor in robotics sensing systems.

AI Vision Sensing#

Take AI Vision Snapshot#

The Take Snapshot block is used to capture the current image from the AI Vision Sensor to be processed and analyzed for Visual Signatures.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a take a AIVision5 snapshot of AprilTags block.#
  take a [AIVision5 v] snapshot of [AprilTags v]

A snapshot is required first before using any other AI Vision Sensor blocks.

Choose which AI Vision Sensor to use.

Image of a device interface displaying various sensor and motor status metrics for robotics control.

Select what Visual Signature the AI Vision Sensor should take a snapshot of.

  • AprilTags.

  • AI Classifications.

  • A configured Color Signature or Color Code.

AI Vision snapshot showing detected objects and their classifications with visual signatures and coordinates.

When a snapshot is taken with the AI Vision Sensor, it creates an array with all of the detected objects and their properties stored inside.

It’s also important to take a new snapshot everytime you want to use data from the AI Vision Sensor, so your robot isn’t using outdated data from an old snapshot in the array.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature. This makes sure that the data the robot is using is getting constantly updated.

Before any data is pulled from the snapshot, the AI Vision Sensor Object Exists? block is used to ensure that at least one object was detected in the snapshot. This makes sure that the robot isn’t trying to pull data from an empty array.

If the AI Vision Sensor has detected at least one object, it will print the CenterX coordinate of the largest detected object to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on Brain block, a set cursor to row 1 column 1 on Brain block, an if AIVision5 object exists? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, a print AIVision5 object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5 v] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Classification Is#

The AI Classification Is block is used to report if the specified AI Classification has been detected.

VEXcode blocks stack of code containing an AIVision1 AI classification is BlueVall ? block.#
  <[AIVision1 v] AI classification is [BlueVall v] ? :: #5cb0d6>

The Take AI Vision Snapshot block is required first for AI Classifications before using the AI Classification Is block.

The Take AI Vision Snapshot block reports True when the AI Vision Sensor has detected the specified AI Classification.

The Take AI Vision Snapshot block reports False when the AI Vision Sensor has not detected the specified AI Classification.

Choose which AI Vision Sensor to use.

AI Vision Sensor classification result display showing detected object types and counts.

Choose which AI Classification to detect. This can change depending on what detection model you are using.

For more information on what AI Classifications are available and how to enable their detection, go here.

AI Vision classification output displaying detected object types and their identifiers in a robotics context.

In this example, the AI Vision Sensor will take a snapshot of all AI Classifications before checking if a Blue Ball was detected or not. If a Blue Ball was detected, it will print a message to the Print Console.

VEXcode blocks stack of code containing a when selected block, a take a AIVision1 snapshot of AI Classifications block, an if AIVision1 AI classification is BlueBall ? then block, a print Blue Ball detected! on Brain block, and an end block.#
  when started :: hat events
  take a [AIVision1 v] snapshot of [AI Classifications v]
  if <[AIVision1 v] AI classification is [BlueBall v] ? :: #5cb0d6> then
  print [Blue Ball detected!] on [Brain v] ▶
  end

Detected AprilTag Is#

The Detected AprilTag Is block is used to report if the specified AprilTag is detected. For more information on what AprilTags are and how to enable their detection, go here.

VEXcode blocks stack of code containing a AIVision1 detected AprilTag is 1 ? block.#
  <[AIVision1 v] detected AprilTag is (1) ? :: #5cb0d6>

The Take AI Vision Snapshot block is required first for AprilTags before using the Detected AprilTag Is block.

The Detected AprilTag Is block reports True when the AI Vision Sensor has detected the specified AprilTag.

The Detected AprilTag Is block reports False when the AI Vision Sensor has not detected the specified AprilTag.

Choose which AI Vision Sensor to use.

AI Vision Sensor displaying an AprilTag detection interface with various sensor readings and status indicators.

In this example, the AI Vision Sensor will take a snapshot of all AprilTags before checking if the AprilTag with the ID “3” was detected. If that specific AprilTag was detected, it will print a message to the Print Console.

VEXcode blocks stack of code containing a when selected block, a take a AIVision1 snapshot of AprilTags block, a if AIVision1 detected AprilTag is 3 ? then block, a print AprilTag 3 detected! on Brain block, and an end block.#
  when started :: hat events
  take a [AIVision1 v] snapshot of [AprilTags v]
  if <[AIVision1 v] detected AprilTag is (3) ? :: #5cb0d6> then
  print [AprilTag 3 detected!] on [Brain v] ▶
  end

Set AI Vision Sensor Object Item#

The Set AI Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) from the objects detected. By default, the Object item is set to 1 at the start of a project.

When multiple objects are detected, they will be stored from largest to smallest, with object item 1 being the largest.

Note: AprilTags are not by their size, but are sorted by their unique IDs in ascending order. For example, if AprilTags 1, 15, and 3 are detected:

  • AprilTag 1 will have index 0.

  • AprilTag 3 will have index 1.

  • AprilTag 15 will have index 2.

VEXcode blocks stack of code containing a set AIVision5 object item to 1 block, and an end block.#
  set [AIVision5 v] object item to (1)

The Take AI Vision Snapshot block is required first before the Set AI Vision Sensor Object Item block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities and controls for a robotic system, including motor and vision sensors.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected, it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on brain block, a set cursor to row 1 column 1 on brain block, an if AIVision5 object exists ? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, print AIVision5 object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5 v] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object Count#

The AI Vision Sensor Object Count block is used to report how many objects the AI Vision Sensor detects that match the specified Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a AIVision5 object count block.#
  ([AIVision5 v] object count )

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Count block can be used.

Choose which AI Vision Sensor to use.

AI Vision Sensor object count display interface showing detected objects and their details in a graphical format.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox, a clear all rows on Brain block, a set cursor to row 1 column 1 on brain, if AIVision5 object exists ? then block, a print#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, print how many were detected.]
  print [# of Objects Detected: :] on [Brain v] ▶
  print ([AIVision5 v] object count) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object Exists?#

The AI Vision Sensor Object Exists? block is used to report if the AI Vision Sensor detects a Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a AIVision5 object exists? block.#
  <[AIVision5 v] object exists?>

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Exists? block can be used.

The AI Vision Sensor Object Exists? block reports True when the AI Vision Sensor has detected an object.

The AI Vision Sensor Object Exists? block reports False when the AI Vision Sensor has not detected an object.

Choose which AI Vision Sensor to use.

AI Vision Sensor interface showing object detection status and parameters including object count and existence.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on Brain block, a set cursor to row 1 column 1 on Brain block, an if AIVision5 object exists ? then block, a print#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, print how many were detected.]
  print [# of Objects Detected: :] on [Brain v] ▶
  print ([AIVision5 v] object count) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object#

The AI Vision Sensor Object block is used to report information about a specified Visual Signature from the AI Vision Sensor.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing an AIVision5 object width block.#
  ([AIVision5 v] object [width v])

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities of a robotic system, including motor and controller feedback.

Choose which property to report from the AI Vision Sensor:

  • width - How wide the object is in pixels, from 0 - 320 pixels.

  • height - How tall the object is in pixels, from 0 - 240 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 320 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 240 pixels.

  • originX - The X coordinate of the object’s leftmost corner, from 0 - 320 pixels.

  • originY - The Y coordinate of the object’s leftmost corner, from 0 - 240 pixels.

  • angle - The angle of the detected Color Code or ApriTag, from 0 - 360 degrees.

  • tagID - The detected AprilTag’s identification number.

  • score - The confidence score (up to 100%) for AI Classifications. This score indicates how confident the model is in the detected AI Classification. A higher score indicates greater confidence in the accuracy of the AI Classification.

For more examples of using object properties, go here.

Diagram illustrating various sensing blocks and properties for robotic control, including motor and controller functions.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on brain block, a set cursor to row 1 column 1 on brain block, an if AIVision5 object exists ? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, a print AIVision object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain
  [Check if any objects matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

Arm Sensing#

Can 6-Axis Arm Move to Position#

The Can 6-Axis Arm Move to Position block is used to report if the 6-Axis Robotic Arm is able to reach the specified position.

VEXcode blocks stack of code containing a can 6-axis arm move to position block.#
  [arm v] move to position x: (0) y: (0) z: (0) [mm v] ?

The Can 6-Axis Arm Move to Position block reports True when the 6-Axis Arm can reach that position.

The Can 6-Axis Arm Move to Position block reports False when the 6-Axis Arm can not reach that position.

Choose which 6-Axis Arm to use.

Diagram illustrating the functionality of the EXP Brain's timer reset and value reporting features.

Select which unit to use: millimeters (mm) or inches.

Diagram illustrating the arm's reach to various units in a robotic control system.

In this example, the 6-Axis Arm will check if it can move to (0, 0, 0) and print that it can not reach the position.

when started :: hat events
if <not <[Arm2 v] increment position by x: (0) y: (500) z: (0) [mm v] ?>> then
    print [The Arm can't incremental move for this distance.]
end

Can 6-Axis Arm Increment Move to Position#

The Can 6-Axis Arm Increment Move to Position block is used to report if the 6-Axis Robotic Arm is able to incrementally move for that distance.

when started :: hat events
[Arm2 v] increment position by x: (0) y: (500) z: (0) [mm v]
end

The Can 6-Axis Arm Increment Move to Position block reports True when the 6-Axis Arm can incrementally move for that distance.

The Can 6-Axis Arm Increment Move to Position block reports False when the 6-Axis Arm can not incrementally move for that distance.

Choose which 6-Axis Arm to use.

Diagram illustrating the VEX EXP Brain's timer reset functionality and button press detection.

Select which unit to use: millimeters (mm) or inches.

Diagram of various sensing units and their functionalities for robotics control and monitoring systems.

In this example, the 6-Axis Arm will check if it can increment move for 500 millimeters on the Y axis and print that it can’t move for that distance.

when started :: hat events
if <not <[Arm2 v] increment position by x: (0) y: (500) z: (0) [mm v] ?>> then
    print [The Arm can't incremental move for this distance.]
end

Can 6-Axis Arm End Effector Move to Orientation#

The Can 6-Axis Arm End Effector Move to Orientation block is used to report if the 6-Axis Arm’s End Effector can rotate about an axis to a specific orientation.

<[arm v] move end effector to [pitch] (0) degrees?>

The Can 6-Axis Arm End Effector Move to Orientation block reports True when the 6-Axis Arm can rotate about an axis to a specific orientation.

The Can 6-Axis Arm End Effector Move to Orientation block reports False when the 6-Axis Arm can not rotate about an axis to a specific orientation.

Choose which 6-Axis Arm to use.

Diagram illustrating the connections and functions of the 6-axis robotic arm and its sensors.

Select which axis to use:

  • pitch - Movement around the Y-axis.

  • roll - Movement around the X-axis.

  • yaw - Movement around the Z-axis.

Diagram showing the connections between various sensing components and the EXP Brain in a robotic system.

In this example, the 6-Axis Arm will check if the End Effector can point towards the 40 degrees position on the X axis and print if it can or can not.

when started :: hat events
if <[arm v] move end effector to [roll v] (40) degrees?> then
    print [The End Effector can move to this orientation]
else
print [The End Effector can not move to this orientation]
end

Can 6-Axis Arm End Effector Incrementally Move to Orientation#

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block is used to report if the 6-Axis Arm’s End Effector can incrementally rotate about an axis its orientation for a specific amount of degrees.

Diagram of the VEX EXP Brain's Reset Timer block, illustrating its function to reset the timer to zero seconds.

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block reports True when the 6-Axis Arm can incrementally rotate about an axis for a specific amount of degrees.

The Can 6-Axis Arm End Effector Incrementally Move to Orientation block reports False when the 6-Axis Arm can not incrementally rotate about an axis for a specific amount of degrees.

Choose which 6-Axis Arm to use.

Diagram illustrating the functionalities of a 6-axis robotic arm with various sensing capabilities and motor controls.

Select which axis to use:

  • pitch - Rotation around the Y-axis.

  • roll - Rotation around the X-axis.

  • yaw - Rotation around the Z-axis.

Diagram of a 6-axis robotic arm with labeled components and axes for movement and orientation.

In this example, the 6-Axis Arm will check if the End Effector can increment move for 20 degrees on the Z axis and print if it can or can not.

Flowchart illustrating the VEX EXP Brain's timer reset process and button press detection.

6-Axis Arm is Done?#

The 6-Axis Arm is Done? block is used to report if the 6-Axis Arm has completed moving.

Image showing the status of a 6-axis robotic arm indicating that the arm has completed its movement.

The 6-Axis Arm is Done? block reports True when the 6-Axis Arm is not moving.

The 6-Axis Arm is Done? block reports False when the 6-Axis Arm is moving.

Choose which 6-Axis Arm to use.

Image of a 6-axis robotic arm indicating that it has completed its movement, showing its position and status.

In this example, the Arm will move to the position (-100, 200, 100) and print its Y coordinate in mm every .25 seconds as it moves, until it is done moving.

Image of a 6-axis robotic arm indicating its movement status with a motor is done" message displayed on the screen.

6-Axis Arm Position#

The 6-Axis Arm Position block is used to report the current position of the 6-Axis Arm in the specified axis.

Illustration of the arm position block for controlling a 6-axis robotic arm in a programming context.

Choose which 6-Axis Arm to use.

Diagram illustrating the position of a 6-axis robotic arm with labeled axes and joint angles.

Choose which axis to report.

Diagram illustrating the position and axis of a 6-axis robotic arm in a sensing context.

Choose which unit to report with: millimeters (mm) or inches.

Diagram illustrating various arm position units for robotic control and sensing applications.

In this example, the 6-Axis Arm will print its current Z axis position in millimeters to the Print Console.

Illustration of arm position for a 6-axis robotic arm in a sensing context.

6-Axis Arm End Effector Orientation#

The 6-Axis Arm End Effector Orientation block is used to report the current orientation of the 6-Axis Arm’s End Effector.

Diagram illustrating various sensing blocks for the VEX EXP Brain, including timer, battery, controller, and motor sensing.

Choose which 6-Axis Arm to use.

Diagram illustrating arm orientation for a 6-axis robotic arm in a sensing context.

Choose which axis to report:

  • pitch - Rotation around the Y-axis.

  • roll - Rotation around the X-axis.

  • yaw - Rotation around the Z-axis.

Diagram illustrating arm orientation axes for robotic control and movement analysis.

In this example, the 6-Axis Arm will print the End Effector’s current Y axis orientation in degrees to the Print Console.

Image demonstrating arm orientation for robotic control in a VEX EXP Brain project.

Line Tracking Sensing#

Line Tracker Reflectivity#

The Line Tracker Reflectivity block is used to report the amount of light reflected using the Line Tracker Sensor.

  ([LineTrackerG v] reflectivity in % :: #5cb0d6)

Choose which Line Tracker Sensor to use.

Line tracker reflectivity device displaying light reflectivity levels for sensor data analysis.

In this example, the Line Tracker Sensor will print the current detected reflectivity to the Brain’s Screen.

  when started :: hat events
  print ([LineTrackerG v] reflectivity in % :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Light Sensing#

Light Sensor Brightness#

The Light Sensor Brightness block is used to report the amount of light detected by the Light Sensor.

  ([Light v] brightness in %)

Choose which Light Sensor to use.

Light sensor device displaying brightness measurement in a robotics context.

In this example, the Light Sensor will print the current detected brightness to the Brain’s Screen.

  when started :: hat events
  print ([LightH v] brightness in %) on [Brain v] ◀ and set cursor to next row

Potentiometer Sensing#

Potentiometer Angle#

The Potentiometer Angle block is used to report the angular position of the Potentiometer.

VEXcode blocks stack of code containing a Potentiometer3A angle in % block.#
  ([Potentiometer3A v] angle in [% v] :: #5cb0d6)

Choose which Potentiometer to use.

Image of a potentiometer angle device used for measuring angular position in robotics and electronics applications.

Choose which unit to report in: percent (%) or degrees.

Image showing a potentiometer with angle measurement units for sensing applications.

In this example, the Potentiometer will print its current angular position to the Brain’s Screen.

VEXcode blocks stack of code containing a when started block, and a print Potentiometer3A angle in degrees on Brain and set cursor to next row block.#
  when started :: hat events
  print ([Potentiometer3A v] angle in [degrees v] :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Accelerometer Sensing#

Accelerometer Acceleration#

The Accelerometer Acceleration block is used to report the acceleration value from one axis on the Analog Accelerometer.

VEXcode blocks stack of code containing a Accel2g3g acceleration in g block.#
  ([Accel2G3G v] acceleration in g)

The Accelerometer Acceleration block reports a range from -2.0 G to 2.0 G or -6.0 G to 6.0 G depending upon the jumper setting on the Analog Accelerometer.

Choose which Accelerometer to use.

Image of an accelerometer displaying acceleration values and settings for various axes and configurations.

In this example, the Drivetrain will drive forward for 1 second before its current acceleration is printed on the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a drive forward block, a wait 1 second block, a print Accel2g3g acceleration in g on Brain and set cursor to next row block.#
  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print the current acceleration after 1 second.]
  print ([Accel2G3G v] acceleration in g) on [Brain v] ◀ and set cursor to next row

Range Finder Sensing#

Range Finder Found Object?#

The Range Finder Found Object? block is used to report if the Ultrasonic Range Finder Sensor sees an object within its field of view.

VEXcode blocks stack of code containing a RangeFinderE found and object? block.#
  <[RangeFinderE v] found an object?>

The Range Finder Found Object? block reports True when the Ultrasonic Range Finder Sensor sees an object or surface within its field of view.

The Range Finder Found Object? block reports False when the Ultrasonic Range Finder Sensor does not detect an object or surface.

Choose which Ultrasonic Range Finder Sensor to use.

Image of an ultrasonic range finder sensor detecting an object within its field of view.

In this example, every .25 seconds the Range Finder will check if it detects an object, and if so, will print the distance between it and the object to the Brain’s Screen.

VEXcode blocks stack of code containing a when started block, a forever block, a clear all rows on Brain block, a set cursor to row 1 column 1 on Brain block, an if RangeFinderE found an object? then block, a print RangeFinderE distance in mm on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  clear all rows on [Brain v] 
  set cursor to row (1) column (1) on Brain
  [Check if the Range Finder found an object.]
  if <[RangeFinderE v] found an object?> then 
  [Print the distance to the object.]
  print ([RangeFinderE v] distance in [mm v]) on [Brain v]  ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds.]
  wait (0.25) seconds
  end

Range Finder Distance#

The Range Finder Distance block is used to report the distance of the nearest object from the Ultrasonic Range Finder Sensor.

Diagram illustrating the Range Finder Distance block used to measure object distance with an Ultrasonic sensor.

Choose which Ultrasonic Range Finder Sensor to use.

Image of a range finder distance device used for measuring object distance in robotics applications.

Choose which unit to report in: millimeters (mm) or inches.

Image showing distance measurement units for a range finder sensor, illustrating various distance reporting formats.

In this example, every .25 seconds the Range Finder will check if it detects an object, and if so, will print the distance between it and the object to the Brain’s Screen.

  when started :: hat events
  forever
  clear all rows on [Brain v] 
  set cursor to row (1) column (1) on Brain
  [Check if the Range Finder found an object.]
  if <[RangeFinderE v] found an object?> then 
  [Print the distance to the object.]
  print ([RangeFinderE v] distance in [mm v]) on [Brain v]  ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds.]
  wait (0.25) seconds
  end

Digital In Sensing#

Digital In#

The Digital In block is used to report if the Digital In signal is high.

VEXcode blocks stack of code containing a DigitalINA signal high? block.#
  [DigitalInA v] signal high?

The 3-Wire ports function at a 5V logic signal voltage level.

The Digital In block reports True when the digital input signal is high.

The Digital In block reports False when the digital input signal is low.

Choose which Digital In device to use.

Diagram illustrating digital input and output functions of a robotic brain, including timer, button, and sensor readings.

Digital Out Sensing#

Digital Out#

The Digital Out block is used to set the logic level of a digital out 3-Wire port.

VEXcode blocks stack of code containing a DigitalOutA Low block.#
  set [DigitalInA v] [low v] :: custom-sensing

The 3-Wire ports function at a 5V logic signal voltage level.

Choose which Digital Out port to use.

Illustration of digital output device used in robotics for controlling signals and device interactions.

Choose what to output: a low or high digital logic signal.

Diagram illustrating the Digital Out signal block in the VEX EXP Brain programming environment.