Sensing#

Brain Sensing#

Reset Timer#

The Reset Timer block is used to reset the IQ Brain’s timer.

  reset timer

The Brain’s timer begins at the beginning of each project. The reset timer block is used to reset the timer back to 0 seconds.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

  when started :: hat events
  wait (2) seconds
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row
  reset timer
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row

Timer Value#

The Timer Value block is used to report the value of the IQ Brain’s timer in seconds.

  (timer in seconds)

The timer starts at 0 seconds when the program starts, and reports the timer’s value as a decimal value.

In this example, the Brain will print the current time after waiting 2 seconds before resetting its timer.

  when started :: hat events
  wait (2) seconds
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row
  reset timer
  print (timer in seconds) on [Brain v] ◀ and set cursor to next row 

Cursor Column#

The Cursor Column block is used to report the column number of the IQ Brain’s screen cursor location.

  (cursor column)

The Cursor Column block will report a value from 1-80 and will start on column 1 at the start of a project.

In this example, the Brain will print the number of the column the cursor is currently on.

  when started :: hat events
  print (cursor column) on [Brain v] ◀ and set cursor to next row

Cursor Row#

The Cursor Row block is used to report the row number of the IQ Brain’s screen cursor location.

  (cursor row)

The Cursor Row block will report a value from 1-9 and will start on row 1 at the start of a project.

In this example, the Brain will print the number of the row the cursor is currently on.

  when started :: hat events
  print (cursor row) on [Brain v] ◀ and set cursor to next row

Brain Button Pressed#

The Brain Button Pressed block is used to report if a button on the VEX IQ Brain is pressed.

  <Brain [Up v] button pressed?>

The Brain Button Pressed block reports True when the selected Brain button is pressed.

The Brain Button Pressed block reports False when the selected Brain button is not pressed.

Choose which Brain button to use on the IQ Brain.

The image shows a light blue, notched block in a visual coding environment with a dropdown menu. The block contains the text "Brain Up button pressed?" with the dropdown menu currently displaying "Up." The dropdown is open, showing three options: "Up," "Down," and "Check," with "Up" being selected. This block is typically used to check if a specific button on the Brain device has been pressed, with the user able to select which button to check from the dropdown menu.

In this example, the Brain will print a message on its screen when the first time the down Brain Button is pressed.

  when started :: hat events
  [Don't print the message until the Down Brain button is pressed.]
  wait until <Brain [Down v] button pressed?>
  print [Down Brain button pressed!] on [Brain v] ▶

Battery Capacity#

The Battery Capacity block is used to report the charge level of the IQ Brain’s battery.

  (battery capacity in %)

The Battery Capacity block reports a range from 0% to 100%.

In this example, the Brain will print its current battery charge on the Brain’s screen.

  when started :: hat events
  print (battery capacity in %) on [Brain v] ◀ and set cursor to next row

Controller Sensing#

Controller Pressed#

The Controller Pressed block is used to report if a button on the IQ Controller is pressed.

  <Controller [E Up v] pressed?>

The Controller Pressed block reports True when the selected Controller button is pressed.

The Controller Pressed block reports False when the selected Controller button is not pressed.

Choose which Controller button to use.

The image shows a hexagonal coding block labeled "Controller E Up pressed?" with a drop-down menu in the middle. The drop-down is currently expanded, revealing several options: "E Up," "E Down," "F Up," "F Down," and "L Up." The option "E Up" is highlighted and has a checkmark next to it, indicating that it is the currently selected option. The block has a yellow outline, and the drop-down menu is shaded in blue.

In this example, the Brain will print a message on its screen the first time the R Up button on the controller is pressed.

  when started :: hat events
  [Don't print the message until the R Up button is pressed.]
  wait until <Controller [R Up v] pressed?>
  print [R Up Button pressed.] on [Brain v] ◀ and set cursor to next row

Position of Controller#

The Position of Controller block is used to report the position of a joystick on the IQ Controller along an axis.

  (Controller [A v] position)

The Position of Controller block reports a range from -100 to 100.

The Position of Controller block reports 0 when the joystick axis is centered.

Choose the joystick’s axis.

The image shows an elongated, rounded coding block labeled "Controller A position," with a drop-down menu in the center displaying the letter "A." The drop-down is currently expanded, revealing the options "A," "B," "C," and "D." The option "A" is highlighted with a checkmark next to it, indicating that it is the selected option. The block is outlined in yellow, filled with a light blue color, and the text is white. The drop-down menu is also shaded in blue. The shape of the block is smooth and rounded at the ends.

In this example, the Brain will print the C axis of the IQ Controller’s joysticks.

  when started :: hat events
  print (Controller [C v] position) on [Brain v] ◀ and set cursor to next row

Controller Enable/Disable#

The Controller Enable/Disable block is used to enable or disable Controller configured actions from the Devices menu.

  controller [Disable v]

Choose to either enable or disable the configured Controller actions. By default, the Controller is Enabled in every project.

The image shows a notched coding block labeled "Controller Disable," with a drop-down menu in the center displaying the word "Disable." The drop-down is currently expanded, revealing two options: "Disable" and "Enable." The option "Disable" is highlighted with a checkmark next to it, indicating that it is the selected option. The block is outlined in yellow, filled with a light blue color, and the text is white. The notched shape includes slight indentations on the left and right ends. The drop-down menu is also shaded in blue.

In this example, the Controller will be disabled at the start of the project and be re-enabled after the drivetrain has moved forward for 6 inches.

  when started :: hat events
  Controller [Disable v]
  drive [forward v] for (6) [inches v] ▶
  Controller [Enable v]

Motor Sensing#

Motor is Done?#

The Motor is Done? block is used to report if the selected IQ Smart Motor or Motor Group has completed its movement.

  <[Motor7 v] is done?>

The Motor is Done? block reports True when the selected Motor or Motor Group has completed its movement.

The Motor is Done? block reports False when the selected Motor or Motor Group is still moving.

Choose which Motor or Motor Group to use.

The image shows a hexagonal light blue block with a yellow outline from a block-based coding environment. The block contains the text "Motor7 is done?" with a small downward-facing arrow next to "Motor7." Below this block is a drop-down menu, also in light blue, with a checkmark next to "Motor7," indicating that this option is currently selected.

Motor is Spinning?#

The Motor is Spinning? block is used to report if the selected IQ Smart Motor or Motor Group is currently moving.

The image shows a hexagonal light blue block from a block-based coding environment. The block contains the text "Motor7 is spinning?" with a small downward-facing arrow next to "Motor7," indicating that this can be selected or changed. The block is used for checking if the motor is currently spinning.

The Motor is Spinning? block reports True when the selected Motor or Motor Group is moving.

The Motor is Spinning? block reports False when the selected Motor or Motor Group is not moving.

Choose which Motor or Motor Group to use.

  <[Motor7 v] is spinning?>

Position of Motor#

The Position of Motor block is used to report the distance an IQ Smart Motor or the first motor of a Motor Group has traveled.

  ([Motor7 v] position in [degrees v])

Choose which Motor or Motor Group to use.

The image shows a rounded light blue block from a block-based coding environment. The block contains the text "Motor7 position in degrees," with two small downward-facing arrows next to "Motor7" and "degrees." Below this block is a drop-down menu, also in light blue, with a checkmark next to "Motor7," indicating that this option is currently selected.

Choose the units to report in, degrees or turns.

The image shows a rounded light blue block from a block-based coding environment. The block contains the text "Motor7 position in degrees," with two small downward-facing arrows next to "Motor7" and "degrees." Below the block is a drop-down menu with the options "degrees" and "turns," with "degrees" currently selected, indicated by a checkmark.

In this example, the Motor will spin forward for 1 second before its current position is printed on the Brain’s screen.

  when started :: hat events
  [Spin Motor7 forward for 1 second.]
  spin [Motor7 v] [forward v]
  wait (1) seconds
  [Print Motor7's current position in degrees.]
  print ([Motor7 v] position in [degrees v]) on [Brain v] ▶

Velocity of Motor#

The Velocity of Motor block is used to report the current velocity of an IQ Smart Motor or the first motor of a Motor Group.

  ([Motor7 v] velocity in [% v])

The Velocity of Motor block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose which Motor or Motor Group to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Motor7 velocity in %" and small downward-facing arrows next to "Motor7" and "%." Below this block is a drop-down menu in light blue, with a checkmark next to "Motor7," indicating that this option is currently selected.

Choose the units to report in, percent (%) or rpm.

The image shows a rounded light blue block from a block-based coding environment with the text "Motor7 velocity in %" and small downward-facing arrows next to "Motor7" and "%." Below the block is a drop-down menu with options for "%" and "rpm," with "%" currently selected, as indicated by a checkmark.

In this example, the Motor will spin forward for 1 second before its current velocity is printed on the Brain’s screen.

  when started :: hat events
  [Spin Motor 7 forward for 1 second.]
  spin [Motor7 v] [forward v]
  wait (1) seconds
  [Print Motor7's current velocity in rpm.]
  print ([Motor7 v] velocity in [rpm v]) on [Brain v] ▶

Current of Motor#

The Current of Motor block is used to report the amount of current a IQ Smart Motor or Motor Group is drawing in amperes (amps).

  ([Motor7 v] current in [amps v])

Choose which Motor or Motor Group to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Motor7 current in amps" and small downward-facing arrows next to "Motor7" and "amps." Below this block is a drop-down menu in light blue, with a checkmark next to "Motor7," indicating that this option is currently selected.

In this example, the Motor will spin forward for 1 second before its current is printed on the Brain’s screen.

  when started :: hat events
  [Spin Motor7 forward for 1 second.]
  spin [Motor7 v] [forward v]
  wait (1) seconds
  [Print Motor7's current velocity in amps.]
  print ([Motor7 v] current in [amps v]) on [Brain v] ▶

Drivetrain Sensing#

Drive is Done?#

The Drive is Done? block is used to report if the Drivetrain has completed its movement.

  <drive is done?>

The Drive is Done? block reports True when the Drivetrain’s motors have completed their movement.

The Drive is Done? block reports False when the Drivetrain’s motors are still moving.

Drive is Moving?#

The Drive is Moving? block is used to report if the Drivetrain is currently moving.

  <drive is moving?>

The Drive is Moving? block reports True when the Drivetrain’s motors are moving.

The Drive is Moving? block reports False when the Drivetrain’s motors are not moving.

Drive Heading#

The Drive Heading block is used to report the direction that the Drivetrain is facing by using the Inertial sensor’s current angular position.

  (drive heading in degrees)

The Drive Heading block reports a range from 0.0 to 359.99 degrees.

In this example, the Drivetrain will turn to the right for 1 second before its current heading is printed on the Brain’s screen.

  when started :: hat events
  [Turn towards the right for 1 second.]
  turn [right v]
  wait (1) seconds
  [Print Drivetrain's current heading after 1 second.]
  print (drive heading in degrees) on [Brain v] ◀ and set cursor to next row

Drive Rotation#

The Drive Rotation block is used to report the Drivetrain’s angle of rotation.

  (drive rotation in degrees)

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

In this example, the Drivetrain will turn to the left for 1 second before its current rotation is printed on the Brain’s screen.

  when started :: hat events
  [Turn towards the right for 1 second.]
  turn [right v]
  wait (1) seconds
  [Print Drivetrain's current rotation after 1 second.]
  print (drive rotation in degrees) on [Brain v] ◀ and set cursor to next row

Drive Velocity#

The Drive Velocity block is used to report the current velocity of the Drivetrain.

  (drive velocity in [% v])

The Drive Velocity block reports a range from -100% to 100% or -600rpm to 600rpm.

Choose the units to report in, percent (%) or rpm.

The image shows a rounded light blue block from a block-based coding environment with the text "drive velocity in %" and a small downward-facing arrow next to the "%." Below the block is a drop-down menu with options for "%" and "rpm," with "%" currently selected, as indicated by a checkmark. The drop-down menu allows the user to choose the unit of measurement for the drive's velocity.

In this example, the Drivetrain will drive forward for 1 second before its current velocity is printed on the Brain’s screen.

  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current velocity after 1 second.]
  print (drive velocity in [% v] :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Drive Current#

The Drive Current block is used to report the amount of current (power) that the Drivetrain is currently using.

  (drive current amps)

In this example, the Drivetrain will drive forward for 1 second before its current is printed on the Brain’s screen.

  when started :: hat events
  [Drive forward for 1 second.]
  drive [forward v]
  wait (1) seconds
  [Print Drivetrain's current after 1 second.]
  print (drive current amps :: #5cb0d6) on [Brain v] ◀ and set cursor to next row

Bumper Sensing#

Bumper Pressed#

The Bumper Pressed block is used to report if the Bumper Switch is pressed.

  <[Bumper8 v] pressed?>

The Bumper Pressed block reports True when the selected Bumper Switch is pressed.

The Bumper Pressed block reports False when the selected Bumper Switch is not pressed.

Choose which Bumper Switch to use.

The image shows a hexagonal light blue block from a block-based coding environment with the text "Bumper8 pressed?" and a small downward-facing arrow next to "Bumper8." Below this block is a drop-down menu in light blue, with a checkmark next to "Bumper8," indicating that this option is currently selected. The block is used to check whether Bumper8 has been pressed.

In this example, the Brain will print a message on its screen the first time the Bumper Switch is pressed.

  when started :: hat events
  [Don't print the message until the bumper is pressed.]
  wait until <[Bumper8 v] pressed?>
  print [Bumper was pressed.] on [Brain v] ▶

Touch LED Sensing#

Touch LED Pressed#

The Touch LED Pressed block is used to report if the Limit Switch is pressed.

  <[TouchLED11 v] pressed?>

The Touch LED Pressed block reports True when the selected Touch LED is pressed.

The Touch LED Pressed block reports False when the selected Touch LED is not pressed.

Choose which Limit Switch to use.

The image shows a hexagonal light blue block from a block-based coding environment with the text "TouchLED11 pressed?" and a small downward-facing arrow next to "TouchLED11." Below this block is a drop-down menu in light blue, with a checkmark next to "TouchLED11," indicating that this option is currently selected. The block is used to check whether TouchLED11 has been pressed.

In this example, the Brain will print a message on its screen the first time the Touch LED is pressed.

  when started :: hat events
  [Don't print the message until the Touch LED Sensor is pressed.]
  wait until <[TouchLED11 v] pressed?>
  print [Touch LED was pressed.] on [Brain v] ◀ and set cursor to next row

Gyro Sensing#

Calibrate Gyro#

The Calibrate Gyro block is used to calibrate the VEX IQ Gyro Sensor is used to reduce the amount drift generated by Gyro Sensor.

The Gyro Sensor must remain still during the calibration process.

  calibrate [Gyro5 v] for [2 v] seconds

Choose which Gyro Sensor to use.

The image shows a light blue block from a block-based coding environment with the text "calibrate Gyro5 for 2 seconds." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to calibrate the Gyro5 sensor for a specified duration, in this case, 2 seconds.

Choose how long to calibrate the Gyro Sensor for:

  • 2 seconds

  • 4 seconds

  • 8 seconds

The image shows a light blue block from a block-based coding environment with the text "calibrate Gyro5 for 2 seconds." Below this block is a drop-down menu in light blue with the options "2," "4," and "8" seconds, with "2" currently selected, as indicated by a checkmark. The block is used to calibrate the Gyro5 sensor for the selected duration.

In this example, the Brain’s Gyro Sensor will calibrate for 2 seconds before printing the current heading of the Gyro Sensor.

  when started :: hat events
  calibrate [Gyro5 v] for [2 v] seconds
  print ([Gyro5 v] heading in degrees) on [Brain v] ◀ and set cursor to next row

Set Heading#

The Set Heading block is used to set the Gyro Sensor current heading position to the specified value.

  set [Gyro5 v] heading to (0) degrees

The Set Heading block accepts a range of 0.0 to 359.99 degrees.

Choose which Gyro Sensor to use.

The image shows a light blue block from a block-based coding environment with the text "set Gyro5 heading to 0 degrees." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to set the heading of the Gyro5 sensor to a specific degree value, in this case, 0 degrees.

In this example, the Brain’s Gyro sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

  when started :: hat events
  print ([Gyro5 v] heading in progress :: #5cb0d6) on [Brain v] ◀ and set cursor to next row
  set [Gyro5 v] heading to (90) degrees
  print ([Gyro5 v] heading in degrees) on [Brain v] ◀ and set cursor to next row

Set Rotation#

The Set Rotation block is used to set the Gyro Sensor current rotation position to the specified value.

  set [Gyro5 v] rotation to (0) degrees

The Set Rotation block accepts any positive or negative decimal or integer number.

Choose which Gyro Sensor to use.

The image shows a light blue block from a block-based coding environment with the text "set Gyro5 rotation to 0 degrees." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to set the rotation of the Gyro5 sensor to a specific degree value, in this case, 0 degrees.

In this example, the Brain’s Gyro Sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new rotation.

  when started :: hat events
  print ([Gyro5 v] rotation in degrees) on [Brain v] ◀ and set cursor to next row
  set [Gyro5 v] rotation to (-100) degrees
  print ([Gyro5 v] rotation in degrees) on [Brain v] ◀ and set cursor to next row

Angle of Heading#

The Angle of Heading block is used to report the VEX IQ Gyro Sensor’s current heading in degrees.

  ([Gyro5 v] heading in degrees)

The Angle of Heading block reports a range from 0.0 to 359.99 degrees.

Choose which Gyro Sensor to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Gyro5 heading in degrees." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to obtain or specify the heading of the Gyro5 sensor in degrees.

In this example, the Brain’s Gyro sensor will print its starting heading, set its heading to 90 degrees, and then print the new heading.

  when started :: hat events
  print ([Gyro5 v] heading in degrees) on [Brain v] ◀ and set cursor to next row
  set [Gyro5 v] heading to (90) degrees
  print ([Gyro5 v] heading in degrees) on [Brain v] ◀ and set cursor to next row

Angle of Rotation#

The Angle of Rotation block is used to report the VEX IQ Gyro Sensor’s current rotation in degrees.

  ([Gyro5 v] rotation in degrees)

A clockwise direction is reported as a positive value, and a counterclockwise value is reported as a negative value.

Choose which Gyro Sensor to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Gyro5 rotation in degrees." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to obtain or specify the rotation of the Gyro5 sensor in degrees.

In this example, the Brain’s Inertial sensor will print its starting rotation, set its rotation to -100 degrees, and then print the new heading.

  when started :: hat events
  print ([Gyro5 v] rotation in degrees) on [Brain v] ◀ and set cursor to next row
  set [Gyro5 v] rotation (-100) degrees :: #5cb0d6
  print ([Gyro5 v] rotation in degrees) on [Brain v] ◀ and set cursor to next row

Rate of Gyro#

The Rate of Gyro block is used to report the VEX IQ Gyro Sensor’s rate of angular velocity.

  ([Gyro5 v] rate in dps)

The Rate of Gyro block reports a range between 0 to 249.99 dps.

Choose which Gyro Sensor to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Gyro5 rate in dps." Below this block is a drop-down menu in light blue, with a checkmark next to "Gyro5," indicating that this option is currently selected. The block is used to obtain or specify the rotation rate of the Gyro5 sensor in degrees per second (dps).

In this example, the current Gyro Sensor’s rate will be printed to the Brain’s screen.

  when started :: hat events
  print ([Gyro5 v] rate in dps) on [Brain v] ◀ and set cursor to next row

Color Sensing#

Color Sensor Found an Object#

The Color Sensor Found an Object block is used to report if the VEX IQ Color Sensor detects an object.

  <[Color12 v] found an object?>

The Color Sensor Found an Object block reports True when the Color Sensor detects an object or surface close to the front of the sensor.

The Color Sensor Found an Object block reports False when the Color Sensor does not detect an object or surface close to the front of the sensor.

Choose which Color Sensor to use.

The image shows a hexagonal light blue block from a block-based coding environment with the text "Color12 found an object?" Below this block is a drop-down menu in light blue, with a checkmark next to "Color12," indicating that this option is currently selected. The block is used to check if the selected Color12 sensor has detected an object.

In this example, when the Color Sensor detects an object, it will print a message to the Brain.

  when started :: hat events
  [Don't print the message until the Color Sensor detects an object.]
  wait until <[Color12 v] found an object?>
  print [Color Sensor detected an object.] on [Brain v] ◀ and set cursor to next row

Color Sensor Detects Color#

The Color Sensor Detects Color block is used to report if the VEX IQ Color Sensor detects a specific color.

  <[Color12 v] detects [red v] ?>

The Color Sensor Detects Color block reports True when the Color Sensor detects the selected color.

The Color Sensor Detects Color block reports False when the Color Sensor detects a different color than the one selected.

Choose which Color Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a hexagonal light blue block containing the text 'Color12 detects red ?' with small downward-facing arrows next to 'Color12' and 'red,' indicating these options can be selected or changed. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text 'Color12' preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how options are selected in this coding interface.

Choose which color to detect.

The image shows two connected elements from a block-based coding environment. The bottom element is a hexagonal light blue block containing the text "Color12 detects red ?" with small downward-facing arrows next to "Color12" and "red," indicating these options can be selected or changed. Above this, connected by a small arrow, is a light blue rectangular dropdown menu. The menu lists color options: "purple", "red violet", "violet", "blue violet", and "blue green". This dropdown appears to be expanding from the "red" option in the hexagonal block, showing the available color choices that can be selected for the detection condition.

In this example, when the Color Sensor detects the color green, it will print the detected color to the Brain.

  when started :: hat events
  [Don't print the message until the Color Sensor detects an the color green.]
  wait until <[Color12 v] detects [green v] ?>
  print ([Color12 v] color name) on [Brain v] ◀ and set cursor to next row

Color Sensor Color Name#

The Color Sensor Color Name block is used to report the name of the color detected by the VEX IQ Color Sensor.

  ([Color12 v] color name)

Choose which Color Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing the text "Color12 color name" with a small downward-facing arrow next to "Color12," indicating this option can be selected or changed. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text "Color12" preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how options are selected in this coding interface for the Color12 sensor's color name function.

In this example, when the Color Sensor detects the color green, it will print the detected color to the Brain.

  when started :: hat events
  [Don't print the message until the Color Sensor detects an the color green.]
  wait until <[Color12 v] detects [green v] ? >
  print ([Color12 v] color name) on [Brain v] ◀ and set cursor to next row

Color Sensor Brightness#

The Color Sensor Brightness block is used to report the amount of light detected by the VEX IQ Color Sensor.

  ([Color12 v] brightness in %)

Choose which Color Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing the text "Color12 brightness in %" with a small downward-facing arrow next to "Color12," indicating this option can be selected or changed. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text "Color12" preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how options are selected in this coding interface for the Color12 sensor's brightness function.

In this example, the Color Sensor will print the current brightness to the Brain.

  when started :: hat events
  print ([Color12 v] brightness in %) on [Brain v] ◀ and set cursor to next row

Color Sensor Hue#

The Color Sensor Hue block is used to report the hue of the color detected by the VEX IQ Color Sensor.

The Color Sensor Hue block reports a range from 0 to 360.

  ([Color12 v] hue in degrees)

Choose which Color Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing the text "Color12 hue in degrees" with a small downward-facing arrow next to "Color12," indicating this option can be selected or changed. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text "Color12" preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how options are selected in this coding interface for the Color12 sensor's hue measurement function.

In this example, the Color Sensor will print the current brightness to the Brain.

  when started :: hat events
  print ([Color12 v] hue in degrees) on [Brain v] ◀ and set cursor to next row

Distance Sensing#

IQ (1st gen)#

To use the IQ (1st gen) Distance Sensing blocks, you must be using an IQ (1st gen) Distance Sensor.

Distance Sensor Found Object#

The Distance Sensor Found Object block is used to report if the Distance Sensor sees an object within its field of view.

  <[Distance9 v] found an object?>

The Distance Sensor Found Object block reports True when the Distance Sensor sees an object or surface within its field of view.

The Distance Sensor Found Object block reports False when the Distance Sensor does not detect an object or surface.

Choose which Distance Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a hexagonal light blue block containing the text "Distance9 found an object?" with a small downward-facing arrow next to "Distance9," indicating this option can be selected or changed. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text "Distance9" preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how options are selected in this coding interface for the Distance9 sensor's object detection function.

In this example, when the Distance Sensor detects an object, it will print the distance between it and the detected object.

  when started :: hat events
  [Don't print the message until the Distance Sensor detects an object.]
  wait until <[Distance9 v] found an object?>
  print ([Distance9 v] distance in [mm v]) on [Brain v] ◀ and set cursor to next row

Object Distance#

The Object Distance block is used to report the distance of the nearest object from the Distance Sensor.

  ([Distance9 v] distance in [mm v])

The Object Distance block reports a range from 24mm to 1000mm or 1 inch to 40 inches.

Choose which Distance Sensor to use.

The image shows a hexagonal light blue block from a block-based coding environment with the text "Color12 found an object?" Below this block is a drop-down menu in light blue, with a checkmark next to "Color12," indicating that this option is currently selected. The block is used to check if the selected Color12 sensor has detected an object.

Choose what units to report in: millimeters (mm) or inches.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing three segments: "Distance9" with a downward-facing arrow, "distance in" as static text, and "mm" with another downward-facing arrow. Below this, connected by a small arrow to the "mm" segment, is a rectangular light blue block with rounded corners. This lower block displays a dropdown menu with two options: "mm" (millimeters) with a checkmark, indicating it's currently selected, and "inches" below it. This arrangement demonstrates how the unit of measurement for the Distance9 sensor can be selected, offering a choice between millimeters and inches.

In this example, when the Distance Sensor detects an object, it will print the distance between it and the detected object.

  when started :: hat events
  [Don't print the message until the Distance Sensor detects an object.]
  wait until <[Distance9 v] found an object?>
  print ([Distance9 v] distance in [mm v]) on [Brain v] ◀ and set cursor to next row

IQ (2nd gen)#

To use the IQ (2nd gen) Distance Sensing blocks, you must be using an IQ (2nd gen) Distance Sensor.

Object Distance#

The Object Distance block is used to report the distance of the nearest object from the Distance Sensor.

  ([Distance10 v] object distance in [mm v])

The Object Distance block reports a range from 20mm to 2000mm.

Choose which Distance Sensor to use.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing three segments: "Distance10" with a downward-facing arrow, "object distance in" as static text, and "mm" with another downward-facing arrow. Below this, connected by a small arrow, is a rectangular light blue block with rounded corners. This lower block contains the text "Distance10" preceded by a checkmark, suggesting it's a selected option from the dropdown menu in the block above. The arrangement demonstrates how the Distance10 sensor is selected for measuring object distance, likely in millimeters, in this coding interface.

Choose what units to report in: millimeters (mm) or inches.

The image shows two connected elements from a block-based coding environment. The top element is a light blue pill-shaped block containing three segments: "Distance10" with a downward-facing arrow, "object distance in" as static text, and "mm" with another downward-facing arrow. Below this, connected by a small arrow to the "mm" segment, is a rectangular light blue block with rounded corners. This lower block displays a dropdown menu with two options: "mm" (millimeters) with a checkmark, indicating it's currently selected, and "inches" below it. This arrangement demonstrates how the unit of measurement for the Distance10 sensor's object distance can be selected, offering a choice between millimeters and inches.

In this example, when the Distance Sensor detects an object, it will print the distance between it and the detected object.

  when started :: hat events
  [Don't print the message until the Distance Sensor detects an object.]
  wait until <[Distance10 v] found an object?>
  print ([Distance10 v] object distance in [mm v]) on [Brain v] ◀ and set cursor to next row

Object Velocity#

The Object Velocity block is used to report the current velocity of an object in meters per second (m/s).

  ([Distance10 v] object velocity in m/s)

Choose which Distance Sensor to use.

The image shows a rounded light blue block from a block-based coding environment with the text "Distance10 object velocity in m/s." Below this block is a drop-down menu in light blue, with a checkmark next to "Distance10," indicating that this option is currently selected. The block is used to obtain or specify the velocity of an object detected by the Distance10 sensor, measured in meters per second (m/s).

In this example, the Distance Sensor will report the current velocity of an object moving in front of it.

  when started :: hat events
  print ([Distance10 v] object velocity in m/s) on [Brain v] ◀ and set cursor to next row

Object Size Is#

The Object Size Is block is used to report if the Distance Sensor detects the specified object size.

  <[Distance10 v]  object size is [small v] ?>

The Distance Sensor determines the size of the object detected (none, small, medium, large) based on the amount of light reflected and returned to the sensor.

The Object Size Is block reports True when the Distance Sensor detects the specified size.

The Object Size Is block reports False when the Distance Sensor doesn’t detect the specified size.

Choose which Distance Sensor to use.

The image shows a hexagonal light blue block from a block-based coding environment with the text "Distance10 object size is small?" Below this block is a drop-down menu in light blue, with a checkmark next to "Distance10," indicating that this option is currently selected. The block is used to check if the object detected by the Distance10 sensor matches the specified size, which in this case is set to "small.

Choose which size of the object you want the Object Sensor to check for.

  • small

  • medium

  • large

The image shows a hexagonal light blue block from a block-based coding environment with the text "Distance10 object size is small?" Below this block is a drop-down menu in light blue, showing options for "small," "medium," and "large," with "small" currently selected. The block is used to check if the object detected by the Distance10 sensor matches the specified size, which can be selected from the available options.

In this example, if the Distance Sensor detects a small object, it will drive forward until the object is large.

  when started :: hat events
  [Check if the Distance Sensor sees a small object.]
  if <[Distance10 v] object size is [small v] ?> then
  [If a small object is detected, drive forward.]
  drive [forward v]
  [Wait until the small detected object is large.]
  wait until <[Distance10 v] object size is [large v] ?>
  [When the object size is large, stop driving.]
  stop driving
  end

Distance Sensor Found Object#

The Distance Sensor Found Object block is used to report if the Distance Sensor sees an object within its field of view.

  <[Distance10 v]  found an object?>

The Distance Sensor Found Object block reports True when the Distance Sensor sees an object or surface within its field of view.

The Distance Sensor Found Object block reports False when the Distance Sensor does not detect an object or surface.

Choose which Distance Sensor to use.

The image displays a hexagonal block labeled "Distance10 found an object?" with a dropdown menu open below it showing "Distance10" as a selected option. This block is likely used in a coding environment to check whether the distance sensor, named "Distance10," has detected an object.

In this example, when the Distance Sensor detects an object, it will print the distance between it and the detected object.

  when started :: hat events
  [Don't print the message until the Distance Sensor detects an object.]
  wait until <[Distance10 v] found an object?>
  print ([Distance10 v] object distance in [mm v]) on [Brain v] ◀ and set cursor to next row

Optical Sensing#

Set Optical Light#

The Set Optical Light block is used to set the light on the Optical Sensor to on or off. The light lets the Optical Sensor see objects if it is looking at an object in a dark area.

  set [Optical7 v] light [on v]

Choose which Optical Sensor to use.

The image shows a coding block used to control the light of an Optical Sensor. The block is rectangular with a slight notch on the left side, allowing it to connect with other blocks. The block's text reads "set Optical7 light on," where "Optical7" is selected from a dropdown menu that lists available optical sensors, and "on" is another dropdown menu option that allows the light to be turned "on" or "off." The dropdown menu for "Optical7" is currently open, showing the selected sensor.

Choose whether to turn the light on or off.

The image displays a rectangular coding block with a slight notch on the left, which is used to control the light of an Optical Sensor. The text within the block reads "set Optical7 light on." "Optical7" is selected from a dropdown menu, and the word "on" is also chosen from another dropdown menu that controls the state of the light. In the image, the dropdown menu for the light state is open, showing "on" as the selected option with "off" as an alternative choice.

In this example, the Optical Sensor will turn its light on for two seconds before turning it off.

  when started :: hat events
  set [Optical7 v] light [on v]
  wait (2) seconds
  set [Optical7 v] light [off v]

Set Optical Light Power#

The Set Optical Light Power block is used to set the light power of Optical sensor

  set [Optical7 v] light power to (50) %

The Set Optical Light Power block block accepts a range of 0% to 100%. This will change the brightness of the light on the Optical Sensor. If the light is off, this block will turn the light on.

Choose which Optical Sensor to use.

The image shows a blue rectangular block with slight notches on the left side. The block contains the command "set Optical7 light power to 50%," with a dropdown menu open for selecting the device (Optical7 in this case). This block is used to adjust the light power of the Optical7 sensor to 50 percent.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

  when started :: hat events
  set eye light power to (75) %
  [Wait until the Eye Sensor detects an object.]
  wait until <eye found an object?>
  print [Object detected.] ◀ and set cursor to next row

Optical Sensor Found Object#

The Optical Sensor Found Object block is used to report if the Optical Sensor detects an object close to it.

  <eye found an object?>

The Optical Sensor Found Object block reports True when the Optical Sensor detects an object close to it.

The Optical Sensor Found Object block reports False when an object is not within range of the Optical Sensor.

Choose which Optical Sensor to use.

The image shows a hexagonal block with a dropdown menu. The block reads "Optical7 found an object?" with the "Optical7" dropdown currently selected. This block is used in programming to check if the Optical Sensor labeled "Optical7" has detected an object.

In this example, the Optical Sensor’s power light is set to 75% before it waits to detect an object to print a message.

  when started :: hat events
  set eye light power to (75) %
  [Wait until the Eye Sensor detects an object.]
  wait until <eye found an object?>
  print [Object detected.] ◀ and set cursor to next row

Optical Sensor Detects Color#

The Optical Sensor Detects Color block is used to report if the Optical Sensor detects the specified color.

  <eye detects [red v] ?>

The Optical Sensor Detects Color block reports True when the Optical Sensor detects the specified color.

The Optical Sensor Detects Color block reports False when the Optical Sensor doesn’t detect the specified color.

Choose which Optical Sensor to use.

The image shows a hexagonal block labeled "Optical7 detects red?" with a dropdown menu. The dropdown menu is currently open, showing "Optical7" as the selected sensor. This block is used to check if the specified Optical Sensor (Optical7) detects the color red, with the option to change the color and the sensor via dropdown menus.

Choose which color the Optical Sensor will check for.

The image shows a hexagonal block labeled "eye detects red?" with a dropdown menu. The dropdown menu is currently open, displaying color options "red," "green," and "blue," with "red" selected. This block is used to check if the Eye Sensor detects a specific color, with the ability to change the detected color via the dropdown menu.

In this example, the Optical Sensor will wait until it detects a red color before printing the color on the Brain’s screen.

  when started :: hat events
  [Wait until the Eye Sensor detects the color red.]
  wait until <eye detects [red v] ?>
  print [Color red detected.] ◀ and set cursor to next row

Optical Sensor Color Name#

The Optical Sensor Color Name block is used to report the name of the color detected by the VEX IQ Optical Sensor.

  ([Optical4 v] color name)

The Color Name block reports one the following colors:

  • red

  • green

  • blue

  • yellow

  • orange

  • purple

  • cyan

Choose which Optical Sensor to use.

The image shows a rounded blue block labeled "Optical4 color name," with a dropdown menu that is currently set to "Optical4." The dropdown menu is highlighted, indicating that it is currently selected and allows the user to choose the specific sensor device for retrieving the color name detected by the Optical Sensor.

In this example, the Optical Sensor will wait until it detects a red color before printing the color on the Brain’s screen.

  when started :: hat events
  [Wait until the Eye Sensor detects the color red.]
  wait until <eye detects [red v]>
  print [Color red detected.] ◀ and set cursor to next row

Optical Brightness#

The Optical Brightness block is used to report the amount of light detected by the Optical Sensor.

  (eye brightness in %)

The Optical Brightness block reports a number value from 0% to 100%.

A large amount of light detected will report a high brightness value.

A small amount of light detected will report a low brightness value.

Choose which Optical Sensor to use.

The image shows a rounded blue block labeled "Optical7 brightness in %." This block appears to be part of a visual coding interface, where "Optical7" is selected from a dropdown menu, and the block is likely used to get or set the brightness level of an optical sensor named "Optical7," with the brightness value expressed as a percentage.

In this example, the Optical Sensor will print the current brightness value to the Brain’s screen.

  when started :: hat events
  print (eye brightness in %) ◀ and set cursor to next row

Optical Hue#

The Optical Hue block is used to report the hue of the color of an object.

  (eye hue in degrees)

The Optical Hue block reports a number value that is the hue of the color of an object. It returns a number between 0 and 359.

The value can be thought of as the location of the color on a color wheel in degrees.

Choose which Optical Sensor to use.

The image shows a blue rounded block with the text "Optical7 hue in degrees" and a dropdown menu that allows you to select "Optical7" as the sensor. This block is used in a visual coding interface to measure or display the hue detected by the "Optical7" sensor in degrees. The dropdown menu is expanded, showing the "Optical7" option selected.

In this example, the Optical Sensor will print the currently seen hue to the Brain’s screen.

  when started :: hat events
  print (eye hue in degrees) ◀ and set cursor to next row

Vision Sensing#

Take Vision Sensor Snapshot#

The Take Vision Sensor Snapshot block is used to take a snapshot from the Vision Sensor.

  take a [Vision1 v] snapshot of [SELECT_A_SIG v]

The Take Vision Sensor Snapshot block will capture the current image from the Vision Sensor to be processed and analyzed for color signatures and codes.

A snapshot is required first before using any other Vision Sensor blocks.

Choose which Vision Sensor to use.

The image shows a block from a visual coding environment. The block is rectangular with slightly rounded edges and is used to take a snapshot using a vision sensor labeled as "Vision1." Below the block, there is a dropdown menu that allows the user to select "Vision1" as the active sensor for taking the snapshot. The block is also set to take a snapshot of a selected signature, which is indicated by the placeholder text "SELECT_A_SIG." This block is typically used in a program to capture visual data from a specific sensor.

Select which vision signature to use. Vision signatures are configured from the Devices window.

The image shows a block from a visual coding environment with a dropdown menu expanded. The block is rectangular with slightly rounded edges and is used to take a snapshot using a vision sensor labeled as "Vision1." The dropdown menu below "snapshot of SELECT_A_SIG" is expanded, revealing two options: "REDBLOCK" and "GREENBLOCK." This block allows the user to select a specific signature (such as "REDBLOCK" or "GREENBLOCK") for the vision sensor to capture in the snapshot.

Set Vision Sensor Object Item#

The Set Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) out of the number of objects detected.

  set [Vision1 v] object item to (1)

Choose which Vision Sensor to use.

The image displays a block from a visual coding environment, with slightly rounded edges, used to set a specific object item for a vision sensor labeled "Vision1." The dropdown menu is expanded, showing that "Vision1" is selected. The block is configured to set the object item to "1." This block allows the user to specify which object item the vision sensor should track or reference during the program's execution.

Vision Sensor Object Count#

The Vision Sensor Object Count block is used to report how many objects the Vision Sensor detects.

  ([Vision1 v] object count)

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Count block reports a number of objects.

The Vision Sensor Object Count block will only detect the number of objects from the last snapshot signature.

Choose which Vision Sensor to use.

The image shows a rounded block from a visual coding environment. This block is set up to use a vision sensor labeled "Vision1" to retrieve the "object count." The dropdown menu is open, showing the selection of the sensor. The block is used to count the number of objects detected by the specified vision sensor during the program's execution.

Vision Sensor Object Exists?#

The Vision Sensor Object Exists? block is used to report if the Vision Sensor detects a configured object.

  <[Vision1 v] object exists?>

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

The Vision Sensor Object Exists? block reports True when the Vision Sensor detects a configured object.

The Vision Sensor Object Exists? block reports False when the Vision Sensor does not detect a configured object.

Choose which Vision Sensor to use.

The image shows a hexagonal block in a visual coding environment. The block is configured to check if an object exists, as detected by a vision sensor labeled "Vision1." The dropdown menu is open, confirming "Vision1" is selected. This block likely returns a boolean value indicating whether the vision sensor has identified an object within its field of view.

Vision Sensor Object#

The Vision Sensor Object block is used to report information about a detected object from the Vision Sensor.

  ([Vision1 v] object [width v])

The Take Vision Sensor Snapshot block will need to be used before the Vision Sensor Object Exists? block can detect any configured objects.

Choose which Vision Sensor to use.

The image shows a rounded block in a visual coding environment. The block is configured to retrieve the width of an object detected by a vision sensor labeled "Vision1." A dropdown menu is expanded below the block, showing that "Vision1" is selected from the available sensors. This block will return a numeric value representing the width of the detected object as measured by the "Vision1" sensor.

Choose which property to report from the Vision Sensor:

  • width - How wide the object is in pixels, from 2 - 316 pixels.

  • height - How tall the object is in pixels, from 2 - 212 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 315 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 211 pixels.

  • angle - The angle of the detected object, from 0 - 180 degrees.

The image displays a rounded block in a visual coding environment, configured to retrieve a specific property of an object detected by the "Vision1" sensor. The block is set to "width" by default, and a dropdown menu is expanded, revealing additional options including "height," "centerX," "centerY," and "angle." These options allow the user to select different attributes of the detected object for the block to return, depending on the needs of the code.