AI Vision Sensing#

The AI Vision Sensor must be connected to your V5 Brain and configured in VEXcode V5 before it can be used. Go here for information about Getting Started with the AI Vision Sensor with VEX V5

Refer to these articles for more information about using the AI Vision Sensor.

For more detailed information about using the AI Vision Sensor with Blocks in VEXcode V5, read Coding with the AI Vision Sensor in VEXcode V5 Blocks.

Take AI Vision Snapshot#

The Take Snapshot block is used to capture the current image from the AI Vision Sensor to be processed and analyzed for Visual Signatures.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a take a AIVision5 snapshot of AprilTags block.#
  take a [AIVision5 v] snapshot of [AprilTags v]

A snapshot is required first before using any other AI Vision Sensor blocks.

Choose which AI Vision Sensor to use.

Image of a device interface displaying various sensor and motor status metrics for robotics control.

Select what Visual Signature the AI Vision Sensor should take a snapshot of.

  • AprilTags.

  • AI Classifications.

  • A configured Color Signature or Color Code.

AI Vision snapshot showing detected objects and their classifications with visual signatures and coordinates.

When a snapshot is taken with the AI Vision Sensor, it creates an array with all of the detected objects and their properties stored inside.

It’s also important to take a new snapshot everytime you want to use data from the AI Vision Sensor, so your robot isn’t using outdated data from an old snapshot in the array.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature. This makes sure that the data the robot is using is getting constantly updated.

Before any data is pulled from the snapshot, the AI Vision Sensor Object Exists? block is used to ensure that at least one object was detected in the snapshot. This makes sure that the robot isn’t trying to pull data from an empty array.

If the AI Vision Sensor has detected at least one object, it will print the CenterX coordinate of the largest detected object to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on Brain block, a set cursor to row 1 column 1 on Brain block, an if AIVision5 object exists? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, a print AIVision5 object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain :: #9A67FF
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5 v] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Classification Is#

The AI Classification Is block is used to report if the specified AI Classification has been detected.

VEXcode blocks stack of code containing an AIVision1 AI classification is BlueVall ? block.#
  <[AIVision1 v] AI classification is [BlueBall v] ? :: #5cb0d6>

The Take AI Vision Snapshot block is required first for AI Classifications before using the AI Classification Is block.

The Take AI Vision Snapshot block reports True when the AI Vision Sensor has detected the specified AI Classification.

The Take AI Vision Snapshot block reports False when the AI Vision Sensor has not detected the specified AI Classification.

Choose which AI Vision Sensor to use.

AI Vision Sensor classification result display showing detected object types and counts.

Choose which AI Classification to detect. This can change depending on what detection model you are using.

For more information on what AI Classifications are available and how to enable their detection, go here.

AI Vision classification output displaying detected object types and their identifiers in a robotics context.

In this example, the AI Vision Sensor will take a snapshot of all AI Classifications before checking if a Blue Ball was detected or not. If a Blue Ball was detected, it will print a message to the Print Console.

VEXcode blocks stack of code containing a when selected block, a take a AIVision1 snapshot of AI Classifications block, an if AIVision1 AI classification is BlueBall ? then block, a print Blue Ball detected! on Brain block, and an end block.#
  when started :: hat events
  take a [AIVision1 v] snapshot of [AI Classifications v]
  if <[AIVision1 v] AI classification is [BlueBall v] ? :: #5cb0d6> then
  print [Blue Ball detected!] on [Brain v] ▶
  end

Detected AprilTag Is#

The Detected AprilTag Is block is used to report if the specified AprilTag is detected. For more information on what AprilTags are and how to enable their detection, go here.

VEXcode blocks stack of code containing a AIVision1 detected AprilTag is 1 ? block.#
  <[AIVision1 v] detected AprilTag is (1) ? :: #5cb0d6>

The Take AI Vision Snapshot block is required first for AprilTags before using the Detected AprilTag Is block.

The Detected AprilTag Is block reports True when the AI Vision Sensor has detected the specified AprilTag.

The Detected AprilTag Is block reports False when the AI Vision Sensor has not detected the specified AprilTag.

Choose which AI Vision Sensor to use.

AI Vision Sensor displaying an AprilTag detection interface with various sensor readings and status indicators.

In this example, the AI Vision Sensor will take a snapshot of all AprilTags before checking if the AprilTag with the ID “3” was detected. If that specific AprilTag was detected, it will print a message to the Print Console.

VEXcode blocks stack of code containing a when selected block, a take a AIVision1 snapshot of AprilTags block, a if AIVision1 detected AprilTag is 3 ? then block, a print AprilTag 3 detected! on Brain block, and an end block.#
  when started :: hat events
  take a [AIVision1 v] snapshot of [AprilTags v]
  if <[AIVision1 v] detected AprilTag is (3) ? :: #5cb0d6> then
  print [AprilTag 3 detected!] on [Brain v] ▶
  end

Set AI Vision Sensor Object Item#

The Set AI Vision Sensor Object Item block is used to set the object item (of the object you want to learn more information about) from the objects detected. By default, the Object item is set to 1 at the start of a project.

When multiple objects are detected, they will be stored from largest to smallest, with object item 1 being the largest.

Note: AprilTags are not by their size, but are sorted by their unique IDs in ascending order. For example, if AprilTags 1, 15, and 3 are detected:

  • AprilTag 1 will have index 0.

  • AprilTag 3 will have index 1.

  • AprilTag 15 will have index 2.

VEXcode blocks stack of code containing a set AIVision5 object item to 1 block, and an end block.#
  set [AIVision5 v] object item to (1)

The Take AI Vision Snapshot block is required first before the Set AI Vision Sensor Object Item block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities and controls for a robotic system, including motor and vision sensors.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected, it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on brain block, a set cursor to row 1 column 1 on brain block, an if AIVision5 object exists ? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, print AIVision5 object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain :: #9A67FF
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5 v] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object Count#

The AI Vision Sensor Object Count block is used to report how many objects the AI Vision Sensor detects that match the specified Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a AIVision5 object count block.#
  ([AIVision5 v] object count )

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Count block can be used.

Choose which AI Vision Sensor to use.

AI Vision Sensor object count display interface showing detected objects and their details in a graphical format.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox, a clear all rows on Brain block, a set cursor to row 1 column 1 on brain, if AIVision5 object exists ? then block, a print#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain :: #9A67FF
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, print how many were detected.]
  print [# of Objects Detected: :] on [Brain v] ▶
  print ([AIVision5 v] object count) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object Exists?#

The AI Vision Sensor Object Exists? block is used to report if the AI Vision Sensor detects a Visual Signature.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing a AIVision5 object exists? block.#
  <[AIVision5 v] object exists?>

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object Exists? block can be used.

The AI Vision Sensor Object Exists? block reports True when the AI Vision Sensor has detected an object.

The AI Vision Sensor Object Exists? block reports False when the AI Vision Sensor has not detected an object.

Choose which AI Vision Sensor to use.

AI Vision Sensor interface showing object detection status and parameters including object count and existence.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before printing how many objects were detected.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on Brain block, a set cursor to row 1 column 1 on Brain block, an if AIVision5 object exists ? then block, a print#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain :: #9A67FF
  [Check if any object matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, print how many were detected.]
  print [# of Objects Detected: :] on [Brain v] ▶
  print ([AIVision5 v] object count) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end

AI Vision Sensor Object#

The AI Vision Sensor Object block is used to report information about a specified Visual Signature from the AI Vision Sensor.

A Visual Signature can be a Color Signature, Color Code, AprilTag, or AI Classification.

VEXcode blocks stack of code containing an AIVision5 object width block.#
  ([AIVision5 v] object [width v])

The Take AI Vision Snapshot block is required first before the AI Vision Sensor Object block can be used.

Choose which AI Vision Sensor to use.

Diagram illustrating various sensing capabilities of a robotic system, including motor and controller feedback.

Choose which property to report from the AI Vision Sensor:

  • width - How wide the object is in pixels, from 0 - 320 pixels.

  • height - How tall the object is in pixels, from 0 - 240 pixels.

  • centerX - The center X coordinate of the detected object, from 0 - 320 pixels.

  • centerY - The center Y coordinate of the detected object, from 0 - 240 pixels.

  • originX - The X coordinate of the object’s leftmost corner, from 0 - 320 pixels.

  • originY - The Y coordinate of the object’s leftmost corner, from 0 - 240 pixels.

  • angle - The angle of the detected Color Code or ApriTag, from 0 - 360 degrees.

  • tagID - The detected AprilTag’s identification number.

  • score - The confidence score (up to 100%) for AI Classifications. This score indicates how confident the model is in the detected AI Classification. A higher score indicates greater confidence in the accuracy of the AI Classification.

For more examples of using object properties, go here.

Diagram illustrating various sensing blocks and properties for robotic control, including motor and controller functions.

In this example, every .25 seconds, the AI Vision Sensor will take a snapshot of the RedBox color signature.

Then it will check if an object was detected before it will print the the CenterX coordinate of the largest object (indexed at 1) to the Brain’s screen.

VEXcode blocks stack of code containing a when started block, a forever block, a take a AIVision5 snapshot of RedBox block, a clear all rows on brain block, a set cursor to row 1 column 1 on brain block, an if AIVision5 object exists ? then block, a set AIVision5 object item to 1 block, a print CenterX on Brain block, a print AIVision object originX on Brain and set cursor to next row block, an end block, a wait 0.25 seconds block, and an end block.#
  when started :: hat events
  forever
  [Take a snapshot looking for the RedBox Color Signature.]
  take a [AIVision5 v] snapshot of [RedBox v]
  clear all rows on [Brain v]
  set cursor to row (1) column (1) on Brain :: #9A67FF
  [Check if any objects matching the RedBox Color Signature were detected.]
  if<[AIVision5 v] object exists? > then
  [If any objects were detected, set the object item to 1.]
  set [AIVision5] object item to (1)
  [Print object 1's CenterX coordinate on the Brain Screen.]
  print [CenterX:] on [Brain v] ▶
  print ([AIVision5 v] object [originX v]) on [Brain v] ◀ and set cursor to next row
  end
  [Repeat the process every .25 seconds]
  wait (0.25) seconds
  end