AI Vision Sensor#

Introduction#

The AI Vision Sensor can detect and track objects, colors, and AprilTags. This allows the robot to analyze its surroundings, follow objects, and react based on detected visual data.

The AI Vision examples are designed for use with a Basebot equipped with an AI Vision Sensor. In these examples, the sensor is configured with the name AIVision1, which appears in the blocks.

Below is a list of available blocks:

Actions – Capture data from the AI Vision Sensor for a selected signature.

  • get object data – Captures data for a specific object type, such as colors, pre-trained objects, or AprilTags.

Settings – Choose which object to interact with.

Values – Access and use the captured data.

Actions#

get object data#

The get object data block filters data from the AI Vision Sensor frame. The AI Vision Sensor can detect signatures that include pre-trained objects, AprilTags, or configured colors and color codes.

Colors and color codes must be configured first in the AI Vision Utility before they can be used with this block.

The dataset stores objects ordered from largest to smallest by width, starting at index 0. Each object’s properties can be accessed using AI Vision object property block. An empty dataset is returned if no matching objects are detected.

The Get object data stack block.#
  get [AIVision1 v] data from [SELECT_A_SIG v]

Parameter

Description

signature

Filters the dataset to only include data of the given signature. Available signatures are:

  • AprilTags
  • AI Classifications – includes:
    • BlueBall
    • GreenBall
    • RedBall
    • BlueRing
    • GreenRing
    • RedRing
    • BlueCube
    • GreenCube
    • RedCube
    • NAME – A color or color code where NAME is the name configured in the AI Vision Utility

Note: For AprilTag or AI Classification options to appear, their detection must be enabled in the AI Vision Utility.

Example

  when started :: hat events
  [Drive forward if an AprilTag is detected.]
  forever
  get [AprilTags v] data from [AIVision1 v]
  if <[AIVIsion1 v] object exists?> then
  drive [forward v] for [10] [mm v] ▶

Color Signatures#

A Color Signature is a unique color that the AI Vision Sensor can recognize. These signatures allow the sensor to detect and track objects based on their color. Once a Color Signature is configured, the sensor can identify objects with that specific color in its field of view.

Color Signatures are used in the Get object data block to process and detect colored objects in real-time. Up to 7 Color Signatures can be configured at a time.

The AI Vision Utility showing a connected vision sensor detecting two colored objects. The left side displays a live camera feed with a blue box on the left and a red box on the right, each outlined with white bounding boxes. Black labels display their respective names, coordinates, and dimensions. The right side contains color signature settings, with sliders for hue and saturation range for both the red and blue boxes. Buttons for adding colors, freezing video, copying, and saving the image are at the bottom, along with a close button in the lower right corner.

Example

  when started :: hat events
  [Display if any objects matching the RED_BOX signature is detected.]
  forever
  set cursor to row [1] column [1] on screen
  clear row [1] on screen
  [Change the signature to any configured Color Signature.]
  get [RED_BOX v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  print [Color detected!] on screen ▶

Color Codes#

A Color Code is a structured pattern made up of 2 to 4 Color Signatures arranged in a specific order. These codes allow the AI Vision Sensor to recognize predefined patterns of colors.

Color Codes are particularly useful for identifying complex objects, aligning with game elements, or creating unique markers for autonomous navigation. Up to 8 Color Codes can be configured at a time.

The AI Vision Utility interface shows a connected vision sensor detecting two adjacent objects, a blue box on the left and a red box on the right, grouped together in a single white bounding box labeled BlueRed. Detection information includes angle (A:11°), coordinates (X:143, Y:103), width (W:233), and height (H:108). On the right panel, three color signatures are listed: Red_Box, Blue_Box, and BlueRed, with adjustable hue and saturation ranges. The BlueRed signature combines the Blue_Box and Red_Box. Below the video feed are buttons labeled Freeze Video, Copy Image, Save Image, and Close.

Example

  when started :: hat events
  [Display if any objects matching the RED_BLUE code is detected.]
  forever
  set cursor to row [1] column [1] on screen
  clear row [1] on screen
  [Change the signature to any configured Color Code.]
  get [RED_BLUE v] data from [AIVision1 v]
  if <[AIVIsion1 v] object exists?> then
  print [Code detected!] on screen ▶

Settings#

set AI Vision object item#

The set AI Vision object item block sets which item in the dataset to use.

The Set AI Vision object item stack block.#
  set [AIVision1 v] object item to (1)

Parameters

Description

item

The number of the item in the dataset to use.

Example

A stack of blocks that begins with a when started block, followed by a comment block reading Display the largest detected AprilTag ID. Inside a forever loop, a take a AI Vision snapshot block captures an image of AprilTags. An if block checks if an AI Vision object exists. If true, a set AI Vision object item block assigns the object count to a variable, and a print block displays the tag ID of the detected object on the screen.#
  when started :: hat events
  [Display the largest detected AprilTag ID.]
  forever
  get [AprilTags v] data from [AIVision1 v]
  clear row [1] on screen
  set cursor to row [1] column [1] on screen
  if <[AIVision1 v] object exists?> then
  set [AIVision1 v] object item to ([AIVision1 v]object count)
  print ([AIVision1 v] object [tagID v]) on screen ▶

Values#

AI Vision object exists?#

The AI Vision object exists block returns a Boolean indicating whether any object is detected in the dataset.

  • True – The dataset includes a detected object.

  • False – The dataset does not include any detected objects.

The AI Vision object exists Boolean block.#
  <[AIVision1 v] object exists?>

Parameters

Description

This block has no parameters.

Example

  when started :: hat events
  [Drive forward if an object is detected.]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  drive [forward v] for [10] [mm v] ▶

AI Vision object count#

The AI Vision object count block returns the number of detected objects in the dataset as an integer.

The Set AI Vision object item stack block.#
  ([AIVision1 v] object count)

Parameters

Description

This block has no parameters.

Example

  when started :: hat events
  [Display the total amount of cubes, rings, and balls.]
  forever
  clear row [1] on screen
  set cursor to row [1] column [1] on screen
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  print ([AIVision1 v] object count) on screen ▶
  end
  wait [0.5] seconds

AI Vision object property#

There are nine properties that are included with each object (shown below) stored after the Get object data block is used.

The AI Vision object property reporter block.#
  ([AIVision1 v] object [width v])

Some property values are based off of the detected object’s position in the AI Vision Sensor’s view at the time that the Get object data block was used. The AI Vision Sensor has a resolution of 320 by 240 pixels.

Parameter

Description

property

Which property of the detected object to use:

width#

width returns the width of the detected object in pixels as an integer from 1 to 320.

The AI Vision object property stack block with its parameter set to width.#
  ([AIVision1 v] object [width v])

Example

  when started :: hat events
  [Drive towards an object until its width is larger than 100 pixels.]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [width v]) [math_less_than v] [100]> then
  drive [forward v]
  end
  else
  stop driving

height#

height returns the height of the detected object in pixels as an integer from 1 to 240.

The AI Vision object property stack block with its parameter set to height.#
  ([AIVision1 v] object [height v])

Example

  when started :: hat events
  [Drive towards an object until its height is larger than 100 pixels.]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [height v]) [math_less_than v] [100]> then
  drive [forward v]
  end
  else
  stop driving

centerX#

centerX returns the x-coordinate of the center of the detected object in pixels as an integer from 0 to 320.

The AI Vision object property stack block with its parameter set to centerX.#
  ([AIVision1 v] object [centerX v])

Example

  when started :: hat events
  [Turn slowly until an object is centered in front of the robot.]
  set turn velocity to [30] %
  turn [right v]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <[140] [math_less_than v] ([AIVision1 v] object [centerX v]) [math_less_than v] [180]> then
  stop driving

centerY#

centerY returns the y-coordinate of the center of the detected object in pixels as an integer from 0 to 240.

The AI Vision object property stack block with its parameter set to centerY.#
  ([AIVision1 v] object [centerY v])

Example

  when started :: hat events
  [Drive towards an object until its center y-coordinate is more than 140 pixels.]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [centerY v]) [math_less_than v] [140]> then
  drive [forward v]
  end
  else
  stop driving

angle#

angle returns the orientation of the detected Color Code or AprilTag as an integer in degrees from 0 to 359.

The AI Vision object property stack block with its parameter set to angle.#
  ([AIVision1 v] object [angle v])

Example

  when started :: hat events
  [Slide left or right depending on how the Color Code is rotated.]
  forever
  get [RED_BLUE v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <[50] [math_less_than v] ([AIVision1 v] object [angle v]) [math_less_than v] [100]> then
  drive [right v]
  else if <[270] [math_less_than v] ([AIVision1 v] object [angle v]) [math_less_than v] [330]> then
  drive [left v]
  else
  stop driving
  end
  else
  stop driving

originX#

originX returns the x-coordinate of the top-left corner of the detected object’s bounding box in pixels as an integer from 0 to 320.

The AI Vision object property stack block with its parameter set to originX.#
  ([AIVision1 v] object [originX v])

Example

  when started :: hat events
  [Display if an object is to the left or the right.]
  forever
  clear row [1] on screen
  set cursor to row [1] column [1] on screen
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [originX v]) [math_less_than v] [160]> then
  print [To the left!] on screen ▶
  else
  print [To the right!] on screen ▶
  end
  wait [0.5] seconds

originY#

originY returns the y-coordinate of the top-left corner of the detected object’s bounding box in pixels as an integer from 0 to 240.

The AI Vision object property stack block with its parameter set to originY.#
  ([AIVision1 v] object [originY v])

Example

  when started :: hat events
  [Display if an object is close or far from the robot.]
  forever
  clear row [1] on screen
  set cursor to row [1] column [1] on screen
  get [AI Classifications v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [originY v]) [math_less_than v] [80]> then
  print [Far!] on screen ▶
  else
  print [Close!] on screen ▶
  end
  wait [0.5] seconds

tagID#

tagID returns the identification number of the detected AprilTag as an integer.

The AI Vision object property stack block with its parameter set to tagID.#
  ([AIVision1 v] object [tagID v])

Example

  when started :: hat events
  [Drive forward when AprilTag ID 0 is detected.]
  forever
  get [AprilTags v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <([AIVision1 v] object [tagID v]) [math_equal v] [0]> then
  drive [forward v]
  else
  stop driving
  end
  wait [0.5] seconds

AI Vision object is?#

The AI Vision object is? block returns a Boolean indicating whether a detected object matches a specific classification.

  • True – The item in the dataset is the specific object.

  • False – The item in the dataset is not the specific object.

The AI Vision AI Classification is object Boolean block.#
  <[AIVision1 v] object is [BlueBall v] ?>

Parameter

Description

object

Which object to compare the item to:

  • BlueBall
  • GreenBall
  • RedBall
  • BlueRing
  • GreenRing
  • RedRing
  • BlueCube
  • GreenCube
  • RedCube

Example

  when started :: hat events
  [Display if a Blue Cube is detected.]
  forever
  get [AI Classifications v] data from [AIVision1 v]
  clear row [1] on screen
  set cursor to row [1] column [1] on screen
  if <[AIVision1 v] object exists?> then
  if <[AIVision1 v] object is [BlueCube v] ?> then
  print [Cube detected!] on screen ▶
  wait [0.5] seconds

AI Vision object is AprilTag ID?#

The AI Vision object is AprilTag ID? block returns a Boolean indicating whether a detected AprilTag matches a specific ID.

  • True – The AprilTag ID is the number.

  • False – The AprilTag ID is not the number.

The AI Vision detected AprilTag is Boolean block.#
  <[AIVision1 v] object is AprilTag [1] ?>

Parameters

Description

AprilTag number

The number to compare against the detected AprilTag’s ID number.

Example:

  when started :: hat events
  [Report if AprilTag ID 3 is detected.]
  forever
  clear screen
  set cursor to row [1] column [1] on screen
  get [AprilTags v] data from [AIVision1 v]
  if <[AIVision1 v] object exists?> then
  if <[AIVision1 v] object is AprilTag [3] ?> then
  print [That is 3!] on screen ▶
  else
  print [That isn't 3!] on screen ▶
  end
  end
  wait [0.1] seconds