AI Vision Sensor#
Introduction#
The AI Vision Sensor can detect and track objects, colors, and AprilTags. This allows the robot to analyze its surroundings, follow objects, and react based on detected visual data.
The AI Vision examples are designed for use with a Basebot equipped with an AI Vision Sensor. In these examples, the sensor is configured with the name AIVision1, which appears in the blocks.
Below is a list of available blocks:
Actions – Capture data from the AI Vision Sensor for a selected signature.
get object data – Captures data for a specific object type, such as colors, pre-trained objects, or AprilTags.
Settings – Choose which object to interact with.
set AI Vision object item – Selects a specific object from the detected object list.
Values – Access and use the captured data.
AI Vision object exists? – Returns whether an object is detected.
AI Vision object count – Returns the number of detected objects.
AI Vision object property – Returns details such as:
AI Vision object is? – Checks if a detected object matches a specific classification.
AI Vision object is AprilTag ID? – Checks if a detected AprilTag matches a specific ID.
Actions#
get object data#
The get object data block filters data from the AI Vision Sensor frame. The AI Vision Sensor can detect signatures that include pre-trained objects, AprilTags, or configured colors and color codes.
Colors and color codes must be configured first in the AI Vision Utility before they can be used with this block.
The dataset stores objects ordered from largest to smallest by width, starting at index 0. Each object’s properties can be accessed using AI Vision object property block. An empty dataset is returned if no matching objects are detected.
get [AIVision1 v] data from [SELECT_A_SIG v]
Parameter |
Description |
---|---|
signature |
Filters the dataset to only include data of the given signature. Available signatures are:
|
Note: For AprilTag or AI Classification options to appear, their detection must be enabled in the AI Vision Utility.
Example
when started :: hat events
[Drive forward if an AprilTag is detected.]
forever
get [AprilTags v] data from [AIVision1 v]
if <[AIVIsion1 v] object exists?> then
drive [forward v] for [10] [mm v] ▶
Color Signatures#
A Color Signature is a unique color that the AI Vision Sensor can recognize. These signatures allow the sensor to detect and track objects based on their color. Once a Color Signature is configured, the sensor can identify objects with that specific color in its field of view.
Color Signatures are used in the Get object data block to process and detect colored objects in real-time. Up to 7 Color Signatures can be configured at a time.
Example
when started :: hat events
[Display if any objects matching the RED_BOX signature is detected.]
forever
set cursor to row [1] column [1] on screen
clear row [1] on screen
[Change the signature to any configured Color Signature.]
get [RED_BOX v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
print [Color detected!] on screen ▶
Color Codes#
A Color Code is a structured pattern made up of 2 to 4 Color Signatures arranged in a specific order. These codes allow the AI Vision Sensor to recognize predefined patterns of colors.
Color Codes are particularly useful for identifying complex objects, aligning with game elements, or creating unique markers for autonomous navigation. Up to 8 Color Codes can be configured at a time.
Example
when started :: hat events
[Display if any objects matching the RED_BLUE code is detected.]
forever
set cursor to row [1] column [1] on screen
clear row [1] on screen
[Change the signature to any configured Color Code.]
get [RED_BLUE v] data from [AIVision1 v]
if <[AIVIsion1 v] object exists?> then
print [Code detected!] on screen ▶
Settings#
set AI Vision object item#
The set AI Vision object item block sets which item in the dataset to use.
set [AIVision1 v] object item to (1)
Parameters |
Description |
---|---|
item |
The number of the item in the dataset to use. |
Example
when started :: hat events
[Display the largest detected AprilTag ID.]
forever
get [AprilTags v] data from [AIVision1 v]
clear row [1] on screen
set cursor to row [1] column [1] on screen
if <[AIVision1 v] object exists?> then
set [AIVision1 v] object item to ([AIVision1 v]object count)
print ([AIVision1 v] object [tagID v]) on screen ▶
Values#
AI Vision object exists?#
The AI Vision object exists block returns a Boolean indicating whether any object is detected in the dataset.
True – The dataset includes a detected object.
False – The dataset does not include any detected objects.
<[AIVision1 v] object exists?>
Parameters |
Description |
---|---|
This block has no parameters. |
Example
when started :: hat events
[Drive forward if an object is detected.]
forever
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
drive [forward v] for [10] [mm v] ▶
AI Vision object count#
The AI Vision object count block returns the number of detected objects in the dataset as an integer.
([AIVision1 v] object count)
Parameters |
Description |
---|---|
This block has no parameters. |
Example
when started :: hat events
[Display the total amount of cubes, rings, and balls.]
forever
clear row [1] on screen
set cursor to row [1] column [1] on screen
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
print ([AIVision1 v] object count) on screen ▶
end
wait [0.5] seconds
AI Vision object property#
There are nine properties that are included with each object (shown below) stored after the Get object data block is used.
([AIVision1 v] object [width v])
Some property values are based off of the detected object’s position in the AI Vision Sensor’s view at the time that the Get object data block was used. The AI Vision Sensor has a resolution of 320 by 240 pixels.
Parameter |
Description |
---|---|
property |
Which property of the detected object to use: |
width#
width returns the width of the detected object in pixels as an integer from 1 to 320.
([AIVision1 v] object [width v])
Example
when started :: hat events
[Drive towards an object until its width is larger than 100 pixels.]
forever
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [width v]) [math_less_than v] [100]> then
drive [forward v]
end
else
stop driving
height#
height returns the height of the detected object in pixels as an integer from 1 to 240.
([AIVision1 v] object [height v])
Example
when started :: hat events
[Drive towards an object until its height is larger than 100 pixels.]
forever
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [height v]) [math_less_than v] [100]> then
drive [forward v]
end
else
stop driving
centerX#
centerX returns the x-coordinate of the center of the detected object in pixels as an integer from 0 to 320.
([AIVision1 v] object [centerX v])
Example
when started :: hat events
[Turn slowly until an object is centered in front of the robot.]
set turn velocity to [30] %
turn [right v]
forever
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <[140] [math_less_than v] ([AIVision1 v] object [centerX v]) [math_less_than v] [180]> then
stop driving
centerY#
centerY returns the y-coordinate of the center of the detected object in pixels as an integer from 0 to 240.
([AIVision1 v] object [centerY v])
Example
when started :: hat events
[Drive towards an object until its center y-coordinate is more than 140 pixels.]
forever
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [centerY v]) [math_less_than v] [140]> then
drive [forward v]
end
else
stop driving
angle#
angle returns the orientation of the detected Color Code or AprilTag as an integer in degrees from 0 to 359.
([AIVision1 v] object [angle v])
Example
when started :: hat events
[Slide left or right depending on how the Color Code is rotated.]
forever
get [RED_BLUE v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <[50] [math_less_than v] ([AIVision1 v] object [angle v]) [math_less_than v] [100]> then
drive [right v]
else if <[270] [math_less_than v] ([AIVision1 v] object [angle v]) [math_less_than v] [330]> then
drive [left v]
else
stop driving
end
else
stop driving
originX#
originX returns the x-coordinate of the top-left corner of the detected object’s bounding box in pixels as an integer from 0 to 320.
([AIVision1 v] object [originX v])
Example
when started :: hat events
[Display if an object is to the left or the right.]
forever
clear row [1] on screen
set cursor to row [1] column [1] on screen
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [originX v]) [math_less_than v] [160]> then
print [To the left!] on screen ▶
else
print [To the right!] on screen ▶
end
wait [0.5] seconds
originY#
originY returns the y-coordinate of the top-left corner of the detected object’s bounding box in pixels as an integer from 0 to 240.
([AIVision1 v] object [originY v])
Example
when started :: hat events
[Display if an object is close or far from the robot.]
forever
clear row [1] on screen
set cursor to row [1] column [1] on screen
get [AI Classifications v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [originY v]) [math_less_than v] [80]> then
print [Far!] on screen ▶
else
print [Close!] on screen ▶
end
wait [0.5] seconds
tagID#
tagID returns the identification number of the detected AprilTag as an integer.
([AIVision1 v] object [tagID v])
Example
when started :: hat events
[Drive forward when AprilTag ID 0 is detected.]
forever
get [AprilTags v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <([AIVision1 v] object [tagID v]) [math_equal v] [0]> then
drive [forward v]
else
stop driving
end
wait [0.5] seconds
AI Vision object is?#
The AI Vision object is? block returns a Boolean indicating whether a detected object matches a specific classification.
True – The item in the dataset is the specific object.
False – The item in the dataset is not the specific object.
<[AIVision1 v] object is [BlueBall v] ?>
Parameter |
Description |
---|---|
object |
Which object to compare the item to:
|
Example
when started :: hat events
[Display if a Blue Cube is detected.]
forever
get [AI Classifications v] data from [AIVision1 v]
clear row [1] on screen
set cursor to row [1] column [1] on screen
if <[AIVision1 v] object exists?> then
if <[AIVision1 v] object is [BlueCube v] ?> then
print [Cube detected!] on screen ▶
wait [0.5] seconds
AI Vision object is AprilTag ID?#
The AI Vision object is AprilTag ID? block returns a Boolean indicating whether a detected AprilTag matches a specific ID.
True – The AprilTag ID is the number.
False – The AprilTag ID is not the number.
<[AIVision1 v] object is AprilTag [1] ?>
Parameters |
Description |
---|---|
AprilTag number |
The number to compare against the detected AprilTag’s ID number. |
Example:
when started :: hat events
[Report if AprilTag ID 3 is detected.]
forever
clear screen
set cursor to row [1] column [1] on screen
get [AprilTags v] data from [AIVision1 v]
if <[AIVision1 v] object exists?> then
if <[AIVision1 v] object is AprilTag [3] ?> then
print [That is 3!] on screen ▶
else
print [That isn't 3!] on screen ▶
end
end
wait [0.1] seconds