You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A [Limelight](https://andymark-weblinc.netdna-ssl.com/product_images/limelight-2-plus/5e15fe1480289d6162f285cd/zoom.jpg?c=1578499604) is a small, on-board computer and camera system that performs computer vision. Many FRC teams, including us, use a Limelight for the autonomous period of competition games. Limelights are expensive and should be handled carefully.
2
+
A [Limelight](https://andymark-weblinc.netdna-ssl.com/product_images/limelight-2-plus/5e15fe1480289d6162f285cd/zoom.jpg?c=1578499604) is a small, on-board computer and camera system that performs computer vision. Many FRC teams, including us, use a Limelight assisting the driver in multiple tasks such as auto aiming and auto movement. Limelights are expensive and should be handled carefully.
3
3
4
-
How does it work? Well, the principles are simple. Several places on the field are marked by special tape known as retroreflective vision tape. This tape reflects all light that hits it directly back at the source, as shown in the image below (the photo was taken with flash). The tape is the only visible object in the image because all the light from the flash was reflected by the vision tape right back at the camera.
4
+
Take a look at the instructions [here](https://docs.limelightvision.io/en/latest/getting_started.html) for setting up the Limelight. As well, you should read how to build and configure a vision pipeline [here](https://docs.limelightvision.io/en/latest/vision_pipeline_tuning.html).
5
5
6
-

6
+
Generally, the Limelight draws a bounding box around the target and then obtains values from this rectangle.
7
7
8
-

8
+
There are two main pipelines ("modes") we use are: AprilTags and Neural Networks.
9
9
10
-
The limelight shines green light continuously and captures images using a camera. If the limelight shines green light onto retroreflective vision tape, the tape will show up brightly in the camera image. The limelight then determines the brightest object in the image, draws bounding boxes around it, finds the center of the bounding box, and determines the difference between that center and the crosshair of the camera alongside other important variables.
10
+

11
11
12
-
Take a look at the instructions [here](https://docs.limelightvision.io/en/latest/getting_started.html) for setting up the Limelight. As well, you should read how to build and configure a vision pipeline [here](https://docs.limelightvision.io/en/latest/vision_pipeline_tuning.html).
12
+

13
+
14
+
Apriltags are essentially QR codes. These are placed throughout the field, such as on scoring targets. Since they have set locations on the field, it is useful for us to use these to detect what things we can do in our location (for example, if we can see that we are in front of a goal then we can shoot).
13
15
14
-
There are two important interfaces for the limelight: http://limelight.local:5801 and http://limelight.local:5800 (you need to be connected to the limelight to use these links). The first one is for configuring the Limelight pipeline and the second one is for displaying the camera feed.
16
+
On the other hand, we use the Neural Network pipeline to detect custom objects. Usually, we train a neural network to find these objects, and then we can get measurement values from it. We'll go more in depth later.
17
+
18
+
There are two important interfaces for the limelight: http://limelight.local:5801 and http://limelight.local:5800 (you need to be connected to the limelight via the radio to use these links). The first one is for configuring the Limelight pipeline and the second one is for displaying the camera feed.
15
19
16
20
## Limelight NetworkTables values
17
21
[`NetworkTables`](https://first.wpi.edu/FRC/roborio/release/docs/java/edu/wpi/first/networktables/NetworkTable.html) might seem like a new concept, but you have already been working with `NetworkTables` because it includes `SmartDashboard` as one of its keys. Whenever you put or read values using SmartDashboard, you were working with `NetworkTables`. You can think of every instance of `SmartDashboard` as `NetworkTableInstance.getDefault().getTable("SmartDashboard")`. You can find a full description of `NetworkTables`[here](https://docs.wpilib.org/en/stable/docs/software/networktables/index.html?highlight=networktables),
@@ -22,59 +26,46 @@ Most important for this section is the fact that the Limelight puts many useful
Important keys that you will use often includes but not limited to: `tv`, `tx`, `ty`, `ta`, and etc. You may want to store the Limelight `NetworkTable` as a variable depending on how often you plan on accessing its entries. You can find a full list of keys [here](https://docs.limelightvision.io/docs/docs-limelight/apis/complete-networktables-api).
30
+
25
31
!!! tip
26
-
**You can find the full list of keys [here](https://docs.limelightvision.io/docs/docs-limelight/apis/complete-networktables-api).**
27
-
28
-
Important keys that you will use often includes but not limited to: `tv`, `tx`, `ty`, `ta`, and etc. You may want to store the Limelight `NetworkTable` as a variable depending on how often you plan on accessing its entries.
32
+
**Instead of using this long mess every time, consider copy-pasting the [LimelightHelpers](https://github.com/LimelightVision/limelightlib-wpijava/blob/main/LimelightHelpers.java) class into the subsystem file and using its methods**
29
33
30
34
## Distance Estimation and Angle Alignment
31
-
There are several "Case Studies" in the [Limelight documentation](https://docs.limelightvision.io/en/latest). I will highlight a few and compare/contrast with the Limelight code used for the 2020 Build Season. I highly recommend watching the gifs of the code in action for each of the case studies.
35
+
Here's some example code. There are more examples in the [Limelight documentation](https://docs.limelightvision.io/en/latest).
32
36
33
-
First up, aiming. Here is the featured code for aiming at a vision target (it is written in C++, but the math is the same regardless of the syntax):
37
+
First, finding the ground distance from the limelight to a target:
The code is active while a specific button is pressed. It gets the current horizontal offset and uses that as the error for a PID loop with a setpoint of zero. The idea is that the robot will be "aligned" if the offset is near zero. Since the Limelight is fixed in place on the robot, the only way to control the horizontal crosshair offset is to turn left/right.
47
+
Let's predefine some constants:
48
+
`LIMELIGHT_MOUNT_ANGLE` - the angle that the limelight is mounted from
49
+
`TARGET_HEIGHT` - the height of the target
50
+
`LIMELIGHT_HEIGHT` - the height of the limelight from the ground
59
51
60
-
The second segment of code is for moving to a particular distance:
52
+
This code uses basic trigonometry to calculate the floor distance. This method can be used to move to some location.
The idea here is the same as before: a PID loop with a setpoint at zero and a crosshair offset as the error. Since the Limelight is fixed in place on the robot, the only way to control the vertical crosshair offset is to move forward/backward. So, to set a particular distance, one would need to experiment with varying setpoints until the robot drives to the specified distance.
65
+
This returns the angle (in radians!) that the drivetrain must turn to align to its target. It also uses basic trigonometry.
66
+
67
+
!!! tip
68
+
**Code with Limelight often involves a lot of math, especially trig. If that's not your cup of tea, try the tea again :P**
78
69
79
70
Now take a look at [Limelight.java](https://github.com/DeepBlueRobotics/RobotCode2020/blob/unifiedcode/src/main/java/org/team199/lib/Limelight.java) from RobotCode2020. In particular, take a look at [`distanceAssist()`](https://github.com/DeepBlueRobotics/RobotCode2020/blob/unifiedcode/src/main/java/org/team199/lib/Limelight.java#L111), [`steeringAssist()`](https://github.com/DeepBlueRobotics/RobotCode2020/blob/unifiedcode/src/main/java/org/team199/lib/Limelight.java#L127), and [`autoTarget()`](https://github.com/DeepBlueRobotics/RobotCode2020/blob/unifiedcode/src/main/java/org/team199/lib/Limelight.java#L172). Some things I will point about about the code, especially `steeringAssist()`, are:
80
71
@@ -88,4 +79,32 @@ The programming team had some fun with this code during the 2019-2020 Pre-Season
In 2024, we used Limelight to autoalign then intake the game piece. Check out [this code](https://github.com/DeepBlueRobotics/RobotCode2024/blob/master/src/main/java/org/carlmontrobotics/commands/AutoMATICALLYGetNote.java) to see how we did it!
85
+
86
+
If it's too long to read, basically what it does is:
87
+
1. Change the drivetrain to robot oriented and turn on the intake
88
+
2. Retrieves three offsets: forward, strafe, and angle
89
+
3. It ends when there is a game piece inside the robot
90
+
4. The drivetrain is back to field oriented so it is driveable for the driver and the intake is stopped
91
+
92
+
Now go back and read it again :)
93
+
94
+
## What if I want to detect other things?
95
+
Well now you're at the right section. Using Limelight, we can detect a bunch of different objects, like game pieces and people!
96
+
97
+
There are two (main) types of vision models: classifier and detector.
98
+
99
+
A classifier is used to categorize an entire image into a predefined label. For example, if you want to distinguish between a red ball and a blue ball, you would use a classifier.
100
+
101
+
A detector is used to find specific distances within an image. This is usually used more often. For example, if you want to find the bounding box and specific location of that ball, you would use a detector.
102
+
103
+
For more details on training a model, check [this](https://docs.limelightvision.io/docs/docs-limelight/pipeline-neural/getting-started-with-neural-networks) out. There are also several pretrained models online you can steal >:)
104
+
105
+
Once you have your files, change the pipeline option to "Neural Network" and upload the files to the Limelight interface. Then watch the magic happen 😼😼
106
+
107
+
Some general notes to keep in mind when coding:
108
+
- Make sure you keep stay consistent with units
109
+
- Remember to store the name of your limelight(s) in constants
110
+
- In commands, don't make Limelight a requirement so that it can be used by multiple commands at the same time
0 commit comments