top of page
Search
Writer's pictureKilian

VIDEO: Light Tracking Thingamajig

Updated: Feb 17

Late one fateful Tuesday eve I was contemplating deer safety, I had analysed one of the most pertinent threats that modern deer face: to be that of road kill at night and specifically that of locating lights. This follows the logic of: “if they knew where the lights are why would they stand still?”.

(Figure 1: me as a boy. I never knew my father)


Enlightened and highly motivated I started developing a light tracker to improve the lives of upper-middle class rich deer worldwide.


My first version worked... but with some minor kinks.


Using cv2.minMaxLoc() function from the cv2 library I was mostly able to separate a bright light from the surrounding pixels but sometimes it would get confused. The cv2.minMaxLoc() function returns four values but we are only interested in the fourth one which is the location of the largest number (brightest pixel) in the data set.


To use this function, you first need to turn the frame into a black and white frame so that it is a one dimensional array of one value being passed. The occasional confusion seemed to stem from noise in the camera’s sensor where a pixel would briefly become very bright (perhaps due to radiation hitting it) and while this is not noticeable to humans, the program was very susceptible to it. The fix is quite simple and just requires a cv2.GaussianBlur() which averages the brightness of the pixels in a certain area.



Now that we have it working it can draw a square around the bright area (the LED I’m holding).


Sadly, I was informed that deer don’t actually have Python installed and thus are unable to run my revolutionary code. While I waited for Elon to roll out a deer version of Neuralink I twiddled my thumbs and pondered.


While pondering I realised that I could apply this technology to Pi Wars by giving our robot eyes, allowing it to see itself from a bird’s eye view by attaching a light to the top of STEV3 and mounting the camera above. This does however add the need for either a wireless webcam (surprisingly hard to come by) or a second pi/microcontroller to send STEV3 the data


To summarise: Elon needs to hurry, deer suck and Python is cool.


-----


Note from the future: While this technology is really slick, we didn't have time to apply it to our robot this year, which is why our robot is bumbling around the arena like the lights are off. Maybe 2023.

5 views0 comments

Recent Posts

See All

Comments


bottom of page