Now that we have a fully functioning remote controllable robot with a live image from its embedded camera, we can start thinking about the next step of the project: allowing the robot to automatically recognize colored balls, go to grab them, and finally bringing them to a zone corresponding to their color.
Hopefully, I have found a color recognition algorithm written in Java on the website uk-dave.com [source]. After analyzing this code, I noticed that one of the classes, called “ImageProcessor” was taking a BufferedImage as an input and outputting an image with the isolated colored ball and its coordinates.
It was very easy to implement in my system because I only had to convert it to a process, adding it two outputs, removing the useless functions and attributes and that was it.
Here is how uk-dave’s system is working :
- First, the user has to set intervals of hue, saturation and intensity corresponding to the color he wants to match
- Then, the function converts the input image to the HSI (hue, saturation, intensity) color space.
- It then applies the hue, saturation and intensity intervals set by the user to the image, keeping only the pixels from the original image that match the interval set by the user.
- Finally, thanks to an algorithm called “Blob coloring”, it located the locations where there is the biggest bunch of pixels on the filtered image (pixels that are linked together).
After a lot of tweaking on the three different channels (hue, saturation and intensity), I finally managed to isolate the red ball :
So it is quite efficient but I have found one problem : every time the luminosity of the room we are in is changing, we have to modify a little bit the HSI intervals. One thing I have noticed is that the hue interval is almost independent of the luminosity of the room, after some research, it is because the hue represents the color in degrees (0° is red, 120° is green, 240° is blue and 360° is red again). With the images from the phone’s camera, I have found that the hue interval for the red ball is : 7° to 20° and for the blue ball, 183° to 236°.
Because the S and I channels are based on the luminosity, it could perhaps be interesting to use the luminosity captors from Lego (RGB or light captor) to set the values automatically based on the room luminosity.
The next step is to write the algorithm that will make the robot locate an seek for the balls.
Well done; just what I expected to see given last week’s discussions.
The fact that you have also determined its limitations is also very good