Continuing on from the long-standing attempts at hacking the quadcopter remote communication, a few methods were used to find out exactly why the communication wouldn’t work. These methods and the resulting outcomes are discussed in this week’s post.
Non-functional NRF24L01+ modules:
The 10 NRF modules acquired earlier seem to be not working even with the basic communication code that was used at the start of the project. This is a huge headache as there has been no change in the coding or the libraries since the start of the project. The modules were tested with/without capacitors attached to them without any result. This adds to the suspicion that the cheaper modules acquired are non-functional from the start.
Considering the likelihood that the libraries itself may not be compiling properly, a new stack of code to represent the NRF communication library was written simply as a Arduino Sketch and compiled into the Arduino. The resulting code still showed the same issues. To get around to whether the problem lies, a couple of high-end NRF24L01 modules were acquired over the internet. These arrived a bit too late to be able to be tested with the Arduino’s in time so hopefully they will yield some positive feedback by the next entry,
Hacking using the Transmitter from the Quadcopter Remote
During the wait for the delivery of the new NRF modules, the communication module from inside the quadcopter remote was taken out and some jumper cables were glued onto the specific SPI pins on the module itself. Co-incidentally, the communication on one of the older bots purchased during the project was sniffed and it was found to have the exact same protocol as the newer bot. This is great news as all these bots seem to be working under the same control protocol from the remote control. For the earlier bot, the only things that were different in the 16-byte payload were the suspected ID ( bytes in the same location as the other bot. This is good as this verifies the current methodology for the quadcopter control being considered. It should be possible to control multiple bots by simply changing this ID for each bot to connect.
Connecting the quadcopter’s RF module to the Arduino was fairly simple enough however the library configuration registered were altered to match the exact set-up bytes seen over the remote control communication seen over the SPI bus. This was done to replicate the power-up sequences exactly. After doing so, the module was tested first with the logic analyzer to first see if it is responding to the Arduino (the module may have been damaged during attempts made to take it out of the remote control).
The outcome was encouraging as the module was still responding and after some tweaking the status register seemed to indicate that the payload was being sent over the radio. However, the quadcopter still did not respond to the attempts to hack it which is desperately disappointing. The code and the module is still being worked on and will be tested alongside the newly arrived NRF module in the next few days so heads for updates on that.
Implementing the Image Segmentation as a video feed:
Whilst the hacking was done, some time was devoted to the much improved image segmentation code developed in Matlab. In the interest of having some much needed progress for the overall system, the code was incorporated with an appropriate loop to work as an active video feed to allow matlab to capture the location of the bot in a 2D environment. A tracking algorithm was also implemented to track the number of objects detected and spit out the values as a 5X2 matrix (maximum number of objects’ co-ordinates recorded was set at 5). The result was much more satisfactory that expected as the image segmentation (with added filtering) was able to detect objects close and far and changing the background did not seem to make much difference. The reference image for the background is taken before the loop that executes the video, so given that the camera is fixed in a still position, the small changes in the background can be filtered out quite easily and the only constraint against performance now seems to be the lighting.
One more encouraging aspect was the discovery of the use of Region Properties in Images/objects detected. There are several properties about the object that can be extracted from a segmented image (such as centroid which is used when tracking the object itself) and a very interesting property could be the Area covered by the object in terms of pixels. Given that the objects are in the same horizontal frame as the camera, the area occupied by the object could be used to decipher exactly how far the object is from the camera. This is kind of a long shot since the movement of the bots can make it quite difficult to calculate the exact distance from the camera, but could be highly useful to simplify the system further.
Implementation of the Swarm Algorithms
Some time was also dedicated to find a suitable way to incorporate the Swarm Algorithm into the system. The solution discovered was to use a simple function that can take in the Quadcopter positions and run the calculations needed to find out where each bot should move. This can be done internally in the function and the only output that would be required would be the specific state the bots should be in (i.e. Move Left/ Move Right etc). This was implemented and tested out with just the X-coordinates considered. In the context of the resolution of the location sensing and the nature of the uniform distribution, the horizontal movement is more important compared to the vertical ones. The results make sense and using functions such as these could be the best way to achieve the objectives of the project as the functions themselves can be further improved to make the system more dynamic. Furthermore this implementation leaves more room for newer Swarm Algorithms to be applied without much change to the overall code.
The project is reaching its final run-in and it is clearly evident that there is a lot of work still to be done. The most important work of all is the quadcopters themselves so unless that is sorted out and bots are flying around, the project will be a failure. Time is quite short so there is a need for urgency, however, it is still not beyond the realm of possibility so there needs to be more focus and determination towards completing the project from now until the final presentation.