It has been a long time since my last blog post. My progress has been slow over the past 4 months but the important thing is that I’m back in Malaysia and have gotten back to work on the project at hand. Since my last post, a lot of aspects of the project has changed and I’ll be explaining this changes in this post and giving an overview on how the whole process will be from this point on.
Program Flow
After a discussion with the supervisor, some aspects of the project were changed. Firstly, it was pointed out that it was very foolish to try and build my own quadcopters when the project demands multiple bots. It would add complexity to my project which I can do without and thus, it was obvious that the way forward was to try and concentrate my efforts to hacking the toy bots. This would mean that I do not have to come up with the flight algorithms and use complicated electronics, like a gyroscope, on my hand made bots.
Another aspect of the project changed was the bots communicating with eachother. Once again, this was deemed as unnecessary complexity added to the project which would make it harder to complete the project in time and just make it harder to troubleshoot and operational by the end. It was decided to use a computer using Matlab as the main brain for the project, then the Swarm Algorithms simulations already developed could be directly plugged into the code and a new medium to translate the Swarm Algorithms would not be needed, vastly simplifying the work left in the project.
Finally, the location sensing problem was discussed and it was found that the best solution for the scale of the project would be to use a camera (either webcam or better) be used to pick out the bots in an environment. This would reduce the costs of buying and implementing GPS and would also help with the accuracy in closed spaces. This project is a proof of concept of using Swarms for area search so using this method for location sensing would not hamper the project objectives.
The above diagram shows a simplified program flow where the Arduino is just a conduit for communicating with the toy bots and nothing more. The PC with Matlab executes all the necessary calculations and does the location sensing, simplifying the overall operation of the project a great deal.
Fixing hardware issues
Previously, the wireless communication modules (NRF24L01+) were found to be not operating properly. Unfortunately, during my travels I lost these modules I had been using so had to wait till I came back to order some more. In the mean time, I did some research and found that the reason why these modules seem to intermittently stop working is that, especially on clone Arduino’s, the host can’t seem to drive the modules with sufficient current on the +3.3V resulting in current surges which tends to turn the module on and off. The solution seems to be to add a regulating capacitor directly across the +VCC and GND pins on the module to regulate the current supplied.
Armed with this information, I wanted to try it out for myself. Unfortunately, I only had 47uF capacitors available at home so I could only try it out with those. I solder them on two of the newer modules and tried out some of my old code which seemed to stop working sporadically. And it seems like the solution worked and I could run my old test code. But I fear 47uF may be a bit high as it is recommended to use capacitors from 0.33uF to 10uF so I will get some more caps later and solder use them on the other modules I have available.
Working on Location Sensing via image processing in Matlab
Matlab image processing toolbox makes it easy to learn image processing pretty fast. There are a few techniques I have in mind when trying to locate the quadcopters using a webcam, but I thought I’d try out picking out objects based on color first due to the vast amount of material available online on this aspect of the image processing toolbox.
After spending some time learning how the image processing toolbox works, I was able to come up with a script in Matlab that takes snapshots in a loop and separates the Red component in images and segments out the median, and then separates them further by converting the leftover to binary. After working out how to recognize red objects, I went ahead and modified the script to pick out how many objects are detected and plot it on x-y coordinates and display that real time on the camera’s viewpoint.
Initially, it looked to be working pretty well. But the code was too sensitive and it was picking out the redness on my skin as well as any red object in front of the camera. Tuning down the sensitivity, made it harder to pick out the objects when more that 20cm away which is a big problem. I then tried to detect object that have Red LED’s hoping that the shining red light would be more apparent while maintaining low sensitivity on the whole code. It added about a foot and a half to my current further distance so basically I can detect objects with Red LED’s up to 3 feet away but it fails to recognize any beyond that.
This upcoming week I will try out another technique using segmentation from a reference image to see if that works better for the purpose of this project. I hope that it will be possible to work on that but I will need to run some field tests on the venue of my final presentation at my University because lighting there on the day might be a factor. I will also try out using some store bought webcams (currently I am using the one built-in to my laptop which I must say is quite impressive in terms of definition).
Its been a slow time on the project recently, but I am rejuvenated in my vision for the project as now I have a clear idea of what I want to achieve. And to me it seems pretty feasible even in the small amount of time I have left. But of course, the main part of my project hinges on me being able to hack the toy quadcopters and I am hoping to have more results on that front soon as I take a fresh look at how they work, and how it may be possible to hack one. Lets hope the new year brings me more success on that front! Here’s wishing you all a very belated but none-the-less necessary Happy New Year. Let look out for new horizons for the year ahead.
1 Comment
William Young · 2015-01-20 at 12:02
Great job buddy. Keep it up! š