I started off with the project by writing some test cases and carry out some automated testing. I didn’t understand everything that was going on at that point, but I hoped that testing would shed some light on the behaviour of the back end. The first thing to do when making a test is to make an assumption or have some expectation on some behaviour of the application. The Wiki page of the project’s Git repository contains documentation for each REST API endpoint on what they should do, hence they form the basis of my test cases. If there are one or more test cases that fails, it was up to me to find out what was causing the failure and correct it so that it could pass those test cases. This involves me having to look at the part of the code my test was based on, and reading through the messy code to get a grip of what’s going on. It was very tempting for me to refactor everything, but doing so would make me lose focus on what is actually important first. Now that I have a suite of test cases, if there is ever a need to do some refactoring, I would not be afraid of accidentally breaking the functionality as I have the tests to back me up.
Moving up the cloud
I then decided to work on integrating with cloud storage services using the more standard protocols that I mentioned in my previous post. At the moment, I created a base class that is responsible for interfacing with the remote file system, then its subclasses would be the protocols that would be used to do that. I was hoping that there would be a one fits all solution, but I haven’t be able to find it yet. I have learnt the importance of continuously testing your software even as you add new features to it, so since my tests mainly involves using the cURL command to send requests to the REST API endpoints, I also created another REST API endpoint just to test out the integration with remote file systems, hence I would be adding new test cases for that. Normally you have the basic CRUD operations when it comes to RESTful Web resources. CREATE in this case could mean attempting to access a remote file system and mount it as a local drive, READ could be downloading the files from it, UPDATE for adding, modifying or deleting the files and finally DELETE to unmount the remote file system. That is how I decided to test it.
I was not sure how to actually test mounting a remote file system, until Dr Shawn show me that you can use SSHFS to just access some directory in your local machine and set the mounting point to be some other directory to start off with. But, of course, you need a ssh server already running in your machine or the ssh client would not be able to connect to localhost. You can install the ssh server in your own machine by typing this in your terminal:
sudo apt-get install openssh-server
With that, I tested out with SSHFS first. I used the Poco::Process API to execute the SSHFS driver as a child process, which is alright for now but there is some hit to performance when that is done. Knowing in the future that we might do some other way, I encapsulated the use of Poco::Process in a separate class.
Fortunately, the CREATE test (i.e accessing some directory using SSHFS) passed. Next week I would need to implement and test out the READ, UPDATE and DELETE part. I then went on trying to do the same thing with the WebDAV protocol but this time using the command-line for a quick test. For this, I used the fusedav driver to access my own files in the cloud storage service provided by AESTE. I was able to access my files in the cloud server, but there were problems. First, fusedav process seems to continue running even after mounting takes place, and if you were to end the process it immediately unmounts the remote file system on its own. This would be bad for our application if it were to wait for the process to finish. There must be some options to make it run in the background instead. Also, I was unable to add any new files to it, as it was getting a Permission denied sort of error, which is weird cause I already provided my credentials to authorize access. I must be doing something terribly wrong. I would need to get to the bottom of it sooner or later.
FUSE
If I have not clearly explained how those protocols worked, I would briefly do so now. Filesystem in Userspace (FUSE) is basically a software interface that allows users to create their own filesystems within their own user space. Usually, only the root user have the privileges to mount another filesystem, but FUSE enables the normal non-privileged users to do the same thing. That is why I was able to mount a remote filesystem as a local drive on my machine, as the protocols also make use of FUSE to enable you to do just that. With FUSE, you can basically carry out the same normal file operations in your mounted filesystem as you would with your local filesystem. You can use commands such as ls, cp, mv and many more file-based commands.
There were many mistakes that I made during my internship here at AESTE, and I am trying to not make those same mistakes again. I’m doing things little by little, no more shortcuts or ambitious feats that would probably get you into trouble as you continue working on the project later on. I am still inexperienced in the world of software development, so this should be the way to go now. Unlike my previous project, I am currently doing my testing along with my implementation and also being more cautious of the design choices I make. I want to make every single component of the application to be testable, and that involves reducing the coupling between the classes. Classes should be self-contained and have one responsibility, which makes them easier to test. I want the codebase to also be maintainable, so that future interns would not have to go through the same pain as I did to understand the inner workings of the software and be not afraid to make the necessary changes.
As you can see, software development is a very disciplined field. Anything you do will affect you and the current people and also the future people that are working on the same project. Rushing software development does no good either, you would still have to go through some iterative process in any software project. You can only get better by realizing what slows you down and try to find ways to be more efficient. In the end, you still have to take things one at a time, step by step, little by little.
0 Comments