As the end of my internship period was approaching, I was asked to move from feature extraction to working on the machine learning models. For the past few weeks I have been working with multiple libraries to retrieve information from musical tracks, and it was time to test what I had so far on machine learning models.
The library that was chosen for the implementation of those ML models was the C++ library Mlpack, which is a machine learning library designed with focus on speed and performance. The week started with me reading the documentation of the library to figure out the various machine learning algorithms available and how to turn them into functioning code. The design of the library was clear and direct, which added to the experience of utilizing the library.
The first algorithm I decided to use was a Naive Bayes algorithm, a popular machine learning, and statistical algorithm that mainly uses probability for classification. Up next was a decision tree a popular machine learning predictive model that treats features as branches and classes as leaves. A random forest was also deployed, which is an algorithm that uses a multitude of decision trees for predictions. I , as well, used several other models such as a softmax regression, linear support vector machines, and a basic neural network made from three layers.
After the code for the models was finally written, results were gathered with accuracy as the evaluation metric. However, the results obtained did not seem at all promising, with accuracy of 40% and less. This could be an indication that there is not a great correlation between the predicated classes and the features that were used to train the models, which would require further feature extraction.
Looking at the results discussed above, it seemed that it was time to extract further features to enhance the performance of the models. Our plan for this upcoming week is to use MFCCs in order to pick up on frequency and cepstrum patterns as a descriptor of timbre (something that I have already started working on it near the end of the week).