Let us proudly present to you our new public demo. For a long time, people have been asking us to be able to quickly test our AI technology.
The Niland core technology consists of two functions: automatic tagging and suggestions of acoustically similar tracks.
This technology is meant to build better recommendation engines for music app users. The rapid growth of digital media delivery in recent years has led to an increase in the demand for tools and techniques for managing huge music catalogues. B2B (Distributors, Publishers and Libraries) and B2C (Music Streaming services) actors are facing the same issues. Music content is exploding which can make searching for specific tracks challenging.
Up until now, media file search queries have been performed textually, which is effective to an extent, but what if you want to find music that is similar to one you have in your possession? Or what if you want alternative tracks based on voice, mood, or instrumentation? This is where niland’s audio search enters into the picture.
This demo is based on deep learning algorithms applied to machine listening. Machine Listening is essentially an arm of AI that lets machines analyze and understand music by processing it (aka the music signal), rather than by meta data (such as keywords and descriptions that rely not only on human actions, but on human subjectivity too).
Try the Music Classification and Similarity Technology :
Paste any URL from soundcloud….
.. you get an estimate of its percentage belonging to different tags (genres, instrument, mood, vocals…).
You also get a list of other tracks showing the highest acoustically similarity to your track.