AI Linux is a Linux distribution that comes complete with artificial intelligence libraries, tool and languages. A proof of concept alpha version is now available, suitable for test-driving in a virtual environment such as VirtualBox.
The cloud still monopolizes the space where neural networks and their algorithms breed. We have already explored such a case in Haven OnDemand Offers Machine Learning As A Service.
Things seem to be shifting though, with those elaborate algorithms looking to move on to run locally on mobile devices. That includes their training too; the pictures, notes, data and metadata that reside in the device and which are going to be worked upon, will also serve to train the network and aid its learning activities such as the recognizing, ranking and classifying of objects. The difference is that now all of that is going to happen locally.
Qualcomm’s Snapdragon 820 processors and their accompanying Snapdragon Neural Processing Engine SDK are behind such a move which would allow manufactures to run their own neural network models on Snapdragon powered devices, such as smart phones, security cameras, automobiles and drones, all without a connection to the cloud. Common deep learning user experiences that could be realized with the SDK, would be scene detection, text recognition, object tracking and avoidance, gesturing, face recognition and natural language processing.
Take The Roll for example, an iOS app based that helps in organizing photos on the user’s phone, utilizing an algorithm that combines artistic photography principles with deep learning technology. It sorts photos based on topics, locations, and events , and can also recognize the best, based on ranking system it employees.
The introduction of the new Snapdragon processor could enable apps like Roll to shift their business from the cloud and onto the device, since working offline has distinct advantages over its online counterpart.Online processing requires the presence of either a WiFi or mobile connection which can be sluggish as well a host of privacy concerns. Then looking at it from an ever practical perspective, multiple concurrent requests from thousands of client devices can easily overload the cloud based service and leave the client machine prone to long delays in getting a response, or even to fully scaled denials of service.
But in order to get going with developing AI powered applications localy, we don’t have to wait for Qualcomm’s SDK to be released; we are already in possession of a powerful machine, that of our GPU aided Personal Computer! The reasoning is simple.We are used to Linux distributions for just about anything; from dedicated servers to security and penetration testing, to education, games and science. So why leave AI out of the game?
Quick in following the trend, AI Linux is such a distribution and is described as:
a minimal Ubuntu Gnome respin stocked with examples and programming languages and libraries for Artificial Intelligence.
Its announcement says:
This distro is aimed at STEM people: Student, Teachers, and Enquiring Minds. But Scientists, Technologist, Engineers, and Mathematicians might also be interested!
It consists of the following:
- C, C++, Objective-C, Fortran, Java, Ada, and Go
- numpy, scipy, matplotlib, pandas
- scikit-learn (python3-sklearn)
- FANN (python3-fann2)
- Natural Language Toolkit
plus some AI powered games like Backgammon, Checkers, Chess, Go, and applications like CLIPS and weka.
Of course Google’s Tensorflow stands out from the rest – AI Linux makes it easy to combine Tesnorflow’s offline availability with the power of processing of a modest personal computer. For even more flexibility it can also be hosted as a Virtual Machine.
Couple that with the myriads of Tensorflow tutorials or courses available online, such as the one on Kadenze, and get hacking right now on your own AI powered applications!
As a side note, in case you are more of a .NET and Windows person, you can still catch the Machine Learning train through a framework like Accord; served with C#.