AI has seen a steady increase in popularity over the last few years. But there has always been the challenge of actually applying for these advances. While the algorithm exists, the hardware tends to be either expensive or underpowered, or frankly not available. Well, Nvidia has heard the pleas of developers and has announced the Jetson Nano.
What exactly is the Jetson Nano?
The Jetson Nano is the latest in the line of embedded computing boards under the “Jetson” category. These devices are all low power systems that are specifically designed for accelerating machine learning applications.
So essentially, plug in a Jetson Nano to your AI-powered device and the Nano will act as a brain for it, handling all computational and processing work. This includes object recognition and autonomous navigation as well.
Why is the Jetson Nano a viable choice?
Well, for starters, the Jetson Nano is perfect for Edge computing. In case you don’t know what that is, Edge computing allows data generated by IoT devices to be processed closers to where the data was initially created. So rather than sending the data to the cloud to be analyzed, processed and sent back, Edge computing allows developers and organizations to analyze data in almost real time.
Say you’re processing data from cameras and microphones. The data can be intercepted along their way to the cloud and deleted or modified. But because the Jetson Nano uses Edge computing, the data actually never leaves the device.
Further, the data processing is also faster as there is no need to rely on cloud computing and the internet. This is especially handy if your device does a lot of real-time identification and translations.
Who can use the Jetson Nano?
The developer kid of the Nano is aimed at embedded designers, researchers and DIY makes, according to Nvidia. The developer kit will be priced at $99 and production ready modules will be priced at $129. While developer kits are available However, these modules will only be for commercial companies with a minimum order quantity of 1,000 modules.
Making things even more interesting, Nvidia also unveiled the JetBot. This is a autonomous robotics kit that includes a Jetson Nano, a robot chassis, battery pack and various motors. The kit allows those who are versed in robotics and the Jetson to essentially create their own self-driving robot. It’s not cheap tho. The open-source kit is priced at $250.
What does one get with the Jetson Nano?
The Jetson Nano kit comprises of a quad-core ARM A57 processor rated at 472 gigaflops of processing power. It also includes a 128-core Nvidia Maxwell GPU and is backed by 4GB of RAM.
In addition, you also get a MicroSD card slot, HDMI 2.0, 4 USB 3.0 ports, Gigabit Ethernet, and eDP 1.4 ports. In terms of framework support, you’re looking at support for TensorFlow, PyTorch, Caffe, Keras, MXNet and even a full Linux desktop environment.
How good will the Jetson Nano be?
While Nvidia’s effort at working on a competitively priced dev kit would certainly benefit developers, it would also mean that they face stiff competition. For example, Intel’s Neural Compute Stick retails for $79 and Google’s own Coral brand has a $150 devkit and a USB accelerator priced at $75.
That being said, Nvidia is entering quite an opportune market. I guess we’ll just have to wait and see what AI-powered devices are brought to life with the Jetson Nano kit. You can check out the full specs of the Jetson Nano here and even preorder one.
Are you interested in getting a Jetson Nano kit? What are your thoughts on it? We would love to hear from you.