MEETUP: NVIDIA – BUILDING SMARTER CITIES AND MACHINES WITH JETSON

London’s meetup group Data Science for Internet of Things wrapped up 2016. with an event last week where audience had a chance to hear the latest news about NVIDIA’s Jetson platform and also to see live demo with sensors and Raspberry Pi.

NVIDIA – Building Smarter Cities and Machines with Jetson

In the first part of the evening Pierre-Antoine Beaudoin from NVIDIA gave an insight into Jetson – an embedded platform and development board used in applications which require ultra fast processing with low energy footprint.

In the heart of Jetson is a GPU (Graphics Processing Unit) which consists of thousands of mini cores optimized for parallel processing. This parallelism is what gives GPU big processing power and makes it different from common CPUs. Jetson board can be used wherever low-power ultra-fast data processing is required: mobile medical imaging and diagnostics, wearables, computer vision (real-time video analytics, moving objects detection and recognition), navigation, robotics, autonomous cars…

NVIDIA Jetson TX1

NVIDIA Jetson TX1

It was interesting to hear how machine learning apps actually work on Jetson board. NVIDIA provides JetPack (Jetson Development Pack) which is a set of tools, SDKs and libraries. Some of them are dedicated for writing high performance machine learning applications (like for real-time video content analysis). Jetson runs Linux with these applications installed together with NVIDIA’s TensorRT which is a high performance neural network inference engine. Peripheral devices (sensors, cameras…) are attached to the board and are collecting data (signals of interest like images, sound…). Data feeds artificial neural network, gets processed and the output – e.g. inferred object or knowledge, is sent to the controlling app.

Pierre mentioned couple of examples where companies are using Jetson and machine learning in their products:

Dutch startup birds.ai uses artificial intelligence in asset management. One of their projects targets a relatively new discipline called Precision Agriculture. With Jetson TX1 on board, drones are flying over fields or plantations and take aerial photos. Computer vision software running on Jetson is trained to detect specific object (e.g. plants or trees) and also the state of it which is used to detect which area or crop has to be watered, which can be harvested etc.

NVIDIA Jetson Precision Agriculture

Precision Agriculture example: Birds.ai aerial imagery with plant identification

Another example of utilizing Jetson’s power in computer vision analytics on-the-go is coming from Estonia: Starship‘s futuristic 6-wheel robot which works as personal courier. It can bring you groceries from your local shop, navigating itself avoiding pedestrians and obstacles.

NVIDIA Jetson delivery robot

Starship – a self-driving delivery robot

Pierre pointed out that Embedded AI solutions usually come from startups and small companies. He emphasized that NVIDIA is committed in providing tools for Jetson community which is coming up with innovations and solutions. It provides JetPack for free to all members of NVIDIA Embedded Developer Program.

 

Stress detection system using Raspberry Pi

Second speaker at this event was Nam Tran, Machine Learning researcher who demonstrated how Raspberry Pi can detect person’s mental state – whether they are stressed or relaxed.

Mental state reflects on heart function. In periods of stress, heart beat increases and heart rate variability (HRV) decreases. There are commercial products out there on the market which have two hardware parts: pulse sensor which is usually embedded into watch-like device and mobile device which have heart signal data processing software (an app). Watch and mobile device usually talk via Bluetooth.

HRV data

HRV data

Nam ported this solution to Raspberry Pi: he connected wearable sensor to it and ran software on Pi which was recording pulse, calculating HRV, running learning engine and giving out the result – diagnostics.

It was possible to see all steps of Machine Learning in action:

  • data acquisition (Nam collected 1 minute samples of raw data; preprocessing to get systolic peaks, filtering, extracting features)
  • training & testing set generation
  • defining predictive model (neural network architecture – decision tree in this case, parameters, knowledge input)
  • model training
  • deploying model
  • making prediction from the model (output is the probability that sample instance matches “stress” or “relaxation” states)
Machine Learning Heart Rate Variability

Mental State Machine Learning model deployment


Data Science for Internet of Things meetups will resume in January 2017.

Leave a Reply