The main goal of this project (according to some promotional material) is “to meet the needs of a new professional developer audience. It delivers free tools and fundamental open-source hardware and software building blocks for the rapid development of innovative ARM-based devices. The project also enables the easy integration of connectivity, sensor, and cloud-service software components and the tools and support for a dynamic, collaborative developer and partner ecosystem.”
The director of IoT platform development, Simon Ford at ARM, said: “The mbed project is bringing together leading technology companies to create a step change in productivity for embedded device development. We have learnt from the web and smartphone revolutions that by building an open-source software platform with reusable software components and free development and collaboration tools, we can enable the creation of IoT and smart devices on a previously unimagined scale.” (Source: ST.com Press Release.)
The community of system and software engineers has a dedicated website at ARM to help the developers collaborate. This helps them discuss their experiences in the field of the production of embedded systems. For example, implementing complex functions, such as image processing and gesture recognition.
The image processing function is featured by one of the partners of the community: the eyeSight company. I found very interesting the solutions section of the company website: “Using eyeSightEmbedded our machine vision algorithms can be implemented on DSP and GPU level, allowing the technology to be distributed between different processing units, offering a very powerful and efficient gesture recognition system without impacting the main application processor.”
The technology used in eyeSight's Touch Free system “utilizes advanced real-time image processing and machine vision algorithms to track the user's hand gestures and convert them into commands. [See Figure 1.] These commands are then used to control functions and applications within the device, creating a Natural User Interaction. The technology is completely software based and is independent of the underlying processor and camera hardware. The technology from eyeSight produces robust gesture recognition using only a standard 2D camera. It is compatible with 3D stereoscopic sensors, and IR illumination. The Touch Free technology can be easily integrated into various levels of the digital device: on the chipset level, operating system, as part of the camera module, or simply integrated in application level.”
The technology is really optimized in terms of power consumption of the CPU. This makes this solution ideal for integrated SoC systems. It is based on a standard VGA camera. Moreover, it is suitable for handheld devices like smartphones and tablets. The user can easily control many devices (such as smartphones, tablets, PCs, and set-top-boxes) in a touch-free mode. This feature also can be utilized for the movement detection because the visual recognition eyeSight technology can also operate in a wide range of lighting conditions, even if they are very dynamic and fast-changing. (See Figure 2.)
The features of this technology makes it very suitable to realize a complex system based on an Internet of Things IoT approach, and, following this philosophy, the ARM community presented the Robodog solution in a video that shows all the potentialities of this approach. The Robodog ERIC (Embedded Robotic Interactive Canine) incorporates concepts from:
- Electronics engineering
- Mechanical engineering
- Software engineering
- Computer engineering
- Inverse kinematics
- Speech recognition
The ERIC system is able to recognize a specific object by a visual identification. It is implemented by means of a touch-free solution (as with the eyesight system). The Robodog can move to the selected object and grab it among a scenario of other different objects. (See Figure 3.)
The Robodog Eric central core is implemented with two microcontrollers: The first is dedicated to the movement control and the second is for the image processing. (See Figure 4.)
This dual core solution is open to be adapted to the requirements of the company which decides to join the ARM mbed project and that is interested in the production of smart integrated systems controlled by microcontrollers acting as the CPU that realizes image processing, actuator procedure, motor control, by mean of a IoT approach. The Robodog may be utilized also to guard an object and to react when the selected object is taken away, and it recognizes a verbal command.
Have you ever worked with an embedded SoC similar to the ERIC Robodog, presented in this blog which integrates an image processing and motor control procedure? What do you think about the potential usage of this system? Did you ever use a dual core microprocessor approach to realize an IoT embedded system? What do you think of eyeSight's Touch Free technology?