The next step is the robot that is able to interact with humans; this amazing example of an application of electronics technology to robotics is the Pepper Robot, the “World's First Personal Robot That Reads Emotions” which is made by SoftBank Robotics of the SoftBank Group Corporation (see Figure 1):
The Pepper Robot (Source: SoftBank Group)
This robot opens the way to new scenarios, especially after the announcement of collaboration between the SoftBank Robotics and the Microsoft Company:
“SoftBank Robotics Corp. (headquarters: Minato-ku, Tokyo; President: Fumihide Tomizawa) and Microsoft Corporation (headquarters: Redmond, Washington, USA; CEO: Satya Nadella) have announced a strategic collaboration in the field of cloud robotics.
As a first step, Microsoft and SoftBank Robotics will work together to create a next-generation cloud-enabled robot using “Pepper” — SoftBank Robotics' humanoid robot — and Microsoft's cloud-based Azure IoT Suite. They will then work together to build a next-generation in-store solution for the retail industry, using Microsoft's Surface Hub large-screen collaboration device and Surface 2-in-1 devices together with these new cloud-enabled robots to serve customers in person”. (Source: SoftBank Robotics Press Releases 2016)
One of the multiple applications of this type of robot is to interact with people from different countries, speaking different languages, by mean of a vocal recognition setup and a multi-language translator integrated into the robot.
There are many contributions of electronics technology to the realization of this incredible robot, like it is shown in Figure 2:
The contents of the Pepper Robot (Source: Nikkei.com)
The Pepper Robot contains many electronic components, like microphones and touch sensors; the data coming from the peripherals is processed by a computer, acting as the brain of the Robot. To achieve this task, the Intel Atom microprocessor was utilized. The human voices of the people near the robot are received by means of its microphones, acting as the ears of the robot.
Moreover the electronics technology offers the possibility to receive and process the human voice by mean of dedicated voice recognition hardware like the Arduino EasyVR Shield 3.0 module (see Figure 3):
Figure 3: The SparkFun EasyVR Shield Demo. (Source: YouTube)
Robotics science is thus strongly enhanced by electronics technology to realize an innovative robot like the Pepper Robot, with an interesting potential to be powered by means of the Internet of Things technology and the voice recognition feature that makes this robot able to read human emotions. That’s a very interesting aspect of the i-Robot approach: do you think that this type of robot will be massively utilized in the near future? What type of use would you suggest for this robot?
Re: To connect or not to connect? That is the question.
It would not be surprising if every household were to own a robot each to perform daily chores in the future. However, I personally would still have my own paranoia regarding the situation. Perhaps I have watched too many fictional movies but I still think that robots might take over the world some day if we fail to control them as needed.
A few things called my attention from this robot. One of them jumped out from the text just 15 microseconds after reading that "To achieve this task, the Intel Atom microprocessor was utilized". That would probably be my last choice. One of the reasons is pointed out in the article referenced in the post:
"Pepper, you could say, is a hothead. The makers seem to have wrestled with ways to dissipate heat from the computer, as well as the 3-D cameras installed in the eye sockets. (...) Two fans inside the head draw air in and expel heat. Metal parts, also used to keep heat in check, consist of two separate pieces, suggesting improvised construction."
From another angle of view, "As a first step, Microsoft and SoftBank Robotics will work together to create a next-generation cloud-enabled robot using "Pepper" — SoftBank Robotics' humanoid robot — and Microsoft's cloud-based Azure IoT Suite." A personal robot MUST/SHOULD be a tool/assistant/helper/whatever-you-like-to-call-it you can trust, strictly speaking, trustable in all senses and safe and reliable even if taken to the highest extreme (http://www.planetanalog.com/author.asp?section_id=3319&doc_id=564248).
Taking into account it has about 20 motors + Atom + 2x3D cameras, it makes sense to expect a low autonomy (about 12h), and a large battery pack (4.7kg) from an also heavy toy (29kg).
Most things noted as "opening new scenarios" are not new. E.g. continuous speech recognition used to be very complex to achieve but it has now been around us for more than a decade. Take a look at open source project Julius (http://julius.osdn.jp/en_index.php). Voice synthesis, face and gestures recognition, and behavioral modeling are now well established technologies and we have several open source projects to look at just for fun.
The probably only thing that surprises me is the price per unit, $1650 seems a very reasonable price.
ASIMO did surprise me at its time. I know, it has less autonomy, not for the masses, heavier...end so on, but I liked it.
Autonomous driving represents one of the most interesting fields of research in the electronics and automotive sectors. This interest is growing by means of projects like the “Roobopoli” which includes the design and testing of prototype autonomous cars having integrated microcontrollers.
Autonomous driving is progressively becoming an important part of the future developments of e-cars making them more intelligent and safe. The aim is to increase the comfort of e-car users and to protect and make the driver aware in case of physiological issues while driving
The e-car is progressively evolving toward autonomous driving systems which can help the driver. In many cases this may help in dangerous situations like drowsiness or distress, being able to take control of the vehicle and to handle these kinds of situations.