Happy to help

Caroline Hayes meets robots that do more than heavy industrial work – the companions and service robots that interact with humans

Robots will operate beyond the factory floor to interact closely with humans in future. For that to happen, it is essential that robots are able to perceive people, react to their actions, and do not endanger them in any way, says EBV Elektronik.

The distributor claims to supply all the electronic components necessary for the development of autonomous robots that will be capable of interacting closely with people.

Bernard Vicens, director of smart consumer and building at EBV, believes it is an exciting period for the industry.

“In the scope of our business, we are still at the very early phase, but applications are across all market segments,” he argues. “Requirements are huge in terms of computing power, power management, security, sensors, human machine interfaces [HMI] and communications so – soon – this market will become strategic.”

A key trend is direct collaboration between humans and machines. They already help in disaster relief, the home and in surgical operations. In doing so, they are increasingly moving out of protected, encapsulated work spaces and interacting directly with humans. That demands very high standards of safety and functional reliability.

There are already numerous standards designed to ensure that industrial robots do not pose a danger to humans or to their environment, even in the event of a malfunction. For example, the safety requirement EN ISO 10218 sets out the rules for collaborative robots (colloquially known as cobots), and can also be applied outside the industrial sphere.

Making sense of surroundings

To interact with humans or with other robots without harming humans, service robots’ sensors collect data about the world around them.
EBV believes that in the future, cognitive skills will additionally enable them to predict and interpret human actions in order to derive their own helpful, safe responses.
Multiple sensor systems, such as cameras, ultrasonic and pressure‑sensitive sensors can be used by the robots to assess their environments with precision and detect any motion within the space.

Merging data from an array of different sensors means robots can track dynamic obstacles and estimate their position and speed. As a result, robots are able to compute how an obstacle – such as a person – will move, and whether a collision might ensue.

As the robot moves, its distance from the obstacle is continuously monitored. If it detects an unexpected obstacle, it can slow down or change direction.

Speaking my language

For genuine interaction, there has to be efficient communication between humans and machines. EBV believes that reliable voice recognition in as many languages as possible, together with complex semantic processing, incorporating context (which may be time and place, or information from apps and databases, for example), and natural-sounding speech are key requirements.

The accuracy of word recognition has improved dramatically in the past five years, based on developments such as IBM’s Watson and Apple’s Siri voice controlled personal assistants.

For Vicens, the home assistant robot market is the next big thing, with voice recognition such as Amazon Echo or Google Home. “We will soon see a similar approach, with mobile robots addressing home applications like childcare, entertainment, security and comfort,” he says.
“The improvement that voice recognition has recently made is greatly facilitating interaction between humans and robots,” he believes. “For example, it is possible to embed a voice trigger in your application – you just have to utter a ‘key word’ to wake up a system. The cloud provides unlimited capacities to manage sophisticated voice recognition – there is virtually no limit to the performance of voice recognition”

Home help

Figure 2: The iPal humanoid robot by AvatarMind helps children develop language and social interaction skills

EBV reports that people take in about 80% of all information visually, so it makes sense to communicate with robots visually too. New 3D sensor technologies, together with fast data processing and interpretation methods are enabling machines to sense and understand gestures and commands.
For example, in future a person will merely have to point to an object and a robot will bring it, says EBV.

“We have solutions for power management, security, HMI and communications,” says Vicens, adding: “Our portfolio is expanding, particularly with new sensor technologies. Human-machine interfaces are really improving. Again, voice recognition will definitely simplify our interaction with robots, but the fact that robots are interacting ever more closely with humans – for example as care robots – means that very strict safety processes must be implemented.

“Which reminds me of the three laws of Isaac Asimov …,” he adds.

He also points out that while there are concerns about robotic or autonomous systems – especially vehicles – robots can avoid accidents, as they can react in one thousandth of the time a human takes to do so.

Many robotic projects originate in the target market, such as agriculture or security, from a base that does not have any experience or expertise of electronics, sensors or artificial intelligence (AI). Vicens says this is a familiar scenario. “There is a similar situation in the internet of things (IoT) market. Our goal is to turn our customers’ ideas into reality.”

As well as the appropriate electronic components, “EBV provides “the complete ecosystem, including partners that can offer hardware, software design support, manufacturing and so on”, he says. “First, we need to differentiate outdoor applications where a GPS signal can be used. Then some technologies, such as radar, offer longer-range detection.

“In somes, infrared (IR) time-of-flight sensors are adequate. In other cases, motion detection and magnetometer MEMS can help to determine position and orientation,” says Vicens.

Customers will often combine technologies to gain the best solution for their system, he adds, citing autonomous cars, which he believes will – in the near future – combine at least three different technologies to determine their position.


Leave a Reply

Your email address will not be published. Required fields are marked *

*