What are AI-enabled HMIs?

AI-enabled human-machine interfaces (HMI) are systems that involve interactions between the technical system of a machine and an operator in a context and across various channels:
– the machine is itself a hierarchy of interconnected systems capable of real-time data acquisition, data processing, artificial intelligence, and models;
– the operator has intra-individual and inter-individual specificities and performances, including the mobilization of cognitive and training abilities for a given task;
– the channels include, but are not limited to: vision and imaging, voice and sound, touch and haptics, 2D-3D immersion, and a possible development of (direct) brain-to-machine interactions. They involve both directional machine-to-man and man-to-machine exchanges. They aim at providing information, at setting up decisions, at acting and checking the consequences of the action for the best recovery, both on the human side and on the machine side.

 

The impact and maturity level of AI-enabled warehousing HMIs can be mapped and ranked

High impact: AI-enabled HMIs are revolutionizing warehouse systems

AI is relatively easy to apply to a warehouse system because tasks in a warehouse are clearly specified and simple and the environment is highly structured. Order picking takes up 40% of operational costs across most warehouses, and labor costs accounts for up to 70% of a warehouse’s total budget (source: Presans). The number of companies applying AI technology to warehouses is growing fast, with fully automated product processes as a target. Most companies focus on how AI-enabled robots quickly or autonomously execute the given tasks.

Human safety will of course be less of a consideration for fully autonomous warehouse robots. However, AI-enabled robots in a warehouse still need human intervention to resolve unexpected problems. Humans will as a consequence remain nearby AI-enabled robots, and therefore, AI-enabled robots will be required to recognize the collocated human workers with sophisticated sensors, motors, and actuators in order to avoid posing a danger to them.

 

Strategies for securing human safety in warehouses via AI-enabled HMIs can be mapped

Depending on how the tasks on human workers and AI-enabled robots are allocated, each company has different strategies to handle human safety in a warehouse. For example, a robotic arm called the Orb from Kindred performs sorting tasks, and human workers step in only when needed to manually operate the Orb to perform tasks that are difficult for the robotic arm. So, most of the time, human workers are separated from the robotic arm and sometimes remotely operate the robotic arm through VR systems. In the case of Fetch and Amazon, human workers and AI-enabled robots collocate with each other and collaborate on the same task. Fetch adopts a human supervision strategy: a human worker loads up products on a robot, then the robot finds the most efficient route through the warehouse and delivers the loaded products. After delivering a full load, the robot returns to the human worker for the next run. To secure human safety, the robot always runs at reasonable speeds of about 3.5 miles an hour or 1.5 meters per second. Kiva from Amazon takes a closer collaboration approach between human workers and robots such that a human worker is responsible for product packaging and a robot allocates the right amount of tape by measuring the size of packages. Therefore, human workers in Amazon warehouses wear a high-tech vest with pouches full of sensors and radio transmitters on their belt and a tablet in their hand.

Strategies for securing human safety in warehouses via AI-enabled HMIs can be mapped

Source: Presans

The optimization of AI-enabled machines in a warehouse requires the capability to learn from human workers as well as from the data detected and gathered by the system. This implies new human roles to train, explain and maintain machines.

 

Warehouse HMIs will be heavily voice-based

The interface that’s receiving the most attention for training AI is a voice interface based on natural language processing because the voice interface technology shortens the amount of time human workers spend learning new skills for training and is already widely adopted in the real world with mass-market products like Amazon’s Alexa, Google Assistant, Apple’s Siri, and Microsoft’s Cortana. To secure the accuracy of language translated through AI natural language processing, human trainers should train the AI system to make fewer errors while it is translating a human voice.

Aside from training natural language processing, there are a lot of opportunities incorporating other sensing technologies such as computer vision, gesture control devices, embedded eye-tracking platforms, bioacoustics sensing that allows the skin to be used as a finger input surface, emotion detection/recognition technology, and muscle-computer interfaces. Humans also can train an AI to find, handle, and sort products in a warehouse through AR and VR technologies, and then eventually let the AI be able to make the right decision within the supply chain in just seconds. For example, the Orb from Kindred learns how to move to a desired object, lower its gripper, and adjust the two clamps to make a firm grip.

 

Mapping and ranking AI-enabled warehouse HMI technologies can help guide breakthrough innovation roadmaps

Presans can help industrial innovators quickly assess the impact and maturity level of AI-enabled HMIs in their sector.