By Ian Ferguson
There is increasing excitement surrounding the seamless integration of robots in automation and Industry 4.0 strategies. However, we are a long way from writing human operatives out of the equation. The best automation strategies within Industry 4.0 rest on the recognition that the strengths of robots are best realized when they work alongside rather than in the absence of humans.
There are certainly roles that robots can perform more effectively than us. These include very repetitive, monotonous tasks (i.e., welding), jobs involving holding heavy workloads, or tasks in environments where it is dangerous for humans to be present. That said, robots can only do what they are told – they can’t improvise. The incredible power of the human is to think and apply real-time decision making when a situation arises, and AI has some distance to go before coming close to replicating this.
Today’s robots are predominantly costly, static, and capable of performing one task very effectively. I believe that costs must come down and systems must become more agile in terms of movement and re-programmability. Indeed, with these capabilities, we think that factories can better compete with imports. This does not mean the end of human operators. Humans have an extraordinary brain and can deploy this in partnership with robots. The skill sets of workers may well change, but I am pretty optimistic about the prospects of the use of humans in factory settings.
One way in which robots are becoming more closely aligned is in the future integration of sensors into industrial robots. We have five senses. Today, the robots mostly use just one: vision. The capabilities of these systems will continue to improve and I can envision machines beginning to understand gestures for example, in settings where there is a lot of noise as one example. But I expect to see robots adding the other senses. Future robots will be able to listen to voice commands and identify noises that indicate something in the manufacturing plant isn’t behaving in a usual way. They may be able to use touch to confirm a particular product is smooth enough, and perhaps they could taste-test the mixtures of specific compounds. The only sense that may be more limited in robotics is smell, but some companies are working on sensors for this function, with fascinating future applications. For example, the smell of urine has been found to be a leading indicator for certain cancers as an example and it has been proven that bees can smell explosives. Imagine an agricultural setting where fruit can be selected based on their ripeness.
Expecting the Unexpected
An important factor in the cooperation between humans and robots is sporadic cooperation. Developers need to put two key considerations at the center of their thinking. First, when the robot is needed, it must be immediately responsive to the human. Second, these systems are being deployed in settings that are far from ideal and often unpredictable. The designer needs to contemplate nonideal circumstances and ensure that the system keeps humans safe at all times.
One experiment I saw in factories might explain my thinking. A company was exploring drones to perform inventory checking in large warehouses. This sounds like something that can be completed quickly, accurately and safely, until the deployer saw the bar codes being ripped and placed in a non-uniform way on shelves. There was also a huge amount of dust lying around which was getting pushed around by the drones. The lighting in the warehouse varied widely between the start of the day and the evening hours and the drone had to navigate ladders, forklift trucks, boxes, etc in the aisles. All warehouses are different, which challenged the business model for the drone providers, in addition to a range of technology challenges (recharging, etc). If the robot sees something that it doesn’t recognize, the system must default to a safe state, where human intervention can restart the service
At present, robots still have to be monitored by humans, but I think artificial intelligence is going to help. We will see the robot’s capabilities improve as it learns more about the factory setting it is being deployed in and the scenarios it encounters. Many people are excited by the prospects of AI, as am I. However, I am urging conservatism and the delaying of rollouts due to the need to think through all possible scenarios, instead of rushing to be first.
For instance, if collaborative robots are not connected to the outside world and only use local computing resources, this increases security on the one hand, but makes the AI’s self-learning processes more difficult. I think the right balance will be the combination of local learning combined with using the power of scale achieved from the aggregated learnings accumulated in the cloud. I will sacrifice that extra learning if the connectivity cannot be guaranteed as safe.
Efficiency and accuracy are arguably the greatest strengths of AI and robotics, but critical thinking and creativity are still lacking. In a factory, there are all sorts of corner cases that aren’t yet fully understood yet. We’ve seen this situation in the automotive industry, where regular cars have been tested with hundreds of miles of testing, and some of the new autonomous functionality is in the millions of miles driven. It is improving, but problems are still found. In a factory environment, a line-down situation is a significant hit on business effectiveness, and more importantly, an injury to a human is a significant problem. It will take a while before those autonomous systems are fully trusted.
To help with this, we must be prepared for failures in hardware and false conclusions in software. There must be safeguards in place to ensure that neither a malicious attack nor life-threatening misbehavior becomes possible. The system architecture is such that everything is locked down. “Lock all the doors, not just the front one,” Microsoft announced during its Azure Sphere initiative a few years ago, an analogy that has stuck with me.
When we leave our homes, we lock the front door. In the world of IoT, we need to lock every door—inside the house and those that connect outside. From a network perspective, if there’s a breach, the entrant only gains access to a subset of the valuable assets. Software and hardware have to partition systems to isolate functions, and for this to happen, they have to realize immediately when they have been compromised and send a real-time alert.
It’s one way AI can play a role in industrial IoT applications: identifying out-of-the-norm behavior for that system, and alerting a user to then decide the correct course of action. Options would include disconnecting the system from the network, blocking a specific IP address, and disabling certain system functions. Quite simply, developers must plan on being hacked. There are no 100 percent foolproof systems. IoT systems need to continue to raise the bar over time in terms of the level of immunity from attack, but equally, the system must quickly recover to a known, safe state if it becomes compromised.
The automotive industry has so far been considered a pioneer for smart factory solutions. I think there are a wide variety of opportunities for autonomous mobile robots. As mentioned previously, the scale of automotive plants is such that it can justify complex machines that perform one task both efficiently and effectively. Car manufacturing economics (and indeed certain consumer areas like smartphone assembly lines) make the costs justifiable. However, I see robots being able to improve the effectiveness of a wide set of applications when the cost, mobility, power, and reliability issues can be addressed.
New markets such as the food and beverage industry have also discovered the robot in the manufacturing sector. Some argue it has the power to render the traditional production line obsolete. Though robots can replace repetitive tasks in the food and beverage industry, I believe that for maximum effectiveness, the production line must be adjusted to harness the skills of robots and humans. I do not believe this is something where we will see a major change in the next couple of years on the factory floor. The pandemic is such that many companies are thinking about adding technology to adjust process flows in their existing facilities, whereas the more significant opportunity for robots will be for brand new facilities.
About the author
Ian Ferguson is the Vice President of Marketing and Strategic Alliances at Lynx Software Technologies