Mechanical servants have long captured people’s imaginations, but it was only in the 1960s that viable robots first debuted in factories. With continued improvements in the artificial intelligence behind such devices, it’s recently become possible to employ robots in the seemingly impervious field of customer service.
Sights and Sounds
The main drivers behind the recent phenomenon of customer service robots are software advances, stronger batteries, and cheap, more sophisticated sensing equipment. These robots often provide a touchscreen option, but natural language comprehension is a key element. The first try at programming a computer to understand natural human language was the STUDENT program created by Daniel Bobrow in 1964. Since then, steady improvements have culminated in software like Siri for the iPhone and Cortana for the Windows OS. What might come as a surprise is that different language comprehension programs are based on competing theories. Unlike understanding language, there isn’t much conflict in how to go about recognizing images. Convolutional neural networks, based on research stretching back to the 1950s into how animal vision systems operate, are used. Combined with digital cameras, this method lets robots add a personal touch to interactions with people by recognizing the faces of return customers.
The New Employee
Genuinely autonomous service robots only began appearing in stores around 2014. Among the early arrivals was the OSHbot created by a partnership between Lowe’s Innovation Labs and Fellow Robots. This five-foot tall rectangular automaton features front and rear screens along with visual capabilities that allow it to see and identify objects presented by inquiring customers. It can understand questions in several languages and escort customers to the desired merchandise. Similar robots began work in Japan in 2015 performing tasks like checking in guests, showing them to their rooms, and carrying luggage at the Henn na Hotel in Nagasaki. The expansion of customer service robots is also becoming easier due to companies like Aldebaran Robotics in France that manufactures the Nao, Pepper, and Romeo robots primarily to perform these functions. The Nao robot, only 23 inches tall, made its debut at the Bank of Tokyo-Mitsubishi UFJ in April, 2015. Its job consisted of providing directions to customers. Its larger sibling, the four-foot Pepper, which has been used as one of the most recent innovation initiatives for Westfield malls, relies on wheels for mobility, while the Nao is capable of bipedal walking. As well as understanding language, Pepper is able to recognize human facial expressions to enhance communications.