AI robots want office jobs – IBM

This post was originally published on this site.

Lenovo’s prototype is a desktop assistant with a small screen that serves as a face, cameras mounted on an articulating arm and a built-in projector that displays slides or documents on nearby surfaces. The company says the system could connect to office calendars, messages and files, allowing workers to ask questions about projects without opening another laptop window.

A broader movement away from traditional screen-based computing interfaces may be driving interest in systems like Lenovo’s AI Workmate, according to Wigdor.

“When a device like Lenovo’s AI Workmate can see your desk, hear what’s happening in the room and project information directly into your physical space, the interface stops being something you visit and starts being something you’re inside of.”

Wigdor and other researchers use the term “embodied AI” to describe systems that interact with the physical world through sensors, microphones, cameras, robotics or other hardware. Wigdor said those systems could change not only how people use AI, but also how they relate to it psychologically.

After several years of investment in generative AI workplace tools for writing, coding and productivity tasks, companies including Lenovo and Hugging Face are beginning to test AI systems designed for physical environments.

The push also extends beyond hardware. Thinking Machines Lab’s Murati introduced research on what it calls “interaction models”: AI systems designed to process audio, video and text simultaneously, and respond in real time. The company said current chatbot interfaces still rely on turn-based exchanges where users type or speak, wait for a response and then repeat the process. By contrast, the company envisions AI systems that can continuously interpret what users are seeing, saying and doing without requiring a stop-and-start chatbot exchange.

Hugging Face, the open-source AI platform best known for hosting machine learning models, recently expanded into robotics with a desktop companion called Reachy Mini. Designed for AI experimentation and conversational interaction, the device includes cameras, microphones, speakers and a motorized head that can respond to users in real time.

Even as companies pitch AI devices for workplaces, office environments may prove especially difficult terrain for robotic systems according to Gabe Goodhart, Chief Architect for AI Open Innovation at IBM.

“The biggest challenge with embodied AI will be designing a platform that has a real reason to be embodied,” Goodhart told IBM Think in an interview. “Successful embodied AI will need to figure out how to manifest this in the real world.”

Rather than changing what work gets done, Wigdor suggested embodied AI could change how people interact with information inside physical spaces.

“The reason this matters is that people are actually very good at working in physical space,” he said. “We read context from our environment, and we coordinate with objects and other people through gesture and proximity.”

Leave a Reply

Your email address will not be published. Required fields are marked *