OK...this thing with offboard intelligence processing etc. is almost enough to creep me out
MIT page
MIT technical video (WATCH THIS ONE!!) (MOV format)
Save/As and watching it in Quicktime at 2x size is recommended.
YouTube video (shows expressiveness etc.)
Daily Tech story....
MIT page
MIT technical video (WATCH THIS ONE!!) (MOV format)
Save/As and watching it in Quicktime at 2x size is recommended.
YouTube video (shows expressiveness etc.)
Daily Tech story....
MIT Develops Advanced Humanlike Robot
Like something straight out of the movies, MIT's NEXI body has human-like expressions and speech, which is either really cool or really creepy
Scientist continue to push the boundaries of artificial intelligence, deploying robots and computer AIs into increasingly complex and varied situations.
Many observers on robotics and artificial intelligence, including Apple co-founder Steve Wozniak, remain skeptical that robots will ever be able to perform human like tasks and interact with humans on a social basis.
However, seeing is believing, and if MIT's startling new video is any indication, it appears that researchers at the MIT Media Lab are much closer to overcoming the latter obstacle than previously thought. The product of the Lab's team, directed by Dr. Cynthia Breazeal, is a human-like robot named Nexi that speaks and features complex hand movements and facial gestures.
Nexi is an Mobile Dexterous Social robot, or MDS. The robot is mobile as it can navigate via wheels. It features a mobile base that is self balance, akin to a mini-Segway. It can travel at human walking speed.
The robot is dexterous in that it has two highly agile arms. The arms have four degrees of freedom (DOF), are elastic, and are based on the DOMO/WAM style arm design. They support position and force control via force sensors. The arms together can pick up a 10 pound object, fully extended. Several of the robots can "team up" to lift heavier objects. The shoulder chassis of the robot is mounted on a torso pivot, giving it full freedom of motion.
A DSP and FPGA control the motors while the balancing and force control are achieved via an embedded PC running Linux OS mounted near the base. The Linux PC features wireless communication. A laser sight is used to avoid obstacles.
The hands are one of the robot's unique features. They feature five degrees of freedom. The forearm can roll and provide wrist flex akin to a human forearm. Each hand features three fingers and an opposable thumb, with the index finger and thumb independently controlled and the other two fingers coupled together. The robot can grip objects and make hand gestures to convey emotions. The arms are developed by Meka, Inc. with help from MIT, and also feature protection against collision and slips.
The most interesting and perhaps most disturbing part of Nexi is its expressive face. The face, design by Xitome Design with MIT, features complex expressions. The four degrees of freedom neck can bend low at the base and the head supports a pan-tilt-yaw, allowing for human-like motions. It can nod, shake its head, or move its head as if orienting itself with its surroundings.
The face has 15 DOF and features expressive eyebrows, gaze, eyelids, and mandible. Each eye has a color CCD camera and the head also features an active indoor IR camera. Four separate microphones allow it to localize sounds and another microphone is used to detect speech. It has a speaker to allow it to synthesize speech.
For the robot's human-like behavior and interaction, MIT is focusing on a human-robot interaction approach, which seeks to identify what average citizens want in a robot. MIT will be deploying a team of four robots during a two week pilot program at the Boston Museum of Science in the summer of 2009.
The robot will interact with visitors within a "robot playroom." It will engage listeners in conversation and express emotions. During these interactions the robot will try to learn conversation and new behaviors. At points the MIT operates can elect to tele-operate the robot, Wizard of Oz style to give it more complex behavior, or help conversations from getting to boring. The robot supports many emotions including sadness, anger, confusion, excitement, and boredom.
If the video Nexi independently demonstrates its basic conversational skills, greeting the viewer and informing them, "But I hope you can see that I am very happy that I met you. Thank you for visiting me and I hope to see you again soon!"
While the MIT researchers admit that human level learning and more complex conversational skills remain currently unsolved challenges, Nexi certainly represents an amalgamation of exciting and exotic advances in robotics. With robots like Nexi that can learn and interact, the world may soon become a very different place.
The MIT team's research is sponsored by an ONR DURIP Award "Mobile, Dexterous, Social Robots to Support Complex Human-Robot Teamwork in Uncertain Environments" and by a Microsoft grant.
Like something straight out of the movies, MIT's NEXI body has human-like expressions and speech, which is either really cool or really creepy
Scientist continue to push the boundaries of artificial intelligence, deploying robots and computer AIs into increasingly complex and varied situations.
Many observers on robotics and artificial intelligence, including Apple co-founder Steve Wozniak, remain skeptical that robots will ever be able to perform human like tasks and interact with humans on a social basis.
However, seeing is believing, and if MIT's startling new video is any indication, it appears that researchers at the MIT Media Lab are much closer to overcoming the latter obstacle than previously thought. The product of the Lab's team, directed by Dr. Cynthia Breazeal, is a human-like robot named Nexi that speaks and features complex hand movements and facial gestures.
Nexi is an Mobile Dexterous Social robot, or MDS. The robot is mobile as it can navigate via wheels. It features a mobile base that is self balance, akin to a mini-Segway. It can travel at human walking speed.
The robot is dexterous in that it has two highly agile arms. The arms have four degrees of freedom (DOF), are elastic, and are based on the DOMO/WAM style arm design. They support position and force control via force sensors. The arms together can pick up a 10 pound object, fully extended. Several of the robots can "team up" to lift heavier objects. The shoulder chassis of the robot is mounted on a torso pivot, giving it full freedom of motion.
A DSP and FPGA control the motors while the balancing and force control are achieved via an embedded PC running Linux OS mounted near the base. The Linux PC features wireless communication. A laser sight is used to avoid obstacles.
The hands are one of the robot's unique features. They feature five degrees of freedom. The forearm can roll and provide wrist flex akin to a human forearm. Each hand features three fingers and an opposable thumb, with the index finger and thumb independently controlled and the other two fingers coupled together. The robot can grip objects and make hand gestures to convey emotions. The arms are developed by Meka, Inc. with help from MIT, and also feature protection against collision and slips.
The most interesting and perhaps most disturbing part of Nexi is its expressive face. The face, design by Xitome Design with MIT, features complex expressions. The four degrees of freedom neck can bend low at the base and the head supports a pan-tilt-yaw, allowing for human-like motions. It can nod, shake its head, or move its head as if orienting itself with its surroundings.
The face has 15 DOF and features expressive eyebrows, gaze, eyelids, and mandible. Each eye has a color CCD camera and the head also features an active indoor IR camera. Four separate microphones allow it to localize sounds and another microphone is used to detect speech. It has a speaker to allow it to synthesize speech.
For the robot's human-like behavior and interaction, MIT is focusing on a human-robot interaction approach, which seeks to identify what average citizens want in a robot. MIT will be deploying a team of four robots during a two week pilot program at the Boston Museum of Science in the summer of 2009.
The robot will interact with visitors within a "robot playroom." It will engage listeners in conversation and express emotions. During these interactions the robot will try to learn conversation and new behaviors. At points the MIT operates can elect to tele-operate the robot, Wizard of Oz style to give it more complex behavior, or help conversations from getting to boring. The robot supports many emotions including sadness, anger, confusion, excitement, and boredom.
If the video Nexi independently demonstrates its basic conversational skills, greeting the viewer and informing them, "But I hope you can see that I am very happy that I met you. Thank you for visiting me and I hope to see you again soon!"
While the MIT researchers admit that human level learning and more complex conversational skills remain currently unsolved challenges, Nexi certainly represents an amalgamation of exciting and exotic advances in robotics. With robots like Nexi that can learn and interact, the world may soon become a very different place.
The MIT team's research is sponsored by an ONR DURIP Award "Mobile, Dexterous, Social Robots to Support Complex Human-Robot Teamwork in Uncertain Environments" and by a Microsoft grant.
Comment