Thursday, June 28, 2012

Interface Presentation Gostai Jazz - Telepresence Robot

Gostai Jazz presentation - Telepresence Robot

Humanoid Surveilliance Robot - Jazz Research Development Platform


Jazz Research Development Platform

• Robust and fast moving robot development platform
• Includes 8 ultrasonic sensors and 4 IR sensors
• Embedded Computer and WiFi included
• Easily programmed through open-source Urbi middleware

Jazz Research Development Platform
State of the art technology
Jazz was designed for service and research, with state of the art technology, an articulated head, a docking station, an engaging design for rich interactions with humans and the Urbi open-source middleware for fast programming and flexibility.

• Fast moving & robust
• 8 ultrasonic sensors, 4 IR sensors
• Automatic docking on a charging station (5h of autonomy)
• Wifi 802.11G connected
• USB2 port and removable tablet support for extra accessories (cameras, modules…)
• Atom N270 1.6 GHz or D510 1.66GHz processor with Linux OS (Ubuntu 10.04 server)
• 2 camera types: 1600 X 1200 – 30 FPS - Vision angle: 90° OR 720 x 576 – 25 FPS – Vision angle: 170°


Gostai Jazz Connect

Rich interactions with humans
Beyond a simple pack of technologies, Jazz is a unique robot for researchers interested in rich human/robot interactions, thanks to its engaging design and articulated head with colored eyes that can express basic emotions.

Fast and easy programming with Urbi
• Jazz runs Linux and integrates the open-source Urbi middleware
• Urbi, is the most innovative robotics middleware available today: parallel and event-based execution, integrated components, client/server approach for flexibility
• C++ UObject distributed component architecture
• Urbi is open-source and has a large community: www.urbiforge.org
• Urbi is compatible with ROS, and is used by hundreds of research labs in the world
• Re-use your code on many other Urbi compatible robots (including Nao, Spykee, Bioloid, Pioneer, Segway RMP)

Gostai Urbi 
It also has a full audio/video system for 2 way video conferencing and the necessary user friendly web based interface. You can reprogram it, record interactions, test user reactions and emotions, and the engineering and technical issues will not interfere in your work. With its unique look & feel, Jazz immediately conveys a friendly and positive image, and you can focus on dialogue and interactions. Examples: Jazz is perfect for interactions with children, elderly, people with disabilities or in hospitals.

Jazz Research Development Platform

What makes Jazz unique compared to many other mobile platforms?
• Engaging design for interactions
• Docking Station
• Integrated directional screen
• Articulated head with colored eyes to express basic emotions
• Urbi software: Advanced, open-source, solid SDK interfaced with standards
• 1m high

Wednesday, June 27, 2012

Robot Lawn Mower: Lawnbott Action & Features

LawnBott LB1200 Spyder

Robotic Lawn Mowers


LawnBott Robot Mowers

 KA Lawnbott LB1200 Robot Lawn Mower

LawnBott Robot Mowers are battery powered autonomous lawn mowing robots. LawnBott offers highly efficient and "green" technology at an afordable cost. You will no longer need to mow your lawn and can enjoy life or spend more time with your family. We also offer LawnBott Accessories to keep your automatic lawn mower in peak operating condition and also to increase its mowing time and effectiveness on your specific property. RobotShop is also an authorized dealer and service center for the Lawnbott Robot Mower product line via the RobotShop Robot Hospital.

Powered by one lithium battery
• Designed for smaller yards, up to 5,500 sq. ft.
• Easy, just set it down, turn it on, and walk away!
• Rated slope usage of 27 degrees
• Lightweight at only 18 lbs.
• 30 Day Money Back Guarantee (minus original shipping fees)



The KA LawnBott LB1200 Spyder Robot Lawn Mower is the world's first robot mower that doesn't use a perimeter wire to operate. Patented sensors actually 'sense' when the LawnBott Spyder is over grass to cut, reversing direction when over walkways, curbs, patios and mulched areas.
Features:
• Cuts up to 5 hours on a single charge using alithium battery 
• Let it cut often, so NO bagging, NO clippings, NO mess.
• 4WD, manages slopes up to 27°.
• Virtually silent operation. 

Tuesday, June 26, 2012

Nurse Robot

RIBA robotic bear nurse

Riba Robot Nurse: The Future of Healthcare


There might come a time when human caregivers will be replaced by robots. An impossible vision it may seem, but do not be too surprised about human caregivers being assisted by robots in the near future. Much advancement has been achieved in the field of robotics and technology, and RIBA robot nurse is no doubt another milestone accomplished in the field of healthcare.


RIBA, which stands for Robot for Interactive Body Assistance, is a nurse robot developed in collaboration of the technologies and researchers from the Institute of Physical and Chemical Research (or RIKEN) in Japan and Tokia Rubber Industries Ltd. (TRI). It is engineered to aid nurses in lifting patients from the bed to the wheelchair, and the other way around. RIBA can also assist patients with mobility problems to move to and from the toilet.


RIBA is the second generation of the nursing assistant robot developed under the RIKEN-TRI partnership, following the first model called as RI-MAN, which has low carrying capacity and limited performance functionality. The latest model has been proven and tested to have very strong, human-like arms and novel tactile sensors, which are combined to create an innovative and safe technology for moving patients. It weighs about 180kg and can carry patients of up to 61kg.


Designed to look like a cross-breed of a polar bear and a snowman, this robot nurse is made after a bear with a cute, friendly, and tame face to put patients at ease.
Meanwhile, RIBA’s license for commercial distribution is still under consideration for approval. Once approved, this nursing assistant can really be a worthy acquisition for hospitals and other healthcare facilities that are caring for hundreds of patients needing to be lifted and moved frequently.

Monday, June 25, 2012

Maytronics Dolphin Supreme Automatic Pool Cleaner

DOLPHIN - THE ROBOT POOL GUY









Maytronics is committed to Green-Eco Friendly Technology. Our goal at Maytronics is to offer you the opportunity and the benefit of enjoyment and delight from a Cleanly Pure and Hygienic pool.

Using the Dolphin in your pool will:

Save water and Energy.

Help to prevent germination of algae and bacterium.

Reduce the number of required Backwashes.

The family of Dolphin Robotic Pool Cleaners includes unique and patented
characteristic advantages.



The new generation in robotic pool cleaners
Recommended for pools up to 10m. (36ft.) in length.

Cleans pool floor and pool corners.
- Brushes, scrubs, vacuums, and filters the entire pool including the floor, walls, and waterline. The new, advanced technology enables optimal scanning and pool coverage in a shorter cycle time.
- Add-on extra brushing system is twice as efficient. The rigorous brushing and scrubbing action reinforces the elimination of algae and bacteria.
- Top-opening filtration compartments enabling easy and convenient maintenance of the filtration system
- Three types of filtration options (ultrafine, disposable ultrafine, and coarse filter bags) cover all cleaning demands and requirements, from spring cleaning to ongoing pool maintenance
- One-way water valves prevent the escape of debris and ensure rapid water drainage
- Adjustable floats allow efficient scanning in different pool sizes
- Low voltage motor provides minimal energy consumption
- DIY – easy maintenance, at user and local dealer level
- Three year warranty (all parts included)




Dolphin Automatic Pool Cleaner

Saturday, June 23, 2012

NAO ROBOT - ALDEBARAN ROBOTICS



Title #0

Hardware Platform

NAO is a programmable, 57-cm tall humanoid robot with the following key components:
- Body with 25 degrees of freedom (DOF) whose key elements are electric motors and actuators
- Sensor network, including 2 cameras, 4 microphones, sonar rangefinder, 2 IR emitters and receivers, 1 inertial board, 9 tactile sensors, and 8 pressure sensors
- Various communication devices, including voice synthesizer, LED lights, and 2 high-fidelity speakers
- Intel ATOM 1,6ghz CPU (located in the head) that runs a Linux kernel and supports Aldebaran’s proprietary middleware (NAOqi)
- Second CPU (located in the torso)
- 27,6-watt-hour battery that provides NAO with 1.5 or more hours of autonomy, depending on usage



Motion

Omnidirectional walking:
NAO's walking uses a simple dynamic model (linear inverse pendulum) and quadratic programming. It is stabilized using feedback from joint sensors. This makes walking robust and resistant to small disturbances, and torso oscillations in the frontal and lateral planes are absorbed. NAO can walk on a variety of floor surfaces, such as carpeted, tiled, and wooden floors. NAO can transition between these surfaces while walking.

Whole body motion:
NAO's motion module is based on generalized inverse kinematics, which handles Cartesian coordinates, joint control, balance, redundancy, and task priority. This means that when asking NAO to extend its arm, it bends over because its arms and leg joints are taken into account. NAO will stop its movement to maintain balance.

Fall Manager:
The Fall Manager protects NAO when it falls. Its main function is to detect when NAO's center of mass (CoM) shifts outside the support polygon. The support polygon is determined by the position of the foot or feet in contact with the ground. When a fall is detected, all motion tasks are killed and, depending on the direction, NAO's arms assume protective positioning, the CoM is lowered, and robot stiffness is reduced to zero.



Vision

NAO has two cameras and can track, learn, and recognize images and faces.
NAO sees using two 920p cameras, which can capture up to 30 images per second.
The first camera, located on NAO’s forehead, scans the horizon, while the second located at mouth level scans the immediate surroundings.
The software lets you recover photos and video streams of what NAO sees. But eyes are only useful if you can interpret what you see.
That’s why NAO contains a set of algorithms for detecting and recognizing faces and shapes. NAO can recognize who is talking to it or find a ball or, eventually, more complex objects.
These algorithms have been specially developed, with constant attention to using a minimum of processor resources.
Furthermore, NAO’s SDK lets you develop your own modules to interface with OpenCV (the Open Source Computer Vision library originally developed by Intel).
Since you can execute modules on NAO or transfer them to a PC connected to NAO, you can easily use the OpenCV display functions to develop and test your algorithms with image feedback.


Title #0

Tactile Sensors:
Besides cameras and microphones, NAO is fitted with capacitive sensors positioned on top of its head in three sections and on its hands.
You can therefore give NAO information through touch: pressing once to tell it shut down, for example, or using the sensors as a series of buttons to trigger an associated action.
The system comes with LED lights that indicate the type of contact. You can also program complex sequences.

Sonar Rangefinders:
NAO is equipped with two sonar channels: two transmitters and two receivers.
They allow NAO to estimate the distances to obstacles in its environment. The detection range is 0–70 cm.
Less than 15 cm, there is no distance information; NAO only knows that an object is present.

Title #0


Aldebaran's New Nao Robot Demo

Nao Robot

Monday, June 11, 2012

Chapit, your new Domestic Robot


Today the Raytron company presents a small new robot named Chapit which is an “intelligent” companion helping you for some basic tasks like turning the light on or turning on electric or electronic devices (television… air conditioning system…)one of the biggest advantages of the Chapit is the capacity to recognize a man, woman or child without any programmation. The base model comes with a vocabulary of about 100 words only but it is possible to teach it up to 10.000. It also features an internet connection allowing distant control. Of course the Chapit is far from an ASIMO but at the same time it appears to be better than a Nabaztag for example.






Friday, June 8, 2012

ROBOFOOT - DFKI ROBOTIC CENTER

ROBOFOOT

Smart robotics for high added value footwear industry:

Scientific Leader:
Prof. Dr. Frank Kirchner
Short description:

The ROBOFOOT project addresses the urgent need for intelligent solutions to automate complex and still mostly manual processes in industrial production. In this case, the project ROBOFOOT aims at introducing robotics on the manufacturing of footwear, which is still mainly handcrafted. The project develops robotic solutions to optimize and automate the production process in order to allow higher quality products at competitive prices.
The target area is the production of fashion and other high added value shoes where Europe still maintains its leadership. ROBOFOOT brings together a consortium composed of 10 institutions from 3 European countries (Italy, Spain and Germany).
Duration:
Start: 01/09/2010
Ende: 28/02/2013
Sponsors:
Funded by the European Commission; Grant agreement No. 260159 under the Call FP7-2010-NMP-ICT-FoF
Partner:
TEKNIKER (Coordinator) (SP), CNR (IT), INESCOP (SP), COMAU (IT), ROBOTNIK (SP), QDESIGN (IT), AYCN (SP), PIKOLINOS (SP), ROTTA (IT)
Project leader:
Dr.-Ing. José de Gea Fernández
Contact person:
Dr.-Ing. José de Gea Fernández
Website:
Downloads:



Project details:
Footwear industry in Europe is one of the most important sectors in terms of the number of people employed.

A large part of the production of such shoes is still handcrafted. One of the main difficulties to automate this area is the high number of products variants due to a large number of models, sizes, and colors.  Additionally, footwear manufacturing requires complex manufacturing and assembly processes as well as extensive labour demand during quality and packaging operations. 

Although there have been attempts in the past that tried to incorporate robotic solutions in this sector, they did not succeed in the objective (except for specific operations, as, for instance, related to the injection process) probably because it was too early and the technology available and its cost were not adequate for the footwear industry. ROBOFOOT will demonstrate that it is feasible nowadays.

Within the project, DFKI-RIC will address mainly the problem of manipulating shoes, which as deformable objects require of high levels of dexterity. More specifically, DFKI-RIC will evaluate the use of multi-fingered hands and dual-arm robots in these scenarios. In order to provide the robot with flexibility, a learning component will allow the possibility to learn new manipulation skills and task sequencing.

AILA - Robofoot HD

AILA - Teleoperation using Kinect (Demo)

Thursday, June 7, 2012

ApriAlpha: Toshiba's New Personal Robot

Toshiba has announced a new personal robot called the ApriAlpha. The 9.5 Kg spherical robot has 3 wheels and sports a CCD camera complete with face recognition. It also has a speech synthesizer, speech recognition, and speech localization - that last feature allows it to come to you when you call. And to stay in touch, it can use 802.11b or Bluetooth to link with your LAN. An IR receiver lets you command it with a conventional remote control and it can also send photos taken with its camera to cell phones. It uses ultrasonic sensors for obstacle avoidance. The robot is powered by a conventional battery but can optionally use Toshiba's recently introduced methanol fuel cell.

(Front) ApriAlpha _v3, Apri sharp ear (Back)ApriAttenda

(Front) ApriAlpha _v3, Apri sharp ear (Back)ApriAttenda

ApriAlpha (Toshiba household robot )

Homemade Robot

Tuesday, June 5, 2012

ROBOT AT WAR - AlphaDog Proto

ROBOT AT WAR - DARPA/Boston Dynamics LS3 AlphaDog

ROBOTS GO TO WAR 2

Dragonflies, fleas and dogs
Military robots come in an astonishing range of shapes and sizes. DelFly, a dragonfly-shaped surveillance drone built at the Delft University of Technology in the Netherlands, weighs less than a gold wedding ring, camera included. At the other end of the scale is America’s biggest and fastest drone, the $15m Avenger, the first of which recently began testing in Afghanistan. It uses a jet engine to carry up to 2.7 tonnes of bombs, sensors and other types of payload at more than 740kph (460mph).
On the ground, robots range from truck-sized to tiny. TerraMax, a robotics kit made by Oshkosh Defense, based in Wisconsin, turns military lorries or armoured vehicles into remotely controlled or autonomous machines. And smaller robotic beasties are hopping, crawling and running into action, as three models built by Boston Dynamics, a spin-out from the Massachusetts Institute of Technology (MIT), illustrate.
By jabbing the ground with a gas-powered piston, the Sand Flea can leap through a window, or onto a roof nine metres up. Gyro-stabilisers provide smooth in-air filming and landings. The 5kg robot then rolls along on wheels until another hop is needed—to jump up some stairs, perhaps, or to a rooftop across the street. Another robot, RiSE, resembles a giant cockroach and uses six legs, tipped with short, Velcro-like spikes, to climb coarse walls. Biggest of all is the LS3 (pictured above), a four-legged dog-like robot that uses computer vision to trot behind a human over rough terrain carrying more than 180kg of supplies. The firm says it could be deployed within three years.
Demand for land robots, also known as unmanned ground vehicles (UGVs), began to pick up a decade ago after American-led forces knocked the Taliban from power in Afghanistan. Soldiers hunting Osama bin Laden and his al-Qaeda fighters in the Hindu Kush were keen to send robot scouts into caves first. Remote-controlled ground robots then proved enormously helpful in the discovery and removal of makeshift roadside bombs in Afghanistan, Iraq, and elsewhere. Visiongain, a research firm, reckons a total of $689m will be spent on ground robots this year. The ten biggest buyers in descending order are America, followed by Israel, a distant second, and Britain, Germany, China, South Korea, Singapore, Australia, France and Canada.
Robots’ capabilities have steadily improved. Upload a mugshot into an SUGV, a briefcase-sized robot than runs on caterpillar tracks, and it can identify a man walking in a crowd and follow him. Its maker, iRobot, another MIT spin-out, is best known for its robot vacuum cleaners. Its latest military robot, FirstLook, is a smaller device that also runs on tracks. Equipped with four cameras, it is designed to be thrown through windows or over walls.
Another throwable reconnaissance robot, the Scout XT Throwbot made by Recon Robotics, based in Edina, Minnesota, was one of the stars of the Ground Robotics Capabilities conference held in San Diego in March. Shaped like a two-headed hammer with wheels on each head, the Scout XT has the heft of a grenade and can be thrown through glass windows. Wheel spikes provide traction on steep or rocky surfaces. In February the US Army ordered 1,100 Scout XTs for $13.9m. Another version, being developed with the US Navy, can be taken to a ship inside a small aquatic robot, and will use magnetic wheels to climb up the hull and onto the deck, says Alan Bignall, Recon’s boss.
Even more exotic designs are in development. DARPA, the research arm of America’s Department of Defence, is funding the development of small, soft robots that move like jerky slithering blobs. EATR, another DARPA project, is a foraging robot that gathers leaves and wood for fuel and then burns it to generate electricity. Researchers at Italy’s Sant’Anna School of Advanced Studies, in Pisa, have designed a snakelike aquatic robot. And a small helicopter drone called the Pelican, designed by German and American companies, could remain aloft for weeks, powered by energy from a ground-based laser.
Robots on the front line: iRobot’s SUGV; the Recon Robotics Scout XT Throwbot; and Rafael’s Samson Remote Weapon Station
All this technology may not always provide a meaningful advantage. This year the US Marine Corps will start testing Boston Dynamics’s four-legged beast of burden, the LS3. Its elaborate design keeps it upright even on rocky ground, and it is very difficult to knock over. But its petrol engine makes it as loud as a lawnmower. The Taliban have a much stealthier system, notes a former French army lieutenant. Their mules quietly eat grass.
A slippery slope to war?
A larger worry is that countries with high-performance military robots may be more inclined to launch attacks. Robots protect soldiers and improve their odds of success. Using drones sidesteps the tricky politics of putting boots on foreign soil. In the past eight years drone strikes by America’s Central Intelligence Agency (CIA) have killed more than 2,400 people in Pakistan, including 479 civilians, according to the Bureau for Investigative Journalism in London. Technological progress appears to have contributed to an increase in the frequency of strikes. In 2005 CIA drones struck targets in Pakistan three times; last year there were 76 strikes there. Do armed robots make killing too easy?
Not necessarily. When Mary Cummings, a former US Navy pilot, stopped flying F-18 fighter jets in 1997, there were no video links between cockpits and command centres, and even radio contact was patchy at times. As a result, pilots often made their own calls on whether or not to strike. Today’s drones, blimps, unmanned boats and reconnaissance robots collect and transmit so much data, she says, that Western countries now practise “warfare by committee”. Government lawyers and others in operation rooms monitor video feeds from robots to call off strikes that are illegal or would “look bad on CNN”, says Ms Cummings, who is now a robotics researcher at MIT. And unlike pilots at the scene, these remote observers are unaffected by the physical toil of flying a jet or the adrenalin rush of combat.
In March Britain’s Royal Artillery began buying robotic missiles designed by MBDA, a French company. The Fire Shadow is a “loitering munition” capable of travelling 100km, more than twice the maximum range of a traditional artillery shell. It can circle in the sky for hours, using sensors to track even a moving target. A human operator, viewing a video feed, then issues an instruction to attack, fly elsewhere to find a better target, or abort the mission by destroying itself. But bypassing the human operator to automate attacks would be, technologically, in the “realm of feasibility”, an MBDA spokesman says.
Could the “man in the loop” be removed from robotic weapons? The Israel Defence Forces have installed “combat proven” robot machineguns along the country’s borders. When sensors detect an intruder, the barrel pivots to follow him. A human soldier, watching the scene remotely via a fibre-optic link, decides whether or not to issue a warning (through a loudspeaker) or press the fire button. The robot sentry, the Samson Remote Weapon Station, could function without human intervention, says David Ishai of Rafael, its Israeli manufacturer, based in Haifa. But, he says, switching to automatic mode would be a bad idea—and illegal to boot.
Traditional rules of engagement stipulate that a human must decide if a weapon is to be fired. But this restriction is starting to come under pressure. Already, defence planners are considering whether a drone aircraft should be able to fire a weapon based on its own analysis. In 2009 the authors of a US Air Force report suggested that humans will increasingly operate not “in the loop” but “on the loop”, monitoring armed robots rather than fully controlling them. Better artificial intelligence will eventually allow robots to “make lethal combat decisions”, they wrote, provided legal and ethical issues can be resolved.
A report on the matter issued by Britain’s Ministry of Defence last year argued that if a drone’s control system takes appropriate account of the law on armed conflicts (basically military necessity, humanity, proportionality and the ability to distinguish between military targets and civilians), then an autonomous strike could meet legal norms. Testing and certifying such a system would be difficult. But the authors concluded that “as technology matures…policymakers will need to be aware of the potential legal issues and take advice at a very early stage of any new system’s procurement cycle.”
Pressure will grow for armies to automate their robots if only so machines can shoot before being shot, says Jürgen Altmann of the Technical University of Dortmund, in Germany, and a founder of the International Committee for Robot Arms Control, an advocacy group. Some robot weapons already operate without human operators to save precious seconds. An incoming anti-ship missile detected even a dozen miles away can be safely shot down only by a robot, says Frank Biemans, head of sensing technologies for the Goalkeeper automatic ship-defence cannons made by Thales Nederland.
Admittedly, that involves a machine destroying another machine. But as human operators struggle to assimilate the information collected by robotic sensors, decision-making by robots seems likely to increase. This might be a good thing, says Ronald Arkin, a roboticist at the Georgia Institute of Technology, who is developing “ethics software” for armed robots. By crunching data from drone sensors and military databases, it might be possible to predict, for example, that a strike from a missile could damage a nearby religious building. Clever software might be used to call off attacks as well as initiate them.
In the air, on land and at sea, military robots are proliferating. But the revolution in military robotics does have an Achilles heel, notes Emmanuel Goffi of the French air-force academy in Salon-de-Provence. As robots become more autonomous, identifying a human to hold accountable for a bloody blunder will become very difficult, he says. Should it be the robot’s programmer, designer, manufacturer, human overseer or his superiors? It is hard to say. The backlash from a deadly and well-publicised mistake may be the only thing that can halt the rapid march of the robots.

ROBOTS GO TO WAR

Robots go to war

March of the robots

Robotics: From reconnaissance to bomb-defusal to launching attacks, military robots are on the march, raising knotty ethical quandaries

Jun 2nd 2012 | from the print edition
·          
·          
IN THE early afternoon of August 18th 2008, a reconnaissance unit of about 100 French paratroopers, accompanied by a small number of Afghan and American soldiers, was ambushed by a similarly sized Taliban force in the Uzbin Valley, not far from Kabul. Ten French soldiers were killed in fighting that continued into the night—France’s biggest loss since it sent soldiers to Afghanistan in 2002. But it might have been avoided had the unit had a single aerial-robot scout, says Gérard de Boisboissel, a specialist on military robots at the French army’s Saint-Cyr military academy. That assessment, shared by many, led to a retooling of France’s armed forces. Today drones, also called unmanned aerial vehicles (UAVs), routinely accompany even small French units.
More broadly, fighting forces and intelligence services worldwide are equipping themselves with all manner of robots that operate on land and sea, and in the air. The conduct of war is being transformed—and largely, it seems, to the West’s advantage. But knotty ethical quandaries are cropping up as the mechanical guts, electronic sensors and digital brains of robots continue to improve. Some fear that robots, which are ingeniously mobile and can collect and process huge quantities of data, make it too easy to launch attacks. Others worry whether robots can be trusted to make their own decisions while in combat.
In this Technology Quarterly