Jumat, 23 Oktober 2009

Robotic NEWS !!

0 komentar

Scientists Create Robot Surrogate For Blind Persons In Testing Visual Prostheses

(Oct. 20, 2009) — Scientists at the California Institute of Technology (Caltech) have created a remote-controlled robot that is able to simulate the "visual" experience of a blind person who has been implanted with a visual prosthesis, such as an artificial retina. An artificial retina consists of a silicon chip studded with a varying number of electrodes that directly stimulate retinal nerve cells. It is hoped that this approach may one day give blind persons the freedom of independent mobility.


The CYCLOPS mobile robotic platform is designed to be used as a surrogate for blind persons in the testing of visual prostheses. (Credit: Caltech/Wolfgang Fink, Mark Tarbell)

The robot—or, rather, the mobile robotic platform, or rover—is called CYCLOPS. It is the first such device to emulate what the blind can see with an implant, says Wolfgang Fink, a visiting associate in physics at Caltech and the Edward and Maria Keonjian Distinguished Professor in Microelectronics at the University of Arizona. Its development and potential uses are described in a paper recently published online in the journal Computer Methods and Programs in Biomedicine.

An artificial retina, also known as a retinal prosthesis, may use either an internal or external miniature camera to capture images. The captured images then are processed and passed along to the implanted silicon chip's electrode array. (Ongoing work at Caltech's Visual and Autonomous Exploration Systems Research Laboratory by Fink and Caltech visiting scientist Mark Tarbell has focused on the creation and refinement of these image-processing algorithms.) The chip directly stimulates the eye's functional retinal ganglion cells, which carry the image information to the vision centers in the brain.

CYCLOPS fills a void in the process of testing visual prostheses, explains Fink. "How do you approximate what the blind can see with the implant so you can figure out how to make it better?" he asks.

One way is to test potential enhancements on a blind person who has been given an artificial retina. And, indeed, the retinal implant research team does this often, and extensively. But few people worldwide have been implanted with retinal prostheses, and there is only so much testing they can be asked to endure.

Another way is to give sighted people devices that downgrade their vision to what might be expected using artificial vision prostheses. And this, too, is often done. But it's a less-than-ideal solution since the brain of a sighted person is adept at taking poor-quality images and processing them in various ways, adding detail as needed. This processing is what allows most people to see in dim light, for example, or through smoke or fog.

"A sighted person's objectivity is impaired," Fink says. "They may not be able to get to the level of what a blind person truly experiences."

Enter one more possible solution: CYCLOPS. "We can use CYCLOPS in lieu of a blind person," Fink explains. "We can equip it with a camera just like what a blind person would have with a retinal prosthesis, and that puts us in the unique position of being able to dictate what the robot receives as visual input."

Now, if scientists want to see how much better the resolution is when a retinal prosthesis has an array of 50 pixels as opposed to 16 pixels, they can try both out on CYCLOPS. They might do this by asking the robot to follow a black line down a white-tiled hallway, or seeing if it can find—and enter—a darkened doorway.

"We're not quite at that stage yet," Fink cautions, referring to such independent maneuvering.

CYCLOPS's camera is gimballed, which means it can emulate left-to-right and up-and-down head movements. The input from the camera runs through the onboard computing platform, which does real-time image processing. For now, however, the platform itself is moved around remotely, via a joystick. "The platform can be operated from anywhere in the world, through its wireless Internet connection," says Tarbell.

"We have the image-processing algorithms running locally on the robot's platform—but we have to get it to the point where it has complete control of its own responses," Fink says.

Once that's done, he adds, "we can run many, many tests without bothering the blind prosthesis carriers."

Among the things they hope to learn from such testing is how to enhance a workplace or living environment to make it more accessible to a blind person with a particular vision implant. If CYCLOPS can use computer-enhanced images from a 50-pixel array to make its way safely through a room with a chair in one corner, a sofa along the wall, and a coffee table in the middle, then there is a good chance that a blind person with a 50-pixel retinal prosthesis would be able to do the same.

The results of tests on the CYCLOPS robot should also help researchers determine whether a particular version of a prosthesis, say, or its onboard image-processing software, are even worth testing in blind persons. "We'll be coming in with a much more educated initial starting point, after which we'll be able to see how blind people work with these implants," Fink notes.

And the implants need to work well. After all, Fink points out, "Blind people using a cane or a canine unit can move around impressively well. For an implant to be useful, it has to have the implicit promise that it will surpass these tools. The ultimate promise—the hope—is that we instill in them such useful vision that they can attain independent mobility, can recognize people, and can go about their daily lives."

The work done in the paper by Fink and Tarbell, "CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses," was supported by a grant from the National Science Foundation. Fink and Tarbell have filed a provisional patent on the technology on behalf of Caltech.


Adapted from materials provided by California Institute of Technology, via EurekAlert!, a service of AAAS.

Swimming Robot Makes Waves At Bath

(Sep. 25, 2009) — Researchers at the University of Bath have used nature for inspiration in designing a new type of swimming robot which could bring a breakthrough in submersible technology.


Postgraduate researchers Keri Collins and Ryan Ladd developed the Gymnobot. It is powered by a fin that runs the length of the underside of its rigid body; this undulates to make a wave in the water which propels the robot forwards. (Credit: Image courtesy of University of Bath)

Conventional submarine robots are powered by propellers that are heavy, inefficient and can get tangled in weeds.

In contrast ‘Gymnobot', created by researchers from the Ocean Technologies Lab in the University's Department of Mechanical Engineering, is powered by a fin that runs the length of the underside of its rigid body; this undulates to make a wave in the water which propels the robot forwards.

The design, inspired by the Amazonian knifefish, is thought to be more energy efficient than conventional propellers and allows the robot to navigate shallow water near the sea shore.

Gymnobot could be used to film and study the diverse marine life near the seashore, where conventional submersible robots would have difficulty manoeuvring due to the shallow water with its complex rocky environment and plants that can tangle a propeller.

Dr William Megill, Lecturer in Biomimetics at the University of Bath, explained: "The knifefish has a ventral fin that runs the length of its body and makes a wave in the water that enables it to easily swim backwards or forwards in the water.

"Gymnobot mimics this fin and creates a wave in the water that drives it forwards. This form of propulsion is potentially much more efficient than a conventional propeller and is easier to control in shallow water near the shore."

Keri Collins, a postgraduate student who developed the Gymnobot as part of her PhD, added: "We hope to observe how the water flows around the fin in later stages of the project. In particular we want to look at the creation and development of vortices around the fin.

"Some fish create vortices when flicking their tails one way but then destroy them when their tails flick back the other way. By destroying the vortex they are effectively re-using the energy in that swirling bit of water. The less energy left in the wake when the fish has passed, the less energy is wasted.

"It will be particularly interesting to see how thrust is affected by changing the wave of the fin from a constant amplitude to one that is tapered at one end."

The lab was recently awarded a grant to work with six other European institutions to create a similar robot that reacts to water flow and is able to swim against currents.

In addition to studying biodiversity near the shore and in fast-flowing rivers, robots like Gymnobot could also be used for detecting pollution in the environment or for inspecting structures such as oil rigs.

The project was funded by BMT Defence Services and the Engineering & Physical Sciences Research Council.


Adapted from materials provided by University of Bath, via AlphaGalileo.

Research Teams Successfully Operate Multiple Biomedical Robots From Numerous Locations

ScienceDaily (Sep. 18, 2009) — Using a new software protocol called the Interoperable Telesurgical Protocol, nine research teams from universities and research institutes around the world recently collaborated on the first successful demonstration of multiple biomedical robots operated from different locations in the U.S., Europe, and Asia. SRI International operated its M7 surgical robot for this demonstration.


SRI M7 system. (Credit: Image courtesy of SRI International)

In a 24-hour period, each participating group connected over the Internet and controlled robots at different locations. The tests performed demonstrated how a wide variety of robot and controller designs can seamlessly interoperate, allowing researchers to work together easily and more efficiently. In addition, the demonstration evaluated the feasibility of robotic manipulation from multiple sites, and was conducted to measure time and performance for evaluating laparoscopic surgical skills.

New Interoperable Telesurgical Protocol The new protocol was cooperatively developed by the University of Washington and SRI International, to standardize the way remotely operated robots are managed over the Internet.

"Although many telemanipulation systems have common features, there is currently no accepted protocol for connecting these systems," said SRI's Tom Low. "We hope this new protocol serves as a starting point for the discussion and development of a robust and practical Internet-type standard that supports the interoperability of future robotic systems."

The protocol will allow engineers and designers that usually develop technologies independently, to work collaboratively, determine which designs work best, encourage widespread adoption of the new communications protocol, and help robotics research to evolve more rapidly. Early adoption of this protocol internationally will encourage robotic systems to be developed with interoperability in mind, and avoid future incompatibilities.

"We're very pleased with the success of the event in which almost all of the possible connections between operator stations and remote robots were successful. We were particularly excited that novel elements such as a simulated robot and an exoskeleton controller worked smoothly with the other remote manipulation systems," said Professor Blake Hannaford of the University of Washington.

The demonstration included the following organizations:

  • SRI International, Menlo Park, Calif., USA
  • University of Washington Biorobotics Lab (BRL), Seattle, Washington, USA
  • University of California at Santa Cruz (UCSC), Bionics Lab, Santa Cruz, Calif., USA
  • iMedSim, Interactive Medical Simulation Laboratory, Rensselaer Polytechnic Institute, Troy, New York, USA
  • Korea University of Technology (KUT) BioRobotics Lab, Cheonan, South Chungcheong, South Korea
  • Imperial College London (ICL), London, England
  • Johns Hopkins University (JHU), Baltimore, Maryland, USA
  • Technische Universität München (TUM), Munich, Germany
  • Tokyo Institute of Technology (TOK), Tokyo, Japan

For more information regarding availability of the Interoperable Telesurgical Protocol, please visit: http://brl.ee.washington.edu/Research_Active/Interoperability/index.php/Main_Page


Adapted from materials provided by SRI Internationa




Sabtu, 10 Oktober 2009

2 komentar




We're the champion!!!!
Pemenang robot sumo juara 1 di stikom dan juara 3 robot line tracer
makasih buat dukungan doa teman2 selama ini,,,,
semua ni bukan hanya karena kerja keras kita saja,,,,
tapi karena Tuhan yang memberikan jalan yang terbaik buat kita smua,,,, dan rencana yang terindah buat kita smua,,,,
rendah hati,,, rendah hati,,dan rendah hati,,,,,
amin,,,,

foto : pemenang juara 1 robot sumo dan juara 3 robot line tracer