U.S. Robotics Roadmap calls for white papers for revision

U.S. Robotics Roadmap calls for white papers for revision

The U.S. National Robotics Roadmap was first created 10 years ago. Since then, government agencies, universities, and companies have used it as a reference for where robotics is going. The first roadmap was published in 2009 and then revised in 2013 and 2016. The objective is to publish the fourth version of the roadmap by summer 2020.

The team developing the U.S. National Robotics Roadmap has put out a call to engage about 150 to 200 people from academia and industry to ensure that it is representative of the robotics community’s view of the future. The roadmap will cover manufacturing, service, medical, first-responder, and space robotics.

The revised roadmap will also include considerations related to ethics and workforce. It will cover emerging applications, the key challenges to progress, and what research and development is needed.

Join community workshops

Three one-and-a-half-day workshops will be organized for community input to the roadmap. The workshops will take place as follows:

  • Sept. 11-12 in Chicago (organized by Nancy Amato, co-director of the Parasol Lab at Texas A&M University and head of the Department of Computer Science at the University of Ilinois at Urbana-Champaign)
  • Oct. 17-18 in Los Angeles (organized by Maja Mataric, Chan Soon-Shiong distinguished professor of computer science, neuroscience, and pediatrics at the University of Southern California)
  • Nov. 15-16 in Lowell, Mass. (organized by Holly Yanco, director of the NERVE Center at the University of Massachusetts Lowell)

Participation in these workshops will be by invitation only. To participate, please submit a white paper/position statement of a maximum length of 1.5 pages. What are key use cases for robotics in a five-to-10-year perspective, what are key limitations, and what R&D is needed in that time frame? The white paper can address all three aspects or focus on one of them. The white paper must include the following information:

  • Name, affiliation, and e-mail address
  • A position statement (1.5 pages max)

Please submit the white paper as regular text or as a PDF file. Statements that are too long will be ignored. Position papers that only focus on current research are not appropriate. A white paper should present a future vision and not merely discuss state of the art.

White papers should be submitted by end of the day Aug. 15, 2019, to roadmapping@robotics-vo.org. Late submissions may not be considered. We will evaluate submitted white papers by Aug. 18 and select people for the workshops by Aug. 19.

Roadmap revision timeline

The workshop reports will be used as the basis for a synthesis of a new roadmap. The nominal timeline is:

  • August 2019: Call for white papers
  • September – November 2019: Workshops
  • December 2019: Workshops reports finalized
  • January 2020: Synthesis meeting at UC San Diego
  • February 2020: Publish draft roadmap for community feedback
  • April 2020: Revision of roadmap based on community feedback
  • May 2020: Finalize roadmap with graphics design
  • July 2020: Publish roadmap

If you have any questions about the process, the scope, etc., please send e-mail to Henrik I Christensen at hichristensen@eng.ucsd.edu.

U.S. Robotics Roadmap calls for reviewers

Henrik I Christensen spoke at the Robotics Summit & Expo in Boston.

Editor’s note: Christensen, Qualcomm Chancellor’s Chair of Robot Systems at the University of California San Diego and co-founder of Robust AI, delivered a keynote address at last month’s Robotics Summit & Expo, produced by The Robot Report.

The post U.S. Robotics Roadmap calls for white papers for revision appeared first on The Robot Report.

Soft robots controlled by magnets, light in new research

Researchers from North Carolina State University and Elon University have developed a technique that allows them to remotely control the movement of soft robots, lock them into position for as long as needed, and later reconfigure the robots into new shapes. The technique relies on light and magnetic fields.

“We’re particularly excited about the reconfigurability,” said Joe Tracy, a professor of materials science and engineering at NC State and corresponding author of a paper on the work. “By engineering the properties of the material, we can control the soft robot’s movement remotely; we can get it to hold a given shape; we can then return the robot to its original shape or further modify its movement; and we can do this repeatedly. All of those things are valuable, in terms of this technology’s utility in biomedical or aerospace applications.”

LEDs make soft robots pliable

For this work, the researchers used soft robots made of a polymer embedded with magnetic iron microparticles. Under normal conditions, the material is relatively stiff and holds its shape.

However, researchers can heat up the material using light from a light-emitting diode (LED), which makes the polymer pliable. Once pliable, researchers demonstrated that they could control the shape of the robot remotely by applying a magnetic field. After forming the desired shape, researchers could remove the LED light, allowing the robot to resume its original stiffness — effectively locking the shape in place.

By applying the light a second time and removing the magnetic field, the researchers could get the soft robots to return to their original shapes. Or they could apply the light again and manipulate the magnetic field to move the robots or get them to assume new shapes.

In experimental testing, the researchers demonstrated that the soft robots could be used to form “grabbers” for lifting and transporting objects. The soft robots could also be used as cantilevers, or folded into “flowers” with petals that bend in different directions.

“We are not limited to binary configurations, such as a grabber being either open or closed,” said Jessica Liu, first author of the paper and a Ph.D. student at NC State. “We can control the light to ensure that a robot will hold its shape at any point.”

Soft robots controlled by magnets, light in new research

Iron microparticles can be used to make soft robots move. Source: North Carolina State University

Streamlining robot design

In addition, the researchers developed a computational model that can be used to streamline the soft robot design process. The model allows them to fine-tune a robot’s shape, polymer thickness, the abundance of iron microparticles in the polymer, and the size and direction of the required magnetic field before constructing a prototype to accomplish a specific task.

“Next steps include optimizing the polymer for different applications,” Tracy said. “For example, engineering polymers that respond at different temperatures in order to meet the needs of specific applications.”

Authors and support

The paper, “Photothermally and Magnetically Controlled Reconfiguration of Polymer Composites for Soft Robotics,” appears in the journal Science Advances. In addition Liu as first author, the paper was co-authored by Jonathan Gillen, a former undergraduate at NC State; Sumeet Mishra, a former Ph.D. student at NC State; and Benjamin Evans, an associate professor of physics at Elon University.

The work was done with support from the National Science Foundation (NSF) under grants CMMI-1663416 and CMMI-1662641. The work was also supported by the Research Triangle MRSEC, which is funded by NSF under grant DMR-1121107; and by NC State’s Analytical Instrumentation Facility and the Duke University Shared Materials Instrumentation Facility, which are supported by the State of North Carolina and NSF grant ECCS-1542015.

The post Soft robots controlled by magnets, light in new research appeared first on The Robot Report.

Wearable device could improve communication between humans, robots

An international team of scientists has developed an ultra-thin, wearable electronic device that facilitates smooth communication between humans and machines. The researchers said the new device is easy to manufacture and imperceptible when worn. It could be applied to human skin to capture various types of physical data for better health monitoring and early disease detection, or it could enable robots to perform specific tasks in response to physical cues from humans.

Wearable human-machine interfaces have had challenges — some are made from rigid electronic chips and sensors that are uncomfortable and restrict the body’s motion, while others consist of softer, more wearable elastic materials but suffer from slow response times.

While researchers have developed thin inorganic materials that wrinkle and bend, the challenge remains to develop wearable devices with multiple functions that enable smooth communication between humans and machines.

The team that wrote the paper included Kyoseung Sim, Zhoulyu Rao; Faheem Ershad; Jianming Lei, Anish Thukral, Jie Chen, and Cunjiang Yu at University of Houston. It also included Zhanan Zou and Jianling Xiao at University of Colorado, Boulder, and Qing-An Huang at Southeast University in Nanjing, China.

Wearable nanomembrane reads human muscle signals

Kyoseung Sim and company have designed a nanomembrane made from indium zinc oxide using a chemical processing approach that allows them to tune the material’s texture and surface properties. The resulting devices were only 3 to 4 micrometers thick, and snake-shaped, properties that allow them to stretch and remain unnoticed by the wearer.

When worn by humans, the devices could collect signals from muscle and use them to directly guide a robot, enabling the user to feel what the robot hand experienced. The devices maintain their function when human skin is stretched or compressed.

Wearable device could improve communication between humans, robots

Soft, unnoticeable, multifunctional, electronics-based, wearable human-machine interface devices. Credit: Cunjiang Yu

The researchers also found that sensors made from this nanomembrane material could be designed to monitor UV exposure (to mitigate skin disease risk) or to detect skin temperature (to provide early medical warnings), while still functioning well under strain.

Editor’s note: This month’s print issue of The Robot Report, which is distributed with Design World, focuses on exoskeletons. It will be available soon.

The post Wearable device could improve communication between humans, robots appeared first on The Robot Report.

Roach-inspired robot shares insect’s speed, toughness

If the sight of a skittering bug makes you squirm, you may want to look away — a new insect-sized robot created by researchers at the University of California, Berkeley, can scurry across the floor at nearly the speed of a darting cockroach. And it’s nearly as hardy as a roach is. Try to squash this robot under your foot, and more than likely, it will just keep going.

“Most of the robots at this particular small scale are very fragile. If you step on them, you pretty much destroy the robot,” said Liwei Lin, a professor of mechanical engineering at UC Berkeley and senior author of a new study that describes the robot. “We found that if we put weight on our robot, it still more or less functions.”

Small-scale robots like these could be advantageous in search-and-rescue missions, squeezing and squishing into places where dogs or humans can’t fit, or where it may be too dangerous for them to go, said Yichuan Wu, first author of the paper, who completed the work as a graduate student in mechanical engineering at UC Berkeley through the Tsinghua-Berkeley Shenzhen Institute partnership.

“For example, if an earthquake happens, it’s very hard for the big machines, or the big dogs, to find life underneath debris, so that’s why we need a small-sized robot that is agile and robust,” said Wu, who is now an assistant professor at the University of Electronic Science and Technology of China.

The study appears this week in the journal Science Robotics.

PVDF provides roach-like characteristics

The robot, which is about the size of a large postage stamp, is made of a thin sheet of a piezoelectric material called polyvinylidene fluoride, or PVDF. Piezoelectric materials are unique, in that applying electric voltage to them causes the materials to expand or contract.

UC Berkeley roach robot

The robot is built of a layered material that bends and straightens when AC voltage is applied, causing it to spring forward in a “leapfrogging” motion. Credit: UC Berkeley video and photo by Stephen McNally

The researchers coated the PVDF in a layer of an elastic polymer, which causes the entire sheet to bend, instead of to expand or contract. They then added a front leg so that, as the material bends and straightens under an electric field, the oscillations propel the device forward in a “leapfrogging” motion.

The resulting robot may be simple to look at, but it has some remarkable abilities. It can sail along the ground at a speed of 20 body lengths per second, a rate comparable to that of a roach and reported to be the fastest pace among insect-scale robots. It can zip through tubes, climb small slopes, and carry small loads, such as a peanut.

Perhaps most impressively, the robot, which weighs less than one tenth of a gram, can withstand a weight of around 60kg [132 lb.] — about the weight of an average human — which is approximately 1 million times the weight of the robot.

“People may have experienced that, if you step on the cockroach, you may have to grind it up a little bit, otherwise the cockroach may still survive and run away,” Lin said. “Somebody stepping on our robot is applying an extraordinarily large weight, but [the robot] still works, it still functions. So, in that particular sense, it’s very similar to a cockroach.”

The robot is currently “tethered” to a thin wire that carries an electric voltage that drives the oscillations. The team is experimenting with adding a battery so the roach robot can roam independently. They are also working to add gas sensors and are improving the design of the robot so it can be steered around obstacles.

Co-authors of the paper include Justin K. Yim, Zhichun Shao, Mingjing Qi, Junwen Zhong, Zihao Luo, Ronald S. Fearing and Robert J. Full of UC Berkeley, Xiaojun Yan of Beihang University and Jiaming Liang, Min Zhang and Xiaohao Wang of Tsinghua University.

This work is supported in part by the Berkeley Sensor and Actuator Center, an Industry-University Cooperation Research Center.

Editor’s note: This article republished from the University of California, Berkeley.

The post Roach-inspired robot shares insect’s speed, toughness appeared first on The Robot Report.

LUKE prosthetic arm has sense of touch, can move in response to thoughts

Keven Walgamott had a good “feeling” about picking up the egg without crushing it. What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of LUKE, a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he “felt” the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That’s because the team, led by University of Utah biomedical engineering associate professor Gregory Clark, has developed a way for the “LUKE Arm” (named after the robotic hand that Luke Skywalker got in The Empire Strikes Back) to mimic the way a human hand feels objects by sending the appropriate signals to the brain.

Their findings were published in a new paper co-authored by University of Utah biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark, and other colleagues in the latest edition of the journal Science Robotics.

Sending the right messages

“We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits,” George says. “We’re making more biologically realistic signals.”

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up, and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

“It almost put me to tears,” Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. “It was really amazing. I never thought I would be able to feel in that hand again.”

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the University of Utah, was able to pluck grapes without crushing them, pick up an egg without cracking it, and hold his wife’s hand with a sensation in the fingers similar to that of an able-bodied person.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

How those things are accomplished is through a complex series of mathematical calculations and modeling.

Kevin Walgamott LUKE arm

Kevin . Walgamott wears the LUKE prosthetic arm. Credit: University of Utah Center for Neural Interfaces

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon “skin” over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the University of Utah team has been developing a system that allows the prosthetic arm to tap into the wearer’s nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by University of Utah biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array.

The Array is a bundle of 100 microelectrodes and wires that are implanted into the amputee’s nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to “feel” the object in order to know how much pressure to exert because you can’t figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the Array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” says Clark.

To achieve that, Clark’s team used mathematical calculations along with recorded impulses from a primate’s arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the University of Utah’s Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago’s Department of Organismal Biology and Anatomy, the Cleveland Clinic’s Department of Biomedical Engineering, and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

“This is an incredible interdisciplinary effort,” says Clark. “We could not have done this without the substantial efforts of everybody on that team.”

Editor’s note: Reposted from the University of Utah.

The post LUKE prosthetic arm has sense of touch, can move in response to thoughts appeared first on The Robot Report.

Sea Machines Robotics to demonstrate autonomous spill response

Sea Machines Robotics to demonstrate autonomous spill response

Source: Sea Machines Robotics

BOSTON — Sea Machines Robotics Inc. this week said it has entered into a cooperative agreement with the U.S. Department of Transportation’s Maritime Administration to demonstrate the ability of its autonomous technology in increasing the safety, response time and productivity of marine oil-spill response operations.

Sea Machines was founded in 2015 and claimed to be “the leader in pioneering autonomous control and advanced perception systems for the marine industries.” The company builds software and systems to increase the safety, efficiency, and performance of ships, workboats, and commercial vessels worldwide.

The U.S. Maritime Administration (MARAD) is an agency of the U.S. Department of Transportation that promotes waterborne transportation and its integration with other segments of the transportation system.

Preparing for oil-spill exercise

To make the on-water exercises possible, Sea Machines will install its SM300 autonomous-command system aboard a MARCO skimming vessel owned by Marine Spill Response Corp. (MSRC), a not-for-profit, U.S. Coast Guard-classified oil spill removal organization (OSRO). MSRC was formed with the Marine Preservation Association to offer oil-spill response services in accordance with the Oil Pollution Act of 1990.

Sea Machines plans to train MSRC personnel to operate its system. Then, on Aug. 21, Sea Machines and MSRC will execute simulated oil-spill recovery exercises in the harbor of Portland, Maine, before an audience of government, naval, international, environmental, and industry partners.

The response skimming vessel is manufactured by Seattle-based Kvichak Marine Industries and is equipped with a MARCO filter belt skimmer to recover oil from the surface of the water. This vessel typically operates in coastal or near-shore areas. Once installed, the SM300 will give the MSRC vessel the following new capabilities:

  • Remote autonomous control from an onshore location or secondary vessel,
  • ENC-based mission planning,
  • Autonomous waypoint tracking,
  • Autonomous grid line tracking,
  • Collaborative autonomy for multi-vessel operations,
  • Wireless remote payload control to deploy onboard boom and other response equipment, and
  • Obstacle detection and collision avoidance.

Round-the-clock response

In addition, Sea Machines said, it enables minimally manned and unmanned autonomous maritime operations. Such configurations allow operators to respond to spill events 24/7 depending on recovery conditions, even when crews are unavailable or restricted, the company said. These configurations also reduce or eliminate exposure of crewmembers to toxic fumes and other safety hazards.

“Autonomous technology has the power to not only help prevent vessel accidents that can lead to spills, but can also facilitate better preparedness; aid in safer, efficient, and effective cleanup,” said CEO Michael G. Johnson, CEO of Sea Machines. “We look forward to working closely with MARAD and MSRC in these industry-modernizing exercises.”

“Our No. 1 priority is the safety of our personnel at MSRC,” said John Swift, vice president at MSRC. “The ability to use autonomous technology — allowing response operations to continue in an environment where their safety may be at risk — furthers our mission of response preparedness.”

Sea Machines promises rapid ROI for multiple vessels

Sea Machines’ SM Series of products, which includes the SM300 and SM200, provides marine operators a new era of task-driven, computer-guided vessel control, bringing advanced autonomy within reach for small- and large-scale operations. SM products can be installed aboard existing or new-build commercial vessels with return on investment typically seen within a year.

In addition, Sea Machines has received funding from Toyota AI Ventures.

Sea Machines is also a leading developer of advanced perception and navigation assistance technology for a range of vessel types, including container ships. The company is currently testing its perception and situational awareness technology aboard one of A.P. Moller-Maersk’s new-build ice-class container ships.

The post Sea Machines Robotics to demonstrate autonomous spill response appeared first on The Robot Report.

Microrobots activated by laser pulses could deliver medicine to tumors

Targeting medical treatment to an ailing body part is a practice as old as medicine itself. Drops go into itchy eyes. A broken arm goes into a cast. But often what ails us is inside the body and is not so easy to reach. In such cases, a treatment like surgery or chemotherapy might be called for. A pair of researchers in Caltech’s Division of Engineering and Applied Science are working on an entirely new form of treatment — microrobots that can deliver drugs to specific spots inside the body while being monitored and controlled from outside the body.

“The microrobot concept is really cool because you can get micromachinery right to where you need it,” said Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering at the California Institute of Technology. “It could be drug delivery, or a predesigned microsurgery.”

The microrobots are a joint research project of Wang and Wei Gao, assistant professor of medical engineering, and are intended for treating tumors in the digestive tract.

Developing jet-powered microrobots

The microrobots consist of microscopic spheres of magnesium metal coated with thin layers of gold and parylene, a polymer that resists digestion. The layers leave a circular portion of the sphere uncovered, kind of like a porthole. The uncovered portion of the magnesium reacts with the fluids in the digestive tract, generating small bubbles. The stream of bubbles acts like a jet and propels the sphere forward until it collides with nearby tissue.

On their own, magnesium spherical microrobots that can zoom around might be interesting, but they are not especially useful. To turn them from a novelty into a vehicle for delivering medication, Wang and Gao made some modifications to them.

First, a layer of medication is sandwiched between an individual microsphere and its parylene coat. Then, to protect the microrobots from the harsh environment of the stomach, they are enveloped in microcapsules made of paraffin wax.

Laser-guided delivery

At this stage, the spheres are capable of carrying drugs, but still lack the crucial ability to deliver them to a desired location. For that, Wang and Gao use photoacoustic computed tomography (PACT), a technique developed by Wang that uses pulses of infrared laser light.

The infrared laser light diffuses through tissues and is absorbed by oxygen-carrying hemoglobin molecules in red blood cells, causing the molecules to vibrate ultrasonically. Those ultrasonic vibrations are picked up by sensors pressed against the skin. The data from those sensors is used to create images of the internal structures of the body.

Previously, Wang has shown that variations of PACT can be used to identify breast tumors, or even individual cancer cells. With respect to the microrobots, the technique has two jobs. The first is imaging. By using PACT, the researchers can find tumors in the digestive tract and also track the location of the microrobots, which show up strongly in the PACT images.

Microrobots activated by laser pulses could deliver medicine to tumors

Microrobots activated by lasers and powered by magnesium jets could deliver medicine within the human body. Source: Caltech

Once the microrobots arrive in the vicinity of the tumor, a high-power continuous-wave near-infrared laser beam is used to activate them. Because the microrobots absorb the infrared light so strongly, they briefly heat up, melting the wax capsule surrounding them, and exposing them to digestive fluids.

At that point, the microrobots’ bubble jets activate, and the microrobots begin swarming. The jets are not steerable, so the technique is sort of a shotgun approach — the microrobots will not all hit the targeted area, but many will. When they do, they stick to the surface and begin releasing their medication payload.

“These micromotors can penetrate the mucus of the digestive tract and stay there for a long time. This improves medicine delivery,” Gao says. “But because they’re made of magnesium, they’re biocompatible and biodegradable.”

Pushing the concept

Tests in animal models show that the microrobots perform as intended, but Gao and Wang say they are planning to continue pushing the research forward.

“We demonstrated the concept that you can reach the diseased area and activate the microrobots,” Gao says. “The next step is evaluating the therapeutic effect of them.”

Gao also says he would like to develop variations of the microrobots that can operate in other parts of the body, and with different types of propulsion systems.

Wang says his goal is to improve how his PACT system interacts with the microrobots. The infrared laser light it uses has some difficulty reaching into deeper parts of the body, but he says it should be possible to develop a system that can penetrate further.

The paper describing the microrobot research, titled, “A microrobotic system guided by photoacoustic tomography for targeted navigation in intestines in vivo,” appears in the July 24 issue of Science Robotics. Other co-authors include Zhiguang Wu, Lei Li, Yiran Yang (MS ’18), Yang Li, and So-Yoon Yang of Caltech; and Peng Hu of Washington University in St. Louis. Funding for the research was provided by the National Institutes of Health and Caltech’s Donna and Benjamin M. Rosen Bioengineering Center.

Editor’s note: This article republished from the California Institute of Technology.

The post Microrobots activated by laser pulses could deliver medicine to tumors appeared first on The Robot Report.

Smart manufacturing trends analyzed in GP Bullhound report

Smart manufacturing trends analyzed in new GP Bullhound report

Smart manufacturing investments. Source: GP Bullhound

Continuing improvements in software and hardware are leading to trends such as Manufacturing-as-a-Service, hyper-personalization of products on demand, and a reinvention of the capital goods economy, found a new study. Last month, GP Bullhound issued a new report titled “Smart Manufacturing: The Rise of the Machines.”

The report provided a global, in-depth look at how smart manufacturing gained momentum between 2013 and 2018. It also drew conclusions about the potential future for manufacturing in terms of growth, investment, and the value of data. With robotics still largely serving manufacturing, engineers can get a glimpse of trends for which to prepare.

GP Bullhound reviewed the value of smart manufacturing transactions. China and Japan have led in smart manufacturing, with a market value of $28 billion, according to the technology advisory and investment firm. Europe followed with $24 billion, and the U.S. lagged at $20 billion.

The report found 1,300 venture capital transactions worldwide, worth a total of $17.4 billion. The U.S. led in investments, with American startups receiving $11.4 billion, compared with $3.9 billion in Asia and $2.1 billion in Europe. GP Bullhound also found $37.7 billion in mergers and acquisitions during the five-year period.

Venture funding in smart manufacturing by region

Sources: Pitchbook, Capital IQ, company websites, press releases, GP Bullhound

In addition, the report noted that data is growing in value, despite debates over how and whether production should be automated.

Dr. Nikolas Westphal, director at GP Bullhound, answered several questions from The Robot Report about the study’s findings:

Whether we call it “smart manufacturing,” “Industry 4.0,” or something else, the combination of machine learning, big data, the Internet of Things (IoT), and robotics is arriving, according to your report. But how ready are most companies — especially those outside the electronics and automotive verticals — for it?

Westphal: Smart manufacturing readiness is something that we discussed with several of our interview partners, including interviewees from leading European software houses and IoT platforms.

The current state seems to be that most OEMs are substantially increasing the density of IoT devices within their equipment in order to make it “smart” and are also working on the required digital platforms. As “smart” equipment proliferates, more and more manufacturing operators of all sizes will start to increasingly use methodologies of smart manufacturing.

Annual data creation in smart manufacturing

Source: GP Bullhound

When it comes to digitization by industry, our research indeed indicates that electronics and automotive are furthest down the line on the journey to end-to-end digitization. In general, I would say that today, industries with the highest scale effects are also the most automated. With the emergence of smaller, more flexible robotic equipment — such as collaborative robots, additive manufacturing, and data-driven factory design — we believe that also smaller players will be able to reap the rewards of smart automation.

Some of the companies featured in our report actually address this challenge for companies of all sizes. One example for this is Oden Technologies, which is featured in Section 2 of our report.

Investments in robotics and startups have slowed in the past quarter, but do you think that’s temporary and why?

Westphal: Quarterly VC funding data is notoriously hard to interpret, as it follows transaction cycles. Applying our search criteria for smart manufacturing startups, global VC funding in smart manufacturing in Q1 2019 has stood at €1.02 billion ($1.14 billion U.S.) across 73 deals versus €1.07 billion ($1.2 billion U.S.) in Q4 2018 [Source: Pitchbook]. As there is somewhat of a reporting lag, I expect the Q1 2019 figure to be gradually adjusted upward throughout the year.

Global smart manufacturing trends

Source: GP Bullhound

How might a cyclical economic recession affect spending on industrial automation and smart manufacturing?

Westphal: I believe that a recession may not necessarily long-term impact investments into industrial automation specifically.  While replacement cycles may somewhat slow, efficiency will be increasingly important in a recession situation.

The section on productivity gains from smart manufacturing cites Volvo as an example. How is Volvo’s use of robots part of a technology cluster?

Westphal: The tables and the case studies were supplied by our feature partner Accenture. On the left-hand side of both Figure 1 and 2, you can see the different relevant technologies, on the right-hand side different industry verticals. The percentages indicate the incremental cost savings per employee in Figure 1 as well as the projected implied additional gains in market capitalization in Figure 2.

For example, in automotive, autonomous robots and AI seem to have the biggest impact, in addition to 3D printing, blockchain, and big data. Overall, Accenture believes that the combinatory effect of these technologies will add up to incremental cost savings per employee of 13.9% for automotive.

How much is simulation software being applied to the design and implementation of robotics? How far are we from “lights-out” manufacturing? 

Westphal: This question is addressed to some extent by the feature of Brian Mathews of Bright Machines. Once the computer vision and control challenges have been addressed, lights out manufacturing should become a reality.

Design and simulation in smart manufacturing

Source: GP Bullhound

Several robotics vendors have told us that they want to “keep humans in the loop,” so what sorts of processes are better for collaboration vs. full autonomy with “software-defined” manufacturing?

Westphal: From our interviews on the topic, it seems to me that high-volume, repetitive, but complex processes that require a high degree of accuracy are well-suited for full autonomy, while processes that require a high degree of versatility are better suited for collaboration.

In noteworthy mergers and acquisitions, why was Teradyne’s acquisition of Universal Robots included but not the creation of OnRobot or Honeywell‘s purchase of Intelligrated. Was there a reason for the omissions?

Westphal: The Teradyne-Universal Robots deal is featured on p. 33. Honeywell/Intelligrated is part of our database but not featured in the selected landmark transactions. We have not only selected those by size, but also other criteria like sector fit and visibility.

The creation of OnRobot is not shown in Section 3 as we weren’t able to find publicly available data on funding amount. OnRobot itself is featured as a notable company on p. 63 of the report.

Will trade tensions between the West and China slow the trend to cross-border consolidation?

Westphal: It seems that Chinese outbound investment is really geared towards utilizing technologies in China’s huge manufacturing sector. Especially as Europe does not seem to engage in restrictive trade policies with China (yet), I would expect this trend to continue.

Cross-border deals in smart manufacturing

Cross-border deals. Source: GP Bullhound

Since GP Bullhound is watching investments in hardware and the software stack around smart manufacturing, has it identified any strategic leaders?

Westphal: We don’t provide investment advice. A selection of companies that we find interesting can be found on p. 62 and 63 of the report.

The post Smart manufacturing trends analyzed in GP Bullhound report appeared first on The Robot Report.

Electronic skin could give robots an exceptional sense of touch


electronic skin

The National University of Singapore developed the Asynchronous Coded Electronic Skin, an artificial nervous system that could give robots an exceptional sense of touch. | Credit: National University of Singapore.

Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by researchers at the National University of Singapore (NUS).

The new electronic skin system has ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.

The innovation, achieved by Assistant Professor Benjamin Tee and his team from NUS Materials Science and Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.

Faster than the human sensory nervous system

“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hopes of giving robots and prosthetic devices a better sense of touch.

Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, unlike the nerve bundles in the human skin, it is made up of a network of sensors connected via a single electrical conductor.. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.

Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology, N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”

Related: Challenges of building haptic feedback for surgical robots

ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds – the fastest ever achieved for an electronic skin technology – even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.

The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

The ACES developed by Asst. Professor Tee (left) and his team responds 1000 times faster than the human sensory nervous system. | Credit: National University of Singapore

Smart electronic skins for robots and prosthetics

ACES has a simple wiring system and remarkable responsiveness even with increasing numbers of sensors. These key characteristics will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.

<strong>Related:</strong> <a href=”https://www.therobotreport.com/university-of-texas-austin-patent-gives-robots-ultra-sensitive-skin/”>UT Austin Patent Gives Robots Ultra-Sensitive Skin</a>

“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.

For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.

Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.

Editor’s Note: This article was republished from the National University of Singapore.

The post Electronic skin could give robots an exceptional sense of touch appeared first on The Robot Report.

Artificial muscles based on MIT fibers could make robots more responsive

Artificial muscles from MIT achieve powerful pulling force

Artificial muscles based on powerful fiber contractions could advance robotics and prosthetics. Credit: Felice Frankel

CAMBRIDGE, Mass. — As a cucumber plant grows, it sprouts tightly coiled tendrils that seek out supports in order to pull the plant upward. This ensures the plant receives as much sunlight exposure as possible. Now, researchers at the Massachusetts Institute of Technology have found a way to imitate this coiling-and-pulling mechanism to produce contracting fibers that could be used as artificial muscles for robots, prosthetic limbs, or other mechanical and biomedical applications.

While many different approaches have been used for creating artificial muscles, including hydraulic systems, servo motors, shape-memory metals, and polymers that respond to stimuli, they all have limitations, including high weight or slow response times. The new fiber-based system, by contrast, is extremely lightweight and can respond very quickly, the researchers say. The findings are being reported today in the journal Science.

The new fibers were developed by MIT postdoc Mehmet Kanik and graduate student Sirma Örgüç, working with professors Polina Anikeeva, Yoel Fink, Anantha Chandrakasan, and C. Cem Taşan. The team also included MIT graduate student Georgios Varnavides, postdoc Jinwoo Kim, and undergraduate students Thomas Benavides, Dani Gonzalez, and Timothy Akintlio. They have used a fiber-drawing technique to combine two dissimilar polymers into a single strand of fiber.

artificial muscle fiber at MIT

Credit: Courtesy of the researchers, MIT

The key to the process is mating together two materials that have very different thermal expansion coefficients — meaning they have different rates of expansion when they are heated. This is the same principle used in many thermostats, for example, using a bimetallic strip as a way of measuring temperature. As the joined material heats up, the side that wants to expand faster is held back by the other material. As a result, the bonded material curls up, bending toward the side that is expanding more slowly.

Using two different polymers bonded together, a very stretchable cyclic copolymer elastomer and a much stiffer thermoplastic polyethylene, Kanik, Örgüç and colleagues produced a fiber that, when stretched out to several times its original length, naturally forms itself into a tight coil, very similar to the tendrils that cucumbers produce.

Artificial muscles surprise

But what happened next actually came as a surprise when the researchers first experienced it. “There was a lot of serendipity in this,” Anikeeva recalled.

As soon as Kanik picked up the coiled fiber for the first time, the warmth of his hand alone caused the fiber to curl up more tightly. Following up on that observation, he found that even a small increase in temperature could make the coil tighten up, producing a surprisingly strong pulling force. Then, as soon as the temperature went back down, the fiber returned to its original length.

In later testing, the team showed that this process of contracting and expanding could be repeated 10,000 times “and it was still going strong,” Anikeeva said.

One of the reasons for that longevity, she said, is that “everything is operating under very moderate conditions,” including low activation temperatures. Just a 1-degree Celsius increase can be enough to start the fiber contraction.

The fibers can span a wide range of sizes, from a few micrometers (millionths of a meter) to a few millimeters (thousandths of a meter) in width, and can easily be manufactured in batches up to hundreds of meters long. Tests have shown that a single fiber is capable of lifting loads of up to 650 times its own weight. For these experiments on individual fibers, Örgüç and Kanik have developed dedicated, miniaturized testing setups.

artificial muscle fiber test

Credit: Courtesy of the researchers, MIT

The degree of tightening that occurs when the fiber is heated can be “programmed” by determining how much of an initial stretch to give the fiber. This allows the material to be tuned to exactly the amount of force needed and the amount of temperature change needed to trigger that force.

The fibers are made using a fiber-drawing system, which makes it possible to incorporate other components into the fiber itself. Fiber drawing is done by creating an oversized version of the material, called a preform, which is then heated to a specific temperature at which the material becomes viscous. It can then be pulled, much like pulling taffy, to create a fiber that retains its internal structure but is a small fraction of the width of the preform.

For testing purposes, the researchers coated the fibers with meshes of conductive nanowires. These meshes can be used as sensors to reveal the exact tension experienced or exerted by the fiber. In the future, these fibers could also include heating elements such as optical fibers or electrodes, providing a way of heating it internally without having to rely on any outside heat source to activate the contraction of the “muscle.”

Potential applications

Such artificial muscle fibers could find uses as actuators in robotic arms, legs, or grippers, and in prosthetic limbs, where their slight weight and fast response times could provide a significant advantage.

Some prosthetic limbs today can weigh as much as 30 pounds, with much of the weight coming from actuators, which are often pneumatic or hydraulic; lighter-weight actuators could thus make life much easier for those who use prosthetics.

Credit: Courtesy of the researchers, MIT

“Such fibers might also find uses in tiny biomedical devices, such as a medical robot that works by going into an artery and then being activated,” Anikeeva said. “We have activation times on the order of tens of milliseconds to seconds,” depending on the dimensions.

To provide greater strength for lifting heavier loads, the fibers can be bundled together, much as muscle fibers are bundled in the body. The team successfully tested bundles of 100 fibers.

Through the fiber-drawing process, sensors could also be incorporated in the fibers to provide feedback on conditions they encounter, such as in a prosthetic limb. Örgüç said bundled muscle fibers with a closed-loop feedback mechanism could find applications in robotic systems where automated and precise control are required.

Kanik said that the possibilities for materials of this type are virtually limitless, because almost any combination of two materials with different thermal expansion rates could work, leaving a vast realm of possible combinations to explore. He added that this new finding was like opening a new window, only to see “a bunch of other windows” waiting to be opened.

“The strength of this work is coming from its simplicity,” he said.

The work was supported by the National Institute of Neurological Disorders and Stroke and the National Science Foundation.

Editor’s note: This article republished with permission from MIT News. 

The post Artificial muscles based on MIT fibers could make robots more responsive appeared first on The Robot Report.