U.S. Robotics Roadmap calls for white papers for revision

U.S. Robotics Roadmap calls for white papers for revision

The U.S. National Robotics Roadmap was first created 10 years ago. Since then, government agencies, universities, and companies have used it as a reference for where robotics is going. The first roadmap was published in 2009 and then revised in 2013 and 2016. The objective is to publish the fourth version of the roadmap by summer 2020.

The team developing the U.S. National Robotics Roadmap has put out a call to engage about 150 to 200 people from academia and industry to ensure that it is representative of the robotics community’s view of the future. The roadmap will cover manufacturing, service, medical, first-responder, and space robotics.

The revised roadmap will also include considerations related to ethics and workforce. It will cover emerging applications, the key challenges to progress, and what research and development is needed.

Join community workshops

Three one-and-a-half-day workshops will be organized for community input to the roadmap. The workshops will take place as follows:

  • Sept. 11-12 in Chicago (organized by Nancy Amato, co-director of the Parasol Lab at Texas A&M University and head of the Department of Computer Science at the University of Ilinois at Urbana-Champaign)
  • Oct. 17-18 in Los Angeles (organized by Maja Mataric, Chan Soon-Shiong distinguished professor of computer science, neuroscience, and pediatrics at the University of Southern California)
  • Nov. 15-16 in Lowell, Mass. (organized by Holly Yanco, director of the NERVE Center at the University of Massachusetts Lowell)

Participation in these workshops will be by invitation only. To participate, please submit a white paper/position statement of a maximum length of 1.5 pages. What are key use cases for robotics in a five-to-10-year perspective, what are key limitations, and what R&D is needed in that time frame? The white paper can address all three aspects or focus on one of them. The white paper must include the following information:

  • Name, affiliation, and e-mail address
  • A position statement (1.5 pages max)

Please submit the white paper as regular text or as a PDF file. Statements that are too long will be ignored. Position papers that only focus on current research are not appropriate. A white paper should present a future vision and not merely discuss state of the art.

White papers should be submitted by end of the day Aug. 15, 2019, to roadmapping@robotics-vo.org. Late submissions may not be considered. We will evaluate submitted white papers by Aug. 18 and select people for the workshops by Aug. 19.

Roadmap revision timeline

The workshop reports will be used as the basis for a synthesis of a new roadmap. The nominal timeline is:

  • August 2019: Call for white papers
  • September – November 2019: Workshops
  • December 2019: Workshops reports finalized
  • January 2020: Synthesis meeting at UC San Diego
  • February 2020: Publish draft roadmap for community feedback
  • April 2020: Revision of roadmap based on community feedback
  • May 2020: Finalize roadmap with graphics design
  • July 2020: Publish roadmap

If you have any questions about the process, the scope, etc., please send e-mail to Henrik I Christensen at hichristensen@eng.ucsd.edu.

U.S. Robotics Roadmap calls for reviewers

Henrik I Christensen spoke at the Robotics Summit & Expo in Boston.

Editor’s note: Christensen, Qualcomm Chancellor’s Chair of Robot Systems at the University of California San Diego and co-founder of Robust AI, delivered a keynote address at last month’s Robotics Summit & Expo, produced by The Robot Report.

The post U.S. Robotics Roadmap calls for white papers for revision appeared first on The Robot Report.

PHD adds gripper options, transition plate to product line

PHD adds gripper options, transition plate to product line

PneuConnect with GRT gripper on a UR cobot. Source: PHD

PHD Inc. this month added three products to its line of grippers and accessories for industrial automation. They are intended to help robots grip large objects, make positioning and programming easy for maximum efficiency, and facilitate machine tending. PHD’s products are designed to work with collaborative robot arms, or cobots, from Universal Robots A/S.

Fort Wayne, In.-based PHD said it sells grippers, linear slides, and the widest range of long-life, robust actuators in the industry. It also offers engineering software and Internet-based tools to save design time, support from factory-trained application and industry specialists, and rapid product delivery.

PHD adds jaw-travel option to GRR line

The company has added a 300mm (11.81 in.) jaw-travel model of its Series GRR high-capacity pneumatic grippers. These parallel grippers are designed to provide high grip force, five long-jaw travels, and high loads.

Because the Guardian grippers can withstand high impact and shock loads, they are suitable for applications such as small engine block manufacturing, automotive wheel-rim manufacturing, and foundry applications, said PHD.

Also available is the Series EGRR high-capacity electric parallel grippers, which offer many of the same benefits as the pneumatic design.

Pneu-Connect X2 with dual grippers available

PHD also announced the release of Pneu-Connext X2 kits with dual grippers. They can be mounted to UR cobots for maximum efficiency in automation performance.

The Pneu-Connect X2 includes PHD’s Freedrive feature, which interfaces with UR cobots for easy positioning and programming. The kits come in the following standard combinations:

Contact PHD for other gripper combinations.

The Pneu-Connect® X2 includes the following features, said PHD:

  • Five popular PHD pneumatic gripper options for a wide variety of applications
  • Two grippers for maximum automation efficiency
  • Series GRH Grippers now offer analog sensors providing jaw position feedback throughout jaw travel
  • The Freedrive feature that interfaces with the UR for easy positioning and programming
  • Seamless, cost-effective, end-effector integration
  • Incorporated MAC valves and control board
  • Common jaw mounting for application specific tooling
  • Updated URCap software included for intuitive, easy setup
  • Ease of use

Download the Pneu-Connect catalog for more information.

Transition plates connect UR directly to linear actuator

PHD’s transition plate allows a Universal Robot arm to be directly attached to the new PHD Series ESU electric belt-driven linear actuator. The company said it offers a transition plate for each size of UR arm, “taking machine tending to a whole new level.”

PHD transition plate

This transition plate provides a seventh axis for UR arms with the ESU linear actuator. Source: PHD

With a cataloged stroke of up to 5500mm (216.53 in.), users can increase the working area of a UR10 arm by 10 times.

The post PHD adds gripper options, transition plate to product line appeared first on The Robot Report.

Soft robots controlled by magnets, light in new research

Researchers from North Carolina State University and Elon University have developed a technique that allows them to remotely control the movement of soft robots, lock them into position for as long as needed, and later reconfigure the robots into new shapes. The technique relies on light and magnetic fields.

“We’re particularly excited about the reconfigurability,” said Joe Tracy, a professor of materials science and engineering at NC State and corresponding author of a paper on the work. “By engineering the properties of the material, we can control the soft robot’s movement remotely; we can get it to hold a given shape; we can then return the robot to its original shape or further modify its movement; and we can do this repeatedly. All of those things are valuable, in terms of this technology’s utility in biomedical or aerospace applications.”

LEDs make soft robots pliable

For this work, the researchers used soft robots made of a polymer embedded with magnetic iron microparticles. Under normal conditions, the material is relatively stiff and holds its shape.

However, researchers can heat up the material using light from a light-emitting diode (LED), which makes the polymer pliable. Once pliable, researchers demonstrated that they could control the shape of the robot remotely by applying a magnetic field. After forming the desired shape, researchers could remove the LED light, allowing the robot to resume its original stiffness — effectively locking the shape in place.

By applying the light a second time and removing the magnetic field, the researchers could get the soft robots to return to their original shapes. Or they could apply the light again and manipulate the magnetic field to move the robots or get them to assume new shapes.

In experimental testing, the researchers demonstrated that the soft robots could be used to form “grabbers” for lifting and transporting objects. The soft robots could also be used as cantilevers, or folded into “flowers” with petals that bend in different directions.

“We are not limited to binary configurations, such as a grabber being either open or closed,” said Jessica Liu, first author of the paper and a Ph.D. student at NC State. “We can control the light to ensure that a robot will hold its shape at any point.”

Soft robots controlled by magnets, light in new research

Iron microparticles can be used to make soft robots move. Source: North Carolina State University

Streamlining robot design

In addition, the researchers developed a computational model that can be used to streamline the soft robot design process. The model allows them to fine-tune a robot’s shape, polymer thickness, the abundance of iron microparticles in the polymer, and the size and direction of the required magnetic field before constructing a prototype to accomplish a specific task.

“Next steps include optimizing the polymer for different applications,” Tracy said. “For example, engineering polymers that respond at different temperatures in order to meet the needs of specific applications.”

Authors and support

The paper, “Photothermally and Magnetically Controlled Reconfiguration of Polymer Composites for Soft Robotics,” appears in the journal Science Advances. In addition Liu as first author, the paper was co-authored by Jonathan Gillen, a former undergraduate at NC State; Sumeet Mishra, a former Ph.D. student at NC State; and Benjamin Evans, an associate professor of physics at Elon University.

The work was done with support from the National Science Foundation (NSF) under grants CMMI-1663416 and CMMI-1662641. The work was also supported by the Research Triangle MRSEC, which is funded by NSF under grant DMR-1121107; and by NC State’s Analytical Instrumentation Facility and the Duke University Shared Materials Instrumentation Facility, which are supported by the State of North Carolina and NSF grant ECCS-1542015.

The post Soft robots controlled by magnets, light in new research appeared first on The Robot Report.

Wearable device could improve communication between humans, robots

An international team of scientists has developed an ultra-thin, wearable electronic device that facilitates smooth communication between humans and machines. The researchers said the new device is easy to manufacture and imperceptible when worn. It could be applied to human skin to capture various types of physical data for better health monitoring and early disease detection, or it could enable robots to perform specific tasks in response to physical cues from humans.

Wearable human-machine interfaces have had challenges — some are made from rigid electronic chips and sensors that are uncomfortable and restrict the body’s motion, while others consist of softer, more wearable elastic materials but suffer from slow response times.

While researchers have developed thin inorganic materials that wrinkle and bend, the challenge remains to develop wearable devices with multiple functions that enable smooth communication between humans and machines.

The team that wrote the paper included Kyoseung Sim, Zhoulyu Rao; Faheem Ershad; Jianming Lei, Anish Thukral, Jie Chen, and Cunjiang Yu at University of Houston. It also included Zhanan Zou and Jianling Xiao at University of Colorado, Boulder, and Qing-An Huang at Southeast University in Nanjing, China.

Wearable nanomembrane reads human muscle signals

Kyoseung Sim and company have designed a nanomembrane made from indium zinc oxide using a chemical processing approach that allows them to tune the material’s texture and surface properties. The resulting devices were only 3 to 4 micrometers thick, and snake-shaped, properties that allow them to stretch and remain unnoticed by the wearer.

When worn by humans, the devices could collect signals from muscle and use them to directly guide a robot, enabling the user to feel what the robot hand experienced. The devices maintain their function when human skin is stretched or compressed.

Wearable device could improve communication between humans, robots

Soft, unnoticeable, multifunctional, electronics-based, wearable human-machine interface devices. Credit: Cunjiang Yu

The researchers also found that sensors made from this nanomembrane material could be designed to monitor UV exposure (to mitigate skin disease risk) or to detect skin temperature (to provide early medical warnings), while still functioning well under strain.

Editor’s note: This month’s print issue of The Robot Report, which is distributed with Design World, focuses on exoskeletons. It will be available soon.

The post Wearable device could improve communication between humans, robots appeared first on The Robot Report.

Roach-inspired robot shares insect’s speed, toughness

If the sight of a skittering bug makes you squirm, you may want to look away — a new insect-sized robot created by researchers at the University of California, Berkeley, can scurry across the floor at nearly the speed of a darting cockroach. And it’s nearly as hardy as a roach is. Try to squash this robot under your foot, and more than likely, it will just keep going.

“Most of the robots at this particular small scale are very fragile. If you step on them, you pretty much destroy the robot,” said Liwei Lin, a professor of mechanical engineering at UC Berkeley and senior author of a new study that describes the robot. “We found that if we put weight on our robot, it still more or less functions.”

Small-scale robots like these could be advantageous in search-and-rescue missions, squeezing and squishing into places where dogs or humans can’t fit, or where it may be too dangerous for them to go, said Yichuan Wu, first author of the paper, who completed the work as a graduate student in mechanical engineering at UC Berkeley through the Tsinghua-Berkeley Shenzhen Institute partnership.

“For example, if an earthquake happens, it’s very hard for the big machines, or the big dogs, to find life underneath debris, so that’s why we need a small-sized robot that is agile and robust,” said Wu, who is now an assistant professor at the University of Electronic Science and Technology of China.

The study appears this week in the journal Science Robotics.

PVDF provides roach-like characteristics

The robot, which is about the size of a large postage stamp, is made of a thin sheet of a piezoelectric material called polyvinylidene fluoride, or PVDF. Piezoelectric materials are unique, in that applying electric voltage to them causes the materials to expand or contract.

UC Berkeley roach robot

The robot is built of a layered material that bends and straightens when AC voltage is applied, causing it to spring forward in a “leapfrogging” motion. Credit: UC Berkeley video and photo by Stephen McNally

The researchers coated the PVDF in a layer of an elastic polymer, which causes the entire sheet to bend, instead of to expand or contract. They then added a front leg so that, as the material bends and straightens under an electric field, the oscillations propel the device forward in a “leapfrogging” motion.

The resulting robot may be simple to look at, but it has some remarkable abilities. It can sail along the ground at a speed of 20 body lengths per second, a rate comparable to that of a roach and reported to be the fastest pace among insect-scale robots. It can zip through tubes, climb small slopes, and carry small loads, such as a peanut.

Perhaps most impressively, the robot, which weighs less than one tenth of a gram, can withstand a weight of around 60kg [132 lb.] — about the weight of an average human — which is approximately 1 million times the weight of the robot.

“People may have experienced that, if you step on the cockroach, you may have to grind it up a little bit, otherwise the cockroach may still survive and run away,” Lin said. “Somebody stepping on our robot is applying an extraordinarily large weight, but [the robot] still works, it still functions. So, in that particular sense, it’s very similar to a cockroach.”

The robot is currently “tethered” to a thin wire that carries an electric voltage that drives the oscillations. The team is experimenting with adding a battery so the roach robot can roam independently. They are also working to add gas sensors and are improving the design of the robot so it can be steered around obstacles.

Co-authors of the paper include Justin K. Yim, Zhichun Shao, Mingjing Qi, Junwen Zhong, Zihao Luo, Ronald S. Fearing and Robert J. Full of UC Berkeley, Xiaojun Yan of Beihang University and Jiaming Liang, Min Zhang and Xiaohao Wang of Tsinghua University.

This work is supported in part by the Berkeley Sensor and Actuator Center, an Industry-University Cooperation Research Center.

Editor’s note: This article republished from the University of California, Berkeley.

The post Roach-inspired robot shares insect’s speed, toughness appeared first on The Robot Report.

Acutronic Robotics fails to find funding for H-ROS for robot hardware

Acutronic Robotics today announced on its blog that it is shutting down on July 31. The company, which has offices in Switzerland and Spain, offered communication tools based on the Robot Operating System for modular robot design.

The company, which was founded in 2016 after Acutronic Link Robotics AG’s acquisition of Erle Robotics, said it had been waiting on financing. Acutronic Robotics was developing the Hardware Robot Operating System or H-ROS, a communication bus to enable robot hardware to interoperate smoothly, securely, and safely.

Components of Acutronic’s technology included the H-ROS System on Module (SoM) device for the bus, ROS2 as the “universal robot language” and application programming interface, and the Hardware Robot Information Model (HRIM) as a common ROS dialect.

Acutronic was involved in the development of the open-source ROS2 and was recently named a “Top 10 ROS-based robotics company” for 2019. The company built MARA, the first robot natively running on ROS2.

In January, Acutronic Robotics said that it had made grippers from Robotiq “seamlessly interoperable with all other ROS2-speaking robotic components, regardless of their original manufacturer.”

Acutronic Robotics H-ROS

H-ROS was intended to make robot hardware work together more easily. Source: Acutronic Robotics

Funding challenges

HRIM was funded through the EU’s ROS-Industrial (ROSIN) project, and the U.S. Defense Advanced Projects Research Agency (DARPA) had invested in H-ROS.

In September 2017, Acutronic raised an unspecified amount of Series A funding led by the Sony Innovation Fund. More recently, however, the company had difficulty finding venture capital.

“We continue to believe that our robot modularity technology and vision are relevant strategically speaking, both product and positioning wise,” stated Victor Mayoral, CEO of Acutronic Robotics. however we probably hit the market too early and fell short of resources.”

According to Acutronic’s blog post, the company received acquisition proposals but was unable to agree to any of them.

The global robot operating system market will experience a compound annual growth rate of 8.8% between 2018 and 2026, predicts Transparency Market Research. However, that forecast includes proprietary industrial software and customized robots.

Other ROS-related news today included Freedom Robotics’ seed funding and Fetch Robotics’ Series C. As The Robot Report previously reported, AWS RoboMaker works with ROS Industrial, and Microsoft recently announced support for ROS in Windows 10.

Uncertain future for Acutronic team

Mayoral didn’t specify what would happen to Acutronic Robotics’ approximately 30 staffers or its intellectual property, but he tried to end on an optimistic note.

“We are absolutely convinced that ROS is a key blueprint for the future of robotics,” Mayoral said. “The ROS robotics community has been a constant inspiration for all of us over these past years, and I’m sure that with the new ROS 2, many more companies will be inspired in the same way. Our team members are excited about their next professional steps, and I’m sure many of us will stay very close to the ROS community.”

Acutronic Robotics staff

The Acutronic Robotics team. Source: Acutronic

The post Acutronic Robotics fails to find funding for H-ROS for robot hardware appeared first on The Robot Report.

Velodyne Lidar acquires Mapper.ai for advanced driver assistance systems

SAN JOSE, Calif. — Velodyne Lidar Inc. today announced that it has acquired Mapper.ai’s mapping and localization software, as well as its intellectual property assets. Velodyne said that Mapper’s technology will enable it to accelerate development of the Vella software that establishes its directional view Velarray lidar sensor.

The Velarray is the first solid-state Velodyne lidar sensor that is embeddable and fits behind a windshield, said Velodyne, which described it as “an integral component for superior, more effective advanced driver assistance systems” (ADAS).

The company provides lidar sensors for autonomous vehicles and driver assistance. David Hall, Velodyne’s founder and CEO invented real-time surround-view lidar systems in 2005 as part of Velodyne Acoustics. His invention revolutionized perception and autonomy for automotive, new mobility, mapping, robotics, and security.

Velodyne said its high-performance product line includes a broad range of sensors, including the cost-effective Puck, the versatile Ultra Puck, and the autonomy-advancing Alpha Puck.

Mapper.ai staffers to join Velodyne

Mapper’s entire leadership and engineering teams will join Velodyne, bolstering the company’s large and growing software-development group. The talent from Mapper.ai will augment the current team of engineers working on Vella software, which will accelerate Velodyne’s production of ADAS systems.

Velodyne claimed its technology will allow customers to unlock advanced capabilities for ADAS features, including pedestrian and bicycle avoidance, Lane Keep Assistance (LKA), Automatic Emergency Braking (AEB), Adaptive Cruise Control (ACC), and Traffic Jam Assist (TJA).

“By adding Vella software to our broad portfolio of lidar technology, Velodyne is poised to revolutionize ADAS performance and safety,” stated Anand Gopalan, chief technology officer at Velodyne. “Expanding our team to develop Vella is a giant step towards achieving our goal of mass-producing an ADAS solution that dramatically improves roadway safety.”

“Mapper technology gives us access to some key algorithmic elements and accelerates our development timeline,” Gopalan added. “Together, our sensors and software will allow powerful lidar-based safety solutions to be available on every vehicle.”

Mapper.ai to contribute to Velodyne software

Mapper.ai developers will work on the Vella software for the Velarray sensor. Source: Velodyne Lidar

“Velodyne has both created the market for high-fidelity automotive lidar and established itself as the leader. We have been Velodyne customers for years and have already integrated their lidar sensors into easily deployable solutions for scalable high-definition mapping,” said Dr. Nikhil Naikal, founder and CEO of Mapper, who is joining Velodyne. “We are excited to use our technology to speed up Velodyne’s lidar-centric software approach to ADAS.”

In addition to ADAS, Velodyne said it will incorporate Mapper technology into lidar-centric solutions for other emerging applications, including autonomous vehicles, last-mile delivery services, security, smart cities, smart agriculture, robotics, and unmanned aerial vehicles.

The post Velodyne Lidar acquires Mapper.ai for advanced driver assistance systems appeared first on The Robot Report.

LUKE prosthetic arm has sense of touch, can move in response to thoughts

Keven Walgamott had a good “feeling” about picking up the egg without crushing it. What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of LUKE, a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he “felt” the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That’s because the team, led by University of Utah biomedical engineering associate professor Gregory Clark, has developed a way for the “LUKE Arm” (named after the robotic hand that Luke Skywalker got in The Empire Strikes Back) to mimic the way a human hand feels objects by sending the appropriate signals to the brain.

Their findings were published in a new paper co-authored by University of Utah biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark, and other colleagues in the latest edition of the journal Science Robotics.

Sending the right messages

“We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits,” George says. “We’re making more biologically realistic signals.”

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up, and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

“It almost put me to tears,” Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. “It was really amazing. I never thought I would be able to feel in that hand again.”

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the University of Utah, was able to pluck grapes without crushing them, pick up an egg without cracking it, and hold his wife’s hand with a sensation in the fingers similar to that of an able-bodied person.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

How those things are accomplished is through a complex series of mathematical calculations and modeling.

Kevin Walgamott LUKE arm

Kevin . Walgamott wears the LUKE prosthetic arm. Credit: University of Utah Center for Neural Interfaces

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon “skin” over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the University of Utah team has been developing a system that allows the prosthetic arm to tap into the wearer’s nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by University of Utah biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array.

The Array is a bundle of 100 microelectrodes and wires that are implanted into the amputee’s nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to “feel” the object in order to know how much pressure to exert because you can’t figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the Array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” says Clark.

To achieve that, Clark’s team used mathematical calculations along with recorded impulses from a primate’s arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the University of Utah’s Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago’s Department of Organismal Biology and Anatomy, the Cleveland Clinic’s Department of Biomedical Engineering, and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

“This is an incredible interdisciplinary effort,” says Clark. “We could not have done this without the substantial efforts of everybody on that team.”

Editor’s note: Reposted from the University of Utah.

The post LUKE prosthetic arm has sense of touch, can move in response to thoughts appeared first on The Robot Report.

Neural Analytics partners with NGK Spark Plug to scale up medical robots

Neural Analytics partners with NGL Spark Plug to scale up medical robots

The Lucid Robotic System has received FDA clearance. Source: Neural Analytics

LOS ANGELES — Neural Analytics Inc., a medical robotics company developing and commercializing technologies to measure and track brain health, has announced a strategic partnership with NGK Spark Plug Co., a Japan-based company that specializes in comprehensive ceramics processing. Neural Analytics said the partnership will allow it to expand its manufacturing capabilities and global footprint.

Neural Analytics’ Lucid Robotic System (LRS) includes the Lucid M1 Transcranial Doppler Ultrasound System and NeuralBot system. The resulting autonomous robotic transcranial doppler (rTCD) platform is designed to non-invasively search, measure, and display objective brain blood-flow information in real time.

The Los Angeles-based company’s technology integrates ultrasound and robotics to empower clinicians with critical information about brain health to make clinical decisions. Through its algorithm, analytics, and autonomous robotics, Neural Analytics provides valuable information that can identify pathologies such as Patent Foramen Ovale (PFO), a form of right-to-left shunt.

Nagoya, Japan-based NGK Spark Plug claims to be the world’s leading manufacturer of spark plugs and automotive sensors, as well as a broad lineup of packaging, cutting tools, bio ceramics, and industrial ceramics. The company has more than 15,000 employees and develops products related to the environment, energy, next-generation vehicles, and the medical device and diagnostic industries.

Neural Analytics and NGK to provide high-quality parts, global access

“This strategic partnership between Neural Analytics and NGK Spark Plug is built on a shared vision for the future of global healthcare and a foundation of common values,” said Leo Petrossian, Ph.D., co-founder and CEO of Neural Analytics. “We are honored with this opportunity and look forward to learning from our new partners how they have built a great global enterprise,”

NGK Spark Plug has vast manufacturing expertise in ultra-high precision ceramics. With this partnership, both companies said they are committed in working together to build high-quality products at a reasonable cost to allow greater access to technologies like the Lucid Robotic System.

“I am very pleased with this strategic partnership with Neural Analytics,” said Toru Matsui, executive vice president of NGK Spark Plug. “This, combined with a shared vision, is an exciting opportunity for both companies. This alliance enables the acceleration of their great technology to the greater market.”

This follows Neural Analytics’ May announcement of its Series C round close, led by Alpha Edison. In total, the company has raised approximately $70 million in funding to date.

Neural Analytics said it remains “committed to advancing brain healthcare through transformative technology to empower clinicians with the critical information needed to make clinical decisions and improve patient outcomes.”

The post Neural Analytics partners with NGK Spark Plug to scale up medical robots appeared first on The Robot Report.

Sea Machines Robotics to demonstrate autonomous spill response

Sea Machines Robotics to demonstrate autonomous spill response

Source: Sea Machines Robotics

BOSTON — Sea Machines Robotics Inc. this week said it has entered into a cooperative agreement with the U.S. Department of Transportation’s Maritime Administration to demonstrate the ability of its autonomous technology in increasing the safety, response time and productivity of marine oil-spill response operations.

Sea Machines was founded in 2015 and claimed to be “the leader in pioneering autonomous control and advanced perception systems for the marine industries.” The company builds software and systems to increase the safety, efficiency, and performance of ships, workboats, and commercial vessels worldwide.

The U.S. Maritime Administration (MARAD) is an agency of the U.S. Department of Transportation that promotes waterborne transportation and its integration with other segments of the transportation system.

Preparing for oil-spill exercise

To make the on-water exercises possible, Sea Machines will install its SM300 autonomous-command system aboard a MARCO skimming vessel owned by Marine Spill Response Corp. (MSRC), a not-for-profit, U.S. Coast Guard-classified oil spill removal organization (OSRO). MSRC was formed with the Marine Preservation Association to offer oil-spill response services in accordance with the Oil Pollution Act of 1990.

Sea Machines plans to train MSRC personnel to operate its system. Then, on Aug. 21, Sea Machines and MSRC will execute simulated oil-spill recovery exercises in the harbor of Portland, Maine, before an audience of government, naval, international, environmental, and industry partners.

The response skimming vessel is manufactured by Seattle-based Kvichak Marine Industries and is equipped with a MARCO filter belt skimmer to recover oil from the surface of the water. This vessel typically operates in coastal or near-shore areas. Once installed, the SM300 will give the MSRC vessel the following new capabilities:

  • Remote autonomous control from an onshore location or secondary vessel,
  • ENC-based mission planning,
  • Autonomous waypoint tracking,
  • Autonomous grid line tracking,
  • Collaborative autonomy for multi-vessel operations,
  • Wireless remote payload control to deploy onboard boom and other response equipment, and
  • Obstacle detection and collision avoidance.

Round-the-clock response

In addition, Sea Machines said, it enables minimally manned and unmanned autonomous maritime operations. Such configurations allow operators to respond to spill events 24/7 depending on recovery conditions, even when crews are unavailable or restricted, the company said. These configurations also reduce or eliminate exposure of crewmembers to toxic fumes and other safety hazards.

“Autonomous technology has the power to not only help prevent vessel accidents that can lead to spills, but can also facilitate better preparedness; aid in safer, efficient, and effective cleanup,” said CEO Michael G. Johnson, CEO of Sea Machines. “We look forward to working closely with MARAD and MSRC in these industry-modernizing exercises.”

“Our No. 1 priority is the safety of our personnel at MSRC,” said John Swift, vice president at MSRC. “The ability to use autonomous technology — allowing response operations to continue in an environment where their safety may be at risk — furthers our mission of response preparedness.”

Sea Machines promises rapid ROI for multiple vessels

Sea Machines’ SM Series of products, which includes the SM300 and SM200, provides marine operators a new era of task-driven, computer-guided vessel control, bringing advanced autonomy within reach for small- and large-scale operations. SM products can be installed aboard existing or new-build commercial vessels with return on investment typically seen within a year.

In addition, Sea Machines has received funding from Toyota AI Ventures.

Sea Machines is also a leading developer of advanced perception and navigation assistance technology for a range of vessel types, including container ships. The company is currently testing its perception and situational awareness technology aboard one of A.P. Moller-Maersk’s new-build ice-class container ships.

The post Sea Machines Robotics to demonstrate autonomous spill response appeared first on The Robot Report.