IDS NXT malibu marks a new class of intelligent industrial cameras that act as edge devices and generate AI overlays in live video streams. For the new camera series, IDS Imaging Development Systems has collaborated with Ambarella, leading developer of visual AI products, making consumer technology available for demanding applications in industrial quality. It features Ambarella’s CVflow® AI vision system on chip and takes full advantage of the SoC’s advanced image processing and on-camera AI capabilities. Consequently, Image analysis can be performed at high speed (>25fps) and displayed as live overlays in compressed video streams via the RTSP protocol for end devices.
Thanks to the SoC’s integrated image signal processor (ISP), the information captured by the light-sensitive onsemi AR0521 image sensor is processed directly on the camera and accelerated by its integrated hardware. The camera also offers helpful automatic features, such as brightness, noise and colour correction, which significantly improve image quality.
„With IDS NXT malibu, we have developed an industrial camera that can analyse images in real time and incorporate results directly into video streams,” explained Kai Hartmann, Product Innovation Manager at IDS. “The combination of on-camera AI with compression and streaming is a novelty in the industrial setting, opening up new application scenarios for intelligent image processing.“
These on-camera capabilities were made possible through close collaboration between IDS and Ambarella, leveraging the companies’ strengths in industrial camera and consumer technology. „We are proud to work with IDS, a leading company in industrial image processing,” said Jerome Gigot, senior director of marketing at Ambarella. “The IDS NXT malibu represents a new class of industrial-grade edge AI cameras, achieving fast inference times and high image quality via our CVflow AI vision SoC.“
IDS NXT malibu has entered series production. The camera is part of the IDS NXT all-in-one AI system. Optimally coordinated components – from the camera to the AI vision studio – accompany the entire workflow. This includes the acquisition of images and their labelling, through to the training of a neural network and its execution on the IDS NXT series of cameras.
I am passionate about technology and robotics. Here in my own blog, I am always taking on new tasks. But I have hardly ever worked with image processing. However, a colleague’s LEGO® MINDSTORMS® robot, which can recognize the rock, paper or scissors gestures of a hand with several different sensors, gave me an idea: „The robot should be able to ’see‘.“ Until now, the respective gesture had to be made at a very specific point in front of the robot in order to be reliably recognized. Several sensors were needed for this, which made the system inflexible and dampened the joy of playing. Can image processing solve this task more „elegantly“?
Rock-Paper-Scissors with Robot Inventor by Seshan Brothers. The robot which inspired me for this project
From the idea to implementation
In my search for a suitable camera, I came across IDS NXT – a complete system for the use of intelligent image processing. It fulfilled all my requirements and, thanks to artificial intelligence, much more besides pure gesture recognition. My interest was woken. Especially because the evaluation of the images and the communication of the results took place directly on or through the camera – without an additional PC! In addition, the IDS NXT Experience Kit came with all the components needed to start using the application immediately – without any prior knowledge of AI.
I took the idea further and began to develop a robot that would play the game „Rock, Paper, Scissors“ in the future – with a process similar to that in the classical sense: The (human) player is asked to perform one of the familiar gestures (scissors, stone, paper) in front of the camera. The virtual opponent has already randomly determined his gesture at this point. The move is evaluated in real time and the winner is displayed.
The first step: Gesture recognition by means of image processing
But until then, some intermediate steps were necessary. I began by implementing gesture recognition using image processing – new territory for me as a robotics fan. However, with the help of IDS lighthouse – a cloud-based AI vision studio – this was easier to realize than expected. Here, ideas evolve into complete applications. For this purpose, neural networks are trained by application images with the necessary product knowledge – such as in this case the individual gestures from different perspectives – and packaged into a suitable application workflow.
The training process was super easy, and I just used IDS Lighthouse’s step-by-step wizard after taking several hundred pictures of my hands using rock, scissor, or paper gestures from different angles against different backgrounds. The first trained AI was able to reliably recognize the gestures directly. This works for both left- and right-handers with a recognition rate of approx. 95%. Probabilities are returned for the labels „Rock“, „Paper“, „Scissor“, or „Nothing“. A satisfactory result. But what happens now with the data obtained?
Further processing
The further processing of the recognized gestures could be done by means of a specially created vision app. For this, the captured image of the respective gesture – after evaluation by the AI – must be passed on to the app. The latter „knows“ the rules of the game and can thus decide which gesture beats another. It then determines the winner. In the first stage of development, the app will also simulate the opponent. All this is currently in the making and will be implemented in the next step to become a „Rock, Paper, Scissors“-playing robot.
From play to everyday use
At first, the project is more of a gimmick. But what could come out of it? A gambling machine? Or maybe even an AI-based sign language translator?
Qviro, one of the leading robotics platforms, introduces a groundbreaking marketplace, offering unparalleled transparency and choice. Users can effortlessly compare the full robotics market and access a vast selection of 211 cobots.
The platform ensures transparent pricing, allowing buyers access to all cobot prices on Qviro. For added assistance, it provides an average cobot price of €27,158. Additionally, Qviro includes 400+ user reviews for informed decisions.
In the cobot category, Universal Robots leads with a 4.6 rating from over 41 user reviews. Their products excel in ease of use and integration, favored by engineers and enthusiasts.
For budget-conscious buyers, Elephant Robotics and Wlkata offer educational robots starting at $599. They provide cost-effective solutions for educational and hobbyist projects. Find Elephant Robotics‘ products at Elephant Robotics Products and Wlkata’s at Wlkata Products.
Sven De Donder, Co-CEO of Qviro, said, „Our user base in Europe and North America is growing exponentially due to unmatched transparency.“
Qviro transforms the robotics buying experience, offering an all-in-one solution for enthusiasts and professionals. With diverse options, transparent pricing, and a supportive user community, Qviro meets all your robotics needs.
About Qviro:
Qviro is a Belgium-based startup that is revolutionising the procurement process of industrial technology such as robots and machines through digitization. The company’s review platform, Qviro.com, provides factories and engineers with valuable insights and customer feedback to make confident purchasing decisions. At the same time, it offers vendors market intelligence and data to help them better understand their potential customers. As a SaaS platform, Qviro is dedicated to providing exceptional customer experiences and innovative solutions that drive growth and progress in the industry. To learn more about Qviro, visit www.Qviro.com.
After a series of successful Kickstarter Campaigns, Geek Club and CircuitMess launch their most ambitious project yet – a NASA-approved AI-powered scale model Replica of the Perseverance Space Rover
Zagreb, Croatia – October 31st, 2023. – Today, Geek Club and CircuitMess announced their Kickstarter space exploration campaign designed to teach children eleven and up about engineering, AI, and coding by assembling the iconic NASA Perseverance Space Rover, as well as a series of other NASA-inspired space vehicles.
This new space-themed line of DIY educational products was born out of both companies‘ shared vision to aim for the stars and to take their fans with them. The Kickstarter campaign starts today, October 31st, and will last for 35 days.
The collaboration was a logical union of the two companies. Both companies create educational STEM DIY kits that are targeted towards kids and adults. Both share the same mission: To make learning STEM skills easy and fun.
“For decades, the team and I have been crafting gadgets for geeks always inspired by space exploration,” says Nicolas Deladerrière, co-founder of Geek Club. “Inspired by Mars exploration, we’ve studied thousands of official documents and blueprints to craft an authentic Mars exploration experience. The product comes alive thanks to microchips, electromotors, and artificial intelligence. Imagine simulating your own Mars mission right from your desk!”
Geek Club is an American company that specializes in designing and producing DIY robotics kits that educate their users on soldering and electronics. They focus primarily on space exploration and robotics, all to make learning engineering skills easy and fun for kids, adults, and everyone in between.
“We have successfully delivered seven Kickstarter campaigns, raised more than 2.5 million dollars, and made hundreds of thousands of geeks all around the world extremely happy,” says Albert Gajšak, CEO of CircuitMess. “In a universe where space and technology are constantly growing, we’re here to ensure you’re never left behind.”
The new product line consists of five unique space-themed products:
1. The Perseverance Space Rover Kit
This kit is designed to be an educational journey into programming, electronics, robotics, and AI. The model comes with four electromotors, six wheels, a control system with a dual-core Espressif ESP32 processor, Wi-Fi, and Bluetooth connectivity, a sample collection arm based on the real thing with two servo motors, a Wi-Fi-connected remote controller, and support for programming in Python or via a Scratch-inspired drag-and-drop visual coding environment.
Alongside the Perseverance Space Rover, you’ll be able to get more iconic space vehicles:
2. The Voyager: A DIY kit made as a tribute to NASA’s longest-lasting mission, which has been beaming back data for an incredible 45 years and counting.
3. Juno: A solar-powered DIY kit celebrating the mission that gave us the most detailed and breathtaking images of Jupiter.
4. Discovery: A DIY kit honoring the legendary space shuttle with 39 successful orbital flights under its belt.
5. The Artemis Watch: A sleek, space-themed wrist gadget inspired by NASA’s upcoming Artemis space suit design. The watch is a programmable device equipped with an LCD display, Bluetooth, and a gyroscope.
The Perseverance Educational Space Rover Kit is available for pre-order now on Kickstarter, starting at $149.
No previous experience or knowledge is needed for assembling your very own space rover. The kit is designed for anyone aged 11+ and comes with detailed video instructions.
Bochum, October 16, 2023 – United Robotics Group (URG) and the Fraunhofer Institute for Manufacturing Engineering and Automation (IPA) have signed a licensing agreement as part of their newly concluded technology partnership. The agreement covers the distribution and further development of the KEVIN® laboratory robot, which will be manufactured and distributed by URG in the future. With this strategic step, URG is expanding its presence in the life science sector. In addition, both partners benefit from valuable synergies for the future-oriented automation of laboratories – the research and innovation expertise of Fraunhofer IPA optimally complements the robotics expertise of URG. The laboratory robot KEVIN® was developed by the Department of Laboratory Automation and Bioproduction Technology at Fraunhofer IPA and brought to prototype stage with first test customers worldwide. Under the umbrella of URG, KEVIN® will now go into series production. For this purpose, the company is acquiring a corresponding licence for the use and further development of the robot’s hardware and software.
From left to right:Andreas Traube, Head of Department Laboratory automation and bioproduction technology, Prof. Thomas Bauernhansl, Director of Fraunhofer IPA, Thomas Linkenheil, Co-CEO of URG, Sarah Ostertag, UX & Industrial Design Lead and Product Management and Tobias Brode, Head of Business Development Lab Automation SOURCE: FRAUNHOFER IPA/PHOTO: RAINER BEZ
We are pleased to be able to create new, promising perspectives for laboratory automation with our robotics solutions as part of the collaboration. This strengthens our presence in the life science sector – and enables us to effectively address societal challenges such as the shortage of skilled labour and demographic change,“ explains Thomas Linkenheil, Co-CEO of URG.
Mobile laboratory robot KEVIN® is an autonomous, mobile laboratory robot. It automates processes and can be flexibly and intuitively integrated into laboratory infrastructures. The robot takes over repetitive routine tasks. For example, it transports microtitre plates and racks in SBS format, such as samples or consumables for refilling. It can also supply systems with pipette tips. Using KEVIN® around the clock increases efficiency in the laboratory. Given the shortage of skilled workers, it is particularly important to reduce the workload of staff, allowing them to focus on value-added activities.
„With the new agreement, we have added a decisive boost to the further development and commersialisation of KEVIN®. Automation plays an important role in the life science sector. It enables laboratories to respond flexibly to different requirements. This makes it all the more important to jointly develop suitable solutions for this sector,“ says Thomas Bauernhansl, Director of the Fraunhofer IPA.
Increase in personnel for URG In the course of the technological partnership between Fraunhofer IPA and URG and the transfer of the licensing rights to KEVIN®, there will also be personnel changes. Sarah Ostertag and Tobias Brode will join the United Robotics Group. In the future, Sarah Ostertag will work as UX & Industrial Design Lead + Product Management and Tobias Brode as Head of Business Development Lab Automation at URG. Both will accompany KEVIN® from the initial idea to the market-ready series product and thus strengthen company’s expertise in the long term.
Soft robotics represents a groundbreaking advancement in the field, standing apart from the rigid structures people usually associate with traditional robotic systems. Learn more about recent advances in this field and the many benefits.
The Era of Soft Robots
Nature and biology heavily influence soft robots, giving them the flexibility and ability to adapt to their surroundings. For example, some commercially available soft robotic designs mimic fish, octopi and worms.
Innovative materials such as shape-memory alloys, dielectric elastomers and liquid crystal elastomers are critical to soft robotics. These materials change their properties in response to various stimuli. Grippers on soft robots, made of high-tech elastomers, mold to the target object’s shape. This flexibility ensures a gentler and more adaptable grip than rigid robots, making them ideal for tasks like fruit picking.
Soft robots also use self-healing materials made from shape-memory alloys. These alloys allow the robots to repair themselves after damage, increasing their operational life span and reducing maintenance needs.
As technology progresses, scientists outfit soft robots with sensory systems, enhancing their ability to understand their surroundings. For example, soft pressure sensors can tell a robot if it’s gripping too hard. Some researchers are even developing soft robots capable of working in swarms, emulating the behavior of fish, bees and birds.
3D printing, a form of advanced manufacturing, has revolutionized how scientists design and produce intricate soft robotic parts, driving innovation and accessibility in this sector. Some robots incorporate the strengths of both rigid and soft systems, resulting in hybrids that offer improved strength, precision and flexibility. Instead of traditional motors, there’s a growing trend towards fluidic actuation. Robots use liquids or air for movement, making their movements more natural.
Soft Robotics in Medicine
Robotics is revolutionizing various aspects of modern medicine. In rehabilitation and physiotherapy, soft robotic exosuits or exoskeletons support patients recovering from strokes, spinal cord injuries or surgeries. These devices gently guide and assist patients, helping them regain motor functions, relearn movements and restore strength.
In assistive medical devices, soft wearable robots are emerging to help those with mobility issues. The Wyss Institute at Harvard University developed a soft, wearable robotic glove that assists individuals with hand disabilities in performing day-to-day activities. This glove, made from soft elastomers, can assist in gripping objects, potentially improving rehabilitation outcomes.
Scientists at the City University of Hong Kong developed a soft robot capable of maneuvering inside the stomach and intestine. The robot can change shape and size, facilitating better imaging and allowing localized drug delivery or biopsies.
A collaboration between Boston Children’s Hospital and Harvard University resulted in a soft robotic sleeve that surgeons can place around the heart. This device helps the heart pump more efficiently in patients with heart failure, providing a potential alternative to organ transplants.
In diagnostics, soft robots simplify procedures like endoscopy, making it less invasive and patient-friendly. Patients can now swallow endoscopy capsules equipped with a camera and a tissue collection mechanism to get the same results traditionally obtained by putting patients under general anesthesia.
Research teams at institutes like the Sant’Anna School of Advanced Studies in Italy have been working on developing soft robotic arms that can assist surgeons. Due to their soft and pliant design, these arms can navigate the body with minimal risk of damaging tissues or organs.
Soft Robotics in Marine Conservation
Equipped with sensors, soft robots can monitor water quality, track marine species and evaluate the health of habitats over prolonged periods. Their non-intrusive nature and versatility enable them to probe areas inaccessible to traditional robots. MIT’s Computer Science and Artificial Intelligence Laboratory developed a soft robotic fish named „SoFi“ that can swim naturally in the ocean, recording close-up videos of marine life and providing insights without alarming or disturbing the aquatic life.
Soft robots also offer the potential for marine clean-up efforts, such as removing pollutants like microplastics and oil spills. The WasteShark, developed by RanMarine Technology, is an ASV designed to „eat“ or collect trash in harbors and other waters close to the shore. This drone skims the water’s surface, collecting waste in its path, thereby aiding in marine clean-up.
The Ocean Exploration Trust’s E/V Nautilus expeditions have been using ROVs to explore and map uncharted coral reefs, helping scientists understand their structures, the species they harbor and their overall health. Similar soft robots can be deployed to plant sea grass and maintain coral reefs.
ROVs like the Hercules, also from the E/V Nautilus expedition, have robotic arms that can collect geological and biological samples from the deep sea that can help scientists study ecosystems in abyssal regions, leading to new species discoveries and insights into deep-sea conservation needs.
The Challenges Ahead
Soft robotics faces challenges, but its vast potential is undeniable. A primary focus lies in developing innovative materials that combine durability, flexibility and responsiveness. While traditional actuators, like motors, aren’t suitable for soft robots, alternatives like pneumatic and hydraulic systems are on the rise, promising unparalleled autonomy.
Manufacturing these robots at scale is now more feasible due to advanced construction techniques and materials. Even as these robots retain flexibility, integrating crucial rigid components, like batteries, is becoming smoother. The scientific community aims to enhance the response times of soft actuation mechanisms to match or exceed traditional systems.
Safety remains a top priority in soft robotics, especially in applications involving humans or medical scenarios. Although the field recognizes the higher initial research and production costs, they believe ongoing advancements will reduce expenses.
Guest article by Ellie Gabel. Ellie is a writer living in Raleigh, NC. She's passionate about keeping up with the latest innovations in tech and science. She also works as an associate editor for Revolutionized.
Adelaide, Australia – September, 2023 – The new Ortomi Generation 4 has been released on Kickstarter, and they’re simple, comforting and interactive little friends, designed to make people smile. They’re a creation from Ortomi, a small Australian start-up on a mission to capture the joy and comforting presence of real pets.
An Ortomi has a simple face that changes with different expressions and moods – both randomly and interactively. They respond to gestures such as being petted, prodded and picked-up; and respond differently based on their mood – such as when they are happy, sad, asleep, angry or bored. As well as now making cute beeps and boops (with a silent mode), the new Generation 4 has a larger screen, more expressions, moods & interactions, and a smooth injection moulded shell – benefitting both aesthetics and manufacturability.
The Kickstarter launched on August 29th, and met its goal of AU$23,000 in under 10 hours. It currently sits at AU$46,000, as people continue to rally behind project and help fund subsequent stretch goals:
1. AU$50,000 Custom Expression Creator:
Users will be able to draw, share and browse custom expressions that they can teach their Ortomi.
2. AU$70,000 Silicone Cases:
Squishy, silicone, key-ring cases will be developed and produced, improving the portability and personalisation options of Ortomi, as well as adding a satisfying tactile feel.
3. AU$100,000 Interactions between Ortomi:
Wireless capability will be developed for Ortomi to interact with each other, enabling everything from cute reactions to complex relationships.
As well as being completely portable with a 20hr battery life, Ortomi are set apart from other robots on the market by their personalisation. Like dogs, or goldfish – Ortomi are a whole species, and each one is meant to be unique. They come in many different colours, with different accessories, have different personalities, and are usually even given their own name by their owner.
The Ortomi 4 is set to ship in November 2023, and is set to mark a new chapter for the young Australian company in terms of scale and impact.