NSK is working to assist society by developing new service robot technology, including robotic devices for moving patients in medical settings. In October 2021, the company joined a Japanese government initiative to implement robotic technology in hospitals and help prevent the spread of Covid-19. NSK is now working to develop its robotic technologies further through dialogue with frontline medical staff.
New robots are currently in development around the world to assist humans and help solve societal issues. As part of this effort, NSK wants to create robots for use in settings where many people are moving around, including medical facilities and hospitals. The company’s smooth movement and low noise technologies are ideal for robotic applications in this field.
Among the candidates for development at the initial planning stage was an autonomous mobile robot. However, after observing the inner workings of a hospital, with its narrow corridors and high footfall rates, NSK concluded that a motorised assistance robot which could help staff during patient transfer would be a more useful contribution to workplace efficiency.
The company knew that its proposed robot could reduce the physical burden on medical staff and help facilitate work-style reform in the healthcare sector. Based on this approach, the company built a robotic prototype that helps staff to move heavy objects such as stretchers and trolleys in hospitals. As part of the Japanese government initiative, NSK is currently demonstrating the use of its motorised assistance robot at a major hospital. The end goal is eventual adoption by the healthcare sector for daily use.
NSK is focusing on essential user issues when developing the assist robot, deploying idea verification in short cycles. For example, rather than spending three years to develop the robot in its entirety, NSK is seeking feedback from customers every three months, implementing improvements incrementally during the development process.
The robot developed by NSK uses a motor drive that facilitates smooth starting and acceleration, as well as deceleration and tight turns. NSK ultimately wants to create a usable robot that fits user requirements, leveraging its know-how to aid people working in frontline healthcare. Innovative projects of this type support the company’s ethos of better meeting the needs of society, while simultaneously creating opportunities for new business growth.
Guest Article by Ellie Poverly - Ellie is an online journalist specializing in robotics and science research. She is also the Managing Editor at Revolutionized Magazine.
As of 2022, roughly 80% of the ocean remains a mystery. In fact, much of it hasn’t even been seen by humans yet. There have been more photographs of the Moon’s surface than the ocean floor.
Exploring the deepest parts of the ocean is an incredible challenge, complicated by intense pressures and complete darkness. Here’s a look at the robots that could finally reveal the secrets of Earth’s oceans.
The Limits of Human Exploration
Over half of the tallest mountain on Earth is underwater. The volcano Mauna Kea in Hawai’i is estimated to be around a mile taller than Mt. Everest from base to peak, although most of it is submerged. This massive volcano is just one of countless secrets hiding beneath the surface of Earth’s oceans.
Other discoveries waiting underwater range from the wrecks of lost aircraft to the remains of sunken ships and even sunken cities, as well as life that looks like something from an alien world. Exploring the ocean floor could reveal groundbreaking archaeological, geological and biological discoveries. So, why haven’t humans started SCUBA diving down to the seabed?
Unfortunately, reaching the ocean floor is not easy, especially for humans. The deepest part of the ocean is the Mariana Trench, which extends 7 miles — or about 36,200 feet — deep. To put that into perspective, the world record for the deepest SCUBA dive is only 1,090 feet, set by diver Ahmed Gabr in 2014. Approximately 1,000 feet below sea level is considered the maximum depth humans can dive.
Luckily, humans have invented some amazing robots to help explore the ocean floor remotely.
The Robotic SCUBA Divers Exploring the Depths
A growing number of robotic SCUBA divers travel down to the deepest reaches of the ocean. Some look like mini submarines, while others are eerily human-like. These robots help scientists study a wide range of topics and may become even more crucial in the years ahead.
For instance, climate change increases the risk of food toxins on the surface. Could this be happening underwater as well? Additionally, the strange biology of deep-sea life could help researchers learn more about how life evolved on Earth and how it might exist on other worlds.
The Woods Hole Oceanographic Institution has explored depths of 36,000 feet below sea level in the Mariana Trench using a robotic submarine called Orpheus. The robot is helping create 3D imaging of the ocean floor and capture video footage of deep-sea life. Its navigation system may one day be used in robots that explore the dark oceans of the moons of Jupiter and Saturn.
Taking a completely different approach is the OceanOne robotic SCUBA diver developed by researchers at Stanford University. The OceanOne is designed to be as human-like as possible, acting as a robotic avatar for human divers. On its maiden voyage in 2016, OceanOne was used to retrieve the first treasures ever recovered from the flagship of King Louis XIV, wrecked in 1664. Humans have never touched the wreck before.
NASA is also developing a deep-sea robot — the Aquanaut — which features a humanoid design. The Aquanaut has front-mounted cameras and sensors, as well as robotic arms engineers can connect various tools to. Unlike other deep-sea robots, the Aquanaut is geared more toward underwater work than exploration. However, NASA still plans to use robots to explore the oceans of Europa and Enceladus, frozen moons of Jupiter and Saturn, respectively.
What the Robot Divers are Discovering
These robotic SCUBA divers are making incredible discoveries at the bottom of the ocean. For starters, robots are helping scientists map the ocean floor, which is a monumental undertaking. Robots that dive down to the seabed need to withstand immense pressures that would kill a human diver. Plus, all of the robot’s instrumentation, sensors, cameras and navigation electronics must be able to survive, as well.
In addition to mapping the ocean floor, robotic SCUBA divers are helping humans find new species of aquatic life. The seabed seems like an unlikely place for life — it is entirely devoid of sunlight and freezing. However, robotic deep-sea exploration has changed how scientists think about the necessary ingredients for life.
Dozens of new species have been discovered living in the darkness of the deep oceans. They range from colossal squids to strange life forms that look like something straight out of science fiction. Many deep-sea animals do not have eyes since there is no light on the ocean floor. Others have evolved to be far larger than their higher-depth cousins, such as enormous jellyfish and crabs.
The Future of Robotic SCUBA Divers
Robots are taking exploration to new horizons that would otherwise be unattainable for humans. Earth’s oceans remain some of the least explored regions in the solar system. With the help of robotic SCUBA divers, scientists are discovering new species, unearthing ancient shipwrecks and revolutionizing knowledge of the sea.
Anyone who needs quickly available industrial cameras for image processing projects is not faced with an easy task due to the worldwide chip shortage. IDS Imaging Development Systems GmbH has therefore been pushing the development of alternative USB3 hardware generations with available, advanced semiconductor technology in recent months and has consistently acquired components for this purpose. Series production of new industrial cameras with USB3 interface and Vision Standard compatibility has recently started. In the CP and LE camera series of the uEye+ product line, customers can choose the right model for their applications from a total of six housing variants and numerous CMOS sensors.
The models of the uEye CP family are particularly suitable for space-critical applications thanks to their distinctive, compact magnesium housing with dimensions of only 29 x 29 x 29 millimetres and a weight of around 50 grams. Customers can choose from global and rolling shutter sensors from 0.5 to 20 MP in this product line. Those who prefer a board-level camera instead should take a look at the versatile uEye LE series. These cameras are available with coated plastic housings and C-/CS-mount lens flanges as well as board versions with or without C-/CS-mount or S-mount lens connections. They are therefore particularly suitable for projects in small device construction and integration in embedded vision systems. IDS initially offers the global shutter Sony sensors IMX273 (1.6 MP) and IMX265 (3.2 MP) as well as the rolling shutter sensors IMX290 (2.1 MP) and IMX178 (6.4 MP). Other sensors will follow.
The USB3 cameras are perfectly suited for use with IDS peak thanks to the vision standard transport protocol USB3 Vision®. The Software Development Kit includes programming interfaces in C, C++, C# with .NET and Python as well as tools that simplify the programming and operation of IDS cameras while optimising factors such as compatibility, reproducible behaviour and stable data transmission. Special convenience features reduce application code and provide an intuitive programming experience, enabling quick and easy commissioning of the cameras.
Kivnon will be presenting its most advanced and safest AGV/AMR Forklift at the event
21 September 2022, Barcelona: Kivnon, an international group specializing in automation and mobile robotics, is attending Logistics & Automation in Spain and will be showcasing it’s safe and versatile K55 AGV/AMR Forklift Pallet Stacker. Putting the emphasis on forklift safety, Kivnon K55 is equipped with advanced safety features to guarantee safe operations as it collaborates, moves, and reacts in a facility.
The Kivnon K55 is designed to move and stack palletized loads at low heights and performs cyclic or conditioned routes while interacting with other AGVs/AMRs, machines, systems, and people, making it a highly effective and safe solution. The model incorporates safety scanners that allow the vehicle to ensure 360-degree safety and operate seamlessly in shared spaces. The fork sensors help assess the possibility of correct loading or unloading of the pallet, keeping the transported goods safe.
Thierry Delmas, Managing Director at Kivnon, says, “AGVs/AMRs are revolutionizing internal logistics. The rising forklift safety challenge is of deep concern, and with the K55 we have taken a step forward to address the global issue. The Kivnon range is designed to ensure safe and reliable operations and to optimize operational efficiency.“
During the event, which runs from 26 – 27 October at IFEMA, Madrid, Kivnon will demonstrate the capabilities of the K55 Pallet Stacker. The vehicle can autonomously transport palletized loads of up 1,000 kg and lift them to heights of up to 1 meter. The vehicle is capable of performing cyclical or conditional circuits and interacting with other AGVs/AMRs, machines, and systems. Highly adaptable, the K55 is perfect for any open-bottom or euro-pallet storage application, receipt and dispatch of goods, and internal material transport. Its use will optimize safety, storage space, and process efficiency.
A robust industrial product, the K55 provides the reliability required to ensure continuity of production process and flexibility to adapt to specific application needs, with an online battery charging system that can function 24/7 with opportunity charges.
Delmas continues, “The Logistics and Automation show is an important networking event where customers can learn about the latest technologies and innovations. We pride ourselves on innovation and are excited to have this opportunity to showcase the capabilities of our products. In addition to the K55, our robust portfolio also includes twister units, car and heavy load tractors, low-height vehicles, and cart pullers, meeting multiple application needs”
The efficiency and precision of Kivnon AGVs/AMRs will be on display and Kivnon robotics experts will be available throughout the show to answer questions and arrange consultations at booth #3F43.
Kivnon offers a wide range of autonomous vehicles (AGVs/AMRs) and accessories for transporting goods, using magnetic navigation or mapping technologies that adapt to any environment and industry. The company offers an integrated solution with a wide range of mobile robotics solutions automating different applications within the automotive, food and beverage, logistics and warehousing, manufacturing, and aeronautics industries.
Kivnon products are characterized by their robustness, safety, precision, and high quality. A user-friendly design philosophy creates a pleasant, simple to install, and intuitive work experience.
Learn more about Kivnon mobile robots (AGVs/AMRs) here.
Apptronik, an Austin-based company specializing in the development of versatile, mobile robotic systems, is announcing a partnership with NASA to accelerate commercialization of its new humanoid robot. The robot, called Apollo, will be one of the first humanoids available to the commercial markets.
At Apptronik’s headquarters in Austin, Texas, the first prototype of Apollo is now complete, with the expectation of broader commercial availability in 2023. Unlike special-purpose robots that are only capable of a single, repetitive task, Apollo is designed as a general-purpose robot capable of doing a wide range of tasks in dynamic environments. Apollo will benefit workers in industries ranging from logistics, retail, hospitality, aerospace and beyond.
NASA is known across the globe for its contributions to the advancement of robotics technology. NASA first partnered with Apptronik in 2013 during the DARPA Robotics Challenge (DRC), where founders were selected to work on NASA’s Valkyrie Robot. The government agency has now selected Apptronik as a commercial partner to launch a new generation of general-purpose robots, starting with Apollo.
“Continued investment from NASA validates the work we are doing at Apptronik and the inflection point we have reached in robotics. The robots we’ve all dreamed about are now here and ready to get out into the world,” said Jeff Cardenas, CEO and co-founder of Apptronik. “These robots will first become tools for us here on Earth, and will ultimately help us move beyond and explore the stars.”
In addition to its work with NASA, Apptronik’s team has partnered with leading automotive OEMs, major transportation and logistics companies, and government agencies. Boasting notable names including Dr. Nicholas Paine, Co-founder and Chief Technology Officer of Apptronik and Dr. Luis Sentis, Co-Founder and Scientific Advisor, its team is respected as among the best in the world. A growing hub for robotics, the Austin-based company continues to recruit top talent looking to bring their innovations to market now.
Apptronik is recognized for its emphasis on human-centered design, building beautifully designed and user-friendly robotic systems. As part of this commitment, it selected premier design firm argodesign as its partner in designing Apollo with the goal of creating robots capable of working alongside humans in our most critical industries. The team’s focus now is to scale Apollo so that it can be customer-ready in 2023.
About Apptronik: Apptronik is a robotics company that has built a platform to deliver a variety of general-purpose robots. The company was founded in 2016 out of the Human Centered Robotics Lab at the University of Texas at Austin, with a mission to leverage innovative technology for the betterment of society. Its goal is to introduce the next generation of robots that will change the way people live and work, while tackling some of our world’s largest challenges. To learn more about careers at Apptronik, visit https://apptronik.com/careers/.
If you are an experienced Maker, or even just a passing Tinkerer, the RoboHeartTMHercules development board may be just what you are looking for.
RoboHeart is an exceptional development board that proudly boasts: “One board to rule them all!” This exceptional circuit board is just what is sounds like: the heart of your robotics Maker project.
At Augmented Robotics, we have gone the extra mile by combining the magic of Augmented Reality with embedded mobile systems, so you can play and manipulate brave new worlds using only your smartphone. This allows you to control any RC device or creative Maker project with connectivity in WiFi, BLE and even 5G when paired with out RoboHeart Vela board. On Tuesday, August 23, 2022, RoboHeart was launched on Kickstarter, and it reached its funding goal in less than three hours! In only two short days, RoboHeart was selected as the coveted “Project We Love” by Kickstarter – an award given to projects that really stand out with creative innovation.
By replacing the circuit board inside your RC car with RoboHeart, you can drive the car with only your smartphone – so we have decided to add an original AR game on top of that. Drive around and collect candies to get the most points… but careful not to run out of fuel! As we reach more stretch goals, the game will get increasingly complex. We have already reached our first stretch goal – Gamepad compatibility – but there are many more to come.
The Maker community is the most creative community out there, so RC cars are definitely not the only thing your RoboHeart board can be used for. Put the board into any electronics project and watch it come to life!
Get RoboHeart and dive right into building projects with an open-source RoboHeart Arduino library on GitHub! Our repository has several cool examples at varied complexities, from reading the IMU sensor data to controlling things with your smartphone. For example, we have built our own Balancing Bot using only one board:
The RoboHeart Hercules has all the features one needs to start building:
ESP32-WROOM32 with dual-core and WiFi+Bluetooth capabilities
Arduino compatibility for easy programming
An integrated Inertial Measurement Unit (IMU) for motion data
Three DC motor outputs
USB-C connector with auto-download feature: flashing is a piece of cake!
LiPo battery input with auto-charging feature: where do you need Power most, at USB or LiPo? RoboHeart will automatically distribute power where it is most needed
Convenient peripherals for Makers: GROVE, JST and JTAG
… and with the addition of the RoboHeart Vela extension board, Makers can unlock the power of 5G, and no longer be dependent on the BLE range of 50m.
So the only question that remains is: What will YOU do with RoboHeart?
Today, cameras are often more than just suppliers of images – they can recognise objects, generate results or trigger follow-up processes. Visitors to VISION Stuttgart, Germany, can find out about the possibilities offered by state-of-the-art camera technology at IDS booth 8C60. There, they will discover the next level of the all-in-one AI system IDS NXT. The company is not only expanding the machine learning methods to include anomaly detection, but is also developing a significantly faster hardware platform. IDS is also unveiling the next stage of development for its new uEye Warp10 cameras. By combining a fast 10GigE interface and TFL mount, large-format sensors with up to 45 MP can be integrated, opening up completely new applications. The trade fair innovations also include prototypes of the smallest IDS board-level camera and a new 3D camera model in the Ensenso product line.
IDS NXT: More than artificial intelligence IDS NXT is a holistic system with a variety of workflows and tools for realising custom AI vision applications. The intelligent IDS NXT cameras can process tasks „on device” and deliver image processing results themselves. They can also trigger subsequent processes directly. The range of tasks is determined by apps that run on the cameras. Their functionality can therefore be changed at any time. This is supported by a cloud-based AI Vision Studio, with which users can not only train neural networks, but now also create vision apps. The system offers both beginners and professionals enormous scope for designing AI vision apps. At VISION, the company shows how artificial intelligence is redefining the performance spectrum of industrial cameras and gives an outlook on further developments in the hardware and software sector.
uEye Warp10: High speed for applications With 10 times the transmission bandwidth of 1GigE cameras and about twice the speed of cameras with USB 3.0 interfaces, the recently launched uEye Warp10 camera family with 10GigE interface is the fastest in the IDS range. At VISION, the company is demonstrating that these models will not only set standards in terms of speed, but also resolution. Thanks to the TFL mount, it becomes possible to integrate much higher resolution sensors than before. This means that even detailed inspections with high clock rates and large amounts of data will be feasible over long cable distances. The industrial lens mount allows the cameras to fully utilise the potential of large format (larger than 1.1″) and high resolution sensors (up to 45 MP).
uEye XLS: Smallest board-level camera with cost-optimised design IDS is presenting prototypes of an additional member of its low-cost portfolio at the fair. The name uEye XLS indicates that it is a small variant of the popular uEye XLE series. The models will be the smallest IDS board-level cameras in the range. They are aimed at users who, e.g. for embedded applications, require particularly low-cost, extremely compact cameras with and without lens holders in large quantities. They can look forward to Vision Standard-compliant project cameras with various global shutter sensors and trigger options.
Ensenso C: Powerful 3D camera for large-volume applications 3D camera technology is an indispensable component in many automation projects. Ensenso C is a new variant in the Ensenso 3D product line that scores with a long baseline and high resolution, while at the same time offering a cost-optimised design. Customers receive a fully integrated, pre-configured 3D camera system for large-volume applications that is quickly ready for use and provides even better 3D data thanks to RGB colour information. A prototype will be available le at the fair.
If you are a hobbyist or beginner in CNC machining or one who intermittently uses CNC routers, you might be looking for an easy CAM software for your CNC machining.
SourceRabbit, a Greek CNC machine tool manufacturer, released a new CAM software called RabbitCAM. Their new software is cross platform, meaning it works under Windows, MacOS and Linux, it features a user-friendly user interface and lets anyone generate toolpaths from 2D DXF files.
We managed to obtain a statement from Nikos Siatras, CEO of SourceRabbit, who told us “Today’s CAM software is expensive and difficult to use for most users. Our primary goal with RabbitCAM is to create software that is easy to use, fast, affordable and able to work under any modern operating system.”
RabbitCAM is a cross-platform software for rapid programming of 3-Axis CNC machine tools from 2D DXF files.
We built RabbitCAM to provide our customers with the fastest and easiest solution for turning their designs into parts and final products through simple material removal strategies.
The multithreaded core of RabbitCAM takes advantage of all system processors during toolpath calculations, in order to reduce user’s waiting time, while its user-friendly work interface displays them on screen almost in real time.
RabbitCAM was created with the Java programming language and works on all operating systems (Windows, MacOS, Linux, Haiku and more).
Igus igumania game. Build your own Mars Rover assembling automation factory and improve it with @igusgmbh products. Enjoy this new game soon in your Webbrowser and on other plattforms. Get to know Rusty the robot and Dave the igus employee while learning about igus smart plastics products and low cost Automation. I really enjoyed playing this game, mainly for one reason: robots 😉