Kickstarter Campaign Announced for RoboDriveWheel: The fully integrated wheel for mobile robots

NAPLES, ITALY — S4E Robotics is pleased to announce the launch of a Kickstarter campaign that will allow the public to meet RoboDriveWheel: the fully integrated wheel for mobile robots. RoboDriveWheel is a new fully integrated motorized wheel designed specifically for the development of a new generation of safe and versatile service mobile robots. 

“We believe RoboDriveWheel can help all robotics designers but also enthusiast and hobbyist to create powerful and smart mobile robots with little effort and reducing cost and time” says Andrea Fontanelli inventor of RoboDriveWheel. “This Kickstarter campaign, to rise 43359€, will help us optimize the production process, manufacture our units and finalize our packaging components”. 

RoboDroveWheel integrates Inside a Continental rubber with strong adhesion a powerful brushless motor, a high-efficiency planetary gearbox and a control board implementing state-of-the-art algorithms for torque and velocity control. Robodrivewheel can also detect impacts and collisions thanks to the combined use of torque, acceleration and inclination measurement obtained from the sensors integrated into the control board. 

“RoboDriveWheel is the ideal solution for anyone who wants to build a mobile robot with little effort but which integrates the most modern technologies.” says Roberto Iorio CFO of S4E Robotics. RoboDriveWheel is being produced by S4E Robotics, a company that specialised in industrial automation and robotics. They have designed and produced mobile robots such as ENDRIU: The compact and modular mobile robot for sanitization. The process of designing mobile service robots is highly time-consuming. A service robotic drive wheel must be compact, have good traction friction, must be powerful but fast enough to move the robot at a speed comparable to human speed. 

Moreover, a robotic wheel must include all the functionalities, sensors, and electronics. Finally, a service robot must work close to humans so the robot should be capable to identify efficiently the interaction with the environment. Usually, all these capabilities require different components: the wheel, the shaft, one or more bearings with the housing, a traction system (pulley and belt), a motor with a reducer and encoder an IMU and the electronics with several cables for all the sensors. 

RoboDriveWheel integrates all these functions inside the wheel and it is easy to connect and control through the single cable for power supply and can-bus communication, a protocol used for years in the automotive sector. 
To learn more about Kickstarter and supporting RoboDriveWheel’s 43359€ campaign, please visit
https://www.kickstarter.com/projects/andreafontanelli/robodrivewheel

See RoboDriveWheel in action in our promoting video on youtube: https://www.youtube.com/watch?v=t4SOgimqY6U 

uEye XC closes the market gap between industrial camera and webcam

New 13 MP autofocus camera from IDS

Unlike webcams from the consumer sector, uEye XC is specially designed for industrial requirements. Its components score with long availability – a must for industrial applications. It is used, for example, in kiosk systems, logistics, in the medical sector and in robotics applications.

Thanks to the integrated autofocus module, the Vision standard-compliant USB3 camera can easily manage changing object distances. In terms of appearance, it is distinguished by its elegant and lightweight magnesium housing. With dimensions of only 32 x 61 x 19 mm (W x H x D) and screwable USB Micro B connection, the camera can be easily integrated into image processing systems. Its 13 MP onsemi sensor delivers 20 fps at full resolution and ensures high image quality even in changing light conditions thanks to BSI (“Back Side Illumination”) pixel technology. Useful functions such as 24x digital zoom, auto white balance and colour correction ensure that all details can be captured perfectly. With the IDS peak software development kit, users can configure the camera specifically for their application if required.

From process and quality assurance to kiosk systems, logistics automation and medical technology: the autofocus camera can be used in a variety of ways in both industrial and non-industrial areas. With the quick-change macro attachment lens, users can easily shorten the minimum object distance of the camera. This makes it also suitable for close-up applications, such as diagnostics.

Learn more: https://en.ids-imaging.com/ueye-xc-autofocus-camera.html

Robots as a Service (RaaS) – everything you need to know

Robots as a Service (RaaS) has become popular in the last couple of years mainly because it is much cheaper for a business to hire a robot for a specific project than to buy one that it may not fully utilize in the long run. For purpose of efficiency and effectiveness, most robot arms are usually designed to do a particular task, such as moving items from one position to another.

Besides costs, there are several other advantages of RaaS over buying an in-house robot. In this article, we will discuss everything you need to know about RaaS, including its benefits and the different types of robots that you can hire.

Photo by ThisisEngineering RAEng on Unsplash

What is Robots as a Service?

It is a service that involves hiring robots to organizations that would like to use robots but do not have the expertise or in-house recourses to procure and maintain these robots. There are several types of robots that organizations can hire depending on the task at hand. Some of the popular options include robot arms, cobots, and universal robots.

Robot arms are designed with almost the same functionality as a human arm. Their core task is to handle and move things from one position to another. A cobot or collaborative robot is designed to work hand in hand with humans. Universal robots are more versatile robot arms that can do more tasks than the standard robot arms.

Use cases of RaaS

  1. Temporary warehouse operations: If you have a temporary project that involves operating a warehouse, it is much wiser to hire robots than buy them. Robots are more efficient and less costly to maintain than humans. However, you could also use cobots that work with humans to handle different warehouse operations.
  2. Security for structures under construction: You can use robots to do the 360o  surveillance of facilities under construction. Robots capture useful real-time information that can be sent to AI algorithms for further interpretation.
Photo by Testalize.me on Unsplash

Benefits of RaaS

  • Lower costs of entry: small businesses that don’t have the money to buy robots can take advantage of RaaS by hiring robots for handling specific projects that humans may not be effective and efficient at.
  • Increased scalability: Businesses can quickly scale up their automation operations using some of the cutting-edge robotic technologies without incurring high upfront costs and the risk of asset deterioration or obsolescence.
  • Businesses no longer need robotic experts: With RaaS, small businesses can hire robots without having to recruit robotics professionals. Most of the configurations are done on the cloud by the RaaS provider. This further lowers the barrier of entry for getting into the automation space.

Final thoughts

If your business operations involve tasks that can be done by robot arms, cobots, or universal robots, using RaaS is the easiest way to get started. The good news is that you can hire cobots from Bots.co.uk for as low as £65 a day. Considering how much work these robots can do in a day, £65 is a worthy investment if your business has enough operations that need to be automated.

Block-based Editor: Create Vision Apps without programming knowledge

New IDS NXT software release themed “App your camera!”

The current software release 2.6 for the AI vision system IDS NXT focuses primarily on simplifying app creation. The initial phase in development is often one of the greatest challenges in the realisation of a project. With the help of the new Application Assistant in IDS NXT lighthouse, users configure a complete vision app under guidance in just a few steps, which they can then run directly on an IDS NXT camera. With the Block-based Editor, which is also new, users can configure their own program sequences with AI image processing functions, such as object recognition or classification, without any programming knowledge. Users create simple sequences in a few minutes with this visual code editor without having to know the syntax of a specific programming language.

With the Use Case Assistant, IDS supports users in creating Vision App projects. They simply select the use case that fits their project. With queries and tips, the assistant guides them through the process of creating the Vision App project and creates the code, just like in an interview. It links existing training projects with the vision app project or creates new training projects and data sets in IDS NXT lighthouse if required.

With the combinable blocks and the intuitive user interface of the Block-based Editor, anyone can realise their own projects using AI-based image processing (such as object detection or classification) as an individual vision app without having to know the syntax of a specific programming language. Using the predefined blocks of the code editor, users build their vision app graphically, including processes such as loops and conditional statements. How this works is demonstrated, for example, in the IDS Vision Channel (www.ids-vision-channel.tech). The session “Build AI vision apps without coding – xciting new easyness” is available for viewing as a recording.

IDS NXT is a comprehensive system with a wide range of workflows and tools for realising your own AI vision applications. The intelligent IDS NXT cameras can process tasks “OnDevice” and deliver image processing results themselves. The tasks of the cameras are determined by apps that are uploaded to the cameras and executed there. Their functionality can thus be changed at any time. This is supported by software such as IDS NXT lighthouse, with which users can not only train neural networks, but now also create their own vision apps. The system offers both beginners and professionals enormous scope for designing AI vision apps.
Learn more: www.ids-nxt.com

Festo and MassRobotics are Leading the Global Nurturing of Healthcare Robotics Innovation

The Festo-MassRobotics Healthcare Robotics Startup Catalyst program celebrates the milestones achieved by the program’s four selected global startups at the Healthcare Robotics Engineering Forum. Key life sciences and robotics speakers to lead the event.

The successful Healthcare Robotics Startup Catalyst program came to an end on April 7th, 2022. The concluding ceremony will be held at the Healthcare Robotics Engineering Forum, Boston Convention and Exhibition Center, on May 11, 2022. The event includes an impressive line-up of speakers: ​​Fady Saad, Co-founder & Vice President of Strategic Partnerships at MassRobotics; Alfons Riek, Vice President of Technology and Innovation at Festo; Kendalle Burlin O’Connell, President & Chief Operating Officer at MassBio; Kenn Turner, President and CEO at Mass Life Sciences Center; and Brian Johnson, President at MassMedic. All four selected startup companies, Kinarm (Canada), Assistive Technology Development Inc. (United States), Eureka Robotics (Singapore), and Bionomous (Switzerland ) will, in turn, promote their companies, along with their products and service offerings. They will also be demonstrating their technologies on the event’s expo floor.

In October 2021, MassRobotics, Festo, and other key players in healthcare robotics, launched a Startup Catalyst Program to advance healthcare robotics companies around the world, by providing the networking opportunities, guidance, and resources they need to grow and succeed. The aim of the program was to connect healthcare robotics startups with customers, investors, suppliers, marketing, and overall support. The program focused on startups in the areas of clinical care, public safety, laboratory, supply chain automation, out-of-hospital care, quality of life, as well as continuity of work and education, and training and support for healthcare professionals.

More than 30 companies applied from all over the world, and the selection committee invited four to join in the program. The participating startups completed impressive milestones, as detailed below:

  • Eureka Robotics develops and commercializes cutting-edge robotics and artificial intelligence (AI) technologies to automate high-accuracy, high-agility tasks. Eureka is currently completing fundraising rounds in Japan through connections provided by program mentors. Eureka was introduced to MassRobotics partner, Mitsubishi Electric, and signed a global partnership with Mitsubishi as a platinum partner. The program helped the company’s leadership to explore attractive applications in surgical lenses manufacturing technology, which is an extension to its focus on traditional manufacturing.
  • Bionomous provides laboratory equipment to automate the screening, sorting, and pipetting of miniature biological entities for more ethical and faster research in life science. CEO Frank Bonnet reports that with the aid of the Catalyst Program, Bionomous was able to run a pilot program in the US, leading to the company’s first sales outside Europe. This convinced Bionomous to expand into the US market and set up offices in the MassRobotics space in Boston. Bonnet emphasized the importance of the program’s mentors, who connected them to key industry leaders to open possibilities for future partnerships.
  • Assistive Technology Development Inc. is an American startup dedicated to at-home physical therapy solutions that are operable at a low cost and always accessible to rural patients and those who need closer monitoring for recovery. The company came into the program with three goals: 1) begin its first pilot study in a clinical setting; 2) downsize the actuation unit to a wearable form, and 3) raise capital. CEO Todd Roberts reports that with help from the program, the company has completed the first two milestones and is making progress on the third. It will begin phase I of a pilot study with UCHealth, a not-for-profit health care system, headquartered in Aurora, Colorado, on April 25th, allowing the company to present preliminary results at the keynote event at the Healthcare Conference. The study will assess the early clinical efficacy and collect patient and clinician feedback. Assistive’s actuation unit has been downsized by 70%, from a large, wall-powered, benchtop system to a wearable, battery-powered system that will enable the company to complete the pilot. Finally, Assistive is in the process of raising capital and has begun diligence with two firms.
  • Kinarm uses robotic arms to provide an objective assessment method to identify, measure, and track cognitive motor or sensory impairments resulting from injury or disease. Kinarm worked with assigned mentors from the robotics ecosystem who provided introductions to industry leaders who responded with “jaw-dropping, you-can-do-that?” exclamations, reports ​​Anne Vivian-Scott, CEO. Vivian-Scott was also introduced to experienced healthcare robotics leaders who will collaboratively aid Kinarm as the company scales its solutions. Vivian-Scott adds, “What we gained was not specific knowledge that can be encoded into our product, but direction. Quite frankly, most other programs are not ‘sufficiently vested’ in the participant’s business/opportunity to be able to offer such feedback.”

“I am grateful to Festo’s pioneering work to support our efforts to find global disrupting applications and startups in such a human-care field like healthcare, including life science, biotech, and medical devices,” said Fady Saad of MassRobotics.

 “I am impressed with the quality of applications we received, and the unique structure of the program that allowed us to select such innovative companies and match them with world-class advisors,” said Festo’s Alfons Riek. “Certainly, we are excited about the  networking opportunities opened to these companies and to presenting them to the world as great examples of the power of utilizing robotics in healthcare.”

MassRobotics, Festo, and additional corporations plan to launch the second version of the program by July 2022 to build on the programs’ amazing momentum and impact.

ABOUT MassRobotics: MassRobotics is the result of the collective work of a group of engineers, rocket scientists, and entrepreneurs with a shared vision to create an innovation hub and startup cluster focused on the needs of the robotics and IoT community. MassRobotics’ mission is to help create and scale the next generation of successful robotics and connected devices companies by providing entrepreneurs and innovative robotics/automation startups with the workspace and resources they need to develop, prototype, test, and commercialize their products and solutions. See www.massrobotics.org for details.

About Festo: Festo is a global player and an independent family-owned company with headquarters in Esslingen am Neckar, Germany. Festo has set standards in industrial automation technology and technical education ever since its establishment, thereby making a contribution to the sustainable development of the environment, the economy, and society. The company supplies pneumatic and electrical automation technology to 300,000 customers of factory and process automation in over 35 industries. The LifeTech sector with medical technology and laboratory automation is becoming increasingly important. The products and services are available in 176 countries. With about 20,000 employees in over 250 branch offices in 61 countries worldwide, Festo achieved a turnover of around €2.84 billion in 2020. Each year around 8 % of this turnover is invested in research and development. In this learning company, 1.5 % of turnover is invested in basic and further training. Festo Didactic SE is a leading provider of technical education and training and offers its customers worldwide comprehensive digital and physical learning solutions in the industrial environment.

Large laundry

Intelligent robotics for laundries closes automation gap

The textile and garment industry is facing major challenges with current supply chain and energy issues. The future recovery is also threatened by factors that hinder production, such as labour and equipment shortages, which put them under additional pressure. The competitiveness of the industry, especially in a global context, depends on how affected companies respond to these framework conditions. One solution is to move the production of clothing back to Europe in an economically viable way. Shorter transport routes and the associated significant savings in transport costs and greenhouse gases speak in favour of this. On the other hand, the related higher wage costs and the prevailing shortage of skilled workers in this country must be compensated. The latter requires further automation of textile processing. The German deep-tech start-up sewts GmbH from Munich has focused on the great potential that lies in this task. It develops solutions with the help of which robots – similar to humans – anticipate how a textile will behave and adapt their movement accordingly.

The German deep-tech start-up sewts GmbH from Munich has focused on the great potential that lies in this task. It develops solutions with the help of which robots – similar to humans – anticipate how a textile will behave and adapt their movement accordingly. In the first step, sewts has set its sights on an application for large industrial laundries. With a system that uses both 2D and 3D cameras from IDS Imaging Development Systems GmbH, the young entrepreneurs are automating one of the last remaining manual steps in large-scale industrial laundries, the unfolding process. Although 90% of the process steps in industrial washing are already automated, the remaining manual operations account for 30% of labour costs. The potential savings through automation are therefore enormous at this point.

Application

It is true that industrial laundries already operate in a highly automated environment to handle the large volumes of laundry. Among other things, the folding of laundry is done by machines. However, each of these machines usually requires an employee to manually spread out the laundry and feed it without creases. This monotonous and strenuous loading of the folding machines has a disproportionate effect on personnel costs. In addition, qualified workforce is difficult to find, which often has an impact on the capacity utilisation and thus the profitability of industrial laundries. The seasonal nature of the business also requires a high degree of flexibility. sewts makes IDS cameras the image processing components of a new type of intelligent system whose technology can now be used to automate individual steps, such as sorting dirty textiles or inserting laundry into folding machines.

“The particular challenge here is the malleability of the textiles,” explains Tim Doerks, co-founder and CTO. While the automation of the processing of solid materials, such as metals, is comparatively unproblematic with the help of robotics and AI solutions, available software solutions and conventional image processing often still have their limits when it comes to easily deformable materials. Accordingly, commercially available robots and gripping systems have so far only been able to perform such simple operations as gripping a towel or piece of clothing inadequately. But the sewts system

Weiterlesen

Teledyne FLIR Introduces Hadron 640R Dual Thermal-Visible Camera for Unmanned Systems

GOLETA, Calif. and ORLANDO, Fla. ― Teledyne FLIR, part of Teledyne Technologies Incorporated, today announced the release of its high-performance Hadron 640R combined radiometric thermal and visible dual camera module. The Hadron 640R design is optimized for integration into unmanned aircraft systems (UAS), unmanned ground vehicles (UGV), robotic platforms, and emerging AI-ready applications where battery life and run time are mission critical.

The 640 x 512 resolution Boson longwave infrared (LWIR) thermal camera inside the Hadron 640R can see through total darkness, smoke, most fog, glare, and provide temperature measurements for every pixel in the scene. The addition of the high definition 64 MP visible camera enables the Hadron 640R to provide both thermal and visible imagery compatible with today’s on-device processors for AI and machine-learning applications at the edge.

“The Hadron 640R provides integrators the opportunity to deploy a high-performance dual-camera module into a variety of unmanned form factors from UAS to UGV thanks to its incredibly small size, weight, and power requirement,” said Michael Walters, vice president product management, Teledyne FLIR. “It is designed to maximize efficiency and its IP-54 rating protects the module from intrusion of dust and water from the outside environment.”

The Hadron 640R reduces development costs and time-to-market for integrators and original equipment manufacturer (OEM) product developers by offering a complete system through a single supplier, Teledyne FLIR. This includes offering drivers for market-leading processors from NVIDIA, Qualcomm, and more, plus industry-leading integration support and service from a support team of experts. It also offers flexible 60 Hz video output via USB or MIPI compatibility. Hadron 640R is a dual use product and is classified under US Department of Commerce jurisdiction.

The Teledyne FLIR Hadron 640R is available for purchase globally from Teledyne FLIR and its authorized dealers. To learn more or to purchase, visit www.flir.com/hadron640r.

For an exclusive in-person first look at the Hadron 640R, please visit booth #2107 at AUVSI Xponential, April 26-28, 2022, in Orlando, Florida.

About Teledyne FLIR
Teledyne FLIR, a Teledyne Technologies company, is a world leader in intelligent sensing solutions for defense and industrial applications with approximately 4,000 employees worldwide. Founded in 1978, the company creates advanced technologies to help professionals make better, faster decisions that save lives and livelihoods. For more information, please visit www.teledyneflir.com or follow @flir.

About Teledyne Technologies
Teledyne Technologies is a leading provider of sophisticated digital imaging products and software, instrumentation, aerospace and defense electronics, and engineered systems. Teledyne’s operations are primarily located in the United States, the United Kingdom, Canada, and Western and Northern Europe. For more information, visit Teledyne’s website at www.teledyne.com.

An innovative tracked mobile robot for research and agricultural applications

Guest Article by Robo-dyne

Thanks to the recent innovations in electronics and mechatronics, the use of mobile robots and manipulators is becoming popular mostly in smart factories where robotic platforms are used to move heavy loads and for the storage of goods.

One of the questions that arise when developing a new mobile robot is: continuous tracks or wheels? When implementing a new industrial robotics application, it can be very difficult to correctly choose between wheels and tracks when you build a robot because each locomotion system provides different performances and outputs. The most important features to consider when you have to design a locomotion system are the traction, the driving system, for example, Ackermann or skid-steering, the maximum ground pressure and the suspension system.

If it is necessary to minimize the ground pressure, then it is necessary to choose a solution based on tracks since they are well suitable for soft surfaces. On the other side, wheels have a significant advantage in steering compared to tracks and this can be translated into a good manoeuvrability for the wheels. Robo-dyne is a robotics company that provides tracked and wheeled robots that can be customized depending on the final requirements; this is particularly useful for all that companies that need to develop high-level applications based on a mobile robotic platform and do not will to invest money and time to build the mobile base because they aim to focus on the final application.

Robo-dyne designed MaXXII-S which is the first electric mobile robot with skid-steering and rubber tracks powered by AGM Batteries and able to use an innovative undercarriage configuration which includes a large rear wheel and a couple of two smaller front idle wheels pointing forward. This configuration allows the robot to be able to keep its balance when additional tools, i.e., lawn mowers unit, weed cutter, rotary hoes, are mounted on its rear side and it can also overcome obstacles greater than the ⅔ of the diameter of the rear wheel since the upper front wheel leans on it providing area for extra grip. MaXXII-S is open source and runs with ROS and C++ libraries in order to support researchers all over the world to develop their industrial applications.

The center idle wheels provide a better pressure distribution on the ground. A passive suspension system is used to minimize the vibrations over the main frame caused by the tracks-terrain interactions. In particular, MaXXII-S adopts a set of three shock absorbers with preload adjustment in addition to the possibility to change the elastic coefficient in order to better adapt the system configuration depending on the terrain features. This electric skid-steering tracked mobile robot and its undercarriage is already available on the market finding use in a vast range of agricultural applications, such as plant monitoring or weed cutting, and also for industrial use to move goods in warehouses with uneven ground as also reported on the manufacturer’s official website robo-dyne.com.

In Celebration of National Robotics Week, iRobot® Launches the Create® 3 Educational Robot

Robot’s Smartest Developer Platform, Now with ROS 2 and Python Support

BEDFORD, Mass., April 5, 2022 /PRNewswire/ — iRobot Corp. (NASDAQ: IRBT), a leader in consumer robots, today is expanding its educational product lineup with the launch of the Create® 3 educational robot – the company’s most capable developer platform to date. Based on the Roomba® i3 Series robot vacuum platform, Create 3 provides educators and advanced makers with a reliable, out of the box alternative to costly and labor-intensive robotics kits that require assembly and testing. Instead of cleaning people’s homes,1 the robot is designed to promote higher-level exploration for those seeking to advance their education or career in robotics.

In Celebration of National Robotics Week, iRobot launched the Create® 3 Educational Robot – the company’s most capable developer platform to date. Now with ROS 2 and Python Support, Create 3 provides educators and advanced makers with a reliable, out of the box alternative to costly and labor-intensive robotics kits that require assembly and testing. Create 3 is designed to promote higher-level exploration for those seeking to advance their education or career in robotics.

The launch of Create 3 coincides with National Robotics Week, which began April 2 and runs through April 10, 2022. National Robotics Week, founded and organized by iRobot, is a time to inspire students about robotics and STEM-related fields, and to share the excitement of robotics with audiences of all ages through a range of in-person and virtual events.

“iRobot is committed to delivering STEM tools to all levels of the educational community, empowering the next generation of engineers, scientists and enthusiasts to do more,” said Colin Angle, chairman and CEO of iRobot. “The advanced capabilities we’ve made available on Create 3 enable higher-level students, educators and developers to be in the driver’s seat of robotics exploration, allowing them to one day discover new ways for robots to benefit society.”

With ROS 2 support, forget about building the platform, and focus on your application: 
The next generation of iRobot’s affordable and trusted all-in-one mobile robot development platform, Create 3 brings a variety of new functionalities to users, including compatibility with ROS 2, an industry-standard software for roboticists worldwide. Robots require many different components, such as actuators, sensors and control systems, to communicate with each other in order to work. ROS 2 enables this communication, allowing students to speed up the development of their project by focusing more on their core application rather than the platform itself. Learning ROS 2 also gives students valuable experience that many companies are seeking from robotics developers.

Expand your coding skills even further with Python support:
iRobot also released a Python Web Playground for its iRobot Root® and Create 3 educational robots, providing a bridge for beginners to begin learning more advanced programming skills outside of the iRobot Coding App. Python, a commonly used coding language, enables users to broaden the complexity of projects that they work on. The iRobot Education Python Web Playground allows advanced learners and educators to program the iRobot Root and Create 3 educational robots with a common library written in Python. This provides users with a pathway to learn a new coding language, opening the door to further innovation and career development.

With more smarts, Create 3 lets you do more:
As a connected robot, Create 3 comes equipped with Wi-Fi, Ethernet-over-USB host, and Bluetooth. Create 3 is also equipped with a suite of intelligent technology, including an inertial measurement unit (IMU), optical floor tracking sensor, wheel encoders, and infrared sensors for autonomous localization, navigation, and telepresence applications. Additionally, the robot includes cliff, bump and slip detection, along with LED lights and a speaker.

A 3D simulation of Create 3 is also available using Ignition Gazebo for increased access to robotics education and research.

Create 3 Pricing and Availability
Create 3 is available immediately in the US and Canada for $299 USD and $399 CAD. It will be available in EMEA through authorized distributors in the coming months. Additional details can be found at https://edu.irobot.com/what-we-offer/create3.

iRobot Education Python Web Playground Availability
The iRobot Education Python Web Playground can be accessed in-browser at python.irobot.com.

Robots as helpers in the lettuce harvest

Robot solution for automating the lettuce harvest

Lettuce is a valuable crop in Europe and the USA. But labor shortages make it difficult to harvest this valuable field vegetable, as sourcing sufficient seasonal labor to meet harvesting commitments is one of the sector’s biggest challenges. Moreover, with wage inflation rising faster than producer prices, margins are very tight. In England, agricultural technology and machinery experts are working with IDS Imaging Development Systems GmbH (Obersulm, Germany) to develop a robotic solution to automate lettuce harvesting.

Robot solution for automating the lettuce harvest

The team is working on a project funded by Innovate UK and includes experts from the Grimme agricultural machinery factory, the Agri-EPI Centre (Edinburgh UK), Harper Adams University (Newport UK), the Centre for Machine Vision at the University of the West of England (Bristol) and two of the UK’s largest salad producers, G’s Fresh and PDM Produce.

Within the project, existing leek harvesting machinery is adapted to lift the lettuce clear from the ground and grip it in between pinch belts. The lettuce’s outer, or ‘wrapper’, leaves will be mechanically removed to expose the stem. Machine vision and artificial intelligence are then used to identify a precise cut point on the stem to to neatly separate the head of lettuce.

“The cutting process of an iceberg is the most technically complicated step in the process to automate, according to teammates from G subsidiary Salad Harvesting Services Ltd.”, explains IDS Product Sales Specialist Rob Webb. “The prototype harvesting robot being built incorporates a GigE Vision camera from the uEye FA family. It is considered to be particularly robust and is therefore ideally suited to demanding environments. “As this is an outdoor application, a housing with IP65/67 protection is required here”, Rob Webb points out.

GV-5280FA

The choice fell on the GV-5280FA-C-HQ model with the compact 2/3″ global shutter CMOS sensor IMX264 from Sony. “The sensor was chosen mainly because of its versatility. We don’t need full resolution for AI processing, so sensitivity can be increased by binning. The larger sensor format means that wide-angle optics are not needed either”, Rob Webb summarized the requirements. In the application, the CMOS sensor convinces with excellent image quality, light sensitivity and exceptionally high dynamic range and delivers almost noise-free, very high-contrast 5 MP images in 5:4 format at 22 fps – even in applications with fluctuating light conditions. The extensive range of accessories, such as lens tubes and trailing cables, is just as tough as the camera housing and the screwable connectors (8-pin M12 connector with X-coding and 8-pin Binder connector). Another advantage: camera-internal functions such as pixel pre-processing, LUT or gamma reduce the required computer power to a minimum.

The prototype of the robotic mower will be used for field trials in England towards the end of the 2021 season.

“We are delighted to be involved in the project and look forward to seeing the results. We are convinced of its potential to automate and increase the efficiency of the lettuce harvest, not only in terms of compensating for the lack of seasonal workers”, affirms Jan Hartmann, Managing Director of IDS Imaging Development Systems GmbH.

Prototype lettuce harvesting robot of Agri-Epicentre (UK)

The challenges facing the agricultural sector are indeed complex. According to a forecast by the United Nations Food and Agriculture Organization (FAO), agricultural productivity will have to increase by almost 50 percent by 2050 compared to 2012 due to the dramatic increase in population. Such a yield expectation means an enormous challenge for the agricultural industry, which is still in its infancy in terms of digitalization compared to other sectors and is already under high pressure to innovate in view of climatic changes and labor shortages. The agriculture of the future is based on networked devices and automation. Cameras are an important building block, and artificial intelligence is a central technology here. Smart applications such as harvesting robots can make a significant contribution to this.