Robots as helpers in the lettuce harvest

Robot solution for automating the lettuce harvest

Lettuce is a valuable crop in Europe and the USA. But labor shortages make it difficult to harvest this valuable field vegetable, as sourcing sufficient seasonal labor to meet harvesting commitments is one of the sector’s biggest challenges. Moreover, with wage inflation rising faster than producer prices, margins are very tight. In England, agricultural technology and machinery experts are working with IDS Imaging Development Systems GmbH (Obersulm, Germany) to develop a robotic solution to automate lettuce harvesting.

Robot solution for automating the lettuce harvest

The team is working on a project funded by Innovate UK and includes experts from the Grimme agricultural machinery factory, the Agri-EPI Centre (Edinburgh UK), Harper Adams University (Newport UK), the Centre for Machine Vision at the University of the West of England (Bristol) and two of the UK’s largest salad producers, G’s Fresh and PDM Produce.

Within the project, existing leek harvesting machinery is adapted to lift the lettuce clear from the ground and grip it in between pinch belts. The lettuce’s outer, or ‘wrapper’, leaves will be mechanically removed to expose the stem. Machine vision and artificial intelligence are then used to identify a precise cut point on the stem to to neatly separate the head of lettuce.

“The cutting process of an iceberg is the most technically complicated step in the process to automate, according to teammates from G subsidiary Salad Harvesting Services Ltd.”, explains IDS Product Sales Specialist Rob Webb. “The prototype harvesting robot being built incorporates a GigE Vision camera from the uEye FA family. It is considered to be particularly robust and is therefore ideally suited to demanding environments. “As this is an outdoor application, a housing with IP65/67 protection is required here”, Rob Webb points out.

GV-5280FA

The choice fell on the GV-5280FA-C-HQ model with the compact 2/3″ global shutter CMOS sensor IMX264 from Sony. “The sensor was chosen mainly because of its versatility. We don’t need full resolution for AI processing, so sensitivity can be increased by binning. The larger sensor format means that wide-angle optics are not needed either”, Rob Webb summarized the requirements. In the application, the CMOS sensor convinces with excellent image quality, light sensitivity and exceptionally high dynamic range and delivers almost noise-free, very high-contrast 5 MP images in 5:4 format at 22 fps – even in applications with fluctuating light conditions. The extensive range of accessories, such as lens tubes and trailing cables, is just as tough as the camera housing and the screwable connectors (8-pin M12 connector with X-coding and 8-pin Binder connector). Another advantage: camera-internal functions such as pixel pre-processing, LUT or gamma reduce the required computer power to a minimum.

The prototype of the robotic mower will be used for field trials in England towards the end of the 2021 season.

“We are delighted to be involved in the project and look forward to seeing the results. We are convinced of its potential to automate and increase the efficiency of the lettuce harvest, not only in terms of compensating for the lack of seasonal workers”, affirms Jan Hartmann, Managing Director of IDS Imaging Development Systems GmbH.

Prototype lettuce harvesting robot of Agri-Epicentre (UK)

The challenges facing the agricultural sector are indeed complex. According to a forecast by the United Nations Food and Agriculture Organization (FAO), agricultural productivity will have to increase by almost 50 percent by 2050 compared to 2012 due to the dramatic increase in population. Such a yield expectation means an enormous challenge for the agricultural industry, which is still in its infancy in terms of digitalization compared to other sectors and is already under high pressure to innovate in view of climatic changes and labor shortages. The agriculture of the future is based on networked devices and automation. Cameras are an important building block, and artificial intelligence is a central technology here. Smart applications such as harvesting robots can make a significant contribution to this.

iRobot Releases Genius 4.0 Home Intelligence: Doubles the Intelligence for Roomba® i3 and i3+ Robot Vacuums and More

Updates Include Imprint® Smart Mapping for Roomba i3 Series, Siri Commands, Clothing & Towel Detection for Roomba j7 Series

BEDFORD, Mass., March 17, 2022 /PRNewswire/ — iRobot Corp. (NASDAQ: IRBT), a leader in consumer robots, today announced that it has started rolling out its iRobot Genius 4.0 Home Intelligence software update to Wi-Fi connected Roomba® robot vacuum and Braava jet® robot mop customers. The company is on a mission to design superior cleaning experiences that go beyond just smart. iRobot Genius 4.0 is the next step in transforming the smart home into a thoughtful home.

With the iRobot Genius 4.0 update and Imprint Smart Mapping, customers can now create customizable Smart Maps for their Roomba i3 and i3+ robot vacuums, enabling them to send their robot to clean specific rooms via the iRobot Home app or through their preferred voice assistant.

iRobot Genius 4.0 includes updates that provide powerful new functionality, like Imprint® Smart Mapping for Roomba i3 and i3+ customers, as well as customization and convenience features like Room-Specific Cleaning Preferences, Siri Shortcut Integration, Child & Pet Lock, and Do Not Disturb. The update also expands the list of objects the Roomba j Series can recognize and avoid to include clothing and towels.

The company also announced that the Roomba i3 and i3+ will be sold as the Roomba i3 EVO and Roomba i3+ EVO robot vacuums in the Americas moving forward. These Roomba i3 models automatically include the Imprint Smart Mapping update and are being sold at a new, lower retail price, starting at $349 USD for the Roomba i3 EVO and $549 USD for the Roomba i3+ EVO.

“The beauty of iRobot Genius is that our robots get smarter over time and continuously provide customers with new ways to clean where, when and how they want,” said Keith Hartsfield, chief product officer at iRobot. “As iRobot develops new features and experiences, the updates are pushed out to customers’ robots at no cost. From the day a customer welcomes a Roomba robot vacuum or Braava jet robot mop into their home, they know that they’ll always benefit from new features and functionality. They are also getting a robot that works harder for them, so they don’t have to. With more than 60 million personalized recommendations provided to customers to date, our robots are proven to learn, respect and work around individual schedules and needs.”

The iRobot Genius 4.0 update delivers these thoughtful mapping, voice and app features across iRobot’s Wi-Fi connected robot lineup:

  • Tell Roomba i3 and i3+ to Clean the Rooms You Want: With double the intelligence provided by Imprint Smart Mapping, customers can now create customizable Smart Maps for their Roomba i3 and i3+ robot vacuums, enabling them to send their robot to clean specific rooms1 via the iRobot Home app or through their preferred voice assistant. They can also now receive estimated cleaning times and create cleaning routines based on their preferred schedules, rooms and automations. This update is available to Roomba i3 and i3+ customers in the Americas and APAC now and is expected to be available in EMEA by the end of Q3 2022.
  • Clean Each Room the Way You Want: Everyone’s home is unique, with individual rooms varying in size, flooring, furniture and traffic level. That’s why users who own Imprint Smart Mapping-capable robots2 will have more control of how their robot cleans with Room-Specific Cleaning Preferences. Need your Roomba to take an extra pass in the entryway where shoes are kept but quickly clean other rooms? No problem. Looking for your Braava jet m6 to dispense more cleaning solution when tackling the kitchen, but not in the hallway? You’ll be able to do that too.
  • Use Siri to Clean – Everywhere: With approximately 600 supported Alexa® and Google Assistant® commands, robots that can be scheduled to clean specific rooms with your voice, and the only robots that can be told to clean specific areas like around the kitchen counter, iRobot is expanding its market-leading voice capabilities to include Siri. After all, you should have choices when it comes to verbally communicating with your robot, and with Siri Shortcut Integration, you’ll get just that. Like existing Alexa ® or Google Assistant ® shortcuts in the iRobot Home App, owners of Wi-Fi connected Roomba robot vacuums and Braava jet robot mops who use an iOS device will have the option to connect their robot to Siri in the iRobot Home App. Want to vacuum your whole home? Set up your own custom phrase, and just say “Hey Siri, ask Roomba to clean everywhere.”
  • No More Accidental Starts: Has Fido ever accidentally started your robot? Or maybe a curious child unintentionally found the robot’s “Clean” button? Problem solved with Child & Pet Lock via the iRobot Home App, an option that temporarily disables the physical “Clean” button on Wi-Fi connected Roomba and Braava jet robots.3 Once activated, the robots can only be controlled through the iRobot Home App, eliminating the need to quickly end an accidental start.
  • Never Be Distracted or Woken Up: Introducing iRobot’s NAP Commitment (Never Awake People or Pets). With the Do Not Disturb feature, customers can use the iRobot Home App to define windows of time in which the robot should not run, whether that be when someone is asleep or in a meeting. Do Not Disturb provides peace of mind that your robot will respect life’s quiet times. This feature has been rolled out to existing customers globally.
  • Less Clean-Up Before You Clean Up: Having already avoided 3 million objects in people’s homes since being introduced last year, the Roomba j7 and j7+ will also be able to detect and avoid clothing and towels left on the floor, letting customers feel confident the job will get done without needing to pick them up beforehand. These objects expand the visual vocabulary of the Roomba j7 and j7+, which already recognizes and avoids shoes, socks, cords, headphones, and pet waste. iRobot will continue enabling the Roomba j7 Series to identify and avoid even more objects that might prevent mission completion over time.

Availability:

All iRobot Genius 4.0 software updates will be rolling out globally to Wi-Fi connected Roomba and Braava jet customers through the end of June 2022 with the exception of Do Not Disturb, which is now available globally, and Imprint Smart Mapping for Roomba i3 and i3+, which is now available for customers in the Americas and APAC regions – and expected to be available for customers in EMEA by the end of Q3 2022.

The Roomba i3+ EVO robot vacuum with Clean Base® Automatic Dirt Disposal is available for purchase immediately in the U.S. and Canada starting at $549 USD on www.irobot.com and at select retailers. The Roomba i3 EVO robot vacuum can also be purchased without the Clean Base starting at $349 on www.irobot.com and at select retailers.

For more information or questions on robot software updates, please visit: https://homesupport.irobot.com/s/article/550.

Der ReBeL der Automatisierung: Smarter igus Cobot für 4.970 Euro

Mit dem weltweit ersten Cobot-Kunststoffgetriebe und einem digitalen Ökosystem beschleunigt igus die Low-Cost-Automatisierung – heute schon 20 Projekte pro Woche

Als Herzstück das Getriebe aus Kunststoff – der Cobot ReBeL ist für 4.970 Euro jetzt lieferbar mit einem digitalen Universum für die Low-Cost-Integration in wenigen Tagen. (Quelle: igus GmbH)

Köln, 16. März 2022 – igus liefert ab jetzt den Serviceroboter ReBeL aus – auch als smarte Version. Zu einem Preis von 4.970 Euro für die Plug-and-Play-Variante und mit einem Gewicht von nur rund 8 Kilogramm erhalten Kunden einen der leichtesten Cobots auf dem Markt. Digitale Services wie der RBTXpert und neue Online-Angebote ermöglichen den Kunden komplette Automatisierungslösungen in wenigen Tagen und für wenig Geld.

Der ReBeL der Automatisierung: Smarter igus Cobot für 4.970 Euro Mit dem weltweit ersten Cobot-Kunststoffgetriebe und einem digitalen Ökosystem beschleunigt igus die Low-Cost-Automatisierung – heute schon 20 Projekte pro Woche Köln, 16. März 2022 – igus liefert ab jetzt den Serviceroboter ReBeL aus – auch als smarte Version. Zu einem Preis von 4.970 Euro für die Plug-and-Play-Variante und mit einem Gewicht von nur rund 8 Kilogramm erhalten Kunden einen der leichtesten Cobots auf dem Markt. Digitale Services wie der RBTXpert und neue Online-Angebote ermöglichen den Kunden komplette Automatisierungslösungen in wenigen Tagen und für wenig Geld.

Beim ReBeL setzt igus ganz auf seine motion plastics Expertise: Der Einsatz von Kunststoff macht den Roboter mit 8,2 Kilogramm Eigengewicht zum leichtesten Serviceroboter mit Cobot-Funktion in seiner Klasse. Alle mechanischen Bauteile, aus denen sich der ReBeL zusammensetzt, sind ausnahmslos von igus entwickelt und gefertigt. Seine Traglast beträgt 2 Kilogramm und er besitzt eine Reichweite von 664 Millimetern. Die Wiederholgenauigkeit liegt bei +/- 1 Millimeter bei 7 Picks pro Minute. Das Herzstück ist das weltweit erste industrietaugliche Cobot-Getriebe aus Kunststoff. „Hinter diesen Zahlen stecken 1.041 Tests im hauseigenen Labor seit 2019, bei denen wir tribologische und thermodynamische Tests zu 15 Materialpaarungen und Toleranzketten durchgeführt haben. Eine besonders große Herausforderung war die Wärmeentwicklung in den vollintegrierten Wellgetrieben; sie werden durch den Motor thermisch beeinflusst. In der Entwicklung haben wir uns daher zusätzlich auf größere Motoren und einen besseren Wirkungsgrad konzentriert, um die Wärmeentwicklung deutlich zu verringern“, sagt Alexander Mühlens, Leiter des Geschäftsbereichs Low-Cost-Automation bei igus. „Dadurch konnten wir kontinuierlich Verbesserungen erzielen und am Ende die Zyklenzahl auf zwei Millionen sogar verfünffachen. Das entspricht einer üblichen Lebensdauer von zwei Jahren.”  

Smart Plastics – Volle Transparenz im Betrieb für präventive Wartung
igus hat sein motion plastics Knowhow auch in die Leistungselektronik eingebracht und erstmalig einen Encoder mit Hilfe von Leitplastikbahnen entwickelt. So lassen sich Dreh- und Zyklenzahl, Durchläufe, Temperatur und Strom exakt messen. Dank einer Cloudanbindung mit Webcam stellt ein Dashboard alle generierten Daten übersichtlich live dar. Der Kunde erhält so die volle Transparenz über seinen ReBeL im Betrieb, über Kennzahlen wie Verschleiß, Taktzeit und Stückzahlen.  

Günstige Komplettlösung, schnell integriert
Erhältlich ist der smarte ReBeL in zwei Varianten: einmal als Open Source Version ohne Robotersteuerung, Netzteil und Software für 3.900 Euro bei Stückzahl 1 oder als Plug-and-Play-Variante mit Roboter, Steuerungssoftware und Netzteil für 4.970 Euro bei Stückzahl 1. Gemäß dem igus Ansatz „Build or Buy“ stehen Kunden neben dem Komplettsystem auch die einzelnen ReBeL Wellgetriebe zur Verfügung, in den Durchmessern 80 und 105 Millimeter. Das Drehmoment beträgt 3 Nm (80) beziehungsweise 25 Nm (105) bei 6 RPM, mit einer Übersetzung von 50:1. Der ReBeL ist auf dem Online-Marktplatz RBTX erhältlich. Hier finden Anwender Einzelkomponenten, Integrationsunterstützung sowie Hard- und Software von inzwischen mehr als 40 Partnern – im Wissen, dass alles zu 100 Prozent miteinander kompatibel ist; darunter verschiedenste Roboterkinematiken, Kameras, Software, Gripper, Power Electronics, Motoren, Sensoren und Steuerungen.   Für die Integration per Online-Beratung mit Festpreisgarantie steht für Kunden der RBTXpert bereit: Auf einer 400 Quadratmeter großen Customer Testing Area beraten Experten täglich Kunden per Live-Video und schicken innerhalb von Stunden Lösungsangebote. Typische Hardwarekosten ohne Integration starten bei 8.500 Euro und Komplettlösungen ab 12.500 Euro. „Wir spüren, dass wir Automatisierung noch zugänglicher machen, da wir mit unserem Service RBTXpert allein in Deutschland mehr als 20 Kundenprojekte pro Woche beraten. Deshalb erweitern wir bis Ende März den Service um zehn weitere Online-Berater. International ist das Angebot bereits in sieben Ländern verfügbar, weitere 14 sind in Vorbereitung”, so Alexander Mühlens. „Aus diesen positiven Erfahrungen, den vielen umgesetzten Projekten und den zahlreichen Kundengesprächen heraus entwickelt sich zurzeit ein spannendes Ökosystem an weiteren Services.”  

Universum für die Low-Cost-Automation
In diesem Low-Cost-Automation-Universum dreht sich alles um die individuelle Kundenanwendung. Ziel ist es, mit neuen Angeboten und Businessmodellen die Integration weiter zu vereinfachen. „Wir werden einen App Store zur Verfügung stellen, in dem Anbieter von Low-Cost-Automation und freie Softwareentwickler ihre Software-Ideen einbringen können. Indem sie bestehende Software nutzen, können Anwender ihre Automatisierung noch schneller realisieren. So ist die Anbindung der Roboter an digitale Services wie IFTTT oder smarte Assistenten wie Alexa oder Siri möglich. Der Besucher kann dann beispielsweise in einer Kaffeebar per Sprache seinen Lieblingskaffee ordern und der Roboter schenkt ihn dann aus. Daraus ergeben sich ganz neue Business-Modelle wie Pay-per-Pick, bei dem Nutzer nicht für den Roboter, sondern nur für seine Aufgabe zahlen. Diese neuen Möglichkeiten werden den Robotikmarkt ebenso wie den Alltag nachhaltig verändern“, sagt Mühlens. „Ihnen wollen wir mit dem Low-Cost-Automation-Universum ein Zuhause geben.“

https://www.igus.de/info/build-or-buy-serviceroboter-rebel

Draper Teaches Robots to Build Trust with Humans – new research

New study shows methods robots can use to self-assess their own performance

CAMBRIDGE, MASS. (PRWEB) MARCH 08, 2022

Establishing human-robot trust isn’t always easy. Beyond the fear of automation going rogue, robots simply don’t communicate how they are doing. When this happens, establishing a basis for humans to trust robots can be difficult.

Now, research is shedding light on how autonomous systems can foster human confidence in robots. Largely, the research suggests that humans have an easier time trusting a robot that offers some kind of self-assessment as it goes about its tasks, according to Aastha Acharya, a Draper Scholar and Ph.D. candidate at the University of Colorado Boulder.

Acharya said we need to start considering what communications are useful, particularly if we want to have humans trust and rely on their automated co-workers. “We can take cues from any effective workplace relationship, where the key to establishing trust is understanding co-workers’ capabilities and limitations,” she said. A gap in understanding can lead to improper tasking of the robot, and subsequent misuse, abuse or disuse of its autonomy.

To understand the problem, Acharya joined researchers from Draper and the University of Colorado Boulder to study how autonomous robots that use learned probabilistic world models can compute and express self-assessed competencies in the form of machine self-confidence. Probabilistic world models take into account the impact of uncertainties in events or actions in predicting the potential occurrence of future outcomes.

In the study, the world models were designed to enable the robots to forecast their behavior and report their own perspective about their tasking prior to task execution. With this information, a human can better judge whether a robot is sufficiently capable of completing a task, and adjust expectations to suit the situation.

To demonstrate their method, researchers developed and tested a probabilistic world model on a simulated intelligence, surveillance and reconnaissance mission for an autonomous uncrewed aerial vehicle (UAV). The UAV flew over a field populated by a radio tower, an airstrip and mountains. The mission was designed to collect data from the tower while avoiding detection by an adversary. The UAV was asked to consider factors such as detections, collections, battery life and environmental conditions to understand its task competency.

Findings were reported in the article “Generalizing Competency Self-Assessment for Autonomous Vehicles Using Deep Reinforcement Learning,” where the team addressed several important questions. How do we encourage appropriate human trust in an autonomous system? How do we know that self-assessed capabilities of the autonomous system are accurate?

Human-machine collaboration lies at the core of a wide spectrum of algorithmic strategies for generating soft assurances, which are collectively aimed at trust management, according to the paper. “Humans must be able to establish a basis for correctly using and relying on robotic autonomy for success,” the authors said. The team behind the paper includes Acharya’s advisors Rebecca Russell, Ph.D., from Draper and Nisar Ahmed, Ph.D., from the University of Colorado Boulder.

The research into autonomous self-assessment is based upon work supported by DARPA’s Competency-Aware Machine Learning (CAML) program.

In addition, funds for this study were provided by the Draper Scholar Program. The program gives graduate students the opportunity to conduct their thesis research under the supervision of both a faculty adviser and a member of Draper’s technical staff, in an area of mutual interest. Draper Scholars’ graduate degree tuition and stipends are funded by Draper.

Since 1973, the Draper Scholar Program, formerly known as the Draper Fellow Program, has supported more than 1,000 graduate students pursuing advanced degrees in engineering and the sciences. Draper Scholars are from both civilian and military backgrounds, and Draper Scholar alumni excel worldwide in the technical, corporate, government, academic, and entrepreneurship sectors.

Draper

At Draper, we believe exciting things happen when new capabilities are imagined and created. Whether formulating a concept and developing each component to achieve a field-ready prototype, or combining existing technologies in new ways, Draper engineers apply multidisciplinary approaches that deliver new capabilities to customers. As a nonprofit engineering innovation company, Draper focuses on the design, development and deployment of advanced technological solutions for the world’s most challenging and important problems. We provide engineering solutions directly to government, industry and academia; work on teams as prime contractor or subcontractor; and participate as a collaborator in consortia. We provide unbiased assessments of technology or systems designed or recommended by other organizations—custom designed, as well as commercial-off-the-shelf. Visit Draper at http://www.draper.com.

Maicat, the Cybernetic Companion Cat

Macroact, the personal robotics development lab operating out of South Korea, has released  their first AI based companion pet. Designed for education and entertainment, Maicat is now live on Kickstarter after years of design and testing. 

CAPABLE – Ready to use directly from the box, Maicat is an autonomous robot pet. Using its  sensors, Maicat is capable of detecting obstacles and walking around the house on its own.  With its laser range finder and gyroscope, it is able to adjust for thick carpets and door frames. 

CARING Maicat has facial, voice pattern and emotional recognition software. When paired  with the AI learning algorithm, Maicat is able to identify its owners and react to their moods. 

CONNECTED – Integrated IoT connectivity allows you to add Maicat’s sensors and capabilities  to your existing home network. The Maicat SDK will allow the creation of apps which will let Maicat talk to most modern IoT devices.

CREATIVE Maicat is an excellent platform to get students interested in STEM topics. With an  app and the Maicat SDK, students can study AI, programming, robotics, facial recognition…the  list goes on and on. 

CELEBRATED Maicat was a CES 2022 Innovation Award nominee for its IoT integration and  support. That’s more than you can say for most other pets. 

CUDDLY Maicat is small and light enough to pick up and pet. Sensors within its body let  Maicat know it’s being petted and Maicat will respond lovingly. 

To learn more about the Maicat project checkout the promotional link below.

Meet Maicat 

Maicat Kickstarter 

About Macroact Inc. 

Macroact is an AI and robotics startup that develops machine learning solutions for adaptive robots. The company focuses on the implementation of artificial intelligence solutions throughout the  whole robot development process to reduce time and costs of the robot development and enhance the  learning ability of robots. Their core technology is Maidynamics, an autonomous robot control solution.  Maicat is their first adaptive robot. 

obode Announces Launch of the P8: Next-Generation Robot Vacuum for Multi-Surface Floor Cleaning

Robotics experts obode, just announced the launch of a next-generation smart robot vacuum/mop for automated floor cleaning for the entire home. Featuring voice control, customizable app, and LDS intelligent navigation, this innovative new cleaning robot adds powerful cleaning and modern convenience to any home. The obode P8 is available now: https://www.kickstarter.com/projects/obodep8/obode-p8-2-in-1-smart-self-cleaning-cleaner

obode P8 is the ultimate all in one floor cleaning solution with 3 modes for sweeping, vacuuming, mopping, and combination cleaning. Equipped with double-spin mops and a heavy duty vacuum motor with 2000pa of suction power, the system easily picks up debris and hair from both hard floors and carpets. P8 uses an advanced ultrasonic sensor to determine surface types and apply the proper cleaning, sweeping or mopping as needed. It moves seamlessly across the room, and intelligently switches between sweeping and mopping for safe, effective cleaning of any floor types.   

 “Many people have made the move to robotic vacuum cleaners for home convenience. However, over the past few years, technologies such as robotic navigation and surface sensors have greatly advanced. For P8, we applied these next-gen technologies to create the ultimate robot vacuum cleaner with mopping functions. The result is the most advanced multi-surface floor cleaning device with superior mapping, intelligent surface identification and multi-mode cleaning for all household floors. P8 intelligently cleans and features programmable functions that takes the hassle out of household chores. It efficiently and thoroughly keeps your floors clean so that you don’t have to. It’s the perfect addition to the modern home.” Founder, obode

P8 uses the latest in advanced LDS navigation and multi-layer mapping with an intelligent algorithm to avoid obstacles and barriers as it determines the most efficient and effective cleaning route through the home. With a 6200mAh battery built-in, P8 is capable of up to 2.5 hours of continuous cleaning, enough to do the entire house, before it automatically returns to base for recharging. For superior cleaning, P8 has a backwashing mopping cloth to prevent secondary smudging and automatically returns to the base to clean and hot dry the mop to disinfect it after each cleaning session. 

Convenient control of the P8 is achieved with voice commands via Alexa or Google Home and the system has an intelligent app for scheduling, customized cleaning, setting no-go zones, and ‘Do-not-disturb’ modes. The app updates automatically OTA and provides total control at the touch of a button. 

The obode P8 combines the latest intelligent cleaning features with next-generation robotic tech for the modern home. P8 is available now for pre-sale with special deals and pricing for early adopters: https://www.kickstarter.com/projects/obodep8/obode-p8-2-in-1-smart-self-cleaning-cleaner

Creality’s new budget 3D scanner, the CR-Scan Lizard, is about to hit Kickstarter

Creality has lifted the veil over its latest 3D scanner. In an effort to further diversify its 3D cosmos, Creality, the well-known manufacturer of 3D printing-community-favorites such as the Ender 3, has announced its new and improved 3D scanner: the CR-Scan Lizard.

This entry-level 3D scanner for consumers follows the company’s CR-Scan 01 — which was released a fairly short time ago as an affordable option for users to digitalize objects. The new Lizard is smaller in size for better portability and feel but promises improved features such as accuracy up to 0.05 mm, and better handling of bright environments and dark objects. All that for less money than its predecessor, even.

With the Lizard, you can scan small or large objects with ease. The CR Studio software does the heavy lifting of optimizing models and even sends those files via the Creality Cloud directly to your 3D printer. The applications seem almost endless.

With some early bird specials, the CR-Scan Lizard has made a debut on Kickstarter on February 2022, and, unsurprisingly, smashed its campaign goal in next to no time.

We have gathered all the information revealed so far about this new consumer-grade 3D scanner to give you an overview of what the Lizard has in store. Creality has also already sent us a scanner to try for ourselves, so keep an eye out for our upcoming hands-on experience.

Image of Creality CR-Scan Lizard: Specs, Price, Release & Reviews: Features

Features

HIGH ACCURACY

With the CR-Scan Lizard, Creality wants to bring professional-grade accuracy to the budget market. According to its spec sheet, the scanner has an accuracy of up to 0.05 mm allowing it to capture small parts and intricate details with high precision. Thanks to the scanner’s binoculars and improved precision calibration, Creality says it can pick up rich detail from objects as small as 15 x 15 x 15 mm, or as large as objects like car doors, engines, rear bumpers, and so on.

SCAN MODES

The CR-Scan Lizard comes with three different scanning modes. You can either use it in turntable mode, handheld mode, or a mixture of the two to scan an object.

Turntable mode is suitable for 15 – 300 mm objects and will scan automatically. You can use the combination mode for larger objects up to 500 mm, where you put the object on the rotary table but hold the scanner in hand to scan. Lastly, its handheld mode is suitable for scanning large objects up to 2 meters in size, such as the car parts mentioned above.

Plus, thanks to its visual tracking, the Lizard doesn’t need markers to work. You can scan objects without having to pin a bunch of stickers to them first — its software’s tracking algorithm will take care of that for you.

LIGHT OR DARK

Besides its scan modes, the Lizard also offers some improved scanning functions that should make it easier for users to achieve good results with minimal effort.

For one, Creality states the Lizard can scan accurately in sunlight. 3D scanners typically struggle with too much direct light, forcing users to scan in a darkened room for best results. However, Creality claims the Lizard, thanks to its multi-spectral optical technology, maintains excellent performance even in bright sunlight — which would vastly improve its field of application. The scanner can also be powered by a portable charger, so, in theory, you could go out there and scan the woods to your heart desire.

What’s more, the CR-Scan Lizard promises better material adaptability when scanning black and dark objects. Sounds like it’s got it all.

COLOR MAPPING

Creality has stated that it is planning to release a fully automated color mapping texture suite in March 2022 that promises true color fidelity for your scanned objects, but its currently still in development. Once released, you can make use of the mapping process, where high-definition color pictures of the model taken with a phone or DSLR camera can automatically be mapped onto the 3D model, allowing you to create high-quality, vivid color scans.

CR STUDIO

The Lizard’s accompanying software, CR Studio, promises many features that should help to achieve clean scans. For example, the software features on-click model optimization and multi-positional auto alignment, auto noise removal, topology simplified, texture mapping, and much more.

You can also upload and share models via the Creality Cloud, allowing you to slice your scanned objects and even send them to a 3D printer — all with the click of a button.

Release Date & Availability

Creality has set up a limited pre-order via Kickstarter. The scanner is available for backing since February 10, 2022, alongside some early bird batch sales. According to the Kickstarter campaign, shipping will take place in April.

Over the past days and weeks, Creality has already released a couple of videos on its YouTube channel showing off the scanner’s features in greater detail. Be sure to check those out if the Lizard tickles your fancy.

Creality has also already sent All3DP a CR-Scan Lizard to try out, so we are looking forward to giving it a spin in the next few days. Stay tuned for a full review of our hands-on experience.

At the time of writing, the CR-Scan Lizard is available via Kickstarter with super early bird pledges, priced from $300 for the most basic Lizard package and reaching $400 for the luxury version that already comes with a color kit.

According to the campaign, the off-the-shelf price for the Lizard will be $599 for its base version. So, there are potentially some bucks to be saved if you get in early. However, it wouldn’t be the first time that prices given changed eventually.

Here are the technical specifications for the Creality CR-Scan Lizard 3D scanner:

GENERAL SPECIFICATIONS

  • Precision: 0.05 mm
  • Resolution ratio: 0.1 – 0.2 mm
  • Single capture range: 200 x 100 mm
  • Operating Distance: 150 – 400 mm
  • Scanning Speed: 10 fps
  • Tracking mode: Visual tracking
  • Light: LED+NIR (Near-infrared mode)
  • Splicing Mode: Fully automatic geometry and visual tracking (without marker)

OUTPUT

  • Output Format: STL, OBJ, PLY
  • Compatible System: Win 10 64bit (MacOS to be released in March 2022)

COMMON SPECIFICATIONS

  • Machine Size: 155 x 84 x 46 mm
  • Machine Weight: 370 g

https://www.kickstarter.com/projects/3dprintmill/creality-cr-scan-lizard-capturing-fine-details-of-view

Starship Launches Grocery Delivery Service in Bay Area

On-demand robot delivery now available in Pleasanton, CA at Lucky California flagship store

SAN FRANCISCO (February, 2022)  Starship Technologies, the world’s leading provider of autonomous delivery services, is now delivering groceries in the San Francisco Bay Area. Starship is expanding its partnership with The Save Mart Companies for the exclusive launch of an on-demand grocery delivery service at its Lucky California flagship store in Pleasanton, CA. Lucky is the first grocery store in the San Francisco Bay Area to partner with Starship. 

Starship and The Save Mart Companies first partnered in September 2020, when the Save Mart flagship store in Modesto became the first grocery store in the U.S. to offer Starship robot delivery service. Since its launch, that store has expanded its delivery area to serve over 55,000 households. In Pleasanton, the service is launching to thousands of residents, with the delivery area expected to grow rapidly in the coming months, similar to Modesto. 

“We are very pleased to bring the benefits of autonomous delivery to Pleasanton, in partnership with Lucky California,” said Ryan Tuohy, SVP of Sales and Business Development at Starship Technologies. “Since launching our service in Modesto in 2020, we’ve been excited to see the extremely positive reaction to the robots and how they were embraced as part of the local community. We think the residents of Pleasanton will appreciate the convenience and positive environmental impact of autonomous delivery and we fully expect the service area to quickly expand to more households.”

The robots, each of which can carry up to 20 pounds of groceries – the equivalent of about three shopping bags – provide a convenient, energy-efficient, and low-cost delivery alternative to driving to the Lucky California store, allowing shoppers to browse thousands of items via the secure Starship app for on-demand delivery straight to their home.

The robots travel autonomously – crossing streets, climbing curbs and traversing sidewalks – to provide on-demand delivery to shoppers. They often become local celebrities as community members share their robot selfies and “love notes” on social media. 

“Since the debut of our contactless delivery service at the Save Mart flagship store, feedback from the Modesto community has been incredibly positive,” said Barbara Walker, senior vice president and chief marketing officer for The Save Mart Companies. “We are thrilled to expand this service to Lucky California in Pleasanton and offer a safe and efficient grocery delivery solution, along with some joyful entertainment, especially as the service area progressively expands over time..”

The Starship Food Delivery app is available for download on iOS and Android. To get started, customers choose from a range of their favorite groceries and drop a pin where they want their delivery to be sent. When an order is submitted, Lucky California team members gather the delivery items and carefully place them in a clean robot. Every robot’s interior and exterior is sanitized before each order. The customer can then watch as the robot makes its journey to them, via an interactive map. Once the robot arrives, the customer receives an alert, and can then meet the robot and unlock it through the app.

Starship already offers its services in many parts of the EU, UK and the US in cities, university campuses and industrial campuses, with further expansion planned in the near future. Starship is able to do L4 deliveries everywhere it operates – entire cities and campuses. The robots have been operating at L4 since 2018. On a daily basis Starship robots will complete numerous deliveries in a row 100% autonomously, including road crossings. This is why the cost of a Starship delivery is now lower than the human equivalent, which is believed to be a world first for any robot delivery company, whereas most others are still majority human controlled and in pilot mode.

Starship Technologies operates commercially on a daily basis around the world. Its zero-emission robots make more than 100,000 road crossings every day and have completed more than 2.5 million commercial deliveries and travelled more than 3 million miles (5 million+ kms) globally, more than any other autonomous delivery provider.

Further development of IDS NXT ocean: focus on user-friendliness and AI transparency

All-in-one embedded vision platform with new tools and functions

(PresseBox) (ObersulmAt IDS, image processing with artificial intelligence does not just mean that AI runs directly on cameras and users also have enormous design options through vision apps. Rather, with the IDS NXT ocean embedded vision platform, customers receive all the necessary, coordinated tools and workflows to realise their own AI vision applications without prior knowledge and to run them directly on the IDS NXT industrial cameras. Now follows the next free software update for the AI package. In addition to the topic of user-friendliness, the focus is also on making artificial intelligence clear and comprehensible for the user.

An all-in-one system such as IDS NXT ocean, which has integrated computing power and artificial intelligence thanks to the “deep ocean core” developed by IDS, is ideally suited for entry into AI Vision. It requires no prior knowledge of deep learning or camera programming. The current software update makes setting up, deploying and controlling the intelligent cameras in the IDS NXT cockpit even easier. For this purpose, among other things, an ROI editor is integrated with which users can freely draw the image areas to be evaluated and configure, save and reuse them as custom grids with many parameters. In addition, the new tools Attention Maps and Confusion Matrix illustrate how the AI works in the cameras and what decisions it makes. This helps to clarify the process and enables the user to evaluate the quality of a trained neural network and to improve it through targeted retraining. Data security also plays an important role in the industrial use of artificial intelligence. As of the current update, communication between IDS NXT cameras and system components can therefore be encrypted via HTTPS.

Just get started with the IDS NXT ocean Creative Kit

Anyone who wants to test the industrial-grade embedded vision platform IDS NXT ocean and evaluate its potential for their own applications should take a look at the IDS NXT ocean Creative Kit. It provides customers with all the components they need to create, train and run a neural network. In addition to an IDS NXT industrial camera with 1.6 MP Sony sensor, lens, cable and tripod adapter, the package includes six months’ access to the AI training software IDS NXT lighthouse. Currently, IDS is offering the set in a special promotion at particularly favourable conditions. Promotion page: https://en.ids-imaging.com/ids-nxt-ocean-creative-kit.html.

Learn more: www.ids-nxt.com

Weiterentwicklung von IDS NXT ocean: Fokus auf Benutzerfreundlichkeit und KI-Transparenz

All-in-One Embedded Vision Plattform mit neuen Werkzeugen und Funktionen

(PresseBox) (ObersulmBei IDS bedeutet Bildverarbeitung mit künstlicher Intelligenz nicht nur, dass die KI direkt auf Kameras läuft und Anwender zusätzlich enorme Gestaltungsmöglichkeiten durch Vision Apps haben. Kunden erhalten mit der Embedded-Vision-Plattform IDS NXT ocean vielmehr alle erforderlichen, aufeinander abgestimmten Tools und Workflows, um eigene KI-Vision-Anwendungen ohne Vorwissen zu realisieren und direkt auf den IDS NXT Industriekameras auszuführen. Jetzt folgt das nächste kostenlose Softwareupdate für das KI-Paket. Im Fokus steht neben dem Thema Benutzerfreundlichkeit auch der Anspruch, die künstliche Intelligenz für den Anwender anschaulich und nachvollziehbar zu machen.

Ein All-in-One System wie IDS NXT ocean, das durch den von IDS entwickelten „deep ocean core“ über integrierte Rechenleistung und künstliche Intelligenz verfügt, eignet sich bestens für den Einstieg in AI Vision. Es erfordert weder Vorkenntnisse in Deep Learning noch in der Kameraprogrammierung. Das aktuelle Softwareupdate macht die Einrichtung, Inbetriebnahme und Steuerung der intelligenten Kameras im IDS NXT cockpit noch einfacher. Hierzu wird unter anderem ein ROI-Editor integriert, mit dem Anwender die auszuwertenden Bildbereiche frei zeichnen und als beliebige Raster mit vielen Parametern konfigurieren, speichern und wiederverwenden können. Darüber hinaus veranschaulichen die neuen Werkzeuge Attention Maps und Confusion Matrix, wie die KI in den Kameras arbeitet und welche Entscheidungen sie trifft. Das macht sie transparenter und hilft dem Anwender, die Qualität eines trainierten neuronalen Netzes zu bewerten und durch gezieltes Nachtraining zu verbessern. Beim industriellen Einsatz von künstlicher Intelligenz spielt auch Datensicherheit eine wichtige Rolle. Ab dem aktuellen Update lässt sich die Kommunikation zwischen IDS NXT Kameras und Anlagenkomponenten deshalb per HTTPS verschlüsseln. 

Einfach loslegen mit dem IDS NXT ocean Creative Kit

Wer die industrietaugliche Embedded-Vision-Plattform IDS NXT ocean testen und das Potenzial für die eigenen Anwendungen evaluieren möchte, sollte einen Blick auf das IDS NXT ocean Creative Kit werfen. Kunden erhalten damit alle Komponenten, die sie für die Erstellung, das Trainieren und das Ausführen eines neuronalen Netzes benötigen. Neben einer IDS NXT Industriekamera mit 1,6 MP Sony Sensor, Objektiv, Kabel und Stativadapter enthält das Paket u.a. einen sechsmonatigen Zugang zur KI-Trainingssoftware IDS NXT lighthouse. Aktuell bietet IDS das Set in einer Sonderaktion zu besonders günstigen Konditionen an. Aktionsseite: https://de.ids-imaging.com/ids-nxt-ocean-creative-kit.html.

Weitere Informationen: www.ids-nxt.de