Web-based VEXcode EXP

VEXcode EXP is now available in a web-based version for Chrome browsers. The web-based version can be reached by navigating to codeexp.vex.com and contains all of the features and functionality of VEXcode EXP, but without the need to download or install anything! The new web-based version of VEXcode makes it easier for teachers and students to access projects from anywhere, at any time, on any device – including Chromebooks!

In addition to the built-in Help and Tutorials, the STEM Library contains additional resources and support for using web-based VEXcode EXP. Within the STEM Library you can find device-specific articles for connecting to web-based VEXcode EXP, loading and saving projects, updating firmware, and more. View the VEXcode EXP section of the STEM Library to learn more.

Web-based versions of VEXcode IQ and VEXcode V5 are in the works and will be available soon.

Der ReBeL der Automatisierung: Smarter igus Cobot für 4.970 Euro

Mit dem weltweit ersten Cobot-Kunststoffgetriebe und einem digitalen Ökosystem beschleunigt igus die Low-Cost-Automatisierung – heute schon 20 Projekte pro Woche

Als Herzstück das Getriebe aus Kunststoff – der Cobot ReBeL ist für 4.970 Euro jetzt lieferbar mit einem digitalen Universum für die Low-Cost-Integration in wenigen Tagen. (Quelle: igus GmbH)

Köln, 16. März 2022 – igus liefert ab jetzt den Serviceroboter ReBeL aus – auch als smarte Version. Zu einem Preis von 4.970 Euro für die Plug-and-Play-Variante und mit einem Gewicht von nur rund 8 Kilogramm erhalten Kunden einen der leichtesten Cobots auf dem Markt. Digitale Services wie der RBTXpert und neue Online-Angebote ermöglichen den Kunden komplette Automatisierungslösungen in wenigen Tagen und für wenig Geld.

Der ReBeL der Automatisierung: Smarter igus Cobot für 4.970 Euro Mit dem weltweit ersten Cobot-Kunststoffgetriebe und einem digitalen Ökosystem beschleunigt igus die Low-Cost-Automatisierung – heute schon 20 Projekte pro Woche Köln, 16. März 2022 – igus liefert ab jetzt den Serviceroboter ReBeL aus – auch als smarte Version. Zu einem Preis von 4.970 Euro für die Plug-and-Play-Variante und mit einem Gewicht von nur rund 8 Kilogramm erhalten Kunden einen der leichtesten Cobots auf dem Markt. Digitale Services wie der RBTXpert und neue Online-Angebote ermöglichen den Kunden komplette Automatisierungslösungen in wenigen Tagen und für wenig Geld.

Beim ReBeL setzt igus ganz auf seine motion plastics Expertise: Der Einsatz von Kunststoff macht den Roboter mit 8,2 Kilogramm Eigengewicht zum leichtesten Serviceroboter mit Cobot-Funktion in seiner Klasse. Alle mechanischen Bauteile, aus denen sich der ReBeL zusammensetzt, sind ausnahmslos von igus entwickelt und gefertigt. Seine Traglast beträgt 2 Kilogramm und er besitzt eine Reichweite von 664 Millimetern. Die Wiederholgenauigkeit liegt bei +/- 1 Millimeter bei 7 Picks pro Minute. Das Herzstück ist das weltweit erste industrietaugliche Cobot-Getriebe aus Kunststoff. „Hinter diesen Zahlen stecken 1.041 Tests im hauseigenen Labor seit 2019, bei denen wir tribologische und thermodynamische Tests zu 15 Materialpaarungen und Toleranzketten durchgeführt haben. Eine besonders große Herausforderung war die Wärmeentwicklung in den vollintegrierten Wellgetrieben; sie werden durch den Motor thermisch beeinflusst. In der Entwicklung haben wir uns daher zusätzlich auf größere Motoren und einen besseren Wirkungsgrad konzentriert, um die Wärmeentwicklung deutlich zu verringern“, sagt Alexander Mühlens, Leiter des Geschäftsbereichs Low-Cost-Automation bei igus. „Dadurch konnten wir kontinuierlich Verbesserungen erzielen und am Ende die Zyklenzahl auf zwei Millionen sogar verfünffachen. Das entspricht einer üblichen Lebensdauer von zwei Jahren.”  

Smart Plastics – Volle Transparenz im Betrieb für präventive Wartung
igus hat sein motion plastics Knowhow auch in die Leistungselektronik eingebracht und erstmalig einen Encoder mit Hilfe von Leitplastikbahnen entwickelt. So lassen sich Dreh- und Zyklenzahl, Durchläufe, Temperatur und Strom exakt messen. Dank einer Cloudanbindung mit Webcam stellt ein Dashboard alle generierten Daten übersichtlich live dar. Der Kunde erhält so die volle Transparenz über seinen ReBeL im Betrieb, über Kennzahlen wie Verschleiß, Taktzeit und Stückzahlen.  

Günstige Komplettlösung, schnell integriert
Erhältlich ist der smarte ReBeL in zwei Varianten: einmal als Open Source Version ohne Robotersteuerung, Netzteil und Software für 3.900 Euro bei Stückzahl 1 oder als Plug-and-Play-Variante mit Roboter, Steuerungssoftware und Netzteil für 4.970 Euro bei Stückzahl 1. Gemäß dem igus Ansatz „Build or Buy“ stehen Kunden neben dem Komplettsystem auch die einzelnen ReBeL Wellgetriebe zur Verfügung, in den Durchmessern 80 und 105 Millimeter. Das Drehmoment beträgt 3 Nm (80) beziehungsweise 25 Nm (105) bei 6 RPM, mit einer Übersetzung von 50:1. Der ReBeL ist auf dem Online-Marktplatz RBTX erhältlich. Hier finden Anwender Einzelkomponenten, Integrationsunterstützung sowie Hard- und Software von inzwischen mehr als 40 Partnern – im Wissen, dass alles zu 100 Prozent miteinander kompatibel ist; darunter verschiedenste Roboterkinematiken, Kameras, Software, Gripper, Power Electronics, Motoren, Sensoren und Steuerungen.   Für die Integration per Online-Beratung mit Festpreisgarantie steht für Kunden der RBTXpert bereit: Auf einer 400 Quadratmeter großen Customer Testing Area beraten Experten täglich Kunden per Live-Video und schicken innerhalb von Stunden Lösungsangebote. Typische Hardwarekosten ohne Integration starten bei 8.500 Euro und Komplettlösungen ab 12.500 Euro. „Wir spüren, dass wir Automatisierung noch zugänglicher machen, da wir mit unserem Service RBTXpert allein in Deutschland mehr als 20 Kundenprojekte pro Woche beraten. Deshalb erweitern wir bis Ende März den Service um zehn weitere Online-Berater. International ist das Angebot bereits in sieben Ländern verfügbar, weitere 14 sind in Vorbereitung”, so Alexander Mühlens. „Aus diesen positiven Erfahrungen, den vielen umgesetzten Projekten und den zahlreichen Kundengesprächen heraus entwickelt sich zurzeit ein spannendes Ökosystem an weiteren Services.“  

Universum für die Low-Cost-Automation
In diesem Low-Cost-Automation-Universum dreht sich alles um die individuelle Kundenanwendung. Ziel ist es, mit neuen Angeboten und Businessmodellen die Integration weiter zu vereinfachen. „Wir werden einen App Store zur Verfügung stellen, in dem Anbieter von Low-Cost-Automation und freie Softwareentwickler ihre Software-Ideen einbringen können. Indem sie bestehende Software nutzen, können Anwender ihre Automatisierung noch schneller realisieren. So ist die Anbindung der Roboter an digitale Services wie IFTTT oder smarte Assistenten wie Alexa oder Siri möglich. Der Besucher kann dann beispielsweise in einer Kaffeebar per Sprache seinen Lieblingskaffee ordern und der Roboter schenkt ihn dann aus. Daraus ergeben sich ganz neue Business-Modelle wie Pay-per-Pick, bei dem Nutzer nicht für den Roboter, sondern nur für seine Aufgabe zahlen. Diese neuen Möglichkeiten werden den Robotikmarkt ebenso wie den Alltag nachhaltig verändern“, sagt Mühlens. „Ihnen wollen wir mit dem Low-Cost-Automation-Universum ein Zuhause geben.“

https://www.igus.de/info/build-or-buy-serviceroboter-rebel

Draper Teaches Robots to Build Trust with Humans – new research

New study shows methods robots can use to self-assess their own performance

CAMBRIDGE, MASS. (PRWEB) MARCH 08, 2022

Establishing human-robot trust isn’t always easy. Beyond the fear of automation going rogue, robots simply don’t communicate how they are doing. When this happens, establishing a basis for humans to trust robots can be difficult.

Now, research is shedding light on how autonomous systems can foster human confidence in robots. Largely, the research suggests that humans have an easier time trusting a robot that offers some kind of self-assessment as it goes about its tasks, according to Aastha Acharya, a Draper Scholar and Ph.D. candidate at the University of Colorado Boulder.

Acharya said we need to start considering what communications are useful, particularly if we want to have humans trust and rely on their automated co-workers. “We can take cues from any effective workplace relationship, where the key to establishing trust is understanding co-workers’ capabilities and limitations,” she said. A gap in understanding can lead to improper tasking of the robot, and subsequent misuse, abuse or disuse of its autonomy.

To understand the problem, Acharya joined researchers from Draper and the University of Colorado Boulder to study how autonomous robots that use learned probabilistic world models can compute and express self-assessed competencies in the form of machine self-confidence. Probabilistic world models take into account the impact of uncertainties in events or actions in predicting the potential occurrence of future outcomes.

In the study, the world models were designed to enable the robots to forecast their behavior and report their own perspective about their tasking prior to task execution. With this information, a human can better judge whether a robot is sufficiently capable of completing a task, and adjust expectations to suit the situation.

To demonstrate their method, researchers developed and tested a probabilistic world model on a simulated intelligence, surveillance and reconnaissance mission for an autonomous uncrewed aerial vehicle (UAV). The UAV flew over a field populated by a radio tower, an airstrip and mountains. The mission was designed to collect data from the tower while avoiding detection by an adversary. The UAV was asked to consider factors such as detections, collections, battery life and environmental conditions to understand its task competency.

Findings were reported in the article “Generalizing Competency Self-Assessment for Autonomous Vehicles Using Deep Reinforcement Learning,” where the team addressed several important questions. How do we encourage appropriate human trust in an autonomous system? How do we know that self-assessed capabilities of the autonomous system are accurate?

Human-machine collaboration lies at the core of a wide spectrum of algorithmic strategies for generating soft assurances, which are collectively aimed at trust management, according to the paper. “Humans must be able to establish a basis for correctly using and relying on robotic autonomy for success,” the authors said. The team behind the paper includes Acharya’s advisors Rebecca Russell, Ph.D., from Draper and Nisar Ahmed, Ph.D., from the University of Colorado Boulder.

The research into autonomous self-assessment is based upon work supported by DARPA’s Competency-Aware Machine Learning (CAML) program.

In addition, funds for this study were provided by the Draper Scholar Program. The program gives graduate students the opportunity to conduct their thesis research under the supervision of both a faculty adviser and a member of Draper’s technical staff, in an area of mutual interest. Draper Scholars’ graduate degree tuition and stipends are funded by Draper.

Since 1973, the Draper Scholar Program, formerly known as the Draper Fellow Program, has supported more than 1,000 graduate students pursuing advanced degrees in engineering and the sciences. Draper Scholars are from both civilian and military backgrounds, and Draper Scholar alumni excel worldwide in the technical, corporate, government, academic, and entrepreneurship sectors.

Draper

At Draper, we believe exciting things happen when new capabilities are imagined and created. Whether formulating a concept and developing each component to achieve a field-ready prototype, or combining existing technologies in new ways, Draper engineers apply multidisciplinary approaches that deliver new capabilities to customers. As a nonprofit engineering innovation company, Draper focuses on the design, development and deployment of advanced technological solutions for the world’s most challenging and important problems. We provide engineering solutions directly to government, industry and academia; work on teams as prime contractor or subcontractor; and participate as a collaborator in consortia. We provide unbiased assessments of technology or systems designed or recommended by other organizations—custom designed, as well as commercial-off-the-shelf. Visit Draper at http://www.draper.com.

Maicat, the Cybernetic Companion Cat

Macroact, the personal robotics development lab operating out of South Korea, has released  their first AI based companion pet. Designed for education and entertainment, Maicat is now live on Kickstarter after years of design and testing. 

CAPABLE – Ready to use directly from the box, Maicat is an autonomous robot pet. Using its  sensors, Maicat is capable of detecting obstacles and walking around the house on its own.  With its laser range finder and gyroscope, it is able to adjust for thick carpets and door frames. 

CARING Maicat has facial, voice pattern and emotional recognition software. When paired  with the AI learning algorithm, Maicat is able to identify its owners and react to their moods. 

CONNECTED – Integrated IoT connectivity allows you to add Maicat’s sensors and capabilities  to your existing home network. The Maicat SDK will allow the creation of apps which will let Maicat talk to most modern IoT devices.

CREATIVE Maicat is an excellent platform to get students interested in STEM topics. With an  app and the Maicat SDK, students can study AI, programming, robotics, facial recognition…the  list goes on and on. 

CELEBRATED Maicat was a CES 2022 Innovation Award nominee for its IoT integration and  support. That’s more than you can say for most other pets. 

CUDDLY Maicat is small and light enough to pick up and pet. Sensors within its body let  Maicat know it’s being petted and Maicat will respond lovingly. 

To learn more about the Maicat project checkout the promotional link below.

Meet Maicat 

Maicat Kickstarter 

About Macroact Inc. 

Macroact is an AI and robotics startup that develops machine learning solutions for adaptive robots. The company focuses on the implementation of artificial intelligence solutions throughout the  whole robot development process to reduce time and costs of the robot development and enhance the  learning ability of robots. Their core technology is Maidynamics, an autonomous robot control solution.  Maicat is their first adaptive robot. 

Robothon® – The Grand Challenge 2022 // Call for Teams

Dear Robothon® Community!

We, the Munich Institute of Robotics and Machine Intelligence (MIRMI) of the Technical University of Munich (TUM), in collaboration with Messe München

and automatica have launched successfully a new high-tech platform calledmunich_i in 2021, an event bringing together the world’s leading thought leaders and personalities from AI and robotics.

munich_i will take place again at the next automatica from June 21-24, 2022 in Munich, therefore

Robothon®, the international competition to develop skills in robot manipulations, will also go into the second round!! 

Robothon® – The Grand Challenge Series focuses on pressing and unsolved challenges of our time and was 2021 held digitally in the run-up to the automatica sprint

with 9 international teams and a renowned Grand Challenge Jury. As a highlight, it ended with the Award Ceremony on June 22, 2021

with 4 winning teams, a total prize money of € 22,500, great recognition and an expansion of our community.

Are you a motivated robotics enthusiast looking for new challenges?

CALL FOR TEAMS is open until March 31, 2022!!

Apply HERE!

KEY FACTS:

  • Robothon® will once again will be held digitally from April 29 to June 1, 2022
  • Special highlight: the Award Ceremony will take place on-site on June 21, 2022, during automatica at the Messe München!

HOW IT WORKS: 

  • Robothon® againwill focus on single-arm robot manipulation
  • The Grand Challenge 2022: disassembly and sorting of e-waste
  • The competition is free of charge 
  • Up to 20 selected teams can participate (2-4 members) 
  • All roboticists (academic and young professionals) are encouraged to apply
  • Teams will need to provide their own robot to complete the challenge remotely
  • Each team will receive an internet connected competition task board by mail
  • The processing period of 1 month starts from receipt of the competition scorecard 
  • Team performances will be evaluated by the Grand Challenge Jury 
  • Prize money awaits the finalists!

HAVEN’T SIGNED UP YET? Apply as a team until March 31, 2022, and visit our website www.robothon-grand-challenge.com to learn more. 

Know someone who should participate? Please help spread the word!

Feel free to email us with any questions at [email protected].

With kind regards,

The Robothon® Team

Barbara Schilling & Peter So (Technical Leader)

obode Announces Launch of the P8: Next-Generation Robot Vacuum for Multi-Surface Floor Cleaning

Robotics experts obode, just announced the launch of a next-generation smart robot vacuum/mop for automated floor cleaning for the entire home. Featuring voice control, customizable app, and LDS intelligent navigation, this innovative new cleaning robot adds powerful cleaning and modern convenience to any home. The obode P8 is available now: https://www.kickstarter.com/projects/obodep8/obode-p8-2-in-1-smart-self-cleaning-cleaner

obode P8 is the ultimate all in one floor cleaning solution with 3 modes for sweeping, vacuuming, mopping, and combination cleaning. Equipped with double-spin mops and a heavy duty vacuum motor with 2000pa of suction power, the system easily picks up debris and hair from both hard floors and carpets. P8 uses an advanced ultrasonic sensor to determine surface types and apply the proper cleaning, sweeping or mopping as needed. It moves seamlessly across the room, and intelligently switches between sweeping and mopping for safe, effective cleaning of any floor types.   

 “Many people have made the move to robotic vacuum cleaners for home convenience. However, over the past few years, technologies such as robotic navigation and surface sensors have greatly advanced. For P8, we applied these next-gen technologies to create the ultimate robot vacuum cleaner with mopping functions. The result is the most advanced multi-surface floor cleaning device with superior mapping, intelligent surface identification and multi-mode cleaning for all household floors. P8 intelligently cleans and features programmable functions that takes the hassle out of household chores. It efficiently and thoroughly keeps your floors clean so that you don’t have to. It’s the perfect addition to the modern home.” Founder, obode

P8 uses the latest in advanced LDS navigation and multi-layer mapping with an intelligent algorithm to avoid obstacles and barriers as it determines the most efficient and effective cleaning route through the home. With a 6200mAh battery built-in, P8 is capable of up to 2.5 hours of continuous cleaning, enough to do the entire house, before it automatically returns to base for recharging. For superior cleaning, P8 has a backwashing mopping cloth to prevent secondary smudging and automatically returns to the base to clean and hot dry the mop to disinfect it after each cleaning session. 

Convenient control of the P8 is achieved with voice commands via Alexa or Google Home and the system has an intelligent app for scheduling, customized cleaning, setting no-go zones, and ‘Do-not-disturb’ modes. The app updates automatically OTA and provides total control at the touch of a button. 

The obode P8 combines the latest intelligent cleaning features with next-generation robotic tech for the modern home. P8 is available now for pre-sale with special deals and pricing for early adopters: https://www.kickstarter.com/projects/obodep8/obode-p8-2-in-1-smart-self-cleaning-cleaner

Creality’s new budget 3D scanner, the CR-Scan Lizard, is about to hit Kickstarter

Creality has lifted the veil over its latest 3D scanner. In an effort to further diversify its 3D cosmos, Creality, the well-known manufacturer of 3D printing-community-favorites such as the Ender 3, has announced its new and improved 3D scanner: the CR-Scan Lizard.

This entry-level 3D scanner for consumers follows the company’s CR-Scan 01 — which was released a fairly short time ago as an affordable option for users to digitalize objects. The new Lizard is smaller in size for better portability and feel but promises improved features such as accuracy up to 0.05 mm, and better handling of bright environments and dark objects. All that for less money than its predecessor, even.

With the Lizard, you can scan small or large objects with ease. The CR Studio software does the heavy lifting of optimizing models and even sends those files via the Creality Cloud directly to your 3D printer. The applications seem almost endless.

With some early bird specials, the CR-Scan Lizard has made a debut on Kickstarter on February 2022, and, unsurprisingly, smashed its campaign goal in next to no time.

We have gathered all the information revealed so far about this new consumer-grade 3D scanner to give you an overview of what the Lizard has in store. Creality has also already sent us a scanner to try for ourselves, so keep an eye out for our upcoming hands-on experience.

Image of Creality CR-Scan Lizard: Specs, Price, Release & Reviews: Features

Features

HIGH ACCURACY

With the CR-Scan Lizard, Creality wants to bring professional-grade accuracy to the budget market. According to its spec sheet, the scanner has an accuracy of up to 0.05 mm allowing it to capture small parts and intricate details with high precision. Thanks to the scanner’s binoculars and improved precision calibration, Creality says it can pick up rich detail from objects as small as 15 x 15 x 15 mm, or as large as objects like car doors, engines, rear bumpers, and so on.

SCAN MODES

The CR-Scan Lizard comes with three different scanning modes. You can either use it in turntable mode, handheld mode, or a mixture of the two to scan an object.

Turntable mode is suitable for 15 – 300 mm objects and will scan automatically. You can use the combination mode for larger objects up to 500 mm, where you put the object on the rotary table but hold the scanner in hand to scan. Lastly, its handheld mode is suitable for scanning large objects up to 2 meters in size, such as the car parts mentioned above.

Plus, thanks to its visual tracking, the Lizard doesn’t need markers to work. You can scan objects without having to pin a bunch of stickers to them first — its software’s tracking algorithm will take care of that for you.

LIGHT OR DARK

Besides its scan modes, the Lizard also offers some improved scanning functions that should make it easier for users to achieve good results with minimal effort.

For one, Creality states the Lizard can scan accurately in sunlight. 3D scanners typically struggle with too much direct light, forcing users to scan in a darkened room for best results. However, Creality claims the Lizard, thanks to its multi-spectral optical technology, maintains excellent performance even in bright sunlight — which would vastly improve its field of application. The scanner can also be powered by a portable charger, so, in theory, you could go out there and scan the woods to your heart desire.

What’s more, the CR-Scan Lizard promises better material adaptability when scanning black and dark objects. Sounds like it’s got it all.

COLOR MAPPING

Creality has stated that it is planning to release a fully automated color mapping texture suite in March 2022 that promises true color fidelity for your scanned objects, but its currently still in development. Once released, you can make use of the mapping process, where high-definition color pictures of the model taken with a phone or DSLR camera can automatically be mapped onto the 3D model, allowing you to create high-quality, vivid color scans.

CR STUDIO

The Lizard’s accompanying software, CR Studio, promises many features that should help to achieve clean scans. For example, the software features on-click model optimization and multi-positional auto alignment, auto noise removal, topology simplified, texture mapping, and much more.

You can also upload and share models via the Creality Cloud, allowing you to slice your scanned objects and even send them to a 3D printer — all with the click of a button.

Release Date & Availability

Creality has set up a limited pre-order via Kickstarter. The scanner is available for backing since February 10, 2022, alongside some early bird batch sales. According to the Kickstarter campaign, shipping will take place in April.

Over the past days and weeks, Creality has already released a couple of videos on its YouTube channel showing off the scanner’s features in greater detail. Be sure to check those out if the Lizard tickles your fancy.

Creality has also already sent All3DP a CR-Scan Lizard to try out, so we are looking forward to giving it a spin in the next few days. Stay tuned for a full review of our hands-on experience.

At the time of writing, the CR-Scan Lizard is available via Kickstarter with super early bird pledges, priced from $300 for the most basic Lizard package and reaching $400 for the luxury version that already comes with a color kit.

According to the campaign, the off-the-shelf price for the Lizard will be $599 for its base version. So, there are potentially some bucks to be saved if you get in early. However, it wouldn’t be the first time that prices given changed eventually.

Here are the technical specifications for the Creality CR-Scan Lizard 3D scanner:

GENERAL SPECIFICATIONS

  • Precision: 0.05 mm
  • Resolution ratio: 0.1 – 0.2 mm
  • Single capture range: 200 x 100 mm
  • Operating Distance: 150 – 400 mm
  • Scanning Speed: 10 fps
  • Tracking mode: Visual tracking
  • Light: LED+NIR (Near-infrared mode)
  • Splicing Mode: Fully automatic geometry and visual tracking (without marker)

OUTPUT

  • Output Format: STL, OBJ, PLY
  • Compatible System: Win 10 64bit (MacOS to be released in March 2022)

COMMON SPECIFICATIONS

  • Machine Size: 155 x 84 x 46 mm
  • Machine Weight: 370 g

https://www.kickstarter.com/projects/3dprintmill/creality-cr-scan-lizard-capturing-fine-details-of-view

Starship Launches Grocery Delivery Service in Bay Area

On-demand robot delivery now available in Pleasanton, CA at Lucky California flagship store

SAN FRANCISCO (February, 2022)  Starship Technologies, the world’s leading provider of autonomous delivery services, is now delivering groceries in the San Francisco Bay Area. Starship is expanding its partnership with The Save Mart Companies for the exclusive launch of an on-demand grocery delivery service at its Lucky California flagship store in Pleasanton, CA. Lucky is the first grocery store in the San Francisco Bay Area to partner with Starship. 

Starship and The Save Mart Companies first partnered in September 2020, when the Save Mart flagship store in Modesto became the first grocery store in the U.S. to offer Starship robot delivery service. Since its launch, that store has expanded its delivery area to serve over 55,000 households. In Pleasanton, the service is launching to thousands of residents, with the delivery area expected to grow rapidly in the coming months, similar to Modesto. 

“We are very pleased to bring the benefits of autonomous delivery to Pleasanton, in partnership with Lucky California,” said Ryan Tuohy, SVP of Sales and Business Development at Starship Technologies. “Since launching our service in Modesto in 2020, we’ve been excited to see the extremely positive reaction to the robots and how they were embraced as part of the local community. We think the residents of Pleasanton will appreciate the convenience and positive environmental impact of autonomous delivery and we fully expect the service area to quickly expand to more households.”

The robots, each of which can carry up to 20 pounds of groceries – the equivalent of about three shopping bags – provide a convenient, energy-efficient, and low-cost delivery alternative to driving to the Lucky California store, allowing shoppers to browse thousands of items via the secure Starship app for on-demand delivery straight to their home.

The robots travel autonomously – crossing streets, climbing curbs and traversing sidewalks – to provide on-demand delivery to shoppers. They often become local celebrities as community members share their robot selfies and “love notes” on social media. 

“Since the debut of our contactless delivery service at the Save Mart flagship store, feedback from the Modesto community has been incredibly positive,” said Barbara Walker, senior vice president and chief marketing officer for The Save Mart Companies. “We are thrilled to expand this service to Lucky California in Pleasanton and offer a safe and efficient grocery delivery solution, along with some joyful entertainment, especially as the service area progressively expands over time..”

The Starship Food Delivery app is available for download on iOS and Android. To get started, customers choose from a range of their favorite groceries and drop a pin where they want their delivery to be sent. When an order is submitted, Lucky California team members gather the delivery items and carefully place them in a clean robot. Every robot’s interior and exterior is sanitized before each order. The customer can then watch as the robot makes its journey to them, via an interactive map. Once the robot arrives, the customer receives an alert, and can then meet the robot and unlock it through the app.

Starship already offers its services in many parts of the EU, UK and the US in cities, university campuses and industrial campuses, with further expansion planned in the near future. Starship is able to do L4 deliveries everywhere it operates – entire cities and campuses. The robots have been operating at L4 since 2018. On a daily basis Starship robots will complete numerous deliveries in a row 100% autonomously, including road crossings. This is why the cost of a Starship delivery is now lower than the human equivalent, which is believed to be a world first for any robot delivery company, whereas most others are still majority human controlled and in pilot mode.

Starship Technologies operates commercially on a daily basis around the world. Its zero-emission robots make more than 100,000 road crossings every day and have completed more than 2.5 million commercial deliveries and travelled more than 3 million miles (5 million+ kms) globally, more than any other autonomous delivery provider.

Further development of IDS NXT ocean: focus on user-friendliness and AI transparency

All-in-one embedded vision platform with new tools and functions

(PresseBox) (ObersulmAt IDS, image processing with artificial intelligence does not just mean that AI runs directly on cameras and users also have enormous design options through vision apps. Rather, with the IDS NXT ocean embedded vision platform, customers receive all the necessary, coordinated tools and workflows to realise their own AI vision applications without prior knowledge and to run them directly on the IDS NXT industrial cameras. Now follows the next free software update for the AI package. In addition to the topic of user-friendliness, the focus is also on making artificial intelligence clear and comprehensible for the user.

An all-in-one system such as IDS NXT ocean, which has integrated computing power and artificial intelligence thanks to the „deep ocean core“ developed by IDS, is ideally suited for entry into AI Vision. It requires no prior knowledge of deep learning or camera programming. The current software update makes setting up, deploying and controlling the intelligent cameras in the IDS NXT cockpit even easier. For this purpose, among other things, an ROI editor is integrated with which users can freely draw the image areas to be evaluated and configure, save and reuse them as custom grids with many parameters. In addition, the new tools Attention Maps and Confusion Matrix illustrate how the AI works in the cameras and what decisions it makes. This helps to clarify the process and enables the user to evaluate the quality of a trained neural network and to improve it through targeted retraining. Data security also plays an important role in the industrial use of artificial intelligence. As of the current update, communication between IDS NXT cameras and system components can therefore be encrypted via HTTPS.

Just get started with the IDS NXT ocean Creative Kit

Anyone who wants to test the industrial-grade embedded vision platform IDS NXT ocean and evaluate its potential for their own applications should take a look at the IDS NXT ocean Creative Kit. It provides customers with all the components they need to create, train and run a neural network. In addition to an IDS NXT industrial camera with 1.6 MP Sony sensor, lens, cable and tripod adapter, the package includes six months‘ access to the AI training software IDS NXT lighthouse. Currently, IDS is offering the set in a special promotion at particularly favourable conditions. Promotion page: https://en.ids-imaging.com/ids-nxt-ocean-creative-kit.html.

Learn more: www.ids-nxt.com