QUADRUPED A1 – Four-legged robot combines artificial intelligence and sophisticated motion sequences

The newly founded company QUADRUPED Robotics is the first and currently the only German company to introduce fully modifiable multi-legged robots to the European market. In doing so, this form of robot represents a novelty: the four-legged robots combine artificial intelligence with new motion sequences and individually customizable equipment.
The A1 robot in the QUADRUPED line is based on the Robot Operating System (ROS.org) and can thus be adapted to its environment and requirements. However, even the basic equipment enables a wide range of applications.

By means of an AI-controlled and depth-sensing smart camera, HD recordings can be transmitted in real time and to a terminal device. At the same time, the integrated multi-eye camera offers real-time tracking of objects in sight, gesture recognition and target tracking following specific movement patterns.

The basis for the development of an environment map is the visual SLAM. QUADRUPED A1 calculates paths, obstacles, routes and navigation points. This leads to vision-based autonomous obstacle avoidance. In addition, QUADRUPED A1 also recognizes obstacle shapes and an adjustment of the body position takes place. If an impact or fall does occur, the advanced dynamic balancing algorithm allows balance to be quickly restored. Further measurement data as well as more dynamic behavior can be achieved by integrating additional sensor technology, such as that of a 3D LiDAR or further camera modules.

The QUADRUPED A1 incorporates the unique patented sensitive foot contact. Each of the four feet can be controlled individually. The smart actuators provide precise footing as well as different gaits. The system is based on a low-level control developed by QUADRUPED Robotics, which can read out the position including torque and current consumption at any time. The foot end is waterproof and dustproof and can be easily replaced after wear.
The A1 impressed with its latest measured top speed of 11.8 km/h (3.3 m/s), which is unique for a robot of this type. It can also carry loads of up to 5 kg.

For simplified maintenance work, the robot was designed with a stable and lightweight body structure. The A1 has an external 24 V power input and 5 V-/12 V-/19 V power supply, which enables the use of additional external devices. Other external interfaces include 4 USB, 2 HDMI, 2 Ethernet ports.

It is equipped with a powerful redundant control system: low-level control for CAN communication with the smart actuators and NVIDIA Xavier for calculation or measurement data evaluation. The current runtime of approx. 1.5 hours varies depending on the application.

Additional equipment is available from QUADRUPED Robotics and can be delivered with implemented software packages on request. Due to in-house research and development, the end customer can order a finished and tested product. Another service is the provision of complete documentation on the website www.docs.quadruped.de. In addition, complete simulation environments based on Webots & Gazebo are also made available for download there, which can be used for application testing.

QUADRUPED Robotics is a spin-off of MYBOTSHOP uG, which emerged as an established sales and development partner in the fields of robotics, sensor technology and automation technology. Company founder Daniel Kottlarz draws from the potential of four-legged and autonomous robots the opportunity to relieve humans in particularly dangerous areas of operation and situations and to ward off dangerous situations by means of the autonomous robots.

QUADRUPED A1 – Vierbeiniger Roboter vereint künstliche Intelligenz und ausgereifte Bewegungsabläufe

Das neu gegründete Unternehmen QUADRUPED Robotics führt als erstes und derzeit einziges deutsches Unternehmen voll modifizierbare mehrbeinige Roboter in den europäischen Markt ein. Dabei stellt diese Form der Roboter eine Neuheit dar: die Vierbeiner kombinieren künstliche Intelligenz mit neuen Bewegungsabläufen und einer individuell anpassbaren Ausstattung.

Der Roboter A1 der Linie QUADRUPED basiert auf dem Robot Operating System (ROS.org) und lässt sich somit auf seine Umgebung und Anforderung anpassen. Doch auch schon die Grundausstattung ermöglicht einen breiten Anwendungsbereich.
Mittels KI-gesteuerter und tiefenerkennender Smart-Kamera lassen sich HD-Aufnahmen in Echtzeit und an ein Endgerät übertragen. Gleichzeitig bietet die integrierte Mehraugen-Kamera die Echtzeit-Verfolgung von Objekten in Sichtweite, Gestenerkennung und auf bestimmte Bewegungsmuster folgend die Zielpersonenverfolgung.
Grundlage zur Erarbeitung einer Umgebungskarte ist das visuelle SLAM. QUADRUPED A1 berechnet Wege, Hindernisse, Strecken und Navigationspunkte. Dies führt zu einer visions-basierten autonomen Hindernisvermeidung. Zusätzlich erkennt der QUADRUPED A1 auch Hindernisformen und es erfolgt eine Anpassung der Körperposition. Sollte es doch zu einem Aufprall oder Sturz kommen, ermöglicht der fortschrittliche dynamische Balancier-Algorithmus das Gleichgewicht schnell wiederherzustellen. Weitere Messdaten sowie dynamischeres Verhalten können durch die Integration zusätzlicher Sensorik, wie die eines 3D-LiDAR oder weiterer Kameramodule, erreicht werden.

Im QUADRUPED A1 ist der einzigartige patentierte sensible Fußkontakt verbaut. Jeder der vier Füße kann einzeln und individuell angesteuert werden. Durch die smarten Aktuatoren sind präzises Auftreten sowie verschiedene Gangart geboten. Das System basiert auf einem von QUADRUPED Robotics entwickelten Low-Level-Control, das zu jedem Zeitpunkt die Position samt Drehmoment und Stromaufnahme auslesen kann. Das Fußende ist wasser- und staubdicht und kann nach Abnutzung leicht ausgetauscht werden.
Der A1 überzeugte durch seine zuletzt gemessene Höchstgeschwindigkeit von 11,8 km/h (3,3 m/s), welche für einen Roboter dieser Art einmalig ist. Zudem kann er Lasten bis zu 5 kg tragen.

Für vereinfachte Wartungsarbeiten wurde bei dem Roboter auf eine stabile und leichte Karosseriestruktur geachtet. Der A1 verfügt über einen externen 24 V Stromeingang und 5 V-/12 V-/19 V-Spannungsversorgung, die den Einsatz zusätzlicher externer Geräte ermöglicht. Weitere externe Schnittstellen sind 4 USB-, 2 HDMI-, 2 Ethernet-Anschlüsse.
Ausgestattet ist er mit einer leistungsstarken redundanten Steuerung: Low-Level-Control zur CAN-Kommunikation mit den smarten Aktuatoren und NVIDIA Xavier für die Berechnung bzw. Messdatenauswertung. Die aktuelle Laufzeit von ca. 1,5 Stunden variiert je nach Anwendung.

Zusatz-Equipment ist bei QUADRUPED Robotics erhältlich und wird auf Wunsch mit implementierten Software-Packages ausgeliefert. Durch die hausinterne Forschung und Entwicklung kann der Endkunde ein fertiges und getestetes Produkt bestellen. Ein weiterer Service ist die Bereitstellung der vollständigen Dokumentation auf der Website www.docs.quadruped.de. Darüber hinaus werden dort auch vollständige Simulationsumgebungen auf Basis von Webots & Gazebo zum Download bereitgestellt, die zu Anwendungstests genutzt werden können.

QUADRUPED Robotics ist eine Ausgründung der MYBOTSHOP uG, die als etablierter Vertriebs- und Entwicklungspartner in den Bereichen Robotik, Sensorik und Automatisierungstechnik entstand. Firmengründer Daniel Kottlarz schöpft aus dem Potenzial der vierbeinigen und autonomen Roboter die Chance, den Menschen in besonders gefährlichen Einsatzbereichen und Situationen zu entlasten und mittels der autonomen Roboter Gefahrensituationen abzuwehren.

flatcat, der gruseligste Roboter aller Zeiten, ist nur noch sieben Tage auf Kickstarter

Entweder haben Sie schon eins, oder Sie haben bald eins. Roboterhaustiere erobern die Verbrauchermärkte weltweit in Form von Babyrobben, Hundewelpen oder einem schwanzwedelnden Kissen. Jetzt bekommen sie Gesellschaft von einer überfahrenen Katze.

(lifePR) (Berlin, 14.05.21) flatcat wurde von Gizmodo ((https://gizmodo.com/…)) als “der gruseligste Roboter, den man je gesehen hat” betitelt, und das mag für einige tatsächlich so sein. Für viele andere ist es ein zugegebenermaßen seltsames, aber niedliches Roboter-Haustier, das sie umarmen und mit dem sie spielen wollen.

Die ersten paar Flatcats sind ab sofort und nur noch sieben Tage lang auf Kickstarter ((https://www.kickstarter.com/…), der beliebtesten Crowdfunding-Website, erhältlich. Die Kampagne steht kurz vor der Vollfinanzierung, braucht aber noch ein paar entscheidende Zusagen von Roboter-Enthusiasten aus nah und fern, die etwas bewegen wollen.

Der Roboter, der von Jetpack Cognition Lab , einem in Berlin ansässigen Unternehmen mit Grazer Wurzeln entwickelt und hergestellt wird, ist ein Roboter der neuen Art. Er ist völlig anders als alle anderen vergleichbaren Produkte auf dem Markt. Was ihn einzigartig macht, ist seine sensomotorische Kompetenz, die Kräfte seiner eigenen Bewegung und die von außen durch Menschen oder einfach durch die Schwerkraft erzeugten Kräfte zu spüren und darauf zu reagieren.

Die Fähigkeit, Kräfte direkt in den Gelenken zu spüren, erlaubt es Flatcat, neugierig zu sein und seinen eigenen Körper und die Welt auf die sicherste Art und Weise zu erkunden. Die Technologie dafür kommt aus dem Forschungsfeld der Entwicklungsrobotik, bei dem Teile der Entwicklung von Tieren und Menschen in Software und Algorithmen umgesetzt werden.

Mögliche Verwendungszwecke von flatcat sind als Haustier im Wohnzimmer, um einfach zu spielen und gemeinsam die Welt der sensomotorischen Erfahrung und Bewegung zu erkunden; als therapeutischer Roboter, um sanft einfache Bewegungen zu stimulieren, Gesellschaft und Trost zu spenden; oder als Desktop-Forschungs-Roboter für Wissenschaftler und Hacker:innen gleichermaßen, da er neben seiner hochmodernen sensomotorischen Sensibilität auch Open Source, erweiterbar und modifizierbar ist.

Jetpack Cognition Lab, Inc

Seit seinen Anfängen im Jahr 2019 bringt Jetpack Cognition Lab radikale Innovationen aus der wissenschaftlichen Forschung auf den Konsumentenmarkt. Die Gründer des Labs sind Dr. Oswald Berthold und Matthias Kubisch. Sie lernten sich während ihres Studiums an der Humboldt-Universität zu Berlin kennen und taten sich zusammen, um die schrägsten und lustigsten Roboter der Welt zu entwickeln.

Berthold ist ein in österreichischer Künstler-Technologe, geboren in Graz, der schon mit dem Kollektiv farmersmanual Musikgeschichte geschrieben hat, indem er neuartige Stile und innovative Ansätze zur digitalen Musikproduktion und -veröffentlichung im Internetzeitalter einführte. Spätestens seit er 2018 seine Promotion in Robotik innerhalb der Adaptive Systems Group der HU Berlin abgeschlossen hat, ist er damit beschäftigt, Grundlagenforschung in Kundennutzen zu verwandeln.

Kubisch ist ein deutscher Informatiker, Kreativer und Aktivist. Er hat als wesentliches Mitglied des Teams gearbeitet, das den modularen humanoiden Roboter Myon im ALEAR-Projekt unter der Leitung von Dr. Manfred Hild entwickelt hat. Außerdem hat er die Industrie von innen gesehen und Algorithmen zur Steuerung von elektrischen Kraftwerken entwickelt. Er ist nicht nur ein Experte für adaptive Echtzeitalgorithmen und maschinelles Lernen, sondern auch ein genialer Elektronikdesigner und Produktvisionär.

Artificial Intelligence Platform Ludo Revolutionizes Games Creation

Ludo AI, available now in open beta, gives developers access to the world’s first AI platform for games concept creation – accelerating and democratizing games creation



Seattle, USA. AI (Artificial Intelligence) games creativity platform Ludo has announced its open beta, following a deeply successful closed beta and attracted participation from independent studios across the globe. Games creators tasked with delivering the next hit game to emulate the success of the likes of Call of Duty, Among Us, Fortnite and Fall Guys, now have the answer in Ludo – the world’s first AI games ideation tool.

Ludo, Latin for ‘I Play’, uses machine learning and natural language processing to develop game concepts 24 hours a day. The platform is constantly learning and evolving. Ludo is built on a database of close to a million games and is agile and supremely intelligent. When asked to find a new game idea, based on intuitive keyword searches, Ludo returns almost immediately with multiple written game concepts, artwork and images that developers can rapidly work on to take the next stage (concept presentation, MVP or accelerated soft launch).

AI has never before been used at the start of the games creation process: In a 159.3 billion* dollar industry, the pressure to release new hit games is relentless: And coming up with new exciting and sticky games is the Holy Grail. Ludo is set to revolutionize game creation enabling developers by arming them with unique games concepts within minutes of their request being processed. Furthermore, as Ludo’s powerful capabilities are within the reach of any size of studio, the creation process has been democratized.

Games publishers and developers must deliver hit new games at a pace: The industry landscape is changing as it grows in value: Large, acquisitive publishers are constantly on the lookout for growing independents, with great new games and creative ideas, to absorb as they, in turn, need to deliver value to their stakeholders.

“Creativity is the new currency in the games industry,” said Tom Pigott, CEO of JetPlay, Ludo’s creator. “The next hit game could be worth millions and you never know where it will spring up from. With Ludo anyone can come up with a great new game idea without having to waste hours on the process and then invest even more time in researching what is already out there and how successful any similar games have been. Ludo does it all for you: Ludo brings the playfulness back into the game creation process, increases the probability of coming up with a great new game, and saves time and money.”

Since the global pandemic the games industry has seen exponential growth and it is estimated to be worth $200 Billion by 2023. Every developer is under pressure to create a viable pipeline and now with so many ways of testing games quickly ( a large percentage being rejected before they get through the gates) the appetite is at an all time high for new games ideas and concepts.

Ludo has been created by a small outstanding global team of AI Ph.D.’s and the brainchild of seasoned entrepreneur Tom Pigott, CEO of Jet Play, the developer of Ludo. The new open beta follows a highly successful closed program that saw a select group of studios harness the creative power of AI. Now, with an open beta, games developers can try the platform free of charge for a trial period.

„We’ve been extremely pleased by the feedback and the usage of our platform by the game makers that were part of the closed beta,“ said Pigott. „AI, when used as part of the creative process, delivers great results. It is easy to use, working intuitively with keyword searches, and those involved in our closed beta have already proved that amazing things can be done, and all without detracting from their development or marketing time. Very soon Ludo will become an integral part of every studio’s games ideation process.”

The Ludo open beta program offers an opportunity to enjoy all the benefits of early adoption, giving a head start on a mobile game creation approach that works. Due to the tremendous interest there is a waitlist: those interested in joining the Ludo open beta can apply or find out more here.

Boston Dynamics expands Spot® product line

NEW SELF-CHARGING ENTERPRISE ROBOT, REMOTE OPERATION SOFTWARE, AND ROBOT ARM ENHANCE SPOT’S CAPABILITIES FOR AUTONOMOUS SITE MONITORING


Waltham, MA – February 2, 2021 – Boston Dynamics, the global leader in mobile robotics, today announced an expanded product line for its agile mobile robot Spot. The new products include a self-charging Enterprise Spot, web-based remote operations software, Scout, and the Spot Arm. These additions extend Spot’s ability to perform autonomous, remote inspections and data collection, and enable the robot to perform manual tasks.

With more than 400 Spots out in the world, the robot has successfully undertaken hazardous tasks in a variety of inhospitable environments such as nuclear plants, offshore oil fields, construction sites, and mines. Customers have leveraged Spot’s advanced mobility, autonomy, control, and customizability to improve operational efficiency, enhance worker safety, and gather critical data. Spot’s new products are designed to enable customers to fully operationalize continuous, autonomous data collection on remote or hazardous worksites of any size, from anywhere they have access to their network.

Autonomy is critical to enhancing Spot’s value. In order to support long, remote deployments, Boston Dynamics is introducing Spot Enterprise, a new version of Spot that comes equipped with self-charging capabilities and a dock, allowing it to perform longer inspection tasks and data collection missions with little to no human interaction. In addition to the basic capabilities that the base Spot robot offers, Spot Enterprise leverages upgraded hardware for improved safety, communications, and behavior in remote environments. These upgrades expand the range that autonomous missions can cover, extend WiFi support, add flexibility to Spot’s payload ports, and enable users to quickly offload large data sets collected during the robot’s mission.

Pivotal to refining Spot’s value at scale is remote operation. Scout is Boston Dynamics’ web-based software that enables operators to control their fleet of Spots from a virtual control room. Operators can use Scout to take Spot anywhere a person could go on-site, allowing them to inspect critical equipment or hazardous areas from afar. The software is designed with a simple user interface to run pre-programmed autonomous missions or manually control the robot, to perform various tasks such as walking or posing the robot to capture images and thermal data of obscured gauges or pipes using the Spot CAM+IR thermal imaging payload.

Combined, the Spot Enterprise robot equipped with a Spot CAM+IR thermal imaging payload, Scout software, and Boston Dynamics’ premium support now create an out-of-the-box solution for asset-intensive environments. Operators can deploy this solution on site to proactively maintain and manage assets while maximizing worker uptime and improving worker safety.

In addition to launching products designed to make remote inspection safer and easier, Boston Dynamics is also releasing the Spot Arm, which enables users to act on data insights and perform physical work in human-centric environments. The arm is equipped to operate through both semi-autonomous actions and telemanipulation. It can manually or semi-autonomously grasp, lift, carry, place, and drag a wide variety of objects. It is also capable of manipulating objects with constrained movement and can open and close valves, pull levers and turn handles and knobs in coordination with its body to open standard push and pull doors.

“Since first launching Spot, we have worked closely with our customers to identify how the robot could best support their mission critical applications,” said Robert Playter, CEO of Boston Dynamics. “Our customers want reliable data collection in remote, hazardous, and dynamic worksites. We developed the new Spot products with these needs in mind, and with the goal of making it easy to regularly and remotely perform critical inspections, improving safety and operations.”

Interested parties can purchase Spot Enterprise, Scout, and the Spot Arm via Boston Dynamics’ sales team. For more information on these new offerings, please visit: www.bostondynamics.com.



About Boston Dynamics

Boston Dynamics is the global leader in developing and deploying highly mobile robots capable of tackling the toughest robotics challenges. Our core mission is to lead the creation and delivery of robots with advanced mobility, dexterity and intelligence that add value in unstructured or hard-to-traverse spaces and positively impact society. We create high-performance robots equipped with perception, navigation and intelligence by combining the principles of dynamic control and balance with sophisticated mechanical designs, cutting-edge electronics and next-generation software. We have three mobile robots in our portfolio – Spot®, Handle™ and Atlas® – as well as Pick™, a computer vision-based robotics solution for logistics. Founded in 1992, Boston Dynamics spun out of the MIT Leg Lab and is one of Inc. Magazine’s Best Workplaces of 2020. For more information on our company and its technologies, please visit www.bostondynamics.comhttp://www.bostondynamics.com.

Tech Vision: How Boston Dynamics Built The Most Advanced Robot

The following video has been uploaded to YouTube by tech vision. I think it has all the information you need about Boston Dynamics robots and is therefore a must-see.

Blaize Delivers First Open and Code-free AI Software Platform Spanning the Entire Edge AI Application Lifecycle


El DORADO HILLS, CA — December, 2020 — Blaize today fully unveiled the Blaize AI Studio offering, the industry’s first open and code-free software platform to span the complete edge AI operational workflow from idea to development, deployment and management. AI Studio dramatically reduces edge AI application deployment complexity, time, and cost by breaking the barriers within existing application development and machine learning operations (MLOps) infrastructure that hinder edge AI deployments. Eliminating the complexities of integrating disparate tools and workflows, along with the introduction of multiple ease-of-use and intelligence features, AI Studio reduces from months to days the time required to go from models to deployed production applications.



“While AI applications are migrating to the Edge with growth projected to outpace that of the Data Center, Edge AI deployments today are complicated by a lack of tools for application development and MLOps,” says Dinakar Munagala, Co-founder and CEO, Blaize. “AI Studio was born of the insights to this problem gained in our earliest POC edge AI hardware customer engagements, as we recognized the need and opportunity for a new class of AI software platform to address the complete end-to-end edge AI operational workflow.”



“AI Studio is open and highly optimized for the AI development landscape that exists across heterogeneous ecosystems at the edge,” says Dmitry Zakharchenko, VP Research & Development, Blaize. “With the AI automation benefits of a truly modern user experience interface, AI Studio serves the unique needs in customers’ edge use cases for ease of application development, deployment, and management, as well as broad usability by both developers and domain expert non-developers.”



The combination of AI Studio innovations in user interface, use of collaborative Marketplaces, end-to-end application development, and operational management, collectively bridge the operational chasm hindering AI edge ROI. Deployed with the Blaize AI edge computing hardware offerings that address unserved edge hardware needs, AI Studio makes AI more practical and economical for edge use cases where unmet application development and MLOps needs delay the pace of production deployment.



“In our work for clients, which may include developing models for quality inspection within manufacturing, identifying stress markers to improve drug trials or even predicting high resolution depth for autonomous vehicles, it is vital that businesses can build unique AI applications that prove their ideas quickly,” says Tim Ensor, Director of AI, Cambridge Consultants. “AI Studio offers innovators the means to achieve this confidence in rapid timeframes, which is a really exciting prospect.” Cambridge Consultants, part of Capgemini Group, helps the world’s biggest brands and most ambitious businesses innovate in AI, including those within the Blaize ecosystem.

Code-free assistive UI for more users, more productivity
The AI Studio code-free visual interface is intuitive for a broad range of skill levels beyond just AI data scientists, which is a scarce and costly resource for many organizations. “Hey Blaize” summons a contextually intelligent assistant with an expert knowledge-driven recommendation system to guide users through the workflow. This ease of use enables AI edge app development for wider teams from AI developers to system builders to business domain subject matter experts.

Open standards for user flexibility, broader adoption
With AI Studio, users can deploy models with one click to plug into any workflow across multiple open standards including ONNX, OpenVX, containers, Python, or GStreamer. No other solution offers this degree of open standard deployment support, as most are proprietary solutions that lock in users with limited options. Support for these open standards allows AI Studio to deploy to any hardware that fully supports the standards.



Marketplaces collaboration
Marketplace support allows users to discover models, data and complete applications from anywhere – public or private – and collaborate continuously to build and deploy high-quality AI applications.

AI Studio supports open public models, data marketplaces and repositories, and provides connectivity and infrastructure to host private marketplaces. Users can continually scale proven AI edge models and vertical AI solutions to effectively reuse across enterprises, choosing from hundreds of models with drag and drop ease to speed application development



Easy-to-Use application development workflow:
The AI Studio model development workflow allows users to easily train and optimize models for specific datasets and use cases, and deploy quickly into multiple formats and packages. With the click of a button, AI Studio’s unique Transfer Learning feature quickly retrains imported models for the user’s data and use case. Blaize edge-aware optimization tool, NetDeploy, automatically optimizes the models to the user’s specific accuracy and performance needs. With AI Studio, users can easily build and customize complete application flows other than neural networks, such as image signal processing, tracking or sensor fusion functions.



Ground-breaking edge MLOps/DevOps features
As a complete end-to-end platform, AI Studio helps users deploy, manage, monitor and continuously improve their edge AI applications. Built on a cloud-native infrastructure based on microservices, containers and Kubernetes, AI Studio is highly scalable and reliable in production.



Blaize AI Studio Early Adopter Customers Results
In smart retail, smart city and industry 4.0 markets, Blaize customers are realizing new levels of efficiency in AI application development and deployment using AI Studio. Examples include:

– Complete end-to-end AI development cycle reduction from months to days
– Reduction in training compute by as much as 90%

– Edge-aware efficient optimizations and compression of models with a < 3% accuracy drop

– New revolutionary contextual conversational interfaces that eclipse visual UI



Availability
AI Studio is available now to qualified early adopter customers, with general availability in Q1 2021. The AI Studio product offering includes licenses for individual seats, enterprise, and on-premise subscriptions, with product features and services suited to the needs of each license type.



About Blaize


Blaize leads new-generation computing unleashing the potential of AI to enable leaps in the value technology delivers to improve the way we all work and live. Blaize offers transformative computing solutions for AI data collection and processing at the edge of network, with focus on smart vision applications including automobility, retail, security, industrial and metro. Blaize has secured US$87M in equity funding to date from strategic and venture investors DENSO, Daimler, SPARX Group, Magna, Samsung Catalyst Fund, Temasek, GGV Capital, Wavemaker and SGInnovate. With headquarters in El Dorado Hills (CA), Blaize has teams in Campbell (CA), Cary (NC), and subsidiaries in Hyderabad (India), Manila (Philippines), and Leeds and Kings Langley (UK), with 300+ employees worldwide.

GrubTech and Wobot.ai join forces in the automation and digitization of restaurant & cloud kitchen operations

9th November 2020, Dubai, United Arab Emirates: GrubTech, the UAE-based tech start-up that is taking the foodservice industry by storm with the introduction of the world’s most technologically advanced digital commerce tool for restaurant and cloud kitchen owners, and India-based AI-powered video analytics powerhouse Wobot.ai are proud to announce a global strategic collaboration. The partnership brings together GrubTech’s native technology expertise in the foodservice landscape and Wobot’s state-of-the-art platform to curate an optimum experience for restaurateurs and cloud kitchen operators globally.

In this age of the technological revolution, rapidly evolving technology is expected to provide much-needed tailwinds to the foodservice business, as tech enablement is no longer a luxury, but a necessity to survive and succeed. The alliance between GrubTech and Wobot.ai establishes a comprehensive solution for restaurants and cloud kitchens, encompassing the digitization of everything from order capture and operations to compliance management and marketing.

GrubTech’s integration with food aggregators, points of sale and third party logistics providers, eliminates the need for the manual, error-prone and often cumbersome entry of orders into siloed solutions. Rather, it digitizes the order lifecycle, providing comprehensive visibility over sales and operations and resulting in reduced costs, increased efficiencies and shortened food preparation and delivery times, i.e. from click to doorbell in far less time.

„GrubTech provides restaurants, cloud kitchens and virtual brands with the first end-to-end management system, automating manual processes in order to drive operational efficiencies and improve the customer experience. Wobot’s AI-powered insights & business intelligence tools create a perfect synergy with our platform, setting us on course to completely revolutionise the global foodservice industry. We look forward to helping to drive future fit and profitable operations for our customers, as they strive to win in this ever-changing landscape” said Mohamed Al Fayed, Co-Founder and CEO of GrubTech.

Wobot.ai today powers 10,000+ units globally, helping them reduce their risk of non-compliance, cost of monitoring, and increases their customer NPS with its computer vision technology.

Mr Adit Chhabra, CEO of Wobot added „Our vision with the Wobot-GrubTech alliance is to create a seamless workplace optimized to deliver operational excellence in the hospitality industry with our combined technology platforms. Our service offerings tailored specifically for restaurants and cloud kitchens, offers the unmatched capability to deliver massive value for these businesses. Wobot’s platform monitors health, safety & operational checklists & helps you ascertain if you meet global foodservice industry standards“.

GrubTech’s solution is highly scalable and easily deployable remotely, and the company is in advanced discussions to deploy across a number of large enterprises and SME’s across the MENA region and beyond into SE Asia, and Europe. The agreement with Wobot.ai will significantly enhance the offering, as with heightened food safety requirements and increased restrictions resulting from the COVID-19 pandemic, countries are frequently updating their compliance legislation creating an urgent need for an effective and multi-purpose operations platform.

Mayflower Autonomous Ship Launches

PLYMOUTH, England, Sept. 15, 2020 /PRNewswire/ — Ocean research non-profit ProMare and IBM (NYSE:IBM) have announced the completion and launch of the Mayflower Autonomous Ship (MAS) – an AI and solar powered marine research vessel which will traverse oceans gathering vital environmental data.

https://www.youtube.com/watch?v=n8wPUCPX6ss&feature=youtu.be

Following two years of design, construction and training of its AI models, the new fully-autonomous trimaran was today lifted into the waters off the coast of Plymouth, England ahead of its official launch tomorrow.

Designed to provide a safe, flexible and cost-effective way of gathering data about the ocean, the new-generation Mayflower promises to transform oceanography by working in tandem with scientists and other autonomous vessels to help understand critical issues such as global warming, micro-plastic pollution and marine mammal conservation. ProMare is co-ordinating the scientific studies working with IBM Research and a number of leading scientific organizations.

MAS features an AI Captain built by ProMare and IBM developers which gives MAS the ability to sense, think and make decisions at sea with no human captain or onboard crew. The new class of marine AI is underpinned by IBM’s latest advanced edge computing systems, automation software, computer vision technology and Red Hat Open Source software.

„Able to scan the horizon for possible hazards, make informed decisions and change its course based on a fusion of live data, the Mayflower Autonomous Ship has more in common with a modern bank than its 17th century namesake,“ said Andy Stanford-Clark, Chief Technology Officer, IBM UK & Ireland. „With its ability to keep running in the face of the most challenging conditions, this small ship is a microcosm for every aspiring 21st century business.“

To enable followers around the world to stay updated with MAS as it undertakes its various missions, IBM and ProMare have today launched a new interactive web portal. Built by IBM iX, the business design arm of IBM Services, the MAS400 portal is designed to provide real-time updates about the ship’s location, environmental conditions and data from its various research projects. Live weather data will be streamed from The Weather Company, as MAS is receiving forecast data and insight from the new IBM Weather Operations Center.

The portal even features a seven-armed, stowaway octopus chatbot called Artie, who claims to be hitching a ride on the ship. Powered by IBM Watson Assistant technology and created in partnership with European start-up Chatbotbay, Artie has been trained to provide information about MAS and its adventures in a lively, and accessible format.

„MAS400.com is one of the most advanced ocean mission web portals ever built,“ says Fredrik Soreide, Scientific Director of the Mayflower Autonomous Ship project and Board Member of ProMare. „Protecting the ocean depends on our ability to engage the public in important matters affecting its health. This MAS400 portal is designed to do exactly that and tell people where the ship is, what speed it’s travelling at, what conditions it’s operating in and what science we are conducting. Users can even help Artie the Octopus fish out surgical masks, cigarette butts and other increasingly common forms of ocean litter from a virtual ocean of facts and data.“

MAS will spend the next six months in sea trials and undertake various research missions and voyages before attempting to cross the Atlantic in Spring 2021. MAS’s transatlantic voyage will be based on a similar route and pioneering spirit to the 1620 Mayflower which made the same crossing 400 years ago.

MAS Facts:

Name: Mayflower Autonomous Ship (MAS)
Organizations and
companies behind it:   
ProMare, IBM and a global consortium of partners 
Mission:MAS and other autonomous ships and drones working in tandem with human scientists to collect vital oceanographic data
Humans on board:0
Autonomy level5 (can operate independently with no human intervention)
Sensors on board:30+
AI Cameras on board:   6
Octopuses on board:1
Science projects:Marine mammals, micro plastics, sea level height & wave patterns, oceanographic and environmental data collection
Length:15M
Width:6.2M
Max speed:10 knots
Weight:5 tons/4535KG
Equipment capacity:    0.7 tons/700KG
Hull design:Trimaran (central hull with two outrigger wings)
Power:Solar-driven hybrid electric motor
Software:IBM Visual Insights computer vision technology, IBM edge systems, IBM Operational Decision Manager automation software, IBM Maximo asset management software, data from The Weather Company        
Hardware:IBM Power Systems AC922, 6 Jetson AGX Xavier, 2 Jetson Xavier NX, 4+ Intel-based computers, 4+ custom microprocessor systems
Navigation equipment:Precision GNSS (Global Navigation Satellite System), IMU (Inertial Measurement Units), radar, weather station, SATCOM, AIS
Live mission portal:https://mas400.com
More information:https://newsroom.ibm.com/then-and-now
B-roll:  https://newsroom.ibm.com/mayflower-b-roll
Images:https://newsroom.ibm.com/mayflower-images