Picked up and put off

Guest post by IDS Corporate Communications

Autonomously driving robotic assistance system for the automated placement of coil creels

Due to the industry standard 4.0, digitalisation, automation and networking of systems and facilities are becoming the predominant topics in production and thus also in logistics. Industry 4.0 pursues the increasing optimisation of processes and workflows in favour of productivity and flexibility and thus the saving of time and costs. Robotic systems have become the driving force for automating processes. Through the Internet of Things (IoT), robots are becoming increasingly sensitive, autonomous, mobile and easier to operate. More and more they are becoming an everyday helper in factories and warehouses. Intelligent imaging techniques are playing an increasingly important role in this.

To meet the growing demands in scaling and changing production environments towards fully automated and intelligently networked production, the company ONTEC Automation GmbH from Naila in Bavaria has developed an autonomously driving robotic assistance system. The „Smart Robot Assistant“ uses the synergies of mobility and automation: it consists of a powerful and efficient intralogistics platform, a flexible robot arm and a robust 3D stereo camera system from the Ensenso N series by IDS Imaging Development Systems GmbH.

The solution is versatile and takes over monotonous, weighty set-up and placement tasks, for example. The autonomous transport system is suitable for floor-level lifting of Euro pallets up to container or industrial format as well as mesh pallets in various sizes with a maximum load of up to 1,200 kilograms. For a customer in the textile industry, the AGV (Automated Guided Vehicle) is used for the automated loading of coil creels. For this purpose, it picks up pallets with yarn spools, transports them to the designated creel and loads it for further processing. Using a specially developed gripper system, up to 1000 yarn packages per 8-hour shift are picked up and pushed onto a mandrel of the creel. The sizing scheme and the position of the coils are captured by an Ensenso 3D camera (N45 series) installed on the gripper arm.

Application

Pallets loaded with industrial yarn spools are picked up from the floor of a predefined storage place and transported to the creel location. There, the gripper positions itself vertically above the pallet. An image trigger is sent to the Ensenso 3D camera from the N45 series, triggered by the in-house software ONTEC SPSComm. It networks with the vehicle’s PLC and can thus read out and pass on data. In the application, SPSComm controls the communication between the software parts of the vehicle, gripper and camera. This way, the camera knows when the vehicle and the grabber are in position to take a picture. This takes an image and passes on a point cloud to a software solution from ONTEC based on the standard HALCON software, which reports the coordinates of the coils on the pallet to the robot. The robot can then accurately pick up the coils and process them further. As soon as the gripper has cleared a layer of the yarn spools, the Ensenso camera takes a picture of the packaging material lying between the yarn spools and provides point clouds of this as well. These point clouds are processed similarly to provide the robot with the information with which a needle gripper removes the intermediate layers. „This approach means that the number of layers and finishing patterns of the pallets do not have to be defined in advance and even incomplete pallets can be processed without any problems,“ explains Tim Böckel, software developer at ONTEC. „The gripper does not have to be converted for the use of the needle gripper. For this application, it has a normal gripping component for the coils and a needle gripping component for the intermediate layers.“

For this task, the mobile use for 3D acquisition of moving and static objects on the robot arm, the Ensenso 3D camera is suitable due to its compact design. The Ensenso N 45’s 3D stereo electronics are completely decoupled from the housing, allowing the use of a lightweight plastic composite as the housing material. The low weight facilitates the use on robot arms such as the Smart Robotic Asstistant. The camera can also cope with demanding environmental conditions. „Challenges with this application can be found primarily in the different lighting conditions that are evident in different rooms of the hall and at different times of the day,“ Tim Böckel describes the situation. Even in difficult lighting conditions, the integrated projector projects a high-contrast texture onto the object to be imaged by means of a pattern mask with a random dot pattern, thus supplementing the structures on featureless homogenous surfaces. This means that the integrated camera meets the requirements exactly. „By pre-configuring within NxView, the task was solved well.“ This sample programme with source code demonstrates the main functions of the NxLib library, which can be used to open one or more stereo and colour cameras whose image and depth data are visualised. Parameters such as exposure time, binning, AOI and depth measuring range can – as in this case – be adjusted live for the matching method used.

The matching process empowers the Ensenso 3D camera to recognise a very high number of pixels, including their position change, by means of the auxiliary structures projected onto the surface and to create complete, homogeneous depth information of the scene from this. This in turn ensures the necessary precision with which the Smart Robot Assistant proceeds. Other selection criteria for the camera were, among others, the standard vision interface Gigabit Ethernet and the global shutter 1.3 MP sensor. „The camera only takes one image pair of the entire pallet in favour of a faster throughput time, but it has to provide the coordinates from a relatively large distance with an accuracy in the millimetre range to enable the robot arm to grip precisely,“ explains Matthias Hofmann, IT specialist for application development at ONTEC. „We therefore need the high resolution of the camera to be able to safely record the edges of the coils with the 3D camera.“ The localisation of the edges is important in order to be able to pass on as accurate as possible the position from the centre of the spool to the gripper.

Furthermore, the camera is specially designed for use in harsh environmental conditions. It has a screwable GPIO connector for trigger and flash and is IP65/67 protected against dirt, dust, splash water or cleaning agents.

Software

The Ensenso SDK enables hand-eye calibration of the camera to the robot arm, allowing easy translation or displacement of coordinates using the robot pose. In addition, by using the internal camera settings, a „FileCam“ of the current situation is recorded at each pass, i.e. at each image trigger. This makes it possible to easily adjust any edge cases later on, in this application for example unexpected lighting conditions, obstacles in the image or also an unexpected positioning of the coils in the image. The Ensenso SDK also allows the internal camera LOG files to be stored and archived for possible evaluation.

ONTEC also uses these „FileCams“ to automatically check test cases and thus ensure the correct functioning of all arrangements when making adjustments to the vision software. In addition, various vehicles can be coordinated and logistical bottlenecks minimised on the basis of the control system specially developed by ONTEC. Different assistants can be navigated and act simultaneously in a very confined space. By using the industrial interface tool ONTEC SPSComm, even standard industrial robots can be safely integrated into the overall application and data can be exchanged between the different systems.

Outlook

Further development of the system is planned, among other things, in terms of navigation of the autonomous vehicle. „With regard to vehicle navigation for our AGV, the use of IDS cameras is very interesting. We are currently evaluating the use of the new Ensenso S series to enable the vehicle to react even more flexibly to obstacles, for example, classify them and possibly even drive around them,“ says Tim Böckel, software developer at ONTEC, outlining the next development step.

ONTEC’s own interface configuration already enables the system to be integrated into a wide variety of Industry 4.0 applications, while the modular structure of the autonomously moving robot solution leaves room for adaptation to a wide variety of tasks. In this way, it not only serves to increase efficiency and flexibility in production and logistics, but in many places also literally contributes to relieving the workload of employees.

More at: https://en.ids-imaging.com/casestudies-detail/picked-up-and-put-off-ensenso.html

Kosmos ReBotz – Rusty der Crawling-Bot

So einfach kann es sein, einen eigenen Roboter zu konstruieren! Die kleinen Roboter aus
der neuen Reihe ReBotz sind mit wenigen Handgriffen zusammengebaut. Mit Batterien
versorgt, bringen sie Kinder ab acht Jahren mit Ihren kuriosen Fortbewegungsarten zum
Staunen und mit ihren besonderen Charakteren zum Schmunzeln. Besonders Spaß macht
es, die Körperteile der verschiedenen ReBotz durcheinander zu mischen und so ganz neue
Fortbewegungsmethoden zu erfinden. Auf diese Weise wird die Sammelleidenschaft und
die Lust der jungen Forscherinnen und Forscher am Experimentieren geweckt.

Kosmos ReBotz – Duke der Skating-Bot

So einfach kann es sein, einen eigenen Roboter zu konstruieren! Die kleinen Roboter aus
der neuen Reihe ReBotz sind mit wenigen Handgriffen zusammengebaut. Mit Batterien
versorgt, bringen sie Kinder ab acht Jahren mit Ihren kuriosen Fortbewegungsarten zum
Staunen und mit ihren besonderen Charakteren zum Schmunzeln. Besonders Spaß macht
es, die Körperteile der verschiedenen ReBotz durcheinander zu mischen und so ganz neue
Fortbewegungsmethoden zu erfinden. Auf diese Weise wird die Sammelleidenschaft und
die Lust der jungen Forscherinnen und Forscher am Experimentieren geweckt.

Kosmos ReBotz – Pitti der Walking-Bot

Kosmos ReBotz – Pitti der Walking-Bot.

So einfach kann es sein, einen eigenen Roboter zu konstruieren! Die kleinen Roboter aus
der neuen Reihe ReBotz sind mit wenigen Handgriffen zusammengebaut. Mit Batterien
versorgt, bringen sie Kinder ab acht Jahren mit Ihren kuriosen Fortbewegungsarten zum
Staunen und mit ihren besonderen Charakteren zum Schmunzeln. Besonders Spaß macht
es, die Körperteile der verschiedenen ReBotz durcheinander zu mischen und so ganz neue
Fortbewegungsmethoden zu erfinden. Auf diese Weise wird die Sammelleidenschaft und
die Lust der jungen Forscherinnen und Forscher am Experimentieren geweckt.


Find the latest News on robots drones AI robotic toys and gadgets at robots-blog.com. Follow us on our Blog Instagram Facebook Twitter or our other sites. Share your robotics ideas and products with us. #robots #robot #omgrobots #roboter #robotic #mycollection #collector #robotsblog #collection #botsofinstagram #bot #robotics #robotik #gadget #gadgets #toy #toys #drone #robotsofinstagram #instabots #photooftheday #picoftheday #followforfollow #instadaily #kosmos #rebotz #buxy #pitti #rusty #duke

Kosmos ReBotz – Buxy der Jumping-Bot

Kosmos ReBotz – Buxy der Jumping-Bot.

So einfach kann es sein, einen eigenen Roboter zu konstruieren! Die kleinen Roboter aus
der neuen Reihe ReBotz sind mit wenigen Handgriffen zusammengebaut. Mit Batterien
versorgt, bringen sie Kinder ab acht Jahren mit Ihren kuriosen Fortbewegungsarten zum
Staunen und mit ihren besonderen Charakteren zum Schmunzeln. Besonders Spaß macht
es, die Körperteile der verschiedenen ReBotz durcheinander zu mischen und so ganz neue
Fortbewegungsmethoden zu erfinden. Auf diese Weise wird die Sammelleidenschaft und
die Lust der jungen Forscherinnen und Forscher am Experimentieren geweckt.


Find the latest News on robots drones AI robotic toys and gadgets at robots-blog.com. Follow us on our Blog Instagram Facebook Twitter or our other sites. Share your robotics ideas and products with us. #robots #robot #omgrobots #roboter #robotic #mycollection #collector #robotsblog #collection #botsofinstagram #bot #robotics #robotik #gadget #gadgets #toy #toys #drone #robotsofinstagram #instabots #photooftheday #picoftheday #followforfollow #instadaily #kosmos #rebotz #buxy #pitti #rusty #duke

Low-Cost-Automation in XXL: Großer DIY-Palettierer von igus zum kleinen Preis

drylin XXL-Raumportalroboter ist bis zu 60 Prozent günstiger als vergleichbare Lösungen und besonders einfach in Betrieb zu nehmen

Köln, 8. Februar 2022 – igus erweitert sein breites Low-Cost-Automation Angebot um einen neuen drylin XXL-Raumportalroboter. Das Portal hat einen Aktionsradius von 2000 x 2000 x 1500 Millimeter und eignet sich besonders für Palettierungsanwendungen bis 10 Kilogramm. Der Roboter ist ab 7.000 Euro inklusive Steuerung erhältlich und lässt sich einfach selbst nach dem Do-it-yourself Prinzip aufbauen und programmieren – ohne Hilfe eines Systemintegrators.

Der schmier- und wartungsfreie drylin XXL-Raumportalroboter von igus hebt bis zu 10 Kilogramm und kostet bis zu 60 Prozent weniger als vergleichbare Lösungen. (Quelle: igus GmbH)

Zu teuer in der Anschaffung, zu aufwendig in der Programmierung, zu kompliziert in der Wartung: Viele kleine und mittelständische Unternehmen scheuen den Einstieg in die Automatisierung. Und gefährden damit langfristig ihre Wettbewerbsfähigkeit. Dabei geht der Einstieg ganz leicht von der Hand. Das beweist der drylin XXL-Portalroboter von igus. Der DIY-Bausatz bietet Unternehmen die Möglichkeit, schnell und unkompliziert einen Pick-and-Place Linearroboter für Aufgaben rund um Palettierung, Sortierung, Etikettierung und Qualitätsprüfung in Betrieb zu nehmen. „Palettier-Roboter, die in Zusammenarbeit mit externen Dienstleistern entstehen, kosten schnell zwischen 85.000 und 120.000 Euro. Das sprengt das Budget vieler kleiner Betriebe“, sagt Alexander Mühlens, Leiter Geschäftsbereich Low-Cost-Automation bei igus. „Wir haben deshalb eine Lösung entwickelt, die aufgrund des Einsatzes von Hochleistungskunststoffen und Leichtbaumaterialien wie Aluminium um ein Vielfaches günstiger ist. So kostet der drylin XXL-Raumportalroboter je nach Ausbaustufe zwischen 7.000 und 10.000 Euro. Eine Investition, die risikoarm ist und sich in der Regel innerhalb weniger Wochen amortisiert.“

DIY-Bausatz lässt sich ohne Vorkenntnisse schnell zusammensetzen

Das Raumportal erhält der Käufer als DIY-Bausatz. Bestandteile sind zwei Zahnriemenachsen und eine Zahnstangen-Auslegerachse mit Schrittmotoren und einem Aktionsraum von 2000 x 2000 x 1500 Millimeter. In der Maximallänge sind auch bis zu 6.000 x 6.000 x 1.500 Millimeter möglich. Zusätzlich ist im Paket ein Schaltschrank, Leitungen und Energieketten sowie die kostenlose Steuerungssoftware igus Robot Control (iRC) enthalten. Anwender können die Komponenten in wenigen Stunden zu einem betriebsfertigen Linearroboter zusammensetzen – ohne externe Hilfe, ohne Vorkenntnisse und ohne lange Einarbeitungszeit. Und werden noch zusätzliche Komponenten wie Kamerasysteme oder Greifer benötigt, so werden Anwender auf dem Robotik-Marktplatz RBTX schnell fündig.

Automatisierung entlastet Mitarbeiter

Zum Einsatz kommt der kartesische Roboter beispielsweise an Förderbändern, die Produkte von Spritzgussmaschinen abtransportieren. Hier nimmt der Roboter Artikel mit einem Maximalgewicht von 10 Kilogramm vom Band, transportiert sie mit einer Geschwindigkeit von bis zu 500 mm/s und positioniert sie mit einer Wiederholgenauigkeit von 0,8 Millimeter auf einer Palette. „Dank dieser Automatisierung können Betriebe ihre Mitarbeiter von körperlich anstrengenden und zeitaufwendigen Palettier-Aufgaben entlasten und Ressourcen für wichtigere Aufgaben gewinnen.“ Das System selbst verursacht dabei keinen Wartungsaufwand. Die Linearachsen bestehen aus korrosionsfreiem Aluminium, die Schlitten bewegen sich über Gleitlager aus Hochleistungskunststoff, die dank integrierter Festschmierstoffe über viele Jahre einen reibungsarmen Trockenlauf ohne externe Schmiermittel ermöglichen – selbst in staubigen und schmutzigen Umgebungen.

Digitaler Roboter 3D-Zwilling ermöglicht kinderleichte Programmierung

Doch nicht nur die Montage, sondern auch die Programmierung von Bewegungsabläufen stellt kein Einstiegshindernis dar. „Für viele Betriebe, die keine eigenen IT-Fachkräfte haben, ist die Programmierung von Robotern oft mit Problemen besetzt“, so Mühlens. „Wir haben deswegen mit der iRC eine kostenlose Software entwickelt, die optisch an häufig genutzte Office Software erinnert und eine intuitive Programmierung von Bewegungen ermöglicht. Das Besondere: die Software ist kostenlos und die so entstehende Low-Code-Programmierung kann dann 1:1 am realen Roboter verwendet werden.“ Herzstück der Software ist ein digitaler Zwilling des Raumportals, über den sich Bewegungen mit wenigen Klicks festlegen lassen. Auch im Vorfeld, bevor der Roboter in Betrieb ist. „Interessenten können vor dem Kauf anhand des 3D-Modells prüfen, ob gewünschte Bewegungen tatsächlich realisierbar sind. Zusätzlich laden wir alle Interessen ein, unsere Roboter live oder über das Internet kostenfrei auszuprobieren. Wir unterstützen sie bei der Inbetriebnahme und zeigen, was alles mit Low-Cost-Robotern möglich ist. Die Investition wird dadurch nahezu risikofrei.“

QUBS – The toymaker merging traditional designs and screen-free technology in early years learning

QUBS (www.qubs.toys) is a Swiss company producing traditionally-designed wooden toys with hidden high-tech magic: liberating children to explore their imagination, safely learn future skills and engage in educational, screen-free fun.

Inspired by the Montessori method, QUBS STEM toys educate as well as entertain. Playing with QUBS toys provides children, through play, with developmental skills in science, technology, engineering, and mathematics.

Loved by parents, teachers and, most importantly, young users (3 to 12 years), QUBS’ intuitive, gender neutral toys – made from responsibly sourced and long lasting beechwood – contain patented technology which brings them to life. Unlike other tech-enabled STEM children’s toys, QUBS’ toys have an eternal shelf life, do not require updates nor access to the internet, and are completely screen-less, empowering children to become creators, rather than passive users of laptop or smartphone screens.

Each block and toy component contains a QUBS-developed and patented version of RFID (Radio Frequency Identification) technology (the innovation most commonly-used in contactless payments and key fobs). RFID technology is 100% safe and secure for children and grown-ups, allowing the individual tiles and blocks to interact, all within their own secure universe.

Cody Block

QUBS’ first product, CodyBlock- to be showcased at Nuremberg Toy Fair – Spielwarenmesse Digital (where it has been shortlisted for the prestigious annual ‘Toy Award’) – features an independently-moving car (Cody), whose journey changes in response to a child’s placement and arrangement of wooden blocks within its environment. Encouraging creativity and teamwork, Cody Block introduces children to computer programming concepts, robotics, and the Internet Of Things through fun and accessible play.

Learning computational skills in early years is essential. Cody the car, and the wooden toy blocks which shape his journey, teach kids to think like a programmer: being introduced to principles of debugging (the process of identifying a problem and correcting it) and sequencing (the specific order in which instructions are performed in an algorithm) through physical play.

The task is to plan a path that leads Cody through the city and back home, his movements changing in response to the child’s arrangement and rearrangement of the wooden blocks (each containing RFID tech). Each block denotes a different directional command (e.g. ‘turn left’, ‘turn right’, ‘u-turn’ etc.), creating a sequence of instructions. This allows children to improve their motor skills, critical thinking, creativity and spatial awareness.

Cody Blockis designed for kids aged 3-12, and will be available to ship in Q2 2022.

Matty Block

QUBS’ second product, MattyBlock, is designed for ages 3-9, it helps children develop self confidence in mathematics by introducing the concepts of addition, subtraction and multiplication.

Children place Matty the farmer on a board above a sum of their own creation, formed by numbered tiles (representing seeds). With a nod or shake of his head, Matty guides young users to the right answer to the sum. MattyBlockfeatures voice feedback in six languages (English, German, French, Spanish, Italian and Mandarin), making it the perfect tool for children to play and learn autonomously. Its story setting provides a fun and comprehensive introduction to numbers and equations, while exploring the delicate and ever-changing world of nature.

Matty Blockwill be available in 2023.

About QUBS

Based in Zurich, Paris and London, QUBS Toys was founded by Hayri Bulman in 2019, a Swiss entrepreneur with over 30 years of IT expertise, working for GE (General Electric) and Xerox. Hayri’s own fatherhood, passion for wooden toys and firm grasp of technology motivated him to create QUBS to better equip the future generations for the digital world. Inspired by the toy company TEGU in 2015, Hayri sought out to merge classic wooden toys with modern technology and soon started working on concepts that combined RFID technology with wooden blocks. Since then, QUBS has expanded into a vast team of designers, engineers and creatives from all across Europe.

In April 2020, at the very beginning of the global pandemic, QUBS raised CHF 88,887 (~£70,000) by 503 backers during a Kickstarter campaign.

QUBS Toys will be available for purchase online from www.qubs.toys, as well as from major stockists.

Maker Faires 2022: Termine, Teilnahme und Tickets
Die Maker-Szene trifft sich wieder

Hannover, 3. Februar 2022 – In diesem Jahr gibt es frischen Input aus der Maker-Community wieder live und in Farbe: Vier Maker Faires sind für 2022 geplant. Deutschlands größte Maker Faire findet am 10. und 11. September in Hannover statt. Erstmals gibt es das Format für Innovation und Macherkultur auch im Süden Deutschlands. Maker, Enthusiasten, Kreative und Erfinder treffen sich im Sommer zur Maker Faire Baden-Württemberg. Auch in Dortmund und Chemnitz stellen Maker ihre spannenden Ideen vor. Die Ticketshops sind eröffnet und die Calls for Makers laufen.

Los geht’s im Westen: Auf der 5. Maker Faire Ruhr in Dortmund werden am 26. und 27. März in der DASA, Deutschlands größter Arbeitsweltausstellung, wieder ungewöhnliche Experimente und ziemlich schräge Projekte von IT bis Design präsentiert.

Danach folgt die Premiere im Süden Deutschlands: Die erste Maker Faire Baden-Württemberg startet am 25. und 26. Juni, auf dem Gelände des RTunlimited in Reutlingen. In Kooperation mit dem Innoport in der Metropolregion Stuttgart stellen Maker gemeinsam mit Technologiepartnern ihre zukunftsweisenden Ideen und MINT-Projekte vor. „Wir verlassen den Standort Berlin und wollen zusätzlich zur Maker Faire Hannover im Norden die Maker Faire Baden-Württemberg als zweite große Leitveranstaltung in Deutschland für die Maker-Community aufbauen“, erklärt Anna Ludwig, Besuchermanagerin Maker Faire.

Für den 9. und 10. Juli sollten sich alle Wissbegierigen die Maker Faire Sachsen vormerken: Chemnitz ist nicht nur Kulturhauptstadt 2025, hier präsentieren inspirierende Maker jedes Jahr aufs Neue, wie kreativ man mit Wissenschaft und Technik umgehen kann.

Höhepunkt ist dann die Maker Faire Hannover: Am 10. und 11. September wandelt sich das Hannover Congress Centrum mit seinem idyllischen Außengelände bereits zum achten Mal in einen Schauplatz kreativer Ideen. „Dieses Jahr wird es internationaler und zweisprachig!“, sagt Daniel Rohlfing, Leiter Events und Produktmanagement. „Wir laden Maker aus der ganzen Welt zu uns ein, ihre Genialität bei uns unter Beweis zu stellen.“ Die Maker Faire Hannover hat sich in der niedersächsischen Landeshauptstadt als „Must-see-Event“ etabliert und zog in der Vergangenheit rund 20.000 Besucherinnen und Besucher an.

Ab sofort können für die beiden Leitveranstaltungen, die Maker Faire Baden-Württemberg und die Maker Faire Hannover, Tickets gebucht werden. Auch die Calls for Makers laufen auf Hochtouren. Interessierte Maker können sich für einen Stand, einen Workshop oder Vortrag anmelden. Auch Unternehmen haben noch Zeit, sich für eine Ausstellungsteilnahme zu entscheiden.

The Evolution of Robo-Dogs

Sophie writes on behalf of Panda Security covering cybersecurity and online safety best practices for consumers and families. Specifically, she is interested in removing the barriers of complicated cybersecurity topics and teaching data security in a way that is accessible to all. Her most recent piece is on the evolution of robotic dogs and where they're headed next.

Robots have been a point of fascination and study for centuries as researchers and inventors have sought to explore the potential for automated technology. While there’s a long history of the development and creation of autonomous machines, mobile, quadrupedal robots — or four-legged robotic dogs — have seen a significant boom in the last few decades. 

The development of quadrupedal robots stems from the necessity of mobile robots in exploring dangerous or unstructured terrains. Compared to other mobile robots (like wheeled or bipedal/two-legged robots), quadrupedal robots are a superior locomotion system in terms of stability, control and speed.

The capabilities of quadrupedal robots are being explored in a variety of fields, from construction and entertainment to space exploration and military operations. Today, modern robotic dogs can be purchased by businesses and developers to complete tasks and explore environments deemed too dangerous for humans. Read on for the evolution of robotic dogs and where they might be headed in the future. 

1966: Phony Pony

Although it technically mirrored the form of a horse, the Phony Pony was the first autonomous quadrupedal robot to emerge in the U.S. that set the precedent for robotic dogs of the future. Equipped with electrical motors, the Pony Pony had two degrees of freedom, or joints, in each leg (the hip and the knee) and one adaptive joint in the frontal plane. The hip and knee joints were identical, allowing for both forward and backward walking movements. 

The Phony Pony was capable of crawling, walking and trotting, albeit at a very slow speed. Thanks to its spring-restrained “pelvic” structure, it was able to maintain static vertical stability during movement. Since the Phony Pony was developed before the advent of microprocessors, it could only be controlled through cables connected to a remote computer in an adjacent building.  

Developer: Frank and McGhee

Use: Initial research and development of autonomous quadrupeds 

1999: AIBO

In the late 1990s, Sony’s AIBO  — one of the most iconic and advanced entertainment robotic dogs — hit the market. While the AIBO (Artificial Intelligence RoBOt) was constructed for entertainment purposes, its machinery is still highly complex. 

Developed with touch, hearing, sight and balancing capabilities, it can respond to voice commands, shake hands, walk and chase a ball. It can also express six “emotions”: happiness, sadness, fear, anger, dislike and surprise. Its emotional state is expressed through tail wagging, eye color changes and body movements, as well as through a series of sounds including barks, whines and growls. Today, the AIBO has been used across many research groups for the purpose of testing artificial intelligence and sensory integration techniques.

Developer: Sony

Use: Toys and entertainment

2005: BigDog

Boston Dynamics has become a leader in the world of robotics, specifically in their development of canine-inspired quadrupeds. Their first robotic dog, coined BigDog, arrived in 2005. Measuring three by two feet and weighing in at 240 pounds, BigDog was designed to support soldiers in the military. It can carry 340 pounds, climb up and down 35-degree inclines and successfully hike over rough terrains. 

Each of BigDog’s legs has a passive linear pneumatic compliance — a system that controls contact forces between a robot and a rigid environment — and three active joints in the knees and hips. The robot is powered by a one-cylinder go-kart engine, and its dynamic regulating system allows it to maintain balance. Its movement sensors embrace joint position, joint force, ground contact, ground load and a stereo vision system. 

In 2012, developers were still working to refine BigDog’s capabilities before plans to officially deploy it to military squads. However, the project was discontinued in 2015 after concluding its gas-powered engine was too noisy to be used in combat. 

Developer: Boston Dynamics

Use: Assist soldiers in unsafe terrains 

2009: LittleDog 

Four years after BigDog came LittleDog, Boston Dynamics’ smallest quadrupedal robot to date. LittleDog was developed specifically for research purposes to be used by third parties investigating quadrupedal locomotion. 

Each of LittleDog’s legs are powered by three electric motors fueled by lithium polymer batteries and have a maximum operation time of thirty minutes. LittleDog maintains a large range of motion and is capable of climbing, crawling and walking across rocky terrains. A PC-level computer placed on top of LittleDog is responsible for its movement sensors, controls and communications. It can be controlled remotely and includes data-logging support for data analysis purposes. 

Developer: Boston Dynamics

Use: Research on locomotion in quadrupeds 

2011: AlphaDog Proto

Continuing their efforts to develop military-grade robots, Boston Dynamics released AlphaDog Proto in 2011. Powered by a hydraulic actuation system, AlphaDog Proto is designed to support soldiers in carrying heavy gear across rocky terrains. It’s capable of carrying up to 400 pounds for as far as 20 miles, all within the span of 24 hours, without needing to refuel. 

AlphaDog Proto is equipped with a GPS navigation and computer vision system that allows it to follow soldiers while carrying their gear. Thanks to an internal combustion engine, AlphaDog Proto proved to be quieter than its predecessor BigDog, making it more suitable for field missions. 

Developer: Boston Dynamics

Use: Assist soldiers in carrying heavy gear over unsafe terrains

2012: Legged Squad Support System (LS3)

Boston Dynamics’ development of the Legged Squad Support System (LS3) came soon after the creation of BigDog in their efforts to continue refining their quadrupedal robots for soldiers and Marines. LS3 was capable of operating in hot, cold, wet and otherwise unfavorable conditions. It contained a stereo vision system with a pair of stereo cameras, which were mounted inside the robot’s head. This operated in conjunction with a light-detecting and ranging unit that allowed it to follow a soldier’s lead and record feedback obtained from the camera. 

Compared to BigDog, LS3 was around 10 times quieter at certain times and had an increased walking speed of one to three miles per hour, increased jogging speed of five miles per hour and the ability to run across flat surfaces at seven miles per hour. It was also capable of responding to ten voice commands, which was a more efficient function for soldiers who would be too preoccupied with a mission to use manual controls. 

Five years into development, LS3 had successfully been refined enough to be able to operate with Marines in a realistic combat exercise and was used to resupply combat squads in locations that were difficult for squad vehicles to reach. By 2015, however, the LS3 was shelved due to noise and repair limitations. While the Marines were ultimately unable to use the LS3 in service, it provided valuable research insights in the field of autonomous technology. 

Developer: Boston Dynamics

Use: Assist soldiers in carrying heavy gear over unsafe terrains

2016: Spot 

Spot is Boston Dynamics’ next creation in their line of quadrupedal robots, designed in an effort to move away from developing quadrupeds strictly for military use and instead move into more commercial use. Spot is significantly smaller than their previous models, weighing just 160 pounds. Spot is capable of exploring rocky terrains, avoiding objects in its path during travel and climbing stairs and hills. 

Spot’s hardware is equipped with powerful control boards and five sensor units on all sides of its body that allow it to navigate an area autonomously from any angle. Twelve custom motors power Spot’s legs, gaining speed of up to five feet per second and operating for up to 90 minutes. Its sensors are able to capture spherical images and also allow for mobile manipulation for tasks such as opening doors and grasping objects. Spot’s control methods are far more advanced than Boston Dynamics’ earlier robots, allowing for autonomous control in a wider variety of situations. 

Developer: Boston Dynamics

Use: Documenting construction process and monitoring remote high-risk environments 

2016: ANYmal

While Boston Dynamics had been the main leader in quadrupedal robots since the early 2000s, Swiss robotics company ANYbotics came out with its own iteration of the robotic dog in 2016. Positioned as an end-to-end robotic inspection solution, ANYmal was developed for industrial use, specifically the inspection of unsafe environments like energy and industrial plants. 

ANYmal is mounted with a variety of laser inspection sensors to provide visual, thermal and acoustic readings. Equipped with an on-board camera, it’s capable of remote panning and tilting settings to adjust views of the inspection site. ANYmal is capable of autonomously perceiving its environment, planning its navigation path and selecting proper footholds during travel. It can even walk up stairs and fit into difficult-to-reach areas that traditional wheeled robots can’t.

ANYmal has undergone a handful of development iterations since 2016 and is available for purchase as of 2021. ANYbotics is currently working on an upgraded version of the robot suitable for potentially explosive environments. 

Developer: ETH Zurich and ANYbotics

Use: Remote inspection of unsafe environments

2021: Vision 60 

One of the latest developments in quadrupedal robots is Ghost Robotics’ Vision 60 robotic dog, which has recently been tested at the U.S. Air Force’s Scott Air Force Base in Illinois as part of its one-year pilot testing program. Built to mitigate risks faced by Air Force pilots, Vision 60 features a rifle mounted on its back contained in a gun pod and is equipped with sensors that allow it to operate in a wide variety of unstable terrains. It’s also capable of thermal imaging, infrared configuration and high-definition video streaming. 

Vision 60 can carry a maximum of 31 pounds and can travel at up to 5.24 feet per second. It’s considered a semi-autonomous robot due to its accompanying rifle; while it can accurately line up with a target on its own, it can’t open fire without a human operator (in accordance with the U.S. military’s autonomous systems policy prohibiting automatic target engagement).

Developer: Ghost Robotics

Use: Military and Homeland Security operations

2021: CyberDog

With more companies embracing the development of quadrupeds, Xiaomi Global followed suit and released their version named CyberDog. CyberDog is an experimental, open-source robot promoted as both a human-friendly companion and an asset by law enforcement and military. CyberDog is sleeker and smaller than its other robotic dog predecessors, carrying a payload of just 6.6 pounds and running over 10 feet per second. 

CyberDog is equipped with multiple cameras and image sensors located across its body, including touch sensors and an ultra-wide fisheye lens. CyberDog can hold 128 gigabytes of storage and is powered by Nvidia’s Jetson Xavier AI platform to perform real-time analyses of its surroundings, create navigation paths, plot its destination and avoid obstacles. CyberDog can also perform backflips and respond to voice commands thanks to its six microphones. 

By making CyberDog an open-source project, Xiaomi hopes to expand its reach into the future of robot development and innovation. Its open-source nature is meant to encourage robotics enthusiasts to try their hand at writing code for CyberDog, giving the project more exposure and bolstering Xiaomi’s reputation in the robotics community. 

Developer: Xiaomi Global

Use: An open-source platform for developers to build upon 

While the market for quadrupedal robots is still in its early stages, interest is steadily growing in a wide range of industries. As for fears of robots pushing out the need for traditionally human-led jobs, these machines are more intended to support humans alongside their jobs rather than replace them outright. 

On the other hand, privacy concerns associated with robots aren’t to be ignored. As with any tech-enabled device, hacking is always possible, especially for open-source robotic models that can put users’ personal information at risk. This applies not only to the quadrupeds discussed above, but to more common commercial robotic systems like baby monitors, security systems and other WiFi-connected devices. It’s important to ensure your home network system is as strong and secure as possible with a home antivirus platform

JetMax: The AI Vision Robotic Arm for Endless Creativity

The true AI vision robotic arm powered by Jetson Nano is affordable and open-source, making your AI creativity into reality.

In recent years, there are more makers, students, enthusiasts, and engineers learning artificial intelligence technology, and many interesting AI projects are being developed as well. Hiwonder brings the power of AI to robot, build a true AI robotic arm — JetMax, to enhance the AI and robotic learning experience for everyone.

JetMax featurs Deep Learning and Computer Vision abilities. It is equipped with Jetson Nano and HD Wide Angle camera, which enables it to interact with the perceived environment efficiently. It empowers you to skillfully make your AI creativity into reality.

Being an AI Vision Robotic Arm, JetMax not only features AI vision but has a clever brain as well. Supporting you in learning coding, researching AI robotics applications, and bringing your AI ideas to life. It can be your helping hand in a lab, university, or workshop.

  • Powered by NVIDIA Jetson Nano

The open-source JetMax robot arm is powered by Jetson Nano, featuring deep learning, computer vision and more. Jetson Nano has the performance needed to power modern AI workloads to enable JetMax robot arm with advanced AI capabilities.

  • Supports multiple types of EoAT (End-of-Arm Tooling)

Supporting multiple types of end-of-arm tooling such as grippers, suction cup, pen holder, electromagnet etc, JetMax provides you with many ways of creative design applications.

  • Open-Source

JetMax is an open platform hardware product. We contribute numerous project source and AI tutorials. Additionally, the API interface is completely opened for customization and supports, such as Python, C++ and JAVA languages