Unidice: The Future of Gaming, Gamification and Smart Home Integration

The gaming world is on the cusp of a groundbreaking revolution with the introduction of the unidice, a new console that promises to redefine our understanding of games and gamification. This innovative device, blending cutting-edge technology with traditional gaming fun, offers a rich gaming experience and opens the door to a wide range of gamification applications. Furthermore, the unidice’s integration of generative AI in some of their Apps, particularly within the Tale Spinner application, along with a machine learning core that powers its Inertial Measurement Unit (IMU), signals a leap toward more interactive and intelligent gaming and beyond. An especially notable feature is its compatibility with the Matter protocol, thanks to its new and improved processor, making it an ideal tool for smart home management.

The Unidice: A Nexus of Hybrid Gaming 

At its heart, the unidice is an electronic dice featuring six touch displays, a processor, a gyroscope, and dual storage systems, designed to facilitate seamless communication via Bluetooth with smartphones and tablets. This capability allows for a hybrid gaming experience that combines the physical pleasure of rolling dice with the limitless possibilities of digital innovation. Beyond enhancing traditional gaming experiences, this feature significantly broadens the horizon for gamification in various domains, including education and sports. Moreover, with its compatibility with the Matter protocol, the unidice also steps into the realm of smart home technology, offering users a unique and interactive way to control and automate their home environments.

Unidice and Robotics: Pioneering Smart Device Interaction

In a notable showcase of its expansive capabilities, the unidice demonstrated control over a car provided by fishertechnik, highlighting its potential in robotics and smart device interaction. This feature is a testament to the versatility of the unidice, illustrating its capacity to extend beyond gaming and home automation into the realm of educational robotics and interactive learning. By integrating with physical devices, the unidice opens up new possibilities for gamification and educational applications, offering a hands-on approach to technology and engineering concepts. To witness this remarkable feature in action, explore the video from the convention where unidice displays the new features, games, and notably, the car control.

Generative AI and Tale Spinner: A New Chapter in Storytelling 

The unidice revolutionizes storytelling through its Tale Spinner application, which uses generative AI to create personalized stories for children. By rolling the dice and selecting main topics, kids can generate new story segments narrated by an AI voice, powered by GPT-4 Turbo and Whisper. This innovative use of generative AI transforms the unidice from a mere gaming console into a dynamic storytelling companion, encouraging creativity and a love for reading in young users.

Machine Learning and the IMU: Enhancing Interaction The unidice incorporates a machine learning core within its IMU, enabling it to „learn“ new input methods (thanks to firmware updates) and improve its movement and orientation detection. This advancement makes interactions more intuitive and responsive, paving the way for more immersive gaming and gamification applications.

Smart Home Integration: The Matter Protocol Advantage 

The unidice’s ability to use the Matter protocol, enabled by its advanced processor, marks a significant development in smart home integration. This feature allows the unidice to act as a smart home controller, providing a novel and interactive method for managing and automating home environments. Through simple gestures, taps, tilts and rolls, users can control lighting, temperature, and other smart devices, making the unidice a central hub for both entertainment and home management.

The unidice exemplifies the convergence of gaming, artificial intelligence, robotics, and smart home technology, heralding a new era of interactive entertainment and practical utility. Its fusion of physical gaming with digital innovation, powered by generative AI and machine learning, positions it as a pioneering console in not only the gaming industry but also in the broader context of smart living. As we move forward, the unidice stands as a pioneer to the potential of blending traditional gaming elements with the latest technological advancements to create experiences that are entertaining, educational, and enriching, while also seamlessly integrating into our daily lives through smart home technology.

Discover more about unidice and secure your own by visiting their website at www.unidice.world and preordering at https://unidice.stimulus-games.com/.

Experience the Future of Companionship with Doly, Launched by Limitbit soon on Kickstarter

Limitbit, a pioneer in AI powered companion technology, has announced the launch of its groundbreaking product, Doly, on Kickstarter. As of today, Doly has already captured the imagination of tech enthusiasts and educators, raising significant interest ahead of its official Kickstarter launch scheduled for February 13, 2024. 

Doly, launch day special priced at $289, is an autonomous AI-powered companion robot that seamlessly integrates robotics, AI, and coding education into one dynamic device. It is the first of its kind to offer an open hardware and open design, powered by Raspberry Pi, allowing customization and continual evolution in capabilities. 

„Doly represents a fusion of companionship, education, and technological innovation,“ says Levent Erenler, founder of Limitbit. „It’s designed to grow and adapt, offering an engaging experience for all ages and skill levels. Our open-source approach places Doly at the forefront of personal robotic innovation.“ 

Product highlights of Doly include: 

Self-acting Personality: A unique character that develops and evolves through interaction, offering a personalized experience. 

Edge AI Processing: Ensuring maximum privacy, Doly’s AI algorithms operate locally, without relying on cloud processing, safeguarding user data.

STEM Education Enabler: Doly serves as an engaging tool for learning coding and robotics, catering to both beginners and advanced users. 

Open-Source Platform: Users can innovate and customize Doly, thanks to its open hardware and open design, fostering a community-driven approach to technological advancement. 

Extensive Add-On Support: Featuring a range of I/O ports, Doly offers extensive opportunities for expansion and customization, perfect for developers and hobbyists. 

3D Printable Design: Emphasizing its customizable nature, Doly can be personalized with 3D printed parts, allowing users to tailor its appearance and functions. 

Targeted towards a wide audience that includes robot lovers, parents, children, software and hardware developers, and open-source enthusiasts, Doly is positioned as the ultimate educational and interactive robot companion. 

„Doly is not just a product; it’s a step towards a future where technology enhances every aspect of learning and daily living,“ added Levent Erenler. „Its ability to engage users in coding, robotics, and AI, while also serving as a companion robot, sets a new benchmark in the field.“ 

About Limitbit: 

Based in Markham, Ontario, Limitbit is dedicated to revolutionizing AI powered companion robots. Their mission is to blend cutting-edge technology with practical, educational applications, making advanced robotics accessible to everyone. 

For more information about Doly and to participate in the Kickstarter campaign, click here. 

fischertechnik Design Studio allows you to build models on the web

The groundbreaking fischertechnik Design Studio now offers the opportunity to create models online with fischertechnik building blocks. All components from the current range are available in the new ft Design Studio in 3D format. The result is impressive virtual buildings that can be viewed from all sides, just like in a real design office.

The cool fischertechnik Design Studio brings fun and games into the children’s room, trains spatial thinking and stimulates creativity. The children are immersed in a digital world where they can combine building blocks and create fascinating 3D models. The buildings created can be effortlessly rotated and viewed from different angles. Photo modes can be used to capture the constructions and compare them with other buildings. Once logged in, you have the opportunity to exchange ideas with other creative members of the fischertechnik design community.

After the virtual design on the screen, the models can be rebuilt using the fischertechnik building blocks. If required, the required building blocks can be obtained via the platform www.santjohanser.com to ensure the seamless implementation of the ideas.

Young designers can use it not only to design their creations virtually, but also to take them in their hands and assemble them in the physical world. „Our goal at fischertechnik is to combine virtual functional principles with haptics in a modern and cool way. 

With the new fischertechnik Design Studio, we have achieved a milestone in this regard,“ says Marc Schrag, fischertechnik Sales Manager Germany, Austria and Switzerland. The Design Studio can be downloaded free of charge from the fischertechnik website. Learning tutorials are available to get to know the platform.

Further information:

ft Design Studio (fischertechnik-cloud.com)

Moley Robotics Unveils World’s First Luxury Robot Kitchen Showroom in London

London, 15th December 2024 – Moley Robotics, a pioneer in culinary automation, is proud to announce the grand opening of the world’s first luxury robot kitchen showroom in the heart of London. This revolutionary space, located at 16 Wigmore Street, marks a significant milestone in the fusion of technology and gastronomy, offering visitors a first-hand experience of the future of automated cooking that is set to revolutionise the culinary landscape.

An Immersive Culinary Journey

The showroom is a testament to Moley Robotics‘ commitment to transforming the way we think about and engage with cooking. Stepping into this cutting-edge showroom is like entering a realm where culinary dreams meet technological prowess. The showroom has been meticulously designed to provide an immersive and interactive experience, showcasing the advanced cooking capabilities of the Moley Robotic kitchens and distinctive kitchen designs, crafted from premium materials including Glacier White Corian, Patagonian marble and high gloss Eucalyptus wood panels.

Visitors will be captivated by the graceful human-like movements of the robotic arms as they seamlessly prepare gourmet meals in the state-of-the-art kitchen, equipped exclusively with premium appliances from globally renowned brands such as Siemens, Gaggenau, and Miele. The latest advancements in robotics and artificial intelligence, demonstrate the unparalleled precision and versatility of the Moley system. The showroom aims to transport visitors into the future of home cooking, where efficiency, elegance, and innovation converge.

Unveiling the Future of Home Cooking

The centrepiece of the showroom are, of course, the Moley Robotic kitchens; Chef’s Table, X-Air and A-Air. Visitors will have the opportunity to witness live demonstrations of the robotic arms in action, showcasing the system’s ability to faithfully replicate recipes from an extensive library curated by world-renowned chefs including three-Michelin-starred Andreas Caminada, MasterChef winner Tim Anderson, Award-winning Andrew Clarke and sushi Grandmaster Kiichi Okabe. From delicate stirring to precise seasoning, the robotic arms perform each task with a level of skill and dexterity previously reserved for the most accomplished chefs.

„We are thrilled to open the doors to the world’s first luxury robot kitchen showroom in London,“ said Mark Oleynik, CEO of Moley Robotics. „This space is not just a showcase of our technology; it’s an invitation for people to experience first-hand the future of home cooking. The Moley Robotic Kitchen is a game-changer, and this showroom is the perfect platform to share our vision with the world.“

A Gourmet Experience for All

The luxury showroom isn’t just about awe-inspiring technology; it’s about making gourmet experiences accessible to everyone. The Moley Robotic Kitchen is designed to cater to a wide range of culinary preferences and dietary needs. Visitors can explore the user-friendly interface, customise recipes, and witness the system adapt to individual preferences, showcasing the versatility that makes Moley Robotics the leaders in the world of culinary robotics.

Additionally, the showroom will host live cooking events, allowing guests to taste the delicious creations prepared by the Moley Robotic Kitchen. This hands-on experience aims to bridge the gap between futuristic technology and the joy of savouring exquisite meals, reinforcing the idea that automation can enhance, rather than entirely replace, the human experience in the kitchen.

Innovating with Elegance

Beyond its technological marvels, the luxury showroom reflects Moley Robotics‘ commitment to design and aesthetics which have been forged by a fruitful five-year collaboration with the renowned Italian design house, Minotti Collezioni. The Moley Robotic Kitchen seamlessly integrates into modern kitchen spaces, and the showroom itself is a testament to the marriage of innovation and elegance. The sleek, contemporary design of the kitchen setup and the overall ambiance of the space create an environment that is both inviting and forward-thinking.

„As we open the world’s first luxury robot kitchen showroom, we’re not just unveiling a product; we’re introducing a lifestyle—a future where technology elevates our culinary experiences,“ added Mark Oleynik. „Our goal is to inspire and empower individuals to reimagine their relationship with cooking.“

Visit Us Today

The Moley Robotics luxury robot kitchen showroom is located at 16 Wigmore Street, London, W1U 2RF in London and is open to the public by appointment starting 15th December. Visitors are invited to make an appointment on the Moley Website and explore the future of home cooking, witness live demonstrations, and immerse themselves in a culinary experience like no other. For more information, visit moley.com.

About Moley Robotics:

Moley Robotics is a leading innovator in the field of culinary automation, dedicated to redefining the way we approach cooking at home. With a focus on precision, convenience, and elegance, Moley Robotics is at the forefront of the integration of robotics and artificial intelligence in the kitchen.


Moley Robotics Kitchen Demo Studio

16 Wigmore Street, London, W1U 2RF

HQ NextPCB Introduces New PCB Gerber Viewer: HQDFM Online Lite Edition

HQ NextPCB is proud to announce the release of HQDFM Online Gerber Viewer and DFM Analysis Tool, free for everyone. With HQDFM, NextPCB hopes to empower designers with the DFM tools and knowledge to perfect their designs as early as possible, where problems are the least costly.

The PCB manufacturing industry has and still relies on the Ucamco Gerber standard to communicate PCB production to manufacturers, which are often located overseas. While the 2D image format overcomes language barriers, the language of PCB Design for Manufacture is still ambiguous to many inexperienced designers and the division between PCB design and PCB manufacture remains an expensive obstacle leading to missed deadlines, hit-and-miss runs and quick-to-fail products.

“There are still many designers that send off their production files without any knowledge of what the files contain. Resolving issues with the fab house becomes a challenge for both sides without the appropriate tools and knowledge to debug the design, which is something we hope to change for everyone’s benefit,” said HQ NextPCB CEO, Alex Chen.

HQDFM Gerber Viewer is a completely free, online tool for displaying and reviewing PCB Gerber files and OBD++ files. It incorporates HQDFM’s Design for Manufacture (DFM) algorithms which are based on NextPCB’s 15 years of high-reliability manufacturing expertise and current industry standards.

Unlike EDA software’s built-in DRC checks, HQDFM allows designers to navigate and analyze the production files and check for design issues that may impact manufacturing or cause long-term reliability issues. With HQDFM, designers are given a manufacturer’s perspective and have valuable insights into how they can improve their designs.

Based on the free desktop version, the cross-platform, online HQDFM makes the easy-to-use Gerber Viewer function and DFM analysis available to everyone, including Mac and Linux users with no download or installation and no sign-up. Anyone can freely upload PCB production files to the online interface and view their design in seconds.

HQDFM supports X2 and RS-274x PCB Gerber formats, Excellon drill files and OBD++ files and works with Google Chrome, Safari, Microsoft Edge, Mozilla Firefox and Opera browsers.

Features:

– Check for over 20 Design for Manufacture issues
– Navigate your PCB layer-by-layer
– Free downloadable DFM Report for more detailed analysis
– Easy, one-click ordering with HQ NextPCB
– Absolutely free, no sign-up or obligation

What does HQDFM check?

Design for Manufacturability – whether PCB elements are within the capabilities of most manufacturers in the industry, from low cost quickturn services to advanced services:

– Trace width/spacing
– Drill hole/slot sizes
– Clearances to copper and board outline elements
– Special drill holes
– Solder mask openings and more

Design for Cost – detect production difficulties that may cause costs or lead times to increase:

– Drill hole fees
– Testing fees
– Surface finish fees
– Small trace widths/spacings
– Special drill hole fees

Design for Reliability – detect problems that may not affect production but impact the normal operation or lifetime of the boards:

– Shorts/opens
– Missing solder mask dams or openings
– Missing/extra pads, drill holes, annular rings

About NextPCB

HQ NextPCB and HQ Online are the overseas trading brands of Shenzhen Huaqiu (HQ) Electronics Co. Ltd., a reliable multi-layer PCB manufacturer and assembly house. Their capabilities include up to 32 layer boards, blind and buried vias up to HDI 3, custom stack-ups, full turnkey assembly and more via a smart quotation platform with dedicated one-to-one support. Also, NextPCB is offering free reliable PCB assembly for everyone.

Founded in 2009, NextPCB’s passion for fast, reliable and affordable full-featured electronics manufacturing has driven NextPCB to innovate and modernize the electronics manufacturing industry. NextPCB developed and maintains the software HQDFM, a groundbreaking tool for designers to analyze PCB Gerber files and detect design issues. Over 300,000 users around the world already choose NextPCB. Try them for your next design at HQ NextPCB.com and see how they can accelerate your workflow.

Block coding for all modern LEGO® hubs

Endless creativity and fun with smart LEGO® bricks using Pybricks

November, 2023 – Pybricks Headquarters: Today, the Pybricks team presents the first beta release of block coding for all modern LEGO® hubs. For the first time, fans of all LEGO themes can bring their smart bricks together in a single app for endless possibilities and creativity.

Whether you want to make smart train layouts, autonomous Technic machines, interactive BOOST creatures, or super-precise SPIKE and MINDSTORMS robots, you can do it with Pybricks.

Pybricks is beginner-friendly and easy to use. There’s no need to install complicated apps or libraries either. Just go to https://beta.pybricks.com, update the firmware, and start coding.

And now for the first time, no prior Python coding experience is required. You can code with familiar but powerful blocks, and gradually switch to Python when you’re ready. The live preview makes it easy to see how your blocks translate to Python code.

Meanwhile, more seasoned builders and robotics teams will enjoy advanced features such as color sensor calibration or builtin gyro control for drive bases.

The new block coding experience is exclusively available to our supporters on Patreon. You can sign up for a monthly subscription or make a one-time pledge in our shop for lifetime access.

Python coding remains entirely free and open source, and continues to be supported by a community of developers and LEGO enthusiasts around the world. Improvements are made almost every day, with the lead developers actively engaging with the community for ideas, bug fixes, and brand new features.

So grab your LEGO sets and start coding!

ABB is the first manufacturer to provide intuitive, block-based no-code programming for all cobots and six-axis industrial robots

  • First-time users can program their collaborative robots and industrial robots for free within minutes
  • System integrators and experienced users can develop, share, and customize sophisticated programs for application-specific features

ABB Robotics has expanded the scope of its free Wizard Easy Programming software for collaborative robots to include all six-axis industrial robots running on an ABB OmniCore™ controller. This makes ABB the first robot manufacturer to offer an easy-to-use no-code programming tool for cobots and six-axis industrial robots. This lowers the barriers to automation for early adopters and provides ecosystem partners and integrators with an efficient tool to support their customers.

„If we want to promote and advance the use of robotic automation on a global scale, we need to address the challenges and opportunities of the industry,“ says Marc Segura, head of the robotics division at ABB. „By adding our six-axis industrial robots to Wizard Easy Programming, ABB Robotics is responding to the skills shortage and increasing demand from manufacturing companies for simple and easy-to-use programming software for their robot fleets.“

Create robot applications without prior training

Wizard Easy Programming uses a graphic, drag-and-drop, no-code programming approach designed to simplify the development of robotic applications. The software allows both first-time and experienced robot users to create applications in minutes – a task that typically requires a week of training and another week of development work. Since its launch in 2020, Wizard Easy Programming has been used in a wide range of applications in conjunction with ABB’s YuMi, SWIFTI™ and GoFa™ collaborative robots.

Wizard Easy Programming, previously available for ABB’s collaborative robots, is now available for all of the company’s six-axis industrial robots. (Image: ABB)

The software offers users the opportunity to create complete programs for applications such as arc welding or machine tending without prior training. An intuitive graphical user interface allows you to customize existing programs and pre-programmed blocks to control various actions – from robot movements to signal instructions and force control – for added flexibility.

Efficiently generate specific codes for specific applications

Wizard Easy Programming also includes Skill Creator, a tool that helps system integrators and experts create custom, application-specific wizard blocks for their customers. Skill Creator simplifies the creation of new blocks for highly specific tasks such as machine tending and welding, but also for difficult applications such as medical tests. Ecosystem partners who develop accessories such as grippers, feeding systems and cameras will have access to a digital tool that allows them to share product-specific functionalities regardless of the type of robot to be used.

Wizard Easy Programming is pre-installed on all cobots and new six-axis industrial robots running ABB’s OmniCore controller. The leading robot controllers of the OmniCore family are characterized by an energy saving potential of 20 percent on average and a high degree of future-proofing – thanks to integrated digital connectivity and over 1,000 scalable functions.

More information about Wizard Easy Programming is available here.

Igus expands its portfolio around the ReBeL with a humanoid hand made of lubrication-free high-performance plastics

Cologne, November 14, 2023 – Robots have become an integral part of industry and are increasingly finding their way into small and medium-sized companies in the form of cobots, such as the ReBeL. They sort, pick and move with the help of cameras, vacuum cleaners and gripping systems. In order to be able to take on humanoid tasks, igus has now developed a finger gripper for the ReBeL cobot. It is made entirely of lubrication-free plastics, so it is very cost-effective and easy to integrate.

With the ReBeL, igus has launched a compact and lightweight cobot on the market that makes it possible to enter robotics at low cost. This makes it particularly suitable for assembly tasks, quality inspections and in the service sector.  In order for the robot to really work, grippers and suction cups are necessary. For this purpose, igus offers a wide selection of suitable end effectors from various manufacturers on the marketplace RBTX.com. „Since the ReBeL is very light and affordable with its own weight of around 8 kilograms and a price starting at 3,970 euros, it is widely used in humanoid applications. For this reason, we received a number of customer inquiries for a robotic hand that can be easily connected to the ReBeL via plug-and-play,“ explains Alexander Mühlens, Head of the Low Cost Automation business unit at igus GmbH. For this reason, igus has now developed a particularly cost-effective ReBeL finger gripper, which is available for as little as 1,840 euros. The humanoid hand is compatible with all ReBeL models. It is controlled via DIO at the Tool Center Point, which allows for easy integration and flexibility in various applications. The special feature of the finger gripper is that it can imitate human hand movements. „With the new low-cost hand, the ReBeL can take on a wide range of simple humanoid tasks and applications. We are thinking of the area of research and development at universities, but tasks in gastronomy or in the entertainment industry are also conceivable,“ says Alexander Mühlens.

High-performance plastics ensure precise movements
All components, including the flange, cables and control, come directly from igus in Cologne. In this way, the customer receives a 100 percent compatible solution. The low price is due to the lubrication-free high-performance plastics. The plain bearings in the joints made of iglidur polymers are not only cost-effective and lubrication-free, but also enable smooth and precise movements of the individual fingers. Extensive testing in the in-house 3,800 square meter laboratory guarantees the longevity of the humanoid hand. They are extremely flexible and can be controlled via various interfaces, including USB, TTL (5 V) serial and internal scripting. In addition to the finger gripper, igus offers other products for the ReBeL Environment. These include, for example, fireproof fire protection hoods, a 7th axis, gripper sets, adapter plate sets, energy supply systems, a finished workstation and connection cables.

IDS NXT malibu: Camera combines advanced consumer image processing and AI technology from Ambarella and industrial quality from IDS

New class of edge AI industrial cameras allows AI overlays in live video streams
 

IDS NXT malibu marks a new class of intelligent industrial cameras that act as edge devices and generate AI overlays in live video streams. For the new camera series, IDS Imaging Development Systems has collaborated with Ambarella, leading developer of visual AI products, making consumer technology available for demanding applications in industrial quality. It features Ambarella’s CVflow® AI vision system on chip and takes full advantage of the SoC’s advanced image processing and on-camera AI capabilities. Consequently, Image analysis can be performed at high speed (>25fps) and displayed as live overlays in compressed video streams via the RTSP protocol for end devices.

Thanks to the SoC’s integrated image signal processor (ISP), the information captured by the light-sensitive onsemi AR0521 image sensor is processed directly on the camera and accelerated by its integrated hardware. The camera also offers helpful automatic features, such as brightness, noise and colour correction, which significantly improve image quality.

„With IDS NXT malibu, we have developed an industrial camera that can analyse images in real time and incorporate results directly into video streams,” explained Kai Hartmann, Product Innovation Manager at IDS. “The combination of on-camera AI with compression and streaming is a novelty in the industrial setting, opening up new application scenarios for intelligent image processing.“

These on-camera capabilities were made possible through close collaboration between IDS and Ambarella, leveraging the companies’ strengths in industrial camera and consumer technology. „We are proud to work with IDS, a leading company in industrial image processing,” said Jerome Gigot, senior director of marketing at Ambarella. “The IDS NXT malibu represents a new class of industrial-grade edge AI cameras, achieving fast inference times and high image quality via our CVflow AI vision SoC.“

IDS NXT malibu has entered series production. The camera is part of the IDS NXT all-in-one AI system. Optimally coordinated components – from the camera to the AI vision studio – accompany the entire workflow. This includes the acquisition of images and their labelling, through to the training of a neural network and its execution on the IDS NXT series of cameras.

Robot plays „Rock, Paper, Scissors“ – Part 1/3

Gesture recognition with intelligent camera

I am passionate about technology and robotics. Here in my own blog, I am always taking on new tasks. But I have hardly ever worked with image processing. However, a colleague’s LEGO® MINDSTORMS® robot, which can recognize the rock, paper or scissors gestures of a hand with several different sensors, gave me an idea: „The robot should be able to ’see‘.“ Until now, the respective gesture had to be made at a very specific point in front of the robot in order to be reliably recognized. Several sensors were needed for this, which made the system inflexible and dampened the joy of playing. Can image processing solve this task more „elegantly“?

Rock-Paper-Scissors with Robot Inventor by Seshan Brothers. The robot which inspired me for this project

From the idea to implementation

In my search for a suitable camera, I came across IDS NXT – a complete system for the use of intelligent image processing. It fulfilled all my requirements and, thanks to artificial intelligence, much more besides pure gesture recognition. My interest was woken. Especially because the evaluation of the images and the communication of the results took place directly on or through the camera – without an additional PC! In addition, the IDS NXT Experience Kit came with all the components needed to start using the application immediately – without any prior knowledge of AI.

I took the idea further and began to develop a robot that would play the game „Rock, Paper, Scissors“ in the future – with a process similar to that in the classical sense: The (human) player is asked to perform one of the familiar gestures (scissors, stone, paper) in front of the camera. The virtual opponent has already randomly determined his gesture at this point. The move is evaluated in real time and the winner is displayed.

The first step: Gesture recognition by means of image processing

But until then, some intermediate steps were necessary. I began by implementing gesture recognition using image processing – new territory for me as a robotics fan. However, with the help of IDS lighthouse – a cloud-based AI vision studio – this was easier to realize than expected. Here, ideas evolve into complete applications. For this purpose, neural networks are trained by application images with the necessary product knowledge – such as in this case the individual gestures from different perspectives – and packaged into a suitable application workflow.

The training process was super easy, and I just used IDS Lighthouse’s step-by-step wizard after taking several hundred pictures of my hands using rock, scissor, or paper gestures from different angles against different backgrounds. The first trained AI was able to reliably recognize the gestures directly. This works for both left- and right-handers with a recognition rate of approx. 95%. Probabilities are returned for the labels „Rock“, „Paper“, „Scissor“, or „Nothing“. A satisfactory result. But what happens now with the data obtained?

Further processing

The further processing of the recognized gestures could be done by means of a specially created vision app. For this, the captured image of the respective gesture – after evaluation by the AI – must be passed on to the app. The latter „knows“ the rules of the game and can thus decide which gesture beats another. It then determines the winner. In the first stage of development, the app will also simulate the opponent. All this is currently in the making and will be implemented in the next step to become a „Rock, Paper, Scissors“-playing robot.

From play to everyday use

At first, the project is more of a gimmick. But what could come out of it? A gambling machine? Or maybe even an AI-based sign language translator?

To be continued…