Lego Mindstorms Robot Inventor/Spike Prime (51515/45678) Adapterplatine für den Ultraschallsensor

Gastbeitrag von brickobotik:

Der SPIKETM Prime von LEGO Education ist inzwischen seit über einem Jahr auf dem Markt. Wir haben ihn euch in unserem großen Test ausführlich vorgestellt. Inzwischen ist auch der Inventor 51515 also die Home-Variante des SPIKETM Prime erhältlich. Bei beiden ist die Software nun auf einem adäquaten Level angekommen. Inzwischen haben wir auch unser E-Book zur SPIKETM Prime Classroom-Software veröffentlicht, das für alle, die noch Fragen zur Programmierung der Roboters haben, definitiv einen Blick wert ist.

Für uns bei brickobotik geht die Arbeit mit dem SPIKETM Prime aber trotzdem weiter. Zum einen natürlich in unseren Workshops und Fortbildungen, die wir zu diesem Roboter durchführen. Aber auch die Elektrotechnik des SPIKETM beschäftigt uns. Deshalb geben wir euch in diesem Artikel einen kleinen Einblick in unsere „brickobotik-Bastelstube“ und stellen ein Projekt vor, an dem wir gerade arbeiten.


Viele von euch ist sicher aufgefallen, dass der Ultraschallsensor von SPIKETM Prime und Mindstorms Inventor im Gegensatz zu den anderen Sensoren auf seiner Rückseite zwei Torx-Schrauben zeigt. Wenn man diese herausschraubt, kann man die weiße Sensoreinheit des Ultraschallsensors entfernen und hält dann nur die schwarze Schale in der Hand. Darin kommt das Kabel des LEGO Powered-Up-Steckers an und wird auf eine Buchsenleiste verteilt.

Diese Buchsenleiste (es handelt sich um einen 8-Pin Female Header) ist mit einem Rastermaß von 1,27 mm sehr klein und es kann deshalb ziemlich fummelig werden, sie mit herkömmlichen Arduino-Kabeln zu nutzen. Darum haben wir eine passende Adapterplatine entwickelt, welche die kleine Buchsenleiste auf das typische Rastermaß von 2,54 mm übersetzt, wie man es vom Arduino, Steckbrettern, Lochrasterplatinen, etc. kennt.

Technische Details zur Platine

Die Power-Funotions-2.O-Verbindung führt sechs Kontakte:

1X 3,3 V Spannungsversorgung
1X GND
2 digitale Ein-/Ausgänge (GPIO), welche auch für UART (115200 Baud, 8N1) verwendet werden können.
Achtung! Die GPlOs liefern nicht genug Strom, um LEDs direkt zu betreiben! Es wird eine Transistorschaltung benötigt, um eine LED aus der 3,3 V Spannungsversorgung zu speisen.

2x PWM für Motoren

Achtung! Die Spannung dieser Signale kommt direkt vom Akku des SPIKETM Prime! Diese liegt nach unseren Messungen zwischen 8,4 V und 6,3 V.

Für die GPlO-Kontakte ist auf der Platine je ein Widerstand vorgesehen, welcher einen minimalen Schutz gegen falsche GPlO-Konfigurationen darstellt. Sie können aber auch einfach überbrückt werden.

Nach links und rechts sind die gleichen Kontakte noch einmal ausgeführt. So sind auf der einen Seite der Platine die GPlO-Kontakte mit Spannungsversorgung ausgeführt und auf der anderen Seite die PWM-Kontakte mit Spannungsversorgung – und zwar sowohl im Rastermaß 2,54 mm als auch im Rastermaß 2,00 mm für das Grove-Stecksystem. Für die Kontakte links und rechts ist die Spannungsversorgung von 3,3 V durch eine offene Lötbrücke unterbrochen, damit zum Beispiel bei Verwendung eines Calliope mini die unabhängigen Spannungsversorgungen beider Geräte nicht zerstörerisch konkurrieren. Die offene Lötbrücke kann bei Bedarf mit etwas Lötzinn geschlossen werden.

Neue Möglichkeiten durch die Platine

Mit der Platine ist es deutlich einfacher, weitere Sensoren oder Motoren anzuschließen und mit dem SPIKETM Prime zu nutzen. Auch eine Verbindung zu Mikrocontrollern wie dem Calliope mini ist möglich. Aber es gibt eine wichtige Einschränkung: Solche Projekte sind eher für fortgeschrittene Nutzer*innen geeignet. Sowohl die Verdrahtung als auch das Programmieren erfordern Erfahrung mit der Elektronik und den entsprechenden Sensorprotokollen.

Technische Details zur Ansteuerung

Das direkte Ansteuern der Kontakte funktioniert über die SPIKETM-Prime-App, allerdings nur in Python-Projekten und auf eigene Faust. Es gibt kein von LEGO gestelltes „UltrasonicBreakout“ Python-Modul o.ä. Beschreibungen und Anleitungen zur den entsprechenden Micropython-Klassen und -Methoden kursieren jedoch im Internet. Wer Erfahrung mit anderen Micropython-Geräten, speziell der Bedienung der Micropython-REPL, mitbringt, kann hier schnell Fuß fassen.

Bestellt eure eigene Adapterplatine!

Wir werden bei brickobotik mit der Platine weiterarbeiten, um die Verbindung mit verschiedenen Sensoren zu testen. Allen Bastler*innen, die jetzt Lust bekommen haben, ebenfalls mit Verbindungen zum SPIKETM Prime zu experimentieren, möchten wir die Möglichkeit geben, unsere Adapterplatine dafür zu nutzen. Wenn ihr also Interesse an der beschriebenen Platine habt und sie über uns erwerben wollte, dann schreibt uns eine E-Mail an [email protected]. Wir sammeln die Anfragen und wenn genügend Interessent*innen zusammenkommen, geben wir euch per Mail Bescheid, sobald die Platine vorbestellbar ist. Du willst nicht selbst basteln, bist aber interessiert an einem bestimmten Sensor, den man mit dem SPIKETM Prime verbinden könnte? Dann besuch uns auf www.brickobotik.de und lass uns einen Kommentar oder eine Nachricht mit deinen Wünschen da. Wir werden versuchen, sie für kommende Projekte zu berücksichtigen

Wandelbots – No-Code Robotics – – Short Interview

Sebastian from Robots-Blog was able to do a short interview with Annelie Harz from Wandelbots. Learn in the interview what Wandelbots is and why programming might soon become obsolete.

Robots-Blog: Who are you and what is your job at Wandelbots?

Annelie: My name is Annelie and I work as a marketing manager at Wandelbots.

Robots Blog: Which robot from science, movies or TV is your favorite?

Annelie: Wall-E, actually. A little robot that does good things and is just adorable.

Robots Blog: What is Wandelbots and where does the name come from?

Annelie: The name describes the CHANGE (german: „Wandel“) of RoBOTics. Because that is exactly what we do. We enable everyone to handle robots, which today is only reserved for a small circle of experts. Our long-term company vision is: „Every robot in every company and every home runs on Wandelbots“. And that promises big change on a wide variety of levels – starting for us with industry.

Robots Blog: Who is your product aimed at and what do I need for it?

Annelie: Our product is currently aimed at customers from industry. Here, our software – Wandelbots Teaching – can help with programming various applications such as welding or gluing without having to write a line of code. It is designed to be so simple and intuitive that really anyone can work with it to teach a robot a desired result. This works through the interaction of an app and an input device, the TracePen. This takes the form of a large pen with which users can draw a desired path for the robot on the component. But we also work together with educational institutions. They are the ones who train the next generation of robot experts. And in the long term, we are convinced – and this is already part of our vision – that robots will also find their way into private life as little helpers.

Robots-Blog: What feature is particularly worth mentioning?/What can’t anyone else do?

Annelie: Our product works robot manufacturer independent. In robotics, each manufacturer has developed its own proprietary programming language over the years. This makes communication between humans and machines very difficult. We, on the other hand, want to create a tool that allows any human to work with any robot – completely independent of programming language and manufacturer. Robotics should be fun for the user of our product. Thanks to the high usability and the operation of our app via iPad, this is already possible today. And over the next time, application-specific editions will be added to our platform – currently, for example, we are working on an app version for robot welding.

Robots Blog: Do I still need to learn programming at all?

Annelie: No. As I just explained, with this so-called no-code technology, you don’t need to learn programming anymore. It is simple, intuitive and user-friendly, even for laymen. Of course, you always need to have some basic understanding of robotics, especially for safety reasons. You should never underestimate the dangers posed by robots, which is why our product always works according to the respective manufacturer-specific safety specifications.

Robots Blog: What robots are supported? I have a Rotrics DexArm and an igus Robolink DP-5; can I use those as well?

Annelie: Of course, shortly after entering the market, we first want to make robotics in the industry, for example the automotive sector, more flexible and easier. To do this, we are gradually integrating the largest robot brands into our platform. We will certainly also integrate smaller robot brands that cover one or more niches. Or – even better – thanks to our Robot Integration Software Development Kit, robot manufacturers will soon be able to do it themselves.

Robots Blog: How much does your product cost?

Annelie: Our product is offered via a licensing model as a subscription, as is common in the Software as a Service business, or also classically for purchase. The current prices for the different editions can be found on our website (and you will certainly find more exciting content there)

Certification as a professional in image processing by Eye Vision Technology


Image processing is a complex and very extensive topic. In order to be able to use the multitude of different application possibilities and functions optimally, EVT has been offering training courses on various topics of image processing for several years. The participants will learn how to use it correctly, as well as the numerous functions and possible uses of the innovative EyeVision software.

EVT now also offers the first free certification program in addition to free knowledge sharing. The webinar participants can participate and benefit from the advantages. After successfully completing a test that is independent of time and location, the participants receive a certificate and are allowed to bear the title “certified Eye Vision Technology professional in image processing”. The certification comes with numerous advantages, such as saving 10 percent with every order via EVT, the permission to use prioritized support via an exclusive acceptance point and an entry as a certified professional in image processing on the highly frequented Eye Vision homepage.

Certification not only benefits companies, but also customers. Because the certificate enables transparency about the knowledge of the person responsible in the field of image processing and the use of image processing software.

You can find out more about the criteria and registration for the free certification program at www.evt-web.com.

Boston Dynamics expands Spot® product line

NEW SELF-CHARGING ENTERPRISE ROBOT, REMOTE OPERATION SOFTWARE, AND ROBOT ARM ENHANCE SPOT’S CAPABILITIES FOR AUTONOMOUS SITE MONITORING


Waltham, MA – February 2, 2021 – Boston Dynamics, the global leader in mobile robotics, today announced an expanded product line for its agile mobile robot Spot. The new products include a self-charging Enterprise Spot, web-based remote operations software, Scout, and the Spot Arm. These additions extend Spot’s ability to perform autonomous, remote inspections and data collection, and enable the robot to perform manual tasks.

With more than 400 Spots out in the world, the robot has successfully undertaken hazardous tasks in a variety of inhospitable environments such as nuclear plants, offshore oil fields, construction sites, and mines. Customers have leveraged Spot’s advanced mobility, autonomy, control, and customizability to improve operational efficiency, enhance worker safety, and gather critical data. Spot’s new products are designed to enable customers to fully operationalize continuous, autonomous data collection on remote or hazardous worksites of any size, from anywhere they have access to their network.

Autonomy is critical to enhancing Spot’s value. In order to support long, remote deployments, Boston Dynamics is introducing Spot Enterprise, a new version of Spot that comes equipped with self-charging capabilities and a dock, allowing it to perform longer inspection tasks and data collection missions with little to no human interaction. In addition to the basic capabilities that the base Spot robot offers, Spot Enterprise leverages upgraded hardware for improved safety, communications, and behavior in remote environments. These upgrades expand the range that autonomous missions can cover, extend WiFi support, add flexibility to Spot’s payload ports, and enable users to quickly offload large data sets collected during the robot’s mission.

Pivotal to refining Spot’s value at scale is remote operation. Scout is Boston Dynamics’ web-based software that enables operators to control their fleet of Spots from a virtual control room. Operators can use Scout to take Spot anywhere a person could go on-site, allowing them to inspect critical equipment or hazardous areas from afar. The software is designed with a simple user interface to run pre-programmed autonomous missions or manually control the robot, to perform various tasks such as walking or posing the robot to capture images and thermal data of obscured gauges or pipes using the Spot CAM+IR thermal imaging payload.

Combined, the Spot Enterprise robot equipped with a Spot CAM+IR thermal imaging payload, Scout software, and Boston Dynamics’ premium support now create an out-of-the-box solution for asset-intensive environments. Operators can deploy this solution on site to proactively maintain and manage assets while maximizing worker uptime and improving worker safety.

In addition to launching products designed to make remote inspection safer and easier, Boston Dynamics is also releasing the Spot Arm, which enables users to act on data insights and perform physical work in human-centric environments. The arm is equipped to operate through both semi-autonomous actions and telemanipulation. It can manually or semi-autonomously grasp, lift, carry, place, and drag a wide variety of objects. It is also capable of manipulating objects with constrained movement and can open and close valves, pull levers and turn handles and knobs in coordination with its body to open standard push and pull doors.

“Since first launching Spot, we have worked closely with our customers to identify how the robot could best support their mission critical applications,” said Robert Playter, CEO of Boston Dynamics. “Our customers want reliable data collection in remote, hazardous, and dynamic worksites. We developed the new Spot products with these needs in mind, and with the goal of making it easy to regularly and remotely perform critical inspections, improving safety and operations.”

Interested parties can purchase Spot Enterprise, Scout, and the Spot Arm via Boston Dynamics’ sales team. For more information on these new offerings, please visit: www.bostondynamics.com.



About Boston Dynamics

Boston Dynamics is the global leader in developing and deploying highly mobile robots capable of tackling the toughest robotics challenges. Our core mission is to lead the creation and delivery of robots with advanced mobility, dexterity and intelligence that add value in unstructured or hard-to-traverse spaces and positively impact society. We create high-performance robots equipped with perception, navigation and intelligence by combining the principles of dynamic control and balance with sophisticated mechanical designs, cutting-edge electronics and next-generation software. We have three mobile robots in our portfolio – Spot®, Handle™ and Atlas® – as well as Pick™, a computer vision-based robotics solution for logistics. Founded in 1992, Boston Dynamics spun out of the MIT Leg Lab and is one of Inc. Magazine’s Best Workplaces of 2020. For more information on our company and its technologies, please visit www.bostondynamics.comhttp://www.bostondynamics.com.

Blaize Delivers First Open and Code-free AI Software Platform Spanning the Entire Edge AI Application Lifecycle


El DORADO HILLS, CA — December, 2020 — Blaize today fully unveiled the Blaize AI Studio offering, the industry’s first open and code-free software platform to span the complete edge AI operational workflow from idea to development, deployment and management. AI Studio dramatically reduces edge AI application deployment complexity, time, and cost by breaking the barriers within existing application development and machine learning operations (MLOps) infrastructure that hinder edge AI deployments. Eliminating the complexities of integrating disparate tools and workflows, along with the introduction of multiple ease-of-use and intelligence features, AI Studio reduces from months to days the time required to go from models to deployed production applications.



“While AI applications are migrating to the Edge with growth projected to outpace that of the Data Center, Edge AI deployments today are complicated by a lack of tools for application development and MLOps,” says Dinakar Munagala, Co-founder and CEO, Blaize. “AI Studio was born of the insights to this problem gained in our earliest POC edge AI hardware customer engagements, as we recognized the need and opportunity for a new class of AI software platform to address the complete end-to-end edge AI operational workflow.”



“AI Studio is open and highly optimized for the AI development landscape that exists across heterogeneous ecosystems at the edge,” says Dmitry Zakharchenko, VP Research & Development, Blaize. “With the AI automation benefits of a truly modern user experience interface, AI Studio serves the unique needs in customers’ edge use cases for ease of application development, deployment, and management, as well as broad usability by both developers and domain expert non-developers.”



The combination of AI Studio innovations in user interface, use of collaborative Marketplaces, end-to-end application development, and operational management, collectively bridge the operational chasm hindering AI edge ROI. Deployed with the Blaize AI edge computing hardware offerings that address unserved edge hardware needs, AI Studio makes AI more practical and economical for edge use cases where unmet application development and MLOps needs delay the pace of production deployment.



“In our work for clients, which may include developing models for quality inspection within manufacturing, identifying stress markers to improve drug trials or even predicting high resolution depth for autonomous vehicles, it is vital that businesses can build unique AI applications that prove their ideas quickly,” says Tim Ensor, Director of AI, Cambridge Consultants. “AI Studio offers innovators the means to achieve this confidence in rapid timeframes, which is a really exciting prospect.” Cambridge Consultants, part of Capgemini Group, helps the world’s biggest brands and most ambitious businesses innovate in AI, including those within the Blaize ecosystem.

Code-free assistive UI for more users, more productivity
The AI Studio code-free visual interface is intuitive for a broad range of skill levels beyond just AI data scientists, which is a scarce and costly resource for many organizations. “Hey Blaize” summons a contextually intelligent assistant with an expert knowledge-driven recommendation system to guide users through the workflow. This ease of use enables AI edge app development for wider teams from AI developers to system builders to business domain subject matter experts.

Open standards for user flexibility, broader adoption
With AI Studio, users can deploy models with one click to plug into any workflow across multiple open standards including ONNX, OpenVX, containers, Python, or GStreamer. No other solution offers this degree of open standard deployment support, as most are proprietary solutions that lock in users with limited options. Support for these open standards allows AI Studio to deploy to any hardware that fully supports the standards.



Marketplaces collaboration
Marketplace support allows users to discover models, data and complete applications from anywhere – public or private – and collaborate continuously to build and deploy high-quality AI applications.

AI Studio supports open public models, data marketplaces and repositories, and provides connectivity and infrastructure to host private marketplaces. Users can continually scale proven AI edge models and vertical AI solutions to effectively reuse across enterprises, choosing from hundreds of models with drag and drop ease to speed application development



Easy-to-Use application development workflow:
The AI Studio model development workflow allows users to easily train and optimize models for specific datasets and use cases, and deploy quickly into multiple formats and packages. With the click of a button, AI Studio’s unique Transfer Learning feature quickly retrains imported models for the user’s data and use case. Blaize edge-aware optimization tool, NetDeploy, automatically optimizes the models to the user’s specific accuracy and performance needs. With AI Studio, users can easily build and customize complete application flows other than neural networks, such as image signal processing, tracking or sensor fusion functions.



Ground-breaking edge MLOps/DevOps features
As a complete end-to-end platform, AI Studio helps users deploy, manage, monitor and continuously improve their edge AI applications. Built on a cloud-native infrastructure based on microservices, containers and Kubernetes, AI Studio is highly scalable and reliable in production.



Blaize AI Studio Early Adopter Customers Results
In smart retail, smart city and industry 4.0 markets, Blaize customers are realizing new levels of efficiency in AI application development and deployment using AI Studio. Examples include:

– Complete end-to-end AI development cycle reduction from months to days
– Reduction in training compute by as much as 90%

– Edge-aware efficient optimizations and compression of models with a < 3% accuracy drop

– New revolutionary contextual conversational interfaces that eclipse visual UI



Availability
AI Studio is available now to qualified early adopter customers, with general availability in Q1 2021. The AI Studio product offering includes licenses for individual seats, enterprise, and on-premise subscriptions, with product features and services suited to the needs of each license type.



About Blaize


Blaize leads new-generation computing unleashing the potential of AI to enable leaps in the value technology delivers to improve the way we all work and live. Blaize offers transformative computing solutions for AI data collection and processing at the edge of network, with focus on smart vision applications including automobility, retail, security, industrial and metro. Blaize has secured US$87M in equity funding to date from strategic and venture investors DENSO, Daimler, SPARX Group, Magna, Samsung Catalyst Fund, Temasek, GGV Capital, Wavemaker and SGInnovate. With headquarters in El Dorado Hills (CA), Blaize has teams in Campbell (CA), Cary (NC), and subsidiaries in Hyderabad (India), Manila (Philippines), and Leeds and Kings Langley (UK), with 300+ employees worldwide.

Igus Robolink Programming Session #1

My workplace for today was kind of different. Thanks @igusgmbh (https://www.igus.de/robolink/roboter) for making this possible. I am learning a lot about robot programming today. I wish I could have such a powerful robot next to my desk any given day.

LEGO® Star Wars™ BOOST droid commander set takes the force to a new level, introducing the droids you have been looking for…

New LEGO® Star Wars™ BOOST Droid Commander set lets fans build, code and play with three iconic Star Wars droids – whether they’re a young Padawan or Jedi Master

May 4, 2019: Today, the LEGO Group unveils the latest addition to its much-loved Star Wars™ range – and it’s something even Yoda’s Force sense didn’t see coming. The new LEGO Star Wars BOOST Droid Commander set offers all the creativity and coding fun of LEGO® BOOST alongside the chance to build three of the film franchise’s most iconic droids: R2-D2; the Gonk Droid; and the Mouse Droid.

It’s the first time the intuitive drag-and-drop LEGO BOOST coding technology has been used in a LEGO licensing product. With the technology overhauled to match the LEGO Star Wars galaxy, the result is a whole new play experience in which kids and parents can team up to build, code and play with the droids, then create their own Star Wars stories and battlegrounds with inspiration from 40+ interactive missions. What’s more, every time they play with this LEGO Star Wars brick galaxy of lovable droids, they will also be honing their STEAM (Science, Technology, Engineering, Arts and Maths) skills, which are ever more important for children in today’s digital world.

Julia Goldin, Chief Marketing Officer, LEGO Group, said: “We’ve been fuelling the imagination of young Padawans and Jedi Masters for twenty years and wanted to take the Force to a new level. By introducing LEGO BOOST and creative coding into the LEGO Star Wars galaxy, kids now have the chance to develop essential 21st century skills while immersing themselves in the amazing world of Droid Commanders. Our children are the problem solvers of tomorrow and STEAM skills will be essential to help them conquer the challenges of the future.”

The LEGO Star Wars BOOST Droid Commander set is the latest example of how the LEGO Group is using product innovation to help boys and girls gain vital STEAM skills like creativity, critical-thinking, problem-solving and communication – all while enjoying the thrill of playing with their favourite LEGO Star Wars characters.

Launching globally September 1, 2019 just in time to mark the upcoming release of Star Wars: The Rise of Skywalker, all three droids (R2-D2, Gonk Droid and Mouse Droid) included in the set are great fun to build, code and play with, and completely customisable for every child.

“These are the droids you’re looking for.”

LEGO® Star Wars™ BOOST Droid Commander product facts:

  • The set includes a color & distance sensor, interactive motor, Bluetooth (Move Hub) and 1,177 pieces – enough to build all three lovable R2-D2 robot, Gonk Droid and Mouse Droid, each coming with their own personalities, skills, and authentic Star Wars sounds and music. It will be age graded 8+ and the recommended retail price will be $199,99/€199,99.
  • R2-D2 measures over 7” (20cm) high and 5” (14cm) wide. LEGO® Gonk Droid measures over 7” (18cm) high, 3” (9cm) wide and 6” (16cm) long. LEGO Mouse Droid measures over 5” (14cm) high, 3” (9cm) wide and 6” (17cm) long.
  • Free LEGO® BOOST Star Wars™ app is available for selected iOS, Android and Fire smart devices. Using the app, young commanders can build the droids, insert the Bluetooth-controlled Move Hub into the droid they want to see solve each of the 40+ missions, and bring it to life using the intuitive drag-and-drop coding environment.
  • Mission examples include:
    • R2-D2:
      • Plot a course
      • Receive and decode an incoming message
      • Party infiltration
      • Assisting flying an X-wing
    • GONK Droid:
      • Arena Training
      • Work as a load lifter
      • Ready for the fighting pit
      • Power droids
    • Mouse Droid:
      • Thrash sweep
      • Thrash dump
      • Message delivery
      • Locate Rebels

And many more…

For more information, visit: www.lego.com/starwars-droidcommander.

STAR WARS and related properties are trademarks and/or copyrights, in the United States and other countries, of Lucasfilm Ltd. and/or its affiliates. © & TM Lucasfilm Ltd.


About the LEGO Group:
The LEGO Group’s mission is to inspire and develop the builders of tomorrow through the power of play. The LEGO System in Play, with its foundation in LEGO bricks, allows children and fans to build and rebuild anything they can imagine.

The LEGO Group was founded in Billund, Denmark in 1932 by Ole Kirk Kristiansen, its name derived from the two Danish words LEg GOdt, which mean “Play Well”.

Today, the LEGO Group remains a family-owned company headquartered in Billund. However, its products are now sold in more than 140 countries worldwide. For more information: www.LEGO.com 

MIMIC educational robots introduces robots you can code

Cincinnati-based Entrepreneur launches Kickstarter campaign to launch ‘mimicArm’, your own programmable A.I. robot arm

Cincinnati, OH (April 14, 2018) – mimicEducationalRobots (a division of Robomotive Laboratories LLC) is changing the way coding is taught with mimicArm. The Cincinnati-based small family business launched a new Kickstarter campaign (https://www.kickstarter.com/projects/713401305/662798422?ref=455790&token=e4d4249c) on April 24th to help advance the development and production of the new technology.

mimicArm is a desktop sized robot arm that represents a new approach to teaching programming. mimicArm is a collaborative robot, or “cobot”, designed to interact with human users.  Unlike other educational robots mimicArm teaches children to program robots to work in tandem with humans. Using the mimicArm controller children as young as 5 are immediately able to interact with mimicArm.  When they’re ready, users can begin coding using mimicBlock, a graphical coding interface.  mimicArm is also programmable using actual C code and includes simple pre-written functions to allow the quick creation of complicated programs. The provided programming experiments start simple, but build to complex interactive artificial intelligence programs. Paired with the inputBox and other sensors the user can create a truly interactive artificial intelligence robot arm.

Brett Pipitone, the Founder of mimicEducationalRobots is no stranger to Kickstarter.  “After a successfully-funded Doorbell Phone campaign on Kickstarter, I began to indulge in my love of robotics and introduced the Cortex:Robot Arm controller,” said Pipitone. “We continued to develop the technology by adding joints, cameras and motion and soon realized that we had developed something truly unique: The mimic immersion robot was born.  While mimic’s Kickstarter campaign didn’t reach it’s goal, the technology developed allowed us to build mimicArm, which we think could really make a difference in the world.”

The key to bringing mimicArm to life is the mimic arm controller with patent-pending Posi-Feeltm grip controllers. The user grips a simple scissor control and moves his or her arm and hand in natural ways.  mimicArm will “mimic” these motions. A series of joints, pivots and sensors are built into the mimicArm Controller to make this possible.  When the user is ready, easy to use programming software and robust examples walk them through the process of learning to program their own robot.

This assemblage of new technologies allows the user to see immediate frustration-free results without the risk of outgrowing the robot in a short time.  The infinite expandability and endless programming possibilities will keep even expert programmers captivated.

mimicEducationalRobots realized early that a single package would not fit all users needs, so backers have a choice of three packages, each with a unique user in mind.

  • The mimicArm kit version includes the robot and manual controller. This version is great for those who want to ease into robotics coding, or those that already have sensors that they’re ready to integrate with the robot.  Perfect for beginners and experts, this package is the most affordable option.
  • The mimicArm Super Fun Kit is centered around manual mode, with a set of accessories to maximize the fun factor. Users can stack the stacking blocks (included), or program the robot to do it for them.  The Great Big Button is also included, and offers additional capabilities for those honing their coding skills (or for those with younger siblings that really want to touch something). “mimicArm Super Fun Kit is targeted towards younger users, but is also a great way for beginners to make coding interesting as well,” says Pipitone.
  • The mimicArm Deluxe Kit is the most complete kit offered. Including everything from the mimicArm Super Fun Kit, the mimicArm Deluxe Kit adds the input box and IR Distance Sensor for maximum interactive possibilities.  The inputBox incorporates buttons, a microphone and other sensors, and a microSD card.  “Programmable with both mimicBlock and Arduino, the Deluxe kit really expands the possibilities. With this kit the user can truly program their own interactive robot,” says Pipitone.

MimicArm is a great educational tool, and users can be a part of it now by backing mimic on Kickstarter. According to Pipitone, „We’re teaching those who will deliver the personal robot of the future. We’re still working on flying cars and jet packs.“

For more information, please visit: http://www.mimicrobots.com/.

CoderZ: Bringing robotics to every student in the world

CoderZ is an online learning environment where kids learn how to program virtual and real robots within the STEM pathways. Problem-solving, critical thinking, computational thinking, teamwork, self-paced learning, formative assessment, robotics, classroom engagement: CoderZ includes all of these concepts and more.

Discovering different new ways to engage the new generations with robotics and with STEM related fields becomes a bigger challenge everyday. That is why, tools like CoderZ are being developed to give teachers, educators, and robotics experts the possibility to take a deep breath.

CoderZ’s new version, now compatible with the LEGO Mindstorms EV3 (through Lejos), enables students to program their own virtual robot and acquire 21st-century skills. Delivered with the “Coding Robots” curriculum, co-developed by Intelitek and Gary Garber, CoderZ becomes an scalable and effective way for students with different levels to experience the robotics world in class.

Having several gamified missions, motivates kids to accomplish them in order to move to a harder level. Also, CoderZ has a class management tool for teachers to track each student progress and activity.

Starting with a friendly drag-and-drop blockly visual editor, kids progress to code their virtual robot using Java.

Recently, the CoderZ team added to their previous FTC, First Tech Challenge, version, the new version mentioned before, which is compatible with the EV3 brick. Right now, the CoderZ team is offering a 14-day free trial which you can sign up for here.

CoderZ even gives you the option of driving and programming your virtual robot on the moon, taking into consideration friction and gravity. And of course, increasing the kids’ engagement with the robotics world. Although, for now, kids’ won’t be able to try their robot on the moon after they download the program, but who knows what Elon Musk will create in the next few years.

Pay some atención! CoderZ’s STEM learning environment is available both in English and in Español… Si señor!

Learn more about CoderZ at http://GoCoderZ.com.

Request your free trial here.

uArm Swift: Multi-use Desktop Robotic Arm for Everyone

In 2014 UFACTORY built the first open source desktop robotic arm, ushering in a new era of affordable desktop robotic arms for consumers. uArm is an Arduino-powered desktop 4-axis parallel-mechanism robot, easy to use, has multiple accessories and open sourced. In Jan 2014, uArm went on Kickstarter and became famous overnight, leading to an exclusive interview with WIRED. A standout quote from the piece: Thirty years ago Bill Gates promised to put a computer on every desk in America, an ambitious sentiment echoed by Wang and company … The most innovative aspect of the entire project is probably the concept of putting a robot arm on your desk”.

As the robot arm is moving to various industries, its users have expanded from relatively minority geeks to robot lovers. The demand for better and easier user experience is increasing.

January 23, 2017, UFACTORY returns again, announcing two robotic arms in uArm Swift Series. Swift is intended to highlight the elegant texture of the fuselage, lightweight and portable form and flexible movement, just like a swift. This series has uArm Swift and uArm Swift Pro.

  • Hardware Upgrades

 

Performance improvements

1.Precision improvement

uArm Swift enhanced the control algorithm and increased the accuracy by 50%, from 1cm to 5mm.

uArm Swift Pro adapted self-designed reducer. Working with a high-precision stepping motor, uArm Swift Pro minimizes gear gap, improves joint accuracy, and is more compact. The built-in 12-bit magnetic encoder and motor forms instant position feedback, achieving closed-loop control, and improves the accuracy to unprecedented 0.2mm, perfectly performs 3D printing and laser engraving.

  1. Working scope improvement:

uArm Swift Series improves mechanical arm structure and increases working range by 20%, covering the working area of an entire A4 paper.

  1. CPU upgrade

uArm Swift Series upgrades the main board. We choose Arduino MEGA 2560, which is nearly 10 times larger in the storage space compared to the previous UNO edition.

  1. Accessories enhancements

uArm Swift Series has 4-axis, whether equipped with fixtures or suction head, the end of uArm can freely steer, and the replacement of accessories requires less than 30 seconds.

uArm Swift/uArm Swift Pro have a built-in socket for selected Seeed Grove modules.

uArm Swift Series can be equipped with a smart car, the uCar. uCar is a mobile open-source car, with infrared avoidance, trajectory planning functions.

uArm Swift Series adapts CNC integral forming process, and the whole body is matte black. uArm Swift looks has a more minimalistic design. The aluminum body is light and stable, enhancing the overall rigidity.

Compared with the previous version, uArm Swift series redesigned the base, inserted the mainboard and added power button, function switching button, play button and menu button. The new indicator light shows the current operation mode and status of uArm Swift.

 

 

  • Software upgrade

uArm Swift Series support PC + mobile control.

Software upgrade -uArm Studio

uArm Studio is a brand new cross-platform robotic arm control software. It has integrated offline learning, graphical programming and instant control functions, manipulate the robotic arm to finish complex tasks.

  1. Teach & Play Offline Learning Mode

Teach uArm Swift by your own hand to learn move, gripping, dropping, and save them with just a click to replay on Blockly mode. uArm Swift can also sync offline learning data once connected.

  1. Blockly-based graphical programming

Blockly is a web-based visual programming tool, allowing users to program without needing to code The software is designed to be so simple that even even preschool children could create a program easily. Detailed tutorial will be provided for your quick guide and interesting secondary development.

  1. Instant Control

uArm Studio has combined control of keyboard and mouse. Developers may use keyboard hotkeys and mouse simultaneously to control move, gripping, and dropping of the robotic arm, and it supports customize hot keys.

After connect with LeapMotion, users of uArm Swift may use their own hand to control gesture such as move, gripping and dropping etc.

 

Software upgrade –

APP-uArm Play

Robotic Arm has built-in Bluetooth module, simply connect your smartphone with uArm Play to remote control your uArm Swift or uArm Swift Pro.

Your smartphone can also work as an external actuator, download and run a program from Blockly.

 

UCS

UCS, also known as uArm Creator Studio, which is a open sourced developer tool developed by UFACTORY. UCS has integrated graphical programming and coding, to achieve features such as rapid development, visualize and easy sharing.

 

  1. Rapid Development

With numerous commands of UCS, developers don’t need to construct programming environment.

For programming developers, UCS is a rapid development tool, developers doesn’t need to construct programming environment, any interface of the whole system supports Python script, all variables can be sharing between visual programming and coding which means you don’t need to copy setting each time.

 

  1. Built-in Robotic Vision

UCS has integrated complex robotic vision function, just need to connect with the camera, so the uArm can “see” and adapt to different environment.

The camera can instantly locate, memorize, recognize and track 3D space position of objects.

3.Easy Sharing

Every creator will be able to save their own work as .task file format through the UCS programming, and it supports one click sharing to the official website of UFACTORY or Reddit Community, copying scenario in just one click on other robotic arms.

The uArm Swift Pro is a “Open Sourced” design concept with more freedom, simplicity, and functions. This is a whole new open platform came from developers, back to developers, and still that Open Source Robotic Arm. We just can’t wait to become  your comprehensive desktop assistant!