In July, the Taiwanese-based company MATRIX Robotics System marked a significant milestone in educational robotics with the release of its latest product: the MATRIX R4 Robo Set. Developed in partnership with and certified by Arduino Education, this innovative robotics set is built on the UNO R4 WiFi platform, providing users with a sophisticated 12- in 1 robot model and versatile tool to explore and excel in the field of robotics.
The MATRIX R4 Robo Set has been meticulously designed to cater to all ages for versatile projects. The MATRIX R4 offers comprehensive solutions that leverages Arduino’s cutting-edge technology. The set provides endless possibilities for those eager to enhance their robotics skills, whether they are students, hobbyists, or seasoned competitors.
The MATRIX R4 Robo Set includes a versatile controller that supports various applications beyond just robots, but also factory simulations. The controller allows for different projects and scenarios, such as smart factories, by integrating components like the Mvision camera and IoT functionalities. The Smart Factory simulation models a factory inspection process where products are assessed after production. This flexibility enables users to explore and understand a wide range of industrial and technological processes.
The MATRIX R4 RoboSet serves as more than just a tool; it is an educational journey. With educational objectives, it enhances learning programming featuring MATRIXblock and Arduino IDE, introduces foundational computing concepts, making it an ideal starting point for anyone interested in computer science and robotics. Furthermore, with a quick and easy assembly building system, users gain hands-on experience in mechanism design, a crucial skill in robotics.
The R4 set not only lays the foundation for developing hardware and software integration skills but also encourages users to think critically and creatively when tackling real-world robotics challenges. It provides practical applications for problem-solving skills. While controlling a motor with Arduino WiFi can be challenging, especially when managing both the motor and sensors, the R4 set simplifies the process with its easy plug-and-play solution.
One of the standout features of the MATRIX R4 Robo Set is its compatibility with various expansion kits, allowing users to customize and expand up to 12 different robotic creations. For example, the MX300 expansion kit empowers users to build robust and fundamental robots using the MATRIX basic set. The MX300 Expansion Kit is a versatile tool that bridges the gap between theory and practice in small and medium-sized Autonomous Mobile Robots (AMRs). Tailored for students, educators, and enthusiasts, it helps users understand AMR principles and serves as a demonstration robot for the MARC (Master AI Robot Cup) competition, providing participants with a platform to practice and hone their skills in a competitive setting.
Additionally, the MJ2 Wireless Joystick, a key part of the MATRIX ecosystem, uses 2.4G wireless technology to connect over 20 devices simultaneously. Its strong anti-interference and stable signal make it perfect for precision-demanding competitive scenarios.
As the field of robotics continues to evolve, MATRIX Robotics System remains at the forefront, providing cutting-edge tools and resources to inspire the next generation of roboticists. The MATRIX R4 Robo Set represents a gateway to a world of innovation, creativity, and competition, equipping users with the skills they need to succeed in the rapidly changing landscape of robotics.
Seattle, WA – Crafted for portability, KT2 promotes physical activity through intuitive prompts and interactive games, making it a perfect companion for home and professional settings.
Designed by Seattle based tech company, Wair Living, the In-house robot innovation team KameRobotics have developed a mini-robot that enhances productivity at home, acts as a thoughtful companion, and is a diligent STEM teacher.
„Our KT2 robot prioritizes user privacy and minimalist with its design, featuring no integrated cameras or screen.“ said Qian Li, Founder of Wair and KameRobotics. „We want to ensure a secure interaction environment and more intuitive play via robot action and lighting interaction“.
KT2 Kungfu Turtle is not just a toy but a multifunctional companion for gaming, productivity, and learning. It features acrobatics, physical sparring, and autonomous return functions, supported by a powerful operating system and programmable chipset.
Beyond entertainment, KT2 enhances productivity in home and office environments. It includes a Pomodoro timer for focused work intervals and movement prompts to combat sedentary lifestyles. The modular design allows customization with accessories, transforming KT2 into a unique desk ornament and interactive companion.
At the last Robotics Meetup Nuremberg, significant updates about the humanoid robot pib were announced: a new version is set to be released in August.
Changes from Last Year’s Version:
• Stylish Design: The upper body has been redesigned to be more stylish, with hidden screws and motors, and no loose cables.
• Easier Assembly: The new version offers simpler assembly.
• Enhanced Motors: The shoulders are now equipped with stronger motors.
• Improved Imitation Feature: Previously, pib could only follow horizontal movements. The updated version can also follow vertical movements.
• Upgraded Electronics: The electronics have been upgraded to a Raspberry Pi 5.
• Dockerization: pib is now dockerized, allowing installation on various software systems. This was a community effort, especially during the last hackathon.
• Digital Twin: A digital twin has been developed, enabling simulations and machine learning.
Continuity and Compatibility
Not everything has changed. The camera and display remain the same, and efforts have been made to maintain compatibility with the previous version. Updated tutorials are being prepared to ensure easy assembly.
The meetup also featured a guest speaker from EduArt Robotik, who introduced EduArt, the robot. After the presentations and discussions, everyone enjoyed pizza and drinks while networking with fellow robotics enthusiasts.
The Avishkaar ABot Advanced Kit is a comprehensive DIY kit for STEM robotics and coding designed specifically for children aged 8 and up. It contains over 60 parts, including metal parts, motors, sensors, wheels, USB cables, screws, nuts, an Allen wrench, and a wrench. With these parts, children can build 10 different robots, from simple vehicles to more complex constructions The set reminded me of the mBot when I set it up, as it was also based on a sturdy metal construction. The included stickers and the tool are nice. The instructions were easy to understand and I didn’t find any errors or had any problems assembling. The app for remote control and programming must be activated with the product code and the user must be registered before using it for the first time. When deploying, e.g. in the classroom, you should have done this in advance. By the way, the 9V block battery visible in the video is not included when purchased from a dealer, but a full-fledged battery pack. I only use the 9V battery because of delivery problems. Overall, however, it is interesting that the robot makes this possible at all. I’m playing with the idea of connecting a solar cell here and operating the robot with solar energy…like a real Mars rover…
Here is more detailed information:
Easy to build programmable robots: With this kit, kids can create 10 different robots with over 60 pieces. This includes metal parts, an easily programmable brain, motors, sensors (2x light sensor, 1x touch sensor/button), wheels, USB cables, screws, nuts, an Allen key, a wrench, cables, and instructions.
Control via mobile app: The robots can be controlled via a remote control app. They can also be programmed using a visual block-based programming environment (similar to Scratch/Blockly).
Learning Objectives: With the ABot Advanced Kit, children learn robotics, programming, construction, mechanical design and problem solving.
Compatibility: The mobile app is compatible with iOS 11 or later and Android 10 or later.
Inexpensive Kit: The ABot Advanced Kit offers a sturdy metal frame to which motors and sensors can be attached. For the equivalent of about 60€, the set offers good value for money. Maybe the set will soon be available at a German retailer.
You can find a comprehensive assembly video of one of the 10 robot models here:
Montreal/Berlin, 5 June 2024. The technology company Vention has analyzed in a study how companies automate their production independently. Small businesses are trendsetters in do-it-yourself (DIY) automation, but large companies are catching up. The study is based on anonymized data from over 1,400 corporate users of Vention’s Manufacturing Automation Platform (MAP) worldwide.
For the second time, Vention publishes the annual study „The State of DIY Industrial Automation“. The focus is on do-it-yourself (DIY) automation, which enables manufacturers of different sizes to automate their production independently using state-of-the-art technologies.
For the study, Vention evaluated the user behavior of its corporate customers on the Vention cloud platform MAP from January to December 2023. The aim was to capture the current state of DIY automation in companies and to explain how they use the DIY approach for the design, integration and operation of automation components, such as robot cells or cobot palletizers.
„The trend towards DIY automation continues this year,“ says Etienne Lacroix, CEO of Vention. „One driver is the shortage of skilled workers, which is becoming increasingly noticeable . The question of how production can be automated quickly and cost-effectively is currently occupying many companies. We see that small companies in particular are automating independently. But compared to last year, the number of large companies using DIY automation is increasing significantly.“
The most important findings of the study:
1. Small (< 200 employees (MA)) and medium-sized enterprises („Medium“, < 2,000 employees) were the leading users of automation systems on MAP in 2023, with a share of 48% and 17%, respectively. However, small businesses faced more challenging economic conditions in 2023. As a result, there was a decline of 12% in this segment compared to the previous year (see study, p. 8).
Large companies („Large“, < 10,000 employees) as well as the academic and government research sector („Academia & Gov Research“) have made significant gains in the use of the DIY approach on MAP (+10% and +4% respectively). Platform technology has made significant progress over the past year, providing more opportunities for high-throughput projects traditionally associated with larger manufacturers or users (see study, p. 8).
3. In 2023, very large companies („Enterprise“, > 10,000 employees) used the DIY approach more often in their factory floors than any other sector. Accordingly, the number of projects implemented with MAP in this segment has risen – from an average of 4.1 in 2022 to 4.9 projects in 2023 (see study, p. 11).
4. Projects with machine operation applications were the fastest implemented on MAP in 2023. This is likely due to the fact that it is difficult for companies to recruit staff given the ongoing labor shortages. As recent innovations have made CNC integration more accessible, manufacturers are more eager than ever to quickly adopt automated machine operation applications (see study, p. 24).
5. After two years of record sales (2021 and 2022), the Association for Advancing Automation (A3) reported a significant 30% decline in robot sales in North America in 2023. In contrast, robot deployments on MAP saw a notable increase in both 2022 and 2023. In 2023, robot deployments on MAP grew by about 40% (see study, p. 26).
At VivaTech 2024, the United Robitics Group (URG) will be showing the new product in action – together with other robots from the URG fleet, which are fully adapted to the needs of retailers.
uLink is a highly flexible, versatile platform for rapid adaptation and support in logistics and automation.
uLink is the first URG solution with an open API for seamless integration with operating systems and greater operational flexibility.
Paris/Bochum, May 22, 2024 – At VivaTech in Paris, the United Robotics Group will be presenting its new service robots from the uLink series for the first time, which are characterized by easy integration, flexible customization and individual extensions. They are suitable for use in retail, warehouse logistics and manufacturing. As the European market leader for service robotics solutions, URG is expanding its CobiotX portfolio worldwide with the latest Cobiot for integrated workflows in the aforementioned segments. The modular platform fits seamlessly into the respective workflows and can be expanded with industry-standard accessories. uLink is designed to simplify operations and increase safety and efficiency in various environments. The unique combination of features sets new standards in the industry – from modular design and open API to 3D LiDAR-based navigation and real-time operational data visualization.
uLink is equipped with IDE, SDK and fleet management tools and allows the seamless integration and control of various components such as LiDARS, engines or sensors. Thanks to powerful software functions for configuring and managing robot applications, as well as sensors and accessories, the robot can easily handle various logistics challenges in trade and warehousing, which vary depending on the industry, company size and automation requirements. The uLink accessory interface is also modular, so that the usable area can carry a payload of up to 60 kg. The SEER navigation control allows deliveries in a predefined area of up to 400,000 m².
With the help of the plug-and-play mechanism, other accessories such as a locker for Click & Collect or confidential deliveries, a pegboard for the delivery of tools and spare parts, and trays for transporting stock can be integrated into the platform. In addition, partner integrators can develop new accessories to meet specific requirements.
„The retail and logistics sector has been undergoing a profound transformation for years, related to the growth of e-commerce, automation and the shortage of skilled workers in the value chain. Whether it’s shelf replenishers or water-spiders, i.e. those responsible for inventory in warehouses or production: it is important to support the players and offer solutions that meet their specific needs,“ explains Thomas Linkenheil, Co-CEO of the United Robotics Group. „In a highly competitive sector, consumers want a fast and personalized service. Our new logistics solution enables service providers to offer up-to-date customer service without long searches in the warehouse or tedious processes that can cost time and affect customer business.“
First Cobiot with an open API for connectivity and flexibility
Like all CobiotX solutions from the United Robotics Group, uLink is also equipped with a particularly user-friendly user interface. It is based on the no-code principle and enables users without robotics knowledge to quickly integrate into daily operations. In addition, uLink has an open API platform. This allows the solution to communicate with existing operations and other connected systems such as automatic doors or elevators, and also work with other robots and automated guided vehicles such as AMRs and AGVs.
With the launch of uLink, United Robotics Group is expanding its range of robotics solutions for logistics, warehouse management and industrial manufacturing. The robot is intended to be used in retail and logistics environments such as department stores, supermarkets, warehouses and fulfillment centers, but also factories and production facilities as well as airports and healthcare facilities.
The robot is equipped with 3D LiDAR and PL LiDAR systems for maximum precision in mobility. The platform can immediately register changes in the environment such as certain movements or people or machines and react accordingly. It is connected to an online dashboard that facilitates both workflow management and quick decisions between front- and back-of-house teams.
uLink has a long battery life of up to 14 hours on a single charge. In addition, the solution has an intelligent, wireless charging function that was developed with a well-known German battery manufacturer. This allows it to automatically return to the charging station between individual operations. The robot complies with the highest security and privacy standards, including the EU’s Performance Level D Machinery Directive and GDPR regulations.
uLink, along with United Robotics Group’s logistics and warehouse management fleet, including RBWatcher and MobilePalletizer, will be on display at the company’s VivaTech booth (Hall 1, Booth G18) in Paris from May 22-25.
uLink can be rented via the RaaS (Robot as a Service) model of the United Robotics Group from 699 euros / month or purchased for 19,900 euros.
2 years ago, the open source robotics project pib was launched. The goal of pib, the printable intelligent bot anyone can build themselves, is to lower the barriers and make robotics and AI accessible to anyone who is interested. Over the past two years, pib has built an active and dedicated community that supports the project in moving forward. Therefore, a lot has happened since the project launch – time to look back on how far pib has come.
Milestones, Challenges and What Comes Next
It’s not every day that a robot turns two years old, so the team celebrated with a big party. The all new pib documentary was streamed to kick off the event, followed by different stations for guests to experience pib’s newest features hands-on.
pib started out as an idea that slowly took shape in the form of a master thesis and a robotic arm. From there, a humanoid robot was created that can easily be 3D printed with the free 3D print files on the website and then built with the help of the building manuals online. pib offers many ways to implement AI trainings such as voice assistant technology, object detection, imitation and more.
For starters, the pib team and the community have optimized pib’s mobility in a joint effort. The result is impressive: In its newest version, pib can now move its arms at basically all angles. Another rapidly progressing topic is pib’s digital twin which received a birthday present by the community members that took on this project: The camera now works in the virtual environment, enabling the camera stream to be transmitted to the outside world to be analyzed there and then become the base of control processes.
Talk To Me, pib!
Aside from that, there has been some significant progress in the field of human-machine interaction, particularly focusing on enabling voice-based communication with pib through advanced voice assistant technology. Exploring the potential of natural speech interaction has become a significant area of the team’s current efforts and the project is committed to advancing pib’s capabilities in this direction.
One of the newest features that were revealed at the pib party is communication in a multimodal world. The robot captures an image, analyzes it, and then answers questions in relation to the image. For example, when asking pib “where are we right now?” it interprets the room and its setting and will answer something like “we are in an office space”.
With this new feature, pib was also able to play its first round of Tic Tac Toe. The team drew the gameboard on a whiteboard so that pib was able to analyze the current state of the game and determine the next move with commands such as “place the next X in the top right corner”.
Join The Community
The pib community is rapidly growing and consists of 3D printing, robotics and AI enthusiasts. Whether you’re a rookie or an expert, anyone is invited to join, share their ideas and work on exciting projects together.
This article primarily introduces the practical application of LIMO Cobot by Elephant Robotics in a simulated scenario. You may have seen previous posts about LIMO Cobot’s technical cases, A[LINK], B[LINK]. The reason for writing another related article is that the original testing environment, while demonstrating basic functionality, often appears overly idealized and simplified when simulating real-world applications. Therefore, we aim to use it in a more operationally consistent environment and share some of the issues that arose at that time.
2. Comparing the Old and New Scenarios:
First, let’s look at what the old and new scenarios are like.
Old Scenario: A simple setup with a few obstacles, relatively regular objects, and a field enclosed by barriers, approximately 1.5m*2m in size.
New Scenario: The new scenario contains a wider variety of obstacles of different shapes, including a hollowed-out object in the middle, simulating a real environment with road guidance markers, parking spaces, and more. The size of the field is 3m*3m.
The change in environment is significant for testing and demonstrating the comprehensiveness and applicability of our product.
3. Analysis of Practical Cases:
Next, let’s briefly introduce the overall process.
The process is mainly divided into three modules: one is the functionality of LIMO PRO, the second is machine vision processing, and the third is the functionality of the robotic arm. (For a more detailed introduction, please see the previous article [link].)
LIMO PRO is mainly responsible for SLAM mapping, using the gmapping algorithm to map the terrain, navigate, and ultimately achieve the function of fixed-point patrol.
myCobot 280 M5 is primarily responsible for the task of grasping objects. A camera and a suction pump actuator are installed at the end of the robotic arm. The camera captures the real scene, and the image is processed by the OpenCV algorithm to find the coordinates of the target object and perform the grasping operation.
Then, start the gmapping mapping algorithm by opening another new terminal and entering the command:
roslaunch limo_bringup limo_gmapping.launch
After successful startup, the rviz visualization tool will open, and you will see the interface as shown in the figure.
At this point, you can switch the controller to remote control mode to control the LIMO for mapping.
After constructing the map, you need to run the following commands to save the map to a specified directory:
1. Switch to the directory where you want to save the map. Here, save the map to `~/agilex_ws/src/limo_ros/limo_bringup/maps/`. Enter the command in the terminal:
cd ~/agilex_ws/src/limo_ros/limo_bringup/maps/
2. After switching to `/agilex_ws/limo_bringup/maps`, continue to enter the command in the terminal:
rosrun map_server map_saver -f map1
This process went very smoothly. Let’s continue by testing the navigation function from point A to point B.
Navigation:
1. First, start the radar by entering the following command in the terminal:
Upon success, this interface will open, displaying the map we just created.
Click on „2D Pose Estimate, “ then click on the location where LIMO is on the map. After starting navigation, you will find that the shape scanned by the laser does not overlap with the map. You need to manually correct this by adjusting the actual position of the chassis in the scene on the map displayed in rviz. Use the tools in rviz to publish an approximate position for LIMO. Then, use the controller to rotate LIMO, allowing it to auto-correct. When the shape of the laser scan overlaps with the shapes in the map’s scene, the correction is complete, as shown in the figure where the scanned shape and the map overlap.
Click on „2D Nav Goal“ and select the destination on the map for navigation.
The navigation test also proceeds smoothly.
Next, we will move on to the part about the static robotic arm’s grasping function.
Identifying and Acquiring the Pose of Aruco Codes
To precisely identify objects and obtain the position of the target object, we processed Aruco codes. Before starting, ensure the specific parameters of the camera are set.
Initialize the camera parameters based on the camera being used.
Then, identify the object and estimate its pose to obtain the 3D position of the object and output the position information.
def estimatePoseSingleMarkers(self, corners): """ This will estimate the rvec and tvec for each of the marker corners detected by: corners, ids, rejectedImgPoints = detector.detectMarkers(image) corners - is an array of detected corners for each detected marker in the image marker_size - is the size of the detected markers mtx - is the camera matrix distortion - is the camera distortion matrix RETURN list of rvecs, tvecs, and trash (so that it corresponds to the old estimatePoseSingleMarkers()) """ marker_points = np.array([[-self.marker_size / 2, self.marker_size / 2, 0], [self.marker_size / 2, self.marker_size / 2, 0], [self.marker_size / 2, -self.marker_size / 2, 0], [-self.marker_size / 2, -self.marker_size / 2, 0]], dtype=np.float32) rvecs = [] tvecs = [] for corner in corners: retval, rvec, tvec = cv2.solvePnP(marker_points, corner, self.mtx, self.dist, False, cv2.SOLVEPNP_IPPE_SQUARE) if retval: rvecs.append(rvec) tvecs.append(tvec)
The steps above complete the identification and acquisition of the object’s information, and finally, the object’s coordinates are returned to the robotic arm to execute the grasping.
Robotic Arm Movement and Grasping Operation
Based on the position of the Aruco marker, calculate the target coordinates the robotic arm needs to move to and convert the position into a coordinate system suitable for the robotic arm.
def homo_transform_matrix(x, y, z, rx, ry, rz, order="ZYX"): rot_mat = rotation_matrix(rx, ry, rz, order=order) trans_vec = np.array([[x, y, z, 1]]).T mat = np.vstack([rot_mat, np.zeros((1, 3))]) mat = np.hstack([mat, trans_vec]) return mat
If the Z-axis position is detected as too high, it will be corrected:
if end_effector_z_height is not None: p_base[2] = end_effector_z_height
After the coordinate correction is completed, the robotic arm will move to the target position.
# Concatenate x, y, z, and the current posture into a new array new_coords = np.concatenate([p_base, curr_rotation[3:]]) xy_coords = new_coords.copy()
Then, control the end effector’s API to suction the object.
The above completes the respective functions of the two robots. Next, they will be integrated into the ROS environment.
#Initialize the coordinates of point A and B
goal_1 = [(2.060220241546631,-2.2297520637512207,0.009794792000444471,0.9999520298742676)] #B
goal_2 = [(1.1215190887451172,-0.002757132053375244,-0.7129997613218174,0.7011642748707548)] #A
#Start navigation and link the robotic arm
map_navigation = MapNavigation()
arm = VisualGrasping("10.42.0.203",9000)
print("connect successful")
arm.perform_visual_grasp(1,-89)
# Navigate to location A and perform the task
for goal in goal_1:
x_goal, y_goal, orientation_z, orientation_w = goal
flag_feed_goalReached = map_navigation.moveToGoal(x_goal, y_goal, orientation_z, orientation_w)
if flag_feed_goalReached:
time.sleep(1)
# executing 1 grab and setting the end effector's Z-axis height to -93.
arm.unload()
print("command completed")
else:
print("failed")
4. Problems Encountered
Mapping Situation:
When we initially tried mapping without enclosing the field, frequent errors occurred during navigation and localization, and it failed to meet our requirements for a simulated scenario.
Navigation Situation:
In the new scenario, one of the obstacles has a hollow structure.
During navigation from point A to point B, LIMO may fail to detect this obstacle and assume it can pass through, damaging the original obstacle. This issue arises because LIMO’s radar is positioned low, scanning only the empty space. Possible solutions include adjusting the radar’s scanning range, which requires extensive testing for fine-tuning, or adjusting the radar’s height to ensure the obstacle is recognized as impassable.
Robotic Arm Grasping Situation:
In the video, it’s evident that our target object is placed on a flat surface. The grasping did not consider obstacle avoidance for the object. In the future, when setting special positions for grasping, this situation needs to be considered.
5. Conclusion
Overall, LIMO Cobot performed excellently in this scenario, successfully meeting the requirements. The entire simulated scenario covered multiple core areas of robotics, including motion control of the robotic arm, path planning, machine vision recognition and grasping, and radar mapping navigation and fixed-point cruising functions of the mobile chassis. By integrating these functional modules in ROS, we built an efficient automated process, showcasing LIMO Cobot’s broad adaptability and advanced capabilities in complex environments.
Explore myArm M&C series robots for versatile, high-performing solutions in robotics, offering precise control and diverse applications.
SHENZHEN, GUANGDONG, CHINA, May 10, 2024 /EINPresswire.com/ — Embodied intelligence research, as a critical branch of artificial intelligence, is striving to endow robots with new capabilities in precise motion control, high-level autonomous decision-making, and seamless human-machine interaction.
Against this backdrop, Elephant Robotics has recently unveiled the myArm M&C series robots. These powerful and cost-effective lightweight robots empower researchers and developers in both data collection and execution, driving forward the advancements in embodied intelligence technology and its practical applications..
The myArm M&C series robots are meticulously designed to meet the diverse needs of users, prioritizing flexibility and adaptability. They play a pivotal role in various research and application scenarios, making them the ideal robotics solution for education and research purposes.
myArm C650
The myArm C650 is a universal 6 DOF robot motion information collection device designed to meet the diverse needs of education, research, and industry in robot motion data collection and analysis. With its lightweight design of weighing only 1.8kg, the myArm C650 boasts a horizontal working radius of 650mm, minimizing inertial forces during operation for enhanced response speed and precision.
Equipped with high-precision digital servo motors and 4096-bit encoders on all 6 joints, the myArm C650 mimics human arm motion with remarkable accuracy, enabling a wide range of tasks. Its intuitive control method, featuring dual-finger remote control and dual customizable buttons, supports recording functions for precise command execution and immediate feedback on robot behavior. This flexibility makes the myArm C650 an ideal choice for precise motion tracking and data collection in various experimental and educational settings. With an impressive information acquisition speed of up to 50Hz, it has become indispensable for robot algorithm development and higher education institutions, offering real-time data support for complex control systems.
In remote control applications, the myArm C650 excels, delivering outstanding performance regardless of the robot’s configuration complexity. Moreover, its compatibility with Python and ROS, coupled with open-source remote control demonstration files, expands its application scope, enabling seamless integration with advanced robot platforms like the myArm M750, myCobot Pro 630, and Mercury B1.
The myArm C650 sets a new standard for versatility and performance in robot motion data collection, empowering users to explore the full potential of advanced robotics across diverse fields.
myArm M750
The myArm M750 is a universal intelligent 6 DOF robotic arm. It not only meets the demand for high-precision robot motion control but is particularly suitable for entry-level robot motion algorithm verification and practical teaching scenarios. Its standardized mechanical arm structure provides an ideal learning platform for students and beginners to grasp the basic principles and applications of robot kinematics.
Dedicated to achieving precise motion control and verification, the myArm M750 excels in applications requiring strict operational accuracy, such as precision assembly, fine manipulation, and quality monitoring. Equipped with industrial-grade high-precision digital servo motors and advanced control algorithms, the myArm M750 delivers exceptional torque control and positional accuracy, supporting a rated load capacity of 500g and a peak load of up to 1kg.
The myArm M750’s versatility extends to its end effector design, featuring a standard parallel gripper and vision module that empower users with basic grasping and recognition capabilities. Furthermore, the myArm M750 offers compatibility with a range of optional accessories, significantly expanding its application scenarios and adaptability to diverse tasks.
myArm M&C Teleoperation Robotic Arm Kit
Teleoperation Robotic Arm Kit represents a leap forward in robotics innovation, offering an advanced solution tailored for remote control and real-time interaction through cutting-edge teleoperation technology. By seamlessly integrating the versatility of the myArm C650 with the precise control capabilities of the myArm M750, this kit forms a dynamic and adaptable platform suitable for a myriad of research, educational, and commercial applications.
Engineered to mimic human behavior, the kit enables researchers and developers to validate and test remote control systems and robot motion planning models akin to the ALOHA robot. Empowered by millisecond-level data acquisition and control capability, real-time drag control functionality, and multi-robot collaborative operation capabilities, the myArm M&C Kit facilitates the execution of complex tasks, including advanced simulations of human behavior. This technology not only showcases the precision and efficiency of robots in mimicking human actions but also propels research and development in robot technology for simulating human behavior and performing everyday tasks.
Moreover, integrated AI technology equips robots with learning and adaptability, enabling autonomous navigation, object recognition, and complex decision-making capabilities, thereby unlocking vast application potential across diverse research fields.
myArm M&C Embodied Humanoid Robot Compound Kit
Stanford University’s Mobile ALOHA project has garnered global attention for its groundbreaking advancements in robotics technology. It has developed an advanced system that allows users to execute complex dual-arm tasks through human demonstrations, thereby enhancing imitation learning algorithms‘ efficiency through data accumulation and collaborative training. The Mobile ALOHA system showcases its versatility by seamlessly executing various real-world tasks, from cleaning spilled drinks to cooking shrimp and washing frying pans. This innovation not only marks a significant milestone in robotics but also paves the way for a future where humans and robots coexist harmoniously.
Drawing inspiration from Stanford’s Mobile ALOHA project, this kit adopts the same Tracer mobile chassis. With an open-source philosophy, minimalist design, modular construction, and robust local community support, this kit serves as a cost-effective solution for real-time robot teleoperation and control, mirroring the capabilities of Mobile ALOHA with a more accessible price.
Designed to cater to the needs of small and medium-sized enterprises, as well as educational and research institutions, this kit offers a more accessible price, user-friendly features, and easy accessibility to cutting-edge robot technology.
The myArm M&C series robots are a versatile robotics solution catering to diverse needs from fundamental research to intricate task execution. In combination with optional kits, they seamlessly adapt to various application scenarios, from precision manufacturing to medical assistance, education, training, and household support. The myArm M&C series robots stand out as dependable and high-performing solutions, promising reliability and excellence. The inclusion of the Embodied Humanoid Robot Compound Kit and Quadruped Bionic Robot Compound Kit further expands the possibilities in robotics, encouraging interdisciplinary exploration and fostering innovation.