Picked up and put off

Guest post by IDS Corporate Communications

Autonomously driving robotic assistance system for the automated placement of coil creels

Due to the industry standard 4.0, digitalisation, automation and networking of systems and facilities are becoming the predominant topics in production and thus also in logistics. Industry 4.0 pursues the increasing optimisation of processes and workflows in favour of productivity and flexibility and thus the saving of time and costs. Robotic systems have become the driving force for automating processes. Through the Internet of Things (IoT), robots are becoming increasingly sensitive, autonomous, mobile and easier to operate. More and more they are becoming an everyday helper in factories and warehouses. Intelligent imaging techniques are playing an increasingly important role in this.

To meet the growing demands in scaling and changing production environments towards fully automated and intelligently networked production, the company ONTEC Automation GmbH from Naila in Bavaria has developed an autonomously driving robotic assistance system. The “Smart Robot Assistant” uses the synergies of mobility and automation: it consists of a powerful and efficient intralogistics platform, a flexible robot arm and a robust 3D stereo camera system from the Ensenso N series by IDS Imaging Development Systems GmbH.

The solution is versatile and takes over monotonous, weighty set-up and placement tasks, for example. The autonomous transport system is suitable for floor-level lifting of Euro pallets up to container or industrial format as well as mesh pallets in various sizes with a maximum load of up to 1,200 kilograms. For a customer in the textile industry, the AGV (Automated Guided Vehicle) is used for the automated loading of coil creels. For this purpose, it picks up pallets with yarn spools, transports them to the designated creel and loads it for further processing. Using a specially developed gripper system, up to 1000 yarn packages per 8-hour shift are picked up and pushed onto a mandrel of the creel. The sizing scheme and the position of the coils are captured by an Ensenso 3D camera (N45 series) installed on the gripper arm.

Application

Pallets loaded with industrial yarn spools are picked up from the floor of a predefined storage place and transported to the creel location. There, the gripper positions itself vertically above the pallet. An image trigger is sent to the Ensenso 3D camera from the N45 series, triggered by the in-house software ONTEC SPSComm. It networks with the vehicle’s PLC and can thus read out and pass on data. In the application, SPSComm controls the communication between the software parts of the vehicle, gripper and camera. This way, the camera knows when the vehicle and the grabber are in position to take a picture. This takes an image and passes on a point cloud to a software solution from ONTEC based on the standard HALCON software, which reports the coordinates of the coils on the pallet to the robot. The robot can then accurately pick up the coils and process them further. As soon as the gripper has cleared a layer of the yarn spools, the Ensenso camera takes a picture of the packaging material lying between the yarn spools and provides point clouds of this as well. These point clouds are processed similarly to provide the robot with the information with which a needle gripper removes the intermediate layers. “This approach means that the number of layers and finishing patterns of the pallets do not have to be defined in advance and even incomplete pallets can be processed without any problems,” explains Tim Böckel, software developer at ONTEC. “The gripper does not have to be converted for the use of the needle gripper. For this application, it has a normal gripping component for the coils and a needle gripping component for the intermediate layers.”

For this task, the mobile use for 3D acquisition of moving and static objects on the robot arm, the Ensenso 3D camera is suitable due to its compact design. The Ensenso N 45’s 3D stereo electronics are completely decoupled from the housing, allowing the use of a lightweight plastic composite as the housing material. The low weight facilitates the use on robot arms such as the Smart Robotic Asstistant. The camera can also cope with demanding environmental conditions. “Challenges with this application can be found primarily in the different lighting conditions that are evident in different rooms of the hall and at different times of the day,” Tim Böckel describes the situation. Even in difficult lighting conditions, the integrated projector projects a high-contrast texture onto the object to be imaged by means of a pattern mask with a random dot pattern, thus supplementing the structures on featureless homogenous surfaces. This means that the integrated camera meets the requirements exactly. “By pre-configuring within NxView, the task was solved well.” This sample programme with source code demonstrates the main functions of the NxLib library, which can be used to open one or more stereo and colour cameras whose image and depth data are visualised. Parameters such as exposure time, binning, AOI and depth measuring range can – as in this case – be adjusted live for the matching method used.

The matching process empowers the Ensenso 3D camera to recognise a very high number of pixels, including their position change, by means of the auxiliary structures projected onto the surface and to create complete, homogeneous depth information of the scene from this. This in turn ensures the necessary precision with which the Smart Robot Assistant proceeds. Other selection criteria for the camera were, among others, the standard vision interface Gigabit Ethernet and the global shutter 1.3 MP sensor. “The camera only takes one image pair of the entire pallet in favour of a faster throughput time, but it has to provide the coordinates from a relatively large distance with an accuracy in the millimetre range to enable the robot arm to grip precisely,” explains Matthias Hofmann, IT specialist for application development at ONTEC. “We therefore need the high resolution of the camera to be able to safely record the edges of the coils with the 3D camera.” The localisation of the edges is important in order to be able to pass on as accurate as possible the position from the centre of the spool to the gripper.

Furthermore, the camera is specially designed for use in harsh environmental conditions. It has a screwable GPIO connector for trigger and flash and is IP65/67 protected against dirt, dust, splash water or cleaning agents.

Software

The Ensenso SDK enables hand-eye calibration of the camera to the robot arm, allowing easy translation or displacement of coordinates using the robot pose. In addition, by using the internal camera settings, a “FileCam” of the current situation is recorded at each pass, i.e. at each image trigger. This makes it possible to easily adjust any edge cases later on, in this application for example unexpected lighting conditions, obstacles in the image or also an unexpected positioning of the coils in the image. The Ensenso SDK also allows the internal camera LOG files to be stored and archived for possible evaluation.

ONTEC also uses these “FileCams” to automatically check test cases and thus ensure the correct functioning of all arrangements when making adjustments to the vision software. In addition, various vehicles can be coordinated and logistical bottlenecks minimised on the basis of the control system specially developed by ONTEC. Different assistants can be navigated and act simultaneously in a very confined space. By using the industrial interface tool ONTEC SPSComm, even standard industrial robots can be safely integrated into the overall application and data can be exchanged between the different systems.

Outlook

Further development of the system is planned, among other things, in terms of navigation of the autonomous vehicle. “With regard to vehicle navigation for our AGV, the use of IDS cameras is very interesting. We are currently evaluating the use of the new Ensenso S series to enable the vehicle to react even more flexibly to obstacles, for example, classify them and possibly even drive around them,” says Tim Böckel, software developer at ONTEC, outlining the next development step.

ONTEC’s own interface configuration already enables the system to be integrated into a wide variety of Industry 4.0 applications, while the modular structure of the autonomously moving robot solution leaves room for adaptation to a wide variety of tasks. In this way, it not only serves to increase efficiency and flexibility in production and logistics, but in many places also literally contributes to relieving the workload of employees.

More at: https://en.ids-imaging.com/casestudies-detail/picked-up-and-put-off-ensenso.html

BlueBotics’ ANT® Navigation Drives Autonomous Vehicles Over 10 Million Kilometers

St-Sulpice, Switzerland, February18, 2021 – BlueBotics, the global leader in natural feature navigation, has announced its Autonomous Navigation Technology (ANT®) is now estimated to have driven automated guided vehicles (AGVs) and autonomous mobile robots (AMRs) over 10 million kilometers, in applications ranging from warehousing and manufacturing to commercial cleaning services, UVC disinfection and more.

Dr. Nicola Tomatis, CEO of BlueBotics, said, “The timing of this milestone is perfect as it comes the same month that we celebrate the company’s 20-year anniversary. It is amazing to think that our customers’ ANT® driven vehicles have driven over 10 million kilometers, almost 250 times the circumference of the globe. This achievement really speaks to the robustness of our industry-proven ANT® technology.”



ANT® navigation is popular with manufacturers and end users of AGVs, automated forklifts and AMRs, since it simplifies and shortens vehicle installation times as well as providing flexible, accurate and user-friendly operation. The technology suits a myriad of vehicle types and kinematics, from small 100 kg AMRs to 30-ton heavy load transporter AGVs. In addition, with BlueBotics’ accompanying ANT® server software, users whose vehicles are driven by BlueBotics’ ANT® lite+ product can create and operate a synchronized fleet of ANT® driven vehicles, no matter what the type or even brand. All of these vehicles are able to interact seamlessly with on-site equipment and machinery, including an organization’s existing WMS/MES/ERP software, using ANT® server’s simple API.

Dr. Tomatis continued, “In arriving at our 10-million-kilometer milestone, we conservatively estimate that AGVs driven by ANT® technology have worked more than one million days – over 2,500 years – of commercial operation.”

“It is exciting to see the impact ANT® technology is having, both on the AGV market and – most importantly – on the efficiency of those companies that operate ANT® driven vehicles. With our continuing strong growth, it shouldn’t be long before ANT® driven products will have circumnavigated the globe 500 times!”

About BlueBotics

BlueBotics is the reference in natural feature navigation and has the mission to help companies meet the challenge of vehicle automation. With its 20 years of industry experience, the company provides the autonomous navigation technology (ANT®) and expert support customers need to bring their AGV, automated forklift or mobile robot successfully to market. Today, there are more than 2,000 ANT® driven vehicles in operation worldwide. https://www.BlueBotics.com

AGVs keep the PPE moving during the COVID-19 pandemic

St-Sulpice Switzerland, October 07, 2020 – The COVID-19 crisis has focussed public attention on the role that hospitals and healthcare professionals play in treating those infected by the virus. Working within strict social distancing guidelines and limited staff has strained the resources of some of the support staff including internal logistics suppliers.

This is why an increasing number of hospitals around the world are reaping the benefits of investment in automated guided vehicles (AGVs). 

A modern hospital or clinic handles a huge amount of internal transportation daily. A 200-bed hospital transports an average of six tons of materials per day over a total distance of about 60 km, while an 800-bed hospital can handle up to 27 tons of materials, covering a distance of about 800 km. By utilising an AGV logistics system these movements can be handled more efficiently, freeing up valuable resources for medical activities.

One hospital in Garbagnate Milanese, Italy, for example, employs AGVs to automate these processes. The 57,000 m2 facility has over 500 beds and the backend logistics are handled by 12 AGVs that transport goods to 147 reception stations throughout the hospital.

The quality of transport in healthcare is essential, in order to safeguard the integrity of the materials carried, ensuring a high level of hygiene and assuring the health of patients.  One company that has been supporting healthcare facilities automate their logistics since 2012 is Italy-based Oppent, with its EvoCart series of mobile robots, specifically developed for hospitals and medical centres. These vehicles can handle food, laundry, waste, sterilisation, pharmacy, and general supplies, including ensuring that vital personal protection equipment (PPE) is in the right places at the right time during the current pandemic. Oppent has managed handling in more than 20 healthcare facilities. 

Oppent’s bi-directional mobile robots have a programmable speed of 0.10 m/s to 2.0 m/s and respect the safety regulations ISO 3691-4. Their movements are controlled by Autonomous Navigation Technology (ANT®), by BlueBotics, which uses natural structures in the environment—such as walls or furniture—as references, to ensure each vehicle knows where exactly it is. This approach means an AGV installation does not require expensive infrastructure changes, such as inductive wires being laid in the floor, or triangulation reflectors on the walls, in order to navigate effectively. 

The AGVs are quickly installed with ANT® lab tool suite and modifications to routes are even simpler. As a result, installations are simple and economical to set up and maintain, whether a single automated guided vehicle or a large fleet. A specific built-in safety system using specific certified laser scanners can identify any obstacles along the path and adjust the movement of the vehicle, with the AGVs autonomously handling obstacles either by adapting their speed to avoid emergency situations (path following) or moving around them (obstacle avoidance).

======================

About BlueBotics

BlueBotics aims to become the reference in autonomous navigation with the mission to enable the mobility of vehicles for the automation in the professional use market.

The company is now active in two segments:

  • Industrial automation – BlueBotics proposes ANT®, its innovative navigation solution.
  • Service robotics – The company proposes engineering services based on its expertise in mobile robotics with standard platforms, feasibility studies, custom designs, and dedicated developments to enable new customer applications.

Roboteq Launches Enhanced Sensor for Guiding Robotic Vehicles along Invisible Magnetic Tracks

Scottsdale, AZ, January 4, 2017 – Roboteq, Inc (www.roboteq.com) introduces a new magnetic guide sensor capable of detecting and reporting the position of a magnetic field along its horizontal axis. The sensor is intended for line following robotic applications, using adhesive magnetic tape to form a track guide on the floor.

Measuring only 165 x 35 x 35mm, the MGSW1600 is built into a rugged, watertight, all-metal enclosure. It uses an 8-pin waterproof M12-type connector for its power supply and IO signals.

The sensor uses advanced signal processing to accurately measure its lateral distance from the center of the track, with millimeter resolution, resulting in nearly 160 points end to end. Tape position information can be output in numerical format on the sensor’s RS232, CANbus or USB ports. The position is also reported as a 0 to 3V analog voltage output and as a variable PWM output. Additionally, the sensor supports a dedicated MultiPWM mode allowing seamless communication with all Roboteq motor controllers using only one wire.

The sensor is primarily used to steer Automatic Guided Vehicles (AGVs), moving material on factory floors. However, its unique sensitivity and accuracy opens a world of new application opportunities, such as automatic shelf replenishing in supermarkets, patient transport in hospitals, stage theater props, or rail-less tramways.

Compared to other guiding techniques, magnetic guides are totally passive and therefore easy to lay and modify. The tape creates an invisible field that is immune to dirt and unaffected by lighting conditions. The magnetic track can be totally hidden under any non-ferrous flooring material, such as linoleum, tiles, or carpet.

Roboteq provides drawings, How-To videos and software free of charge for building Magnetic Track Guided mobile robots. Roboteq’s RoboAGVSim is a software package that lets the user develop and simulate such robots.

The sensor will detect and manage up to 2-way forks and can be instructed to follow the left or right track using commands issued via the serial, CAN or USB ports. All of the sensor’s operating parameters and commands are also accessible via its CAN bus interface.

In addition to detecting a track to follow, the sensor will detect and report the presence of magnetic markers. Markers are pieces of tape of opposite magnetic polarity that may be positioned on the left or right side of the track. The sensor is equipped with four LEDs for easy monitoring and diagnostics.

The sensor incorporates a high-speed, Basic-like scripting language that allows users to add customized functionality to the sensor. A PC utility is provided for configuring the sensor, capture and plot the sensor data on a strip chart recorder, and visualize in real time the magnetic field as it is “seen” by the sensor.

The sensor firmware can be updated in the field to take advantage of new features as they become available.

The MGSW1600 is available now to customers worldwide at $595 in single quantities. Product information and software can be downloaded from the company’s web site at https://www.roboteq.com/index.php/roboteq-products-and-services/magnetic-guide-sensors

Roboteq sells adhesive magnetic tape of 25mm and 50mm width, in 50 meter rolls. These are available from our online store.

A 2 minutes video presentation of the MGSW1600 can be viewed at:http://www.youtu.be/Hg3ToqLGqPY

A demonstration video of the RoboAGVSimulator can be viewed at: https://www.youtube.com/watch?v=3r_vB-9433Q&t=14s

Roboteq’s Magnetic Guide Technology is used by the World’s finest AGV and Mobile Robot manufacturers. Below are some Customer videos showing Roboteq’s sensor in action :

Aristeril – Spain: https://www.youtube.com/watch?v=k_R7gnSA5qI

ASI Technologies – USA: https://www.youtube.com/watch?v=Vjcbox8z9A0

Divel – Canada: https://www.youtube.com/watch?v=Da_FIXaPiRw

DTA – Spain: https://www.youtube.com/watch?v=eEnIHanKGmg

Ideasparq – Malaysia: https://www.youtube.com/watch?v=l6tr4okyCLc

Tekn0 – USA: https://www.youtube.com/watch?v=TM2t-ddph8Q