RIEM News LogoRIEM News

Articles tagged with "AI-vision"

  • Robotic exoskeleton gives YouTuber 63% aim boost, 17ms latency

    YouTuber Nick Zetta, known as Basically Homeless, developed a robotic exoskeleton aimed at enhancing aiming performance in the Aimlabs training program. Combining Nvidia Jetson hardware with a YOLO-powered AI vision system, motors, and 3D-printed components, the device physically guides his wrist and fingers to improve target acquisition. Initial tests showed a 20% accuracy drop as Zetta adapted to the system, but after hardware optimizations—such as reducing latency from 50ms to 17ms and increasing motor strength—he achieved a 63% boost in his Aimlabs score, propelling him to second place on the global leaderboard. The exoskeleton attaches to the forearm using 3D-printed hinges, with Kevlar lines and gimbal motors controlling wrist movements and solenoids managing finger clicks. A high-speed camera feeds real-time target data to the AI, which directs the motors to adjust hand positioning, effectively acting as a physical aimbot. Unlike

    roboticsrobotic-exoskeletonAI-visioncomputer-visionNvidia-Jetson3D-printingassistive-technology
  • OmniCore EyeMotion lets robots adapt to complex environments in real time, says ABB - The Robot Report

    ABB Robotics has launched OmniCore EyeMotion, a new software solution that enables OmniCore-powered robots to recognize and adapt to their surroundings in real time using any third-party camera or sensor. This advancement allows robots to perform complex 2D and 3D vision-based tasks without requiring specialized camera hardware. Designed for ease of use with a simple drag-and-drop web interface, EyeMotion integrates fully with ABB’s RobotStudio programming tool, significantly reducing commissioning time by up to 90%. The system supports a wide range of applications across industries such as manufacturing, logistics, packaging, and food and beverage, handling tasks like item sorting and quality inspection. In more complex scenarios, OmniCore EyeMotion can be combined with ABB’s Automatic Path Planning Online to enable collision-free navigation around obstacles, potentially reducing cycle times by up to 50%. This innovation is part of ABB’s broader strategy to advance “autonomous versatile robotics” (AVR), aiming for robots that autonomously plan and execute diverse tasks in real time

    roboticsindustrial-robotsAI-visionautonomous-robotsOmniCore-EyeMotionABB-Roboticsmachine-automation
  • Robot with AI vision and 4,000-Newton grip targets marine pollution

    German researchers at the Technical University of Munich (TUM) have developed an innovative AI-powered autonomous diving robot designed to combat marine pollution by collecting underwater debris. Tested in the port of Marseille, the robot integrates AI vision, ultrasound, and cameras to detect and identify various types of ocean litter, ranging from heavy objects like lost fishing nets and tires to fragile items such as glass bottles. The robot’s four-fingered robotic hand can exert a gripping force of up to 4,000 Newtons, enabling it to lift objects weighing as much as 551 lbs (250 kilograms) with precision, thanks to sensors that regulate grip strength to avoid damaging delicate waste. The system is part of the EU project SEACLEAR and operates as a coordinated network including an unmanned service boat, a small underwater search robot, and an autonomous dinghy that serves as a floating waste container. The service boat supplies power and data via cable and uses ultrasonic waves to map the seabed. The 20-inch search robot

    roboticsAI-visionunderwater-robotmarine-pollutionautonomous-systemsrobotic-gripperenvironmental-technology
  • Unitree launches A2 quadruped equipped with front and rear lidar - The Robot Report

    Unitree Robotics has launched its latest quadruped robot, the Unitree A2, designed for industrial applications such as inspection, logistics, and research. The A2 features significant upgrades in perception, including dual industrial lidar sensors positioned at the front and rear, an HD camera, and a front light to improve environmental detection and eliminate blind spots. Weighing 37 kg unloaded, the A2 can carry a 25 kg payload while walking continuously for three hours or about 12.5 km, supported by hot-swappable dual batteries for extended missions. This model balances endurance, strength, speed, and perception, marking it as one of Unitree’s most advanced quadrupeds to date. Key specifications of the A2 include a top speed of 5 m/s, an unloaded range of 20 km, a maximum standing load of 100 kg, and the ability to climb steps up to 1 meter high. Compared to Unitree’s previous B2 model, the A2 is

    robotquadruped-robotlidarautonomous-robotsroboticsAI-visionbattery-technology
  • Apera AI closes Series A financing, updates vision software, names executives - The Robot Report

    Apera AI, a Vancouver-based developer of 4D Vision technology for industrial automation, has closed an oversubscribed Series A financing round. The company plans to use the new funding to expand its team, improve processes, and drive product innovation. Apera AI’s patented 4D Vision system integrates advanced 3D vision with artificial intelligence—the “fourth dimension”—to enable robots to perform complex tasks such as bin picking, de-racking, and assembly with high speed and precision. This technology addresses common manufacturing challenges like shifting bins, changing lighting, and worn grippers, which traditional 3D vision systems struggle to handle. In conjunction with the funding announcement, Apera AI released Apera Vue 9.50, an updated version of its controller software featuring vision-guided TCP calibration, recording and playback capabilities, and an accuracy insight tool. These enhancements help manufacturers maintain precision despite real-world variations on the factory floor. The company also emphasizes no-code setup tools, AI-powered calibration,

    robotindustrial-automationAI-vision4D-vision-technologymanufacturing-roboticsrobotic-calibrationautomation-software
  • Samsung plans to make eyes for growing humanoid robot market

    Samsung Electro-Mechanics is positioning itself to become a key supplier in the growing humanoid robot market by leveraging its advanced camera module technology and AI vision capabilities. Building on its expertise in image processing, AI-driven image recognition, and object detection—technologies already showcased in Samsung Galaxy smartphones—Samsung aims to develop sophisticated "eyes" for humanoid robots. This move aligns with the company's recent robotics ventures, including the upcoming Ballie home assistant robot and the Samsung Bot Handy, an AI-powered robot capable of object recognition and manipulation. Given the saturation of the smartphone camera market, robotics presents a significant new growth opportunity for Samsung. Rather than manufacturing its own line of humanoid robots, Samsung may choose to collaborate with other robotics companies by supplying core AI vision technology, similar to its existing business model of providing components like displays and memory chips. Meanwhile, competitor LG Innotek is already advancing in this space through negotiations with prominent robotics firms such as Boston Dynamics and Figure AI, which plans to mass-produce

    roboticshumanoid-robotsAI-visionSamsungcamera-technologyartificial-intelligencerobotics-market
  • Orbbec, Connect Tech to provide support for Gemini stereo depth camera - The Robot Report

    robotIoTautonomous-machinesindustrial-automationstereo-visionAI-visiondepth-camera
  • ABB upgrades Flexley Mover AMR with visual SLAM capabilities

    robotAMRautomationvSLAMAI-visionlogisticsindustrial-robotics