On May 16th, one of the highlights of the inVISION Day Metrology 2024 – Digital Conference for Metrology – will be the Keynote ´Intelligent measurement technology for the production of the future´ by Dr. Benjamin Montavon (Werkzeugmaschinenlabor WZL der RWTH Aachen). His topis: Resilient Measurement technology is essential for the intelligent production of the future - including for the evaluation of components in a circular economy or flexible automation in dismantling and reassembly. The enabler and challenge is trustworthy digitalization, especially when using AI. The online conference offers twelve presentations about 3D scanners, inline metrology, surface inspection and CT/X-Ray. Event partners of the inVISION Day Metrology are Control Messe and EMVA. Full program with all presentations and free registration at... www.invdays.com/metrology #metrology #3dscanning #qualityassurance #inspection #ndt #automotiveindustry #nondestructivetesting #industrialengineering #xray #robotics #patternrecognition #ai #deeplearning
inVISION News’ Post
More Relevant Posts
-
On May 16th, one of the highlights of the inVISION Day Metrology 2024 – Digital Conference for Metrology – will be the discussion ´Using AI in Dimensional Metrology´. Experts from Hexagon Manufacturing Intelligence (Johannes Mann), LMI Technologies (Chris Aden), VisiConsult - X-ray Systems & Solutions GmbH (Lennart Schulenburg) and ZEISS Industrial Quality Solutions (Dr. Christian Wojek) will discuss if AI is already been used in metrology applications and if yes: what advantages does AI offer and what are the challenges? The online conference offers twelve presentations about 3D scanners, inline metrology, surface inspection and CT/X-Ray as well as a keynote by the WZL. Event partners of the inVISION Day Metrology are Control Messe and European Machine Vision Association - EMVA. Full program with all presentations and free registration at... www.invdays.com/metrology #metrology #3dscanning #qualityassurance #inspection #ndt #automotiveindustry #nondestructivetesting #industrialengineering #xray #robotics #patternrecognition #ai #deeplearning
To view or add a comment, sign in
-
-
On May 16th, one of the highlights of the inVISION Day Metrology 2024 – Digital Conference for Metrology – will be the discussion ´Using AI in Dimensional Metrology´. Experts from Hexagon Manufacturing Intelligence (Johannes Mann), LMI Technologies (Chris Aden), VisiConsult - X-ray Systems & Solutions GmbH (Lennart Schulenburg) and ZEISS Industrial Quality Solutions (Dr. Christian Wojek) will discuss if AI is already been used in metrology applications and if yes: what advantages does AI offer and what are the challenges? The online conference offers twelve presentations about 3D scanners, inline metrology, surface inspection and CT/X-Ray as well as a keynote by the Werkzeugmaschinenlabor WZL der RWTH Aachen. Event partners of the inVISION Day Metrology are Control Messe and European Machine Vision Association - EMVA. Full program with all presentations and free registration at... www.invdays.com/metrology #metrology #3dscanning #qualityassurance #inspection #ndt #automotiveindustry #nondestructivetesting #industrialengineering #xray #robotics #patternrecognition #ai #deeplearning
To view or add a comment, sign in
-
-
Tactile feedback is essential for understanding the dynamics of both rigid and deformable objects in many manipulation tasks, such as non-prehensile manipulation and dense packing. Researchers have introduced 𝐑𝐨𝐛𝐨𝐏𝐚𝐜𝐤, a framework combining visual and tactile sensing through a neural, tactile-informed dynamics model to enhance robotic manipulation. RoboPack employs a recurrent graph neural network to estimate object states, including particles and object-level latent physics information, from historical visuo-tactile observations and make future state predictions. This model, learned from real-world data, enables robots to solve downstream tasks using model-predictive control. The approach was demonstrated on a real robot equipped with a compliant Soft-Bubble tactile sensor. The robot tackles non-prehensile manipulation and dense packing tasks and infers objects' physics properties from direct and indirect interactions. Remarkably, the model was trained on only 30 minutes of real-world interaction data per task, yet it can perform online adaptation and make touch-informed predictions. RoboPack is showing promise over previous learning-based and physics-based simulation systems through extensive evaluations in long-horizon dynamics prediction and real-world manipulation. Congrats to the team, and best of luck with the future work. 📝 Research Paper: https://lnkd.in/e2Ng3VDA 📊 Project Page: https://lnkd.in/e-V3buQv #robotics #research
To view or add a comment, sign in
-
🤖 Exciting breakthrough in robotic manipulation 🤖 Revolutionizing robotic grasping and manipulation: a novel approach leveraging reinforcement learning and tactile sensing has just been unveiled, and the implications are profound! • Researchers have developed a tactile-based reinforcement learning framework that enables robots to learn complex grasping and manipulation tasks with unprecedented success rates. • The framework utilizes a novel tactile sensor design and a reinforcement learning algorithm that adapts to changing object properties and environmental conditions. • Experimental results demonstrate significant improvements in task success rates and adaptability compared to existing state-of-the-art methods. This research has the potential to significantly improve the dexterity and versatility of robots in real-world applications - but what are the key challenges that need to be addressed to bring this technology to scale? Read the full paper to dive deeper into the methodology and results: https://lnkd.in/gwQcY65c. Share your thoughts on the potential impact of this research and let's continue the discussion #ReinforcementLearning #RoboticManipulation
To view or add a comment, sign in
-
Our latest research by Federico Bernabei, Matteo Lo Preti and Lucia Beccai, published in Micromachines MDPI introduces the T-Blep: a soft optical sensor that can measure both stiffness and contact force. 🪶 𝐖𝐡𝐚𝐭 𝐦𝐚𝐤𝐞𝐬 𝐢𝐭 𝐬𝐩𝐞𝐜𝐢𝐚𝐥? The T-Blep uses light transduction, making it highly sensitive, adaptable, and immune to electromagnetic interference. Its softness allowing it to interact with delicate objects without causing damage. 🔑 𝐊𝐞𝐲 𝐟𝐢𝐧𝐝𝐢𝐧𝐠𝐬? • The T-Blep successfully distinguishes between extra-soft, soft, and rigid materials • By adjusting its internal pressure, the sensor can switch between measuring stiffness and force, offering versatility in various applications. • It has a sixfold increase in detectable force range with the internal pressure, making it suitable for a wide range of tasks. 🤖 𝐖𝐡𝐲 𝐝𝐨𝐞𝐬 𝐢𝐭 𝐦𝐚𝐭𝐭𝐞𝐫? The T-Blep has the potential to empower fields like: • Soft Robotics: equipping robots with a sense of touch for safer and more delicate interactions. • Medical diagnostics: Enabling more precise and comfortable assessment of tissue stiffness for disease detection. 🩺 • Agricultural robotics: a gentle squeeze of a fruit offers vital tactile feedback for assessing ripeness. 💬 𝐑𝐞𝐚𝐝𝐲 𝐭𝐨 𝐥𝐞𝐚𝐫𝐧 𝐦𝐨𝐫𝐞? Read the full paper here: https://lnkd.in/dx8YTbnZ Feel free to ask questions or reach out the authors for collaboration opportunities. #SoftRobotics #TactileSensors #Bioengineering #MDPI #OpenScience
To view or add a comment, sign in
-
Empowering Professionals for Success in AI, Marketing & Digital Transformation | Leader in Data Science Education & Industry Innovation
𝐌𝐈𝐓 𝐢𝐬 𝐑𝐞𝐯𝐨𝐥𝐮𝐭𝐢𝐨𝐧𝐢𝐳𝐢𝐧𝐠 𝐋𝐚𝐧𝐠𝐮𝐚𝐠𝐞 𝐅𝐨𝐫 𝐑𝐨𝐛𝐨𝐭𝐬🧠 MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) unveils F3RM, an innovative system enhancing robots' understanding of the world through language and 3D scenes. This technology enables #robots to interpret open-ended text prompts, making them adaptable to real-world environments like warehouses and households, even when facing thousands of objects 🤯 Combining 2D images and foundation model features, #F3RM creates a 3D representation, offering the ability to grasp objects accurately. Imagine a robot picking up a "tall mug" or a specific item you request. This is the future of robotic flexibility, with applications in logistics, urban environments, and beyond. Do you think F3RM will help the democratization of robots in workspaces? Share your thoughts 👇 Image Credit: Massachusetts Institute of Technology #AIForRobotics #YobiAI
To view or add a comment, sign in
-
-
🤖📚 Robotic in Assistive Technology! 🌐✨ This robot has been trained to read braille in highspeed! Researchers in Cambridge have created a robotic sensor that reads braille at an incredible 315 words per minute - TWICE as fast as humans! 🚀👁️🗨️ Using advanced machine learning, the sensor, with a camera in its 'fingertip,' achieves 87% accuracy, surpassing human braille readers. The breakthrough has broad applications, including potential use in human-like tactile robot hands. 📈🤖 Provided by University of Cambridge Source: https://lnkd.in/emM_g-92 #SmartTech --------------------------------------------------------------------- 💡 Enjoyed this post? 🔄 Let's stay connected! 👣 Follow me for more insights about smart technology & business 🔔 Activate the notification bell to not miss new content
To view or add a comment, sign in
-
Introducing the latest marvel in book digitization:The automatic book scanner! Crafted for seamless book-to-digital conversion, this innovative contraption boasts overhead scanners equipped with cameras strategically poised to capture pages from above, all nestled on a v-shaped cradle that gracefully turns each page with precision. But wait, there's more! Enter the robotic scanner, complete with a dexterous robotic arm delicately flipping through pages while a mounted camera effortlessly immortalizes each word. But the magic doesn't end there. These scanners come loaded with an array of features to streamline the process: automatic page turning, deskewing for text alignment, cropping to eliminate background noise, and the pièce de résistance—optical character recognition, transforming scanned text into searchable digital formats. Say goodbye to manual labor and hello to the future of book preservation and accessibility! Join us here for more like this https://lnkd.in/gdcE93ta follow our Instagram https://lnkd.in/ejZKsdRG #AI #Robotics #Innovation #FutureTech #Technology #Automation #ArtificialIntelligence #TechNews #IndustryTrends #Business #Entrepreneurship #MarketTrends #LinkedIn #ShareTheFuture #qqmworld
To view or add a comment, sign in
-
🚀 Diving Deeper into the Tech Behind My Robotic Arm with Stereo Vision! 🤖 In my recent project, I combined robotics, computer vision, and AI to create a robotic arm capable of identifying and manipulating objects. Today, I want to highlight the technologies that made this project possible. Technical Insights: 🔹 Computer Vision: OpenCV: Utilized for image processing and computer vision tasks. TensorFlow Lite SSDLite V1: Employed for object detection and recognition. Disparity Map: Implemented for stereo vision to estimate distance between the robot and objects. 🔹 Robotic Arm: Equipped with 4 servo motors (one for each link) and an additional servo motor for the claw mechanism. A stepper motor is used on the base for precise rotation control. 🔹 Mobility: The robotic arm is mounted on a mobile base with two motors for forward movement, controlled via PWM (Pulse Width Modulation). 🔹 Control System: Given the hardware limitations of the Raspberry Pi 3B+, all components are controlled using ZeroMQ. This allows for efficient communication using a master-slave pattern instead of the more commonly used ROS. Key Achievements: Stereo Vision Implementation: Enabled the robot to perform object recognition and distance measurement using two webcams. Functional Design: The arm identifies empty bottles, calculates their distance, and disposes of them in a trash bin. Automation and Efficiency: Demonstrates the potential of integrating machine learning algorithms with robotics for practical solutions. Looking Ahead: I'm excited to further explore the convergence of robotics and AI, aiming to develop innovative solutions that can make a meaningful impact. #Robotics #MachineLearning #Automation #Engineering #AI #ComputerVision #Technology
To view or add a comment, sign in
Editor in chief for machine vision and optical metrology inVISION
2moWe are looking forward to an interesting keynote on the future of metrology