Feeling robots have been the basis of science fiction for decades, raising tempting ethical questions and glowing light on the technical barriers to creating artificial consciousness. Much of what the technological world has achieved in artificial intelligence (AI) is today thanks to the latest progress in deep learning, which allows machines to learn automatically during training.
This breakthrough eliminates the need for careful, manual engineering of features – a key reason why deep learning stands out as a transformation force in artificial intelligence and technological innovation.
Based on this momentum, the finish – which is the owner of Facebook, WhatsApp and Instagram – dives on a new bold territory with advanced “AI touch” technologies. The company has recently introduced three new powered toolsSparsh, 360 number and digital weave—SIGNED to give robots a form of touch sensitivity that strictly imitates human perception.
Objective? Creating robots that do not imitate tasks, but actively engage in their surroundings, just like people interact with the world.
Sparsh, aptly named after the Sanskrit word “Touch”, is the general AI model that allows robots to interpret and react to sensory signals in real time. Similarly Digit 360 sensorIt is an artificial fingertip of robot fingers that can help in the perception of touch and physical feelings as a minute as stuck or needle changes. . Digital weave It will act as a bridge, providing a standardized frame for integration of touch sensors in various robotic projects, facilitating the capture and analysis of touch data. The finish line believes that these tools powered by artificial intelligence will allow robots to deal with complicated tasks that require a “human” touch, especially in areas such as healthcare, where sensitivity and precision are the most important.
However, the introduction of sensory robots raises larger questions: can this technology unlock new levels of cooperation, or can the society be introduced not to be equipped?
“When robots unlock new senses and obtain a high degree of intelligence and autonomy, we will have to start considering their role in society” Ali Ahmedco -founder and general director RobomartHe told me. “Meta efforts are the main first step towards providing them with senses similar to humans. Because people become extremely intimate among robots, they will treat them as life partners, comrades and even go so far to build life with them.”
Frame for human harmony, future?
In addition to the progress of the touch artificial intelligence, the meta also presented Partner Benchmark, standardized framework for assessing Human-Robot cooperation on a large scale. Partner, designed for testing interactions that require planning, reasoning and cooperation, will allow robots to navigate both in a structured and unstructured environment with people. By integrating large language models (LLM) in order to conduct these interactions, the partner can assess work on critical elements, such as coordination and tracking of tasks, transferring them from ordinary “agents” on real “partners” capable of smooth work with human counterparts.
“The current article is very limited for comparative tests, and even in the field of natural language processing (NLP), it took a significant amount of time to improve the real world. It will be a huge exercise generalizing the population 8.2 billion with a limited laboratory environment”, “,” Ram palaniappanCto of of Text systemsHe told me. “It will require a larger dedicated effort to increase this research article to get to the feasible remote control.”
To introduce these tactile progress to the market, the finish line joined forces with Gelsight Inc. and Wonik Robotics. Gelsight will be responsible for producing a 360 digit sensor, which is to be issued next year and will provide access to the research community to advanced tactile capabilities. Meanwhile, the Wonik Robotics will cope with the production of a new generation Allegro hand, which integrates the digital weave to enable robots to perform complex, touching tasks with a new level of precision. However, not everyone is convinced that progress is a step in the right direction.
“Although I still think that adding detection possibilities may be important for understanding the environment, I think that current use cases are more related to robots for mass consumers and improving their interaction” Agustin Huerta, SVP of digital innovation for North America in GlobantHe told me. “I do not believe that we will be close to giving them impressions at a human level, or that it is actually needed. Instead, it will act more as an additional data point for the decision -making process.”
The tactile development of AI Meta reflects a broader trend in Europe, in which countries such as Germany, France and Great Britain cross borders in robotic detection and awareness. For example, the EU The Horizon 2020 program supports a number of projects aimed at crossing robotic boundaries, from touch detection and environmental awareness to decision -making. In addition, the Karlsruhe Institute of Technology in Germany has recently introduced Armar-6, a humanoid robot designed for industrial environments. Armar-6 is equipped with the use of tools such as exercises and hammers and contains AI capabilities that allow him to learn to grab objects and help people's colleagues.
But, Dr. Peter Gorm LarsenVice-head of the Section at the Faculty of Electrical and Computer Engineering at the University of Aarhus in Denmark and coordinator financed by the EU Robosapiens The project warns that the finish line can overlook a key challenge: a gap between virtual perception and physical reality, in which autonomous robots operate, especially in relation to environmental and human security.
“Robots have no intelligence in the same way as live creatures” He told me. “Technology companies have a moral obligation to ensure that their products respect ethical boundaries. Personally, I am most worried about the potential convergence of such advanced touch feedback with 3D glasses as compact as ordinary glasses.”
Are we ready for the robots to “feel”?
Dr. Larsen He believes that AI touch sensors themselves are not a real challenge, but rather their arrangement in autonomous conditions. “In the EU, the Machine Directive currently limits the use of controls based on AI in robots. But in my opinion it is too stringent and we hope that we will be able to show that in the Robosapiens project, which I am currently coordinating.”
Of course, robots already work with people in various industries around the world. For example, Kiwibot has he helped Logistics companies dealing with labor deficiencies in warehouses and a Swiss company Anybotics recently collected $ 60 million to help in bringing more industrial robots to the USA, According to to technology. We should expect that artificial intelligence will continue to penetrate industries, like “AI accelerates performance in repetitive tasks, such as code re -invoice, solves the debt and technical tests and transforms how global teams cooperate and introduce innovations”, ” he said Vikas Basra, Global Head, Intelligent Engineering Practice, Ness Digital Engineering.
At the same time, the safety of these robots – now, and also in a potentially “feeling” future – is the main problem that the industry can develop.
He said Matan Libis, Vice President for Product in SQreamadvanced company dealing in data processing, v ObserverIN “Another important mission for companies will be to establish AI's place in society – there is roles and responsibilities … We must clearly define its boundaries and where it really helps. If we do not identify the limits of artificial intelligence, we will face the growing fears of its integration with everyday life.”
When AI is evolving to take into account the touch of touch, it raises the question of whether society is ready for robots they “feel”. Experts say that software -based superinterty can hit the ceiling; In order for artificial intelligence to achieve a real, advanced understanding, it must sense, perceive and act in our physical environments, combining methods to get a deeper understanding of the world – something that works are extremely adapted to achieve. However, superintelligence itself is not equal among them. “We cannot anthropomorphist tool to the extent that it associated it as a feeling creature, if it did not prove that it is able to be feeling,” explained Ahmed. “However, if the robot undergoes a test among then, he should be considered a living feeling being, and then we will have morality and fundamental responsibility for granting them some freedoms and rights as a feeling being.”
AI Meta's touch implications are significant, but whether these technologies will lead to a revolutionary change or ethical lines remains uncertain. For now, society remains to think about the future in which artificial intelligence not only sees and hears, but also touches – simply transforming our relationships with machines in a way that we are just beginning to imagine.
“I do not think that the increase in the ability to sense AI exceeds the ethics line. This is more related to how use is later used to make decisions or in making decisions of others”,, he said Huerta. “The robot's revolution will not differ from the industrial revolution. This will affect our lives and leave us in a state, which in my opinion can cause humanity to develop. For this, we must start educating ourselves and the upcoming generations on supporting a healthy relationship between people and robots.”