Human Machine Synergy Bridging The Physical And Digital Worlds

Bonisiwe Shabane
-
human machine synergy bridging the physical and digital worlds

EY helps clients create long-term value for all stakeholders. Enabled by data and technology, our services and solutions provide trust through assurance and help clients transform, grow and operate. Discover how EY insights and services are helping to reframe the future of your industry. How digital twin technology powers the future at Xcel Energy How Bristol Myers Squibb overhauled working capital to fund its future How St James’s Hospital's journey to cloud transformed cancer care

Work in the future will be a partnership between people, agents, and robots—all powered by artificial intelligence. While much of the current public debate revolves around whether AI will lead to sweeping job losses, our focus is on how it will change the very building blocks of work—the skills that underpin... Our research suggests that although people may be shifted out of some work activities, many of their skills will remain essential. They will also be central in guiding and collaborating with AI, a change that is already redefining many roles across the economy. In this research, we use “agents” and “robots” as broad, practical terms to describe all machines that can automate nonphysical and physical work, respectively. Many different technologies perform these functions, some based on AI and others not, with the boundaries between them fluid and changing.

Using the terms in this expansive way lets us analyze how automation reshapes work overall.1Our analysis considers a broader range of automation technologies than the narrow definition of agents commonly used in the AI... For more on how we define the term, see the Glossary. This report builds on McKinsey’s long-running research on automation and the future of work. Earlier studies examined individual activities, while this analysis also looks at how AI will transform entire workflows and what this means for skills. New forms of collaboration are emerging, creating skill partnerships between people and AI that raise demand for complementary human capabilities. Although the analysis focuses on the United States, many of the patterns it reveals—and their implications for employers, workers, and leaders—apply broadly to other advanced economies.

We find that currently demonstrated technologies could, in theory, automate activities accounting for about 57 percent of US work hours today.2Our analysis focuses exclusively on paid productive hours in the US workforce, encompassing full-time... We assess only the share of time awake that is spent on work-related activities, totaling roughly 45 percent of waking hours. Our analysis excludes time spent on unpaid tasks and leisure, but agents and robots could be used in related activities to support productivity and personal well-being. This estimate reflects the technical potential for change in what people do, not a forecast of job losses. As these technologies take on more complex sequences of tasks, people will remain vital to make them work effectively and do what machines cannot. Our assessment reflects today’s capabilities, which will continue to evolve, and adoption may take decades.

University of Maryland Assistant Professor of Computer Science Huaishu Peng focuses his research on bridging the gap between digital and physical interactions. His work in human-computer interaction (HCI) explores how devices such as small robots, wearable technologies and haptic interfaces can make computing more tangible and accessible. In this Q&A, Peng discusses his path into computer science, his research directions and how his work contributes to sustainability and new ways of connecting humans and machines. Was there a defining moment that shaped your career path into computer science? A few things led me to computer science. I have a bachelor’s degree in the field, but my interest really began earlier, in middle and high school, when I spent a lot of time playing computer games.

I thought it would be exciting to learn how to create those games myself, and that’s what first drew me toward programming. As I progressed, my focus evolved. What I do now isn’t the traditional idea of computer science, like writing algorithms or building systems entirely within the digital world. My research extends beyond that into what we call human-computer interaction and human-robot interaction. It’s about exploring where the digital and physical worlds meet and how people can engage with computational systems in new ways. In our previous post, Transforming the physical world with AI: the next frontier in intelligent automation, we explored how the field of physical AI is redefining a wide range of industries including construction, manufacturing,...

Now, we turn our attention to the complete development lifecycle behind this technology – the process of creating intelligent systems that don’t just follow instructions, but truly partner with humans by collaborating, anticipating requirements,... To illustrate this workflow in action, we’ll explore how Diligent Robotics applies physical AI principles to develop mobile robots that assist clinical teams in hospital settings. We’ll also share key considerations for business leaders looking to implement physical AI solutions that can improve both their operations and customer experiences. The relationship between humans and machines is undergoing a profound transformation. What began as simple tools under direct human control has evolved into sophisticated partnerships where intelligent machines can understand context, interpret intentions, and make autonomous decisions. The term physical AI describes a system that is interactive and iterative.

Physical AI is a process where elements work together in various patterns to understand, reason, learn, and interact with the physical world. At each step of the autonomy flywheel, elements are continuously learning and improving to feed the next step in the journey. The process begins with understanding. Here we integrate models and algorithms with sensors, real world and simulated data, and use these datasets to create reasoning. Next, a reasoning model predicts actions that will be realized in the physical world in real-time. But the process for these intelligent systems doesn’t stop there – they must continuously learn iteratively through feedback loops to improve overall performance of the system.

Correspondence: maros.krupas@tuke.sk (M.K.); c.liu16@aston.ac.uk (C.L.) Received 2024 Feb 16; Revised 2024 Mar 14; Accepted 2024 Mar 28; Collection date 2024 Apr. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). With the intent to further increase production efficiency while making human the centre of the processes, human-centric manufacturing focuses on concepts such as digital twins and human–machine collaboration. This paper presents enabling technologies and methods to facilitate the creation of human-centric applications powered by digital twins, also from the perspective of Industry 5.0.

It analyses and reviews the state of relevant information resources about digital twins for human–machine applications with an emphasis on the human perspective, but also on their collaborated relationship and the possibilities of their... Finally, it presents the results of the review and expected future works of research in this area. Keywords: human–machine collaboration, digital twin, human-centric, enabling technologies and methods, Industry 5.0 You have full access to this open access article Sustainable manufacturing remains a central objective of Industry 5.0. By successfully implementing harmonic human-robot teams in intelligent industrial systems, the efficiency and well-being of human workers can be increased.

Achieving this requires a gradual approach from caged robots to advanced, seamless collaboration between humans and robots. Initially, that means transitioning to human-robot interaction (HRI) where there is an exchange of commands between the human and the robot. Further advancements within safety considerations, including collision avoidance through advanced machine vision, enable the exchange of workspace that defines human-robot collaboration (HRC). The next stage is physical HRC (pHRC) which requires safe and controlled exchange of forces through impedance and admittance control. Finally, this paper describes human-robot teaming (HRT), which is defined by the exchange of solutions as teammates. This is enabled by combining cutting-edge technologies such as digital twin (DT), advanced vision sensors, machine learning (ML) algorithms and mixed reality (MR) human–machine interfaces for operators.

A key contribution of this work is reviewing the integration of HRT with DT and ML, highlighting how these technologies enable seamless perception, prediction, and decision-making in human-centric industrial systems. By reviewing these technologies, the paper highlights current challenges, limitations and research gaps within the field of HRT and suggests potential future possibilities for HRT, such as advanced disassembly of used goods for a... Avoid common mistakes on your manuscript. Humans have throughout the evolution sought for ways to ease their tedious and hard work. This started with developing tools before animals became a central part of our lives. They were crucial for us to grow as species due to their contribution to agriculture.

Horses were used for ploughing land and for transportation. Later on, wind mills were used to utilise wind energy for heavy tasks such as grinding grain, pumping water and sawing wood. The steam engine initiated the first industrial revolution, before the electric motor and the combustion engine, enabled mass production and assembly lines, hence the transition to Industry 2.0 (Mokyr & Strotz, 1998). During the 20th century there was a rapid growth of factories around the globe with large and powerful machinery. Enabling technologies of Industry 3.0 (information technology and electronics) meant that these machines could be automated, thereby increasing efficiency (Taalbi, 2019). In the 21st century, technological concepts such as the internet of things (IoT), big data, artificial intelligence (AI), and digital twin (DT) emerged, enabling the transition to Industry 4.0 (Xu et al., 2021).

However, common for all industrial revolutions until now is the separation between human and machines due to the power, determination and lack of cognitive awareness present in most machines today. But just like we tamed animals in early human evolution, Industry 5.0 aims at bringing humans and machines closer together by building safer robots and intelligent sensory systems. Furthermore, Industry 5.0 places a strong emphasis on sustainable manufacturing, aiming to minimise waste, conserve resources, and reduce the environmental footprint of production. This shift towards sustainability is not just an ethical imperative but also a strategic advantage for businesses, as it can lead to cost savings, enhanced brand reputation, and increased resilience in the face of... Humanities and Social Sciences Communications volume 12, Article number: 691 (2025) Cite this article This study explores whether experience with AI tools and the intensity of their use influence individuals’ adoption of ChatGPT in the Czech Republic.

Using data from 1232 respondents (aged 15+), collected via a quota-based online survey from April 8 to April 26, 2024, logistic regression analyses investigated two key questions: (1) Does increased use of virtual assistants... and (2) Does frequent ChatGPT usage predict more intensive engagement with other AI tools? Findings confirm that people who use voice/chatbots more often are significantly more likely to try ChatGPT, and vice versa. Preference for text-based assistants also correlates positively with ChatGPT adoption. Unexpectedly, a generally positive outlook on AI across sectors (banking, healthcare, customer service) does not always translate into ChatGPT usage, implying that trust or scepticism can be context-specific. Another notable insight is that ethical concerns and a strong preference for human contact consistently dampen ChatGPT uptake, suggesting that perceived privacy risks remain a critical barrier.

These results highlight the importance of digital synergy in AI adoption. Policymakers and industry stakeholders can use these insights to develop targeted strategies for fostering inclusive, ethical, and sustainable digital transformation. Technological development advances at a level beyond human understanding, and the use of technology is self-evident in all aspects of human life. The technological innovations impact human values and vice versa (Friedman and Ormiston, 2022). The human-centred technology design has changed to value-centred design. We currently do not know, what impact the AI technologies will have on people’s quality of life now as well as in the future (Ayling and Chapman, 2022; Makridakis, 2017).

Nowadays, it is very difficult to estimate, where the boundaries of the development of the AI technologies in the society are laid, what ethical questions it will bring, the expected (feared) creative destruction, shifts... 2024). Supporting the development of AI in the whole population as a part of the digital transformation processes will not be possible without the systematic involvement of the institutional sphere. The initiating position in AI literacy related to age and the educational aspects will create a knowledge gap, influenced by the perceived relationship to technology. Hence, the completely different mechanisms for supporting AI literacy can be active in the differentiated age population groups (for instance, a part of the educational processes, inclusion of the AI tools in the work... The accessibility of digital technologies and AI tools creates many experiential effects that can influence the adoption rate of AI tools in people’s daily lives (Cheng and Jiang, 2020).

People Also Search

EY Helps Clients Create Long-term Value For All Stakeholders. Enabled

EY helps clients create long-term value for all stakeholders. Enabled by data and technology, our services and solutions provide trust through assurance and help clients transform, grow and operate. Discover how EY insights and services are helping to reframe the future of your industry. How digital twin technology powers the future at Xcel Energy How Bristol Myers Squibb overhauled working capita...

Work In The Future Will Be A Partnership Between People,

Work in the future will be a partnership between people, agents, and robots—all powered by artificial intelligence. While much of the current public debate revolves around whether AI will lead to sweeping job losses, our focus is on how it will change the very building blocks of work—the skills that underpin... Our research suggests that although people may be shifted out of some work activities, ...

Using The Terms In This Expansive Way Lets Us Analyze

Using the terms in this expansive way lets us analyze how automation reshapes work overall.1Our analysis considers a broader range of automation technologies than the narrow definition of agents commonly used in the AI... For more on how we define the term, see the Glossary. This report builds on McKinsey’s long-running research on automation and the future of work. Earlier studies examined indivi...

We Find That Currently Demonstrated Technologies Could, In Theory, Automate

We find that currently demonstrated technologies could, in theory, automate activities accounting for about 57 percent of US work hours today.2Our analysis focuses exclusively on paid productive hours in the US workforce, encompassing full-time... We assess only the share of time awake that is spent on work-related activities, totaling roughly 45 percent of waking hours. Our analysis excludes time...

University Of Maryland Assistant Professor Of Computer Science Huaishu Peng

University of Maryland Assistant Professor of Computer Science Huaishu Peng focuses his research on bridging the gap between digital and physical interactions. His work in human-computer interaction (HCI) explores how devices such as small robots, wearable technologies and haptic interfaces can make computing more tangible and accessible. In this Q&A, Peng discusses his path into computer science,...