What is VR and what is AR? We introduce these key technologies, identified as specific industrial digitalisation technologies (IDTs,) that could play a significant role in reviving flat lining productivity levels and enable faster growth for UK manufacturing.

The 2017 the Made Smarter Review launched by Juergen Maier at the AMRC’s Factory 2050 identified Virtual Reality (VR) and Augmented Reality (AR) as specific industrial digitalisation technologies (IDTs,) that could play a significant role in reviving flat lining productivity levels and enable faster growth for UK manufacturing.

But what is VR and what is AR? How can these technologies be used in manufacturing and who is currently using them? In this article we aim to introduce these key technologies, give examples of where and how they have been used in manufacturing and show just how they could have the impact that Made Smarter anticipated in the future.

Virtual Reality


What is AR / VR / MR 

Virtual Reality (VR) and Augmented Reality (AR) are often spoken about in the same breath, but they are quite different things. Both are forms of conceptual reality and offer the user a virtualised experience. However, where VR offers a more immersive experience, AR provides information overlaid on, or contextualised within, the real world.

Typically, VR uses a completely enclosed headset with images normally projected onto both eyes, such as the Oculus Rift. In contrast, AR offers a semi-adjusted view of the real world, which is generally achieved through a graphical overlay located on the world around the user. An early example of this is the heads-up-display in fighter jets, which is now widely available in many models of cars and is used to project vehicle speeds and other information in the driver’s field of view.

These virtual experiences have been extended in more recent years through the capture and integration of additional sensory information from the user, or by embedding more of the real-world within the experience. This is often achieved through supplementary sensory user feedback, such as via audio or haptic means. This extension enhances the experience often increasing the level of user immersion. 


How can they be used?

VR is a well-established technology having seen extensive development over the last three decades. Initially VR was the domain of the large corporations, but more recently has emerged into gaming and entertainment. Whereas in the 1990s when the cost of entry into these technologies was a small fortune in the £100,000s, today VR equipment is readily available, ranging from simple smartphone based holding assemblies costing about £5, such as Google Cardboard, to fully immersive, high-fidelity room-scale VR costing hundreds of thousands of pounds. 

AR and its sister technology Mixed Reality (MR) are newer compared with VR and thus have seen less extensive development. However, AR can be offered without dedicated hardware through utilisation of existing technologies. Smartphones and tablets contain much of the sensing, processing and display hardware required for rudimentary AR experiences. The ready availability, low cost and existing user experience of these devices make them ideal for deploying AR-based solutions. The ‘Pokemon Go’ smartphone game is probably the best known application of AR. In this game Pokemon characters are overlaid onto the real world at specific locations, allowing the player some basic real-world interaction. A more recent mainstream application of AR is the delivery of navigation information on smartphones such as through Google Maps.

However, AR is not limited to smaller mobile devices. It can also go big. AR can be projected using high-powered projectors, an approach already used in the arts where images can be mapped and projected onto large structures such as buildings to produce visual effects for outdoor shows. 

Dedicated AR and MR head-mounted displays (HMDs) generally offer a more complete AR experience compared to those encountered when using projectors, tablets or smartphones. HMDs are often better able to track the position of a user, offer hands-free operation and interaction, and can be completely mobile. This is achieved through the inclusion of specialist sensing and display hardware to accurately track the user’s position, record user behaviour and to display virtual items over much of the field of view (FOV) of the user. AR headsets are available as either minimalist monocular devices or more comprehensive binocular devices and are thus suited to different tasks and environments. Lightweight devices have little impact on the user’s normal FOV, generally having a small field of view and can even be attached to a user’s spectacles or safety glasses. More comprehensive devices offer greater immersion with richer delivery of information, sensing and interaction, with much greater obscuring of the FOV. These devices tend to be heavier, are able to overlay images over more of the user’s FOV and include a range of sensors and more powerful on-board processing hardware.

The latest developments in Mixed Reality (MR)  deployment are commonly centred on improving the spatial tracking of the device and the user, which will allow for semi-automated inspection processes and tagging of parts and assets. For example, using the on-device camera and depth-sensing feeds available on some MR HMDs, the location of items on a work piece can be detected using Artificial Intelligence (AI) and compared with matching CAD information, with incorrectly placed or missing items visually highlighted to the user.

How can these technologies be used in manufacturing?


VR and AR in manufacturing: an introduction


VR in manufacturing

VR has three key areas of direct application for manufacturing. The first and most established is to allow visualisation of what does not exist, and it is in this area that many readers may have come across it. The creation of virtual prototypes really exploded in the early 2000s when computers with sufficient power could render scenes containing products or environments with a level of realism to them. Since then, technology has moved on considerably with near photo-realistic scene generation commonly used within the architectural and design industries to assess factory design and layouts before construction and hardware procurement begins.

An example of the use of VR in virtual prototyping is a case study from the Ford Motor Company. Ford has developed the Ford Immersive Vehicle Environment (FIVE), a VR-based tool to allow design engineers to inspect new car models before they are built. A physical room which includes just a seat and steering wheel is transformed via a VR headset and gloves, to display highly detailed life size 3D vehicles generated from CAD. The virtual cars can be test driven, walked around and fully  inspected. The virtual prototypes are much cheaper to generate and update than a physical model and the vehicles can be viewed by engineers from anywhere in the world with VR hardware (https://youtu.be/eJx7T-H00SE).

The second premise is around the use of virtual reality to de-risk change. The visualisation of the reconfigurations of production lines or the new layout of existing premises where there may be a need to get stakeholder feedback are increasingly common. It is now routine to engage a workforce in a virtual scenario, to gather first-hand input from the people who will actually be on that shop floor.

The third area in which VR is seeing extensive uptake within manufacturing is in training, assessment and certification. Training in a virtual environment achieves many things; it is less risky than using real-world processes, it is highly repeatable, it has the potential to reduce the use of materials during the training phase, and most importantly, it can be gamified. Gamification is the ability to set a scoring metric against a repeatable scenario, and allows the use of virtual environments to introduce, teach and then evaluate worker progression. It also enables peer group competition to drive self-learning, thus accelerating the uptake of new skills or the understanding and refining of new scenarios.

Multinational corporation 3M offer a package of VR safety training to prevent trips and falls at work, to reinforce training behaviour. The VR system allows supervisors to walk around a virtual site to identify hazards and safety violations and to inspect workforce PPE. Example tasks are also included, such as installing a steel beam or corrugated flooring. The benefit of this virtual training is that the student is not exposed to any real hazards during the training and the system allows multiple training courses and scenarios to be completed without travelling between sites. (https://youtu.be/JcElnVCaEzk, https://www.3m.com/3M/en_US/worker-health-safety-us/3m-ppe-training/virtual-reality/).

General Electric is using VR to train new staff to operate cranes to install high-voltage grid components. Training times have been reduced from days to just 20 minutes using VR, removing the need for physical training hardware and associated support. However, the final stages of training are still completed on-site with physical hardware. This shows how VR can enable training to be delivered in safer environments, for example students can be trained inside an office or even at home, away from industrial hazards such as high voltages. (https://www.ge.com/news/reports/virtual-reality-heavy-lifting-grid-technicians-training).


AR in manufacturing

The role of AR in manufacturing is, as yet, less well refined. Nonetheless, the range of use cases for this technology are expanding every day. The most common use cases of AR include remote support, data overlays, hands-free digital work instructions and assisted inspection. Remote support is probably the simplest, and is yet likely to be the most effective use case of AR. Assuming a minimum level of network connectivity, a remote viewer can ‘look through the eyes’ of the person needing assistance and annotate the view with instructions or comments. This makes it easier to convey complex instructions and if wearing a HMD the operator can immediately implement the guidance. This also enables a one-to-many relationship between experts and workers in the field, mitigating some current skills shortages. Given the ongoing pandemic, remote support is likely to be especially important at the current time with the need to socially distance and minimise travel. The experience can be further enhanced with the remote expert using VR, giving them a more similar view and experience to that of the factory operator.

A recent example of how AR has been used to upskill workers was seen during the recent VentilatorChallengeUK, a nationwide response to the pandemic-induced need to manufacture ventilators at a previously unimaginable scale and speed. The Microsoft HoloLens 2 was deployed across a number of assembly sites tasked with  manufacturing thousands of ventilators. This technology allowed aerospace and automotive operators to be reskilled to assemble ventilators, receiving step-by-step instructions about the devices they were assembling, in the context of the assembly station they were working at, all while working hands-free. These devices also enabled experts to remotely ‘dial in’ to the devices to provide individualised one-to-one support in real-time.

The delivery of contextualised data to operators on the factory floor is another prime use case for AR. A viewer can be looking at a piece of manufacturing equipment, with real-time data feeds delivered in the spatial context of the equipment itself, which for example could provide direct monitoring feedback. This data could be low level information from the machine, for example spindle speeds or temperatures fed from connected sensors, or could be processed to give higher level information such as that related to current production or failure rates. The AR system can be extended to offer a virtual control panel allowing the user to interact with higher level devices, such as control systems or update a manufacturing execution system (MES).

A specific application for contextualised data delivery via AR is digital work instructions. This is especially important for flexible manufacturing, a cornerstone of Made Smarter. An AR HMD can be used to deliver the latest, version-controlled work instructions tailored for each operator's competencies. These instructions can be interactive, offering the user more or less information on specific tasks. The AR device can also be used to request inspections, report issues and report real-time work progress allowing for dynamic process planning. Many of the more comprehensive mixed reality HMDs are able to display 3D CAD models, spatially anchored onto the current work piece, which can simplify complex multi-component assembly or can be used to guide inspection. Boeing has developed and assessed an AR-based tool, which could achieve a 90 per cent improvement in first-time quality when using an interactive AR model, compared to using 2D information on a wiring loom assembly task.


AR is seeing earlier integration into manufacturing equipment, even at the equipment installation stage. For example, Stahlwille and a German start-up, oculavis, developed an AR solution which not only took the user through the set-up instructions for a torque wrench and the bolt torqueing sequence, but also gave a heads-up display of the torque values, before automatically writing those values to the production planning and control system. Over the next few years, this type of integrated system will become more commonplace on the shop floor and will even be embedded in the audit trail, as they do not just capture the bolt torque values but also the identity of the operator. What is more, through the uses of integrated systems, the operator could even pull up a virtual version of a calibration certificate to ensure that the equipment is still within calibration. Having said that, the augmented experience is just displaying the values that have been captured from embedded systems within the tool that are connected to the head set through Bluetooth or other communication protocols.


In 2017 Thyssenkrupp, a manufacturer of home mobility solutions, announced plans to equip their sales people with Microsoft HoloLens AR HMDs to measure staircase dimensions during site visits. The HMD would then automatically share the staircase dimensions with manufacturing teams to reduce data entry time, transcription errors and decision-making time. Many AR devices now offer the capability to measure in augmented space, which is even available on some mobile phones. However, this technology is still quite immature and measurements done in an ‘app’ suffer in many ways, from problems with the positioning of the point to be measured, to the accuracy of the sensing technology itself with different phone types producing widely differing values. It is expected that this technology will improve quite rapidly, especially with the increasing availability of better accuracy sensing technologies such as miniaturised solid-state Light, Detection and Ranging (LiDAR) and multi-camera systems. However, there remains some underlying challenges in the interfaces.

Augmented Electrical Cabinet: The AR electrical cabinet demonstrator features a Microsoft HoloLens HMD which was used to deliver work instructions to an operator performing wiring commissioning checks on an electrical cabinet. The project highlighted the use of digital work instructions for improved productivity and quality. The system that was developed integrated a voice activated multi-meter to allow operatives to view meter readings on the HMD, record readings and maintain complete focus on the task. The test results were able to be taken directly from the device (rather than have an operator write them down). Time savings in excess of 30 per cent were achieved compared to an installer using paper-based instructions. 

AR doesn’t always require customised hardware such as HMDs which can come with hefty initial investment. Older, readily available technologies such as optical or laser projectors are an option too. These can be used to project diagrams, work instructions and other information directly onto a work piece. This approach keeps the workforce hands-free and requires little training and up-skilling.

Projected AR for construction: This use case was conducted with a major player in the construction sector and featured the use of optical projection systems to assist operatives during the 'setting out' process, whereby an operator can project work instructions directly onto the fabric of a building. The AMRC worked with the company to ensure that data created during the design phase of a building could be exploited on-site with minimal processing steps. The system enabled operators to work with unprecedented speed (~50 per cent time savings, compared to using paper based drawings) and ensured a strong link between the design intent of the architects and the construction teams. The output of the project is a system which it is intended to roll out across existing and future construction projects.

Video: https://www.youtube.com/watch?v=6EXu1t8KpEw&t=191s 

Case Study: https://www.amrc.co.uk/files/document/292/1557330103_SmartSet_case_study.pdf


So where next?

While VR and AR technologies have been around for decades in various forms, it is only now that we are beginning to see the enabling technologies evolve to such a state wherein there are no longer major barriers to their adoption and deployment. The advent of 5G will enable greater numbers of fully mobile devices to be connected with lower latencies and higher bandwidth. The evolution of artificial intelligence (AI)-driven software platforms for VR and AR based content authoring will enable the adoption by companies of the technologies, without the previously eye-watering budgets for content creation. The development of industry standards will allow greater interoperability so that the hardware is no longer the bottleneck, and on that hardware, greater processing power combined with improved power consumption and next generation sensors at consumer prices will open the potential for adoption of these technologies at scales of which the pioneers of this technology could only dream.

The future is bright, the future is digital…


Dr Luke Boorman is a lead engineer in Factory 2050 at the University of Sheffield’s Advanced Manufacturing Research Centre (AMRC) in Sheffield. He obtained his PhD in neuroscience and an MEng degree in systems control engineering from the University of Sheffield. His research experience covers a range of disciplines, from medical imaging to industrial robotics, where he has specialised in the development, teaching and application of software solutions for research problems. His publication record reflects his research interests in neuroscience, artificial intelligence and robotics.

Dr Luke Boorman is a lead engineer in Factory 2050 at the University of Sheffield’s Advanced Manufacturing Research Centre (AMRC)


Mike Lewis is Theme Lead for Digital in Factory 2050 at the University of Sheffield’s Advanced Manufacturing Research Centre. Mike joined the AMRC after graduating Sheffield Hallam University with a master’s degree in Mechanical Engineering in 2015. He spent 3 years as a software developer, designing and developing digital work instruction and computer vision solutions for assembly, training and quality control. The portfolio of projects he is currently responsible for span such topics as; augmented and virtual reality for the delivery of assembly and maintenance instructions, virtual reality for human process optimization, layout planning and training, the application of the industrial internet of things.

Mike Lewis is Theme Lead for Digital in Factory 2050 at the University of Sheffield’s Advanced Manufacturing Research Centre (AMRC)