Sanjay Jhawar, Co-Founder and Chief Strategy Officer, RealWear
RealWear Navigator™ 500 solution offers a frontline connected worker platform for the integration of multiple assisted and augmented reality (SLAM) experiences into a high-performance industrial solution
RealWear Navigator™ 500 solution is our all-new head-mounted device product platform specifically designed to engage, empower and elevate the frontline worker for the next several years. Building on the accumulated experience of the last four years, working with 5000 enterprise customers in 60 countries with solutions based on our HMT-1™ and HMT-1Z1™ platforms, this new product brings targeted innovation in all the key areas that matter most to achieving solid results at scale.
RealWear has been known for establishing and gaining major customer deployments for frontline worker solutions based on “assisted reality”. The core concept of assisted reality is that it makes a different tradeoff than mixed reality. Assisted reality is better suited to the majority of industrial use cases where user safety is paramount. The goals of assisted reality are to keep the user’s attention in the real world, with a direct line of sight, for the most part unoccluded by digital objects or “holograms” that require extra cognitive focus for humans to process.
Situational awareness of moving machinery, approaching forklifts or other vehicles, steam escape valves, slip and trip hazards and electrical and chemical hazards is key for RealWear’s customers. These are the same working environments that mandate specific personal protective equipment for safety glasses and goggles, to hard hats, hearing protection, heavy gloves and even respirators. Users in these situations mostly require both hands to be available for the use of tools and equipment, or to hold on to railings, ropework, etc. In turn the user interface for assisted reality cannot rely on the availability of hands to operate handheld controllers, or to draw gestures in the air. RealWear’s assisted reality solutions rely on voice recognition that is field proven in very high noise environments, plus the minimal use of head motion detection. The platform uses a single articulated micro-display easily adjusted to sit below the dominant eye that does not obstruct direct vision and provides the user a view similar to a 7-inch tablet screen at arm’s length.
A core concept of mixed reality has been the placement of virtual 3D digital objects overlaid on the physical world – such as 3D models or animations. This requires two stereoscopic see-through displays that are brought to a point of focus that typically is not in the same plane as the real-world object. The resulting vergence-accommodation conflict – where the greater convergence of the eyes when looking at near objects is in conflict with the focal distance, or accommodation of the eye’s lens needed to bring the digital image into focus – is a source of eyestrain, discomfort and in some cases headaches after extended use. In addition, in bright conditions, especially outdoors, mixed reality displays struggle to provide sufficient contrast with the real world and therefore they always either cut a significant amount of light from the real world using darkened glass or have to generate such a bright display that battery life is very short unless tethered with a cord to a separate battery pack. Both situations contribute to eyestrain with extended use.
However mixed reality applications do allow information to be overlaid on the real-world asset which in some use cases can provide an additional boost in productivity in identifying the item to be worked on.
So how could this tradeoff be solved? Is it possible to tag or overlay information on the real 3D world while also maintaining safety, situational awareness, low eyestrain, hands-free use and full-shift battery life?
We’ve long believed that the answer lies in amping up the amount of “assistance” in assisted reality rather than solely focusing on the amount of reality, with power-hungry, wide field of view, super bright stereoscopic, transparent and ultra-high resolution displays. With advanced camera capabilities and computer-vision processing, key information about real-world assets can be placed on the camera view shown in the single, monocular, non-see-through (opaque) display.
With the launch of the RealWear Navigator 500 solution, RealWear’s second generation assisted reality platform, we are pleased to showcase the future happening today. Our long-time partner, Transition Technologies PSC has, in collaboration with us, enhanced its SkillWorx connected worker application for the new device.
Transition Technologies PSC has combined several capabilities from their computer vision and application software platform with newly enhanced capabilities in RealWear Navigator 500 solution.
RealWear Navigator 500 solution has a camera system with 48 megapixels permitting excellent low light video capture performance by combining 4 pixels into one, plus the ability to zoom in on the sensor itself to magnify the image and still retain a high quality output for a still image capture. In addition, image stabilization has been further enhanced, leveraging additional sets of accelerometers in the device to better sense its orientation and movement during video capture.
SkillWorx brings computer vision at the edge, combining video application software running on RealWear Navigator 500 solution with powerful GPU servers running either on-premise or in the cloud. In real time, points in the video feed are recognized as belonging to a real-world object, and tracked from frame to frame, generating a local 3D coordinate system from 2D video – known as a point cloud. Once the system begins tracking, the positions of real objects are recognized and identified even though the field of view and position of the head mounted camera is changing as the user moves around. Altogether, this represents an implementation of simultaneous localization and mapping, or SLAM.
“At the very beginning of our partnership with RealWear, we decided to focus on Simultaneous Localization And Mapping (SLAM) to serve knowledge as a service and amplify human senses in a non-intrusive manner,” said Adam Gąsiorek, CTO and Co-owner of Transition Technologies PSC. “We introduced spatial intelligence to RealWear HMT-1 to contextually show what, where and how it should be done, and how to document what was done by the frontline worker. With the evolution of RealWear platform and the launch of RealWear Navigator 500 solution, we can support time-intensive manual labor even under more challenging conditions. Time is a resource more valuable than money these days. For many industrial situations, time is more valuable than currency itself – it’s the one resource we can’t replace or replenish no matter how wealthy we are. With SkillWorx and RealWear Navigator 500 solution, we take tasks that are time consuming and make them much less so without sacrificing worker safety nor comfort.”
The video shows a user in the pipe room of a building tagging objects such as valves and pipes dynamically with voice commands. These objects become persistent and are recognized even if they exit the frame and return later due to the user’s movements. Once tagged, they can be recognized by a different user with a second device at a later time. The scenario in the video combines remote assistance with live annotation, spatially labelled work instructions, pipe pressure readings from IoT sensors and access to technical documentation for a gas valve. Thanks to the camera capabilities of RealWear Navigator 500 solution, the device performs well even in low light and in hard-to-access positions, as RealWear’s telephoto mode has been integrated into SkillWorx. The application is operated 100% hands-free with noise robust voice commands that work well even with the nearby loud noise of air pressure escaping the valve.
It is easy to see from the above video how such a system enhances the accuracy and speed of frontline worker inspection and maintenance tasks. Yet now, such productivity also comes with maximum situational awareness and enhanced safety, and works well with a monocular display and no blocking of direct or peripheral vision, in contrast to many mixed reality solutions. This system can support an entire shift on a single battery.
So, are holograms the holy grail after all? Or is this contextually aware, live-tracked 3D SLAM solution on a 2D display – perhaps we might call this 2.5D assisted reality – the true sweet spot for the frontline worker doing constant field work safely? RealWear Navigator 500 wearable solution and Transition Technologies PSC SkillWorx show together the power of these next generation platforms in action.