Seeing AR through the eyes of a shipmaster

Nov. 15, 2023
Research reveals the capabilities high-end systems must have to meet users’ needs within the complex marine environment.

A pillar of the world’s economy, the global merchant fleet of 118,928 ships represents an intriguing market for developing augmented reality (AR) navigation technologies. According to the U.S. Navy, AR technology is “a necessity in the ever more complicated domain of Surface Warfare Operations.”1

The task of AR is to process and present information to users in a more assimilable way by providing them with additional visual details derived from data collected, reduced, and processed by a computer, with this information presented within the real environment with which it is associated.

The “virtual” elements superimposed in the operator’s field of view (FOV) represent the actual characteristics of existing targets. In virtual reality (VR), the essence of an object is absent. In AR, aspects of an existing object of interest are available, including those that cannot be ordinarily perceived, that otherwise would have been accessed by looking away from the direction of the target, or that wouldn’t have been available at all.

A vital function of AR is to present in a visual fashion information from sensors operating at wavelengths not accessible to humans, such as in the infrared (IR) or radar bands.

Augmented reality comes with crucial benefits to the officer of the watch (OOW). AR increases safety by submitting shipping lane information in shallow waters or under adverse visibility conditions. It improves situational awareness (SA) by providing—in one system—a comprehensive package of information on its own vessels and all vessels of interest. The more the officer’s mental picture coincides with reality, the better the SA, defined as the knowledge of the relevant entities in a ship’s environment, the understanding of their meaning, and their evolution in the immediate future. By concentrating information on a single and handy user interface, AR reduces cognitive and physical stress and the possibility of human errors, which cause over 75% to 96% of marine accidents, primarily collisions and groundings.

Two of the more advanced AR navigation systems currently on the market, Furuno’s Envision and Groke’s As-Pro, are essentially monoscopic. A wide-field CCTV camera monitors the sea surface from an advantageous position over the ship and delivers an image to a dashboard on the bridge, where information collected via other sensors is processed, associated, and made visible to the officer. Both systems feature an additional IR camera for night vision.

Peculiarity of ship navigation

The peculiarity of ship navigation lies in that the master creates a mental picture of the vessel’s situation by combining continuous head-up (HUP) and head-down (HD) movements while walking about the bridge (see Fig. 1).2 This routine dates to the 4th Century BC, when the Phoenicians made the first sea charts to facilitate the Persian invasion of Greece. Watch time is 50/50 HUP/HD with ECDIS,1 the modern descendant of the Phoenician charts, being the most consulted interface on the bridge.

By concentrating information on one screen, the monoscopic systems above represent a significant advancement in reducing cognitive load by obviating the consultation of several screens, as happens on most bridges today. The OOW, however, must still bend down on a screen to acquire details that must be reconciled with what is seen on the sea surface.

Head-mounted display AR

The bridge is where access to sensor information and orders is centralized to ensure the safe conduction of a vessel at sea. According to the Navy, these data can be “centralized” via AR in a head-mounted display (HMD),3 which confers the following advantages:

  • The HMD does away with the HUP and HD movements that are a source of physical fatigue. Hands no longer must serve dashboards and are left free for maneuvering; e.g., during pilot operations or when working in a sailboat.
  • As part of a smartly designed system, the HMD focuses immediate data association to what is perceived by looking outside to sea alongside what is out there that cannot be interpreted well because of obstruction, unfavorable weather, and other conditions that could obscure or mask its presence. Also, a HMD can deliver stereoscopic vision in some critical areas when needed to better evaluate the distance, dimensions, and precise nature of a target or, for instance, in port waters.
  • Team situation awareness (TSA) increases when all officers on the bridge wear a headset that delivers the same level of information to respective positions, thus preventing misunderstandings. In a military context, the new routine also improves mental model sharing.

For safe navigation, the system design must smoothly display the complete set of information usually accessible on the multivendor bridge interfaces according to the ship type, which can be divided into primary and secondary information.

Primary information is always shown and includes own ship track and speed, whether the ship is on the ECDIS fed voyage plan; when the vessel leaves ocean passage, it further reports buoys with respective top marks and shipping lanes. In 2021, Laera and colleagues at the University of Bari in Italy examined 11 existing AR systems: course, compass, vessel speed, and geographic coordinates continue to be the fundamentals delivered.

Secondary information is available at the officer’s request or if the program deems it urgent in the circumstances. It includes details on other vessels in the area acquired via AIS2 and radar plotter, plus information on points of interest (POI), shallow waters, wrecks, debris, and ice and its consistency.

Overarching both types of information is a warning system for the potential failure of compliance with the IMO Collision Avoidance Regulations (COLREGS) by showing the point and time of closest approach (PCA/TPCA) of another ship with respective details.

The expectations are for the HMD to handle graphics by providing a quality of color, contrast, and resolution to allow a sharp appreciation of the proposed informational details. The subjective brightness derived by a correct luminance plus a high legibility form part of the user’s experience that impacts his/her reaction times. The headset should render the inputs to direct the officer’s attention by using color changes for sea areas (e.g., shallow waters) or, when situations of interest are registered for targets, by using object circling, tags, virtual lines, and attention funnels. Also, changes in luminosity, blinking, or transparency may be used to drive the user’s attention softly and with greater efficacy.

Standardizing the display

Researchers of the Ocean Industries Concept Lab (OICL) at the Oslo School of Architecture in Norway proposed a standard of five objects to mediate information in AR spaces expediently without clutter and overlap (see Figs. 2 and 3):

1. The app display shows an application entirely, like ECDIS, local ship traffic.

2. The widget display shows just a fraction of an application; e.g., compass read and wind speed.

3. The annotation provides information on a target as a ship or other POI.

4. Ocean overlays visualize information on the matching area or point of reference, like the planned route or avoidance zones.

5. Augmented reality maps are placed on top of the sea field view to report details; e.g., position, heading, and speed of vessels nearby or the surrounding sea ice conditions mapped by sensors.

The HMD must further ensure adaptation to frequent luminosity changes on the sea surface and within the bridge while also mastering the problem of glare because of reflections from the sea. As regulated by the program, the headset must be able to handle luminosity efficiently in case of night darkness and poor visibility, such as fog, rain, or snow.

The research now focuses on existing headsets and the improvements required to achieve TLR 9 for safe marine applications.

HMD in the laboratory and at sea

At the University of Bergen in Norway, postgraduate researcher Abel van Beek designed Sjor (Sea), an AR navigation program he tested with the help of sailors at different development stages using a HoloLens 2. The ships involved in the study between 2020 and 2021 were a platform supply vessel (The Energy Duchess; see Fig. 4), a ferry, and a 12 m sailing boat. Van Beek summarized his experience, stating, “The use of HoloLens 2 on a ship bridge—being exposed to the environment or enclosed on a ship bridge—is challenging in its current state.”

The main issues found by van Beek included missing waterproofness, limited stability, poor visibility in bright environments, misaligning of the graphics by vessel tilt, inaccurate GPS, and limited FOV. He concludes that “general AR technology (including HoloLens 2) will transition into the domain-specific AR devices that do not suffer from the same limitations.”

Within the project Safe Maritime Operations under Extreme Conditions – The Arctic Case (SEDNA), financed by the European Union, researchers of the Ocean Industries Concept Lab (OICL) in Norway used Microsoft HoloLens 2, Meta2, and HTC Vive Cosmos to test an AR Concept for Icebreaker Assistance in arctic operations.3 With the crew’s cooperation, tests were performed between 2018 and 2020 on a simulator at the Chalmers University in Sweden, at sea during a 14-day voyage onboard a Norwegian CG ship in East Greenland, and on a Swedish icebreaker in the Baltic.

A main issue in the program was the light reflections of arctic ice, which caused the graphic in the headset to fade almost completely, forcing the team to adapt a tinted film filter to carry out research work successfully with HoloLens 2 (see Fig. 5).

OICL currently carries on further projects, OpenBridge and Open AR. Professor Kjetil Nordby, who leads the laboratory, on headsets: “There are issues with compensation for waves and for reflections in the arching windows of the bridge. HoloLens has a moving platform mode. We are not entirely convinced it solves the issues yet.”

Nordby explains: “We believe screens are going to be most important for a while, with goggles coming in for supporting roles, such as lookout.” The laboratory uses Magic Leap 2 for current work. About the existing HMD technology for maritime use, he adds, “We are curious about the upcoming passthrough goggles, like Apple Vision. As it stands now, we would suggest maritime versions of the headsets or at least some modifications.”

The production of HMD specifically designed for a maritime environment would likely accelerate the implementation of AR navigation systems in the merchant navy, improving the safety of goods, crews, and passengers.

REFERENCES

1. See https://apps.dtic.mil/sti/pdfs/AD1147715.pdf.

2. F. Guo et al., “Towards an ergonomic interface in ship bridges: Identification of the design criteria,” International Conference on Applied Human Factors and Ergonomics [AHFE] 2022, New York, USA, 24–28 [July 2022]); http://doi.org/10.54941/ahfe1001609.

3. S. Frydenberg, K. Aylward, K. Nordby, and J. O. H. Eikenes, J. Mar. Sci. Eng., 9, 996 (2021); https://doi.org/10.3390/jmse909099.

About the Author

Vittorio Lippay

Vittorio Lippay has been a member of the Institute of Physics since 2018 and a member of the Institute of Chartered Shipbrokers (London branch) since 2012. 

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!