If you’ve ever had a lengthy internet-browsing session and wondered just how deep the web can go, the answer, as least physically, is the University of Hawaii’s ALOHA Cabled Observatory (ACO). Situated on the floor of the Pacific Ocean, three miles below the surface and about 60 miles off the beautiful north coast of Oahu, the ACO receives power and Ethernet connectivity through a retired AT&T undersea cable that was donated to the oceanographic project.

The information that travels through this cable provides a wealth of observational data, measuring (in real-time) factors from water pressure to salinity, temperature, currents, and oxygen levels. This data valuably contributes to oceanographers’ understanding of more far-reaching issues like climate change. The ACO features live video equipment and a hydrophone, and performs real-time audio and visual observations from the ocean floor, which include video of newly discovered sea life and the soft crooning of humpback whales.

As you might assume, implementing network connectivity three miles beneath the ocean comes with a unique set of IT challenges not for the faint of heart. Physically, the ACO is the size of a VW Beetle, with plugs on its ends featuring special connectors that are designed to control fiber optics and power, simultaneously, in an environment with 500 atmospheres of pressure. Not exactly your usual data center setup. The ACO is designed to interface with different observational equipment on a modular basis, providing multi-tenant networking with internet and power connections ready for each piece of hardware introduced. Different universities and research groups can then have their projects attached and removed from the ACO via underwater remote operating vehicles. Not the easiest place to do a truck roll.

The undersea cable itself, graciously donated by AT&T, was the first fiber-optic cable between the continental U.S. and Hawaii, and still uses the original AT&T control systems along with our modern equipment, all of which resides at a cable landing station where the cable meets the north shore of Oahu. This system includes optical switches used as repeaters, so that if a strand of the cable were to be broken (say from having an anchor dragged across it), connectivity and power can be routed around that strand.

Networking the ACO requires converting the original proprietary AT&T signaling at work within the undersea cable into 100Base-FX industry standard network communication. This is actually done using a customized board that has been designed and built for the project by former AT&T engineers. Data is converted to 100Base-FX within two 10-foot by 24-inch diameter titanium tubes on the ACO, which keep the networking equipment dry and at normal atmospheric pressure. The converted data is then handled by industrial, ruggedized networking gear, including a combination of Cisco, Belden, and Sixnet switches. A custom-made, computer-controllable power supply at the ACO also makes it possible to control power to each accessory plug, and a switching device allows for changing the copper connection to be Ethernet or serial for industrial control.

Resiliency is also a key component to making this unique networking scenario work the way it has. For this piece of the puzzle we’ve used Opengear networking equipment, which is in place at the cable landing station and at the University of Hawaii School data center. Doing this helps maintain a high-availability failover pair of connections, ensures redundant data storage, and provides secure access for oceanographers around the world via an IPsec VPN tunnel. The equipment further delivers on our system’s absolutely essential need for resiliency by managing the ACO’s power supply and having the capability to automatically address common issues that may arise deep under the sea.

To more easily gather troubleshooting information, smart devices have been added that can ping our undersea ACO hardware via SNMP management and monitoring. An Opengear console server placed in one of the ACO’s camera domes also offers the unique ability whereby power can be fed into it and it can, in turn, feed power over Ethernet downstream. We use this technique to power cameras in the dome. The console server also includes RS485 industrial control serial connectivity (which we implement to control lights at the dome), and regular Ethernet connecting to sensors for measuring mission-critical interior environmental factors like humidity and barometric pressure.

The result of this network infrastructure is that scientists can securely control their equipment at the station, and terabytes of data can be streamed and made available in real-time while also being stored at multiple locations to prevent the risk of data loss. Looking forward, this infrastructure supports the ACO’s plans for future expansion and the introduction of new capabilities, all of which will serve to further enhance our understanding of the environment at the internet’s deepest frontier far under the sea.