Originally published in SSA Waves.
A common perception nowadays is that working in the maritime sector means that a fleet needs to go digital—that a company needs to have all of its data streams connected to be successful, or to even keep its business running smoothly. The reality is that many in the industry are not there yet.
The majority of shipping data is transmitted onshore via traditional formats like email, texts and basic excel sheets. A good portion of that data is inputted manually. How, then, does a shipping company work within this reality yet stay competitive in an increasingly challenging and data-vulnerable industry?
Before we get started, let’s look at the different types of data typically encountered in the shipping industry.
When collecting data for managing fleet performance, there are typically two types of data collected: event data, which includes data over a specific period of time, such as speed or distance averages or FOC; and snapshot data, which as the name implies is data recorded at a specific point in time, such as measurements showing load, RPM, pressure, or power.
Event data must be entered manually, while snapshot data can be directly measured through sensors on a ship, though requires autologging systems to gather data from each sensor. Each data source comes with its own advantages and disadvantages.
Event data is able to connect to existing shore ship-to-shore processes in a shipping company. It typically comes from aggregated crew knowledge and doesn’t require installation of costly sensors. Key voyage parameters like BOSP, arrival, bunkering, departure, cargo, and EOSP are usually always manual data.
As this data is manually recorded, human error will eventually will find its way into a this manual data set. Over time, data quality issues become inevitable.
Snapshot data is measured daily or more often, which is important for a number of KPIs such as engine power and hull and propeller performance. The data-driven analysis needs to go back to every vessel, which is great for high-investment vessels.
However, it can be difficult to understand the context of a snapshot data point without comparing it to similar data points over a longer period. The quality of the data is also dependent on the quality of the sensors. Working with faulty sensors can be tricky without other data points to compare it to.
Fleets of various shipping companies are typically quite diverse and heterogeneous. Vessels are chartered for different lengths of time, which means a cost-benefit decision process may equip a part of a fleet with different data-reporting technology.
A portion of a fleet can contain vessels in short-term operation mode, such as a spot charter vessel, which would be in operation for just a few months. Equipping those vessels with high-investment sensors and technology would be costly and likely not generate the return to make the cost worthwhile. It can be expected that most short-term vessels will depend on low-investment manual reporting.
However, time chartered vessels may be in operation from anywhere between six months to three years or longer. It may make a lot more sense to invest in voyage reporting software and other basic equipment like ship sensors for these vessels, though in a time where every cent counts, the vessels may still prefer to utilize a manual reporting system.
Then consider vessels owned by a shipping company directly which can be on very long charters, some for as long as ten years or more. These ships would benefit greatly from investing in sophisticated equipment which can autolog data at a reasonably high frequency.
No two shipping companies are the same, with factors like fleet size, age, and segments in operation affecting the distribution of different types of vessels. Not all vessels will be equipped with sensors and auto logging systems. It is very likely, however, that most vessels in a fleet will have a manual reporting system. Both sets of data could be enriched greatly by the addition of third party sources. So how would a fleet collect multiple data streams into a common format?
To merge the data, there are generally two main options. Option A is to gather all autologged data, then project the event data onto it. This makes more sense for fleets where every vessel uses autologged systems, so that the remaining data comes from other sources as needed.
However, as we know, most fleets are not homogenous, and most operators do not choose to equip their entire fleet with sensors necessary for autologged data. A better solution then, is to build a base not on autologged data, but on event data instead.
Option B is to aggregate reports by gathering all the manual data – such as noon reports in port, departure events, beginning of sea passage, noon at sea, end of sea passage, arrivals, anchorings, and more – then overlay all other data into the existing sequence of the manual reports. This solution is much more viable for the majority of companies who own heterogeneous fleets.
By compiling the two sources, a fleet operator is able to check the plausibility of the data. Merging data can also improve overall insights. If, for example, a fleet operator is missing specific fuel oil consumption (SFOC) information, they can use both manual and autologged sources to calculate the SFOC, instead of calculating it on board. This is especially useful for vessels which do not have a way to transfer onboard-recorded SFOC measurements onshore.
After merging the sources, additional plausibility checks can be implemented, but only with the right software. Manual reporting software not designed for shipping—such as Excel—is typically sent by email straight from ship to shore. By using event reporting software instead-- such as Navigator Insight--quality checks would activate through the ECO Insight portal between the time a noon report is sent and received. In this example, StormGeo’s Fleet Performance Center would then further validate the data to ensure a company is equipped with better quality insights and through the Daily Alert service.
Everything done ashore is based on the quality of the data received. If the data received is of questionable quality, a lot of resources would have to be spent to resolve the discrepancies to get the data back to a reasonable standard.
The current world environment does not leave a lot of room for excess resource allocation, which leads to a higher likelihood that faulty data will lead to costly errors in decision making. Quality checks integrated into the solution can reduce the risk of this scenario.
Whatever the size of a fleet--and whatever its composition—investing in a software solution which enables data plausibility checks can go a long way in ensuring a company stays around through turbulent times and beyond.