Melvin Mathews is an MBA graduate and Ex-Master Mariner with over 20 years of experience in the Maritime Industry previously serving as a Captain on vessels ranging from coastal ships to VLCC’s. He is an Associate Fellow of the Nautical Institute and a Fellow of IMarEST.
Melvin’s extensive travel and working in several countries has broadened his expertise and understanding of operating in multicultural and multilingual environments. He has extensive business and consultancy experience and shouldered Risk-assessment and Risk-management initiatives at a senior level.
Melvin is a certified nautical lecturer and has been involved in Maritime & Competency training.
In the past data was only collected if it was required and most likely by whoever needed it. Therefore the amount of data being collected was relatively small. Gradually the value of the data became apparent when it was understood what the data could reveal. When analysed by an expert set of eyes it could give insights and trends that could not be easily picked up otherwise. This made people wonder what does the data that they are not yet collecting reveal. Thus began the race for collecting all sorts of data. Whether the data collected is useful or not, relevant or not would be decided later was the mantra.
However the practice of ‘whoever needed the data collected the data’ continued. In many cases, different people collected and used almost the same or similar data. People who routinely worked with the data obviously also started developing skills and competency in using the data. With the electronic age the volume of data collection grew exponentially. It consequently rolled out the red carpet to big data. The various sensors, instruments and equipment were manufactured to a range of precisions by different manufacturers. Although data collection was not uniform earlier, the consistency of data changed altogether. This created not just several different variations in the data but a plethora of different signals and protocols to transmit the data.
Ingenious companies developed their own platforms to collect the data they required from the various sources. They convert the different signals collected to a convenient protocol of their choice to run their systems. However this results in several companies that are vying for the same data, collecting it independently on their own platforms. This is typical of any industry and is not any different in the shipping industry.
It is now getting increasingly common for solution providers to link into systems on board vessels to gather data from sensors, instrumentation and automation. The data gathered is used in many ways including for developing tools, solutions and applications that promote efficiency, enhance performance, reduce fuel consumption, etc. Analysis and study of the data gives useful information and insight that can be used for decision support at management level. The data also provides equipment manufacturers with the capability to monitor the performance, track ageing, carry out trouble shooting and make improvements.
When this data is transferred ashore using the vessel’s communication system it allows the teams ashore to support the crew and systems on board much quicker than before. However when this data is transferred in real time it drastically reduces the reaction time. Teams ashore can see changes to equipment on board almost as quickly as the crew or perhaps even before it is picked up by the crew.
Almost all the companies that offer such solution and applications carry out their own integration on board. The reason they embark on it themselves is to ensure they get the required data they need to drive their systems. It is also a means to ensure they get it at their specified quality and frequency. One of the most serious issues that companies face while linking to systems is the sheer variety of protocols used to send signals by the plethora of instruments and devices on board. To add to that every ship is different and even sister vessels end up having different models and versions of equipment by the same manufacturer. This makes each ship unique and therefore a customised approach to data gathering is required for each ship. Each solution provider as a result has developed their own unique data gathering platform on board which is not open to anybody else. This has resulted in some instances where one vessel has ended up with several data gathering platforms installed by several solution providers, where each one is closed but doing almost a similar job.
Since the end of last year there has however been a slight shift in this mind-set. A few companies have come forward with a unique idea. They propose using a common data gathering platform on board that can support any solution or application provider. The platform links into all the equipment on board and collects all data like a black box. It is open source which means anybody can link to it and take the data required to run their system. Although the signal coming to the open platform from the different sources will be of differing protocols, the output from the platform will be a set of standard agreed protocols. The advantages seem to be many:
- Total integration to all systems required only once per vessel
- Solution providers only need to be linked to the open platform
- All signals are converted to standard agreed protocols
- Single integration cost and manpower needs
- Reduced time to customer benefit for each solution or application
- Limited requirement to send installation engineers as solutions can be shipped off the shelf for a simple plug-in installation by crew.
- Reduction in communications costs since data has to be relayed ashore only once. It can then be distributed ashore for use by all providers for their needs.
Considering the advantages, having an open platform on every vessel certainly seems to have its advantages. Significant number of ships are said to be already installed with such open platforms. At the moment huge reduction in the cost and effort of embarking on retrofit solutions during the life of a vessel appears to be the greatest motivator. Among ship owners early adopters are said to have a head start with such platforms.
The open platform is a big enabler for standardising big data in shipping. Using the same data will certainly make different applications more comparable, something that ship owners find quite difficult now. Identifying and cherry picking the best applications that suit their needs will be made much easier. Why then are all shipping companies not rushing to install the open platform? One reason could be the initial cost of installing one without knowing how many solution providers support and can work with the standardised data.
The ultimate success of the open platform will depend mainly on three parameters:
- Solution providers agreeing to dump their own in-house platforms for the open platform.
- The number of ship owners who adopt the open platform, and
- Insistence by ship owners that the installed open platform will be the only source for anybody who seeks any data.
Data by itself is just the basic building block, it can perhaps be considered similar to a pile of bricks which are of no value just by themselves. The strength of the data can only be appreciated when it can provide something of value or an advantage. This is only possible through advanced analytical capabilities, which is a highly specialised area that needs a combination of both skill and competence in tandem with subject matter experts. The possibility of gaining an edge from using data has quite rapidly increased its value as a commodity and therefore data of good quality is quite eagerly sort after these days. However going forward in the data driven world, it is value added services that increase the worth of the data that will rule the future.