Conference PaperPDF Available

Preparing for a Digital Future-Experiences and Implications from a Maritime Domain Perspective

Authors:

Abstract and Figures

The re-emergence and subsequently increased credibility of big data analytics to support and enhance effective decision-making in recent years, comes largely because of significantly increased available computing power, which has consequently caused a dramatic spur in the generation of new applications encroaching upon the normalcy of business processes and models. Maritime industries are often attributed with an elaborate complexity, which often protrudes the difficulties of state-of-the-art technology adaptations and implementations that have not been developed natively within the industry. As such, there is a clear need for a modern, systemic, and methodological approach so that more traditionally inclined industries not only can utilize such digital enhancements, but embrace it to further contribute towards previously incomprehensible applications and provide enhanced sustainability including but not limited to higher productivity and new business opportunities. This paper examines the empirical and implicational facets of embracing digital technologies in the maritime domain from a Big Data Analytics handling perspective. It presents an introduction on the emergence of digital technology, discusses the challenge of integrating such innovations into the traditional maritime business and presents a few examples from Ulstein digital experiences.
Content may be subject to copyright.
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
Preparing for a Digital Future - Experiences and Implications from a
Maritime Domain Perspective
André Keane, Ulstein International AS, Ulsteinvik/Norway, andre.keane@ulstein.com
Per Olaf Brett, Ulstein International AS, Ulsteinvik/Norway and Norwegian University of Science
and Technology, Trondheim/Norway per.olaf.brett@ulstein.com
Ali Ebrahimi, Ulstein International AS, Ulsteinvik/Norway, ali.ebrahimi@ulstein.com
Henrique M. Gaspar, Norwegian University of Science and Technology, Aalesund/Norway,
henrique.gaspar@ntnu.no
Jose Jorge Garcia Agis, Ulstein International AS, Ulsteinvik/Norway and Norwegian University of
Science and Technology, Trondheim/Norway, jose.agis@ntnu.no
Abstract
The re-emergence and subsequently increased credibility of big data analytics to support and enhance
effective decision-making in recent years, comes largely because of significantly increased available
computing power, which has consequently caused a dramatic spur in the generation of new applications
encroaching upon the normalcy of business processes and models. Maritime industries are often
attributed with an elaborate complexity, which often protrudes the difficulties of state-of-the-art
technology adaptations and implementations that have not been developed natively within the industry.
As such, there is a clear need for a modern, systemic, and methodological approach so that more
traditionally inclined industries not only can utilize such digital enhancements, but embrace it to further
contribute towards previously incomprehensible applications and provide enhanced sustainability
including but not limited to higher productivity and new business opportunities. This paper examines
the empirical and implicational facets of embracing digital technologies in the maritime domain from
a Big Data Analytics handling perspective. It presents an introduction on the emergence of digital
technology, discusses the challenge of integrating such innovations into the traditional maritime
business and presents a few examples from Ulstein digital experiences.
1. Extended Systems of Systems Boundaries and its Effect on the Traditional Maritime Business
Domain part I
With over 90% of the world trade being carried by ships, the maritime industry is, more than ever, an
integral part of the process of globalization, which makes it strongly dependent of the world behaviour.
As such, the industry is influenced by factors such as economy, trade, production, consumption, politics,
financing, and technology that drive the demand and supply of manufactured goods, raw materials, and
shipping services (Stopford, 2009).
Companies operating in the maritime business are challenged by much more extended and somehow
liquid system of systems boundaries, a VUCA world. Volatility, complexity, uncertainty, and ambiguity
(VUCA) coexist in many industries (Bennet & Lemoine, 2014), maritime being one of them. VUCA
aspects characterize the technical and managerial development of firms, limiting innovation, and
challenging the implementation of new technologies (Corsi & Akhunov, 2000). Fuel prices, market
supply or demand are volatile variables. They are influenced by a diverse list of factors that makes them
unpredictable. Although we know they are continuously changing, it is very difficult or impossible to
predict their exact behaviour. Volatility covers the expected but unpredictable. Fuel price is one of the
volatile factors with stronger influence in the maritime industries. Uncertainty, contrary to volatility,
refers to changes that are not expected. New regulations, new market entrants (products, services, or
business models) or natural disasters are typical examples of uncertainties affecting the maritime
industry. Both, volatility, and uncertainty can be assessed by simulation of future scenarios
(Schoemaker & van der Heijden, 1992), which improves the agility of the company in reacting towards
unpredictable or unexpected changes. Data availability, storage capacity and increasing capability are
key drivers in mitigating future uncertainty, and the emergence of Big Data Analytics and Artificial
Intelligence (AI) bring with them capabilities that will undoubtedly further our understanding the
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
technology is matured.
The complexity of the maritime industry and its corresponding decision-making processes are a cause
of the large variety of operations, operational environments, and multitude of stakeholders. This
complexity at industrial level influences directly on the degree of complexity of the systems operating
in it, such as the fleet, the ship, and its subsystems (Gaspar, et al., 2012). If we define complexity as the
amount of relevant information to properly define a system, how then to be precise and establish a
boundary for properly enough? Such reflexions in the industry foster ambiguity. Typically, the latter
relates to the lack of data, comprehension, and a clear idea about cause-effect relationship. How will
the market react to a new disruptive product? Will these extra functionalities or capacities add value to
the customer? How much more information is necessary to increase precision and reduce uncertainty?
Experimentation, simulation, and data analysis can solve this problem. One example is the use of
statistics and multivariate data analysis to estimate vessel performances (Ebrahimi, et al., 2015). Virtual
prototyping brings the possibility of simulating changes during design or operational phases before their
real implementation. It gives the opportunity of testing before experiencing, which reduces the
probability of errors, and furthermore improving efficiency, safety, and environmental friendliness. It
is important to approach design in the conceptual phase from different perspectives: This is to
differentiate among different solutions, to have better understanding of consequence for input change,
to measure Goodness of Fit between initial product expectations and final as built performance yields,
to have more meaningful benchmarking with market competitors, to make better and more robust
decision-making in vessel design, and to support the development of effective sales arguments.
Counter to other comparable industries such as aerospace or automotive, the maritime industry
constitutes a cost-driven business model (McCartan, et al., 2014). The implementation of innovation
and new technologies will be challenged by cost as the only benchmark, since maritime companies in
general act capitalistically (Borch & Solesvik, 2016). This view acts in direct contradiction to
innovations and new technologies, labelling many shipping companies as “deeply conservative” (Glave,
et al., 2014). Considering these premises and the difficulties of connecting technological developments
to growth in revenue, technical development in shipping, has historically been moved to second division
play (Dijkshoorn, 1977). Other factors such as human, regulatory, and financial, challenge the
introduction of new technologies and innovative features in the maritime industry as well.
Considering the potential benefits and the challenges for its implementation, (Norden, et al., 2013)
suggest changes in the business concept as a way towards the digital future. A switch from a product-
based business concept, typically focused on costs, towards the more service-oriented approach, which
focuses on recurring revenue and overall system performance over its life time. Such change, would
open the door to innovations and new technology that may increase the revenue making capability of
the design solution at a required but justifiable higher initial investment, and potentially reduce lifecycle
costs and increase overall performance yields. Leading companies such as GE, IBM, Rolls-Royce, or
Siemens are all looking for how such servitization concept should be applied in their respective
industries (Ahamed, et al., 2013).
Figure 1: Ulstein Value Creation Process in Vessel Design
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
The different steps of the Ulstein value creation process, digitalization, and automation of processes are
continually being applied to reduce time and required resources, while at simultaneously enhancing
robustness of the decisions and actions that are being made. Figure 1 illustrates the process within a
product lifecycle context for ships. This is a new product development process where different business
intelligence approaches, tools and techniques are applied to improve overall system effectiveness.
The exploration of these phenomenon and recent related developments of the maritime industry are
reviewed by this paper. The elaborations are divided into two main sections: Part I addresses and
elaborates on the most prominent features of big data analytics from a generic knowledge standpoint,
and Part II presents some practical applications based on Ulstein business and work procedures.
2. The Integration Challenge part I
2.1. PLM as an Umbrella
Products within the maritime domain not only must comply with a vast set of requirements governed
by national and international politics, rules and regulations, and vested stakeholder interests, but a high
performing asset also demands exceeding capabilities within aspects such as the commercial, technical,
and operational (Ulstein & Brett, 2009). Such multi-faceted criteria as input to product development
requires a robust product development suite to supplement the human capability and to stay competitive
as a product supplier. Product lifecycle management (PLM) was introduced in the 1990s to better
manage information and expand the scope of computer aided tools throughout all phases of a product’s
lifecycle, which from a manufacturer’s standpoint is comprised of imagination, definition, realisation,
support, and retirement (Lee, et al., 2008). Such an approach enhances the overall effectiveness of vessel
design (Ulstein & Brett, 2015).
PLM is the business activity of managing a product from first idea to final retirement cradle to grave
(Stark, 2015), which furthermore can be viewed as an approach to integrate information, people,
processes, and business systems (Lee, et al., 2008). Realising such an approach in practice, not only
encompasses an immense amount and variety of data in terms of volume, veracity, and compatibility,
but also requires interfaces to enable an efficient two-way communication channel for information
exchange among relevant stakeholders. Adding to the consideration an inherent complexity within ship
design, manufacturing, and operation, the challenge of integration is an intractable one.
Lee, et al. (2008) argue that PLM originates from two main domains: The first being enterprise
management, which further consists of subdomains such as enterprise resource planning (ERP),
customer relationship management (CRM), and supply chain management (SCM); and the second
sourced from product information management. The latter largely pertains to computer aided design
and manufacturing, and product data management tools that have become imperative in the process of
product development (Sharma, et al., 2012).
Taking a ship as a product-example from the maritime domain, the lifecycle phases pertaining to
product definition and generation in a PLM context serve to provide a complete product definition in
terms of both performance evaluation and input documentation for subsequent production activity. Each
module in the core product data model is composed of multidimensional sources of information in terms
of formatting, frequency, and compatibility to name a few, which, in turn, clearly exemplifies the extent
of the integration challenge to systematically manage the product’s lifecycle. From a shipbuilding
perspective (Morais, et al., 2011) further elaborate regarding some of the main challenges of technology
adoption within the industry.
2.2. PLM as Basis for Emergent Technology
As increased competition among vendors thrive, an increase in demand drives complexity in modern
engineering design, which furthermore protrudes the nuanced and frail trade-off between technical,
operational performance, and commercial viability (Sharma, et al., 2012). For a product in development
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
to meet or exceed customer expectations and simultaneously adhere to regulation, coordinate cross-
functional workloads, and deliver on budget, it would follow that through an increase in design
complexity, the resulting information produced to manage and govern the product’s total lifecycle
would increase proportionally. Mitigating potential model inconsistencies in the flow and management
of product information over the lifespan is stated as one of the major concerns and keys to a successful
PLM implementation (Thimm, et al., 2006). Minimising the probability of inconsistency in such an
information model can be obtained through meticulous data decomposition, tracking, and gathering,
ensuring a high degree of fidelity, level of detail, and perspective.
Kessler & Bierly (2002) claim that a fast-paced innovation strategy is increasingly successful if context
is predictable. It is well documented that a well-executed PLM implementation leads to availability of
quantifiable product information for purposes of knowledge decomposition and analytical insight,
increased quality assurance and compliance, collaboration and communication, which all contribute to
a transparency increase in context, which again would facilitate an intensified agile product
development (Stark, 2015; Lee, et al., 2008).
In recent years, the term Industry 4.0 has emerged as a central topic of discussion, aiming to accentuate
the current paradigmatic transition in technology towards increased use of cyber-physical systems, as
compared to the previous three editions of the industrial revolution, which in short consist of water- and
steam powered mechanical manufacturing, electrical mass production and labour division, and the
digital revolution from the 1960s to the 1990s, as Industry 1.0, Industry 2.0, and Industry 3.0,
respectively (Devezas, et al., 2017). Albeit with a slight manufacturing focus, Industry 4.0 does convey
a central point, namely that digital and physical processes, technology (Schmidt, et al., 2015), and
products continue to further intertwine, and as such will fundamentally change the inner workings of
supply chains, business models, and processes the like (Berger, 2014).
2.3. Can Diverse Data Have Compatible Taxonomy?
As previously introduced, vessel design, engineering and fabrication follow traditional approaches and
the application of conventional marine systems design theory, methods, and analytical tools,
comparatively can be characterized the same way. Current ship design, engineering and fabrication
approaches are fragmented, discontinuous, time consuming, and laborious the way they normally are
carried out. Rationalization of business and work processes (e.g., modularization, parameterization, and
other design automation techniques) have so far, only to little extent been tested out and implemented.
The main cause of these distances to more modern approaches is the lack of standards, common
practices, and diversity of taxonomies that a ship can have.
The ship design value chain is divided in multiple phases and gates, and many actors are involved
(Figure 2). Traditions and approaches vary from designer to designer, from country to country, and
from yard to yard. Only to a limited extent are novel and state of the art knowledge and technology
(ICTtools) in use to streamline and improve the efficiency of such work processes, and actor interfaces.
Moreover, highly detailed 3D concepts are presented to the client prior to a signed contract despite
much of the engineering being remodelled in 2D, and few attempts to operate in 3D during the assembly
phase. This excessive re-modelling, blocks innovation and consumes unnecessary time. The cause can
mainly be attributed to the difference in boundary definition and placement for the parts and systems of
the ship between phases.
Hierarchization was once considered the solution for such lack of standard: it handles the complexity
on different layers and systems, such as the SFI group system on the 70's (Manchinu & McConnel,
1977). Unfortunately, it also constrains the understanding of the system by a specific set of rules, which
may not work in a different context. A mapping between such stiff hierarchy and other more pragmatic
taxonomies is then necessary. The cumbersome task of handling multiple taxonomies and diverse tags
attributed to each part/system is now promised to be handled by modern PLM systems.
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
Figure 2: Relevant information related to activities in the ship design value-chain with example of
current commercial tools (Gaspar, 2017).
3. Emergent Technologies in the Maritime Industry
3.1. From PLM to Everything Else
PLM systems promise to keep control of products digital data structuring, using dedicated (and
expensive) software for improving the management and collaboration of the team throughout the
product development process. If modern PLM systems deliver on the promise of handling the
aforementioned challenges of multi-taxonomy/disciplinarily issues, it is reasonable to assume that we
can start to use PLM as a foundation towards handling and storing diverse data, and furthermore develop
other emergent technologies.
The drawbacks of a decision to go completely via PLM are, however, well known. It is vital to
understand how complicated and time consuming the implementation of a PLM project might become
depending on the company requirements. Often maritime companies consider PLM system as too time
and resource consuming before bringing benefits, and tend to avoid it or postpone its implementation.
Another drawback is the failures of previous implementation attempts (Levisauskaite, 2016). PLM
systems have been promising these implementation miracles for years. Many ship design companies
faced difficulties in the past, including Ulstein, when managing the large scale of data using 10-20 year
old PLM systems, and such a poor experience is clearly reflected when introduced to more modern and
agile software.
3.2. Cloud Computing and the Internet of Things
To circumvent sizeable capital expenditures and other typical challenges of on premise computing
solutions, such as required space and maintenance for hardware, or need for competency in conjunction
with advanced configuration, Cloud-computing has emerged as a viable solution. Aimed at providing
much of the same solution scope, and at greater convenience, such systems have become highly scalable
and personalised to the extent that many companies have chosen to outsource large, or all, parts of in-
house IT infrastructure in favour of cloud services (Willcocks & Lacity, 2016). Complementarily, there
has been a similar development in the adoption of Internet of Things (IoT), defined as “a set of
interconnected things (humans, tags, sensors, and so on) over the Internet, which have the ability to
measure, communicate and act all over the world.” (Díaz, et al., 2016). Whereby Cloud computing has
virtually unlimited capabilities in terms of storage and processing power, IoT infrastructure
comparatively lacks the same features. As such, the inherent complementarity that enables IoT to be
abstracted from its limitations, and Cloud computing to fruition, has led to the Cloud of Things (Aazam,
et al., 2014). Leveraging the combined strengths of these technologies has become a priority for many
solution providers (Díaz, et al., 2016; Microsoft, 2016), and has certainly also minimized the barrier of
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
entry for parties still considering whether to pursue such technology in their organisation.
The application of practical IoT use has been demonstrated in a vast set of use-cases, such as the
structural health of buildings, waste management systems, traffic congestion monitoring, energy
consumption monitoring, smart lighting, building automation, nursing home patient monitoring, eating
disorder mitigation, and indoor navigation for the blind, to name some (Zanella, et al., 2014), (Al-
Fuqaha, et al., 2015). In the maritime domain, there have also been several publicized initiatives spurred
on by larger industry actors, e.g. Hyundai Heavy Industries (MarEx, 2016), Wärtsilä (Wärtsilä, 2016),
and DNV-GL (Späth, 2017).
3.3. Simulation and Virtual Prototyping
The rapid development of computational power and the need of robust, quantitative metrics have
progressively incentivized the implementation of simulation and virtual prototyping in the maritime
industries. Starting in academic environments as a way to generate knowledge (Haddara & Xu, 1998),
and expanding lately to rule development (Glen, 2001), training purposes (Pawlowski, 1996),
conceptual design (Chaves & Gaspar, 2016; Erikstad, et al., 2015), construction (Karpowicz & Simone,
1987), and operational management (Ludvigsen, et al., 2016).
In general, the ship design environment has been the one using the most simulation and virtual
prototyping within the maritime industry. The lack of information at early concept design stages,
together with influence on final performance of decisions taken at this stage up to 80% of the costs
are fixed in the concept design (Erikstad, 2007) spurred the need of understanding the consequences
and implications of decisions in technical, operational, and commercial performances (Ulstein & Brett,
2015). Specific, single-attribute simulation tools could not solve this complexity problem, therefore
holistic, multi-attribute simulation tools have been the core focus of recent research. Concept design
workbenches developed both by universities and industry, pursue the acceleration of the concept design
development process and to increase the potential number of alternatives being evaluated during
consideration of changing contextual factors. These workbenches approach the concept design
development from an alternative perspective. Rather than focusing purely on design parameters their
approach embraces the selection of functional requirements, and which mission the vessel is intended
for, as a premise to design a better vessel: it is only when having the correct set of requirements that
we can decide upon the correct vessel (Gaspar, et al., 2016).
VISTA (Virtual sea trial by simulating complex marine operations) is a simulation-based workbench to
assess operability performance of a design (Erikstad, et al., 2015). Its goal is to shorten the time spent
in concept development and make it more efficient, by creating a template to configure alternative
designs and measure their performance. Notwithstanding, as a state-of-the-art simulation tool for ship
design, VISTA does lack a commercial perspective. (Chaves & Gaspar, 2016) present a similar
approach based on open-source web-based applications. As an alternative application, Li, et al., (2016)
assess the value of implementing virtual prototyping to support the planning phase of offshore
operations though simulation of vessel’s manoeuvrability.
3.4. Big Data
Following the emergence of Industry 4.0, the prevalence of digitization, and the ensuing deluge of
information and knowledge that has surfaced because of it, the topic of Big Data has been at the centre
of many discussions in terms of defining what it is, and how it can be done. In that respect, (De Mauro,
et al., 2015) after a thorough review, suggest an entailing characterization of volume, velocity, and
variety in terms of the what, and “… specific Technology and Analytical Methods for its transformation
into Value” as to the how. The definition leaves little to interpretation in terms of compositional criteria,
but it does leave the innate content of said criteria still as a topic under development and one to be
further explored. More specifically, the technology used to gather, process, store, and distribute various
sources of data, as well as the methodology with which said data is transformed into insight and
subsequent value creation, are the components in need of attention and development to achieve the end
goal of spurring economic value. It is generally agreed that as the volume of data keeps growing,
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
opportunities for new discoveries increase thereafter (Manovich, 2011). It is also anticipated to facilitate
and create substantial shifts in a range of different disciplines, bringing disruptive innovations into play
and potentially revamping how research is conducted (Kitchin, 2014). Nevertheless, organizations are
still facing significant challenges in understanding the guiding principles and value proposition during
early stage adoption (Wamba, et al., 2015). A main cited contributor is the multidisciplinary knowledge
required across topics such as statistics, programming, and other domain (industry) specific fields, to
effectively understand the business challenge at hand, and envisage the necessary solution scope that
will provide an economic surplus upon completion (Dumbill, et al., 2013).
Considering prevalent Big Data technology, Hadoop was one of the first commonly available
frameworks developed for distributed storage and processing of Big Data. Since its inception in 2003,
it arguably has emancipated utilization of big data analytics (BDA), as it could be run by computer
clusters based on cheap commodity hardware. Over time, many competing, as well as complementary,
systems have been developed, many of which have been adopted by the Apache foundation to ensure
operational maintenance and further development, e.g. Apache Pig, Apache Hive, Apache Spark. The
latter has for several applications become a contender to be reckoned with, even though the two are not
mutually exclusive and can work together in some fashions (Gu & Li, 2013). There are an untold
number of other technological solutions not mentioned herein, as a complete overview is beyond the
purpose and scope of this paper.
When considering methods and techniques of approach, implementing, and governing Big Data
initiatives, there are many options to consider. Prominent techniques used in the context of analysis and
prediction include data mining, clustering, regression, classification, association analysis, anomaly
detection, neural networks, generic algorithms, multivariate statistical analysis, optimization, heuristic
search (Chen, et al., 2012).
As has been briefly touched upon in this paper, there are many variations of which to choose from when
evaluating technologies and methodologies for Big Data analytics (BDA). Notwithstanding, the most
impactful challenge lies in the implementation and embracement of BDA as a core part of the firm’s
business model as a quote from (Henke, et al., 2016) states: “the real power of analytics-enabled
insights comes when they become so fully embedded in the culture that their predictions and
prescriptions drive a company’s strategy and operations and reshape how the organization delivers on
them.” These initiatives are often easy to deprioritize as leadership is often focused on performance and
the bottom line based on individual projects or cases, whereas pivoting towards a more holistic
perspective, and aggregating each incremental gain, might depict a much larger benefit. Uncovering
this value in the early stages of BDA initiatives is paramount to gain traction and acceptance.
BDA in practice emanates from many fields and sectors, ranges from economic and business activities
to public administration, and from national security to scientific research (Chen & Zhang, 2014). State-
of-the-art applications and use-cases are copiously being developed, including behaviour prediction,
healthcare analysis, content recommendation, and traffic forecasting (Lv, et al., 2017). In maritime,
much of the focus has currently resided with geospatial analysis (Adland & Jia, 2016), logistics
optimization (Xu, et al., 2015), and condition based monitoring (Wang, et al., 2015). Additionally, the
Japan Ship Technology Research Association has been reported to invest a substantial amount of
funding into a vast BDA system designed to analyse yard workers’ behavioural patterns by gathering
and analysing imagery, radio-frequency identification tags, and physical force gathered from
smartphone accelerometers (Wainright, 2016).
3.5. Artificial Intelligence
The first reported work referencing artificial intelligence (AI) emerged in 1943 and touched upon
knowledge of basic physiology and functions of neurons in the brain; the formal analysis of proportional
logic; and Turing’s theory of computation (Russel & Norvig, 1995). Both applications and research
have come a long way since then, and to portray how it currently can be utilized some explanations are
in order. The most prominent interrelationship between fields within AI research normally depict a
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
hierarchical structure showing AI at the top, Machine Learning (ML) in the middle, and Deep Learning
(DL) as the newest addition (Algorithmia, 2016). Whereas the term AI is commonly accepted as a
machines capability of mimicking human intelligence, ML refers to algorithms that allow computers to
learn behaviours by generalizing from data, and DL a representation-learning method with multiple
levels of representation based on artificial neural networks (ANN) (Bravo, et al., 2014).
Using ANNs to emulate how the brain functions, in conjunction with higher levels of representation,
enables the machine to identify features that have not been designed by human engineers, they are
learned (LeCun, et al., 2015). Combined with the increased availability of high performance parallel
computing via the cloud, this methodology has become increasingly present in consumer products, such
as language translation, image context recognition, or purchasing recommendations. Recently, Google
created a novel approach using DL to replace large parts of their existing language translation service
(Johnson, et al., 2016). Whereas most translation systems only work on a single pair of languages, this
method could handle multiple pairs despite not having been directly trained to do so. As such, based on
discovered patterns, the machine essentially created a new language that could translate via an
intermediary.
Within a maritime domain, the use of DL yields little results in terms of academic research literature.
Identified examples have shown application regarding image processing for ship detection (Tang, et al.,
2015), and the response or load prediction of offshore floating structures (Mazaheri, 2006; Uddin, et
al., 2012; Maslin, 2017).
4. Preparing for the Future Ulstein Experiences part II
4.1. Decision Making in Ulstein
Having introduced in Part I of this paper, the intricacies of the maritime industry as an ingrained part
of the VUCA world, expounded on challenges related to the integration and centralization of data, and
presented a short synopsis of emerging technologies, Part II of the paper will predicate facets of a
practical implementation of big data analytics from Ulstein’s perspective.
It is important to approach vessel design in the conceptual phase from different perspectives: i) To
differentiate among different solutions; ii) to have better understanding of consequence for any small
input change; iii) to measure goodness of fit between final product and requirements; iv) to have more
meaningful benchmarking with market competitors; v) to make better and more robust decision-making
in vessel design; and vi) to support the development of effective sales arguments of - what is a better
vessel. Furthermore, Ulstein appreciates that proper and effective decision making should be based on
the fact that ship design is a multi-variable-based decision making process, and big data oriented to
secure proper balancing of new vessel designs with appropriate trad-off among requirements to resolve
the inherent complexity of ship design. It is essential that the decision-making model can demonstrate,
separate and distinguish among the effects of design parameters (main dimensions, power, mission
attributes, machinery, etc.) on final vessel design performance yield (Ebrahimi, et al., 2015). The
Ulstein approach typically, integrates both multi criteria evolutionary problems with multiple objective
optimization problems to come up with the better solution. Ulstein applies complementary methods for
benchmarking of ship designs: i) Ranking based vessel design including indices developed based on
vessel missions, scoring by indices, and ranking by statistics. Ulstein also apply Hierarchical
multivariate based vessel design benchmarking according to smarter, safer, and greener performance
perspectives by i) hierarchical factor categorization, ii) metric attribution of design factor causal map
matrices and iii) hierarchical comparative based ranking.
4.2. Data Integration in Practice
As has been highlighted previously the challenge of creating a centralized, organized, and maintained
single version of the truth, is a formidable yet increasingly surmountable task. Facilitated by new
storage and processing technologies such as Apache Hadoop and Microsoft Azure, NoSQL, or T-SQL,
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
and combined with modern processes of transformation, augmentation, and automation using languages
such as R, M (Power Query), Python, and PowerShell, Ulstein has initiated the creation of an integrated
ecosystem of information the Ulstein Big Data Repository. Currently, the scope has largely covered
domains such as finance, market intelligence, and resource management much because of the
transaction based nature of residing information. Moving forward, the product and all aspects pertaining
to vessels and ship designs lifecycle will increasingly become a focal point, seeing as, in an ideal state,
the product is at the heart of all business activities. This drives the further development, integration, and
extension of PLM.
4.3. PLM as an Implementation
Implementation of modern PLM is a key success factor for modernization of the Ulstein processes and
methods in novel ship design. Ulstein experience is based on key implementations and testing of modern
established commercial software tools (e.g. Autodesk, Siemens), in conjunction with internal
proprietary knowledge. As an example, testing and implementation of the non-conventional 4GD
framework in ship design has been performed as a comparison to the conventional structuring approach
(Levisauskaite, 2016). Several design and change cases were evaluated and analysed, emphasizing
challenges in ship design like exchange, remodelling, alternatives, and reuse across vessels. The method
is described in detail in (Levisauskaite, et al., 2017), and proved to be a powerful tool for this research
to verify whether the 4GD improves the exchange and facilitates the 3D re-use.
The mentioned case study showed that the 4GD approach requires different thinking on the assemblies
and designing process as the components and features are distinct from a traditional assembly approach.
Due to the absence of assembly constraints together with flat assembly structure in 4GD, the positioning
of the parts becomes straightforward, and changes are accomplished smoothly. These features influence
the exchange of parts which are non-restrictive and a fluent process in comparison to the traditional
assembly approach. Additionally, the effectiveness in 4GD proved to be an efficient solution for
alternative vessels or various ship configurations across the vessel family. It supports the designers
towards avoiding remodelling, and instead reuses 3D models of previous products. In ship design, it is
a powerful tool that is innovative, cost effective, and time saving (Levisauskaite, 2016).
However, this implementation only scratched the surface of the 4GD framework from an entry level
point of view. 4GD is a highly-advanced approach to work and organise design data, which requires
advanced competence within programming, configuration and working with Teamcenter and Siemens
NX to gain sufficient benefit. Moreover, it requires well established needs and requirements to
efficiently employ and integrate 4GD into business processes. The installation and configuration must
be well set and customized. To verify whether 4GD is a beneficial approach for continuous
improvement in shipbuilding, it must be implemented and tested in maritime business and products.
Ulstein has plans to proceed along this experimental line of implementation initiatives.
4.4. Accelerated Business Development
Ulstein has over the years introduced and implemented an Accelerated Business Development
methodology (ABD) to enhance and strengthen our capability to effectively solicit relevant
stakeholders´ expectations and desires when it comes to the realization of ship designs and new building
projects (Ulstein & Brett, 2009).
The core elements of the ABD approach, which aims to better guide ship designers, yards, cargo, and
ship owners in realizing a business opportunity within intermodal transport or offshore field
development work whereby ship design is utilized to achieve a competitive advantage. The approach
advocates that a new or improved solution system, where the ship plays a significant role, shall fulfil
the needs and expectations of all the involved stakeholders in the best possible way through the multi-
attribute decision making ABD-approach. This approach makes it possible to follow the complex and
normally fragmented processes of business development related to maritime transport, offshore oil &
gas field, and the pertinent ship design in a systemic and explicit way.
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
Traditionally, the big data oriented logistics-based requirements of a transport or offshore field system
have been included in the ship conceptualization and design often, in a non-structured and non-scientific
way. Knowledge about logistics is spread over many actors and subjects being involved in realizing the
transport system at hand where the ship design solutions are integral parts of the operation of such
systems, but seldom an integral part of the business development process. The actors in such processes
are ship operators, brokers, investors, designers, consultants, and companies managing transport chains.
None of these parties isolated have the full picture and specific knowledge on assessing a ship's
technical and operational performance in a broader business context. Traditions and specialization over
many years among actors in the overall realization value chain is to blame. Historically, separate
documents like outline, contract and/or building specifications and drawings have constituted the
communicational instrument among the players in the overall decision-making process. Owners'
specifications are typically formulated based mainly on their experience in ship operations. Expanding
on what is or has been the experiences of the past is more typical than what it is we really need. Yards
or designers, on the other hand, typically optimize a vessel with respect to preferred engineering criteria,
such as installed engine power, speed, or lane meters and frequently their own production facilities. If
more specific and complimentary technical, operational, and commercial project information is
necessary, typically ad-hoc inquiry sessions are held with different information sources. More often
than is admitted, solutions developed along these lines are presented as best practice and state-of-the-
art, without really meeting preferred requirements as to applying a sound set of rationales and scientific
reasoning. The ABD approach counteracts these discrepancies and inefficiencies and secures a holistic
management of complex data such as metric, film/video, sensor signals and the like.
Ulstein has carried out more than 25 such ABD processes on own development projects and with
customers. Comprehensive data analytics processes have been carried as complimentary fact finding
following such ABD approaches.
4.4.1. Business Intelligence and Market Analysis
The “Ulstein Business Intelligence Methodology” is intended to create, map and organize the necessary
resources and tasks that streamline the process from an initial BI need to final BI product.
Figure 3: Methodology Sketch (left) and Market Information Data Model (right)
The preconceived idea of an Ulstein methodology was originally sketched as shown in Figure 3 above.
The ensuing process of refining this model was performed together with experienced domain experts
from external companies. The methodology is designed purposefully to be iterated upon, such that
during or after each case specific feedback or experiences should be continuously integrated, adjusted
and updated. In Ulstein marketing research and analytics support executive, marketing, sales, and
product development managers with more effective direction setting and in complex every day and
strategic decision making support.
Ulstein International is today the main proprietor of market information, and as such actively subscribes
to various sources consisting of news subscriptions, PDF reports, and databases to name a few. The
databases constitute a driving portion of the performed market analysis, and traditionally would have
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
to be individually cleaned, modified, and processed before any analysis would occur a task that could
extend up to several hours without difficulty. The task of the project was to automate the manual
processing, and integrate the best attributes from each database into one central version of the truth that
is always up to date. An overview of the model is illustrated in Figure 3 above.
4.4.2. Field Studies and Big Data Analytics
As a slight divergence from the most common interpretation of Big Data, field studies are a well-known
method for the acquisition of operational data and the facilitation of detailed, holistic, and accurate
information that typically is contained as tacit knowledge. They play an important role in acquiring
contextual, systems-oriented, and human-centered knowledge from on-site operations and during
execution generate an extensive amount of data. Sources of information include video, audio, pictures,
interviews, physiological monitoring, notes, diagrams, and models in addition to the plethora of both
on-board systems and provided third party sources such as the integrated automation system (IAS),
automation systems for winches and cranes, dynamic positioning system, accelerometers, cargo load
calculator, route planner, weather forecasts, radar imagery, task plans, operation logs, and so on.
Aggregated, these sources can generate upward of a terabyte of data per day in countless formats, and
as such counts significantly towards the data integration and big data analytics challenge as presented
previously in the paper. In cooperation with various research institutions, Ulstein has developed
methodology to handle the various challenges such an approach entails, and are in the process of
creating a corresponding infrastructure to convert the data into knowledge.
As one of the initial use-cases analysis of AIS (Automatic Identification System) information containing
data for nearly 6000 vessels, was performed. Gathering such data for integration and analysis improved
existing knowledge of design performance, that traditionally had been extracted from theoretical models
and simulations. Additionally, it provided an opportunity to compare aspects of operational features,
such as service speed and operating draft with design draft and speed, respectively. Containing 24 days
of on-site operation, the vessel data was processed and stored in the repository and then analysed.
Despite the relatively short period of evaluation, it showed that offshore vessels typically operate 30%
below designed service speed and 20% below design draft during most commercial operations. This
significant discrepancy between utilized and available capacities is an indication for further
investigation, development, and eventual design target resetting. This type of analysis provides new
opportunities for verifying actual performance including variation in motions, fuel consumption due to
respective weather conditions.
4.4.3. Fast Track Vessel Concept Design Analysis (FTCDA)
Companies operating in the design of maritime units are challenged by the need of incorporating
flexibility, innovation, speed, and agility to their business model (Ulstein & Brett, 2009). The
conventional concept design development process, based on work processes relating to the traditional
design spiral for vessels has proven to be non-effective when it comes to ensuring very short customer
response time and robustness of the results. It is too time consuming and resource demanding, and
drastically limits the number of alternatives to potentially be evaluated for goodness of fit.
In response to this, Ulstein has developed a Fast-Track Concept Design Analysis tool (FTCDA). This
simulation tool combines multivariate statistics, network resources and design knowledge/expertise to
accelerate effective decision making in vessel concept design. The FTCDA is an integration tool which
gathers different modules of the conceptual design process in a unified digital platform. A holistic
approach, combining technical, commercial, and operational perspectives, ensures a more balanced and
robust design solution. The overall concept design development is benchmarked with peer vessel
alternatives, including existing vessels. Hence, the concept design is validated and potential points of
improvement can be identified and rectified to improve the overall performance of vessel design
solutions proposed.
This comprehensive approach requires a multi-disciplinary design platform, combining the different
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
aspects of maritime systems. Technical analyses such as stability, structural strength, and calm and
waves propulsion resistance. Hydrodynamic aspects such as seakeeping and operability, combined with
the evaluation of capacities and capabilities give the operational perspective. The feasibility analysis of
the configured solution, is assessed simultaneously in the tool, including the commercial perspective.
Newbuilding price and operational expenses are then contrasted with the potential revenue capability
and costs of the design solution. This fast-track evaluation of design performance enables designers and
decision makers to better perceive the implications and consequences of individual design changes such
as: main dimensions, mission equipment, operational environment, crew nationality, material, or build
country.
Figure 4. Collage of example results from Ulstein’s FTCDA tool (left) and 3D configurator (right)
The implementation of FTCDA in early design phases has demonstrated three principal advantages:
more robust decisions, higher quality of vessel design solutions - due to the availability of additional
information at an early stage of the concept design process of the problem at hand. Other achievements
include a significant reduction of response time and committed resources, and the capability of
evaluating (visually and analytically) multiple design solutions. In addition, it brings the possibility of
performing sensitivity analyses of cost, capacities, and capabilities towards specific design parameters.
Parallel use of the FTCDA, ABD and other Big Data Analytics tools allow us to validate and verify
promising solutions very quickly. This again, has dramatically reduced the response time with
customers.
4.4.4. Virtual Configuration and Prototyping
FTCDA can also be combined with modern virtual prototyping technology to deliver fast and real-time
3D parametric concepts. A simplistic version of the Ulstein 3D configurator is presented in Figure 4.
It consists of pre-defined modules that are combined in a web-like environment using JavaScript and
WebGL. It provides real-time 3D modules based on pre-defined choices such as modules, main
dimensions, and style preferences. It can also be combined with the FTCDA to deliver a full 2D/3D
analytical package during early and exploratory stages of design.
5.0. Conclusion
The Ulstein experience of approaching and using Big Data Analytics at larger scale has already proven
its worth. Higher productivity in ship design and more robust and higher performance of ship design
solutions are evident. Close to hundreds of recent case studies have exemplified lead time gains and
performance yields of vessel design solutions being developed. Expertise-based design of ships relies
on the domain knowledge and experience of the naval architects. Their capability and capacity to also
utilize and maximise the use of big data analytics and artificial intelligence have demonstrated value
and appropriateness in handling the maritime industry’s challenges represented by a continual influx of
volatility, uncertainty, complexity, and ambiguity, as described in Chapter 1. Such approaches having
been reviewed, discussed, and critically scrutinised in this paper, opens three potential strengths to
companies operating in the maritime industry:
Flexibility: The usability of the resources outside the portfolio of services and products provided by
a company is limited. Typically, when entering in a new segment, even within the same discipline,
requires training and in many cases the acquisition of new or additional expertise. The same happens
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
when going beyond conventional solutions, where traditional design principles may not apply
Agility: The need of acquiring new knowledge or expertise is a time-consuming activity, that may
limit the response time of companies to new market demands, making them less competitive
Robustness: Decision-making under high degree of uncertainty exists because of lack of data and/or
the misinterpretation of them. More and qualified (verified and validated) data reduces uncertainty
and leads to a more effective decision-making process.
Ulstein has undertaken a substantial digitalisation effort in recent years and developed and implemented
a set of new internally manufactured Big Data Analytics based tools and knowledge repositories to
enhance its ship design activity. It is envisaged that this effort is only a good start in exploring the
benefits and potential of Big Data Analytics and digitalisation of business and work processes. The
continuation of improvement activities will encompass, but not be restricted to such expansions such
as online web-based vessel configuration, algorithmic intelligence supported naval architecture and
vessel engineering work, and artificial intelligence based tools’ expansions. Continual improvement
work along these strategic avenues and steadily inclusion of new internally and externally developed
analytics tools are bound to happen, thus accelerating the effectiveness of Ulstein doings.
References
Aazam, M., Khan, I., Alsaffar, A. A. & Huh, E.-N., 2014. Cloud of things: integrating internet of things
and cloud computing and the issues involved.. Anchorage, Alaska, USA, IEEE, pp. 414-419.
Adland, R. O. & Jia, H., 2016. Vessel speed analytics using satellite-based ship position data. s.l., IEEE,
pp. 1299-1303.
Ahamed, Z., Inohara, T. & Kamoshida, A., 2013. The Servitization of Manufacturing: An Empirical
Case Study of IBM. International Journal of Business Admisitration, 4(2), pp. 18-26.
Al-Fuqaha, A. et al., 2015. Internet of things: A survey on enabling technologies, protocols, and
applications.. IEEE Communications Surveys & Tutorials, pp. 2347-2376.
Algorithmia, 2016. [Online]
Available at: http://blog.algorithmia.com/ai-why-deep-learning-matters/
Bennet, N. & Lemoine, G. J., 2014. What a difference a word makes: Understanding threats to
performance in a VUCA world. Business Horizonts, Issue 57, pp. 311-317.
Berger, R., 2014. Industry 4.0. The new industrial revolution. How Europe will succeed.. Roland Berger
strategy consultants, maart.
Borch, O. J. & Solesvik, M. Z., 2016. Partner selection versus partner attraction in R&D strategic
alliances: the case of the Norwegian shipping industry. International Journal of Technology Marketing,
11(4).
Bravo, C. E. et al., 2014. State of the Art of Artificial Intelligence and Predictive Analytics in the E&P
Industry: A Technology Survey. SPE Journal, 19(4), pp. 547-563.
Chaves, O. & Gaspar, H., 2016. A Web Based Real-Time 3D Simulator for Ship Design Virtual
Prototype and Motion Prediction. Lecce, Italy, s.n., pp. 410-419.
Chen, C. P. & Zhang, C. Y., 2014. Data-intensive applications, challenges, techniques and technologies:
A survey on Big Data. Information Sciences, Volume 275, pp. 314-347.
Chen, H., Chiang, R. H. & Storey, V. C., 2012. Business intelligence and analytics: From big data to
big impact.. MIS quarterly, 36(4), pp. 1165-1188.
Corsi, C. & Akhunov, A., 2000. Innovation and Market Globalization: The Position of SME's. Burke,
Virginia, USA: IOS Press.
De Mauro, A. et al., 2015. What is big data? A consensual definition and a review of key research
topics.. s.l., AIP, pp. 97-104.
Devezas, T., Leitão, J. & Sarygulov, A., 2017. Industry 4.0: Entrepreneurship and Structural Change
in the New Digital Landscape. s.l.:Springer.
Díaz, M., Martín, C. & Rubio, B., 2016. State-of-the-art, challenges, and open issues in the integration
of Internet of things and cloud computing. Journal of Network and Applications, pp. 99-117.
Dijkshoorn, N., 1977. Interaction of costs and technological developments in the shipping industry.
Rotterdam, Netherlands, s.n.
Dumbill, E. et al., 2013. Educating the next generation of data scientists. Big Data, 1(1), pp. 21-27.
Ebrahimi, A. et al., 2015. Better decision making to improve robustness of OCV designs.. s.l., IMDC.
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
Ebrahimi, A. et al., 2015. Parametric OSV Design Studies - precision and quality assurance via updated
statistics. Tokyo, Japan, s.n.
Erikstad, S. O., 2007. Efficient Exploitation of Existing Corporate Knowledge in Conceptual Ship
Design. Ship Technology Research, 54(4), pp. 184-193.
Erikstad, S. O., Grimstad, A., Johnsen, T. & Borgen, H., 2015. VISTA (Virtual sea trial by simulating
complex marine operations): Assessing operability at the design stage. Tokyo, Japan, s.n.
Gaspar, H., 2017. EMIS Project: Final Report to the Research Council of Norway (RCN), Ålesund:
NTNU: s.n.
Gaspar, H. M., Hagen, A. & Erikstad, S. O., 2016. On designing a ship for complex value robustness.
Ship Technology Research, 63(1), pp. 14-25.
Gaspar, H. M., Ross, A. M., Rhodes, D. H. & Erikstad, S. O., 2012. Handling Complexity Aspects in
Conceptual Ship Design. Glasgow, United Kingdom, s.n.
Glave, T., Joerss, M. & Saxon, S., 2014. The hidden opportunity in container shipping. McKinsey &
Company, November.
Glen, I., 2001. Ship Evacuation Simulation: Challenges and Solutions. SNAME Transactions, Volume
109, pp. 121-139.
Gu, L. & Li, H., 2013. Memory or time: Performance evaluation for iterative operation on hadoop and
spark.. s.l., IEEE, pp. 721-727.
Haddara, M. & Xu, J., 1998. On the identification of ship coupled heave-pitch motions using neural
networks. Ocean Engineering, 26(5), pp. 381-400.
Henke, N., Libarikian, A. & Wiseman, B., 2016. Straight talk about big data. McKinsey Quarterly, 10.
Johnson, M. et al., 2016. Google's Multilingual Neural Machine Translation System: Enabling Zero-
Shot Translation. arXiv preprint arXiv:1611.04558.
Kahneman, D. & Klein, G., 2009. Conditions for Intuitive Expertise: A Failure to Disagree. American
Psychologist, 64(6), pp. 515-526.
Karpowicz, A. S. & Simone, V., 1987. An application of computer simulaiton methods in ship
production process. Computers in Industry, 9(1), pp. 37-51.
Kessler, E. H. & Bierly, P. E., 2002. Is faster really better? An empirical test of the implications of
innovation speed.. IEEE Transactions on Engineering Management, 49(1), pp. 2-12.
Kitchin, R., 2014. Big Data, new epistemologies and paradigm shifts. Big Data & Society, 1(1), p.
2053951714528481.
Kudyba, S., 2014. Big data, mining, and analytics: components of strategic decision making. s.l.:CRC
Press.
LeCun, Y., Bengio, Y. & Hinton, G., 2015. Deep learning. Nature, 521(7553), pp. 436-444.
Lee, S. G., Ma, Y. S., Thimm, G. L. & Verstraeten, J., 2008. Product lifecycle management in aviation
maintenance, repair and overhaul.. Computers in industry, 59(2), pp. 296-303.
Levisauskaite, G., 2016. Implementation of 4GD framework in Ship Design for improving exchange
and 3D reuse, Aalesund: MSc Thesis NTNU.
Levisauskaite, G., Gaspar, H. M. & Ulster, B., 2017. 4GD framework in Ship Design. Cardiff, COMPIT
2017.
Li, G. et al., 2016. Towards a Virtual Prototyping Framework for Ship Maneuvering in Offshore
Operations. Shanghai, China, s.n.
Ludvigsen, K. B., Jamt, L. K., Husteli, N. & Smogeli, Ø., 2016. Digital Twins for Design, Testing and
Verification Trhoughout a Vessel's Life Cycle. Lecce, Italy, s.n., pp. 448-456.
Lv, Z. et al., 2017. Next-generation big data analytics: State of the art, challenges, and future research
topics. IEEE Transactions on Industrial Informatics.
Manchinu, A. & McConnel, F., 1977. The SFI Coding and Classification System for Ship Information..
s.l., Proceedings of the REAPS Technical Symposium.
Manovich, L., 2011. Trending: The promises and the challenges of big social data. Debates in the digital
humanities, Volume 2, pp. 460-475.
MarEx, 2016. Shipbuilder Looks to Internet of Things for Future Business. [Online]
Available at: http://www.maritime-executive.com/article/shipbuilder-looks-to-internet-of-things-for-
future-business
Maslin, E., 2017. Neural networking by design. Offshore Engineer, 1 3, pp. 26-27.
Mazaheri, S., 2006. The Usage of Artificial Neural Networks in Hydrodynamic Analysis of Floating
Pre-print version: Paper presented at the 16th International Conference on Computer Applications and
Information Technology in the Maritime Industries, May 2017, Cardiff, UK.
Offshore Platforms.. International Journal of Maritime Technology, 3(4), pp. 48-60.
McCartan, S. et al., 2014. European boat design innovation group: The marine design manifesto.
Coventry, UK, Marine Design.
Microsoft, 2016. News Center. [Online]
Available at: https://news.microsoft.com/2016/07/11/ge-and-microsoft-partner-to-bring-predix-to-
azure-accelerating-digital-transformation-for-industrial-customers/
Morais, D., Waldie, M. & Larkins, D., 2011. Driving the Adoption of Cutting Edge Technology in
Shipbuilding. Berlin, s.n., pp. 490-502.
ludvigs, C. et al., 2013. New Approaches to Through-life Asset Managemnet in the Maritime Industry.
Cranfield, United Kingdom, s.n., pp. 219-224.
Pawlowski, J. S., 1996. Hydrodynamic modelling for ship manoeuvring simulation. Rotterdam,
Netherlands, s.n.
Russel, S. & Norvig, P., 1995. Artificial Intelligence: A modern approach. Englewood Cliffs: Citeseer.
Schmidt, R. et al., 2015. Industry 4.0-potentials for creating smart products: empirical research results.
s.l., Springer, pp. 16-27.
Schoemaker, P. J. & van der Heijden, C. A., 1992. Integrating Scenarios into Strategic Planning at
Royal Dutch/Shell. Strategy & Leadership, 3(20), pp. 41-46.
Sharma, R. et al., 2012. Challenges in computer applications for ship and floating structure design and
analysis.. Computer-Aided Design, 44(3), pp. 166-185.
Späth, N., 2017. DNV GL’s new Veracity i ndustry platform unlocks the potenti al of bi g data. [Online]
Available at: https://www.dnvgl.com/news/dnv-gl-s-new-veracity-industry-platform-unlocks-the-
potential-of-big-data-85547
Stark, J., 2015. Product lifecycle management. s.l.:Springer International Publishing.
Stopford, M., 2009. Maritime Economics. Abingdon, United Kingdom: Taylor & Francis.
Tang, J., Deng, C., Huang, G. B. & Zhao, B., 2015. Compressed-domain ship detection on spaceborne
optical image using deep neural network and extreme learning machine.. IEEE Transactions on
Geoscience and Remote Sensing, 53(3), pp. 1174-1185.
Thimm, G., Lee, S. G. & Ma, Y. S., 2006. Towards unified modelling of product life-cycles.. Computers
in Industry, 57(4), pp. 331-341.
Tomar, G. S. et al., 2016. The Human Element of Big Data: Issues, Analytics, and Performance.
s.l.:CRC Press.
Uddin, M., Jameel, M., Razak, H. A. & Islam, A. B. M., 2012. Response prediction of offshore floating
structure using artificial neural network.. Advanced Science Letters, 14(1), pp. 186-189.
Ulstein, T. & Brett, P. O., 2009. Seeing whats next in design solutions: Developing the capability to
develop a commercial growth engine in marine design. Trondheim, Norway, s.n.
Ulstein, T. & Brett, P. O., 2015. What is a better ship? - It all depends.... Tokyo, Japan, s.n.
Wainright, D., 2016. Ship Sales. [Online]
Available at: http://www.tradewindsnews.com/shipsales/774217/japans-shipbuilders-look-to-
technology-to-gain-edge
Wamba, S. F. et al., 2015. How ‘big data’ can make big impact: Findings from a systematic review and
a longitudinal case study.. International Journal of Production Economics, Volume 165, pp. 234-246.
Wang, H. et al., 2015. Big data and industrial internet of things for the maritime industry in
northwestern norway.. s.l., IEEE, pp. 1-5.
Willcocks, L. P. & Lacity, M. C., 2016. The new IT outsourcing landscape: from innovation to cloud
services.. s.l.:Springer.
Wärtsilä, 2016. Optimising ship lifecycle efficiency. [Online]
Available at: http://cdn.wartsila.com/docs/default-source/services-documents/white-
papers/w%c3%a4rtsil%c3%a4-bwp---optimising-ship-lifecycle-efficiency.pdf?sfvrsn=2
Xu, J., Huang, E., Chen, C.-H. & Lee, L. H., 2015. Simulation optimization: a review and exploration
in the new era of cloud computing and big data. Asia-Pacific Journal of Operational Research, 32(3),
pp. 155019-34.
Zanella, A. et al., 2014. Internet of things for smart cities. IEEE Internet of Things journal, pp. 22-32.
... A proper generalized and consolidated definition accepted by the majority of DTT custodians, researchers and developers are, therefore, needed. Despite this ongoing "allover-the-place" development, digital twin technology is fundamentally part of the rise of new digital technologies to support more effective decision-making activities and enhance the servitization of the objects being twinned (Keane et al. 2017) -or in other words, "achieve a cyber-physical integration" (Qi et al. 2021, 3). One of the basic principles of the digital twin technology is to bring real-time data about the status of the physical context and the object in question to improve the effectiveness of decision-making in operations (Aheleroff et al. 2021). ...
Conference Paper
The digital twin technology platform has not yet, achieved the expected acceptance and wider implementation in the maritime industry, although theoretically, it makes a lot of sense in advancing vessels' life cycle control and performance monitoring. There could be many reasons behind such a reluctance to take new technology into full use. It is argued by this article, that too complex, too expensive, and too few goal-oriented real-life examples are some influential factors. The success rate of such technology development and implementation is therefore limited. Thus, the promised and expected short-and long-term benefits to be achieved from digital twin applications in relation to vessel operations and their design are still missing. It is, therefore, the purpose of this paper to present and discuss some of the reasons for this lack of success.
... The emergence of Big Data Analytics due to the data availability, storage capacity and increasing capability have become tools for to stakeholders driver in mitigating future uncertainty in all type industries [1]. Due to that, Big Data is used in countless applications. ...
Chapter
Automatic Identification System (AIS) data records a huge quantity of information regarding the safety and security of ships and port facilities in the international maritime transport sector. However, this big database is not only useful for the security of ships operations and port facilities. It can also be helpful for other important functions in maritime traffic such as reducing environmental impacts, improve the logistics and analyses compliance with current International Maritime Organization (IMO) regulations. This study develops an analytical approach to quantify the impacts of ship emissions in the Guanabara Bay of Rio de Janeiro (Brazil) using AIS database as well as life cycle assessment (LCA) tool. The paper describes a method in two steps. First, the inventory of ship emissions is evaluated and geolocated with AIS data through the assessment of fuel consumption calculated for each individual vessel. Then, the impact of the emissions is assessed with the ReCiPe LCA method that translates emissions into a limited number of environmental impact scores by means of so-called characterization factors. The results show that the proposed methodology is efficient to estimate the environmental impact of ship emissions over the Rio de Janeiro Port area. We suggest that quantifying the number of emissions from ships in order to fulfil IMO regulations and reduce the health impacts of people who are living in surrounding areas of high maritime traffic is important for decision makers and for the maritime authorities to improve their strategies.
... The emergence of Big Data Analytics due to the data availability, storage capacity and increasing capability have become tools for to stakeholders driver in mitigating future uncertainty in all type industries, [1]. Due to that, Big Data is used in countless applications. ...
Conference Paper
Automatic Identification System (AIS) data records a huge quantity of information regarding the safety and security of ships and port facilities in the international maritime transport sector. However, this big database is not only useful for the security of ships operations and port facilities. It can also be helpful for other important functions in maritime traffic such as reducing environmental impacts, improve the logistics and analyses compliance with current International Maritime Organization (IMO) regulations. This study develops an analytical approach to quantify the impacts of ship emissions in the Guanabara Bay of Rio de Janeiro (Brazil) using AIS database as well as life cycle assessment (LCA) tool. The paper describes a method in two steps. First, the inventory of ship emissions is evaluated and geolocated with AIS data through the assessment of fuel consumption calculated for each individual vessel. Then, the impact of the emissions is assessed with the ReCiPe LCA method that translates emissions into a limited number of environmental impact scores by means of so-called characterization factors. The results show that the proposed methodology is efficient to estimate the environmental impact of ship emissions over the Rio de Janeiro Port area. We suggest that quantifying the number of emissions from ships in order to fulfil IMO regulations and reduce the health impacts of people who are living in surrounding areas of high maritime traffic is important for decision makers and for the maritime authorities to improve their strategies.
... Sometimes data collected are not simply used because of its complexity, or the lack of such data prevents further interpretation and diagnostics. There are potential opportunities, based on an enhanced large amount of data when collected, stored and collated properly to understand better and improve the logistics, emissions, energy consumption and maintenance of the ships in service and new design solutions respectively [5]. Big Data generally refers to data that exceeds the typical storage, processing, and computing capacity of conventional databases and data analysis techniques. ...
Article
Typically, only a smaller portion of the monitorable operational data (e.g. from sensors and environment) from Offshore Support Vessels (OSVs) are used at present. Operational data, in addition to equipment performance data, design and construction data, creates large volumes of data with high veracity and variety. In most cases, such data richness is not well understood as to how to utilize it better during design and operation. It is, very often, too time consuming and resource demanding to estimate the final operational performance of vessel concept design solution in early design by applying simulations and model tests. This paper argues that there is a significant potential to integrate ship lifecycle data from different phases of its operation in large data repository for deliberate aims and evaluations. It is disputed discretely in the paper, evaluating performance of real similar type vessels during early stages of the design process, helps substantially improving and fine-tuning the performance criterion of the next generations of vessel design solutions. Producing learning from such a ship lifecycle data repository to find useful patterns and relationships among design parameters and existing fleet real performance data, requires the implementation of modern data mining techniques, such as big data and clustering concepts, which are introduced and applied in this paper. The analytics model introduced suggests and reviews all relevant steps of data knowledge discovery, including pre-processing (integration, feature selection and cleaning), processing (data analyzing) and post processing (evaluating and validating results) in this context.
Conference Paper
Full-text available
State-of-the-art VR tools can be used for simulating critical ship operations at the conceptual design stage to evaluate the likely implications of set design parameters and to understand how external factors such as environmental conditions impact the concept design solution. A comparison of different available free-source VR software used for such simulations is performed in this paper. Our study shows that some recent VR tools are unnecessarily complex and cumbersome to be used effectively in conceptual ship design processes. Sometimes and in certain user-case situations, the ship design industry needs to find a better balance between the precision and quality of display and the resource intensity to develop the right VR-based simulations. Very often they focus more on the visualization element-to give a WOW-effect-and less on aspects relating to the efficient use of the tools within acceptable time-to-market deadlines and overall resources consumption.
Conference Paper
Full-text available
This paper proposes an open-source application capable to run real-time ship motion simulations in a web browser, in any device of any operational system with HTML5 compatibility. This becomes possible by implementing closed-form expressions for wave-induced motion in JavaScript code, assisted by THREE.js and WebGL libraries to handle 3D graphics. Furthermore, this approach offers support for parametric 3D models and fast collaborative virtual prototype development. The breakthrough advantage consists in adapting the simulation tool requirements to user's common platform (modern web browser), instead of forcing the user to comply with the tool requirements (operational system, installs, updates, commercial software or file format).
Article
Full-text available
The term big data occurs more frequently now than ever before. A large number of fields and subjects, ranging from everyday life to traditional research fields (i.e., geography and transportation, biology and chemistry, medicine and rehabilitation) involve big data problems. The popularizing of various types of network has diversified types, issues, and solutions for big data more than ever before. In this paper, we review recent research in data types, storage models, privacy, data security, analysis methods, and applications related to network big data. Finally, we summarize the challenges and development of big data to predict current and future trends.
Book
Full-text available
This proposed book talks about participation of human in Big Data. How human as a component of system can help in making decision process easier and vibrant. Human participation can be direct and indirect. Direct involvement includes entering User Generated Content in blogs and other review sites. On the other hand, indirect involvement includes searching, and shopping. This book is divided into four parts as below: Part 1: Introduction to Human Side of Big Data – Definition & Explorations. Part II: Human Side of Big Data - New Trends & Methodologies. Part III: Algorithms & Application of Advancement in Big Data. Part IV: Future Research & Scope of Human Side of Big Data
Conference Paper
This paper presents a digital twin simulation platform, “Nauticus Twinity”, with the vision of providing a more efficient verification scheme for the maritime industry. A digital twin of a vessel consists of a number of simulation models that are continuously updated to mirror its real-life twin. A key feature of the simulation platform is the open architecture allowing integration and co-simulation of models developed by DNV GL and our partners. The platform facilitates new tools for design, classification, verification, commissioning, condition monitoring, and decision-making throughout a vessel’s life cycle. The paper focuses on co-simulation and use of digital twins for the new build phase, and includes an example of how Nauticus Twinity can improve the commissioning and the verification process for complex integrated systems.
Article
We propose a simple, elegant solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no change in the model architecture from our base system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. The rest of the model, which includes encoder, decoder and attention, remains unchanged and is shared across all languages. Using a shared wordpiece vocabulary, our approach enables Multilingual NMT using a single model without any increase in parameters, which is significantly simpler than previous proposals for Multilingual NMT. Our method often improves the translation quality of all involved language pairs, even while keeping the total number of model parameters constant. On the WMT'14 benchmarks, a single multilingual model achieves comparable performance for English$\rightarrow$French and surpasses state-of-the-art results for English$\rightarrow$German. Similarly, a single multilingual model surpasses state-of-the-art results for French$\rightarrow$English and German$\rightarrow$English on WMT'14 and WMT'15 benchmarks respectively. On production corpora, multilingual models of up to twelve language pairs allow for better translation of many individual pairs. In addition to improving the translation quality of language pairs that the model was trained with, our models can also learn to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation. Finally, we show analyses that hints at a universal interlingua representation in our models and show some interesting examples when mixing languages.
Book
This book presents the latest research perspectives on how the Industry 4.0 paradigm is challenging the process of technological and structural change and how the diversification of the economy affects structural transformation. It also explores the impact of fast-growing technologies on the transformation of socioeconomic and environmental systems, and asks whether structural and technological change can generate sustainable economic growth and employment. Further, the book presents the basic innovations (new technologies, materials, energy, etc) and industrial policies that can lead to such a structural change.