Conference PaperPDF Available

PROGRESSIVE MANAGEMENT METHODOLOGY FOR REAL-TIME BUSINESS INTELLIGENCE DECISION SYSTEMS

Conference Paper

PROGRESSIVE MANAGEMENT METHODOLOGY FOR REAL-TIME BUSINESS INTELLIGENCE DECISION SYSTEMS

Abstract and Figures

Business intelligence systems represent a significant trend today. Choosing the right project management methodology is an essential step for a successful business intelligence implementation. New aspects and perspectives are included in this process nowadays due to new requirements imposed by the real-time activities. The automated decision-making systems used in different activity domains and the low-latency responses required by different processes determine new specifications for the entire system. The response delay of each time chain component has become a design factor. Also, using automated decision-making systems, the human factor is excluded from an important part of the decision process. To manage the decision tree appropriately, the human and automated decisions units must also be included in the business intelligence system design. It was found that the results obtained after the implementation of a real-time decision system will conduct to new requirements for the business intelligence system itself and will produce new resources for a better and improved solution. This progressive implementation needs a suitable management methodology in order to permit evaluative adaptability for the entire system. This paper will present the Progressive Management Methodology especially designed for a successful Real-Time Business Intelligence Decision System implementation. The model permits the analysis, design, implementation, and improvement for the real-time components considering the time-delay as a design factor.
Content may be subject to copyright.
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
PROGRESSIVE MANAGEMENT METHODOLOGY FOR
REAL-TIME BUSINESS INTELLIGENCE DECISION SYSTEMS
Cristian PĂUNA
1
Economic Informatics Doctoral School
Bucharest Academy of Economic Studies
cristian.pauna@ie.ase.ro
Abstract. Business intelligence systems represent a significant trend today. Choosing the
right project management methodology is an essential step for a successful business
intelligence implementation. New aspects and perspectives are included in this process
nowadays due to new requirements imposed by the real-time activities. The automated
decision-making systems used in different activity domains and the low-latency responses
required by different processes determine new specifications for the entire system. The
response delay of each time chain component has become a design factor. Also, using
automated decision-making systems, the human factor is excluded from an important part of
the decision process. To manage the decision tree appropriately, the human and automated
decisions units must also be included in the business intelligence system design. It was found
that the results obtained after the implementation of a real-time decision system will conduct
to new requirements for the business intelligence system itself and will produce new
resources for a better and improved solution. This progressive implementation needs a
suitable management methodology in order to permit evaluative adaptability for the entire
system. This paper will present the Progressive Management Methodology especially
designed for a successful Real-Time Business Intelligence Decision System implementation.
The model permits the analysis, design, implementation, and improvement for the real-time
components considering the time-delay as a design factor. Besides, the human and automated
decision units will be included and analyzed in the decision tree together with the suitable
control links between the two parts. The real-time components also contribute to the
development of specific design methods in order to assure the fluency functionality for the
entire system. Examples from a real-time business intelligence decision system of a capital
investment company will be presented in order to reveal different aspects related to the time-
chain, decision tree and the particularities of progressive modern implementation.
Keywords: business intelligence system, automated decision-making system, real-time
system, low-latency response, decision tree, progressive management methodology.
JEL classification: M15, O16, G23
1. Introduction
Design and implementation of business intelligence systems (BIS) is an essential activity
today. In order to adapt and improve business decisions, more and more companies are
developing complex BIStware
market is forecast to reach $18.3 billion in 2017, an increase of 7.3 percent from 2016,
according to the latest forecast from Gartner, Inc. By the end of 2020, the market is forecast
  [1] This fast-growing market still faces challenges, especially due
1
This paper was co-financed by Algorithm Invest SRL (https://algoinvest.biz)
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
to the economic climate changes and also as a result of continuous modifications in the
systems requirements. It is well known that an important part of the BIS fails in their
implementation attempt due to inappropriate project management, as a result of the
inadequate analysis process, or just because of the misallocation of the resources. From the
known predictions       
and experimentation, and will be abandon [2] This paper aims to improve this factor
analyzing different failure reasons and improving the project methodology in order to
increase the success rate. The original Progressive Management Methodology (ProgressM)
will be revealed in this article together with practical examples and facts from a Real-Time
Business Intelligence Decision System implemented with this method.
It is well known that the BIS analyze the past in order to improve the future using
information technology. ad category of applications, technologies, and
processes for gathering, storing, accessing and analyzing data to help business users make
 [3] Even they are technological systems, the human factor is involved in
this process to design, implement and improve the system. The failure or success of a BIS
implementation can be quantified after the decisions taken will produce economic effects in
the enterprise. The adopted project management methodology is essential in order to build a
successful system. It was found that resource planning is a critical activity and it is mandatory
to be included in BIS development as a standalone process. Besides, the decision tree must
also be included in the BIS design. More, once the system uses automated-decision processes,
the decision tree becomes a significant part of the system. An important part of the
requirements changes will become from the real-time decisions and the links between human
and automated levels decisions.
The business needs will impose the BIS project requirements. These factors are in constant
change today, and this reality will dictate significant modifications in the design,
implementation and management process of the BIS projects as we will see in this paper. It
was found that a critical failure factor for the BIS is the delay between the data analysis
results and the decision support level needs. Informatics systems providing aggregated data
and reports one time per day for a decision activity that needs real-time information for quick
adjustments of the main activity is not a successful solution. Even an update made one time
per hour for the aggregated data will not assure a stable functionality of a real-time decision
process. For these cases, real-time business intelligence systems (RTBIS) are required. These
are systems using applications, technologies, and processes to collect, access, store and
analyze real-time data streams in order to aggregate real-time decisions. Due to the fast speed
implied, RTBIS usually include automated decision-making components to assure the fast
response required. These are called in this paper Real-Time Business Intelligence Decisions
Systems (BIDS); they are BIS generating and executing real-time automated decision streams
in order to produce enterprise results.
A particular domain where BIDS are required is the domain of capital markets. Due to the
large price volatility in the financial markets, the trading and investment decisions must be
made and executed in a very short time in order to catch the best price levels. The profit is the
reason to participate in the capital markets. It is made by the difference between the buy and
the selling prices. A fast evolution price market involves real-time price data analysis and a
real-time decision process to be organized in order to buy or sell at the desired price. The
delayed orders received by the brokerage system will be ignored if the price is too far away
from the current real-time quote. The delay of a performing system today is under the 10-100
milliseconds for a stable system. Different brokerage system can accept delays until 300-500
milliseconds, but in these cases, they will execute the orders at the market price instead of the
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
price ordered by the investor. All of these conditions describe as a design factor for BIDS for
modern financial investment companies: the low-latency response of the entire system.
This paper will present a modern methodology to design, build, implement, maintain, manage
and evolve the BIDS: the Progressive Management methodology (ProgressM). The model
will include the resource planning methodology and interactive processes in order to design
and implement the real-time chain components. It was found that the latency response of each
component is a design criterion and special methods are involved in order to manage these
systems. The low-latency functionality of all components also conducts to a new architecture,
specific design processes, and specific methodology to manage the entire real-time system
being used. The ProgressM methodology includes the decision tree in the design and
implementation process in order to manage real-time automated decision-making systems.
2. Business Intelligence Systems
The BIS is a system analyzing the past data in order to build decisions to gain a specific result
for the future.   [3] The first idea became from 1950 in
   Support  [4] from a study of small groups about how the
resulting data can be analyzed and summarized in order to retrieve useful information. Later
in 1960, idea 
[5] Starting with 1970 different business journals has
published different papers a      
Systems.  These concepts have become in 1980 the well known  
Systems, owing to the increasingly complicated and volatile environment of decision-making
application[7]  Analytical Processes is an approach to answering multi-dimensional
[8] which have evolved continuously in the years of 1990. The BIS term is used
today in order to cover all these systems under a unitary project.
concepts, tools, methods, and technologies that, once connected,
              
strategies, technologies and decision factors used by companies in order to analyze the
available data, to provide the historical, current and predictive analysis of different business
indicators and to facilitate the business decisions in order to obtain a specified result in the
future activity. The usual components of BIS are online analytical processing, data analysis,
data mining, text mining, process mining, event processing analysis, business reporting,
benchmarking analysis, performance analysis, predictive analytics, and prescriptive analytics.
BIS can handle a large amount of structured or unstructured data, use specific data and
reporting processes to allow a facile interpretation of the significant data in order to generate
decisions in accordance with the current state of the business and to determine better
performance for the future. The massive data collections are available today due to increased
performance acquisition processes and supported by more substantial storage support and
improved computing speed. Due to all of these aspects, BIS performances have grown in the
last decades. Using modern data sources as smartphones, tablets, internet of things,
generalized computer systems and social media records, the business intelligence systems
results have gained new dimensions, and a significant number of enterprises include the BIS
in their plans.
BIS is designed to manage the past data in order to report the state of the business to help for
a more sustained decision. RTBIS analyze real-time data streams, and the reports are built in
real-time, including the present data. The fast-growing real-time data systems development
pushes to new limits the streaming data analysis. The real-time analysis became an essential
factor in BIS today especially for the finance industry but also for different other domains
where the streaming data analysis is providing useful information for the sales and marketing
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
department, for the executive management level and also for the research and development
departments. This fact is confirmed by the BIS survey published by Forbes in [10].
However, analyzing the data streams is not enough for different business nowadays. The
speed of data modification in real time in different domains implies real-time decisions to be
built and executed by informatics systems also in real time or with very low latency. For this
case the human factor is excluded, the automated decision-making systems being designed
especially for this purpose. By consequences modern BIS include automated decision-making
software as are parts of the decision chain. This paper will present a project management
methodology especially adapted to manage the design, implementation, maintenance, and
progress of BIDS in order to improve the main failure factors and to increase the success rate.
3. Project Management Methodologies
As we have seen, BIS is a complex system integrating multiple data sources, distributed
informatics systems, and multiple data and reporting processes in order to offers sustained
information for the decision level. The project management methodology applied to design
and implement BIS may vary depending on the business specificity. Over the years many
authors have established several types of methods or factors that can be applied for the
informatics system particularly or generalized for complex BIS. Some authors use different
critical factors to break the design process into sequential phases depending on the specificity
of the project. Methodologies including increased management and control steps are
developed and presented in [11], [12] or [13]. Other authors [14] indicate as a key factor the
communication of all successful practices used and build additional support for inexperienced
or unexceptional developers or phases [15]. The project management methodology must
build a perspective to solve all challenges during the project implementation. Starting with
the first analysis phases, the methodology must respond to all steps including the project and
requirements modifications due to economic, functional or environmental fluctuations.
There is sustained research about the formal and strict methodologies for information
development systems, particularly during the time. More researches are presented by [16] or

to the specificity of the meaning of the complex project omitting, changing or adding
substantial steps as it can be discovered in [18], [19] or [20]. It was found that the BIS
involve significative differences during the project implementation compared with the simple
informatics systems. The business processes related to the functionality of the entire BIS
implies specific flexibility for all the steps during the project management and
implementation. In BIS we have a data functionality implementation which demands cross-
organizational activities to be developed and planned. In addition, aspect as low-latency
response gets special attention.
Once the BIS target is the information required by the decision level, the enterprise-wide data
integration is needed in order to link and analyze all processes for a complete and sustained
prediction. An interesting approach in order to design the development steps is presented in
[21]. Being a data-driven process with more flexibility needed in the management process,
the BIS project can be managed with an agile methodology but not Scrum which is a code-
driven methodology [22]. A significant degree of planning and coordination is needed
together with enough flexibility to integrate all changes during the implementation project.
No universal methodology can be applied blindly for any BIS system design. More sustained
arguments can be found in [23].
The specificity of each business must be considered in order to build a suitable situation-
based methodology for each BIS and to avoid the risk of using an inadequate model. More
considerations about this idea can be found in [24] together with the sustained idea that the
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
decision regarding the methodology used to manage the BIS project is based on the majority
of cases on the individual preferences of the designers and project managers than rational
aspects, in case there are no specific requirements from the company ownership. In [25] we
can find that a sustained investigative approach will improve the BIS methodology to fit the
type of the project, better than a classic or prescriptive approach. Under all of these
considerations, this paper will reveal a methodology to manage the BIDS.
The flexibility criterion can classify the project management methodologies as we can see in
figure 1. In the low flexibility range, we have the fixed plan methodologies as the Waterfall
method [26], Cleanroom method [27] or the Capability Maturity Model [28]. With a little
higher flexibility we have the iterative models as it is the Spiral model [29]. With more
flexibility, there are incremental models as it is the Rational Unified Process [30]. The agile
models are those with the highest flexibility level as Scrum [31] or eXtreme Programming
Model [32].
Figure 1. Different model development types depending on the flexibility degree.
Looking at the specificity of each model type we can summarize that the agile methods
        
and to fix them in the next iteration. Determining the highest-priority set of requirements to
             
[23]. The iterative and incremental methodologies can be adapted to include the changes
during the design progress [32] more easily than the planned-based models which are known
as the most uncomfortable when it is about the business or requirements changes during the
project. Several more considerations can be found in [24] and [33]. For the real-time BIS, an
adaptable model is needed, but the control is also required in order to measure and manage
different characteristics for those process included in the real-time chain.
4. Progressive Management Methodology
In this section, it will be presented an original project management methodology which
combines the advantages of all known models mentioned in the last chapter with improved
factors in order to respond to all requirements of BIDS projects.
To add value from the scientific point of view, the methodology presented in this paper must
answer to a specific list of arguments. First, the methodology must be applicable to a
representative class of problems; this is the BIDS including automated decision-making
processes. Second, the method presented must include new findings and new knowledge.
These are related to the aspects of adaptable resource management, the real-time data
processes, and the model to evaluate and distribute the time delay in the time-chain in order
to obtain a low-latency response and how the components of the automated decision can be
integrated into the decision tree in order to manage the functionality of the entire system.
Third, the methodology presented must have reproducibility and to be applicable to a new
case in the same hypothesis. These factors are assured by the generality of the presented
model which can be applied for any BIDS regardless of the company's activity. On the fourth
scientific point of view, the method must generate value to a class of users or companies.
This is a fact; the methodology was already implemented to a BIDS for a capital investment
company and can be applied in the future to any other similar enterprise. Besides, the
methodology presented can be applied to any other real-time activity from any other domain.
The fifth scientific value criterion asks the methodology to be feasible and the costs and
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
application effort to be justified by the obtained benefits. The methodology presented in this
paper also provides this measure. Being exclusively an organizational model, the costs are
similar with the other existing models, and the advantages will be revealed on the way by
transparent management of the time-chain and by the interactive processes to design and
manage the decision tree. The sixth point of view is to adopt a value-based approach in order
to develop and classify the metrics of the presented methodology. This last requirement is
also considered in this article; the measure of the model being even the time-delay produced
by each component in the time-chain summarized into the total delay of the system.
4.1. ProgressM principles
In the era of agile methodologies for software and project management, lack of a general plan
is a failure factor. One of the key roles of the BIS project manager, BIS architect, BIS master
or whatever name can have the responsible job for the BIS success is to manage all the
activities and resources in order to obtain functional BIS, and to adapt the whole process to
all discovered changes. There are three major factors considered as ProgressM principles:
- Any action must be accompanied by the necessary and sufficient resources;
- The progress in any activity comes from small, multiple, complete and consecutive steps;
- Unplanned interactivity between components must be followed by control keys and points.
These factors come from experience in different agile project activities. The ProgressM
principles are developed in order to avoid the failure of the implemented project. In practice,
it was observed in many cases when the necessary resources were unavailable for a specific
requirement or task. This is probably one of the significant failure reasons. A simple example
is that real-time BIS cannot work without real-data streams. Alternatively, real-time BIS
cannot work properly with only one real-time data supplier. Any interruption in the data
stream connection will put the enterprise in a nonfunctional stage. Factors like these are
decided in the business analysis and business modeling and here comes the experience of a
good project manager to ask for suitable resources form the beginning. An inexperienced
manager will drive on the wrong ways from the beginning. Another example about
inadequate resource allocation is in a more advanced progress stage when the managers and
owners understand that need a specific state of acts, for example, real-time reports and they
allocate no additional resources for this new improvement. We can handle what we have
sentence is not working any time, and the result can be the failure of the whole BIS.
The second principle integrated into the ProgressM methodology becomes from the reality
that usual complex activities or processes do not have an explicit algorithm or solution. The
majority of new business and enterprises are trying to be unique in order to gain a
competitive advantage. A unique activity supposes a unique or a new idea, an unknown
algorithm and an unknown technical way in order to find a unique technical solution. The
ProgressM methodology is designed especially for this case when unknown solutions must be
found in order to solve the BIS problems. In these cases, the progress and success of any
activity come from small, multiple, complete and consecutive steps. Any components of BIS
will be designed and implemented in small steps. To accomplish the objective, multiple steps
are necessary. Usually, the next step can be initiated after the last objective was achieved. In
an agile environment, multiple steps can be consecutively initiated. In a key-driven
methodology, the steps will be confirmed as finished consecutively, once the key factor or the
desired efficiency level was achieved. Third principle included in the ProgressM
methodology is that we can not have a general plan for all actions, more flexibility is needed
especially for those unknown steps which make the BIS unique, a high degree of interactivity
is necessarily between all components of the system, including the resource components but
for each interactivity must be planned a critical factor in order to quantify or to measure the
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
interactivity and to measure progress. For example, for the real-time components of BIS, the
time-delay will be the key factor. A new progress step for a real-time component must
achieve a smaller time-delay, measure easily to be followed.
4.2. ProgressM steps
The ProgressM methodology presented in this paper was applied for the first time to build,
implement, maintain and improve BIDS of a financial trading company. We will use this
particular example to present the methodology and how it can be applied. The model was
designed starting to the initial criteria imposed by the specificity of the mentioned activity:
a). The initial business requirements are known, but because the company activity is new and
implies new technologies and techniques with fast time evolution, the business requirements
will be changed from time to time. Also, some of the technical solutions are unknown in the
initial business analysis process. Starting from this idea, the model can not use a plan-driven
methodology; a higher degree of flexibility must be involved in order to develop new or
unknown business components. The new technologies used can produce new requirements in
the decision-making chain which presume new requirements for the entire BIS. Starting from
this fact, more interactive progress steps are included together with a global iterative process
in order to evolve the solution for the entire system. Once the new requirements are produced
and delivered by the BIS and can be sustained by the new resources also produced by the BIS
itself, we will call this as Progressive Management Methodology, on short ProgressM. The
logical of this methodology is presented in figure 2.
Figure 2. Progressive Management Methodology for
Real-Time Business Intelligence Decision Systems.
b). Because it is about BIDS, the design key factor will be the time delay for each BIS
components in the time-chain. Into a system with inputs as real-time streams, the extract,
transform and load processes must be done with low-latency in order to deliver fast data for
the online analytical processing and services. The time-chain is the sum of all real-time
components in the BI. The real-time automated decisions are made at the end of the time-
chain functionality, and the total delay is also a design factor for the entire BIS. For this
purpose, the model will use an iterative cycle for each time-chain component and a sequential
design to link all the time-chain components with the other BIS components. This ProgressM
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
method can be applied in any BIDS design when a specific measure is used as a design
factor. For other systems or activities, different design factors can be used. For example a
cost index, a limit in some resources or expenses, a requested communication chain between
the BIS components or a time-driven process which define for example the production chain
or a related service. In all these cases the sequential model is the right solution in the design
and implementation phases; even the rest of the BIS is solved other agile models accordingly
adapted.
c). The methodology must permit a link between the automated decision and the human
decision factors in order to integrate all decisions into a decision tree. The human factor
decision level will control and manage automated decision processes. They all contribute
together to the results of the enterprise. Once the automated decisions are executed by the
priory designed automated processes, the implementation of the BIS decision system will
produce results, but equally will produce new requirements. This fact will call again the
progressive global cycle, and the methodology must answer to the management needs in
order to quantify the new requirements, to measure the new resources available and to plan a
new progress step, which will be an incremental global process for the whole system.
d). The links between the BIS processes and services and the real-time decision chain actions
can also evolve as direct results of the BIS operation. Learning from the past must be an
included process in the design methodology. This step becomes a requirement for the project
management methodologies in the era of fast technic changes. From this reason, a more
flexible methodology is needed for each component, into or outside the time-chain.
e). The human and automated decisions must also be included in the BIS design, and
implementation process and a management model for the decision tree has to be inserted in
the methodology. Starting from the business analysis, and based on the business models, a
finite list of decisions can be built and linked with the functionality of the whole business
system. The automated decision processes can be configured by the human decision factors
and can also be started or stopped in different moments depending on specific functionality
requirements. New requirements on the BIS activity will ask for new automated decision
processes which will be a sustained design factor for the next progress design step.
Based on the three main principles exposed in the last chapter, with these requirements for
the BIDS, the Progressive Management Methodology developed has the logical included in
figure 2. The steps to apply the methodology to BIDS are presented in table 1.
Table 1. Steps to apply the Progressive Management Methodology.
1
Business Analysis
16
Real Time Automated Execution
2
Business Modeling
17
Connections with Real-Time Chain
3
BIDS Requirements
18
Data Design and Implementation
4
BIDS Resources
19
Models Design and Implementation
5
Technical Analysis
20
Processes Design and Implementation
6
Data Analysis
21
Systems Design and Implementation
7
System Architecture
22
Reports Design and Implementation
8
Real-Time Chain Analysis
23
Decision Tree design and Implementation
9
Real-Time Data Streams
24
BIDS Components Tests and Results
10
Real-Time Data-Mining
25
BIDS Components Deployment
11
Real-Time Processes
26
BIDS Maintenance
12
Real-Time Events
27
BIDS Reevaluation
13
Real-Time Services
28
BIDS New Requirements
14
Real-Time Reports
29
BIDS New Resources
15
Real-Time Automated Decisions
30
BIDS Progressive step
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
4.3. ProgressM design and implementation
As can be seen, in the presented methodology the design and implementation steps are
considered aggregated together. ProgressM will exclude with this aggregation one of the
major failure factor in the BIS management. When the designers are different or have
different knowledge than the IT coding team implementing the informatics system or
software components, major discrepancy malfunctions can appear. It was observed many
cases when the main BIS system has huge implementation latency just because the designer
is not anymore in the team and the coding department has no idea what it is about in a
significant part of the project. Into an interactive development mode, the design and
implementation are made together, and the responsibilities are also distributed. This fact is
more important in those cases when the algorithm is not explicit or for those components
with a high degree of novelty or fast technical development. How it works the interactive
design and implementation process is outlined in figure 3.
Figure 3. Aggregated design and implementation time-chain components in ProgressM methodology.
All three mentioned principles of ProgressM are met in the logic presented in the figure
above. The design and implementation of each component will be made in small steps in an
iterative internal process. Here comes the flexibility of any agile method adopted for each
software component. Besides, the process will ask for suitable resources anytime when the
progress step asks for more technical capabilities. The process is managed according to a key
factor which is measured and tested at the end of each interactive step.
This management procedure fits very well with any time driven process; many time technical
or logical difficulties encountered in the implementation steps will change the design
principles and practices and consequently will change the implementation process which by
the interactive loop will change the design and technical solutions. This is progress based on
the BIDS needs and requirements and the results obtained in the process of implementing
new ideas. The success of each time-related component is obtained once suitable design
measures solve the technical limits found in the implementation process and spending
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
suitable allocated resources. When the technical improvement is not enough or when the
asked resources are not enough, the progressive step cannot be achieved.
The progressive methodology implies a critical analysis of the resources related to the
business requirements needed. After a complete phase of the BIS implementation, a
reevaluation process will define new requirements for the whole BIS which will ask for new
resources in order to consider a new progress step. The nature of BIS means that user
satisfaction alone is not a sufficient measure of success and it is also necessary to consider
technical factors. [34] The success factor will be evaluated at the end of each phase
technically and from all user points of view. A progress step means to obtain better efficiency
for the new system component, as a result of the accumulated experience from the past. New
and better requirements will ask for more technical or financial resources. This is a real fact
of any evolutionary technical process which sometimes is forgotten by the managers or by the
business owners. A new progress step can be implemented only when the new resources
available will cover the costs for design, implementation, and maintenance of all new
requirements of the BIS.
4.4. ProgressM is more than Agile
ProgressM is a methodology to develop large scale projects. ProgressM can include an agile
methodology for any software components, but the design and implementation process is
directed, organized and controlled according to the three fundamental ProgressM principles:
any action needs resources, progress comes from small steps, and control key factors must
follow all unplanned activities. ProgressM methodology is more than agile. The methodology
tries to control the failure factors. ProgressM methodology can also be applied to small
projects, even to single software development if all three fundamental principles are met.
According to the Manifesto for Agile Software Development [35], there are twelve agile
principles. In short they are related to the priority to satisfy the customer, to accept changing
requirements, to deliver software working frequently, business people and developers must
work together, build projects around motivated individuals, initiate and sustain face-to-face
conversation, working software is the primary measure of progress, promote suitable
development maintaining a suitable peace, get attention to technical excellence, simplicity is
essential, the best requirements emerge from self-organizing teams and adjust the team
behavior to become more productive. These are the agile principles.
It was found that applying the agile principles on large scale projects many failure factors can
be inserted by an unrealistic approach or by inappropriate project management. This is what
ProgressM tries to correct. It was found that these three fundamental principles of the
ProgressM methodology cover a lot of missing points and reduce the failure factor
significantly. The first principle assures that any action, new procedures, event, modules,
process, service, software or not software component integrated into a project must be
accompanied by suitable resources. The principle comes from the fact that action without
resources has a substantial probability of failing. Working often into an agile environment the
author of this paper met the situation when a new software module comes into the plan after
an iterative analysis. The component was not in the plan from the beginning, so there are no
resources allocated for it. The project manager asked for the new component to be delivered
under the idea: we work into an agile environment, it was not in the initial plan, and we need
it, so we have to do it. Is that so? Can be done without any additional resources? At this
moment the manager is increasing the failure probability of the project, especially when it is
about new software components which need additional hardware components, a lot of
working hours or, even worst, more actions from other teams who also do not have the
necessary resources. The best example was BIDS when the owners asked real-time reports
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
with no additional resources in the situation when a real-time concept was not in the initial
plan. The project failed in about one month. In conclusion, the first ProgressM principle
assures each component has the necessary resources in order to make it possible to be
realized. When new components are needed on the way, new resources are needed for sure.
This principle will make the manager aware of the missing points in order to avoid failure.
The second ProgressM principle also comes from practice. It was found that essential failure
factors, especially in large scale projects, are coming from activities that are not suitable
treated in the project. Because of the lack of knowledge needed for a specific part of the
project, that activity gets not enough attention. For example, a new original idea or technique
must be implemented. For a new idea, the way to do it is unknown. The manager asks to be
done. He pushes the deve           
idea, we have no plan, just do it. Is that so? Can be done without the proper knowledge? The
developing team tries to understand and to build a solution, but they are only developers.
They are not mathematicians, physicians, economist to split the process into significant parts.
After considerable effort, the developing team will fail. In ProgressM the project manager
must assure that all activities are divided into small components with a theoretical solution
which will be the base for the developing team. The second ProgressM principle starts from
the idea that any activity can be divided into small and multiple steps that must be completed
into a consecutive sequence in order to assure the progress. With this approach, the project
will get a significant success probability. Also, the deciding factor in the project must assure
all the small steps needed for the entire project. If the manager has not enough knowledge, he
will ask for help from different external resources. With this principle, no ambiguity will
remain, and no team participant will have to do something not fitted with his knowledge. The
idea of consecutive steps regards each complex component. The flexibility in project
development is still there.
The second principle is also applied for the global progressive steps of a large project. It
means small targets must be planned for the whole project progress. If a large target is
adopted, it must be divided into small parts in order to fix and get the progress in multiple
small steps. What means large and small progress steps it is, of course, a relative criterion.
For example, BIDS with no real-time stream analysis cannot be transformed over the night
into real-time BIDS. The progress must be divided into several parts and implemented step
by step, especially when it is about a new or an unknown activity. First, it will be inserted the
real-time stream ETL processes, the database architecture will get new features, the server
architecture will be modified because of new computational needs, the data warehouse will
be reconfigured, new real-time data transformation processes will be implemented, the
processes and services of the real-time event will get born, real-time automated decisions can
be implemented and on end the real-time reports can be built and delivered. A manager who
will ask real-time reports directly in the next progress step of a no real-time BIDS will fail for
sure. This is only an example of how the second ProgressM principle can be applied to the
global progress stage of a large scale project.
The third principle asks for a key factor in order to quantify the evolution of the project
activities during the time. Each component will have a defined key factor. This measure will
indicate if the activity goes on the right way, if the state of the project is improving or if
additional actions must be made in order to help the development into a specific part of the
project. It was found, especially in agile software development, because of a significant
flexibility level adopted into the team, the project is not evaluated for an essential time period
just because there is no key factor measured. Anyone works outside of a realistic plan, and
after an important time elapsed they meet again, and they find there is no progress in the last
days or weeks. This is why the ProgressM methodology asks for a key factor to be measured
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
in any activity in order to have a measurable point of view for each activity. The idea is
simple to be implemented; the key factor can be a number or even a Boolean variable for
small processes. For more aggregated processes the key factor can be the time-delay inserted
into a time-chain, an expenditure cost which can be optimized and so on.
The ProgressM methodology organizes the project management in order to optimize the
activity and resources, divide the whole process into small parts and measure the efficiency
of each activity in order to progress. The methodology can include any agile, iterative,
incremental or fixed plan models for each project components once the three principles are
met. The ProgressM methodology can be applied to small or large scale projects, in small or
large teams and will cover the main failure mistakes in order to increase the success factor.
5. Intelligence in Real-Time Decision Systems
In this chapter, we will display particular considerations regarding the application of the
ProgressM methodology to a BIDS in order to reveal the main particularities regarding the
real-time-chain, the decision tree and the progressive steps for the entire system. As an
example, it will be used a BIDS built for a financial capital investment company. The
summarized business analysis for the company taken as example defines two significant
activities in the enterprise: the main activity is providing financial services to invest in the
   
need the services provided by the enterprise. The first is a real-time activity which implies
real-time data streams from different sources. The system will manage in real-time multiple
capital accounts and will process different data mining procedures in order to find
opportunities to buy and sell on the capital markets. The system will also manage the capital
exposure for each account, activity which implies different real-time services and processes
and real-time reporting processes for the clients, managers, and ownership. The trading
activity is made using several automated trading algorithms and software in order to enter and
exit on the capital markets. The automated trading decisions are sustained by automated
execution software provided by different brokerage companies as service suppliers. The
second activity is a simple client management process which implies different contracts,
invoices, and reports with no requirements in the real-time-chain. These will be managed as
usual in any BIS. However, because the system design has a real-time approach, the reporting
for the second activity is requested to be made also in real-time. From the business modeling
activity, the value proposition for the company comes from the services provided to the
clients. The company will deliver financial investment services to different investors who
want to increase their capital. The added value becomes from the automated trading software
included in the BIS, from the capability to generate automated real-time trading decisions and
to deliver this service to an essential number of clients. The great return factor and the
managed capital exposure will be the reason for a new client to join the enterprise. The
business will respond to the need to invest capital amounts into a managed risk environment
with a positive return. As we can see from the beginning, the results of the informatics
systems included in the BIS are of the response services offered to the clients. From this fact,
better results will generate better profits which will be the base for additional technical
resources, as key-factor for a global progress step in the BIDS management project.
The special requirements for the BIDS which operate with real-time processes are related to
the next directives: the company must be able to trade automatically in the several financial
markets, with low-latency procedures, in multiple capital accounts. Besides, the enterprise
must not be dependent on the historical data provided by the stock exchanges or by the
brokerage informatics systems. Related to the client support, the company must be able to
report in real time the capital transactions and current balance using a secured online
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
interface. Starting from these requirements, the real-time data streams will be recorded in the
warehouse in order to build the historical records for each stock exchange used. Also, the
low-latency response needed for the automated trading system will be the design factor for
the entire real-time-chain. Due to the real-time character of the data streams, the extract,
transformation and load processes will have special characteristics and the management. The
online analytical processing will include different design requirements about the low-latency.
5.1. Data Warehouse
The BIDS resource analysis must describe the available technical, human and financial
resources for the system design, implementation, maintenance, and evolution. In this paper,
we will discuss only technical aspects related to real-time data processing. Due to a large
number of data streams processed, a specific solution must be found for the server
architecture and data warehouse. In addition, the large financial data streams and unstructured
data processing are in continuous development. It was found that a simple server cannot offer
enough computational speed and power for all data processing of BIDS. It was also found by
practice that functional server architecture is the DERW (Data Events Reports Warehouse)
configuration, presented in figure 4.
Figure 4. Data Events Reports Warehouse (DERW) server architecture.
The first server (D) is dedicated to real-time data streams acquisition, data clean, extract,
transformation, and loading processing. Once we have to manage real-time data streams from
multiple distributed sources, a dedicated server is absolutely necessary for this purpose. The
second server (E) is dedicated to real-time events, real-time data mining processing, real-time
online analytical processing, and real-time services. The low-latency requested by the
automated decision processing can be assured only by a low latency data processing. The
service time is depending on the low-latency events, an essential part of the real-time BIDS.
For all of these components, a dedicated server is needed. The third server in the presented
architecture (R) is dedicated to the real-time reports, data visualization and dashboards. The
NoSQL data processing and no real-time analytical processing are also included in this
server. The fourth machine (W) is designed to store and manage the data warehouse. Each
server manages its data marts in SQL databases. The warehouse is a NoSQL database.
The DERW distributed servers architecture is the result of experience accumulated in several
global progress steps in the real-time BIDS taken as an example. The DERW architecture can
be adopted regardless of the software stack used for data processing and storage. Different
licensed software solutions for BIDS can be used with this architecture. Due to the particular
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
specificity of the project in our example, an open source package was used. The primary
consideration for this solution was related to the insufficient development of the known BIDS
software for the real-time needs in the capital markets. In our example, a UAMP stack was
used. The servers run Unix operation systems, the web server is an Apache server, the
economic data are stored in relational databases MySQL, and the data processing for the
relational databases are made using Perl/PHP/Python environments. For the unstructured
data, a NoSQL is used under the MongoDB database. For real-time data processing and
services, node.js is used to manage the Restful API data connections and streams. For the
real-time data reporting and interactive reports, the Angular for JavaScript is used. For some
reports and dashboards, the Angular 2 technology under node.js can also be included. All
named packages are open sources resources, this being one of the main system requirements
for the BIDS project in our example. Another idea sustained by this paper is that in any
project it is possible to build complex and functional real-time BIDS using only open sources
resources.
5.2. Real-Time Source Data
Data used by a BIS are usually structured semi-structured and unstructured data. Starting
from the complexity of the system, the data are distributed between different components of
the system and inventoried and consolidated in the data warehouse in order to be accessed
and analyzed any time later. The particularity of the real-time systems is the fact that the data
is received as a stream. For each time interval, the data is received by the system together
with the time signature when the data was generated at the data source. For example, for the
time price series of a stock exchange, the price quotes can be delivered by the informatics
brokerage system in the form:
 
timePPS bak ,,,
(1)
where
k
S
is the symbol defining the current capital market,
a
P
is the asking price,
is the
bid price and time is the exact moment of time when the quotes were generated by the source.
The data from relation (1) will be received by the real-time system in uneven time intervals.
The fact is that the stream can provide repetitive data in a very short time interval, in the
order of milliseconds or, in a low volatility market, the data can be provided in seconds or
minutes. The relation (1) is a simplified example; some sources can provide more information
about the current stock exchange as can be the traded volume in the last time interval,
commissions, etc. The real-time data sources are also distributed; the system can receive
streams from a large number of data provides to perform the trading decisions in multiple
capital markets. For each data source, a special data ETL process must be implemented.
A special particularity in the financial markets is that the data are structured and the ETL
process is made in relational databases. The rest of the data sources used by the BIDS can be
unstructured data provided from different sources as the client-service surveys, internet data
collections, social media, smart-phones, and tablets applications, etc. In this paper, we will
consider only the real-time streams to reveal different aspects related to the specificity of
real-time data extraction, real-time data mining, and real-time decision-making applications.
5.3. Real-Time Extract Transformation and Loading Processing
Having access to the real-time data does not mean the data are ready and useful for the BIDS
processing. The multiple data sources will provide data in different formats. Even all the
streams are structured each provider uses a particular data structure for the real-time stream.
This means a real-time ETL process must be implemented for each data source in the BIDS.
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
This step is one of the most challenging technical steps. The complexity is provided by the
technical solutions adopted by each data provider. Some providers are using Restful API data
connections in order to deliver the data streams, others use their informatics software, and
others use the third party informatics solution. Each provider will have another data structure
and its policy to update the stream values. By consequence for each data source, a separate
data acquisition process will be implemented. For each data source also an extraction and
transformation procedure will be organized in the (D) server in order to load the data in the
current data format used by the current BIDS.
 
 
   
timePPSLoad
timePPStimePtimePtimeSExtrTransf
timePPStimePStimePSExtrTransf
timePPStimePPSExtrTransf
bak
baba
baba
baba ,,,
,,, ,, ,
,,, ),,( ,,
,,, ,,,
333
2222
111
(2)
In relation (2) are figured several extracts and transformation processes adapted for different
data sources. Each process will receive the real-time stream in a different data structure and
will transform the data into the used structure in the local system. The loading process will
store the data in the data marts or directly in the data warehouse. Because the data is
presented in a repetitive stream, the processes in the relation (2) will be repetitive at different
time intervals. Usually, the data source is not notifying the data receiver about a new value
for the data. The local process must be adapted with small repetitive intervals in order to
check the data sources. A short time interval used for the ETL process will generate low-
delay in the time-chain of the intelligence system. More considerations about integrating
different real-time data sources can be found in [36].
5.4. Real-Time Data Processing
After the data streams are transformed and loaded in the system, an additional processing step
is required in order to prepare the real-time data to be analyzed. The data mining processes,
real-time events, and automated decision procedures used more variables than the data
presented in relation (1). A real-time data analysis for the streams loaded in the form (1) will
require huge computational resources and will produce a considerable time delay. To solve
this problem a real-time data processing step in inserted in the time-chain. In this process,
each stream (1) will be summarized in a new form given by:
 
timetimeramesLHCO ,,,,,,
(3)
where O is the open price, meaning the bid price at the moment time, C is the close price,
meaning the bid price at the moment time+timeframe, H is the highest price recorded in the
interval between time and time+timeframe, L is the lowest price recorded in the mentioned
interval, s is the average spread in the current time interval, meaning the difference between
bid and ask price, time is the moment when the interval starts and timeframe is the length of
the time interval used. The aggregated data (3) are computed for a different timeframe,
varying from seconds, minutes, hours, days, weeks and even years.
This data process must be done with very small latency in order to use the results in the real-
time analysis. Some data providers offer these results for different capital markets and
timeframes available. However, it was proved that the accuracy of the aggregated data
provided by different data sources is variable. In addition, historical data provided by
suppliers are trimmed or incomplete from time to time. Professional BIS will manage and
store all the aggregated data in the warehouse to be available anytime for future analysis.
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
5.4.1. Real-Time Indicators Cubes
The second aspect of the real-time data processing is related to different financial indicators
used by the real-time data mining procedures. These indicators are computed with the
aggregated quotes given by (3). Some indicators use the values of other economic indicators.
The recurrences and the multiple usages of the same indicator in several data-mining
         alue is calculated
whenever needed. To avoid this inconvenience, the real-time data processing will compute
and store the values of each used indicator in the warehouse.
Figure 5. Real-Time Indicators Cube.
The indicators time series values are depending only on the historical quote data; each of
them can be calculated once the last quote level is available. To be more clear, the common
indicators used by the system as Relative Strength Index [37], Exponential Moving Average
[38], Bollinger Bands [39] or Price Cyclicality Function [40] will be computed and saved in
the data warehouse for a different timeframe for each capital market. When an Indicator for a
specified Market in a defined timeframe at a precise time interval will be used by the real-
time data-mining algorithms, the value will be received directly from the database. The image
of the indicators cubes is presented in figure 5. The indicator cubes are updated in real-time.
The layer (0) corresponds to the last values of each indicator, computed with the last value of
the price quotes. For indices greater or equal with (1) in the time dimension, the values of the
indicators will remain unchanged, once each indicator depends only on the historical values.
Consequently, these values ask not to be recalculated or updated in the future; the data-
mining processes will only use them without any update operations. It is essential to make the
difference between the OLAP cubes used in the front end of the BIDS in order to report and
present data to the users and the real-time indicators cubes presented here. These are data
structures computed in real-time and used to decrease the computational effort and delay in
the stream data mining process. More aspects of automated trading systems integration into
BIDS can be found in [41].
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
5.4.2. Real-Time Data Mining and Events
The core of the real-time chain is the real-time stream data analysis. The purpose of the real-
time chain processing is to find opportunities in the capital market in order to build and send
market orders with low latency to speculate the price movements. The stream data mining
processing will use the data provided by the real-time indicator cubes will analyze the price
movements and will generate time events.
 
   
 
timePMarketSellSignal
timePMarketBuySignal
timetimeframeindicatorMarket
timetimeframeLHCOMarket
timePPS
b
a
bak
,,
,,
Processing
Mining Data
TimeReal
),,(
,,,,,
,,,
(4)
The real-time data streams and all information provided by the indicators cubes are analyzed
with fast-speed data-mining procedures in order to find patterns and to generate buy and sell
events. Automated decision-         
computer-based information systems that use expert knowledge to attain high-level decision
           
Boolean variables named trading signals. The true value of a buy signal will indicate a buying
opportunity on a specified market, with an accurate price in a specified moment of time. The
buy and sell signals are built similarly as it is indicated in the relation (4). The trading signals
are not trading orders. These are only real-time events found by the system and recorded in
the event data mart. Depending on the specificity of business activity, real-time events can
differ. Into a production process, the events can be associated with different production
phases; into a transportation process, the events can be for example the arrivals and
departures from a specified station. In the case included in our example, the events are buying
and selling opportunities. Each open trade must be closed. Using different data-mining
procedures, the closing events will be generated in the same way.
The real-time data mining processing for the capital markets is made using different
algorithms and mathematical models. These are not a subject for this paper. Sustained
approach for real-time data-mining procedures especially designed for automated trading
systems can be found in [40], [43], [44] and [45].
5.4.3. Real-Time Services
The services of the real-time chain are those components which transform the events into
decisions or directives. An event associated with a buy signal is not an order. The event
indicates only the opportunity on the market, found by data-mining procedures based on the
current or historical price behavior. Services will transform the buy event into orders sent into
each trading managed account using a risk and capital management procedure. This is the
service provided by the enterprise to the clients.
For this step, additional information is needed for each capital account. A data stream
including the capital and liquidity information from each trading account will be received,
transformed and also loaded on the real-time data processing. This real-time info will be
available in the data marts at each moment of time. A design factor is to obtain a lower delay
for the capital and liquidity stream processing than the data-mining processing. The data
stream with the available liquidity is provided by each brokerage informational system and
will be subject of the same ETL methodology presented in paragraph 5.3.
A reliable model for the risk and capital management specially designed for real-time
processing and automated trading services are presented in [46]. The real-time service
processing will aggregate the data and transformed into trading orders:
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
   
VolumeMarketBuyOrder
RiskLevelClientxposureAvailableE
AccountClientapitalAvailableC
timePMarketBuySignal a,
Processing
Service
TimeReal
),(
),(
,,
(5)
Depending on the available capital from each client accounts and the risk level granted by
contract by each client, the capital exposure methodology will decide about the trading
volume. The service will assemble the trading orders which will be automatically sent to the
brokerage informatics system in order to be executed.
Each component from the time-chain is designed and implemented separately using as key-
factor the time-delay produced by each component. The sum of all delays in the time-chain is
also a design requirement for the real-time system. The brokerage system usually imposes
this measure. The typical value for the total delay is usual under 100 milliseconds. Some
brokerage companies are accepting delays under 300-500 milliseconds or will execute mode
relayed orders with large differences between the ordered price and the executed price.
5.5. Real-Time Reports, Data Visualizations and Dashboards


of the main goals of the BIDS, data visualization also gets several particular aspects in the
real-time systems. In the example of capital markets, the real-time price evolution is an
essential element. The software requirements ask for price graphs to be displayed in real-
time, together with different indicators used in the technical analysis. The real-time software
will aggregate these graphs using the quotes and data indicators from the data marts.
Sometimes these graphs can also be provided to the clients as value added services. In some
BIDS cases, requirements regarding real-time data delivery services are met; the company
will add value by selling informatics services to different partners. These kinds of
requirements can change the design project significantly in the next progressive step.
An additional request type for the real-time management system is to provide reports about
capital evolution with no delay. First implementations have offered these reports with a
significant delay, a fact considered too obsolete for a modern business system. In the next
progress steps, the capital balance and capital evolution are presented in real-time reports for
the customers, managers, and ownership. The methodology used is included in the Angular
Javascript of Angular 2. This quality step improved significantly the clients and managers
trust in the BIDS. Once the real-time cubes with the capital and liquidity information are built
with low latency, the real-time reports are more facile to be implemented.
Different tools can be used in order to build friendly reports and dashboards. The trend is to
build interactive and dynamic content. Personalized reports, real-time representations, and
screens are also new requirements for modern systems. A customized interface will allow
each user to manage its presentation and dashboard style.
5.6. Real-Time Decision Tree
The main objective of the real-time chain is to collect, analyze and transform the data streams
into real-time events and automated decisions. In the example figured in this paper, the real-
time chain generates buy and sell orders for the capital markets. Due to the small delay
requested by the brokerage systems, the orders are built automatically using different data
mining procedures and are also sent automatically by specialized services in order to be
executed by the brokerage informatics system. In this process, the human factor is excluded.
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
Once the real-time trading system is started, the automated software will produce effects in
the enterprise results without human intervention. The human decision factor can start and
stop the automated decision-making software or some of its components. The human decision
can allow or deny different data-mining procedures and can also configure the system in
order to manage the risk and capital exposure level for each managed account. The decision
tree for the example case is presented in figure 6.
Figure 6. Real-Time Business Intelligence Decision Tree.
For different other BIDS, the decision tree can have different characteristics depending on the
specificity of the automated decision processes. The functionality of the decision tree must be
designed together with the BIDS requirements. The correlations links between human
decisions and the automated decisions levels will generate a specific efficiency for the entire
system. The new requirements and resources for the future progress steps of the BIDS will
come in this case exactly from the functionality of the decision tree.
6. Conclusions
The new economic demands require the use of real-time informatics systems. Large volumes
of data in real-time streams and managed in order to find patterns and to build events and
decisions based on the current data values. The low latency decisions are automatically
assembled and executed in order to obtain a small delay. The automated decision processing
is a significant part of the decision tree in the modern BIDS.
The complexity of the real-time BIDS decision systems asks for an appropriate methodology
in order to manage the BIDS project to design, implement, maintain and improve the system.
Specially designed for real-time projects, the ProgressM methodology offers multiple
advantages. It combines the flexibility of the agile methods with the iterative steps needed in
order to achieve a demanded performance level. Besides, the real-time chain is managed
separately in a sequential model in order to obtain the sustainability given by the low-latency
requirements of each component. The decision chain will include automated and human
factor decisions levels in the BIDS design in order to implement the complete functionality.
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
The connections between the human factor and the automated decision level are essential
parts of the BIDS functionality. The Progressive Management Methodology permits progress
steps to be made in order to improve the entire BIDS. To improve the functionality and the
results obtained, after the first phase of implementation, new requirements are born from the
missing points of the current version. Once new resources are available, a new progressive
step can be initiated. It will be implemented over the functionality of the previous system in
order to assure the activity fluency. The ProgressM model integrates the concept of the
necessary and sufficient resource, the small, multiple, complete and consecutive steps
concept together with the key factor control steps concept. All of these will contribute to a
stable, organized and in the same time flexible project management methodology for any
real-time BIDS. The ProgressM methodology has no limitation; it permits large scale projects
to be implemented with controllable costs and resources.
References
[1] Gartner, Inc. Gartner Says Worldwide Business Intelligence and Analytics Market to
Reach $18.3 Billion in 2017. Sydney, Australia, 2017. Available at: https://www.gartner.com
/en/newsroom/press-releases/2017-02-17-gartner-says-worldwide-business-intelligence-and-
analytics-market-to-reach-18-billion-in-2017
[2] Gartner, Inc. Gartner Says Business Intelligence and Analytics Leaders Must Focus on
Mindsets and Culture to Kick Start Advanced Analytics. Egham, UK, 2015. Available at:
https://www.gartner.com/newsroom/id/3130017
[3] H. J. Watson. Tutorial Business Intelligence - Past, Presence and Future. US:
Communication of the Association for Information Systems, Volume 25, Article 39, 2009.
Available at: http://aisel.aisnet.org/cais/vol25/issl/39
[4] R. F. Bales. Interaction process analysis: A method for the study of small groups. Reprint
at Chicago, University of Chicago Press, 1950 republished in 1976. ISBN: 9780226036182
[5] D. J. Power. A Brief History of Decision Support Systems. DSSResources.com, World
Wide Web, 2003. Available at: http://DSSResources.COM/history/dsshistory.html
[6] R. H. Jr. Sprague, H. J. Watson. Bit by Bit: Toward Decision Support Systems, US:
California Management Review, vol. XXII, no. 1, 1979. DOI:
https://doi.org/10.2307/41164850
[7] S. Y. Hung. Expert versus novice use of the executive support systems: an empirical
study. Maui, Hawaii, US: Proceedings of the 34th Annual Hawaii International Conference
on System, 2001. DOI: 10.1109/HICSS.2001.927187
[8] BI. Wikipedia Online Encyclopedia. Business Intelligence online presentation. 2018.
Available at: https://en.wikipedia.org/wiki/Business_intelligence
[9] Y. Riahi. Business Intelligence: A Strategy for Business Development. SSRG International
Journal of Economics and Management Studies. Volume 4, Issue 9, 2017. DOI:
10.14445/23939125/IJEMS-V4I9P101
[10] L. Columbus. The state of business Intelligence. Forbes online, 2018. Available at:
https://www.forbes.com/sites/louiscolumbus/2018/06/08/the-state-of-business-intelligence-
2018/ #7df597137828
[11] D. E. Avison, G. Fitzgerald. Information systems development, in Rethinking
Management Information Systems: an Interdisciplinary Perspective, Oxford, UK: Oxford
University Press. W. Currie and B. Galliers edition, pp. 136-155, 1999. ISBN:
9780198775324
[12] J. L. Wynekoop, N. L. Russo. Studying System Development Methodologies: An
Examination of Research Methods. Information Systems Journal, (7), 1997, pp. 47-65. ISSN:
1365-2575
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
[13] N. L. Russo, E. Stolterman. Exploring the assumptions underlying information systems
methodologies, Information Technology & People, 13:4, 313-327, 2000. ISSN: 0959-3845
[14] E. Stolterman. How system designers think about design and methods: some reflections
based on an interview study, Scandinavian Journal of Information Systems, 4, 137-150, 1992.
ISSN 1901-0990
[15] M. Shaw. Prospects for an engineering discipline of software, IEEE Software, 7, 15- 24,
1990. DOI: 10.1109/52.60586
[16] C. R. Necco, C. L. Gordon, N. W. Tsai. Systems Analysis and Design: Current
Practices, Management Information Systems Research Center, University of Minnesota,
Quarterly, Volume 11. No. 4, pp. 461-476, 1987. DOI: 10.2307/248975
[17] S. M. Dekleva. The influence of the information systems development approach.
Management Information Systems Research Center. University of Minnesota. Quarterly,
Volume 16, No. 3, pp. 355-372, 1992. DOI: 10.2307/249533
[18] J. P. Bansler, K. Bodker. A reappraisal of Structured Analysis: design in an
organizational context, ACM Transactions on Information Systems, 11:2, 165-193, 1993.
DOI: 10.1145/130226.148055
[19] B. Fitzgerald. The use of systems development methodologies in practice: a field study,
Information Systems Journal, 7, 201-212, 1997. DOI: https://doi.org/10.1046/j.1365-
2575.1997.d01-18.x
[20] C. J. Hardy, J. B. Thompson, H. M. Edwards. The use, limitations and customization of
structured systems development methods in the United Kingdom, Information and Software
Technology, 37:9, 467-477, 1995. DOI: 10.1016/0950-5849(95)97291-F
[21] L. T. Moss, S. Atre. Business Intelligence Roadmap: The Complete Project Lifecycle for
Decision-Support Applications, Boston, USA: Addison-Wesley Professional, 2003. ISBN:
978-0201784206
[22] L. T. Moss. Beware of Scrum Fanatics On DW/BI Projects. Enterprise Information
Management Institute Magazine, Volume 3, Issue 3, 2009. Available at:
http://www.eiminstitute.org/library/eimi-archives/volume-3-issue-3-march-2009-edition/
beware-of-scrum-fanatics-on-dw-bi -projects
[23] B. W. Boehm, R. Turner. Balancing Agility and Discipline: A Guide for the Perplexed,
US: Addison-Wesley Longman Publishing Co., Inc., 2003. ISBN 0-321-18612-5
[24] J. Charvat. Project Management Methodologies: Selecting, Implementing, and
Supporting Methodologies and Processes for Projects, Hoboken, NJ, US: John Wiley & Sons
Inc., 2003. ISBN: 0-471-221-78-3
[25] H. Wells. How Effective Are Project Management Methodologies? An Explorative
Evaluation of Their Benefits in Practice. Project Management Journal, Vol. 6 Issue 43, pp.
43-58, 2012. DOI: https://doi.org/10.1002/pmj.21302
[26] H. D. Benington. Production of Large Computer Programs. IEEE Annals of the History
of Computing. IEEE Educational Activities Department. 5 (4): 350361, 1983. DOI:
10.1109/MAHC.1983.10102.
[27] H. Mills, M. Dyer, R. Linger. Cleanroom Software Engineering. IEEE Software. 4 (5):
19, 25, 1987. DOI:10.1109/MS.1987.231413.
[28] W. S. Humphrey. Characterizing the software process: A maturity framework. IEEE
Software. 5 (2): 7379, 1988. DOI:10.1109/52.2014.
[29] B. Boeh. A Spiral Model of Software Development and Enhancement, ACM SIGSOFT
Software Engineering Notes, ACM, Volume 11, Issue 4, 1986. DOI: 10.1145/12944.12948
[30] P. Kruchten. The Rational Unified Process: An Introduction. US: Addison-Wesley. 1998
republished in 2004. ISBN: 0-321-19770-4
Proceedings of the IE 2019 International Conference
www.conferenceie.ase.ro
[31] K. Schwaber. Agile Project Management with Scrum. Microsoft Press, 2004. ISBN 978-
0-7356-1993-7
[32] K. Beck. Extreme Programming Explained: Embrace Change. Addison-Wesley, 1999.
ISBN 978-0-321-27865-4
[33] P. Abrahamsson, K. Conboy, X. Wang. 'Lots Done, More To Do': the Current State of
Agile Systems Development Research, European Journal of Information Systems Volume 18
Issue 4, 2009. DOI: https://doi.org/10.1057/ejis.2009.27
[34] N. , C. Stanier. Measuring the success of changes to Business Intelligence
solutions to improve Business Intelligence reporting, Journal of Management Analytics, 4:2,
130-144, 2017. DOI: 10.1080/23270012.2017.1299048
[35] K. Beck, M. Beedle, A. Benekum, A. Cockburn, W. Cunningham, M. Fowler, J.
Grenning, J. Highsmith, A. Hunt, R. Joffries, J. Kern, B. Marick, R. C. Martin, S. Mellor, K.
Schwaber, J. Sutherland, D. Thomas. Manifesto for Agile Software Development, 2001.
Available at: http://agilemanifesto.org/principles.html
[36]  . Arbitrage Trading Systems for Cryptocurrencies. Design Principles and
Server Architecture, Bucharest, Romania: Informatica Economica Journal vol. 22, no. 2,
2018. ISSN: 1842-8088. Available at: http://revistaie.ase.ro/content/86/04%20-%20pauna.pdf
[37] W. Jr. Wilder. New Concepts in Technical Trading Systems. Greensboro, NC: Trend
Research, 1978. ISBN 978-0894590276
[38] D. R. Sir. Cox. Prediction by Exponentially Weighted moving Averages and Related
Methods, Journal of the royal Statistical Society, Series B, Vol. 23, No. 2, pp. 414-422, 1961
[39] J. Bollinger. Bollinger on Bollinger Bands. The Seminar. The essentials. US: Bollinger
Capital Management. DVD, 2002. ISBN: 978-0-9726111-0-7.
[40] C. , I. Lungu. Price Cyclicality Model for Financial Markets. Reliable Limit
Conditions for Algorithmic Trading, Studies and Economic Calculations and Economic
Cybernetics Journal, Vol. 52, Issue 4, 2018. ISSN: 0585-7511. DOI:
10.24818/18423264/52.4.18.10
[41] C. . Automated Trading Software - Design and Integration in Business Intelligence
Systems, Bucharest, Romania: Database Systems Journal vol. IX, 2018. ISSN: 2069-3230
[42] R. Sharda, D. Delen, E. Turban. Business Intelligence and Analytics Systems for
Decision Support. Tenth edition. US: Pearson, 2014. ISBN: 978-1-292-00920-9
[43] . Smoothed Heikin-Ashi Algorithms Optimized for Automated Trading Systems.
Graz, Austria: Second International Scientific conference on IT, Tourism, Economics,
Management and Agriculture ITEMA, 2018. Available at: https:/pauna.biz/ideas
[44]  . Reliable Signals Based on Fisher Transform for Algorithmic Trading,
           ,
2018. ISSN: 2286-0991. DOI: 10.2478/tjeb-2018-0006
[45] C. . Reliable Signals and Limit Conditions for Automated Trading Systems
Romania: Review of Economic and Business Studies, Volume 11, Issue 2, 2018. ISSN: 2068-
7249. DOI: 10.1515/rebs-2018-0070
[46] C. . Capital and Risk Management for Automated Trading Systems
Proceeding of the 17th International Conference of Informatics in Economy, pp. 183-189,
2018. Available at: https://pauna.biz/ideas
[47] M. Alnoukari. Business Intelligence and Agile Methodologies for Knowledge-Based
Organizations: Cross-Disciplinary Applications. Upgrade: The European Journal for the
Informatics Professional. Volume XII, no. 3, 2011. DOI: 10.4018/978-1-61350-050-7.ch007
Thesis
Full-text available
After several attempts to publish my Ph.D. thesis with different prestigious publishers, I have decided to make this work public and free of charge for anyone. Enjoy! Cristian Păuna
Article
Full-text available
Automated trading software is a significant part of the business intelligence system in a modern investment company today. The buy and sell orders are built and sent almost instantly by computers using special trading and computational strategies. The trading decisions are made by automated algorithms. In this paper it will be presented one of these mathematical models which generate trading signals based only on the time price series. The algorithm combines several known computing techniques to build a trading indicator to automate the trades. With this method, buy decisions on oversold intervals and sell decisions on overbought price values can be built. Limit conditions in order to close the long and short trades can be also automatically generated. More trading signal types based on this model will be revealed. Trading results obtained with all these signals will be presented in order to qualify this methodology developed especially for algorithmic trading.
Article
Full-text available
Trading the financial markets is a common idea nowadays. Millions of market participants, individuals, companies or public funds are buying and selling different equities in order to obtain profit from the buy and sell price difference. Once the equity was established, the main question marks are when to buy, when to sell and how long to keep the opened positions. This paper will present a mathematical model for the cyclicality of the price evolution. The model can be applied for any equity in any financial market, using any timeframe. The method will gives us information about when is good to buy and when is better to sell. The price cyclicality model is also a method to establish when the price is approaching to change its behavior in order to build limit conditions to stay away the market and to minimize the risk. The fundamental news is already included in the price behavior. Being exclusively a mathematical model based on the price evolution, this method can be easily implemented in algorithmic trading. The paper will also reveal how the cyclicality model can be applied in automated trading systems and will present comparative results obtained in real-time trading environment.
Article
Full-text available
After the introduction of the electronic execution systems in all main stock exchanges in the world, the role of the automated trading software in the business intelligence systems of any financial or investment company became significant. Designing of reliable trading software to build and send automated orders based on quantitative mathematical models applied in the historical and real-time price data is a challenge for nowadays. Algorithmic trading and high-frequency trading engines become today a relevant part of any trading system and their specific characteristics related with the fast execution trading process and capital management involves specific measures to be used. Smart integration of the trading software in the business intelligence systems is also a sensitive theme for any financial and investment activity, a plenty of functional, control and execution issues being subjects of researches for future improvements. This paper wants to gather together more particular aspects on this subject, based on the experience of last years, opening the way for future topics.
Conference Paper
Full-text available
The most important part in the design and implementation process of automated trading systems in any financial investment company is the capital and risk management solution. Starting from the principle that the trading system must run fully automated, the design process gets several particular aspects. The global stop loss is a special approach for the risk management strategy that will ensures a positive expectancy in algorithmic trading. A case study based on an already optimized trading algorithm will be used to reveal how important the risk level optimization is, in order to improve the efficiency of the trading software. The main optimal criteria are as usual the profit maximization together with the minimization of the allocated risk, but these two requirements are not enough in this case. This paper will reveal an additional optimization criterion and the main directions to build a reliable solution for an automated capital and risk management procedure. Keywords: automated trading software (ATS), business intelligence systems (BIS), capital and risk management (CRM), algorithmic trading (AT), high frequency trading (HFT). (Available at: https://pauna.biz/Capital_and_Risk_Management)
Article
Full-text available
Today, in a context where information resources are fragmented, voluminous and complex, there is a real need to consolidate and analyze them in order to have a global vision and optimize the company's assets. The objective of the BI is to create information and knowledge, not only from the company's data, but also external to the company, from the executives to the operational staff, in their steering. BI can be the subject of very different approaches from one company to another. Its objective is to help managers in their decision-making and in the performance analysis of their company. Business intelligence has become an unavoidable subject considering its impact on the performance of the company. It is a continuous process within companies, and not a one-off project.
Article
Full-text available
Alias Decision Support System or DSS, the use of information technology to support upper-level management is receiving much attention. Is DSS just another buzz word-or will it become the system that performs in data storage, access, and handling; analytic model creation, use, and manipulation; and manager-system interface ?
Article
Evaluating the success of changes to an existing Business Intelligence (BI) environment means that there is a need to compare the level of user satisfaction with the original and amended versions of the application. The focus of this paper is on producing an evaluation tool, which can be used to measure the success of changes to existing BI solutions to support improved BI reporting. The paper identifies the users involved in the BI process and investigates what is meant by satisfaction in this context from both a user and a technical perspective. The factors to be used to measure satisfaction and appropriate clusters of measurements are identified and an evaluation tool to be used by relevant stakeholders to measure success is developed. The approach used to validate the evaluation tool is discussed and the conclusion gives suggestions for further development and extension of the tool.