Manuel A. Serrano

Manuel A. Serrano
University of Castilla-La Mancha · Department of Information Technologies and Systems

Phd in Computer Science

About

82
Publications
33,084
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,565
Citations
Introduction
Manuel A. Serrano currently works at the Department of Information Technologies and Systems , University of Castilla-La Mancha. Manuel does research in Software Engineering, Information Systems (Business Informatics) and Programming Languages. Their current project is 'FCD-Master Data Exchange based on ISO 8000-1x0'.
Additional affiliations
May 2014 - present
University of Castilla-La Mancha
Position
  • ViceHead of Department

Publications

Publications (82)
Article
Full-text available
The information society depends increasingly on risk assessment and management systems as means to adequately protect its key information assets. The availability of these systems is now vital for the protection and evolution of companies. However, several factors have led to an increasing need for more accurate risk analysis approaches. These are:...
Article
Quantum computing is the turning point that represents a revolution in software development that will make it possible to solve those problems unsolvable with classical computing. Just as in other milestones in the history of software development, such as the adoption of object-oriented systems, where new software development processes and new life...
Article
Full-text available
In recent years, autonomous electric vehicles (A-EVs) have attracted the attention of academia and industry. In urban mobility, this topology requires consensus to control behaviours under swarm robotics. Although several model-based solutions have successfully enhanced accuracy and overcome some limitations, specific technological, methodological,...
Article
Full-text available
The Information Security Management Systems (ISMS) are global and risk-driven processes that allow companies to develop their cybersecurity strategy by defining security policies, valuable assets, controls, and technologies for protecting their systems and information from threats and vulnerabilities. Despite the implementation of such management i...
Article
Full-text available
Integrating embedded systems into next-generation vehicles is proliferating as they increase safety, efficiency, and driving comfort. These functionalities are provided by hundreds of electronic control units (ECUs) that communicate with each other using various protocols that, if not properly designed, may be vulnerable to local or remote attacks....
Article
Cyber-physical systems (CPSs) are smart systems that include engineered interacting networks of physical and computational components. CPSs have an increasingly presence on critical infrastructures and an impact in almost every aspect of our daily life, including transportation, healthcare, electric power, and advanced manufacturing. However, CPSs...
Chapter
This chapter provides an overview of state-of-the-art quantum software technologies: quantum programming languages, quantum software simulators and design environments, quantum tools and libraries, quantum annealing environments, and quantum software development and run platforms.
Chapter
This chapter proposes the development of a Quantum Information Technology Governance System using COBIT 2019 as its Information Technology Governance Framework. It also analyzes some limitations and suggestions for future research in this field.
Book
Edited by members of aQuantum, the book contains the contributions of European researchers on Quantum Software Engineering and Programming from nine universities in five different countries and the collaboration of two quantum software companies from two countries. This book introduces Software Engineering techniques and tools to improve the produ...
Article
Quantum computing is the latest revolution in computing and will probably come to be seen an advance as important as the steam engine or the information society. In the last few decades, our understanding of quantum computers has expanded and multiple efforts have been made to create languages, libraries, tools, and environments to facilitate their...
Article
Full-text available
Cyber-physical systems (CPS) are the next generation of engineered systems into which computing, communication, and control technologies are now being closely integrated. They play an increasingly important role in critical infrastructures, governments and everyday life. Security is crucial in CPS, but they were not, unfortunately, initially concei...
Preprint
Full-text available
Technology has changed both our way of life and the way in which we learn. Students now attend lectures with laptops and mobile phones, and this situation is accentuated in the case of students on Computer Science degrees, since they require their computers in order to participate in both theoretical and practical lessons. Problems, however, arise...
Article
Full-text available
Nowadays, we are at the dawn of a new age, the quantum era. Quantum computing is no longer a dream; it is a reality that needs to be adopted. But this new technology is taking its first steps, so we still do not have models, standards, or methods to help us in the creation of new systems and the migration of current ones. Given the current state of...
Article
Quantum Computing is becoming an increasingly mature area, with a simultaneous escalation of investment in many sectors. Quantum technology will revolutionize all the engineering fields. For example, companies will need to add quantum computing progressively to some or all of their daily operations. It is clear that all existing classical informati...
Article
Data is one of the most important assets for all types of companies, which have undoubtedly grown their quantity and the ways of exploiting them. Big Data appears in this context as a set of technologies that manage data to obtain information that supports decision-making. These systems were not conceived to be secure, resulting in significant risk...
Conference Paper
Full-text available
This paper presents the Talavera Manifesto for quantum software engineering and programming. This manifesto collects some principles and commitments about the quantum software engineering and programming field, as well as some calls for action. This is the result of the discussion and different viewpoints of academia and industry practitioners who...
Conference Paper
Full-text available
This paper presents the Talavera Manifesto for quantum software engineering and programming. This manifesto collects some principles and commitments about the quantum software engineering and programming field, as well as some calls for action. This is the result of the discussion and different viewpoints of academia and industry practitioners who...
Article
Big Data environments are typically very complex ecosystems; this means that implementing them is complicated. One possible technique with which to address this complexity is the use of abstraction. Reference architecture (RA) can be useful for an improved understanding of the main components of Big Data. Herein, we propose a security RA that inclu...
Research Proposal
Full-text available
This paper presents the Talavera Manifesto for quantum software engineering and programming. This manifesto collects some principles and commitments about the quantum software engineering and programming field, as well as some calls for action. This is the result of the discussion and different viewpoints of academia and industry practitioners who...
Article
Full-text available
Big data ecosystems are increasingly important for the daily activities of any type of company. They are decisive elements in the organization, so any malfunction of this environment can have a great impact on the normal functioning of the company; security is therefore a crucial aspect of this type of ecosystem. When approaching security in big da...
Article
Full-text available
A Big Data environment is a powerful and complex ecosystem that helps companies extract important information from data to make the best business and strategic decisions. In this context, due to the quantity, variety and sensitivity of the data managed by these systems, as well as the heterogeneity of the technologies involved, privacy and security...
Conference Paper
Big Data is changing the perspective on how to obtain valuable information from data stored by organizations of all kinds. By using these insights, companies can make better decisions and thus achieve their business goals. However, each new technology can create new security problems, and Big Data is no exception. One of the major security issues i...
Conference Paper
Full-text available
Numerous authorization models have been proposed for relational databases. On the other hand, several NoSQL databases used in Big Data applications use a new model appropriate to their requirements for structure, speed, and large amount of data. This model protects each individual cell in key-value databases by labeling them with authorization righ...
Conference Paper
Full-text available
Companies are aware of Big Data importance as data are essential to conduct their daily activities, but new problems arise with new technologies, as it is the case of Big Data; these problems are related not only to the 3Vs of Big Data, but also to privacy and security. Security is crucial in Big Data systems, but unfortunately, security problems o...
Chapter
It is a reality, we live in the world of Big Data. However, the use of Big Data creates new issues related not only to the volume or variety of the data that processes, but also to data security and privacy. In this chapter we will describe a full perspective of the problematic. Furthermore, we will explain the main international proposals that add...
Article
Full-text available
The implemented programs in the MapReduce processing model are focused in the analysis of large volume of data in a distributed and parallel architecture. This architecture is automatically managed by the framework, so the developer could be focused in the program functionality regardless of infrastructure failures or resource allocation. However,...
Article
During the execution of business processes involving various organizations, Master Data is usually shared and exchanged. It is necessary to keep appropriate levels of quality in these Master Data in order to prevent problems in the business processes. Organizations can be benefitted from having information about the level of quality of master data...
Article
Full-text available
Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues...
Article
Context: Global Software Development (GSD) allows companies to take advantage of talent spread across the world. Most research has been focused on the development aspect. However, little if any attention has been paid to the management of GSD projects. Studies report a lack of adequate support for management's decisions made during software develop...
Article
Beyond the hype of Big Data, something within business intelligence projects is indeed changing. This is mainly because Big Data is not only about data, but also about a complete conceptual and technological stack including raw and processed data, storage, ways of managing data, processing and analytics. A challenge that becomes even trickier is th...
Conference Paper
During the execution of business processes involving various organizations, Master Data is usually shared and exchanged. It is necessary to keep appropriate levels of quality in these Master Data, in order to prevent defects and failures in the business processes. A way to support the decision about the usage of data in business processes is to inc...
Article
Full-text available
Global software development (GSD) involves new challenges that need to be addressed when project managers have to make significant decisions such as task allocation, resource assignments, etc. Visualisation techniques can be useful as regards helping managers to process complex information and interpret the data shown. The main goal of this study i...
Article
Full-text available
Data warehouses are powerful tools for making better and faster decisions in organizations where information is an asset of primary importance. Due to the complexity of data warehouses, metrics and procedures are required to continuously assure their quality. This article describes an empirical study and a replication aimed at investigating the use...
Article
Full-text available
Data warehouses are large data repositories integrating data from several sources that support decision making. Although, traditionally, data warehouses have been designed using the 'well-known' star schema, some design methodologies have come into existence in recent times. These new methodologies have not only focused on logical design: they also...
Article
Full-text available
Empirical studies in software engineering are essential for the validation of different methods, techniques, tools, etc. Students play a fundamental role in carrying these studies out successfully and, as a consequence, most experiments connected with software engineering are conducted in academia. Benefits which are concerned exclusively with aspe...
Chapter
Data warehouses are large repositories that integrate data from several sources for analysis and decision support. Data warehouse quality is crucial, because a bad data warehouse design may lead to the rejection of the decision support system or may result in non-productive decisions. In the last years, we have been working on the definition and va...
Article
Due to the principal role of Data warehouses (DW) in making strategy decisions, data warehouse quality is crucial for organizations. Therefore, we should use methods, models, techniques and tools to help us in designing and maintaining high quality DWs. In the last years, there have been several approaches to design DWs from the conceptual, logical...
Article
The evaluation of software processes is nowadays a very important issue due to the growing interest of software com- panies in the improvement of the productivity and quality of delivered products. Software measurement plays a fundamen- tal role here. Given the great diversity of entities which are candidates for measurement in the software process...
Conference Paper
OLAP (On-Line Analytical Processing) operations, such as roll-up or drill-down, depend on data warehouse dimension hierarchies in order to aggregate information at different levels of detail and support the decision-making process required by final users. This is why it is crucial to capture adequate hierarchies in the requirement analysis stage. H...
Article
Full-text available
Actualmente es importante tener un conjunto de medidas para medir las mejoras introducidas por esfuerzos de mejora de procesos de software y que en muchas ocasiones estas mejoras se miden a través de procesos informales y subjetivos basados en la percepción de los empleados y/o auditores. En este trabajo se presenta un conjunto de medidas para medi...
Conference Paper
Full-text available
Different modeling approaches have been proposed to overcome every design pitfall of the development of the different parts of a data warehouse (DW) system. However, they are all partial solutions which deal with isolated aspects of the DW and do not provide designers with an integrated and standard method for designing the whole DW (ETL processes,...
Article
Data warehouses are large repositories that integrate data from several sources for analysis and decision support. Data warehouse quality is crucial, because a bad data warehouse design may lead to the rejection of the decision support system or may result in non-productive decisions. In the last years, we have been working on the definition and va...
Conference Paper
The quality of Data Warehouses is absolutely relevant for organizations in the decision making process. The sooner we can deal with quality metrics (i.e. conceptual modelling), the more willing we are in achieving a data warehouse (DW) of a high quality. From our point of view, there is a lack of more objective indicators (metrics) to guide the des...
Conference Paper
Full-text available
Data warehouses (DW), based on the multidimensional modeling, provide companies with huge historical information for the decision making process. As these DW's are crucial for companies in making decisions, their quality is absolutely critical. One of the main issues that influences their quality lays on the models (conceptual, logical and physical...
Conference Paper
Full-text available
Nowadays most organizations have incorporated datawarehouses as one of their principal assets for the efficient management of information. It is vital to be able to guarantee the quality of the information that is stored in the datawarehouses given that they have become the principal tool for strategic decision making. The quality of the informatio...
Article
Full-text available
Multidimensional data models are playing an increasingly prominent role in support of day-to-day business decisions. Due to their significance in taking strategic decisions it is fundamental to assure its quality. Although there are some useful guidelines proposals for designing multidimensional data models, objective indicators (metrics) are neede...
Article
This chapter proposes a set of metrics to assess data warehouse quality. A set of data warehouse metrics is presented, and the formal and empirical validations that have been done with them. As we consider that information is the main organizational asset, one of our primary duties should be assuring its quality. Although some interesting guideline...
Article
Organisations are adopting data warehouses to manage information efficiently as the main organisational asset. This success of data warehouses (DW) can be explained because a data warehouse is a set of data and technologies aimed at enabling the executives, managers and analysts to make better and faster decisions. Due to the principal role of data...
Conference Paper
In recent years various initiatives for solving the problem of security in databases have arisen. However, all of them have been only partial solutions which resolved isolated problems. Consequently, a global solution to the problem has not been reached yet. We believe that a methodological approach in which security is taken into consideration fro...
Article
Full-text available
Organizations are adopting datawarehouses to manage information efficiently as "the" main organizational asset. It is essential that we can assure the information quality of the data warehouse, as it became the main tool for strategic decisions. Information quality depends on presentation quality and the data warehouse quality. This last includes t...
Conference Paper
Due to the growing complexity of information systems, continuous attention to and assessment of the quality of databases, which are the essential core of information systems, it is necessary to produce quality information systems. In a typical database design a conceptual schema which specifies the requirements of the database is first built. Even...
Article
Software measurement is an effective means to manage software development and maintenance projects. In the past decades a huge amount of software metrics has been proposed, but primarily focused on programs. Metrics for databases have been neglected, mainly because databases have developed a secondary role in Information Systems (IS) infrastructure...
Conference Paper
Full-text available
To succeed in their tasks, users need to manage data with the most adequate quality levels possible according to specific data quality models. Typically, data quality assessment consists of calculating a synthesizing value by means of a weighted average of values and weights associated with each data quality dimension of the data quality model. We...

Network

Cited By