Article

IS practitioners' views on core concepts of information integrity

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Based on a review of the literature on data quality and information integrity, a framework was created that is broader than that provided in the widely recognized international control guideline COBIT [ISACA (Information Systems Audit and Control Association) COBIT (Control Objectives for Information Technology) 3rd edition. Rolling Meadows, Il: ISACA, 2000], but narrower than the concept of information quality discussed in the literature. Experienced IS practitioners' views on the following issues were gathered through a questionnaire administered during two workshops on information integrity held in Toronto and Chicago: definition of information integrity, core attributes and enablers of information integrity and their relative importance, relationship between information integrity attributes and enablers, practitioners' experience with impairments of information integrity for selected industries and data streams and their association with stages of information processing, major phases of the system acquisition/development life cycle, and key system components. One of the policy recommendations arising from the findings of this study is that the COBIT definition of information integrity should be reconsidered. Also, a two-layer framework of core attributes and enablers (identified in this study) should be considered.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... The term is often used interchangeably with data quality, data integrity, and information quality. However, Boritz (2005) draws the distinction between "data integrity" and "information integrity," arguing that "data integrity" refers to a narrower concept as data is the "raw material" used to create a "finished information product." Scholars initially took on a wide-ranging approach to measuring information quality, with a survey by Delone and McLean (1992) reporting the use of 23 different measures. ...
... Scholars initially took on a wide-ranging approach to measuring information quality, with a survey by Delone and McLean (1992) reporting the use of 23 different measures. Yet, most agreed on the core criteria for assessing "information integrity" (Mandke 1996;Boritz 2005): the accuracy, reliability, and consistency of information stored and accessed within a specific system. ...
... In fact, just six of the thirty documents offered further clarification on what was meant by each attribute in a definition of "information integrity", with all but one of those coming from academia, specifically information security. (Bovee et al. 2003;Boritz 2005;Nayar 2007;Khan et al. 2013;Boritz & Datardina 2019;Adam et al. 2023). Drawing from those definitions, only 'accuracy' alone was more or less consistently defined across the six concepts, roughly meaning that the information is free from errors or too much deviation from a standard starting point. ...
Preprint
Full-text available
How should "information integrity" be understood in the context of the information environment? The term has seen increased use by researchers and policymakers, yet its relative newness means that there is a lack of consistency in its definition and scope. This article provides analysis on how "information integrity" has been defined across multilateral efforts and existing literature. Using John Gerring's framework for conceptual goodness, it evaluates how existing conceptualizations of the term fall short and provides guidance on how information integrity can achieve conceptual goodness. Only by doing so can information integrity fulfill its potential in supporting multi-stakeholder coordination to improve the global information environment .
... The buyer has publicly expressed the commitment to purchase goods against a particular price; the seller has expressed the commitment to satisfy this order. Making a commitment has legal and economic consequences, which must be faithfully represented in the (accounting) information systems of buyer and seller known as the representational faithfulness view [8]. Therefore strong consistency is warranted in order to ensure for non repudiation issues. ...
... Data integrity in itself is deined as łthe state that exists when data are unchanged from its source and has not been accidentally or maliciously modiied, altered or destroyedž [29]. This view is consistent with the model proposed by Boritz in [8] in which data integrity is subsumed in the notion of information integrity. Boritz deines information integrity as the representational faithfulness of information to the true state of the object that the information represents. ...
... In this dialogue stakeholders need diagnostic control systems to monitor the actual performance of the business system behavior. This information needs to be useful and relevant [8,29]. ...
Article
Full-text available
Governments like municipalities and cities may be regarded as the ultimate stakeholder society organizations. Their key challenge is to balance the welfare of many interest groups as natural stakeholders. Stakeholders need reliable information to assess the effectiveness of implemented policies of organizations to obtain specific objectives. These objectives relate to one or more capitals measuring economic, social and environmental sustainability affecting societal well being. This makes reporting sustainability information addressed to a large variety of stakeholders, coined as savers i.e. investors and users coined as civil society actors, a challenging task to fulfill due to the multidimensional construct of well being in perspective of CSDR-2022 and similar reporting frameworks. The European Commission asserted that there is significant evidence that many undertakings like businesses do not disclose material information on all major sustainability topics including climate related information such as GHG emissions and factors that affect bio diversity. In this research we propose a method build upon the logic of double-entry bookkeeping in a rigorous way extending the value cycle concept buttressing any value chain to design accounting information systems fulfilling the need of complete and reliable sustainable data for decision making and evaluation purposes.
... [Quotient]x/y := x · y −1 (14) Next there are basic properties of order on the rationals. Following Tao they are [7]: Proposition 1.9 (Basic properties of order on the rationals) Let x, y and z be rationals, then the following properties hold: Laws 1.10 Order trichotomy. ...
... Data integrity in itself is defined as "the state that exists when data are unchanged from its source and has not been accidentally or maliciously modified, altered or destroyed" [16]. This view is consistent with the model proposed by Boritz in [14,15], in which data integrity is subsumed in the notion of information integrity. Boritz defines information integrity as the representational faithfulness of information to the true state of the object that the information represents. ...
... [10] [10] repeat [11] value of attributes of rows to data base: write value variable type ; [12] until last row; [13] if the attribute syntax data file = reference attribute syntax of reference is TRUE then [14] the data syntax of data file TRUE: write T in database; [15] else [16] the data syntax of data file ¬ TRUE is T write F in database ; [17] end [19] [19] return done [20] end Algorithm 3: Syntax data quality attributes of data files ...
Preprint
Full-text available
The use of modern techniques like IOT, AI and machine learning revolutionized the idea of quality and quality control. Auditors face a a tidal wave of data. One of the key challenges is how to determine the quality of the data systems and processes produces. We propose a computational model to learn the inherent uncertainty to data integrity subsumed in the claims actually done by stakeholders within and outside the organization. The decision procedure combines two strong forms of obtaining audit evidence. These two forms are external conformation and re-performance. The procedure fits in the currently modern computational idea data driven assurance which is consistent with quality 4.0 concepts in quality control and quality audits practices.
... The main factors to safeguard the system represents in allow the authorised access in the firm system according to the necessity and deny the unauthorised access in the all other cases [14]. Boritz (2005) conducted a study to determine the significant attributes of the information integrity and the related issues [39]. Boritz (2005) considered the information security as one of the main attributes for the information integrity. ...
... The main factors to safeguard the system represents in allow the authorised access in the firm system according to the necessity and deny the unauthorised access in the all other cases [14]. Boritz (2005) conducted a study to determine the significant attributes of the information integrity and the related issues [39]. Boritz (2005) considered the information security as one of the main attributes for the information integrity. ...
... Boritz (2005) conducted a study to determine the significant attributes of the information integrity and the related issues [39]. Boritz (2005) considered the information security as one of the main attributes for the information integrity. The integrity is associated to the accuracy and the completeness of the accounting information and its validity regarding the customers' value or expectation [32]. ...
Article
Full-text available
The aim of this research is to propose a conceptual framework that links the Accounting Information System components with the Firm Performance. The framework contained the Availability, the Security and the Integrity, the Confidentiality and Privacy, and the System Quality as independent variable, with Firm financial and non-financial performance among the Jordanian Firms as dependent variable. The researcher followed the quantitative research methodology by testing the measurement model of the conceptual framework by checking the convergent and discriminant virility of the framework. The researcher used the mean of survey questionnaire as a research instrument, on which the researcher developed a 31 items questionnaire and distributed 350 questionnaires, and received 263 fully answered questionnaire. The findings of this study revealed that the scores of factor loadings and AVE did not achieve the recommended level of 0.4 and 0.5 respectively, which required a modification on the research model in the second run, on which the researcher achieve a satisfactory level of Factor loadings, Composite Reliability, Cronbach Alpha, and AVE. However, the scores of the Fornell and Larcker Criterion and HTMT which confirmed the discriminant validity. This study was limited to the measurement model analysis only, an empirical study with both the measurement and structural model will be a great addition to the future studies.
... Integrity is the maintenance of accuracy of data through its life cycle (Boritz, 2005). ...
... This process requires a set of rules to be thoroughly and consistently applied to all the data entering the database (Boritz, 2005). In the case of phylogenetic matrices, data validation needs to be a constant procedure to ensure the new incoming information is compatible with the previous one. ...
... In data science, every change to a database is defined as a transaction (Beynon-Davies, 2000;Boritz, 2005;Connolly and Begg, 2015). In the case of phylogenetic matrices, each addition of new taxa or characters constitutes a transaction (Figure 3.6). ...
Thesis
Full-text available
Non-sauropod sauropodomorphs, also known as 'basal sauropodomorphs' or 'prosauropods', have been thoroughly studied in recent years. Several hypotheses on the interrelationships within this group have been proposed, ranging from a complete paraphyly, where the group represents a grade from basal saurischians to Sauropoda, to a group on its own. The grade-like hypothesis is the most accepted; however, the relationships between the different taxa are not consistent amongst the proposed scenarios. These inconsistencies have been attributed to missing data and unstable (i.e., poorly preserved) taxa, nevertheless, an extensive comparative cladistic analysis has found that these inconsistencies instead come from the character coding and character selection, plus the strategies on merging data sets. Furthermore, a detailed character analysis using information theory and mathematical topology as an approach for character delineation is explored here to operationalise characters and reduce the potential impact of missing data. This analysis also produced the largest and most comprehensive matrix after the reassessment and operationalisation of every character applied to this group far. Additionally, partition analyses performed on this data set have found consistencies in the interrelationships within non-sauropod Sauropodomorpha and has found strong support for smaller clades such as Plateosauridae, Riojasauridae, Anchisauridae, Massospondylinae and Lufengosarinae. The results of these analyses also highlight a different scenario on how quadrupedality evolved, independently originating twice within the group, and provide a better framework to understand the palaeo-biogeography and diversification rate of the first herbivore radiation of dinosaurs.
... A good starting point for the attempt to define integrity of data sources is the vocabulary used in the field of computer science, particularly information systems. In the discussion on defining information integrity concepts presented in Boritz (2005), integrity is defined as an unimpaired or unmarred condition, hence providing the entire correspondence of a representation with an original condition. ...
... When it comes to information integrity, it is a measure of representational faithfulness of the information to the condition or subject that is being represented by the information (Boritz 2005). The core attributes of information integrity are identified as accuracy, completeness, timeliness and validity in Boritz (2005), Boritz (2004), and CobiT (2002) and Boritz (2005) also lists the enablers of information integrity as security, availability, understandability, consistency, predictability, verifiability and credibility. ...
... When it comes to information integrity, it is a measure of representational faithfulness of the information to the condition or subject that is being represented by the information (Boritz 2005). The core attributes of information integrity are identified as accuracy, completeness, timeliness and validity in Boritz (2005), Boritz (2004), and CobiT (2002) and Boritz (2005) also lists the enablers of information integrity as security, availability, understandability, consistency, predictability, verifiability and credibility. While core attributes of information integrity refer to the minimum criteria that must be satisfied while judging representational faithfulness of information, enablers are the properties or factors of information that help realize those core attributes. ...
Thesis
Full-text available
Intelligent vehicles are a key component in humanity’s vision for safer, efficient, and accessible transportation systems across the world. Due to the multitude of data sources and processes associated with Intelligent vehicles, the reliability of the total system is greatly dependent on the possibility of errors or poor performances observed in its components. In our work, we focus on the critical task of localization of intelligent vehicles and address the challenges in monitoring the integrity of data sources used in localization. The primary contribution of our research is the proposition of a novel protocol for integrity by combining integrity concepts from information systems with the existing integrity concepts in the field of Intelligent Transport Systems (ITS). An integrity monitoring framework based on the theorized integrity protocol that can handle multimodal localization problems is formalized. As the first step, a proof of concept for this framework is developed based on cross-consistency estimation of data sources using polynomial models. Based on the observations from the first step, a 'Feature Grid' data representation is proposed in the second step and a generalized prototype for the framework is implemented. The framework is tested in highways as well as complex urban scenarios to demonstrate that the proposed framework is capable of providing continuous integrity estimates of multimodal data sources used in intelligent vehicle localization.
... In this context, accuracy and currency are vital for the reliability of the information source. Accuracy pertains to the correctness of information provided by ChatGPT to tourists, while currency relates to the information's contemporaneity (Boritz, 2005;Fallis and Frick e, 2002;Prentice et al., 2020b). Regarding the channel aspect, the promptness of ChatGPT's responses is crucial for travelers (Elsharnouby et al., 2023), as delays can have adverse effects (Abou-Shouk and Khalifa, 2017). ...
... The study's data showed multivariate nonnormality, indicated by a Mardia Kurtosis value of 15, above the threshold of 5 (Byrne, 2016). For such data, the robust maximum likelihood estimation method was used, yielding stable estimates and fit indices based on the Satorra-Bentler x 2 statistic (Bentler and Wu, 1995;Byrne, 2013). The study used Anderson and Gerbing's (1988) two-stage procedure to evaluate the measurement model, assess the overall goodness of fit of the model and confirm validity and reliability of constructs. ...
Article
Purpose This study, rooted in affordance-actualization theory and communication theory, aims to critically examine how ChatGPT influences users’ transition from new adopters to loyal advocates within the context of travel decision-making. It incorporates constructs including communication quality, personalization, anthropomorphism, cognitive and emotional trust (ET), loyalty and intention to adopt into a comprehensive model. Design/methodology/approach This study used quantitative methods to analyze data from 477 respondents, collected online through a self-administered questionnaire by Embrain, a leading market research company in South Korea. Lavaan package within R studio was used for evaluating the measurement model through confirmatory factor analysis and using structural equation modeling to examine the proposed hypotheses. Findings The findings reveal a pivotal need for enhancing ChatGPT’s communication quality, particularly in terms of accuracy, currency and understandability. Personalization emerges as a key driver for cognitive trust, while anthropomorphism significantly impacts ET. Interestingly, the study unveils that in the context of travel recommendations, users’ trust in ChatGPT predominantly operates at the cognitive level, significantly impacting loyalty and subsequent adoption intentions. Practical implications The findings of this research provide valuable insights for improving Generative AI (GenAI) technology and management practices in travel recommendations. Originality/value As one of the few empirical research papers in the burgeoning field of GenAI, this study proposes a highly explanatory model for the process from affordance to actualization in the context of using ChatGPT for travel recommendations.
... ISACA (Information Systems Audit and Control Association) has developed COBIT (Control Objectives for Information Technology) as a framework to "help companies implement healthy governance factors," providing a way for organizations to align their business strategy with IT objectives. [1,2] IOP Publishing doi: 10.1088/1757-899X/1174/1/012001 2 Over time, the COBIT framework has evolved (several versions are known), and IT specialists have combined this framework with other methods to address specific IT problems. ...
... Based on a literature review on data quality and information integrity, a framework has been created that is considered broader than that provided by COBIT on information integrity [10]. ...
Article
Full-text available
This paper describes the process of designing a customized governance solution for an enterprise information system using COBIT framework guidelines and an axiomatic design approach. COBIT (Control Objectives for Information and related Technology) is a generally accepted framework created by the ISACA (Information Systems Audit and Control Association) for governing and managing enterprise information and technology (IT). COBIT framework can be applied to any organization in any industry and was designed to help deliver value while managing better the risks associated with the IT processes. On the other side, the Axiomatic Design (AD) theory involves a continuous interplay between the design objectives (the needs/what we want to achieve) and the means capable of reaching those objectives (how we want to achieve) to determine the best configuration capable of satisfying the design intend. The AD theory requires a description of the design's objectives in terms of specific requirements, called Functional Requirements (FR). The development of a complete solution to a given problem starts by mapping the FRs to Design Parameters (DPs) in the solution domain. ISACA Design Guide proposes ten (10) Design Factors and forty (40) Governance Objectives; each objective (a set of Functional Requirements) can be achieved through several combinations of Design Parameters. From the Axiomatic Design theory perspective, this determines a coupled matrix. To decouple the matrix, the profile of each design factor is drawn, and the sub-factors will be taken only once within the factor where the sub-factor has the highest weight. A case study is presented.
... Garantir a confidencialidade da informação é garantir que ela não está disponível e nem pode ser descoberta por indivíduos, entidades ou processos que não têm autorização para acessá-la [Beckers et al. 2015]. Manter a integridade dos dados significa ter a garantia de que os dados estão corretos e completos durante todo o seu ciclo de vida, assim assegurando que os dados não foram modificados de forma não autorizada [Boritz 2005]. Por fim, a disponibilidade é a propriedade que garante que a informação pode ser acessada no momento em que ela é necessária, ou seja, de que os sistemas usados para seu armazenamento e processamento estão funcionando corretamente, assim como as medidas de segurança usadas para protegê-la e os canais de comunicação necessários para acessá-la. ...
... It is known that the functioning of the PM is described by such possible states as serviceable, faulty, diagnosed, restored [13], [14]. In ISS, risk is considered the possibility of the occurrence of some unfavorable event associated with the characteristics of the unreliability of the PM, entailing various types of losses [1], [15], [16]. However, approaches associated with risk arising from the reliability characteristics of the PM are not considered in this work, i.e. it is assumed that all MP are reliable. ...
Article
Full-text available
The analysis shows that the insufficient level of information security in service networks is the main cause of huge losses for enterprises. Despite the appearance of a number of works to solve this problem, there is currently no unified system for assessing information security. This shows that this problem has not yet been sufficiently studied and relevant. This work is one of the steps towards creating a system for assessing information security in service networks. The purpose of the work is to develop an algorithm and simulation model, analyze the results of the simulation model to determine the main characteristics of the information security system (ISS), providing the ability to completely close all possible channels of threats by controlling all unauthorized access (UA) requests through the protection mechanism (PM). To solve the problem, a simulation method was applied using the principles of queuing systems (QS). This method makes it possible to obtain the main characteristics of the ISS from the UA with an unlimited amount of buffer memory (BM). Models, an algorithm and a methodology for the development of ISS from UA are proposed, which is considered as a single-phase multi-channel QS with an unlimited volume of BM. The process of obtaining simulation results was implemented in the GPSS World modeling system and comparative analyzes of the main characteristics of the ISS were carried out for various laws of distribution of output parameters. At the same time, UA requests were the simplest flows, and the service time was subject to exponential, constant and Erlang distribution laws. Conducted experiments based on the proposed models and algorithm for analyzing the characteristics of the ISS from the UA as a single-phase multi-channel QS with unlimited waiting time for requests in the queue confirmed the expected results. The results obtained can be used to build new or modify existing ISS in corporate networks for servicing objects for various purposes. This work is one of the approaches to generalizing the problems under consideration for systems with an unlimited volume of BM. Prospects for further research include research and development of the principles of hardware and software implementation of ISS in service networks.
... For Boritz (2005) there are three main concepts regarding the quality of an information system, these are (1) the integrity of the information, (2) the integrity of the processing and (3) reliability of the system. however, given that due to the internet, people have access to virtually unlimited information, technology plays a very important role helping users navigate, read, acquire, analyze and apply such information in a constructive way, in which you can make the most of it (ho et al., 2012). ...
Article
Full-text available
Processes of innovation in the supply chain are frequently examined in depth within large firms across various economic sectors operating in developed economies. However, studies applied to small companies operating in emerging markets is still incipient, especially due to the few resources they have, which in turn affects their capacity for innovation. Taking into account the social and economic importance of small enterprises in the economic growth of developing countries, this study focused on the analysis of such enterprises, through a sample of 413 entrepreneurs who own small restaurants in Colombia. The data processed through PLS-SEM, analyze the perception of this type of companies, regarding the use of new technologies in the supply chain, through an indirect process of technological innovation, where they had to embrace the use of various platforms, such as apps or websites, to procure the supplies necessary for their restaurants. The results of this research show that Perceived Usability and Perceived Usefulness are antecedents of the quality of the Website and e-satisfaction is considered a result, through which you can indicate the success of the adoption of these technological tools, which allow a better daily operation for such small businesses.
... However, in a digital world, people tend to use KYC systems to identify themselves. Customer identification, authentication, and verification form the foundation of KYC procedures, which are essential for many businesses, including banking, telecommunications, and shared services like car rentals [9]. The KYC process is designed to prevent identity theft, financial fraud, money laundering and terrorist financing. ...
... Such restriction protects the company resources such as data, software, sensitive information from theft or misuse. Security represents the foundation of the system reliability, as it empowers the other principles of the trust service framework and improve the integrity of accounting information [1,22]. According to Beard [23], the lake of AIS security creates significant business and professional risks which could increase the possibility of manipulation of accounting information. ...
Conference Paper
The main objective of this study is to construct a comprehensive conceptual framework that link the relationship between artificial intelligence (AI), trust service framework of system reliability and reliability of accounting information system (AIS). To achieve this objective, the current related literature synthesized and analyzed using meta-analysis as a methodological approach. The findings of this study suggest that AI techniques can enhance the AIS reliability by supporting the security, confidentiality, privacy, processing integrity, and availability as five principles of trust service framework for system reliability. The proposed conceptual framework provides an explanation for such findings. The implications of this study will benefit a wide range of stakeholders including users, management, auditors. Some suggestions for future studies are provided in this study.
... The security challenges that we aim to solve include the integrity and privacy of IoT multimodal data. Data integrity refers to the assurance of data accuracy, completeness, and consistency [5]. Once data integrity is protected, the source data should remain accurate and reliable during data storage or data access. ...
Article
With the wide application of Internet of Things (IoT) technology, large volumes of multimodal data are collected and analyzed for various diagnoses, analyses, and predictions to help decision-making and management. However, the research on protecting data integrity and privacy is quite limited, while the lack of proper protection for sensitive data may have significant impacts on the benefits and gains of data owners. In this research, we propose a protection solution for data integrity and privacy. Specifically, our system protects data integrity through distributed systems and blockchain technology. Meanwhile, our system guarantees data privacy using differential privacy and Machine Learning (ML) techniques. Our system aims to maintain the usability of the data for further data analytical tasks of data users, while encrypting the data according to the requirements of data owners. We implement our solution with smart contracts, distributed file systems, and ML models. The experimental results show that our proposed solution can effectively encrypt source IoT data according to the requirements of data users while data integrity can be protected under the blockchain.
... Enterprise performance can be significantly predicted through AIS reliability, such as security and integrity (Al-Dmour, A. H., & Al-Dmour, R. H., 2018). Information security is one of the main attributes of information integrity (Boritz, 2005). The accuracy and completeness of accounting information determine the integrity of the information. ...
Article
Full-text available
Purpose: The article analyzes the impact of the accounting information system on the performance of Vietnamese construction enterprises, providing more empirical evidence on the impact of the accounting information system to the performance of Vietnamese construction enterprises. Theoretical framework: This paper uses Organizational information processing theory, Situation theory, System Theory. Design/Methodology/Approach: The research method uses a questionnaire survey of accountants, chief accountants, business managers. Firm Performance, AIS Availability, AIS Security and Integrity, AIS Confidentiality and Privacy, AIS System Quality measured on a five-point Likert scale Very good, good, moderate, not good, weak. Findings: Research results show that AIS Availability, AIS Security and Integrity, AIS Confidentiality and Privacy, AIS System Quality has a positive impact on the performance of Vietnamese construction enterprises. Research, Practical & Social implications: Based on the research results, the author has proposed recommendations to improve the performance of construction enterprises in Vietnam. Originality/Value: This study fills the gap in the the the impact of the accounting information system on the performance of Vietnamese construction enterprises.
... Secondly, integrity is defined as the representational faithfulness of a subject (Boritz 2005). Specifically, in information security, integrity damage refers to the inconsistent and compromising adherence to faithful and true values due to unforeseeable interventions (Vigil et al. 2015), e.g., unauthorized human manipulation. ...
Article
Reliable and accurate information is crucial for decision making in construction projects. However, stakeholders driven by profit have the potential to manipulate information, compromising information authentication and integrity (IAI). Even worse, digitization in the construction industry has made IAI extremely volatile, e.g., by easily copying, modifying, and falsifying. This study aims to ensure IAI by proposing a four-layer blockchain-based framework for combining smart construction objects (SCOs)-enabled oracles and hash-based digital signature techniques to protect both on-chain and off-chain information. Four deployed smart contracts provide three assurance mechanisms, i.e., signature verification, public data validation, and SCO cross-validation, which have been tested to improve the tampering detection accuracy by 19%, 11%, and 27%, respectively. The contribution of this study is to illustrate how a reliable and flexible blockchain oracle system can be established with limited resources to handle the concomitant IAI problem and provide an in-depth understanding of the IAI assurance mechanisms from the proposed framework. Future research can be conducted to reduce the signature size, enhance scalability, and further increase detection accuracy.
... Furthermore, studies on IT governance practices and models, in general, and the resource-event-agent (REA) model can also be identified (e.g., [33][34][35]). There are also analyses related to performance, integrity, risks, success factors, obstacles, or challenges associated with implementing technologies in different areas of accounting that relate to the previous themes, some of them discussing the advantages and disadvantages of outsourcing (e.g., [36][37][38][39][40][41][42]). ...
Article
Full-text available
Accounting has been evolving to follow the latest economic, political, social, and technological developments. Therefore, there is a need for researchers to also include in their research agenda the emerging topics in the accounting area. This exploratory paper selects technological matters in accounting as its research object, proposing a literature review that uses archival research as a method and content analysis as a technique. Using different tools for the assessment of qualitative data, this content analysis provides a summary of those papers, such as their main topics, most frequent words, and cluster analysis. A top journal was used as the source of information, namely The International Journal of Accounting Information Systems, given its scope, which links accounting and technological matters. Data from 2000 to 2022 was selected to provide an evolutive analysis since the beginning of this century, with a particular focus on the latest period. The findings indicate that the recent discussions and trending topics in accounting, including matters such as international regulation, the sustainable perspective in accounting, as well as new methods, channels, and processes for improving the entities’ auditing and reporting, have increased their relevance and influence, enriching the debate and future perspectives in combination with the use of new technologies. Therefore, this seems to be a path to follow as an avenue for future research. Notwithstanding, emerging technologies as a research topic seem to be slower or less evident than their apparent development in the accounting area. The findings from this paper are limited to a single journal and, therefore, this limitation must be considered in the context of those conclusions. Notwithstanding, its proposed analysis may contribute to the profession, academia, and the scientific community overall, enabling the identification of the state of the art of literature in the technological area of accounting.
... Human integrity entails data integrity, i.e., information transfer that is accurate and consistent over its entire life cycle, i.e., free of copying errors (after Boritz (2005)). This, for example, means that we communicate our welfare assessments and contributions, openly and honestly, with the least possible distortion. ...
Article
Full-text available
The current food system is not sustainable. Circular agriculture aims to save the environment and produce food sustainably by closing nutrient cycles, possibly without improving animal welfare. This paper proposes a new conceptual framework, called a circular welfare economy (CWE), to facilitate a transition towards a sustainable agriculture based on integrity. The CWE-framework explains how welfare relates to circular agriculture, how potential conflicts can be solved and what future livestock farming could look like. CWE applies the notion of circularity to welfare defined as the quality of life as perceived by the individual itself. CWE also identifies human integrity, defined as being open and honest, as a sine qua non for sustainability. Animal-welfare problems arise when animals are merely used as a means, e.g., for profits. Instead, profits and circular agriculture are means to the end of welfare. In a CWE, welfare is promoted sustainably, without causing undue need frustration in other individuals. This requires informed moral decision making involving human integrity and closure of welfare-related feedback loops. Conflicts between circular agriculture and animal welfare are solved by weighing all welfare needs impartially. Three future scenarios are presented: Animal-welfare-exclusive circular agriculture, which resembles modern intensive livestock farming, animal-rights agriculture without livestock farming, and a CWE-based agriculture which integrates circular agriculture and animal welfare. In the latter case we will not use animals merely as a means to close nutrient cycles, but take every effort, openly and honestly, to understand and benefit their points of view as we do our own.
... We also noticed interviewees categorizing or grading the relative importance of various aspects of model monitoring as a part of the response to IQ3. Interviewees largely identify monitoring data integrity (Boritz 2005), that is, the accuracy, completeness, and consistency of data for inputs and outputs of a model as a basic requirement. One interviewee remarked, "I think, at the very least, [we need] a system that monitors the exhaust of the model; the outputs and also the inputs of the model." ...
Article
Predictive models are increasingly used to make various consequential decisions in high-stakes domains such as healthcare, finance, and policy. It becomes critical to ensure that these models make accurate predictions, are robust to shifts in the data, do not rely on spurious features, and do not unduly discriminate against minority groups. To this end, several approaches spanning various areas such as explainability, fairness, and robustness have been proposed in recent literature. Such approaches need to be human-centered as they cater to the understanding of the models to their users. However, there is little to no research on understanding the needs and challenges in monitoring deployed machine learning (ML) models from a human-centric perspective. To address this gap, we conducted semi-structured interviews with 13 practitioners who are experienced with deploying ML models and engaging with customers spanning domains such as financial services, healthcare, hiring, online retail, computational advertising, and conversational assistants. We identified various human-centric challenges and requirements for model monitoring in real-world applications. Specifically, we found that relevant stakeholders would want model monitoring systems to provide clear, unambiguous, and easy-to-understand insights that are readily actionable. Furthermore, our study also revealed that stakeholders desire customization of model monitoring systems to cater to domain-specific use cases.
... Data integrity ensures the correctness and consistency of data throughout its life cycle. It, therefore, plays a vital role in the design, implementation, and utilization of any data management system [3]. The current solution for data integrity verification is through third-party auditing service providers such as Spectra [4]. ...
Preprint
Data tampering is often considered a severe problem in industrial applications as it can lead to inaccurate financial reports or even a corporate security crisis. A correct representation of data is essential for companies' core business processes and is demanded by investors and customers. Traditional data audits are performed through third-party auditing services; however, these services are expensive and can be untrustworthy in some cases. Blockchain and smart contracts provide a decentralized mechanism to achieve secure and trustworthy data integrity verification; however, existing solutions present challenges in terms of scalability, privacy protection, and compliance with data regulations. In this paper, we propose the AUtomated and Decentralized InTegrity vErification Model (AUDITEM) to assist business stakeholders in verifying data integrity in a trustworthy and automated manner. To address the challenges in existing integrity verification processes, our model uses carefully designed smart contracts and a distributed file system to store integrity verification attributes and uses blockchain to enhance the authenticity of data certificates. A sub-module called Data Integrity Verification Tool (DIVT) is also developed to support easy-to-use interfaces and customizable verification operations. This paper presents a detailed implementation and designs experiments to verify the proposed model. The experimental and analytical results demonstrate that our model is feasible and efficient to meet various business requirements for data integrity verification.
... • System quality and information quality theories (DeLone and McLean, 1992;Boritz 2005). ...
Article
We present an integrative review of existing marketing research on mobile apps, clarifying and expanding what is known around how apps shape customer experiences and value across iterative customer journeys, leading to the attainment of competitive advantage, via apps (in instances of apps attached to an existing brand) and for apps (when the app is the brand). To synthetize relevant knowledge, we integrate different conceptual bases into a unified framework, which simplifies the results of an in-depth bibliographic analysis of 471 studies. The synthesis advances marketing research by combining customer experience, customer journey, value creation and co-creation, digital customer orientation, market orientation, and competitive advantage. This integration of knowledge also furthers scientific marketing research on apps, facilitating future developments on the topic and promoting expertise exchange between academia and industry.
... Information quality could be explained as that consists of accuracy, timeliness, completeness and validity characteristics. As indicated by (Boritz, 2005), the word accuracy alludes to the data that relates with the truth and lack of bias. The expression completeness relates to information, which changes the client requirement's dimensionality. ...
Article
Full-text available
Quality of information is a priceless asset for organization to possess as its assist in carrying out business plans and changes. These business changes usually support the management executive in decision makings. In view of that, this study examines the information quality in AIS and its effects on organizational performance among conventional and Islamic banks in Jordan. To achieve that, proportionate stratified random sampling is applied to the information system users of sixteen conventional and Islamic banks in Jordan. Total copies of 600 questionnaires were distributed and only 250 among the returned copies were valid, suggesting a valid response rate of 41.7%. The study adopts the partial least square (Smart PLS 3) method to enhance the data analysis and perform hypotheses testing. Findings clearly show that quality of information is the key for business growth as it indicates a positive effect on organizational performance. Further result shows that organizational culture improves and increases business performance when combined with information quality. For this reason, conventional and Islamic banks in Jordan should have well-developed AIS as it assists organizations to -attain higher performance. There is need for more development in management skills to fully exploit the AIS in order to realize a greater organizational performance. In other words, full implementation of AIS should be given more priority by the managements of these conventional and Islamic banks.
... Diverse preliminary analyses are available in confidentiality safeguarded Data Integrity. Conversely, there are several problems which require more analyses in the conception of data integrity from both confidentiality and safety initiatives [7,8]. ...
Chapter
Each information system (IS) has an underlying architecture, although its complexity and scope can vary quite substantially for different kinds of systems. Since design decisions about the architecture define the very foundation of an IS, these design decisions cannot be easily undone or altered after they were made. If not taken seriously enough, improper IS architecture designs can result in the development of systems that are incapable of adequately meeting user requirements. Understanding the concept of good IS architecture design and taking design decisions diligently is, therefore, highly important for an IS development project’s success. In order to answer the question of what constitutes a good IS architecture, this chapter examines the importance of design decisions across a system’s lifecycle. In particular, two different perspectives on the concept of good IS architecture design are explicated: (1) design as a process and (2) design as the outcome of a design process. The two perspectives are closely related to each other and generally help explain the more abstract concept of IS architecture design and particularly the characteristics of a good IS architecture.
Article
Full-text available
Context. An analysis of the service network shows that insufficient information security in service networks is the cause of huge losses incurred by corporations. Despite the appearance of a number of works and materials on standardization, there is currently no unified system for assessing information security in the field of information security. It should be noted that existing methods, as well as accumulated experience in this area, do not completely overcome these difficulties. This circumstance confirms that this problem has not yet been sufficiently studied and, therefore, remains relevant. The presented work is one of the steps towards creating a unified system for assessing information security in service networks. Objective. Development of an algorithm and simulation model, analysis of simulation results to determine the key characteristics of the Information Security System, providing the capability for complete closure, through the security system, of all potential threat channels by ensuring control over the passage of all unauthorized access requests through defense mechanisms. Method. To solve the problem, a simulation method was applied using the principles of queuing system modeling. This method makes it possible to obtain the main characteristics of the Information Security System from the unauthorized access with a limited amount of buffer memory. Results. Algorithms, models, and methodology have been developed for the development of Information Security System from unauthorized access, considered as a single-phase multi-channel queuing system with a limited volume of buffer memory. The process of obtaining model results was implemented in the General Purpose Simulation System World modelling system, and comparative assessments of the main characteristics of the Information Security System were carried out for various laws of distribution of output parameters, i.e., in this case, unauthorized access requests are the simplest flows, and the service time obeys exponential, constant, and Erlang distribution laws. Conclusions. The conducted experiments based on the algorithm and model confirmed the expected results when analyzing the characteristics of the Information Security System from the unauthorized access as a single-phase multi-channel queuing system with a limited waiting time for requests in the queue. These results can be used for practical construction of new or modification of existing Information Security System s in service networks of objects of various purposes. This work is one of the approaches to generalizing the problems under consideration for systems with a limited volume of buffer memory. Prospects for further research include research and development of the principles of hardware and software implementation of Information Security System in service networks.
Article
Full-text available
Penelitian ini memiliki tujuan untuk menganalisis penerapan tata kelola TI yang berfokus pada sistem keamanan informasi dan analisis risiko TI di Perguruan Tinggi Harapan Maju dengan menggunakan framework COBIT 2019. Adapun metode penelitian yang digunakan ialah metode campuran dengan pendekatan sekuensial, pendekatan sekuensial merupakan pendekatan yang pelaksanaan pengumpulan datanya tidak secara bersamaan antara kuantitatif dan kualitatif, proses analisis data juga dilaksanakan bertahap. Tujuan keseluruhan dari desain ini adalah agar data kualitatif dapat membantu menjelaskan secara detail hasil kuantitatif. Prosedur yang umum dilakukan adalah mengumpulkan data survei, menganalisis data, dan kemudian menindaklanjuti dengan wawancara kualitatif untuk membantu menjelaskan tanggapan survei. Berdasarkan hasil pengolahan data ditemukan bahwa penggunaan aplikasi e-learning dan Silam di Perguruan Tinggi Harapan Maju saat ini sudah memberikan penciptaan nilai sehingga terjadi peningkatan efisiensi. Tingkat kapabilitas yang dicapai pada management objective EDM 03 dan DSS 05 yaitu level 1 (initial process) serta APO 13 mencapai tingkat kapabilitas pada level 0 (incomplete process). Jika dibandingkan dengan tingkat kapabilitas yang seharusnya, maka terdapat kesenjangan 2 sampai 4 level yang menunjukkan penerapan tata kelola TI di Perguruan Tinggi Harapan Maju dikategorikan sebagai penerapan yang rendah. Penelitian ini memberikan kontribusi terhadap pentingnya evaluasi tata kelola TI pada suatu organisasi yang sudah menerapkan TI, tata kelola memberikan informasi terkait kondisi TI saat ini dan mengarah kepada keselarasan antara TI dan tujuan organisasi. Penelitian ini menggunakan objek penelitian perguruan tinggi karena saat ini mereka belum memahami pentingnya tata kelola TI dalam ruang lingkup perguruan tinggi.
Article
Full-text available
The UK enacted its first legal measure to address gender pay inequity, the Equal Pay Act 1970, more than 50 years ago. Yet, in 2021, the gender pay gap (GPG) still stood at 15.4%. Departing from the remedial and individual approach that characterises equal pay legislation, the 2017 Gender Pay Gap Information Regulations (the Regulations) require private and voluntary sector organisations with 250+ employees to annually publish pay data broken down by gender. The long-term aspiration of the Regulations is to contribute to closing the GPG within a generation. It is also hoped that they will encourage the public disclosure of pay data and changes in workplace policies to reduce organisational GPGs (immediate aims) and improve employers’ accountability (underlying aim). This paper considers whether the Regulations have what it takes to meet those immediate and underlying aims. Our assessment framework is built on the premise that for public disclosure to be useful and for employers to tackle the causes of the GPG, the information reported must be of sufficient quality, meaningful and relevant. The paper draws on both doctrinal analysis and empirical data reported by FTSE 100 Index companies to assess the Regulations and determine whether they hold the potential to meet those aims.
Article
In this research the question of knowledge reusability in creating more reliable IT audit plans has been investigated. With the use of appropriate simulation techniques and statistical analysis, it has been proved that the explicit usage of self-reflection in IT auditing enables more precise audit plans, therefore the execution might become more effective. This self-reflection means that auditing methodologies are largely depending on the results of previous examinations of certain areas. In fact, the most widespread methods and guidelines are also based on the experience gained through previous examinations. If the results gathered in this way are being used, more precise audit plans can be made and the designation of the areas to be examined can become more accurate. If the fact that audit methodologies are primarily based on practical experience is accepted and explicitly formulated, then, with the use of the information acquired in previous audits, better and more precise audit plans can be created. In other phrases: the assignment of control objectives in certain situations of examination can be done based on the experiences from previous audits. Additionally, the audit plans created in this way enable the cost-effective execution of audits, without sacrificing accuracy and reliability. The results of the simulation confirm these statements.
Chapter
Full-text available
The use of modern techniques, such as IOT, AI, and machine learning, revolutionized the idea of quality and quality control. Auditors face a tidal wave of data. One of the key challenges is how to determine the quality of the data, systems and processes produce. We propose a computational model to learn the inherent uncertainty to data integrity subsumed in the claims actually done by stakeholders within and outside the organization. The decision procedure combines two strong forms of obtaining audit evidence. These two forms are external conformation and re-performance. The procedure fits in the current modern computational idea data-driven assurance, which is consistent with quality 4.0 concepts in quality control and quality audit practices.
Conference Paper
Full-text available
Purpose-The purpose of this paper is to examine the impact of the integration between Extensible Business Reporting Language (XBRL) and the decentralized global distributed ledger system Blockchain on the transformation from continuous Auditing to real-time Auditing. Design/methodology/approach-Using a content analysis model, this study examines the impact of the integration between XBRLand Blockchain on the transformation from continuous Auditing to real-time Auditing. Findings-This paper finds that the XBRL-Blockchain is a good way to activate real-time Auditing because of the characteristics of XBRL and Blockchain technology that can support real-time Auditing, including transparency, privacy, decentralization, and pre-validation of operations at the same time as they occur with no possibility of modification or fraud. Therefore, Blockchain is not a substitute for XBRL, as the Blockchain is a ledger through which transactions can be conducted, and XBRL is the standard that can standardize the terms and standards for the items that are exchanged in accounting in those transactions, which means that XBRL supports Blockchain in transparency and trust in transactions. There for, this paper finds that the XBRL-Blockchain integration affects significatively to transfer from continuous auditing to real time auditing. Originality/value-This paper contributes to the literature on Provide a proposed framework for the integration between XBRL and Blockchain to transfer from continuous auditing to real time auditing.
Chapter
Full-text available
The use of modern techniques like IOT, AI and machine learning revolutionized the idea of quality and quality control. Auditors face a a tidal wave of data. One of the key challenges is how to determine the quality of the data systems and processes produces. We propose a computational model to learn the inherent uncertainty to data integrity subsumed in the claims actually done by stakeholders within and outside the organization. The decision procedure combines two strong forms of obtaining audit evidence. These two forms are external conformation and re-performance. The procedure fits in the currently modern computational idea data driven assurance which is consistent with quality 4.0 concepts in quality control and quality audits practices.
Article
Information authentication and integrity (IAI) plays a critical role in construction digital transformation. Blockchain technology can prevent information from fabrication and falsification but cannot guarantee IAI before it enters the on-chain world. To cover the void, this paper describes a blockchain-oriented deployment framework to secure IAI for construction by formulating watermarks from the 5W1H (Who?, What?, Where?, Where?, Why?, How?) and hiding them in digital content. It does so by conducting a literature review and industrial stakeholder engagements, based on which the design requirements are derived and a framework with seven modules is further developed. By deploying it in a modular construction project, the framework demonstrates its capability to guide the formation and embedding of watermarks, accurately detect and localize the integrity damage, and recover the full authentication data even after some degree of tampering. Future studies are encouraged to deploy the framework with various blockchain systems in construction applications.
Article
Full-text available
Data mining refers to the process of analyzing massive volumes of data and big data from different angles in order to identify relationships between data and transform them into actionable information. The Revolution through the digitization of the economy and the circulation of various data, in volumes never before reached and in record time, is probably fundamentally shaking up the functioning of our organizations. Organizations are thus faced with a gigantic wave of data both endogenous and exogenous to their own environment. Artificial intelligence represents, for its part, the impacting part for employment of this new industrial revolution, promising the conversion of a new category of tasks, previously performed by humans, towards a new type of robotization even more elaborate than that of previous industrial revolutions, in volumes that no one can still fully appreciate today. In the interest of having a much broader and clearer vision on Data Mining and its impact on education, it is necessary to draw up a judicious plan for the realization of the project by answering the following problem: How Big Data can influence the education sector?
Article
Full-text available
The amount of data generated by scientific applications on high-performance computing systems is growing at an ever-increasing pace. Most of the generated data are transferred to storage in remote systems for various purposes such as backup, replication, or analysis. To detect data corruption caused by network or storage failures during data transfer, the receiver system verifies data integrity by comparing the checksum of the data. However, the internal operation of the storage device is not sufficiently investigated in the existing end-to-end integrity verification techniques. In this paper, we propose a concurrent and reliable end-to-end data integrity verification scheme considering the internal operation of the storage devices for data transfer between high-performance computing systems with flash-based storage devices. To perform data integrity verification including data corruptions that occurred inside the storage devices, we control the order of I/O operations considering the internal operations of the storage devices. Also, to prove the effectiveness of the proposed scheme, we devise a prototype that injects faults on the specific layer of the storage stack and examines detection of faults. We parallelize checksum computation and overlap it with I/O operations to mitigate the overhead caused by I/O reordering. The experimental results show that the proposed scheme reduces the entire data transfer time by up to 62% compared with the existing schemes while ensuring robust data integrity. With the prototype implementation, our scheme detects failures on NAND flash memory inside storage devices that cannot be detected with the existing schemes.
Book
Full-text available
This book TECHNO-BUSINESS DATA MANAGEMENT prepares students of Business Management specialization, Data Base Management, Knowledge Management, and students specializing to become data manager and other students of management who specializes in Computer Applications, Business Data, DBMS Specialists, Commerce, Entrepreneurship Management, BBA, MBA, or Business Strategy related subjects, Entrepreneurial practitioners, and includes the dynamic concepts of newer Entrepreneurial Strategies happening across the world, and also caters to the syllabus for BBA and MBA of all the leading Indian Universities specifically Bangalore University, Anna University, Bharathiar University, Kerala University, Calicut University, and other Indian Universities. These concepts in this book will prepare all Entrepreneurial professionals who are evolving into higher-level professionals who can use this book for their challenging and rewarding career. The readers can apply these concepts in their day today management strategy functions to have effective practical advancements in their career.
Book
Full-text available
This book TECHNO-BUSINESS DATA MANAGEMENT prepares students of Business Management specialization, Data Base Management, Knowledge Management, and students specializing to become data manager and other students of management who specializes in Computer Applications, Business Data, DBMS Specialists, Commerce, Entrepreneurship Management, BBA, MBA, or Business Strategy related subjects, Entrepreneurial practitioners, and includes the dynamic concepts of newer Entrepreneurial Strategies happening across the world, and also caters to the syllabus for BBA and MBA of all the leading Indian Universities specifically to Bangalore University, Anna University, Bharathiar University, Kerala University, Calicut University, and other Indian Universities. These concepts in this book will prepare all Entrepreneurial professionals who are evolving into higher level professionals who can use this book for their challenging and rewarding career. The readers can apply these concepts in their day to day management strategy functions to have effective practical advancements in their career.
Chapter
Financial technology (FinTech) as part of financial inlcussion changes conventional business models to be information technology minded. The presence of FinTech in the wider community makes it easy for access to financial service products and transactions and payment systems more practically, efficiently, and economically. Unfortunately, as the security risk in transacting increases, cyber security in the financial services industry and FinTech service providers is considered a major target by cybercriminals. This study proposed a security management approach through hybrid blockchain method implemented through flask framework and encryption to protect transaction data. The results are promising. Referring to accuracy, this study successfully reduces data leakage and misuse of personal data and financial data in FinTechs.
Thesis
Full-text available
Nowadays pharmaceutical analysis and industry could not be imagined without using chromatographic methods like the High Performance Liquid Chromatography (HPLC) and Gas Chromatography (GC). Therefore, the field of chromatography is already firmly anchored in the three important, regional pharmacopeias: EP, USP and JP. Deficits of specifications within these pharmacopoeias relating to the detector parameters sampling rate and signal filtration are the major motivation of this thesis. Furthermore investigations on the data acquisition and processing within the detectors and software products dealing with collected data were done. Several concepts like double entry method, smoothing optimization and signal filtration based on persistence have been developed and are used as tools to examine the data integrity of commercial Chromatography Data System (CDS), improve the signal-to-noise ratio of small peaks to lower the LOD and LOQ by efficient denoising, and determine the suitable filter parameters and sampling rates an user can apply on his system to accelerate the method development and validation. All developments have been tested on simulated and real chromatograms and have shown that they are suitable for their specific purposes. In the end optimizations in the concepts still exist but some new aspects in the long-term investigated field of chromatography have been discovered.
Chapter
Full-text available
Predições são feitas cotidianamente, por todo mundo. Quantos pacotes de arroz é preciso comprar no supermercado? Qual é o melhor caminho para ir até o trabalho? A resposta que dermos a estas perguntas, geralmente inconsciente, se fundamenta nas informações disponı́veis e na experiência passada: por exemplo, se consumimos dois quilos de arroz na semana anterior e se não há circunstâncias extraordinárias a vista, é bem possı́vel que essa seja a quantidade necessária para a próxima semana. Naturalmente, somos também capazes de inserir informações adicionais para adequar nossas decisões. Quando há notı́cias de um acidente no caminho que costumamos utilizar, procuramos estabelecer uma nova rota alternativa. Esse procedimento cognitivo é natural.
Article
In a recent survey of academic research, Fintech related topics, broadly classified as crypto-currency studies, were by far the most researched topics in the social sciences. However, we have observed that, perhaps surprisingly, even though crypto-currencies rely on a distributed accounting ledger technology, relatively few of those studies were conducted by accounting academics. While some of the features of a system like Bitcoin do not necessarily rely on a traditional accounting knowledge, this knowledge is key in designing effective real-world distributed systems. Building on a foundational framework developed by Risius and Spohrer (2017), we provide support for their hypothesis that to date, research in this area has been predominantly of a somewhat narrow focus (i.e., based upon exploiting existing programming solutions without adequately considering the fundamental needs of users). This is particularly reflected by the abundance of Bitcoin-like crypto-currency code-bases with little or no place for business applications. We suggest that this may severely limit an appreciation of the relevance and applicability of decentralized systems, and how they may support value creation and improved governance. We provide supporting arguments for this statement by considering four applied classes of problems where a blockchain/distributed ledger can add value without requiring a crypto-currency to be an integral part of the functioning system. We note that each class of problem has been viewed previously as part of accounting issues within the legacy centralized ledger systems paradigm. We show how accounting knowledge is still relevant in the shift from centralized to decentralized ledger systems. We advance the debate on the development of (crypto-currency free) value-creating distributed ledger systems by showing that applying accounting knowledge in this area has potentially a much wider impact than that currently being applied in areas limited to auditing and operations management. We develop a typology for general distributed ledger design which assists potential users to understand the wide range of choices when developing such systems.
Book
Full-text available
A comprehensive entity security program deploys information asset protection through stratified technological and non-technological controls. Controls are necessary for counteracting threats, opportunities, and vulnerabilities risks in a manner that reduces potential adverse effects to defined, acceptable levels. This book presents a methodological approach in the context of normative decision theory constructs and concepts with appropriate reference to standards and the respective guidelines. Normative decision theory attempts to establish a rational framework for choosing between alternative courses of action when the outcomes resulting from the selection are uncertain. Through the methodological application, decision theory techniques can provide objectives determination, interaction assessments, performance estimates, and organizational analysis. A normative model prescribes what should exist according to an assumption or rule.
Preprint
Full-text available
Importance of Techno-Business Data for effective management of organisations. Exclusive for Technology Development Leaders.
Article
Full-text available
Acknowledgments Work reported herein has been supported, in part, by MIT's Total Data Quality Management (TDQM) Research Program, MIT's International Financial Services Research Center (IFSRC), Fujitsu Personal Systems, Inc. and Bull-HN. An Object-Oriented Implementation of Quality Data Products
Article
Full-text available
Information is valuable if it derives from reliable data. However, measurements for data reliability have not been widely established in the area of information systems (IS). This paper attempts to draw some concepts of reliability from the field of quality control and to apply them to IS. The paper develops three measurements for data reliability: internal reliability-reflects the "commonly accepted" characteristics of various data items; relative reliability-indicates compliance of data to user requirements; and absolute reliability-determines the level of resemblance of data items to reality. The relationships between the three measurements are discussed, and the results of a field study are displayed and analyzed. The results provide some insightful information on the "shape" of the database that was inspected, as well as on the degree of rationality of some user requirements. General conclusions and avenues for future research are suggested.
Article
Full-text available
Information quality (IQ) is an inexact science in terms of assessment and benchmarks. Although various aspects of quality and information have been investigated [1, 4, 6, 7, 9, 12], there is still a critical need for a methodology that assesses how well organizations develop information products and deliver information services to consumers. Benchmarks developed from such a methodology can help compare information quality across organizations, and provide a baseline for assessing IQ improvements.
Article
Full-text available
Poor data quality has far-reaching effects and consequences. The article aims to increase the awareness by providing a summary of impacts of poor data quality on a typical enterprise. These impacts include customer dissatisfaction, increased operational cost, less effective decision-making and a reduced ability to make and execute strategy. More subtly perhaps, poor data quality hurts employee morale, breeds organizational mistrust, and makes it more difficult to align the enterprise. Creating awareness of a problem and its impact is a critical first step towards resolution of the problem. The needed awareness of the poor data quality, while growing, has not yet been achieved in many enterprises. After all, the typical executive is already besieged by too many problems, low customer satisfaction, high costs, a data warehouse project that is late, and so forth. Creating awareness of issues of the accuracy level and impacts within the enterprise is the first obstacle that practitioners must overcome when implementing data quality programs.
Article
Full-text available
Organizational databases are pervaded with data of poor quality. However, there has not been an analysis of the data quality literature that provides an overall understanding of the state-of-art research in this area. Using an analogy between product manufacturing and data manufacturing, this paper develops a framework for analyzing data quality research, and uses it as the basis for organizing the data quality literature. This framework consists of seven elements: management responsibilities, operation and assurance costs, research and development, production, distribution, personnel management, and legal function. The analysis reveals that most research efforts focus on operation and assurance costs, research and development, and production of data products. Unexplored research topics and unresolved issues are identified and directions for future research provided
Article
Protecting the integrity of data will challenge analytical labs as they become compliant with the requirements of 21 CFR Part 11. Other responsibilities include ensuring the reliability and trustworthiness of electronic records used to support particular decisions, such as release of a production batch.
Article
This article presents some general ideas on how various aspects of the sytems development life cycle might be measured as one means of assessing and controlling the quality of a system. We feel that this is a fruitful area for research with potentially important industrial and research implications. We strongly urge further research and investigation into systems development quality control based on the framework established in this article.
Article
IS and user departments expect correct information. When embarrassing information mistakes occur, the result is the loss of credibility, business, and customer confidence. This article reviews the causes of incorrect and inaccurate information, examines existing solutions to data quality problems, and discusses the importance of information integrity as the cornerstone to achieving total quality management, business process reengineering, and automated operations objectives.
Article
Poor data quality (DQ) can have substantial social and economic impacts. Although firms are improving data quality with practical approaches and tools, their improvement efforts tend to focus narrowly on accuracy. We believe that data consumers have a much broader data quality conceptualization than IS professionals realize. The purpose of this paper is to develop a framework that captures the aspects of data quality that are important to data consumers.A two-stage survey and a two-phase sorting study were conducted to develop a hierarchical framework for organizing data quality dimensions. This framework captures dimensions of data quality that are important to data consumers. Intrinsic DQ denotes that data have quality in their own right. Contextual DQ highlights the requirement that data quality must be considered within the context of the task at hand. Representational DQ and accessibility DQ emphasize the importance of the role of systems. These findings are consistent with our understanding that high-quality data should be intrinsically good, contextually appropriate for the task, clearly represented, and accessible to the data consumer.Our framework has been used effectively in industry and government. Using this framework, IS managers were able to better understand and meet their data consumers' data quality needs. The salient feature of this research study is that quality attributes of data are collected from data consumers instead of being defined theoretically or based on researchers' experience. Although exploratory, this research provides a basis for future studies that measure data quality along the dimensions of this framework.
Article
Numerous researchers in a handful of disciplines have been concerned, in recent years, with the special role (or roles) that time seems to play in information processing. Designers of computerized information systems have had to deal with the fact that when an information item becomes outdated, it need not be forgotten. Researchers in artificial intelligence have pointed to the need for a realistic world model to include representations not only for snapshot descriptions of the real world, but also for histories, or the evolution of such descriptions over time. Many logicians have regarded classical logic as an awkward tool for capturing the relationships and the meaning of statements involving temporal reference, and have proposed special "temporal logics" for this purpose. Finally, the analysis of tensed statements in natural language is a principal concern of researchers in linguistics.
Article
A framework is described that was developed to help assess communications service quality in a manner that most closely reflects the broad array of performance parameters experienced by customers. Examples of its application are presented. An additional benefit experienced with the use of this framework is a more consistent, disciplined analysis of service parameters, adding to the thoroughness with which service providers and customers alike assess the quality of communications services.< >
Article
This article presents a methodology and tests its efficacy through a rigorous case study. The players in our study include information producers, who generate and provide information; information custodians, who provide and manage computing resources for storing, maintaining, and securing information; and information consumers, who access and use information [9]. We extend previous research on managing information as product to incorporate the service characteristics of information delivery. We draw upon the general quality literature, which discusses quality as conformance to specification and as exceeding consumer expectations. Our key contributions are: . Developing a two-by-two conceptual model for describing IQ. The columns capture quality as conformance to specifications and as exceeding consumer expectations, and the rows capture quality from its product and service aspects. We refer to this model as the product and service performance model for information quality (PSP/IQ)
Many factors can contribute to the Transmission was added because in network-based systems this phase represents a key information processing phase with special risk and control considerations that should not be overlooked
  • Understandability
Understandability/granularity/aggregation. Many factors can contribute to the (2004). Transmission was added because in network-based systems this phase represents a key information processing phase with special risk and control considerations that should not be overlooked. References AICPA/CICA SysTrust, Principles and Criteria for Systems Reliability, Version 2.0, New York/Toronto: American Institute of Certified Public Accountants and the Canadian Institute of Chartered Accountants, January 2001.
Dirty data: inaccurate data can ruin supply chains The role of time in information processing: a survey
  • M Betts
  • A Bolour
  • Tl Anderson
  • Lj Dekeyser
  • Wong
Betts M. Dirty data: inaccurate data can ruin supply chains. Computerworld [December]. Bolour A, Anderson TL, Dekeyser LJ, Wong HKT. The role of time in information processing: a survey. SIGMOD 1982;Record 12(3):27 – 50.
Global data management survey
  • Pricewaterhousecoopers
PricewaterhouseCoopers. Global data management survey. New York (NY)7 PricewaterhouseCoopers; 2001.
Dirty data: inaccurate data can ruin supply chains
  • Betts
Information quality benchmarks: product and service performance
  • Kahn