Chapter

Applying Verbal Decision Analysis for Public Research and Technology Advancement

Authors:
  • Federal Research Center "Computer Science and Control" of RAS
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Ensuring sustainable development of scientific organizations is impossible without coordinating development strategies at all levels of management, taking into account the totality of factors and environmental conditions. The possibility of using probabilistic modeling tools in making management decisions to ensure sustainable development of scientific organizations is considered. A methodology for modeling the management decisions implementation results is proposed. An algorithm for adapting the decision trees structure to new data arriving in real time has been implemented.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
The paper discusses a new approach to the processing of verbal assessments, which express the opinions of experts on many criteria. The algorithm for verbal analysis of multi-feature objects expert assessments is proposed for identifying subgroups of experts with agreed opinions. It is proposed to use the multiplicity indices of multisets to calculate the coefficient of consistency. It is also proposed to use jointly the coefficients of consistency values to assess the consistency of the experts subgroups on one feature, and strive for the total maximization of the coefficients of consistency for all features. The complexity of the proposed algorithm is estimated. The results of calculations for the problem of choosing the best science-intensive technology are presented.
Article
Full-text available
The generation and adoption of technologies have been considered key factors in resolving inconsistencies in research on the innovative phenomenon, though few studies have taken this stance. In this paper, we empirically test the validity of this theoretical framework, using the data of 87,911 surveys of firms in 13 European countries from CIS-2012. This information has allowed us to find strong evidence for the existence of four organizational archetypes based on the generation and adoption of technologies, with relevant differences in organizational characteristics and tendency toward radical innovation in products.
Article
Full-text available
The exponential growth of data in digital environments has highlighted the emergence of developing analytics processes for data visualization and evaluation. The same happens at the university research level. Therefore, universities need to have specific analytical processes for research evaluation. The aim of our study is to find the way to apply general data analytical methods and technologies, specifically for university research environments, to help improve the results of their research. We have found that universities that maintain a Current Research Information System, CRIS (CERIF Compliant), containing high quality data, are able to implement these analytical processes and improve their results in the assessment and evaluation of their research data. This paper explains the process to implement such methodologies and techniques. Some results are also explained.
Article
Full-text available
This paper analyzes the factors that contribute or hinder the (joint) adoption of current research information systems (CRIS) and Open Access repositories (OAR) as manifestations of the Open Science principle. Given the significance attached to Open Science principles such as transparency and accessibility, we ask whether CRIS are able to reflect these priorities by providing technical interfaces or integrating OAR into the CRIS structure. We suggest that the interplay of national and institutional governance structures and policies is crucial for the adequate and efficient integration of CRIS and Open Science. To account for variances in both national and institutional context factors, we compare research institutions in three countries: Italy, the Netherlands and Germany. The qualitative exploratory analysis suggests that the consistent adoption and implementation of CRIS and Open Access policies in a science system are facilitated by national evaluation or quality assessment policies. In addition, the integration of Open Access repositories into CRIS is furthered by an institutionalized, efficient and flexible CRIS infrastructure.
Article
Full-text available
This paper reports on the progress that the OpenAIRE-funded, euroCRIS-led METIS2OpenAIRE project means for the implementation of the OpenAIRE Guidelines for CRIS Managers. An update of these guidelines has taken place in the past two years to enable a smoother and more effective information exchange process between CRIS systems and the OpenAIRE aggregation. METIS2OpenAIRE aimed to ensure OpenAIRE compatibility based on these version 1.1 guidelines for a first institutional research information management system, namely METIS at Radboud University Nijmegen in the Netherlands. This three-and-a-half-month project also set out to support a number of parallel implementations by external, budget-neutral stakeholders that would ensure a widespread adoption of the newly released CRIS interoperability standard. These were PURE, the single most widely adopted CRIS platform worldwide, and OMEGA-PSIR, a widespread CRIS solution in Poland. The work carried out under the METIS2OpenAIRE initiative and its results as of mid-2018 are described in this piece. These results include the development of a minimally sufficient OpenAIRE validator for CRIS systems. This software, specifically designed to support the interoperability requirements for the METIS institutional CRIS at Radboud, should gradually evolve into the default mechanism for other CRIS systems to use in order to test their own level of compliance with the OpenAIRE Guidelines. As for the external stakeholders, by mid-2018 there is already significant progress in the route towards OpenAIRE compatibility for both PURE and OMEGA-PSIR.
Article
Full-text available
Does a national research-focused organization need a technically competent leader? This study provides preliminary answers to that question using a natural experiment underway at the Department of Energy (DOE): a leader with significant high-level management experience superior to that of his predecessors, but no relevant technical experience was appointed to run the vast scientific research operation. The following hypothesis is proposed: A major risk of allowing technically-unqualified leadership is that an amateur can be more easily manipulated by special interests against the best interests of the nation. The hypothesis is tested by technically analyzing three case studies on proposals from the Secretary of Energy. The results show 1) requests for budget cuts undermining DOE's mission, 2) requests for redundant studies wasting DOE resources, and 3) counter-productive recommendations derived from a misrepresentation of the studies' results, which indicate technical competence is important for DOE leadership. These preliminary results indicate technical competency is important for leaders running organizations that oversee research. Finally, a potential policy safe guard to the risks of extreme technical incompetence is provided, which can be applied to either independent- or politically-appointed bureaucrats.
Article
Full-text available
Knowledge Management (KM) is vital factor to successfully undertake projects. The temporary nature of projects necessitates employing useful KM practices for tackling issues such as knowledge leakiness and rework. The Project Management Office (PMO) is a unit within organizations to facilitate and oversee organizational projects. Project Management Maturity Models (PMMM) shows the development of PMOs from immature to mature levels. The existing PMMMs have focused on discussing Project Management (PM) practices, however, the management of project knowledge is yet to be addressed, at various levels of maturity. This research project was undertaken to investigate the mentioned gap for addressing KM practices at the existing PMMMs. Due to the exploratory and inductive nature of this research, qualitative methods were chosen as the research methodology. In total, three cases selected from different industries: research; mining and government organizations, to provide broad categories for research and research questions were examined using the developed framework.
Article
Bayesian network structure learning is the basis of parameter learning and Bayesian inference. However, it is a NP-hard problem to find the optimal structure of Bayesian networks because the computational complexity increases exponentially with the number of nodes. Hence, numerous algorithms have been proposed to obtain feasible solutions, while all of them are of certain limits. In this paper, we adopt a heuristic algorithm to learn the structure of Bayesian networks, and the algorithm can give a reasonable solution to the BN structure learning problem. In our algorithm, the PC and a Particle Swarm Optimization (PSO) algorithm are integrated. Moreover, we consider structure priors to improve the performance of our PC-PSO algorithm. Meanwhile, we utilize a new mutation operator called Uniform Mutation by Addition and Deletion (UMAD) and a crossover operator called Uniform Crossover. Experiments on different networks show that the approach proposed in this paper has achieved better Bayesian Information Criterion (BIC) scores than other algorithms.
Article
Bayesian network classifiers (BNCs) are powerful tools in knowledge representation and inference under conditions of uncertainty. In contrast to eager learning, lazy learning seeks to improve the classification accuracy of BNCs by forming a decision theory that is especially tailored for the testing instance, whereas it has received less attention due to the high computational cost at classification time. This study introduces the conditionally independently and identically distributed (c.i.i.d.) assumption to BNCs by assuming that all instances of the same class are conditionally independent of each other and stem from the same probability distribution. Based on this premise, we propose a novel lazy BNC, semi-lazy Bayesian network classifier (SLB), which transforms each unlabeled testing instance to a series of complete instances with discriminative supposed class labels, and then builds class-specific local BNCs for each of them. Our experimental comparison on 25 UCI datasets shows that SLB has modest training time overheads and less classification time overheads. The Friedman and Nemenyi tests show that SLB has significant zero–one loss and bias advantages over some state-of-the-art BNCs, such as selective k-dependence Bayesian classifier, k-nearest neighbor, lazy Bayesian rule and average n-dependence estimators with lazy subsumption resolution.
Article
With consumers’ stricter requests for product quality, whether the product quality is good or bad has been factored into sustainable development for business sectors. Aiming to increase industrial competitiveness, a number of enterprises have imported the six sigma process one after another and gained remarkable results. However, as product quality is being assessed, it is commonly seen that process specifications are not only symmetric but also asymmetric, and the manufacturing process usually contains unknown parameters. As a result, sampling is needed for estimation whereas the samples do not necessarily have normal distributions. In this study, a new process capability index Yp is proposed for the asymmetric specifications while four types of bootstrap confidence interval methods are applied to find out the bootstrap confidence interval of the indicator Yp, so either the problem of asymmetrical process specifications or the problem of non-normal processes can be solved. In addition, this study applies it to the six sigma process and develops a six sigma model with measurement, analysis, improvement, and control, to promote product quality. From the practical point of view, it can solve the problem of insufficient samples in the industry, and there is an effective evaluation process available for application. Lastly, at the empirical stage, this study takes the manufacturing process of TFT-LCD for example to explain the application of this evaluation process. The evaluation model proposed by this study can provide the related industries with a convenient and practical method when they conduct the six sigma quality improvement and process evaluations.
Article
In this paper, a novel robust Bayesian network is proposed for process modeling with low-quality data. Since unreliable data can cause model parameters to deviate from the real distributions and make network structures unable to characterize the true causalities, data quality feature is utilized to improve the process modeling and monitoring performance. With a predetermined trustworthy center, the data quality measurement results can be evaluated through an exponential function with Mahalanobis distances. The conventional Bayesian network learning algorithms including structure learning and parameter learning are modified by the quality feature in a weighting form, intending to extract useful information and make a reasonable model. The effectiveness of the proposed method is demonstrated through TE benchmark process and a real industrial process.
Article
Starting from a Human Capital Analysis Model, this work introduces an original methodology for evaluating the performance of employees. The proposed architecture, particularly well suited to the special needs of knowledge-based organizations, is articulated into a framework able to manage cases where data is missing and an adaptive scoring algorithm takes into account seniority, performance, and performance evolution trends, allowing employee evaluation over longer periods. We developed a flexible software tool that gathers data from organizations in an automatic way – through adapted connectors – and generates abundant results on the measurement and distribution of employees’ performances. The main challenges of human resource departments – quantification of human resource performance, analysis of the distribution of performance, and early identification of employees willing to leave the workforce – are handled through the proposed IT platform. Insights are presented on different granularity levels, from organization view down to department, group, and team.
Article
In this paper, the Conditional Gaussian Networks (CGNs), a form of Bayesian Networks (BN), are used as a statistical process monitoring approach to detect and diagnose faults. The proposed approach improves the structure of Bayesian networks and generalizes a few results regarding statistical tests and the use of an exclusion criterion. The proposed framework is evaluated using data from the benchmark Tennessee Eastman Process (TEP) with various scenarios.
Article
This study aims to investigate the effects of public research organizations’ technology entrepreneurship and external relationships on technology transfer performance. The results show that the degree of public research organizations-industry collaboration has a negative effect on public research organizations’ spin-off creation but positive effects on their technology license agreements and licensing income. In addition, both organizational- and individual-level technology entrepreneurship positively affect public research organizations’ technology licensing income and, simultaneously, have a negative moderating effect on the relationship between the degree of public research organizations-industry collaboration and spin-off creation. Therefore, public research organizations’ top managements and policy makers should consider these different mechanisms and the double-edged sword effects of external relationships when they formulate strategies or policies for technology commercialization.
Article
Research information systems are often described as important tools to enhance open innovation, by directly gathering and unlocking information on scientific and technological research to stakeholders. Current Research Information Systems (CRIS) systems are mostly developed within a single organization with a context-specific vocabulary and generally use the Common European Research Information Format (CERIF) for data storage. Although CERIF allows for almost unlimited possibilities to model research information, it has - like any standard - limitations when it comes down to communication to end-users in terms of semantics. However, if one wants to exchange the information kept within CRIS systems, it is essential to include research information and classification governance as this is key to truly comparable and thus meaningful information. In this paper, we elaborate on research information and classification governance as a prerequisite for establishing true semantic interoperability of CRIS systems in inter-organizational contexts.
Article
Across Europe, different kinds of research organisations are confronted with the challenge of managing their most valuable resources, which are knowledge based, in a more explicit and transparent manner. Over the last few years a small number of research organisations have started to implement new instruments for managing and measuring their knowledge-based resources and processes. The research organisations ARC (Austria) and DLR (Germany) have been the first European research organisations to publish Intellectual Capital Reports for their entire organisation, using a similar model, which addresses both the issues of internal management as well as external reporting. In both organisations, the newly established instruments are based on an indicator-based system. In this system, indicators about the different forms of Intellectual Capital (IC), the value-added processes and the results of the organisational knowledge-production processes, are integrated. Based on the experiences of implementing and running the system for 4 years, various aspects and the lessons that have been learned, are discussed. One of the main benefits of these IC measurement and reporting systems is that the organisations learn about their knowledge-production processes. However, there are some trade-offs between internal management use and external reporting, as well as limits in comparing indicators between organisations.