A.K. Onoma’s research while affiliated with Hitachi, Ltd. and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (18)


AN OBJECT-BASED ENVIRONMENT (OPUSDEI) FOR SOFTWARE DEVELOPMENT AND MAINTENANCE
  • Article

January 2012

·

23 Reads

International Journal of Artificial Intelligence Tools

A.K. ONOMA

·

H. Suganuma

·

M. Poonawala

·

[...]

·

T. Syomura

This paper discusses an object-based software development and maintenance environment, Opusdei, built and used for several years at Hitachi Software Engineering (HSK - Since 1994, University of Minnesota has been involved in the Opusdei project.) Industrial software is usually large, has many versions, undergoes frequent changes, and is developed concurrently by multiple programmers. Opusdei was designed to handle various problems inherent in such industrial environments. In Opusdei, all information needed for development is stored using an uniform representation in a central repository, and the various documentation and views of the software artifacts can be generated automatically using the tool repository. Opusdeis’ innovative capabilities are 1) uniform software artifacts representation 2) inter-relation and traceability maintenance among software artifacts 3) tools coordination and tool integration using tool composition scenarios 4) automatic documentation and versioning control. Tool coordination and composition has been discussed in the literature as a possible way to make software development environments more intelligent. Opusdei provides a uniform representation of software artifacts and tools which is an essential first step in addressing the issues of tool coordination and composition. Opusdei has been operational for several years and has been used in many large software development projects. The productivity gain reported for some of these projects, by using Opusdei ranged from 50–90%.


A SOFTWARE TEST AND EVALUATION ENVIRONMENT BASED ON LONGITUDINAL DATABASE

November 2011

·

17 Reads

International Journal of Software Engineering and Knowledge Engineering

To assure the quality of software by running test cases and evaluating the results is one of the difficult parts of the entire software development project. The difficulty usually comes from the lack of appropriate supporting tools and the complexity of the software. In the past ad hoc supporting tools were made for each project and test results were usually not used across projects. This conventional way of test and evaluation (T&E) is time consuming, and the most important decision "When is this software ready to ship?" is left to the engineers depending on their experiences. Our objective is to build a knowledge-based T&E environment such that tests cases, test results, object snapshots and other information are accumulated in a database. These longitudinal data can be automatically tracked and analyzed to provide decision support information. As a result, test results can be reviewed repeatedly and software quality can be assured by analyzing these data from various perspectives.


A Distributed Program Management Environment on X.500

August 2008

·

20 Reads

This paper discusses an integrated software program management environment, ScmEngine, being built at the University of Minnesota. Industrial software is usually large, has many versions, undergoes frequent changes, and is developed concurrently by multiple programmers. In ScmEngine all information needed for program management is stored using an uniform representation in a distributed repository built on top of X.500, and the various documentation and views of the software artifacts can be generated automatically using the tool repository. The innovative capabilities of this tool are 1) Uniform software artifacts representation 2) Inter-relation and traceability maintenance among software artifacts 3) Tools repository and integration using tool composition scenarios 4) Automatic documentation and versioning control.


ScmEngine: A distributed software configuration management environment on X.500

November 2006

·

28 Reads

·

3 Citations

Lecture Notes in Computer Science

This paper presents a new approach using X.500 model for distributed software configuration management. It discusses an integrated software configuration management environment, ScmEngine, being built at the University of Minnesota. Large software usually has many versions, undergoes frequent changes, and could be developed concurrently by groups of programmers at different sites. In ScmEngine all information needed for software configuration management is stored using an uniform representation in a distributed repository built with X.500 model, and the various documentation and views of the software artifacts can be generated automatically using configuration tools. The innovative capabilities of ScmEngine with these tools are 1) Distributed configuration management and version control. 2) Uniform software artifacts representation over a distributed context 3) Inter-relation and traceability maintenance among software artifacts 4) Tools repository and integration using tool composition scenarios.


Framework and network based multimedia object management environment

January 2006

·

9 Reads

Because multimedia objects are becoming more prevalent, on ever increasing volume, inventing an efficient multimedia object management environment is a matter of increasing urgency. In recognition of this crisis, we developed the multimedia object management environment (MOME) that includes a suite of tools such as the Vortex framework, and network file indexer (NFI). MOME also includes a fully featured graphical user interface for maximum user control and flexibility. With meta data, it automatically generates indexes and paths for different types of multimedia objects and allows users to quickly find what they are looking for. In this paper, we address the background, architecture, and performance of MOME in detail.


Numerical software quality control in object oriented development

November 2005

·

15 Reads

This paper proposes new method to predict the number of the remaining bugs at the delivery inspection applied to every iteration of OOD, object oriented development. Our method consists of two parts. The first one estimates the number of the remaining bugs by applying the Gompertz curve. The second one uses the interval estimation called OOQP, object oriented quality probe. The basic idea of OOQP is to randomly extract a relatively small number of test cases, usually 10 to 20% of the entire test cases, and to execute them in the actual operation environment. From the test result of OOQP, we can efficiently predict the number of the remaining bugs by the interval estimation. The premier problem of OOQP is that OOD is imposed to use the system design specification document whose contents, like UML, tend to be ambiguous. Our estimation method works well at a matrix-typed organization where a QA team and a development team collaboratively work together to improve the software quality.


DPSSEE: a distributed proactive semantic software engineering environment

January 2004

·

8 Reads

·

2 Citations

We present a distributed proactive semantic software engineering environment (DPSSEE) that incorporates logic rules into software development process to capture the semantics from all levels of the software life cycle. It introduces the syntax and semantics of the rule description language (RDL) employed by DPSSEE and two working scenarios that illustrate the use of proactive rules for workflow control and design consistency checking.


A new defining approach for software requirement specifications

June 2003

·

19 Reads

·

4 Citations

Due to business demands including cost cutting and schedule shortening, we see many software development projects that directly make software specifications without writing requirement specification. Such projects have many problems. This paper describes the ZC method which defines the requirement specification by visualizing the requirement features. In the ZC method, the objects to be developed are circled, and the relation among the objects are defined by lines: our empirical study discovered that a single A4 paper represents approximately 10 KLOC. We confirmed that the ZC method allows software engineers to easily and precisely define the user requirement. The ZC method is at the experimental stage, and we attempt it in the actual projects.


Hypothesis testing for module test in software development

February 2002

·

60 Reads

·

1 Citation

One of the most important issues in the software development is how to guarantee that the software satisfies the quality defined in the requirement specification. This paper proposes that the issue can be solved, first the number of test cases is statistically calculated from the failure density defined in the requirement specification, then the selected test cases are executed basing on the hypothesis testing. This paper also presents how our method can be used for debugging. When the number of the test cases is calculated, we applied the statistical behavior of the software quality to the integration testing. We, however, did not consider the ripple effect since it is unable to measure. In order to guarantee the quality of 4σ, and 5σ, we found that many more test cases are needed than is previously believed enough.


University software education matched to social requests

February 2002

·

3 Reads

We raised issues of how the software education matching to a social request should be carried out in universities in this cyber-world era. In the high-tech era, it is unavoidable that almost all the high-tech products are provided as black-box. For this reason, it, too, is unavoidable that the education in high-tech era will, in stead of extracting the befit of the high-tech stuff, force us to memorize the "manners" which were institutionalized by a few high-tech gurus. As basic tools, everybody has to learn word processors, such as Word and Ichitaro, PowerPoint and Excel, and LATEX including tgif. On the other hand, special subjects, such as programming languages, programming language theory, compiler theory, OS theory, DBMS, Internet technology, business models, and software engineering are needed when becoming a software specialist. Mathematics, such as differentiation integral calculus, and physics, such as general dynamics must be also required as liberal arts of software science. The laws related to the patent, copyright, and accountancy for a floatation, and the logical thinking method are the subjects which should be studied as an application subject. In order to prepare the educational environment to study this wide scope of subjects, we must tackle the issues such as restriction of the number of credits and maintenance of lecturer staff. Such issues may be solved by applying less strict criteria when hiring lecturers, reexamining of the whole curriculum, or changing the system of company examination when employing new hires. This paper shows some proposals and remedies applicable to the universities in Japan.


Citations (9)


... Software bugs are costly to detect and rectify (Onoma et al., 1998;Engström & Runeson, 2010;Böhme & Roychoudhury, 2014). Due to deadline-meeting demands, software programs are often delivered with known or unknown bugs (Anvik et al., 2005). ...

Reference:

Adversarial patch generation for automated program repair
Regression Testing in an Industrial Environment
  • Citing Article
  • Full-text available
  • May 1998

Communications of the ACM

... In practice, the role of the IS department is much broader during the deployment stage, as illustrated by the ubiquitous help desk. Published evaluations of software maintenance practices tend to concentrate on the narrow issue of efficiently handling change requests and bug fixes (Briand, et al, 1998;Onoma, et al, 1995;Singer, 1998;West, 1996). For example, a common denominator in these papers is the emphasis that is placed on a presence of what is termed a bug tracking system, historical data base of changes, or change management. ...

Software maintenance—an industrial experience
  • Citing Article
  • September 1995

Journal of Software Maintenance Research and Practice

... In contrast to other approaches, such as CME [25] or ScmEngine [10], the storage model does not impose any semantic relationship among the versions of an artifact. In particular, the tree-structured revision and variant relationships that are found in manyÐbut by no means allÐCM systems are not present in the directed versioned graph. ...

ScmEngine: A distributed software configuration management environment on X.500
  • Citing Conference Paper
  • November 2006

Lecture Notes in Computer Science

... The use of a rule oriented approach for workflow control and design consistency checking has been illustrated in DPSSEE (Deng et al, 2003). The approach of semantics of software project for perception and cognition is broadened to introduce logic rules to all levels of the software life cycle. ...

DPSSEE: a distributed proactive semantic software engineering environment
  • Citing Conference Paper
  • January 2004

... In addition, students can iterate these tests at any suitable moments, thus our method makes the teaching/learning process more agile. However, most of the teachers or course material developers have been awkward to apply the advanced ideas from agile software development23. This is partly because there is a belief that beginners' course has little to do with advanced techniques and partly because the cost of courseware development tends to increase. ...

A new defining approach for software requirement specifications
  • Citing Conference Paper
  • June 2003

... Regression testing refers to the testing approach where a modified version of a component or application is tested, in order to ensure that existing features are still intact. This testing approach and other testing methods have been used by Beydeda [Beydeba and Gruhn 2002] and Yamaura [Yamaura and Onoma 2002]. ...

Hypothesis testing for module test in software development
  • Citing Conference Paper
  • February 2002

... Scenarios of important use cases are also prioritized by considering different usage metrics such as total number of actors/objects involved, usage frequency of an object/actor in each step of the use case. Use case prioritization have been used in different activities in the software engineering process [24,21]. In [21], A. K. Onoma et al. present a technique for the management of software development by utilizing the ranking information associated with each use case. ...

Management of object oriented development based on ranked use cases
  • Citing Conference Paper
  • September 1997

... The quality of such systems becomes an issue, especially when this characteristic can be understood differently by people with diverse backgrounds. It is a vital matter for software engineers, business managers, and researchers [51,111,81,161,116,114,69,126]. Software quality, according to the definition by IEEE Standard 1061 [54], "is the degree in which software possesses a desired combination of quality attributes". ...

Practical steps toward quality development
  • Citing Article
  • October 1995

IEEE Software