Chetan Arora’s research while affiliated with University of Luxembourg and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


Fig. 2 Procedure for domain model construction. The noun phrases provided to the experts are for scoping domain modeling and ensuring that feasible completeness is achievable
Fig. 6 Plots showing the relationship between unsupported domain model elements and number of omissions of different types
a Examples of shall requirements; b Example domain model represented as a UML class diagram
Examples of omittable requirement segments
Examples of trace links from a requirement to the domain model

+4

An empirical study on the potential usefulness of domain models for completeness checking of requirements
  • Article
  • Full-text available

August 2019

·

445 Reads

·

33 Citations

Empirical Software Engineering

Chetan Arora

·

Mehrdad Sabetzadeh

·

Domain modeling is a common strategy for mitigating incompleteness in requirements. While the benefits of domain models for checking the completeness of requirements are anecdotally known, these benefits have never been evaluated systematically. We empirically examine the potential usefulness of domain models for detecting incompleteness in natural-language requirements. We focus on requirements written as “shall”-style statements and domain models captured using UML class diagrams. Through a randomized simulation process, we analyze the sensitivity of domain models to omissions in requirements. Sensitivity is a measure of whether a domain model contains information that can lead to the discovery of requirements omissions. Our empirical research method is case study research in an industrial setting. We have experts construct domain models in three distinct industry domains. We then report on how sensitive the resulting models are to simulated omissions in requirements. We observe that domain models exhibit near-linear sensitivity to both unspecified (i.e., missing) and under-specified requirements (i.e., requirements whose details are incomplete). The level of sensitivity is more than four times higher for unspecified requirements than under-specified ones. These results provide empirical evidence that domain models provide useful cues for checking the completeness of natural-language requirements. Further studies remain necessary to ascertain whether analysts are able to effectively exploit these cues for incompleteness detection.

Download

Citations (1)


... Arora et. al. [4] evaluates domain models' effectiveness in detecting incompleteness in natural-language requirements, finding nearlinear sensitivity to both unspecified and under-specified requirements. Results suggest domain models offer valuable cues for identifying omissions, prompting further investigation into analysts' ability to leverage these cues for incompleteness detection. ...

Reference:

ARIA-QA: AI-agent based requirements inspection and analysis through question answering
An empirical study on the potential usefulness of domain models for completeness checking of requirements

Empirical Software Engineering