Sai Peck Lee's research while affiliated with University of Malaya and other places

Publications (122)

Article
Measuring and estimating the reusability of software components is important towards finding reusable candidates. Researchers have shown that software metrics can be effectively used to assess software reusability. This work provides a systematic literature review to investigate the main factors that influence software reusability and how these ide...
Article
Full-text available
Dynamic software updating (DSU) is shifting gears to modify software systems without a halt. Even though extensive research has been conducted on DSU, it is necessary to synthesise and map the results of recent studies on DSU for prospective research highlights. This study aims to highlight the current state-of-the-art, to recognise trends, and to...
Article
Full-text available
Predicting the number of defects in software at the method level is important. However, little or no research has focused on method-level defect prediction. Therefore, considerable efforts are still required to demonstrate how method-level defect prediction can be achieved for a new software version. In the current study, we present an analysis of...
Article
Context : Multiple fault localization (MFL) is the act of identifying the locations of multiple faults (more than one fault) in a faulty software program. This is known to be more complicated, tedious, and costly in comparison to the traditional practice of presuming that a software contains a single fault. Due to the increasing interest in MFL by...
Article
Full-text available
Dynamic software updating (DSU) is the act of modifying software without stopping its execution. DSU is employed to preserve the high availability in the deployed software systems. Although significant investigations have been conducted on static analysis (SA) to determine DSU errors, no particular study exists that explores the effects of SA for d...
Article
Full-text available
Data preprocessing remains an important step in machine learning studies. This is because proper preprocessing of imbalanced data can enable researchers to reduce defects as much as possible, which, in turn, may lead to the elimination of defects in existing data sets. Despite the remarkable achievements that have been accomplished in machine learn...
Article
Full-text available
Testing and debugging are very important tasks in software development. Fault localization is a very critical activity in the debugging process and also is one of the most difficult and time-consuming activities. The demand for effective fault localization techniques that can aid developers to the location of faults is high. In this paper, a fault...
Chapter
Full-text available
Various studies had successfully utilized graph theory analysis as a way to gain a high-level abstraction view of the software systems, such as constructing the call graph to visualize the dependencies among software components. The level of granularity and information shown by the graph usually depends on the input such as variable, method, class,...
Conference Paper
The idea of Smart Home is turning into a pattern with more home appliances and sensors interconnected into a single home infrastructure. The objective of IoT Home application is to make living more secure, convenient, and carefree by implementing software and hardware development. However, IoT Home application development process is different from...
Article
Full-text available
Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critical activities in program debugging. Due to the increase in software complexity, there is a huge interest in advanced SFL techniques that aid software engineers in locating program bugs. This interest paves a way to the existence of a large amount of lit...
Article
Full-text available
During program testing, software programs may be discovered to contain multiple faults. Multiple faults in a program may reduce the effectiveness of the existing fault localization techniques due to the complex relationship between faults and failures in the presence of multiple faults. In an ideal case, faults are isolated into fault-focused clust...
Article
Full-text available
Effective debugging is necessary for producing high quality and reliable software. Fault localization plays a vital role in the debugging process. However, fault localization is the most tedious and expensive activity in program debugging, as such, effective fault localization techniques that can identify the exact location of faults is eminent. De...
Article
Effective management of Scientific Workflow Scheduling (SWFS) processes in a cloud environment remains a challenging task when dealing with large and complex Scientific Workflow Applications (SWFAs). Cost optimisation of SWFS benefits cloud service consumers and providers by reducing temporal and monetary costs in processing SWFAs. However, cost op...
Article
Full-text available
Regression testing aims at testing a system under test (SUT) in the presence of changes. As a SUT changes, the number of test cases increases to handle the modifications, and ultimately, it becomes practically impossible to execute all of them within limited testing budget. Test Suite Reduction (TSR) approaches are widely used to improve the regres...
Conference Paper
Refactoring aims at improving software design quality without affecting external behavior. It is commonly believed that refactoring operations always enhance the software quality. However, some recent empirical studies have reported negative or negligible effects of refactoring on certain quality attributes. The actual impact of each refactroing on...
Article
Full-text available
Aspect-oriented programming (AOP) is a programmatic methodology to handle better modularized code by separating crosscutting concerns from the traditional abstraction boundaries. Automated testing, as one of the most demanding needs of the software development to reduce both human effort and costs, is a delicate issue in testing aspect-oriented pro...
Article
Software defect prediction provides actionable outputs to software teams while contributing to industrial success. Empirical studies have been conducted on software defect prediction for both cross-project and within-project defect prediction. However, existing studies have yet to demonstrate a method of predicting the number of defects in an upcom...
Article
Full-text available
Constrained clustering or semi-supervised clustering has received a lot of attention due to its flexibility of incorporating minimal supervision of domain experts or side information to help improve clustering results of classic unsupervised clustering techniques. In the domain of software remodularisation, classic unsupervised software clustering...
Article
Unified Modeling Language (UML) has become the de-facto standard to design today’s large-size object-oriented systems. However, focusing on multiple UML diagrams is a main cause of breaching the consistency problem, which ultimately reduces the overall software model’s quality. Consistency management techniques are widely used to ensure the model c...
Article
Software testing is a widely accepted practice that ensures the quality of a System under Test (SUT). However, the gradual increase of the test suite size demands high portion of testing budget and time. Test Suite Reduction (TSR) is considered a potential approach to deal with the test suite size problem. Moreover, a complete automation support is...
Article
Full-text available
Feature location is a frequent software maintenance activity that aims to identify initial source code location pertinent to a software feature. Most of feature location approaches are based, at least in part, on text analysis methods which originate from the natural language context. However, the natural language context and the text data in softw...
Article
Full-text available
Modeling software systems using complex networks can be an effective technique for analyzing the complexity of software systems. To enhance the technique, the structure of a complex network can be extended by assigning a weight to the edges of the complex network to denote the strength of communicational cohesion between a pair of related software...
Article
Full-text available
Workflow scheduling in scientific computing systems is one of the most challenging problems that focuses on satisfying user-defined quality of service requirements while minimizing the workflow execution cost. Several cost optimization approaches have been proposed to improve the economic aspect of Scientific Workflow Scheduling (SWFS) in cloud and...
Conference Paper
Full-text available
According to the fact that unit tests are valuable sources of up-to-date documentation, maintaining traceability links between unit tests and production code can be helpful for software engineers to perceive parts of a system. Traceability information facilitates the testing and debugging of complex software by linking the dependencies between code...
Article
Full-text available
Throughout the requirements engineering phase, the process of giving precedence to one requirement over another is beneficial to accomplish projects on a predefined schedule. This process is referred to as requirements prioritization. Although plenty of research has been dedicated to proposing various approaches to perform the requirements prioriti...
Conference Paper
Full-text available
The Scientific Workflow Applications (SWFAs) play a vital role for both service consumers and service providers in designing and implementing large and complex scientific processes. The cloud computing offers significant storage space and other required computing resources necessary for processing large-scale and complex SWFAs. Consequently, cloud...
Conference Paper
Full-text available
Although agglomerative hierarchical software clustering technique has been widely used in reverse engineering to recover a high-level abstraction of the software in the case of limited resources, there is a lack of work in this research context to integrate the concept of pair-wise constraints, such as must-link and cannot-link constraints, to furt...
Article
Full-text available
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, non-functional requirements (NFRs) must be...
Conference Paper
Full-text available
Software testing is extensively used to ensure the development of a quality software system. The test suite size tends to increase by including new test cases due to software evolution. Consequently, the entire test suite cannot be executed considering budget and time limitations. Researchers have examined test suite reduction and prioritization te...
Article
Full-text available
Testing is a key activity of software development and maintenance that determines the level of reliability. Traceability is the ability to describe and follow the life of software artifacts, and has been promoted as a means for supporting various activities, most importantly testing. Traceability information facilitates the testing and debugging of...
Article
In cloud computing environments in software as a service (SaaS) level, interoperability refers to the ability of SaaS systems on one cloud provider to communicate with SaaS systems on another cloud provider. One of the most important barriers to the adoption of SaaS systems in cloud computing environments is interoperability. A common tactic for en...
Article
Context: Feature location aims to identify the source code location corresponding to the implementation of a software feature. Many existing feature location methods apply text retrieval to determine the relevancof the features to the text data extracted from the software repositories. One of the preprocessing activities in text retrieval is term-w...
Article
Interoperability frameworks present a set of assumptions, concepts, values, and practices that constitute a method of dealing with interoperability issues in the electronic business (e-business) context. Achieving interoperability in the e-business generates numerous benefits. Thus, interoperability frameworks are the main component of e-business a...
Article
Unified Modeling Language (UML) is easier to understand and communicate using graphical notations, but lacks techniques for model validation and verification especially if these diagrams are updated. Formal approaches like Coloured Petri Nets (CPNs) are based on strong mathematical notations and proofs as basis for executable modeling languages. Tr...
Article
Full-text available
Context Aspect-oriented programming (AOP) has been promoted as a means for handling the modularization of software systems by raising the abstraction level and reducing the scattering and tangling of crosscutting concerns. Studies from literature have shown the usefulness and application of AOP across various fields of research and domains. Despite...
Conference Paper
Full-text available
Requirements prioritization is recognized as a critical but often neglected activity during software development process. To achieve a high quality software system, both functional and non-functional requirements must be taken into consideration during the prioritization process. Although in recent past years a lot of research has been devoted to r...
Article
Full-text available
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirem...
Article
Ultra large scale systems are a new generation of distributed software system that are composed of various changing, inconsistent or even conflicting components that are distributed in a wide domain. Some important characteristics of these systems include their very large size, global geographical distribution, operational and managerial independen...
Article
Interoperability is defined as the ability for two (or more) systems or components to exchange information and to use the information that has been exchanged. There is increasing demand for interoperability between individual software systems. Developing an interoperability evaluation model between software and information systems is difficult, and...
Article
Full-text available
The evolution in pedagogy and research has introduced new requirements in the educational sector. Ondemand computing resource provisioning, often referred as virtual lab, is one of the requirements in great demand. Most current virtual lab systems are proprietary and thus their detailed software architectures are not accessible to developers. The l...
Article
Context Software clustering is a key technique that is used in reverse engineering to recover a high-level abstraction of the software in the case of limited resources. Very limited research has explicitly discussed the problem of finding the optimum set of clusters in the design and how to penalize for the formation of singleton clusters during cl...
Conference Paper
Full-text available
Requirements prioritization is recognized as a critical but often neglected activity during software development process. To achieve a high quality software system, quality attribute requirements must be taken into consideration during the prioritization process. However, prioritizing quality attributes is not an easy task due to the inherent inter...
Article
Multicore architecture has dramatically changed the general direction of software development dedicated for personal computers. As such, it is important for software designers to keep pace with the evolving challenges that happen in the hardware side, for example in this context of multicore architecture, so that they can leverage on the advantages...
Article
Design patterns provide a way to transfer design knowledge and reusable solutions to recurring problems. The patterns include structural and interaction information that, if captured in a catalog, can act as a useful reference guide for developers when making design decisions. However, for the same design pattern structure, there can be different w...
Conference Paper
Work-Stealing technique has proven itself to be one of the successful multithreaded scheduling techniques that are designed to balance work-load in multicore environments. As the number of cores in multicore-based products increases, new developments have to be made for this technique to cope with this continuous challenge in the hardware side. In...
Article
Full-text available
The Software Capability Maturity Model Integration (CMMI) has become a popular Software Process Improvement (SPI) model for enhancing software development processes with the goal of developing high-quality software within budget and schedule. Since software development effort can be greatly affected by the organizational process maturity level, thi...
Conference Paper
Work-Stealing technique has proven itself to be one of the successful multithreaded scheduling techniques that are designed to balance work-load in multicore environments. As the number of cores in multicore-based products increases, new developments have to be made for this technique to cope with this continuous challenge in the hardware side. In...
Article
Modern computer systems greatly depend on multithreaded scheduling to balance the workload among their working units. One of the multithreaded scheduling techniques, the work-stealing technique has proven effective in balancing the distribution of threads by stealing threads from the working cores and reallocating them to the nonworking cores. In t...
Article
: Modern computer systems greatly depend on multithreaded scheduling to balance the workload among their working units. One of the multithreaded scheduling techniques, the work-stealing technique has proven effective in balancing the distribution of threads by stealing threads from the working cores and reallocating them to the nonworking cores. In...
Article
Multicore architecture has dramatically changed the general direction of software development dedicated for personal computers. As such, it is important for software designers to keep pace with the evolving challenges that happen in the hardware side, for example in this context of multicore architecture, so that they can leverage on the advantages...
Article
Software development aims to produce software systems that satisfy two requirement categories: functional and quality. One aspect of software quality is nonfunctional attributes (NFAs), such as security, performance, and availability. Software engineers can meet NFA requirements by applying suitable guidelines during software development. However,...
Article
Instruction through game-based applications is one of the most efficient methods for learning that has high potential of providing interactive and effective educational environments. For example, learning music is a popular subject in which many people are interested. Yet music instruction is one of the toughest matters for music artists and learne...
Conference Paper
Multicore technology has imposed new ways of thinking in the field of software designs. In general, softwares should be adopted to reflect the hardware changes in the design process. Binary Search has its share in these developments, in the sense that this technique should be adaptable with the new environment. In this study, we develop a new hiera...
Conference Paper
Multicore technology has imposed new ways of thinking in the field of software designs. In general, softwares should be adopted to reflect the hardware changes in the design process. Binary Search has its share in these developments, in the sense that this technique should be adaptable with the new environment. In this study, we develop a new hiera...
Article
� Abstract≤ Business Process Modeling (BPM) is the first and most important step in business process management lifecycle. Graph based formalism and rule based formalism are the two most predominant formalisms on which process modeling languages are developed. BPM technology continues to face challenges in coping with dynamic business environments...
Article
The emergence of multicore technology has led to essential changes in the hardware design of personal computers. These changes are represented by an increased growth of cores per chip. As a matter of fact, this growth has imposed new directions in planning not only in the hardware but also in the software side. This study is about developing new al...
Article
The growing and wide usage of Service-Oriented Architecture (SOA) has made performance troublesome due to increasing trend in the complexity and size of applications. Designing for performance is crucial on the early development stages while architectural patterns are to be made. In this paper, early consideration of performance is emphasized and a...
Article
Designing software involves thinking about the solution for a design problem. Some of the ideas of design solution are captured in the form of descriptions and structures in design pattern. However, there is not much visual aid on the behaviour of a design pattern in a visual design modeling tool. Currently it is difficult to determine the design p...
Conference Paper
Non-functional attributes of software are considered as major element for improving software quality. However, achieving these attributes in a software system is not a simple task, bearing in mind the relationships between these attributes, and the diversity of software domains. This paper proposes a new guideline-based software development approac...
Article
Full-text available
The software capability maturity model has become a popular model for enhancing software development processes with the goal of developing high-quality software within budget and schedule. The software cost estimation model, constructive cost model, in its last update (constructive cost model II) has a set of seventeen cost drivers and a set of fiv...
Article
Design patterns are known as a way for software designers to communicate about design. There are various descriptions, structures and behaviors on the solution for a design problem in a design pattern. However, there is not much visual aid on the internal workings of a design pattern in a visual design modeling tool. Currently, it is difficult to d...
Conference Paper
There are various descriptions, structures and behavior on the solution for a design problem in a design pattern. However, there is not much visual aid on the internal workings of a design pattern in a visual design modeling tool. Currently, it is difficult to determine the pattern roles and variants of interaction groups of a design pattern as the...
Article
Many business processes are highly dynamic and require changes even during execution. Existing commercial Business Process Management Systems (BPMS) fail to support such processes appropriately since they work in a rather static manner; they demand that the structure of a process is fixed before execution. The aim of this research is to provide a c...
Article
Full-text available
The Software Capability Maturity Model (SW-CMM) has become a popular model for enhancing software development processes with the goal of developing high-quality software within budget and schedule. The software cost estimation model, COnstructive COst MOdel (COCOMO), in its last update (COCOMO II) has a set of seventeen cost drivers and a set of fi...
Conference Paper
Objects persistence has been popular recently since it has become a necessary practice to shuttle objects between in-memory and in-storage status. Most researchers focus on objects to relational data model mapping but few on objects to multivalued data model (O/M). Nevertheless, existing mapping mechanisms are either hard to be extended or difficul...
Conference Paper
Updating XML documents is getting more and more attention from both academic and commercial fields. The time needed to update XML documents is a key factor for the success of any application especially for those related with the e-commerce. In this article, we suggest using Petri Nets as a tool for designing a concurrent model that is able to achie...
Conference Paper
Change management is very important to reduce risks and costs and maximizes the benefits of such major changes in business and information technology. Business Process Modeling (BPM) is the first and most important step in business process management lifecycle. BPM technology continues to face challenges in coping with dynamic business environments...
Conference Paper
Object-oriented application framework is one of the most important implementations of object-oriented software engineering. Normally, a user takes several months of learning in order to become highly productive in using a specific object-oriented application framework. Without proper documentation, frameworks are not very usable to framework users....
Article
Most of the existing object-oriented design metrics and data mining techniques capture similar dimensions in the data sets, thus reflecting the fact that many of the metrics are based on similar hypotheses, properties, and principles. Accurate quality models can be built to predict the quality of object-oriented systems by using a subset of the exi...