
Mohammad Abdallah- PhD Software Engineering
- Professor (Associate) at Al-Zaytoonah University of Jordan
Mohammad Abdallah
- PhD Software Engineering
- Professor (Associate) at Al-Zaytoonah University of Jordan
Chair, Computer Science Dept.
&
Director, Technology Transfer Office
@
Al-Zaytoonah University..
ACM&IEEE Senior Member
About
61
Publications
55,715
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
438
Citations
Introduction
My research interests are:
Software Testing, Software Quality, Program Analysis and recently, Big Data.
I will be happy to be a part of a research group, helping in Journal and conference research papers review. Also, answering questions.
Current institution
Additional affiliations
October 2016 - October 2017
October 2015 - October 2016
October 2012 - present
Education
October 2008 - July 2012
October 2007 - September 2008
February 2004 - February 2007
Publications
Publications (61)
Since ChatGPT was presented to the public in 2022, generative artificial intelligence and especially large language models (LLM) have attracted a lot of interest in academia and industry alike. One of the arguably most interesting domains in that regard is banking. This is because it could, theoretically, heavily benefit from their application but...
Social networking sites (SNS) have revolutionized communication, information sharing, and community building for billions of users globally, becoming integral to daily life. Platforms like Facebook, Instagram, LinkedIn, and Twitter are crucial for business, political, and personal interactions. Existing research has identified several key factors i...
This study comprehensively evaluates VR communication interfaces and its usability, considering usability, user experience, and platform-specific design across various VR platforms. Findings reveal diverse usability levels and identify usability issues and user preferences, informing interface refinements such as improved system status visibility a...
The integration of artificial intelligence (AI) and Intellectual Property (IP) into the academic ecosystem
is currently gaining memontum as it has the potential to stimulate the convergence of innovation and
entrepreneurship. The aim of this study is to investigate the potential of leveraging AI and
implementing IP rights in higher education to...
The integration of artificial intelligence (AI) and Intellectual Property (IP) within academic settings is rapidly advancing, with the potential to drive innovation and entrepreneurship. This study aims to explore how AI and IP rights can be utilized in higher education to support entrepreneurial activities among university students. The central hy...
This research explores the intricate landscape arising from the integration of big data and large language models (LLMs) across sectors, unveiling intellectual property (IP) challenges requiring careful scrutiny. The transformative impact of big data and the ascendancy of LLMs in artificial intelligence have precipitated complex inquiries into data...
The synergistic emergence of big data analytics and Large Language Models, exemplified by ChatGPT, has redefined industries by unlocking unprecedented insights and creative capabilities. This paper investigates their impact on intellectual property (IP) landscapes. Delving into patentability issues, it navigates challenges arising from data-driven...
This research investigates the relationship between Generative Artificial Intelligence (GAI), media content, and copyright laws. As GAI technologies continue to evolve and permeate various aspects of the media landscape, questions regarding the creation and protection of intellectual property have become paramount. The study aims to highlight the i...
303 304 D. Staegemann et al. and their general nature is outlined. Then, subsequently, suitable approaches for the components' testing are discussed. In doing so, researchers and practitioners alike shall be provided with a starting point when it comes to the discussion of big data quality assurance and how it can be implemented in their own endeav...
Forest fires present a substantial environmental challenge, constituting a dual menace to human life and ecological well-being. The imperative for forest fire prevention and management underscores the indispensability of robust detection, prediction, and behavior analysis systems. This scholarly paper offers a thorough exploration of diverse method...
In standard iris recognition systems, a cooperative imaging framework is employed that includes a light source with a near-infrared wavelength to reveal iris texture, look-and-stare constraints, and a close distance requirement to the capture device. When these conditions are relaxed, the system’s performance significantly deteriorates due to segme...
This research delves into the intricate nexus of artificial intelligence (AI) and intellectual property (IP), addressing the pressing legal and ethical challenges in this evolving landscape. AI's capacity to autonomously generate content and inventions raises pivotal questions about inventorship, copyright, and data privacy. It explores the ethical...
With society’s increasing data production and the corresponding demand for systems that are capable of utilizing them, the big data domain has gained significant importance. However, besides the systems’ actual implementation, their testing also needs to be considered. For this, oftentimes, proper test data sets are necessary. This publication disc...
Most university campuses, such as the campus of Al-Zaytoonah University of Jordan (ZUJ), are usually large and comprise many buildings. Finding the location of an office, a lecture hall, a service center, etc. is not an easy task for most visitors and even for many students and employees. Therefore, a virtual tour of the campus and its buildings wi...
Infrastructure as a Service (IaaS) is a service
provided by cloud computing systems vendors. IaaS
provides virtual computing resources to the users such as
hardware, servers, data center space and storage, and
network components all over the internet. Just like any
other growing technology, cloud computing and specifically
IaaS faces some challenge...
Big data analytics have claimed an important role in today’s society. Consequently, ways of improving the design and development of the corresponding applications are highly sought after. One rather current proposition is the application of test driven development (TDD) in the big data domain. The idea behind it is to increase the quality and the f...
Big data applications have gained widespread
usage across various fields, including healthcare, business, and
education. The effectiveness and accuracy of these applications
heavily rely on the availability of a large volume of data.
However, the collected and generated data for these
applications often suffer from incompleteness, inaccuracy, and
l...
This research paper proposes a new quality
model for social media websites based on an extensive literature
review and analysis of previous studies. The study identifies
seven key quality factors that are essential for evaluating social
networking websites: user friendliness, community-drivenness,
website appearance, entertainment, security and pri...
Several studies were conducted in the past for evaluating the benefits and shortcomings of virtual tours in various sites including university campuses. The previous studies addressed numerous issues such as providing initial perceptions, guiding visitors, etc. Most of these issues can be related, up to some extent, to Jordanian universities in gen...
Big data (BD) is one of the major technological trends of today and finds application in numerous domains and contexts. However, while there are huge potential benefits, there are also considerable challenges. One of these is the difficulty to make sure the respective applications have the necessary quality. For this purpose, the application of tes...
Hand veins can be used effectively in biometric recognition since they are internal organs that, in contrast to fingerprints, are robust under external environment effects such as dirt and paper cuts. Moreover, they form a complex rich shape that is unique, even in identical twins, and allows a high degree of freedom. However, most currently employ...
Background: Quality is a critical aspect of any software system. Indeed, it is a key factor for the competitiveness, longevity, and effectiveness of software products. Code review facilitates the discovery of programming errors and defects, and using programming language standards is such a technique. Aim: In this study, we developed a code review...
In recent years, the telecommunications sector is no longer limited to traditional communications, but has become the backbone for the use of data, content and digital applications by individuals, governments and companies to ensure the continuation of economic and social activity in light of social distancing and total closure in most countries in...
Big Data and its uses are widely used in many applications and fields; artificial information, medical care, business, and much more. Big Data sources are widely distributed and diverse. Therefore, it is essential to guarantee that the data collected and processed is of the highest quality, to deal with this large volume of data from different sour...
The E-learning standard is made up of several different quality
elements and characteristics. Scholars examined the effectiveness of
E-learning from a variety of perspectives. However, studies
concerning the quality of E-learning mobile applications in particular
are limited. Hence, the present study looks at the factors that influence
the use of t...
Big data has evolved to a ubiquitous part of today’s society. However, despite its popularity, the development and testing of the corresponding applications are still very challenging tasks that are being actively researched in pursuit of ways for improvement. One newly introduced proposition is the application of test driven development (TDD) in t...
Big Data filed is an unsettled standard comparing with a traditional database, data mining, or data warehouse. Stability measure aims to acquire the quality dataset which encourages to use of preprocessing data method to handle instability that miniaturization missing data. Therefore, to increase the data quality in order to achieve an accurate pre...
Abstract:
The standard of e-learning is a compound of different quality factors quality dimensions. The researchers looked at the efficiency of e-learning from various angles and viewpoints. However, few works of literature study, specifically, the quality of e-learning portals or software. Besides, the students' or learners' opinions about this is...
In the recent years, the term big data has attracted a lot of attention. It refers to the processing of data that is characterized mainly by 4Vs, namely volume, velocity, variety and veracity. The need for collecting and analysing big data has increased manifolds these days as organizations want to derive meaningful information out of any data that...
Due to the constantly increasing amount and variety of data produced, big data and the corresponding technologies have become an integral part of daily life, influencing numerous domains and organizations. However, because of its diversity and complexity, the necessary testing of the corresponding applications is a highly challenging task that lack...
Code inspection is the way to ensure the quality and the control of products and software by detecting, correcting or reducing defects. Nowadays, several methods of code inspection emerged, most of which consisted of processing, preparation, analysis, meetings, and reform. However, each code inspection method may face problems in terms of number of...
The metrics or measures used in evaluating image quality usually serve applications that aim at enhancing image appearance or at preventing the image quality from degrading after image processing operations. On the other hand, the main goal of cryptography applications is to produce unrecognizable encrypted images that can resist various kinds of a...
Big Data applications are widely used in many fields such as artificial intelligence, marketing, commercial applications, and health care, as demonstrated by the role of Big Data in coping with the COVID-19 pandemic. Therefore, it is essential to ensure the quality of the generation and use of Big Data applications. Consequently, Big Data applicati...
Many organization, programmers, and researchers need to debug, test and make maintenance for a segment of their source code to improve their system. Program slicing is one of the best techniques to do so. There are many slicing techniques available to solve such problems such as static slicing, dynamic slicing, and amorphous slicing. In our paper,...
The dynamic education environment has compelled educational institutions to discover alternatives to optimize their data technology platform's costs and operational effectiveness. Cloud computing has developed as an important technology that could add more value by offering software and infrastructure alternatives for the university's whole IT need...
Today, the amount and complexity of data that is globally produced increases continuously, surpassing the abilities of traditional approaches. Therefore, to capture and analyze those data, new concepts and techniques are utilized to engineer powerful big data systems. However, despite the existence of sophisticated approaches for the engineering of...
Cloud computing helps organizations to move fast and adopts different services that accelerate building and hosting their services and reach more customers and
third parties quickly. Cloud computing is gaining a considerable attention nowadays due
to its accelerated adopting, deployment models and the variety of services it provides to
customers. T...
There are many vital issues that the software implementation faces during all projects phases journey and throughout the Software Development Lifecycle (SDLC). Many software change implementations fail due to lack of engagement of the stakehold-ers, employees and management resistance, poor leadership, organization environment, cultural issues and...
As software systems change and evolve over time regression tests have to be run to validate these changes. Regression testing is an expensive but essential activity in software maintenance. The purpose of this paper is to compare a new regression test selection model called ReTSE with Pythia. The ReTSE model uses decomposition slicing in order to i...
The Internet of things systems (IoT) has a significant impact on different aspects of our lives. For that reason, IoT systems should be in high quality and clean of defects. The quality measurements for IoT systems vary according to the type of the IoT system and its applications. Therefore, IoT systems should be quality measured differently consid...
The program slicing technique is an abstracting technique that focuses on the program code. Clause Slicing is the type of program slicing that focuses only on the code clauses, which allows the quality assurance to measure program robustness by measuring every code clause against the programing language standards. The proposed model gives a new way...
the main Big Data Quality Factors, which need to be measured, is presented in the perspective of the data itself, the data management, data processing, and data users. This presentation highlights the quality factors that may be used later to create different Big Data quality models.
Big Data, is a growing technique these days. There are many uses of Big Data; Artificial Intelligence, Health Care, Business, and many more. For that reason, it becomes necessary to deal with this massive volume of data with caution and care in a term to make sure that the data used and produced is in high quality. Therefore, the Big Data quality i...
The Clause slicing technique is static slicing techniques which also have forward and backward slicing methods. The Clause slice criteria are the clause and the clause number. In this paper, we have discussed the Clauser tool the forward clause slicing tool introduce some improvements to it. The Clauser mechanism divides the program code statement...
Program slicing is to abstract a part of source code depending on the point of interest. It used widely in maintenance, debugging and testing. There are many slicing techniques such as static, dynamic, and amorphous. In this paper, we choose to develop a new approach applying static slicing on Java programs. The new approach simplifies the data dep...
Program code may have useless statements of code. These useless statements called dead code. The dead code affects the code quality negatively. In this research, a new model has been introduced to identify the dead code in a program and remove it. The decomposition slicing technique is used to identify the live code, and refactoring techniques can...
Cloud Computing is a new era that helps organizations to move fast and adopts different services that accelerate building and hosting their services and reach customers and third parties quickly. However, there are challenges related to Cloud Computing that the Services Providers must consider and implement. This paper highlights the Cloud Computin...
All languages, natural and programming, have rules and styles in how to write. These rules and styles mainly aim to make sure that anyone, who understand a language, can understand what the sentence say. In other words, the aim of rules and styles in a language is to deliver an information to reader, and the reader must get the right information. T...
Regression testing is expensive but an essential activity in software maintenance. Regression testing validates modified software and ensure that the modified parts of the program do not introduce unexpected errors. This paper briefly describes an overview of regression testing specifically regression test selection techniques. Most regression test...
Java language is one of the most used programming languages. This makes it used to develop many applications and systems by many different developers. This diversity of developers skills and backgrounds lead to the Java Programming language standards. There are many rules and guidelines for Java programming language. In this research, we include 20...
Program slicing is a technique of abstracting from a program depending on slicing criteria to be used in maintenance, debugging and other applications. Java programs as any other programming language has been a target for slicing tool, such as Kaveri and WALA. In this paper, we present a new Java backward slicing tool; JavaBST. JavaBST, proposed a...
Iris recognition is the most accurate form of biometric identification. The robustness of iris recognition
comes from the unique characteristics of the human iris texture as it is stable over the human life, and the
environmental effects cannot easily alter its shape. In most iris recognition systems, ideal image acquisition
conditions are assum...
Iris recognition is the most accurate form of biometric identification. The robustness of iris recognition comes from the unique characteristics of the human iris texture as it is stable over the human life, and the environmental effects cannot easily alter its shape. In most iris recognition systems, ideal image acquisition conditions are assumed....
Program Robustness is now more important than before, because of the role software programs play in our life. Many papers defined it, measured it, and put it into context. In this paper, we explore the different definitions of program robustness and different types of techniques used to achieve or measure it. There are many papers about robustness....
The Clause slicing technique is a static slicing technique. The clause slice criterion is the clause, which is the smallest part of the code line, with the clause number. In this paper, we introduce the "Clauser," which is a new clause slicing tool for C programs. The Clauser is a slicing tool that divides the program code lines into clauses, depen...
Robustness is a key issue for all the programs, especially safety critical ones. In the
literature, Program Robustness is defined as “the degree to which a system or
component can function correctly in the presence of invalid input or stressful
environment” (IEEE 1990). Robustness measurement is the value that reflects the
Robustness Degree of the...
Program robustness is the ability of software to behave correctly under stress. Measuring program robustness allows programmers to find the program's vulnerable points, repair them, and avoid similar mistakes in the future. In this paper, a Robustness Grid will be introduced as a program robustness measuring technique. A Robustness Grid is a table...
Robustness of a program is the degree of system correctness of all parts. Measuring robustness is a goal for many researchers. In this paper, program slicing is used to build a robustness hierarchy, where this hierarchy will be used to test, and build a robust program.
Questions
Questions (14)
Have you use AI detection tools, like turnitin, to detect the manuscripts that written by AI?
Was that accurate enough?
What if you have use undetect AI tools? Is it still the same?
What do you suggest to teach in the Advanced Software Engineering Course for MSc students?
We have already given courses for Advanced requirements, design, quality, testing, and project management.
Nowadays, AI and Data Science are booming. Which opens many new areas for researchers.
What do you think are the impact of this on Software Quality research area? And what do think the future directions of Software Quality will be?
I need to know the number of employees in a country or number of jobs that created every year, and so on.
I need to count the number of classes, number of methods, number of comments lines, LOC for any Java code as part of analysis process. So, I need a tool or plugin that gives me these data in any form; Excel, XML ...etc.