Vanessa Ayala-Rivera

Vanessa Ayala-Rivera
National College of Ireland · School of Computing

PhD in in Computer Science

About

19
Publications
18,753
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
191
Citations

Publications

Publications (19)
Chapter
Full-text available
Performance testing is a critical task to assure optimal experience for users, especially when there are high loads of concurrent users. JMeter is one of the most widely used tools for load and stress testing. With JMeter, it is possible to test the performance of static and dynamic resources on the web. This paper presents DYNAMOJM, a novel tool b...
Chapter
Full-text available
Performance testing is a critical task to ensure an acceptable user experience with software systems, especially when there are high numbers of concurrent users. Selecting an appropriate test workload is a challenging and time-consuming process that relies heavily on the testers’ expertise. Not only are workloads application-dependent, but also it...
Conference Paper
Full-text available
Garbage Collection (GC) is a core feature of multiple modern technologies (e.g., Java, Android). On one hand, it offers significant software engineering benefits over explicitly memory management, like preventing most types of memory leaks. On the other hand, GC is a known cause of performance degradation. However, it is considerably challenging to...
Conference Paper
Full-text available
The General Data Protection Regulation (GDPR) aims to protect personal data of EU residents and can impose severe sanctions for non-compliance. Organizations are currently implementing various measures to ensure their software systems fulfill GDPR obligations such as identifying a legal basis for data processing or enforcing data anonymization. How...
Conference Paper
Full-text available
The Internet of Things (IoT) has become a major technological revolution. Evaluating any IoT advancements comprehensively is critical to understand the conditions under which they can be more useful, as well as to assess the robustness and efficiency of IoT systems to validate them before their deployment in real life. Nevertheless, the creation of...
Conference Paper
Full-text available
The identification of workload-dependent performance issues, as well as their root causes, is a time-consuming and complex process which typically requires several iterations of tests (as this type of issues can depend on the input workloads), and heavily relies on human expert knowledge. To improve this process, this paper presents an automated ap...
Conference Paper
Full-text available
Nowadays, cluster computing has become a cost-effective and powerful solution for enterprise-level applications. Nevertheless, the usage of this architecture model also increases the complexity of the applications, complicating all activities related to performance optimisation. Thus, many research works have pursued to develop advancements for imp...
Conference Paper
Full-text available
Plagiarism in programming assignments is an extremely common problem in universities. While there are many tools that automate the detection of plagiarism in source code, users still need to inspect the results and decide whether there is plagiarism or not. Moreover, users often rely on a single tool (using it as " gold standard " for all cases), w...
Article
Full-text available
The dissemination of textual personal information has become an important driver of innovation. However, due to the possible content of sensitive information, this data must be anonymized. A commonly-used technique to anonymize data is generalization. Nevertheless, its effectiveness can be hampered by the Value Generalization Hierarchies (VGHs) use...
Conference Paper
Full-text available
Concept hierarchies are widely used in multiple fields to carry out data analysis. In data privacy, they are known as Value Generalization Hierarchies (VGHs), and are used by generalization algorithms to dictate the data anonymization. Thus, their proper specification is critical to obtain anonymized data of good quality. The creation and evaluatio...
Conference Paper
Full-text available
Conducting extensive testing of anonymization techniques is critical to assess their robustness and identify the scenarios where they are most suitable. However, the access to real microdata is highly restricted and the one that is publicly-available is usually anonymized or aggregated; hence, reducing its value for testing purposes. In this paper,...
Article
Full-text available
In privacy-preserving data publishing, approaches using Value Generalization Hierarchies (VGHs) form an important class of anonymization algorithms. VGHs play a key role in the utility of published datasets as they dictate how the anonymization of the data occurs. For categorical attributes, it is imperative to preserve the semantics of the origina...
Article
Full-text available
The vast amount of data being collected about individuals has brought new challenges in protecting their privacy when this data is disseminated. As a result, Privacy-Preserving Data Publishing has become an active research area, in which multiple anonymization algorithms have been proposed. However, given the large number of algorithms available an...
Article
Full-text available
Datasets of different characteristics are needed by the research community for experimental purposes. However, real data may be difficult to obtain due to privacy concerns. Moreover, real data may not meet specific characteristics which are needed to verify new approaches under certain conditions. Given these limitations, the use of synthetic data...
Data
Datasets of different characteristics are needed by the research community for experimental purposes. However, real data may be difficult to obtain due to privacy concerns. Moreover, real data may not meet specific characteristics which are needed to verify new approaches under certain conditions. Given these limitations, the use of synthetic data...
Conference Paper
Full-text available
Data security remains a top concern for the adoption of cloud-based delivery models, especially in the case of the Software as a Service (SaaS). This concern is primarily caused due to the lack of transparency on how customer data is managed. Clients depend on the security measures implemented by the service providers to keep their information prot...

Network

Cited By

Projects

Projects (4)
Project
The For-CoPS project will develop a radically new software technology that will transform cyber and physical forensic investigations of the future. The project envisions organisations and citizens playing an active role in the collection and preservation of data coming from cyber and physical sources and in a form that is usable by Law Enforcement Agencies (LEA) and other authorised entities involved forensic investigations. The project will develop an adaptive and interactive recommender technology to support stakeholders throughout the different stages of an investigation, from cyber and physical evidence acquisition, evidence preservation, case-based retrieval, evidence interpretation and presentation in court. The long-term vision of the project will be achieved by developing adaptive methodologies to engineer forensically ready cyber and physical systems and by providing an evidence-based framework to support effective cyber and physical forensic investigations.
Project
To develop techniques that can improve the productivity of software testers as well as the processes involved in conducting proper performance testing of enterprise-level applications.
Archived project
The aim is to investigate some of the many challenges Web-based Enterprise applications pose to traditional performance evaluation and testing techniques, due to the huge amounts of data that they generate. Tools for detecting and visualising anomalies in performance measurement data are also under development.