Salvador Sánchez-Alonso’s research while affiliated with King Juan Carlos University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (152)


Fig. 3 IOTA Streams architecture. Streams channels are built upon the DLT, enabling the transmission and security of messages across the network, thereby guaranteeing the authenticity of the data
Fig. 4 Utilization of public, private, and evaluation keys to encrypt and decrypt data within a homomorphic encryption scheme implemented by a service provider
Fig. 5 Message payload generation for each party. Public payload visible to all. Masked payload visible only to subscribers. Encrypted data is handled by both the client and the server, with the client directly generating it and the server acquiring it as a computation result
Expansion ratio of FHE from raw data to ciphertext depend- ing on the quantization bits used
Transmission metrics for various message sizes and connection types from the client to the service provider across the channel

+3

Privacy-Preserving Computing Services for Encrypted Personal Data Through Streams Over Distributed Ledgers
  • Article
  • Full-text available

September 2024

·

13 Reads

International Journal of Networked and Distributed Computing

·

Miguel-Ángel Sicilia-Urbán

·

Salvador Sánchez-Alonso

The growing adoption of wearables is driving the demand for personalized services that leverage unprocessed data, such as biometric and health information, to enhance user experiences and support through software applications. However, several existing use cases involving this information still prioritize traditional schemes, neglecting user privacy. Consequently, the transparency of data transmission paths and the potential for tampering remain ambiguous when users share data with service providers. In this paper, we propose the application of an Internet of Things device-focused distributed ledger as an underlying layer for the transmission of encrypted data using streams. Moreover, our proposal enables data recording for future events and the implementation of multi-subscriber models, allowing client information to be shared securely with different service providers. Through simulation experiments conducted on constrained devices, we demonstrate that our proposed framework efficiently transmits large ciphertexts through streams on a distributed ledger, overcoming the inherent limitations of such networks when dealing with substantial data volumes. Ultimately, the performance metrics presented prove that the proposed model is suitable for real-world applications requiring continuous data collection by wearables and subsequent transmission to service providers.

Download

Enhancing Privacy and Integrity in Computing Services Provisioning Using Blockchain and Zk-SNARKs

January 2024

·

21 Reads

IEEE Access

The widespread integration of on-demand services founded on proprietary algorithms into various software applications has ushered into a new era of advanced service capabilities. However, using these services entails disclosing information by the customer, not only during the payment process but also when using the service, where certain personal information must be shared to obtain a more personalized service. This practice potentially exposes users to increased security risks in case of data security breaches. In this paper, we introduce a novel framework aimed at enhancing client privacy and ensuring service integrity within the context of computing services that rely on proprietary algorithms. A blockchain-based approach is proposed to enhance user privacy throughout service provision, encompassing both the payment process and the verification of the provided service. Our proposal leverages properties of distributed ledger networks to improve user privacy during payment transactions and incorporates a verification system using zero-knowledge proofs on blockchain to validate the integrity of the contracted service. Finally, we analyze the privacy, overhead, and performance aspects of the framework, employing custom proprietary algorithms. We illustrate this through examples of Convolutional Neural Networks with multiple layers, undisclosed to the client. This emphasizes the potential benefits of its applicability for both service providers and clients.


Unleashing Competitive Intelligence: News Mining Analysis on Technology Trends and Digital Health Driving Healthcare Innovation

November 2023

·

118 Reads

IEEE Transactions on Engineering Management

In the rapidly evolving digital health landscape, technology plays a pivotal role in transforming the healthcare industry. With the exponential growth of data, uncovering valuable insights has become a daunting task. In today's data-driven world, healthcare businesses must leverage emerging technologies to stay informed about trends in their field. This research article presents a novel approach to deriving business insights in digital health enabled by technology, including artificial intelligence, and other cutting-edge advancements. We propose a methodology that utilizes news mining techniques and the global data on events, location, and tone database as the primary data source. By employing natural language processing, we developed a practical way of extracting relevant insights from vast amounts of public data. We implemented named-entity recognition (NER) enriched with the DBpedia knowledge base and relationship extraction. In addition, we leveraged graph analytics to identify and analyze the most significant concept relationships within the text corpus and their evolution in time. By integrating these advanced techniques, healthcare businesses can extract actionable insights from public datasets, empowering them to stay abreast of emerging trends and advancements in digital health, such as telehealth, precision medicine, or medical imaging.


The Relevance of Open Data Principles for the Web of Data

September 2023

·

88 Reads

·

1 Citation

Open data has been improving both publishing platforms and the consumers-oriented process over the years, providing better openness policies and transparency. Although organizations have tried to open their data, the enrichment of their resources through the Web of Data has been decreasing. Linked data has been suffering from notable difficulties in different stages of its life cycle, becoming over the years less attractive to users. According to that, we decided to explore how the lack of some opening requirements affects the decline of the Web of Data. This paper presents the Web of Data radiography, analyzing the governmental domain as a case study. The results indicate that it is necessary to strengthen the data opening process to improve resource enrichment on the Web and have better datasets. These improvements describe that open data must be public, accessible (in machine-readable formats), described (use of robust, granular metadata), reusable (made available under an open license), complete (published in primary forms), and timely (preserve the value of the data). The implementation of these characteristics would enhance the availability and reuse of datasets. Besides, organizations must understand that opening and enriching their data require a completely new approach, and they have to pay special attention and control to this project, generally by putting money, the commitment by management at all levels, and lots of time. On the contrary, given the magnitude of availability and reuse problems identified in the opening and enrichment data process, it is believed that the Web of Data model would inevitably lose the interest it aroused at the beginning if not addressed immediately by data quality, openness, and enrichment issues. Besides, its use would be restricted to a few particular niches or would even disappear altogether.


Prescriptive graph analytics on the digital transformation in healthcare through user-generated content

July 2023

·

127 Reads

·

10 Citations

Annals of Operations Research

As they swiftly evolve and become widely adopted, new technologies are fundamentally transforming the landscape of business models. Digital transformation has become a top priority for business leaders, and the Covid-19 pandemic has significantly accelerated this trend. As a result, the healthcare industry is also rapidly evolving. The goal of this research is to understand the impact of digital transformation (DX) in the healthcare industry and identify the main trends, opportunities and challenges by leveraging user-generated content through Twitter analytics. The analysis of textual information allows for the generation of insights and prescriptive analytics from aggregated unstructured data. Between January 2017 and December 2021, 96,826 English-language tweets on digitisation and digital transformation in healthcare were collected. The method consisted of a series of experiments of co-occurrence network topic modelling, based on graph analytics and semantic analysis. The results are prescriptive on the development of patient-centric digital health. The results are linked to the development of personalised healthcare, mobile health (mhealth) and efficiencies derived from adopting technology, especially artificial intelligence, machine learning and cloud computing. However, certain challenges must be addressed to implement digital transformation strategies. These challenges include ensuring compliance with data privacy regulations, as well as managing the changes required in legacy systems and processes.


The power of big data analytics over fake news: A scientometric review of Twitter as a predictive system in healthcare

February 2023

·

131 Reads

·

13 Citations

Technological Forecasting and Social Change

Interest in healthcare has grown significantly worldwide, especially since the Covid-19 outbreak. Digitalisation has allowed users to interact on social networks through platforms like Twitter, collecting user interactions over time, resulting in the proliferation of fake news. This research aims to analyse, evaluate and classify the predictive potential of Twitter analytics in healthcare, identifying the latent knowledge insights and distinguishing them from related rumours and fake news. Thus, a systematic literature review (SLR) is carried out to identify and analyse the existing academic research and applications in Twitter in predicting healthcare. The most important predictive applications are detecting mental health issues and public health emergencies. Covid-19 has been the main topic of most of the studies linked to fake news and misinformation, and this research provides a practical contribution to the use of unstructured data from Twitter and raises awareness of the importance of this content applied to healthcare. Therefore, it is pertinent to focus on the advances offered by these data as a predictive tool in healthcare since it is essential, to this end, to evaluate the veracity of the information shared on Twitter.


Understanding KlimaDAO Use and Value: Insights from an Empirical Analysis

January 2023

·

69 Reads

·

2 Citations

Communications in Computer and Information Science

·

·

Salvador Sánchez-Alonso

·

[...]

·

Blockchain technologies have demonstrated the potential to build decentralized finance (DeFi) protocols that are composable and interoperable. One of the envisioned applications of blockchains is that of becoming a platform for markets of greenhouse-gas emissions. KlimaDAO is one of the recently launched protocols that attempts to bridge existing voluntary carbon markets to DeFi by means of building incentive mechanisms on top of tokenized carbon assets. That approach may eventually bring benefits to carbon markets in terms of liquidity and transparency and address a wider audience. Here we report the analysis of the early status of that initiative in an attempt to get insights in its actual functioning and value, and on the extent to which they are currently addressing the original goals and potential benefits of this kind of protocols.


Twitter as a predictive system: A systematic literature review

December 2022

·

449 Reads

·

37 Citations

Journal of Business Research

Millions of people use Twitter daily, posting thousands of messages and interacting with their peers. This research aims to evaluate and classify the predictive potential of the Twitter social platform through the intelligent analysis of user-generated public big data analytics. A systematic literature review (SLR) covering Web of Science, IEEE, Scopus and other databases identified the gaps and opportunities for developing predictive applications of User-Generated Content (UGC) on Twitter since 2006. Our research is a practical contribution to the use of Twitter data as a predictive system. A wide variety of application domains, highlighting social network analysis and public health, have been identified by applying innovative techniques for conducting a massive SLR, leveraging machine learning and graph analysis. The results give rise to new research lines with implications for both scholars and business leaders.


Quantum Software Measurement

October 2022

·

29 Reads

This chapter discusses prospects for the measurement of quantum software artifacts and processes, describing initial directions and reviewing the scattered and scarce literature on the topic. It examines potential differences and commonalities of classical and quantum software in the context of measurement and identifies future research directions that appear to be more promising to address the specificities of quantum computer–oriented development.


Improving OER descriptions to enhance their availability, reuse, and enrichment

March 2022

·

124 Reads

·

7 Citations

Education and Information Technologies

Nowadays, information and communication technologies (ICTs) and virtual training have increased the use of educational resources. This growth use has highlighted educational resource reuse and availability problems. Resource descriptions adapted to particular needs and the lack of metadata enrichment taking advantage of the benefits provided by the Semantic Web are some examples of these problems. The purpose of this paper is to expose an enhanced and interoperable set of metadata elements for describing OER (Open Educational Resources), which takes full advantage of the Openness and data enrichment. In this research, requirements such as data quality dimensions, Linked Data, and mapping into RDF (Resource Description Framework) Graph, have been taken into account to provide well‐described, available, reusable and enriched OER, in addition, to display them as Linked Data. These features contribute to the educational field strengthen the processes of Opening, Availability, Reuse, and Linking OER. In a nutshell, these features are necessary to facilitate innovative educational settings. Finally, this improved OER description can be extrapolated to other countries, serving as a potential Opening and reuse guidelines to publish OER on both applications and LOD (Linked Open Data) Cloud.


Citations (75)


... And users' entrenched habits may further inhibit the discontinuation of legacy systems. [21] As for behavior, complex cycles in the client's decision-making process and the vendor's communicative actions when purchasing new systems [22], along with new ways of working introduced by change management related to legacy systems [23], also contribute to the passive maintenance of these systems. ...

Reference:

How to Stop Negative Maintenance of Legacy Systems to Rescue Historical Data
Prescriptive graph analytics on the digital transformation in healthcare through user-generated content

Annals of Operations Research

... The spread of fake news has a negative impact on social stability and harmony, as disinformation can also mislead and disrupt the tranquility of human communities [75]. Fake news, typically spread on social media, has a negative impact on national innovation [108], health [23], consumer choices [75], and economically on social media platforms themselves [109]. In sum, fake news weakens the stability of society through its impact on public opinion, health, business, media and politics [49,114]. ...

The power of big data analytics over fake news: A scientometric review of Twitter as a predictive system in healthcare

Technological Forecasting and Social Change

... For instance, Klima DAO, a decentralized autonomous organization, uses its governance token Klima, anchored to carbon credits, to encourage emission reductions by driving up token prices. Participants can engage by purchasing Klima tokens, intermediary BCT tokens, or pledging BCT in liquidity pools, all of which increase learning and decision-making costs and potentially reduce participation [21,24]. ...

Understanding KlimaDAO Use and Value: Insights from an Empirical Analysis
  • Citing Chapter
  • January 2023

Communications in Computer and Information Science

... According to behavioral economics, emotions and sentiment can greatly influence individual actions and decision-making processes [11]. Previous research explored the effect of public Twitter sentiment on cryptocurrency prices [16], Elon Musk's Twitter activity on the cryptocurrency market [2], Twitter effect on stock market decisions during pandemics [27], and the predictive power of Twitter for Bitcoin price [25] (see [5] for a recent review). ...

Twitter as a predictive system: A systematic literature review

Journal of Business Research

... Another significant problem is that many authors of resources fail or are reluctant to deliver any metadata at all. Numerous studies have recommended metadata sets that describe OER more systematically and thereby enrich and facilitate the metadata report to improve the OER description and, therefore, the OER discoverability (Herrera-Cubides et al., 2022). ...

Improving OER descriptions to enhance their availability, reuse, and enrichment

Education and Information Technologies

... In childhood cancer, AI traceability ensures accountability, auditability, and the ability to identify, mitigate and correct problems or biases in AI applications. Traceability in AI involves comprehensive documentation to ensure the reproducibility and accountability of AI systems [93]. Reproducibility of AI methods details procedures and data, allowing for accurate replication. ...

Traceability for Trustworthy AI: A Review of Models and Tools

Big Data and Cognitive Computing

... . 이 기법들은 4차 산업혁명과 함께 보건 의료분야를 포함한 산업 전반으로 급속히 확산되고 있으며 [5], 특히 보건의료분야에 도입되는 이유는 의 료기관의 비용 절감, 운영의 효율성, 효과성 등을 크 게 개선하기 때문이다 [6]. 의료기관들은 이러한 기법 들을 비용효과성 제고, 자원배분의 최적화, 병상 점 유율 등의 관리에 이용하고 있다 [7,8]. ...

Predicting Length of Stay Across Hospital Departments

IEEE Access

... Virtual communities or retro gaming websites can be categorized according to the interests of the individuals, namely interest in games, computers, and game consoles. These three branches of the retro gaming community can be further divided into the developers, buyers and sellers, and the technical problems and repairs group [Mora-Cantallops et al., 2021]. Besides having an organized structure, retro gaming communities are important spaces for the dynamization of this subculture since they allow individuals to interact with like-minded people and take on the role of agents for the preservation of software and hardware and other materials associated with these artifacts. ...

Identifying communities and fan practices in online retrogaming forums
  • Citing Article
  • February 2021

Entertainment Computing

... Existe un enorme potencial de recursos que se exponen como fuentes de datos abiertos en diversos escenarios, pasando por la reutilización de recursos educativos (Herrera-Cubides et al., 2020), la ciberseguridad (Pastor-Galindo et al., 2020), entre otros. Sin embargo, si no se cuenta con un adecuado tratamiento y proceso de transformación, no será posible extraer aquellos aspectos que permitan su uso para analizar y posteriormente utilizar como fuentes de datos inteligentes. ...

Open-Source Intelligence Educational Resources: A Visual Perspective Analysis

Applied Sciences

... Code coverage and other source-code metrics used for classical software have not been adopted for quantum programs [66]. This may be because the differences in the importance between quantum code and classical code have not yet been fully explored. ...

On the Source Code Structure of Quantum Code: Insights from Q# and QDK
  • Citing Chapter
  • August 2020

Communications in Computer and Information Science