José Antonio González Alastrué

José Antonio González Alastrué
Universitat Politècnica de Catalunya | UPC · Department of Statistics and Operations Research (EIO)

Computer Science

About

86
Publications
24,634
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
652
Citations
Citations since 2017
25 Research Items
405 Citations
2017201820192020202120222023020406080
2017201820192020202120222023020406080
2017201820192020202120222023020406080
2017201820192020202120222023020406080
Additional affiliations
September 1992 - present
Universitat Politècnica de Catalunya
Position
  • Professor (Associate)

Publications

Publications (86)
Article
Full-text available
Background: Citicoline or CDP-choline is a neuroprotective/neurorestorative drug used in several countries for the treatment of traumatic brain injury (TBI). Since the publication of the controversial COBRIT, the use of citicoline has been questioned in this indication, so it was considered necessary to undertake a systematic review and meta-analy...
Article
Full-text available
Background. Research on impact in student achievement of online homework systems compared to traditional methods is ambivalent. Methodological issues in the study design, besides of technological diversity, can account for this uncertainty. Hypothesis. This study aims to estimate the effect size of homework practice with exercises automatically pr...
Article
Full-text available
This work explores the adaptation of the learning process during the COVID-19 pandemic, while students and teachers kept confined at home and classes were online. For this task, a survey among teachers in Statistics and Operations Research departments at Spanish universities was conducted. 131 teachers from 25 universities answered a 28-item questi...
Article
Full-text available
Objective To develop a tool to assess the quality of peer-review reports in biomedical research. Methods We conducted an online survey intended for biomedical editors and authors. The survey aimed to (1) determine if participants endorse the proposed definition of peer-review report quality; (2) identify the most important items to include in the...
Article
Full-text available
Background : Precision medicine is the Holy Grail of interventions that are tailored to a patient’s individual characteristics. However, conventional clinical trials are designed to find differences in averages, and interpreting these differences depends on untestable assumptions. Although only an ideal, a constant effect of treatment would facilit...
Article
Full-text available
Background: From 2005 to 2010, we conducted 2 randomized studies on a journal (Medicina Clínica), where we took manuscripts received for publication and randomly assigned them to either the standard editorial process or to additional processes. Both studies were based on the use of methodological reviewers and reporting guidelines (RG). Those inte...
Article
Background From 2005 to 2010, we conducted 2 randomized studies on a journal (Medicina Clínica), where we took manuscripts received for publication and randomly assigned them to either the standard editorial process or to additional processes. Both studies were based on the use of methodological reviewers and reporting guidelines (RG). Those interv...
Article
Rationale: Optimal pre-hospital delivery pathways for acute stroke patients suspected to harbor a large vessel occlusion have not been assessed in randomized trials. Aim: To establish whether stroke subjects with rapid arterial occlusion evaluation scale based suspicion of large vessel occlusion evaluated by emergency medical services in the fie...
Article
Full-text available
Background : Precision medicine is the Holy Grail of interventions that are tailored to a patient’s individual characteristics. However, conventional clinical trials are designed to find differences in averages, and interpreting these differences depends on untestable assumptions. Although ideal, a constant effect would facilitate individual manage...
Article
Full-text available
Background: A strong need exists for a validated tool that clearly defines peer review report quality in biomedical research, as it will allow evaluating interventions aimed at improving the peer review process in well-performed trials. We aim to identify and describe existing tools for assessing the quality of peer review reports in biomedical res...
Article
Full-text available
Background : Precision medicine is the Holy Grail of interventions that are tailored to a patient’s individual characteristics. However, conventional clinical trials are designed to find differences with the implicit assumption that the effect is the same in all patients within the eligibility criteria. If this were the case, then there would be no...
Article
Full-text available
Teaching statistics has benefited from Java applets, the successful technology that appeared in the late 90s and which allowed real interactivity on an Internet browser. Combining dynamic functionality with the web provides an inspirational complement to the contents of many subjects in undergraduate statistics courses, especially for active learni...
Article
Full-text available
Background : Precision medicine is the Holy Grail of interventions that are tailored to a patient’s individual characteristics. However, conventional clinical trials are designed to find differences with the implicit assumption that the effect is the same in all patients within the eligibility criteria. If this were the case, then there would be no...
Article
Full-text available
Background: Precision medicine is the Holy Grail of interventions that are tailored to a patient’s individual characteristics. However, the conventional design of randomized trials assumes that each individual benefits by the same amount. Methods: We reviewed parallel trials with quantitative outcomes published in 2004, 2007, 2010 and 2013. We coll...
Chapter
Protection levels on sensitive cells—which are key parameters of any statistical disclosure control method for tabular data—are related to the difficulty of any attacker to recompute a good estimation of the true cell values. Those protection levels are two numbers (one for the lower protection, the other for the upper protection) imposing a safety...
Article
National Statistical Agencies routinely disseminate large amount of data. Prior to dissemination these data have to be protected to avoid releasing confidential information. Controlled tabular adjustment (CTA) is one of the available methods for this purpose. CTA formulates an optimization problem that looks for the safe table which is closest to t...
Article
Full-text available
To estimate treatment effects, trials are initiated by randomising patients to the interventions under study and finish by comparing patient evolution. In order to improve the trial report, the CONSORT statement provides authors and peer reviewers with a guide of the essential items that would allow research replication. Additionally, WebCONSORT ai...
Presentation
Full-text available
Se presenta un conjunto de aplicativos desarrollados con R y el paquete Shiny, disponibles en web. Estos recursos se han diseñado con el fin de facilitar el aprendizaje, combinando interacción, visualización y simulación. Se busca que el alumno, manipulando el recurso, descubra por sí mismo determinadas experiencias y propiedades relacionadas con e...
Article
Los lectores de Medicina Clinica conocen bien la importancia de definir bien el denominador de una proporcion para estimar una probabilidad: “No es lo mismo la probabilidad de que un catolico sea Papa que la de que un Papa sea catolico”. De forma similar, en un diagnostico, no es lo mismo la probabilidad de que un enfermo de positivo (sensibilidad)...
Article
Controlled tabular adjustment (CTA) is a relatively new protection technique for tabular data protection. CTA formulates a mixed integer linear programming problem, which is challenging for tables of moderate size. Even finding a feasible initial solution may be a challenging task for large instances. On the other hand, end users of tabular data pr...
Conference Paper
https://speakerdeck.com/delucas/adaptacion-de-una-plataforma-de-e-learning-a-nuevos-entornos
Conference Paper
Minimum distance controlled tabular adjustment (CTA) is a perturbative technique of statistical disclosure control for tabular data. Given a table to be protected, CTA looks for the closest safe table by solving an optimization problem using some particular distance in the objective function. CTA has shown to exhibit a low disclosure risk. The purp...
Article
Objectives: To evaluate the empirical concordance between the hazard ratio (HR) and the median ratio (MR) in survival cancer studies. Study design and setting: We selected all cancer survival articles from the New England Journal of Medicine published between 2000 and 2010. The qualitative concordance was estimated by the proportion of measured...
Article
One of the main concerns of national statistical agencies (NSAs) is to publish tabular data. NSAs have to guarantee that no private information from specific respondents can be disclosed from the released tables. The purpose of the statistical disclosure control field is to avoid such a leak of private information. Most protection techniques for ta...
Article
Full-text available
Objective To investigate the effect of an additional review based on reporting guidelines such as STROBE and CONSORT on quality of manuscripts. Design Masked randomised trial. Population Original research manuscripts submitted to the Medicina Clínica journal from May 2008 to April 2009 and considered suitable for publication...
Data
Web table 1: Manuscript quality assessment instrument
Data
Full-text available
Web appendix: Protocol trial design
Data
Web table 2: Adherence recommendation assessment questionnaire
Article
Full-text available
Objective: To investigate the effect of an additional review based on reporting guidelines such as STROBE and CONSORT on quality of manuscripts. Design: Masked randomised trial. Population Original research manuscripts submitted to the Medicina Clínica journal from May 2008 to April 2009 and considered suitable for publication. Intervention:...
Conference Paper
Full-text available
Minimum-distance controlled tabular adjustment methods (CTA), and its restricted variants (RCTA), is a recent perturbative approach for tabular data protection. Given a table to be protected, the purpose of RCTA is to find the closest table that guarantees protection levels for the sensitive cells. This is achieved by adding slight adjustments to t...
Article
Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students’ answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All...
Article
Full-text available
There is a lack of agreement regarding measuring the effects of stroke treatment in clinical trials, which often relies on the dichotomized value of 1 outcome scale. Alternative analyses consist mainly of 2 strategies: use all the information from an ordinal scale and combine information from several outcome scales in a single estimate. We reanalyz...
Article
Full-text available
Tabular data is routinely released by national statistical agencies (NSA) to disseminate aggregated information from some particular microdata. Prior to publication, these tables have to be treated to preserve information without dis- closing confidential details from specific respondents. This statistical disclosure control problem is of main inte...
Article
Full-text available
e-status es una plataforma web que propone ejercicios a los estudiantes que se corrigen automáticamente. Un mismo ejercicio puede repetirse todas las veces que se desea, puesto que los datos son diferentes cada vez. Al acabar de resolver el ejercicio el estudiante obtiene una realimentación de su trabajo. La plataforma se ha utilizado en los último...
Article
Background: e-status is a web-based tool able to generate different statistical exercices and to provide immediate feed-back to students´ answers. Although the use of Information and Communication Technologies (ICTs) is widely extended in undergraduate education, there are few studies formally evaluating its learning effects. Method: all the studen...
Article
The article presents a web-based tool designed to support a learning project led by a team of teachers to assist students in learning statistics. The goal is to build a tool that is able to display statistical or mathematical problems and to correct the students' answers. The problems may include random data, so the solution cannot be previously kn...
Article
Full-text available
We have developed a web-based tool, called e-status, that is able to generate individually different statistical or mathematical problems and to correct the students´ answers. The tool is well appreciated by the students since it is available anywhere and anytime. This ability allows the weaker students to practice the concepts as needed, without o...
Article
Full-text available
e-status es una herramienta informática diseñada para que los estudiantes puedan resolver problemas estadísticos y obtener corrección inmediata. En el trabajo presente se va a exponer: a) su empleo durante la adaptación al sistema de créditos ECTS que requiere el proceso de convergencia europea, ejemplificado en una asignatura concreta; y b) el dis...
Article
Full-text available
Se presenta un sistema basado en plataformas web que pretende ayudar a los estudiantes a aprender estadística mediante la generación de problemas individualizados de tipo numérico con corrección instantánea. Como los problemas incluyen parámetros aleatorios, la solución no puede ser conocida de antemano. De esta manera, el estudiante puede insistir...
Article
Long-term hydrothermal coordination is one of the main problems to be solved by an electric utility. Its solution provides the optimal allocation of hydraulic, thermal and nuclear resources at the different intervals of the planning horizon. The purpose of the paper is two-fold. Firstly, it presents a new package for solving the hydrothermal coordi...
Article
La representación gráfica se ha convertido en un elemento habitual en los medios de comunicación, tanto científicos como divulgativos, además de la prensa escrita o televisión. Su presencia se considera un potente mecanismo de transmisión de información. Algunos tópicos han pasado a adquirir rango de axioma: un gráfico se explica por sí sólo, una i...
Article
Whenever relationships among variables are complex, or time processes play an essential part, or random components mask the process under study, graphical display becomes an indispensable tool. Biomedicine, in a broad sense, from research to medical care or management activities, is a field with these features, and good use of graphics can facilita...
Article
Full-text available
The work presented deals with long-term hydrogeneration optimization in integrated systems when there are no limitations on the availability of fuels for thermal units. A multicommodity network model represents stochastic hydrovariables. Uncertainty due to random inflows means that the hydrogeneration's share in production is undeterministic, and t...
Article
Full-text available
The paper introduces the project led by a team of teachers to assist students learn statistics. The goal is to build a tool able to present mathematical problems and to correct the students´ answers. The problems may include random data, so the solution cannot be previously known (if solved before) and the student can reconsider it if necessary. Pe...
Article
The work presented deals with long-term hydrogeneration optimization in integrated systems when there are no limitations on the availability of fuels for thermal units. A multicommondity network model represents hydrovariables. Hydrogeneration and its unavailability distribution is modeled as a multiblock distribution and a procedure is derived to...
Article
Full-text available
Optimizing the thermal production of electricity in the long term, once the maintenance schedules have been decided, means optimizing both the fuel procurement policies and the use of fuels for generation in each thermal unit throughout the time period under study. A fundamental constraint to be satisfied at each interval into which the long time p...
Article
Summary The long-term hydrothermal coordination of electricity generation has to optimize many variables tied to stochastic parameters. Multicommodity network flows in a replicated reservoir hydronetwork with multicommodity water inflows is one of the possible ways to model and optimize the long-term coordination. In the existing literature on this...
Article
HTCOOR is a computer package to solve the long-term hydrothermal coordination problem of electricity generation. HTCOOR was developed for the SLOEGAT European Esprit Project 22652. This document fully describes the model implemented in HTCOOR, as well as the optimization problem to be solved. Keywords: Electricity generation, Long-term hydrothermal...
Article
Full-text available
Minimum distance controlled tabular adjustment (CTA) is a recent method-ology for the protection of tabular data. Given a table to be protected, the purpose of CTA is to find the closest table that guarantees the confidentiality of the sensitive cells. This is achieved by adding slight adjustments to the remaining cells, or a subset of them provide...
Article
Full-text available
BACKGROUND AND OBJECTIVE E-status is a web-based tool designed to improve learning in statistics introductory courses. In a separate report (Gonzalez et al, 2006) we presented an experimental evaluation of the performance improvement among students of a dentistry school who used the tool. Here, based on observation, we estimate its effect in improv...
Article
Full-text available
The problem consists of routing packages through a network. In fact, the packages actually correspond to bandwidth in a telecommunication network. The packages can be assigned to several sizes: the larger the size, the lesser the cost of transport is. It is possible to tie packages together (and untie them) to take advantage of lower costs, but onl...
Article
Full-text available
Introducción. Los estudiantes de Aplicaciones de la Estadística en las Ciencias de la Salud llegan con buen nivel en aspectos teóricos, formales e informáticos, y nos proponemos desarrollar sus habilidades de trabajo en equipo y de comunicación oral y escrita, así como sus capacidades cognitivas de análisis, síntesis y evaluación en el área de la B...
Article
Full-text available
Los nuevos planes de estudio del EEES se ponen en marcha, y las asignaturas deben introducir los cambios metodológicos sugeridos por Bolonia. Describimos nuestra experiencia en una asignatura perteneciente a un plan de estudios anterior, pero donde en su última edición hemos ensayado algunas iniciativas para estimular el trabajo activo del alumno....

Questions

Questions (5)
Question
Hi, I would be happy to know your points of view about the following situation. Imagine that we are interested to use some tool in order to measure a given construct (for instance, the quality of a product), and we prepare two designs. In the first, we employ two raters which evaluate a number of products chosen at random with the tool (both raters evaluate each product independently). In the second, we ask many people from a pool to rate a small number of products with the tool (each rater evaluates the same (say, three) products); the products to measure were previously selected by us.
Starting from data obtained after these two designs, please share your thoughts about what you could analyze and what you could interpret. I'm not ignoring that in the first case the product would a random factor, as in the second case the rater would be. I'm particularly interested in the links and differences between both analyzes, and what could obtained in global if we gather them.
Thanks in advance.
José Antonio
Question
Do any of you know about some recent systematic review focused on studies investigating the efficacy of ICTs in statistical education?
I'm looking for research based on randomized experiments, but this seems a rare field in teaching. I would greatly appreciate any useful reference.
Question
When do you think a 50% confidence interval may make sense? Could those be more valuable in certain settings than usual 95% CI? (Please, share your point of view with me. I'm trying to scan a large spectrum of opinions. Thks)