Article

Métodos de escalamiento aplicados a la priorización de necesidades de formación en organizaciones

Psicothema, ISSN 0214-9915, Vol. 21, Nº. 4, 2009, pags. 509-514 01/2009;
Source: OAI

ABSTRACT

Los criterios para priorizar las necesidades que justifican las acciones formativas a implantar no se suelen explicitar a priori en los programas de formación continua en contextos organizacionales. En este trabajo se proponen los métodos de escalamiento como procedimiento factible y útil para identificar criterios explícitos de priorización de necesidades, y se concreta cuál de ellos es más apropiado en este contexto de intervención. 404 empleados de una organización pública cumplimentaron un cuestionario ad hoc para priorizar necesidades formativas en diferentes áreas durante el período 2004 al 2006; concretamente, se ordenaron 117, 75 y 286 estímulos, respectivamente. Se calcularon y compararon las ordenaciones obtenidas con cuatro métodos de escalamiento: el método de Dunn-Rankin y tres métodos derivados de la Ley del Juicio Categórico de Thurstone, concretamente ordenación por rangos, intervalos sucesivos e intervalos aparentemente iguales. Con los resultados, se constata la factibilidad y utilidad de estos métodos de escalamiento para la solución de los problemas planteados; a partir de los métodos más precisos, se propone el uso del método de ordenación por rangos por su parsimonia (respecto a sencillez en su procedimiento); y se anticipan futuras líneas de actuación.

Download full-text

Full-text

Available from: Francisco Holgado-Tello
  • Source
    • "Los participantes debían priorizar un máximo de 10 NNFF de la/s área/s de conocimiento en la/s que estuviesen interesados; en base a esta información se tomaría la decisión de qué NNFF atender en el Programa de Formación 2007. Una vez recopilados los cuestionarios cumplimentados, se codificaron y, en base al método de escalamiento de Dunn-Ranking (Sanduvete et al., 2009), el equipo técnico obtuvo el listado de necesidades priorizadas y elaboró un informe en el que se recogieron el procedimiento seguido y los resultados obtenidos. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In the field of education, specific or standard models of quality management are usually performed. Both alternatives have their advantages and disadvantages, but most educational organizations prefer to implement standard models which usually include some mismatches during their implementation. To address this situation, we propose an instrument that, when applied during the evaluation processes of these organizations, facilitates the specification and improvement of the referents participation, usefulness and transparency. This creates a context that enhances the successful application of standard models. The instrument implementation is described in the needs assessment process of the Training Program Department in an Andalusian sports organization. Results are analyzed using key indicators before and after applying the instrument. Last, potential advantages are discussed.
    Full-text · Article · Jan 2015 · Revista Internacional de Medicina y Ciencias de la Actividad Fisica y del Deporte
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The approach to intervention programs varies depending on the methodological perspective adopted. This means that health professionals lack clear guidelines regarding how best to proceed, and it hinders the accumulation of knowledge. The aim of this paper is to set out the essential and common aspects that should be included in any program evaluation report, thereby providing a useful guide for the professional regardless of the procedural approach used. Furthermore, the paper seeks to integrate the different methodologies and illustrate their complementarity, this being a key aspect in terms of real intervention contexts, which are constantly changing. The aspects to be included are presented in relation to the main stages of the evaluation process: needs, objectives and design (prior to the intervention), implementation (during the intervention), and outcomes (after the intervention). For each of these stages the paper describes the elements on which decisions should be based, highlighting the role of empirical evidence gathered through the application of instruments to defined samples and according to a given procedure.
    Full-text · Article · Jan 2013 · International Journal of Clinical and Health Psychology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Method: Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. Results: We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. Conclusions: We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
    Full-text · Article · Feb 2014 · Psicothema
Show more