Article

Métodos de escalamiento aplicados a la priorización de necesidades de formación en organizaciones

Psicothema, ISSN 0214-9915, Vol. 21, Nº. 4, 2009, pags. 509-514 01/2009;
Source: OAI

ABSTRACT Los criterios para priorizar las necesidades que justifican las acciones formativas a implantar no se suelen explicitar a priori en los programas de formación continua en contextos organizacionales. En este trabajo se proponen los métodos de escalamiento como procedimiento factible y útil para identificar criterios explícitos de priorización de necesidades, y se concreta cuál de ellos es más apropiado en este contexto de intervención. 404 empleados de una organización pública cumplimentaron un cuestionario ad hoc para priorizar necesidades formativas en diferentes áreas durante el período 2004 al 2006; concretamente, se ordenaron 117, 75 y 286 estímulos, respectivamente. Se calcularon y compararon las ordenaciones obtenidas con cuatro métodos de escalamiento: el método de Dunn-Rankin y tres métodos derivados de la Ley del Juicio Categórico de Thurstone, concretamente ordenación por rangos, intervalos sucesivos e intervalos aparentemente iguales. Con los resultados, se constata la factibilidad y utilidad de estos métodos de escalamiento para la solución de los problemas planteados; a partir de los métodos más precisos, se propone el uso del método de ordenación por rangos por su parsimonia (respecto a sencillez en su procedimiento); y se anticipan futuras líneas de actuación.

0 Bookmarks
 · 
116 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: The approach to intervention programs varies depending on the methodological perspective adopted. This means that health professionals lack clear guidelines regarding how best to proceed, and it hinders the accumulation of knowledge. The aim of this paper is to set out the essential and common aspects that should be included in any program evaluation report, thereby providing a useful guide for the professional regardless of the procedural approach used. Furthermore, the paper seeks to integrate the different methodologies and illustrate their complementarity, this being a key aspect in terms of real intervention contexts, which are constantly changing. The aspects to be included are presented in relation to the main stages of the evaluation process: needs, objectives and design (prior to the intervention), implementation (during the intervention), and outcomes (after the intervention). For each of these stages the paper describes the elements on which decisions should be based, highlighting the role of empirical evidence gathered through the application of instruments to defined samples and according to a given procedure.
    International Journal of Clinical and Health Psychology 01/2013; 13:58−66. DOI:10.1016/S1697-2600(13)70008-5 · 2.79 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Method: Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. Results: We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. Conclusions: We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
    Psicothema 02/2014; 26(1):91-6. DOI:10.7334/psicothema2013.144 · 0.96 Impact Factor

Full-text (2 Sources)

Download
72 Downloads
Available from
Jun 1, 2014