Graduate preparation in research methods is needed to help ensure that the next generation of psychologists is prepared to consume and engage in research. This study examined the availability of courses in research methods in 192 American Psychological Association (APA)-accredited programs based on reports from program directors in clinical, counseling, school, and combined psychology programs. Results suggest that, although most doctoral-level psychology programs require introductory methods courses, the requirement to take more advanced courses in research methods is less common. Although many programs offer advanced methods courses as electives, fewer than 10% of program directors believe additional courses are needed. Among the areas of specialization, significant differences in required coursework in research methods were found only for factor analysis, which was required most by school psychology programs, followed by clinical psychology and then counseling psychology. In addition, PhD and PsyD programs generally do not differ in requiring coursework in research methods. Data from this study reflect a significant improvement in course offerings in research methods during the last two decades. Implications of these findings are discussed. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
[Show abstract][Hide abstract] ABSTRACT: Recent articles in The Journal for Specialists in Group Work have discussed credibility indicators for quantitative and qualitative studies (Asner-Self, 2009; Rubel & Villalba, 2009). This article extends upon these contributions by discussing measurement issues that are relevant to producers and consumers of quantitative group research. This article is necessary as measurement quality is directly associated with research credibility. The topics of reliability and validity along with credibility indicators for measures are discussed. This is followed by a description of the statistical assumption of independent measurements in relationship to group research. Implications for research and practice are provided.
The Journal for Specialists in Group Work 10/2010; 35(4):331-348. DOI:10.1080/01933922.2010.514978
[Show abstract][Hide abstract] ABSTRACT: To explore which factors doctor of psychology (PsyD) students feel are important to consider when selecting a PsyD program.
This article analyzes the survey responses of 394 enrolled PsyD students and 17 directors of clinical training (DCTs), in which the respondents rated the importance of 18 factors in program selection to understand what qualities PsyD students and DCTs value in a PsyD program. Students were also asked to assess how their program fared on the same 18 dimensions.
Results indicated that participants rated the program's structure, tone, and reputed quality of training as the most important factors in program selection (M's of 4.13 to 4.54 on a 5-point scale). Additionally, students rated their current program as high in quality on the same factors that they felt were most important in program selection (r's ranging from .15 to .37).
PsyD students rated a program's structure, tone, and reputation as particularly important factors to consider in selecting a program. Students' quality ratings were used to determine the top 5 programs for each of the factors assessed in the study.
[Show abstract][Hide abstract] ABSTRACT: While quantitative methodologists advance statistical theory and refine statistical methods, substantive researchers resist adopting many of these statistical innovations. Traditional explanations for this resistance are reviewed, specifically a lack of awareness of statistical developments, the failure of journal editors to mandate change, publish or perish pressures, the unavailability of user friendly software, inadequate education in statistics, and psychological factors. Resistance is reconsidered in light of the complexity of modern statistical methods and a communication gap between substantive researchers and quantitative methodologists. The concept of a Maven is introduced as a means to bridge the communication gap. On the basis of this review and reconsideration, recommendations are made to improve communication of statistical innovations. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Erin L. Woodhead, Erin E. Emery-Tiburcio, Nancy A. Pachana, Theresa L. Scott, Candace A. Konnert, Barry A. Edelstein
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.