ArticlePDF Available

A Literature Review on Collaborative Problem Solving for College and Workforce Readiness

Authors:

Abstract and Figures

The literature and the employee and workforce surveys rank collaborative problem solving (CPS) among the top 5 most critical skills necessary for success in college and the workforce.This paper provides a review of the literature on CPS and related terms, including a discussion of their definitions, importance to higher education and workforce success, and considerations relevant to their assessment. The goal is to discuss progress on CPS to date and help generate future research on CPS in relation to educational and workforce success.
Content may be subject to copyright.
A Literature Review on
Collaborative Problem Solving
for College and Workforce Readiness
April 2017
GRE
ETS GRE®Board Research Report
ETS GRE®17-03
ETS RR17-06
María Elena Oliveri
René Lawless
Hillary Molloy
e report presents the ndings of a research
project funded by and carried out under the
auspices of the Graduate Record Examinations
Board.
∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗
Researchers are encouraged to express freely their
professional judgment. erefore, points of view or opinions
stated in Graduate Record Examinations Board reports do not
necessarily represent ocial Graduate Record Examinations
Board position or policy.
e Graduate Record Examinations and ETS are dedicated to
the principle of equal opportunity, and their programs, services,
and employment policies are guided by that principle.
∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗
As part of its educational and social mission and in fullling
the organization’s non-prot Charter and Bylaws, ETS has and
continuestolearnfromandalsotoleadresearchthatfurthers
educational and measurement research to advance quality and
equity in education and assessment for all users of the
organization’s products and services.
GRE-ETS
PO Box 6000
Princeton, NJ 08541-6000
USA
To obtain more information about GRE
programs and services, use one of the following:
Phone: 1-866-473-4373
(U.S., U.S. Territories, and Canada)
1-609-771-7670
(all other locations)
Web site: www.gre.org
America Samoa, Guam, Puerto Rico, and US Virgin Islands
GRE Board Research Report Series and ETS Research Report Series ISSN 2330-8516
RESEARCH REPORT
A Literature Review on Collaborative Problem Solving
for College and Workforce Readiness
María Elena Oliveri, René Lawless, & Hillary Molloy
Educational Testing Service, Princeton, NJ
e literature and the employee and workforce surveys rank collaborative problem solving (CPS) among the top 5 most critical skills
necessary for success in college and the workforce. is paper provides a review of the literature on CPS and related terms, including a
discussion of their denitions, importance to higher education and workforce success, and considerations relevant to their assessment.
e goal is to discuss progress on CPS to date and help generate future research on CPS in relation to educational and workforce success.
Keywords Collaborative problem solving; college; higher education; workforce readiness; assessment; noncognitive
doi:10.1002/ets2.12133
Collaborative problem solving (CPS) skills are identied to be important for daily life, work, and schooling in the 21st
century. e importance of CPS is best captured in Hutchins’s (1995) book, CognitionintheWild,inwhichtheauthorhigh-
lighted the importance of collaboration by asking his readers to scan their immediate environment to identify objects that
were not produced collaboratively. Hutchins remarked that in so doing, he was able to uniquely identify only the pebble
on his desk. All other objects exemplied teamwork production. Products stemming from collaboration are everywhere.
eir production ranges across elds including entertainment, health, nutrition, engineering, and housing, providing
impressive evidence of the need for collaboration for progress and development. Yet, as observed by Hesse, Care, Buder,
Sassenberg, and Grin (2015), collaborative skills are neither formally taught nor assessed. Despite the clear need for
such skills for workforce readiness, the assessment of 21st-century skills such as CPS lags behind.
In this review paper, we discuss advances related to conceptualization, assessment, and validity considerations related
to CPS. We provide a construct model that denes CPS in general and for higher education in particular. We also describe
advances in knowledge, skills, and abilities (KSAs) that reveal higher levels of competency in CPS. Finally, we discuss
validity evidence (advances and ideas for future research). We aim, through these discussions, to inform current and
future eorts in designing, interpreting, and validating CPS assessments.
e goal is to provide an overview of existing CPS denitions, conceptual frameworks or taxonomies, and measurement
eorts conducted to date. e second goal is to inform an operational denition of CPS and identify considerations needed
for developing a prototype of a CPS assessment. In our review, we cast a wide net for the terminology related to CPS and
teamwork. We examined both theoretical and empirical literature, though there is a paucity of the latter.
An overarching denition of CPS was oered by Kyllonen (2012). In his organizational review of the various tax-
onomies and classications of diverse 21st-century skills (e.g., Assessment and Teaching of 21st Century Skills [http://
www.atc21s.org], Partnership for 21st Century Skills [2012], and National Research Council ‘[2010]), he dened CPS as
“a performance activity requiring groups of students to work together to solve problems” (Kyllonen, 2012, p. 16). Another
denition is provided in Hesse et al. (2015), where CPS was described as “a joint activity where dyads or small groups
[interact to] execute a number of steps in order to transform a current state into a desired goal state” (p. 39). However,
ner-grained denitions are needed if we want to develop tasks that assess the critical elements of CPS. At this more
detailed level, we nd a lack of coherence with respect to how teamwork and CPS is dened and what behaviors comprise
it. CPS is not a well-dened construct, which further complicates eorts to dene CPS because researchers typically dif-
fer in their denitions of CPS and the elements that comprise it. Researchers have oen ignored existing denitions of
skills and have not systematically built on previous eorts or denitions of skills provided by other researchers (Cannon-
Bowers, Tannenbaum, Salas, & Volpe, 1995).1For example, researchers have used dierent terminology to describe the
Corresponding author: M. E. Oliveri, E-mail: moliveri@ets.org
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 1
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
same construct. Terms such as groupdecisionmaking,team cognition, teamwork,group work,small group problem solving,
cooperative learning,collaborative learning,andteam collaboration have all been used interchangeably (O’Neil, Chuang,
& Chung, 2004). Moreover, researchers have oen used similar labels to refer to dierent skills.
Literature Review
To gain insights into CPS denitions including its constituent elements, we conducted a multidisciplinary and transor-
ganizational review of higher education and workforce frameworks, spanning elds of education, sciences, business, and
management. Occupational literature was also reviewed, especially the literature concerning the military, as much of the
research surrounding CPS originated within this particular domain (Cannon-Bowers & Salas, 1997; Dickinson & McIn-
tyre, 1997; Fiore et al., 2010), in which the use of teams to perform complex work functions is pervasive (Shanahan, Best,
Finch, & Sutton, 2007). We included higher education and workforce readiness frameworks as the former oen serves
as preparatory ground for the latter. Each of these various sources presented information at dierent levels of specicity
and detail. For instance, the theoretical literature tended to be descriptive as compared to both the surveys and learning
outcomes frameworks, which described CPS skills more globally. We evaluated frameworks and assessments developed
forapopulationofstudentsthatwereattheendofcompulsoryeducation(15yearsoldandup),spannedtheeducation
and workforce domains, and applied to both national and international contexts.
ere is voluminous literature on CPS spanning decades of research conducted by both individual researchers and
organizations. Stevens and Campion (1994), who conducted early research in identifying specic KSAs associated with
teamwork, provided one of the earliest frameworks on CPS. In their framework, they outlined the KSAs to recognize
the obstacles to collaborative group problem solving and to utilize the proper degree and type of participation, in con-
trast to the identication of global personality traits (e.g., helpfulness, supportiveness) as researchers had done previously
(p. 505). Additionally, the authors focused on the individual’s role in successful team performances rather than focus-
ing on the group or the team as the unit of analysis, which was critical to advance the development of an assessment
that could provide information about individual test takers. Finally, the authors identied transportable teamwork com-
petencies (conict resolution, CPS, communication, goal setting and performance management, and planning and task
coordination) believed to be generalizable across contexts.
e framework by Cannon-Bowers et al. (1995) expanded this work and made two important contributions. First,
the authors expanded the set of ve transportable competencies suggested by Stevens and Campion (1994) to eight, with
the argument that the additional competencies are also central to CPS. e eight competencies include (a) adaptability,
(b) shared understanding of the situation, (c) performance monitoring and feedback, (d) leadership, (e) interpersonal
relations, (f) coordination, (g) communication, and (h) decision making. Cannon-Bowers et al. further explicated dis-
tinctions between team-generic and team-specic competencies, as well as task-generic and task-specic competencies.
Team-generic competencies are those that an individual possesses that can be used in any group work, regardless of its
composition, such as communication skills, leadership skills, and attitudes toward teamwork. Team-specic competen-
cies, conversely, are those tied to the composition of the group, such as “knowledge of teammates’ characteristics, specic
compensation strategies, and team cohesion” (Cannon-Bowers et al., 1995, p. 338). Similarly, task-specic competencies
are KSAs that are relevant to a specic task (e.g., an understanding of the specic roles and responsibilities of each team-
mate), whereas task-generic competencies are those that an individual uses across a variety of tasks (e.g., general planning
skills). Competencies deemed “transportable” are those that are both task-generic and team-generic. Since we are looking
to develop an assessment that can predict a test taker’s ability to collaborate with a wide variety of teams across an array of
contexts, transportable competencies are the ones that we aim to target in the development of a CPS measure for higher
education.
Several research teams further expanded the abovementioned frameworks and elaborated on the description of
the skills they comprise. For example, the notion of a shared mental model as being central to team performance was
included (Aguado, Rico, Sánchez-Manzanares, & Salas, 2014; Cooke, Salas, Cannon-Bowers, & Stout, 2000; Klimoski
& Mohammed, 1994; Mathieu, Hener, Goodwin, Salas, & Cannon-Bowers, 2000). Processes related to shared mental
models are thought to enable team members to have common ways of interpreting cues, allocating resources, and making
compatible decisions (Cooke et al., 2000). Several scholars (Aguado et al., 2014; Cannon-Bowers et al., 1995; Zaccaro,
Rittman, & Marks, 2001) reiterated the importance of including team leadership in a conceptual framework. Zaccaro et al.
described how it is composed of four superordinate components: (a) information search and structuring, (b) information
2GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
use in problem solving, (c) managing personnel resources, and (d) managing material resources. Recognizing that CPS
is a dynamic process, Marks, Mathieu, and Zaccaro (2001) suggested including a temporal aspect in teamwork and
collaboration and indicated that collaborative actions occur along a series of simultaneous tasks, including creating team
goals and strategies, monitoring and coordinating the actions of teammates, and managing tasks occurring over time.
Similar to Stevens and Campion (1994), M. M. Chiu’s (2000) taxonomy explained collaboration and group problem-
solving processes in detail, providing a focus not only on social interactions but also on the individual’s role within the
team. Chiu’s research also provided exemplars of directly measurable behaviors and hence of importance to assessment
development. Examples of the kinds of social interactions that students engage in when problem solving include (a)
taking turns and working cooperatively; (b) providing tangible demonstrations of solutions to problems; and (c) listening
attentively, being responsive to each other’s’ input, and connecting ideas with each other through full participation.
Beyond the aforementioned taxonomies developed by individual researchers, we also reviewed taxonomies developed
by organizations that were developed for the assessment of CPS within a large-scale assessment context. ese included
higher education and workforce frameworks. Tables 1 and 2 provide a summary of the higher education and workforce
frameworks we reviewed, respectively. One example is the framework developed by the Programme for International
Student Assessment (PISA) CPS assessment commissioned by the Organisation for Economic Co-operation and Devel-
opment (OECD). e OECD (2013b) proposed that CPS comprises three competencies: (a) establishing and maintaining
shared understanding, (b) taking appropriate action to solve a problem, and (c) establishing and maintaining team orga-
nization. Table 3 presents frameworks that span the higher education and workforce contexts.
e goal of the review of the frameworks summarized in Tables 1– 3 and the empirical research on CPS was to develop
an operational denition of CPS that would be applicable to the higher education and workforce contexts and that would
contain overlapping KSAs across contexts. We used four sources of evidence. First, we used an iterative approach that
consisted of reviewing higher education student learning outcomes literature and noted the skills and behaviors that
required working with groups in various capacities (e.g., collaboration, group communications, or group projects). We
reviewed journal articles, book chapters, and books and identied the various skills and behaviors that were suggested
to play a role in successful team interactions while conducting various activities or projects. Our review of the literature
yielded extensive lists of skills and behaviors that we summarize below.
Second, this goal of achieving an operational denition was informed by consultation with experts, including research
scientists, test developers with expertise in international populations, and psychometricians. Our collaborators brought
expertise in evidence-centered design, design spaces, assessment strategies, prototyping, the development of CPS items
for the PISA, and the development of items measuring noncognitive constructs. ey also had expertise with various
item types used to assess noncognitive constructs (e.g., forced-choice items, ipsative, and quasi-ipsative measures) in the
context of CPS with human and virtual agents. Together, we engaged in an iterative process where ve experts reviewed
various versions of the proposed model and gave feedback, which led to at least three revision cycles with modications
and updates at each stage.
ird, we attended the Innovative Assessment of Collaboration 2014 conference, which brought together researchers
and international experts in organizational teaming, educational collaboration, tutoring, simulation, gaming, and statis-
tical and psychometric process modeling for insight into the development of reliable and valid collaborative assessments.
e meetings focused on team performances in organizations, simulated teamwork environments, and CPS in educational
settings (ETS, 2014b).
e nal source of evidence we relied on was the Council for the Advancement of Standards in Higher Education (CAS;
Strayhorn, 2006), which was developed to guide higher education professionals in the assessment of 16 student learning
and development outcomes, such as leadership development and eective communication. As summarized in Table 1, the
CAS standards provide denitions of CPS, among other learning outcomes; describe the theoretical underpinnings of the
constructs; provide a list of related variables; and give suggestions for how to measure these outcomes. CPS was named as
one of the critical learning outcomes in the CAS.
An Operational Definition of CPS
Figure 1 illustrates our proposed denition of CPS. We suggest that CPS comprises four components: teamwork, commu-
nication,leadership,andproblemsolving.Eachhasmultipleassociatedskills.Gradations(examplesofKSAsthatcanbe
demonstrated at various prociency levels for some of the constituent elements such as teamwork and communication)
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 3
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 1 Higher Education Frameworks
Framework Who provided input What was done Method Intended outcomes
e Council for the
Advancement of Standards
in Higher Education (CAS)
Delegates from 36 higher
education associations,
student aairs oces, and
nonprots in United States
and Canada
Developed professional practice
standards for undergraduate and
master’s level students; provided
outcomes assessment tools to
educators wanting to assess student
learning development
NS Help guide higher education
administrators and institutions that
work with college students and
master’s-level students to promote
theassessmentofstudents’growth,
goals, and behaviors
Framework for 21st Century
Learning: Communication/
Collaboration, Critical
inking, & Problem
Solving
Teachers, education experts,
business leaders
Dened the KSAs students need in
work and life; and the support
systems that will enable this
NS Help practitioners integrate these skills
into core academic subjects
e Lumina Foundation’s
Degree Qualications
Prole
Four original authors, expert
reviewers, faculty
Developed reference points for what
students should know and be able to
do for associate’s, bachelor’s, and
master’s degrees
NS Inform students on knowledge and
abilities required for earning
undergraduate or master’s degrees
Liberal Education and
America’s Promise (LEAP)
Educators from hundreds of
colleges and universities,
employers
Described the kinds of learning that
students need from college and how
to help them achieve it
Expert opinions,
business reports,
accreditation
requirements
Educate students and the public about
quality liberal education outcomes
and advocate their achievement by
college students
Tuning Project Original authors, over 100+
universities
Dened commonly accepted
professional and learning outcomes
and generic & subject-specic
competencies
Discussions;
benchmark papers;
questionnaires;
online websites
Create transparency in educational
structureandpromotefurther
innovation through communication
and identication of good practices
e Assessment and Teaching
of 21st Century Skills
(ATC21S)
250 researchers from around the
world
Dened 21st century skills
(communication, collaboration,
problem-solving, digital and ICT
literacy); developed ways to
transform the teaching, learning, and
assessment of these skills
White papers from
each working group
Help prepare entry-level workers with
practicalskillsto“create,build,and
help sustain information-rich
business” by providing education
systems with curricular
recommendations, innovative
assessments, and teaching–learning
resources
Note. NS =not specied.
4GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 2 Workforce Frameworks
Framework Who provided input What was done Method Intended outcomes
O*NET Competencies
Framework
Employees, job analysts Described the KSAs, interests, and
general work activities associated
with each occupation
Survey, in-depth
interviews
Public knowledge and dissemination
Workforce Readiness Project 431 employers, 12 HR
executives
Determined the corporate perspective
on how ready entry-level workers in
the United States are, by level of
educational attainment
Survey, in-depth
interviews with 12 HR
executives
To encourage the business community,
educators, policy makers, and students
to augment workforce readiness
preparation by providing a denition of
workforce readiness
Hewlett Foundation’s 21st
Century Competencies
A wide range of literature
was summarized,
including literature on
human capital and work
readiness
Determined which worker
competencies are most important for
the 21st century and how possessing
them makes a dierence in
educational and economic outcomes
for individuals and employers
Literature review Promote the development of programs and
systems that will assist in acquiring and
assessing skills that are critical to the
economy
ETA Competency Model
Clearinghouse’s General
Competency Model
Framework
Industry associations,
labor organizations,
educators and I/O
psychology, and subject
matter experts
Developed a model describing
competencies on nine tiers,
hierarchically organized in three
tiers: foundational, industry- and
occupational-related competencies
NS Inform and support the education and
training of a competitive workforce by
determining the necessary requirements
of workers based on business and
employer needs
DOL-ETA SCANS Report Job experts, occupational
analysts
Determined the major types of skills
required to enter employment,
demonstrated the level of importance
of the skills in sample jobs, identied
specic job tasks that illustrated the
use of the skills
Expert opinions, literature
review, analysis of 50
jobs, structured
interviews
Help make high school courses contain
information relevant to the workforce;
help employers to make sure their
workers have relevant skills
Note.NS=not specied.
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 5
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 3 Combined Higher Education and Workforce Frameworks
Framework Who provided input What was done Method Intended outcomes
Council of Graduate
Schools
University leaders, business
leaders from nonprot and
for-prot sectors, Council
of Graduate Schools, ETS
Described the link between
graduate education and the
workforce, described the
professional skills that
employers seek and currently
do not nd in entry-level
graduate students, and made a
call to strengthen the link
Expert input, interviews with
employers, survey of
graduate deans, survey of
graduate students
Help develop professional
skills relevant to students’
careers in graduate school
by outlining roles of
stakeholders such as
universities, employers,
and policy holders and
provide recommendations
to each of these groups on
how they can clarify career
pathways for graduate
students.
AAC&U/HART Research
Associates
400 employers with 25+
employees; 613 college
students
Determined employers’ and
college students’ perspectives
on (a) which learning
outcomes are most important
to succeed in today’s economy,
(b) how prepared college
graduates are, and (c)
importance of project-based
learning in college
Online surveys NS
National Research
Council’s 21st Century
Skills
Council committee members Dened and summarized the
literature on 21st century skills
and (a) how these skills relate
to traditional academic skills,
(b) importance of these skills
to educational and work
success, and (c) how these
skills can be taught, assessed,
and learned
Research literature review
from several disciplines,
NRC workshops, other key
papers
Educatepubliconthe
research on
teaching/learning of key
skills needed in the 21st
century
Note. NS =not specied.
6GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Figure 1 e specic skills associated with each of the four components of our proposed higher education readiness CPS taxonomy.
contextualized in higher education have been recently developed by Lench, Fukuda, and Anderson (2015). e existing
gradations inform us of what types of behaviors are associated with each component for various levels of the components.
e authors suggested that individuals at every age level may fall along dierent stages of the developmental continuum
depending on life experiences and learning paths. In the report, the authors also provided specic ways in which vari-
ous skills map onto specic higher education activities and processes. Similarly, we also map higher education processes
to specic behaviors in our operational denition to elucidate the connection between the behaviors in our proposed
denition and the higher education and workforce preparatory activities students are likely to encounter.
A Description of the Proposed Constituent Elements
In what follows, we describe each component of our proposed denition. As mentioned, each component is aligned with
the denitions from and connections to existing higher education student learning outcomes frameworks and existing
assessments of CPS, particularly those that focus on transportable competencies.
Teamwork
Teamwork consists of ve main skills. ese include processes related to promoting (a) team cohesion, (b) team empow-
erment, (c) team learning, (d) self-management and self-leadership, and (e) attitudes of open-mindedness, adaptability,
and exibility.
Team C oh esi on
Team cohesion involves having increased knowledge of teammates and the activities they nd enjoyable, and it is an
important aspect of team performance (Salisbury, Carte, & Chidambaram, 2006). Although its transportability is debated
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 7
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
by dierent frameworks (Aguado et al., 2014), certain team cohesion behaviors are relevant to the higher education and
workforce contexts, such as encouraging team members’ contributions in meetings and understanding group dynamics
while carrying out group projects (Association of American Colleges and Universities [AAC&U], 2011; Strayhorn, 2006).
Team Emp ow erm ent
Team empowerment refers to the ability to be committed to one’s team and the ability to empower team members by
challenging their opinions and motivating them to take on additional challenges and interact with others to maximize
their strengths (Cannon-Bowers, Tannenbaum, Salas, & Volpe, 1995; Dickinson & McIntyre, 1997; Marks et al., 2001). In
higher education and the workforce, team empowerment includes motivating and inspiring action in others, expressing
condence in the assigned task and the team’s ability to accomplish a goal, contributing to the achievement of a com-
mon goal of the group, and leveraging the strengths of others to achieve common goals (AAC&U, 2011; Casner-Lotto &
Barrington, 2006; González & Wagenaar, 2003; Markle, Brenneman, Jackson, Burrus, & Robbins, 2013).
Team Learning
Kukenberger, Mathieu, and Ruddy (2015) dened team learning as the extent to which one’s knowledge and ability have
increased as the result of being a team member. Kostopoulos, Spanos, and Prastacos (2013, p. 1453) described it as a
collective property that although it builds on, cannot be reduced to, individual contributions.” Evidence of team learning
might include reective statements provided by students at the end of a group project when they are asked to summarize
their contribution to the completion of the team assignment.
Self-Management and Self-Leadership
Rousseau and Aubé (2010) dened team self-managing behaviors as those in which team members collectively distribute
tasks. Each person is responsible for taking care of the planning and execution of their assigned tasks to meet the goals
of the team (Burrus, Jackson, Xi, & Steinberg, 2013; Ennis, 2008; Markle et al., 2013). Team members are responsible
for monitoring their own performances during this process. ey are also responsible for making any adjustments to
their plans if obstacles are encountered. Self-leadership involves the monitoring of one’s performance on a task as well
as demonstrating one’s inuence on the team goals to achieve success (AAC&U, 2011; González & Wagenaar,2003).
An example of a behavior associated with self-management and self-leadership is demonstrating the monitoring of one’s
performance—for example, by looking for one’s own improvements to increase personal or group ecacy (AAC&U,
2011; Burrus et al., 2013). In the context of higher education and the workforce, manifestations of these skills may involve
ensuring that one’s assignments are all completed on time in a thorough and comprehensive way to advance a team project
(AAC&U, 2011; Adelman, Ewell, Gaston, & Schneider, 2014).
Open-Mindedness, Adaptability, and Flexibility
A vital part of teamwork is the ability to demonstrate the skills of being open-minded, adaptable, and exible in order
to enable the best performance of the team. Behaviors that fall within these three skills include being open-minded in
order to incorporate others’ ideas and feedback into the group’s project; responsiveness to diverse perspectives, ideas,
and values; the ability to work for and accept necessary compromises as a team member; able to adapt one’s behavior
to increase its suitability for others; exibility, which includes the ability to change one’s behavior or thinking when
conditions change; and the ability to accept ambiguity (Dickinson & McIntyre, 1997; Hesse et al., 2015; OECD, 2013b).
Across higher education and workforce contexts, open-mindedness and adaptability were viewed to matter in relation to
being open and responsive to diverse perspectives, ideas, and values; seeking and providing constructive feedback; and
identifying the rationale behind teammates’ opinions and perspectives (AAC&U, 2011; Markle et al., 2013). e expres-
sion of exibility involves the willingness to make necessary compromises to accomplish a common goal (Casner-Lotto
& Barrington, 2006; Ennis, 2008).
Communication
A second constituent element of CPS is communication. Two skills are central to communication: (a) active listening and
(b) information exchange.
8GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Active Listening
Active listening serves to ensure that a message is received as stated (Stevens and Campion, 1994). It involves the ability
to interpret others’ nonverbal cues, to listen nonjudgmentally, to probe the speaker for clarifying information, and to
give others the opportunity to speak (Cannon-Bowers et al., 1995; Dickinson & McIntyre, 1997; Fiore et al., 2010). It also
involves giving interlocutors full attention as they speak, making a conscious eort to understand the speaker’s points,
posing appropriate follow-up questions, and refraining from inappropriate interruptions. e behaviors were identied
as important in both higher education and workforce frameworks (Burrus et al., 2013; Strayhorn, 2006). Moreover, as
a testament to the value of active listening skills, Woolley, Chabris, Pentland, Hashmi, and Malone (2010) found that
conversational turn-taking was important to team success, which is to say that teams performed better across a variety
of tasks when team members were given equal opportunities to speak than when a few select individuals dominated the
conversations.
Exchanging Information
Exchanging information includes the ability to send congruent messages that are clear, accurate, and validating. e
exchanged messages are transactional rather than personal, thus making this skill distinct from the communicative aspect
of team cohesion. e competent individual is able to provide relevant project updates to teammates (Fiore et al., 2010;
Marks et al., 2001; OECD, 2013b). During the exchange of information, individuals take turns to ensure that team mem-
bers are given the opportunity to speak. is kind of interaction not only includes prompting and responding to what
others say, but also helps ensure that their nonverbal cues agree with their verbal cues.
e need to exchange information was mentioned as a central aspect of CPS across workforce and higher education
frameworks. González and Wagenaar (2003) stated that it is necessary to communicate with nonexperts in one’s eld as
well as to eectively communicate with all team members to achieve the goals and objectives of the team. For example,
exchanging information is necessary to keep parties informed of the progress and any changes to a given project to meet
project timelines (Ennis, 2008). Strayhorn (2006) suggested that in higher education it is needed to articulate abstract ideas
eectively. e frameworks by the International Society for Technology in Education (ISTE, n.d.) and the Partnership
for 21st Century Skills (2012) stated that students should be able to communicate ideas and information eectively and
articulate thoughts and ideas across multiple audiences using diverse media and formats.
Leadership
We suggest that there are ve skills that are relevant to leadership within the context of higher education and work-
force readiness: (a) organizing activities and resources, (b) monitoring performances, (c) reorganizing when faced with
obstacles, (d) resolving conicts, and (e) demonstrating transformational leadership.
Organizing Activities and Resources and Performance Monitoring
Kozlowski and Ilgen (2006) indicated that organizing activities and resources and performance monitoring are central to
setting and meeting realistic team learning goals. An eective project manager will be able to balance the need to advance
project goals key to the group’s success with the need to respect time and resource constraints (Ennis, 2008; Hesse et al.,
2015; O’Neil, Chung, & Brown, 1997). is involves communicating clear roles, responsibilities, and expectations to team
members and distinguishing between problems that can be solved either individually or within a team (Dickinson &
McIntyre, 1997; OECD, 2013b; Stevens & Campion, 1994). Organizing activities involves the ability to articulate a vision
for the project that can be communicated to and shared by the team.
Key behaviors include clearly and eectively dening the roles and responsibilities of team members to ensure that
team goals are understood, the information is shared, and team members have the necessary resources to perform their
assigned tasks (C.-J. Chiu et al., 2011; OECD, 2013b; Marks et al., 2001; O’Neil et al., 1997; Stevens & Campion, 1994). As
suggested by the higher education and workforce frameworks, these activities involve the ability to (a) design and manage
projects (González & Wagenaar, 2003); (b) plan and schedule tasks to complete assigned work on time (Ennis, 2008);
(c) allocate time and resources eectively, including coordinating eorts with all parties involved (Ennis, 2008);
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 9
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
(d) evaluate and maintain the quality of work (González & Wagenaar, 2003); and (e) keep track of details to ensure work
is performed accurately and completely (Ennis, 2008).
Reacting to Obstacles and Resolving Conicts
e ability to resolve conicts involves reacting to or anticipating obstacles preemptively (AAC&U, 2011; Hesse et al.,
2015; Marks et al., 2001; O’Neil et al., 1997). It also involves identifying and correcting gaps, errors, or misunderstandings
that arise during the execution of tasks (OECD, 2013b). Ideally, conict is confronted and win– win strategies are used to
resolve conict directly and constructively. Instead of focusing on personal gain, it may be useful to encourage solutions
that benet all team members to address common goals (AAC&U, 2011; Stevens & Campion, 1994). Alternatively, conict
can be resolved by incorporating the needs and viewpoints of all parties. In either case, insights regarding opposing or
dierent views should be sought to hear all members’ views and minimize misunderstandings (Marks et al., 2001).
In higher education, the Lumina DQP (Adelman et al., 2014) highlighted the importance of negotiating successful
strategies when conducting group research. e AAC&U (2011) provided examples of such strategies. ese include
addressing destructive conict directly and constructively to help manage or resolve potential conicts to strengthen
the team and the future eectiveness of the team and project. In the workforce, employers want employees who can
anticipate obstacles to project completion, develop contingency plans to address the obstacles, and take corrective action
when projects go o track. Desirable employees are those who can bring others together to reconcile dierences, handle
conicts maturely through a mutual give-and-take approach, and actively promote mutual goals and interests (Ennis,
2008).
Transformational Leadership
Transformational leadership describes the ability of a team member to motivate other team members to work hard to
attain or exceed goals and take on more dicult or challenging tasks. Such a person brings added value to the team.
Mitchell et al. (2014) argued that positive team dynamics, a receptiveness to diversity, and positive motivation are all the
results of eective transformational leadership.
Problem Solving
Problem solving focuses on the creation of strategies to answer a given problem, dilemma, or open-ended question. It is
dened as “the process of designing, evaluating, and implementing a strategy to answer an open-ended question or achieve
a desired goal,” wherein the solution or the strategy required to arrive at a solution is not readily apparent (AAC&U, 2011
p. 1; Murray, Owen, & McGaw, 2005; OECD, 2013a). Under the PISA 2012 problem-solving framework, this process
involves four overarching stages: (a) exploring and understanding, (b) representing and formulating, (c) planning and
executing, and (d) monitoring and reecting (OECD, 2013a). Conversely, the problem-solving VALUE rubric, authored
by the AAC&U (2011), includes six dimensions: (a) dene the problem, (b) identify the strategies, (c) propose solutions
andhypotheses,(d)evaluatepotentialsolutions,(e)implementthesolutions,and(f)evaluatetheoutcomes.Fivestages
of problem solving are commonly identied in the literature: (a) identifying and dening a problem, (b) brainstorming,
(c) planning, (d) interpreting and analyzing information, and (e) evaluating and implementing solutions. Although prob-
lem solving is important for higher education and workforce readiness, in alignment with CPS, we focus on the interac-
tions of individuals during the various problem-solving stages rather than the capability of each individual to problem
solve, which is already integrated into existing higher education assessments.
Collaborative Problem Solving Assessments
Our second objective is to provide exemplars of existing CPS assessments. Table 4 illustrates a selection of existing team-
work and CPS assessments spanning elds including business, education, health and medicine, and the military. Examples
include the VIEW assessment (Creative Problem Solving Group, 2013), the Teamwork-KSA Test based on the taxon-
omy by Stevens and Campion (1994, 1999), and the CCSSO Workplace Readiness Assessment Consortium (Grummon,
1997). ese assessments measure CPS and related constructs using multiple themes, some of which overlap. For example,
10 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
teamwork, communication, problem solving, and leadership were found frequently across assessments. Some dimensions
or themes were found less frequently, such as situational awareness, workload management, and product quality.
Assessment Format
Table4liststhetasktypesthathavebeenusedtomeasureCPSandrelatedskillsandincludesself-assessmentsthatuse
Likert scales or forced-choice options, situational judgment tests (SJTs), third-party evaluations, and observation tools
(e.g., behavioral checklists, audio- and videotaped observations), and the analysis of think-aloud protocols (e.g., Brannick,
Prince, Prince, & Salas, 1993; Oser, McCallum, Salas, & Morgan, 1989). We elaborate on the advantages and disadvantages
of various item formats later in the paper.
General Versus Domain Specific
Teamwork and CPS assessments administered in the health and medical elds are oen domain specic, which is to say
that they aim to assess CPS skills in highly specic contexts that require some prerequisite content knowledge. For example,
the University of Texas Behavioral Marker Audit Form (omas, Sexton, & Helmreich, 2004) is used to assess teamwork
skills during simulations of neonatal resuscitation. Another example is the Anaesthetists’ Non-Technical Skills test (ANTS;
Flin, Glavin, Maran, & Patey, 2012), which also takes the form of a high-delity simulation to assess anesthetists’ CPS skills
in situations closely resembling those they may encounter in the workplace.
On the other hand, VIEW (Creative Problem Solving Group, 2013), the Teamwork Competency Test (TWCT; Aguado
et al., 2014), and the Teamwork-KSA Test (Stevens & Campion, 1994, 1999) were developed to assess general team compe-
tencies (i.e., transportable competencies). Similarly, PISA was developed to “use problem situations and contexts relevant
to 15-year-old students that tap generalized problem-solving skills, but do not rely on specialized knowledge” (OECD,
2013b, p. 14). Transportable skills are of particular interest to a higher education assessment context, as test takers may be
submitting their scores to graduate programs across a wide array of academic and professional disciplines and in support
of the teaching and learning of skills that are relevant when transitioning from higher education to the workforce.
Test and Scale Reliability
Some CPS assessments contain and provide scores for more than one subscale. For example, VIEW has three subscales:
orientation to change, manner of processing, and ways of deciding. Each subscale has reliabilities above .80 (Trenger
et al., 2014). An advantage of using subscale scores is the ability to augment the information obtained on each dimen-
sion(s). A disadvantage is that subscale scores typically have lower reliability. For example, Athanasaw, 2003 (as cited
in Aguado et al., 2014) obtained a coecient of .66 for the complete teamwork-KSA test scale and reliabilities ranging
between .25 and .48 for each of the ve factors (subscales).
Validity of Existing CPS Assessments
Our review of validity evidence for existing assessments of CPS suggested that the number and quality of studies varied
widely. It ranged from evidence on the number of dimensions extractable from the measures, whether the measures
predicted and augmented aptitude and cognitive measures typically utilized in assessing team performance. Moreover,
most studies were conducted in the workforce within specic contexts as compared to the use of teamwork measures in
the higher education context.
Construct Validity
Few construct validity studies on CPS measures are available. Among existing studies, two situations typically arose:
(a) the empirical model did not support the hypothesized structure, and (b) a suggestion was made to revise the exist-
ing scale and assess fewer factors than originally hypothesized. For example, a study conducted by Oliveri, McCarey,
Holtzman, & Ezzo (2014) yielded improved t for a two-factor model rather than the originally hypothesized six-factor
structure associated with the ETS®Personal Potential Index (PPI). e authors suggested assessing fewer factors in a
revised scale.
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 11
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 4 Assessments of Collaborative Problem Solving (CPS) and Related Constructs
Assessment and
developer name Field Description of the assessment Delivery mode emes and topics Inferential level
Self-assessments are of two kinds: (a) In the context of Likert-type self-assessments, respondents are asked about their personal preferences, beliefs, tendencies, and behaviors; and
(b) in the context of forced-choice assessments, test takers are presented with multiple statements linked to dierent dimensions and are asked to select the statement that is
closest to the way they perceive themselves.
VIEW: An Assessment of
Problem Solving Style
(Creative Problem
Solving Group, 2013)
Business and
education
Contains 34 Likert scale forced-choice
hybrid items to create a prole of the
test taker’s problem-solving style and
preferences
Web- ba sed Orientation to change
Manner of processing
Ways of deciding
Individual
Tai lored Adaptiv e
Personality Assessment
System (TAPAS)
(Drasgow et al., 2012)
Military Contains forced-choice items to assess
certain noncognitive characteristics
deemed valuable for entry-level
soldier performance and retention.
ese include subcomponents of the
Big Five personality traits
Web- ba sed Extroversion: dominance, sociability,
excitement seeking
Agreeableness: warmth, generosity,
co-operation
Conscientiousness: industriousness,
order, self-control, responsibility,
virtue, traditionalism, physical
condition
Emotional stability: adjusted,
even-tempered, happy
Openness: intellectual, ingenuity,
tolerance, curiosity, depth, aesthetics
Individual
ETS WorkFORCE®
Assessment for Job Fit
(ETS, 2014a, 2014b)
Education and
workforce
Computerized adaptive self-assessment
with approximately 100 forced-choice
items that measure six key elements
of workplace success. It is based on
the U.S. Army’s Tailored Adaptive
Personality Assessment (TAPAS)
Web- ba sed Initiative and perseverance
Responsibility
Teamwork and citizenship
Customer service orientation
Problem solving and ingenuity
Flexibility and resilience
Individual
12 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 4 Continued
Assessment and
developer name Field Description of the assessment Delivery mode emes and topics Inferential level
Teamwork Competency
Te st ( T WC T ; Ag u a do
et al., 2014)
Education and
workforce
Targets the 14 subcomponents of
teamwork as outlined by the Stevens
and Campion (1999) framework. It
contains “36 items in a 4-point
frequency scale format and draed in
‘ob ser vable behaviors’ stat em en ts
(Aguado et al., 2014, p. 116)
Paper-and-pencil Conict resolution
Collaborative problem solving
Communication
Goal setting and performance man-
agement
Planning and task coordination
Individual
Situational judgment tests (SJTs) are a type of scenario-based assessment. Test takers are presented with a particular situation by text (low delity) or video (high delity) and are
asked to determine how someone should react under the presented circumstances. Aer being presented with the scenario, they are given a set of possiblereactionsandaskedto
either select one option or to rate the appropriateness of each option.
e Tea mw or k- KSA Te st
(Stevens & Campion,
1994, 1999)
Higher education
and workforce
readiness
Contains 35 multiple-response items.
Students are asked to respond to how
they would act across dierent
situations potentially arising within a
work team
Paper-and-pencil Conict resolution
Collaborative problem solving
Communication
Goal setting and performance man-
agement
Planning and task coordination
Individual
Predictor Situational
Judgment Test (PSJT;
Waugh & Russell, 2005)
Military Contains written scenarios with 20
items designed to improve personnel
selection and classication for the
U.S. Army. Assesses judgment and
decision-making skills across
situations commonly encountered
during the rst year in the Army
(Knapp & Hener, 2009)
Paper-and-pencil Adaptability to changing conditions
Relating to and supporting peers
Teamwork
Self-management
Self-directed learning
Individual
In third-party evaluations, a third-party rates test takers on behaviors such as knowledge and creativity.
ETS®Personal Potential
Index (PPI; ETS, 2009)
Graduate school
readiness
A web-based evaluation system
containing 24 Likert-type items
distributed to up to ve evaluators of
the applicant’s choosing. It is
designed to supplement the Graduate
Record Examination and provide a
more complete picture of an
applicant’s graduate school readiness
Web- ba sed Knowledge and creativity
Communication
Teamwork
Resilience
Planning and organization
Ethics and integrity
Individual
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 13
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 4 Continued
Assessment and
developer name Field Description of the assessment Delivery mode emes and topics Inferential level
Observational tools (i.e., behavioral marker systems; behavioral checklists) are used while test takers are engaged in either a real or simulated performancetasktonotewhether
(un)desirable behaviors are displayed during a given task. Raters indicate (a) whether or not a particular behavior was observed, (b) its quality, and (c) its frequency. ey are
typically developed based on the analysis of performances that contributed to (un)successful outcomes (Daimler-Und & Benz-Stiung, 2001).
University of Texas Behavioral Marker
Audit Form (omas et al., 2004)
Health and
medicine
Used by trained raters while
observing simulations of
neonatal resuscitation (M. A.
Rosen et al., 2010)
High-delity
simulation
Information sharing
Inquiry
Assertion
Sharing intentions
Teac hi ng
Evaluation of plans
Workload management
Vigilance/environmental
awareness
Overall teamwork
Leadership
Group
Performance Assessment of
Communication and Teamwork
(PACT; C.-J. Chiu et al., 2011)
Health and
medicine
Contains three observation tools:
(a) novice observer form, (b)
long form for experienced
observers, and (c) video
coding sheet to assess
teamwork and communication
during simulated scenarios,
including capturing the
frequency and quality of
desired teamwork and
communication behaviors
High-delity
simulation
Team structure
Leadership
Situation monitoring
Mutual support
Communication
Group
Anesthetists’ Non-Technical Skills
(ANTS; Flin et al., 2012). Developed
by the University of Aberdeen
Industrial Psychology Research
Center and the Scottish Clinical
Simulation Centre
Health and
medicine
Measures obser vable
nontechnical skills associated
with good practice in task
management, team working,
situational awareness, and
decision making
High-delity
simulation
Task management
Team working
Situation awareness
Decision making
Individual
14 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 4 Continued
Assessment and
developer name Field Description of the assessment Delivery mode emes and topics Inferential level
Communication and
Teamwork Skills
(CATS; Frankel,
Gardner, Maynard, &
Kelly, 2007)
Health and
medicine
Observer protocol designed to
assess the examinee’s
communication and teamwork
skills in clinical settings. e
observer uses the form to
capture the frequency and
quality of desirable behaviors
High-delity
simulation
Coordination
Cooperation
Situational awareness
Communication
Group
Observational Teamwork
Assessment of Surgery
(OTAS; M. A. Rosen
et al., 2010)
Health and
medicine
Measures task work and
teamwork surrounding the
surgical processes to assess
communication, leadership,
monitoring, and cooperation.
Rated by an expert using a
7-pointLikertscaletoscore
behaviors
High-delity
simulation
Communication
Leadership
Coordination
Monitoring
Cooperation
Group
Team Problem Solving
Assessment Tool
(TPSAT) (Rotondi,
1999)
Health and
medicine
Designed to “aid managers and
other practitioners convene
and coach teams more
eectively” (Rotondi, 1999,
p. 205). For each variable, the
evaluator assigns a score
between 0 and 100
High-delity
simulation
Customer’s values
Team member expertise
Team interaction style
Systematic problem exploration
and solution development
Meeting facilitation
Pressure to solve the problems
Problem denition
Team member participation
Written logs of meetings
Group
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 15
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 4 Continued
Assessment and
developer name Field Description of the assessment Delivery mode emes and topics Inferential level
Anti-Air Teamwork
Observation Measure
(ATOM; Smith-Jentsch,
Johnston, & Payne,
1998)
Military Uses a three-step process
consisting of a
four-dimensional model of
teamwork in the form of
qualitative notes and
ratings
High-delity
simulation
Information exchange
Communication
Supporting behavior
Leadership and initiative
Group
Blended assessment consisting of a prototype observation tool, third-party evaluation of the nal product, and a self-assessment
CCSSO Workplace
Readiness Assessment
Consortium
(Grummon, 1997)
Secondary
education
Designed to help instructors
give students feedback on
theirteamworkskillsand
to help instructors develop
their own teamwork
assessment tools using
three instruments: (1) a
team-eectiveness
observation sheet, (2) a
rubric for evaluating the
nal product, and (3) a
self-assessment for students
on teamwork skills
High-delity
simulation and
paper-and-pencil
Interpersonal skills
inking and problem-solving skills
Product quality
Group
16 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Inotherinstances,modelswithahighernumberoffactorsthanoriginallyhypothesizedaresuggestedtohaveimproved
model–data t. Aguado et al. (2014) hypothesized a ve-factor model for the Teamwork Competency Test (TWCT) but
obtained improved model–data t estimates for a model containing a higher number (eight factors), which explained
56% of the total variance. e authors stated that the eight-factor solution better reproduced the analyzed data matrix
compared with the ve-factor model (the substantive model). e study found coverage of the content domain proposed
by Stevens and Campion (1994) with Cronbach alpha>.80. e authors suggested that the TWCT yielded improved t
over the Teamwork-KSA Test, as it was able to extract a greater number of factors that were more closely aligned with the
theorized model.
Predictive Validity
Results from incremental predictive validity studies revealed that measures of knowledge and skills both related to a
specic task and to teamwork more generally predicted individual performance in work contexts (e.g., McClough & Rogel-
berg, 2003; Stevens & Campion, 1999) over and above cognitive measures. An incremental predictive validity study was
conducted using the Teamwork-KSA Test with employees in real work teams using supervisor and peer ratings of job
performance (McClough & Rogelberg, 2003). e Teamwork-KSA Test correlated with teamwork performance (r=.44,
p<.05), with ratings of overall job performance (r=.56, p<.05), and with ratings of overall job performance (r=.53,
p<.05). It also provided a signicant increase in explained variance beyond aptitude measures in relation to teamwork
(incremental R2=.08) and overall job performance (incremental R2=.06). e implications of this study are that the use
of teamwork assessments can help augment predictions of job satisfaction and job performance over and above the use of
aptitude measures alone.
Inferential Level of the Assessments
Table 4 also indicates whether the assessment was intended to make inferences about an individual’s behavior within a
team or the functioning of a team as a unit. In the context of developing an assessment of CPS for higher education, our
focus would be on understanding and making inferences about how an individual functions within a team. As can be seen
in Table 4, such assessments have previously been developed.
Overlap Between Existing Assessments and Our Proposed Definition
In Table 5, we list a sample of existing assessments of CPS skills and indicate whether the components we identied in
our proposed taxonomy (see Figure 1) are measured by the existing assessments. Consistent with the idea that teamwork,
collaboration, and CPS have received widely dierent denitions (as noted previously), each of the assessments described
in Table 5 similarly evaluates various skills and behaviors under the umbrella of CPS. Note that although there is some
overlap between the skills assessed by the various measures and the components listed in our proposed taxonomy, there
are some components, such as active listening, that are not explicitly assessed by any of the existing measures.
Considerations in Assessing Collaborative Problem Solving
In the remainder of this section, we discuss considerations for the design of a CPS assessment. We discuss possible task
types, item formats, and issues of accessibility when assessing diverse populations. is section concludes with a brief
description of the possible advantages of our proposed operational denition of CPS and other assessment considerations.
Task Types
To provide authenticity, motivation, and engagement with the presented material, Grummon (1997) suggested using a
variety of task types and structural features in assessment design. Dwyer, Millett, and Payne (2006) also suggested the
use of a variety of assessment formats beyond the use of multiple-choice item types in alignment with the fair and valid
testingofhighereducationskills.Asawaytoevaluatethepossibletasktypesthatcouldbeamenableforanassessment
of CPS, we evaluated the advantages and disadvantages associated with various task types previously utilized to assess
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 17
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 5 Existing Collaborative Problem Solving (CPS) Assessments and eir Connection to Our Proposed Taxonomy
Taxonomy components and related skills
Teamwork Communication Leadership Problem solving
Domain Assessment
Teamwork (general)
Team c oh es ion
Team empowerment
Team learning
Self-management/self-leadership
Open-mindedness
Adaptability
Flexibility
Communication (general)
Active listening
Exchanging information
Leadership (general)
Organizing activities and resources
Performance monitoring
Reorganizing when faced with obstacles
Resolving conicts
Transformational leadership
Problem solving (general)
Identifying problems
Brainstorming
Planning
Interpreting and analyzing information
Evaluating and implementing solutions
Self-assessment forced choice
Military TAPAS √√√ √
Education and workforce VIEW √√
WorkFORCE √√ √
Situational judgment test (SJT)
Education and workforce TKSAT √√ √ √
PSJT √√ √ √
ird-party evaluations with Likert-type items
Graduate school ETS®PPI √√
Observational tools (e.g., behavioral marker systems, behavioral checklist)
Health & medicine UT BAMF √√√
PAC T √√
ANTS √√√√ √
CATS √√√ √
OTAS √√
TPSAT √√√ √√
Military ATOM √√
Blended assessment
Education CCSSO √√ √
18 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Table 6 Advantages and Disadvantages Associated with Assessments Measuring Noncognitive Constructs
Advantages Disadvantages
Self-assessments (Likert Type). (Brill, 2008; Callegaro, 2008; ETS, 2012; Greenleaf, 2008; OECD, 2014; Villar, 2008;
Zhuang et al., 2008)
Convenient and economical
Compared to third-party judgments, examinees know the
most about their teamwork behavior.
Can be independent from other parts of the applicants’ appli-
cation process.
Response biases (social desirability, extreme, central
tendency or acquiescent responding)
Fakeable
Students must have the necessary metacognitive ability
to accurately gauge their own levels of teamwork
Self-assessments (forced choice). (Brown & Maydeu-Olivares, 2011; Heggestad, Morrison, Reeve,
& McCloy, 2006; OECD, 2014)
Reduces response biases such as acquiescence or central ten-
dency responding
Reduces “halo” eects
Scores are ipsative rather than normative, so they can-
notbecomparedacrossindividualsforadmissionsand
employment decisions
Situational judgment tests. (Cullen, Sackett, & Lievens, 2006; Lievens, Buyse, & Sackett, 2005; OECD, 2014;
Whetzel & McDaniel, 2009; Zhuang et al., 2008)
Less adverse eects on ethnic minorities
Validity evidence supports their use in employment and edu-
cational settings.
Direct measure of judgment making
Possibly more engaging than their self-report style counter-
parts
ey may be easy to fake if they contain behavioral ten-
dency instructions instead of knowledge instructions.
Test takers are able to slightly increase their score
through retesting or coaching
ird-party evaluations. (Kyllonen, 2008; Zhuang et al., 2008)
Less expensive
Scalable
Pronetoresponsebiases(e.g.,haloeects)
Responses are not comparable between instructors
and students
Observational tools. (Daimler-Und & Benz-Stiung, 2001)
Direct measure of teamwork performance in context
Fake resistant
Raters need to be recruited, trained, calibrated, and
periodically recalibrated
Cannot capture every aspect of performance because
not all behaviors may be elicited during observation,
or the situations may be too complex to observe every
behavior
Dicult to standardize the test conditions
Very expensive for large-scale administrations
various noncognitive skills. As summarized in Table 6, these task types are (a) self-assessments with Likert scales; (b) self-
assessments with forced-choice, (c) situational judgment tests (SJTs); (d) third-party evaluations; and (e) observational
tools.etablealsolistsourevaluationofthetasktypeswithrespecttowhethertheassessmentsare(a)resistanttofaking,
gaming, and coaching; (b) scalable; (c) cost-ecient to produce and score; (d) resistant to the various biases that have been
documentedintheresearch;and(e)secure.
Upon our evaluation, two task types (third-party evaluations and observational tools) received a less favorable eval-
uation on the abovementioned criteria. First, although the advantage to third-party evaluations is the removal of the
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 19
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
individual as the evaluator, these task types have other limitations. One is the diculty of controlling response patterns
or biases possibly held by individual instructors. Moreover, low reliability has been reported when comparinginstructors’
ratings, as the evaluations are possibly based on subjective judgments (Zhuang, MacCann, Wang, Liu, & Roberts, 2008).
ird,haloeectsmayoccurifaninstructorhasapositiveimpressionaboutoneormoreofanexamineesattributes(e.g.,
creativity), which generalize across all other attributes (Nisbett & Wilson, 1977).
Observation tools also received low ratings as a possible contender due to challenges related to scoring expense and the
variability of testing conditions. As O’Neil et al. (1997) indicated, these methods are neither practical nor cost eective in
large-scale test settings. ey can be expensive given the need to recruit, train, and calibrate raters (Daimler-Und & Benz-
Stiung, 2001). is is likely to outweigh the benets of reduced fakeability and their ability to yield direct measurements of
teamwork performance. A second limitation is that not all behaviors may be displayed during the assessment time frame.
For example, it would be dicult to assess a test taker’s conict resolution skills if team members are largely cooperative
throughout the exercise. In the same vein, a third limitation is the diculty of standardizing the test conditions because
an examinee’s performance might be impacted by his or her interactions with teammates or their teammates’ behavior. In
the context of admissions or any other administration with large numbers of subjects, the use of observation tools would
be expensive and dicult to operationalize. Such tools would require that multiple examinees interact with each other in
controlled situations (in an attempt to standardize the required tasks), and there are no assurances that the behaviors of
interest would be elicited for evaluation. Additional limitations include the need for transcribing, coding, and analyzing
information post hoc, precluding the analysis of information expediently and thus delaying the reporting of an individual’s
performance within a team.
Self-Assessments Using Forced Choice
Self-assessments using Likert scales are one of the most commonly used task types to assess noncognitive skills, given their
low cost and convenient administration (O’Neil et al., 2004; Zhuang et al., 2008). However, they are easily fakeable and
coachable, and they depend on “students’ capabilities for self-knowledge: Students must have the necessary psychological-
mindedness to accurately gauge their own levels of teamwork” (Zhuang et al., 2008, p. 5). ese types of assessments are
also prone to response biases that could occur in terms of an examinee’s interpretation or use of the scale, particularly
in cross-cultural contexts. Other response biases may occur because of social desirability, tendencies toward endorsing a
particular part of the scale (extreme or central tendencies), or acquiescent response patterns, where the test taker endorses
a particular item due to social desirability (ETS, 2012; OECD, 2012). As an alternative, the use of self-assessments using
forcedchoicemightbepreferabletothoseusingLikertscales,astheymaybehardertogame,andtheyarerelatively
inexpensive to produce and quick to score. ey can help reduce response biases such as tendencies toward acquiescence
or central response tendencies because this format makes it impossible to endorse every item in the same way (Cheung &
Chan, 2002).
As an example, a forced-choice item may contain the following two statements, from which the examinee must choose
one statement over the other: (a) I can relax easily, or (b) I set high personal standards for myself. e selection of one of
the two responses can also help reduce the eects of the use of high ratings across domains because the option choices are
not attributes or traits at opposite ends of a scale but instead are on traits that are on dierent dimensions (Bartram, 2007;
Brown & Maydeu-Olivares, 2011, 2013; OECD, 2012). Previous research has suggested that they are more fake resistant,
as one among two choices needs to be selected. Both choices may appear equally (un)appealing (Bowen, Martin, & Hunt,
2002; Brown & Maydeu-Olivares, 2013; Christiansen, Burns, & Montgomery, 2005; Jackson, Wroblewski, & Ashton, 2000;
OECD, 2012; White & Young, 1998). Beyond these advantages, their inclusion within a blended assessment approach
would be helpful by increasing the amount of data collected as a cross-check for other sections of the assessment. It could
also help increase assessment reliability through the administration of additional items.
A downside is that they are not based on direct observations of an examinee’s behaviors. ey typically do not have
normative scores, as scores in one dimension are relative only to scores on dierent dimensions for the same individual
and not to other examinees (Brown & Maydeu-Olivares, 2011). is downside limits the ability to compare individuals,
which is important in higher education admissions (Heggestad et al., 2006), further suggesting the need to complement
this task type with others. Current research might help remedy this shortcoming. See the following approaches (a) Stark,
Chernyshenko, and Drasgow’s multi-unidimensional pairwise-preference model (2005); (b) de la Torre, Ponsoda, Leenen,
and Hontangas’ (2011) extension of Stark et al.’s model; and (a) Brown and Maydeu-Olivares (2013).
20 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Situational Judgment Tests
SJTs composed of short vignettes have also been used widely in multiple elds, such as business and education. eir
advantages include their usefulness in detecting subtle judgment processes by asking participants to provide intuitive
or contextual judgments about scenarios set in plausible contexts. is contextualization in real-life scenarios also ren-
ders them more engaging than other task types, such as self-reports (Zhuang et al., 2008). Weekly and Ployhart (2005)
suggested they can enhance the incremental predictive validity of traditional personality and cognitive measures.
SJTs are exible and may use a variety of response formats and delivery modes that extend beyond the use of multiple-
choice and render them more engaging. Possible response formats include the use of constructed-response items or chat
boxes that present challenging and possibly complex situations to probe into the examinees ability to display the behav-
iors and skills thought central to CPS. SJTs can be delivered online, which increases the possibility to elicit behaviors as
examinees interact with avatars or simulated team members, thereby increasing the authenticity of the assessment and
performances.eycanalsobemoredynamicandenabletheuseofvariouspathwaysbasedonanexamineesresponses.
Despite the advantages, SJTs also have three main limitations, which might present challenges in their use in assess-
ing CPS: (a) fakeability, (b) coachability, and (c) item authoring. First, although SJTs with behavior tendency items have
higher correlations with personality than items eliciting knowledge, they may be easier to fake (Nguyen, Biderman, &
McDaniel, 2005). Second, in a high-stakes setting, Lievens, Buyse, Sackett, and Connelly (2012) found an incremental
eect (SD =0.5) between coaching and self-test preparation across alternate forms of an interpersonal SJT in the context
of a medical school admissions test. ird, challenges arise in relation to how to dene someone as an expert in team-
work for the item authoring and scoring, as dierent situations may lend themselves to dierent kinds of expertise; hence,
nding item writers to create a diverse range of plausible scenarios might be dicult (Zhuang et al., 2008).
Simulated Scenario-Based Tasks
is task type provides the opportunity to observe examinees’ behaviors in a presented scenario. is approach is closer to
direct observation and can potentially yield data such as desirable or undesirable behaviors. Second, simulated scenario-
based tasks may be more engaging and realistic for the examinees. ird, simulated scenarios allow for the control of the
situation to the degree that the presented situation is the same for all examinees. Each scenario can have decision points
embedded within it in order to track and score an examinee’s choices. Chat screens or avatars can be embedded to gauge
how examinees interact with simulated teammates (Y. Rosen & Tager, 2013).
Although simulated scenario-based tasks oer increased test-taker engagement, authenticity, and standardization
over other item types, the possibility of “gaming” the task persists. Examinees may respond to the test based on guesses
of what the desired responses or outcomes are rather than what they would do under regular conditions. For example,
a very bright and intuitive examinee might “know” what the desired attributes of someone with high collaborative skills
are and guess what the “correct” pathway is for the task set, even if the student was reluctant to collaborate with peers
and mentors in nonassessment settings. For this reason, it might not be advisable to use this task type in isolation. A
second consideration is the cost associated with developing and implementing the scenarios, which on one hand are
advantageous for rendering pseudoauthentic responses, but on the other hand may be dicult to keep secure.
Recommended Approach to Collaborative Problem Solving Assessment
In light of our review, we suggest using a blended assessment approach to assess CPS in (high-stakes) large-scale
assessment contexts. It might consist of various combinations of task types, such as (a) forced-choice self-assessment,
(b) SJTs composed of vignettes, and (c) a simulated scenario-based task. ese were our top three task-type contenders.
e blended approach can be benecial in relation to accessing the strengths of the dierent task types while balancing
their weaknesses. e Council of Chief State School Ocers (CCSSO) Workplace Readiness Assessment Consortium
used a blended approach previously (Grummon, 1997).
Accessibility and the Reduction of Sources of Construct-Irrelevant Variance
Beyond the selection of task types and structural features to ensure high levels of examinee motivation and engagement
with the task, it is also important from a fairness and validity standpoint that the items are accessible to the diverse
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 21
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
populations taking the assessment, such as students with disabilities or students who are culturally and linguistically
diverse. Guidelines exist for the development of assessments that are sensitive to students with learning disabilities, which
should be consulted in the development of prototypes of an assessment of CPS (Davey, 2011; Stone & Davey, 2011). Tak-
ing into account students with disabilities is of utmost importance for fairness purposes, particularly given that these
students are attending higher education institutions at increasingly elevated rates, as suggested by Heiman and Precel
(2003).
Several guidelines have been developed to provide guidance on the development of fair and valid assessments for
diverse test-taker populations. For instance, the International Test Commission (2010) included two particularly rele-
vant guidelines: C1 and D1. Respectively, they state, “Eects of cultural dierences which are not relevant or important
to the main purposes of the study should be minimized to the extent possible,” and “Test developers/publishers should
ensure that the adaptation process takes full account of linguistic and cultural dierences among the populations for
whom adapted versions of the test or instrument are intended” (p. 2). Such guidelines should be reviewed in the devel-
opment of the prototypes of assessments of CPS for higher education. Sensitivity to cultural and linguistic dierences
is particularly important in the context of assessing higher education students, given the large number of international
populations taking these kinds of assessments. For instance, in the 2015–2016 administration of the GRE®General Test,
43% of the population comprised non-U.S. citizens, an increase of 15% from the 2011–2012 administration (ETS, 2014a,
2016).
Previous research has also provided guidance on how to reduce construct-irrelevant variance due to cultural dier-
ences in assessment development. For example, PISA used various strategies to reduce cultural load when designing item
features and content. ese issues were of central importance in PISA, as it is administered to multiple countries globally.
Consideration was made in the way items were written and selected as well as in the development of PISA’s concep-
tual framework, which we used to inform our proposed operational denition. Such eorts are helpful in informing the
development of task types and design patterns (Oliveri, Lawless, & Mislevy, 2017).
Concluding Note
In this paper, we presented a synthesis of the CPS literature and provided our perspectives on the components, skills, and
behaviors that are related to CPS within the context of higher education readiness, with connections to the workforce. We
provided an operational denition of CPS and discussed assessment considerations to capture the behaviors and elements
of CPS that are generalizable across tasks and teammates. Accordingly, our literature review focused on identifying and
describing such skills in alignment with the use of a test for higher education admissions across elds. Because of the
multiple elds and subjects studied in higher education, we suggested focusing on taxonomies such as Aguado et al.
(2014), Cannon-Bowers et al. (1995), and Stevens and Campion (1994, 1999), who discussed taxonomies of transportable
skills. A focus on transportable skills is important, as meaningful and practical gains in workforce readiness of college
students were found to relate to gains in CPS. As stated by Chen, Donahue, and Klimoski (2004), a systematic focus on
teaching and learning of CPS in a wider number of university curricula across elds such as business, engineering, and
health care could potentially improve workforce readiness of college graduates, which may in turn translate into better
teamwork behavior in actual work settings. Such eorts could be useful in assisting universities in their eorts to prepare
students for today’s and the future workplace.
Although we envision rigorous work will need to be done to ensure meeting the psychometric standards of the assess-
ment, we suggest that a rst step of this eort is to provide conceptual clarity— in other words, an operational de-
nition that provides suciently concrete terms to help inform measurement eorts. Exemplars of existing assessments
do not delineate constructs or behaviors with sucient clarity to develop substantive models. We noted this absence
particularly in relation to higher education. We thus aimed to provide an operational denition that would provide suf-
cient information to lay out the foundation to develop a substantive model that later could be empirically assessed.
In so doing, this paper supports the eorts of the scholars who have begun conceptual work intended to clarify the
nature and the dimensional structure of teamwork processes (e.g., Marks et al., 2001). Such research (although currently
sparse) is central to advancing the measurement and assessment of CPS to support construct validity. We also suggest
the use of a blended approach to assessing CPS. is approach is not new, and the task types (e.g., forced-choice self-
assessments, SJTs, and scenario-based) we wish to use have already been developed to measure other constructs across
various contexts.
22 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Acknowledgments
e authors would like to thank Vinetha Belur, Kri Burkander, Chelsea Ezzo, and Laura Ridol for their help with the
literature searches and research support, and Robert Mislevy, Patrick Kyllonen, Alina von Davier, Katrina Roohr, Lei Lu,
Paul Borysewicz, Peter Cooper, and Eric Steinhauer for their consultation.
Note
1 Henceforth, all terminology and skills used interchangeably in the literature with teamwork or CPS will be referred to simply as
CPS.
References
Adelman, C., Ewell, P., Gaston, P., & Schneider, C. G. (2014). e Degree Qualications Prole: A learning-centered framework for
what college graduates should know and be able to do to earn the associate, bachelor’s, or masters degree. Indianapolis, IN: Lumina
Foundation.
Aguado, D., Rico, R., Sánchez-Manzanares, M., & Salas, E. (2014). Teamwork Competency Test (TWCT): A step forward on measuring
teamwork competencies. Group Dynamics: eory, Research, and Practice,18(2), 101– 121.
Association of American Colleges and Universities. (2011). e LEAP vision for learning: Outcomes, practices, impact, and employers’
views.Washington,DC:Author.
Bartram, D. (2007). Increasing validity with forced-choice criterion measurement formats. International Journal of Selection and Assess-
ment, 15, 263–272.
Bowen, C. C., Martin, B. A., & Hunt, S. T. (2002). A comparison of ipsative and normative approaches for ability to control faking in
personality questionnaires. International Journal of Organizational Analysis,10(3), 240 –259.
Brannick, M. T., Prince, A., Prince, C., & Salas, E. (1993, April). Impact of raters and events on team performance measurement. Paper
presented at the eighth annual conference of the Society for Industrial and Organizational Psychology, San Francisco, CA.
Brill, J. E. (2008). Likert Scale. In Paul J. Lavrakas (Ed.), Encyclopedia of survey research methods (pp. 427–429). ousand Oaks, CA:
Sage.
Brown, A., & Maydeu-Olivares, A. (2011). Item response modeling of forced-choice questionnaires. Educational and Psychological
Measurement, 71(3), 460–502. doi: 10.1177/0013164410375112
Brown, A., & Maydeu-Olivares, A. (2013). How IRT can solve problems of ipsative data in forced-choice questionnaires. Psychological
Methods,18(1), 36– 52.
Burrus, J., Jackson, T., Xi, N., & Steinberg, J. (2013). Identifying the most important 21st century workforce competencies: An analysis
of the Occupational Information Network (O*NET) (Research Report No. RR-13-21). Princeton, NJ: Educational Testing Service.
10.1002/j.2333-8504.2013.tb02328.x
Callegaro, M. (2008). Social desirability. In Paul J. Lavrakas (Ed.), Encyclopedia of Survey Research Methods (pp. 825– 826). ousand
Oaks, CA: Sage. http://dx.doi.org/10.4135/9781412963947.n486
Cannon-Bowers, J. A., & Salas, E. (1997). A framework for developing team performance measures in training. In M. T. Brannick, E.
Salas, & C. Prince (Eds.), Team performance assessment and measurement: eory, methods, and applications (pp. 45– 62). Mahwah,
NJ: Lawrence Erlbaum.
Cannon-Bowers, J. A., Tannenbaum, S. I., Salas, E., & Volpe, C. E. (1995). Dening competencies and establishing team training require-
ments. Team Eectiveness and Decision Making in Organizations,333, 380.
Casner-Lotto, J., & Barrington, M. (2006). Are they really ready for work? Employers’ perspectives on the basic knowledge and applied
skills of new entrants into the 21st century workforce. New York, NY: Conference Board, Partnership for 21st Century Skills, Corporate
Voices for Working Families, and Society for Human Resource Management.
Chen, G., Donahue, L. M., & Klimoski, R. J. (2004). Training undergraduates to work in organizational teams. Academy of Management
Learning and Education, 3, 27– 40.
Cheung, M. W. L., & Chan, W. (2002). Reducing uniform response bias with ipsative measurement in multiple-group conrmatory
factor analysis. Structural Equation Modeling, 9, 55 –77.
Chiu, C.-J., Brock, D., Abu-Rish, E., Vorvick, L., Wilson, S., Hammer, D., , Zierler, B. (2011). Performance Assessment of Communica-
tion and Teamwork (PACT) tool set. Retrieved from http://collaborate.uw.edu/educators-toolkit/tools-for-evaluation/performance-
assessment-of-communication-and-teamwork-pact-too
Chiu, M. M. (2000). Group problem-solving processes: Social interactions and individual actions. Journal for the eor y of Social
Behaviour, 30(1), 26– 49. doi:10.1111/1468-5914.00118
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 23
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Christiansen, N. D., Burns, G. N., & Montgomery, G. E. (2005). Reconsidering the use of forced-choice formats for applicant personality
assessment. Human Performance,18, 267– 307.
Cooke, N. J., Salas, E., Cannon-Bowers, J. A., & Stout, R. J. (2000). Measuring team knowledge. Human Factors, 42(1), 151–173.
Council of Graduate Schools. (2012). U.S. must close gap between graduate schools, employers to stay competitive, spur innovation [Press
release]. Retrieved from http://www.cgsnet.org/pathways_release
Creative Problem Solving Group, Inc. (2013). About VIEW: An assessment of problem solving style. Retrieved from http://www
.viewassessment.com/media/978/About-VIEW-2014.pdf
Cullen, M. J., Sackett, P. R., & Lievens, F. (2006). reats to the operational use of situational judgment tests in the college admission
process. International Journal of Selection and Assessment,14(2), 142–155.
Daimler-Und, G., & Benz-Stiung, K. (2001, July 5–6). Enhancing performance in high risk environments: Recommendations for the use
of behavioural markers [Workshop]. Zurich, Switzerland: SwissAir Training Centre.
Davey, T. (2011). Practical considerations in computer-based testing. Princeton, NJ: Educational Testing Service. Retrieved from https://
www.ets.org/Media/Research/pdf/CBT-2011.pdf
de la Torre, J., Ponsoda, V., Leenen, I., & Hontangas, P. (2011, July). Some extensions of the multi-unidimensional pairwise-preference
model. Paper presented at the 26th Annual Conference for the Society of Industrial and Organizational Psychology, Chicago, IL.
Dickinson, T. L., & McIntyre, R. M. (1997). A conceptual framework for teamwork measurement. In M. T. Brannick, E. Salas, & C.
Prince (Eds.), Team performance assessment and measurement: eory, methods, and applications (pp. 19– 43). Mahwah, NJ: Lawrence
Erlbaum.
Drasgow, F., Stark, S., Chernyshenko, O. S., Nye, C. D., Hulin, C. L., & White, L. A. (2012). Development of the Tailored Adaptive
Personality Assessment System (TAPAS) to support Army selection and classication decisions (Technical Report No. 1311). Fort Belvoir,
VA: Army Research Institute for the Behavioral and Social Sciences.
Dwyer, C. A., Millett, C. M., & Payne, D. G. (2006). A culture of evidence: Postsecondary assessment and learning outcomes. Princeton,
NJ: Educational Testing Service.
Educational Testing Service. (2009). About the ETS®Personal Potential Index (ETS®PPI). Retrieved from http://www.ets.org/ppi/
evaluators/about/
Educational Testing Service. (2012). Assessment methods. Retrieved from http://www.ets.org/s/workforce_readiness/pdf/21333_big_5
.pdf
Educational Testing Service. (2014a). A snapshot of the individuals who took the GRE®revised General Test. Princeton, NJ: Educational
Testing Service. Retrieved from http://www.ets.org/s/gre/pdf/snapshot_test_taker_data_2014.pdf
Educational Testing Service. (2014b, Nov.). Innovative Assessment of Collaboration 2014 Meeting Program. Innovative Assess-
ment of Collaboration Conference 2014. Arlington, VA: Educational Testing Service. Retrieved from https://custom.cvent.com/
DE290BCAC7FE4C9D8E112A32B558BCC1/les/2029ac56dba04f1ca0fccb03b8185410.pdf
Educational Testing Service. (2016). A snapshot of the individuals who took the GRE®revised General Test. Princeton, NJ: Educational
Testing Service. Retrieved from http://www.ets.org/s/gre/pdf/snapshot_test_taker_data_2016.pdf
Ennis, M. R. (2008). Competency models: A review of the literature and the role of the Employment and Training Administration (ETA).
Washington, DC: U.S. Department of Labor.
Fiore, S. M., Rosen, M. A., Smith-Jentsch, K., Salas, E., Letsky, M., & Warner, N. (2010). Toward an understanding of macrocognition
in teams: Predicting processes in complex collaborative contexts. HumanFactors:eJournaloftheHumanFactorsandErgonomics
Society, 52(2), 203– 224.
Flin, R., Glavin, R., Maran, N., & Patey, R. (2012). Framework for observing and rating anaesthetists’ non-technical skills: Anaesthetists’
non-technical skills (ANTS) system handbook, v 1.0. Retrieved from http://www.abdn.ac.uk/iprc/documents/ANTS%20Handbook
%202012.pdf
Frankel, A., Gardner, R., Maynard, L., & Kelly, A. (2007). Using the Communication and Teamwork Skills (CATS) assessment to
measure health care team performance. Joint Commission Journal on Quality and Patient Safety/Joint Commission Resources,33(9),
549–558.
González, J., & Wagenaar, R. (Eds.). (2003). Tuning educational structures in Europe. Final Report Pilot Project - Phase 1. Retrieved from
http://tuningacademy.org/wp-content/uploads/2014/02/TuningEUI_Final-Report_EN.pdf
Greenleaf, E. A. (2008). Extreme response style. In P. J. Lavrakas (Ed.), Encyclopedia of survey research methods (p. 257). ousand Oaks,
CA: Sage. 10.4135/9781412963947.n486
Grummon, P. T. H. (1997). Assessing teamwork skills for workforce readiness. In H. J. O’Neil (Ed.), Workforce readiness: Competencies
and assessment (pp. 383–410). Mahwah, NJ: Lawrence Erlbaum.
Hart Research Associates. (2015). Falling short? College learning and career success: Selected ndings from online surveys of employers
and college students conducted on behalf of the Association of American Colleges & Universities. Retrieved from https://www.aacu.org/
sites/default/les/les/LEAP/2015employerstudentsurvey.pdf
24 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Heggestad, E. D., Morrison, M., Reeve, C. L., & McCloy, R. A. (2006). Forced-choice assessments for selection: Evaluating issues of
normative assessment and faking resistance. Journal of Applied Psychology, 91, 9 –24.
Heiman, T., & Precel, K. (2003). Students with learning disabilities in higher education: Academic strategies prole. Journal of Learning
Disabilities,36(3), 248–258.
Hesse, F., Care, E., Buder, J., Sassenberg, K., & Grin, P. (2015). A framework for teachable collaborative problem solving skills. In P.
Grin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach.Dordrecht,eNetherlands:Springer.
Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.
International Society for Technology in Education. (n.d.). ISTE standards. Students. Retrieved from http://www.iste.org/docs/pdfs/20-
14_ISTE_Standards-S_PDF.pdf
International Test Commission. (2010). International Test Commission guidelines for translating and adapting tests.Retrievedfrom
https://www.intestcom.org/les/guideline_test_adaptation.pdf
Jackson, D. N., Wroblewski, V. R., & Ashton, M. C. (2000). e impact of faking on employment tests: Does forced choice oer a
solution? Human Performance, 13(4), 371–388.
Klimoski, R., & Mohammed, S. (1994). Team mental model: Construct or metaphor? Journal of Management, 20, 403–437.
Knapp, D. J., & Hener, T. S. (Eds.) (2009). Validating future force performance measures (Army class): End of training longitudinal
validation (Technical Report No. 1257). Arlington, VA: Army Research Institute for the Behavioral and Social Sciences.
Kostopoulos, K. C., Spanos, Y. E., & Prastacos, G. P. (2013). Structure and function of team learning emergence: A multilevel empirical
validation. Journal of Management, 39(6), 1430– 1461.
Kozlowski, S. W. J., & Ilgen, D. R. (2006). Enhancing the eectiveness of work groups and teams. Psychological Science in the Public
Interest,7, 77 –124.
Kukenberger, M. R., Mathieu, J. E., & Ruddy, T. (2015). A cross-level test of empowerment and process inuences on members’ informal
learning and team commitment. Journal of Management,46, 227–259.
Kyllonen, P. C. (2008). e research behind the ETS®Personal Pote ntial Index (PPI). Retrieved from http://www.ets.org/Media/Products/
PPI/10411_PPI_bkgrd_report_RD4.pdf
Kyllonen,P. C. (2012, May). Measurement of 21st century skills within the common core state standards. Paper presentedat the Invitational
Research Symposium on Technology Enhanced Assessments. Princeton, NJ. Retrieved from https://www.ets.org/Media/Research/
pdf/session5-kyllonen-paper-tea2012.pdf
Lench, S., Fukuda, E., & Anderson, R. (2015). Essential skills and dispositions: Developmental frameworks for collaboration, creativity,
communication, and self-direction. Lexington: Center for Innovation in Education at the University of Kentucky.
Lievens, F., Buyse, T., & Sackett, P. R. (2005). Retest eects in operational selection settings: Development and test of a framework.
Personnel Psychology,58(4), 981 –1007. doi:10.1111/j.1744-6570.2005.00713.x
Lievens, F., Buyse, T., Sackett, P. R., & Connelly, B. S. (2012). e eects of coaching on situational judgment tests in high-stakes
selection. International Journal of Selection and Assessment,20, 272– 282. doi:10.1111/j.1468-2389.2012.00599.x
Markle, R., Brenneman, M., Jackson, T., Burrus, J., & Robbins, S. (2013). Synthesizing frameworks of higher education student learning
outcomes (Research Report No. RR-13-22). Princeton, NJ: Educational Testing Service. 10.1002/j.2333-8504.2013.tb02329.x
Marks, M. A., Mathieu, J. E., & Zaccaro, S. J. (2001). A temporally based framework and taxonomy of team processes. Academy of
Management Review,26(3), 356 –376.
Mathieu, J. E., Hener, T. S., Goodwin, G. F., Salas, E., & Cannon-Bowers, J. A. (2000). e inuence of shared mental models on team
process and performance. Journal of Applied Psychology, 85(2), 272– 283.
McClough, A. C., & Rogelberg, S. G. (2003). Selection in teams: An exploration of the teamwork knowledge, skills, and ability test.
International Journal of Selection and Assessment,11(1), 56– 66. doi:10.1111/1468-2389.00226
Mitchell, R., Boyle, B., Parker, V., Giles, M., Joyce, P., & Chiang, V. (2014). Transformation through tension: e moderating impact of
negative aect on transformational leadership in teams. Human Relations, 67(9), 1095– 1121. doi:10.1177/0018726714521645
Murray, T. S., Owen, E., & McGaw, B. (2005). Learning a living: First results of the adult literacy and life skills survey.Ottawa,Canada:
Statistics Canada and the Organisation for Economic Co-operation and Development.
National Research Council. (2010). Assessing 21st century skills: Summary of a workshop. J. A. Koenig, Rapporteur, Committee on the
Assessment of 21st Century Skills, Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education.
Washington, DC: e National Academies Press.
Nguyen, N. T., Biderman, M. D., & McDaniel, M. A. (2005). Eects of response instructions on faking a situational judgment test.
International Journal of Selection and Assessment, 13, 250– 260.
Nisbett, R. E., & Wilson, T. D. (1977). e halo eect: Evidence for unconscious alteration of judgments. Journal of Personality and
Social Psychology, 35, 250– 256.
Oliveri, M. E., Lawless, R. R., & Mislevy, R. J. (2017). Using evidence-centered design to support the development of culturally and linguis-
tically sensitive collaborative problem-solving assessments. Manuscript in preparation.
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 25
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Oliveri, M. E., McCarey, D., Holtzman, S., & Ezzo, C. (2014, April). Investigating the factor structure of the Personal Potential Index
using a multilevel factor analysis approach. In M. E. Oliveri (Chair), Problems with interpretations of multilevel data – Extending
research beyond hierarchical linear modeling. Paper presented at the annual American Educational Research Association conference,
Philadelphia, PA.
O’Neil, H. F., Chuang, S., & Chung, G. K. W. K. (2004). Issues in the computer-based assessment of collaborative problem solving.
Assessment in Education, 10, 361 –373.
O’Neil, H. F., Chung, G., & Brown, R. (1997). Use of networked simulations as a context to measure team competencies. In H. F. O’Neil
Jr. (Ed.), Workforce readiness: Competencies and assessment (pp. 411– 452). Mahwah, NJ: Lawrence Erlbaum.
Organisation for Economic Co-operation and Development. (2012). Literacy, numeracy and problem solving in technology-rich envi-
ronment. Framework for the OECD Survey of Adult Skills. http://dx.doi.org/10.1787/9789264128859-en
Organisation for Economic Co-operation and Development. (2013a). PISA 2012 assessment and analytical framework: Mathematics,
reading, science, problem solving and nancial literacy. https://doi.org/10.1787/9789264190511-en
Organisation for Economic Co-operation and Development. (2013b). PISA 2015. Dra collaborative problem solving frame-
work. Retrieved from http://www.oecd.org/pisa/pisaproducts/Dra%20PISA%202015%20Collaborative%20Problem%20Solving
%20Framework%20.pdf
Organisation for Economic Co-operation and Development. (2014). Context questionnaire development. In R. Turner (Ed.), PISA 2012
Technical Re po rt (pp. 47–64). Retrieved from http://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report- nal.pdf
Oser, R. L., McCallum, G. A., Salas, E., & Morgan, B. Jr. (1989). Toward a denition of teamwork: An analysis of critical team behavior
(NTSC Technical Report No. 90-009). Orlando, FL: Naval Training System Centre.
Partnership for 21st Century Skills. (2012). Framework for 21st century learning. Retrieved from http://www.p21.org/our-work/p21-
framework
Rosen, M. A., Weaver, S. J., Lazzara, E. H., Salas, E., Wu, T., Silvestri, S., , King, H. B. (2010). Tools for evaluating team performance
in simulation-based training. Journal of Emergencies, Trauma & Shock,3(4), 353–359. doi:10.4103/0974-2700.70746
Rosen, Y., & Tager, M. (2013). Computer-based assessment of collaborative problem-solving skills: Human-to-agent versus human-
to-human approach. Retrieved from http://researchnetwork.pearson.com/wp-content/uploads/collaborativeproblemsolving
researchreport.pdf
Rotondi, A. J. (1999). Assessing a team’s problem solving ability: Evaluation of the Team Problem Solving Assessment Tool (TPSAT).
Health Care Management Science, 2, 205– 214.
Rousseau, V., & Aubé, C. (2010). Team self-managing behaviors and team eectiveness: e moderating eect of task routineness.
Group & Organization Management, 35(6), 751 –781.
Salisbury, W. D., Carte, T. A., & Chidambaram, L. (2006). Cohesion in virtual teams: Validating the perceived cohesion scale in a
distributed setting. ACM SIGMIS Database,37(2– 3), 147–155.
Shanahan, C., Best, C., Finch, M., & Sutton, C. (2007). Measurement of the behavioural, cognitive, and motivational factors underlying
team performance (DSTO Research Report DSTO-RR-0328). Fishermans Bend Victoria, Australia: Air Operations Division Defence
Science and Technology Organisation.
Smith-Jentsch, K. A., Johnston, J. H., & Payne, S. C. (1998). Measuring team-related expertise in complex environments. In J. Cannon-
Bowers & E. Salas (Eds.), Making decisions under stress: Implications for individual and team training (pp. 61– 87). Washington, DC:
APA Press.
Stark, S., Chernyshenko, O. S., & Drasgow, F. (2005). An IRT approach to constructing and scoring pairwise preference items involving
stimuli on dierent dimensions: e multi-unidimensional pairwise-preference model. Applied Psychological Measurement,29(3),
184–203.
Stevens, M. J., & Campion, M. A. (1994). e knowledge, skills and ability requirements for teamwork: Implications for human resources
management. Journal of Management, 20(2), 502– 528.
Stevens, M. J., & Campion, M. A. (1999). Stang work teams: Development and validation of a selection test for teamwork settings.
Journal of Management,25(2), 207– 228.
Stone, E., & Davey, T. (2011). Computer-adaptive testing for students with disabilities: A review of the literature (RR-11-32). Princeton,
NJ: Educational Testing Service. http://dx.doi.org/10.1002/j.2333-8504.2011.tb02268.x
Strayhorn, T. L. (2006). Frameworks for assessing learning and development outcomes. Washington, DC: Council for the Advancement
of Standards in Higher Education.
omas, E. J., Sexton, J. B., & Helmreich, R. L. (2004). Translating teamwork behaviours from aviation to healthcare: Development of
behavioural markers for neonatal resuscitation. Quality Safety Health Care, 13(1), i57-i64. doi:10.1136/qshc.2004.009811
Trenger, D. J., Isaksen, S. G., & Selby, E. C. (2014). Evidence supporting VIEW. Orchard Park, NY: Creative Problem Solving Group.
Villar, A. (2008). Response bias. In P. J. Lavrakas (Ed.), Encyclopedia of survey research methods (pp. 752– 754). ousand Oaks, CA:
Sage. doi:10.4135/9781412963947.n486
26 GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service
M. E. Oliveri et al.A Literature Review on CPS for College and Workforce Readiness
Waugh, G. W., & Russell, T. L. (2005). Predictor situational judgment test. In D. J. Knapp & T. R. Tremble (Eds.) Development of experi-
mental Army enlisted personnel selection and classication tests and job performance criteria (Technical Report 1168). Arlington, VA:
U.S. Army Research Institute for the Behavioral and Social Sciences.
Weekly, J. A., & Ployhart, R. E. (Eds.). (2005). Situational judgment tests: eory, measurement, and application. Hillsdale, NJ: Lawrence
Erlbaum.
Whetzel, D. L., & McDaniel, M. A. (2009). Situational judgment tests: An overview of current research. Human Resource Management
Review, 19, 188 –202.
White, L. A., & Young, M. C. (1998, August). Development and validation of the Assessment of Individual Motivation (AIM).Paper
presented at the annual meeting of the American Psychological Association, San Francisco, CA.
Woolley, A. W., Chabris, C. F., Pentland, A., Hashmi, N., & Malone, T. W. (2010). Evidence for a collective intelligence factor in the
performance of human groups. Science, 330, 686–688. doi:10.1126/science.1193147
Zaccaro, S. J., Rittman, A. L., & Marks, M. A. (2001). Team leadership. Leadership Quarterly, 12, 451– 483.
Zhuang, X., MacCann, C., Wang, L., Liu, O. L., & Roberts, R. D. (2008). Development and validity evidence supporting a teamwork
and collaboration assessment for high school students (Research Report RR-08-50). Princeton, NJ: Educational Testing Service.
http://dx.doi.org/10.1002/j.2333-8504.2008.tb02136.x
Suggested citation:
Oliveri, M. E., Lawless, R., Molloy, H. (2017). A literature review on collaborative problem solving for college and workforce readiness
(GRE Board Report No. 17-03). Princeton, NJ: Educational Testing Service. http://dx.doi.org/10.1002/ets2.12133
Action Editor: Brent Bridgeman
Reviewers: This report was reviewed by the GRE Technical Advisory Committee and the Research Committee and Diversity,
Equity and Inclusion Committee of the GRE Board.
ETS, the ETS logo, GRE, the GRE logo, and WORKFORCE are registered trademarks of Educational Testing Service (ETS). All other
trademarks are property of their respective owners.
Find other ETS-published reports by searching the ETS ReSEARCHER database at http://search.ets.org/researcher/
GRE Board Research Report No. 17-03 and ETS Research Report Series No. RR-17-06. © 2017 Educational Testing Service 27
... These factors now make the comparisons of domestic and international social enterprises unreliable. Although it is universally accepted, there is no definition of social entrepreneur, social enterprise, or social entrepreneur (Oliveri et al., 2017) and there seems to be a contagious consensus on the meaning of social enterprise, while social entrepreneurs and social withdrawal are possible for defining companies (Haugh, 2005). In addition, (Chauhan, 2020) states that, for each topic, they propose the scope of future social entrepreneurship research of among the most significant challenges faced by social entrepreneurs that include recruitment, motivation, retention and training of trustees, employees, and volunteers' skill-based abilities. ...
... Employers place a great value on job preparation (Warnick et al., 2018), which means having the skills, knowledge, attitude, and understanding of business that allows new graduates to make productive contributions to the achievement of organizational goals, grades, skills, and knowledge of students as important preparation to the working world (Oliveri et al., 2017). (de Villiers, 2010) emphasized that employers and graduates need to have a consensus on job preparation skills in terms of self-discipline, self-confidence, leadership, number skills, and problem-solving skills. ...
Article
Full-text available
Social Entrepreneurship offers a unique opportunity to challenge, question and conceive concepts and assumptions in business management and research topics, as well as research methods and disciplines. The social entrepreneurship debate is a process of social change and/or addressing important social needs in a way that does not jeopardize the overall economic interests of the company. Social entrepreneurship differs from other forms of entrepreneurship in that it relatively prioritizes the promotion and development of social values over the acquisition of economic value. However, there is currently limited information on the level of international and social tuition fees for students in Malaysia. Nowadays, entrepreneurship and social entrepreneurship are considered viable work options and the career readiness elements do not only help improve the abilities and skills of prospective graduates but also increase the confidence of these graduates when entering the working world. The development of social cognition at the highest levels will have a profound impact on society and the combination of students or academics with university and society makes social entrepreneurship an important university curriculum, which is in the real life of learning and practice. Therefore, this concept paper aims to discuss and propose a conceptual framework of social entrepreneurship career readiness among public university students. Finally, social entrepreneurship is hoped to contribute to the ideas of prospective graduates as future careers can also contribute to positive and effective job creation with the ability to change perspectives and focus on social business to motivate them.
... Thus, while there are many factors that could influence the effectiveness and success of our government's response to the health crisis, the CPS competencies of our national and local leaders could be one critical component to ensure effective solutions are put in place through collaborative problem-solving. Oliveri et al. (2017) made a literature review on CPS for college and workforce readiness where they provided a brief description of various frameworks of CPS. One is by Stevens and Campion (1994) who identified CPS as one of five transportable teamwork competencies. ...
... One is by Stevens and Campion (1994) who identified CPS as one of five transportable teamwork competencies. Another framework is by Oliveri et al. (2017) who proposed their own definition and framework for CPS for higher education readiness. They considered teamwork, communication, leadership, and problem solving the core components of CPS. ...
Article
Full-text available
The research focused on the analysis of the Kto12 Science Curriculum for Grades 7 to 10 vis-a-vis the PISA 2018 Science Framework to examine the degree of alignment and identify possible gaps on the knowledge domains (content, procedural and epistemic), scientific literacy competencies, and levels of cognitive demand. Descriptive Research Design was used employing qualitative and quantitative methods. Statistical tools used include mean average, frequency and percentages. The mapping process involved analysis of the content and performance standards and the learning competencies in the Kto12 Science Curriculum vis-a-vis the components of PISA 2018 Science framework which are knowledge domain, scientific literacy competencies and level of cognitive demand. Collaborative validation of the results of the mapping was done to confirm initial analysis. Findings revealed that PISA 2018 Scientific Literacy Assessment Framework is closely similar to the Kto12 Science Curriculum in terms of content and knowledge domains, scientific learning competencies and levels of cognitive demand. The identified gaps were limited to a few topics in Earth and Space Systems that were not explicitly included in the curriculum content. Finally, the content topics and learning competencies were not proportionately and appropriately distributed across grade levels and in consideration of the cognitive demand expected of a learner. The unpacking of the curriculum to consider the findings of this study is recommended.
... Students' critical thinking skills can be used after graduating as a provision for life in society so that they are able to compete in facing the challenges of the 21 st century. Critical thinking skills are skills needed in the world of work (Oliveri, 2017). ...
Article
Full-text available
Research on the effectiveness of collaborative problem-solving using decision-making problems in improving pre-service physics teachers' thinking skills has been carried out. This research aims to find out the effectiveness of the application of collaborative problem-solving using decision-making problems to improve the pre-service physics teachers' critical thinking skills. The method used was pre-experiment with one group pretest-posttest design. The research subjects were 76 pre-service physics teachers at a university in the city of Makassar, South Sulawesi. An essay test of critical thinking skills was employed to collect the data that 6 items. The results showed that after implementing collaborative problem-solving using decision-making problems, students' critical thinking skills increased to high category, and most of the students' critical thinking skills improvements are also categorized as high. Thus, collaborative problem-solving using decision-making problems is effective to improve the pre-service physics teachers critical thinking skills.
... With the increasing need for technology in workplace contexts, collaborative problem solving (CPS) is considered an important 21st century skill as workers are often required to complete complex tasks in teams to solve complicated, often technical problems (Oliveri et al., 2017;Burrus et al., 2013). CPS is a complex activity that involves cognitive processes, such as content understanding, knowledge acquisition, action planning and execution, and non-cognitive processes, such as adaptability, engagement and social regulation (von Davier and Halpin, 2013;Greiff, 2012). ...
Article
Full-text available
Background Collaborative problem solving (CPS) is important for success in the 21st century, especially for teamwork and communication in technology‐enhanced environments. Measurement of CPS skills has emerged as an essential aspect in educational assessment. Modern research in CPS relies on theory‐driven measurements that are usually carried out as manual annotations over recorded logs of collaborative activities. However, manual annotation has limited scalability and is not conductive towards CPS assessments at scale. Objective We explore possibilities for automated annotation of actions in collaborative‐teams, especially chat messages. We evaluate two approaches that employ machine learning for automated classification of CPS events. Method Data were collected from engineering, physics and electronics students' participation in a simulation‐based task on electronics concepts, in which participants communicated via text‐chat messages. All task activities were logged and time stamped. Data have been manually classified for the CPS skills, using an ontology that includes both social and cognitive dimensions. In this article, we describe computational linguistic methods for automatically classifying the CPS skills from logged data, with a view towards automating CPS assessments. Results We applied two machine learning methods to our data. A Naïve Bayes classifier has been previously used in CPS research, but it is only moderately successful on our data. We also present a k‐nearest‐neighbours (kNN) classifier that uses distributional semantic models for measuring text similarity. This classifier shows strong agreement between automated and human annotations. The study also demonstrates that automatic spelling correction and slang normalization of chat texts are useful for accurate automated annotation. Implications Our results suggest that a kNN classifier can be very effective for accurate annotation of CPS events. It achieves reasonably strong results even when trained on only half of the available data. This shows a promise towards reduction of manual data annotation for CPS measurement.
... Professionals with effective teamwork skills, such as collaboration or leadership, are highly demanded in the workplace [14]. For this reason, educational institutions have adopted active learning methodologies, such as collaborative learning and team-based problem solving [11]. However, the objective evaluation of collaboration at scale still is a challenge, especially in face-to-face, domain-specific situations [9]. ...
... The findings support previous studies that have shown the relationship between team effectiveness skill and optimism (Achor & Gielan, 2020, as cited in Niyogi, 2017;Kenneally, 2015). (Oliveri et al., 2017) in the team; they also have issues regarding team processes which deal with how the participants are able to integrate their capabilities in order to accomplish common goals (Crawford, 2020); they also lack passion and commitment which suggest that they are not willing and determined to work (Stoia, 2018); and they also lack when it comes to helping and interacting with other groups, if present (Kramer & Schaffer, 2015). ...
Article
Full-text available
The call for a society that recognizes gender parity in all spaces is an enduring struggle. The inclusion of Gender and Development in the United Nations Developmental Goal made it possible for countries to mainstream gender policies and programs in their government institutions and civil society. Through the Convention on the Elimination of All Forms of Discrimination against Women (CEDAW), Beijing Platform for Action (BPA), and Magna Carta for Women, the Commission on Higher Education designed implementing rules and policies for gender inclusivity among higher learning institutions. The implemented Commission on Higher Education (CHED) Memorandum No. 1, s. 2015 required State Universities and Colleges (SUCs) and Private Higher Education Institutions (HEIs) to contribute to women’s empowerment and gender equality. This exploration tackles the implementation of private HEIs of their gender and development programs and other related mechanisms to elevate women and other genders in learning institutions and even in the public sphere. Utilizing a case study design, this exploration gathered exhaustive information from the GAD focal persons and teachers. The findings revealed the inefficiency and noncompliance to the CHED memorandum of private HEIs.
... This level of detail on the internal structure of the assessment is unprecedented in the literature (Scoular & Care, 2020). Similar studies using the PISA framework have mainly referred to comparisons with other theoretical frameworks or assessments to provide validation (Herborn et al., 2018;Oliveri, Lawless, & Molloy, 2017). ...
Article
As 21st century skills have become increasingly important, Collaborative Problem Solving (CPS) is now considered essential in almost all areas of life. Different theoretical frameworks and assessment instruments have emerged for measuring this skill. However, more applied studies on its implementation and evaluation in real-life educational settings are required. In this sense, pre-post experimental designs are essential for identifying new methods for developing collaborative problem-solving skills. To do so, equivalent tests are needed to facilitate consistent score interpretations and reduce the practice effect. In the present study, a Design-Based Research approach is used to design and validate an assessment tool with two equivalent forms based on a framework proposed by the OECD and applied to a collaborative activity. A total of 719 students aged between 10 and 13 years old participated in the different stages of the study. The results show that the proposed instrument effectively measures the problem-solving dimension of collaborative problem-solving skills among students of this age. Moreover, the results from the test were equivalent for both forms and across genders. Finally, there were no significant differences when assessing collaborative problem-solving in human-human groups versus human-agent groups using the proposed instrument. For future work, we recommend including other data sources than just text-based conversations. This would allow us to capture the rich social interactions present in this type of activity. Future work should also consider exploring the extent to which skills could be trained. This could be done in an experimental design assessed using the equivalent forms of the proposed instrument as a pre- and post-test. Doing so would provide a more accurate measure of students’ collaborative skills.
... The findings support previous studies that have shown the relationship between team effectiveness skill and optimism (Achor & Gielan, 2020, as cited in Niyogi, 2017;Kenneally, 2015). (Oliveri et al., 2017) in the team; they also have issues regarding team processes which deal with how the participants are able to integrate their capabilities in order to accomplish common goals (Crawford, 2020); they also lack passion and commitment which suggest that they are not willing and determined to work (Stoia, 2018); and they also lack when it comes to helping and interacting with other groups, if present (Kramer & Schaffer, 2015). ...
Research
Full-text available
These developmental frameworks provide practitioners and policymakers a research-based, multidimensional approach to define the trajectory of skill development across K-12.
Article
Collaborative problem solving (CPS) ranks among the top five most critical skills necessary for college graduates to meet workforce demands (Hart Research Associates, 2015). It is also deemed a critical skill for educational success (Beaver, 2013). It thus deserves more prominence in the suite of courses and subjects assessed in K-16. Such inclusion, however, presents the need for improvements in the conceptualization, design, and analysis of CPS, which challenges us to think differently about assessing the skills than the current focus given to assessing individuals’ substantive knowledge. In this article, we discuss an Evidence-Centered Design approach to assess CPS in a culturally and linguistically diverse educational environment. We demonstrate ways to consider a sociocognitive perspective to conceptualize and model possible linguistic and/or cultural differences between populations along key stages of assessment development including assessment conceptualization and design to help reduce possible construct-irrelevant differences when assessing complex constructs with diverse populations. See article's mention at: https://www.linkedin.com/pulse/building-assessments-can-measure-collaboration-diverse-lawrence
Book
This second volume of papers from the ATC21STM project deals with the development of an assessment and teaching system of 21st century skills. Readers are guided through a detailed description of the methods used in this process. The first volume was published by Springer in 2012 ( Griffin, P., McGaw, B. & Care, E., Eds., Assessment and Teaching of 21st Century Skills, Dordrecht: Springer). The major elements of this new volume are the identification and description of two 21st century skills that are amenable to teaching and learning: collaborative problem solving, and learning in digital networks. Features of the skills that need to be mirrored in their assessment are identified so that they can be reflected in assessment tasks. The tasks are formulated so that reporting of student performance can guide implementation in the classroom for use in teaching and learning. How simple tasks can act as platforms for development of 21st century skills is demonstrated, with the concurrent technical infrastructure required for its support. How countries with different languages and cultures participated and contributed to the development process is described. The psychometric qualities of the online tasks developed are reported, in the context of the robustness of the automated scoring processes. Finally, technical and educational issues to be resolved in global projects of this nature are outlined.
Book
Considerations in Computer-Based Testing * Issues in Test Administration and Development * Examinee Issues * Software Issues * Issues in Innovative Item Types * Computerized Fixed Tests * Automated Test Assembly for Online Administration * Computerized Adaptive Tests * Computerized Classification Tests * Item Pool Evaluation and Maintenance * Comparison of the Test Delivery Methods