Content uploaded by Patanamon Thongtanunam
Author content
All content in this area was uploaded by Patanamon Thongtanunam on Jul 11, 2021
Content may be subject to copyright.
Shadow Program Committee Initiative: Process and Reflection
Patanamon Thongtanunam
patanamon.t@unimelb.edu.au
University of Melbourne, Australia
Ayushi Rastogi
a.rastogi@rug.nl
University of Groningen, Netherlands
Foutse Khomh
foutse.khomh@polymtl.ca
Polytechnique Montréal, Canada
Serge Demeyer
serge.demeyer@uantwerpen.be
University of Antwerp, Belgium
Meiyappan Nagappan
mei.nagappan@uwaterloo.ca
University of Waterloo, Canada
Kelly Blincoe
k.blincoe@auckland.ac.nz
University of Auckland, New Zealand
Gregorio Robles
grex@gsyc.urjc.es
Universidad Rey Juan Carlos, Spain
1. INTRODUCTION
The Shadow Program Committee (PC) is an initiative/program
that provides an opportunity to Early-Career Researchers (ECRs),
i.e., PhD students, postdocs, new faculty members, and industry
practitioners, who have not been in a PC, to learn first-hand about
the peer-review process of the technical track at Software Engi-
neering (SE) conferences. This program aims to train the next
generation of PC members as well as to allow ECRs to be recog-
nized and embedded in the research community. By participating
in this program, ECRs will have a great chance i) to gain expe-
rience about the reviewing process including the restrictions and
ethical standards of the academic peer-review process; ii) to be
mentored by senior researchers on how to write a good review; and
iii) to create a network with other ECRs and senior researchers
(i.e., Shadow PC advisors).
The Shadow PC program was first introduced to the SE research
community at the Mining Software Repositories (MSR) confer-
ence in 2021. The program was led by Patanamon Thongta-
nunam and Ayushi Rastogi (Shadow PC Co-chairs) with support
from Shadow PC Advisor Co-Chairs (Foutse Khomh and Serge
Demeyer), PC Co-Chairs of the technical track (Meiyappan Na-
gappan and Kelly Blincoe), and the General Chair of the con-
ference, Gregorio Robles. To promote and facilitate the Shadow
PC program at SE conferences in the future, this report provides
details about the process and a reflection on the Shadow PC pro-
gram during MSR2021. The presentation slides and video are also
available online at https://youtu.be/ReUXwmtIEk8.
2. SHADOW PC PROCESS
The Shadow PC review process functioned like an independent
version of the review process of the technical track at the con-
ference. The Shadow PC members provided reviews on a subset
of submissions to the technical track of the conference (the au-
thors opted-in for their paper to be reviewed by the Shadow PC).
However, the Shadow PC members did not have access to the
reviews and the names of actual reviewers of the technical track.
In addition, they had to abide by the same rules and restrictions
applicable to regular PC members. This includes, but is not lim-
ited to, declaring conflict of interests, performing double-blind re-
views, and following rules against discussing the papers outside of
the PC context or using in any way results from reviewed papers
before such papers have been published. Delegated reviews (i.e.,
external reviews) were not allowed for the Shadow PC. Shadow
reviews for papers that were reviewed by the Shadow PC were
sent out after the actual review process.
2.1 Shadow PC Members
Recruitment. The Shadow PC program was open to PhD stu-
dents, post-docs, new faculty members and industry practitioners
working in SE research who have not yet served as a PC member
of the technical research track (or the main track) of premier SE
conferences. ECRs who were interested in participating in our
Shadow PC program were asked to nominate themselves through
a Google form, which collects general demographics (e.g., country
of residence, gender), public research profile, motivation for join-
ing the Shadow PC, and SE topics of expertise. To ensure that
this Shadow PC opportunity was widely spread to many ECRs,
we publicized the Shadow PC program and the self-nomination
via various social platforms (e.g., Twitter, Facebook), SE mailing
lists (e.g., SEWorld), SE communities (e.g., SEWIRE: Women in
SE, SENF: New Faculty and Postdoc). Through our personal con-
tacts, we also reached out to SE researchers in under-represented
regions who are less likely to be informed about this opportunity
(e.g., ECRs in South-East Asia, Africa, South America). The
self-nomination was open during November 1st to December 10th,
2020.
Selection. We received a total of 162 applicants from a wide range
of backgrounds from 36 countries. The selection was mainly based
on the research experience (i.e., the number of years before or after
PhD graduation and the number of publications) and their stated
motivation for participating in the Shadow PC program, while
ensuring the diversity in terms of gender, country, and occupa-
tion. After reviewing the applications, we invited 106 applicants
from 32 countries to participate in the program as Shadow PC
members. Figure 1 shows that our Shadow PC members are in a
fairly reasonable distribution based on gender and occupations.
The applicants who were not selected were given other opportuni-
ties to serve the community based on their qualifications. Indeed,
we found that thirteen of the applicants were overqualified to be
in the Shadow PC as they had prolific research profiles with ex-
perience reviewing journal and workshop papers. Hence, these
thirteen applicants were instead invited to be a regular PC mem-
ber of the technical track at MSR2021. The applications of the
Post-PhD applicants (e.g., post-docs, faculty members) were also
shared with PC chairs of various tracks at the premier SE confer-
Figure 1: The distributions of the 106 Shadow PC members at
MSR2021 based on gender and occupations.
ences. The remaining applicants who were not selected, mainly
early PhD students, were encouraged to apply in future editions
of the conference.
2.2 Reviewing and Mentoring Process
Respecting the double-blind policy, the Shadow PC members did
not have access to the names of the actual reviewers at the tech-
nical track and the author names. Hence, the Shadow PC reviews
were conducted in a separate instance of the HotCRP review man-
agement system from the technical track. This also ensured that
the reviews of the PC members at the technical track did not in-
fluence the review decisions of the Shadow PC members, and vice
versa. The Shadow PC reviews were also double-blinded.
Paper Submissions. The Shadow PC program used a subset of
the papers that were actually submitted at the technical track
of MSR2021. During the submission of the technical track, the
authors could select an option to participate in the Shadow PC
program. We selected 40 papers from the 59 papers that opted
for the Shadow PC program. The selection was based on the pa-
per categories (i.e., a full-length research paper) and the diversity
of the topics. Then, the PC co-chairs of the technical track of
MSR21 shared the meta-data and the submission files of the se-
lected papers to the Shadow PC co-chairs privately to populate
into the HotCRP Shadow PC review instance.
Reviewing Timeline & Process. The Shadow PC program used a
similar process and timeline to the technical track at MSR2021.
The Shadow PC members had to declare conflicts of interests, top-
ics of interest, and provide review preferences. Since the reviewing
timeline of the Shadow PC program was shorter than the technical
track and many Shadow PC members did not have prior reviewing
experience, each PC member was assigned only two papers based
on their topics of interests and review preferences. Hence, each
paper received five to six Shadow PC reviews. Again, to avoid po-
tential influence of the outcomes from the technical track, Shadow
PC members had to submit reviews and reach consensus before
the author notification date of the technical track. After the re-
view submission deadline, Shadow PC advisors provided feedback
on the reviews written by the Shadow PC members. After all the
reviews were checked by the Shadow PC advisors, we sent the de-
cisions and reviews of the Shadow PC members with the authors.
Mentoring & Support. The mentorship was mainly from Shadow
PC advisors through individual feedback for each Shadow PC re-
view. We invited 26 senior SE researchers who had extensive ex-
perience in community service and reviewing to serve as Shadow
PC advisors. Each Shadow PC advisor was assigned to moni-
tor one or two papers and provide feedback for 5-10 Shadow PC
reviews. In addition to mentorship by the Shadow PC advisors,
the Shadow PC members were also provided with materials about
By MSR PC
#Accepted #Rejected
By Shadow PC #Accepted 12 7
#Rejected 2 19
Table 1: The number of papers that were accepted and rejected
by Shadow PC and MSR PC members.
Figure 2: Survey responses from Shadow PC members
reviewing guidelines and tips. We also set up a private commu-
nication channel via Discord for questions and answers between
Shadow PC members and the Shadow PC co-chairs.
2.3 Review Outcome and Analysis
We received a total of 203 reviews from 102 Shadow PC members.
There were 680 messages exchanged in the review discussions of
which 408 messages were from Shadow PC members, showing
that Shadow PC members were active. The acceptance rate by
Shadow PC members was 47.5% (19 out of 40 papers). Based on
the same set of the papers, we found that the acceptance rate by
the regular PC members was similar, i.e., 35% (13 out of 40 pa-
pers). Table 1 shows that the review outcomes from the Shadow
PC members and the PC members of the technical track are fairly
consistent, i.e., 12 papers were consistently accepted and 19 pa-
pers were consistently rejected. Only 9 out of 40 papers that
received inconsistent review outcomes.
3. REFLECTION
To gauge the success of the Shadow PC program, we sent out
a survey to each of the Shadow PC members and the authors
who participated in the program. The surveys were anonymized
and voluntary. For the Shadow PC members, the survey aimed
to collect feedback on their experiences and perceptions of the
program. For the authors who received reviews from the Shadow
PC, the survey aimed to better understand their motivation for
opting into the Shadow PC review process and to collect feedback
about the Shadow PC reviews. In this section, we present the
survey responses and discuss the areas for improvement for the
program in the future.
3.1 Participant Feedback
Below, we summarize the survey responses from Shadow PC mem-
bers and the paper authors.
Shadow PC Feedback. We received 73 responses from Shadow PC
members. Figure 2 shows that the responses were generally posi-
tive about the learning and experience gained after participating
in the Shadow PC program. The majority of the respondents
Figure 3: Survey responses from authors of participating papers
agreed that they learned and/or better understood the academic
peer review process and gained a new perspective of how the ac-
tual PC members read and reviewed the manuscripts. Many re-
spondents noted that this program provided a great opportunity
to learn how the actual review process functions. For example,
Shadow PC members noted: “It was my first time seeing the work
behind the scenes that goes into discussing the papers and finaliz-
ing the decision”;“there is not often an opportunity to learn this
(except if you are lucky and have a good supervisor/colleagues dur-
ing your PhD studies).” Several respondent also noted that they
have also learnt from other Shadow PC reviews: “Great to read
feedback on not only my reviews but also on other shadow PC
members’ review.” In addition, most of the respondents also indi-
cated that the Shadow PC program should be held again in the
future and that they would recommend this program to others.
Author Feedback. We received 23 responses from the paper au-
thors. Several respondents indicated that they opted in the Shadow
PC review to receive additional feedback, while contributing to
the community to train the next generation of PC members: “Re-
ceiving additional feedback and allowing new community members
to become acquainted with the inner workings of a conference.”
Figure 3 also shows that the authors were generally satisfied with
the quality of Shadow PC reviews. Many respondents agreed that
the Shadow PC reviews were clear, constructive and used appro-
priate tone and language. Several respondents also noted that
several comments were useful: “I used some reviews in the camera-
ready.”;“The shadow PC reviews are mostly well-expressed and
comparable in clarity to the normal PC reviews.” Similar to the
response from Shadow PC members, most of the respondents sug-
gested that the Shadow PC program should be held in the future
and that they would submit manuscripts to participate in the
program.
3.2 Areas for Improvement
Based on the survey responses and our discussion with the Shadow
PC advisors, we also identified several points that can be further
improved in the future.
Training Workshop & Social Event: Several respondents from the
survey of Shadow PC members noted that it would be good
to see an example of reviews and receive guidelines about
what they should and shouldn’t do when reviewing before
starting the review process. Hence, we suggest that orga-
nizing a training workshop to ensure that all Shadow PC
members have a clear understanding about what they should
read, critique, and write in a review. In addition, organizing
a social event may also increase the interaction and partici-
pation between the Shadow PC members.
Timeline: There were a couple of delays in the timeline. This is
partly because each paper received reviews from five to six
Shadow PC members and it took a longer time to reach con-
sensus than usual. Hence, the authors received the Shadow
PC reviews relatively late (i.e., one week prior to the camera-
ready deadline). We suggest that reducing the number of
Shadow reviews per paper and be more strict on the timeline
to ensure a timely notification sent to the authors.
Author Response: One interesting suggestion from the paper au-
thors is that they were happy to provide feedback to their
Shadow PC members. We agreed that organizing an au-
thor response period would be beneficial to both Shadow
PC members and the paper authors. This opportunity al-
lows the authors to clarify or correct misunderstandings.
The Shadow PC members will also receive additional feed-
back about their reviews.
Recognition: To recognize the effort of Shadow PC members, fu-
ture programs should also consider to select distinguished
reviewers and/or invite some of the more deserving partici-
pants to be a PC member of the main research track. Since
the majority of Shadow PC members were PhD students,
they considered that the participation to the Shadow PC
program could be useful for their profile. Hence, a certifi-
cate of participation should also be provided.
4. CONCLUSION
Shadow PC has provided numerous benefits to the SE community.
ECRs had an opportunity to learn to review and understand the
academic peer review process. Paper authors received additional
feedback. Moreover, the community will have a larger pool of
diverse reviewers. Responses to this initiative from both Shadow
PC members and the paper authors were also very positive and
encouraging. Several Shadow PC members have acknowledged
that they have gained experience in reviewing and some paper
authors noted that they received useful feedback from the Shadow
PC members. Hence, we encourage the SE research community
to continue on the Shadow PC program in the future.