Content uploaded by Jaime A. Teixeira da Silva
Author content
All content in this area was uploaded by Jaime A. Teixeira da Silva on Feb 22, 2017
Content may be subject to copyright.
http://wjst.wu.ac.th Special Article
Walailak J Sci & Tech 2017; 14(4): 259-265.
Cost-benefit Assessment of Congresses, Meetings or Symposia, and
Selection Criteria to Determine if They are Predatory
Jaime A. TEIXEIRA DA SILVA1,*, Shahryar SOROOSHIAN2 and
Aceil AL-KHATIB3
1P.O. Box 7, Miki-cho post office, Ikenobe 3011-2, Kagawa-ken, 761-0799, Japan
2Faculty of Industrial Management, Universiti Malaysia Pahang, 26300 Gambang Kuantan,
Pahang, Malaysia
3Faculty of Dentistry, Jordan University of Science and Technology, P.O. Box 3030 Irbid 22110, Jordan
(*Corresponding author’s e-mail: jaimetex@yahoo.com)
Received: 1 February 2017, Revised: 11 February 2017, Accepted: 17 February 2017
Abstract
Not a single day goes by in which academics receive one or more emails inviting them to attend a
congress, meeting or symposium (CMS). Increasingly, most of these invitations are for attending CMSs
that lie beyond the scope of their fields of research, and are usually characterized by images of grandeur
and finesse, enticing the invitee with claims of international status, the pompous nature of the steering
committee, or the meeting’s sheer size and dimension, including a list of famed participants. In other
cases, emphasis is placed instead on the exotic nature of the location, and the invitation often sounds more
like a travel brochure than an invitation to join a professional CMS. In several cases, a promise to publish
the CMS proceedings in an indexed database is made. It is difficult to judge the veracity and significance
of such meetings at a distance, even more so through an email. However, when the balance sheet is drawn
up, and the costs are assessed, including of travel, accommodation and meals, it is clear to see that most
CMSs are simply traps to make money, and that true academic discovery is a secondary, or more distant,
objective. This article draws readers’ attention to the need for making a cost-benefit analysis based on the
criteria that we present before deciding on whether to attend a CMS, or not.
Keywords: Congress, meeting or symposium (CMS), costs, predatory, scrutiny
The broader academic context of congresses, meetings and symposia
Most scientists appreciate recognition for their efforts. However, in the modern era of science, with
so many distorted and non-academic publishing incentives, including the gaming of the Thomson Reuters
journal impact factor and now the Elsevier/Scopus CiteScore [1], it is rare to find scientists who publish
or research altruistically, and who think only of the pure nature of science, or of science discovery. This is
because such scientists, especially those with noble objectives, would not be able to attract funding to
make their objectives a reality. The sad reality of science is the intricate link between well-funded science
and the success of science publishing, and vice versa. Those who do not publish, or demonstrably show
their productivity, primarily through published papers, will receive little or no funding, and may become
redundant scientists [2]. Such scientists will thus not survive, i.e., they will become victims to science’s
classic “publish or perish” mantra. The phenomenon of publication phishing, using email phishing, in
which unethical publishers take advantage of those who are seeking a home to share, present or publish
their research and studies [3], is a potential business for the organizers of a predatory congress, meeting or
symposium (CMS).
Many universities, striving to improve their visibility [4], still recognize a proceedings paper that
results from a CMS as being a valid scientific publication with academic merit. However, academic trust
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
260
in some knowledge-sharing and publication platforms, among them CMSs, is being eroded [5]. In most
cases, the universities’ assumption of academic validity is based on the notion that a CMS is created by
experts, who have an international and renowned background, and properly vetted through peer review,
leading to the publication of a proceedings paper. Some CMS-derived proceedings papers are used as
selection criteria for obtaining a masters or PhD degree, or even research grants, and thus the academic
validity of a CMS must be fully vetted, verified and validated before a scientist becomes an attendee. Any
scientist who has attended a truly scholarly meeting organized exclusively by peers or an academic
society can confess to a productive and stimulating encounter, in which ideas are shared face to face,
exchanged, enhanced and enriched. However, a sector of academics believes that the trustworthiness of
all CMSs is harmed by some predatory or unprofessional CMSs. Very sadly, there is a sector of the fake
or dishonest economy that has seen this weakness in science and is preying upon the desperate need by
scientists and their institutes to publish in order to be perceived as being productive, or important, for
example, the “honor” of being an “invited speaker” [6].
The authors are aware of CMS articles and proceedings that most probably never went through a
proper scientific peer review before presentation and/or publication. Be that as it may, this predatory
behavior has led to the mushrooming of an entire industry of CMSs that are attempting to lure scientists,
and their - or their institute’s - money, through participation as speakers and who are later charged fees to
make a profit for the organizers [7]. One such experience was recently reported by a SpringerNature
Editor-in-Chief, Roger W. Byard, who had been invited to participate in a conference on “coastal zones”
because of his apparent academic standing in a field he had never written about [8]. On a daily basis - and
the numbers will undoubtedly be higher for those who publish more extensively because their emails will
be trawled by bots more frequently - scientists are receiving emails inviting them to CMSs, usually with
enticing, but in many cases non-academic, benefits. A priori, it is evident that a scientist will not attend a
meeting that proves to be boring or bland, so there is always an element of luster, even in valid academic
congresses. By virtue of the very fact that travel has become a luxury only for those with economic
prowess, and that scientists’ travel might be limited to one or few CMSs a year - usually funded by travel
or research grants paid for by universities - scientists also want to ensure that such a trip is not only
academically fulfilling, but also a travel experience with “an all-expenses-paid trip to a vacation
destination” [9]. In the latter, it is implied that culinary, cultural or theatrical aspects of the social program
of a CMS also serve to boost the ambiance with which academic information is shared among
participants. A fun, or diverse CMS can stimulate greater discussion since participants are motivated.
However, if the balance of academic content and fun is distorted, and weigh more heavily towards the
latter, then the true objectives of a CMS become clouded and its academic objectives become diluted.
The emergence and growth of academically questionable congresses, meetings and symposia
We now focus on a few recent events that have cast doubt on the academic nature of CMSs. The
first pertains to costs. No matter how productive a meeting is, one (including the institute where the
traveling scientists are based) has the responsibility of calculating the cost of participation, i.e., the cost
per poster or per oral presentation. A rough estimate of hundreds or thousands of US$, depending on
several factors, for a single poster or oral presentation, is a sign that CMSs are excessive, and pompous,
i.e., still a privilege for the elite academic minority [10]. These issues and concerns become even more
pertinent when tax-payers’ money (i.e., public funding) is involved. Evidently, when scientists have to
fork out large amounts of money for their own CMS, they will no doubt reflect extremely carefully on the
cost-benefit ratio of their investment that will come out of their own pockets. Most likely, self-funded
scientists will rarely attend CMSs. So, when funding is provided by their research institute or donors, and
even more so when such funding is derived from tax-payers’ money, then there needs to be very careful
reflection. When funds are squandered using someone else’s budget, there tends to be a frivolous or
nonchalant attitude. And here, it is the responsibility of research institutes to instill strict and careful
control over the selection of CMSs that are attended by their faculty and staff. Scientists and research
institutes must do a cost-benefit analysis to ascertain whether using hundreds or thousands of US$ per
scientist per meeting is worthwhile for one or two posters or oral presentations, especially given the fact
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
261
that open access publishing, social media and online CMSs can achieve the same - or a greater - publicity
effect, at a fraction of the cost. Scientists and institutes that have bad management skills are as much of a
problem - or are a source of the problem - as the fake or predatory CMSs that try to lure their money. As a
result, a scientist and their institute’s reputation may be negatively affected [6].
Such fake or predatory CMSs, which Sorooshian aptly described as “conference wolves in sheep’s
clothing” that “transformed their event into moneymaking machines to collect substantial registration fees
from authors in an unethical manner” [11], form part of a wider budding scholarly black market [5]. In
June 2016, James McCrostie published a provisional list of criteria on the Beall blog to characterize such
predatory CMSs, but the Beall blog has since been terminated on January 15, 2017, so we discuss these
criteria in detail in the last section of this paper. This was preceded, just a few months earlier, by a white
paper [12] representing a cooperative effort by the ASCE (American Society of Civil Engineers),
Elsevier, the IEEE (Institute of Electrical and Electronics Engineers) and the IET (Institution of
Engineering and Technology) to hammer out criteria to boost the academic robustness of CMSs. It is
highly likely that this white-paper emerged in response to the IEEE and Springer scandals that involved
the retraction of hundreds of poorly vetted and unscholarly papers [13], or thousands of IEEE meeting
abstracts [14]. These abuses of the scholarly process, including fake peer reviews and abused submission
systems [15] are rapidly corroding trust in the academic veracity of so-called peer reviewed journals, or
proceedings derived from CMSs.
A new aspect of concern relates to the abuse of congresses as a possible depository for fake or
fictional articles, as was recently revealed with the acceptance of a nonsense paper generated by iOS
software into an international conference [16]. The additional problem with that particular case was the
use of a fictitious email account by a professor to complete the submission, highlighting the risks of
submission abuses by scientists, either as hoaxes, or for non-academic mischievous purposes [17,18].
Possible ways to measure, or assess, the predatory nature of a CMS: Suggestions and limitations
Table 1 lists a revised set of criteria originally devised by James McCrostie based on colour coding,
but expanded to include 3 colour codes, instead of two. Several of McCrostie’s initial criteria were,
according to McCrostie, based on a 2015 source: “Drawing partly on the document “Recommended
Practices to Ensure Technical Conference Content Quality”, originally presented by Gordon MacPherson
at the 4th World Conference on Research Integrity Rio De Janeiro, Brazil, June 2, 2015” [19]. The criteria
indicated in Table 1 are not formally established criteria, most likely would require additional input from
a wider range of academics, and would eventually need to have the formal backing of official academic
institutes across a range of countries to legitimize the criteria before they are used for official purposes.
Conclusions
Evidently, fake or predatory CMSs are eroding trust in scholarly communication, because they exist
primarily to attract funding, and the focus on academics is a secondary objective, making it difficult for
junior researchers to determine which conference is legitimate and which is predatory. The marketing
ploys used, however, to attract participants, would lead an invitee to believe the opposite, i.e., that their
participation would be a positive academic advancement. In addition, they impose a real financial burden
on university administrators who are involved in making budgetary decisions. Therefore, all parties
involved in budget planning and execution should pay close attention to the negative consequences of
funding attendance of their academics in predatory CMSs, and implement policies to track and prevent
such unnecessary expenses. The topic of predatory CMSs is still relatively young, and some crude criteria
for determining the predatory nature of such meetings exist, as we have listed above and in Table 1,
based primarily on McCrostie’s lists. Joining such CMSs may constitute a risk to academia, and a waste
of money. We caution readers, however, that only observation of the invitation email or web-site might
not constitute sufficient proof of predatory CMS behavior, and that multiple factors should be taken into
consideration when considering whether a CMS is predatory, or not, and thus worth the investment, both
financially and in terms of dedicated time.
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
262
Acknowledgements
The authors thank James McCrostie (Daito Bunka University, Japan) for discussion on this topic,
and for providing explicit permission to use the revised set of criteria set out in Table 1.
Conflicts of interest
The authors declare no conflicts of interest.
References
[1] JA Teixeira da Silva and AR Memon. CiteScore: A cite for sore eyes, or a valuable, transparent
metric? Scientometrics 2017, DOI: 10.1007/s11192-017-2250-0.
[2] G Sharrock. Communicating spending cuts: Lessons for Australian university leaders. J. Higher Ed.
Policy Man. 2014; 36, 338-54.
[3] S Sorooshian. Publication phishing: A growing challenge for researchers and scientific societies.
Curr. Sci. 2016; 110, 766-7.
[4] M Schulte. Toot your own horn! Share your successes through publication, conferences, awards,
and professional development. J. Cont. Higher Ed. 2016; 64, 133-6.
[5] S Sorooshian. Conference wolves in sheep’s clothing. Sci. Eng. Ethics 2017, DOI: 10.1007/s11948-
016-9788-8.
[6] J McCrostie. Warning: Conmen and shameless scholars operate in this area. Available at:
https://www.timeshighereducation.com/comment/warning-conmen-and-shameless-scholars-operate-
area, accessed February 2017.
[7] JD Bowman. Predatory publishing, questionable peer review, and fraudulent conferences. Amer. J.
Pharm. Ed. 2014; 78, 176.
[8] RW Byard. The forensic implications of predatory publishing. Forensic Sci. Med. Pathol. 2016; 12,
391-3.
[9] M Brooks. Red-flag conferences. Chronicle Higher Ed. 2009. Available at:
https://chronicle.com/article/Red-Flag-Conferences/44795, accessed February 2017.
[10] JA Teixeira da Silva. Are international symposia becoming redundant and elitist? Asian Australas.
J. Plant Sci. Biotechnol. 2013; 7, 114-5.
[11] S Sorooshian. Scholarly black market. Sci. Eng. Ethics 2017, DOI: 10.1007/s11948-016-9765-2.
[12] B Kulamer, W Meester, J Salk, N Blair-DeLeon, G MacPherson, W Moses, A Philippidis, C
Chapman, D Smith, I Stoneham and K Vukmirovic. Recommended practices to ensure technical
conference content quality. Available at: https://www.ieee.org/conferences_events/conferences/
publishing/paper_acceptance_criteria.pdf, accessed February 2017.
[13] RV Noorden. Publishers withdraw more than 120 gibberish papers. Nature 2014, DOI:
10.1038/nature.2014.14763.
[14] A McCook. One publisher appears to have retracted thousands of meeting abstracts. Yes, thousands.
Available at: http://retractionwatch.com/2015/06/25/one-publisher-appears-to-have-retracted-
thousands-of-meeting-abstracts-yes-thousands, accessed February 2017.
[15] JA Teixeira da Silva. On the abuse of online submission systems, fake peer reviews and editor-
created accounts. Persona Bioética 2016; 20, 151-8.
[16] E Hunt. Nonsense paper written by iOS autocomplete accepted for conference. Available at:
https://www.theguardian.com/science/2016/oct/22/nonsense-paper-written-by-ios-autocomplete-
accepted-for-conference, accessed February 2017.
[17] A Al-Khatib and JA Teixeira da Silva. Stings, hoaxes and irony breach the trust inherent in
scientific publishing. Publishing Res. Quart. 2016; 32, 208-19.
[18] JA Teixeira da Silva and A Al-Khatib. The macro and micro scale of open access predation.
Publishing Res. Quart. 2017. DOI: 10.1007/s12109-016-9495-y.
[19] G MacPherson. Recommended practices to ensure conference content quality. Res. Integrity Peer
Rev. 2016; 1, P8.
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
263
Table 1 Proposed criteria to identify predatory characteristics in a congress, meeting or symposium
(CMS), or a CMS organizer.
Red level criteria
1. The use of deceit
1.1. Related to the CMS
Claiming to be a non-profit organization when the organizer is a for-profit company.
Hiding or obscuring relationships with for-profit partner companies.
Falsely claiming universities or other organizations as partners or sponsors.
Listing addresses or phone numbers that are nonexistent or false.
Using organization names or addresses that imply they are based in one country or region when in fact
they operate out of a different country or region.
Lying to CMS participants about any aspect of that conference, or not correcting facts.
1.2. Related to the CMS organizers or organizing committee
Falsely claiming the involvement of people on advisory boards or organizing committees.
Using fake names to hide the identity of organizers or their country of origin.
Listing organizers or participants falsely (i.e., non-participants), or listing renowned individuals without
their knowledge or permission.
Failing to list the names, addresses and affiliations of individuals owning or controlling the organization.
Organizers falsely claiming academic positions or academic qualifications.
2. No, inadequate or poorly vetted peer review
Machine-generated or other “sting” abstracts or papers get accepted.
Organizers market CMS as being peer-reviewed when no peer-review occurs, where peers are not true
peers, or where the qualifications of peers are not fully vetted.
CMS is listed as peer-reviewed but peer review does take place or the conference organizing company
uses employees to handle submissions and complete “reviews”.
Peer reviewers read, judge and select proceedings papers based exclusively on abstracts, and with
insufficient credentials or experience to do so, i.e., vetting is absent or inadequate.
Accepting papers for the CMS proceedings that have not been presented at the CMS, either as a poster, or
as an oral presentation.
3. Issues with conference proceedings and publications
The CMS organizer publishes a proceedings that consists of non-peer-reviewed papers.
The organizer promises that papers will be published in an unnamed journal indexed in ISI, SCOPUS, or
some other commonly-used whitelist.
4. Links to other predatory CMSs, publishers or journals
Conference papers get funneled to known or suspected predatory journals knowingly, or unknowingly.
5. Virtual presentations
Acceptance of virtual presentations that are not presented to an audience.
6. Miscellaneous
The conference organizer(s) and/or director(s) possess no or tangential expertise in the conference subject
matter, or whose academic record cannot be verified due to the use of abbreviated names.
Participants are charged additional fees (per author or per paper) when authoring or co-authoring more
than one CMS paper.
CMS organizers cancel the CMS or change the venue at short notice, or without notice.
Orange level criteria
1. CMS ID
The name of the CMS matches or nearly matches the name of another established, respected CMS.
2. CMS proceedings and journal duplication
The CMS organizer allows CMS papers to be published twice (in the official CMS proceedings and a
separate journal published by the CMS organizer without due cross-referencing and citation of the
original).
1
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
264
Orange level criteria
3. CMS leadership reputational history
CMS chairs, session chairs, keynote speakers, or CMS proceeding editors have connections to other
predatory CMSs or journals.
2
4. Virtual presentations
Virtual presentation papers get published in CMS proceedings without being identified as such.
Yellow level criteria
1. Fees
The CMS fee is unjustifiably high.
Presenters pay more than attendees.
The CMS organizer focuses more on selling dinners and associated tours than on the CMS program.
2. CMS scope
The CMS is overly broad in scope, or combines radically different fields, e.g., business and engineering.
A single organization holds conferences in very different fields.
The organizer simultaneously holds more than two conferences at the same time and place.
The same conference is held several times a year in different cities.
3. Acceptance of CMS proceedings
Immediate or almost immediate acceptance of proceedings papers.
Regular extensions to the “call for papers” submission deadline or accepting papers after the deadline.
Accepting proceedings papers just a few days before the deadline.
Using undergraduate or master’s students as peer reviewers without oversight from university faculty.
4. Miscellaneous
The CMS organizer regularly sends spam emails to people outside the CMS’s field of focus.
The name of the person or organization acting as the “registrant” for the CMS website or CMS organizer
website is hidden on website registry documents.
The CMS is marketed as a holiday. CMS websites and emails resemble travel brochures rather than CMS
notices.
Opening, officiating, closing, and keynote speeches (if any) are presented by relatively unknown
(globally) scholars (except for where the CMS focuses explicitly on a local audience).
Overuse of the term “international” in the organization name or CMS title when the CMS organizer
and/or attendees overwhelmingly come from a single country.
Awarding best paper prizes before the end of the CMS or awarding multiple “best paper” prizes.
When CMS proceedings are published only digitally, no attempt is made to electronically preserve them.
No attempts are made to distribute CMS proceedings beyond the CMS participants.
CMS organizers create a “society”, “association” or “institute” or some other organization and name it as
the sponsor or organizer of the CMS.
No clear CMS chair or director is identified.
Insufficient contact details for the organizer or CMS, the organization headquarters location is obscured
by using P.O. boxes or virtual offices, or the listed office is in reality a private home.
CMS schedule is overly vague, consisting of only times and the type of activity, e.g., 9:30-10:30 Paper
Session 1. 10:30-10:45 Break. 10:45-11:45 Paper Session 2.
Including logos of data-bases or indexing agencies on CMS websites when no indexing will occur.
CMS websites and/or emails contain several spelling mistakes, grammar mistakes, or non-native English.
1This may depend on the copyright permission, CC-BY license, and other factors, including possible
cronyism, links and inappropriate relationships (e.g., friendships) between CMS organizers and journal
editors.
2If only an isolated case, then the link between any individual and an apparently predatory journal or
publisher might not indicate anything at all about that individual. Instead, a consistent pattern of
participation, or support, of academically suspect journals or publishers.
Determination of Predatory Congresses
Jaime A. TEIXEIRA DA SILVA et al.
http://wjst.wu.ac.th
Walailak J Sci & Tech 2017; 14(4)
265
Table 1 notes:
Criteria represent a modified version of a list of criteria provided, with permission, by James McCrostie.
McCrostie’s list will appear in a special issue of the Bulletin of Daito Bunka University, Vol. 56, 2017,
entitled “Developing a Criteria for Identifying Predatory Conferences”. Additional characteristics have
also been added by the authors and several criteria provided by McCrostie have been omitted or modified
for clarity, or to accommodate a 3-colored set of selection criteria. Red level criteria indicate serious
predatory aspects or behavior. It is suggested that even just one of these behaviors merits the label of a
“predatory” CMS. In contrast, yellow level criteria constitute predatory aspects that, in themselves, might
not make the entire CMS predatory, but should raise concerns among academics that plan to attend them.
Orange level criteria show predatory characteristics that are intermediate between red and yellow criteria,
and may need a closer case-by-case analysis. Even though McCrostie suggested that 4 yellow level
criteria would result in a CMS being considered as “predatory”, we believe that a set number of criteria
should not be used initially to make such an assessment, and that each institute establish a committee that
is responsible for vetting and evaluating CMSs using an established level of red, orange and yellow
criteria, beyond which an academic cannot attend that CMS. Similarly, a CMS that displays no red or
orange level criteria, and only very few yellow criteria, could receive institution-based funding, as an
incentive for academics to select carefully. If academic institutes implement such a rigorous and strict
vetting process, the hope is that predatory CMSs will eventually disappear.