Predatory journals, publishers and conferences are on the rise and becoming increasingly sophisticated. These practices prey on the pressure researchers feel to publish and present their work, and include: pay-to-publish and pay-to-present models without peer review; fake editorial boards listing respected scientists; fraudulent impact factors; journal and conference names deceptively similar to those of legitimate ones; and spam invitations to sham conferences with high registration fees (inter alia).
We recently hosted an event on behalf of the Interacademy Partnership (IAP) on this topic. Titled Predatory Academic Journals and Conferences: Raising Awareness and Understanding, the event provided valuable insights and information from leading regional experts concerning the impact and implications of predatory conferences, and methods to combat these practices.
We were honoured to get an opportunity to participate in the event (as well as host it) and took the opportunity to discuss Think.Check.Attend., a free initiative which we launched in 201.
Alongside predatory journals, predatory conferences pose a real and rising threat to individual researchers and the academic ecosystem as a whole. The organisers of these events don’t simply steal delegates’ money, they also rob them of the chance to present their research to an audience of their peers, the chance to gain valuable feedback, the chance to participate in important discussions and debates in their field, and the opportunity for career-development networking. Already growing rapidly, the uptick in predatory conferences has been exacerbated by Covid, as it is far easier to take advantage of people at a digital conference – with videos off and no face-to-face meetings, they might never even realise they’ve been conned!
A sister initiative to Think.Check.Submit. (designed to combat predatory journal practices), Think.Check.Attend. was developed to raise awareness about predatory practises and enable researchers to identify and avoid the predatory conference organisers lurking in the academic ecosystem waiting to pounce on the unwary…
In the last 2 years alone, over 22,000 people have visited the website and made use of our criteria to evaluate conferences. The initiative involves 3 simple steps:
Step 1: Encourage awareness of predatory conferences by asking researchers to THINK about whether this is the right conference for them to attend and present their research.
Step 2: Suggest researchers to use our Conference Checker tool to CHECK a conference and determine whether it is legitimate and the organisers trustworthy.
Step 3: Advise researchers to only ATTEND a verified, credible conference that offers valuable networking, discussion, and career advancement opportunities.
Researchers considering attending an event can visit the website and use the Conference Checker tool to assess the conference they are considering attending. Our current format is a simple, easy-to-use checklist of criteria with ‘yes’ / ‘no’ answer options. ‘Yes’ equates to 1 point and ‘no’ results in zero points. High scoring conferences display more credibility markers than low scoring conferences which are more likely to be predatory.
However, as the recent IAP report highlighted, predatory conference organisers are becoming increasingly sophisticated in their techniques and it is becoming harder to identify them, especially as many of the characteristics identified in the IAP report are only revealed during the event, after payment has been collected! For example:
• Failing to conduct rigorous peer review of abstract proposals
• Having a disproportionately high acceptance rate – or a very low attendance rate
• Presenting low quality research papers
• Event cancellations with limited notice and no refunds
It is even harder to spot the fakes because many well-established legitimate conferences do not display any credibility markers at all (perhaps because the organisers are not tech-literate, or because the event is in a small-subfield where everyone knows each other so organisers rely on word documents and word-of-mouth instead of websites).
Therefore, we are developing and improving the Conference Checker tool, and currently working on a new matrix based on the spectrum approach proposed in the IAP report. This will entail a more complex scoring system, with different categories which can be rated individually. Our hope is that this will produce a more effective tool to combat modern predatory conference practices.
Ultimately, our goal is to empower researchers to make informed decisions, helping them identify and avoid predatory conferences.