Scholarly Open Access |
||||||||||||||||
|
||||||||||||||||
Home
About
About Us
Beall's
Beall's List
Support
Contact |
||||||||||||||||
|
||||||||||||||||
Another
concerning aspect of Beall’s work is his evaluation of OA publishers
from less economically developed countries. Crawford, Karen Coyle, and Jill Emery have
all noted Beall’s bias against these publishers. Imperfect English
or a predominantly non-Western editorial board does not make a journal
predatory. An interesting example is Hindawi, an Egyptian publisher once
considered predatory that improved its practices and standards over time.
If we accept that there is a continuum from devious and duplicitous to
simply low-quality and amateurish, then it is likely, as Crawford believes, that some of the publishers on Beall’s
list are not actually predatory. Although Beall’s contributions are
arguably compromised by his attitudes about OA, the criteria he uses for his list are
an excellent starting point for thinking about the hallmarks of predatory
publishers and journals. He encourages thorough analysis, including
scrutiny of editorial boards and business practices. Some of his red flags
provide a lot of “bang for your buck” in that they are both easy to
spot and likely to indicate a predatory operation. These include editors
or editorial board members with no or fake academic affiliations, lack of
clarity about fees, publisher names and journal titles with geographic
terms that have no connection to the publisher’s physical location or
journal’s geographic scope, bogus impact factor claims and invented
metrics, and false claims about where the journal is indexed. Beall
also lists common practices indicative of low-quality but not necessarily
predatory journals. He is rightfully wary of journals that solicit
manuscripts by spamming researchers, as established publishers generally
do not approach scholars, as well as publishers or editors with email
addresses from Gmail, Yahoo, etc. Also, he wisely warns researchers away
from journals with bizarrely broad or disjointed scopes and journals that
boast extremely rapid publication, which usually suggests no or only
cursory peer review. Given
the fuzziness between low-quality and predatory publishers, whitelisting,
or listing publishers and journals that have been vetted and verified as
satisfying certain standards, may be a better solution than blacklisting.
The central player in the whitelisting movement is the Directory of Open
Access Journals (DOAJ). In response to the Bohannon sting, DOAJ removed 114 journals and
revamped its criteria for inclusion. Journals accepted into DOAJ
after March 2014 under the stricter rules are marked with a green tick
symbol, and DOAJ has announced that it will require the remaining 99% of
its listed journals to reapply for acceptance. At
the basic level, a journal must be chiefly scholarly; make the content
immediately available (i.e., no embargoes); provide quality control
through an editor, editorial board, and peer review; have a registered
International Standard Serial Number (ISSN); and exercise transparency
about APCs. Journals that meet additional requirements, such as providing
external archiving and creating persistent links, are recognized with the
DOAJ Seal. DOAJ receives an assist from the ISSN Centre, which in 2014
added language reserving the right to deny ISSNs to
publishers that provide misleading information. An
organization that whitelists publishers by accepting them as members is
the Open Access Scholarly Publishers Association (OASPA). Members must
apply and pledge to adhere to a code
of conduct that
disallows any form of predatory be-havior. OASPA has made errors in
vetting applicants, though: it admitted some publishers that it later had
to reject (e.g., Dove Medical Press). Of course, no blacklist or whitelist can substitute for head-on
investigation of a journal. Open Access Journal Quality Indicators, a
rubric by Sarah Beaubien and Max Eckard featuring
both positive and negative journal characteristics, can help researchers
perform such evaluation. Furthermore, any tool or practice that gives
researchers more information is a boon. For example, altmetrics provide a
broad picture of an article’s impact (not necessarily correlated to its
quality), and open peer review—i.e., any form of peer review where the
reviewer’s identity is not hidden—increases transparency and allows
journals to demonstrate
their standards. About
the Authors Monica
Berger is Associate Professor and
Electronic Resources and Technical Services Librarian at New York City
College of Technology, CUNY. Her academic interests include scholarly
communications as well as popular music. Jill
Cirasella is the Associate Librarian for Public Services and Scholarly
Communication at the Graduate Center CUNY, where she leads numerous
scholarly communications initiatives, including the GC’s new
institutional repository, Academic
Works. Jill is a vocal advocate of open access and seeks to
promote understanding and adoption of open access at CUNY and beyond.
|
||||||||||||||||
|
||||||||||||||||
|
||||||||||||||||