VIA GIZMODO: 120 artigos científicos foram criados em “gerador de lero-lero” e ninguém percebeu.

por jonasscherer

No sábado (1/3), o Gizmodo Brasil publicou matéria que, por sua vez, foi publicada antes pela Nature (sim, isso é o jornalismo de internet), sobre algo que é comum, mas não muito destacado: muitos artigos “científicos” são puro lero-lero e ninguém percebe. Abaixo reproduzo o conteúdo do Gizmodo Brasil e, em seguida, da Nature:

120 artigos científicos foram criados em “gerador de lero-lero” e ninguém percebeu – BY ASHLEY FEINBERG – GIZMODO BRASIL

Esta semana, Nature revelou que as editoras de revistas científicas Springer e IEEE removeram mais de 120 artigos publicados entre 2008 e 2013. Elas descobriram que cada um deles era jargão sem sentido, todos gerados automaticamente por computador.

O enorme descuido foi descoberto pelo cientista da computação Cyril Labbé, que passou os últimos dois anos reunindo esses artigos.

Os textos foram elaborados com um programa do MIT chamado SCIgen; qualquer pessoa pode baixá-lo e usá-lo. Ele foi criado em 2005 para provar que conferências acadêmicas constantemente aceitam estudos sem qualquer sentido. Labbé criou um site onde usuários podem testar se artigos foram criados usando o SCIgen; ele diz à Nature que eles “são bastante fáceis de se detectar”.

Labbé diz não saber por que os trabalhos foram enviados, nem mesmo se os autores sabiam deles. Mas como algo assim pode ser publicado em veículos sérios? Parte da genialidade do esquema é que, pelo menos para um olhar destreinado, os artigos parecem plausíveis.

Por exemplo, um dos trabalhos publicados, vindo de uma conferência de engenharia na China, é intitulado “TIC: Uma metodologia para a construção do e-commerce”. Vago, mas parece plausível. Só que o resumo já causa estranheza:

Nos últimos anos, muitos estudos vêm se dedicando à criação de chaves públicas e privadas de criptografia; por outro lado, poucos sintetizaram a visualização do problema do produtor-consumidor. Dado o estado atual de arquétipos eficientes, importantes analistas notoriamente desejam uma emulação do controle de congestionamento de rede, que incorpora os princípios fundamentais de hardware e arquitetura. Em nossa pesquisa, nós concentramos nossos esforços em refutar que planilhas podem ser compactas ou feitas com base em conhecimento e empatia.

Basicamente, algo saído de um gerador de lero-lero. Segundo a Nature, a maioria dos trabalhos veio de conferências que aconteceram na China, e a maior parte tem autores com filiações chinesas. No entanto, ninguém sabe ao certo quem está por trás desse escândalo.

Dezesseis dos trabalhos foram publicados pela Springer, enquanto mais de 100 vieram da IEEE. O problema é que os estudos, supostamente, são revisados por pares: eles passam pelo escrutínio de um ou mais estudiosos com mesmo escalão que o autor, em geral de forma anônima. Por isso, as editoras estão tendo dificuldade em explicar exatamente como isso aconteceu.

Ou seja, se você já teve que ler um artigo científico e ficou meio (muito) confuso, não se sinta tão mal: talvez o texto fosse algo sem sentido gerado automaticamente por computador.

Aqui, a reportagem, mais completa, da Nature, em inglês.

Publishers withdraw more than 120 gibberish papers

The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense.

Over the past two years, computer scientist Cyril Labbé of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers (IEEE), based in New York. Both publishers, which were privately informed by Labbé, say that they are now removing the papers.

Among the works were, for example, a paper published as a proceeding from the 2013 International Conference on Quality, Reliability, Risk, Maintenance, and Safety Engineering, held in Chengdu, China. (The conference website says that all manuscripts are “reviewed for merits and contents”.) The authors of the paper, entitled ‘TIC: a methodology for the construction of e-commerce’, write in the abstract that they “concentrate our efforts on disproving that spreadsheets can be made knowledge-based, empathic, and compact”. (Nature News has attempted to contact the conference organizers and named authors of the paper but received no reply*; however at least some of the names belong to real people. The IEEE has now removed the paper).

*Update: One of the named authors replied to Nature News on 25 February. He said that he first learned of the article when conference organizers notified his university in December 2013; and that he does not know why he was a listed co-author on the paper. “The matter is being looked into by the related investigators,” he said.

How to create a nonsense paper

Labbé developed a way to automatically detect manuscripts composed by a piece of software called SCIgen, which randomly combines strings of words to produce fake computer-science papers. SCIgen was invented in 2005 by researchers at the Massachusetts Institute of Technology (MIT) in Cambridge to prove that conferences would accept meaningless papers — and, as they put it, “to maximize amusement” (see ‘Computer conference welcomes gobbledegook paper’). A related program generates random physics manuscript titles on the satirical website arXiv vs. snarXiv. SCIgen is free to download and use, and it is unclear how many people have done so, or for what purposes. SCIgen’s output has occasionally popped up at conferences, when researchers have submitted nonsense papers and then revealed the trick.

Labbé does not know why the papers were submitted — or even if the authors were aware of them. Most of the conferences took place in China, and most of the fake papers have authors with Chinese affiliations. Labbé has emailed editors and authors named in many of the papers and related conferences but received scant replies; one editor said that he did not work as a program chair at a particular conference, even though he was named as doing so, and another author claimed his paper was submitted on purpose to test out a conference, but did not respond on follow-up. Nature has not heard anything from a few enquiries.

“I wasn’t aware of the scale of the problem, but I knew it definitely happens. We do get occasional e-mails from good citizens letting us know where SCIgen papers show up,” says Jeremy Stribling, who co-wrote SCIgen when he was at MIT and now works at VMware, a software company in Palo Alto, California.

“The papers are quite easy to spot,” says Labbé, who has built awebsite where users can test whether papers have been created using SCIgen. His detection technique, described in a study1published in Scientometrics in 2012, involves searching for characteristic vocabulary generated by SCIgen. Shortly before that paper was published, Labbé informed the IEEE of 85 fake papers he had found. Monika Stickel, director of corporate communications at IEEE, says that the publisher “took immediate action to remove the papers” and “refined our processes to prevent papers not meeting our standards from being published in the future”. In December 2013, Labbé informed the IEEE of another batch of apparent SCIgen articles he had found. Last week, those were also taken down, but the web pages for the removed articles give no explanation for their absence.

Ruth Francis, UK head of communications at Springer, says that the company has contacted editors, and is trying to contact authors, about the issues surrounding the articles that are coming down. The relevant conference proceedings were peer reviewed, she confirms — making it more mystifying that the papers were accepted.

The IEEE would not say, however, whether it had contacted the authors or editors of the suspected SCIgen papers, or whether submissions for the relevant conferences were supposed to be peer reviewed. “We continue to follow strict governance guidelines for evaluating IEEE conferences and publications,” Stickel said.

A long history of fakes

Labbé is no stranger to fake studies. In April 2010, he used SCIgen to generate 102 fake papers by a fictional author called Ike Antkare [see pdf]. Labbé showed how easy it was to add these fake papers to the Google Scholar database, boosting Ike Antkare’s h-index, a measure of published output, to 94 — at the time, making Antkare the world’s 21st most highly cited scientist. Last year, researchers at the University of Granada, Spain, added to Labbé’s work, boosting their own citation scores in Google Scholar by uploading six fake papers with long lists to their own previous work2.

Labbé says that the latest discovery is merely one symptom of a “spamming war started at the heart of science” in which researchers feel pressured to rush out papers to publish as much as possible.

There is a long history of journalists and researchers getting spoof papers accepted in conferences or by journals to reveal weaknesses in academic quality controls — from a fake paper published by physicist Alan Sokal of New York University in the journal Social Text in 1996, to a sting operation by US reporter John Bohannon published in Science in 2013, in which he got more than 150 open-access journals to accept a deliberately flawed study for publication.

Labbé emphasizes that the nonsense computer science papers all appeared in subscription offerings. In his view, there is little evidence that open-access publishers — which charge fees to publish manuscripts — necessarily have less stringent peer review than subscription publishers.

Labbé adds that the nonsense papers were easy to detect using his tools, much like the plagiarism checkers that many publishers already employ. But because he could not automatically download all papers from the subscription databases, he cannot be sure that he has spotted every SCIgen-generated paper.

Anúncios