Over two years ago in a post entitled Extending Your Community – Through Machine Translation I suggested that although in the past machine translation was felt to be of little use, developments with services such as Google Translate may mean that “machine translation now does have a role to play“.
A few weeks ago I came across a referrer link to a blog post from a post entitled “Google Scholar Citations y la emergencia de nuevos actores en la evaluación de la investigación“. My Chrome browser helpfully informed me that the page was in Spanish and provided a link to an automated translation of the page. The post began:
The launch of a few months ago Google Scholar Citations , the tool for measuring the impact of research publications in indexed by popular search engine, leads us to revise this and other bibliometric applications such efforts to measure the visibility of academic and researchers on the web. This impact is not limited to traditional media (citations received from other scientific works) but embraces new ways of scientific communication and their associated indicators, as the number of downloads of a job, people store it in your manager references or the time that a presentation is envisioned online. It discusses briefly the extent to affect the emergence of these new tools to the traditional databases for evaluation of science, Thomson Reuters, Web of Science , and Scopus, Elsevier .
I think this provides a comprehensible summary of what the post will cover. The post concluded:
Since the level of university policy and research evaluation, the question to be made ??is whether any of the products mentioned both Microsoft and Google mainly but also alt-metrics initiatives can be serious competitors in the near future to the two large databases that provide information bibliometric, important cost, especially in an era marked by budget cuts. Traditional products are more creditworthy and stable than new ones by offering a wide range of possibilities and associated metrics, not just jobs but also to journals in which they are published. Besides its use is widespread and there are some metrics validated by professionals and bibliometrics by agencies with responsibility for research. However, it is legitimate debate about whether these databases are essential in research assessment processes. In our opinion, at present these databases (ISI Web of Science or Scopus, no need for two) are essential for the evaluation, however the new generation Science Information Systems (CRIS)  together seekers free scientists such as Google Scholar, and metrics based on the use of information may provide new solutions to the evaluation of science, perhaps the medium term by decreasing the need for costly citation indexes. Making prospective fiction might think how it would change the market for scientific information and assessment if Google decided to launch its own “impact index” from the indexed information, which does not seem unreasonable since its popular search management PageRank is based on a principle that already apply other bibliometric indicators. In any case, what is certain is that new products and tools available to researchers and evaluators to facilitate the dissemination and the retrieval of scientific information and open new possibilities for the exchange of scientific information and assessment.
The meaning is less clear, but it does seem that the authors, Alvaro Cabezas Clavijo and Daniel Torres-Salinas of the EC3 Evaluation Group Scientific and Scientific Communication at the Hospital Universitario Virgen de las Nieves in Granada, have been asking whether new tools and approaches for identifying the value of scientific research are challenging the well-established tools provided by ISI Web of Science and Scopus. They seem to feel that researchers will need to continue to make use of ISI Web of Science or Scopus but new approaches may increasingly be relevant, especially if Google make a business decision to further enhance their Google Citation service.
Although not mentioned in the conclusions, the article also reviews Microsoft Academic Search and suggests that “compared to Google Scholar Citations, the process of updating the cv is heavier“; a conclusion which reflects my experiences in the long delay in having updates accepted. The article also mentions the altmetrics initiative and provides links to a number of examples of such approaches including “Total Impact  where, in the same line, we can find metrics posted on Slideshare presentations , the times they shared a scientific article on Facebook , or the number of groups Mendeley which has collected a certain job”.
I found the article of interest and I’m pleased to have found it via the referrer link. Should searches of online foreign language resources now become a significant part of a research strategy I wonder? I also wonder what the term “prospective fiction“, mentioned in the conclusions, means? Can any Spanish speakers explain what a better translation for the following sentence could be:
Haciendo prospectiva-ficción cabría pensar cómo cambiaría el mercado de la información y evaluación científica si Google decidiera lanzar su propio “índice de impacto” a partir de la información que indiza, lo cual no parece descabellado ya que su popular sistema de ordenación de búsqueda PageRank se basa en un principio que ya aplican otros índices bibliométricos.
Note that “prospectiva-ficción” was italicised in the original article.