Response to President Paul Hofheinz of The Lisbon Council regarding Elsevier and the Open Science Monitor
What a mess. The story so far:
- On June 29, I posted this opinion piece in The Guardian expressing deep concerns with Elsevier as the sole contractor for the new European Commission (EC) Open Science Monitor.
- On July 2, Elsevier followed this up with a response on their website, which has also received some comments and annotations.
- On July 3, I responded to this on my blog with an extensive point-by-point rebuttal.
- July 4, the news outlet Research Professional provided an overview of the discussion to date, including comments from an EC spokesperson.
- On July 5, I filed a formal complaint with the EU Ombudsman, which was accompanied by 432 signatories. At the time of starting to write this article (evening of July 5), the open document this was based on has accumulated 593 signatories.
- On July 5, President Paul Hofheinz of The Lisbon Council, leader of the consortium behind the Open Science Monitor, issued a press release responding to the original Guardian article. This was addressed to the Science Editor of The Guardian, to which I was alerted on Twitter.
This article is the full response to the press release from the President of the Lisbon Council. Here, I just quickly want to add that the reason that I am pursuing this is as a concerned European citizen, a strong supporter of the principles and practices of Open Science, and am receiving no support (beyond that on social media) or financial backing from any source.
As before, the original text in the press release is in italics, and my response in bold.
————–
Firstly, thank you to President Paul Hofheinz for taking the time to respond to the original article. Regarding these facts below, I am forced to highlight that two out of the five comments have been about my character, not the Open Science Monitor. Two have been relevant to the present discussion, but sadly only reiterate what we already mostly knew, without providing any answers to the numerous questions and points I have previously raised, along with other members of the global research community. The final point was about the capabilities of the Lisbon Council, something that I have never called into question. Furthermore, you have failed to demonstrate any aggression, hectoring, or innuendo, or any misinformation on my part, and have not clearly addressed any of the main points I have previously raised.
Dear Sir,
Open Science is an increasingly important field, one for which the Lisbon Council for Economic Competitiveness and Social Renewal, a Brussels-based think tank, where I have the honour of serving as president, has been long committed. Our credentials and commitment to this vision are impeccable, and pre-date the time when open science became an important European objective. We are delighted that open access to publications and scientific data sharing are increasingly becoming the rule, rather than the exception. And we are prepared to continue working to make sure that the results of scientific research are properly and broadly disseminated as widely and quickly as possible – not so that we researchers might have even more to do, but so that we might all live in a society propelled towards and driven by the innovation that open science so clearly engenders.
So it is with much sadness that we read Jon Tennant’s aggressive and misinformed article, “Elsevier are Corrupting Open Science in Europe,” which appeared in The Guardian on Friday, 29 June 2018.
A few facts:
1) The “Open Science: Monitoring Trends and Drivers” project, launched on 22 December 2017, is led by a three-partner consortium: the Lisbon Council (lead manager), the Centre for Science and Technology Studies at the University of Leiden and ESADE Business School, in collaboration with the European Commission, which has committed to open science officially as a policy objective. Elsevier, a large, Amsterdam-based scientific publication publisher, is a subcontractor to the project, having agreed to provide data. For the record, Elsevier has no involvement in defining, building or controlling any of the indicators that make up the Open Science Monitor, which the consortium is contracted to produce. Anyone who claims otherwise is mis-informed and has not taken time to get acquainted with the structure of the consortium, which is described in some detail in the official journal of the European Union and in the memorandum setting out the Open Science Monitor methodology on the European Commission website.
It is still not clear exactly what the role of Elsevier is, although you have helped to narrow down what it is not, information which is also not provided in the methodological note; thus, this remains questionable. Regarding the involvement of Elsevier in the Consortium, there are numerous other questions, available in our complaint to the Ombudsman, that have yet to be addressed. Calling this misinformed because the process has not been sufficiently transparent about Elsevier’s role is not an appropriate response. The prominent issue remains that many members of the research community, myself included, do not believe that the process behind the methodology, and the role of Elsevier, has been made sufficiently transparent; this communication error is not particularly in the spirit of Open Science.
In our formal complaint, our issues focused on two main aspects. First, regarding the process leading to the award of the subcontract to Elsevier, which would have been raised irrespective of who the final award was granted to. Second, regarding the specific choice of the subcontractor. Given Elsevier’s involvement in the monitor, their anti-Open history, and current tensions between Elsevier and higher education institutions around Europe, there are additional questions that need to be raised. These include:
- How did the 3 bids received for the tender score on the specific criteria that were used to select the contractor? Why is this information not required to be made public?
- Who evaluated the suitability of each candidate? Were independent external experts involved in the evaluation process?
- Was there a consultation process involved?
- Why are tenderers only required to identify subcontractors whose share of the contract is above 15%?
- Was the identity of this subcontractor made known to the EU during the tender process?
- Was a risk analysis performed as to the ramifications of the choice of subcontractor?
- CWTS worked for many years on the development of journal indicators based on Scopus. What was the nature of putative collaboration and/or business, and how did this historical relationship (as well as with, for example, The Leiden Manifesto and the Open Data Report) factor into the decision process for the Monitor?
- Why was the public only informed of the outcome of the process months later and exactly at the end of the deadline that any formal objections to the process could be raised?
- Why, if Elsevier data is only one of numerous data sources for the monitor, are they the only data provider included directly in the consortium?
2) The Lisbon Council has a long history of high-profile work on open science. In 2016, the Lisbon Council convened The 2016 Innovation Summit: Open Innovation, Open Science and the Open to the World Agenda, where the European Commission’s flagship publication (Science, Research and Innovation Performance of the EU: A Contribution to Open Innovation, Open Science and the Open to the World Agenda) was launched in the presence of Carlos Moedas, European commissioner for research, science and innovation, and a host of leading European researchers and open-science advocates. For those readers who don’t know the publication, it is the bible of open science, serving as a strong endorsement of the European Commission’s commitment to the dossier and providing an important, emerging evidence base on the economic impact of this policy approach.
I have absolutely no doubt that the Lisbon Council, and other members of the Consortium, are absolutely capable in delivering the Open Science Monitor. Indeed, I never took issue with this at all. However, to state that a document produced in 2016 is the ‘bible’ of open science, is a bit strange – the EU as an institute were not a frontrunner in open science, and the movement was not catalysed by a policy document. Indeed, open science has been around for many years before the Council’s long history of involvement. Nonetheless, my primary two concerns have always been about the transparency of the process, and the role and potential biases associated with Elsevier as the sole data-providing sub-contractor. The scientific community recognises that this is an enormous opportunity to set the global stage for the future of Open Science, and therefore we are rightly concerned with the direction of the project and the involvement of Elsevier. This is the first time that the wider community has been able to engage with the process, and their concern is reflected in the complaint form made to the EU Ombudsman.
3) Few scientific projects succeed without good, up-to-date data. And we all know that Elsevier today not only has a unique position in the scientific-publishing field but also provides advanced data services as part of its business. We reached out to them to help us learn more about the large and quickly growing European market for scientific publications and to form a better, more evidence-based view of the trends in this key area. But, for the record, Elsevier plays no role in the classification of the data they provide to the consortium; that role falls to the Centre for Science and Technology Studies at the University of Leiden, which houses one of the world’s leading scientific-output measuring projects, the CWTS Leiden Ranking. The methodology which CWTS uses to classify data for the Open Science Monitor is described fully in the memorandum mentioned above and on the Open Science Monitor website. And Elsevier is not the only source. The consortium – whose research objective is to track and monitor the spread of open science and the impact it has on economic activity and innovation – also consults a wide array of data in an effort to absorb, analyse and stress test the emerging results, including Sherpa Juliet, Sherpa Romeo, re3data, Datacite, Mozilla Codemeta, Programmableweb, Zooniverse, Github and more. It is our view that more data will lead to better, more robust results. And we don’t quite understand arguments that not looking at some data sets – or not collaborating on a data-basis with one of the world’s largest scientific publishers – would lead to better results.
We are fully aware of the fact that Elsevier is not the only service provider, and have never suggested this either. There are several points to take issue with here. When you say Elsevier are not involved in the classification of the data, this simply cannot be the case. Elsevier collect and provide specific data based on what is feasible within their portfolio, thus providing specific parameters and intrinsic classifications to those data. Furthermore, if Elsevier are going to be helping you to learn more about the EU publications market, this also suggests that they will be doing more than just providing data, and actively involved in the interpretation of those data. Further transparency regarding the specific role of Elsevier would help to clarify this.
The principle issue remains in that there appears to be an unfavourable bias of Elsevier services and tools in the methods note. If it is your view that more data will lead to better results, then why are so many of the critical elements focused on one service provider, Elsevier? Indeed, to assume that more data will lead to better results is presumptive of the outcome before the methods are even known and the data have been collected. Furthermore, at no point has it been suggested that data sets should be excluded from the analyses. Additional questions around this still need to be addressed, that were mentioned in the complaint again (selected ones given here):
- What was the selection method for the different tools and services to be used for the Monitor? This is essential for reliability, robustness, and reproducibility of the methods, and part of standard Data Management best practices.
- Given the EU’s emphasis on Open Science, including Open Data, why is the consortium not requiring that the Open Science Monitor must be based upon open data, open standards, and open source tools (with appropriate licenses for re-use accessibility) as a matter of principle? For example, elements of this could follow the EC’s own Open Source Software Strategy.
Elsevier has a huge scope in providing a wide range of services and tools across the full research workflow, which means that any policy-changes resulting from the Open Science Monitor will affect their publishing and Open Science workflow services business. This will also impact upon Elsevier’s negative reputation amongst many of those engaged in debates about Open Science (typified by, e.g., the Cost of Knowledge boycott), and therefore it can hardly have come as a surprise to the consortium that many have responded to news of Elsevier’s active involvement as a sub-contracting party with dismay and distrust. If it is true that Elsevier are included only as a data-provider with no role in data-analysis or priority setting in identifying indicators, then this is good to know; however, this assertion is not supported by the currently available public documentation, and would be beneficial to all parties to see substantiated. If this were clearer, the Open Science Monitor consortium would find it much easier to assure the concerned community of researchers and others that this role of Elsevier would continue to be the case. Clarity over why Elsevier were included as the sole sub-contractor, especially if their involvement is so minor and given the wholly predictable response, will be required to establish the independence and impartiality of the Open Science Monitor. However, this is just one of many remaining issues.
4) And, finally, there is this question of the tone Mr Tennant uses, and whether it is appropriate in a discussion of this type. No genuine scientific debate was ever conducted in the language of hectoring and innuendo. If Mr Tennant feels so strongly about open science, he can and should participate more directly and constructively in the open, collaborative effort to review and improve the Open Science Monitor methodology), where he (and everyone) can participate directly in creating a powerful, effective open science monitoring tool. There, his comments will be analysed and weighed along with other experts and stakeholders involved in this discussion. Every comment there is logged and processed; and a summary of the comments – and the consortium’s reaction to it – will be published in September, open source, for all to see and read.
As a matter of professional conduct, I refuse to engage with many elements of this comment. However, it is worth noting that on June 17, I left multiple comments on the methodology, a fact which seems to have been missed here. It is also welcoming to know that a report on the comments on the indicators, and the consortium reaction to them, will be published; however, and again for the sake of transparency, this was never made clear, or what the details of this part of the project are. Furthermore, it is not clear whether each individual comment will be responded to, and whether these will be made public, or a general summary of responses.
5) Of course, it would be hard for Mr Tennant to get as much attention as he is getting from attacking the consortium on twitter. But constructive engagement in the project would lead to a better and greater impact on the final outcome. And to more and better open science in Europe. A goal which the consortium is committed to advancing, and which (we believe) we share.
As a matter of professional conduct, I refuse to engage with the content of this comment. Also, my correct title is Dr. Tennant.
This is twice now, including the response by Elsevier, that I have had assaults made on my character over this matter, which look like strategic attempts to discredit me, rather than the substance of the posts. Terms like ‘misleading’ and ‘misinformation’ have been used repeatedly, without any substantial evidence, and detracting from addressing the numerous issues that I have raised. These issues have been co-signed by more than 600 members of the global research community in a formal complaint to the EU Ombudsman, and not treated with the respect that they deserve by Elsevier or the Council. As a result of this, I will no longer respond to such comments, which are not the sort of critical, granular responses I was expecting as part of a professional, critical, and courteous discourse on this matter. However, if members of the consortium, and Elsevier, wish to directly address the points I have raised, then I am available.
Regards,
Jon
am highlighting the relevant parts to myself using genius.it annotation. hope to fill in the blanks (—) later. in the meantime, there is an indonesian/malay saying, “sekali layar terkembang, surut kita berpantang,” which means “once the ship is sailed, don’t stop till destination.” 🙂
http://genius.it/fossilsandshit.com/response-to-president-paul-hofheinz-of-the-lisbon-council-regarding-elsevier-and-the-open-science-monitor/
Great, thank you, Surya. Looking forward to reading your comments. And that is a highly appropriate saying 🙂
The statement / comment / press release of the ‘Lisbon Council for Economic Competitiveness and Social Renewal asbl’ has ‘Open Science, the Open Science Monitor and the Open Science Monitoring Trends and Drivers Project’ as title.
It seems to me that Paul Hofheinz, listed as author of this statement / comment / press release, is more or less suggesting that his views / statements are (more or less) are more or less also the views of the ‘Centre for Science and Technology Studies at the University of Leiden’, one of the 3 partner of the consortium. Anyone over here who has any idea about this issue?
The website of the ‘Centre for Science and Technology Studies at the University of Leiden’ (CWTS) is https://www.universiteitleiden.nl/en/social-behavioural-sciences/cwts It is thus obvious that it is mandatory for all at CWTS to act at all times, and always for the full 100%, according to the 2014 version of the VSNU Code of Conduct at http://www.vsnu.nl/files/documenten/Domeinen/Onderzoek/The_Netherlands_Code%20of_Conduct_for_Academic_Practice_2004_(version2014).pdf It is also stated at https://www.organisatiegids.universiteitleiden.nl/binaries/content/assets/ul2staff/reglementen/onderzoek/klachtregeling-wetenschappelijke-integriteit-2018-eng.pdf : “Within Leiden University and the Leiden University Medical Center, all persons involved in research are personally responsible for safeguarding academic integrity. To this end, the general principles of professional academic conduct should be adhered to at all times.”
I am thus wondering if the texts with the personal attacks / ad hominem attacks on Jon Tennant are fully in line with the expected behaviour of all at CWTS. I am looking forwards to comments about this topic.
Very nice information in this article keep it up..
Tennant’s constructive engagement and energy are appreciated. The Guardian piece brought awareness whereas a comment on the “improve the indicators page” would’ve flown under my radar. The Elsevier and Hofheinz responses to the Guardian op-ed were surprisingly and disappointingly petty and tepid — no light shed, no new perspective gained.