News 
ANDS has begun an online course in Research data management with 23 (research data) Things
What is 23 (research data) Things?
23 Things is a recognised training concept with several organisations already using the idea to help Librarians, data managers and others to build their understanding of research data and its potential.
CDU Library staff have updated the CDU Referencing Guide for APA and Harvard and now include examples of how to reference pre-publication author accepted and submitted versions of articles that are made open access in eSpace.
For quick reference they have also been posted here on the Open Access Libguide.
The online presentation can be seen at the VALA conference site. You may have to register to view.
As an Open-Access Megajournal Cedes Some Ground, a Movement Gathers Steam
By Paul Basken JANUARY 13, 2016
Last year PLOS ONE published 10 percent fewer papers than it did two years ago. Its editors say that’s a sign that more major publishers are taking open-access publications seriously.
Deborah A. Zarin , Tony Tse
Published: January 19, 2016DOI: 10.1371/journal.pmed.1001946
The Institute of Medicine (IOM) [1], journal editors [2,3], and many others [4–6] have called for more widespread, third-party access to the individual participant data (IPD) and associated documentation from clinical trials (i.e., “IPD sharing”). Advocates assert that access to trial IPD will help to address well-established flaws in the current system of communicating trial results, including nonpublication, selective reporting, and lack of reproducibility [7]. Additional proposed benefits include the ability to reanalyze study data (e.g., validation and/or correction of previously published findings [8]) and to combine data from multiple studies (e.g., IPD-level meta-analyses [9]). Others note the burdens and costs associated with preparing IPD and associated documentation for sharing, the need to ensure participant privacy, and the risk of invalid analyses [10].
Darren B. Taichman , Joyce Backus, Christopher Baethge, Howard Bauchner, Peter W. de Leeuw, Jeffrey M. Drazen, John Fletcher, Frank A. Frizelle, Trish Groves, Abraham Haileamlak, Astrid James, Christine Laine, Larry Peiperl, [ ... ], Sinan Wu
Published: January 20, 2016DOI: 10.1371/journal.pmed.1001950
The International Committee of Medical Journal Editors (ICMJE) believes that there is an ethical obligation to responsibly share data generated by interventional clinical trials because participants have put themselves at risk. In a growing consensus, many funders around the world—foundations, government agencies, and industry—now mandate data sharing. Here we outline ICMJE’s proposed requirements to help meet this obligation. We encourage feedback on the proposed requirements. Anyone can provide feedback at www.icmje.org by 18 April 2016.
Here's an excerpt:
For some years, researchers have been using new ways to communicate and share their work by using academic social networks. In an attempt to foster the development of Open Access in France, the French consortium COUPERIN (Unified Consortium of Higher Education and Research Organizations for Access to Numerical Publications) proposed that academic social networks could be used to convince researchers of becoming more involved in Open Access. To test this hypothesis, a nationwide survey was launched in 2014 to explore whether and how these academic social networks are used to share content, but also how they compare to other Open Access classic tools. Within a month (20 May to 20 June), 1,898 researchers answered this 28-question survey. It was fully completed by 1,698 of them.
NHMRC Conclusion:
* A peer review system that gives prime consideration to the impact factor of journals for peer review of individual applications is unfair and unscholarly.
* Its use in this way is supported neither by high impact journals themselves, nor by the originators of the impact factor.
* Therefore, NHMRC will no longer call for inclusion of impact factors in applications nor use journal impact factors in peer reviewed evaluations.
"The harm caused by myths about open access” Peter Suber – 30 July 2014
"Open access ix myths put to rest" Peter Suber – The Guardian 22 October 2013
"Busting the top five myths about open access” Danny Kingsley, The Conversation 11 July 2013
ESAC stands for “Efficiency and Standards for Article Charges”. ESAC aims to
*address the challenges associated with the management of Open Access article charges (alias: APCs, article processing charges, article page charges, article fee etc.);*start the discussion on efficient workflows involving all parties such as funders, libraries, authors, standardization initiatives, and publishers;
*propose good practices and proven workflows.
As Open Access publishing is growing, libraries, funders, and open access publishers are faced with the need to establish administrative routines for the management of Article Processing Charges (APCs). Without past experience in managing, many institutions have turned to manual systems of processing payments and/or established routines that may not be scalable as the uptake of pure OA among their researchers grows.
The ESAC initiative evolved from this workshop to communicate its results, to preserve the discussion and to establish a platform where practical solutions can be proposed.
As a first step, ESAC suggests an improved data exchange between publishers and institutions in order to process invoices for APCs more efficiently and to enable institutions to generate reports on their APC budget expenses. Additionally, some ideas for the enhancement of workflows with regards to publisher¹s submitting systems are introduced.
Learn more at www.esac-initiative.org
See also the article on managing article processing charges ahe Australian Open Access Support Group (AOASG) site: http://aoasg.org.au/managing-article-processing-charges/
In 2012, the Australian Research Council conducted the second evaluation of Excellence in Research for Australia (ERA). The report provides the outcomes of the ERA 2012 evaluations, which applies to research undertaken between 1 January 2005 and 31 December 2010.
Charles Darwin University scored well above world standard (5) for evidence of outstanding performance in Clinical Sciences; above world standard (4) in the Environmental Sciences, the Agricultural and Veterinary Sciences, and Medical Microbiology; and at world standard (3) in Biological Sciences, Information and Computing Sciences, Economics, Medical and Health, and Public Health and Health Services.
See http://www.arc.gov.au/pdf/era12/report_2012/ARC_ERA12_Section4.pdf
LIBER, the Association of European Research Libraries, along with 17 other International library and research organisations, have issued an open letter to Elsevier.
The letter requests that Elsevier withdraw its TDM [text and data mining] policy because it places unfair restrictions on how researchers can mine content to which they have legal access to and how they disseminate the results of their research.
The letter is available @ http://libereurope.eu/news/european-research-organisations-call-on-elsevier-to-withdraw-tdm-policy/
It is open to further signatories, so please disseminate amongst your networks! Individuals or organisations wishing to sign the letter should forward their details and logo on to: susan.reilly@kb.nl
The Global SCImago Institutions Rankings (SIR)is an annual report that evaluates research performance in organisations that have had at least 100 documents published in the last year of the preceding five years.
The 2013 Global SIR shows that Charles Darwin University's performance has improved against all indicators except for the number of papers in which the lead author is listed as belonging to the University, and the degree of specialisations of publications.
In 2013, quality of research output is high, with the University ranked
* 16th for Normalized Impact, indicating that the University's research is quoted approximately 30% more than the world average;
* 18th for Percentage International Collaboration, a reflection of the percentage of publications that have international co-authors;
* and first for percentage of high-quality publications, which reflects the proportion of CDU publications that appear in highly ranked journals. . . . (p. 5 CDU Annual Report 2013)
See http://www.scimagoir.com/
The Thomson Reuters Times Higher Education World University Rankings placed Charles Darwin University in the top 400 universities for the third consecutive year. The rankings are based on five indicators relating to international outlook, research citations, industry income and teaching.
The University's performance has improved in four of the five indicators, with the most improvement occurring in measurements relating to international outlook and research. . . . (p. 5, CDU Annual Report 2013)
See http://www.timeshighereducation.co.uk/world-university-rankings/