Saturday, December 7, 2013

After MOOCs, the next challenge to Academia

Increasing Challenges to Traditional Academia 

The huge surge in interest in MOOCs underlines more the problem with the current status of academia and less the qualities of MOOCs as a replacement model. Part of the attraction of MOOCs is that they present a potential solution to address the eternal triangle of education - Access, Quality and Cost.  

As if the ridiculous cost of higher education combined with the absence of matching quality is not a problem enough, the traditional publish-or-perish tenure process is being called into question. Peter Higgs, the man behind the Higgs boson, and recipient of the Nobel Prize was quoted as saying 
"(He) doubts work like Higgs boson identification achievable now as academics are expected to 'keep churning out papers"
 Doing high quality research and disseminating the results should not be controversial.  The problem is that the publication process has been tied with tenure.  A high-stake summative examination promotes teaching to the test, and creates an industry around the examination and preparation process, while causing distractions for the actual process of education.  The publish-or-perish tenure process is similarly flawed and has been compounded by the issues with the peer-review itself.

Problems with the peer-review process

Problems with the peer-review process have been well-recognized.  While it is a valuable process, it is not perfect.  In a previous blog (from 2010) we looked at some of these problems as highlighted by Richard Smith (former editor of BMJ).
  • Faith based not evidence based
  • Slow
  • Expensive
  • Largely a lottery
  • Poor at detecting errors and fraud
  • Stifles innovation
  • Biased

Can Web 2.0 play a role?

In that same blog, I outlined a model for using technology and Web 2.0 tools to create a post-publication review process.  Here is an outline of the model:
  1. A central resource for online hosting of all research articles in each area of biomedical science.  We would not have multiple journals competing and catering to the same audience
  2. There would be some kind of simple review process to filter out "junk" and "spam" publications
  3. The articles would need to include all the necessary raw data so anyone could rerun the statistical tests and verify the results.
  4. There would be a robust authentication scheme for authors.
  5. Each article would have a place for commenting much like a blog, but you would need to have to be authenticated before submitting your comments.  There would be no anonymous comments.
  6. Readers after logging in could rate each article on various criteria e.g. study design, practical value, etc...  
  7. The comments could also be rated up or down
  8. It would be possible to track how many times the article was cited, tweeted and posted on Facebook; how many times it was downloaded, favorited,  etc.
  9. Other studies on the same topic would also be linked from the article making it easy to find all the studies in one place.
  10. Part of the publication process would be to search for all the previously published related articles in this central repository and provide links to all of these.
  11. Viewers could see a timeline of development of literature on a specific topic 
  12. Over a period of time, some studies, authors, commentators would rise to the top.  
  13. There would be a robust search and tagging system.
  14. Some articles could be accompanied by "editorials".
  15. Every time the IRB at an institution approved a protocol, it would create an entry in this central repository.  Investigators would have to provide their data and a short summary at end of the study even if they did not write it up fully.  This would remove the problem of publication bias for positive studies and make meta-analyses more complete.  If they did not provide this information, their ratings would go down. 

Progress

This was a proposal made almost 3 years back.  Some of the pieces are beginning to fall into place.

Authentication in the online space

Google is stepping up by having people create a profile on Google+ with their real names.  This is getting tied with comments in other Web 2.0 spaces like YouTube and Blogger.  When you search on Google, you can see results that were "1+'ed" by people you know on Google+.  You can see "likes" by Facebook friends on public sites.  These are the first steps to creating authentic personae in the online space.  
When scholars "1+'s" an article or comment on a website, their online persona is tied to that article and will increasingly carry the same weight as commenting on a print article e.g. a letter to the editor.

Public comments on PubMed

Just recently, PubMed Commons was opened to invited researchers to comment on any article indexed on PubMed.
PubMed Commons is a system that enables researchers to share their opinions about scientific publications. Researchers can comment on any publication indexed by PubMed, and read the comments of others. PubMed Commons is a forum for open and constructive criticism and discussion of scientific issues. It will thrive with high quality interchange from the scientific community. PubMed Commons is currently in a closed pilot testing phase, which means that only invited participants can add and view comments in PubMed.

Online communities using reference and citation managers

Sites like Mendeley, CiteULike and Zotero allow scholars to collect online publications in their libraries and share these within groups/communities.  This activity provides data on the frequency with with each publication is included in a library and the number of people doing this.

Tracking impact of an article in the digital world

Altmetric is a terrific tool that tracks the impact of an article in social media and on sites like Mendeley, Zotero and CiteULike. There is a very cool bookmarklet and a Chrome extension that lets one see the Altmetrics of an article.  For example a recent article that I co-authored was published in Academic Medicine in October 2013 just over a month back.  It has been cited twice in literature according to Google Scholar.  It will take a long time to see how many times it will actually be cited in peer-reviewed journal articles.  But using the Chrome extension, I could easily see the impact factor it has in social media.  
Clicking on the link brings up detailed statistics:

It shows the overall impact of the article compared to all articles in Academic Medicine and also provides the impact for all articles of similar age, i.e. published 6 weeks on either side of it.
Altmetric has made is API public and this is being used to create innovative tools.  Thus PaperShip recently created an app for iOS that connects to Zotero or Mendeley and also displays Altmetrics data for each article.

The foundations of traditional academia are being challenged on multiple fronts and are beginning to show their age!  How long before the foundation is rebuilt or the house comes tumbling down?


No comments:

Post a Comment