Source: Think Links

It’s been about two weeks since we had the almetrics11 Workshop at Web Science 2011 but I was swamped with the ISWC conference deadline so I just got around till posting about this now.

The aim of the workshp was to gather together the group of people working on next generation measures of science based on the Web. Importantly, as organizers, Jason, Dario and I wanted to encourage the growth of the scientific side of altmetrics.

The workshop turned out to be way better than I expected. We had roughly 36 attendees, which was way beyond our expectations. You can see some of the attendees here:

There was nice representation from my institution (VU University Amsterdam) including talks by my collaborators Peter van den Besselaar and Julie Birkholtz. But we had attendees from Israel, the UK, the US and all over Europe. People were generally excited about the event and the discussions went well (although the room was really warm). I think we all had a good time the restaurant, the Alt-Coblenz – highly recommended by the way-and an appropriate name. Thanks to the WebSci organizing team for putting this together.

We had a nice mix of social scientists and computer scientists (~16 & 20 respectively). Importantly, we had representation from the bibliometrics community, social studies of science, and computer science.

Importantly, for an emerging community, there was a real honesty about the research. Good results were shown but importantly almost every author discussed where the gaps were in their own research.

Two discussions come to the fore for me. One was on how we evaluate altmetrics.  Mike Thelwall who gave the keynote (great job by the way) suggests using correlations to the journal impact factor to help demonstrate that there is something scientifically valid that your measuring. What you want is not perfect correlation but correlation with a gap and that gap is what your new alternative metric is then measuring. There was also the notion from Peter van den Besselaar is that we should look more closely our how our metrics match what scientists do in practice (i.e. qualitative studies). For example, do our metrics correlate with promotions or hiring. The second discussion was around where to go next with altmetrics. In particular, there was a discussion on how to position altmetrics in the research field and really it seemed to position itself within and across the fields of science studies (i.e scientometricswebometrics,virtual ethnograpy ). Importantly, it was felt that we needed a good common corpus of information in order to comparative studies of metrics. Altmetrics has the problem of data acquisition. While some people are interested in that others want to focus on metric generation and evaluation. A corpus of traces of science online was felt to be a good way to interconnect both data acquisition and metric generation and allow for such comparative studies. But how to build the corpus….Suggestions welcome.

The attendees wanted to have an altmetrics12 so I’m pretty sure we will do that. Additionally, we will have some exciting news soon about a journal special issue on altmetrics.

Some more links:

Abstracts of all talks

Community Notes

Also, could someone leave a link to the twitter archive in the comments? That would be great.

Filed under: academia, altmetrics, interdisciplinary research Tagged: #altmetrics, science studies, web science, websci11