I'm Damian.

I work as an Associate Professor for Communication in the Digital Society at the Department of Communication Science at the University of Amsterdam.

Research
Media use, selective exposure, social media, big data, credibility
Teaching
I teach a variety of communication courses, in particular about 'new media' topics.
Media and talks
Interviews in the media and public talks
Computational Social Science
The Amsterdam Resources for doing CSS with Python

Finally, some updates.

The last months, I was so busy with all kind of stuff – teaching, publishing some papers based on my research, and giving some talks and interviews – which unfortunately made me a bit sloppy in putting all of what was happening on my website. But by now, everything should be updated. Have a look around!

Some media attention…

This weekend, there was a lot of media attention on Filter Bubbles. On Friday, quality newspaper Trouw published a two-page story based on interviews with me and some colleagues. We tried to make the point that filter bubbles– at least in the Netherlands, at this point in time – are less of a problem than often assumed.

On Saturday, I was interviewed on Radio 1 (Argos) about the same topic, as well as about political microtargeting. You can listen to the fragment here.

Coincidentally, and not related to our project, also quality newspaper de Volkskrant published a large story on Filter Bubbles related to music. It discussed the relationship between the usage of Spotify and music taste, and also hinted at the need for diversity in a music recommendation algorithm, to prevent it from becoming ‘boring’.

Teaching news: New Master Data Science

These weeks, I was co-teaching with Stevan Rudinac the two-week case “Political Communication” as part of the Course “Fundamentals of Data Science”, which is the first course of the new master Data Science at the UvA. The students were analyzing tweets about the US election campaign. A very nice example of interdisciplinary co-operation!

 

New publication on news sharing

I’m happy to announce that our article on news sharing has just been published. From the abstract:

People increasingly visit online news sites not directly, but by following links on social network sites. Drawing on news value theory and integrating theories about online identities and self-representation, we develop a concept of shareworthiness, with which we seek to understand how the number of shares an article receives on such sites can be predicted. Findings suggest that traditional criteria of newsworthiness indeed play a role in predicting the number of shares, and that further development of a theory of shareworthiness based on the foundations of newsworthiness can offer fruitful insights in news dissemination processes.

Trilling, D., Tolochko, P., & Burscher, B. (2016). From newsworthiness to
shareworthiness: How to predict news sharing based on article characteristics. Journalism & Mass Communication Quarterly, online first. doi:10.1177/1077699016654682

Blog: Facebook is a news editor

Natali Helberger, with whom I work on the Personalised Communication project, and I wrote a blog post for the LSE Media Policy Project blog: ‘Facebook is a news editor: the real issues to be concerned about‘. We argue that of course, Facebook employs human editors to curate the news feeds, but that this– while it should not come as surprise at all – has serious legal consequences.[full post]

… and another one…

This seems to be the week of publications on selective exposure, as also a literature review I co-authored was published. The abstract:

Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.

Zuiderveen Borgesius, F. J., Trilling, D., Möller, J., Bodó, B., de Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles ? Internet Policy Review, 5(1). doi:10.14763/2016.1.401 [Full text]