Algorithms. We’re all talking about them, but how many of us actually understand what they are? Tech critics, researchers and academics are sounding warning bells that an increasing societal dependence upon algorithms is potentially very dangerous. Data scientists that I follow, though, are excited by the possibilities that algorithms hold for society. These conflicting views can be confusing - so let’s go back to basics, and consider what exactly we mean when we talk about algorithms.
Facebook controls our Newsfeed- what we see, what we don’t, and we have little idea how it works. But we don’t necessarily need to know exactly how it works, in order to use it to our advantage. It turns out that a couple of individuals have already “hacked the algorithm” by using certain keywords in their Facebook status, to give visibility to issues they care most about - so maybe it’s time for activists to join the game.
Last week, I took part in a debate on the role of big data in development, at re:publica here in Berlin. The session was fun, and interesting, and a couple of things have stayed running around my mind since then, helped by some other inspiring talks that I saw at the conference, and conversations I had.
The biggest one: that, especially in international development, we seem to assuming that within big data holds all sorts of answers. Within the context of the “data revolution”, big data is put on somewhat of a pedestal, and in my opinion, we’re putting too much faith in the insights that can actually be gained from big data alone.
The first full episode of the new podcast I’m co-presenting and researching, Collusion, is now up, and you can listen directly below:
Doing the research for this was incredibly interesting, and so just in case other people want to know what sources we drew upon, we’ve started a Tumblr, https://collusionpodcast.tumblr.com, where we’ll be putting research and links that we find in preparation for the podcast.
In the days since the tragic earthquake in Nepal, various forms of assistance have been offered from governments, charities, humanitarian response organisations, and by for-profit technology companies, whose reach and influence is bigger than ever before.
In many ways, it’s quite incredible that we have an expectation that international corporations and companies should offer some kind of response or assistance in the wake of a humanitarian emergency; but in fact, they’re very well placed to do so in lots of cases.
Here’s a list of responses from tech companies that I’ve come across so far; I’m sure I’ve missed some out, and I’ll keep updating it, so please let me know which ones I’ve missed out.
Note: as Maya pointed out: there’s lots of other digital responses going on right now, like the great work by the Humanitarian Open Street Map team. In this post though, I’m thinking about ‘tech company’ as being a for-profit company without an official social mandate.
Last week, I attended the International Journalism Festival in Perugia. It was beautiful, and in many ways, surprising.
With my usual work, I spend most of my time thinking about how civil society and journalists can use data more effectively in their work, and the problems that we come up against are things like:
- Not having enough data (eg. it doesn’t exist, it’s not online, no access to it)
- Not having access to the right technologies (tools behind a paywall, open source tools might not do the trick)
- Not having good enough access to internet, or no access at all - or, audiences having low levels of connectivity
- Being in restrictive political environments with low levels of press freedom
- Keeping safe online
But in Perugia, the problems mentioned were largely very different. The hurdles facing data journalism here aren’t related to technology, they’re related to culture.
The Syrian conflict is one of the worst tragedies of our time, and now, unlike during many tragedies of years gone by, now we have new ways of telling those stories in real time, reaching people who might be geographically far away, but who could (perhaps with the right story) be brought ever closer. Over the past couple of years, organisations, institutions and individuals have used various digital strategies to tell Syria’s story.
As Susan Sontag said, images transfix, and anaesthetize. I wonder though; could digital storytelling methods cause a similar ‘fatigue’? Though some of the examples here are now fairly new and innovative strategies, will they at a point become so regular that they don’t evoke any emotions within the viewer at all?
Here are a few examples I’ve come across of interesting campaigns and stories around the Syrian conflict, all of which evoked strong emotions for me.
I’m very excited to announce that together with Luiza Prado, I’ll be co-hosting a new monthly podcast exploring the relationship between power and technology. We’ve explained a little why we’re doing the podcast, and what we hope it will be, in this (very short!) introduction:
We’ll be tweeting from @Collusion_, and putting episodes up on http://soundcloud.com/collusion, together with links to sources and further material on the topic that we find, and the first full episode will be coming up in late April.
Get in touch if you’ve got ideas for what we should be reading or looking into!
Last week, I took part in a workshop discussing “Impact Assessment for Data training”, with a number of different practitioners working on some aspect of improving data literacy in their communities.
From a personal perspective, it was fascinating to see that the priorities of people there were largely similar: everyone was working (in some way) on improving data skills among different groups of people who were well placed to then use those skills to push for social change. In some countries, the target group for this was university students; in others, the focus was activists, or young people, but the overall aim was the same; to empower them with the skills they need to achieve their own goals, better.
An interesting exercise asked us to define what we actually meant by ‘impact assessment’ - we realised that for many of us, it was a way to work out whether the interventions we were leading, were having a positive or negative effect on the communities we’re working with. Most of the time, though, these impact assessments act as feedback to donors for the said activity - so I do wonder what it would take, or how often it comes about, that an impact assessment carried out internally actually reveals anything but a positive outcome.
I’ve noticed two somewhat distinct schools of thought, or action, around asserting digital rights online. One, calling for people to practise better digital security online; raising awareness of privacy-protecting ways to browse the internet, promoting alternatives to big, privacy-invasive corporations, to name just a few tactics.
The other: encouraging playfulness, and subversion of the roles that those corporations and governments are putting internet users in. Instead of hiding; revealing. Instead of blocking; pushing. Instead of circumventing; confronting.