Trust Aggregation, reputation economies and privacy

Last night I listened to this feature on the excellent BBC World Service – Hacking the Vote – pegged on claims by companies hawking their services to political parties that they know enough about a great many individuals to be able to create specific pyschological profiles and thus enable carefully crafted messages to be shown to them, to get them to vote for the candidate paying for the service.

The shocking reminder of the extent to which data is being collected on all of us and put to murky use in the shadows prompted this post.

It’s not about data privacy, particularly – although I personally make my online life stupidly difficult by using a vpn, by installing the anti-tracking, anti java-script, anti adverts, anti-everything extensions I can find to my browsers in an attempt to at least put some road-bumps down for those who would treat my as a statistical profit centre. With the self-defeating result that half the sites I use won’t work unless I grant them freedom to do it all anyway.

It’s about a way that we, as individuals, might be able to use that data for our own purposes. If it’s all being collected and used to manipulate us anyway, why shouldn’t it work for us, a little?

Aggregated trust scores

There have been several attempts at building tools that provide reputation metrics, trust scores – think credit ratings on steroids.

The idea being that individuals will sign up to aggregator sites, and give them access to various kinds of trust/social standing scores. The aggregator sites will then publish trust metrics on individuals, to be used by all sorts of people. Employers, potential service users, lenders, contacts, dating matches.

If anyone manages to crack this (it’s not easy – see this dead indiegogo site for peeple), then individuals will spend more effort curating these than they do on their credit rating. Lawsuits will be brought over harsh ratings using defamation laws drafted decades before the internet was even imagined.

The trust aggregator metric that is itself trusted will be the locus of immense influence.  If that doesn’t already sound scary, there’s another big problem.

Continue reading “Trust Aggregation, reputation economies and privacy”

Security as an Overhead isn’t working

We’re building a medical app. Of course, Therapy-Smarter isn’t collecting deeply intimate data – just basic contact information, some physiotherapist’s notes, exercise prescriptions and exercise performance data – but nevertheless, medical data is medical data- it’s inherently sensitive, and any company that cares about its reputation needs to take data privacy – and thus data security – very seriously indeed.

HealthITbreaches

So, we’ve been thinking about it fairly hard – but not in a technical way; it’s a specialist domain and we assume that we will need to pay people who know what they are doing to advise us on best practice and  then get them to assess our implementation.

No, we’ve been thinking hard about security in terms of business culture, because it seems painfully clear that this is where security weaknesses really come from. That’s right – I’m saying that security weaknesses have much more to do with business culture than they have to do with engineering.

Continue reading “Security as an Overhead isn’t working”