Algorithms are threatening our democracy

Written By: - Date published: 9:00 am, May 9th, 2018 - 36 comments
Categories: China, community democracy, Deep stuff, democracy under attack, democratic participation, facebook, internet, interweb, Politics, twitter - Tags:

When the Minister for Information Technology finally appoints a Chief Technology Officer I want them to address the risk to our democracy from algorithms.

If we follow the natural flow of algorithmic logic in the modern world that we inhabit, we are going to see the scope of ordinary analogue politics of real leaders in central and local government elections shrink to something so small and inconsequential to our behaviour that we head straight out of autonomous citizenship.

If anyone has today used used Uber, Google, Facebook, Baidu, Twitter, dating sites, a credit card, or Trademe, bought an air ticket or a train ticket, used eftpos, swiped an airpoints card, or looked at the internet on your phone, you’re currently inside an almighty set of largely unregulated algorithms that describe you better than you know yourself.

And when that perpetual identity about me gets in the hands of a central government, such as China, there’s no chance democracy will emerge.

iThis steady transfer of autonomy is not inevitable, and it is not destiny. We shape it before it shapes us. Policy leaders and business leaders can develop and deploy systems the way they want according to institutional needs. The government has just set up a cheerleader group for perpetually greater digital access.

But my question is: does that group of cheerleaders know that they aren’t promoting a utility like a road; they are promoting a form of ever-more pervasive control from which there is no return?

Only a year ago the previous governments’ MSD Minister was continuing to proudly boast that they were transforming the lives of social welfare recipients with the “investment approach” of intensive algorithmic analytics.

Fortunately the new government has rejected that.

It is within our power to cast privacy nets around sensitive areas of human life, to protect people from the harmful uses of data, and to require that algorithms balance predictive accuracy against other values such as fairness, accountability, and transparency. Those values are absent in governments who have oversight over corporate and public algorithms.

Maybe we need to be louder about rejecting the anti-democratic mature of meritocracy which is inherent in analytic-based distribution. It’s reminiscent of Daniel Bell’s “The China Model” and Zhang Weiwei (back in 2012) who have already started to speak that unspeakableness to English-speaking audiences.

Liberal democracies like ours are of course highly unlikely to ever shift to such a system. Unlikely in our legislature, at least, but those algoriythms fully guide our lives in every other sphere involving choices, entertainment, public or private services, or products: in 95% of our lives. For example in the public realm it is not yet clear what will replace the full data collection and life-mapping of the previous
government as a result of the new social security legislation going through Parliament now.

Though as the Minister for the Security Intelligence Services Andrew Little noted a month ago, some algorythmic grounding is always necessary because our threats are real and they are growing.

If such trends in business and consumer culture continue, we might soon have a culture more in common with Chinese meritocratic and communitarian traditions than with our own history of individualism and liberal democracy, in all but Parliament and City Hall.

China is using is spectacular data capture and its algorithms to form a totalising population wide societal purity programme in which you get points for doing heroic and kind acts, and a negative social credit rating for everything from spitting to jaywalking. That’s where this goes folks.

If we want to change course, we will have to put our own political imperatives before those of our technologies.

36 comments on “Algorithms are threatening our democracy ”

  1. Chris T 1

    No offence, but I totally disagree with your thinking

    An algorithm is just a tool that uses a set of rules or methods to solve a problem or get an answer to a question

    They are hardly a new thing

    They have been used since the time of the ancient greeks.

    I would agree with you though that they can be used in ways that are detrimental.

    I think your issue with algorithms seems to be more to do with the type of algorithms, who is creating them and their motives

    It is a bit like blaming money for money laundering

    • Gosman 1.1

      Exactly.

    • tracey 1.2

      Algorithms dont kill. Guns do. No wait.

    • dukeofurl 1.3

      Algorithm isnt Greek

      “the term algorithm itself derives from the 9th Century mathematician Muḥammad ibn Mūsā al’Khwārizmī, latinized ‘Algoritmi’.

      A partial formalization of what would become the modern notion of algorithm began with attempts to solve the Entscheidungsproblem (the “decision problem”) posed by David Hilbert in 1928- “Wikipedia

      1928 !
      https://en.wikipedia.org/wiki/Entscheidungsproblem

  2. One Two 2

    That’s a good article, Ad…

    The digital world is the antethesis of the biological, natural world…which is analogue…

    The so called digital economy is undefined and will remain so…

    As with all ‘things’…there are various outcomes on a spectrum…

    For example…the same digital technology which can be used to entrap humanity through greater centralization can also enable decentralization…

    A key issue as I see it is this…

    The current available technology and knowledge is already sufficient to fix the major issues faced by humanity today…but the tech is not currently being used to achieve such outcomes…more tech is not a solution our planet or species requires…

    More technology is not going to fix those major issues,…human beings and the natural analogue world, is and always will be the solutions required…

  3. Gosman 3

    Except there is strong evidence to suggest that early intervention social programmes in people’s lives can make a huge difference and the best way to identify at risk people is to use algorithms based on known risk factors.

    • Ad 3.1

      The question of my post is not efficacy, but whether it is worth the tradeoff in human rights, human privacy, and human autonomy.

      Here’s what it’s like inside China’s behavioural adjustment programme.

      It’s more Minority Report or Gattica than either of them.

      http://foreignpolicy.com/2018/04/03/life-inside-chinas-social-credit-laboratory/

      • Gosman 3.1.1

        There are dangers inherent in any system that attempts to make other people’s lives better. It inevitably involves a certain amount of trade offs between individual liberty and societal engagement. The question is whether these trade offs are acceptable or not. Using algorithms on their own is not necessarily a bad thing and won’t always lead to negative outcomes.

        • tracey 3.1.1.1

          No but based on the almost total lack of ethics displayed by our most powerful politicians algorithms are already in the wrong hands

        • lprent 3.1.1.2

          Let me put this simply. I have no interest in anyone having more information about me than I choose to give. I have no interest in coercing others to do differently to myself.

          I am also perfectly willing to disrupt arseholes who try to do that – for any reason
          well intentioned or not.

          I can do that politically, with algorithms, with social education, legally, or if required with violence. Those are some of MY choices.

          The last governments disregard for individual rights to privacy and coercive actions towards beneficiaries along with a callous disregard for the damage to people they were inflicting was steadily moving me towards stronger and more effective means of showing my concern.

          Generally I prefer algorithmic approaches. I use them to find people who vote the arseholes back to obscurity.

    • tracey 3.2

      Can you post the evidence of where states used algorithms to improve the lives of those at risk?

    • dukeofurl 3.3

      Where is the algorithm in the early intervention social program. More like a priority ranking where limited resources ( from budget cuts) are made to ‘young people’ in crisis.

      Actually a good example where it all went wrong
      https://www.stuff.co.nz/national/health/70647353/children-not-labrats–anne-tolley-intervenes-in-child-abuse-experiment

      So no algorithm as they couldnt treat kids like lab rats

  4. Draco T Bastard 4

    And when that perpetual identity about me gets in the hands of a central government, such as China, there’s no chance democracy will emerge.

    Same goes for corporations who get to lobby government and sit down to lunch with ‘our’ elected officials.

    And who’d be the ones getting all the government moneys for developing this and then administering it.

  5. David Mac 5

    I think it will be interesting see how far China HQ push. They staunchly enforced their ‘Just 1 kid everyone’ policy.

    There are all sorts of ways folk can be steered via algorithms. eg: If my doctor has deemed I’m overweight and a diabetes risk my EFTPOS card might no longer buy fizzy drink, if I have a history of drunken violence, can’t buy alcohol etc.

    If it starts emptying Chinese jails and hospitals etc I can see Governments like ours nuzzling up to the idea of extending access and control via the info that floats around about all of us….

    “It’s for your own good Mr Mac.”

  6. greywarshark 6

    I understand that it is algorithms that are being used for decision making about people’s entitlements to ACC. Machines checking through applications. Any that don’t fit the boxes don’t pass.

    We will have to insist that people be involved in work and decision making at every level or this sort of thing is going to be rolled out over the broad population. Our freedom to think and do and be treated fairly as people will be under the ‘iron will or the iron fist’ or factually, some sort of polymer and our ability to be ourselves will be constantly encroached on.

    Meanwhile people go on their complacent way and seem unaware or too accommodating for their own good, of this constant invasion into their lives.

  7. greywarshark 7

    I have a facebook page but I have given it the least information, mainly my name. It has picked up bits from my family and put them up and I haven’t corrected it. If someone used that info to provide information for a business purpose instead of approaching me personally, could I be accused of having false information about me on my page? So many people seem to regard Facebook as gospel but I don’t go out of my way to put up anything that doesn’t serve an immediate purpose, and is limited in its detail.

    Perhaps some lonely people are communicating with some avatar as if it is a real person already. Little monkeys brought up away from their mother, will cling to a wire frame covered in soft material. They draw comfort from the feel of the softness, the concept of caring and touch. I hope we can have a more meaningful future with real people and not just the make-believe or the words flying between people like this!

  8. Cold Hard Truth 8

    “And when that perpetual identity about me gets in the hands of a central government, such as China, there’s no chance democracy will emerge.”

    What makes you think democracy is so wonderful? Its democracy that makes the spying possible, its democracy that lets private corporations claim our personal data, its democracy that lets this turd of a Labour govt piss away our right to self determination signing the CP-TPPA.

    If you go on line say “goodbye” to your privacy. Until climate change destroys organised human existence and eliminates “democratic institutions” nothing will change. Sorry about that.

    • McFlock 8.1

      Do you have a better system than democracy? lol

      As for the destruction of organised human existence, not gonna happen from climate change. If anything it will make societies even more organised and draconian than democracy.

  9. DB Brown 9

    This is an incredible historical phenomenon we’re all witness to. Sure algorithms have been around a while but their applications today are astounding.

    In the capacity of science I always want more data. The caveat being, if the data is corrupt, it is worthless (for working towards truths, not agendas). Big data in areas like microbiology is so damn exciting in its potential, it’d require a book to get me started just how useful it could be (imagine efficiencies down to a microscopic scale)…

    It’s a real catch 22. Algorithms capture data. Data is very useful, like science: but in the hands of corporates and non-benign governance it seems to spell the end of many freedoms. As for data in the hands of politicians, well, it’s rarely ever used correctly. Stats in a hat is what I call it (Sorry Dr Suess).

    A family member runs a business on an algorithm. It spies on visitors to a whole sectors websites and exchanges information between vendors. So you think you have never ‘met’ a retailer, but they know already what you’ve been looking at elsewhere, and what the price range is. They can bump a price up if they know you’ve got the money. His take is if it’s legal (grey area = not illegal) who cares.

    Trying to stop algorithms is going to be like trying to stop Oil or Tobacco. They’ll resist with everything they’ve got as the potential revenue is astonishing.

    It’s not going to be they’ll know what you want before you do, instead, they’ll be able to convince you you want what they’ve got. (how to swing votes is the same as how to swing preferences in other fields).

    I opted out of social media (FB) two years ago. It became obvious my thoughts were sent to advertisers. And the way they divided my feed down to angry lefties and tragedy to greet me each day… what a horror show.

    While everyone is entitled to their opinions, I found the bulk of them (FB feed) were misguided. By algorithms?

    • dukeofurl 9.1

      Trouble with that is they are continuing to profile you. Twitter and Facebook cookies are ubiquitous on many many websites.
      Answer is to check in your browser search bar and the little padlock. Click on that and will show how many cookies, just block FB and twitter ones ( and then delete)

  10. Philg 10

    Algos, like hammers or guns can b weaponised. This is the crux of the issue. Do we regulate and enforce laws on guns, hammers? AI justice is just round the corner.

  11. ropata 11

    Nothing wrong with “algorithms” in general, they are ubiquitous in everything electronic. Algorithms started off as mathematical abstractions, but now they are having widespread social impact, they need an ethical framework.

    Europe passed its wide ranging data privacy legislation in 2016 and it comes into force on 25 May 2018. Companies worldwide are scrambling to comply or be subject to considerable fines or sanctions in Europe.
    https://en.wikipedia.org/wiki/General_Data_Protection_Regulation

    New Zealand should do the same.

  12. tangello 12

    Best bet for tracker-blocking (without doing any extensive networking or security) is 1) clear all caches thoroughly
    2) delete Chrome and get Firefox
    3) turn on all privacy blockers and options, enable ‘do not send’ requests, get uBlock Origin extension etc
    4) get the ‘facebook container’ extension which boxes fb’s access to only its own tab while browsing.

    These should largely be repeatable for mobile too, although Firefox has made it a lil easier with their security-oriented browser Firefox Focus, which is rad.

The server will be getting hardware changes this evening starting at 10pm NZDT.
The site will be off line for some hours.