[ad_1]
It wasn’t that long ago that digital technology seemed like the answer to all our problems, certainly for most of our lifetimes. In the year Pick up any book about the promise of technology in the 1990s and early 2000s, and the democratizing impact of the digital revolution is indisputably bound to bring many benefits to civilization as we know it.
Today, that premise seems far away. While there are plenty of reasons to be excited about technology, there is no shortage of reasons to worry. In his brilliant new book The Digital Republic: On Freedom and Democracy in the 21st Centurybarrister and author Jamie Susskind asks how freedom and democracy can exist in a world filled with all-powerful digital technologies.
Digital Trends: What’s the central argument you’re making? Digital Republic?
Jamie Susskind: The main argument is that we have a problem with the tech industry. That problem is not with individual bad apples at the top or with separate corporations. It is a problem of unaccountable power due to lack of proper governance.
My book tries to figure out where that power comes from, why it’s a problem, and how we can hold it accountable in a way that preserves freedom and democracy.
DT: Explain what you mean by ‘republicanism’ mentioned in the title of the book.
JS Drawing on the ancient republican philosophy that stretches back to the Romans. This is not the republicanism of the modern Republican Party, or those who want to get rid of the monarchy, for example in the United Kingdom. Republicanism is the philosophy that the purpose of law and politics is to reduce unaccountable power in society. For example, a republican is not only against a certain bad king, but against the idea of kings. They do not hope for better bosses; They argue for the right to employment. They don’t complain about unpleasant slave owners; They fight to end slavery.
Applied to a digital context, digital republicanism argues that the concentration of too much power in the hands of those who own and control digital technologies is inherently problematic. Even if we sometimes agree on how they use their power, that is the case.
DT: Tech companies are sometimes criticized from both sides for being somewhat political. But is there a way to avoid this? It seems inevitable. Even the broad idea of a computer interface is ideological because it structures how we perceive the world. Add in the mission statement and scale of search engines and this problem seems to come up all the time.
JS i think so. A central argument of my book is that digital technologies exert power on the part of their creators, whether they are aware of it or not. All technologies have rules that we must follow when dealing with them. Twitter’s rules state that you can’t post a tweet if it’s over a certain length. Self-driving car regulations may state that they will not drive at a certain speed limit, even in an emergency.
As many of our actions and interactions and transactions are manipulated by technology, the people who write the rules are writing the rules of society. You may consider yourself an entrepreneur or an engineer or a technology executive or whatever, but you are still performing a political function in society, which in my view should be held accountable accordingly.
DT: What is the answer to that? Engineers and executives are probably not elected politicians. Should they make every effort to ensure their neutrality or neutrality?
JS There is no such thing as an acceptable neutral position. This is because neutrality itself is a choice between alternatives. For example, if you are neutral about the content posted on your social media platform that means being neutral from hate speech, or threats of rape, or child pornography. Another example involves Google autofill suggestions. Google has had a problem with its auto-fill responses returning unpleasant suggestions – so if you type ‘why Jews’ it will come up with ‘they have big noses’ or ‘media owner’. Google’s defense is that it is neutral because it reflects questions people have asked in the past.
To me this is a good example of when neutrality is synonymous with injustice. Instead of changing or helping the extent of bias in the world, Google has expanded and expanded it. According to Holocaust survivor Elie Wiesel, neutrality supports the oppressor. There is no neutral position that owners and regulators of digital technology can take. I think we have to accept that there will always be decisions that involve priorities and business considerations and principles and sometimes prejudice.
The main question is how do we manage those? We should manage other unelected people in the society in the position of social responsibility like doctors, lawyers, bankers, teachers, broadcasters in the same way. All these industries have a special place of social responsibility for people, and the law therefore limits certain obligations.
DT: The question of neutrality has come up recently with a lot of talk around Twitter and Elon Musk’s now seemingly defunct takeover. Some have suggested that platforms like Twitter are biased and that some social media problems could be solved if they took less action.
JS One of the long-standing themes of Republican political thought is that if you take a stand of neutrality or opposition in social and political strife, what you are doing is making room for the strong to dominate the weak. A social media platform without rules does not give everyone equal rights to participate. This means that certain voices are going to be drowned out, certain people are going to be pushed off the stage. In the real world, the government sometimes intervenes in people’s lives through policy to redress power imbalances. Tech should be no different.
DT: There seems to be a real technological skepticism at the moment, certainly when you compare it to, say, the cyber-utopianism of the 1990s when there was a Californian ideological sensibility that could solve all our problems. Can you tell when things change?
JS In the year I think it’s pretty clear that it happened in 2016. That year, the Remain side lost the Brexit referendum, and Hillary Clinton’s campaign lost the Electoral College in America. In both campaigns, the losing side—and on behalf of the losing side—argue that the winning side has illegally harnessed digital technologies.
Whether through micro-targeting or the collection of people’s data, some of those claims have withstood scrutiny in the years since, while others have not. But regardless of their qualifications, I consider that a turning point. That year, the question of the power of digital technology shot to the top of the political agenda. It has also exploded as an educational threat.
DT: What steps can we as individuals take to solve some of the problems you describe in the book?
JS Very few, I’m afraid. That’s why it’s important to be honest. We need to get out of the mindset that we can protect ourselves and our children better if we are a little more technologically savvy. I believe this is nonsense. I think that the challenges posed by digital technology can only be tackled primarily at the collective level. That is, it is a legal method. It should not be left to individuals.
DT: So what does this type of collective action or regulatory action look like?
JS It varies from industry to industry; Technology to technology. In the book, however, I lay out several options. First of all, I think that the behavior of powerful individuals in the technology sector should be regulated in the same way as doctors and lawyers and pharmacists.
Second, I think we need a theory of antitrust that is more focused on economic issues than we are now. I think that when evaluating whether a particular merger or acquisition is beneficial to society, we should not only consider price; We need to consider things like media diversity and the concentration of political and social power.
Thirdly, I want to look at the ways in which individuals and regulators can contest the necessary practices of digital power, whether it’s loans or jobs or houses or ways to contest the algorithms that distribute loans. What I have described in the book is a rational comprehensive legal system. Underpinning all this is a new way to involve the public in decisions about digital technology. It is not only a matter of transferring power from tech companies to parliament, but also a matter of returning power from parliament to the people.
This interview has been edited for length and clarity.
Editors’ recommendations
[ad_2]
Source link