Our colleague Nalaka Gunawardene has written a Facebook post where he asks “Robots in politics? Why not?”
This provides a gateway for a substantive discussion on the role of technology in governance.
First, we have to rephrase the question. I understand politics to be the art of contributing in various ways to governance.
In a democracy, a government cannot satisfy the demands of just one stakeholder group, even in the unlikely event that it is 100 percent right. It definitely cannot satisfy the demands of a stakeholder group that is intent on destroying the very existence of another legitimate stakeholder. Its decisions must be based on the public interest, in terms of ensuring the best possible curative services for today’s consumers of the services provided by the taxpayer-funded hospitals across the breadth of the country. Its decisions must take into account the future curative and palliative care needs of a rapidly aging population.
I wrote the above in the context of a protracted struggle by the government doctors’ trade union and university student organizations in an attempt to shut down a private medical university in Sri Lanka.
We should not be disappointed by the actions of politicians, because they are driven by imperatives of gaining and retaining power. But governance is a different matter.
Now we come to robots. What politicians who reach positions of power in government under democracy must do is work out compromise solutions. Can a robot do this? Yes, using algorithms (pre-defined rules) or artificial intelligence (AI), where the rules are developed by the system. So the other features of robots, such as ability to move around, speak, etc. are irrelevant. What matters is AI in the robot, not the robot itself.
What is unlikely is that AI will be successful in developing compromises that are acceptable to all stakeholders. The emotional work of listening, addressing unstated concerns etc are not likely to be the strong suite of AI.
As one of the executives responsible for Apple’s streaming music service said to the Guardian in 2015:
“Algorithms are really great, of course, but they need a bit of a human touch in them, helping form the right sequence. Some algorithms wouldn’t know that Rock Steady could follow Start Me Up, y’know. That’s hard to do,” says Iovine.
“You have to humanise it a bit, because it’s a real art to telling you what song comes next. Algorithms can’t do it alone. They’re very handy, and you can’t do something of this scale without ‘em, but you need a strong human element.”
2 Comments
Nuwan Waidyanatha
GIGO – Garbage In Garbage Out. AI reliability depends on how it learns (supervised or unsupervised). The influence and training can make AI systems be biased in their reasoning; just like humans; i.e. algorithm will propose one governance method for dark skinned vs. white skinned because of the profiling data they are fed. After all the data is generated by human instruments. AI neural weights are derived on happiness (positive) and sadness (negative) factors. Therefore, like politicians, AI is also inclined on “waasi-peththata hoiya” (වාසි පැත්තට හොය්යා).
https://psmag.com/news/artificial-intelligence-will-be-as-biased-and-prejudiced-as-its-human-creators
Nalaka Gunawardene
Interesting discussion here:
https://theconversation.com/can-we-replace-politicians-with-robots-56683
Workshop: Digital Tools for Strengthening Public Discourse
Today, LIRNEasia hosted a workshop to launch digital tools created by Watchdog Sri Lanka, funded by GIZ’s Strengthening Social Cohesion and Peace in Sri Lanka (SCOPE) programme. Researchers, practitioners, activists and journalists attended to learn about these tools, and how they can potentially help them in their own lines of work.
Election Misinformation in Sri Lanka: Report Summary
Election misinformation poses a credible threat to Sri Lanka’s democracy. While it is expected that any electorate hardly operates with perfect information, our research finds that the presence of an election misinformation industry in Sri Lanka producing and disseminating viral false assertions has the potential to distort constituents’ information diets and sway their electoral choices.
Election Misinformation in South and South-East Asia: Report Summary
A powerful weapon in a time of global democratic backsliding, election misinformation may undermine democracy via a range of mechanisms. Election misinformation may influence an electorate to cast their ballots for candidates they otherwise might not have on the basis of incorrect information about a country’s economy, the candidates, or some other phenomenon.
Links
User Login
Themes
Social
Twitter
Facebook
RSS Feed
Contact
12, Balcombe Place, Colombo 08
Sri Lanka
+94 (0)11 267 1160
+94 (0)11 267 5212
info [at] lirneasia [dot] net
Copyright © 2024 LIRNEasia
a regional ICT policy and regulation think tank active across the Asia Pacific