Recent publicity over Centrelink’s automated debt recovery program has reignited the debate on how algorithms and data matching are used to inform decisions, in both the public and private sectors, and the need to ensure that human judgement continues to play a role.
The use of complex algorithms to automate processes might reduce costs, but ICT professionals need to ensure that appropriate checks and balances are in place to achieve the desired result.
No one would argue the Government’s right, and indeed, responsibility to protect public monies by ensuring that welfare recipients receive their exact entitlements and no more. The Government has clarified its approach, while making adjustments to soften the impact and ensure that recipients under the debt recovery program understand what steps are available to them and how to exercise their rights.
Labor is pushing ahead with calls for a Senate Enquiry and demanding that Centrelink’s data matching system be suspended until a comprehensive review has taken place. The Commonwealth Ombudsman is conducting his own investigation after receiving a series of complaints.
The critical question is: how do we apply a Duty of Care to ensure that automated systems are fit for purpose and the public is appropriately protected?
Retain the Human Touch
The issue of the validity of matching income data reported to Centrelink with information held by the ATO has been raised by Associate Professor Helen Hodgson of Curtin University.
In an article published in The Conversation, she highlights the differences between the two datasets and how they were captured and managed. She suggests that Centrelink could learn from the ATO’s “light touch” approach of engaging earlier with taxpayers to resolve issues and support them in meeting their obligations.
“A properly designed income-matching system can help to identify anomalies, but the information needs to be appropriate and fit for purpose,” she wrote. “The current exercise is using information in a way that it was not designed to be applied, without properly adjusting for the technical and policy differences between the tax and social security system.”
An algorithm interprets the data it is given and applies the rules that have been coded. At this juncture, algorithmic technologies might not have the capability to be imbued with all the inherent human social norms and values to anticipate individual circumstances, judgements or options.
Last year in this column, I welcomed moves by ASIC to introduce guidelines to monitor algorithms and “robo-advisors” that are increasingly being used in the financial sector. It would seem reasonable to suggest that similar guidelines be applied to government algorithms such as Centrelink’s to ensure there is some human oversight of the “automated advice” to drive fair, accurate and sustainable outcomes.
A Duty of Care
At the ACS Ministerial Forum last November, former British Computer Society President, Professor Liz Bacon, raised concerns about the potential impact of intelligent systems and associated algorithms.
“We, as ICT professionals, have a duty of care to the public. Algorithms can have life and death consequences so their design and implementation at all levels need to be verified in detail before use. This is a huge challenge and, at the moment, I don’t see governments being able to keep legislation in sync with the pace of technological change and its impact on society. The professional bodies need to step up to that challenge now.”
She also referenced a 178-nation survey of over 50,000 programmers conducted last year by the online coding community Stack Overflow, which found that 69 per cent were partly or completely self-taught. This raises questions about the professional standards of people who are coding technology systems, in relation to the quality, security, accuracy and code maintainability, and their ability to consider ethical issues and other professional concerns.
While most people have become more comfortable with data sharing and open government, it’s important to remember that data is always historical and cannot accurately predict all future behaviour or outcomes.
How can we reinforce professional standards and best practice to ensure that ICT practitioners apply considerations, including ethics, regulatory concerns, human ethos, social norms and values when designing and implementing algorithms and “automated systems”, particularly when so many are self-taught?
When the International Professional Practice Partnership (IP3) met in Sydney late last year, it launched the IFIP Duty of Care for Everything Digital (iDOCED) initiative. This reminds suppliers and consumers of their duty of care in relation to digital products and services. It is the global ICT profession’s response to high profile system failures and instances of poor ethics and low quality products/services in Australia and overseas.
Anthony Wong is President of the ACS and Chief Executive of AGW Consulting P/L, a multidisciplinary ICT, Intellectual Property Legal and Consulting Practice.