Regardless of whether you think machines and robots are going to take your job or whether you’re more optimistic and believe new roles will develop, one thing is certain – jobs of the future will be different.
With automation and robots providing cost effective alternatives for manual or repeatable functions, and algorithms and AI staking a claim for higher order tasks, what are the sustainable and emerging roles for humans in the digital economy and how do we prepare for these opportunities?
I’m hard-pressed to think of any profession or role that hasn’t been impacted and moulded by technological change in some way, but of course the degree of change varies.
A growing number of professions – from medicine, law and financial services to mining and agriculture – have been radically changed by the application of technology. While ICT professionals are still largely the ones driving the development of new technologies for these other disciplines, we increasingly need the professionals in these disciplines to embrace new digital and technological advances and opportunities to function effectively, and to remain competitive.
When I did my double degree in computer science and law at Monash University, most people couldn’t comprehend the relevance of such a combination. However, joining these disciplines has not only allowed me to play leadership roles in ICT, but also empowered me with the knowledge to assist in re-shaping the fabric of our old world policies and regulatory frameworks. This is essential as emergent technologies disrupt old business models and edifices while peeling and crumbling away long-held regulatory beliefs and assumptions.
Multidisciplinary skills will position our future workforce for the many transformative changes that are yet to manifest – enabling them to explore, rethink and redesign innovative and creative solutions to some of our current complex problems and emerging challenges.
Consider that it takes someone skilled in both medicine and technology to appreciate all the ways in which nanobots and microsurgical techniques can be used to save lives, and to push the boundaries of where these technologies can go in the future.
This is also true in other areas of medicine, such as the emerging area of radiomics, which uses algorithms to extract large amounts of data from routine medical scans to reveal previously undetected anomalies and predict future health.
This collaboration between radiology and computer science can predict medical outcomes in ways that doctors have not been trained to do, significantly increasing the quality of information available about the patient.
Similarly, researchers at Harvard and Stanford are training algorithms to identify and monitor different types of cancers to improve diagnoses and options for treatment.
Other disciplines are being similarly enhanced through the application of technology, such as the mining industry, where companies like GE, IBM and Schlumberger are staking their claims on a digital future.
A global study of mining companies earlier this year highlighted an increased focus on the use of digital tools and systems ranging from Big Data and the Cloud, to Artificial Intelligence and Data Analytics, to improve decision-making and enhance competitiveness.
Rio Tinto’s global boss, Jean-Sebastian Jacques, has even stated publicly that he expects their future competitors to be the technology companies themselves. When you consider how emerging tech players have massively disrupted other sectors like taxis (Uber), retail (Amazon) and accommodation (Airbnb), his prediction becomes less surprising.
The increasing prevalence of disruptive technologies across every industry sector is creating new challenges for educators seeking to prepare the next generation of professionals.
In a context of rapid technological change, how do we ensure that future professionals including doctors, lawyers, scientists and teachers are empowered not only to engage effectively in their profession, but to do it in a way that rides the wave of current and emerging technologies?
This will mean radical changes to the curricula and teaching approaches of a whole range of disciplines to empower students to work cooperatively with technology and with ICT professionals to apply digital tools in a multi-disciplinary way for problem-solving, service delivery, marketing, product development and more.
According to researcher Alison King, this will require a more collaborative learning approach in which teachers transition from a sage on the stage to being more of a guide on the side.
In an article in College Teaching magazine, she wrote, “Engaging our students in such active learning experiences helps them to think for themselves, to move away from the reproduction of knowledge toward the production of knowledge. It helps them become critical thinkers and creative problem solvers so that they can deal effectively with the challenges of the twenty-first century.”
The degree to which future professionals collaborate with technologists or become skilled in the technology themselves will determine their ability to differentiate themselves in their market by developing new products and services for their customers.
The question is how quickly can our universities and tertiary institutions transition their academic programs to embrace more collaborative and multi-disciplinary programs, for that will determine our future competitiveness.