‘I know we can still shape that world, and make it into a place which reflects our humanity, our cultures and our cares’ Photograph: Tegan Osborne
Three decades ago I left Australia to study anthropology in America. That journey took me to the heart of Silicon Valley. My job was to put people back into the process by which technology is made. Eight months ago I came back to Australia.
My time in Silicon Valley has left me with the distinct sense that we need to keep reasserting the importance of people and the diversity of our lived experiences into our conversations about technology and the future. It is easy to get seduced by all the potential of the new and the wonders it promises. There is a lot of hype and not so much measured discussion. So it is time for a conversation about our possible digital and human futures and about the world we might want to make together. What actions can we take, individually and collectively? Is there a particular Australian thread we could follow? I want to suggest four things we should do in Australia.
Build new approaches
We will need new practitioners to tame and manage the emerging data-driven digital world, as well as those to regulate and govern them. Rather than just tweaking existing disciplines, we need to develop a new set of critical questions and perspectives. Working out how to navigate our humanity in the context of this data-driven digital world requires conversations across the disciplines. In the university sector, we need to rethink how we fund, support and reward research, and researchers. At a funding level, our privileging of Stem at the expense of the rest of the disciplines is short-sighted at best, and detrimental at worst.
Invest in the human-scale conversation
We need to invest in hard conversations that tackle the ethics, morality and underlying cultural philosophy of these new digital technologies in Australian lives. Do we need an institute or a consortium or a governmental thinktank? I am not sure, but I think it would be a good start. We have a great deal of concern about our future and the role of technology in it.
We have a responsibility to tell more nuanced, and yes, more complicated stories – governments, NGOs, industry, news media, every one of us. We also have a responsibility to ask better questions ourselves. We should be educated stakeholders in our own future; and this requires work and willingness to get past the easy seduction of killer robots. So the next time you hear a story about those killer robots, ask yourself: what is the history of this technology? What are its vested interests? Who are its beneficiaries? And most importantly, what is the broader context into which it fits?
How will our humanness be expressed in a world shaped by algorithms in which you have no say, and into which you have no insight? Should there be accountability, transparency and openness? And if so, to whom? And how would we manifest that? Where is the duty of care for this new data-driven version of our smart, fast and connected digital world? We should be actively developing an appropriate regulatory and policy frame work for Australia. Should Australians, own their data, as Europeans do? Should we mandate that algorithms are subject to review and scrutiny, even when they are built elsewhere and by commercial interests? Should we require, as we do with new drug treatments, that they be appropriately tested before they are released here? And how do we ensure that our regulators and policy makers fully understand these new technologies and infrastructures? We ask board directors to be certified financially – perhaps we should ask our regulators and politicians to be certified technically.
Make our own futures
I am tempted to suggest there is one simple question here: do we want to be Australian in this new data-driven smart, fast and connected world, or just another colony of some transnational, commercial empire? Of course, it is not that simple, nor should it be. But algorithms and the data-centric world they help build are manifestations of cultural values and logics which arise in very particular places and contexts. While it is certainly the case that in Australia there is currently a robust debate about what our value set might be, I think it is safe to say we might want to embody our own values in the data-driven digital world around us. For me those values include things like fairness, equity, social justice and civic society. So should we build Australian algorithms? Yes. In fact, we already are. But it has not been without its challenges including regulation, oversight and accountability, and open questions remain about whose values we are modelling.
And of course, there are bigger questions about the role of technology more broadly. In making the machines smarter, we have sacrificed a little something of ourselves – we run the risk of being reduced to data and the decisions it drives. I believe we are more than data, more than just intelligence. I worry that, in our current focus on the digital, we have lost some of our agency and some of our sense of what being human might mean. But I don’t believe this is inevitable or irreversible.
Historically, we backed ourselves with a bigger vision of what we stood for and the lives we wanted for ourselves, our families and our country – things that would not have happened without a bigger vision and that at times ran counter to market forces and conventional wisdom. Things like the Snowy Mountain scheme, the ABC, the Sydney Opera House, the 1967 referendum, Medicare, HECS, superannuation, the NBN as originally planned, and I hope soon marriage equality.
I know we can still shape that world, and make it into a place which reflects our humanity, our cultures and our cares. We have done so before, and we can do so again. It requires that we enter a conversation about the role of technology in our society, and about how we want to navigate being human in a digital world. I think we have a moral obligation to do just that, to shape a world in which we might all want to live. And that’s why I came home.
Professor Genevieve Bell is presenter of the ABC’s 2017 Boyer Lectures and director of the Autonomy, Agency & Assurance (3A) Innovation Institute at ANU, co-founded by Data61
No Peace! No Justice! Please share this post.
The contents of Rise Up Times do not necessarily reflect the views of the editor.
‘I know we can still shape that world, and make it into a place which reflects our humanity, our cultures and our cares’ Photograph: Tegan Osborne
Three decades ago I left Australia to study anthropology in America. That journey took me to the heart of Silicon Valley. My job was to put people back into the process by which technology is made. Eight months ago I came back to Australia.
My time in Silicon Valley has left me with the distinct sense that we need to keep reasserting the importance of people and the diversity of our lived experiences into our conversations about technology and the future. It is easy to get seduced by all the potential of the new and the wonders it promises. There is a lot of hype and not so much measured discussion. So it is time for a conversation about our possible digital and human futures and about the world we might want to make together. What actions can we take, individually and collectively? Is there a particular Australian thread we could follow? I want to suggest four things we should do in Australia.
Build new approaches
We will need new practitioners to tame and manage the emerging data-driven digital world, as well as those to regulate and govern them. Rather than just tweaking existing disciplines, we need to develop a new set of critical questions and perspectives. Working out how to navigate our humanity in the context of this data-driven digital world requires conversations across the disciplines. In the university sector, we need to rethink how we fund, support and reward research, and researchers. At a funding level, our privileging of Stem at the expense of the rest of the disciplines is short-sighted at best, and detrimental at worst.
Invest in the human-scale conversation
We need to invest in hard conversations that tackle the ethics, morality and underlying cultural philosophy of these new digital technologies in Australian lives. Do we need an institute or a consortium or a governmental thinktank? I am not sure, but I think it would be a good start. We have a great deal of concern about our future and the role of technology in it.
We have a responsibility to tell more nuanced, and yes, more complicated stories – governments, NGOs, industry, news media, every one of us. We also have a responsibility to ask better questions ourselves. We should be educated stakeholders in our own future; and this requires work and willingness to get past the easy seduction of killer robots. So the next time you hear a story about those killer robots, ask yourself: what is the history of this technology? What are its vested interests? Who are its beneficiaries? And most importantly, what is the broader context into which it fits?
Strive for accountability
Media for the people! Learn more about Rise Up Times and how to sustain
People Supported News.
Follow RiseUpTimes on Twitter RiseUpTimes @touchpeace
How will our humanness be expressed in a world shaped by algorithms in which you have no say, and into which you have no insight? Should there be accountability, transparency and openness? And if so, to whom? And how would we manifest that? Where is the duty of care for this new data-driven version of our smart, fast and connected digital world? We should be actively developing an appropriate regulatory and policy frame work for Australia. Should Australians, own their data, as Europeans do? Should we mandate that algorithms are subject to review and scrutiny, even when they are built elsewhere and by commercial interests? Should we require, as we do with new drug treatments, that they be appropriately tested before they are released here? And how do we ensure that our regulators and policy makers fully understand these new technologies and infrastructures? We ask board directors to be certified financially – perhaps we should ask our regulators and politicians to be certified technically.
Make our own futures
I am tempted to suggest there is one simple question here: do we want to be Australian in this new data-driven smart, fast and connected world, or just another colony of some transnational, commercial empire? Of course, it is not that simple, nor should it be. But algorithms and the data-centric world they help build are manifestations of cultural values and logics which arise in very particular places and contexts. While it is certainly the case that in Australia there is currently a robust debate about what our value set might be, I think it is safe to say we might want to embody our own values in the data-driven digital world around us. For me those values include things like fairness, equity, social justice and civic society. So should we build Australian algorithms? Yes. In fact, we already are. But it has not been without its challenges including regulation, oversight and accountability, and open questions remain about whose values we are modelling.
And of course, there are bigger questions about the role of technology more broadly. In making the machines smarter, we have sacrificed a little something of ourselves – we run the risk of being reduced to data and the decisions it drives. I believe we are more than data, more than just intelligence. I worry that, in our current focus on the digital, we have lost some of our agency and some of our sense of what being human might mean. But I don’t believe this is inevitable or irreversible.
Historically, we backed ourselves with a bigger vision of what we stood for and the lives we wanted for ourselves, our families and our country – things that would not have happened without a bigger vision and that at times ran counter to market forces and conventional wisdom. Things like the Snowy Mountain scheme, the ABC, the Sydney Opera House, the 1967 referendum, Medicare, HECS, superannuation, the NBN as originally planned, and I hope soon marriage equality.
I know we can still shape that world, and make it into a place which reflects our humanity, our cultures and our cares. We have done so before, and we can do so again. It requires that we enter a conversation about the role of technology in our society, and about how we want to navigate being human in a digital world. I think we have a moral obligation to do just that, to shape a world in which we might all want to live. And that’s why I came home.
Professor Genevieve Bell is presenter of the ABC’s 2017 Boyer Lectures and director of the Autonomy, Agency & Assurance (3A) Innovation Institute at ANU, co-founded by Data61
No Peace! No Justice! Please share this post.
The contents of Rise Up Times do not necessarily reflect the views of the editor.