Trending Now

Artificial intelligence: man with machine

In my last Biznology post, Man versus machine, I discussed how advances in AI are feeding a growing fear that machines may soon be doing our jobs better than us, and making us all obsolete. This discussion is as old as technology itself, and one can argue that the fundamental objective of technology is exactly that: do something better than what is possible today, be it human-driven or machine-driven. The novel part of this is how fast it’s happening, potentially forcing us away from our jobs several times before we retire. Some people even predict a future where work as we know today will no longer be required. Others try to reconcile the two narratives and bet on a hybrid future, where humans benefiting from AI advances can do work that we don’t even imagine today.

One aspect often neglected in these discussions is that AI and machine learning are not the only areas seeing remarkable advances these days. In Walter Isaacson‘s biography of Steve Jobs, the Apple CEO is quoted as saying about his disease: “I’m either going to be one of the first to be able to outrun a cancer like this, or I’m going to be one of the last to die from it.” When writing about the legacy Jobs had on personalized medicine, Antonio Regalado, from the MIT Technology Review, had an interesting take on the link between genetics and the digital revolution:

DNA is a profoundly digital molecule. And now that it’s become very cheap to decode, genetic data is piling up by the terabyte.

For all the complexity of our body’s biochemistry, the simplicity and elegancy of our genetic code is remarkably similar to the binary computer representation. You can pretty much describe a live being full genetic makeup through sequences made of the 4 nucleotide letters: A, T, C, and G. We already have technology that can quickly read key parts of our genetic sequence, and there’s emerging techniques like CRISPR, that hold the potential to allow scientists to edit parts of that code.

Neuroprosthetics is another area where new developments are redefining what is possible for a human with a machine to do. From augmenting our visual and auditory capabilities to technology to enhance human intelligence, we may be seeing learning being reinvented over the next few decades.

Finally, there is significant buzz around nootropics, as known as cognitive enhancers, drugs that are said to augment or accelerate our ability to learn new things.

Naturally, I’m not saying all these advances are necessarily a good thing. There will be plenty of ethical concerns to be addressed as new technologies enable us to become real-world Tony Starks in a not-so-distant future. Paying too much attention to the digital revolution in its strict computing sense can prevent us from noticing that there is a bigger world of science out there, which may ultimately redefine what means to be artificially intelligent.

Aaron Kim

Aaron Kim is the Head of Digital Social Collaboration at the Royal Bank of Canada, and led the efforts to bring social business and social collaboration to an organization of 79,000 employees. He’s also been a public speaker at several events across the globe, from the Web 2.0 Expo to JiveWorld, from Singapore to Barcelona. He has a passion for innovation and for making work smarter, more meaningful and rewarding to all. Born and raised in Brazil, to a Korean father and Japanese mother, he also volunteers in several diversity initiatives, inside and outside RBC. In the past, he worked as a consultant both at IBM Canada and Unisys Brazil, having played the roles of solutions architect, Basel II analyst, performance engineer, Java programmer, Unix administrator and environmental biologist. He holds an MBA from the University of Toronto, and a bachelor’s degree in Biology from the Universidade de São Paulo. He lives in Toronto, Canada, is married to Tania and have a son, Lucas.

Join the Discussion

Your email address will not be published. Required fields are marked *

Back to top