The World Cup, Artificial Intelligence and Blueberry Muffins! - The Cork IT Network

The World Cup, Artificial Intelligence and Blueberry Muffins!

 

When it comes to disruptive technologies, nothing is more on trend right now than Artificial Intelligence or AI as it’s commonly known because it’s one of those technologies that we know will impact business, economic and social models as well as our own personal lives. AI is just one part of the larger field of Data Science, where at its simplest, is the art of ‘extracting value or business insights from data’.

While Artificial Intelligence is a term first coined by John McCarthy in 1956, the concept of computers performing cognitive functions to mirror those of humans is around for decades. English mathematician Alan Turing’s paper ‘Computing Machinery and Intelligence’ published in 1950 posed the question ‘can machines think?’ and introduced the ‘Turing test’, a model for measuring intelligence. Called ‘the Imitation Game’, it gave notion to the idea of machines being able to move beyond just logical thinking and into the realm of cognitive thinking using skills like learning, reasoning, remembering, understanding and deduction/inference.

Despite decades of progress, only one computer is generally believed to have successfully passed the Turing test (a long term goal of AI researchers) i.e. the ability to make a computer successfully imitate human behaviour to the point of it being indistinguishable from human responses, even by expert judges.

But if AI is around so long, why the big fuss now? Well, because suddenly AI looks far more accessible, moving it from the realms of Science Fiction to plausible reality. This is predominantly due to…

  • Advances in processing power or computational infrastructure
  • The Internet of things (IOT) – billions of connected devices.
  • The availability of vast amounts of data
  • Ability to handle unstructured data (images, video etc.)

The problem I see now though, is that in the global race to embrace AI and get first mover advantage, people and businesses are trying to solve non-AI problems with AI solutions when simpler solutions are sufficient. It’s not that there is a lack of understanding of the problem being solved, but more that there is an urgent desire to create case-studies in order to experiment and grow a proficiency in AI. Time and time again, I hear people use the words Artificial Intelligence when they really mean RPA (Robotic Process Automation), visual inspection systems, general robotics and much more.

As I sat watching the world cup soccer over the weekend, it struck me that a lot of Data Science terminology might be easily illustrated using the Russian nesting dolls, (also known as Babushkas or Matryoshkas), in that AI is the broader field with Machine Learning and Deep Learning being subsets of the field and systems become increasingly ‘more human-like’ the deeper you go.

 

Fig 1.0 Data Science hierarchy

 

Fig. 2.0 Data Science definitions

 

In Machine Learning, the idea that machines would learn new things independently of humans explicitly programming them, would mean they essentially become what we call ‘smart’. Despite great progress, it’s not as easy as it sounds. Even teaching computers to distinguish between similar images is complex because what we humans see as obvious identifiable markers (colour, form, distribution etc.) are not so obvious (or more importantly, not so unique) when viewed through a computer logic and statistical data lens. This was illustrated in a viral tweet a couple of years ago by Karen Zack’s (@tinybiscuit) by comparing images of animals and food.

 

While these examples are funny, this type of identification error is common and could have far reaching implications in areas like precision medicine, autonomous vehicles and science in general.

There’s no doubt that we are entering an era where Artificial Intelligence will make a big impact sooner rather than later because the speed of progress and its adoption is increasing. It has taken decades to get this point in the AI journey but the next phase could look more like years and while this generation treads carefully with its outcomes, future generations will accept and trust faster, as it become ubiquitous and every day.

Our biggest hurdles may be yet to come, in the form of eliminating bias and overcoming legal and ethical challenges but that’s a blog for another day.

 

Gillian Bergin, Dell EMC & Director it@cork