Artificial Intelligence – Back to the Future

 In In Focus, News, Technology

AI is emerging with thousands of companies around the world transforming their business through artificial intelligence to gain new insight, efficiency, and new automation. Sam Lightstone, AI strategist and inventor for IBM takes the American Chamber of Commerce in Portugal (AmCham) through the impact that AI will soon have on all of our lives.

By Chris Graeme

For people who don’t have a background in AI, it can sound scary or even magical, but in fact it is just computer science made up of bits and bytes and algorithms that we are using in new ways from how we programmed computers in the past.
Although it is not magic, it is transformational, with almost no industry in the world which is not being disrupted in some form by Artificial Intelligence.
What is this change? The big transformation, says Lightstone, is actually a set of techniques by which we can programme computers through experience by showing by showing the computer exact examples instead of programming computers functionally.
“For the past 50 years we’ve programmed computers by feeding them very explicit instructions, but that is limited by the ability of a programmer to express their own thoughts, to express their ideas of what they want the computer to do, and therefore it is limited by human expression,” he says.
“With AI we will begin to programme computers by example, by showing the computer examples of data, and from these digital data examples, the computer can begin to understand progressions, patterns, and automate processes based on how things have been handled in the past and emulating and improving on that for the future,” he adds.
In the words of Geoffrey Hinton, the ‘godfather’ of deep learning at the University of Toronto: “Our relationship to computers has changed. Instead of programming them, we now show them and they figure it out.”
Actually, it is a very dramatic transformation, by showing computers examples, they can learn and be programmed to act in more sophisticated ways than was possible in the past.
Whole selections of techniques and algorithms have been developed over the past 50 years, the details of which are complicated and require a Phd in computer and data science to be able to understand them in detail.
In a kind of back to the future analogy, Lightstone says most of the algorithms we use today were invented in the 1950s, 1960s, 1970s and 1980s, but will have amazing applications from now on and in the future. So, if all this technology is so old, why is AI happening now?
Lightstone says there are three reasons. First, the rise of digital data which makes it possible to have AI, without data there are not enough examples to show the computer what to do.
“Until the advent of the Internet and digital commerce from the early 2000s, we didn’t have enough data to make AI meaningful,” he says.
Second, the inflection point was the advent of low-cost computing. In the 50s-70s computer power was limited and very expensive. From the 1980s and 1990s, with the advent of personal computing, the cost of computers began to drop.
However, it was still quite expensive until the last few years when the processing capacities and power have continued to rise, and the cost of computers has been dropping and now you can get very low cost compute on cloud for just cents.
A cellphone today is more powerful than the super computer that beat Garry Kasparov at chess in 1997!
This is because between 1997 and today, the processing power has grown exponentially. “We are all walking around with super computers.”
The third inflection point happened in 2012, when a new AI technique was developed by Professor Hinton and his research team at the University of Toronto developed a new AI technique for image analytics to identify what was in a collection of photos.
When they did that, every research group in the world realised that something amazing had happened.
That gave researchers the evidence that they could now do dramatic and innovative applications with it.
The thousands of companies now working on AI is a phenomena which has appeared since 2012.

Advances since 2012

IBM has developed AI called Project Debater which can actually debate with human beings on topics of serious social consequence such as economic problems, social education and funding for children’s projects.
This AI can debate with the best debaters in the world. And then there is creativity from AI. “We are seeing AI systems that can engage on topics that we previously thought were just the domain of human beings,” says the IMB AI specialist.
One such is around generative adversarial networks. This is a technology by which AI systems learn photographs of human beings and generate new images of realistic but fake human beings from that, and this technology can be applied to other domains as well.
There is AI that can create artwork or the recipes in a cookbook with an AI chef called Chef Watson that not only creates new recipes, but understands the chemical composition of foods that might go well together to create new tastes.
“AI has become very dramatic and can do some very shocking things and if this is what we can accomplish in just eight years, imagine how far we can go in the next decade or two decades?” says Sam Lightstone.
IMB has been working with AI to create sophisticated new solutions for business. AI depends on rigorous and professional foundations for data known as Information Architecture.
Without Information Architecture it is very difficult to build up AI in a professional and secure manner.
This is now transforming businesses around the world, and during the Covid era, the ability of people to conduct business when they are not physically present, is just accelerating the push to digitalisation, automation and cost recovery.
In conclusion, Sam Lightstone calls these “exciting times” in which the whole world is going to transform very dramatically, in which AI can be built, trusted and used without the use of a Phd to automate business.
IBM is even creating AI that creates AI called Auto AI, whereby even a fairly junior programmer will be able to build quite sophisticated AI models.
Even the potato chip company Lays is using AI to improve the development and deployment of new products in what used to take 240 weeks and now just takes 30 days.