Why should everybody learn Artificial Intelligence?
by Amina Souag, Scott Turner
School of Engineering, Technology and Design
Canterbury Christ Church University
Artificial Intelligence (AI) is not science fiction. It is around us now (e.g., for automatic plate number recognition, credit-card fraud detection), and it is here to remain. AI is also not just one technology, but a range of technologies inspired by everything from how the brain works to how ants find food. These allow computers to appear intelligent and apply more focused processing power than the human brain can produce, though usually only to narrowly defined tasks. This is why AI technology has become so important to the modern economy. AI is here and working now.
What is artificial intelligence?
The prospect of creating intelligent computers has fascinated many people for as long as computers have been around[1]. In 1950, Alan Turing raised the question: ‘Can a machine think?’
The first AI projects sought to program in knowledge about the world, then reason about statements automatically, using logic to infer rules – a knowledge base approach. An alternative approach has them acquire their own knowledge, by extracting patterns from raw data – machine learning. In recent years this has been the approach behind many of the developments having the greatest impact, such as facial recognition, medical applications, customer services, manufacturing; essentially, if you need to model it, machine learning has been used.
Recent findings from the IBM Global AI Adoption Index, conducted in April 2022[2], reveal that the UK narrowly had the highest proportion of companies that were exploring using AI (47%). A majority (85%) of UK respondents also said their companies either have some sort of AI strategy or are developing one – only slightly behind the Europe average of 89%. UK businesses are using AI for a wide range of purposes. The most cited in the study include business analytics or intelligence (29%); automation of key IT processes (25%) or business processes (24%); supporting marketing & sales (23%); and fraud detection (22%). These findings came as the UK government pursued its National AI Strategy, launched in September 2021, which aims to nurture the country’s AI ecosystem and transition to an AI-enabled economy. The UK is ranked third globally for private investment into AI companies and is home to a third of Europe’s total AI businesses.
It is not all rosy though. One of the arguments often touted, that “AI cannot do creative work”, has been gradually coming under threat. Probably the biggest two reasons are the development of tools such as DALL-E (https://openai.com/dall-e-2/ ) for creating images and GPT3 https://openai.com/blog/gpt-3-apps/ for creating text, which are arguably making people a little less confident on the inability of AI to at the very least appear creative.
This is not an obscure research technology, but has actively been used by companies; Cambridge Analytica anyone? It is easy to overestimate it by thinking the current technologies will definitely take over everything, but most are applied to narrowly defined applications, often doing this very well. At the same time, we should not underestimate it by assuming it cannot do things that are creative. AI is not going to go away and will continue to develop. It is already changing things quietly. Only by understanding that it has strengths and weaknesses, and is not all one thing, can we hope to have a say in where it goes.
[1] Kok, J.N., Boers, E.J., Kosters, W.A., Van der Putten, P. and Poel, M., 2009. Artificial intelligence: definition, trends, techniques, and cases. Artificial intelligence, 1, pp.270-299.
[2] https://www.swlondoner.co.uk/news/20052022-skills-shortage-stalls-uks-adoption-of-artificial-intelligence-as-europe-continues-to-accelerate-2