Skip to main content

The Age of Artificial Intelligence: A brief history...

The promise of artificial intelligence (AI) has been around since World War II but it is only in the last decade or so that technology and engineering have gradually caught up with expectations. In fact, AI now seems to be on a winning streak. But before worrying about being taken over by robots, let’s get a clear understanding about what AI is? This article series aims to provide a summary of information on historical, ethical, approaches, tools and applications surrounding AI.

A simple web search will tell you that in computer science, artificial intelligence, sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans. This last decade has seen a massive increase in research effort, business investment and general acceptance of artificial intelligence. Andrew Ng, an authority in the field, calls it the New Electricity1 because, just as electricity transformed almost everything some 100 years ago, no industry will escape AI transformation in the coming years.

For a better understanding, we need to take a step back in time. The theory behind AI, the concepts and mathematics behind the most prevalent algorithms, have been around for decades. Back in 1955, Arthur Samuel2 created a computer program for a game of Checkers on a rudimentary IBM 701 machine that could learn using a combination of a tree search algorithm with learned weights3. In 1961, it challenged and won over the Connecticut State Champion! This was not to last and the years from 1967 to 1976 turned to the first AI Winter. The appearance of cheap personal computers in small offices and households in the late 70’s and early 80’s brought a resurgence in interest. However, this waned once again towards the end of the decade as realisation grew that in spite of what ‘cheap’ hardware promised, it was still not enough to transform AI into something useful. The years were not without success and one of the most important papers at the core of this revolution was published in Nature4 in this period. Ironically, some of the most fruitful seeds in the period came out of Hollywood where such productions like War Games, Short Circuit and D.A.R.Y.L inspired a whole new generation already accustomed to easy access to computers and the idea of coding.

In real terms however, the exaggeration of AI abilities and failure to deliver on hyped up over-expectations and unachievable products, ended up with many costly failures for industry and governments5,6. This accelerated the decline in commercial interest and academic funding. AI became a dirty word because it had been hijacked by big business, political expediency and dubious academics merely interested in milking a buzzword. Moreover the pioneering work of Sir Tim Berners-Lee on the World Wide Web (WWW) in the early 1990s provided the world with a sexier solution to a much more relatable and achievable problem as well as a perfect raison d'être for the millions of computers flooding every single household in the developed world. Winter had once again drawn a blanket over AI - until around 2010 that is… but what changed? The answer is both nothing and everything.

PCs and the WWW had exhausted their momentum and become commodity items. Everyone had multiple computer devices and broadband had become ubiquitous. The wow factor had abated and everyone was eager for the next big thing.

The same WWW that helped to kill off interest in AI 20 years earlier, 10 years of social media platforms and the appearance of the iPhone were generating massive amounts of data on potential customers. Everyone could generate data at any time and any place and organisations were happy to entice us with ever more interesting ‘free’ services in exchange for data on our purchasing habits, pet peeves, political affiliations, age, gender, likes and dislikes and every single fetish and habit that we probably did not even realise we had. A boom of internet enabled devices added to this data soup. All that remained was for the data to be monetised.

 

Enter Big Data...

 

Turns out, fundamental to the success of AI is data - Big Data to be precise. Which makes AI unlike anything before it in technology. AI projects essentially need large amounts of actionable data and the point in time had arrived where availability of astronomical amounts of data coincided with significant technological breakthroughs. Suddenly, 60 years of research and stalled academic progress started bearing fruit. Cloud computing quickly becomes commonplace. It was no longer just FAAMG* that could afford to use massive computing power and storage. In their urge to monetise their spare storage and computing capacity, these resources were opened to the public and the inevitable price wars between Amazon and Google7,8 and eventually Microsoft9, suddenly made it possible for institutions with more limited funds to tap into virtually unlimited storage and processing, when they needed it for as long as they needed it. AI no longer required massive upfront investment and unrealistic business cases. Furthermore, to improve the way they processed their data, huge investments and research efforts by Google, Facebook and large academic institutions like Berkeley, created technologies that ended up with the open source community. Scalable databases and frameworks like Cassandra, MongoDB, MapReduce and Spark could suddenly process massive amount of data on cheap machines without the eye-watering licensing and hardware costs associated with major software providers.

Machines could now trawl through vast amounts of data and identify patterns that no human could previously identify. On the internet, your search preferences were now being correlated with your purchases, habits and locations in near-real time and algorithms could quickly and continuously determine your present and possible likes and dislikes. You search for skiing resorts on your laptop and your Facebook feed starts presenting you with special offers for skiing equipment. Speech recognition can now practically flawlessly transcribe a speech to text and self-driving cars are a pushing towards mainstream reality. New uses for AI are cropping up in all sorts of industries from retail to agriculture. Perhaps more importantly, as more exotic forms of computing become a reality, AI progress will move in tandem. If the way graphical processing units have been applied to AI is anything to go by, imagine what quantum computing and DNA bio storage (Catalog Technologies Inc) will do.
________________________________
*Facebook, Amazon, Apple, Microsoft, Google and later joined by Netflix and Baidu

Bibliography

 

[1] A. Ng, “Andrew Ng: Why AI Is the New Electricity | Stanford Graduate School of Business,” 2017. https://www.gsb.stanford.edu/insights/andrew-ng-why-ai-new-electricity.
[2] J. McCarthy, “Arthur Samuel : Pioneer in Machine Learning.”  http://infolab.stanford.edu/pub/voy/museum/samuel.html.
[3] Sebastian Schuchmann, “History of the first AI Winter - Towards Data Science,” 2019. https://towardsdatascience.com/history-of-the-first-ai-winter-6f8c2186f80b.
[4] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, 1986, doi: 10.1038/323533a0.
[5] “Alvey: the betrayal of a research programme | New Scientist.”  https://www.newscientist.com/article/mg13017682-400-alvey-the-betrayal-of-a-research-programme/.
[6] “House of Lords - AI in the UK: ready, willing and able? - Artificial Intelligence Committee.”  https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/10018.htm.
[7] B. Butler, “In Cloud Computing Price War Amazon and Google Drop Rates Again | CIO,” 2013.  https://www.cio.com/article/2386980/in-cloud-computing-price-war-amazon-and-google-drop-rates-again.html.
[8] Jackson Joab, “Price war! Amazon cuts cloud costs to counter Google | Computerworld,” 2014.  https://www.computerworld.com/article/2489105/price-war--amazon-cuts-cloud-costs-to-counter-google.html.
[9] S. Ranger, “Microsoft, AWS and Google may have just started the next cloud computing price war | ZDNet,” 2017.  https://www.zdnet.com/article/microsoft-aws-and-google-may-have-just-started-the-next-cloud-computing-price-war/.
[10] A. N. Bloomberg, “Artificial Intelligence: Reality vs Hype - YouTube,” 2019. [Online]. Available: https://www.youtube.com/watch?v=NUUsICq5ySk. [Accessed: 23-Jan-2020].
[11] W. Knight, “Reinforcement learning,” Technology Review, vol. 120, no. 2. Massachusetts Institute of Technology, pp. 32–35, 01-Mar-2017, doi: 10.4249/scholarpedia.1448.
[12] F. Woergoetter and B. Porr, “Reinforcement learning,” Scholarpedia, vol. 3, no. 3, p. 1448, 2008, doi: 10.4249/scholarpedia.1448.
[13] Jia Deng, Wei Dong, R. Socher, Li-Jia Li, Kai Li, and Li Fei-Fei, “ImageNet: A large-scale hierarchical image database,” 2009, doi: 10.1109/cvprw.2009.5206848.
[14] “DARPA Grand Challenge - Wikipedia.” https://en.wikipedia.org/wiki/DARPA_Grand_Challenge.
[15] “Tesla Model 3 Driver Claims Autopilot Saved Lives In This Accident.”  https://insideevs.com/news/392220/video-tesla-autopilot-life-saver-crash/.
[16] “Watch Tesla Model 3 On Autopilot Avoid Crash With Distracted Driver.”  https://insideevs.com/news/391075/video-autopilot-prevents-tesla-crash/.
[17] “Watch Tesla Model 3 Autopilot Swerve To Avoid Car Passing On Shoulder.” https://insideevs.com/news/372165/video-tesla-autopilot-avoid-crash/.
[18] S. O’Kane, “Tesla hit with another lawsuit over a fatal Autopilot crash - The Verge,” 2019. https://www.theverge.com/2019/8/1/20750715/tesla-autopilot-crash-lawsuit-wrongful-death.
[19] J. Barnett, “The Army working on a battlefield AI ‘teammate’ for soldiers,” 2020. https://www.fedscoop.com/army-artificial-intelligence-ai-battlefield-systems/.
[20] “Catholic Church joins IBM, Microsoft in ethical A.I. push | Fortune.” https://fortune.com/2020/02/28/ai-ethics-vatican-microsoft-ibm/.
[21] “Algorithmic Accountability Act of 2019 Bill Text.” https://www.documentcloud.org/documents/5816234-Algorithmic-Accountability-Act-of-2019-Bill-Text.html.
[22] J. Emspak, “World’s Most Powerful Particle Collider Taps AI to Expose Hack Attacks - Scientific American,” 2017. https://www.scientificamerican.com/article/worlds-most-powerful-particle-collider-taps-ai-to-expose-hack-attacks/.
[23] J. McCormick, “Health Systems Look to AI to Prevent Sepsis Deaths - WSJ,” 2020. https://www.wsj.com/articles/health-systems-look-to-ai-to-prevent-sepsis-deaths-11580207401?mod=djemAIPro.
[24] Jared Council, “Bayer Looks to Emerging Technique to Overcome AI Data Challenges - WSJ,” 2020. https://www.wsj.com/articles/bayer-looks-to-emerging-technique-to-overcome-ai-data-challenges-11580121000?mod=djemAIPro.
[25] W. Hählen and S. Kapreillian, “RPA for Tax | Deloitte Global,” Deloitte Tax Solutions.  https://www2.deloitte.com/global/en/pages/tax/solutions/rpa-for-tax.html.
[26] https://www.gartner.com/doc/reprints?id=1-1YBATZQ1&ct=200210&st=sb

Did you find this useful?

Thanks for your feedback

If you would like to help improve Deloitte.com further, please complete a 3-minute survey