The UK’s ambition to become a global leader in artificial intelligence and robotics could be derailed because youngsters don’t have the right skills or ambition to work in the area, according to a new report.
The warning comes in a new study from enterprise software firm, Sage.
Growing numbers of young people are expressing an interest in technologies such as AI, machine learning, and robotics, says the company. However, in order for new alliances between government and industry to unlock this potential in the future, much more needs to be done now to “inspire and educate a diverse cohort of young people”.
Carried out by YouGov, the Sage research quizzed 1,484 children aged eight to 18 about their perceptions of the technology industry – particularly of AI and robotics.
Building a head of STEAM
While 66 percent of respondents said they enjoy working with technology, only 25 percent of them said they’d like to work with AI, robotics, or other advanced systems in the future. Out of this (significant) minority, 37 percent of youngsters said the idea of working in AI sounds exciting and motivating, while 31 percent said they want to work at the cutting edge of digital research.
Good news, but despite the optimism on display, the Sage report makes it clear that much greater diversity is essential to creating a strong, inclusive AI industry.
While some young people are excited about the future of technology in general, many others are worried, and 56 percent of youngsters said they are unlikely to enter the industry when they leave school or college.
Their reasons are wide-ranging, with 29 percent saying they’d like to pursue “creative careers” instead, and 25 percent saying they don’t have the right qualifications. A further 21 percent believe that they’re not “smart enough” to work in AI or robotics.
These findings suggest that the education system is failing to make technology seem like a creative pursuit and an attainable goal, despite worldwide efforts to align science, technology, engineering, and maths (STEM) careers with the arts (STEAM). Many robotics and AI development programmes now take place within diverse, inter-disciplinary teams, bringing in non-technical experts from the worlds of psychology, design, ethics, and more.
Action is essential
Kriti Sharma, vice president of AI at Sage, said that more action is needed to ensure that youngsters have the skills to survive in the future workplace. “It’s great to see the government starting to assess the importance of AI, evidenced in the comprehensive Sector Deal announced recently, committing extra resources and funding to help grow this promising area,” she said.
“However, there’s still a huge amount of work to be done, particularly when it comes to the elitism problem in the AI industry, as our research confirms.
“It’s no longer the case that you need a Master’s degree to consider a career in emerging tech; yet 24 percent of the young people we surveyed think you do. We need to educate young people about what working in tech really means.”
The research comes as Sage launches a series of AI events for young people. Run in partnership with the charity Tech for Life, the FutureMakers Labs are designed to showcase the opportunities available in the AI, robotics, and digital sectors.
Lyndsey Britton, founder of Tech for Life, said: “It’s encouraging to see how many young people enjoy technology and believe having a career in the sector will be exciting.
“But we need to make sure that the support is there for them to get the right skills to be able to work in future jobs at the cutting edge of digital, like AI. The young people’s events we put on are increasingly popular, and there’s a real thirst from young people to learn, especially from industry experts.
“Working with organisations like Sage means we can help make sure that opportunities and learning are accessible to young people from any background, and ensure there is a future workforce with the right skills and knowledge to do jobs that probably haven’t even been invented yet.”
Internet of Business says
While some might see the quest for diversity in AI and robotics – and technology in general – as a political tick in the CSR box, nothing could be further from the truth.
Figures released at last year’s UK Robotics week revealed that 83 percent of people working in all types of STEM careers in the UK are male, and the numbers for coding and computer science are even worse in diversity terms: just 10 percent of coders are women, with even fewer employees overall coming from ethnic minorities. In general, this is a global problem.
In the West, programmers are overwhelmingly young, straight, white males, and this is a concern, because they are developing technologies that will be used by everyone, in closed teams that are nowhere near representative of society outside the lab. There is growing evidence that this can lead to the creation of biased systems based on limited training data – often unintentionally.
More, as Joichi Ito of MIT’s Media Lab observed last year at the World Economic Forum in Davos, some of those coders prefer the binary world of computers to the messy, emotional world of people, in which systems will actually be used. He described his own students as “oddballs”.
With figures such as Ada Lovelace and Alan Turing (who was persecuted for his sexual orientation) so central to Britain’s computing heritage, the UK is well placed to push for greater diversity. Both now have institutes named after them, and it is good news that many of the country’s senior AI and robotics policymakers and academics are women.
Among many others, these include: Lucy Martin, head of robotics at EPSRC; Dame Wendy Hall, co-author of the UK’s AI strategy review; and Gila Sacks, director of digital and tech policy at DCMS, and Dr Rannia Leontaridi, director of Business Growth at BEIS, who jointly run the new Office for AI in Whitehall.
And Sage’s Sharma is also emerging as an important voice in the debate.
Driving her work at Sage is her fear that AI and the fourth industrial revolution will entrench inequality rather than provide solutions to it. Instead of emerging technologies easing problems such as gender, race, and age inequality, she believes that they risk perpetuating them by cementing biases that already exist in human society.
Speaking to Internet of Business editor Chris Middleton last year at the Rise of the Machines summit in London, Sharma described herself jokingly as “a token millennial” who had been brought into Sage to shake things up.
Sharma expanded on that view in a recent interview. “Despite the common public perception that algorithms aren’t biased like humans, in reality, they are learning racist and sexist behaviour from existing data and the bias of their creators. AI is even reinforcing human stereotypes,” she told PRI.
According to Sharma, the two key components in developing AIs that reflect social diversity are accountability and transparency. Only by understanding the full end-to-end development processes that any artificial system goes through can we check for inherent bias and keep its designers accountable.
“AI is a fascinating tool to create equality in the world,” she said. “When I’ve worked with people from diverse backgrounds, that’s where we’ve had the most impact. AI needs to be more open, less elite, with people from all kinds of backgrounds: creatives, technologists, and people who understand social policy… getting together to solve real-world problems.”
Meeting these challenges early is critical, and it is good news that Sage – via Sharma’s personal determination – is taking positive steps to push the message to the UK’s young people.
Additional reporting: Chris Middleton, Malek Murison.
- Read more: Top priest shares ‘The Ten Commandments of A.I’ for ethical computing
- Read more: AI regulation & ethics: How to build more human-focused AI
- Read more: Women in AI & IoT: Why it’s vital to Re•Work the gender balance