Scouting America, the group once known as the Boy Scouts, has introduced two new merit badges in artificial intelligence and cybersecurity. With about a million members, the organisation wants to stay current with the interests and realities of today’s world. Its chief executive, Roger Krone, told CNN that the badges are designed around topics that young people care about.
The AI badge asks scouts to look at how AI affects daily life. It also teaches scouts everything on deepfakes and to complete a project explaining or using AI. The cybersecurity badge, created with help from Air Force officer Michael Dunn, looks at teaching young people how to protect themselves and their families from digital threats.
Dunn said the lessons can also guide scouts towards careers in cybersecurity, where thousands of jobs are unfilled because of a shortage of skilled workers.
How Are Governments Helping Young People Gain Digital Skills?
The UK government is also turning its attention to preparing young people for a digital world. Earlier this year, Prime Minister Keir Starmer launched a £187 million national skills programme called “TechFirst.” The plan will give 1 million secondary school students the chance to learn about technology and AI through classroom lessons, online tools and local training.
The “TechYouth” strand of the programme, worth £24 million, is made to reach students in every secondary school in the country within 3 years. It will help them gain skills that can lead to jobs in tech areas that are quickly growing. The government said it is working with companies such as NVIDIA, Google and Microsoft to reach 7.5 million workers by 2030 with AI training.
The Department for Science, Innovation and Technology said that by 2035, around 10 million workers will have AI involved in their roles, with nearly 4 million working directly in the field. The government’s research shows that the AI sector is already worth £72.3 billion and could reach £800 billion within a decade.
The Importance Of Teaching Young People Cybersecurity And AI, According To Experts
Experts have given their input on why these skills are so important when it comes to the youth…
Our Experts:
- Vonny Gamot, Head of EMEA, McAfee
- Imran Akhtar, Head Of Academy, mthree
- Bartosz Skwarczek, Founder and President of the Supervisory Board, G2A.COM
- Claudia Cohen, Associate Director, La Fosse
- Sarah Bone, Co-Founder, YEO Messaging
- Sarah Beale, CEO, AAT
- Daniel Rodriguez, Sales Director, Festo Didactic
- Nita Laad, Founder and CEO, Nexia AI
- Derek Jackson, COO, Cyber Dive
- Gary Orenstein, Chief Customer Officer, Bitwarden
- Nicholas DiCola, VP of Customers, Zero Networks
- Dr. Anmol Agarwal (she/her), Professor, George Washington University
- Daniel Myers, Ph.D, Associate Professor of Computer Science, Rollins College
- Jay Bavisi, Group President, EC-CouncilNhon Ma, CEO, Numerade
- Matt Hasan, CEO, aiRESULTS, Inc.
- Emily Bell-Wootten, CMO and Managing Partner, Marketing Assistant
Vonny Gamot, Head of EMEA, McAfee
“With young people spending so much time online, parents are rightfully concerned with the breadth of online threats that are out there. However, as technology evolves, many feel outpaced and unsure how best to protect them. Our study showed that 98% of parents talk to their teens about online safety, but just half check in regularly and only 2% set rules – leaving a clear gap between awareness and action. To help bridge this, parents should balance their openness to talking about online safety with clear, firm boundaries. It may also be worth considering online protection tools that do a lot of the heavy lifting for you.”
Imran Akhtar, Head Of Academy, mthree
“With cyber-attacks rising and AI advancing at a pace, this isn’t just about staying competitive; it’s about protecting the foundations businesses rely on. Cybersecurity and AI are at the centre of how modern businesses operate. If we don’t equip young people with these capabilities now, we risk leaving critical systems vulnerable and falling behind on innovation.
“The most effective way to build these skills is through hands-on learning that mirrors real workplaces. When young learners solve real problems, use the same technology they’ll face on the job and get meaningful feedback, they enter the workforce ready to make an immediate impact. Equally, soft skills such as communication, teamwork, and problem-solving are just as important for turning knowledge into results.
“When this approach is combined with reskilling for people already in work, employers can retain hard-won knowledge while building new capabilities and closing critical skills gaps. Staying competitive isn’t enough any longer, we need to stay protected. Training our teams to be aware of the dangers and the steps to take to protect their software, their teams and their organisations is now more critical than ever.”
Bartosz Skwarczek, Founder and President of the Supervisory Board, G2A.COM
“Undeniably, technology is near essential to everyday living. Therefore, teaching young people about cybersecurity and AI is no longer optional, but critical. At G2A.COM, we believe the future is not human vs. AI, but human vs. human with AI – which is why education comes first. AI and cybersecurity are already defining the next generation of innovation, and both carry significant responsibility.
“Educating our young people about cyber hygiene, data privacy, and responsible AI equips them with the skills and know-how to protect themselves and others in an increasingly digital world. Doing so also helps them to understand the ethical implications of technology. Especially, how it can empower communities, promote inclusion, and strengthen trust – but only when used correctly.
“AI and cybersecurity are deeply interconnected. And as AI continues to drive further digital transformation, it’s also reshaping the cyberthreat landscape. Developing early awareness ensures that future innovators approach new and evolving technology with both curiosity and caution. It can also teach them the importance of embracing creativity while also understanding risk.
“Ultimately, nurturing this level of awareness at an early age is key to building a safer and more inclusive digital future. Therefore, by investing in education, mentorship, and digital literacy we can inspire a generation that uses and truly understands technology – and who can eventually create solutions which are secure, fair and designed for everyone.”
Claudia Cohen, Associate Director, La Fosse
“Educating the next generation about cybersecurity and AI is essential. As they get older, technology is going to be woven into almost every aspect of their daily lives. From their personal lives with social media, to tools they’ll use in school, and eventually, at work.
“Teaching kids early how to navigate digital spaces safely allows them to put these lessons into practice daily, building habits that protect them online. Early exposure will also help build confidence and curiosity, encouraging them to engage with technology thoughtfully and critically. It helps them understand the wider impact of technology on society, such as how algorithms shape the news they see or influence decisions. Learning about the ethical use of AI teaches them to question bias, respect privacy, and consider how technology affects others, not just themselves.
“But this isn’t just about future jobs, it’s about helping children grow up digitally literate and empowered. By equipping them with these skills early, we’re giving them the tools to thrive in both their personal and professional lives in a tech-driven world.”
Sarah Bone, Co-Founder, YEO Messaging
“Arguably, digital protection of themselves, their data, their online friends and contacts, and their behaviour patterns, should be the number one understanding when we teach young people about cybersecurity. The online world is now a fundamental part of their daily lives, but it also exposes them to risks ranging from identity theft and grooming to manipulation and fraud. Cyber education must be about equipping young people with the awareness and critical thinking skills to recognise threats, question what they see, and take proactive steps to protect their personal information and digital footprint. Just as we teach road safety before allowing a child to cross the street, we must teach cyber safety before they fully participate in digital life.
“Artificial intelligence makes that protective foundation even more urgent. AI can now shape the information young people receive, the content they trust, and the decisions they make, yet it can also be exploited to mislead, profile, or manipulate them. Teaching them how AI works and how their data feeds algorithms helps demystify the technology and reduce vulnerability to misuse. Most importantly, it empowers them not only to protect themselves, their data, and their patterns of behaviour but also to become ethical, informed participants in a future.”
Sarah Beale, CEO, AAT
“AI and cybersecurity are no longer specialist skills, they’re essential life skills, for day to day and the workplace. Yet our latest ‘Filling the Gap’ report shows that 52% of employees lack AI literacy and one in four have no cybersecurity knowledge at all. For young people entering the workforce, that gap is even wider.
“With 950,000 NEETS, the UK can’t afford to leave this generation behind. Employers tell us that school leavers and graduates often have the potential, but not the work readiness workplaces demand. In the age of TikTok and short-form content, attention spans are shrinking, whilst doom-scrolling and brain rot videos are increasing. This is making it even harder for young people to develop the critical thinking and problem-solving skills needed to thrive.
“That’s why we need to start teaching AI and cybersecurity not just as technical disciplines, but also as part of the core skillset every person, including the young needs to succeed in a digital first world. Apprenticeships, mentoring and structured on the job learning gives them opportunity to build those skills and confidence to become the next generation of innovators and leaders. That’s the true foundation of a real future ready workforce.”
Daniel Rodriguez, Sales Director, Festo Didactic
“Technological advances have consistently acted as force multipliers, amplifying human capabilities and accelerating industrial progress. AI is a force multiplier that should be as standard a requirement for students today as typing skills were a few decades ago.
“In the manufacturing sector, for example, AI goes far beyond ChatGPT and into critical programs we rely on every day, like predictive maintenance. While new AI-driven technologies and pedagogy present a paradigm shift for education and training, it shouldn’t be feared as a replacement for human workers. The rise of AI is no different from the role technology has played throughout history.”
Nita Laad, Founder and CEO, Nexia AI
“AI as a technology is democratised like no other technology in the last 2 decades. The semblance is close to the internet revolution when the browser and search became accessible to humanity. AI is a fantastic technology that makes knowledge and cognitive data available to everyone hence it becomes an important life tool for people of all ages.
“However, along with intelligence democratisation this technology poses significant risks as well (impersonation, deep fakes, financial and identity fraud, impersonation etc) that can be very hard to detect by humans that are not well educated on cybersecurity and AI foundations. Hence holistic knowledge on cyber risks, data usage and exposure alongside AI is a critical combination of education that should be studied. While there is a lot of educational content available for no charge on the internet, there continues to be low awareness amongst the general non-tech population on this topic.”
Derek Jackson, COO, Cyber Dive
“We don’t need to teach kids how to use AI and cybersecurity tools. They’re already better at that than most adults. We need to teach them why. Because the moment you understand how the system works, you realise how easily it can be gamed, manipulated, or weaponised. And that’s the part no one wants to talk about. It’s not about coding or passwords. It’s about power. Who has it, who controls it, and how easily it’s given away with a single click.
“Every generation has been terrified of the next one’s technology. The printing press would rot minds. The radio would ruin families. Television would destroy attention spans. The internet would end civilisation. And now, AI will apparently replace us all.
“We always act surprised when new technology is used in unexpected ways — even though that’s the only thing it’s ever done. It exposes our blind spots, mirrors our fears, and forces us to adapt. The real danger isn’t the technology itself; it’s pretending kids can’t handle it.
“We don’t need more “digital literacy.” We need curiosity — kids who look at a chatbot and ask, “Who trained it, and why should I trust it?” That’s how you teach power.”
Gary Orenstein, Chief Customer Officer, Bitwarden
“Recent Bitwarden research reveals a growing gap in cybersecurity education. While 78% of parents worry about AI-enhanced scams targeting their children, 43% have never discussed how to spot one. That disconnect reflects more than a lack of awareness. It highlights the lapse in translating concern into meaningful action.
“Children are online earlier than ever, with 42% of those ages 3 – 5 already having unintentionally shared personal information online. Nearly 80% of kids ages 3 – 12 have their own tablets, yet only 16% of families report using proper security tools such as password managers to protect credentials. These vulnerabilities compound as children grow up without understanding how to safeguard their digital identities.
“Teaching cybersecurity and AI awareness requires a multi-generational approach:
Formal curriculum integration: “Schools should teach students to recognize AI-generated content, understand data privacy, and identify social engineering tactics as essential digital skills.”
Parent education programmes: “Adults need hands-on guidance to bridge the awareness-to-action gap and model more secure behaviors, including the use of password management tools at home.”
Age-appropriate instruction: “Education must evolve with children; from basic online safety concepts for early learners to advanced threat recognition for teens.”
“The next generation will increasingly continue to face threats powered by deepfakes, AI-driven phishing, and identity fraud. The most effective defense is education that evolves as fast as the technology itself, paired with secure habits and tools that protect every account, password, and identity along the way.”
More from News
- UK’s Digital ID Dreams Shaken After Widespread AWS Outage
- Why Are AI Companies Being Kicked Off WhatsApp?
- Digital ID Industry Set To Surpass $80 Billion In 5 Years
- Is Getting Blocked the New Status Symbol on Social Media? Bluesky’s White House Moment Says It Is
- Global AWS Outage Affects Popular Platforms, Including Snapchat And Canva
- How Will Google’s New AI Hub Transform India’s Digital Map?
- How Are Visa And Walmart Changing How Shopping Is Done With AI?
- ChatGPT To Allow Adult Content: What Should Parents Do To Keep Children Safe?
Nicholas DiCola, VP of Customers, Zero Networks
“The best way to strengthen cybersecurity across society is through generational change and by starting with education at earlier ages. Children and students spend much of their lives online, often interacting with AI, yet many lack the basic understanding of how to stay safe in that environment. Teaching cybersecurity early can help them recognise risks, protect their data, and think critically about the technology they use.
“Something as simple as sharing a photo or video online can have serious consequences. With deepfakes and AI-driven impersonation on the rise, personal images, voices, or data can be manipulated to create fake content or commit fraud. Understanding these risks should be as fundamental as learning about strong passwords or software updates.
“Building this awareness from childhood through higher education ensures a more secure, informed society, one that’s ready to navigate the growing influence of AI and protect itself against evolving digital threats.”
Dr. Anmol Agarwal (she/her), Professor, George Washington University
“AI and cybersecurity are influencing all technology, so it is important for young people to learn about AI and cybersecurity because these technologies are impacting everyday life.
“Every time a social media application is used, every time an online shopping application is used, and every time any technology is used, AI and cybersecurity is involved. AI can both help and cause additional security considerations for young people; it determines what content and ads they see and also can impact whether they are impacted by deep fakes for disinformation (“fake news”) or targeted directly via cybersecurity attacks.
“By learning about AI and cybersecurity, young people can be better informed and prepared to thrive in the world, accelerate their careers, and help inform their communities and keep people safe.”
Daniel Myers, Ph.D, Associate Professor of Computer Science, Rollins College
“Teaching AI is important for young people because it’s going to be part of the fundamental tech infrastructure that they’ll use for the rest of their lives, like cell phones and social media were to Gen Z or the Internet was to Millennials.
“Students need to understand both the capabilities and limitations of AI so that they can use it effectively.
“My own work focuses on how to use AI to support programming and data analytics education. We’ve learned that, used carefully, AI can supercharge high-impact education. When used thoughtfully, it allows students to pull from a range of knowledge, skills, and perspectives that would be difficult to obtain through conventional content-based teaching. I see students using AI effectively to take on big, ambitious projects with a personal creative element. We often say that AI is “like having a minor in everything”.
“We can use AI to increase access to valuable skills like programming. Students are also using to practice simulated interviews and conversations. Overall, I see the introduction of AI as a challenge to colleges to raise our standards and push towards a more open, authentic approach to learning.
“Students also need to understand the impact of AI on their information environment. It’s easy to find examples of people who were misled by AI-generated videos or had their mental health damaged by AI models that were too quick to reinforce delusional thinking. Young people need responsible guidance and safe learning opportunities to engage with AI in developmentally-appropriate ways. They should see as AI as a powerful and potentially useful tool, but also something that demands human agency and critical thinking.
“To use AI well, you need think critically and carefully, ask good questions, and bring your own taste and perspective to the problem. Those are exactly the skills developed by a traditional liberal arts education that includes a healthy diet of literature and the arts.
“AI supercharges high-impact education. When used thoughtfully, it allows students to pull from a range of knowledge, skills, and perspectives that would be difficult to obtain through conventional content-based teaching.
“I see students using AI effectively to take on big, ambitious projects with a personal creative element. We often say that AI is “like having a minor in everything”.
“We can use AI to increase access to valuable skills like programming. Students are also using to practice simulated interviews and conversations. Overall, I see the introduction of AI as a challenge to colleges to raise our standards and push towards a more open, authentic approach to learning.
“A second benefit is that AI reaffirms the core value proposition of the liberal arts. To use AI well, you need think critically and carefully, ask good questions, and bring your own taste and perspective to the problem. Those are exactly the skills developed by traditional liberal arts subjects.”
Jay Bavisi, Group President, EC-Council
“Teaching young people about cybersecurity and AI is essential to preparing them for the world they are growing up in. AI is no longer a future concept. It is a present-day necessity that is powering industries, transforming how we work, and shaping how we live. As AI systems become more advanced and agentic, able to make decisions and act independently, the need for digital awareness and security grows.
“Building a culture of security from an early age ensures that young people understand how to use these tools responsibly, how to protect themselves and others, and how to lead in a digital-first world.
“Just as important is teaching them to view AI from a dual perspective. To recognise its power to solve problems while also understanding how it can be misused. This mindset sharpens critical thinking, deepens technical skills, and prepares them to innovate with responsibility. Empowering the next generation with this knowledge is one of the most important investments we can make.”
Nhon Ma, CEO, Numerade
Artificial intelligence is transforming learning from the elementary classroom to higher education. Some may see AI as an endeavour that will eventually supplant traditional learning, but in truth, AI has the potential to enhance learning by guiding students, opening doors of knowledge, and explaining concepts in detail. Properly integrated, AI facilitates personalised learning, enabling every student to achieve mastery at their own pace and build confidence in their knowledge.
“However, the role of AI in education needs to be carefully curated, especially when it comes to younger students. The younger the child, the more important it is to make sure that AI is not used as a means to avoid original thought.
“Students in elementary school should have very limited, if any, exposure to AI-generated content, since their cognitive development relies so heavily on direct engagement in reading, writing, and problem-solving. In secondary school, AI should serve as a supporting tool and never replace critical thinking. Educators and parents need to be attentive to the use of AI, ensuring that the tool aids and does not interfere with intellectual development.”
Matt Hasan, CEO, aiRESULTS, Inc.
”Teaching young people about cybersecurity and AI isn’t just an elective anymore; it’s foundational literacy for the 21st century.”
”We’re not just preparing them for the workforce; we’re preparing them for life in an increasingly digital world where technology is seamlessly integrated into everything—from how they socialize to how they manage their money. If they don’t understand the basics of digital defense (like strong passwords and identifying scams), they become easy targets.”
”Crucially, as AI becomes the default engine for innovation, our students need to be more than just consumers of it. They need to understand how AI works, the data it uses, and the ethical implications of its power. This knowledge empowers them to be thoughtful creators and responsible users, rather than passive bystanders. In short, it’s the difference between being a vulnerable user of technology and a secure, informed participant who can help shape the future.”
Emily Bell-Wootten, CMO and Managing Partner, Marketing Assistant
“Teaching young people about cybersecurity and AI is not optional – it is essential. Your data is the one thing in life you truly own: your name, social security number, personal thoughts, ideas, and online behaviour. If children are not taught how to control it safely, they can be manipulated, exploited, or abused by people or systems they don’t fully understand.
“Think of it like this: just like we tell children that they do not have to hug the creepy uncle, we need to teach them that they do not have to share personal information with strangers online. Not everyone is your friend. Not every app or chatbot is safe. Social media, AI platforms, and games can harvest personal data in ways kids, and even many adults, do not realize.
“A real example: In 2025, the FBI warned about a violent online group called “764” that used social media and gaming apps to groom teens into creating explicit or violent content. Victims were coerced into self-harm and even suicide. This is exactly why early education is critical. Children need to know what information is private, how to spot red flags, and that their digital footprint can have long-term consequences. https://www.fox32chicago.com/news/fbi-warning-764-targets-kids
“Teaching cybersecurity and AI awareness early helps children develop healthy digital boundaries, make smarter decisions online, and avoid traps that can have real-world consequences. Parents, think of it as teaching your kids self-defense in the digital world…because the threats are real, and the lessons need to start before they log on.”