Site icon TechRound

Professional AI Use: Is There A Double Standard In Who Uses It?

In the 2024–25 school year, 6 in 10 US public school teachers said they used AI at work. That’s according to the Walton Family Foundation and Gallup, who asked over 2,200 teachers about how they use these tools.

Nearly a third of teachers use AI every week, mainly to prepare lessons, adjust class materials, or create worksheets. The feedback is mostly positive. Most regular users said it saves them time (almost six hours a week) and helps them do their job better. Teachers said they used the time they saved to help students more directly or simply get home earlier.

Still, schools are cautious when students use AI to help them with essays or homework. In some cases, they face discipline, even when their teachers are doing the same thing to prepare for class.

Hiring managers want real people, but they’re using bots too…

The same split is happening in recruitment. A May 2025 survey of 600 hiring managers showed that almost one in five said they would reject a CV or cover letter if it looked like it was written entirely by AI. Another 20% said it made them question whether the candidate was serious.

Older hiring managers were more likely to feel this way. One in four Baby Boomers said they would not accept AI-written resumes. Millennials and Gen Z managers were less harsh, but even then, only a small number said it was fine to use AI to write a full application.

At the same time, many hiring teams are using AI themselves. While some avoid it altogether, about one in five hiring managers use AI to screen CVs. This has led job seekers to assume that machines are the first to review their applications – and that they should use every tool available to be seen.

 

Is There A Double Standard?

 

The problem is not that teachers or recruiters are using AI. It’s that the people they judge, students and job seekers, are not allowed to do the same without risking their chances.

AI is already part of how schools and companies run. But instead of being honest about that, some are punishing those trying to keep up. If AI is good enough to mark work and read CVs, then it’s hard to say it’s cheating when someone uses it to write one.

 

But let’s see how experts feel about this. They have shared their views on this…

 

Our Experts:

 

 

George Chen, President, QuickTakes

 

 

“There’s definitely a double standard at play. Professionals are applauded for using AI to save time and improve productivity, yet when students do the same, it’s often seen as cutting corners.

“The real issue isn’t AI itself, it’s how it’s being used. When students use AI to better understand material or manage their workload, just like professionals do, it should be recognised as smart, not shameful.”

 

Adnan Rasool, Associate Professor, University of Tennessee, Martin

 

 

“So as someone who deals with this on a regular basis as the Director of my university’s centre for teaching and learning, universities are actively trying to mainstream AI use in class through changing pedagogical approaches. The reason for that has been industry feedback.

“The more we talk with the industry that will hire our graduates, the more we are told that due to budget cuts and higher expectations, a new hire is expected to multitask and know a whole bunch of skills they did not really need a while back. That means the only way this gets done i.e. they end up at a job and are successful is they can leverage AI to be productive and efficient.

“Schools across the US are actively bringing AI into the classrooms, and at least my institution we are investing heavily in training our faculty how to use AI in classes because our students need it. Right now students look at AI as a magical tool to get through a class. We are focusing on them learning how it can be part of their professional lives from preparing notes to creating presentations to helping test out strategy.”

 

Zachary Cote, Thinking Nation

 

 

“AI is a tool and not all tools require the same skill level to do well. Professionals often have the training to use tools in a way that is more in line with the intent of the tool; students are still learning how. If I started a new job in construction with no experience, I wouldn’t expect to be handed the keys to a tractor. It would make more sense to be handed a shovel.

“We can think of AI in the same way. It is a tool that demands a particular type of training. We cannot underestimate that. With this in mind, it is not a double standard, but a rite of passage.”

 

Laurence Minsky, Professor, Columbia College Chicago

 

 

“I can’t speak job hunters, but I can’t speak about students. First, I want to note that I encourage my students to use Gen AI (and more basic AI tools like spelling and grammar checkers) ethically, legally, and appropriately. And I spend time in class discussing how and when to use it – and when not to use it. They can use it for brainstorming ideas, but not for generating the final language.

“Now why many professors don’t allow it: College is the place where students develop their critical thinking skills. Curating from what was produced for the brainstorming requires some critical thinking. (And fact checking is required too, because of hallucinations.) But relying on Gen AI beyond what I outlined does not help students develop their critical thinking skills, so we’d not be doing our job if we allow it in all – or even most – instances.”

 

Jeff Le, Managing Principal, 100 Mile Strategies

 

 

“From an education perspective, the use of AI and fluency on these tools are likely a game changer for the workforce. This is already playing out with layoffs through the private sectors as companies scour for efficiencies.

“There is a balancing act of the widespread fluency for AI tools and LLMs and developing critical thinking skills, writing capabilities, and problem-solving.

“There have been studies on young people leaning heavily on AI for their assignments which has an effect of over-reliance. Such dependency has been seen to impact skills development.

“On the other hand, such tools are able to educate young people and potentially broaden access for a wider swath of students in the digital age.

“For jobseekers, the utilisation of AI on job applications can be counterintuitive in showing qualifications for roles. However, companies are also looking at ways to also test or add additional validation on how candidates can utilise emerging technology resources, including case studies, presentations, and real-time sample testing.”

 

Anju Visen-Singh, VP of Product and Marketing, Acuity Insights

 

 

“This is an important and nuanced question.

“One key tenet of any university is helping students build deep, foundational knowledge. If students rely too heavily on AI without understanding the underlying principles—whether it’s in writing, coding, or problem-solving—they risk becoming dependent on tools without learning critical thinking or core competencies so they can tell when AI is, in fact, not providing the right solution.

“Additionally, educational institutions want to ensure that grades reflect a student’s individual effort and learning. Since generative AI can produce high-quality answers, it complicates traditional assessment models. Until new forms of evaluation are developed, some institutions opt for limits to prevent misuse. Also, not all students have equal access to advanced AI tools. Limiting AI levels the playing field while institutions figure out how to provide equitable access to these technologies.

“Having said this, universities must absolutely look at how they incorporate AI into curricula with urgency. Students will be expected to use AI when they join the workforce, so it’s critical to teach students how and when to use it effectively, ethically, and responsibly.

“That includes knowing when AI enhances work and when it can introduce bias, misinformation, or overreliance. Helping students understand how they can stay updated on AI tools and capabilities is also something that would help them be more effective when they join the workforce.

“Some programmes are already integrating AI literacy and application into their coursework, and I do hope and expect that we will see more of this happening in the coming months. Urgency is the need of the hour – universities can start by focusing first on ensuring that those in the upper years have the AI competencies needed to succeed in their future workplaces and then roll this out more broadly.”

 

Nigar Valiyeva, Director of E-Commerce, Procurement & Growth Strategy Verified Media Expert, Shop Cart Trading FZE

 

 

“The divide between AI use in professional environments versus education isn’t just a double standard it’s a total strategic disconnect. In my two decades of leading global procurement, marketing, and e-commerce innovation across the Gulf and Caspian regions, AI has not only improved decision-making it has deeply redefined the entire competitive landscape.

“We absolutely don’t use AI to replace thinking; we use it to amplify it. Businesses that refuse to leverage AI risk becoming fully obsolete. Meanwhile, educational institutions are conditioning future professionals to fear the very tool that drives modern progress.

“The question shouldn’t be ‘Should students use AI?’ it should be, ‘Are we preparing them to lead in an AI-driven economy?’ Encouraging selective AI adoption creates knowledge inequality. We must shift from outdated notions of academic purity to practical readiness. The next generation of business leaders will be AI-literate and those who aren’t will be left far behind.”

 

 

Sam Wright, Head of Operations and Partnerships, Huntr.co

 

 

“People need to realise there is no AI work. No AI resumes. There is work done well and work done poorly. There are good resumes and bad resumes.”

“There needs to be a shift to focusing on outcomes. It is up to teachers and hiring managers to create assessments that focus on outcomes not easily created without thinking.”

“There is a huge double-standard right now and it’s hurting students and candidates.”

“The best way forward is for students and candidates to use tools intelligently. That will put them ahead of the pack.”

 

Karaoui Elhocin, CEO, Karaoui Agency

 

 

“The double standard in AI is a good example of how things used to be in school. Every day, I use AI to do in-depth research, make high-quality ads, and get accurate information as a digital marketing strategist and language school owner. It doesn’t take away my skills; it makes them stronger.

“The problem isn’t AI; it’s how we’re teaching. Students shouldn’t have to work hard to find information that is already out there; instead, they should learn how to use that information in real life. We used to think that writing one research paper was very hard, and we were wrong to think that learning was the same thing.

“Students today need more than just theoretical research; they need to be able to use what they learn in real life. We shouldn’t ban AI; instead, we should teach students how to use it to solve problems and apply it in the real world, not just to find answers. We shouldn’t be asking, “How do I find this information?” Instead, we should be asking, “How do I use this information to solve real problems?”

“We’re teaching kids how to live in a world where AI is everywhere. It’s like telling them not to use calculators in a digital world.”

 

Ryan Waite, VP of Public Affairs, Think Big

 

 

“It’s a puzzling contradiction. In professional settings, using AI is seen as efficient and forward-thinking. In academic environments, however, students are, more often than not, told to avoid it altogether or severely penalized for using it. That gap reflects a deeper issue: the lag between how education is delivered and how the world actually works.

“I have found that good educators aren’t ignoring AI. We are building it into the learning process with clear expectations and ethical boundaries. The goal should be to teach students how to engage with AI critically and responsibly. Avoiding or actively supplanting that conversation just leaves students unprepared for the tools they’ll be expected to use in the real world.

“If AI is shaping the future of work, not using it as a part of a student’s education is academic malpractice.”

 

Phil Brunkard, Executive Counselor, Info-Tech Research Group

 

 

“Why would we treat AI any different from any other technology wave that has reshaped education and the workplace? For those of us around long enough, we can recall the debate about whether calculators should be used or not for fear that using them would hinder our ability to do mental arithmetic.

“That was nonsense. Calculators became the norm, replacing boring, repetitive calculations. Can you also imagine if computers or the internet were discouraged for fear that using word processors might hinder a student’s handwriting or the internet might replace library research skills? That would not be progress.

“We should also approach AI with a forward-thinking mindset.

“Understandably, there are concerns that overuse of AI will impact critical human skills such as original thought and creativity, analytical and problem solving, or simply having your own personal style and identity. Students and job-seekers should be encouraged to not misuse AI and be encouraged to appropriately use AI. They should be encouraged to use AI as a skill-augmenting, not a skill-replacing tool. This includes:

-Using AI for efficiency – searching, summarising, testing ideas and concepts

-Being cognisant about the responsible and ethical use of AI – bias, plagiarism, data privacy, academic or job-seeking dishonesty, risk of hallucinations, and fact-checking

-Developing AI literacy skills to equip them for an increasingly AI-dominant world, where we must encourage the positive use of AI and not its misuse through restrictions.”

 

Mario Sarceno, Founder, Founders PR

 

 

“As helpful as AI is, it should be an augment of your expertise.

“Discouraging AI in education helps students develop an original judgment that will help them prompt and evaluate AI’s capabilities effectively.

“It helps build a cognitive foundation, and when they become professionals, they can use AI as an amplifier, rather than a crutch.

“A recent MIT study (not peer-reviewed yet) claimed that AI can harm critical thinking skills, and I believe it could for individuals who don’t know how to accomplish work without it.”

Exit mobile version