AI Tools Help (a Lot) During Interviews
A Business Insider article says we shouldn’t call it “cheating,” but using these interview AI tools seems like cheating to me.
Final Round AI will revise a resume, generate a cover letter, and run mock interviews. After an interview, it will summarize, follow up, and somehow—coming later this year—negotiate a salary. But wait, there’s more: Its Copilot product (no relation to Microsoft’s) will transcribe the interview and, in real time, provide sample answers to questions. Cofounder and CEO Michael Guan says, "It can prompt the candidates with the right thing to say at the right time. Like a magical teleprompter, using AI."
Although intended only as a meeting transcription service, Otter.AI is being used as a "proxy interview" tool. This could involve the candidate lip-syncing as someone else answers questions for them. Other tools, like Interview Buddy, provide sample responses or bullet points for the candidate, but Interview Buddy stops short of technical questions, which the CEO says would be "kind of crossing the line, where it's not actually in the interest of the candidate or the employer if they're getting information that they don't actually know."
Students should consider ethical issues of using these types of tools. Where does AI assistance cross a line so that students are no longer representing themselves, which raises questions of integrity and authenticity? How would students answer questions from the Framework for Ethical Decision Making (Chapter 1 of Business Communication and Character, Figure 7)?
From a practical perspective, are students setting themselves up for failure in a job? Guan says using AI reflects a candidate’s “ingenuity,” and he isn’t concerned about results on the job: "If they can use AI to crush an interview, they can for sure continue using AI to become the top performer in their daily jobs.” Can they? Any job? Maybe they can perform only the type of job that is increasingly rare because AI is already doing the lower-end work.