Video Summary and Transcription
This talk explores the use of AI in the interviewing process for software engineering. It discusses the history of interviewing and the skills needed for future interviews. The speaker questions the relevance of traditional coding challenges and highlights the shift towards evaluating specific programming languages and debugging skills. The talk also emphasizes the importance of understanding the uses and limitations of AI and the value of communication skills in technical interviews.
1. Introduction to AI Interviewing
This is a talk about interviewing in the age of AI. We'll discuss the history of interviewing for software engineering and the skills needed for interviewing in the future. The speaker has experience with different types of interviews and acknowledges that it's a learning process. AI is a significant topic in the industry, and the speaker relates it to a taboo subject. They have some questions to explore further.
Thanks for coming. This is a talk about interviewing in the age of AI. This topic has come up in I think every one of the discussion rooms today, so it's probably top of mind for a lot of us, either on the hiring side of this or the interviewing side.
So the agenda is a bit of an intro here. We'll talk about AI, like an elephant in the room, a bit of history on interviewing for software engineering, and then skills, like things to think about for interviewing in the future.
This is me doing my best combustion man from Avatar. I don't know if you know the series, but it's a great one. And a little bit of background on me. I guess I'm currently the VP of engineering for Vercel. Vercel makes a deployment and developer tools platform that I hope many of you are using. We make an AI SDK to make the building of AI applications easier, and it's very easy to build these things on Vercel, so worth a check out. I have done steak dinner interviews with the CEO because that's how it was done early in my career, to puzzlers, algorithms, data structures, kinds of interviews once I was thinking about going to Silicon Valley. It was a complete mind shift for me. It was very different. I have failed some of those interviews miserably. I've passed some rather unexpectedly. So just know that it's a process. You get better by doing it. But this is what we're going to talk about today.
So AI. It's not the elephant in the room. Obviously we spent all day in the discussion rooms talking about AI in some form or another, and it's the topic of every Silicon Valley CEO today. Everybody building a company is thinking about what to do with it. That's why I picked the taper. If you don't know the taper, the taper is a very cute-looking animal. It's more of a cross between a horse and a rhinoceros than an elephant, and that's the thing. I picked the taper because I think AI in the context of interviewing remains a fairly taboo kind of... It's a sort of dangerous topic. And so I have some questions to get some baseline. So this is the one that I'm very curious about.
2. AI in Technical Interviews
Does your company allow candidates to use AI for technical interviews? The speaker discusses the use of AI in technical interviews and expresses curiosity about company policies. They also delve into the topic of embracing or resisting AI during the interview process and question the relevance of traditional coding challenges. The speaker provides a historical perspective on coding interviews, starting from the 60s and highlighting the shift towards evaluating specific programming languages and debugging skills.
Does your company currently allow candidates to use AI for technical interviews? So I'm curious. Okay. And actually maybe another form of this is, does your company have a policy with regard to the use of AI during technical interviews? Has this even been established? Okay. It's very few. I think that's why the taper is a good analogy. Nobody kind of wants to touch this one.
So, okay, do you use AI for work on a pretty regular basis? Okay. Now, given the choice, you're about to do a technical interview, would you want to use AI? Yeah. Most of you. I certainly would. I use it every time I write code. I would not want to change what I'm using. So again, this gets into the opinion section. So the questions we're going to touch on today, so how are tools like the Copilot changing the landscape for coding interviews? A pretty interesting example of this, this guy's the cofounder and CTO of a company called Hatchways. Hatchways helps, they're like interviewing as a service company. And that's actually a very insightful article about AI during the interview process. But what's so fun about it is he picked the clickbait title, that using AI is cheating. And I think this is why it's a taboo topic, because we haven't decided, is it cheating or is it the next best thing in the world? So this is the question. Should we embrace or resist AI during the interview process? And are the like elite code, traditional coding challenges, are those things becoming obsolete now?
We're going to talk a little bit about coding interviews. A bit of history, going back to the 60s, this is the reality. If you wanted to program computers, you might learn Fortran, COBOL in school, but you didn't have one at home. And so the interview processes were not typically done on computers. They were pretty theoretical. They were, can you break down problems, problem-solving questions. And really about raw intellect and understanding. And that's just how it was done. And if you were smart, you could get the job. And you move into the 80s, and now computers are proliferating a bit, in the 80s and 90s. And so specific programming languages become part of the interview loop, right? So can you, do you understand syntax for C++? Can you write object-oriented Java correctly? Can you explain how prototype inheritance works, etc.? And debugging. Debugging now is a little bit more part of the process.
3. Evolution of Software Engineering Interviews
The interview process for software engineering has evolved over time, incorporating behavioral interviews and practical skills like sysadmin, TCP/IP, and web technologies. Whiteboard interviews and puzzlers became popular, testing coding syntax and problem-solving abilities.
It's faster. And so they're testing whether you can do it. But it's also the era of Hackers the Movie. And so what's pretty interesting about this, you know, this is like zero crash, was this guy's name. All of a sudden, you know, Revenge of the Nerd style, programming got to be more cool and edgy in culture. And the idea that you might be using programming for malicious endeavors, you know, extract one penny for every transaction, is a thing. And so the interview process for software engineering starts to incorporate behavioral interviews and understanding your past experience. And so this is pretty new in the interviewing process. So you get into, you know, now into the 2000s and 2010s, really practical skills like can you sysadmin a machine? Do you understand, you know, some of the free software that's come out? Do you understand TCP IP, you know, core networking, database knowledge, systems administration. But also HTML, CSS and JavaScript, because products being put onto the web are delivered to customers through this technology. So how well do you understand it started to become part of the interview. Famously also whiteboard interviews come out during this time, where you're given a problem And in real time now, you're coding on a whiteboard in code, and they're testing your syntax. Like I've done a whiteboard interview where you're dinged for missing a semicolon, right? Like, you know, we didn't have REPLs to do this very easily, but we did do this on whiteboards. Very nerve-wracking. If you've done a whiteboard interview and you haven't practiced it, it feels very combative potentially. But, you know, again, this is also the era of some of the puzzlers. This was famously Google's puzzler put up on billboards in Silicon Valley, which if you did the math and did the theory and the research behind this, you'd end up at HTTP 714, you know, to a website, which took you to another puzzler, which you had to solve. And so basically, again, starting to proliferate the idea that multi-stage analytical problems were part of the job.
4. Modern Full Stack Software Engineering
The full stack spectrum in software engineering requires understanding multiple technologies like operating systems, web frameworks, programming languages, databases, and networking. The use of coding tools and AI assistance in interviews has changed the landscape. A recent article highlighted a negative experience with a specific coding question. The author questioned the relevance and difficulty of the question and experimented with an AI solution. The results showed the importance of understanding the output and potential limitations of AI models in problem-solving tasks.
So you know, you get into the present, right? This is the full stack spectrum. You're expected to understand operating systems, web frameworks, programming languages, multiple ones, databases, web servers, clients, how clients interface with these backends, you know, JavaScript, CSS, and HTML, and native applications. And you probably, if a company you work at is building consumer products, they have both of these things running in production, talking to backend services constantly, and so you have to understand networking, RPCs, how you write the code to do that. So it gets pretty intense.
And so back to this question on coding challenges becoming obsolete, this is in February. Gaspar is one of my favorite engineers from Vercel, and I had drafted a full stack coding challenge that we ran as part of the interview loop. It's a 90-minute interview, okay? And we give you a problem. You can build a thing on your machine, you can use any tools you want to, and, you know, go to town, build a thing. And Gaspar writes me, he's like, oh, I've done it in two minutes using GPT and coding tools and assistance. The world has changed. And he's right.
Recently this article came out, it was on Hacker News, but it was a guy who interviewed Stack Overflow for the second time, and he was really mad. He got to very well in the interviews, he got to his last interview with Joel Spolsky, which I would love to, although maybe I wouldn't love it, get to my last interview with Joel Spolsky, but he gets to his last interview, and he gets a question, it's kind of like this famous one from Facebook, it's to take a decimal number, convert it to base negative two. And so he wrote on his blog, he's like, okay, this question which is possible and doable in 40 minutes is more really about the problem solving process than anything, but it's the dumbest fucking question I've ever had, and I don't care what anyone else says. So he had a bad experience, he was mad, it's an off by one style question, right? Rounding errors, rig-type precision. So I thought it would be fun to plug this in to Vercel's AI SDK and see what kind of answers we get. Because this is kind of elite code style question. Let's see if it just starts. I think I pushed the button. Okay, look. So this is actually two models side by side, and this is me pushing the question in. So GBT 4.0 and Gemini Flash. All right, this 40-minute question is about to be done in 20 seconds. Pretty amazing. But check this part out. The output from each one is different. GBT is off by an entire power here. So what does that tell you? It tells you that actually, if you don't know that it's off, you might take the code from the first one and move on to the next task. You broke it down, you picked this foundational layer, because now we're going to give longer problems, assuming you might use GBT for this.
5. AI in Interviews and Development
Interviewers now use AI models in the interview process, creating both benefits and risks. Rounding errors have always been a part of interviews and can still occur with AI. Technical proficiency with AI tools will be valued in the future. The inner loop of development is already utilizing AI, but there is a focus on bringing AI into the outer loop for bug management, planning, code design, and other aspects of software maintenance. Leveraging AI in areas like language and perception can combat unconscious bias and improve communication.
You broke it down, you picked this foundational layer, because now we're going to give longer problems, assuming you might use GBT for this. I do think interviewers will pick interviews now to kind of play off of this, where if you pick a model and run part of the interview through it, you'll get the wrong answer. And if you can't correctly see that this is the wrong answer, you will fail the interview. So it's both a real benefit and a total risk now to think about this.
And in fact, this is just sort of the analysis. I plugged this into GBT to ask it, you know, okay, you got it wrong by a factor of 10. Like what was the problem? And it was a rounding error. So again, rounding errors have always been part of these kinds of interviews, and they will continue to be subject to error with AI.
All right. So what are the things, what are the skills in the future? What are companies going to value? What are you going to need to practice and bone up on? How should you think about this? I think technical proficiency with AI tools undoubtedly becomes part of the interview loop. What will that really look like? So I think, you know, given a set of tools, you have the inner loop. This actually is the language coming from a paper that came out last week from Google on how they've been instrumenting and building AI for their software engineering team. And it's been years in the making. But these are the areas where they're doing testing. Right? Like, if we give suggestions in the IDE during code reviews, you know, during code search, they get great data feedback on whether it's working. Right? Like, you either pick that the code review bit was good, and you accept it, you're in your IDE, you complete a line of code, or you're in code search and you get the results you want from the AI assistant text. So AI in the inner loop is pretty common already today. But really, the paper starts to talk about how to get AI into this outer loop of your development process. And as a manager, this is the part that I'm pretty excited about now. In terms of bug management, planning, code design, upgrades of libraries, things that originally used to be the least fun parts of software maintenance. Can we actually leverage AI in some of those? But also the softer pieces. Like, I was thinking about this and talking about it in the discussion room earlier. You know, if English, say, isn't your first language, but you work with a company where that is the case, should you run your RFC through Grammarly? Absolutely. Right? This is not a non-question for me. People's perceptions and their brains make all kinds of unconscious bias decisions about other people. And you're combatting that every day at work. You're combating perception to build trust and to find the right things to have impact on. So I think this will play in. If you can use a tool to make your natural language appear better, you should be doing it. This is a slide that comes from Jared Palmer's talk a few days ago at the Lead Dev Conference in London.
6. Building with AI and Communication Skills
Building products on top of AI requires understanding how to test and tune AI systems. The importance of LeetCode in interviews is diminishing as companies focus on candidates' ability to build great products and collaborate effectively. Communication skills are highly valued in technical interviews.
And his presentation here was really about the AI native flywheel. This is about the development of products that are built on top of AIs and how you should do this. If you start with the evals here, these are the prompts, right? This is where you build your testing to tell you whether the AI tools you're using are giving you the answers you expect. Because they're not unit tests, right? These are non-deterministic systems that are feeding your product at the end of the day by generating data, influencing your models and your strategies, which then you build into the product.
And now that product doesn't matter if you don't get distribution. Obviously there's great tooling for that in the AI space as well. And then you get feedback. Oh, okay, this kind of tool, the model didn't work well when somebody tried a pathological use case. But it turns out you know that 1,000 people came to your site and tried something roughly that a model has classified as like that use case. Now you tune your evals. And now you keep building. Understanding how to build with AI is going to be essential. And understanding how to build these pieces of the puzzle, it's quite different than just, okay, I do TDD and I write code and I validate with analytics. I think it's getting much richer. And so this is worth knowing. And I think this will show up in interviews.
So does that mean LeetCode is dead? I don't think so entirely. LeetCode has an interesting place in terms of setting a baseline for whether people should get to the second stage of an interview perhaps. But I also think it isn't the skill that companies really will care about. They're caring about your ability to build great products, your intuition, your ability to collaborate with other people, et cetera. So I do think LeetCode is probably less important right now than it's ever been. So we're back to the future. I think if you go back to those things in the early interviews that I was talking about, I think a lot of those things still matter. Because they don't go out of style. Things like communication. Being able to talk through an interview. This is the most important part. Even of the whiteboard interviews I've done, usually if you end up on the other side of those interviews, what people talk about during the debrief is not the specific syntax of the code being written, but how the developer communicated their strategy, how they laid out ahead of time, and then how they turned their thinking into code. That translation. That communication remains extremely valuable.
7. AI Concepts and Education
Knowing foundations and symbolic logic is valuable. Natural language is increasingly important in human-computer interaction. Understanding prompt engineering and problem-solving skills are essential in interviews. The education scene is changing to include AI concepts and debugging/testing. Interviews focus on a candidate's growth mindset and ability to learn.
So obviously knowing foundations, symbolic logic, how this turns into data structures and algorithms, that remains valuable. But importantly, I put this natural language piece here, so human computer interaction moving towards natural language as the input to generate code perhaps or to generate systems means that our ability to leverage natural language is ever more important. Studying how your language affects models. So this is the prompt engineering piece. Don't write it off. Because you're basically figuring out how to use another kind of programming language. Extremely valuable. And I think it will become part of tests. If I'm interviewing someone to come build things with models, I will test whether or not their ability to generate prompts is good. And how do they know if they're generating good prompts? Back to that AI native flywheel.
Problem solving. Universal. The ability to take a really hard problem that scares the crap out of you in a short interview where you're under the gun and not just blank out. Can you break it down into pieces? Maybe you can't solve all the pieces. Maybe you can solve half of them yourself during the interview, but the other half is fairly rote and you can do it with AI. Or you can identify which pieces are the right ones to shift to using an AI system. I think those will become part of the interview. So one thing you're trying to figure out is, what is someone's ability to spend time and be effective? And if you can identify the things that should best be done by an AI versus you, that's extremely useful. And that means you know because you've tried.
This is also changing the education scene, right? This article talks through a series of universities in the United States about the history of how they've taught programming. Traditionally syntax, right? Like Java. When I was in school, you only learned Java. Python reluctantly ended up in the teaching rubrics for a while. But now they're starting to teach debugging and testing, because these things all influence, again, back to that flywheel. This is how products are built and this is what companies are hiring for. This is how you get jobs. So this is changing the curriculum and I'm certain the ways to work with these models and to work with AI in school will become parts of the curriculum. If you're going to school and it's not in the curriculum, go to a different school, right? It's probably a good choice.
So again, all of these things, interviews, a lot at the end of the day, you get to the debrief, right? And they're talking about a candidate's growth mindset and their ability to learn. We don't know what AI produces six months from now.
8. Learning, Design, and Recognizing AI
The ability to learn and adapt is valuable in the interview process. Good design is a fundamental aspect of the interviewing process. Recognizing the use of AI during an interview can be determined through the candidate's experience and daily use of AI.
But if you find people who love learning and who are good at learning, who given primitives can use them, this will always be incredibly valuable during the interview process. And it's a thing that I would want to look for when I'm interviewing people.
So good design. This one is a fundamental and I think applies very much to the interviewing process. If you're being asked to build a thing end to end, you're being calibrated on your taste for good design. And what does that mean? Go back in time. This is the age of the unicorn. When I got to Google in 2007, they wanted designers who could code. Unicorns in the industry is what they thought of it as. Now, you know, designers who can code, engineers who can product, engineers who can design. Engineers it's a very different phase. But you need to learn these things and they will be in the interview process because we're looking for goats now instead of unicorns.
Is there any way to recognize whether a person is using AI or not during an interview? They are. They are. I think I totally believe again we're back to this question. How many of you use AI during your job on it or life on a day to day basis? The answer is yes. So if you think that the interview is some you know laboratory of you know independent life you know that is a thing you could do as an interviewer and build interview loops like that. You can tell if they're not using AI because you stuck them in a room and taken away their computer.
9. Scoring and Recognizing AI
One interesting fact about scoring at Google is that candidates who scored lower but had more variance in their scores performed better over their career. Recognizing the use of AI in interviews can be determined through the candidate's experience and daily use of AI.
One of the fun maybe my funnest fact about scoring from a series of data work that was done at Google, Google had a one through four scoring system for candidates after the interview. And one thing they found was someone who got a one during the interview process and maybe two fours and a three and got hired did better and had more impact over their career at Google than people who got three fours and a three. So I think what you find is again this like adaptability and people who cause more sorts of variance in their scores may result in better to work with.
But mass hiring we do bespoke hiring at Vercell. So I am not the expert there.
Is there any way to recognize whether a person is using AI or not during an interview? They are. They are. I think I totally believe again we're back to this question. How many of you use AI during your job on it or life on a day to day basis? The answer is yes. So if you think that the interview is some you know laboratory of you know independent life you know that is a thing you could do as an interviewer and build interview loops like that. You can tell if they're not using AI because you stuck them in a room and taken away their computer.
10. AI in Interviews: Junior Opportunities
You can tell if they're not using AI by observing their ability to create a working prototype and their problem-solving skills. As a junior, it is crucial to choose a company that offers opportunities for learning and growth. Losing an opportunity due to the prevalence of AI in simple tasks should not discourage you. The Hatchways blog is a decent resource for leadership articles on changing the interview process.
You can tell if they're not using AI because you stuck them in a room and taken away their computer. I just think that's not a very good test of do you want to work with this person. So you can obviously but I think you're much better off assuming like in my interview example with Gaspar that people could get to a slightly working prototype in two minutes instead of 20. But it doesn't change how I'm going to evaluate the outcome of the interview. It means okay cool they got something working in two minutes. Now where do they spend their time? Where do they show me the craft? Where do they show me they know how to do something that will impress me or blow my mind? That remains true from every interview I've ever done and I think that now it's just okay cool the basics are no longer as impressive. But the fundamentals and how you leverage them and the problems you can identify those remain incredibly important.
Our next question about juniors and beginners. Would it get harder to land the first job as a junior when simple tasks are now done with AI? A lot of fears regarding this. Yeah there's a lot of fear here. You know I don't hire juniors to do simple tasks. So I think as a junior looking for your first job this is the most important time for you to pick the best first thing. If you get out if this took you out of a job as a junior you don't want to work there. You want to go somewhere where you can learn the most you have the least to lose at this phase in your career. So again I don't think this should equate to you losing opportunities. If you lose an opportunity for this reason you are applying at the wrong company.
Are there any sort of leadership articles you can recommend on this topic of changing your interview process? I think the Hatchways blog is at least decent. There's not a lot. My research for this talk I didn't uncover a whole lot of articles about this.
11. Understanding AI Usage and Integration
Today, it's important to understand the uses and limitations of AI. Learning how to use AI correctly and integrating it into projects under NDA can be valuable. Building your own models and working on local systems or using a VM can be wise in these situations.
Knowing what it's good for what it's not so good for. Today this is going to change though right? Like that mistake won't happen and it's a prompt failure right? I asked the question without providing the instinct that okay when you round it's a very important part of the question. So when you get better at prompting better at using AI you know how to use it correctly. Coming from your experience of doing it so if they're not teaching it in school which I'm not sure they are yet spending your time learning how to do it to prepare for interviews seems incredibly worthwhile. Like everywhere you need to try to do it several times and then you will handle it.
How do you integrate LLM usage when writing and debugging code for projects under NDA? I think in this case if you're in this situation building your own models and working on local systems makes a lot of sense. That seems wise but even you know NDA your system your computer may not even be legit in this case you might get a VM somewhere else. So in that case you know learning how to spin up an assistant or another system a model you can ask questions to in a VM seems like you know from scratch would be a valuable thing to learn.
12. Changing Interview Process and Green Flags
There is a lack of leadership articles on changing the interview process, but the Hatchways blog is recommended. People's experiences will be shared through blog posts. One important green flag in an interview is the ability to communicate and ask questions before writing code. Reflective listening and problem-solving strategies are also valued. Starting with questions and validation is a positive sign.
Are there any sort of leadership articles you can recommend on this topic of changing your interview process? I think the Hatchways blog is at least decent. There's not a lot. My research for this talk I didn't uncover a whole lot of articles about this. Again I think it's a little taboo still and people aren't willing to spend quite as much time talking about it or publishing about it but you know I think we'll see what happens in the industry. People's experiences. They're going to write blog posts about it like this guy. I think we'll start to see it.
OK. So maybe you like some of the questions. I have a lot of them here. What's your biggest green flag in an interview? That's a good one. I go back to I start you know the first thing that happens is can someone do they take the time do they take a breath and do they communicate before they start writing code? Usually you know someone who asks questions to make sure they understand the question that I'm asking is key right? The repeat thing. You know this is like classic psychology if you're trying to debug a thing that doesn't work with someone else usually you try to do reflective listening, right? You repeat back what you think the problem is. So that's a good first step and is usually the first step of anyone seasoned in interviewing they start there and then they lay out how they think they'll solve the problem and then you agree that that's a good strategy or you tackle parts of the strategy and then you write code and then I'm looking at OK, do you have the vocabulary and fluency to go from natural language to code? Yeah, but starting with questions is and validation is very much a green flag for me.
Comments