In this episode of the Alooba Objective Hiring podcast, Tim interviews Filip Vitek, Executive VP of AI & Data
In this episode of Alooba’s Objective Hiring Show, Tim interviews Filip to explore the evolving role of AI in the hiring process, emphasizing the imbalance between candidates quickly adopting AI tools like ChatGPT and companies lagging in their utilization. Filip shares his strategies for optimizing hiring practices, focusing on minimizing the ratio of disappointments from screening to technical interviews. He elaborates on the importance of adaptive, challenging interview processes that better gauge candidate potential and how these can mitigate the reliance on superficial data like CVs. The conversation delves into the necessity for hiring calibration, the complexities of managing senior hires, and the implications of AI-driven screenings on candidate authenticity and fairness. The dialogue concludes with perspectives on the broader implications of these trends in maintaining a balanced and effective hiring process.
TIM: Welcome, Filip. Thank you so much for joining us.
FILIP: Thank you for having me.
TIM: It's absolutely our pleasure, and I would love to kick things off talking about everyone's favorite topic at the moment: those two little letters put together, AI. It feels like it's changing everything about the world. probably maybe some of it's hype, maybe some of it's reality I personally feel like there's a huge opportunity to use AI in hiring to make things better, but what I'd love to start with is just your current impressions on where we currently stand. Have you started to dabble with any AI tools as a hiring manager? Have you seen candidates use any AI tools in applying for your roles?
FILIP: Yes, so that's a very good question to kick off the discussion. I actually have seen both, but when I was thinking about how AI is shaping the hiring process, I think it's not very balanced. I would say it's more tilted towards candidates taking advantage of it and companies being relatively shy about how to fully embrace it. So I would say that I was receiving the candidates like trying to use ChatGPT and similar tools quite right out of the bat when they got available, where it takes some lag for the companies to take benefit of it, and to be honest, also we're probably going to talk about it today, but there's this fear of will AI be brought into the hiring Make the hiring process not authentic either, like the candidates being not fully authentic or half fake, or the jobs themselves not being so. Here I think that this balance is more geared towards that the candidates will be probably heavier users of the AI than the companies for quite a while
TIM: Yeah, that's certainly what I'm hearing and seeing, and I guess it makes sense because as a candidate you're just an individual. You can adopt a tool immediately and give it a crack, whereas for a company, there's a lot of nuance involved in that there's legislation, for example, around the use of AI in hiring in particular. I know there are some jurisdictions where it's very heavily policed; New York City, for example, has very clear rules around what you can and can't do in terms of automated decisions in hiring, so yeah, it makes sense that companies might like that a little bit. For the candidates, then, have you seen any particular use cases that they tend to be using AI for?
FILIP: So I don't know if you're familiar with the concept, but originally, before the explosion of AI, AI was believed to do the things that are 3D, so like dangerous, dirty, or dull, and I think this is where it started from, especially the dull part. It's like another big secret that for many candidates, the early stage of getting your CV out and to as many places as possible It's simply a number game, right? So a lot of people would throw their CVs whatever way possible, so for that sake I think it is not surprising that one of the early applications of AI is to make yourself available to as many opportunities as possible. There are already tools for that or even platforms for that. There are platforms to fight those tools or those platforms already, so this is a first line of interaction, but I have seen pretty early also in the making, like the candidates trying to use the AI, especially generative AI and large language models, in answering the questions in the interview, which was interesting—not all the way down to I haven't had experience, all the way down to a fully fake candidate, somebody being an avatar, not a real person—but I have colleagues who had that experience already themselves. So this is already the way it is going, but I would say it's probably heavy on the early part. Make yourself available to as many options as possible.
TIM: Yeah, and as a candidate, if I were a junior candidate, I would probably apply the same use case. I think especially as I feel like we're going to get into a slightly circular problem now because each candidate is looking at the job descriptions or the job ads, I should say, and seeing, Oh wow, there's 500 applications already. Man, I'm going to have to apply to even more roles than I thought I did to have a chance of getting in, because now the conversion rate's gone from I don't know. One in a hundred chance of getting an interview to one in 500, oh my god. So then they're all going to apply to more roles, which could then cause more people to apply to more roles, and it feels like this screening step is already broken. Yeah, especially what I hear in the United States. I'd love to get your thoughts on this. The United States market, in particular, just thousands of applications is now a normal thing for a tech company to receive for a data role. How do we get out of this mess? Is this what you're seeing as well?
FILIP: Yes, it is indeed, so I would say it's like a thousand applications per day.
TIM: Wow, already in a way, if you have a very attractive role and you have a good brand, so people want to apply to you, that's you're flooded with the applications immediately, but that's okay. I think this is where the games get pretty level because
FILIP: If the play of the candidate is to be applying for whatever they can think of, then the game of the companies will be to screen by AI, and I think it's a kind of a fair game, so it's a bit weird that we need to apply a lot of AI to get the candidates visible in the first place than to apply AI to make them less visible in the second place to get it corrected. So I think it's a waste to do it that way, but probably that's the way that the market will iterate to something better, and I honestly think, and I would like to pass that if you have the listeners and viewers on the candidate side, that if you're not using AI now, I think you're being your own detriment because, as you rightly said, there are quite a few candidates who already do, and that makes the number game, or like the chances of being screened out, even less likely. The only suggestion from me would be if you're a candidate thinking about it, don't be so short-sighted in trying to use it for the quantity; rather, try to use it for the quality because, as I remember, when I was trying to apply to the roles whenever I was changing careers, I was having different versions of my CV for whatever role or specific flavor of a role or a mission the company wanted. So if you're a candidate and using the AI generative AI mainly to make yourself a better fit, and this is not like faking the fit but making yourself a better fit of it, then I think you're doing it rightly. If you're using yourself just to be enlisted on every single opening or vacation vacant position there is, you'll soon be punished by that. because the algorithms picking on a side will say Okay, so this is a candidate who applies to 200 rows in our company. Not even considering it, either he or she doesn't know what he wants, or, like, it's just a spammer, right? So use it for a quality, not a quantity, and then I think you're better off.
TIM: Yeah I feel like from what I'm hearing at the moment candidates are using it more as a blunt instrument of just a spray and pray approach on steroids and yeah I don't think that's going to get them very far what's also interesting to think about I think is like zooming out one layer and thinking about inbound versus outbound because at the moment all we're talking about is those inbound applications through job portals which are currently very noisy does that then mean that as a candidate you should be thinking about outbound about like your network how you could get a foot in the door through a company if you were now suddenly looking for a role yourself would you apply through a LinkedIn post through LinkedIn job or would you be thinking of another way to get into the company
FILIP: I think I'm not going to say too much of a secret when I'm going to comment on this question by saying there's probably a certain threshold of seniority from which it doesn't make sense to go and look for job portals because a lot of our positions will not be published there in the first place; either that's like replacing that role, which is too sensitive. So it will probably happen through some kind of a headhunter or something, but leaving that aspect aside because it's probably not what you were referring to in framing the question, if I am, let's say, applying for a mid-senior role of individual contributor, yes, the network helps. I think it does, but ultimately, even if you know somebody, you will still have to go through the formal process of registering and getting the CV in just for transparency reasons—not to have any side thoughts or kind of any favoritism suspicion for that sake. I think that you still need to manage well the mass applying aspect of it, and therefore I think that the network house is probably not in this stage we talk about, so the network can be beneficial for you to learn about the role, which is not visible, but again, for whatever his job posting for AI, I can make sure that you are aware of it. If you really want to, if you set it up well, so that's not a concern, and for the second part, it's probably good for a reference or for second or third rounds, but for the first round, I don't think so; it's not making that much of a difference.
TIM: Yeah I wonder if it varies a little bit depending on the company as well and how they approach hiring so I can think of a big tech company in Australia where each employee has like a referral code and so they will share that with their contacts and it's a way to make sure that their application which still goes through their careers page in the same way as any other applicant but it gets highlighted or gets straight to the client it gets automatically screened and put in front of the talent person to then at least get a first interview so it still goes through the same system but you get a foot in the door might not help you throughout the rest of the process which is quite lengthy at this company but at least it gets you in one of the story I've heard recently was from someone in England who was talking to me about how interesting it was that as you got more senior the hiring process tended to get simpler not more difficult more complicated which is weird given the risk to the company is higher You're paying the person a lot more; wouldn't you think naturally then the hiring process would be longer? But from his perspective, he basically hadn't really had a formal hiring process for more than a decade. Like the last two roles he'd gotten were from coffees and people he knew and networking, which I think you can interpret in a couple of different ways as a junior candidate. One is, Oh, that's really unfair. Why is this guy getting his foot in the door through these means? Woe is me. Or you could interpret it, I think, as, Oh, how did he achieve that? What has he done for the 20 years in his career, in his network, in his industry, to have that level of respect that people would recommend him? Off their own bat for a role, and he's able to get in off his reputation alone. So I wonder whether, for the really junior candidates listening, there's some lesson there around networking for the long run and building relationships and being competent every day because eventually maybe it'll help you in the long run.
FILIP: Yes, I have to say I'm not very pleased, but I have to confirm what you're saying, that especially I felt that the moment that I crossed from, let's say, director to VP EVP level, it indeed got less of the rounds of a hiring process and also less sensitive, and it's interesting because I put myself to the question. So don't they dare to ask the I don't know chief officer to code, or is it not really relevant that the person needs to know the codes so they don't run you through the same technical interview, which is already a level below you? Right, so it's a kind of interesting discussion. I think it is a little bit of a glass ceiling, to be honest, which I'm not pleased about, and I was having a hard time, like being on one side of the river and then seeing it from the other side of the river. I find it not really fair, but unfortunately I can confirm that I had a similar experience with that.
TIM: And when you say glass ceiling, is it a glass ceiling for anyone in particular? Is there a certain segment that you feel is marginalized, or is the process a little bit unfair for
FILIP: No, I think it's just a seniority thing. To put it in an easy sentence, I think that the type and intensity of scrutiny you're put through in, let's say, mid-managerial and lower roles is different than the one you're put through for, let's say, top management roles.
TIM: I wonder if part of it is just supply and demand; like, by the time you get to that level, there's obviously a skill set; therefore, you're worried about putting them off or deterring them. Hey VP, do this SQL test. I can see how the vibe wouldn't quite match for someone that senior, but yeah, you wouldn't want to hire someone leading an entire function of which they meant to be an expert without having a thorough understanding of their skills. because I've also seen that happen, and it doesn't go that well if you're expecting them to actually be involved in executing the work and, like, having some sense of quality control. If they're not really an expert, then that's a problem. Actually, I'd like to get your thoughts on this. What I've noticed is a trend over the past five years. Maybe because analytics expanded so quickly, and particularly in big organizations, large banks would then have so many different analytics teams that there wasn't actually sufficient analytics leadership or people already at that level just because the field had expanded so quickly. This wasn't enough people who'd come through, so a lot of people were imported from other domains, like from marketing or from sales, and they were suddenly running analytics teams without really any experience themselves. Have you seen that? Is that a problem in your eyes?
FILIP: It is a huge problem, and it is also a problem in hiring itself because I think maybe not the reason to be fair, but one of the strong reasons why the higher positions are not maybe put through the same level of technical scrutiny is not that somebody wouldn't dare to, even though it might be part of the issue, but there's nobody to do that. So what level of understanding do you need to have in order to test a VP of AI if he or she is skilled enough? Obviously, at least the level that you're testing or close to it in order to be sparing so they're not like selling rubbish to you, right? And so this is—and it led to a very strange experience for me as well, where I had organizations that were struggling with, to the point that they were having this dilemma: How do I deal with it? and I have seen two scenarios of how they then go out of it, and first they hire external ones, like they hire external experts, which is okay, like they're not always aligned, and usually they don't have a skin in the game, so it's a little bit of a risky business to do it, but it's a much better solution than the other alternative, which I have seen, which was people below We're to assess their future bosses, even like two levels of bosses, on technical skills, and that was a super weird experience. I went through that three or four times, and none of them were good. To be honest, it was really weird.
TIM: and it sounds weird to me, but what made it weird to you? What made it a not particularly pleasant experience?
FILIP: If it's a two-level thing about you, should you have the same kind of way of grasping the things? Because if you are a director and you're being interviewed by an individual contributor, does he or she understand what it's like to have a zoomed-out view of the director's view on a code versus, I think, the level of detail and, like, attention to it is different? So the person would have to be very open-minded to think and to put in shoes like, Okay, if I were a director, what would I expect a person to know? Rather than very often they are trying to test them on what their scope of interaction or like thing is, so that's the first of a problem. second of the problem is that I had one set up where they wanted to hire somebody as a change agent For that team, and then the team members who were part of the problem were screening the candidate for the change, which obviously would not work, and it didn't work like the feedback then I got after that hiring experience was like, Oh, they thought you were too radical. It wasn't what you were trying to hire for in the first place, like a change, so that's what I think. Okay, so whenever, and honestly for me, I had enough of that, so whenever I see that, I make it very clear that I step out. I had enough experience of this that I know this does not work, and if the organization does not have a good enough means to assess a technical skill of the candidate, it's a feature of itself, right? So it's not a neutral point on the whole setup of the organization. How will I ever get a kind of mobilization for what I need to do and get opinions through? If in order to assess my opinions, you need to take two levels down a person to kind of screen my technical opinions, that's what I said. No, it does not work. So don't do that. If you ever think in an organization, I need to take somebody who is more technical to do the interview, but who is, like, in the lower ranks usually does not work well. I would discourage people from trying that.
TIM: Yeah and I wonder if there's a third way that companies do this that might be even worse which is don't hire an expert consultant he as you say has some skills but doesn't have skin in the game don't get a three level down I see to start quizzing you on SQL but do neither of those and basically it's just some leader like a CEO CFO making a call making kind of gut call and then giving you the range to do whatever you need which probably has its pros and cons as well where I've spoken to some analytics leaders who felt like this didn't work was if they ended up reporting into what they viewed was an irrelevant function so they weren't reporting to the CEO they're reporting to the CFO or source chief operating officer or something where it's like Oh I'm never really going to get the level of respect I need because now analytics is sitting in finance function which doesn't work Have you seen that situation as well?
FILIP: I have, and in my career I was reporting as an analytical leader to CEO, CMO, COO, CFO, and CTO, right? So I had all of those, and to be fair, I think all of them can work. I don't think any of them is like a no-hope scenario from the get-go, but there are different flavors, and me being already going through all of those scenarios I can smell, or I can sniff, the kind of different flavor of what the organization is asking for, given where the function is located and how it is treating the role, and it's not that superstitious. Hey, if it's a CFO, it's going to be all about, like, profit or whatever. No, it doesn't have to be that banal, but it gives you an idea of who will be calling the shots and what kind of support you're getting and also what kind of transformational force you can expect because once I had been approached by a headhunter, I said we have this great role for you, and I have to admit that the kind of company setup was interesting. The industry was interesting. But then they said, Yeah, and this needs to be changing. Like introducing a data orientation into the organization and data thinking and change management of the things, it's great, and it's going to be a B-minus two role in finance, and I said, Let's stop here. And I, and don't get me wrong, and I tell you, you go back to the client and talk about it because anybody reasonable who went through this will say, and the ones which you say yes, you should be scared more of than the ones which say no, I've seen that happening, but you need to be careful with how aligned the mission that you are just about to do with data is with where the role is set up in the org, but all of them can work.
TIM: Yeah, you don't want to walk into a brick wall or a hospital pass; we might sometimes call that in football where you're about to be demolished by an unwinnable challenge by the sounds of it.
FILIP: Yes.
TIM: We were talking about AI and its use in hiring, and we were talking about candidates maybe adopting it a little bit sooner than companies, and companies are going to catch up, and those kinds of screening stages are going to end up being, if they're not already, this kind of AI versus AI challenge. There are all these applications, and we're going to have to screen all these applications. One problem I think we're going to run into is that the data used at that stage I feel is fairly flawed to begin with so even pre AI candidates applying with a CV and filling in an application that has always been quite inaccurate I found like basing Hood's interview on a CV was very hit and miss because it's someone's opinion about themselves it's their perception it's what they've chosen to present what they've chosen to include people aren't the best judges of their own true skills and because of that I think there'll be lots of candidates who look amazing on paper and then you interview them and they're just not what they say they are and so even if we have an automated way to do this which will solve lots of problems it'll scale it it'll get rid of some of the bias it'll do lots of things Fundamentally, it's still not going to be that accurate. I feel like we're still going to be interviewing people. Who we feel like Oh my God I wish I didn't have to do this interview, and what I'm trying to understand is what's going to be the unlock. Do we need a new data set to do that screening step because I feel like a CV is not ever going to be that helpful? What do you think?
FILIP: Yes, and thank you for that question. I might surprise you with the answer. When I started my hiring career, almost 20 years back, when I hired my first person, I was trying to optimize for getting the best candidate out of the data that we have. Later on, I learned that's not a good goal to optimize for because you're going to be susceptible to all the pretenders and everybody who actually wants to deliver you what you want to hear. So they're going to get extremely well in delivering what your sensors are sensitive to, so over time I learned that the key metric for me in hiring is what's the ratio of disappointments. From screening to technical interview, I geared the processes of hiring to minimize that, so this one, which you said Hey, it looked perfect on a CV, but it was a disaster in a technical interview, which is exactly what we tried to rule out. and we design our system to prevent that because if you are doing that well, then whatever number of candidates is passing through screening is giving you a very stable funnel down the route. If you're optimizing for what goods look like on an early stage of a funnel, then you're going to have a very harsh dropout in the next stages. and I have to say that over the years of doing this, it really paid off, so this is a strategy that I have optimized for the least surprises between the screening and a technical interview.
TIM: And how have you managed to do that? How did you set up your process to optimize for that?
FILIP: Yes, so I have to say that I stumbled upon it accidentally. It was not a designed process in a way that I took a reflection on hundreds of interviews I did and then designed a new process, and it clicked in, and I was forced into it by something, which we're probably going to discuss today more somewhere down the road, and it's hiring in different environments. So I started my career in Central Europe, then I moved to Western Europe and started hiring also for the US market and the Indian market, and in hiring for these different environments, I realized that the best practices from one market are almost useless for the other market for some reasons, so I was looking for a process that would be robust enough in broader terms—not like in specific details, but in broader terms—robust enough to withstand whatever of the four environments. I'm far from claiming that it would work equally well in East Asia and South America. I've never tried those, so I'm not making it a civil ballot, but the important ingredient of the recipe is focusing on the job. Specific skills being demonstrated directly in the interview That's the ingredient that I'm shooting for, which means it is a little bit more heavy on the technical in-person part, but it also became a very good Kind of a screening criteria, or let's say sorting criteria, for the candidates, of course you still have a bit of a surprise of does it—is it a good personal fit? Because you can still have a great expert and be an asshole. but like leaving that personal risk aside, this worked really well for me. I do a lot of on-the-job kind of tasks in the technical interview to see how well the person responds, and all the roles I do not have a standard set of questions I always adopt it, and what is this person expected to do? and I need to have at least two or three usual tasks of that person in the technical interview incorporated into the tricky part of it being both the actual technical part but also some soft things, so if this is a role that is going to be heavy on, let's say, a C-level interaction, then I need to have a task in the screening interview where they actually perform on that particular role play or situation for me to see if that would really work or they will stumble upon that
TIM: Okay, so in the technical interview itself, you're trying to extrapolate how they perform on the job by giving them something that is quite similar to what they actually do but in a simplified way where I'm assuming they're not—this is not we're not talking about live coding; it's more just live problem solving where you're giving them a typical task and they're talking through how they'd approach it and you're going back and forth. Like, how exactly do you do it?
FILIP: Yes, so I might surprise you again on this one. You said simplifying is the other way; it's making it more complicated, and why a lot of people make a mistake that they say, Let's say that you have a certain bar where the candidate needs to meet in terms of competencies or skills, yes, and they try to fill the interview with a task from here to here, right? So basically, it's almost like a checkpoint, so make sure that whatever is needed up to the level where I'm happy is being done, where in reality what you need to have is you need to have an interview that measures from here to here because otherwise you're not able to screen the gems from the good enough candidates, right? So the interview itself is very challenging to give; it has two important factors. So first of all, it's all scored, so it ends up with a score for candidates for every single one. It's calibrated also to make it fair in comparison, so I can talk about this: this was like a 70-point candidate versus a 50-point candidate. and it also has some implications. This is also solving not just the fairness of it but also the value for money of it because somewhere down the road you're going to come to the situation where, so what's the range that you have, or where would the candidates see him or herself in this monetary range that you have? and very often for many people is whatever you strike and are defending. You try to get for me; it's very easy. I don't feel about any suggestions from the candidate because I can put it into contrast with the value added, and it's just a superstar. I'm up for going to HR and asking for overseeding the range because it's really worth it. But I also know if this is like a 50 percent candidate, if he or she does not realize that he's a 50 percent candidate and strikes for a higher compensation, then he or she disqualifies himself or herself just by being a bad value for money ratio, right? Sorry, but we diverted a little bit, so going back to the original question, there are two important things: I make it intentionally more challenging so that only the great candidates can get a high score, so a lot of the candidates would get an average score; they can still be hired, but it'll be clear what their skills are, and just to calibrate you on that one So it's not necessarily always, though, but let's say if the score was from zero to hundreds, most of the candidates, the interview is so challenging, and most of the candidates would score something about 65 to 70 points, and if you get somebody with 90 points, you want to make sure you get that person because that's a real superstar. So that's the way I approach it, and therefore it might be a little bit of a challenging experience for the candidates, but I don't have too many pushbacks on that. A lot of the candidates realize that maybe I'm not at the level I thought I was, and it's also useful for them. I got a lot of feedback from the candidates. Oh, thank you for showing me how much more room I have ahead, like headroom for kind of growing, right? So that's usually the feedback I get for candidates who don't score that well.
TIM: That's such an interesting perspective you've given us and so in a way it's almost if you did the typical interview set up whereas you say it's like simplifying what they'd be doing on the job and keeping a little bit more interview y a medium quality candidate and a top quality candidate might both end up scoring a hundred because you've almost tapped out you've given them this low ceiling and then you can't distinguish between the superstar and the decent candidates that's a really interesting perspective so then so when you're almost thinking about going above in this range it's that you're taking current business problems and adding layers of complexity that you don't have to see if they could solve them is it introducing extra ambiguity to the problem you give them so that you're trying to test how widely they can think Are you looking for other aspects like creativity, or yeah, how do you think of the questions, and going above that level, could you unpack that a little bit?
FILIP: To be honest with you, I was also having different approaches to this over time, and I calibrated it, so I started with exactly the ambiguity that you're mentioning, but it didn't turn out to be the best strategy because there were a lot of candidates who were good enough, but they were put off by why this is unrealistic. like why I'm discussing this scenario I'll never work on something like that, or this is too bad to be true, so I don't want to be tested through that, so why are you doing this? I almost had a feeling like you're trying to make it difficult for the sake of it, which is not good, because if a candidate thinks of you like my potential future boss or somebody important for, like, my cooperation in my new career, it's going to make things intentionally difficult. That's not a good extra point for you getting accepted the offer that you might extend to the candidate, so then I learned over the time that the way to do it is I have almost like you unlock the next level, so if you correctly answer the trivial part of it, there's more to it. There's a follow-up question, which is more challenging, and more challenging, so some of the candidates never get to the most challenging part, so they also don't unlock the chance to score a hundred points because they haven't gotten to the point of even thinking about the most complicated things. because if you have a candidate who struggles with SELECT * FROM, you will not test window functions because there's no point in doing that, right? It's a relatively stupid example. It's not that banal as I'm now trying to illustrate, but just for the sake of it, there is no point in having an interview where you start with already the supreme level as to your face, right? And you don't know window functions; why should we even continue the interview? Right, this is what I learned over time, and now I'll do an interesting bridge because this is bringing me a little bit back to the AI because when I saw candidates trying to use ChatGPT to answer the questions They could probably get the first one, but it will be very difficult for them to get over the second one or the third one because the second one or the third one needs you to internalize the answers of the first one and to twist it and turn it and use it in different perspectives, and also you would have a very short time to go back and type it back to LLM to get the answer. So unless you have software that is transcribing whatever is being said and that is automatically fed into the LLM, which I haven't experienced as a candidate, and it's not an inspiration for candidates to do that, so unless you have that, it's going to be super challenging with the use of an LLM to get over the more complicated follow-up questions. And so this is also a little bit of a shield for me so far against the candidates who are trying to use ChatGPT to help themselves look better in technical interviews.
TIM: Yeah, I feel like LLMs at the moment, in an actual interview, are surely going to be complicating your life as an interviewee, not making it easier, because how on earth do you have a conversation with someone, interpret the question, and get ChatGPT to answer it? interpret Chachapiti's answer and put it back into normal words that sound like you are not a robot That sounds way more difficult than just answering the question and having the skill to get through the interview. I reckon maybe using chatty between an interview would work better for the non-technical interviews. If you're trying to get through the first HR screen and you need to just sound generally good, and it's a little bit more superficial, I would argue, but once you meet an expert, it's going to start drilling into your decisions on how you're going to solve problems. Yeah, it's going to do more harm than good, I would have thought, but I haven't tried it, so
FILIP: So I have a small tip for the hiring managers here when I first start to receive candidates who were supposedly trying to use ChatGPT. I never called out anybody, Hey, what are you doing? But it was a little bit suspicious that the candidate, like the answers, always thought the person always asked to rephrase the question and so on. So it looked like they were trying to buy time, potentially maybe for getting advice from somebody else or an LLM. When I saw that, what I did was I took my interview questions and I proactively gave them to all the LLMs I could think of what the answers would be so that I could know if somebody jumps on that prompt, this is where the scenario goes right. This is what the ChatGPT answers now. Of course, it's not an ultimate shield because the LLMs develop maybe 35 answers in that way and 4 answers a different way, so it's not supreme protection, but it's good enough, and it's a small trick, which you can do: take your interview questions and put them in an LLM so that you see what kind of answer you would get. if somebody tried to use an LLM to answer your questions
TIM: a hiring manager recently doing this for one of their technical questions on the take home test and they found it quite unusual that Chachapiti had always answered this I think it was a SQL problem they always answered it using incredibly complicated and specific regex whereas any human would answer it using a function that was much simpler and so then they noticed that 70 percent of the candidates had answered with this very specific and obscure way of solving the problem and he knew oh okay I get it you've all put this through Chachapiti now there's a philosophical question that would be interesting to cover which is what should candidates be using chat GPT or other tools for what should they not be using it for We've mentioned you said to try to use it in an intelligent way, not as a brute force instrument in your job process, but if ultimately they're going to use AI on the job, is it actually a problem they use it, for example, in an interview or on a test? Because inevitably they will use it anyway, but where do you currently sit on that?
FILIP: Yes, so I'm more of a tech optimist. Leading a team that implements AI would be strange if I were on the other side of a river, but the way that I think about it is not necessarily on the level of individual candidates. So if we were having a plane football field and everybody was using it, I would not have any objections to that. but in the recent state of affairs, some of the candidates still try the honest way, and some other candidates try to help themselves, so it's distorting the measurements, right? So for that sake, I find it not suitable for what we're trying to achieve here, not that I would mind, because I understand just for you to understand how I think about it. So I don't allow Googling up or, like, going to Stack Overflow or using chat GPTs in my interview. I explicitly say I would not appreciate that, and I would not accept that, right? But that means that I'm not going to nitty-gritty, "Ah, your semicolon doesn't go here; it needs to be somewhere else. That's not the point of it, so you need to have your tolerance level way up if you go for it, but you know that you're not looking for, like, a syntax exam, so if he or she knows how to do it and he just made a mistake on calling a wrong library or kind of who cares, these are the things that you can do. So I would say I'm trying to accommodate for that. I know that a good coding personnel would use the LLMs to help herself or himself to code, but I don't allow that in interviews because I think it will be distorting the measurements.
TIM: I would say 80 percent of all data hiring managers I've ever spoken to don't really advocate for this objective way of hiring where you score everything and you get a number. I find that perplexing given their data leaders and their entire job is to measure things with data in product, engineering, and sales and operations. Thank you very much, and when it comes to hiring, they take off the data-driven hat and put on the I'll just trust my gut the whole way, and what I'm trying to understand is where this comes from, what they are doing, and what value they are getting from the process that they're clearly missing out on. Do you think?
FILIP: It's hard for me to answer because I have seen that happening, but I never understood that concept either, and whenever I was a manager of the managers, which is like most of my most recent roles, like being already on a C level, I didn't tolerate it downwards in the hiring process either, so I'm not saying that I want them to fully replicate what I believe is good. I give them some freedom in setting their interview process, but I do expect a calibrated data point, ideally on a scale base, for each of the candidates for both the price negotiations and the salary negotiations, but also for giving feedback. So sometimes we are in a situation, and I got to give an example, so I'm not going to make the organization, but one of my past employers was running a very specific strategy. They knew they didn't have a budget to hire a top-notch candidate to the point that they said they needed to hire at the average or even slightly below because this is what the budget allows for, and you have to actively go and hire those candidates. and for this kind of a setup, if you don't have the metric, it's super complicated to hit the sweet spot of where you should be hiring for, but once you have it from the market data that the range for this kind of candidate is this and where you're bent with your budget, it also gives you feedback for what kind of core you should be looking for. and don't fool yourself that if you have a budget for a 60 percent skill, a candidate, that you should be even talking in the second or third round with an 85 percent candidate. No, like that's a waste of time. It will not go into the handshake with one example with one exception, which I'm going to mention a while, but in general this is like insanity. You're going to get a lot of no's to the offers because you're too low or too unattractive or took a lot of question marks or too boring of a task. This is why it is important for me to have a calibration also to be fair to your hiring strategy as an organization to the positioning of the company on a market because if you're not using it, then it's super difficult. and it's basically your fault you're being your own detriment because you're going to try harder and longer if you don't have this calibration done well.
TIM: Yeah, I wonder if there's just an element of old habits dying hard and if you've taken a very intuitive approach to hiring and you feel like it's done well for you, and maybe it has. You feel like your team's really good people you hire; you get along with them. Then why change it? I wonder if also part of the issue is that the success metrics and hiring aren't super obvious necessarily, so, for example, are we optimizing for the person getting hired and we didn't fire them like they made their probation, or is it the person getting hired and they get promoted within 18 months? What happens if they leave for a better job? Is that a success or failure? What happens if the company goes through a round of redundancies and they get let go? Is that because of my hiring decision set? I wonder if, yeah, the success metric isn't obvious, and then it takes years to play out as well, arguably. So it's not immediately in your face. So then if a hiring manager is currently taking this gut-feel, intuitive approach, they don't necessarily know better. And maybe their approach has been working for them. So they feel like, why change something that isn't broken in their eyes? I wonder if that's part of it.
FILIP: Yeah, just one short comment on that, and then we can move on to the next point: if you think about it, if you have that, it also helps after the person is being hired because if this was a 95 percent superstar and he or she is not performing on a job, then I would blame the environment because they were all the means because the score does not include only the technical part. Also, the personality feed and so on, the final score, so if the final score was so high and the person is still struggling, then either your hiring process is not well calibrated, which you will soon find out because you're going to have this happening over and over again, or the environment is not supporting the candidate well enough. And also this question: Is it a success if your superstar leaves for a better job? Now, if you have a 95 percent candidate on your team, you better manage it that way because you shouldn't be surprised if a 95 percent candidate is like eyeing or is getting bombarded with the other options because he's or she's a real superstar, right? So this is also then helpful for actually managing the career of the person. I'm not saying that the first score decides your destiny. That's not what I'm trying to say, but it can be used as an additional metric to help yourself to calibrate also what happens with the person and why it is hired.
TIM: Filip It has been a great conversation today. The time has flown by. I feel like we could keep going for hours, and that's been so insightful hearing your perspectives and your experience, and you've just straightaway given me so many other little framings and angles I hadn't really thought about before, so thank you so much for sharing all your expertise with our audience today.
FILIP: Thank you; it was a great talk.