Alooba Objective Hiring

By Alooba

Episode 32
Lisa Quetting on the Objective Hiring, Leveraging AI and Balancing Diversity & Merit

Published on 12/3/2024
Host
Tim Freestone
Guest
Lisa Quetting

In this episode of the Alooba Objective Hiring podcast, Tim interviews Lisa Quetting, Data leader from Berlin

In this episode of Alooba’s Objective Hiring Show, Tim interviews Lisa and she shares her insights on recognizing and mitigating biases in hiring, emphasizing the importance of leveraging AI for refining job descriptions and screening CVs. Lisa discusses her experiences from her tenure at Babbel, highlighting the balance between quantitative measurements and human intuition in evaluating candidates. The conversation delves into maintaining fairness in hiring, emphasizing diversity, and the potential drawbacks of relying solely on intuition. Lisa also explores how companies can make the hiring process more data-driven and transparent, ensuring that the best candidates are chosen while fostering a diverse and inclusive work environment.

Transcript

TIM: Lisa Welcome to the objective hiring show. Thank you so much for joining us.

LISA: Thanks for having me. I'm excited to be here and to chat with you about this topic.

TIM: It's a pleasure, and I would love to start with everyone's favorite topic right now: AI. It seems as though it's changing the world in so many different ways, so I think it would be remiss of us to not chat about it in terms of the context of hiring, and so I'd love to hear your thoughts on this. Have you already started to use AI in any bits of the hiring process? and what are your thoughts in particular on AI at the screening stages of hiring?

LISA: I have to say, at my previous job at Bubble, where I worked for the last four years, we used it only to a very low extent, so some automation yes, the basics, but AI mostly to refine job descriptions but not fully generating them with AI. That would be one step further, really generating a job description based on keywords and the job title. So we had a very structured process, but it was still very manual. There are very interesting use cases, of course, like I would love to try smart searches for filtering or screening CVs based on keywords, especially when you get a huge amount of applications, which I would love to see how AI performs and deals with the very different structure of CVs. Yeah, I might have some concerns there; there also are people with maybe high potential who are filtered out if you set the keywords too strictly, but that would be one use case I would like to try. Also, interview summaries would be very interesting to limit the amount of note-taking and multitasking during the interviews and not have issues with the readability of your own notes afterwards. But of course, also for search, like for employers, they don't have an employer value proposition yet. You could create that with AI to convince more candidates or, of course, generate interview questions and generate a case study with AI. These are usages that I would love to try in my next position.

TIM: Yeah, the sky is almost the limit, isn't it? Because I feel like hiring in general is nowhere near as good as it could be. I guess, to put it politely, there are a lot of problems, and you just touched on several of them there, including even just the interview itself, so you're right. You're sitting there in an interview with a candidate. You're trying to really focus on what they're saying, trying to dig in and think of the next question. You're trying to write notes. You're trying to score them like it's a very complicated thing to do. But now there are tools that transcribe in real time what someone's saying, and I have spoken to a company recently that had spun up their own skunkworks, basically their own implementation of Claude, which was taking the transcripts and also grading the candidates against the questions, so yeah, there's just so much upside, so much potential, I think, in the space.

LISA: Yeah, I would love to explore that more and more in my next role, yeah, even though with the summaries I would be a little bit afraid if the tool would really do it well and not omit, I don't know, key elements. Probably not, but that would just be My concern is a little bit, can I really trust it that much not to leave out anything important? Maybe the specific notes that I would take for questions that are really important for me are how the candidate answers them, but yeah, with a sense of proportion, it would definitely make sense to use it that way.

TIM: Yeah, and also I feel like the candidates number one complaint is pretty much a lack of feedback, poor feedback, or no feedback. I'm certain part of the barrier to providing feedback is that it's just an effort for an interviewer to record it to share it with someone and for talent to then maybe also sanitize it. Like, often you can't share verbatim what you've said because it might be slightly too harsh or whatever, and so that process of just recording the data and sanitizing and sharing it surely could be done at some level by some AI, in which case, even if it's imperfect, it's almost like better than nothing, you could argue.

LISA: Yeah, it could be a good starting point, and then, of course, you have to review it, check it, or, I don't know, let it write it again. I don't know how to make it more polite whatsoever, but yeah, helping you with that would be really advantageous.

TIM: Yeah, I wonder if it's a sort of thing that we might try it for a while, refine it a little bit, but then soon we'll be happy with it, and it'll be good enough, and then we can just let it go down with its job, and it doesn't matter. Does it sting? Maybe.

LISA: agree

TIM: What about at the screening step in particular? So you mentioned taking those unstructured data sets, the CVs, and then maybe doing like a keyword scan or something like that. What else could be done with that unstructured data in a way that we haven't been able to do to date without AI?

LISA: You mean you using the CVs in different ways. You could also create, I don't know, candidate personas, for example, or create metadata about it. Yeah, what are the most mentioned skills or experiences or whatsoever to then, yeah, analyze the job markets, the offering better, and use that again to create job specifications or requirements for the job or work with that? I don't know; I haven't thought about that yet, I have to say. Maybe you have some different types of use cases in mind.

TIM: I feel like maybe a step beyond, let's say, keyword scanning would be some kind of scoring if you had a list of criteria and then you're trying to stack rank the CVs or something like that.

LISA: Sure. Yeah, I would even do that without AI. Let's say that's something we did to, let's say, make the process a bit more data-driven or use data at all, having key criteria or skills for the job and then, after an interview, letting everyone involved in the interview rate the candidates on these main criteria to calculate a score. And have that as a better way to compare You should limit how you compare candidates against each other, but let's say in this very objective way, it definitely made sense to make it less subjective. Yeah, I let interviewers do that or did that myself, but it sure would be interesting to also have AI trying to do this based on the CV already. already

TIM: I'm interested in something you said: then you said you would like to limit how much you compare candidates. Do you feel like we're too focused on comparing candidates or something like that?

LISA: Yeah, it was often, I don't know, one piece of advice that HR often gives to the expert teams, like, judge or evaluate each candidate individually; don't compare them so much against each other because either you like this candidate or you think they are a good fit or not, and that shouldn't depend on someone else and a specific other candidate. So really, to look at the individual and not think about that, I don't know, ranking or favorites too much, but to give every individual more of a chance on their own.

TIM: And was it a case then that you, so you look at an individual by themselves, you would then objectively score them, but then after the fact, then you'd have maybe four or five candidates, and at that point you could do like a comparison?

LISA: Yeah, I think it's hard to really not think about comparing candidates with each other because obviously when you have to make a decision, you have several candidates in mind, but yeah, trying to focus more on the person, the individual, one by one, but then, sure, if you have that score in the end, you will look at that score for everyone and then say, Okay, is this what we want to base the decision on primarily, or what else is there as decision criteria?

TIM: I'm interested that you took quite a sort of objective, reasonably structured approach that I would have thought would be a natural way that any data person would look at the problem, like, Oh, I'm trying to hire someone; I'm going to come up with some metrics; I'm going to measure things as best as I can, and then rank them, like, as a very data way of thinking. But I reckon 80 percent of all the data leaders I've ever spoken to would not really do it that way. They take a very intuitive gut feel, shoot from the hip, and say, Oh, I'm going to do like a pub test. I'm going to have a coffee with them. I could do a vibe check, like they take a very opposite end of the spectrum feel to it. What do you feel like they're missing by not doing it in a more objective way like you've done it?

LISA: Yeah, I think, of course, having a personal impression of the candidate is valuable, and there should be the human component, but if you want to make the process more objective, right, not to be biased, not to make the decision based on, I don't know, one person's very subjective judgment only And I think there are various factors on which you can work on this also, for example, first, when it comes to biases, training people to become aware of their biases—that's something where you can start. Bubble did a very great job in this so that that really helped not only for hiring but also to collaborate in a very diverse team. So that's one step, and then also if you want to make it more objective, you want to have a more comparable process first, which to me also sounds very like a no-brainer, but probably not all the companies do it, so having these clear criteria for the role, calculating the score, and having some best questions and best practices for questions depending on the role You don't always have to ask the same questions, but maybe some key questions that you use if it makes sense in the interview, and then you can also use that to evaluate in a more objective way, and I think what also helps to make it more objective is obviously involving more people. Not only should the hiring manager make that decision, even if HR or talent acquisition is involved, I think you should also have peers, potential peers of that person, or direct reports if it's about a line manager, or stakeholders involved in the process, to not only have a subjective impression of one or two people only. And then, of course, if you want to take it one step further and really make it more data-driven, then yeah, we can talk about tests, all types of tests as well, like assessment centers and skill tests. life coding take-home challenges trial day you name it. Yeah, maybe, yeah, I guess we'll dig deeper there still, but maybe why do a lot of people not do this? maybe because they trust their own judgment so much, maybe because they're not aware of their biases, or because it's more effort to I don't know how to design a test or a take-home challenge, but now there are great tools out there, so it should get easier, so I would think and hope that more people would use these approaches as well. What do you think?

TIM: Yeah, I think the bias angle is true for sure, and ironically, if I'm not mistaken, a bias is a lack of understanding of your own biases. That is one of the biases, so it's very circular. Overconfidence bias is another one, and I feel like that's maybe springing into it, and I can remember through the years speaking to—I remember one particular guy I spoke to years ago who is a chief data officer of a big tech company in Asia, and I got a few minutes onto a call with him, and he made a statement that I just couldn't believe. I asked him about, Oh, what's your hiring challenges? Like when you hire people, what are your challenges? He said I have no challenges.

LISA: That's definitely overconfident.

TIM: He said something like, I've built 20 teams in my career from scratch and hired hundreds of people, and he wasn't that old, so it's actually impossible that he'd ever hired 20 different teams because he couldn't have unless he was hiring a new team every four months, which is not a good sign for him. because those things weren't lasting very long, and so I just—I didn't know how to combat that level of overconfidence in someone, so yeah, I feel like bias is definitely one of them, but maybe there's also a deeper layer, like we're talking about objectively hiring and removing bias, but maybe why does that matter? like what's the so what for this person? Is it they remove the bias, and what, why does it matter for them in a way?

LISA: Good question. First of all, okay, you can say that fairness is a value for you. Do you want to be fair against candidates? Okay, maybe not, but then even from a selfish perspective, to say, Okay, if you recognize that you have biases, then you might not make the best decision for your team or for the company. So even from that point of view it would make sense to to be to become also from a personal development perspective don't we all want to grow personally to develop to become more self reflected and I don't know improve our awareness of what's going on with our communication and in many different ways From several perspectives, I think it is, yeah, for me it's like a no-brainer, but yeah, it's a good point to take that step back and say why does it actually matter, but I think it matters from a lot of perspectives.

TIM: Yeah, and I agree completely. I feel like fairness is almost the other side of the coin of accuracy in the sense that I would argue a fair hiring process is one in which the best candidate is the one who gets hired irrespective of who they are, and in a way that's also the most accurate, and if you're a hiring manager, surely most of the time you just want the best person in the job. Maybe in some scenarios you can imagine in a big bureaucracy, hiring someone underneath you is better than you might not be to your advantage in the long run, but assuming that we're thinking about most hiring managers You just want to have the best team, surely. Yeah, making sure you have the best person out of your hundreds of candidates is going to be the main objective of the whole process. I would have thought

LISA: But if you want to encourage, for example, creativity and different perspectives in your team, then you should also care about diversity because otherwise, if you have maybe, I don't know, a lot of great people, but they're all the likes of each other, then that will not necessarily boost creativity and out-of-the-box thinking in your team. So having a more diverse team can also be very beneficial, and then, yeah, you have to balance that out as well, and also what does the best mean, like the most experience or the highest potential? Maybe also giving someone a try who doesn't have the most experience but wants to change careers. But is really highly motivated and more passionate about the topic than someone who has already more experience, so I think it's difficult to define also what is the best candidate in some cases.

TIM: I would say Then the best is, however you've defined it, like if you've set up the criteria and you've chosen, let's say, motivation and a bunch of skills, then hopefully at the end of the day, you, the candidate who scores the best, are the best by definition, in a sense. But you've touched on something I was going to ask you about anyway, which is whether there is a trade-off or some kind of balance between diversity and merit. So could you have a hiring process where you had a hundred people You scored them as well as you could across a criteria. You came out with just numbers, so this candidate scored 80, and this candidate scored 70. Is there a scenario where you would, you know, Deliberately not choose the highest-scoring candidates for some kind of diversity-based reason do you think

LISA: I think I would never only use this score in any way to make the decision but also a personal impression or personal connection to the candidates, but yeah, talking about diversity, it's easier to provide more opportunities for maybe also lateral entrance or, yeah, people I don't know with less experience if the company commits to diversity, of course if you clearly make this a company value, which has become more common, I think, over the last five years. For some companies, maybe only on paper, but Babbel takes this really seriously. I think then it's easier to provide also these opportunities, and I think that's what we should also do as leaders in general: create opportunities for people to grow and develop and also make their entries into a new career in some cases, and I think that can really pay off because these people then can be very loyal and motivated because they are also maybe thankful for the opportunity they were given knowing that they still have to work on some skills or experience, for example. so that can be one selfish benefit again for the company, and then also there isn't always right or wrong in doing a good job or exactly which skills or experiences you need to precisely have to do this job in a good way, so again coming back to that point of diversity bringing in different perspectives and different ways of doing things There's also then an economic reason to foster diversity as well, and yeah, for me, maybe this matters even more for me on a personal note. Let's say, as a woman in data, I'm still a minority. It wasn't a bubble, but in general, that's still the case, and in other companies, I've seen quite some male white leaders hiring their likes only. Which didn't give a lot of other people with other backgrounds character traits or whatsoever the chance to perform well, so I think it's maybe easier to value different approaches and diversity for someone who is a lateral entrant or part of a minority themselves, so I think, yeah, we have to balance it out, and I think we can. It's not easy, but we should try. and can I throw a devil's advocate angle at you? So let's get to a concrete example, so imagine It comes down to two candidates. One who's scored, let's say, 80 in whatever criteria have been set up So you scored them 80 out of 100. another who scored, let's say, 60 The candidate who scored 60 is a quote-unquote diverse candidate, again, however that's defined by whatever I don't know subgroup you're looking to boost in that particular time.

TIM: Is it fair to the person who scored 80, who is objectively the best candidate, who did the best performance in the interview process, who, on paper, based on how you've set it up, is the best? Is it fair that they don't get the role but the person who scored 60 does?

LISA: No, often it's not about whether one or the other person gets the role or do both move to the next stage. That's what it's sometimes more about, and then I would say, Okay, then let's have both moving to the next stage if that's a take-home challenge, for example. Okay, then you could say, Okay, one step further. If then it's a decision between the two of them after the last step, after all the steps have been taken, that's it; it depends. Okay, if your company hasn't set clear goals of we want to increase diversity; we want to have a higher share of female engineers, for example, if that's not the case, then you could say, Okay, then it's unfair. But anyways, in most of the companies, it will never be that score only, but it will always be a personal impression on top of that, or someone might be more eager to choose someone who is more like them or has more things in common with them because they can relate more to that person even if the score isn't the best one. It's hard to answer. We don't want to turn things around and say, Okay, let's give an advantage to minorities only because they're part of a minority. No, of course skills and abilities, or at least the potential to do a great job, are key as well. but if you see that in someone, and there I'm also a fan of the saying hire for potential, train for skills, I have, in some cases, made choices where I saw that potential in someone and was, in the end, happy that I gave them the opportunity over someone who looked maybe more like the standard candidate to expect.

TIM: I appreciate that what I'm interested in is so you've described, like, there's the case of this quantitative measurement of the candidate, and then there's this human element, and I would have heard this kind of thing described by other people as maybe there's the quantitative bit in the gut field or the intuition or sometimes slightly more subjective things like the cultural fit that maybe could also fit into that category. What I feel like is, wouldn't it still be better if even that final piece, that human piece, still ended up as some kind of metric? Again, as a data person, I just feel like it could be improved that way, so imagine you had the criteria. Isn't the human element just another bit of a criterion? couldn't it just be The remaining 30 percent of this evaluation is how much I like this candidate. How much do I feel like they get along with me? How much do I feel like they're going to fit in the team? Whatever it is, wouldn't it be better to at least make that an explicit bit of the criteria? So it's just on paper like everything else.

LISA: Yeah, I think making it explicit and more specific, like what to label it, like I'm saying this is a cultural fit or the impression I have of team fit or a really personal connection, interpersonal connection, how I maybe think their personality would fit into the team that makes sense to have that as part of the list of criteria as well, but it's harder to quantify it, of course. Then you have to accept this: okay, if this is then a 1 or 5 based on what? How do you calculate that? That brings us to the limit of making everything quantifiable, I think, but having it as an explicit criterion in the score, let's say, would make sense and would make it already more structured than just calling it gut feeling or intuition, even though I think we should make use of experience, knowledge of human nature, and intuition to quite some extent in the hiring process as well.

TIM: And I wonder whether if we called it out explicitly and got people to score, even if it's completely subjective, like you get out of an interview and say how much did you like this person, or how much do you think they are going to be the right person for this job, and people score it, and an interviewer gives it a four. another person gives them a nine suddenly at least those metrics then create a level of debate Oh, why did you give it a four? Why did you give it a nine? And then you can start to unpack it, and also if I think back to some of the people who I've hired, maybe not so much in my current company but previously, I definitely hired people I liked. I hired people who played football and had the same interests as me. Maybe that's not wrong; maybe that's fine if you just want to have people who you're going to get along with. Maybe that's a valuable addition to the team, but I was never made to say, Why do I like this person? It's because they like the same sport as me. It's because we grew up in the same area; it's because we had the same sense of humor or whatever. Is that a bias, or is that the right thing to do? I don't really know.

LISA: Yeah, you could call it a likelihood bias, but I think making it transparent and talking about it with others involved in the hiring process is already a good strategy. It's very valuable that you think, Yeah, I don't know, you like this person; you think they would fit into the team, but like, then, okay, based on what? If it's only because they have the same interests as you, then yeah, maybe someone else will challenge that. But yeah, it's good to start talking about and make it transparent and, yeah, make this personal connection or team fit part of the criteria as well.

TIM: I think that could help. I can think of also some engineers we've interviewed over the years; one guy in particular, obviously I won't mention his name, who we interviewed, and we really liked him. He was so affable and friendly; he had a big smile; he was so energetic; he likes football again, so he had some good banter immediately about football, but then for all the objective criteria in our hiring process for his assessment for his take-home test for his actual technical ability He was just way lower than other candidates, and so we kept trying to find a way to hire him by giving him like another chance, but he just was not at the same level as the other candidates. And, but if we weren't measuring those things, I feel like we probably would have hired him because we were like overcome by his friendliness and his smile and his demeanor and how much energy he had, which may be is valuable, but it's probably not as valuable as his ability to actually do the job that he's going to be doing 95 percent of the time.

LISA: Yeah, exactly. You can't ignore the hard skills, and you could call that an example of a halo effect, another bias. Yeah, that from him being a nice person, friendly, and so you would think that, I don't know, he's also very capable or experienced or whatsoever, but yeah.

TIM: Yeah,

LISA: Of course.

TIM: Not necessarily, no indeed. What about thinking more broadly in the last couple of years or last year or so? What have been your biggest pain points in hiring? What are the biggest challenges at the moment in hiring great data talent?

LISA: several ones First of all, getting a huge number of CVs or applications for a job that we posted, so really from all over the world wanting to attract international talent, but yeah, so really more than a thousand CVs, or I don't remember the precise numbers, really, then having to filter them and decide who to invite to the first interview. Which was done by HR, and also in the other stages, the expert interviews or deep-dive interviews were very time-consuming, so they were a bottleneck in the process as well because they took 45 minutes to one hour, two people were involved, you had to find slots in everyone's calendar, and you had to prepare for the interview and write the scorecards afterwards. So this is a bottleneck, and that's the process taking longer than we wanted, so that's also related to a challenge of ensuring a great candidate experience, of course, because, yeah, candidates want to have a clear process, a fast process, and reliability in the timelines you give them, so that was a challenge as well for HR. Also, then stick to the deadlines they communicated to the candidates. Other challenges were also really judging the candidates abilities to deal with the problems they would have to work on in the everyday job in the end, so even when combining the CV, an interview, or even a case study Yeah, are they really able then to deal with what they will have to deal with when they are hired and on the job? Okay, now you could say maybe we would have needed to adjust our case study, but still, this was a challenge, and also for some specific profiles, it was really challenging to hire a great fit. To make this more specific, we wanted someone who had experience in data science and as a team lead, and obviously in-house salaries are also lower than at consulting companies or agencies, so finding someone with these specific requirements who also was a great team fit, et cetera, took over 12 months to hire someone. but in this case I'm really happy that we had the patience to do so and finally found the best match for this position we could have found, so yeah, shout out to Anna if she hears this. Still happy that we hired her, so yeah, really quite some challenges.

TIM: And is it fair to say then that the process was set up to really minimize the chance of a bad hire?

LISA: In this case, yes, we really wanted to be really sure. We also had two candidates who we made an offer to, and they dropped out after we made them the offer, but yeah, you never know. Was it really the salary? I don't even think so. Maybe they were also in parallel interviewing for another process. You never know, but yeah, maybe we were a bit unlucky, but we also wanted to make sure to really have a great fit in the end. Yeah.

TIM: I hear that saying often: Hire slow, fire fast, but that's probably not a great representation of reality because nobody likes firing anyone; that's an awful experience for everybody.

LISA: Yeah, and especially in Germany we have more of our labor law; maybe I Yeah, it's easier to say that in the US or maybe Australia. I don't know in detail, but in Germany employees also have more rights, so firing fast is It's not really that there are layoffs; people are getting fired. You can have mutual agreements, of course, but it's harder to fire people in general, so you maybe want to be even more careful to make a good choice or use the probation period really to find out if it's really a match from both sides.

TIM: Six months would be normal. I think, yeah, it depends on the role, but yeah, six months would be normal, and so then in that six months, as an employer, can you say after a week Oh, you're not a fit; you're done, or is it still also difficult? remove them in the six months

LISA: No, during the six months it's easy, so you can let someone go within two weeks of the notice period even without giving a reason theoretically, so there it's easy, but afterwards if you have an unlimited contract, people have their rights.

TIM: Yes.

LISA: which is great but can be challenging for employers, of course.

TIM: Yeah, and even if you know legally you can let the person go, that is such an awful experience for everyone. I would think it'd be a pretty rare hiring manager who likes firing people. You know, that would take a bit of a psychopath; most people would find that awful, and it's terrible for the candidate, obviously, because they've lost their job. So it makes sense to really minimize the chance of that happening. Yeah.

LISA: Yeah, it's never easy. If you are a leader who really cares about their team, then you would never make that decision lightheartedly, but I also had to let someone go during the probation period even twice, and it wasn't an easy decision, but still, it was in the end the best for the company, I have to say, because, yeah, we tried to work on the challenges and improve things. It didn't work out, so then I had to, let's say, pull the emergency brake before the six months were over, and yeah, I mean, there is a risk for that. No matter how good your hiring process is, you can minimize it for sure, but some risk will always remain.

TIM: Yeah, it's not a perfect science, and then also just circumstances change. Who knows what happens in people's lives that can affect their job performance? It's almost unlimited, isn't it?

LISA: True, yeah, absolutely. That was also during COVID and the lockdown, with people moving to Germany or also to any country in general and then not knowing anyone in the city, being locked in their home, and not being able to go to the office to get to know people. That was also a tough time. Yeah, on top of the job itself, of course, but yeah, I still have some learnings from this, from these cases where it was about data engineers, and let's say on the side of the hard skills, they performed pretty well in the interviews and even a test that we gave them, and the engineers involved in the hiring process gave them a very good score. But in the end it came down to communication skills or the ability to deal with failure and what type of leadership style they were used to working with, and that's what became the challenge and the reason for having to let them go in the end, so that's why I would, for me as a learning experience, emphasize this, and also you could call that cultural fit even more in the hiring process and also trust my intuition or knowledge of human nature a bit more than then focusing on the hard skills so much.

TIM: So it sounds like there was something that you'd noticed in the hiring process that you ignored or dismissed thinking Oh, I feel like this is a problem, but it probably won't be, but it was a problem.

LISA: Yeah, exactly, so there I was like, okay, I might have some doubts about how this person I don't know can communicate, deal with stakeholders, deal with pressure, or fit into our team or company culture, but okay, the engineers didn't have these concerns; hard skills seem to be great. Okay, then let's leave my gut feeling aside. Yeah, but I would not call it gut feeling, but let's say really giving more importance to the soft skills in general and team fit or cultural fit to rate that a bit higher, yeah, because I left that a bit aside and wanted to focus more on the hard skills and turn out to be not a good decision to hire that person.

TIM: Yeah. Yes, one thing I was thinking about just recently, because I was traveling overseas, was around cultural fit and so this idea that we might have a set of values that we're looking for; we want this person to have a growth mindset, or they're focused on making it happen, or they can take constructive criticism, or whatever the things are that we're looking for to fit in with a culture. And it assumes that the candidate themselves has a fixed set of traits or values, and if they don't currently have the same ones as our business, then it might be difficult for them to adapt, which is probably fair enough, but then I just had this feeling as I was going to all these very different countries recently, like I went to Germany. I went to Saudi Arabia. I went to Thailand, and drastically different places in Australia, and pretty quickly I figured out Oh, in Saudi Arabia I should probably wear some pants and not really short shorts, okay, and little subtle things like that that you just pick up naturally, and unless you're a very disagreeable person who likes breaking the law, or likes breaking the rules, I should say, you just figure out how to fit in. I went to Thailand, and there's a lot of kind of little bowing after you buy something; you have to have an interaction, a lot of smiling, and so you just adopt that almost naturally, so I wonder whether in some ways we almost don't give candidates enough credit for how naturally adaptable they might be in terms of changing their culture, or am I being overly optimistic in how much people can change?

LISA: So, yeah, talking about cultural differences, also, yeah, expecting the ability to adapt to the country where you're working, to some extent, I think that's fair enough. As you say, we would also adapt to other cultures if we worked in that country. To some extent, let's leave it that broad to begin with, but yes, you shouldn't ignore cultural differences, so that's also when it's difficult to balance out with, Okay, let's train people for biases we want to avoid stereotypes. But at the same time you shouldn't ignore cultural differences. Because then there are differences in how we deal with time or how direct communication is, how openly you can disagree with someone, and how hierarchical structures or leadership styles are in different cultures, you can't ignore that; otherwise you will also discriminate against people in the hiring process, for example. So this is really about culture in the sense of, yeah, culture in different parts of the world, but then okay, company culture or values the company commits to. I think if you have clear values, there are also, like, diversity or fighting against discrimination of all kinds, as Babbel clearly positions themselves, then I think it's mandatory to check this in the interview process and to check if candidates commit to this type of company culture because it's not always given. Do they have experience working in international environments already, or are they aware of the challenges that might come along with that? How do they react to women in a leadership position, for example? I would say one base value, or base baseline, for the mindset for all other types of values to work on is like a learning mindset: Is someone open to learning and growing to improve themselves, to see that they don't know everything yet, that they might make mistakes, and that they can get better? and if you have that as a precondition, then it's easier to establish other values, like at Babbel, being bold and humble, or knowing your limits, or not wanting to have very big egos, so if you have this learning mindset and this openness as a prerequisite, then it's maybe easier for other values to establish So yeah, when really culture in a strict sense and then company culture in the other sense, but you can't maybe expect the same level of being reflective about this at any stage in their lives or careers, of course, but yeah, they should be. I would still expect them to be open about this or curious about it at least.

TIM: Speaking again of bias, I personally have the view, sorry, A lot of the traditional ways that we evaluate people in the hiring process are potentially rife with bias; for example, at the CV stage, depending on where you are in the world, a CV might include a photo; it certainly includes a name, which suddenly implies gender and ethnicity. Could have your religion based on the school you went to certainly indicates your age based on when you might've gone to college or university, and so I feel like there's a lot of a bit of a minefield there, and then in the interview process, I'd say most companies don't quite do it as objectively as you guys were doing it in terms of trying to score people on a clear set of criteria. A lot of the time they just do a vibe check. They just think, Oh, did I like this person or not? It's a yes or no at the end of the view. What about other, more objective ways? So you've laid out this measurement criteria in an interview as opposed to just doing the gut feel. If you've seen any other ways that companies are trying to make the process a little bit more objective,

LISA: Yeah, all types of, let's say, tests, besides actual interviews, assessment centers, maybe to start with, and skill tests, so it depends on the relevance of the questions. I have to say I've also seen some quite doubtful assessment centers, like applying for a leadership position and then having to solve Sudoku puzzles and move shapes around under time pressure to see Check my strategic thinking. I don't know, but there can be skill tests with very relevant questions, of course, and I have to say I checked some of Alooba's tests that are available for free, and I found them very specific and helpful, like, for example, the one for marketing analysts really checking expert domain knowledge, you could say. so that if you have that type of skill test, that can really be useful, and because with these general assessment centers, it's also easier; it's also easy to cheat, right? You can get a person who's good at Sudoku or whatever to cheat with that. but that's way harder if it's about expert domain knowledge, of course, and it's still harder to cheat than maybe with a take-home challenge or a case study, so yeah, having that skill test maybe as a first step, even if you don't have an HR department or if you want to use that as a way of filtering first because you have so many applications, could also be interesting. I haven't used it that way yet, but if it's that type of very relevant question, I would be interested in trying it.

TIM: I feel like personally that is where things are going to go very quickly because I've heard from so many people in the last few weeks about being inundated with CVs that a lot of the time have been written or optimized by ChatGPT that seemed to look perfect compared to the job description. And so then how else is a company going to choose between 500 CVs, 400 of which look good but might be largely fabricated? There's got to be some validation step in between them and interviewing them with a human because otherwise they're going to waste a monumental amount of time, so there's got to be something in there. I don't know what it is, but surely something where the candidate actually has to prove something, the skill and intelligence, or whatever

LISA: Yeah, and I think these skill tests are probably also more meaningful than—or more relevant than—I don't know, some companies also do live coding for data engineer or data analyst jobs, which I find also a bit problematic because it's a very stressful situation putting people on the spot a lot, and it's also less critical of a skill. Do you have to do that afterwards in your real job? Coding that fast under pressure is not really, yeah, such a skill test would make more sense for me. Take-home challenges also still matter; of course, you can use LLMs to cheat or get someone else's help, but still, you have to prepare results in an appealing way; you have to send them on time. You have people presenting them, so it's okay to use AI to prepare, but to perform, you still have to do it yourself. Right? How would you use AI to perform in the actual interview or presentation? I can't really think of a way to do that yet; maybe there will be some in the future.

TIM: Yeah, I think that's the critical step if someone's because this is always a problem even before LLMs with a take-home. It's, "Oh, could they have gotten their friend to help them with it? But they're going to come unstuck immediately in the interview because they're going to absolutely crumble under any level of questioning because it would be very obvious when they haven't done it themselves. and it's the same as if they've just outsourced it to an LLM. I think I feel like a good case study that uses the company's data that is closely related to what they'll actually do on the job surely is still going to be the best indicator of whether or not they can do the job. Maybe not the cultural fit side of things, but at least can they do the job you want them to do? Really, be a good case study. I don't think

LISA: Yeah, yeah, I still believe in case studies too, and last but not least, some people, some companies, also have trial days, which I never did, and I also see a bit of a risk here to make candidates act more pushy, for example, than they usually would, because if they are to participate in meetings, they are also put on the spot and expected to, I don't know, suggest solutions much faster than they usually would in the real job, where they would listen first and ask much more questions. and so I think that can also give a pretty distorted impression.

TIM: Yes,

LISA: Sure about it?

TIM: and probably impractical for candidates who are still working to take a day off to do another job. It must be challenging to pull Yeah. Yeah. and also more effort for the company itself to organize that and which meetings to invite that person to, or to simulate something, I don't know, even more effort, so not really scalable. Is there anyone that you've personally learned a lot from in hiring or in data? Anyone that you'd like to give a quick shout-out to?

LISA: Oh yeah, a lot of people I still have to learn a lot from people I work with every day, but if I have to name someone from data, it would be Steven Lamb, who was my first manager at Bubble and was and still is my biggest role model when it comes to leadership and the art of asking the right questions, having a great level of self-reflection, combined with strategic thinking and vast experience. So even if you're hearing this, thank you for being who you are. a great thinker, a great mentor, a great person, and for hiring, I worked with several great people from Babbel's talent acquisition team, especially since they are also motivated, structured, and willing to constantly improve the processes, so shout out to Ibrahim, Ashwini, and Giovanni, to name only a few. You're amazing.

TIM: Wonderful! Some great shout-outs for some great people. Lisa, it's been a pleasure. It was really good to chat with you today. Thank you so much for coming on.

LISA: Thanks again for having me It's been a pleasure for me as well, and thanks to everyone who listened until the end.