In this episode of the Alooba Objective Hiring podcast, Tim interviews Alexander Sommer, Director of Data at LoyaltyLion
In this episode of Alooba’s Objective Hiring Show, Tim interviews Alexander, a seasoned hiring manager with a notable presence in the data space, to discuss the transformative impact of AI in hiring. They delve into how AI, particularly tools like ChatGPT, is used to screen CVs, maintain quality, and save time while minimizing bias. Alex also shares his approach to balancing technical skills and soft skills, emphasizing the importance of communication and listening. Throughout the conversation, they explore the benefits of transparency in the hiring process and the value of structured evaluation through scorecards. The episode offers valuable tips for hiring managers on assessing candidates effectively and highlights the evolving dynamics of AI and human interaction in recruitment. Alex concludes with personal insights into his enjoyment and methods in hiring, making it an informative and engaging discussion.
TIM: Alexander, welcome to the objective hiring show.
ALEXANDER: Thank you very much, Tim.
TIM: It's good to have you here. It's good to have you here, and I feel like you've got the best beard in data; that's already an accolade that you deserve. I think I've got beard envy.
ALEXANDER: Thank you. You're not the only one that has told me that, so I'm glad.
TIM: So, Alexander, AI is reshaping everything about the world as we know it at an alarming rate; people are thinking about it in all kinds of contexts. I'd love to understand your thoughts on AI in hiring. Have you used AI in hiring? Have you seen it already have an impact, and if so, where?
ALEXANDER: Yeah, no, that is a good question. So, in general, AI has taken off massively in all areas, and I would say in the data space, it's probably one of the most prevalent technologies nowadays. We use it like constantly in various different areas of the business and even in our work workflow and the life cycle of what we do, but in hiring, it's, I guess, just as common to use these days. Within the current organization we don't rely on it too much; there's an aspect of us checking through a lot of applications, specifically when we go through a screening. The idea behind it is to get a bit of a gist of what types of applicants have actually applied, and it just helps us save time. The one thing that we don't usually do is just have the AI, or this context chat GPT, tell us essentially what's called the candidates, so we try to leave the bias out of it, so that's maybe one thing that we are conscious of, and the whole process is very much manual, so whilst we do the screening, we don't reject any candidates based on AI. Maybe that's one thing, like how I've seen a user use it—again, it's an interesting tool, and maybe not just on the hiring side but possibly on the screening side, and I've seen a lot of candidates use it. a lot of CVs actually written with AI. It's quite easy to spot, and sometimes even the full application There have been candidates that have used AI to actually create or to tackle these specific technical assessments, which has caused us a few issues throughout the process. I can tell you a little bit more about it later if you want.
TIM: That would be amazing. And yeah, we'll get to the assessment stage. For now I'm interested in a few things that you mentioned there, so one was that, okay, so you're using ChatGPT at the moment to, I guess, look at the CVs of the candidates and also the application information they might complete. I'm not sure if you have additional kinds of questions, and what exactly are you getting it to do? like what does the prompt look like without giving away all the secrets
ALEXANDER: Mostly, the context we have built is specifically around who we are, so you can specify who you are as an interviewer, so we do a role play-type screening, so we try to pretend that we are the hiring manager, so we teach the GPT just to do exactly what we would normally do. and then just give us kind of a list, like a prioritized list of the candidates that would stand out, what areas to look out for, what the specific points on the CV are that we should probe, and so on, but also give us a bit of an idea of the quality of a candidate. Now this is all just for screening. So we still go through all of the applications manually, but it does help us to give us a bit of an idea of the quality of a candidate's potential quality, and it does help us to also understand whether that was true, whether it's matching the expectations or not.
TIM: And do you attempt to, as the output of the prompt, do anything quantitative? For example, are you categorizing the applications into, like, good, decent, or not-so-good, or stack ranking them in some way, or grading them across any criteria?
ALEXANDER: good question not so much again we don't try to use it because that would introduce quite a bit of bias, and that sort of categorization is done by us manually. Again, we do use it for like a basic screening, but we wouldn't want to go anything beyond it just because we have different criteria when we assess candidates, and we do rely heavily on interviews. So this is more old-fashioned; even instead of our domain, it's very important that we don't downplay the importance of soft skills and actual, like, physical interviews or online interviews at least.
TIM: Okay, and using ChatGPT at that CV stage, what then is the goal of it? So you're not using it to reject candidates, certainly not automatically, are you? You mentioned saving time, but you're still reading the CV, so how does it save time for you at the moment? So I guess it's one way of framing that.
ALEXANDER: Yeah, good question. It helps us to go through, especially if you have a larger quantity of CVs; it usually gives us an indication of what CVs are of lower quality, maybe not even suitable for the specific role that we have. It's really just going to get through the quantity, but when it comes to the actual quality, that's when we pay more attention to the CVs ourselves or, like, to specific candidates.
TIM: Okay, and it sounds like you've got this prompt kind of quite customized for what exactly you're looking for, or is it across the team and organization, and you'll share the same prompt, or is it almost like you've created an Alex bot in a way?
ALEXANDER: In a way, it's actually an Alex bot, so we don't have it as a framework. We do use a company account, but we don't necessarily use it in the same way. It's more of just for individuals. We still have to do the scorecards; we still have to go through all of the profiles manually, but it personally helps me to understand what sort of applications I'm dealing with. but it's again not like a widely adopted practice in the company yet, and that'll be interesting if it changes at some point, yeah.
TIM: Companies are almost creating their own mini products in different bits of the business, and it'll be so fascinating to see where a lot of those land in the next year, as the ones that seem to work well will probably get more widely adopted, and maybe some businesses will crop up to then really scale that across organizations. These little kind of skunkworks I find quite interesting.
ALEXANDER: Actually, it's good you mentioned it because this is something that we are also exploring, specifically as a data team, because I don't know if you have probably heard of the concept of service agents, so the idea is to take it like a bit further and not just create a GPT but actually more of a lookalike of you and somebody that you can go and talk to and somebody that can actually come back to you and ask for specific follow-up questions and clarify the intent. So that is something that we are exploring, and I've seen it in many organizations: the concept of service agents and this concept of scouts. You come in, and you ask for specific information, and you have more of a conversation, but yeah, anyway, that's a little bit off topic, but it's very much relatable to this. and I feel like within hiring this could be an interesting area
TIM: Absolutely, and you mentioned also something you mentioned a few times, actually, that you're wary of the bias that the chat GPT or whatever large language model might introduce. Is there any particular bias you're concerned about, and have you thought about perhaps existing biases that we have doing this as humans?
ALEXANDER: Good question, so I guess it's improving, but the hallucination is a thing, but also the more you feed it, the more memory you give it; it tends to then create preferences, or it's very difficult to control the evaluation score within the GPT because all the time it just feels like it degrades. again, it's improving, and that's why you have the whole new functionality behind keeping memory within a GPT I don't remember the exact number of messages, but they do get skewed sometimes; they have to purge them, and that's why it's not necessarily a bias in the context as we humans would know it. It's more of a bias in the sense that it can modify the criteria without you knowing it did, and this sort of guard railing of what it should be doing and how it should be evaluated needs to be revised continuously. This is one issue we have with the LLMs today; again, you can work around it. So I've seen implementations of it where you essentially have human intervention at many steps, which kind of ensures that when it processes a lot of data, it doesn't actually make decisions for you, because if it did, then those decisions would change very slightly every single time. That's what I mean by biases, if that makes sense.
TIM: Yeah, that's a great explanation. Thank you for that. Okay So we've got then this kind of experimental use of AI, right in the screening stages you mentioned that candidates would be using it for the tests for creating their own CVs. What do you think about that philosophically? If you knew a candidate had used ChatGPT if you did every stage of the hiring process and you weren't really interacting with them in a way Or is it more the case of, well, of course you want them to use this groundbreaking tool because they're going to be more productive when they use it in a business? What do you think about
ALEXANDER: Yeah, it's a funny one because, like, you'll be hypocritical if I said I wouldn't like it because if we are the adopters as well, then the candidates should be free to use it, and I'm actually quite welcome, and I quite like it when people do use it. The only time that I don't is when it's just a blatant copy-paste, essentially. and I've seen some candidates do that, and when they forget about the footnote that the GPT leaves you about all about the topic, it's more funny than anything because I get it: like candidates are desperate nowadays, even when we put in the data domain, the market is not great, right? So there are fewer positions; companies require broader skill sets and more years of experience. Candidates have to apply to hundreds of applications or hundreds of job postings, so to me it only makes sense that they use it. It's more of how they use it, so if you use it What do we call whiteboarding or storing boarding where you essentially have it Build up your prompts even if it takes you an hour to really take the equal amount of time to create a CV manually, but take your time to build up to it. So then it gives you a better structure, and then refine it yourself, so for that sort of usage, I'm happy with that because that's like the new way of working anyway, like most positions, most of all, was in companies, and this is how they utilize GPT. It's just to create the initial version. You take it away and then you finish it off manually. okay
TIM: success in your role So the question was very specific to that person, like I cared exactly what their opinion was. I didn't care what they said as long as they told me the truth. I was struck by how many candidates had clearly used ChatGPT for that kind of question, which I felt uncomfortable with because I don't care what a large language model answered arbitrarily one time; I care what you thought about this. I care about you personally. Things that candidates should be using in the hiring process or anything fair game, do you think?
ALEXANDER: That's not that is a really good question, so to me treating a CV, okay, that's fine; it's almost like listing your skills and past experience, okay, make it look good, fine. I think it's the screening questions, and like you just pointed out, like what I appreciate is when somebody goes not in the level of detail like a GPT would, but it's about that they added their own thought process into it. So to me, at that stage, when I usually, because I also use many screening questions, usually two to three, and just to stick through various different levels of experience when it's used by GPT, that, to me, is a no, because, again, I can Google the answer myself, and that's why I'm doing a screening question. So what's the point of you Googling it and putting it in there? And so at that level, GPT hasn't been very good at like coming up with very original responses. Yeah, I would agree with you: screening questions, even if it's a few lines, just be yourself, answer it, don't use it. If you want to use your ideas, then use strategy PT to refine it fine as long as it is as it uses your prompts and your sort of personal context where you've defined who you are and you've created almost like a digital footprint of yourself. In that case, GPT can sadly do a good job at rewording your own words. most of the time it fails at that, and it really sounds like it doesn't sound natural. The only exception could be the old one preview version. I don't know if you ever used it much; that one is Yeah, that one is pretty good at crafting better responses, especially if you give it, like, a really good context. Sometimes it's hard to tell if somebody answered a question using the preview, having modeled the prompts properly, or if it's your own genuine answer.
TIM: That's a great point you make. The technology is changing so rapidly; who knows how indistinguishable this content will be from real human content in a month, two months, a year, or something like that? But I think at the moment, for candidates, I feel like they should probably be aware of two things. One is if they are applying for a job and, in part of the application, they've used their real writing and, in another bit, they've used AI. It is very obvious the difference because you'll have normal human writing with imperfections; it's like a typo and a spelling error, it hasn't been spaced properly, and they used a word incorrectly, which is fine. and then this amazingly elaborate, verbose, perfectly grammatically correct stuff. It's like they have not been written by the same thing. It's very obvious, so this should be a little bit clever, and the other thing is if so many other candidates are using ChatGPT, maybe the way to stand out is just to be really authentic in yourself. Because then maybe the answers will be obviously yours.
ALEXANDER: Yeah, that's true. That is true. And also for me, it's also Oh yeah, of course I would say it's also the examples, right? Like, you can use ChatGPT, but it will always give you recycled content. It's how it always feels like providing authentic examples; that's something you can't ever replace in any application. and some of the best applicants I've had, they only put like a few lines in the response for the screening question, but I knew right away, It's okay; this was something that they did, or like they worked on because it's so imperfect in not in the sense of grammar but in a sense of the content or the experience they did or the example they provided, how flawed it was. and I thought this is great I want to talk to this person because it's so imperfect as an answer; that's what works for me. I look for the flaws usually in order to find the best candidates.
TIM: Up to a manager of analytics role for the first time, maybe they're doing their first hire. Do you have any suggestions for them on how to hire better, particularly how to screen better?
ALEXANDER: So for me it's extremely important that you have a very solid sort of criteria, like evaluation criteria, so have your scorecard but then have a clear survey understanding of what is a good response average and the low, and that gives you like a solid foundation, especially for somebody that's new. It's very easy to feel like you're not connected, like you can't associate yourself with that person. Sometimes you have the manager bias of, Are you looking for someone like you? and it's very difficult to move away from that and very traditional to just have solid questions and then answer those questions yourself. and then think of what a good average and below average would look like in terms of it's the same answer, but there's something else missing in it, so maybe the thought process is that there's a part, there's a topic around testing that's missing, or maybe it's not about understanding the potential impact of that involvement of that piece of work that the person was doing. so it's like missing those little pieces, but it's essentially For me that's always worked because originally when I started as a hiring manager, this is the one thing I struggled with a little bit: to me, all the candidates look great, and I'm like, Okay, I'm struggling to actually find a good one. It's what's happening, and that's what really helped was actually taking a step back. This is something that my manager taught me eventually, like Go to your scorecard, go to your questions, answer them yourself, and then let's get together and let's reevaluate those answers. and what I did was almost like go through the interview process myself again but with my manager, and then I would get the feedback, and he would essentially tell me, Look, This is the answer. Let's place it on the rank, and then we'll see what would make it better and what would make it superb. and then also it's associate-specific levels to those answers because what happens is you tend to have you hire for a specific role, and for that role you have the range of answers, but then the role could be a mid-level, but you could end up finding a general person or candidate, which is a lead. and that's perfectly fine, but we are flexible, so let's hire the lead if we want to, so in that case, it's very important you have that sort of distinction. long story cut short, do answer the questions yourself; ideally, consult with a more experienced hiring manager and then craft the responses for each level. and then that will become your framework for evaluating other candidates.
TIM: That's such a great answer, and I feel like a lot of the issues with the way hiring is done are systemic issues with the way hiring is done. Sometimes there's a lack of real forethought and planning, and you just dive into it, and then you're just interviewing people, and you're just gut feeling it the whole way, and you're getting to the end, and you're like yes or no, doing like a vibe test, and I feel like that's where a lot of these processes run awry, so the fact that you would, yeah, go to that effort initially to create the scorecard but then go through it yourself, that's really clever because then that must also give you empathy with the experience of the candidate as well. And I'm sure it helps you then refine it and maybe make it a little bit longer or shorter or whatever as a result of what you felt going through the process as
ALEXANDER: Exactly, and what you just mentioned, like sometimes you can end up with a screening process and the questions that are just too long, and it just makes you think when you do it yourselves, this is such a waste of time for everybody because that person has to fill it in, and I have to read it and evaluate it. and then you realize, okay, let's just stick with the important stuff, and if those two questions could give me what I need in order to evaluate a candidate initially, great, like that's what I'll do, and I usually end up with two questions—that's about it.
TIM: Yeah, there's so much optimization there, isn't there? And picking the right questions and really just trimming it down to those that give you the most insights is what it's like, almost like they're covering the maximum number of criteria that you're looking for.
ALEXANDER: Precisely, and then again, like the process is longer, right? It's a multi-step process, so there's going to be an opportunity for you to evaluate other parts, other skills, later on in the process.
TIM: So you don't have to put it all into one stage.
ALEXANDER: No, I wouldn't. I'm actually maybe inefficient as a hierarchy manager because I interview so many people, but it's also because, to me, it depends on what organizations you are in because I worked in larger corporations; I worked in scale-ups; now I work in a startup. If you are in a bigger organization, then you can't afford that time. You're going to have thousands of applicants. It's almost impossible, but as you start going to smaller and smaller organizations, you do get fewer applications—still a lot, but I guess for me it's more important to, as you get to a manageable amount of applications, actually go through them manually.
TIM: That probably makes sense if you accept that the most important thing you could do as a leader is hire the best or right people into your team; then maybe we almost underplay how much time we should be spending on hiring, like maybe it should be half of your job or something dedicated to it. So maybe that's not unreasonable.
ALEXANDER: Yeah, I wouldn't say maybe half, but I would say especially if you like in the hiring period, it's going to be around 20 to 30 percent easily, especially to find a good candidate, because on average it takes roughly six months to find, like, a senior developer, so yeah, That's a long time. At least for me, we don't take it lightly, and it's funny, like the smaller the organization—so now I work in a startup—it sometimes takes us even longer, and then even when somebody joins, it doesn't mean it's over yet, so we still—the evaluation goes on; there's a probation period, which we take very seriously. and also because some candidates lately have become incredibly good at interviewing doesn't make them good people or good developers or good engineers, which again I don't know if it's because of ChatGPT or something else, but you can, as you'd maybe as a candidate, you applied for hundreds of jobs; you get really good at it again, but then your work performance day to day may not be as good. and we have those situations happen to us
TIM: You reminded me of something one of our guests last week, Vladimir, said. He said, Oh, there's a joke doing the circles that there's two sets of candidates: those that are amazing at the job but not at interviewing and those that are amazing at interviewing but can't do the job. A little bit cynical maybe, but I've certainly seen that phenomenon.
ALEXANDER: It is true it does happen, and I usually look for the one that's in the middle, so they may not be the best at their job, but they're not the worst at interviewing, but somewhere in the middle, because if you have a good amount of soft skills, then you'll do better than if you're extremely well competent in your role; obviously, it depends on seniority. But generally, like you, I always look for someone who, from a technical perspective, is probably not the best but can grow, but not somebody who's technically extremely competent but can't communicate at all. That, to me, just doesn't work, especially in data; it may work in other parts of engineering. where you have a very focused lens and you just sit down and do the job of architecture, maybe in larger organizations where you have extremely defined roles and focused roles, but in smaller startups and scale-ups that doesn't work.
TIM: And yeah, that was going to be my next question around that trade-off or balance between the technical skills and soft skills, so you mentioned straight off the bat that you wouldn't go for someone at either end of the spectrum, either of those at the end of that spectrum. I should say, other than communication skills, are there other really essential soft skills that you're looking for?
ALEXANDER: Good question, because to me, especially about data, this podcast and you communicating the value and the ability to listen are like the top two priorities for me, and they're not quite the same. Communicating is about translating your understanding into words; listening very closely follows. So it's about taking in the information and being relevant and not going sideways or off topic, so again, like you tell me if it's the same, if it falls into the same bucket or not. To me, there's a clear distinction because one is what comes out of your mouth, and the other one is what comes into your mouth. your ears right, so that would be the top two skills. Obviously, there are other things like empathy and some social skills, but you don't always get that with engineers, so that's fine. I can't expect an extremely confident, likable senior engineer who's highly competent. It doesn't always happen; sometimes you have people like that. It's just quite interesting, especially people from non-technical backgrounds that got into the field. They do tend to like Excel quite well.
TIM: You've touched on something really interesting there again, which is that for some of the softer skills, there is maybe a level of subjectivity or ambiguity because you just said communication skills could be listening. or that could be its own separate skill depending on your perspective when you're evaluating your candidates on the softer skills. How do you try to do that in an objective way? Is it still part of that scorecard, and if so, do you almost break down communication skills into some skills, like how do you think about that problem?
ALEXANDER: Good question, and there's almost like a soft side to the scorecard that we would have, which kind of evaluates the ability to communicate. It's quite simple; it's more about how much on point they are and whether they are going off topic way too frequently. What's the eye contact? and all of these things that we do have them again, they don't make it or break it as long as the content is good, then we look at the soft side, so it's like technically Depending on the level we are assessing, we try to get a good idea of your competency, and then we evaluate your communication skills. the biggest one for us rather than communicating because we are quite forgiving, or at least the companies I work for, in terms of people not having the perfect communication skills, they may not be very good at expressing the ideas, but as long as the content is good, we don't mind somebody not being confident, let's say, with the language. the ability to listen was always the one that was that Hey, that is really a big weight in terms of our evaluation, so if we ask you a question and you don't answer or it takes you two probing questions to get the answer, then it means you're not really good at listening.
TIM: Yes, that is probably a great tip there for candidates. I could tell you nothing would infuriate me more. I've got to be honest; it's someone who just doesn't answer the questions that I've asked them and prevaricates. Either they don't know the answer to the question, or they just want to defer away from the topic because it feels like it's going to be conflict or something. I don't know. And an interview could often feel a little bit like conflict. But it's just such an annoying trait, I find. And I'd rather just have an honest, direct answer.
ALEXANDER: That is true, and I think that there are two strategies that said that there are some people that just go try to get out of the question, but to me, what actually really works is, like, maybe a tip for the candidates: be quiet, take your time, and ask for a few seconds to think. It's okay to actually, as an interviewer, I would prefer to stay quiet for half a minute and then listen to you about an off topic or, like, make up an answer on the spot that doesn't work.
TIM: Yes, directness and what about sicsicness? I guess it's a bit more subjective; some people would prefer a more verbose interview that goes into more detail initially, while some would prefer something a bit shorter. Actually, that leads me on to another kind of sub-area: I should mention that for some of these interviews, you would have multiple interviewers. And can you remember any scenarios where you deviated greatly in how you'd evaluated some of these soft skills, for example, listening or communication, and if so, how did you reconcile those differences in these slightly gray or subjective skills?
ALEXANDER: That's a good point, actually. So when we did group interviews, we did evaluate these, but then we would submit the feedback independently, and then it would be essentially up to one person to evaluate the differences in feedback, but we don't talk about the feedback between us, so it's extremely important it's anonymous because it's a skill that's subjective. So to me, somebody could be a great listener, whilst others are like, Wait, like, there was probably something else they did, and they were not listening; they didn't answer properly, and it maybe comes down to the answer to the way it was worded. Maybe I was looking for an answer, and I got it; the other interviewer was looking for their answer. So what works in this to avoid subjectivity is to submit the feedback autonomously and then review it; then everybody should be able to see it. You can't change your responses and your score, but you may be able to tell that, okay, maybe the other interview was right and I wasn't seeing it. and that's great, so it's up to the final sort of person that evaluates the feedback to tell us which feedback actually was more valid and then make a decision, yeah.
TIM: assessments versus different types of interviews What are your thoughts there? What do you use?
ALEXANDER: Good question. Change changes a lot, like I used to use it. We used to do, like, a technical assessment, and quite a lot, actually. I still use it, but we're changing it now for a reason. I'll tell you in a bit, but usually it's a two-part thing, depending on the level of seniority, so one tends to be a technical sort of coding exercise. do a small project do a small program Do things like a URL shortener: just submit your repo, just show me how you would use that and design it, and make it work. That's always like the point of the exercise: if it's something that doesn't actually run, then yeah, probably we're not able to code it, and I'm not looking for production level So that's one, like a technical assessment, and then there would be a case study, so you go, and there's a context to the specific audience, and then you have to present something in front of us. It could be a design; it could be a specific architecture, a data platform architecture that you should be building. So it's a combination of those two, so coding and the case study, and then we evaluate both, but we don't always look for the solution; we always look for your ability to define your solution, but then also think on the spot. That's how we combine it, so it's not an exercise where you would submit something, and if it fails by default or if it doesn't work or if it's flawed in design, you get rejected. Sometimes, of course, we do that. If you have candidates that are extremely poor in the technical, then yeah, that makes no sense, but for somebody more junior, we tend to have the discussion anyway, so we want to see what you did and why you did it the way you did.
TIM: And it's part of that follow-up where you're digging into their approach. Do you ever try to look for the right word here, try to give them a bit of constructive negative feedback, and see how they react to see if they can deal with that criticism? And do any candidates ever not deal with that particularly well?
ALEXANDER: I would say yes, it's an answer to both questions, so sometimes we could take a specific design decision that the candidate did, and it's not wrong, but we frame it in a way as if it was, and then we sit back and see what the candidate comes back with: Are they defensive? Are they understanding? Are they going to admit it's wrong? Are they going to say, You know what? It's fine. One design choice. I would probably go for some other choices if I had more time; that could be the right answer, right? So we always look to challenge in a good way, and then we just observe the sort of response, and that's what the evaluation criteria is for, right? So we look at you, and if you get defensive, that means that, yeah, you're probably not conscious of other options. And then, yeah, if you do give us a well-rounded answer, you give us potential alternatives, and you start explaining what a production-level code would look like, then yeah, that shows a level of maturity. So that's how we use it, so it's essentially, yes, it's a great example of how to probe essentially.
TIM: Yeah, that's such a good way to do it because you're getting to see the candidates real work; you're giving them a genuine challenge that correlates to their job. If they can produce something that functionally works, to me, as a startup founder, that's like the most important thing: it works. It does what it's meant to do, amazing like you've really achieved something, but then because you have that follow-up interview, you've got the opportunity for them—or they've got the opportunity to really tell you, Actually, if this was the real job and I had to make a production-ready, here's the improvements I would make without having to do all those things in some ridiculously elaborate task. So I think that's a really clever way to do it.
ALEXANDER: Exactly, and many times I think candidates forget, and they overengineer, and it's a stressful process to be looking for a job, but my tip would be to not overengineer; be conscious of the limitations. We as interviewers, we love to hear about the limitations. We know that in those one or two days you're going to submit your case, it will never be enough. It will never be good enough, so just tell us why it's broken. Tell us why. Bye. It's so simple; why, it's almost like primitive. That's a good discussion to be had, and some of the best interviews I had were based on a case study or technical assessment that we only talked about for maybe five minutes. the rest of the evaluation of the technical assessment was the potential possibilities of what that person could do or would do if in other scenarios
TIM: Yeah, and I feel like that's maybe even especially important in startups where often there's a trade-off between speed and quality, and you sometimes might need to do a quick and dirty MVP hackathon-style project, and that's perfectly fine, but then at some point you might need to scale that into a proper product. And so if they thought through some of those trade-offs and they know I could do 10 minutes of work and get 30 or 50 percent of the value, then that's probably worth it rather
ALEXANDER: Yeah.
TIM: with all the bells and whistles from the get-go
ALEXANDER: That is true. And as you mentioned, it's good for startups; potentially scale-ups are not a thing anymore. They are still, but not as much as they used to be, but in large organizations it may differ, right? So it all depends on who you're applying for, so it's something to be conscious of, right? So take this advice with like a pinch of salt, right? As long as you're applying for startups, this is great; it's a good methodology for big organizations. I can imagine not because they're looking for, like, very, like, domain experts, and that's great, so be thorough in your responses. What's for startups Yeah, like you have to know everything, but don't be the one that needs to do everything; you just need to know about everything a little bit. Yeah.
TIM: Knowing who you're speaking to, who you're pitching to, and if you're even within a hiring process at a startup, your first interview might be with someone from talent acquisition, and your final interview might be with your hiring manager, who's a technical whiz. and the way you communicate what you've done is probably going to have to change dramatically between those two audiences, just as it would if you're applying to a three-person startup to a founder versus a hundred-thousand-person bank, because of course they're going to be looking for something a bit different.
ALEXANDER: That is true. That is true, and it's like you've mentioned, like communications is key in this context. I personally interviewed, I don't remember how many times, dozens of times, like throughout my career, and it always feels like I have to be still fresh in terms of my interviewing skills, but yeah, like it's the ability or the capability to tailor the amount of content or, like, the amount of depth you go into it. Each stage is crucial, so if you talk to a recruiter, why bother them with the low-level technicality of a specific solution? They're looking for achievements, so talk to them about achievements, right? And then as you go further through the stages, if you're talking to, let's say, a director or specific senior engineering manager, that's when you can go more out with the details if you like.
TIM: So it sounds like your approach is like really very struck structures—maybe not the right word—thought out, conceived; you've got a scorecard you've thought through, and it's a considered approach that's probably the right way to put it, and it's, I'd say, data-driven but data-informed, like you're measuring and grading people along the whole hiring process. I think it's a great way to do it. Have you ever come unstuck with that approach? Have you ever had a candidate who somehow got to the end of the process, and they scored the highest, though unambiguously the best candidate based on how they've been scored, but then it just didn't work out at all, and there was like something that was missed, like almost an intuition element that was glossed over?
ALEXANDER: good question I didn't have it like in depth; I did have one candidate actually score really well in all interview assessments and even got an offer, and then it didn't work out two months later. That was like probably my first in terms of maybe the judgment; maybe the criteria didn't work. There was something that I had missed, and ultimately it actually came down to the ambition of that candidate they didn't like. I overlooked it because we were so focused on the assessments, on the interviews, and on the screening questions that I didn't necessarily see the ambition of that specific candidate and what it was that drove them to then leave. and ultimately the reason they left was they just went after a different structure and that they wanted a different level of seniority and they didn't want the unstructured life even though they applied for a startup, and it's something that candidates, again, like I've mentioned that before, but it's tough, like in data, and data teams are smaller, and they have to do a lot more work for less pay while not in our case; we pay really well for our people, but still they will lie in throughout the process. and at some point, when they get an opportunity that suits them better in terms of their ambition, they will not have an issue leaving, and this is what happened with that candidate. It didn't quite work out in the end because it was a mismatch in terms of what that person was expecting and what we could offer as an organization. and yeah it was a little bit disappointing, but it's a good learning experience, so now it's one of the questions that I do ask, but not directly but indirectly try to get an understanding of whether that person would be matching the culture. Yeah.
TIM: After certain things, you just got to make sure they overlap well, and I feel like maybe sometimes in the interview process candidates might be quite guarded over what they truly want. It's almost like a bit of a dance, a bit of a dating experience. You just come out with all your cards on the table, but maybe that's where if you're having these several interviews with different people, you can almost be checking in on the true motivations throughout. Maybe you're looking for a consistency in what they say that might help. I personally feel like hiring would be greatly improved if the company and the candidate were as aggressively transparent as possible from the get-go. I feel like
ALEXANDER: Yeah.
TIM: things a lot
ALEXANDER: Yes, it would, and I totally agree. And maybe when you say talk about transparency, one thing that I found that actually works really well, at least the way in my process, is that I'm completely transparent about the roadmap and the importance of that person in that roadmap. In some of my initial interviews, sometimes I go on like a monologue for 10 or 15 minutes explaining what we've done, where we are, where we stand, what the ambitions of the company are, what I'm doing, who I'm looking for, and what I want to work with. So I want to go into that level of detail because that is not that I would maybe give the candidates some ideas, but usually when I'm that transparent, they don't repeat; they don't do an echo; they don't repeat what I just said; they actually open up, and so this is something that I've found out that works really well. Just so your roadmap is very transparent about where you are, what you've achieved, what the flaws are, what's pending, and what's coming up, because then it gives you and the candidate kind of gets excited, like usually, if they see an excited hiring manager, they get excited in the process, and that stress kind of drops, and they start becoming more natural. But yeah, this is on your point of transparency: this is how I achieve it, and it works really well.
TIM: Yeah, it's a great suggestion again for the maybe newer hiring managers to be transparent, and then you're saying that that almost unlocks transparency from the candidate as well because they mirror or copy your transparency, and they are more open.
ALEXANDER: It is exactly, actually, I learned this thing; this was from one of the processes that I was in, so when I interviewed for there was Databricks, actually a huge organization now, and still ambition as well; the whole process took six months, but that's another topic for another story, and it was like 11 stages or something. So it was wild, but there was one hiring manager I interviewed with. It was technical; I think three or four technical, and then multiple interviews, and then group interviews. It was insane, but anyway, long story cut short, there was a hiring manager that taught me about this, and he actually was extremely honest in the interview in terms of what he was looking for, what he was doing, and what he felt like the role would be in it. He even told me, like, the role you have, I'm hiring for, he was like, being honest, he was like, I'm flexible about what the role is going to be as long as I get the right candidate, but then he explained the story of what they were actually working on and what the projects were. Okay, to me, like, I still remember that interview even though it was like a couple of years back, and I didn't like it; it really helped me in that process, and I did stick for those four months until the final interview just because of that experience of the hiring manager because I not only opened up but was actually very inspired. and I felt motivated even though it was a long process, and it is something like, again, on the point of transparency, it's incredibly useful to have that, difficult to achieve, especially if you're doing hundreds of interviews, but once you do pull it off with a few very good quality candidates, then it changes the whole process, and you end up finding relevant people.
TIM: As a quick share, one company that I've seen that does this really well is a business based in Berlin called Get Your Guide. You might've used their product before, and what I was struck by was how early on in the process they were transparent with this type of information, so they had a way that they could share basically the application stage with the candidates. Yeah, the usual job ad, but also a day in the life in this team. Yeah, exactly who your teammates are, the roadmap written down, and share option valuation information in detail, and so they package this information in a bunch of PDFs and share it with candidates, and what I noticed was that the candidates have then had 95 percent of their questions already answered. and so that de-risks it for them, so if any companies are looking to figure out how to increase The completion rate of the process will get the candidates the information they want as soon as possible, and that's surely going
ALEXANDER: Yeah, I totally agree. Again, it really helps you later on because it doesn't mismatch; instead of the expectations, most likely it will come out at the beginning.
TIM: Yes.
ALEXANDER: good
TIM: Yes, you don't want to wait until the final interview after four months to go, Oh, you want 100,000 euros. This only offers up to 80,000 euros, sorry. That's a disaster for everyone.
ALEXANDER: That is true, and it does happen more than you would think, and from your experience, it's your job, so I'm sure you've seen it multiple times.
TIM: Yeah, sad but true. Yes,Alexander on that funny and happy note, it's been great having you on the show today. It's been really insightful and so well spoken, so I really appreciate all your comments.
ALEXANDER: Thank you, team. Thank you for your discussion. I actually had a good chat. It's quite cool because hiring is one of my favorite topics. Unlike other people, I do enjoy it, so yeah, it's good to have a chat about this with other like-minded individuals.
TIM: Wonderful.