Alooba Objective Hiring

By Alooba

Episode 70
Lina Mikolajczyk on Enhancing Analytics Hiring with Structured Processes and AI in Recruitment

Published on 1/20/2025
Host
Tim Freestone
Guest
Lina Mikolajczyk

In this episode of the Alooba Objective Hiring podcast, Tim interviews Lina Mikolajczyk, Director of Data and Advanced Analytics

In this episode of Alooba’s Objective Hiring Show, Tim interviews Lina to discuss the intricacies of hiring for analytics roles, emphasizing the segmentation of different analytics skills and maturity levels within companies. The conversation covers the pros and cons of live coding tests versus take-home assignments, and innovative approaches like giving candidates a choice between the two to reduce drop-off rates. Lina also shares insights on the importance of cultural and ways of working interviews beyond just technical assessments, and how companies can benefit from structured interview training. The discussion extends to the use of AI for interview preparation and real-time feedback, advocating for AI-assisted improvements in recruitment processes.

Transcript

TIM: Lina Welcome to the Objective Hiring Podcast. Thank you so much for joining us.

LINA: Thank you for having me.

TIM: It's a pleasure to have you, and I would love to start with just a high-level kind of discussion at the moment around hiring processes for analytics roles, like what you've been doing, maybe what you've done in the past that's worked well, and what hasn't worked well. I'd love to get just a general lay of the land to begin with.

LINA: Yeah, so I think, first of all, I think it's worth it to differentiate between the types of analytics skills that you are hiring for, so depending on the maturity of your company, you should definitely differentiate between data engineering, analytics engineering, and analytics, and data science can be lumped. and then, of course, machine learning and more advanced science can be together, and actually having thought and a differentiation between the three processes is helpful rather than trying to lump them all in so you can focus more on the technical skills that are required for that role but also the cultural fit that you need for the specific skill sets. So, for example, if you are in a level of maturity where you're just scaling self-service, you want your analytics engineers to be quite stakeholder-heavy so that they can communicate well around changes to data models, dashboards, etc., but if you've done the self-service part and you're more on the like, What's next?"—a more prescriptive, more forecasting type of analytics—analytics engineering may not play as big of a role, but you still need kind of standardized data models. You might focus more in on the technical side. The one thing I will say that I think works really well is that even though live coding tests suck, they're so much better than making somebody take home a case study and work on it for hours on end simply because you just get an engagement and you get a kind of a live view of how people think. Rather than asking them to what is already like a burdensome process to interview to do more work outside of it

TIM: You mentioned live coding sucks. Why does it suck in your view?

LINA: I think the stress of having to think on the spot rather than having the kind of environment to be able to think things through can sometimes make applicants more stressed than they need to be because they feel like they're being tested; they need to perform, and people get performance anxiety. but I think if there's one piece of advice I should give analysts, it's that they are being asked to show their way of thinking rather than being judged on the most efficient way of pushing code.

TIM: As you were saying that and comparing live coding to a take-home, which, yeah, there's definitely pros and cons in each bucket, as you say, the take-home will take way longer for the cabinet in aggregate to complete. It's hard to compare because one candidate might've taken an hour, and the other candidate took eight hours. sort of a lack of apples to apples I was struck by a study that Canva did years ago, which I've just remembered, and I've forgotten about for five years, but they were at some point, five or six years ago, scaling out their engineering teams, the software engineers, not data, but whatever close enough. and they were trying to solve the pain point of candidate drop-off at this stage, and what they noticed was there was a set of candidates who would rather have done a take-home And we're dropping out of a live coding or vice versa, and the way they approached that was by giving candidates a choice. You can either do a take-up or you can do a live coding challenge, and they got the drop-off rate down to zero, so every candidate was doing one or the other because they had a choice if they were short on time, and they could, if they're good at interviewing, maybe they just chose the coding option in that a bit more time. Maybe they're a bit more introverted, a bit more of a deep thinker. Maybe they chose the taken test option. Would you ever give candidates a choice? Because I haven't seen that done before often in the hiring process.

LINA: I think it depends on the kind of case study that you're giving and, again, on the process, so if the case study is focused on you being able to present to stakeholders and communicate technical concepts in a commercial setting and that's what the role requires, then I am more open to giving feedback. The option of either the only thing I would want is not for the candidate to feel like they're doing free work for you; right, the case study should be something that tests their knowledge and showcases their skills rather than being asked to solve a problem that they feel like, Oh, are you just asking me to do this? and it's like free labor One thing I have seen before is case studies being compensated, especially if they take longer than X amount of hours. Not many companies do that, but it gives it makes the applicant feel a lot more respected for their time, so I would also consider doing something like that, especially if the case study is one that you know takes a significant amount of time

TIM: Yeah, it's such a tricky one, isn't it? Because, like, I've had candidates who would do a case study and go Oh, this is great. You're giving me something that's close to the real work that I'm going to do, which means I now know what you expect of me. I get to use some actual data from the business, which again I know the data I'm going to be playing with in real life. and then also some candidates would view the case study as almost like a tell from the company as to their level of data maturity, so if you, for example, gave a candidate a case study and said, Here you go, here's, I don't know, 5,000 rows in Excel; this is huge data for us; give us your best VBA code and bash out some macros. for a candidate to be like, Oh my God, I don't want to get into this environment. I'm going to opt out, and so some would view the case studies, yeah, like almost practice for them, but others are like, Hang on, this seems like real work. Are you sure you're not going to screw me? Are you sure you're not just getting me to do free stuff? So it's just, I guess, it's all in the, is it all in the comms to the candidate to get them to understand, like, your intent and make sure they're not worried about being exploited? I don't know.

LINA: Yeah, and I think a lot of people forget that, like, the interview process is as much an interview of the candidate as it is the candidate interviewing you, and the top candidates are not just interviewing with you; they're interviewing with five other competitive places, and especially when salaries become quite similar and compensation packages in general, the candidate will base their choice on the type of feeling they walked away with from that interview. So you really should invest in making that whole process as culturally reflective of the company that you have and of the people you have, the teams that you have, as possible because that's the best way for a candidate to sell it. Is this really the kind of place that you want to work in? and if you do have a relentless culture and if you are quite intense, then there's nothing wrong with reflecting that in your hiring process because you want a specific type of person. If you've got a more laid-back, maybe more structured atmosphere, then reflect that in your hiring process. and so this is where I think the technical and the case study are such a small part of the entire process and why the cultural interviews become really important, so I'll give you an example: I have interviewed a fair amount; I've moved from company to company, and I find it really interesting. One, I have to interview, and the cultural interviews are very much like question-answer, question-answer. and there isn't a conversational flow because it feels like they're following a structure of questions that they have to get you to answer, and that seems really unnatural, and it seems really disruptive to the flow of we're talking not because this is like a tick box, a box-ticking exercise, but we're talking because you legitimately are interested in having me be part of your team. I guess what I'm trying to say is if the technical and case study are reflective of the kind of work that's great, you can make a choice as to whether it's a take-home or not, but it's the stuff that surrounds it that really makes the candidate figure out whether this is the place that they want to be in, and likewise for you, if you're asking questions that are formulaic about stakeholder engagement, like quite situational structured questions, that doesn't encourage the candidate to have a conversational flow that actually showcases their breadth of skills.

TIM: It's again a tricky one, isn't it? Because I assume that for companies that run hiring like this, they're trying to have a structured process because they're trying to make sure they can more easily compare candidates. They're asking every candidate the same question or a set of questions that are similar to each other. so they can, in theory, go and grade them along the same dimensions so they can get a scorecard and they can see who the best candidate is. Thank you very much, but yeah, when I've heard people push back against this, it's normally along the lines of what you've said there, which is it becomes a robotic exchange of question and answer. and it's almost like maybe it narrows how much you can learn about the candidate because you can confine them to talk between these areas, and some candidates might even feel like they get to the end of an interview and they haven't really been able to show off their best self. Is that part of the challenge, do you think?

LINA: Yeah, and you bring up a good point, right? Like, you do want to have some structure in the interviews so that you can compare candidates to make it as fair as possible; however, you can approach this in a way that's a little bit more of a conversational flow rather than that formulaic self-questioning that we're talking about. So the way to do this is you actually really need to invest time into training people how to interview because if people know that, okay, I need to find out about these five themes that are really important for this role, the work starts up front, which is what are the pillars of the team? And if you're interviewing for an IC, an individual contributor, or a manager, what are the kinds of skills and values you expect them to have? And then you build out your question set from that, and you invest time into your interview to make sure that they are able to ask questions around that and touch on those themes. and then if you have a scorecard that's readily available where you can have, okay, your themes are, say, stakeholder management, communication, and collaboration, say those are your three themes, then you make sure that the interviewers have a set of questions that they can ask that can make the candidate feel like they're involved in the conversation. and then very good scoring guides that everyone's aligned on and understands what good looks like for that role I think that's really important. It's just understanding what skills we would be able to say we can compromise on because no candidate is going to be perfect and what are the skills that we're really looking for that we can't negotiate on because those are the most important ones to have success in the role.

TIM: And so it sounds like you're saying it's still good to have that structure but within the interview itself make it more of a conversation, like you're just chatting to someone as opposed to this kind of rigid question-answer, so they've dealt with it, maybe they weave a little bit, maybe they delve into certain areas and dig a bit deeper. Is that what the interviewer should be doing, do you think?

LINA: Yeah, and I think especially if you're interviewing a lot and hiring a lot, you can get interview fatigue, right? And so you snap into the Q&A format because you're just like, I'm just trying to get through these and see, but I've often found that if I'm interviewing, I can usually tell whether this candidate is going to be somebody that I would want to work with. and that is going to be hireable within the first 10 minutes because of the energy levels that I can see them exhibiting, and if they are, the conversation flow usually comes quite naturally because you ask them a question, they answer, but then they might swing a question back, or they might go, And I can tell you more about that. and so then they're engaging you as they're going through, and that helps as well, so I think it's a two-sided situation, right? Like, I think the interviewers need to be very clear on what they're looking for and what they're looking for; they're able to negotiate on it, but interviewees also need to understand that, like, the Q&A format is up for them to interpret, and they can play with it, and they can weave in questions and engaging kind of statements to the interviewer as they're going along because it also makes the interviewer's life easier.

TIM: Is there also a thought process around the candidates? It's almost reversing the bias because we now know a lot about human biases. We know about the kinds of conclusions an interview could jump to pretty easily. Maybe almost a candidate could gain that and be cognizant of it. there are biases around, for example, somebody who comes from the same school as you or the same area or some kind of common connection, like finding that common ground Could it be that a cunning candidate, if they're really clever, tries to weave the interview in their advantage that kind of way? What do you think?

LINA: And is that bad?

TIM: I could. They should. They maybe, if you're a candidate, would you be taking that tack, perhaps?

LINA: I think I absolutely want, like, the thing you want to find is you're entering into a place where you're going to be spending what, 40-plus hours a week with people you want to find common ground with. You want to be working with people that you either look up to or as your peers, and you respect. So doing your due diligence before interviewing and understanding where they've come from may be like, What do they stand for? Especially when I'm interviewing, I look at the C-suite, and they have been on blogs and podcasts, and I might listen to some of their material. Are they quite open about certain I don't know product pillars or the way data should engage with the wider organization, or whatever the case may be; I try and tease that out before so that I know what I'm looking for. I will tell you one of the most interesting case studies I've ever seen about using ChatGPT for preparation for an interview is to load the interviewer's profiles into ChatGPT or give ChatGPT a summary of what those profiles are and then say, Give ChatGPT the right context, the right prompt around, Okay, I'm interviewing for this role. These are my interviewers. I'm really hoping to succeed. Can you please tell me what kinds of questions these types of people might ask me, and can you tell me what would impress them coming out of me in order for me to be successful at getting this role? Actually using ChatGPT for that kind of preparation is really an interesting exercise because it forces you to think in ways that you wouldn't normally.

TIM: And I guess it's being apathetic with your audience by default because you try to think about what they'd be interested in rather than just your own narrative all the time, but my thought process as we're discussing this, though, is that having the interviews run this way would help you select candidates who are going to get along with everyone. They're going to add a bit of value to who will know how to deal with people in the organization, but would it not also maybe overshadow your more technically strong, not super extroverted, super friendly people that if you're trying to hire, let's say, an individual contributor engineer data analyst, someone like that, where most of the job is doing the work, not stakeholder engagement and being friends with everyone and engaging people is having this kind of process? almost like what's the word, like adding a layer of shine or distraction away from the core skills I'll give you a better example because I'm waffling. I can remember software engineers we've interviewed and tried to hire. I remember one guy in particular who we loved immediately; like, he was so affable, with a big, friendly smile. chatting about the weekend He likes football as well, so he's immediately connected on football, but none of these things really would impact whether or not he could be a software engineer, and so we got him to actually produce some software. We got him to do basically a take on a test, which he bombed out on completely. We thought, No, we must've gotten something wrong here. Let's invite him back for another interview because we must've missed something, and we went through some of his code and went through his project and kind of dug into it a bit further. He just didn't have the skills that we needed to be a software engineer, but we could easily have hired him based on how we thought he'd fit in, and I really like this guy. I want him. I wanted him to be the right fit, but in the end, our data told us not to hire him, and I can think of the opposite kind of people who bombed out in the interview. They nailed all the kind of more objective measures. You, one of whom we ended up hiring because I thought, Oh, I must have missed something here, and ended up being pretty good, but they're just very introverted, very quiet, and didn't really get along in the interview, but it was fine. Once they've warmed up to us and gotten into it, they are okay, so I wonder whether for these super technical kinds of roles, if you have a process very dominated by interactions and how well they get along with other people, it might cause us to miss the introverts. What do you reckon?

LINA: So this is actually a great question, so I think there's a couple of things that you can do, and I think companies do wrong. What they do is they go out and go, We need to hire for this role, and that is the only role that we need to hire for, and that is the perfect fit for the candidate. and they might be a candidate that fits three of the four boxes, but they don't meet the four, and that candidate is discarded. The way to do it is to be more flexible with your candidates across your pipeline. Okay, this candidate is very technical but doesn't have the communication skills. Ooh, maybe he would be a better fit for that team rather than for this specific role that we are talking about. So having that kind of awareness across the pipelines across the teams would be helpful, and the bigger the company grows, the more siloed these processes become, and so people don't share possible candidates, especially when you're trying to get into more general work where you're doing, I don't know, general management, project management, even engineering management, like there is transference of skills. That's just inevitably what happens, and thinking a little bit more creatively around candidates that seem to really fit one part but not another would be helpful here. The other component that I think is really helpful is remembering we've only just talked about the technical side and the cultural fit; that's not the end of the interviews. In theory, there should be a three-way or a three-step process, right? If you sequence these things correctly, then you can actually, through each step of the interview funnel, the candidate appropriately, so if you start with the technical, then you've bashed out, Okay, cool, this person is this level, which means that maybe we need to consider their placement in X, Y, or Z team. Let's do a cultural fit for that team. Okay, X, Y, Z, fine, and then your last level should really be, and I think people forget about this as a way of working, interviewing people over-indexing the technical and over-indexing the cultural but not thinking about how this person is going to be working with these other individuals. If you have an introvert but they are on a team of engineers where maybe being an introvert is okay, if you have a peer interview with other engineers you would be working with, that is going to be far more valuable than your I-don't-know manager's manager trying to do a final cultural check on that candidate because ultimately that person needs to get along with that team in order to be able to push good work. So what I've seen and what I've seen work really well is some companies have started to do perspective funnels, so we know we need analysts, okay, or we know we need engineers, so you come in and you do the technicals, but then your placement in a domain or your specificity is completely dependent on the other stages, the cultural and the ways of working.

TIM: I'd love to hear more about this ways of working interview because I'm not sure I've heard anything characterized as that, but perhaps other people might call it something different, so what would you normally be evaluating in that interview?

LINA: So it's things like you are working on a project together and you've got severe delays. You're not at fault, but your wider team is. How do you handle the situation? You have a very angry stakeholder that thinks that the delivery of a feature was not sufficient for what the brief was. How do you work with your team? It's all about you've got a problem; this is the team. Okay, how do you engage this team, and what is your role in trying to solve that problem, and how do you work with whoever the stakeholders are and your writer team to try and move away from this or solve this challenge or whatever it is? So it's like a more concentrated interview on who you are as a team player, especially for an IC for a manager. Who are you as a person representing your team to the wider organization? How do you arrive at solutions quickly, and how do you arrive at bringing value to the business as quickly as possible? yes

TIM: And for these types of questions, do they also come with a scoring rubric or, like, great, good, okay, bad kind of answer? And even though they're quite open-ended, it's still possible that the answers could almost be bucketed and scored in some point, let's say, semi-objective, semi-subjective kind of way.

LINA: Yeah, and I think rather than asking very high-level questions here, having real-life situations of stuff that has happened in that particular team recently is also really helpful, whereas the cultural interview can be more of a conversation flow, and the ways of working interview can be more of a Here are some issues that we faced. The interviewee gets a taste of the kinds of problems that organization faces. Here are some of the problems we faced: How would you help us solve this problem? And so if you have three or four, they're almost like a case study. but situational teamwork, situational collaboration, and situational ways of working—you still can have a rubric, but you have these three case studies that you're speaking to or problems that you're speaking to and evaluating against, and it's really healthy there to have problems. That team has faced things that are very representative of the kinds of things you would be looking at.

TIM: Have you ever encountered or seen a scenario in that type of interview where, because it's so specific to that team or organization, like these are, as you say, real problems, real examples, which then makes it so relevant? Which is great; is there ever a scenario where someone who is at fault is part of that process or someone who they've chosen their solution that solved that problem? and they're really just looking for the candidate to choose the same solutions they chose. It's a common almost trope among some technical people that it's their way or the highway. Have you seen that, and if so, are there any ways to mitigate that?

LINA: Yeah, and this is again the bias that you're speaking about. Look, I think my solution to all this is for my teams to have extensive training, and we go through and debunk the biases. One practice that has been quite common is there's a recent Harvard Business Review article about it that is stating the biases up front before the interview. So you are aware of them even though you're not thinking about them the whole time, so having that kind of training for all of your interviewers and dispelling the biases as you go through but also having real-life examples before you even start interviewing of how an interviewer or interviewing might act is really important. So one of the sessions that we leave, for example, is you've got your interviewers, and you've got people interviewing each other, and they try and play like certain profiles. This is a person who is very kind of combative and totally challenges you on every question. This is a person who agrees with you. This is a person who provides the same solution as you, and so you're going through these different scenarios, and so people feel more prepared, because ultimately I think when it comes to interviews, you're struggling for time, and you're trying to push as many candidates through as possible. and you've got different layers of seniority that are working through the funnel with you, and not always have they had the same training, but taking that day out with the team to just go through the entire process and get everyone bought in about it and get everyone as well trained as possible It's really worth it because then you have that consistency in your interviewers, and you breed the culture that you want your interview process to represent. Doing that seriously, like once a quarter or once every six months, just as a top-up, is so much better than watching this video on best practice in interview training but actually being like, Guys, you are representing a team. Like what? How would this process make you feel if you were going through it, and then people get bought in as well? They get to choose their peers, and they get to get excited about who they've hired. I think it's a really worthwhile exercise.

TIM: As you were saying that, it made me think of something that's exciting me a lot, which is I feel like there's going to be an astronomical improvement in hiring in the next six to 12 months given the advancements of AI, and so one particular example where this could be done is like a no-brainer; this could be done immediately with current technology. So it's human interviews human to human interviews in a video call, in any case being recorded, getting the transcript, and then imagining a series of prompts that's like rating and ranking interviewers, like which of these biases did you exhibit accidentally during the interview. a scorecard for an interviewer along any metric you can imagine Because I think one of the exciting things I've noticed, at least with my interactions with ChatGPT, is that because it's AI giving you feedback, I feel like you take it on the chin, whereas if you've got to have an interview and someone else in there said these three things you did wrong, here's the four biases you exhibited That's very hard to take and accept, but if it's just AI doing it in an unbiased, quote-unquote, way, I think that could be a massive improvement on the way interviews are currently done because, yeah, I know the training would help, but like, we're still humans, and it's still a meandering live conversation where it could go anywhere. Okay, and the subtlest thing could derail, I think, the evaluation of the candidate. For example, again, I'm just thinking of people I've interviewed in the past—people who, for me, if it's football, there's some connection to football, I'm liking that person. I know that for sure. If they're Brazilian and they like football, it's like they've got the job. And so I have to really guard myself against that immediately because I know it's like my weakness. And so having some system overseeing this and going, you know what, like you went down this rabbit hole talking about football for three minutes, and you're smiling and laughing a lot, do you feel like that put you, did that candidate get put on a pedestal for the rest of the interview? because of how positive the first few minutes were, probably, and if I had AI telling me that I could accept it, whereas if another interviewer told me, I'd be a little bit defensive, I reckon.

LINA: There's that, but this could also be a part of training, right? Having transcripts of the practice and then feeding those into ChatGPT and identifying the biases is helpful. Yeah, like I think anything we can do where we are asking the interviewers to be more empathetic but also more aware, whatever the exercise may be, will help ultimately the interview process. But that's what my worry is,

TIM: I'm not sure how many companies are spending this much level of investment and effort into the interviews because so often it's just got to get somebody in, and they just need to be technical, and then we'll figure it out later when oftentimes, and especially on data teams, the problem you run into is you might hire someone who's super technical, but they can't communicate their work, or they're just not a culture fit, or they don't work in the same way that the rest of the team does. and it's no matter how good your case study is or how great your cultural interview is; ultimately, it's about making sure that you're evaluating for the right thing and that you're expecting the right candidate to come through rather than just focusing on, like, the one skillset that you need.

TIM: I reckon and hope that the big unlock will be AI automating away lots of the bullshit in hiring such that it frees up time to focus on things that matter and AI assisting humans to measure things that at the moment would be tediously difficult to measure, like, for example, writing comprehensive interview notes and scoring a candidate across 30 different dimensions. is a time suck that if you have seven back-to-back interviews in a day, you're not going to do, so I feel like that's probably a huge barrier at the moment that if AI is in on every interview, transcribing everything, summarizing everything, scoring across X dimensions, even if it's imperfect, I'd rather have an imperfect AI score than no human score at all because you're just drowning in work. So I feel like if we can just automate away most of the stuff, then the time we have left to hire could be more thoughtful, more empathetic, focused on the right areas, and focused on the value-added things. And maybe also I think the challenge is that the data points we need to make the most accurate hiring position just take a long time to collect. Like in theory you would want the personality type of the candidate, their IQ, The measurement of all their skills, all their soft skills, all the cultural fit, all that stuff just takes so much time, which is very expensive for the candidate in the company, but maybe there's some business in a few years that pre-aggregates that for every candidate and matches it across every job. and then a candidate doesn't have to do it 30 times for 30 different companies, which is just dumb. That's like, God, imagine the efficiency saving of just a many-to-many matching system rather than the candidate having to answer literally the same questions in every single interview like they're 90 percent the same. There's no need for that. I don't think so. I feel like there's going to be some new data set that gets unlocked in a couple of years, and then happy days! That's my current optimistic view anyway.

LINA: I do think that we do need to lean into the usage of AI in interviews as well. There is AI-assisted interviewing, which is what we've been discussing, AI-assisted profile matching, etc., which is what you just said, but then there is actual use of AI. It's inevitable that these applicants will be using AI in their day-to-day So how do you lean in to figuring out whether they are able to use it in the most productive and efficient ways? So I'll give you an example. I was recently at the marketing analytics summit, and we were discussing the creative usage of AI and day-to-day scenarios, and one lady said that her child going to school has been asked to write an essay. and because the teachers knew that they would use AI to write this essay or help with the essay, instead they asked for them to have Chet GPT write this essay and for them to go through the essay and observe the hallucinations, so what I'm trying to say is we could be doing things like this within the interview process. If there's a fear that people are using ChatGPT to solve case studies or to help with technical tasks, why not lean into it? Why not showcase your use of ChatGPT while solving this problem and observe any issues that you're seeing rather than being like, Oh, how dare you use ChatGPT to solve this problem because it's just not realistic? like that it's going to happen and let it happen and actually test somebody's creativity, especially if they're like a super technical applicant. How would you use ChatGPT to help with this code? What kind of prompts are you going to ask it to be able to get to a level that you feel satisfied with? We need to start thinking more creatively about that rather than battling it.

TIM: Yeah, I agree it's inevitable where we're going, and then also why would you want to discourage a candidate from using a groundbreaking bit of technology that would make them two to 30 times more efficient in their actual job? And I agree it doesn't make any sense. It's just this weird little we're in this weird little interim space where the technology is so new, and it's a threat, but it's also an opportunity that hasn't quite fully embedded in yet, but this is probably just going to be like, I don't know, telling someone not to use Google 15 years ago to be like a closed book exam. and that idea is now gone. It probably would have been like a candidate writing some code, and the first time they've used a package or a library, I cheated because you haven't written it from scratch—not really, because why would I waste time writing everything when there are already these packages that do it for me? So yeah, I guess it's an evolution of that, but just a more extreme example because it's such a profound technology that it's not really iteration; it's a whole new thing. Yeah, yeah, yeah, I think, yeah, exactly, just get it open. Be honest; maybe that's another theme, actually, of hiring. I'd love to get your thoughts on this. I feel like hiring could generally be a lot better and a lot more efficient. If both parties were just brutally honest and transparent from the get-go and there was as little back-and-forth game playing as possible on restricting information, the sooner everyone said, This is how much the salary is; this is how much I want; here's my actual skills; here's the skills we need; here's who's in our team; here's what you're going to do and not going to do; here's literally a day in the life of this role, the better. I don't want to do these things; I opt out. The sooner We could be more honest with each other. I feel like the better it would be, including with the use of ChatGPT, like if a candidate just said, Yeah, I use this for this project. Here's how I used it. Great! Let's have an honest conversation now.

LINA: Yeah, yeah, and so you bring up a really good point. So on the chat GPT usage, you might have a company culture where actually if they aren't using chat GPT, that shows you that their work is inefficient, and what you're looking for is super efficient workers, so that could be one way of thinking about this. But to your second point, on the honesty issue, sometimes it's not a case of honesty; sometimes it's a case of misaligned expectations and lack of clarity on what the actual company is looking for. What I mean by that is many companies struggle with having really good competency frameworks and really detailed understanding of what role requires what skills and competencies. One of the places that does this really well is the app called Clio. I think they use a third-party system, but they list for every single job role at every single level what the expected behaviors, competencies, and technical skills look like, and alongside that they give visibility and the transparency of what a salary range should look like. one of the pioneers in this was—and this is going to show my age, but it was Buffer, the Twitter posting app where you could like schedule

TIM: We used that one I was logged into Buffer today.

LINA: Yeah, exactly, exactly, and so they were very public and very transparent with their salaries, and so at every level, you saw where you were located, blah blah blah. and I think it was in a giant G sheet, but very few companies do this, but it's funny because all the candidates, like all the people working there, know each other's salaries or roughly where they sit, so it's like a weird way of you wanting to hide it 'cause you want to protect people's I guess personal information, but it gets out anyway, but what's interesting about this approach is that if you have that kind of very honest and visible view of, Okay, I'm an IC2; I'm an individual contributor level two. I don't know data engineering. I go on the website and go, Okay, this is the salary that I'm expecting. These are the technical skills I'm supposed to have; this is how I'm supposed to behave; that then allows you to self-evaluate and go Oh, okay. Ooh, I might be a little bit rough on this, but I'm actually quite good at that, and then you have that clarity coming into the interview. Now the problem with this is that a lot of companies don't have that level of detail and don't want to give that visibility, which is fine, but the bigger problem is that a lot of interviews are not conducted internally, and they're conducted by external recruiters and recruitment agencies who don't have that visibility or don't have that level of understanding or don't really know. Enough around that specific team culture and blah blah blah, and so I do see that, like, when I—and there's nothing wrong with using external agencies—you just have to invest the time to be able to tell them what you're exactly looking for.

TIM: And I wonder whether a lot of the time it's just that it isn't committed to paper, like just generating this content, this data, really thinking through exactly the roles and the day in the life, and all compiling all that information is maybe just taxing. Maybe, again, another use case of AI quick prompts gets you 80 percent of the way that you tidy up the edges rather than writing everything from scratch. Yeah, if I think just to the companies I've interacted with, like as a recruiter, actually you start with kind of an intake call where you're unpacking the role and unpacking the team and those kinds of things, but there's such inconsistency. You could have the same conversation with three different people in the company and get three vastly different answers. So I feel like maybe behind the scenes is a lack of alignment to begin with, a lack of maybe concrete thought around some of these areas so that they can't commit it to pay because they don't know what it is in the first place.

LINA: Yeah, and I think that the other problem is people I don't know, especially in my field, analysts tend to be quite ambitious and want to have every role be the next step up and the next step up in salary and the next step up in title, and fair enough, but what people sometimes forget is that a senior analyst in one company may not mean the same thing in a different company, and so the expectation that you would step into a more senior or highly paid role is we need to start dispelling that because I think people need to start thinking about what are the problems that I would solve. what gives me energy in my day-to-day, and what kind of roles play up to that, and that is, I think, that part of identifying what the role entails and what the day-to-day looks like and having that conversation up front can help with that. There was a really interesting LinkedIn post that I saw just a couple of days ago where the recruiter posted Day in the life of the person who they were trying to hire for who was leaving the business, and this is what it looks like: it was like 8 AM. I do this 10 10. I do this blah blah blah, and I know it was like a fake version of the reality, but it does give you a flavor of what you're walking into. and so often, like your expectation is based out of the role that you're currently doing, which may not be at all representative of this new company that you're looking at.

TIM: Yeah, that's where The current process, which is typically just a job ad list of requirements, a list of skills, and a list of whatever, is sadly lacking because the candidate has almost nothing to go on; really, they've got the job title and the company careers page. Better companies will flesh that out a little bit. But it's just so hard to know what you're expecting. One company we learned a lot from with being more forthcoming and transparent and upfront with candidates was Get Your Guide. They did a really good job at just packaging all this content and giving it to candidates at the application stage, which I found really refreshing. and so it meant that basically, imagine the types of things candidates would want to ask in that first interview; it's so predictable. Most candidates are going to ask the same five or six things. They're going to ask about the exact salary remuneration. They don't want words; they want a number. They're going to ask about career progression; they're going to ask who's in the team, like when do they have to work, where can they go to work, like just the basics, and they package this together really nicely, and then the last-minute document they included LinkedIn profile links to the colleagues and gave them the performance metrics. They're using the team to like exact problems they'd solved recently in a fleshed-out word document, and that was just amazing because then you send that to candidates like, Oh cool, I've now had 95 percent of my questions answered, which then has the amazing knock-on effect of it motivating them to finish the process. So that's when you can actually be slightly more aggressive in the screening because they already know everything they need to know, so if they are liking what they're hearing, then they will do a test as an early stage, or they will do a long interview because you've just de-risked it for them by giving them all the information they need to know. So I found that really refreshing, and we copied that exactly for our own hiring process because I thought it was just such a good way to do it, and I guess they're still doing that, so shout out to Get Your Guide.

LINA: I love that, and there are a couple of players that do this, especially bigger companies that have more, maybe, time and space to invest into those kinds of packages. I think that if it were me and I was running the talent acquisition team, I would want my recruiter not to spend time. on answering those questions but instead having this kind of package and instead having the talent acquisition partner's primary OKR to be for this person for the number of hires you're going to make, right, but not that—not like going through the funnel but actually getting through probation. and I think it should be that talent partner's responsibility to meet the expectations of the business and the applicant and then make sure that their onboarding is just so successful and the first few months so successful that that person stays for a longer period of time, and very frequently talent acquisition partners are only measured by the number of people who got through the funnel and the quickness of the funnel, but they don't have the responsibility of what happens after and actually doing something like this where that click cultural piece is upfront and then that talent partner can focus on like fostering the candidate throughout the funnel and into probation is it pays off for everyone

TIM: Yeah, and I'm just again mapping it to what a recruiter would do. TA is basically an internal recruiter, and there are just so many little nuances, so many little things that you need to be aware of when you're doing recruitment, you get almost like a sixth sense for when a process is going to derail when a candidate's obviously got other opportunities they're discussing. You get a sense of what questions to ask and when. Oh, they haven't gotten back to me in the last couple of hours, and they're normally very responsive. Oh, hang on; I better check in with them. There's that, like, very careful management of the process, which I don't think a hiring manager would have time to do. And it's probably not that the typical game is so if they could be doing that very high-touch, very nuanced, personalized nudging through the process, and yet the onboarding is just as important because, yeah, if that's a shoddy process, then everything can go downhill pretty quickly from there. It's interesting, actually, that recruiters in any recruitment contract would normally have a replacement guarantee, like within three or six months of the cabinet quitting or getting fired. You have to replace them for free because you know that sometimes things don't work out. See, I completely agree; any TA team should have that somehow incorporated into their KPIs because it's at least a shared responsibility with the hiring manager. You would think

LINA: Yeah, yeah, and like finding opportunities for that for the person to meet the team and get to know the office as much as possible before joining is also helpful, so for example, if we have an all-hands or any sort of activity and we have a bunch of new people who are about to join, we always invite them. Because it just gives them a flavor of what they're walking into, they get to meet people in person before starting; they get to know the work a little bit more. You want to give time to the new people joining so that they feel like they can hit the ground running as quickly as possible in the joint.

TIM: Lena I gave a shout-out just a second ago to get your guide. Is there anyone you'd like to give a shout-out to, a person or a company that is in the data or hiring space that you've learned a lot from or that you really admire?

LINA: I will give a shout-out to actually an agency that I've worked with quite a few times. They're London-based; they're known as Burn Sheehan, and they do exactly what I've just said, which is they really focus on managing the expectations of the applicant right up front and then fostering every step of the applicant as they go through the process. One of the cool things that they've done is they've built up a bank of Okay, so I know all of these heads of data in London, but this person's like a much better technical head of data; this person's a much better communicator; this person's a lot better at this, and they might be at different stages, and they're different companies, but then when a new job is briefed in, rather than being like, I need to start the search from fresh, they're like, Ooh, okay. so this The fit for this role is these skills, and I've got this bank of people that actually are quite a good fit for that, and so they have taken recruitment on its head because they keep in touch with everyone that has ever gone through the process and really seem to care about their career progression. I just love that because it's so rare to see an agency that puts that much effort into their applicants.

TIM: Lena Thank you so much for your time. It's been a great, interesting, slightly meandering, and engaging chat today, so thank you so much for joining us on the Objective Hiring Show.

LINA: and thank you for having me