In this episode of the Alooba Objective Hiring podcast, Tim interviews Shafeeq Ur Rahaman, Associate Director of Analytics at Monks
In this episode of Alooba’s Objective Hiring Show, Tim interviews Shafeeq Ur Rahaman about the transformative potential of AI in the hiring process. They explore current challenges such as bias, ethical concerns, and the overwhelming volume of job applications. Shafeeq Ur Rahaman shares insights on how AI can enhance resume parsing, skill assessment, and candidate feedback, while also addressing the limitations and necessary precautions. The conversation delves into the pros and cons of using AI for automated feedback, enhancing candidate experience, and ensuring fair hiring practices. Shafeeq Ur Rahaman emphasizes the importance of a balanced approach, blending AI with human oversight to achieve better hiring outcomes.
TIM: Shafeeq Welcome to the objective hiring show. Thank you so much for joining us.
SHAFEEQ UR RAHAMAN: Hi Tim, Nice to meet you and nice to have you.
TIM: Yeah, it's great to have you on the show, and I would love to start by discussing AI, probably the two most said words or letters in the last year or so, and within the context of hiring, I personally feel like there's going to be some fairly dramatic changes over the next few years with our hiring staff. To begin with, I'd love to get your sense of how you think or see candidates currently using AI in the hiring process and have you start to double using AI on your side when hiring people.
SHAFEEQ UR RAHAMAN: Yeah, AI, my organization, from the perspective of hiring and automation, no, we haven't really used it yet. We are exploring it on obviously other different avenues of use, but hiring is something I think It's going to take some time, but we might start small; we might start with maybe resume parsing or maybe skill assessing the skills of the candidates and filtering out the candidates. So that's how we are planning to take it because recruiting is a human It's always involved in human interpersonal touch kind of things, right? And also there's the ethical challenge of bias and fairness, right? To deal with that, I think we'll be starting small, but as of now we haven't started yet on fully incorporating AI into our hiring process.
TIM: Yeah, I feel like that's where most companies are at the moment. What I am hearing a lot of is that the candidates have certainly started to adopt it. What have you seen from that side of the market?
SHAFEEQ UR RAHAMAN: Yeah, from that side of the market, it's really difficult now to filter out the candidates because people are using the GPTs or LLMs to make your resume, like it's like a hundred percent match, which is like literally not possible when you see a couple of resumes, the roles and responsibilities. like point-to-point match, and that's how you figure out the candidates are just trying to game and trying to squeeze out through the process, so that's how I feel from the candidates point, but I think from On the positive note, I think AI could make the candidates experience better, especially on the feedback side. So the often neglected part is on the candidate side; they often don't get clear feedback on what went wrong, what went well, or those kinds of things, so I think maybe the use of AI on the candidate side would really revolutionize the process, the steps for them, make it faster, or provide them enhanced facts.
TIM: And when you say feedback, this is feedback at the application stage, or this is feedback post-interview. Where are you thinking?
SHAFEEQ UR RAHAMAN: Yeah, yeah, mostly the most common thing that we are—that it's commonly heard, but across the candidates, it's like most often they're ghosted by the recruiters, right? So that's their most common point of concern, and some candidates have nailed the interviews; they rocked everything, the whiteboard sessions, through the application process, but still they don't get any feedback, or they just get a generic email that, okay, we are hiring, going forward with a better candidate. So if they can get some kind of feedback, okay, these things occurred, these things were something that we found someone else who was better than you, then I think that would really help them prefer going forward, right? They can apply for a different role within the same organization, so yeah, it's gonna help them help us.
TIM: Yeah, I feel like this is one of the big opportunities to improve the process, as you say, especially if we automate the early screening stages based on CV applications and what have you. Presumably, if we were to do that, the tool that is used to evaluate those applications and score them and decide, Here's the three candidates you should be interviewing, we recommend that same feedback could be collated and given to the candidate. Here's where you came short; here's where you were strong, is how we evaluated you, and that could all be done actually pretty trivially. I don't think that's it. I don't think there's a technical challenge in doing that at the moment with where large language models are at, so I guess it's just a case of waiting for the applications to be built to incorporate that and then maybe some openness from companies to actually do that because right now there would be companies that would be interviewing people. They have the feedback internally; they've added notes to their HR system, but they're deliberately not sharing that with the candidate for fears of legal repercussions, or maybe it's like a clunky extra manual step or something, so I wonder if even if we have the way to do this more easily, some companies might still be reluctant to share that with candidates. What do you think?
SHAFEEQ UR RAHAMAN: Yeah, that's true. There is always a point of legal concern, and that's why we have the recruiters and the HR involved. They filter out and they act as a bridge between us and the candidates. Maybe having a clear-out if we are using the AI, if everything gets better, the systems are the systems, or the data set the tool uses, it's trained well enough that it can avoid the unconscious bias. in the future then we can also provide the candidate with exactly how AI is being used in the process of the interview journey, then I think it would make sense, but at this present stage we are not at that point, so yeah, that's why I understand the reason that we cannot share it openly, even though if you want to Yeah.
TIM: I also would love to delve into the bias fears in a bit more detail because I think it's perfectly legitimate that we would be concerned that we're about to unleash some technology that's making automated decisions, and then if that isn't monitored and controlled, God knows what could happen. My view, though, in hiring in particular, is that the current way hiring is done is already so biased and unfair with humans. so that implementing an AI, I don't see how it could make it worse in my view, and I'll give you a specific example. I'm not sure if you've seen these studies, but in different countries around the world, researchers apply to roles with CVs and try to measure the impact of having a certain name on a CV. So there was one in Australia a couple of years ago. University of Sydney researchers had thousands of different CVs. They grouped them into three buckets. the first bucket had Anglo-Saxon first and last names second bucket had Anglo Saxon first name Chinese last name third bucket had Chinese first and last names otherwise the sets of CVs were similar in terms of skills and experience and what have you. The only difference was the names. They then applied to thousands of different roles in Sydney, Melbourne, and different cities. senior roles junior roles in engineering, data, marketing, and admin, like a whole variety, and then they measured the callback rate So either a literal callback on a phone or an email back some indicator from the business saying yes, we want to interview you, and this is a pretty good experiment, I think, because they've controlled for so many different variables, and they measured, as I said, the callback rate for the first group. That's the Anglo first and last name. There was a 12 percent callback for the third group, which is the Chinese first and last name, and there was a 4 percent callback, and so the study seems to apply all else equal. If you apply to a job in Australia with a Chinese name, you have only one-third the chance of a callback as if you applied with a white name, which, if they thought of everything and they've controlled for all other reasonable variables, that is a staggering statistic that if I were Chinese, I would be pissed off about because that is like blatant racism called unconscious bias, or you can just call it racism. Who knows how consciously people are making these decisions? So it might be a combination of both, but I feel like because that's the starting point of how unfair hiring can be, I'm struggling to imagine how a well-designed and controlled AI system that at least would be transparent and is measurable and defensible—like I'm struggling to see how that could make that worse. What do you think? Do you feel like there are ways that an AI could get out of control that go almost beyond that as well?
SHAFEEQ UR RAHAMAN: As you mentioned, definitely the bias is there even now in the current interview process. How it happens, the statistics you mentioned, say it is numerical and quantitative. It provides everything right. The AI tool that can help us is also dependent on the data that is fed into it, so the data is whatever the real-world data is. So if the data is balanced, then the tool is good enough, but at the same time, the real world also is imbalanced, right? So we can't There is always bias in whichever industry you go to, like finance, or there are kind of niche industries where they only accept people from certain demographics. There is a certain kind of mindset over there that these people from these regions are good for tech, and these people from these regions are good for finance. It's similar in the US as well; finance is largely considered Asian, Chinese, or those kinds of demographic strong regions. Tech is considered more like Indian or Asian from Southeast Asia, something like that, right? So that's the preconceived notion that AI can definitely help resolve that, provided the tool is safe and sound to use, is well vetted, but yeah, even though we use the tool, if we have those periodic audits of how the tool is performing, What kind of data is being used so the periodic human-centric decisions that would help make this decision process even more unbiased? So I think the tool will definitely help going forward in the future, and that's where we should be heading to get out of this situation, so I think the tool would definitely help, but there should be like a gate or some kind of mechanism that involves human-centric decision-making as well.
TIM: Yeah, that oversight and checking process is so important, and again, I feel like this is where an AI system, in theory, or any system that does this in an automated way can actually be audited in a way that a human one can't. For the current way that most companies do hiring, they get an application, they get a CV, and it's a human in an applicant tracking system clicking reject or send to the next stage. but at no point is that person ever scrutinized: Why did you make that decision? What was it about the CV that you didn't like? Why did you choose to reject this candidate? We don't have any of that oversight at the moment. At least if this was mechanized in some kind of automated system, that system obviously has specific logic, it has rules, could be transparent, and could be investigated in a way that you could never scrutinize what a human's doing. So I wonder whether this is going to be again a huge unlock in that we even have the ability to have this level of audit that we currently don't have.
SHAFEEQ UR RAHAMAN: Yeah, that's definitely true because we have what you say, the four-second rule, right? Whether you catch the eye of a recruiter or not, so that's really how everything works right now in the industry. It's whether you—whether it's The recruiter just has four seconds to view your application and toss it into the bin right An AI can definitely enhance and provide the rationale between how many candidates you received, what percent of candidates didn't qualify, and what qualified, but they could be missing some other things, like some things that can be trained, because it's not just about technical skills; it's about your other soft skills as well, right? Technical skills are something that you can teach someone, but soft skills are something that you have within you that cannot be—or it's really difficult to train someone; it requires a lot of oversight, and then technical cells …
TIM: I realized I was just on a call with someone half an hour ago where they had mentioned the last role they put up for a data engineer. This is in the United States. They received 3000 applicants. They said they only managed to look at the first 250, and they had to stop. They literally did not have time to look at the next 2750 applicants. And so at least in these current market conditions, maybe that's also an upside of automation: everyone gets an opportunity, so it's not that there's a biased human looking at the CV; it's like at the moment 90 percent of them aren't even looked at, so if you'd applied a day late rather than a day early, you just didn't have a chance in this particular example. And I'm hearing, yeah, more and more companies just being flooded and inundated with applications. It's just not really having the technology at the moment to deal with that and certainly not having the people to deal with it either, so maybe some automation would just be needed because not for any bias reasons, just because they literally cannot get through the volume of applicants they're currently experiencing.
SHAFEEQ UR RAHAMAN: Yeah, I totally agree on that because, frankly speaking, that's how we have been seeing or we have been reviewing as well because there is only until a certain point that you can review the application that you receive, and it's just you have the time factor, the time to hire, the quality of hire, so you have these metrics that you have to think about. So you have to go in a balanced way; you cannot just keep reviewing the applicants. Yeah, some kind of automation will definitely help it out, speed it up, and also meet those metrics. Right? As a human, we cannot scrutinize maybe not more than a hundred applicants, right? It's just literally not possible. Even though if you are a full-time recruiter, the recruiter just sends it to the hiring manager, the hiring manager doesn't have the time to go through all those applicants, so it is as you said: I agree with that it should be there, especially in this current market where every role is being posted out there literally. You receive like 100 to 200 applications within the first day or sometimes within the first hour or so, so yeah.
TIM: Yeah, we'll have to find a way to deal with that volume and the bias. The other thing you touched on before was that the CVs are looking increasingly good, in inverted commas, in terms of their match against the job description—almost unbelievably good—and I'm hearing that more and more that, yeah, the CVs seem to be matching the JD more. The CVs tend to be looking more like each other as well. That's what I'm hearing a lot: that it's hard to pick from this sea of very good-looking CVs, but we know the reality is that what's on a CV doesn't necessarily represent truth, so I wonder whether even if we automate this, even if we deal with the scale problem, even if we deal with the bias problem, we're still going to be left with the fact that we're making a decision based on pretty crap data, which is someone's opinion of themselves that they put into a CV. So where is that going to go then? Do we need some new way to do the screening step? Like, even if we do, we need some new data set, maybe, as a better way of putting it.
SHAFEEQ UR RAHAMAN: Yeah, eventually. That boils down to eventually some kind of screening or some kind of assessment you give out to these candidates, which you feel are a good fit, at least on paper, and you don't have the time to literally do those whiteboards or the technical interviews, the screening interviews, with each of them. So maybe having a tool with those screenings built within it helps you filter out these candidates, especially in this market where you get increasingly good CVs that are very strong on paper but hard to determine just by looking at it and hard to pass just by looking at it.
TIM: Yeah, I suspect that's where it will go because I personally feel like part of the issue with especially the early stages of hiring is that, let's say, a CV screen and then a high-level HR behavioral interview are both quite heavily reliant on just believing what the candidate is saying. There's not really a level of validation of exactly what they've done; you haven't seen them do work; you haven't seen evidence of that, and I feel like that's a root cause of a lot of the inaccuracy: we just have to take their word for it until maybe they get into that whiteboarding session. Or you give them a take-home project. We'll do a test, and then you start to see the reality, but it's just the level of commitment and effort to administer that for the candidates to do that makes it, I think, quite challenging to do. Maybe in this market you can get away with it because you've got so many applicants that it doesn't matter, but let's say three years ago at the peak of pandemic hiring, where you had so few applicants, you couldn't then give them and get them to validate everything because it was just—they would drop off, and they would go to another job because they had so many options. So it's interesting how the market changes through time and the pain points change a little bit, and I guess then the hiring process probably has to adjust a bit depending on the market you find yourself in. Have you found that, hopefully, if you think back over the last few years?
SHAFEEQ UR RAHAMAN: Yeah, that's true. The hiring process always adapts based on the market scenario. Yeah, that's true. If we come to the pandemic situation, right, if there was a similar situation where we end up with fewer candidates, then obviously the level of scrutiny comes down because it's hardly difficult to find a candidate. We cannot have six rounds or five rounds of interviews with two technical, two managerial, and two behavioral, so all those things don't make sense, so it just has to adapt to the situation with the market scenario. It should be so volatile that it can adapt, and it can move forward.
TIM: Are there any things in the process that a market-agnostic company would recommend a hiring manager do that just make the process unambiguously better, whether it's in a tight candidate market or a loose candidate market, things that are just not really a trade-off decision? It's just it's better to do it this way. Anything that you've seen work really well
SHAFEEQ UR RAHAMAN: So the way we interview is obviously if the market is similar to what we have right now, we definitely have more screening and a more technical route, but I think we should do it the way we do it, which is we have a solid screening round that is just like a go/no-go kind of decision-making, and it is done by two people. They vet it; the more the eyes, the better, so there's no bias in the decision of the screening, so we figure out a screening round that is valid depending upon either of the market scenarios, and we take forward the candidate based on the result of that, and the results are not fairly not like judgmental; it's just like it depends on the people who are reviewing the screening rounds. So it's basically if I screen and I provide a like, we have a scaling system like red, green, yellow, so it's like yellow; maybe green is like definite, and red is like definite no, so we have those kinds of skill scaling metrics that we use, and we discuss it among the team, and that's how we do it, and maybe having a screening system that doesn't now is involving a human, right? So if it can be automated, this kind of screening system can be adapted to either of the times, whether it's a market high or market low. And I think it would help you at least get with the initial push; you have a solid subset of data, and from there, depending upon how the market is, you can scrutinize more deeply and get your exact solid candidate, or you can get someone who you're okay with, you can, you are open to training, you're open to spending time to bring them up to speed, ramp them up, speak to the client, and set new deliverables based on their skillset.
TIM: You mentioned that screening steps and maybe having two people in there, which helps get a more broad perspective on the candidate. What are the things in particular you're looking for? You're covering off technical things; you're covering off soft skills; cultural fit—what is that process?
SHAFEEQ UR RAHAMAN: Yeah, initially the technical skills are being focused upon, so we look at the technical skills, and if the candidate is called out for an interview, that's only when we can then assess the soft skills part. Initially, it's just these technical skills, but it's from the perspective of people who wear different hats and with different expertise. So a person with expertise in a certain technology might be very hard on something the candidate is performing, but a person with multiple flavors of technology backgrounds might be able to grasp the logic the candidate is putting forward, and they might appreciate the logic rather than the technology they're using. So having those multiple eyes helps that way. Soft skills obviously every time require some kind of interaction with the client. Having maybe recorded sessions might help, but we always prefer to have a live interaction with the candidate to better assess and better understand how they perform in different environments.
TIM: And in that step, then, are you trying to replicate their interactions with a client and trying to almost do you stand in as the client and give them, like, a brief, and they're then the consultant? Like, how do you evaluate that piece?
SHAFEEQ UR RAHAMAN: Sure, so we look at how they think, like when they are approaching a problem when they are trying to So the past projects, for example, past challenges How did they approach the problem? What did they face? How did they solve a team dynamics issue? How did they solve the collaboration and the teamwork? Are they open to adopting new cultures, new work, and new environments, or are they like an individual contractor, and they don't prefer to be working in a team environment? So these kinds of things we look at them if it's a live session, then we obviously merge both the technical interview and the software interview together by asking them while they approach the technical problem to share their thought process, like how they think through it and see how they are performing under pressure and how they're responding to it and how they're keeping their nerves calm.
TIM: again, because that would replicate their real work in front of an actual client. I've heard of some consultancies that, when they will be interviewing for a potential consultant, they'll sometimes give them a deliberately vague requirement, again like a real client would do, and sometimes a confusing brief or a contradictory request and see how they deal with that. If they have enough gumption to push back, is that part of what you're testing for as well?
SHAFEEQ UR RAHAMAN: Of course, our assessments always involved incomplete or vague requirements. We are always looking to see how the candidate thinks, how they're trying to get to the answer by asking the right question, or Getting them to think and have that interaction is just not about solving the problem and coming to the right solution. It's the process that matters—their thought process and how they are able to pick how they are able to arrive at that process. Actually, that's how, of course, that's always needed in any of the screening that we do.
TIM: In those interviews, because we're talking about feedback earlier and maybe AI making the provision of feedback simpler, can you imagine an AI system maybe being an extra interviewer or maybe just being invited to that call to summarize what the candidate said or summarize your feedback? Can you imagine the feedback step from an interview getting easy with AI? If you compare the way you currently do it versus what it could be in the future
SHAFEEQ UR RAHAMAN: Yes, definitely that would help both the candidate and the recruiter. From the recruiter's point of view, it just helps them summarize the candidate's statements. how the conversation went through how The interview went through the candidate points, and it summarizes the recruitment steps that involve the conversation with the recruiter right. So yeah, definitely it's a win for both sides. It will happen that it can take notes; it can take every part of the conversation; it can capture some highlights of the conversation, the points. the past projects the challenges something that might have been missed after the interview that might be really value-add that could be passed on to the next interviewer and that's down the line, and that might even make a break or make a situation right, so I think that's definitely helpful.
TIM: Yeah, I feel like this is an absolute no-brainer at the moment because I don't know about you, but if I was doing interviews, let's say pre-AI, let's say a couple of years ago, pre-large language models I should say I would often do a lot of interviews in the same day—maybe four or five different candidates—and try to get through them quickly. and I'd be in the interview, and I'd be trying to remember the types of questions I was trying to ask them the question, listen to their answer, engage in follow-up, try to really focus on them as much as possible, keep an eye on the time, and make sure I didn't run over and take notes on what they're saying. That is such a cognitively complex thing to do that if we have just this system and it's like some of the tools out there for sales calls and that kind of stuff, they're already top-notch; they already do a great job at this so that the tech's available. Yeah, to listen in and summarize Do the transcription. Do the meeting notes do the action items, whatever they are? excuse me That's such a no-brainer, I think, because otherwise there must be so much information lost if you're just relying on the human to manually write down notes. How many notes can you really take? How accurate could it be? Did you really encapsulate everything? So it feels like surely this is the way we should be doing it.
SHAFEEQ UR RAHAMAN: Yeah, that's certainly true because at one time, yeah, as you said, we'd be interviewing five to ten candidates. It depends on the role, right? Sometimes you won't need to fill in the role quickly; sometimes you are like hard-borne, like, I'll just deep-dive and drill down the candidates till I find the right candidate. So either way, you have to interview multiple people to approach that situation, and when you have multiple candidates and you're working as well, it's really difficult to keep in mind what the conversation went on for an hour, right? So towards the end of it, you need a gist of, okay, all the good things, all the highlights, what happened, what's really a value add. Yeah, so I think it would definitely be helpful, and it is really painful, or it is really difficult to capture everything that goes with each candidate. Even though you might be asking similar questions just to gauge the responses of each candidate, it's still difficult to keep track.
TIM: I spoke to one company this week that had hacked together an integration between Claude and Zoom transcripts to extract the Zoom transcripts from their interviews, pipe them into Claude, and then use Claude to actually do the summarization but also an attempt to actually score the candidates based on the answers they gave for the various questions that were actually asked in the interview. Not automating it, like they still have the score there for them, a human is still reviewing it. Do I agree with these scores or not? But again, I feel like that's certainly better than nothing or relying on a human to try to do it or a human inevitably not doing it. I feel like, yeah, a lot of the time this kind of AI is versus the human way of doing it. Sometimes we forget about the fact that it's not like we're currently perfectly scoring candidates, and interviewing and perfectly writing notes is that a lot of the time it doesn't happen because it's so clunky to do so. Having even an imperfect, slightly biased, slightly misleading AI summary is better than no summary in my view.
SHAFEEQ UR RAHAMAN: Yeah, that's definitely true. There should be some tool that can help us with this, and having a tool that can help provide these kinds of summaries is often better than having no summary, as you said, so yeah, it's more passing down the line, so obviously there are always times it's rare that you just speak to one interviewer and you just hire the candidates. It always involves at least two rounds of interviews before you can hire a candidate, and having a summary to be passed down to the next interviewer greatly helps rather than just bullet points that you have scribbled down somewhere to be shared.
TIM: Yeah, last time we did research on interviews, which was not that long ago, maybe two years ago, 20 percent of interviewers would write handwritten notes, like literally pen and paper, so obviously those handwritten notes are going nowhere; it's not even digitized, so yeah, the upside here I think is enormous. But speaking of, and we've chatted through a few of the different types of things you're looking for, what about the cultural fit angle? This is something I feel like is a challenging area, especially as a data person. I'm sometimes a little bit suspicious of things that are immeasurable-sounding, sometimes a little bit more feelings rather than facts. I'd love to get your thoughts on cultural fit in general, and then is there ever, in your view, a clash between fitting someone in with a culture but then promoting diversity at the same time?
SHAFEEQ UR RAHAMAN: That's a great question, so I think the way we try to interview is we try to look at more of a culture ad than a culture fit, so that way we are trying to bring people with different perspectives and a different thought process into our culture, but of course there is always a difference of work culture between different fields of work. Suppose you are from a finance industry; all of a sudden, you jump into digital marketing, which is all fancy, and it's not; it's just completely informal. The finance is like completely formal; it's all tied to an approach, so there's always a difference in the culture, but that's how we try to interview: we try to think from a perspective of culture at how this person can add or bring a different perspective to the team rather than trying to mold them into a singular framework. So that's how we try to approach it.
TIM: And I'm interested in how far that stretches, so, for example, there must be, like, in your team, a certain current balance, a certain equilibrium in the way you interact with each other. What you would consider the better and worse ways of approaching the work, I could imagine a candidate who would come in with a very dramatically different way of communicating. Maybe they're very objectionable. Maybe they disagree with 95 percent of what you say. Maybe they're unusually direct in their feedback. Maybe they almost try to flip the interview around, and then they're almost like attacking you. I guess for a lot of companies that would be an outlier on the culture. You could argue they add, or you could argue they subtract. I guess you could say, like, What? Yeah, how far do you think this goes? Is there still a general pattern you're looking for, or do you want someone who's genuinely going to shake things up a bit?
SHAFEEQ UR RAHAMAN: Oh, I think it's a balance of both. It's it again. And then it should match with the core values of the company or the culture that he's coming or she's coming into, so it's not, as you said, it's definitely an outlier situation. So a person who's going to ground up shift shake everything—that's going to be difficult to work with sometimes, but the person with a mindset of adaptability Whether he's open to hearing out whether he's open to taking on new things and now that's totally acceptable, totally fine. That's how we are looking forward to there being people who are especially in analysts; there are people who are very open, very broad, and they share out their findings; they project them, and there are some analysts who are just like, Who did style? They just deep dive. go into the problems They don't really like to be in the limelight, so it's different people have different perspectives, so how can this person bring a good positive change in the culture within the organization without drastically shaking up things? That's what we are looking at, so obviously technical is the first thing, but also the soft skills with the culture in mind, the culture, and the culture fit with culture add That's what we are looking for.
TIM: I'm interested in the cultural angle. Are there or have there been recurring patterns of where candidates have fallen down? Are there certain things that you would go, They're not going to work in this environment ? Common reasons why they might fail that stage
SHAFEEQ UR RAHAMAN: Yeah, time to time we come across candidates depending on the project, and again, always depending upon the project that you work on, sometimes the project involves a client that's like heavy on communication. They're always on top of things; they always want to know everything; they want to call; they want to meet them. So then sometimes it's the opposite way around; they're chill and laid back, just give me the result, and I'm happy with it. So those kinds of situations are some people can thrive in it, and some people just cannot take that, so that's the kind of situation that we have always faced, and we have always tried to help the candidates by setting up the expectations with the client and trying to ramp up the onboarding periods. If that situation ever arises based on the feedback from the client from the candidate, that's how we try to circle around and close the gap over there.
TIM: So you touched on an interesting point there, which is that you're hiring a candidate, but because you run a consulting business, you're placing them with a client. So then is it a case of trying to fit them into your team's culture or your client's team's culture or a combination of both?
SHAFEEQ UR RAHAMAN: That's a good question, and it's a difficult thing to answer, or it's a difficult thing to see it on because no project is going to be forever, and so even though we hire a candidate from a particular project perspective, they might be working six months, maybe six years, and then they might have to shift to something else. So I think we'd look at a broader perspective where they meet in between our culture and also the client's culture, and they could just adapt based on that. So if they have the adaptability nature within them, I think that's how they can fit in, or they can put on multiple hats, right? So it's always about In consulting, it's always about shifting around, jumping into different work environments, different clients, and different timelines with short deliverables, so yeah.
TIM: So adaptability is key, I think. With a lot of companies, when they think about a cultural step, I wonder if they don't give candidates enough credit for their adaptability to a new culture, so I'll give you an example. I was recently traveling, so I live in Sydney. I was traveling to vastly different places: Sydney, Bangkok, and Berlin. Riyadh—I'm not sure I could pick three more different cities in the world, and each of them, you know, You get there and you're a bit of an outlier. You don't automatically behave exactly the way that people there do, but you cotton to them pretty quickly and start to almost like blend into the way that things are done there. So, for example, something as simple as in Thailand, you have interaction with someone, maybe at a restaurant or a cafe; you don't just take your coffee and walk away. there's a little bit of almost a little bow and a slightly more grateful thank you with a bit more sustained eye contact than what you would have in Australia, and you know initially you're maybe adjusting to that, but then you adjust, and you just become part of the culture unless you're, I think, a very disagreeable or very narrow-minded person; you blend in a little bit. And so I wonder whether for some companies, when they have this very clear culture, they're evaluating They give an interview; they give a candidate one interview, and they're looking for X, Y, and Z, and they feel like they don't quite see that, but maybe the candidate could adjust; maybe they would adapt within a couple of days given the opportunity. Do you feel like sometimes companies are a bit too narrow-minded in the culture, a bit too fixated on what they're looking for?
SHAFEEQ UR RAHAMAN: I guess that could be true because especially in the consulting industry, the way they hire is they try to match the candidate to the client's environment, the client's culture, so they try to emphasize it, whether the candidate can bring it immediately rather than whether they can give a chance to the candidate, whether the candidate is open, or so it's all about assessing the candidate during their interview and understanding whether they have that nature. and providing them the chance, so it all goes through the interview, so yeah, oftentimes it's bringing someone in and ramping it up immediately, hiring and hiring, and continuing the approach, so that's why I think it's more often the candidates are often overlooked even though they have the ability to adapt; they are often overlooked and are just left out.
TIM: I imagine thinking about it now that if you're hiring consultants, I could see how you have to be maybe even more sure that they're the right fit and then the right candidate because not only, let's say you're a company that's not a consulting business, you hire someone, okay, they may be the right person. You realize after a few months, okay, you can churn them out; it's a very unenjoyable experience for everyone, but it can work, but if you're a consultancy, you place in the camp with clients. There's actually a reputational potential damage there if they're not the right person. Do you feel like that then means that for consultants, you have to be super, super sure they're the right person? Is there an element of extra there, do you think?
SHAFEEQ UR RAHAMAN: It depends upon how risky or how high the road is; as per my thought process, if the candidate is in a pretty high-managed decision-making role, then yeah, it's all about reputation. If you're hiring an analyst, then there is always scope; it's really difficult to find an analyst who's going to perfectly fit into culture, and let's just keep jumping around from different domains and different cultures; their core remains the same, that their thought process remains the same, but they have the ability to adapt because of their work. Yeah, it's the consultant. I believe it depends upon the role if you ask me that, whether how How risky it is depends completely on the role; the higher up the role, the more risk there is for the consultants that are going to work with the client.
TIM: And I guess part of that de-risking process is again having a thorough evaluation of the candidates. What are your thoughts on take-home assignments? Do you do those? Do you do it in combination with live interviews, maybe like a whiteboarding session? And how do you think about those different tools?
SHAFEEQ UR RAHAMAN: We do a combination of both, but take-home assignments are great; they provide the candidates the ability to showcase their skills without the pressure of live performance, but at the same time, whiteboard sessions provide you the ability to assess how the candidates perform in a live performance. So both are great. both are good A combination of it would be preferred, but it all depends upon the market you are in. Right? If the market is as it is right now, then it's definitely a combination of both to find the right candidate to get to know their skill set to showcase to give them the opportunity to provide their entire skill set to assess them under pressure, how they think. How they interact and how they collaborate with the team So yeah, a combination of both is how we take it up, but time to time, depending on the role and depending upon the timeline that we need to hire, we either take it or we mostly take the whiteboard things. It's quicker for us to assess because take-home assignments are really not it; it can be overlooked, or it can be Overdone Just not by you, right? You can take someone's help; you can just get a GPT; you can just sort it out, especially in this scenario.
TIM: Yeah, and they typically also extend the time to hire because I'd say most companies would give candidates maybe a week to complete it, whereas in a whiteboarding session, a candidate is ready to go; you can maybe book them in tomorrow and just get through it in an hour, and then you're both done. And I feel like, yeah, again for the consulting business, that step makes even more sense because so much of your job is actually thinking on your feet and talking to clients and being able to do stuff in real time, whereas maybe for an analyst inside a company, I feel like that's less common, like you're going to be doing more deep-focused work that's maybe aligned a little bit more to the take-home assessment. Yeah, it probably depends on the type of business you're running as well, and, as you say, the market conditions depend on what you can get away with in a sense.
SHAFEEQ UR RAHAMAN: Yeah, because the take-home assignments, it's like you give them one week, and imagine you have five or seven candidates, and you give them, and each of the candidates may not be in the same phase, right? So you have to wait for all the five candidates to complete it. Now if it's a whiteboard session, you can just have it like one week with three candidates and the other week with two candidates, and you can just make the decision over there. and you have a lot more valuable data or valuable insights on each candidate, so how they're performing, how they are skill set-wise, soft skills, and technical skills, so yeah, a whiteboard is always a definite go-to for us.
TIM: And Shafiq, as one final question If you had the proverbial magic wand, what would it be, and if you could somehow fix the hiring process, what would it look like, and what would you do?
SHAFEEQ UR RAHAMAN: That's a great question. I think having an AI tool that could serve both ends of the process would be my go-to immediate magic wand thing, so it's the candidate right now. There is so much missing or innovation lagging that there is no candid feedback or experience that they get it, so there isn't. There's a lot of times you get a candidate posted. It's just because it's not that the recruiters don't want to; the recruiters just have too many candidates on their plate that they can't just provide each individual candidate with detailed feedback. The same case is with the hiring manager, so having a tool that can do that on behalf of the recruiter or the hiring manager or anyone else to the candidate would be great. the same way for the recruiter, providing them with the detailed insight upon the interview completion of how the candidates have some kind of soft skill assessment based on the interview process itself, like a mix of everything for everyone, would be great; that would be what I have to look forward to.
TIM: Yeah, and I'm looking forward to that as well, and I feel like we're not that far off. AI sometimes feels like magic, and I think lots of those things you mentioned there—I feel like current versions of large language models can attack and solve lots of this, and we just need to wait six to 18 months for the application layer to catch up to really use these technologies and maybe some behavioral change as well in organizations to adopt them and some legislation things as well. But I feel like we're so close to this magic wand scenario that you've painted, so I'm excited. Shafiq, it's been great having you on the show. You've shared some wonderful insights with the audience. Thank you so much for joining us.
SHAFEEQ UR RAHAMAN: Thank you, Tim. Thank you for having me.