In this episode of the Alooba Objective Hiring podcast, Tim interviews Pratik Ambhore, Data Engineering Expert
In this episode of Alooba’s Objective Hiring Show, Tim interviews Pratik, head of data engineering. The discussion focuses on the impact of AI in the hiring process. Pratik shares insights on how AI is used to streamline candidate sourcing, the pros and cons of relying on AI-generated match scores, and the ethical considerations of using AI during interviews. They also explore the balance between leveraging AI tools and retaining human judgment, especially in evaluating problem-solving abilities and adaptability of candidates. The conversation touches upon strategies for making the hiring process more objective and the potential future role of AI in providing actionable feedback and enhancing hiring practices.
TIM: Pratik. Welcome to the Objective Hiring Show. Thank you so much for joining us today.
PRATIK: Yeah, happy to be here
TIM: It is our pleasure to have you here. And it would be great if you could start by just giving our audience just a very brief introduction about yourself and the kind of work you're doing at the moment.
PRATIK: Yeah, sure. Uh, so I'm Prateek Ambore. Uh, I've been working as a head of data engineering practice at Velescio Technologies for over the past five years. Uh, before that, uh, I've done a bunch of different things, uh, in software development. Uh, whether it be mobile app development, whether it be backend, uh, as well as doing some database, uh, performance optimizations and things like that. Overall, a varied career, uh, but for over the past five years, I've been mostly focused on data engineering, AI, and all the related aspects.
TIM: Nice. Thanks for sharing that with us. Where I'd like to start the conversation today is a discussion about AI. And, uh, you know, it's obviously the hottest topic, maybe in the history of the world, you could argue probably the most talked about thing, uh, maybe ever. And where I'd like to focus that discussion is its potential impact in hiring in particular. And so I'd love to hear just your high level thoughts firstly of, you know, have you started to dabble using AI as a hiring manager? Have you started to see candidates use it on their side as part of their application process?
PRATIK: yes, uh, we have seen a lot of adoption of AI, uh, in different parts of the process. Uh, so all the different SaaS applications you do use on a day to day basis, everyone has, uh, AI agent now built in, right? Uh, we use it a lot, uh, in different aspects. Uh, and I have seen candidates use it as well. So to give you a certain examples, uh, we use it to simply. Source the candidates better, right? Previously, you would have to go on different hiring portals, right? Search for specific keywords and things like that. But with the advent of AI AI integrations It has become very easy to source the candidates within your ATS portal, right? and That helps make the sourcing process easier But again, there are downfalls. Uh, so a lot of the times AI doesn't recognize many of the things which you would like, and so that is where you try and ensure that apart from completely relying on AI, you also try and go the traditional way, wherever required. Uh, from candidates perspective, uh, More often than not, I've seen people try and use, uh, chat GPT during interviews, uh, which sometimes defeats the purpose. Uh, but then we have also taken it in a more positive way where, uh, we want candidates to have good thought process, problem solving aptitude. And I think. Uh, knowing a way to Google correctly, uh, is, uh, very important, uh, specifically in software development. And so many a times, if we are doing like say design related questions or something, which is more scenario based, uh, we often let candidates to say, Oh, feel free to use Google, feel free to use chat GPT, because we want to see how they're thinking, what their thought process is And if they search for the right keywords, if they search for the right terminologies, right. That helps a lot when you are trying to figure out if the candidate is going to be the right fit. fit From thought, thinking process, Do they understand the fundamentals? Do they know the terminologies and all of that?
TIM: The first use case you had mentioned on your side, which is, uh, using AI to simplify the sourcing process. You mentioned that was, uh, that was sitting within an ATS. So is it the case that some of the ATS's you've used have already integrated some like AI functionality to help search over the existing applicants? Like how does, how does that work exactly?
PRATIK: So it is both ways. uh So there is a there is a way to search over the applicants who have directly applied and the ATS has basically the kind list of candidates who have applied to your Uh, job listing, uh, some of the ATS also have integrations with different portals such as LinkedIn, such as monster, right. Uh, and all of that, and you can just type in, I'm looking for a candidate with this kind of experience. Uh, they should have done something like this before, right. And it goes through all the candidate resumes in those portals and then gives you a short listing. Right. And that's something where we have found. It's much more faster and easier to source profiles directly as well.
TIM: And does it then allow you to message those people? Like, has it got a sort of email tool Integrated
PRATIK: we have like a template communication templates for our recruitment team, uh, who then basically once identified, uh, and the candidate has been shortlisted by looking at the resumes and all of that, uh, allows them to communicate with the candidates, uh, allows them to basically maintain a whole timeline of everything that is happening with a particular candidate, uh, that has entered the hiring process.
TIM: And I'm wondering what, what is the main value add or point of difference in AI here as opposed to, let's say, three years ago. So is it the case that it's just better? It's a better tool to search over unstructured data. And provide, uh, the shortlist like it, cause it's consuming. I'm guessing the, the title of the candidate and the history and things like that.
PRATIK: yeah, it searches through the whole resume, the title, the description, the keywords that they have been mentioning and all of that. But I think, uh, we Uh, consider it as an assistant to do all the legwork, right. than being the sole, uh, thing that drives the process. So it is more like, um, you can't go through like a hundred resumes in a day. Uh, that would be too tiring and boring. Uh, but that is where AI comes in and helps you look for something which you might not Uh, not be able to do or miss out when going through resumes,
TIM: And at the moment in this particular tool you're using, does it make any attempt to score CVs or say, well, this is, I don't know, an 80 percent match against this job description or anything like that.
PRATIK: It does, but we do not, uh, actually rely on that because we know it's not really, uh, good enough because I, I've had candidates who have like, say 5 percent matching score. Uh, and then I also rejected candidates with 95 percent matching score. Okay. Uh, so it's an indicator for the recruitment teams who may not be very technical, uh, while sourcing candidate profiles. Uh, but yeah, it's not a very good reliable indicator for now, at least for now. It might improve later.
TIM: This is something I'm quite interested in because I've been assuming that coming around the corner are AI CV screening tools that would be used on mass because so many people are complaining about being inundated with applications. It seems natural that companies would then kind of combat that with some kind of automated AI screen. So the fact you have already some direct experience or maybe like an early version of the product is interesting. When you've, you gave it a great example there of. Hiring or interviewing a candidate who's scored only 5 percent as like a CV match score, but they obviously turn out to be very good. What was the gap do you think in the evaluation? Like what was the scoring logic that was missing or why did you, what did you see that it didn't? I guess there's another way of putting it.
PRATIK: Okay. So, uh, this again comes back to, uh, maybe how well the candidate's resume is written. Right. Uh, oftentimes, uh, the recruitment teams try to use specific keywords that the hiring team has given them. Right. Uh, like say hey I'm looking for someone, some data engineer, a junior data engineer with some experience on Spark, some knowledge of databricks and stuff like that. Right. Now, sometimes. Candidates are good and they put in those keywords and then the automation engines would actually pick those profiles up. Uh, in other cases, people write it in a roundabout way, uh, because the resumes are not really refined, uh, to be SEO friendly, keyword friendly, right? Uh, and that is where. You might get a lower, uh, matching score, right? But what matters is what they've written as part of say their description, like what are their responsibilities? What are, uh, the key highlights of their work in the previous experience, right? Or maybe, uh, they've just done a project, uh, that says they've built a CDP, right? Uh, the keyword may not even be there, but while doing a manual screening You can pick up on that and then that's how you utilize it.
TIM: Okay. So it sounds like maybe the current versions of these tools are a little bit too crude, brute forcey, keyword searchy, and that's fairly limited. But I can imagine evolutions of them improving quite a lot from, from that baseline.
PRATIK: Yeah, that is one and plus a lot of time Uh an actual personal touch or experience also matters So for us who have been in the industry for a long time know that if someone says they have done Databricks, right? Uh, how much of the databricks they have utilized, uh, often like comes out based on what they've mentioned in the description and the roles and responsibilities, uh, than just looking for a keyword called as databricks, right? The AI agents would not be able to evaluate that
TIM: Is there any bit of the hiring process which you feel really optimistic that AI could drastically improve compared to how it's currently done manually by humans?
PRATIK: There are parts of it. Uh, one is basically streamline the communications. So for example, uh, a lot of candidates, uh, typically complain about not getting responses from the recruitment team, right? Uh, or even the recruitment team, uh, worrying about the candidates not responding and all of that. Some of those process could be automated, streamlined using. agentic ai's right? Uh, designed for that. Uh, apart from that, uh, there is a very big pain point. Um, I don't know how many companies still use it, but the online testing or screening. Portals, uh, many of the times, uh, they have very, uh, I would say rigid set of questionnaires, right? Uh, and utilizing AI to recommend questions based on the candidate's portfolio, the candidate's experience, and all of that could be a way to simplify and streamline that process as well.
TIM: And thinking about it now, um, let's say from the kind of interview step itself, uh, yeah. Would you imagine AI maybe doing the first screen interview at some point by itself? Would you imagine almost like an AI assistant in the human interview that could, I don't know, summarize notes or provide feedback or what about during the interview step itself? What role do you think AI could play there?
PRATIK: I don't know if the AI is good enough to do the interview by itself. I've seen some, uh, products. Uh, like saying that, uh, AI would take the interview for you, right? Uh, I haven't tried it yet. Uh, and I'm not really confident on how well it is going to perform, but the second part, uh, which is about, uh, using AI to summarize the interviews, right. Uh, acting as an assistant and the human interview. Right. That could be a potential game changer, uh, when it comes to writing down feedback, noting down all the summary of the kind of level of questions that were asked, right, the kind of answers that were received, uh, and, uh, it might also be able to identify the semantic, uh, analysis based on that conversation. And say, uh, the candidate was really confident or the candidate was not confident enough in that particular tech based on the kind of wording they use and the intonations and all of that.
TIM: Yeah, I feel like there's Some pretty low hanging fruit, um, based on the current state of these technologies, like we use equivalent products in the sales process. The one called fireflies is really popular. I just interviewed one of the co founders of a business called assembly, so they have a similar product. And at the moment, based on that current state, the transcriptions are already 99 percent accurate. The summary is really good. It does action key point takeaway. So that could, I think for me clearly be applied directly now. To interviews, and I wonder if that will chip away at the problem of candidates getting ghosted and not getting feedback because at least now writing the interview notes will, if it's automated and people let the machine get on with it, there should be less of an excuse now to not share that with candidates. Um, but even if we have that, would there be other reasons you could see that, that the feedback still wouldn't be delivered to candidates, even if it was literally automated during the interview?
PRATIK: it can be, uh, there are different reasons why. Uh, feedbacks are withheld many a times. Okay. So this is like my personal experience. Uh, like sometimes you are evaluating multiple candidates and you found the candidate to be just near the line, but not good enough to cross the line. Uh, and because of the job market, people are always trying to go after multiple job opportunities, right? So if you have like, say one job opening, you try to make two offers because you know who might join, not join. The confidence score is typically low, right? And that is when, uh, like the candidates who are just near the line, you try to keep them on hold. Uh, sometimes companies keep the communication transparent. Other times they try to be roundabout and saying, uh, the feedback has not been received and all of that. Right? So it's all tactics about how to, uh, maintain the candidate without outrightly denying them. Uh, but yeah, like there are different reasons for what people do and how they do it, right? So
TIM: Yeah, I think in some markets, more so in the United States, I know some companies are sometimes reluctant to give feedback because then it might open a can of worms if the feedback is perceived as unfair or, um, uh, you know, if it even smells slightly of being racist or sexist or homophobic or whatever. Then the company opens themselves up to being sued. But I feel like that's, I've only really heard that in North American market. I haven't heard that so much in Europe or Australia. Um, what are your
PRATIK: there, uh, yeah, that is there. But another aspect I've seen is, so if you give a really positive feedback from a person particular round of interview. Uh, the expectations of a candidate really goes up, right? Uh, if the feedback is a bit average, then the candidate is more mellow and then would try to be more rational towards the approach to the job offers, the negotiations and all of that, right? Now, different tactics for negotiation. Uh, Uh, not really good way to, uh, deal with the situation. Uh, but yeah, we have seen, uh, some companies are very transparent, uh, very upfront about the feedback. Um, even us, uh, as part of the interview process, try to keep about five to 10 minutes at the end of the conversation to discuss. What the candidate would like to know about the whole interview process about the discussion We just had about like how this leads into the roles and responsibilities. They'll be Getting into if they are selected and all of that and that helps Maintain a level of decency in the whole process, but then again, it's up to Individual companies and how people deal with it.
TIM: Yeah, it is. I guess it is up to each company, how they do it. I feel like maybe if AI can simplify this enough such that at least we get rid of the. it's just too hard to give feedback as a barrier. Like if we can get rid of that one, maybe that's like half of the reasons why companies don't give it. And then if suddenly then the majority of companies are providing feedback, maybe that becomes like an industry norm. And if you're not doing it, then you're like, Oh, you didn't give him any feedback. Are you kidding me? And so maybe there'll be like a tipping point where then suddenly those companies have to almost catch up. Um, because. It is the single biggest complaint of candidates is getting ghosted and getting no feedback. And imagine the improvement to the process if candidates got actually actionable feedback, um, so they can improve themselves and be better the next time. Like, I feel like that's gonna make the whole market more efficient if people actually know where they're falling down, so they can do something about it, rather than never knowing and always kind of guessing.
PRATIK: no that that is a point, uh, but it also depends around how the interview is actually structured. So for example, some of the interviews are more conversational heavy, right? So there is a discussion that is happening and it is easy to identify. Action items or, uh, a general feedback or a consensus on how the interview went, right? And then in other cases the interviews are more focused on say technical or a system design aspect, right? Or a programming aspect for that matter, right? A lot of data structures and algorithms related interviews happen in larger companies. And in those cases, nobody is really speaking. One person is just writing, right? Uh, it would be difficult for the AI to be able to, uh, discern any actionable feedback from it, right? So again, uh, it's a situation based, but yes, uh, if it can be, uh, Implemented at least for certain scenarios, and then based on looking at the summary or looking at what the AI has generated, uh, can be, uh, communicated accordingly. Uh, that's going to solve a lot of hurdles in the entire process.
TIM: One slightly different aspect of this, which I'm quite excited about is not just, okay, can we provide feedback or not coming, provide feedback quickly, fair enough, they're all important dimensions, but one is also just, uh, the quality of the evaluation and like the quality of the feedback itself. So I can think of a recent example where, uh, we've been working with. Um, working with a client and they've interviewed a candidate of ours and they've provided and sent feedback to us to then share onto the candidate. But as a recruiter, I can't really provide the direct feedback of what they've said, because it's so like all over the place. It's kind of going down different threads. It's very subjective and it carries sometimes a lot of bias and different things. So I feel like if we had That kind of AI interview assistant that's keeping us almost on track saying, well, hang on, like the criteria for this role is these eight things, which you established. You said you want, you know, strong SQL skills and Python skills and experience in machine learning and strong verbal communication skills, whatever. Like any feedback should surely be about one of those eight things, not some random other thing. Which I find it's so easy to creep into the process if you don't really keep everyone on track. So I give you a particular example. Um, a verbatim bit of feedback was Uh, this is about a candidate. Oh, yeah, you know, um, they yeah, they were really strong Yeah, they had you know, these things which are good, but I just I just wasn't convinced that they were anti putin That was the feedback. I can't I can't give that feedback to the candidate. I'm not saying that to them You Cause that's not fair, obviously. And that definitely isn't in the criteria, their position on the current conflict is not a hiring criteria. So I feel like AI could enhance the process in a way that we can't even think about at the moment. Partly because I think the current process personally is, is so open to bias.
PRATIK: Yes. Yes. Absolutely. And, uh, many times, uh, what we try to, uh, keep in mind when evaluating even the feedback given by the interviewer is have they hit the key criteria that we have asked them to evaluate on, right? Uh, we, uh, keep it very simple. Uh, we ask the candidate. It's communication skills. We ask the candidates attitude, uh, right, towards how they perceive a particular question. Uh, if they do not understand something, are they asking questions, are they, uh, basically, uh, trying to gain more knowledge and understanding about the particular topic, even if they're familiar, not familiar, right. And then, uh, the technical part of it, uh, the actual problem solving, how much hands on they are. Uh, how much, uh, basically do they understand the theoretical aspect, the fundamentals, the domain, the industry, right. Uh, and all of that, uh, so yes, you're right. Uh, if the interview discussion is not, uh, I would say focused in your key evaluation criterias, whatever they may be, right. Based on the position or the role, uh, it. It is, uh, definitely possible to get sidetracked, uh, and the bias comes in, uh, but then, uh, as a hiring manager, right, you look at the feedback and then you say, Oh, this is not the feedback that we're looking for. If your feedback doesn't hit those key criterias where we asked you to evaluate the candidate, maybe you are not the right kind of interviewer who should be taking this job. And that is where, uh, it's not just about, uh, getting the candidates to learn the skills, but also training your interviewing team to be part of the interview process. To be, uh, clear on the, obviously the technical part of it, but also the key aspect on what they should be evaluating on and then the level of questions they should be evaluating on. Right? So for example, I'm, if I'm looking for a principal architect, I won't be asking him to write code, right? For, uh, the entire app, I would be trying to ask them some design based questions, scenario based questions, problem solving questions, because I need the principal architect to solve problems for me. I don't need the guy to write code. Yes. Reviving code or maybe right. Getting their hands dirty and writing code may be required, but it's like 1%. So that's the kind of thing that you need to review as part of feedback. And that is where, uh, this is something that we have. What we have personally been doing as part of our company is we asked the interviewer to write down a general gist of what they asked as part of the interview. So, uh, we asked them if they have asked, say, hands on programming questions. So what was the question about? What did they try to evaluate? And then write the feedback about what was the evaluation, right? Uh, that as part of the hiring. Um, process, uh, helps to evaluate not only the candidate, but also the interviewer. And next time onwards, you just choose based on that.
TIM: Yeah, that's a great and mature and self reflective way to do it is to yeah, make sure that. Everyone's kind of on the same page and aligned because I think, especially for these hiring processes that have many steps, a lot, a lot of the times the argument is, Oh, well, let's get a more holistic picture of people's view of the candidate. Let's have like someone in talent and someone who's the hiring manager and someone who's a stakeholder. Which there's some merit to that because you get a broader view. But then if those people aren't aligned on what you're actually looking for, I think inevitably they'll have their own idea of what a good analyst or good data scientist could be. And it might be, as to your point, completely irrelevant. For your example, if you don't need them to be coding and then someone's spending an hour drilling them on coding, then their feedback is, in a sense, irrelevant. Um, so yeah, keeping everyone on the same page is just so important. Um, What about on the candidate side now? So, uh, obviously candidates are using Chachapiti and other tools to craft their CV, to optimize it. You mentioned some of them using it during the interview itself. So you've actually experienced candidates clearly like looking across to another screen and interpreting an output. What have
PRATIK: uh, it's not just looking across the screen. Uh, that's obviously visible. Uh, but we've also seen candidates, uh, just stating the exact same answers that you would have read somewhere in the blog, or that you would have seen somewhere online when you, it's you yourself are trying to run or solve something, right? Uh, and sometimes it's fine. Uh, because we understand. The landscape is so vast. Not everyone knows everything, right? But you need to be honest about it If you don't know say you don't know if you know just a little bit say that you know a little bit But you're willing to go learn about it and then how you will Approach that right? It's important because at the end of the day The tools and tech that we were working five years ago are half of them are irrelevant today. Nobody uses them, right? And the same thing is going to happen in next five years. So if I want a candidate who is going to stay with me for the foreseeable future, I need someone who can adapt. I need someone who can learn. So for them to be able to say I know how to ask ChatGPT how to solve my problem, Is also good enough in some cases.
TIM: So in, you're saying almost that their, their usage of the tools may come from a fear of thinking that they should know everything. Whereas you would say, well, that's impossible. And they should just be more upfront with their usage of such yeah,
PRATIK: yeah, yeah, absolutely.
TIM: I think that's the right way to do it personally, because if it's kind of inevitable, they're going to use the products anyway, and you want them to use it while they're working to make them more efficient. So we wouldn't want to discourage it. And so if you almost make it more transparent, then I feel like you could go one level deeper and go, okay, cool, show me your prompts. Like how are you using it? And then you can really understand almost that new skillset that otherwise would be hidden in the process.
PRATIK: Yes. Yes. And, uh, funny enough, uh, we have had a couple of resumes in the past. Uh, who clearly mentioned, I know how to use Stack Overflow. I know how to Google and good enough for us. Right. If you know how to find a way to solve your problem, then I know you will be able to implement it.
TIM: Is there any bit of the hiring process where you would rather candidates not use any AI at all?
PRATIK: So there is a part of the process and that is when we want to evaluate candidates thinking process, their approach to problem solving, right? Uh, more often than not, uh, The customers or even the modern world, people do not spoon feed you solutions. Right. People tell you, this is the problem that I'm facing and please help me find a solution. Now, whether the solution is a simple change of process, whether the solution is writing a big piece of software, right? But you understanding that particular problem is very much related to your thinking process. How create, how, uh, your thought process works. Uh, do you understand the domain? Do you understand the industry? Right. So the problem statement that has been presented to you, are you able to ask the right questions, uh, to gain more knowledge about that problem statement, right? And then come up with a solution in most of these cases, AI, or even, uh, running a simple Google search is not going to help you, right? Uh, your objective thinking. Uh, and your awareness of the particular situation, uh, is more important and that is where we wouldn't want candidates to use AI or any other, uh, tools to assist them. Uh, yes, many times you may not know a particular thing and you need help, but then you should be clear about it that, okay, can I Google this? Or maybe you ask the interview itself to tell them more about it.
TIM: And so in that case, is it, is it that you, you want to understand their current problem solving and their skills disconnected from AI? Um, because the AI would almost muddy the waters because you don't know whether it's really their knowledge or it's, or you're saying the AI is almost pointless in this scenario.
PRATIK: And 90 percent of the cases, AI is pointless in those scenarios. Because, uh, like the current AI tools are not really intelligent. They have just mugged up whatever knowledge was there on the internet. Right. And so they are just repeating the same thing that somebody else has said somewhere, right. And that brings in a bias towards repeating the same mistakes. Uh, you as an engineer should be able to think about a solution on your own. Otherwise you are not an engineer, right? Engineers are meant to solve problems. Uh, and that just my personal opinion. Uh, but if you can't solve problems, then what use are you as an engineer?
TIM: Yeah, I'm personally slightly torn on this because on the one hand, I agree with you and surely. Yeah, a well trained, well skilled, experienced engineer solving from like a first principles basis could come up with something better at the moment. But if the rate of technology change is so vast and so quick, might it be the case that in a year it's, why would you not help? Why would you not use this machine with an IQ of 500 to help you solve problems?
PRATIK: No, that, that is what I'm suggesting that, uh, if the machine is truly intelligent, why do you even need the engineer To work for you machine would do it, right? So as long as the machines are not there, I want you to use your brain. That's the only thing, right? but yeah, there can be differing opinions about it and a lot of the times small ideas that come out of someone's brain Uh, help solve a really critical problem while, uh, others may be stuck in their ways trying to find an answer online.
TIM: I imagine, I don't have any evidence to support this, but I imagine the adoption of Chachapati, Claude, whatever, is highest in youngest people. That's just my assumption. I'm not sure if that's true or not. Um, it, no, so, so, so my observation is, uh, people who are set in their ways, who are a bit inflexible, To change the routine, uh, have low adoption of this new tools. Uh, I've seen, uh, basically like more experienced people using chat GPT, cloud to simplify, say, writing documentation, uh, to simplify writing emails, right.
PRATIK: Uh, even summarize an architecture diagram. Uh, and then I've also seen, uh, a few younger folks who are set in their ways saying, I do not want to waste my time switching windows to get a code or an answer, rather I would just type it out. Right. And that is fine. Right. Uh, everyone has their way of working. Everyone has their way of learning. We encourage people to utilize these tools, uh, because. In general, they help enhance the productivity, but if it is not working for someone, then it's not working, right? You can't force people to adopt things.
TIM: For the set then of junior candidates who are, let's say, quite flexible, so they're kind of happy to adopt AEI and early adopters. And for them, it's just normal. Um, is there a danger that if they, if they're coming in having only ever used Chachipiti, for example, like that's just, just because of their age, they've come in at exactly like their last year of university was 2022 or whatever. They just come in at that exact moment of the curve. Is there a danger that they're going to miss a skillset, like almost over relying on the AI or does the benefits of being almost like an AI native Outweigh any downsides of maybe not having solved problems in another way.
PRATIK: Uh, I would say this, the answer is a bit subjective to that. So if you are mainly focused on say programming, right? Being AI native is awesome. Going to exponentially, uh, increase your productivity. All right. But when it comes to, uh, tasks that need more critical thinking, that need more deeper understanding of the industry, uh, domain, right, uh, Um, it would create hindrances or it would limit the scope of your knowledge and understanding, right? Because, uh, what you gain from actual experience, whether it is learning from, uh, fixing a problem or whether it is talking to somebody and understanding the business, right? Uh, is going to be vastly different, uh, experience and, uh, impact than if you are just going to do it from the air. Because AI doesn't tell you what you don't ask, right? It only tells you what you're asking it for. So if your questions are not broad enough. Your knowledge would not be broad enough, right? So that's where I think the balance needs to be there. Like you need a bit of both because oftentimes the kind of learnings you get from some mentor who is Handling you in the initial set of years, right explaining you things Uh, they'll keep feeding you something that you might never experience until you run into that particular problem.
TIM: And I'm interested to hear about the impact AI has had on what you think a good hiring process is. For example, does it make you rethink a coding test? Does it make you rethink a whiteboard exercise? You know, what, what are the best ways now to evaluate technical skills in your view?
PRATIK: So for me personally. Technical skills, uh, matter when you are actually looking for an expert in that skill set, right? Uh, but over time, uh, you, what you need is someone who understands the fundamentals, right? To give a simple example of data engineering, right? Distributed computing has always been there. Right? Uh, people used to write distributed computing code and C, C Java. And then suddenly Spark as a framework came through, right? And it made people's life easier. Now does that mean that if I know Spark, I would know distributed computing? No, right? I might understand some things, but tomorrow if Spark becomes obsolete, I need to go back and to the learning board and start reading about whatever the next framework is. Okay. Right. But if I know the fundamentals, uh, I can actually pick up anything easily faster because at the end of the day, the concepts, uh, from a theoretical computer science perspective are. It's almost going to remain the same, right? Uh, whatever you learned, uh, say a way to implement recursion, uh, is still going to remain the same. Whether you implemented in Java, Scala, Python, Rust, right? The semantics may change, uh, but the overall structure remains the same, right? And that is where, uh, someone knowing a particular tool is good for short term, but someone understanding how the tool works. Right. And how the next tool might work based on how this tool was working, uh, is going to be what is helpful in the longer term, because you need people to be working 20 years, right. In the industry. And so there's going to be a lot of change over time.
TIM: And, and so how does that then tie into how you're actually evaluating candidates? Does that mean you're, you're focusing on specific tools when it makes sense for that role, but for others, you'll focus more on the concepts.
PRATIK: Yes. Yes. Yes. So what do you try to do? Uh, is, uh, if you're, uh, focused hiring for a very specific skill set. All right. Then you focus on the tools because that's what you need immediately. Right. Uh, but you, when you are not hiring for a specific, Uh, skill set, what you try to look for is the candidate's ability to adapt, the candidate's ability to learn, right. And the candidate's ability to apply what they've learned, right. In real world scenarios.
TIM: This is a word that I've been hearing a lot in the last couple of weeks. Adaptability. I feel like it's, Everyone's thinking about this because the tech is changing so quickly. As you say, the tools from five years ago and now maybe almost irrelevant. So then when you're interviewing candidates, how do you evaluate the adaptability, the willingness and ability to learn new things? What are you looking for exactly?
PRATIK: So it comes through the kind of questions that you ask, uh, it comes through the kind of conversation that you have with the candidates. Uh, many a times, uh, you ask the candidate a scenario based question and you would see them engrossed in trying to understand it, trying to talk through it, right? And that shows the candidate's willingness to learn something different, to learn something new, right? Uh, in other times you would see a candidate, uh, you give them a problem statement and they are just heads down solving that problem. Uh, and then. You find them to be a bit rigid, uh, in some scenarios because they're only focused on what's there, uh, and not trying to find out the underneath thought process or the ideas, right? Or the bigger picture per se, right?
TIM: And just so I understand this in more detail. So basically in the actual interview process. You're, you're giving them, you're kind of working through or talking through a real problem or a case study type of thing. And they're sort of, they're sort of solving it, not on a whiteboard necessarily, but kind of just talking through their solution and you expect them to ask questions and clarify and those kinds of things.
PRATIK: Yes. Because, uh, again, uh, it depends on the level of candidate that you're working with. So, uh, kind of evaluating, uh, so for a fresher candidate right out of, right out of university, right? Uh, you might not, uh, get that level of conversation, but then in that case, you're looking for how good are their fundamentals, uh, the kind of courses that they've done in the uni, right? Or even if they come from a different background, non computer science background, then the kind of learning, uh, the transition. Program that they have had right to get into, uh, software development. So you try to understand that because, uh, it's, yeah, it's very subjective and trying to evaluate, uh, how candidate is going to perform, uh, right. Once they are in, but you try to do what you can. To at least assuage your own concerns and say, Oh, the candidate is thinking the right way.
TIM: And I'm interested in the interview process itself. Granted, there's some subjectivity and that's quite kind of open ended, but do you try to score candidates? You mentioned some of those Key criteria that you'd have in an interview. Are you giving a candidate like a score for criteria one? They got five out of seven and criteria two they got, I don't know, four out of nine. Do you try to quantify it in some way?
PRATIK: Yes, we do try to quantify it because uh, that's the only way for the next interviewer or your hiring manager to say whether this candidate is shortlisted or not shortlisted. Right? Yes, they can go by the qualitative feedback, which you have written, uh, but a quantity without a quantification, uh, it's very hard to discern whether they should select the candidate for the next round, whether they should reject them, uh, or even in some cases, what kind of questions to be asked during the next interview. Right? So for example, uh, round one, uh, you are doing hands on programming, right? But as part of the hands on programming, you evaluate the candidate to be say five out of 10 when it came to the theoretical aspects. So I gave you an example of recursion. I had a candidate, uh, two days ago, uh, who claimed to have solved 200 plus lead code questions, but they couldn't write a recursion, right? They recommended recursion as the solution approach, but they couldn't write it. Right now, such as the kind of candidates where, uh, you may be able to quantify and say they know the concept, uh, but from a hands on programming perspective, they are five out of 10. Right. And that is absolutely fine because in certain, uh, evaluation criterias, uh, they may be good in other cases. They may not be as good, but there is room to grow. And then people. Uh, can improve.
TIM: One reason I ask about the measurement is because I feel like it's a way to make the process a lot more objective. Ultimately, it's still someone's opinion over where that candidate fit on that criteria. It's still this obviously a level of subjectivity to it, but I think if we can at least have numbers that bounds how wildly subjective we can be. Um, so I feel like it's a big, um, a big plus in hiring to try to quantify things where we possibly can. Uh, one thing that's also worth discussing is, is there ever any scenario where budgets can affect how objective the hiring process is, do you think?
PRATIK: it does. It puts a restriction on the experience level of the candidates or the flexibility you get when you're trying to evaluate from the candidate pool. Right. So if you have a certain budget, uh, which is like, I say, you can't go above this number, right. Then it limits the kind of candidates that you are looking for. You are interviewing. Right. Uh, and it also, uh, sometimes. Puts in a bias where you try to be more thorough with the candidate. If the. Candidate expectation as a bit higher side, right? You try to, uh, get the most out of your money, right? Uh, that's also like a pitfall, uh, when working with budgets, uh, but you live in a world where everything is driven by money, right? So you can have a very good candidate coming very cheap, but soon they'll realize that they're worth more and then they'll come back and ask you for more. Right. Uh, or you could have a very expensive candidate, uh, but they just don't match what you expect that money to bring in. Right. And yeah, both, it works both ways. Uh, but, uh, it is good to understand at least at a hiring management level, uh, what's the expected, uh, amount that you're going to spend and what are the corresponding skill set levels. Right.
TIM: And I think that's really important because, uh, That also means you have to be realistic and you have to do your market research and know what we're expecting. I don't know. We're going to pitch this role at 50, 000. We want these 10 skills. Is that even possible in our market? Maybe it isn't. And I know, like, good experienced recruiters are very brutal in giving that feedback and helping. Some clients to understand, well, this is what's possible at, at what you're offering. These are the skills you're going to get. If we try to go for all of these 10 skills, we'll be here forever trying to find this candidate.
PRATIK: Absolutely. And that helps a lot when shortlisting candidates, uh, sourcing profiles. Uh, because if you know that you are low on budget, you wouldn't want to go after a candidate and then do go through the whole interview process and then try to do a low ball offer. Right. And that just wasting everybody's time.
TIM: Yeah, absolutely. And maybe there's something to be said then for, um, like, I feel like we would assume that in a hiring process, you're like, let's just hire the best possible candidate we can find. Maybe that's the right way to think about it. You want the right candidate who fits that job because otherwise you could almost overhire some who already perfectly ticks all the boxes and presumably would be inevitably bored pretty quickly if they can already do the job perfectly. So is there almost a sense of hiring a little bit for a bit of upside, a bit of growth in the candidate?
PRATIK: I would say that depends on the kind of, uh, role or the position that you're looking for. Uh, see, I'm looking for a principal architect, right? I want them to be perfect, right? But if I'm looking for a junior engineer to start, right. Uh, to grow with the team. Right. Uh, then I would be fine to pick someone who has a little bit of, uh, say downsides, uh, doesn't know a few things. Absolutely fine. Right. Uh, because I'm expecting that candidate to grow with the role, right. To grow with the company. Uh, whereas, uh, the other scenario, I'm looking for someone to come in and solve the problem for me. Yeah. So if they're not the right person. Right. Uh, then I, I'm not doing the right thing while interviewing them or while evaluating them. Right. So it depends on the kind of role or the position that you have opened or what's the exact expectation out of the candidate. And that generally drives, uh, whether you're looking for a hundred percent fit or are you okay with like a 70, 80%.
TIM: That makes sense. Um, Pratik, I wonder if I could ask you one final question and that is If you had the proverbial magic wand, and you could wave this wand and fix the hiring process somehow, uh, what would you do?
PRATIK: I would come up with a way to showcase, uh, the true potential of the candidate, uh, and Not just the true potential, uh, but also the compatibility factor, because a candidate may be the best out there in the world, but if I can't work with them, what's the point, right? So I need both, uh, the compatibility factor and the true potential. And if I can wave my wand off and get that, uh, I don't have to do lengthy interviews.
TIM: Yeah, that would be an amazing wand and a very, very valuable one. I think if you could predict people's potential and their compatibility, uh, well, I hope we get that in the next few years, maybe AI will somehow deliver that, that magic to us. I hope so.
PRATIK: I'm optimistic, uh, but I hope for the worst, because, uh, if that happens, Um, uh, the human touch that is there as part of the evaluation process, the recruitment process, right, uh, would go away. Uh, I understand from an objective hiring perspective, uh, it's good, uh, to reduce bias, uh, but then, uh, Like I personally have seen people who did not know anything, right. We're willing to work at the lowest possible entry, uh, point, and then have grown up to be exceptional people in the roles, right? So it depends on the scenario, on the situation, what you're looking for. Uh, and who do you give the benefit of the doubt?
TIM: Yeah. Well, we've still certainly got that human touch in hiring at the moment, so we'll see, see what the next few years has to offer us with ai, uh, pr It's been a really great conversation today. We've covered off a lot of different areas. Um, I'm sure our audience have really enjoyed it, and thank you so much for joining us and sharing all your thoughts and, and insights with us.
PRATIK: Yeah. Thank you. It was a great experience talking to you and hopefully, uh, we get to the North Star of, uh, in the hiring practice and yeah, looking forward to what the world has to offer.