In this episode of the Alooba Objective Hiring podcast, Tim interviews Julian Wiffen, Chief of AI & Data Science at Matillion
In this Podcast, Julian and Tim discuss the challenges and benefits of diverse mindsets in hiring great data talent. He emphasizes that diverse experiences can lead to more effective problem solving and reduce biases within data teams. Julian explains their approach to recruitment, including the importance of apprenticeships, internal lateral moves, and unconventional backgrounds. The conversation also covers leveraging generative AI for recruitment processes, ensuring fairness and transparency, and the potential of expanding traditional definitions of diversity to include diverse experiences and mindsets.
TIM: So Julian I would love to know from you what you think is the leading challenge in hiring great data talent at the moment.
JULIAN: I'd say the biggest challenge actually is related to diversity, but not in the very traditional sense of the phrase; it's about getting in diversity of mindset, and that tends to come from diversity of experience, which is you get people in if they've all come from the same kind of path, if they've all taken the same kind of educational path within your country, usually the same sort of courses at school and the same sort of universities. Certainly within the UK, you get a small subset of universities. They tend to think about problems in very similar ways, so you might, on traditional aspects, have a very diverse intake, whether it's on gender, sexuality, any background, or similar. They've been taught to solve problems the same way, and that will probably introduce the great danger for a data team: it introduces biases and groupthink. An organization that has a diverse mindset will probably move slower and have more arguments, but it's less likely to move in the wrong direction, and I think that's especially important in data where you might end up measuring the wrong things, setting the wrong KPIs, and jumping to the wrong conclusions. and doubly so when you're starting to build AI-driven processes; you might be picking up biases in the way they operate that you don't even notice.
TIM: So do you feel like our definitions of diversity have been a little bit limited? we
JULIAN: No, I think it's expanding and building on them, so things like some companies and recruiters already try and look for people who have taken different backgrounds. Some HR teams talk about squiggly careers, where you've moved different ways. We've embraced hiring apprentices, which has been a great way to get people from very diverse backgrounds in. So we've seen with apprenticeship recruitment that people who are shifting careers, people who are coming back into the workforce after having had a child, people who maybe come through having been studying in their own time in parallel with a job in a completely different industry, are equally looking for people who may be, when doing graduate recruitment, keeping an eye out for folks who may be like the first from their family to go to university. Or in the UK, they've been to the open university, which often gets you certainly interesting people; they're always worth a conversation. You've got more chance of getting someone that's not a good fit, but equally you may well find someone that meets the criteria for what you're looking for in terms of skills and propensity to learn; that could be a great asset to the team just because they'll think about problems in a different way.
TIM: Okay, so it's about really having extra dimensions about a candidate beyond the more traditional ones. Is that then something that you've consciously collected as part of the hiring process and tried
JULIAN: Yeah, and some of it is by making sure by opening up different streams for your different points in which people might enter the workforce, so you've got a lot of companies that have the very traditional path of I'll take placement students for an internship, and then I'll bring them in as graduates when they finish the university degree. and that's great It's a really good way to get good, strong talent you've built up a relationship with and taught a lot of your techniques before they even walk in the door as a full-time graduate employee, but I think there's a lot of value in saying, Okay, we'll have a certain number of placement students. We'll have to look to fill our roles; a certain number we'll take, say, from an apprenticeship scheme is one side. We've seen some universities offering PhDs; we tried out with a PhD placement, for example, recently to the one at the other end of the scale that has come from a more academic background, bringing in some who are postgrad, effectively, and then others where you look to reskill people from within lateral moves from within the existing organization. and that last bit I wouldn't underestimate either. People who, because they'll frequently bring domain knowledge that's relevant, more relevant to your particular job, but they might want to get into your particular field, into the data space, so maybe bringing someone from a customer success organization or software organization or a finance side into a data team to focus more on the data simply based on those that have found that they have a knack for it, they have an interest in it in the same way that one of my team that came in as an apprentice, her background originally was in fashion, and she discovered that she was always the person doing the spreadsheet work and building the reports, and then that was the bit she enjoyed, and that persuaded her to look at a career shift into data And you'll find those in a lot of organizations. You find the person who maybe equally you might find people that are doing heavy admin roles that they're often the superuser or the person that gets called on to deal with the problems when it comes to your Salesforce instance of, Okay, how do I enter this, and how do I enter that? and they know all the ins and outs of what field was cannibalized for this or that, and the leadership goes to them for expert advice. They may actually be good candidates for a data team to say, Actually, do you want to make that your full-time job, that kind of reporting and understanding the as-is rather than the should-be that you get if you read the documentation? And by having a variety of channels, you're much more likely to get that breadth of experience, my feeling
TIM: Yeah, those shadow analytics roles are really interesting, and I wonder whether with the development of AI, with, let's say, a chat chippy T, does that almost reduce the barrier to entry to some of these more technical things that someone with that domain expertise can really excel at?
JULIAN: If you wanted to get into the Python space, actually the barriers to entry are a lot lower because you just need to know what you want to do and take some basic courses and be willing to embrace a chat GPT-like copilot. You can do a lot of the basic tasks and quickly learn how to load a file, how to move the data frame around, and how to join stuff together. All that you'll pick up very quickly. Now, equally, being a slight plug-in, coming from Matillion products, here, you build a copilot into it that allows natural language interaction with, on top of our local platform. and where we started to see that especially with the GNI tools, having real power is being able to put them in the hands of someone who's a domain specialist who's got a separate matter expert rather than a specialist data engineer or data scientist, and they'll know much better what good data from bad data looks like and what good answers look like. They'll have at the top of their mind in their head what a ballpark figure ought to be coming out at the end of any pipeline, for example, so that, yeah, that barrier to entry is stuff that some businesses like ours are going after in terms of allowing folks who might be an analyst in a line of business unit to do more of their own data analysis and data engineering rather than being dependent on a central data team. but equally those folks then become good candidates to move into a centralized data team if you want to make it more formal.
TIM: What about risks with that approach, so let's say we move into this new world where suddenly there's all these kind of like shadow analytics people who've now adopted AI to start saying to do more analytics themselves without necessarily knowing the details, maybe not being an expert in the underlying code or the data science.
JULIAN: That's where you still need a strong skeleton and a process that's a process that wants to embrace them rather than treat them as a slight. It's always a risk where you've got a shadow IT or shadow data organization to be slighted that they don't want to use the lovely dashboard your team has put together and try and either shut them down or certainly not talk to them. It's much better to empower those people and say Okay, here's our public data catalogue. Here's the structure we want people to pull data from. Here's our process for taking your one-time or your ad hoc pieces of reporting and then bringing that back into the centralized. Okay, you see it over and over again: somebody puts together a spreadsheet analysis or a quick piece of analysis for one meeting. Their management likes it; the meeting comes around again next month, and they're asked to do it again. Suddenly, a couple of people refer to it, and okay, you almost got a business-as-usual report now, so you give those people a channel to pass that back in to publish it, have it validated, and maintained so that suddenly, effectively, you're recruiting those folks to help you build official dashboards rather than something sitting on a spreadsheet and a shared drive that's not properly maintained and dangerously
TIM: Do you feel like we probably undervalue internal candidates relative to external ones and there's a lot of untapped talent there?
JULIAN: I think that's a sad thing, that it's too easy to go off and post the job posting before you've even thought about could you arrange a lateral move. It's I think it's very healthy to always do succession planning within the organization to say, Okay, if I need to get more people, if I need to replace one of my team, okay, who are the junior members of my team who could step up to more, take on a more senior role, and who could I go out and reach out to in a lateral organization that might be able to take over a task or be a candidate to bring in? Yeah, they bring in valuable connections; they bring back valuable domain expertise; they've already got it; they already know the business. Okay, you may have to do more education on the formal processes of your organization. good behavior good data engineering best practice there, but that's no different from bringing in an external candidate who maybe has the perfect academic experience but needs to learn the business. Everybody's got some learning to do with any job.
TIM: Yeah, get rid of the notice period as well; save on recruitment fees either way.
JULIAN: And it's healthy for it's good for people to see within it; it improves morale in both teams often to see that successful moves like that and to see that you can grow and we're supporting our own employees growth.
TIM: Yeah.
JULIAN: And even if you don't be afraid of people that the other organization doesn't massively rate highly in the past some of my So my best hires have come from those whose then-internal moves whose own managers were like, Yeah, fine, take that guy; that because that, and you get them into a new team, you give them a slightly different job, you give them a bit of love and attention and support for what they really want to do, and you maybe find a strength that they didn't have had, or they've got rid of stuff they were struggling with, and actually they got a chance to establish themselves fresh, and those folks then become your most productive and most loyal members of the team. You obviously have to find some way of assessing whether they have the potential to do the job, but the fact that somebody's just trundling along, not doing very much, and not going anywhere in one org doesn't necessarily mean they won't flourish when you move them into something different.
TIM: Yeah, I wonder why larger companies don't do this better. Is it just the complexity of having tens of thousands of people across all these different departments, maybe not data available on each person's skill? for
JULIAN: Some people get certainly they get there's a fear of doing succession planning and when you're doing succession planning you ought to be looking at I'm not going anywhere I'm not about to sack that guy or anything like that I don't think he's gonna but he might get or she might get the perfect opportunity to come up or something happens family wise or illness wise or whatever and then they move on people move on and then you execute the plan and then you reach out to the people you've thought about that might be interested And equally there's a fear of I think one of the biggest frictions is the fear of having the conversation and expressing interest in a rotation as being seen as disloyal to your current boss and your current org and that might be damaging to your chance of promotion and that's probably the biggest thing to solve, and I don't have a magic wand for that one, but if nothing else, when you talk to your own direct reports about their ambitions and they express interests that might be better served than adjacent teams, it's worth having a conversation with the leader of your peer managers to say Hey, I think this person might have a spot on your team or mine. That might be the next step, rather than going up the chain in mind, especially if I haven't got it, especially if I haven't got a spot they can grow into unless somebody leaves.
TIM: Yeah, that would be the mature response. I guess some managers might not think of that as holistically. Do
JULIAN: There's also the fear of the okay. I'm losing experience because I'm losing experience resources. I'm going to have to backfill them, but if you're mature about it, you ought to say, Okay, that's an opportunity for a junior member of the team to grow into that role. to shuffle the work around a little bit, give everybody something fresh, give them a chance to learn something new, and stretch I'm sure many people's steps up in careers have equally come from okay, your boss or the tech lead in a particular area has left or has gone for whatever reason, and you're asked to cover something that's a stretch for you. and actually that's often quite healthy; sometimes you get stuck in okay, I'm doing repetitive stuff I know how to do it, but I'm not really stretching myself, and then I get a bit bored, and then I start to get sour about my job despite the fact that actually if I've had something new for a while with a little bit of air cover for giving this person a break for three months, they just stepped into the role. Expect a few mistakes; it can be really healthy for them, and then both leveling up their skills and feeling happy about where they're at
TIM: Yeah, God, we all have so much extra to contribute, don't we? How many of us are operating at our full maximum capacity all the time? Yeah, there's a lot of untapped potential there.
JULIAN: Some of it can be just boredom as well, and you get the same problem, facing the same problems month after month, year after year. I remember a long while back in my career somebody ringing me up to ask about some missing data that we had marketing lead stuff and being a bit grumpy on the call and putting the phone down and thinking it's totally unfair to be grumpy with that person. Their inquiry is a hundred percent reasonable. It's just I've been in the same spot for several years, and I'm getting tired of chasing down missing marketing leads. There's time to look for something new. I'm trying to make an internal move.
TIM: Yeah, a hundred percent. Last time we were chatting, you were telling me about some recruitment use cases of the products and what you guys are working on. It'd be great.
JULIAN: Oh, some of the stuff we're starting to think about experimenting with generative AI, yeah, so that's obviously these sorts of tools have to be very careful in their space, but we think that there are some opportunities to explore, and at the moment it's just experiments to see how well they score compared to what an experienced recruiter would do. What we found is that obviously most people think of the chat interface we think we're a chatbot. What we've done with the Matillion product is bake the calls to these models into a data pipeline, so you might have, if you imagine I've got a dozen CVs coming in, I could put them through into a transactional data pipeline, and you put in calls to ask questions about each one. So you could use Archive for a summary of that CV, for example, or ask, but you could also start to ask questions like, Do they have any evidence of a particular skill? and, in this kind of process, asking questions that we ask for a yes/no judgment is very powerful. For example, you could raise these things and say, Does this CV show any experience in Python? Or does this CV show More than the most basic experience in price and answer yes or no, our experiment that we're working on here is to work out, okay, what's the kind of list of questions you might have in your head as a recruiter when you're going through reviewing the stack of CVs? And we've all been there. You might have dozens, even hundreds, to go through. Can I get a very dry, unambiguous list of questions to go through and review those CVs? Because you might very well do what you need to do in terms of how many points they have on A level in terms of score, the scores based on grades, have they got this particular grade, have they been to an A? What's the quality of the educational establishment? Is it on a sort of certain list of top-tier universities or not? Do they look alternatively like they've come from an unusual background in the Open University, or like someone who's come through and had some work experience prior to coming to an intern role? All things that can be turned into yes or no flags then become very unemotional. It's not that there's far less of a kind of, Oh, I've read the whole CV and made a judgment that you look like that person. sort of person that I am or that the person that built the gen AI, but instead it's a very big audit trail to say Okay, I've got my flags to say Location shows they're within a certain distance of the office. They've got education to the level that we're looking for. They've got experience of this particular role. They've got maybe they've got experience of QA processes or data visualization or whatever criteria, and they're all yes/no stuff. It's very auditable, and then you can filter things based on that score. Maybe you do a, Okay, you must have so many ticks across the board to go through to the next level of screening. Our first experiments with this involve looking at it for recruiter sanity checking themselves going through the file of CVs that have actually just been reviewed and seeing if the model comes out with roughly the same set of people as the human did, and then maybe when we've done that a few times we might have some confidence to look at it elsewhere, but at the moment we're taking it very gently. You want a lot of human review of that kind of prescreening of, Okay, I've flagged up the ones that are meeting so many criteria. The other interesting one we're exploring, and one of my customers is exploring, is actually then using that to generate an automated email back to Canada before you make a judgment. yes or no Just say something like, We've read your resume. We see evidence of these things, which is great. We can't see anything in the resume that's the evidence of X, Y, Z, or whatever the other criteria were. Do you want to take training in those sorts of things? give them a chance to respond to that, which you wouldn't have time to do as a normal human, but if you automate that process, it gives them a chance to push something back in and put a response, and that's, yeah, early days. It's also it opens up the possibility of a very different mode of recruitment, which is I can see recruitment agencies or large HR departments having a lot of roles saying To those that aren't successful candidates, are you happy to give us permission to keep your CV on file and keep screening against any new roles that come up? Because you can rescreening becomes a much lower-intensity task at that point, and you might then get flagged up with it. Okay, sorry you didn't make this one, but we found these three other roles that you might be a fit for within our system. Maybe that becomes more of an equal opportunity. I could see that pushing it to be able to say, Yeah, feel free to give us a longer resume; give us three or four pages if you want.
TIM: Yeah, that's a great point. We're really going to have to rethink a lot of things then. You're right, because suddenly, exactly all the rules of thumb I'll make it one or two pages—that's to be concise. That's to be concise for a human to read it when AI is reading it. It doesn't matter if it's one or a million people. I'm sure there's some limit at the moment, but within reason, that is really cool. One really difficult challenge with this on the first use case is And so we've done some interesting analysis here around what we are optimizing for because if the data set is a recruiter who's selected CVs, if that starting point is so biased, subjective, and random, we wouldn't want to build something to replicate that. we have an opportunity to do something almost better
JULIAN: If you want the alternative, you could start even using the model to generate the question list based on the job posting. It certainly could challenge people's thinking to say, Then say, Okay, here's a feed of the job posting. Generate me 10 or 20 questions that you would have in mind. When evaluating CVs for that role, that can probably start to challenge biases a little bit. I think the other question then is you might use that kind of process and then say, Okay, let's have a look at the ones that the recruiter rejected and the model picked out, especially since there's learning to be had in the gaps between the human and the machine there. Which ones did the model choose that the human said no to? Which ones did the model say no to that the human said yes to? And that'll probably give us some really rich data about how the process that we built biases into it has the person got biases that we need to work on, and then that's the sort of process we need to go through before we start to use this in any kind of real, angry, real-life process rather than just an experiment?
TIM: It's reminded me of an experiment we did ourselves a few years ago, which is probably worth a quick chat about. We basically got a set of around 500 CVs, and we shared them with 10 different recruiters independently, so they didn't really know each other was involved in this. We gave them a job description, and then we said, Shortlist CVs based on this job. description Yeah, and behind the scenes we also had those candidates test scores, like skills-based assessments, that we didn't share with any of the recruiters. and what was fascinating, to cut a long story short, was the recruiters came back, and they almost all had different recommendations, and there was, I think, only one candidate that more than three of them selected. It's almost like flipping a coin; the level of randomness was quite staggering despite the fact that on paper they had the same data sets to choose from. but it's just that they probably each approach it in a different way; maybe some spent 10 seconds, some spent a minute, some came with their own biases, and some took the kind of subjective criteria of a CV and kind of filled in their own gaps, and I think that's going to make it so difficult to create any tool to optimize CVs based on what recruiters have already selected.
JULIAN: Yeah, the only approach I could see is this scaling wisdom of crowds thing, which is you could build a process that's like, How well does this match this person's set of criteria? and then the way they do it and then duplicate that across a bunch of experienced recruiters, and effectively what you're doing is you're doing the equivalent of repeating that exercise with 8 to 10 people for every single CV that comes in and saying, Okay, four out of the seven models rated this person; how do we respond to that? Certainly, again, it would be an interesting experiment. It's that ability to scale. I don't know whether having a bunch of people, all of whom have slightly different approaches and biases, together generates wisdom of crowds, or if that just reinforces what's accepted or what's normal.
TIM: I feel like the other challenge is going to be ultimately okay, so we can select CVs, perhaps with AI. I think we can at least get rid of the manual effort from a human. Amazing. speed up the process, amazing, give feedback straight to the candidate, go through that iterative process like that's all already a staggering improvement on the current situation, but still I feel like the CV itself Is the problem in the sense that it's a pretty crap data set to decide who to interview because it's someone who's written something about themselves, and I wonder if you can envision any other way it's going to be done? Is there going to be some other data Set we're going to use to make those decisions.
JULIAN: It's an artificial construct because we're asking people to summarize how they present themselves purely for the constraint of one person reading it because one person can't scale it. If you could, I could ask someone to give me their PhD thesis and summaries and the five LinkedIn articles they've written and the whole podcast series or similar and I can't read all of that; I haven't got time, but maybe there's room there for saying, actually, you can submit plenty of supporting evidence here: send me your references, send me the thing, send me your GitHub links, your GitHub repo, send me the article you wrote for the college magazine as an undergraduate, whatever other bits of evidence you've got, and we can put all that into the pot, summarize it up, and say, Okay, what's this? Tell us about you. That might be, again, an area to explore. I give you much more opportunity; you can send me a video if you like; we can process that too. That's all self-selected information, of course. I think the getting stuff that's aside from that then becomes a question of what's what, what are you allowed to share, what's reasonable, what's fair, and are you too vulnerable to the unfair opinions of others around you?
TIM: Yeah, yeah, it's fascinating. You're right; the fact that suddenly we aren't limited by the five seconds a human has to look at X
JULIAN: You're right; that's one of the greatest powers that the Gen AI stuff brings to data processing: it can easily synthesize data from multiple unstructured and varied formats. You can send in a PowerPoint, you can send a CV in, and a PDF; you can send me a spreadsheet; you can send me a massive long chat history; or you can send me a video that I can transcribe. all of which are equally, it can synthesize and then say, Okay, can I see evidence of that XYZ?" and maybe that becomes, I can see it becoming a bit more of the adds more value to showcase projects that say the GitHub repo stuff that the previous case studies and that sort of thing, and that might be something that educational institutions need to improve on in terms of giving people, as part of your course coursework, that becomes a matter of almost public record or at least gives the student something for their portfolio that they can show future prospective employers. thanks
TIM: It could get to like content-driven recruitment almost where the candidate really needs to be out there producing something in some format publicly available that showcases them and their abilities that's interesting.
JULIAN: But it doesn't necessarily have to be tailored to every single interview; it's more like a professional creating a portfolio of things to show.
TIM: Yeah, awesome, cool. I'm excited thinking about that now. What about thinking a little bit now about fairness and transparency? So we've just laid out a whole set of ways that the current incumbent process is actually quite opaque and unfair, so I feel like hopefully we can improve that with AI, but do you see some challenges or some ways that it could somehow become less fair or less transparent?
JULIAN: Yeah, it's if you're if you've not got visibility to the criteria by which any judgment calls are being made, so that kind of list of evaluation criteria, for example, there's certainly one thing about understanding what's intended by an advertisement and how the recruiters are going to score it; it has always been a challenge. I think it might doubly become so this was stuff ironically this was stuff I saw where we dug into and were able to have a really big impact on gender balance in recruitment just because I'm by getting to what will eventually an analysis that led us to an understanding of differences by the way male and female candidates tended to read investment and expect to be scored and expected to be judged this was back this was back to where Cisco we did we looked at the kind of woeful ratio we had a male female recruitment and this is where I'd advise people to take a data driven approach to anything when you're looking at trying to improve this balance and what we saw was that while we we hired a poor number very small proportion of female software engineers and that was something a problem we wanted to fix If you work through the process, by all statistical measures, it was totally fair because of every step, the percentage converting all the way from CV to emerging into the first screen through the assessment center to an offer being made and acceptance of the offer, not a whisker in it in the percentages going through. The problem was only 10 percent of the applications were from female candidates, and so it showed us we needed to change the way we handle what we do to get more women candidates in, and what we found with a bunch of interviews was that female candidates were more prone to take a job posting that had a whole long list of skills required and discount themselves for lacking a small proportion of that whereas male candidates were far more willing to chance their arm and say, Okay, I know about a quarter of these; I know another quarter a little bit, and I'll blank the others, or I'll learn them as I go, and the female candidates were saying, I'm missing one quarter of the requirements, and I'll exclude myself. and so we simply changed the wording of the adverts based on this, and we said these roles will involve these 8 to 10 skills as part of their day-to-day business. If you've got prior experience with some, that is great, but we are looking for people who are willing to learn these as part of this. This is an earlier career role. People are willing to learn these as part of this job at a stroke that took us from 10 percent female applications to 40 percent, and across the board, it also got looking at the kind of academic grades that got us brighter candidates of both genders or all genders because we were emphasizing the learning part of the role, and that got us the brighter candidates as well as a far more diverse selection. Ultimately, out of that, we took the majority of female software engineers that year and subsequent years, and it obviously, apart from the initial research, didn't cost anything. This is very replicable; that's why I'm always keen to share this story. Thank you. You can do this today, and just when you're looking at writing a role, there's a lot that can be done to make it more appealing to different groups. and I'm sure if you look to other axes of adversity, you want to address the same analysis that can probably bear fruit. What appeals to you? What doesn't? If you're new, you're diverse, depending on educational background or similar, but certainly we did well by emphasizing, Okay, here's where we want to get you to six months into the role or a year into the role. You don't have to have all this walking in the door; it works really well and shows people where they can grow into, and it gets you people with the more driven candidates and more learning-focused candidates. I'm sorry I'd share that lesson with anybody in tech in any form to look at your job adverts hard.
TIM: That's such a great example. I love that, and it made me think of a couple of things, so one is that that analysis to you was probably a no-brainer; that was almost probably an automated response of how you would approach solving that problem. I heard a similar anecdote from someone yesterday who's like a marketing analyst, and the company they're working for had recently had a big VC investment. and they said to them, Okay, we need to hire like 300 people like this in, I don't know, some ridiculously short period of time, and so that basically made it such that they had to take some kind of analytical approach to it because that's really thinking about the funnel he used, all his knowledge of a marketing funnel, which is recruitment funnels, the exact same thing. and naturally started to think about those conversion rates I wonder whether, though, that mindset and those skill sets are just quite lacking in talent and HR in general because not a lot of people come from science or data or engineering backgrounds to those fields, and so they just wouldn't think like that. That's not their automated way to approach the problem. What do you think?
JULIAN: Yeah, it's possible I don't want to slight my peers in those organizations, but that was very much the way of driving it, and again this in some ways comes back to our earlier point about bringing in folks from adjacent teams that you bring in. You bring a different mindset and a different approach to the problem. and sometimes that's the solution you need. I'm sure there are many things that those data skills are not relevant to in a talent space, but in that use case it was the silver bullet. What was interesting about that was that the analysis of moving through the steps was trivial, as you say; it literally was the work of maybe an hour. What was much harder was finding any ways to collect data on why people weren't applying, how do you find who to interview, and how do you assemble that data? We had to survey a bunch of folks; we talked to people, particularly we even talked to someone who knew the latest interns and the latest graduates about what they were most reluctant to apply for. And what was it that made us the most nervous that gave us a lot of valuable insight, and then we tried it with a few students at the open days? That sort of thing you do at university open days and type stuff. But so you had to get creative with how you could find ways of collecting that data, and that probably comes back more into the traditional recruiter space about where I got touch points to collect it.
TIM: Because, of course, if they haven't applied, then you don't have their application, so
JULIAN: Yeah, it was what it was, but it's not. You could probably get that more also from marketing; it's like they collect data on why the person didn't buy your product. I think it was an interesting challenge.
TIM: And were these people who specifically you somehow knew had seen the Cisco job ads and had not applied, or was it just a set of people from university, and it was a general survey
JULIAN: was a mix of like interviews of and particularly, but also the low-hanging fruit was we interviewed a lot of women that were already at the company and looked at and asked their reaction to the adverts; we asked the people who had applied despite and asked them what was appealing versus what was off-putting about this and actually even the ones who had applied and said, Yeah, I had felt a bit nervous about not having all these things, so that was our big clue, and then we validated that with the students.
TIM: That's
JULIAN: The real proof was in the next round of recruitment where it came true.
TIM: Yeah, amazing. I think that's just such a great example because I've always had the perception that working in the HR tech talent space for five years now is that tech and data are seen as anti-human, dehumanizing, etc., yet this is a perfect example of where the analytics have helped you solve the diversity and equity problem. it's been the
JULIAN: Yeah, but it's made it unemotional in this case. Cut the theories out and then brought us back down to, Okay, what's the real? There was one other lesson we got following that process almost by accident, not by no data at all but by complete accident, which was we had a panel interview set up for some candidates coming in. They would come in; they were all scheduled to travel. The woman IT manager who was on the panel fell sick that morning with very short notice. A lot of the candidates had traveled overnight or similar for these interviews, so we couldn't really cancel them, and it really was a bit of a struggle at the last minute to find a female member of staff who was in the office available on that morning and we could attend. and we actually ended up bringing one of the juniors you can get; we brought one of the female interns in. Bear in mind it wasn't interns, but the quick lesson it taught us was that when you're seeking to make up a diverse panel, they don't all necessarily need to be hiring managers; actually, people who might be peers or even juniors to the role coming in will give you a different perspective. There was an example there where she was looking at one candidate, and the candidate in the interview had been talking about their gap year experiences doing charitable work in Africa, and the thing the lesson we learn with this is obviously to get the junior person to give their feedback first so they're not biased or feel like they're correcting anybody. But she called out that this candidate sounds wonderful but was talking about how basically they had a lovely trip through Africa funded by this charitable organization and not really anything about what they'd achieved or how they'd helped anyone or what use that had been to anybody that wasn't there. and the rest of us on the panel went Oh yeah, we hadn't noticed that, but yes, it was like having somebody from that age group was quite helpful as well in terms of spotting behaviors that we wouldn't necessarily have been as sharp or aware of, so sometimes actually age in your hiring panel can be another diversity factor. and like I say, don't assume everybody's got to be an interviewing grizzled veteran or people leader.
TIM: Yep, that's great when I have seen a wider pool of people be interviewers, for example, in a, let's say, long hiring process where there might be numerous rounds. Yeah, there's a wider variety of opinions, but that sometimes comes at the cost of different people having their own view of what the candidate should be or what skills they should have. and I find it sometimes dangerous if you're not laser-focused on the same page of what the criteria is; anyone could come along and say, Oh yeah, they were great, but I didn't like the communication skills for X, Y, Z, and add their own criteria, shifting the goalpost a bit.
JULIAN: I've seen places in the past where you look at exercises where some of the steps that create it give opportunity for candidates to fall out of the process; it doesn't give an opportunity to shine. Particularly, group exercises tend to be that way, as some of this might be, again, coming back to addressing biases; it might be time to take a long, hard look at Is the skill you're testing for actually relevant to the role? That kind of thing, like the whole kind of if I'm hiring for a heads-down data analyst who, yes, needs to occasionally present their results but really is I need them to be really good at the detail, really wise to any logical traps, and it's probably going to be working remotely somewhere. peace and quiet, peaceful and quiet and if they're going to share their reports with folks, they need to get written communication style, but do they need to be a wonderful, outgoing, gregarious person that focuses on functions really well in a noisy group environment? Maybe not. That's the I think that's another bias that comes in from both tech and HR; I can see the HR side of recruiters finding the outgoing and high soft skill personas very appealing. I think equally I can see the technical side over I've seen technical sides over-rotate on existing proven skills already over all-around capability and willingness to learn. One of the flaws I see in the technical side is that okay, this person has done X, Y, and Z, so we've got proven knowledge of that. Great, but does that prove their ability to learn the next new technology that comes along? You can over-rotate on the very deep technical skills or on the soft side and personality side. Maybe part of the discussion is Let's be really focused on what skills this role really needs. That I think that is also relevant to attracting neurodiverse talent is that some of the things that can exclude in neurodiverse Canada, again particularly in the noisy, chaotic environment of team building, can be off-putting to someone who might actually be on the spectrum somewhere but be perfectly well-skilled or, in fact, really able in all the things that role needs.
TIM: I don't want to sound like I'm having a go at HR again, but I would say that the challenge is, let's say you've got a combination of soft skills and technical skills you're after in a role. To be fair to talent in HR, they're not really in a position to evaluate the technical skills in an interview because it's not their job, so they just focus on the other side because they can't see the other 50% whereas at least in theory, hiring managers should be able to evaluate both maybe a more junior technical person is, as you say, too laser-focused on the technical
JULIAN: but you say, Okay, you're scoring that bit; you don't necessarily—we're not interested in your opinion on their skills in that area because you're not judge I think that's the flaw for all of us and why you need to have the discussion because it's too easy for people to put their own biases on the job description or what they're recruiting for. I saw that incredibly early in my career, when I was first applying for graduate roles, I and my friend were both studying chemistry, and we both came to the conclusion that we actually didn't want to be venture chemists that came out of the lab with headaches from the old solvents each day. and we're like, okay, what other jobs can we do? We applied to a whole bunch of stuff. We were not the laser-focused career candidates who had a burning vocation to work at your company and your industry. We basically applied to everybody that came to Oxford, and one of them was Procter and Gamble. and they had this lovely job advert that was talking about technical brand management, and it sounded like, obviously, I think the marketing team had got their hands on it and really played up the softer skills side. You need to have technical knowledge, but you'll be presenting it in a business context and all this sort of stuff. And both my friend and I Paul and I applied, and we went into the interviews, and we had the exact same experience. They said, Why do you want this job? and we said, and I said, Because it gives me a chance to apply what I've learned as a chemist, but without necessarily being in the lab all the time, because I don't really want to. And the interview went, This is a very lab-focused job, and we'd expect you to spend 80 percent of the time in the lab. And we went, I guess I won't waste any more of your time then, and my friend thought I had the exact same experience, and basically the soft skills have been so played off in the job description that the fact that the role involved 80 percent of the time actually in Valtryek and Pink Fleabee's gear, so it's what happens when you have one member of the team with too much of a focus on one area.
TIM: And you touched on briefly before the concept of not overindexing on current skills but thinking more about what this person is going to grow into over the next couple of years. Could you expand on that a little bit?
JULIAN: Yeah, if you talk in the software engineering field, everyone there will have worked through many different graphical design frameworks or programming languages. The programming languages they taught you at school are certainly dead and out of date now. You move through things in the data space; you might have to start with MATLAB and move on to R and move on to Python. Some languages, frameworks, and tools are more resilient than others. You're probably fairly safe that your Excel skills are going to stay relevant and current, but new things come along all the time, and it's very hard to test for, but the people that will flourish are the ones that can equally quickly scale up on those new technologies or those new languages or models or similar witnesses with Gen AI; for example, nobody really had Gen AI experience a couple of years ago that wasn't deeply in a narrowly focused natural language processing field, but now everybody's trying to try and learn it. and that's really just a kind of what's your mindset? Do you enjoy learning? Do you enjoy trying out new stuff? Do you enjoy learning for yourself? Do you appreciate your kind of learning style, whether that's understanding somebody that's better in a classroom in a structure or by saying for some employees I've had, Take the next couple of weeks and just go and play around with that and get yourself up to speed on it and tell me how you feel? Is how they learn, and also they feel they find that's ideal two weeks of work for them, like I kept trying something new with good fun; others struggle with that and want a formal course there because teach me, and I'll learn. I want to learn the theory and have it solid before I start banging away on a keyboard.
TIM: I guess part of the challenge is, let's say you're evaluating someone's current technical skills; you can probably measure that reasonably easily, but measuring their propensity to learn new things or the rate at which they can learn harder How have you seen that done, or have you got any ideas there?
JULIAN: You can test for problem-solving fields and how they tackle a problem. That's one part, and that's very transferable, and that seems to be a lot of that. I'm looking for how much they check the criteria of the problem and understand what's there and understand the scope of it. I don't have a great answer for a brand new piece beyond explaining a very complex problem and seeing how well they follow and pick up on the details. There, I think just talking about looking for evidence of self-driven learning and looking for evidence of stuff they've explored beyond their existing job or course is always good. Sometimes when I've been interviewing again the early career folks, the new graduates, I ask them what their—that's what some of their favorite courses were or the favorite chunks of their course—and then to drill into that and ask that, have you done any reading around the subject beyond that? Have you explored anything? Tell me about why you enjoy that piece, and I'm really trying to look for evidence of that. Are they excited by that at all? Is there any passion or interest there? A free tip to anybody watching this instead of being interviewed by me in the future is if I ask you about areas you're passionate about or what you enjoy, I'm going to expect to see some evidence that you've actually… That's the bit you want to do more of because it's really showing me something that there's something in what you've studied that has excited you. and that's that, because that's going to drive you to study more.
TIM: I remember now, actually, we had a little hack on this in our software engineering hiring process instead of company values like anyone else does, and we tried to think about how we could set up the hiring process to somehow measure those, and our general thought process was that talk is very cheap. Anyone can say anything in an interview; they can come up with some nice stories to evidence that they present certain values, but ultimately if they can do something to show that value, I feel like that's a lot more compelling. So for us, we also thought along the same lines: this is a startup. We're going to change direction so many times you have to learn new tools and technologies, and you're going to have to change a lot, and so what we did was give our software engineering candidates a short programming problem in R because none of them know R, so we suddenly were measuring, like, were they going to tell us to piss off and go, Why am I going to learn this stupid statistical language? So it's like a mindset. Could they learn it reasonably quickly to solve a simple problem? How did they approach it? So we found that was very interesting.
JULIAN: That's a really interesting one.
TIM: That approach
JULIAN: Okay, I did see that I can see something similar once before. This is again back when I was applying for every graduate job under the sun, and one of the people we were talking to was GCHQ here in the UK, the security services folks. and one of their tests involved it actually educating you on how certain codes work because they work through here's how these simple ciphers work: imagine you're putting in this like grid or this crossword puzzle; now try and tell me what this message does. Now it gets more complex: you have to solve the problems you went through. And that's probably still one of the more fascinating aptitude tests I've ever run into, and it was certainly an interesting way of doing it, but that kind of giving them something new to try and solve or a tool they've not worked with before is, by definition, and even if they struggle, you might see whether they enjoy it or do they really hate facing something new. That's probably the cause.
TIM: Yeah, there's a lot of almost like meta challenges you could set up in hiring out there, like little indicators along the
JULIAN: Yeah.
TIM: kind of around the process to give you a clue
JULIAN: The values one I found all the secrets here, but the one that I was taught once by one of my mentors, as people manage it, is that a really good question is to ask people to think of somebody they enjoyed working with as part of the team, whether it's a group exercise at school or at work or even a social thing, and tell us what you liked about that person, why you like having that person on your team, and then to do the same about what about someone you found really frustrating to work with and why did you struggle to work with those two questions together tell you so much about what that person values and what drives them up the wall. Yeah, it's not necessarily right or wrong answers, but it's a really good test to see that equally asking them what kind of role they tend to gravitate to in a team can be quite telling.
TIM: Yeah, even if I think back to my own experience, suddenly people are ringing a bell, and what I found was that I would tend to clash with people who had different communication preferences than me. I'm not sure if you've ever done the HBDI thing with the red, green, blue, whatever, so I'm like very blue, logical, whatever, and low red, emotional. I would clash with people who are very high red, low blue, and I'd have conversations with him where I felt like he was just throwing a lot of words at me. I couldn't really get a handle on what he was saying. Can we just simplify this down to just do X, then Y? And they probably looked at me like, Don't you know what I'm saying? This is very simple to understand. I just need you to do something, and so it was a great clash, and so, yeah, those interview questions must highlight some of those
JULIAN: It's not necessarily showing there's a right or wrong, but it's certainly a test to say, Okay, would I be able to get on with them? the importance of being willing to teach and educate folks in your team coming or coming into your team and to look at to look from more I look at more than one channel of ways of getting folks into your team. So those are people ready for a lateral move. Do I take someone who's been an apprentice? Do I take someone that's career shifting? Do I take someone out of academia? Do I take a new grad as the key to getting a strong team and getting in the phone and getting in the folks that are keen on learning, and they'll keep their skills current, and they'll be fine? than exciting, and then the key to retaining those people, of course, is giving them plenty of opportunities to keep their skills sharp.
TIM: It sounds easy but maybe challenging in practice.
JULIAN: Yep, always hard, cool, all right, thanks very much for having us on, Tim.
TIM: My pleasure, thank you, Julian.