Making Data Simple

Beyond the Resume: How Alooba is Redefining Hiring with Tim Freestone {Replay}

Season 8 Episode 37

Send us a text

Want to be featured as a guest on Making Data Simple? Reach out to us at [almartintalksdata@gmail.com] and tell us why you should be next.

Synopsis

What if hiring wasn’t about flipping through endless CVs but instead focused solely on skills? In this episode of Making Data Simple, we sit down with Tim Freestone, founder of Alooba, the groundbreaking platform revolutionizing how businesses hire for analytics, data science, and engineering roles. Tim shares how Alooba eliminates bias, speeds up hiring, and ensures candidates are evaluated based on what really matters—their capabilities.

From his journey as an economics teacher to leading data teams, Tim’s insights are a must-hear for anyone tackling hiring challenges in today’s competitive job market. Learn how Alooba’s data-driven approach is transforming recruitment and why the future of hiring might just leave resumes in the dust.

Show Notes

  • 4:46 – How do you go from economics teacher to head of business intelligence?
  • 7:53 – Do CV’s matter anymore?
  • 13:22 – What business problem is Alooba solving?
  • 16:05 – Do you have any data that supports your theory?
  • 19:01 – Why analytics, data science, data engineering?
  • 20:26 - What do you do that others don’t?
  • 23:50 – How does Alooba define success?
  • 25:42 – Who’s your target client base?
  • 32:40 –Is there a customer you can talk about?
  • 36:24 – What does Alooba mean?

Alooba 

Connect with the Team
Executive Producer Kate Mayne - LinkedIn.
Host Al Martin - LinkedIn and Twitter

Want to be featured as a guest on Making Data Simple? Reach out to us at almartintalksdata@gmail.com and tell us why you should be next. The Making Data Simple Podcast is hosted by Al Martin, WW VP Technical Sales, IBM, where we explore trending technologies, business innovation, and leadership ... while keeping it simple & fun.

Want to be featured as a guest on Making Data Simple? Reach out to us at almartintalksdata@gmail.com and tell us why you should be next. The Making Data Simple Podcast is hosted by Al Martin, WW VP Technical Sales, IBM, where we explore trending technologies, business innovation, and leadership ... while keeping it simple & fun.

You're listening to Making Data Simple, where we make the world of data effortless, relevant, and yes, even fun. Welcome, podcast listeners. Thank you so much for listening. Welcome back. I want to thank you. We passed one million downloads since we started this endeavor for two or three years now or whatever, probably three years, I guess. Thank you. That's like for this little humble podcast, I'm sure there's many more with a lot more downloads than that. But for me, that's pretty cool. And I'm pretty appreciative. So thank you so much. We're gonna get into it as always. Today, I have the distinguished guest, Tim Freestone. He is the founder of Aluba. We're gonna get into this, but I wanna talk about your background and how you became the founder of Aluba. This is essentially a platform, I wanna call it. And I'll learn more about it here in a bit. It's a skills assessment platform for analytics, data science, and data engineering, they help businesses easily identify the best candidates that apply for a role. And I've got to tell you, Tim, it couldn't be a better time in this day and age for this platform. So we'll talk about how that came about if you predicted the future or whatnot. But this is a core product that is a skills assessment platform, like I said, which effectively makes manual CV screening redundant. Welcome, Tim. Thank you for being here. I greatly appreciate it. Well, thanks for having me. Congratulations on your milestone. That's amazing. I'll give you a brief round of applause. Where are you located? You're in Australia? You got a little bit of an accent there. Yeah, I'm in Sydney. It's a, I was going to say spectacular spring day, but it's a bit cloudy. Hey, give us a little bit about yourself. Give us some history, how you became the founder of Aluba. You know, we may go a little bit into that history, but we'll... Obviously jump into a loop as well. Yeah, for sure. So I guess the story goes back probably about 10 years actually, um, where I kind of started dabbling in, in online and web products. Um, and my first business was actually an IQ testing practice platform. So going for graduate internship roles myself, you know, this is about 10 years ago now I have to do so many of these numerical reasoning, verbal reasoning, diagrammatic reasoning kind of tests. And I just hated them. So I thought, okay. There seems to be a pain point here. Lots of other students also hate these. So I created a platform which will allow people to basically practice these tests, get better and get more likely to, to get the graduate position. So it was kind of my first taste. And then fast forward a few years later, I joined my first tech company, business called hotels combined to a now part of kayak and booking.com. And I started to get more into the data space. So I joined as like an analyst and eventually progressed to managing the data and analytics team. And I noticed. I guess two really big trends there at that time. One was when I looked around at the company of about 150, even though we only had six people who were data analysts, once you actually looked at what other people were doing day to day in their jobs, actually, at least 30 or 40 of them were doing analytics work. So even though they didn't think of themselves as data analysts or data scientists, once you actually looked at what they did, they were really analysts. So that was the product managers, all the marketers. a lot of the people in the kind of supply team and the engineering team, really they were doing analytics work. So this kind of basic data literacy skill set was clearly taking off and was going to become something essential. So that was one piece. The other piece was that I was responsible for all the end-to-end hiring of data analysts. And I found the whole process just to be a massive pain in the neck. So, you know, putting out those job ads, reading through hundreds of CVs, trying to kind of spot and pick winners, going through a lot of painful phone interviews with people who say, okay, I'm an expert in XYZ. You ask them a few basic questions and they don't know what's going on. And it, yeah, just the whole process of trying to hire in particular, it felt like a massive waste of time, incredibly expensive, very hit and miss, a lot of randomness, like a lot of just, oh, gut feel kind of decisions. And it was clear the world was moving to a position where we were better than that. We weren't just going to make decisions based on gut anymore. We needed to make decisions based on data. So. That's really the origin of Aluba is trying to have an objective measure of candidate skills within analytics and data science. I'm going to dive in because I've got specific questions, but before I do, as I look at your background though, it's kind of an interesting background. You were an economics teacher for a while. And then you come to the commercial analyst and then you went to hotels combined. How do you make a pivot from economics teacher? to head of business intelligence. I guess it sounds like a bigger leap than it was. You know, at the time it was kind of a progressive thing. You're right, I studied business, finance, economics. I think that gave me a pretty solid background in analytical thinking as well as kind of business acumen. And then it was a case of, I guess, progressively getting a more and more technical skill set. So joining Hotels Combined, I was like, okay, let's learn some SQL and create some databases. and then learning about A-B testing and how to do kind of analytics at scale. It was kind of like a, quite a steady progression, I guess, from that initial commercial analytics role into more and more technical roles and then a leadership position within HC. Are you a programmer? You get under the hood? I would say I'm a pseudo-technical person. So as in, I know enough to get around, I know what's technically possible, I can write a bit of Python, I'm pretty good with databases and SQL, I'm not an engineer fundamentally. Fair enough. You must believe in agile practices. The reason, you know why I asked that question? Because you've got like, yeah. Yeah. You got like 300 stickies behind your right shoulder there. I do, and it's good for sales calls because it makes me look busier than what I actually am. I think that's critical. Very good answer. I thought I was gonna snub you there for a second, but you knew right away you were pointing over your shoulder saying, yeah, I got you. So you actively putting stickies up on a regular basis, are these your outcomes, your daily retrospectives and scrums and stuff? I'm happy to say this was our sales board and I'm glad to say it's been digitized and deprecated now because we have a big enough team that's not just me anymore. Fair enough. When you talk about hiring, I lead expert labs delivery around data and AI, which is analytics and governance Obviously AI, automation and AI apps. So everything you say rings true. And right now I'm working with my team on how to bring people in faster. We're hiring a ton. It's just by the time you're right, you get pack of CVs, you gotta go through the CVs, you don't know what you're looking at, then you gotta push them through the system. And by the time they're here, I'm not even gonna tell you how long it's taking right now because it's embarrassing. I've got to... to shorten that cycle dramatically and my team and I are currently working on it. I presume that's all the problems that you're working to solve today. Yeah, absolutely. Speed of execution, accuracy of kind of prediction, overall hiring cost as well. And a big one is just really fairness, like giving candidates a fair suck of the source bottle you might hear in Australia occasionally, you know, the thought that you can get hired based on merit rather than based on some kind of keyword in your CV. I think it's a critical thing. So yeah, it's about speed, accuracy, cost, and fairness to the candidate. As we jump in here, so this will sound like a silly question. Do CVs matter anymore? I think they shouldn't, but we live in a world where they do. So I would say actually 100% of all the customers we have had or have spoken to start their hiring process with a manual CV screen, either someone in the talent team or someone within the... business side of things. Basically it starts with what I call a glorified keyword scan. So basically it's generally a set of what they're looking for in terms of skills or tools or technology, general level of experience, those kinds of things. And people will go through and try to basically shortlist on the basis of that. So unfortunately for candidates, it is a critical step in the current hiring process because you don't have a CV that matches what the person's looking for. You will not be getting an interview going through the traditional hiring process. So it does matter. I can imagine a point in the next few years where it won't because for me, there's so many issues with manual CV screening that I'm sure it will be deprecated. Hopefully with technologies like ours or other solutions, which make it a lot fairer, more accurate, more consistent and less subjective. I've read when I did some research, you know, before we did this podcast. You have five fundamental problems with CV screening. Would you mind talking to them? Yeah, absolutely. So there's probably more than five, if I think about it now. But off the bat, basically, a CV is someone's own impression of themselves. And we must surely realize now that we aren't the best judges of ourselves. We don't have the best understanding. Not everyone has 100% self-awareness. Funnily enough, this is something we track explicitly through our product. So when someone takes a test on a Luba, before they take a test, we get them to rate themselves on a scale of one to 10 for each of the skills we're about to assess them in and compare that to their actual performance to come up with their self-awareness. So we kind of know that people aren't that self-aware on average. And so CV is all about your own opinion of yourself. So that's one thing. There's also the concept of like the Dunning-Kruger effect. you know, the idea that the more you learn, the more you realize you don't know. So in a sense, less experienced candidates have more inflated opinions of themselves. And then also in terms of CVs, I guess they're littered with personal information that is irrelevant to the hiring decision. So if I was trying to think of collecting information to discriminate against someone, I'd pretty much ask for their CV, right? It's got your name. which gives straight away ethnicity and gender with some level of accuracy. You can determine their age based on when they went to high school or university. You can potentially tell their socioeconomic status based on where they live. And in some countries we recruit in, they also just give you their photo, their religion, their marital status, like all these pieces of noise, which are fundamentally irrelevant to the hiring decision. It's no wonder that when you give this data set to someone and try to... ask them to shortlist candidates that bias can creep into the process. The other big thing about CVs is that they're a totally unstructured data set. Every CV is almost unique in the way it's written, the way it's constructed, the way it's organized. And so then that's the data set itself. Then in terms of the process, like the process of CV screening, basically, as I said, either someone in talent acquisition who often has little to no experience in the role they're recruiting for. basically has to do a keyword scan or someone who's like a hiring manager would do it maybe with a bit more skill, but still they're just keyword scanning looking for kind of high level signals off a CV and What we found through an experiment which perhaps we can touch on later is just how inaccurate CV screening is how almost impossibly hard it is to predict someone's skills based on the CV So here's what I heard Slow because you've got to go through these manually inaccurate. I presume that's a little bit of self-assessment as you mentioned. That is inaccurate because you don't know yourself and it could be, I don't know, a little bit of stretching the truth. Costly, you got to have people that screen them. I guess that kind of goes back to slow. Unfair and biased, I presume on both sides. And then if you aren't selected, most often you don't get any feedback anyway. You just move on and you wonder. why or what happened and why I didn't get the role. Yeah, exactly. No feedback and not even any transparency over that as well. I guess that's the other element. I would be very unsurprised in the next five to 10 years, based on the way the world is going, that businesses aren't legally obligated to basically tell a candidate the reasons why they were rejected. because I would have thought getting a job and getting a fair access to employment is like a fundamental human right. And at the moment, businesses have no requirement to do that. Not only they have no requirement, they don't even track it accurately themselves. So they couldn't even tell you necessarily why a particular candidate was rejected because it's often this five to 10 second CV screen and someone says, no, that decision is never really recorded anywhere or the rationale for that decision. So I suspect things will change a lot over the next few years. Aluba, what business problem in all of this is Aluba solving? Good question. Aluba does a couple of things now. We're very much involved in the hiring process and businesses use us as this kind of efficient, objective, scalable measure of candidate skills within analytics and data science towards the start of the hiring process. So we pitch ourselves basically as a replacement or at worst augmentation of the CV screening process. Um, so it's really about, yeah, hiring quicker. So. Instead of having to rely on someone to manually screen CVs, people don't come in on the weekends, like it's, it's obviously a slow process, just ping the candidate quiz and get an objective measure of their skills and basically use that as the decision on whether or not to interview the candidate. So peel the onion back for me, if you will, on the how, I mean, how does Aluba do this and what's the tech behind that makes the magic happen? So borrow an English phrase, like we do what we say on the tin. We part of our pitch actually is we don't use AI or machine learning. So there's no kind of black box decision-making behind the scenes of trying to predict or pick the right type of candidate. It's really straightforward. We just focus on candidates, knowledge and technical skills on things that are basically black and white and easily measurable. We're kind of like the jeopardy of analytics quizzes. That's the best way to put it. All right. And then that's the way that we really rank and measure candidate skills. We asked them a bunch of questions about skills that matter for that role. And automatically grade them. on their answers. You're not looking at CVs, you're replacing the CVs. Is that true? Yeah, exactly. Describe that a little bit more, if you will. I mean, if like a company, like say IBM said, hey, we need to use Aluba to do our hiring versus the CVs. How does that process go? How do the questions come about so you know you're matching the requirements with the need? Yeah, so we have very much a skills focus or lens of the world. So we... sit down with you and say, okay, what are the skills needed for this role? So let's say it's a data scientist role. So you might have an expectation that they know some SQL, some Python. They understand statistics and machine learning and data science principles, et cetera. Basically then within a Luba, you build an assessment, including those skills. So we have about 40 different skills in a Luba and around 3000 questions. And that's always growing. So you basically put together the skills assessment, pick the individual questions that you think are relevant for your role. based on what the most important topics are in that particular role. And then invite all the candidates to take that quiz. So no real need for a CV screen because you kind of just focus on their core skills and measure them directly. Do you have any data that's a portait that what you're doing is, is more effective than the CV screening? Absolutely. We ran an interesting experiment actually a few months ago. Basically, we gave a set of CVs to, I think, seven or eight recruiters independently. So they were working independent of each other. They had the job description for the role, the set of CVs, and we said to them, okay, can you shortlist candidates who you think we should interview for this role? Unbeknownst to them, we also had their test results from Aluba behind the scenes. So we kind of already knew who the candidates were and how they performed. So from this experiment, we got some really interesting results. Firstly, from the top 20 scorers on Aluba, only one of those candidates was shortlisted by more than one recruiter. Of the top 50 candidates, half of them were not shortlisted by anyone, right? And then only 10% of the candidates were shortlisted by no one. So in other words, not only was CV screening almost counterproductive because the best candidates were filtered out, it was also clearly incredibly subjective. and everyone had their own opinion. It was almost like flipping coins as to who anyone would choose. But how do you know that those, I understand that Aluba rated them the best candidate. How do you know they ultimately were the best candidate? I mean, how do you know that the recruiters weren't right versus Aluba? The ultimate would be if we eventually tracked people's on the job performance through years and how much value that is to the business and those kinds of things. The reason we haven't gone down that route so far is because a, when we engage with businesses, there are normally several subsequent hiring stages, like interviews, um, background checks and all those kinds of things that are kind of beyond our control. Once people get into a job, there isn't actually a fair and unbiased measure of their performance anyway. Like even, um, performance reviews, uh, kind of biased and opinionated businesses also don't have an objective way to measure. people once they're in a job, which is kind of the same issue as when they're hiring. So there's that. And then also the privacy issues when we've discussed this with businesses, maybe getting reporting back like, hey, how do these candidates end up? I think that's going to be an issue for us. So we've tried to focus on basically, can we reduce the false positive rate during that interview process? So we're saying we're a screening quiz. We're going to screen your candidates first, assess their skills. So then by the time they get to the interview, How well do those candidates do in that first interview? So it's like interview pass rate is one of our core KPIs. When we deal with a business, we try to improve that. And on average, we double that rate. So as in we halve the number of failed interviews in that kind of first step. Why analytics, data science and data engineering? Why not HR? Why not anything else? Why are you concentrating this area and I presume that there's an element of sophistication around this tech that... you're focused on, but you tell me. Partly just because of A, a market opportunity. Uh, B, my own skillsets and knowledge are in analytics, BIA data science or something that I felt comfortable in. And this is where we started and this is where the growth is. So the growth and the number of analytics roles is massive and it's only going to go up plus the kind of thing I was mentioning before around the fact that actually so many other non-analytical roles are becoming analytical roles. So we have some businesses, for example, who now give their graduates a basic data literacy quiz on a looper. Um, so it's not necessarily the more senior data scientists, but almost anyone needing those basic skills. That seemed like an area of huge growth, but there is nothing like this for other kind of semi-technical roles. I could imagine this easily applying to online marketing or, or engineering or anything like that. I think generally speaking, our approach would scale well. to any roles where there's like quite a measurable technical skillset. For really soft skill based roles like sales, it's much harder because it's not as easy to measure soft skills as technical skills. What do you do different that no one else can do? What's your mode? What's your differentiator versus me just coming up with a quiz I can do with the lube with us. What is your differentiator? I think it's, you know, the scale of content that we've built out across so many hundreds of different. topics and areas like that's taken years and working with a lot of different people to get that right. So I think the difference between a well-written question and a mediocre question is day and night. And then, yeah, I guess our knowledge and experience of the kind of hiring processes that we've been involved in and being able to put that into that products to make it as scalable and as simple as possible, then all the kind of integrations into all the existing ATS and HR technologies to make the process scalable. So you're not kind of just manually managing another set of candidates. And then also now the benchmarking data that we've collected over the last couple of years is now really powerful because we can say to businesses when they set up an assessment for this assessment, we predict an average score of 50% and the benchmark is 80% and that's like our 90th percentile prediction. So it's like a good cutoff for who you should be interviewing. So really great bit of feedback for customers is. If their average scores are way below that, then that actually implies something about their sourcing because they're not able to attract the right types of candidates that can score as well as what they should be. I gotta imagine you've done a lot of work around architecting the questions so you get some real knowledge around the individual candidate. In other words, by example, you could say, like in the industry, we have the concept of data fabric. Do you know data fabric? That's one way you could ask the question. Or you could ask the question, have you architected a data fabric solution? And that's a very different answer. And you'll get to know whether they really have put that into implementation versus just read from a book. Right. Yeah. I presume that your questions are oriented in that kind of fashion. So you really get to know where that person's level skill set is, unless they're just flat out lying. Can you detect if something doesn't add up? Yeah. So good question. So basically the remit for all our content team is a, it's meant to be like a non-academic style of platform. So we engage with people who are practicing data, analysts, data, scientists, data, whatever variety of different industries, variety of different companies. We say to them, okay, like focus on the things that you use most day to day that are most practical. So we're really like, we're not about theory other than to the extent that it's valuable. It's really about the pragmatism. Then in terms of the questions themselves, yeah, so we've basically built up this very well documented process of like what makes a good question, both in terms of making it hard to Google, unambiguous, getting to the nub of someone's knowledge. And then also a lot of analytics around now how candidates answer these questions to determine like what is a bad question. For example, a question that everyone gets right is almost useless in including in a quiz has no value. A question that everyone gets wrong is either a really good question or it's a very strangely worded question that we need to improve because people are being confused by it. So there's also some kind of analytics after the fact. How do you define success in an implementation you do with a customer? Typically it comes down to time saving and that's in two large parts. One is the manual kind of CV screening. The other one is the interviews. basically allow you to narrow the set of candidates you're interviewing because you've had this kind of pre-qualification step, then you'll save some time there. Depending on existing kind of manual assessments, then also you can kind of knock a few of those out in terms of not having to grade those assessments anymore. So it kind of depends on the hiring process and very much around time saving. There's a couple of immeasurable benefits, a really important, but again, harder to measure around making a really concerted effort to have as fair and a transparent hiring process as possible, explicitly removing like a key source of known bias from the hiring process and saying, this CV which contains someone's name, gender, photo, age, like we are not gonna use this anymore as a hiring decision. We're gonna focus on an objective measurement of their skills. So that's a really important one. And then quality as well. So for some of our customers, especially those who don't have... really solid existing manual tests, then they do find that they are hiring people who don't have the skills needed for the role. It's a kind of bad higher rate, failed higher rate is also something we can improve if businesses don't already have solid ways to screen candidates later on in the process. So let me ask the question a little bit differently. Maybe it's the same answer, but I'll give it a shot because it's a different question. Let's say IBM, you know, we have our own processes and our standard questions, et cetera. Is IBM Enterprise clients also in your persona? And if not, I'm curious as to who your target client base is. Typically we target any businesses that hire at scale in analytics and data science. So normally that comes down to either really large enterprises in banking, technology, energy, those kinds of companies. or high growth scale up tech companies, one of those two. What we found is it's like you have to have enough of a pain point to bother solving it. So as long as you're hiring at least, you know, more than 10 people in analytics a year, obviously IBM will be hiring thousands. And some of our other customers hire at those kinds of scales. They're the types of companies that we target. What's your pitch? In other words, I'm IBM, yay man, we, you know, thanks Tim, but we've got a lot of processes. They're probably too lengthy, but. You know, I think we're good. What's your pitch? Our pitch is that I'm pretty sure you're wrong. And we've converted many, many other customers. What's that saying about, you know, catching more flies with honey than vinegar? We've converted many, many large customers who, who had existing processes that they were sure were right. But once we dug into them, we were able to show that actually. They were slow, they were expensive, they were costly, they were biased, they had a lack of objective fair measurement in the process. You don't have ROI calculator or something that you put in and say, look, here's what we're dealing with guys. If you have X number of CVs you're looking at, by an average it's gonna take you this amount of time, by an average it's gonna take us this amount of time. And going down the list, so it's like a no brainer? Companies do it in slightly different ways and they'll have their different measurements. So yes, it does come down to exactly time spent screening CVs. time saved on failed interviews, time saved marking or grading assessments and those kinds of things. Yeah, for sure. I guess part of the thing is that even those metrics, it's quite funny, businesses don't track them very well themselves. Right? So it's often quite an effort in even digging that up from the customers because hiring analytics is not really a thing yet. And this is something that we're also dealing with ourselves and trying to prove is that the analytics around the hiring process doesn't. It doesn't really capture that well, even in large businesses. I'd say those metrics, I mean, I'm preaching to the choir, I'm sure, now, but those metrics right now, I mean, they've always been really necessary. But in today's environment, we're at the candidate environment, you know, picking and choose what you want, what you need. I mean, you've got to know who your candidate is, and you've got to know how long it's taking you to bring candidates in because time is money. And if you have one day, 24 hours, you lose a candidate. if you can't get somebody interviewed as quickly as possible, right? Yep, exactly. It's a huge opportunity cost. I think in general humans are bad at trading off time for other things because people don't feel it's as measurable as other things. Like cost is very obvious. Like I'm going to pay X thousand dollars for this. It slaps you on the face. There's a budget line item, but I think people are very bad at noticing, oh, this process dragged on by another few days, another few weeks. And that's also something that we're hoping to surface better in our product. Well, I know that all companies are different, but I still kind of surprised me you wouldn't have that ROI calculator because yeah, time is money. And I think you could really throw that in the face of the consumer by saying, hey, look, this is what the time you're spending. I mean, because you're, like you said, most of them aren't calculating that time. Most of them aren't probably associated into a cost. But then when you really look at the length of time being spent on CV, walkthroughs, or, you know, look. The bottom line is that they're not spending the time on the CVs that they need to, then they're not reading CVs to begin with, and that's why you're getting the wrong candidate. So it's kind of a downward spiral. Anyway, maybe you do this. Do you put behavioral testing, like disk analysis? You know what disk analysis is? Like it talks about, you know, what behaviors, what kind of person are you? Are you authoritative? I mean, I can go down the list, Kate, you're better than me at this, but to see if it's a, not just a technical fit, but a personality and culture fit. We've tried to focus so far on things that are very easily measurable and measurable in a simple way, thereby needing to avoid more complicated techniques that are more difficult to explain and, and those kinds of things. Cultural fit is very fascinating because I can see two sides of this. One is I can see the legitimate comment of like, we need certain types of people to work in our business. We have a certain culture. There's it's not just about technical skills. Fine. But the thing is, cultural fit, oh, they weren't a good cultural fit. You could say that about anyone for any reason. It's completely immeasurable, I would argue, and therefore we get into quite dangerous territory when we're talking about bias, unless there's some way you can measure cultural fit, which I'm not sure there is. However, I get what you're saying. However, if you do some of the behavioral analysis, the interesting thing is, and you said it earlier, People often don't know themselves. If they think they know themselves, and then you do an assessment and you think, oh, that is me. I mean, a lot of times when I'm working with or coaching with people, they'll do a behavioral or assessment. And then I'll be able to say, hey, you said this about yourself. I didn't say it. And so in other words, I can work with them and coach them and say, we need to work on this or that. It just think that would be an interesting method. done right as part of the questioning to figure out if they were a culture fit within the organization, of course you'd have to match the culture as well. Anyway, just a thought. Go ahead, Kate. Kate's out there. She wants to say something. So she's our producer. Jump in. I think that you make a, you make a valid point as always, because you are the podcast host, of course I'm going to say that, however, I don't think this is the problem. that Tim's company is solving. It is a hard and painful process to hire analytics skills. And what I'm hearing is that he's got a really smart, clean, simple, which means low time tax, low overhead task solution to be able to hire for those skills specifically. I mean, the work still has to be done to understand culture fit and things like that. I will give you that, Al Martin. But I think from what I'm hearing for his company, he's got a really smart, great solution on that. I really like it. No, I think it's awesome. I agree with you, but I also agree that a lot of times fit can supersede the technical. Now that's a giant philosophical debate right there, right? Are we hiring analytics skills? Are we hiring for fit? Which do you prioritize? I think in today's culture, it cannot be one or the other. I think it's an and, not an or. And so this is one really super helpful input. to see do I have the candidates with the skills I'm looking for and then you can take steps from there. Any references or customers that you can talk to say look this is an example of a customer you worked with and look at these results. Yeah, for sure. I guess I could touch on Woolworth's probably doesn't resonate as a name for the American audience that like the Walmart of Australia, basically. So the largest kind of supermarket chain here. So yeah, they're a business that we've been working with for the last six months. When we onboarded them, we started working with what's called Woolies X, which is like the digital subsidiary of Woolworth. So that the ones who have all the data scientists and analytics teams, we've kind of progressively onboarded. each of the chapters in that business. So data science, machine learning, BI, and now into kind of optimization and marketing analytics as well. And that's great. And so previously to working with us, the way they would do recruitment is they would have a series of different recruitment businesses who would source candidates for them. They would get them to take a kind of manual paper-based test. So before COVID, it was literally a piece of paper in an office writing SQL. Okay, it got kind of digitized during COVID to be a live online zoom call. But basically each of the recruiters had to sit there proctor the exams of all these different people writing these different SQL tests, then take a screenshot or whatever manually send it across to Woolworths. They would then go through the process of manually marking it, which obviously takes time and there's a process of a few days there. And at that point they decide whether or not they wanted to interview a candidate. So. That process was like a massive pain in the neck for everyone, the candidate, the recruiter, the expert market within the Woolworths business. And so we've basically deprecated all of that in lieu of this customizable assessment, which not only gives them a much better understanding of the candidate skills, because it's not just SQL, it's like, you know, 30 or 40 different topics, but also it's automated, which makes it obviously a lot quicker and then a lot cheaper because there's all these cost savings in the process. So. That's a good example, I guess, of where a company's gotten a fairly big, immediate benefit from using the Luba. Thank you for that. You going through that makes me think COVID, what's been the COVID impact? COVID is in aggregate. I don't want to come across as harsh saying this, but I feel like it's actually helped us, okay, because now online recruitment is normalized. So the whole idea of someone taking a test online and being worried about that versus them just doing it in an office, that is gone. Because people just don't think like that anymore. Second thing is that I feel like it's opened up the overseas markets to us a lot more. Because I can have a sales conversation with a business in Nebraska in the same way as someone on the ground could. It's been completely equalized. So that was two things that have been to our massive advantage, I think. Certainly empathetic are the health reasons, but there's some companies that it's accelerated and some not so much. It sounds like in your case, it's accelerated the ability to, well, Aluba's business model in terms of, you know, alleviating the CV and doing it remotely. Yeah, absolutely. I've got a few more questions before we end, but I wanna, before I go, those are, there's some higher level questions I'm just thinking about. Where can folks reach you from a business perspective? And if somebody's listening and thinking about Aluba, I've got to exchange with this business. I got to see if it's a match for us. Where do they go? Yeah, they can go to aluba.com that's aloba.com. Um, and typically we start with a quick discovery call, understand businesses, current process, current pain points and kind of walk them through how we can help them. And so probably best to go there and, um, book a discovery call. Okay. Thank you for that. And we'll put that in the show notes. What does Aluba mean? Where'd you come up with that name? You will laugh. Alu means potato in Hindi. I'm a big potato fiend. Like if I could eat anything for the rest of my life, it would be potatoes. I guess it's my, I don't know, Scottish Irish background. And my mom did make a lot of aloo gobi growing up. It's a very prominent Indian curry, so I love that. So that's the origin of it, aloo. And then the bar is just, you know, a random ending, makes it sound cool. It was available, it was a name. This is what I came up with on an eight hour flight back from Vietnam a few years ago. So there you go. I am laughing. What's the attraction? Oh, look, I like potatoes too, but it's not like it's the center of my universe. What's the attraction in potatoes? They are the be all and end all from my book. Like you can eat that and nothing else for the rest of your life and survive. It does what it says on the tin. It's nothing fancy, but it gets the job done. I think that says everything about a luba. Right, man, that's great. Hey, all right, so let me take it up a bit. I got a few more questions for you. One is just overall advice for hiring right now. You're in the business. What is your suggestion? I mean, look, it's a, what is it? A buyer's market? It's a, it's a candidate market worldwide. I think right now we can't find the number of folks. Where did all the people go by the way? Help us out here. Give us some advice. Yes. So two things straight off the bat. Speed, speed, speed will be the first thing. So think about the hiring process and just really think about how long is each step taking? Like, do you really need three people to interview a candidate? And so when you're trying to schedule an interview and you're waiting for three people's calendars to align, do you really need that? So think about each of the steps of the hiring process, remove any that don't actually add any value, and then try to truncate each of the steps. So that would be the first one. For example, I'll just tell you how we do hiring. When we are doing the initial kind of interview, we will just send candidates the link that allows them to book the interview directly into our calendar. Like there's no need for this back and forth alignment of diaries and those kinds of things. Just do things that are simple. The other one is to, I guess, hunt where other people aren't hunting. So if there's a reasonably small, narrow set of candidates who you think everyone's gonna be fixated on, why not look where other people are missing? Part of our whole concept at Aluba is that there's a set of candidates who are fundamentally undervalued. It's like the Moneyball movie, if you've seen that, with Brad Pitt, right? Where they used to have all these different heuristic ways of trying to spot who was the best baseball player based on you know How could their haircut was and the attractiveness of their girlfriend and these kinds of things and they replaced it with actual data So I think because of that they identified is really high quality players that everyone was missing It's the exact same thing in analytics There is a set of candidates that everyone is missing that if you measure something well, then you can identify those candidates as the old like job postings of yesteryear are they Not sufficient. I've got a point of view on this. Well, let me just, I might as well just keep going. I don't want to answer your question, but I have been told that with the experience, particularly the younger generation, the experienced generation, it needs to be more personally tailored to them versus tailored to the company. Is that true? I suspect that, but I have to say from all our experimentation, so we also on the side run an entire recruitment business, by the way. So we post a lot of job ads and cycle through a lot of variations to try to see whether or not that changes the application rate I suspect if it does change it makes a marginal difference at best What we can tell certainly from candidates is often they don't actually read the job description that closely So often candidates will be quite confused once you speak to them about who the job is with and what's it about So we know that candidates come just hit that apply button quickly. So I doubt that having different messaging and those kinds of things in job ads is going to make a tremendous difference to the application rate. It's going to be mainly driven by whatever the big platforms are doing, which is completely opaque anyway. Like whether or not LinkedIn surfaces your job ad is not something you can control, nor can you really monitor because it's just completely opaque. And then employer brand would be a huge one. Like if we think about the customers we deal with, those that get the most applicants are those that... have the best employer brands and everyone wants to work there. I know you're not a CV fan for obvious reasons and I get it, but if you had to design a CV, they're like a dime a dozen today, what would you do to stand out? Okay. From a candidate's perspective, yeah, this is like, no, the rules of the game of the flawed rigged game that you're in, I'd say. Okay. So know that basically. the initial CV screening steps, someone is looking for something as close as possible to the job description you've applied for. Whether that's a hiring manager or someone in talent, that's what they're looking for. So you need to tick as many of those boxes as possible. So they want a certain level of experience. If they wanna see certain skills, make sure you're matching them. Unfortunately, that comes down to basically like keyword stuffing. So for example, if the job ad asks for Python, EG with packages like pandas, I would put both those words. in your CV, because if not, you're going to rely on someone understanding that Pandas implies that you know Python, right? So this is kind of like clever keyword stuffing is what I'd focus on. Obviously, make sure that it's well written, there's no grammar, grammatical errors, no spelling errors, because you could be arbitrarily excluded very easily and then keep it to no more than two pages. No, that makes sense. All right. I'm going to bring back a game to end with. I haven't done in a while. I hope you're game for it. It's where'd you rather? Play that in Australia, right? Sure. Maybe? Yeah. When I was drunk as a teenager, but sure, let's do it. Well, so now you're an adult and you got to pick a side. It gets harder as you get older, I think. That's the problem. All right. Would you rather teach or run a business? Run a business. OK. Distributed team or local team? Distributed. It's probably because you got a distributed team. You got a catered to them, though. All right. I guess this is the same question, but remote or in person? In person, ideally. This whole idea that humans are going to interact digitally for the rest of our lives and we're never going to see each other again, I don't buy it. So you're in person. Now you realize you just said distributed team. I know. Yeah. But still together, together in little pockets, I think, around the world, surely. Coffee or tea? Oh, coffee. Oh, my God. Three, four a day. Love it. Now. And I don't actually know this. I've been to Australia a couple of times. Is Australia a coffee country? Oh yeah. And you have to talk to a very, very smug level where we will look down upon every other country's coffee. I'm sorry to say, especially Americans. So it's a fascinating business case to look into about Starbucks expansion into Australia and how it catastrophically fell very quickly. So you can look that one up. I know the answer to this, but I'll ask it anyway. Personal hiring. via human or an algorithm? Both at the moment, if that's a cop out. I think there's only so many things you can measure right now. Yeah. If I didn't give you a cop out, what side of the fence would you fall down on? Algorithm. Now, if you're gonna hire for your company, you're gonna hire for culture or skill? Skill. All right. You stay along. Any future bet you're gonna make around hiring? Like where we're gonna be in three years? This isn't a word you rather, it's just a final question. So to me, the best analogy I can think of is the first time I went to Europe, 2006, we used to book our accommodation by walking into each local cities, tourism hub and saying, Hey, are there any rooms available? Right. 10 years later, we have Airbnb and that has out dramatically the landscape of booking. accommodation has changed. I think hiring will change to the same degree over the next 10 years, whereby matching someone to a role will be as easy as booking a hotel room. So I think there's gonna be a complete overhaul and we'll look back now and think what on earth we'll be doing that was crazy. Fantastic. All right, I'm out of questions. I could keep going on, but Tim, thank you so much. Anything I missed, Kate, Tim? Anything you wish I would have asked? Not. You've given me a good, a good working of us. I appreciate it. That. All right. Very good. Thank you for being here. I greatly appreciate it. Look, this is a topic of the day. I don't know if there's a hotter topic and I'm not catering to the guest here. I mean, hiring right now, I say this in genuine, uh, look, my team is, we're pushing it and you know, we're going through analytics because it's, we can't get people in fast enough. You know, I just can't. So you're in a good space. It's hot right now, man. Thank you for being here. I appreciate it. My pleasure. Thanks for having me. Podcast listeners, I want to one last thank you to you. And as always hit us on almartintalksdata at gmail.com. We'll get back with you. We'll get you on, or we'll take your topic and we'll run with it. Thank you so much. Talk to you next time. Bye bye. Thanks for listening to the Making Data Simple podcast where we make data fun. Be sure to visit IBM Big Data Hub. dot com forward slash podcasts to access the show notes and uncover even more great episodes. Remember, the views expressed here are those of the host and its guests and do not necessarily represent the views of IBM. Until next time, over and out.