Good Morning, HR

In episode 153, Coffey talks with Brandon Jordan about the use of assessments in the employee selection and development process.

They discuss the different types of employee assessments available in the market and when each might be appropriate; the right questions to ask assessment providers; types of assessments and what they measure; the potential risks of using only cognitive assessments; three kinds of assessment; how to measure the effectiveness of assessments post-hire; complying with the Uniform Guidelines on Employee Selection Procedures; and the potential for bias in artificial intelligence implementations during the hiring process.

Good Morning, HR is brought to you by Imperative—Bulletproof Background Checks. For more information about our commitment to quality and excellent customer service, visit us at https://imperativeinfo.com.

If you are an HRCI or SHRM-certified professional, this episode of Good Morning, HR has been pre-approved for half a recertification credit. To obtain the recertification information for this episode, visit https://goodmorninghr.com.

About our Guest:

Brandon is an organizational psychologist and scientist focused on the study of people, attitudes, emotions, and behavior in the workplace. As leader in talent management and organization development functions in practice, he is focused on using practical, applied research and analytics strategies to build data and evidence-based people practices, processes, and tools to strengthen organizations through their workforce. 

Brandon’s deep areas of specialization and nearly 20 years of experience are primarily focused in employee assessment and employee engagement/experience measurement using psychometrics and people analytics methodology. His passion is to help businesses grow and develop through their people by helping employees 1) build their organization through hiring the right people, 2) the sustained development of the workforce, 3) and building better, high performing organizations through employee listening strategies. Brandon founded Workforce Lifecycle Analytics where he currently leads sales, analytics, and consulting and previously worked for Willis Towers Watson, IBM, Kenexa, and Batrus Hollweg International. 

Brandon completed his graduate training at the University of Tulsa where he received his M.A. in Industrial and Organizational Psychology and received his B.A. in Psychology from the University of North Texas. Brandon is also a member of the Society for Industrial and Organizational Psychology (SIOP), Society of Human Resource Management (SHRM), DallasHR (Dallas SHRM Chapter), as well as an active researcher in the academic community participating in conferences and studies contributing to scientific research.

Brandon Jordan can be reached at
https://workforcelifecycle.com 
https://www.linkedin.com/in/brandonjordantalent

About Mike Coffey:

Mike Coffey is an entrepreneur, human resources professional, licensed private investigator, and HR consultant.

In 1999, he founded Imperative, a background investigations firm helping risk-averse companies make well-informed decisions about the people they involve in their business.

Today, Imperative serves hundreds of businesses across the US and, through its PFC Caregiver & Household Screening brand, many more private estates, family offices, and personal service agencies.

Mike has been recognized as an Entrepreneur of Excellence and has twice been named HR Professional of the Year.

Additionally, Imperative has been named the Texas Association of Business’ small business of the year and is accredited by the Professional Background Screening Association.

Mike is a member of the Fort Worth chapter of the Entrepreneurs’ Organization and volunteers with the SHRM Texas State Council.

Mike maintains his certification as a Senior Professional in Human Resources (SPHR) through the HR Certification Institute. He is also a SHRM Senior Certified Professional (SHRM-SCP).

Mike lives in Fort Worth with his very patient wife. He practices yoga and maintains a keto diet, about both of which he will gladly tell you way more than you want to know.

Learning Objectives:

1. Understand the different types of employee assessments and their applicability.

2. Understand the importance of validation when using assessments for use in the employment context.

3. Evaluate the effectiveness of post-hiring assessments.


What is Good Morning, HR?

HR entrepreneur Mike Coffey, SPHR, SHRM-SCP engages business thought leaders about the strategic, psychological, legal, and practical implications of bringing people together to create value for shareholders, customers, and the community. As an HR consultant, mentor to first-stage businesses through EO’s Accelerator program, and owner of Imperative—Bulletproof Background Screening, Mike is passionate about helping other professionals improve how they recruit, select, and manage their people. Most thirty-minute episodes of Good Morning, HR will be eligible for half a recertification credit for both HRCI and SHRM-certified professionals. Mike is a member of Entrepreneurs Organization (EO) Fort Worth and active with the Texas Association of Business, the Fort Worth Chamber, and Texas SHRM.

Brandon Jordan:

There's certain traits that will predict performance within a certain context. Right? And so some of the work we do with validation work, especially at at a local job family level, is all about that. It's establishing the job relevance from a legal standpoint, but also figuring out what are the specific traits and at what level do those traits predict performance for those different, you might say, environments, but, really, it's it's what they're doing in the job too.

Mike Coffey:

Good morning, HR. I'm Mike Coffey, president of Imperative, bulletproof background checks with fast and friendly service. And this is the podcast where I talk to business leaders about bringing people together to create value for shareholders, customers, and the community. Please follow, rate, and review Good Morning HR wherever you get your podcast. You can also find us on Facebook, Instagram, YouTube, or at good morning hr.com.

Mike Coffey:

Employers are increasingly recognizing that the traditional hiring process isn't ideal. Generally, that's something like you post a job, you collect resumes, the hiring manager interviews a few potentially qualified candidates based on their education and their previous job titles, and then the hiring manager selects the one they like best relying mostly on their gut. In previous episodes of this podcast, we've talked about the shift away from credentials to competencies, the importance of cultural fit within an organization, and how behavior styles affect performance. But each of these factors can seem subjective. How do we measure them objectively and, heck, how do we even know what to measure?

Mike Coffey:

Well, joining me today to discuss the use of assessments in the employee selection and development process is Brandon Jordan. Brandon is an organizational psychologist and scientist focused on the study of people, attitudes, emotions, and behavior in the workplace. He's the founder of workforce life cycle analytics, a firm that partners with clients to grow their business through their people, focusing on talent management and organizational development initiatives and programs. He's also a returning guest. Back in 2022, we talked about employee engagement and quiet quitting.

Mike Coffey:

That's an old that's a throwback term now. It's a, you know, throwback Thursday. We haven't heard quiet quitting in a while. So welcome back to Good Morning HR, Brandon.

Brandon Jordan:

Thanks, Mike. Appreciate the opportunity to talk to your audience about something that I'm I'm really passionate about and, you know, share share what I know about.

Mike Coffey:

And I know you are, and that's why I wanted to get you on here.

Brandon Jordan:

Yeah.

Mike Coffey:

So if you Google employee assessments, you're gonna find a ton of vendors who wanna sell you an assessment of some sort. Some of them just wanna sell you a one and done assessment over the Internet, and others wanna offer consulting to help you choose the right kind of assessment for whatever the needed hand is. So let's start by talking about the different kinds of assessments and situations where each might be appropriate for an employer.

Brandon Jordan:

Yeah. And I'd start, Mike, by saying if you Google employee assessment, a lot of stuff will come up. And I would say the vast majority, 80, 90% of those, offering assessments online are what I might call technology first organizations where they have a Mhmm. A slick technology platform. They've had some leaders that have founded that company but really didn't understand how to create a psychological assessment that's that's worth its salt essentially.

Brandon Jordan:

So 80, 90 percent of the assessments that are on the market, I probably wouldn't advise people to to pursue, honestly. And those can be any number of types of assessments too. I mean, we can talk about behavioral assessments, cognitive ability, skills assessments. I mean, there's there's so many under the umbrella of employee assessment that we could we could talk about. I would say the majority out there are what we might call personality work style behavioral assessment, and that's one area I I have a deep specialty in but I I know a thing or 2 about almost all of them, you know, at least.

Brandon Jordan:

But really starting there, the the market is just saturated with what I would say that our providers that, you know, they have like a Ferrari body on the outside, the slick technology, but you flip the hood open and, it's kind of a a go kart engine underneath there. Like, all of the best practices for establishing a good psychometric measurement, really isn't there.

Mike Coffey:

Well, that's interesting because that that's the same thing in my industry. Right? I mean, you can go online and buy a background check and you know, for $30 or $50. And if you don't know what their exact questions to ask, and, you know, to challenge them. And then if you can even get anybody in customer support to answer your questions when you wanna, you know, ask, you You know, employers just don't know, and so they don't they don't come to a company like Imperative that's really thorough and focuses on getting it done right until something's gone wrong.

Mike Coffey:

Right? And so

Brandon Jordan:

Yeah.

Mike Coffey:

You know, and I imagine in in your world, what's happening is, you know, we've been using this assessment online, and we still got the turnover issues we've got, or we're still not getting the right fits, or people with the cognitive ability to do this role. And we've we've spent all this time and money and turnover, and now we need to figure it out. And, you know, it's probably you're probably like me. Your best clients are the ones who've been burned doing it cheap before.

Brandon Jordan:

Yeah. Yeah. Absolutely. Yeah. The the the cost of buying something that's not reputable and causing more pain than the solving the issue you're trying to to solve for can be pretty painful for a lot of clients.

Brandon Jordan:

And and you touched on something there is like knowing the right questions to ask. And so when I talk about this a lot, I go and speak at, you know, SHRM chapters and, conferences and things on this topic. I I really try to educate HR professionals on what questions you should ask, and I'll share happy to share a couple of those real quick. Right? So first question I would ask any assessment provider, especially a behavioral assessment, is do you have a a technical validation manual?

Brandon Jordan:

Do you have a document that shows the science that you did the best practices and and created this thing where it establishes the validity and reliability of it. Again, kind of going back to my previous comment, 80, 90% of the companies out there won't have this readily available. And, oh, some some do. You can go to their website and you can download it. You don't even have to ask the company.

Brandon Jordan:

They'll allow you to download it from their website. That's question number 1. Another thing that's more difficult for HR practitioners to really know is how good is the technical manual. It's really hard. You need a little bit more time than we have today to educate on what reliability and validity really are.

Brandon Jordan:

But if if they have that first thing in place, that technical manual, that that's a that's a green flag. Right? That's a green flag that you move forward. There's there's assessment providers out there that they'll say they have a technical manual, and it'll be more of a white paper and not really the analysis and validation. It'll just be something written up saying, no.

Brandon Jordan:

It's kinda sound sciency and there's no real numbers behind it, no real empirical evidence. So I've seen that as well. They they pass it off as a technical manual, but it's really not. Another question is, you know, do you have a an organizational psychologist on staff? Did organizational psychologists at least help you develop this?

Brandon Jordan:

Those are some good good markers of a of a reputable assessment. And I wanna dive into validation, but let's let's go okay. So you you mentioned behavioral assessments.

Mike Coffey:

What do we typically when you're talking about a behavioral assessment, what are we really measuring there?

Brandon Jordan:

Yeah. So it really comes down to personality, I would say. And

Mike Coffey:

The big five, those kind of things, ocean.

Brandon Jordan:

The big five. Exactly. So let's go into some green flags for behavioral assessments. Right? And I I wanna talk about it in terms of kind of 2 different classes of assessments.

Brandon Jordan:

You have typology assessments, and those are the assessments that are on the market that people know the most. Right? So think Myers Briggs, DISC, StrengthFinders. Those are more typology based assessments. They use what we call Ibseniv measure where it's a forced choice rather than a Likert scale.

Brandon Jordan:

And the virtue of them is they paint in pretty broad strokes and they bucket people into these different types. The advantage there is it's really easy to implement, really easy to understand. Matter is human behavior is much more complex than that, right? It's much more nuanced and it's really also really dependent upon the environment. So much some of those assessments will even measure you and give you output on how you are normally versus how you are at work, for example.

Brandon Jordan:

Right? You take the most extroverted person in the world and you if they're at a funeral, they're probably going to taper down their extrovertedness at at that in that environment. Right? So there's there's a lot of concomitants that that come into play. But the typology assessments are good for almost a singular purpose it's more for development and team cohesion and they really shouldn't be used for talent decisions in the workplace.

Brandon Jordan:

In fact, you can get in into some legal trouble if you do implement those assessments using them for talent decisions. It's not illegal to do. But if you're challenged in the back of the technical manuals of these assessments in fine print, it says you shouldn't do this.

Mike Coffey:

Yeah. And you see and if you look close enough on a lot of their sites, and again, you say I can say it's it's in small print there. There are people out there selling those things, disk or whatever, some version of disk. And they'll you know, if you read everything on the website, you'll find that paragraph that says this is for development and and, you know, team building, those kind of things that shouldn't be used to make an employment decision or hiring or selection decision. But, you know, it's readily available, and we've all heard about it.

Mike Coffey:

And, you know, we all know that, you know, what our disk and I I I I live by my disc style. I mean, you know, I know exactly what it is and what everybody in my team is. But, you know, but there are they do tend to say, if you, you know, if you look carefully, don't use this in employment context. But then you hear a lot of people saying, well, we use them in an employment context, because, you know, and quite honestly, if you if you happen to get it right and you're not, you know, correlation isn't causation, but sometimes you, you know, you just say, okay, I look at, you know, desk and I see, you know, the people who've worked best in in in this organization on this particular role have a low d, a low I, a high s, IC. And you get so you you can get it.

Mike Coffey:

But like you say, you know, I wanna get into some of the EOC issues and, the uniform guidelines. You know, it works until it does until until you're creating a disparate impact. So, anyway, there's so behavioral assessments. And you you mentioned topology, then there was the other kind. So let's talk about that.

Mike Coffey:

Right.

Brandon Jordan:

Yeah. And and, you know, I I would largely say that we'll just talk about it in terms of, like, some more green flags. Right? So if I was to recommend somebody use a behavioral assessment the first thing I would ask a provider is say, you know, is your assessment rooted in the the 5 factor model, the big five personality model, which is an acronym OCEAN, openness to experience, conscientiousness, extroversion, agreeableness, and neuroticism, which is emotional stability. And those all have sub facets.

Brandon Jordan:

Like I said, personality is very complex. Right? There's sub facets under each one of those, what you might call meta traits, that we can measure. And so is it rooted in that empirical model? That's what we found to be kind of the strongest model to actually measure behavior in people.

Brandon Jordan:

Beyond that, it's it's some other things. Right? And so it can be rooted in that. Sometimes your your model comes out different. Our model has one extra overall factor, the Hogan assessment suite.

Brandon Jordan:

The HPI, it's rooted in the big five, for example. There's a number of assessments on the market that are rooted in that. Then it's, does it have the evidence where it predicts performance? Is it job related? And now we're bumping up into things that are both best practices and also, I wouldn't say legally mandated, but markers for legal protection under the EOC guidelines and and untruthfully case law that goes all the way to the Supreme Court, for example.

Mike Coffey:

And so there's behavioral assessments, and then we've got cognitive and and, you know, we've we've used cognitive assessments for 15 years probably at least, and they they we found those to be really valuable Yeah. Especially as we've hired more remote employees who may not have we may not have the ability to work side by side with and really develop and train. So finding people who can work who can get up to speed quicker. So talk about cognitive, assessments and the pros and cons there, what we should be looking for when we're looking at those.

Brandon Jordan:

Yeah. I think before I talk about that, what I would recommend to anybody looking to implement an employee assessment, you never want to use one methodology as your kind of end all be all, right? What we find even in the scientific evidence of these things is if you bring multiple methodologies together, the likelihood that you're making the right pick for the high performer increases twofold. Right? So using a a trait behavioral personality along with the structured interview and a cognitive ability assessment, for example, you're really increasing your likelihood to pick the right person.

Brandon Jordan:

That being said, it what we often would recommend in using cognitive tools is to use it in concert with other things. I I would probably never implement or recommend implementing a cognitive tool as a standalone as a as an employee employee, talent decision for a client because it has some issues that, bump into some legal risk. What we find cross culturally quite a bit as well but there's there's race ethnicity differences with cognitive ability measurement And some of this is rooted in some socioeconomic things as well. We find that as well. But, in essence, you know, you're bumping up on on EOC risk with that for protected groups.

Brandon Jordan:

So, we find that, you know, for example, just kinda going down the list, it's like a Asian, background, Jewish background tend to be at the top, white slowly white, Caucasian follows after that. And then towards the bottom, we we find black African American and, Hispanic people tend to kind of be across the spectrum from top to bottom there. And so implementing those without some kind of counter counterbalance to say, okay, like, let let's get a a fuller picture of this person on whether they're gonna be a good good performer, a good fit for this across multiple methodologies is usually recommended just in general, but also because of some of the risks in using the cognitive measures.

Mike Coffey:

Yeah. And combined with the behavioral assessment and, like you said, that structured interview. I mean and I've always used the behavioral assessment primarily as to help me design the structured interview for that person. You know? So if I see a behavior pattern that's contrary to what we where we would normally go, but the person's otherwise fully qualified and and they're in the right range cognitively from, you know, from the assessment, then I'm going to use the all those results to say, okay.

Mike Coffey:

Here's the kind of questions we wanna follow-up on. Let's peel this onion a little deeper and figure out what's really going on here. And on the cognitive side and, you know, what I found is that people, there's there's a range there. And if somebody scores higher than that range, we have less success, too. You know, and and especially in our data, you know, our our, you know, high level data entry roles.

Mike Coffey:

There's, you know, you can, you know, I guess in some roles, I've never had this problem, but you'd be too smart for your own good. And, you know, maybe they just can't you know, they've not performed as well. But we've we've, you know, really narrowed down over 15 years that that range for that cog. So So we know if they score in this range, we're probably gonna have success. But if they're they're above it or below it and everything else is right, that's not gonna eliminate them from consideration.

Brandon Jordan:

Right. Yeah. And and I wanna go back and say it's not just legal risks. Like, you know, when it for implementing these, it might be you might behoove you strategically if your organization isn't very diverse to make sure you're using multiple methods. So you're making sure you're bringing some of some of the diverse background in your company.

Brandon Jordan:

But also

Mike Coffey:

make it harder on yourself to hire people by having a test that's you know, so, you know, we're getting, you know, you can get overly dogmatic about this and then just make it really hard to hire people.

Brandon Jordan:

And I've been hearing, I'm a bit addicted to TikTok for reasons, but I've been hearing some buzz on some people talking on TikTok about people that are high IQ being, not neurotypical. Right? And so, like, that is that gonna eventually fall under the, kind of ADA type stuff perhaps? I don't know because there's evidence out there to say people that have higher IQ tend to social, struggle socially and have have a number of other, like anxiety problems and things like that too, right? So it's like there's these concomitants, with people at the higher end of the cognitive ability spectrum that you're right.

Brandon Jordan:

They might not be as successful for for multiple reasons. And, you know, that that's not really fleshed out yet, but I I'm starting to smell it in the air a little bit there.

Mike Coffey:

Well, and, you know, we've had we've done we did an episode earlier this year of about hiring people, who are neurodivergent and the benefits that employers have found, working with those folks. And not every job do you need to be able to look somebody in the eye and, you know, have a firm handshake. And and and maybe in some cases, the ability to to really focus on a problem exclusive of the environment around you, is a real, you know, is a real asset.

Brandon Jordan:

Absolutely. I'll go back to another green flag that also aligns with legal requirements is establishing job relevance is the legal requirement for implementing an assessment. But beyond that, what we, it's rooted in science as well. One of my advisors in grad school, he he had something called trade activation theory. And really what what that is is saying that there's certain traits that will predict performance within a certain context.

Brandon Jordan:

Right? So if you're an example might be bartender. Right? You might need to be higher sociability, higher extroversion for that role in that environment where our accountant, its detail orientation is needed to be high. Sociability might actually detract from performance in that role.

Brandon Jordan:

And so some of the work we do with validation work, especially at at a local job family level, is all about that. It's establishing the job relevance from a legal standpoint, but also figuring out what are the specific traits and at what level do those traits predict performance for those different, you might say environments, but really it's it's what they're doing in the job too. Right.

Mike Coffey:

You're not gonna hire somebody with low sociability and low dominance, let's say, for an outside sales role, that's kind of a high pressure sale, but then maybe perfect for a technical an in house sales role for early technical role. And so you've really got you ain't got to think through those. So we that's behavioral. We talked about cognitive, just measuring it. Say what it means when we say we're measuring cognitive cognitive ability.

Brandon Jordan:

Yeah, so it's the middle horsepower. It tends to be historically pretty correlated with job performance, but both measures do. Cognability historically has been a stronger predictor of job performance. It's it's that middle horsepower, right, and it can be measured in multiple ways. So you'd have things like the Watson laser where it's reasoning, verbal ability, math ability.

Brandon Jordan:

These all tend to get at what we we what what scientists call g, general minimal ability or GMA, And there's multiple avenues to kind of measure these things, but all of those methods tend to get at what we're really looking for is how quickly can this person learn unique things and perform fast, and how much can they learn. Right? And so that's essentially what those things tend to predict. And the outcomes that we see in the scientific literature are that job performance and how how training performance, for example.

Mike Coffey:

And so behavioral and cognitive. Yeah. What are the kind of assessments are out there, that are either, you know, seem to you to be just a voodoo or maybe have some, some reliability behind them?

Brandon Jordan:

So some voodoo stuff that's out there, there there's there's it's not really prevalent in the United States, but people have done handwriting stuff in the past. It's actually pretty prevalent, I think, in India. Maybe not prevalent, but it's more popular there. People will, like, even, like, look at lines in your hands and things like that. I came across somebody recently.

Brandon Jordan:

He's a professional speaker and he's going and telling, people that facial structure of people is related to personality. And it it kind of reminds me of phrenology of, like, when you read the bumps on your head

Mike Coffey:

And I know who you're talking about. Yeah. I've seen it a couple times. Yeah.

Brandon Jordan:

Yeah. And so some of that's kind of wild to me. I don't you know, I there might be something to it. I'm not gonna say there's not, but there's just no scientific evidence, I think, that exists for any of that stuff truthfully. Right?

Brandon Jordan:

And so I I I and I even run into people that, aren't very won't endorse endorse personality assessment, for example. But there's a mountain of, a 100 years of, scientific evidence that's to the contrary. That we can measure these things. It is difficult. It is hard.

Brandon Jordan:

And oftentimes, we get false positives and false negatives with our measurements, but, you know, we're pretty we've gotten pretty good at it still, and there's something to it. You know?

Mike Coffey:

And let's take a quick break. Good morning. HR is brought to you by Imperative, bulletproof background checks with fast and friendly service. For many employers, any review of a candidate's social media profile is taboo. That is a privacy line they are simply unwilling to cross.

Mike Coffey:

But within the confines of title 7 and other anti discrimination laws, a candidate's off duty conduct, including how they conduct themselves on social media, may be relevant. For example, if the employee engages in illegal or negligent conduct on the job that might have been predictable and therefore preventable, had the employer simply reviewed the publicly available information about them on the Internet, the plaintiff's lawyers will certainly be using that information in a negligent hiring suit. For some positions, threatening, coercive, or bigoted online behavior may be a legitimate concern. In other instances, references to personal drug use or criminal behavior may be relevant. Comparative helps clients conduct necessary online due diligence while building guardrails around the process to ensure that irrelevant or potentially discriminatory information isn't provided to decision makers.

Mike Coffey:

You can learn more about the many ways Imperative helps our clients make well informed decisions about the people they involve in their business at imperativeinfo.com. If you're an HRCI or SHRM certified professional, this episode of good morning HR has been preapproved for 3 quarters of a recertification credit. To obtain the recertification information, visit good morning hr dotcom and click on research credits, then select episode 153 and enter the keyword validation. That's validati0n. And if you're looking for even more recertification credit, check out the webinars page at imperativeinfo.com.

Mike Coffey:

And now back to my conversation with Brandon Jordan. And so when we're talking about validation, I think you you mentioned construct earlier, but there's, you know, under the uniform guidelines are 3 three kinds of validation, I think, that are recognized. But can can you just kinda go through when it when when we say it's validated, what does that really mean, and what are the kinds of validations?

Brandon Jordan:

Yeah. So what you're looking for in a technical manual is the the construct, content validity. So did the scientist, organizational psychologist create this to measure what it says it's measuring? Right? So the 5 factor model, for example, we have those 5 factors.

Brandon Jordan:

Those are it emerged empirically from the data in what we call factor analysis, right? And so whenever we do a factor analysis amongst, you know, 300 to a 100000 people that take a personality assessment, some form of that emerges from the data in almost every data set. And so those factors tend to be what you call orthogonal or or independent of each other. There might be a little overlap between, but they're they're not measuring the same thing. So being being extrovert is not the same thing as being conscientious at at the highest level.

Brandon Jordan:

Then, you know, there's sub facets. Can you demonstrate that in the data that these things are are independent of each other but within you brought it up earlier 2 sub facets of extroversion are sociability and dominance for example. Those tend to be orthogonal of each other within that factor but still overlap within the overarching factor of extroversion. So like I said it's a complex thing to measure all of this stuff with all the interrelated connections but that's construct validity. Are we measuring these things in a way that is reflective of real world behavior?

Brandon Jordan:

And there's there's all kinds of interesting studies on this. Right? So for example, there's longitudinal studies that will measure how long babies cry when they're little, and that tends to be correlated with adult levels of, emotional stability and neuroticism, for example. So you cry more, you're low you have lower levels of emotional stability as an adult. There's all kinds of interesting studies like that that verify that we're measuring what we're intending to measure.

Brandon Jordan:

Right? So contract validity in the tech manual, and then you wanna really, really look for criterion validity or what we might call or or I would describe as are these things correlated with something in the real world that's of importance. Right? So within the employee context, are your measures correlated with employee performance? Are they correlated with, organizational citizenship behaviors, being nice to other people?

Brandon Jordan:

They correlated with turnover or retention. So is there a criterion that the assessment is tied to that it tends to predict? And not only is that a good thing to implement an instrument for your organization, there's Supreme Court case law that says, you know, that that's a a practice that needs to really be in place. So job relatedness came from case law.

Mike Coffey:

Mhmm.

Brandon Jordan:

Is it correlated and predicting performance came comes from case law as well. Not only were the scientists starting to do this, and that's how it got brought about in these, court cases, but they're like, oh, you know, you did the best practices. You you did the things that you needed to do. We're we're good with that. And in fact, it's built into case law now.

Mike Coffey:

And so we get construct and criterion. And then I think the one that the thing that's you know, we can find an assessment and we can try to shoehorn it into a certain job, but it's still this you know, an accountant in a big accounting firm, may be very different than an accountant working in a corporate accounting office. And those the you know, so the behaviors, you know, that in one role, even though it's, you know, very similar skills, may be very different in a in in a similar role in a different environment. You know, and so tying is what do we do to tie it back to make sure that we're really picking the right thing for that job?

Brandon Jordan:

Great question. Right? And so another green flag to look for in an assessment organization is do they have an assessment in place for this specific job family? And and what we what we do there is what we call validity generalization. Right?

Brandon Jordan:

So if for my my company, for example, if we've done, you know, 10 criterion related validity studies for clients in customer service, for example. We're looking across 10 different companies, a 1,000 different employees where we have both the assessment data and and that employee outcome, that performance data, where we've demonstrated that it's predictive, that it's job related, that it's related to those roles, We can make the logical leap there and generalize that those findings, that validity to another customer service organization. There's sufficient evidence that this validity generalizes to your role in your new company. You are right. There are extenuating circumstances.

Brandon Jordan:

Right? So like an accountant within a company that's, you know, 50 people, you know, is very different than a very specialized accountant that's doing, you know, like forensic accounting in in one of the big accounting firms. Right? That's that's more generalist versus a very technical role, and you we'd probably wanna have that conversation with that client and say, hey. Might not be a one to one match for you here.

Brandon Jordan:

We would recommend probably doing some job analysis and kind of figuring out some of the nuances here on why it might be different and recommending maybe we build a specific scoring algorithm for your roles for an assessment. So a customized assessment algorithm for your company perhaps, that's localized and not generalized validity, specific validity for your employee population. So there's there's conversations to be had there. Can we do a little bit of job analysis to generalize the validity, or can we say, hey. You can buy this off the shelf, let's say, and we're pretty confident that it's gonna 80, 90% map onto the behaviors that are needed for for that role.

Mike Coffey:

So once we've implemented an assessment and maybe you know, what what is that you know, we always do the plan, do, check, act thing. And so when we get down to the check part, you know, and we've been implemented and now we want to see was this even did this work for us? When do we do that and what what kind of measures do we use to to fairly evaluate it?

Brandon Jordan:

Yeah. So, you know, there's a number of things you can do, and you can measure even as in the client organizations can can measure as well. But I'll start by saying is we we follow-up in aggregate. We're not trying to get every single one right. That's not the name of this game.

Brandon Jordan:

The name of the game is can we not select the worst two employees that you did in the past and instead select 2 of the best, increasing your selection rate by 40%, maybe? You know, that's that's great. I mean, you know, it's kinda like in a way gambling. Right? You you you you're you're trying to beat the house here just by just by a few percentage points if you can.

Brandon Jordan:

Right? So if we can cut out the worst 10, 20% of of hires and it and therefore to improve the top end by that much, we're we're doing great. And it's about that long term game. And for us to establish that, we need a sufficient sample size to to prove that analytically. Right?

Brandon Jordan:

So with our clients, what we do is we'll follow-up and have a hiring review a year well, sometimes 6 months to a year to 2 years out to check-in with them, to say, okay. Look at let's look at fairness bias, adverse impact evidence. Let's look at just your compliance for your hiring managers and your recruiters. Are they actually hiring the people that their assessment's recommending or are they not doing that at all? I'm just going with their gut.

Brandon Jordan:

So we go on and evaluate some of those things with every client. Right? We we can easily get the data and and be able to do that. Beyond that, we can follow-up and do a predictive validation study where we get after we have a 100 of those people, and they've been on the job for at least a year and incumbent incumbents for at least a 100 people, hopefully, up to 300. We can get performance data on them, and their assessment data and say, okay, is this still predicting performance?

Brandon Jordan:

Sometimes it it there's elements that are changed with the job and over time that the predictive model might change a little bit. It's like, oh, like, you implemented this new technology. Like, people aren't knocking on doors anymore. They're actually using LinkedIn to to do sales and things like that. May I don't I just kinda pluck that out of the air, but whatever how does maybe change the environment or or whatever with that job tends to change what predicts performance for it sometimes.

Brandon Jordan:

Right? And so we can have that discussion and say, okay. Like, why isn't this specific trait predicting anymore? Right? And sometimes that happens.

Brandon Jordan:

But following up, doing another validity study, another thing clients can can do is just, you know, measure some of those, training performance stuff. And a lot of clients don't do a great job on measuring performance. Usually, we come in and measure

Mike Coffey:

our performance for them. Yeah. And and that I mean, you know, we talk well, we can close here with uniform guidelines and disparate impact in title 7, but the the you know, it goes back to the bigger issue is why are we gonna make it you know, we need to have good performance measures in our organization anyway. And if we make it hard to hire people, but we also don't manage performance well, our the quality of our our data when we go back to evaluate our processes is gonna be very good. So, you know, I, you know, I I don't know if a if a hiring process is working well if my managers aren't actively managing and supervising their employees and, you know, following processes that we've we've laid out.

Mike Coffey:

It's harder to measure. And so we make it harder. You know? So no tool in the world is gonna really help you if if your management's not gonna, you know, be if you're not gonna have good supervisors, if you're not gonna invest in training and making sure that they know what how to use the tools and and how to, you know, where there's a challenge, you know, bring performance up and reward, incentivize people, all those things. But let's talk about you know, let's wrap up with, you know, we've we I've mentioned the uniform guidelines, the EOC's, uniform guidelines on employee selection procedures.

Mike Coffey:

Kind of run through the risk there. I mean, because it's it's really, you know, protected class is is, you know, a disparate impact on a protected class. Right?

Brandon Jordan:

Yeah. That's right. And and, you know, the the 3 big buckets there tend to be, age, ethnicity, and gender. And that's expanded here recently on the gender side, and there's difference between EEOC and, OFCCP for example. FCCP is the federal government's

Mike Coffey:

Contractors. Right?

Brandon Jordan:

Similar to, yeah, EEOC. Any any organization that's doing any work with the federal government falls under the, OFCCP and there's more requirements for that. They have to proactively, produce reports and and file with them on affirmative action and things like that for example. So in general the OFCCP tends to be more proactive, EEOC is reactive. So anytime that I've been involved with any client interaction where there's been an EEOC challenge, which that means there's been an employee that felt like they were discriminated against, and they file a claim with EOC.

Mike Coffey:

And discriminated because of being in a protected class that they had a, you know, disparate impact that your test is is pushing me out because of my age or my race or whatever. Yeah.

Brandon Jordan:

Exactly. And they they would potentially, our organization when I was with IBM and Connexa and and say, hey. You know, we have this claim. And essentially what we would do is, you know, if it was a localized validation study and job analysis that we did for a client, we we would write up a technical documentation on on it all. It was a 10 to 20 page really sleeping pill is what it was, you know, real real sleeping material there.

Brandon Jordan:

And you you send that into the EOC, and it outlines all of the things that you did from a job analysis standpoint to establish job relevance, all of the data analysis that you did to establish, the criterion validity. And, essentially, they don't have much of a case if if it's, established as as a a good assessment. And it was it's not 25 years old that you did the validity study as well. And, you know, that that's pretty much a a an insurance policy by doing that to to protect yourself. And not only that, it's it's best practice.

Brandon Jordan:

Right? You wanna be sure that the things that you're doing are, you know, impacting your organization and predicting performance and being fair. And so there's the 4 5ths and, a number of other statistical ways that we can go in and determine whether an assessment is, having any kind of impact like that. And real quick, I I just wanted to mention, and I know you're a big AI proponent, artificial intelligence proponent, Mike.

Mike Coffey:

But there's some really bad stuff in the AI world right now, I think, related to this. Yeah. Talk about that.

Brandon Jordan:

Yeah. And so over the last 10, 15 years, what what's happened is essentially we people have built these AI tools and they've been built by people. And what happens is it seems that the AI machine learning algorithms, they get in there and they perpetuate our own biases almost. You know? They they

Mike Coffey:

Bias in, bias out. Yeah.

Brandon Jordan:

Yeah. Exactly. Google famously shut off an algorithm that they had that was selecting, engineers for them. It was super biased towards men, super biased towards I I think that some kind of sport for

Mike Coffey:

men. Yeah. Yeah. Yeah. Was it Google or was it Amazon?

Mike Coffey:

But they had a yeah. They, it was a even if when they took the names out and any references they could think of to to gender, it was recognizing if they played softball, they're probably female and things like that. Yeah. So, you know, and, you know, it's correlation's not causation. And so that's that's the problem.

Mike Coffey:

It wasn't intentional bias on the part of of, you know, whoever the employer was, and it wasn't intentional bias on the part of the AI. It's just it's just gathering data and doing it. I'll tell you the one that scares me is there's a, a company out there, that does the online recording of interviews. You know, they they post a question and you record your answer. And, basically, they're claiming that they they can use AI and neuroling You're going You're going to create a bias automatically against neurolinguistic, or not or neurodiverse people.

Mike Coffey:

And I was talking to somebody at a dinner party I was at in New York a couple weeks ago, who's an MIT guy who's working on that. And I said, so how are you adjusting the facial recognition for people with darker skin? And the challenge is that, well, we haven't we haven't fixed that problem yet. And so they're they're out there raising capital and testing this product, but they, in this company, they're not they haven't rolled it out to market. But I've seen similar products on the market now, and I think I think that's gonna get some people in trouble.

Brandon Jordan:

Yeah. And I I think there's already been some, out of court settlements for things in the last, 10, 15 years for certain companies doing that. And right now what we're seeing is at the state level and and city level, it's a lot of legislation. Colorado just had something. I read an article yesterday about they're coming out with a a big push for that.

Brandon Jordan:

California, the same. New York City city's department. Has already passed something that's one of the most stringent, but it also has the most loopholes, it seems as well. Like, it's kind of EOC esque where it's doesn't really make an impact unless it's challenged, essentially. But, but, yeah, all that stuff is coming at the at the state level for the AI stuff related to employee assessments, and it's gonna be exciting.

Brandon Jordan:

Maybe maybe it's gonna be scary.

Mike Coffey:

Well, we'll we'll pay attention and we'll get you back to talk about it some more. But that's all the time we've got. Thanks for joining me, Brandon.

Brandon Jordan:

Yeah. Absolutely. Thanks for having me.

Mike Coffey:

And thank you for listening. You can comment on this episode or search our previous episodes at goodmorninghr.com or on Facebook, Instagram, or YouTube. And don't forget to follow us wherever you get your podcast. Rob Upchurch is our technical producer, and you can reach him at robmakespods.com. And thank you to Imperative's marketing coordinator, Mary Anne Hernandez, who keeps the trains running on time.

Mike Coffey:

And I'm Mike Coffey. As always, don't hesitate to reach out if I can give service to you personally or professionally. I'll see you next week, and until then, be well, do good, and keep your chin up.