As predictive algorithms and other data analytics tools become omnipresent, employers need to be cognizant of the risks of using software they didn’t program themselves. Nathaniel Glasser, a member with law firm Epstein Becker Green’s Employment, Labor & Workforce Management practice, spoke to Employee Benefit News about how employers can leverage these technologies without perpetuating biases found in the hiring process. This conversation has been condensed and edited for clarity.
Employee Benefit News: Where do employers start if they don’t have a data scientist or never implemented big data analytics?
Nathaniel Glasser: There are a number of companies that have technology options for utilization in HR, and they run the gamut from hiring through promotion, workforce analysis, retention and things of that nature. What we’re seeing is with companies that are interested in finding these techniques, they are looking for efficiencies. Employers are recruiting online so they’re receiving hundreds or thousands of applications for any given position, and their recruiters are overwhelmed with that first screen. One of the benefits that’s being touted by the vendors is the potential to increase diversity in your hirees. Along with efficiency, it’s speed and it’s the ability to focus your recruiters on other issues rather than the initial screen. Often, they’re looking at HR people analytics, first from the hiring standpoint.
EBN: With that, have you seen a lot of companies who don’t know how to take their hiring technology further? Or is that where they’re getting stuck?
Glasser: I think with a lot of companies right now, it’s trying to get the lay of the land and what can you do. For those companies that are using it in hiring, it’s more of a phase-in process. They’re using it at the hiring stage and seeing what it does for them. The other proposed purported benefit of a lot of these technologies is that they’re going to be predictive of future success. Some of the companies are implementing at the hiring stage and they’re taking some time to evaluate whether this is really working. Are the hireees that are coming out of this technology, are they performing at a high level, and then are they sticking around? How is retention with those folks? If they’re finding a positive correlation there, then they’re considering other areas within HR that we should be thinking about using the technology.
EBN: Do you think predictive algorithms are overblown, or do you think there is meaningful information that can be found with it?
Glasser: I think the answer lies somewhere in between. The EEOC is starting to look into this, and the EEOC has expressed concerns about legal issues and concerns, or at least considerations, that employers should have when they’re thinking about these things.
EBN: Like what?
Glasser: On the diversity point, some individuals who are pushing the use of big data analytics are hoping that this is some sort of panacea that’s going to eliminate discrimination in the recruiting or hiring process, but I don’t think that’s accurate. The EEOC’s data concern is that what’s going to happen is the use of these technologies will just perpetuate the biases that are already inherent in traditional hiring practices. I think it’s important for employers who are considering this to get the software, look at the algorithm and the data inputs, and attempt as best as possible to make sure that the data is being analyzed in an objective way.
It’s really important for employers who are implementing to run an adverse impact analysis to see if there’s an impact on any protective category, and if necessary, validate the algorithm or scrap it if it’s not salvageable. Employers really have to take an honest look at what’s occurring and whether it’s actually providing the benefit that you want in terms of the diversity, and if it’s not, and if all it’s doing is providing you with some efficiency but increasing the legal risk, then that might not be software for you to use.
EBN: Are employers concerned that if they’re using these technologies that they’ll open themselves up to a data hack and subsequent lawsuits if the algorithm turns out to be racist or sexist?
Glasser: Our clients are having those concerns in every aspect of their data, regardless of what it is. A lot of companies that are using big data analytics might be integrating with an applicant tracking system (ATS) that they already have in place. Hopefully they’ve already thought about these issues regarding their ATS and the data that’s there. Aside from an external data hack, the other consideration is who’s got access to the data internally; not just from a potential theft of that data, but is it going to be either used in a malicious way for a hiring or some other employment decision, or does somebody who shouldn’t have access to it get access to demographic data at a stage earlier than they should have it in the process? When we talk to clients, we talk about walling off segments of the data, depending on what it’s being used for and who needs what pieces.
EBN: Is there any way to have that data de-identified?
Glasser: For an adverse impact analysis, for instance, you have to have the demographic data paired with whatever the results of the algorithm are. It’s not necessary to have the identities of the specific individuals because you’re looking at the gross numbers. To the extent that you can separate the two, we advise our clients to do that.
EBN: What advice do you have for employers?
Glasser: What I always advise employers is to plan, plan, plan. That’s making sure all of the relevant stakeholders are involved in the implementation, so that’s the business folks, that’s HR, that’s legal counsel. Then what I prefer — it’s not always possible — I prefer a gradual implementation of the software. For instance, with a client I’m working with, we’re piloting the software without utilizing it to make the decisions. The traditional hiring methods are being used, but we’re asking every applicant to run through an algorithm; this particular one requires applicants to take a test.
EBN: When you say you’re piloting it without using it, the algorithm isn’t being considered? You’re just seeing what it would have recommended?
Glasser: We’re comparing what the recommended results are with what actually happened. We can make a better-informed decision as to whether the algorithm is actually working, whether there’s any adverse impact, and is it providing the promised benefit from the vendor that you’re going to increase diversity. That has required a number of interactions internally and externally with the vendor that’s providing it. On most occasions, most vendors are willing to work with the company as a client to deliver good results. When you start early in terms of planning and implementation, you should be able to get closer to that goal result than if you just take something off the shelf.
Register or login for access to this item and much more
All Employee Benefit News becomes archived within a week of it being published
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access