Canadian HR Reporter

March 2020 CAN

Canadian HR Reporter is the national journal of human resource management. It features the latest workplace news, HR best practices, employment law commentary and tools and tips for employers to get the most out of their workforce.

Issue link: https://digital.hrreporter.com/i/1212438

Contents of this Issue

Navigation

Page 1 of 47

2 www.hrreporter.com N E W S look wrong?' And that's what the AI is great at," he says. "When something looks unusual, that's generally what it's looking for, and/or very specific things… bullying terms, harassment terms, violence, sex, gambling… none of that stuff should be happening in your workplace." But the new tools raise more than a few questions: Do they invade employee privacy? Do they contribute to a culture of mistrust? Do they inadequately try to do a manager's job? "You do want to make sure that you're not swatting a fly with a sledgehammer where you're bringing in an intrusive system, poorly validated, in order to maybe, possibly, potentially give you some insight into a set of concerns that you might have been able to get at just by more sensitive and nuanced management work," says Chris MacDonald, an associate professor and consultant on ethics at the Ted Rogers School of Management at Ryerson University in Toronto. Benefits to monitoring tools On the other hand, AI in employee monitoring offers the opportunity for insight that would have been literally impossible previously, he says. "The temptation is to want to do something quantitatively and at least semi-scientifically that would have taken really subtle and nuanced leadership skills before. So, in the past, HR issues were part of the art of management as opposed to the science of it, so it was incumbent on the leader to have the sense of the pulse of the culture of their organization and to be able to read between the lines. And this kind of technology at least claims to be able to say, 'No, no, it's not just a matter of the art of management… we can detect algorithmically whether there's a problem.'" AI and machine learning learns can address it: Why are they disengaged all of a sudden? Maybe their boss is a jerk or they're being overworked… This can shed some light." Employers can also set alerts in the system around people working long hours or after-hours, to combat overwork or overtime requests, he says. Status Today is able to assess the way people communicate and collaborate, and how that is affecting their respective positions in the organization, says Ankur Modi, founder and CEO of Status Today. "Taking these networks into account, we establish who are the influencers, who are the nodes, who are most connected to the rest of the organization, the most engaged, most influential — not by themselves, but also based on the people they talk to. Because these people implicitly become the hubs, the bridges where everything happens." Use data wisely While this all sounds well and good, there are definitely some considerations for employers going down the path. For one, you've got to have a lot of data to train up your algorithm, which is supposed to get better with the more data you feed it, says MacDonald. "Combine that with the tendency among tech companies to get a minimal, viable product to market and then hone it, it's entirely possible that if you're buying access to one of these algorithms today, it's really a beta product, and your employees are part of the fodder that's helping train the algorithm.... If you're the 30th company to buy it, maybe you're getting a well-validated product. But I would be awfully cautious about a product that may well, fundamentally, be experimental." Employers should definitely ask how robust the data is, as there can be different sociocultural groups within everybody's behaviour and understands it, so it sets a baseline, says Nourse. "It creates a digital fingerprint for everybody. And that's what it's really looking for… a deviation from what your own personal norm is, but it's also looking for a deviation from the other people in your assigned group." In using psycholinguistics, the AI is also watching and analyzing everyone, he says, looking for signs of disengagement, for example, which could lead to company theft or a valued employee quitting. "We always say, 'Try to handle it when it's an HR situation before it gets to be a criminal or fireable offence,'" says Nourse. "Companies spend a lot of money training people, getting them up to speed, and, sometimes, it's hard to know when someone's disengaged [and] they're starting to look around. So maybe they a population, and different types of cultures have very different kinds of linguistic patterns, he says. "You would want that to be taken into account both in terms of how an algorithm is trained up, but also in terms of how you're using it." And while people know they should be professional in their communications, sometimes they're less formal or restrained, which might not match up with the AI. "Even if you know that technically your employer owns your emails, that doesn't mean that employees actually expect their emails to be read for real. And so it raises worries, I think, about the extent to which an employer owns you during working hours. And part of that is the extent to which you're monitored," says MacDonald. If there's a problem, AI doesn't solve it, he says. "It gives you at best a lead and what you choose to do with that information is an entirely separate question, and one that's got to matter a lot, because, in a lot of cases — and this is true for consumer surveillance — it's not so much that we're worried about the information itself, it's [that] we're worried that someone's going to misuse it." One rule of thumb? Collect as little data as possible, and don't be very intrusive, says Modi. "Don't try to create a monitoring culture where everything people do, everything an employee does, is captured. Have some sort of measured approach where activities that have an impact could be measured, whereas the rest of FACIAL RECOGNITION SOFTWARE NOT POPULAR WITH WORKERS The least acceptable forms of surveillance: Source: Trade Union Congress, U.K. "A lot of people assume or are scared of AI taking the decisive role and automatically deciding that 'Oh, this team is not important.' That's not the role that AI needs to play." Anuk Modi, Status Today facial recognition software and mood monitoring (76%) monitoring of social media accounts outside of work (69%) recording a worker's location on wearable or handheld devices (67%) monitoring of keyboard strokes (57%) Monitoring > pg. 1

Articles in this issue

Links on this page

Archives of this issue

view archives of Canadian HR Reporter - March 2020 CAN