Canadian HR Reporter is the national journal of human resource management. It features the latest workplace news, HR best practices, employment law commentary and tools and tips for employers to get the most out of their workforce.
Issue link: https://digital.hrreporter.com/i/1212438
www.hrreporter.com 3 employer was using some kind of data mining to look at email patterns to see if there was anger or disaffection or some kind of disgruntlement in a way that would help them fix problems, I might be OK with that." While there's definitely a benefit to being able to predict or gauge when an employee might jump ship, based on their activity and communications, he says, what's the cost of that monitoring? "There is something to be gained by illegal searches by the police, too… but we worry about it. When it becomes intrusive, we worry about abuse of power and a whole bunch of related ethical issues." them are left to human nature." It's really about the proportional collection of data to build trust, he says. "How you do your work should largely be left to you to figure out… Data collection should almost exclusively be on a metadata level, rather than on the content and the depth of it. And that's largely because there are a lot of cultural differences, there's a lot of biases in the content. Staying on the metadata at least provides some level of objectivity to say you should not be reading people's emails or files or things like that, you should stick to a very top-level, objective view of inbound and outbound communication, content activity or HR data as it might be." Intrusion and transparency concerns Of course, there's also the issue of people knowing they are being monitored all the time, says MacDonald. "Legally being allowed to do something doesn't mean that it's OK to do it… That comes down to things like the reasonable expectations of privacy, the extent to which you trust your employees and want to convey to them that you trust them," he says. "It's a pretty serious choice to make as to whether you're going to use this kind of technology. And, more specifically, how you're going to use it, what purposes are you going to use it for? And what human systems are you going to have in place to deal with and respond to that data?" It's only fair for people to know that they are being monitored, says MacDonald. "As an employee, if I found out my If an employer is using an AI tool to monitor employees, the privacy analysis really wouldn't be any different than usual, says Suzanne Kennedy, a partner at Harris & Company in Vancouver. "Transparency and reasonableness are the key fundamental principles in privacy," she says. "By way of an example, in B.C. in the private sector, employers are allowed to collect employee information without their consent, but they have to tell them what it is they're collecting and how it will be used." So, if an employer is using AI for monitoring, it would be really important to make sure that there are policies and it's transparent with employees about what it's doing, what's being collected and how it may be used, says Kennedy. "The best recommendation to an employer going down this path is just to make sure that they do their homework first and that they've thought through: 'OK, first of all, why are we using it? Is the information we're collecting going to be useful for those purposes? Are there other less invasive ways to do this? Or are there ways we can utilize this technology that are less invasive? And are we currently communicating with our workforce, so that people aren't unpleasantly surprised that this is what we're doing?" In general, people are more comfortable with being monitored because it's become so commonplace, says Nourse, citing Google's surveillance as an example. And if the AI is looking at everybody across the board, then you're not running into a risk of discrimination or the like. "It's [about] 'We watch everybody, we DOES MONITORING LEAD TO DISTRUST? "It's a serious choice to make as to whether you're going to use this technology. And, more specifically, how you're going to use it, what purposes are you going to use it for?" Chris MacDonald, Ryerson University want to make sure everybody's playing by the rules and doing good… It's not like we're going in and watching everything everyone's doing.'" The software preserves people's privacy within a work environment as much as is reasonable, he says, "but it triggers when something unusual happens." Also of note: The data collected should not be exclusively visible to management, says Modi. "While there might be different legal answers to this, the ethical right answer is share this information back with the employees... If companies are collecting data and they're worried about trust and privacy, the simplest way to alleviate that is by sharing whatever information you collect with the employee in question… If I could see what information on my name was being captured, then I would have a bigger assurance to understand the decisions that are made from it, or influences that are being derived from it, are accurate… That creates a very transparent culture." A crutch for management? Another big concern around the use of AI is the lack of human involvement and 6 in 10 Number of people who fear that greater workplace surveillance through technology will fuel distrust (65%) and discrimination (66%) 56% Percentage of workers who believe they are monitored by their boss at work 3 in 4 Number of workers who say bosses should be banned from monitoring them outside of working hours Source: Trade Union Congress, U.K. Google's offices in Kitchener-Waterloo, Ont. The U.S.-based company has been accused of over-surveillance of employees in the past.