Nelligan O’Brien Payne gratefully acknowledges the contribution of Suzanne Dunn, Student-at-Law in writing this article. Read more information on our Articling Program.
Digital monitoring. Big data. Algorithms. These are relatively new terms in the employment world, but are increasingly relevant to employer/employee relationships.
Employers are starting to rely on software to make decisions about who to hire and how to run their workplaces. However, early adopters of this type of software are realizing both the benefits and downfalls of using these products for workplace management.
Software has made life simpler for some employers. Employee data can be analyzed to evaluate a prospective employee’s “fit” in the organization. The installation of monitoring programs can help identify security breaches. This saves companies time and money.
However, there are also downfalls to relying too heavily on software in the workplace. Employees do not lose their privacy or human rights when they log on to their computer, but digital software has the potential to infringe on those rights. Employers who uncritically rely on algorithmic or monitoring software in the workplace can seriously violate their employees’ rights.
For example, there may be issues if the software that employers install collects too much data, or the data collected are inherently private and not work-related. Poorly programed automated software can also lead to discriminatory hiring practices.
Employers who want to use employment software in the workplace should consider how those programs impact their workers’ privacy and human rights before installing them.
This article will discuss two types of employment software that have the potential to violate employee rights: algorithmic programming and digital monitoring.
Algorithmic programing
Algorithms are a set of rules that a software program uses to solve a problem. Employers have begun implementing algorithmic programing to facilitate the hiring process. Algorithms are used to identify prime candidates by scanning resumes for particular experiences, qualities or qualifications the employer wants its employees to have. They do this by identifying key words or phrases on a candidate’s resume.
Some companies require potential candidates to take online tests that assess their skills, IQ or personalities. The program then analyzes the results to decide which candidates would be best-suited for the position.
The candidates that do not fulfill the requirements of the algorithm are removed from the competition.
These programs have been very effective for some companies. In the United States, the Xerox Corporation employed an online evaluation tool for assessing potential employees. The algorithm increased Xerox’s retention rate in its call centres by more than 20%.
Algorithms have also been purported to remove implicit human bias in hiring practices and make better decisions than human managers. When a machine is assessing potential candidates, it can avoid the problem of interviewers preferring candidates who are “like them”. By objectively analyzing data, the computer removes the human factor that can lead to discriminatory hiring.
However, poorly programed algorithms can also result in discriminatory hiring. Studies have shown that algorithms that rely on historical hiring data can reinforce pre-existing biased practices. If the algorithm is relying on samples of previously successful candidates who were mainly white, able-bodied men with similar experiences, the algorithm will look for that same pattern for future hires. This type of programming can reinforce discrimination that already exists in the workplace.
To avoid discriminatory hiring, employers should assess what variables they are using in their algorithms. Each variable should be job-related, and empirical data should support why that particular trait is valid. Algorithms that “learn” patterns should be provided with diverse sample populations to avoid reinforcing discriminatory hiring. This may require doing audits to assess whether the program discriminates in practice. Some programs that are not discriminatory on their face may result in discriminatory results.
An example of a poorly designed program is the Boston “Street Bump” app. It is a non-employment related app but it illustrates the potential for discrimination when relying on algorithms to solve problems.
In Boston, the Mayor’s office developed an app to track where potholes were in the city. Individuals could download the “Street Bump” app on their smart phone and if their car “bumped” when it hit a pothole, the city would be notified so it could fix the hole. However, the developers did not originally consider the type of people who would collect the data: people who own smart phones and drive cars.
The algorithm computed the data it received from users and determined most potholes were concentrated in wealthier neighborhoods and therefore the City should focus its efforts there. Data about potholes in lower-income neighborhoods was not being recorded because fewer people owned smart phones and cars. The purportedly non-discriminatory algorithm ended up producing discriminatory results. In order to remedy this, the city needed to collect additional information from the public in order to provide non-discriminatory services.
Employers need to be aware that they are at risk of similar results if they implement programming without doing their homework first.
Digital monitoring
If employers wish to install digital monitoring software on their employees’ computers, they must ensure that it complies with relevant privacy legislation, such as the Personal Information Protection and Electronic Documents Act. Privacy legislation limits the type of information that can be collected by employers, and defines what that information can be used for.
In a recent investigation by the British Columbia Privacy Commissioner, the District of Saanich was found to have violated its employees’ privacy rights when it installed a comprehensive monitoring and recording system on select employees’ computers, including the Mayor’s computer.
During the investigation, the District of Saanich stated it installed the surveillance program, Spector 360, for IT security purposes. The District claimed the information would only be retrieved in cases of security events such as unauthorized access to the computer by hackers.
The Privacy Commissioner found many of the program’s functions went well beyond what was necessary to capture IT security breaches, and unnecessarily collected personal information of the employees.
The software monitored all keystrokes, enabled automated screen shots every 30 seconds, recorded all email or chat exchanges, and logged all program activity of employee computers, including recording every website the employee visited. It collected nearly every aspect of the employees’ computer use.
None of the employees were adequately notified that their personal information was being collected in such depth. Some were not informed at all that they were being monitored.
This egregious over-collection of personal information clearly violated the employees’ privacy rights. The Privacy Commission found that the District had failed to consider or comply with the relevant privacy legislation.
Employers who wish to install surveillance programs on their employees’ computers should speak with a lawyer to ensure the program does not violate the relevant privacy legislation.
Check out our other interesting articles to learn more on topics in Labour Law, topics on workplace policies and more.