Données massives, plus grand risque de violation des droits
avril 27, 2016 Read Time: 4 minutes
Print

This article is only available in its original language: English

Nelligan O’Brien Payne gratefully acknowledges the contribution of Suzanne Dunn, Student-at-law in writing this article. Read more information on our  Articling Program .

Digital monitoring. Big data. Algorithms. These are relatively new in the world of employment.

Employers are starting to rely on software to make decisions about who to hire and how to run their workplaces. However, early adopters of this type of software are realizing both the benefits and downfalls of using these products for workplace management.

Software has made life easy for some employers. Employee data can be analyzed to evaluate a prospective employee’s « fit » in the organization. The installation of monitoring programs can help identify security breaches. This saves companies time and money.

However, there are also downfalls to relying too heavily on software in the workplace. Employees do not lose their privacy when they log on to their computer, but digital software has the potential to infringe on those rights. Employers who can not rely heavily on computer programming or monitoring software in the workplace can seriously violate their employees’ rights.

For example, there may be issues in which the software collects data, or the data is inherently private and not work-related. Can also lead to discriminatory hiring practices.

Employers who want to use employment software in the workplace. 

This article will discuss two types of employment software that has the potential to violate employee rights: algorithmic programming and digital monitoring.

Algorithmic programming

Algorithms are a set of rules that a software program uses to solve a problem. Employers have started implementing algorithmic programming to facilitate the hiring process. Algorithms are used to identify premium candidates by scanning for specific experiences, or qualifications or employing employees. They do this by identifying key words or phrases on a candidate’s resume.

Some companies require potential candidates to take online tests that assess their skills, IQ or personalities. The program then analyzes the results to decide which candidates would be best-suited for the position.

The candidates do not meet the requirements of the algorithm are removed from the competition.

These programs have been very effective for some companies. In the United States, the Xerox Corporation  employed an online assessment tool  for Assessing potential employees. The algorithm has increased by 20%.

Algorithms have also been removed to implicit human bias in hiring practices and make better decisions than human managers. When a machine is assessing potential candidates, it can avoid the problem of interviewers preferring candidates who are « like them ». By objectively analyzing data, the computer removes the human factor that can lead to discrimination hiring. 

However, poorly programmed algorithms can also result in discriminatory hiring. Studies have shown  that algorithms that rely on historical data can reinforce pre-existing biased practices. If the algorithm is relying on the samples of these successful candidates, they are likely to be able to produce the same pattern for future hires. This type of programming can reinforce discrimination that already exists in the workplace.

To avoid discriminatory hiring, employers should assess what variables they are using in their algorithms. Each variable should be job-related, and empirical data should be supported. Algorithms that « learn » patterns should be provided with various sample populations to avoid reinforcing discriminatory hiring. This may require doing audits to assess whether the program discriminates in practice. Some programs that are not discriminatory on their face may result in discriminatory results. 

An example of a poorly designed program is the Boston « Street Bump » app. It is a non-employment related app for the purpose of discrimination when relying on algorithms to solve problems.

In Boston, the Mayor’s Office is an app to track where potholes were in the city. Individuals could download the « Street Bump »  app  on their smart phone and if their « bumped » when it hit a pothole, the city would be notified so it could fix the hole. However, the developers did not originally consider the type of people who would collect the data.

The algorithm computed the data it received from users and determined to be more concentrated in wealthier neighborhoods and therefore the City should focus its efforts there. Data about potholes in lower-income neighborhoods was not smartphones and cars. The purportedly non-discriminatory algorithm ended up producing  discriminatory results . In order to remedy this, the city needs to collect additional information from the public in order to provide non-discriminatory services.

Employers need to be aware that they are at risk if they implement their homepage.

Digital monitoring

If employers wish to install digital monitoring software on their personal computers, they must ensure that they comply with the requirements of the Privacy Information Act and the  Personal Information Protection Act . Privacy legislation limits the type of information that can be collected by employers, and defines what information can be used for.

In a  recent investigation  by the British Columbia Privacy Commissioner, the District of Saanich has been found to have some of the most important problems in the use of computers, including the Mayor’s computer.

During the investigation, the District of Saanich is reported to have the surveillance program, Spector 360, for IT security purposes. The District claimed the information would only be retrieved in such cases as to be unauthorized access to the computer by hackers.

The Privacy Commissioner found many of the program’s functions were well beyond that.

The software monitored all keystrokes, enabled automated screen shots every 30 seconds, and recorded all email or chat exchanges. It is about every aspect of the employees’ computer use.

None of the employees have been provided with notice of their personal information. Some were not informed at all that they were being monitored.

This egregious over-collection of personal information clearly violated the employees’ privacy rights. The Privacy Commission found that the District has failed to comply with the law. 

Employers who wish to install their personal computer programs.

Check out our other interesting articles to learn more on topics in Labor Law, topics on workplace policies and more.

This content is not intended to provide legal advice or opinion as neither can be given without reference to specific events and situations. © 2021 Nelligan O’Brien Payne LLP.

Service: Droit du travail

Enjoy this article? Don't forget to share.