Blog

Class aptent taciti sociosqu ad litora

Canada’s New Data Protection Bill: Is It Adequate?

  • December 31, 2020
  • Heather Ferg

On November 17, 2020, the Canadian federal government tabled Bill C-11 in the House of Commons. If passed, the Digital Charter Implementation Act will enact two new pieces of legislation: the Consumer Privacy Protection Act and the Data Protection Tribunal Act. Together, they form a long-awaited overhaul to federal privacy legislation.

1. Introduction

As our lives become increasingly intertwined with digital interfaces, the collection of personal information has become ubiquitous. Nearly all interactions with technology are tracked in one way or another and the data-brokering market is estimated to be worth about 200 billion dollars (here). Information about the attention, preferences and habits of individuals is a highly sought-after commodity. Digital information can be used to gain insights about individuals or groups and facilitates the ability to nudge (or manipulate) them through targeted messaging or other interventions. Consumer data collection is no longer limited to the basic information one might provide on a sign-up sheet.

In response to the increasingly power accruing to analytics brokers, many jurisdictions have implemented privacy legislation designed to protect the individual consumer and safeguard core societal values. The General Data Protection Regulation of the European Union is among the most robust privacy-protection regimes in the world. It enshrines individual data protection as a fundamental right. In the United States, California’s Consumer Privacy Act is landmark legislation that also takes a rights-based approach to protecting the individual (i.e. the right to know, the right to delete, the right to opt out, and the right to non-discrimination).

Much has changed since Canada’s last major legislative overhaul in 2000. The Digital Charter Implementation Act will be an important step in shaping how Canadians and their data are valued in the years to come. As the Bill goes to committee, we must consider whether it meaningfully reflects Canadian values and enhances the public good.

2. The Digital Charter

The threshold question in addressing the “Digital Charter Implementation Act” is: What, exactly, is the Digital Charter? The Digital Charter is a statement of principles related to data and digital technology introduced by Navdeep Bains, Minister of Innovation, Science and Economic Development, during an address to the Empire Club of Canada in Toronto on May 21, 2019. It is a lofty, aspirational list of principles which are as follows:

  1. Universal Access: All Canadians will have equal opportunity to participate in the digital world and the necessary tools to do so, including access, connectivity, literacy and skills.
  2. Safety and Security: Canadians will be able to rely on the integrity, authenticity and security of the services they use and should feel safe online.
  3. Control and Consent: Canadians will have control over what data they are sharing, who is using their personal data and for what purposes, and know that their privacy is protected.
  4. Transparency, Portability and Interoperability: Canadians will have clear and manageable access to their personal data and should be free to share or transfer it without undue burden.
  5. Open and Modern Digital Government: Canadians will be able to access modern digital services from the Government of Canada, which are secure and simple to use.
  6. A Level Playing Field: The Government of Canada will ensure fair competition in the online marketplace to facilitate the growth of Canadian businesses and affirm Canada’s leadership in digital and data innovation, while protecting Canadian consumers from market abuses.
  7. Data and Digital for Good: The Government of Canada will ensure the ethical use of data to create value, promote openness and improve the lives of people – at home and around the world.
  8. Strong Democracy: The Government of Canada will defend freedom of expression and protect against online threats and disinformation designed to undermine the integrity of elections and democratic institutions.
  9. Free from Hate and Violent Extremism: Canadians can expect that digital platforms will not foster or disseminate hate, violent extremism or criminal content.
  10. Strong Enforcement and Real Accountability: There will be clear, meaningful penalties for violations of the laws and regulations that support these principles.

More important than what the Digital Charter is, is what it is not. The Digital Charter does not have legal force. Ot is not a legal document, a piece of legislation of a “Charter” in the same sense as the Canadian Charter of Rights and Freedoms. It does not confer rights or create remedies. It is (at best) a statement of the goals and values that apparently underlie federal privacy policy.

3. Legislation

Contrary to its name, the Digital Charter Implementation Act (Bill C-11) does not implement the Digital Charter, but rather enacts two new pieces of legislation: the Consumer Privacy Protection Act and the Data Protection Tribunal Act. The Consumer Privacy Protection Act is the substantive legislation that deals with the protection of personal information in the private sector. It replaces the Personal Information Protection and Electronic Documents Act (PIPEDA). It gives the Privacy Commissioner of Canada new enforcement powers. The Data Protection Tribunal Act creates an administrative tribunal to hear appeals of decisions made by the Privacy Commissioner and imposes penalties for privacy violations. The stated purpose of the reforms is to establish “rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances” (s. 5).

4. The Reforms

The balance of this section reviews four of the many important areas covered by Bill C-11: (1) organizations’ internal policy requirements; (2) accessing information; (3) consent to collection, use and disclosure; and, (4) automated decision making.

(a) Internal Policy Requirements

Under the new framework, all organizations would be required to implement privacy management programs and make their policies available to the public. Privacy management programs must ensure compliance with the Act and must include policies, practices and procedures respecting: the protection of personal information, how information requests and complaints are handled, and staff training and developing materials to explain the organization’s policies and procedures (s. 9).

Policies must also be made “readily available” to the public. Organizations must explain the following in plain language:

  • the type of  information under it’s control;
  • how the information is used;
  • an account of the use of any automated decision system to make predictions, recommendations or decisions about individuals that could have significant impacts on them;
  • whether or not the organization carries out  international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications;
  • how individuals can request disclosure or disposal of their private information; and
  • how to individuals can make complaints (s. 62).

The precise meaning of terms such as “significant impacts” and “reasonably foreseeable privacy implications” are not spelled out in the Bill.

(b) Accessing Information

Where a request is made, an organization must tell individuals whether it has any personal information about them, how it uses it, whether it has been disclosed and if so, to who. It must also give the individual access to the information (s. 51).

(c) Consent

In its Bill C-11 Fact Sheet, the federal government boasts that the new legislation’s “[m]odernized consent rules would ensure that individuals have the plain-language information they need to make meaningful choices about the use of their personal information” while simultaneously removing “the burden of having to obtain consent when that consent does not provide any meaningful privacy protection.”

The consent requirements are set out in  ss. 15-7. These sections require that organizations obtain an individual’s valid consent for the collection, use or disclosure of an individual’s personal information. The consent must be obtained at or before the time the information is collected and the individual must be told:  (1) the purposes for the collection, use or disclosure; (2) the way in which it is to be collected, used or disclosed; (3) any reasonably foreseeable consequences of the collection, use or disclosure; (4) the specific type of personal information that is to be collected, used or disclosed; and, (5) the names of any third parties to which it may be disclosed.

Sections 18-28 deal with the many exceptions to the consent requirements. For example, s. 18 provides a “Business Activity” exception wherein an organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for a “business activity” and the following two conditions are met: (1) a reasonable person would expect such a collection or use for that activity; and, (2) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions. Other exceptions include transferring information to service providers and the use and disclosure of “de-identified” consumer information.

The sweeping exceptions to the consent requirements have attracted criticism. In a post titled The Gutting of Consent in Bill C-11Dr. Teresa Scassa of the University of Ottawa argues that not only is there nothing particularly new in the consent requirements, the changes that are being made are problematic. She noted that the exceptions are not just to consent, they are to knowledge and consent and it is very difficult to hold an organization accountable for its practices without knowledge of what they are. The “consent free activities” contemplated by the exceptions are vast and Dr. Scassa provides examples of the shocking potential impacts on personal privacy that are difficult to reconcile with the government’s purported commitment to ensuring Canadians have control over what data they share, who uses it and for what purposes.

The Public Interest Advocacy Centre (PIAC), a not-for-profit group that advocates for Canadian consumers in areas such as privacy, consumer protection, telecommunications and energy law also spoke out. The PIAC criticized the bill as giving with one hand while taking with the other and called for it to be withdrawn and re-written to protect consumers rather than favouring big business (here).

(d) Automated Decision-Making

One of the new features of the Consumer Privacy Protection Act is a disclosure requirement targeting the use of automated decision making. The Act defines an automated decision system as “any technology that assists or replaces the judgement of human decision-makers using techniques such as rules-based systems, regression analysis, predictive analytics, machine learning, deep learning and neural nets”(s. 3). If the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained (s. 63(3)).

This requirement presumes the organization actually knows how its systems work. Automated decision making (also called algorithmic decision making) can be extremely complicated.  It typically involves the analysis of vast quantities of personal data to ascertain trends or correlations thought to be relevant to various decisions. It increasingly utilizes machine learning systems and can have significant ethical implications. Explaining how these systems work is not straightforward and comes with its own issues (see: the European Parliamentary Research Service’s study Understanding algorithmic decision-making: Opportunities and challenges (here) and Understanding automated decisions: reaching the limits of explainability by Dr. Allison Powell of the London School of Economics (here).

5. Conclusion

After Bill C-11 was tabled, Daniel Therrien, the Privacy Commissioner of Canada, issued a statement that the Bill “raises a number of questions about its ability to effective,y protect privacy in a constantly evolving digital society” and reiterated the need for a legal framework that recognizes privacy as a human right and as an essential element for the exercise of other fundamental rights (here). The government, however, refuses to take this approach (apparently on constitutional grounds) and gives equal weight to privacy and the commercial interests of organizations.

It is difficult to see how the Consumer Privacy Protection Act and the Data Protection Tribunal Act constitute an “implementation” of the Digital Charter. As the Bill moves through the legislative process, careful attention ought to be given to the feedback provided by non-partisan groups and individual commentators. The right to privacy commands the protection of digital data as a fundamental aspect of the public good – not just as a commodity.

Comments are closed.