I had the pleasure of participating in a virtual roundtable earlier today hosted by The Center for International Governance Innovation (CIGI), an independent, non-partisan think tank. It was organized at the request of Minister Mary Ng, the Federal Minister of Small Business, Export Promotion and International Trade and moderated by Rohinton Medhora, CIGI’s President. One major focus of my remarks was on the impacts of the CPPA on small business.
The roundtable had three main topics:
1. Digital trade: Covid-19 economic recovery and Canadian businesses
2. Data governance: balancing privacy, security and economic opportunities
3. Fostering competitiveness: innovation and IP strategy
Following opening remarks from Minister Ng (who demonstrated a good understanding of our digital challenges and pressed attendees for their insights on how to solve them), I was asked to provide brief opening comments on topic 2. This includes comments on Bill C-11. Below are my speaking notes for those remarks.[1]
_________________
I want to thank Minister Ng and CIGI for inviting me to participate today in this roundtable. As requested, I will focus these opening remarks on the topic of “data governance: balancing privacy, security and economic opportunities”.
The topic has considerable breadth so I want to focus on several matters which are included in Minister NG’s Mandate Letter. In particular, increasing support to Canadian businesses including small businesses and entrepreneurs, to which I include supporting domestic and international innovation by businesses in Canada.
Government policies must promote trust and confidence in the digital economy. That means having 21st century privacy laws. As data can flow within and across borders, we must ensure that individuals’ reasonable expectations of privacy and appropriate measures to safeguard security are met. Bill C-11 will help achieve this by continuing to require compliance with the “fair information practice principles”, the cybersecurity measures, and the practical service provider transfer for processing provisions in PIPEDA,[2] by updating and clarifying our law.
Canadian businesses operate in a global marketplace. Our framework laws related to the burgeoning ecommerce market must take into account the need for compatibility/interoperability in legal frameworks. But, we have a goldilocks problem. The GDPR is the high-water mark and we want our laws to be deemed to be adequate to support data flows with the EU. But, those standards far exceed those of our major trading partners including the United States and our trading partners to the CUSMA and the CPTTP. These latter standards may be considered too low for our national standards.
Privacy has no recognized multilateral framework like there is for trade under the (beleaguered) WTO or for intellectual property as there is under WIPO. There are standards such as the APEC Privacy Framework and the Council of Europe’s Convention 108+.[3] But, these are not global standards.
This leaves smaller countries like Canada vulnerable. We have international standards for electronic transaction frameworks that are referenced in the CUSMA and CPTTP.[4] We need similar global frameworks for privacy so that Canadians can have high levels of privacy and businesses can innovate on a level privacy playing field. I recognize this will be a challenge as there are major divergences on approaches to privacy internationally.
We also need common domestic standards. Overlapping and potentially inconsistent provincial rules which are being studied such as in Quebec, Ontario, and B.C., may pose challenges. We solved related e-commerce challenges in the late 1990s with the Uniform Electronic Commerce Act[5] and the Uniform Electronic Evidence Act.[6] We need a similar effort now.
C-11 has significantly recalibrated the balance in privacy from PIPEDA.
C-11, as currently drafted, may create barriers to sharing, exploitation and monetization of anonymized data, AI and data driven technologies, with provisions that are more onerous than even those in the GDPR or the U.S. CCPA.[7] (It is possible that the language in C-11 does not accurately reflect the policy intention and, in such event could be clarified.)
C-11 also appears to have a heavy focus on regulating global businesses operating in Canada. With some exceptions,[8] C-11 is not particularly tailored to the needs of small business.
You can see this with the extensive array of new powers for the OPC which will be able to investigate alleged privacy breaches, effectively prosecute violators, adjudicate if PIPEDA has been breached, make interim and final orders, and recommend substantial penalties (up to $10 million or 3% of global revenues) that can be imposed by the new tribunal. Its reports can also be used as a basis for private rights of action. Its findings are extremely hard to appeal, despite the lack of robust procedural protections.[9] This is a major departure from PIPEDA, which according to statistics published by the Privacy Commissioner, has worked well over 98% of the time.[10]
These remedial powers are being promoted as a major feature of the Bill, but there are drawbacks to them. They may change the way businesses view the risks of innovating in this country, especially when viewed in combination with other features of the law.
As we all know, rules that regulate businesses must be relatively easy to understand, apply and should provide relative certainty. Ambiguity and vagueness create risks that hinders innovation and risk taking. This is especially so when addressing digital technologies.
The CPPA has many fundamental obligations requiring businesses to make complicated assessments of what is “reasonable”, or “appropriate”, what is a person’s “reasonable expectations”, whether measures are “proportionate”, what is ”plain language” (in privacy disclosures that must be fully transparent for many technical uses for consents to be valid), or the “reasonable foreseeable consequence” of an act. These are in important provisions that are central to the operation of the law. They define the CPPA’s purposes,[11] limit uses of personal information,[12] define how consents can be obtained,[13] establish the standards security safeguards must meet,[14] and define when personal information is de-identified and can be used for purposes such as machine learning,[15] among others.
It may be hard to frame privacy rules for the myriads of different uses of personal data other than by using general principles, but nevertheless these ambiguities could be particularly troubling in this brave new world for small businesses. We are in the vortex of the 4th industrial revolution (aka Industry 4.0 or I4) that has no historical precedents in terms of velocity, disruption, scope, or systems impact.[16] Successful business models in the I4 economy often require risk taking and a regulatory regime that is easy to navigate, does not appear to be crippling or punitive to businesses that get it wrong, or that can be shut down by interim orders made with minimal procedural safeguards.
Industry 4.0 calls for regulatory approaches that are agile, adaptable, technologically neutral, and that facilitate rather than impede innovation. Prescriptive rules or rules that create burdensome or ambiguous obligations – individually and in the aggregate – can impede innovation.[17]
It is hard to predict how policies that sound good in principle, such as those that create high but ambiguous standards, that apply to complex technological environments, impose severe penalties, and create standards than are more onerous than anywhere else (like the de-identification provisions), will work in practice. These policies may work to undermine small businesses’ trust and confidence in our e-commerce framework.
Global companies may be willing to pay lawyers to figure things out. I worry, however, whether small and medium sized businesses, and their financial investors, will want to avoid or mitigate risks that don’t exist elsewhere, or to the same degree, and will just go elsewhere like to the U.S. to innovate there. So may larger companies that make investment decisions that take into account many factors including domestic regulatory environments. As we know, capital and labor are highly mobile in this global digital economy.
My remarks should not be taken as opposing C-11. Like any legislation that deals with complex subject matter there are always areas for improvement. That’s what Parliamentary Committee hearings are for.
I want to leave you Minister with one thought. Trust and confidence in digital or ecommerce has multi-facets. Individuals must have trust and confidence that their personal information will be protected. Canadian businesses must also have trust and confidence in our legal framework laws. As for privacy, the rules must also be clear, promote innovation and risk taking, and should not unnecessarily put Canadian businesses on an uneven international playing field.
____________________
[1] These remarks are made in my personal capacity and do not necessarily reflect the views of my firm, McCarthy Tetrault ,or its clients.
[2] Barry Sookman “CPPA: transfers of personal information to service providers”, online: https://barrysookman.com/2020/11/23/cppa-transfers-of-personal-information-to-service-providers-for-processing/.
[3] See, Colin Bennett, The Council of Europe’s Modernized Convention on Personal Data Protection; Why Canada Should Consider Accession, CIGI Papers No. 246 November 2020 online: https://www.cigionline.org/sites/default/files/documents/no246_0.pdf
[4] The UNCITRAL Model Law on Electronic Commerce 1996 (referenced in the CUSMA and the CPTTP) and the United Nations Convention on the Use of Electronic Communications in International Contracts, done at New York November 23, 2005) (referenced in the CPTTP).
[5] Oneline : https://www.ulcc.ca/en/1999-winnipeg-mb/359-civil-section-documents/1138-1999-electronic-commerce-act-annotated.
[6] Online: https://www.ulcc.ca/en/older-uniform-acts/electronic-evidence/1924-electronic-evidence-act.
[7] See, Barry Sookman “CPPA: identifying the inscrutable meaning and policy behind the de-identifying provisions”, online at https://barrysookman.com/2020/12/07/cppa-identifying-the-inscrutable-meaning-and-policy-behind-the-de-identifying-provisions/.
[8] See s 108 “Factors to consider” “In addition to taking into account the purpose of this Act in the exercise of the Commissioner’s powers and the performance of the Commissioner’s duties and functions under this Act, the Commissioner must take into account the size and revenue of organizations, the volume and sensitivity of the personal information under their control and matters of general public interest.” S94(5) Factors to consider “(5) In determining whether it is appropriate to impose a penalty on an organization and in determining the amount of a penalty, the Tribunal must take the following factors into account…
(b) the organization’s ability to pay the penalty and the likely effect of paying it on the organization’s ability to carry on its business”.
[9] See, ss. 92, 93-95, 98(1), 100-106.
[10] Barry Sookman PIPEDA by the numbers: lessons for privacy law reform in Canada? Online: https://barrysookman.com/2020/10/12/pipeda-by-the-numbers-lessons-for-law-reform/.
[11] See s5. “Purpose The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.
[12] See s12: Appropriate purposes
12 (1) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider appropriate in the circumstances.
Factors to consider
(2) The following factors must be taken into account in determining whether the purposes referred to in subsection (1) are appropriate:
(a) the sensitivity of the personal information;
(b) whether the purposes represent legitimate business needs of the organization;
(c) the effectiveness of the collection, use or disclosure in meeting the organization’s legitimate business needs;
(d) whether there are less intrusive means of achieving those purposes at a comparable cost and with comparable benefits; and
(e) whether the individual’s loss of privacy is proportionate to the benefits in light of any measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual.
[13] 15(3) The individual’s consent is valid only if, at or before the time that the organization seeks the individual’s consent, it provides the individual with the following information in plain language:
(a) the purposes for the collection, use or disclosure of the personal information determined by the organization and recorded under subsection 12(3) or (4);
(b) the way in which the personal information is to be collected, used or disclosed;
(c) any reasonably foreseeable consequences of the collection, use or disclosure of the personal information;
(d) the specific type of personal information that is to be collected, used or disclosed; and
(e) the names of any third parties or types of third parties to which the organization may disclose the personal information.
15(4) Form of consent “Consent must be expressly obtained, unless the organization establishes that it is appropriate to rely on an individual’s implied consent, taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.
Business activities
18 (1) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for a business activity described in subsection (2) and
(a) a reasonable person would expect such a collection or use for that activity; and…
[14] Security safeguards
57 (1) An organization must protect personal information through physical, organizational and technological security safeguards. The level of protection provided by those safeguards must be proportionate to the sensitivity of the information.
[15] Proportionality of technical and administrative measures
74 An organization that de-identifies personal information must ensure that any technical and administrative measures applied to the information are proportionate to the purpose for which the information is de-identified and the sensitivity of the personal information.
[16] See, for example, Klaus Schwab, (Founder and Executive Chairman of the World Economic Forum) The Fourth Industrial Revolution: what it means, how to respond, online: https://www.weforum.org/agenda/2016/01/the-fourth-industrial-revolution-what-it-means-and-how-to-respond/, Wikipedia, Fourth Industrial Revolution, Online: https://en.wikipedia.org/wiki/Fourthl_Industrial_Revolution
[17] To borrow from our Charter jurisprudence rules should be minimally impairing to promote clear objectives and should be proportionate so that the benefits outweigh the burdens