Late last month, the Privacy Commissioner of Canada (“OPC” or “Commissioner”) released a report of its findings holding that search engines like Google can be required to de-list personal information about an individual. While the decision generally aligns Canada’s federal privacy law PIPEDA with the right of erasure or “right to be forgotten” under the GPDP, the UK GPDR, and Quebec’s updated privacy law, the decision illustrates some problematic aspects of PIPEDA and the Commissioner’s interpretation of this important privacy law.
Overview
The decision and associated News Release issued by the OPC provide a good background and summary to the case.
- In June 2017, the OPC received a complaint against Google’s search engine service alleging that Google is contravening PIPEDA by including certain media articles in the list of search results displayed when the Complainant’s name is searched on Google. The articles, which were published many years earlier, relate to the Complainant being arrested and charged with a criminal offence resulting from an allegation that they did not disclose their HIV status to a person with whom they engaged in sexual activity.
- The Complainant stated that the articles are outdated and misleading, and that because these articles are linked to the Complainant’s name in search results, the Complainant has experienced serious harms such as physical assault, lost employment opportunities, and severe social stigma.
- The Complainant sought to have the media articles de-listed from search results displayed by Google in response to searches by name.
- After investigating the matter, the OPC found that Google’s accuracy-related obligations under PIPEDA do not extend to the underlying content of linked articles. Google is responsible for ensuring that: (i) the personal information provided in its search results accurately reflects the content of the articles to which it links; and (ii) those articles do in fact contain the name in the search term. In this case, it found no evidence to suggest that the search results in question did not accurately reflect the content of the linked media articles.
- But the OPC did find that there are limited circumstances in which a reasonable person would consider it inappropriate for a search engine to return, in response to a search for an individual’s name, content containing personal information about that individual. These circumstances would include those where returning the results in question causes or is likely to cause significant harm to the individual – including, as it found in this case, harms to their safety or dignity – that outweigh any public interest associated with returning those results in the search for that individual’s name.
- The OPC recommended that Google de-list the media articles in question from the search results that are displayed in response to a search for the Complainant’s name.
- Google did not agree to de-list the articles, taking the position that the question of whether PIPEDA includes a right to de-listing should be determined by the courts.
- As such, the OPC found complaint to be well-founded (and unresolved), with the exception of the accuracy aspect of the complaint where we did not find a contravention.
Comments on the OPC decision
The decision generally aligns PIPEDA with the GDPR right to be forgotten
The OPC recognized that users in the EU and U.K have had a “right of erasure” or “right to be forgotten” ever since the landmark decision of the Court of Justice for the European Union (the “CJEU”) in the “Google Spain” case.[i] Under that law Google has de-listed millions of web pages in a variety of circumstances.[ii] The UK has similar requirements.[iii] Quebec’s privacy law was recently amended to provide a similar right. [iv]
After reviewing these frameworks, the OPC proposed the following non-exclusive factors to determine when a search engine should de-list personal information from search results.
-
-
Whether the individual is a public figure (e.g. a public office holder, a politician, a prominent business person, etc.);
-
Whether the information relates to an individual’s working or professional life as opposed to their private life;
-
Whether the information relates to an adult as opposed to a minor;
-
Whether the information relates to a criminal charge that has resulted in a conviction or where the charges were stayed due to delays in the criminal proceedings;
-
Whether the information is accurate and up to date;
-
Whether the ability to link the information with the individual is relevant and necessary to the public consideration of a matter under current controversy or debate;
-
The length of time that has elapsed since the publication of the information and the request for de-listing.
-
The decision also stated “it is important to emphasize that when assessing the public interest in the search results, the question is not whether the underlying information serves the public interest in the abstract, but whether its continued availability in search results for searches of the Complainant’s name is in the public interest.”
The reliance on the “appropriate purposes” override in PIPEDA is problematic
In its findings, the OPC did not find that Google had violated any provisions in PIPEDA that specifically addressed the collection, use, or disclosure of personal information. In particular, it made no finding that Google did not have any consent or other legal basis to conduct it’s search engine business of crawling and indexing websites and displaying search results in response to user queries, or that any consent to display personal information in search results had been withdrawn when it received a complaint and the de-listing request. Rather, the OPC relied only on Subsection 5(3) of PIPEDA which states that “an organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances”.
This sub-section is problematic as it essentially gives the OPC a right to override all of PIPEDA’s other explicit provisions, making it difficult for organizations to know whether a business model would be viewed as inappropriate by the Commissioner. The government had proposed to include a (flawed) amended appropriate purposes provision in the CPPA (the bill that would have replaced PIPEDA) including, as Professor Scassa pointed out, with the inclusion of a new Section 12(2). But, if Parliament had intended to include a right to be forgotten in PIPEDA, it could have done so, as the Google Spain decision of the CJEU was released well before PIPEDA was enacted, and as the government had proposed in the CPPA.
The OPC may well have come to the conclusion that it needed to fill a gap in Canadian privacy law by effectively recognizing a right to be forgotten in the absence of amendments to PIPEDA. As Kris Klein pointed out in commenting on the decision, the OPC may have decided to base its decision on the appropriate purposes provision rather than on the basis of consent, or the withdrawal of consent, as it “allows for more flexibility” and allowed the OPC “to incorporate some of the nuances to the right to be forgotten developed in other countries.”
While it may be argued the OPC was right to step in to provide protection to the public, it may also be contended that this was a policy decision for Parliament and not the Commissioner, and it was not an appropriate case to expand the appropriate purposes provision to fill a gap in PIPEDA, or to achieve a result that the Commissioner was not satisfied with by applying PIPEDA’s statutorily prescribed consent regime.
The decision fails to address whether search engines are subject to PIPEDA at all
The decision and a timeline published by the OPC related to the litigation between the OPC and Google highlighted that Google has long sought to clarify whether PIPEDA violated the Charter because it failed to provide a legal basis for it to collect, use, and disclose personal information in providing its search serv ice. The Commissioner opposed Google’s procedural attempt to resolve the issue and Federal Court of Appeal sided with the OPC and prevented Google from raising the issue within the OPC reference to the Federal Court. However, prior to the release of the OPC findings in the case, an Alberta court in one of the CleaviewAI cases, expressed the view that Alberta’s privacy law violated the Charter as it prevented search engines (and ClearviewAI) from legally operating under that law, a law that is substantially similar to PIPEDA. Thus, while the findings of the OPC addresses whether the Charter freedom of expression right is violated by a right to be forgotten. it did not address the more fundamental question as to whether PIPEDA applied at all to Google.
This issue could have been resolved if Bill C-27 (the CPPA) which included a new legitimate interest exception had passed with suitable amendments. But as it did not, the report is deficient in not addressing this fundamental issue.
The decision misconstrues the interpretive quasi-constitutional status of PIPEDA
The decision recognizes that PIPEDA has been described as quasi-constitutional in nature. In determining the appropriate remedy to address the OPC’s findings, the OPC appears to have relied on it furthering “the objectives of PIPEDA, a quasi-constitutional statute, by giving the Complainant more control over the display of their highly sensitive personal information in Google search results.” While this may be laudable, the decision appears to ignore decisions of the Supreme Court that have held that privacy laws are not to be interpreted in any special way because of their quasi-constitutional status.
The de-listing recommendation did not address the territorial obligation
The OPC recommendation was that Google de-list the articles at issue from searches for the Complainant’s name. However, the recommendation did not expressly address whether the de-listing should apply only to the Google.ca domain or to search results provided to users with Canadian IP addresses, or to users anywhere else in the world. While PIPEDA can have certain extra-territorial scope where the real and substantial connection test is met, the de-listing recommendation did not consider the scope in this case or that the right has been considered to be territorial in the EU, such as in France, for example. Google unsuccessfully argued against a world-wide de-indexing order in the Equustek case, a case that largely dealt with misappropriation of confidential information. So, it is surprising that the decision does not even discuss the territorial scope of the de-listing recommendation.
Concluding remarks
The government has not announced whether and when it might re-introduce a bill that would replace the CPPA. However, in an article in Hill Times the new Minister of AI and Digital Innovation (Minister Solomon) stated that a bill to replace the CPPA is being worked on. There are numerous problems with the prior proposed replacement to PIPEDA, many of which are summarized in my blog post series on the Government proposals to amend CPPA and AIDA: the good, the bad, and the challenges. One can only hope that Parliament will weigh in on whether our privacy law will contain a right to be forgotten and when it should be exercised, and will also limit the Commissioner’s unique discretion to override the express provisions in the privacy law based on his assessment about what is not an appropriate purpose.
As for AIDA, it is likely the government will launch a consultation before proposing any new law to regulate AI, something that should have been done before it was introduced.
[i] The Findings summarise the rights of users under the GDPR:
In the European Union, the GDPR includes the right to erasure, which has been interpreted to include the right to de-listing. The following elements are considered in assessing the balance between an individual’s privacy rights and the right of the public in having access to the information through a search for the individual’s name:
-
-
he or she does not play a role in public life;
-
the information at stake is not related to his or her professional life but affects his or her privacy;
-
the information constitutes hate speech, slander, libel or similar offences in the area of expression against him or her pursuant to a court order;
-
the data appears to be a verified fact but is factually inaccurate;
-
the data relates to a relatively minor criminal offence that happened a long time ago and causes prejudice to the data subject.
-
[ii] The circumstances were described by the OPC as follows:
For users in the European Union (“EU”), who are covered by different data protection laws than Canadian users, Google will remove personal information from its search results pursuant to the Court of Justice for the European Union (the “CJEU”)’s decision in Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (“Google Spain”), the General Data Protection Regulation (“GDPR”) and the relevant laws of member states. According to the Google Spain decision, personal information may be required to be removed from search results returned in response to a search for an individual’s name where it is inaccurate, inadequate, irrelevant or no longer relevant, or excessive and provided that the public interest in accessing the information does not outweigh the interests of the individual concerned.
According to Google’s latest transparency report, it has de-listed over three million webpages from its search results in the EU since 2014 based on the Google Spain decision. The following are examples of cases where Google de-listed results from searches for an individual’s name:
- articles and social media posts that reported on the disappearance of a minor who was no longer missing and no longer a minor;
- a website that reported on an individual’s testimony against their mother during the trial for the murder of their father;
- a news article published 19 years earlier that revealed that an individual had been the victim of abusive treatment when they were a member of a children’s organization;
- webpages that discussed legal proceedings, which were later dropped, in relation to an individual’s publication of prohibited content when they were a student;
- news articles published between 2009 and 2016, which reported that an individual, who was a business professional, had been accused and acquitted of fraud; and
- a news article published 21 years earlier about an individual sentenced to a 6-year prison sentence for their involvement in a fatal shooting.
[iii] These were described by the OPC as follows:
In the United Kingdom, the Information Commissioner’s Office has adopted a similar approach and considers the following elements when assessing de-listing requests:
-
-
Does the search result relate to a natural person – i.e. an individual – and does it come up against a search on the individual’s name?
-
Does the individual play a role in public life?
-
Is the subject of the search result a child?
-
Is the data accurate?
-
Does the data relate to the individual’s working life?
-
Is the information ‘sensitive’ for the purposes of the Data Protection Act?
-
Is the data up to date? Is it being made available for longer than necessary?
-
Is the data processing causing prejudice to the individual? Is it having a disproportionately negative impact on the individual’s privacy?
-
Does the search result link to information that puts the individual at risk?
-
On what basis was the information published originally?
-
Was the original content published in a journalistic context?
-
Does the publisher of the data have a legal power – or a legal obligation – to make the personal data publicly available?
-
Does the data relate to a criminal offence?
-
[iv] Quebec’s framework was summarized by the OPC as follows:
In Quebec, section 28.1 of the Act respecting the Protection of Personal Information in the Private Sector provides that the following elements will be considered in assessing the public interest and freedom of expression:
-
-
the fact that the person concerned is a public figure;
-
the fact that the information concerns the person at the time the person is a minor;
-
the fact that the information is up to date and accurate;
-
the sensitivity of the information;
-
the context in which the information is disseminated;
-
the time elapsed between the dissemination of the information and the request made under this section; and
-
where the information concerns a criminal or penal procedure, the obtaining of a pardon or the application of a restriction on the accessibility of records of the courts of justice.
-