You may be in for an unpleasant surprise when you read the September 23, 2025 findings of the Office of the Privacy Commissioner of Canada (OPC) in PIPEDA Findings # 2025-003. The decision, issued jointly with the OPC counterparts in Quebec, British Columbia, and Alberta involving Tiktok, is easily the most significant privacy enforcement action in Canada in years. It provides important guidance on what the federal and provincial privacy commissioners (the “Offices”) expect under privacy laws including with respect to minors and to obtain consents to collect and use personal information. But, if your organization handles personal information there is a very good chance your privacy practices will not be compliant with many of the Offices’ onerous interpretations of PIPEDA. For example, under the findings in the TikTok case, your privacy policy and agreements with your organizations’ supply chains, are unlikely to obtain a valid consents to collect, use, and disclose personal information:
- If your privacy policy does not have a special upfront section that prominently emphasizes key information about privacy practices.
- If your privacy policy is not “comprehensive and understandable” and does not include the disclosures set out below.
- If your privacy policy does not include granular descriptions of each personal information data element collected, a description of how each data element is used and disclosed, and how it would be used to achieve those purposes.
- If your privacy policy states that personal information collected is used to train and improve your technology including your machine learning models and algorithms, unless it also explains what models or algorithms are being trained, how they are being trained, what personal information is being used to train them, the purposes for which they are used, and the consequences of those algorithms for individuals who use the platform.
- If your privacy policy states that personal information collected is used to improve and develop your platform and conduct product development, unless it also specifies what personal information will be used for such purposes and how it will be used for these purposes.
- If your privacy policy states that personal information collected for target advertising or to measure the effectiveness of those ads, unless it also specifies what personal information is being used for these purposes and how this information is being used to target ads.
- If your comprehensive privacy policy cannot be reasonably be understood by every demographic and every person, regardless of their cognitive levels including enterprises that do business with children, teenagers, adults, seniors, and people with mental impairments.
- If your privacy policy is not written in English and French.
- If you do not get express consents when using biometric technology, even if the biometrics are not used to identify or authenticate a unique individual.
- If you do not get express consents whenever you collect or use of personal information that falls outside of the reasonable expectations of an individual or what they would reasonably provide voluntarily.
- If documents with further details about your privacy practices are not linked to or easily accessible through your privacy policy.
- If you do not test your privacy communications to ensure that information regarding complex technologies and privacy practices are understandable to the intended audience.
- If regardless of whether you obtain meaningful consent and fully comply with PIPEDA, the OPC does not believe your use is for an “appropriate purpose”.
I encourage you to read and digest the decision after reading the rest of this blog post. The findings including but not limited to the ones summarized above should make every organization that collects, uses, and discloses personal information to immediately review its privacy practices and decide what to do in view of the TikTok decision.
The decision is remarkable in the number of issues it canvassed. These are summarized below. This is followed by a deeper dive into the expectations of the privacy commissioners with respect to obtaining consents, with a particular focus on obtaining consents under PIPEDA.
Summary of Key Findings in the Tiktok Decision
The decision related the investigation of TikTok by the Offices (OPC, CAI, OIPC BC, and OIPC AB) that examined whether TikTok’s collection, use and disclosure of the personal information of individuals in Canada through its social media platform complied with federal and provincial private sector privacy laws. Below is a high level summary of the key findings.
Jurisdictional Issues
The Offices held there was jurisdiction over TikTok even though the operating entity being investigated was incorporated in Singapore. It held that PIPEDA applies to organizations that, in the course of a commercial activity, collect, use, and disclose the personal information of individuals within Canada, even if the organization is resident outside of Canada.
The OIPC BC also determined that the BC privacy law, the Personal Information Protection Act applied concurrently with PIPEDA. It held this to be the case consistent with “a core principle of modern division of powers jurisprudence that favors, where possible, the concurrent operation of statutes enacted by the federal and provincial levels of government”. Further, that “PIPA BC has been ‘designed to dovetail with federal laws’ in its protection of quasi-constitutional privacy rights of British Columbians”. “PIPA BC operates where PIPEDA does not, and vice versa. In cases such as the present, which involve a single organization operating across both jurisdictions with complex collection, use, and disclosure of personal information, both acts operate with an airtight seal to leave no gaps. An interpretation of s. 3(2)(c) that would deprive the OIPC BC of its authority in any circumstance the OPC also exercises authority is inconsistent with the interlocking schemes and offends the principle of cooperative federalism.”
Collection of personal information from minors
The investigation examined TikTok’s collection and use of personal information for the purposes of ad targeting and content personalization on the platform, with a focus on TikTok’s practices as they relate to children. The offices found that TikTok had implemented inadequate measures to keep children off its platform which resulted in the collection of the sensitive information of many children, and the use of that information for purposes of ad targeting and content recommendation. The Offices determined that the tools implemented by TikTok to keep children off its platform were largely ineffective.
The Offices found that TikTok was collecting and using the personal information of children with no legitimate need or bona fide interest, and that its practices were therefore inappropriate and contrary to PIPEDA’s appropriate purposes requirement in Section 5(3) of that law. As they found that TikTok’s collection and use of personal information from children was not for an appropriate purpose, they did not consider whether the company obtained valid consent from those users, as consent could not render that collection and use appropriate.
What constitutes “biometric technology” for the purposes of obtaining privacy consents
Canadian privacy laws require a high degree of transparency and measures to obtain express consent if biometric personal information is collected as such information is considered to be sensitive.
TikTok uses computer vision technology to analyze features within a detected facial image to estimate the age and determine the gender of individuals in videos. The age estimate is used to categorize videos for recommendation and targeting (e.g., for inclusion or exclusion of the video from other users’ “For You” Feeds); and protect the safety of minors, as part of TikTok’s tools to identify inappropriate material posted on the platform (including child sexual abuse material or “CSAM”). Tiktok’s model applies filters to the image to extract the facial features it uses for age estimation without creating or retaining a numerical representation of the full face. It then analyses these targeted features and estimates an age range for the user. As the model proceeds through each step of the process, it discards the details collected, preventing retention of data. Once the process is complete, only a final score corresponding to the age range estimate remains and no numerical representations, feature maps or images.
TikTok asserted that it does not use “biometric technology”, taking the position that biometrics require the ability to identify or authenticate a unique individual, and that its technology does not enable it to do so (a fact the Offices did not dispute). The Offices disagreed, rejecting prior guidance from the OPC itself and statements made by the OPC in the Cadillac Fairview investigation among other sources stating:
That said, in our view, including for the reasons outlined above, information need not be uniquely identifying to be termed “biometric information”. We therefore find that TikTok’s technology does collect and use biometric information, in that it collects and analyzes numerical representations of various physiological features of individuals.
Further, biometric information does not have to be uniquely identifying in order to reveal sensitive information about an individual, as personal information inferred from biometric information can itself be sensitive, depending on the circumstances. In the context of this case, TikTok was using biometric information to infer additional personal information about users, including their gender; this can be sensitive, for example, where the gender that an individual’s biometric information suggests is different from that with which they identify.
When must organizations obtain express consents to collect, use, and disclose personal information
Under PIPEDA the consent of an individual is required for the collection, use, or disclosure of their personal information, unless an exception applies. Under Clause 4.3.4, the form of the consent sought by the organization may vary, depending upon the circumstances and the type of information. The form of consent can be express or implied. But, under this Clause, an organization should generally seek express consent when the information is likely to be considered sensitive.
PIPEDA also states that “In obtaining consent, the reasonable expectations of the individual are also relevant”. However, it does not lay down any hard and fast rule as to how the reasonably expectations of users must be taken into account, as it does for when sensitive information is collected.
In the decision, the Offices went further than PIPEDA stating that where “the collection or use of personal information falls outside of the reasonable expectations of an individual or what they would reasonably provide voluntarily, then the organization generally cannot rely upon implied or deemed consent”.
In their view, users of the TikTok site would not reasonably expect that TikTok would collect data to deliver targeted ads and personalize the content they are shown on the platform using complex technological tools such as computer vision and TikTok’s own machine learning algorithms, as the user engages with the platform. It is unclear what evidence the Offices relied on for this finding. Nevertheless, given the Offices’ interpretation of when PIPEDA requires express consent, they found that TikTok needed to get express consent for its collection and use of users’ personal information for its purposes of delivering targeting ads and personalizing content on the platform.
Express Consents can be Obtained using a Clickwrap or Hybrid Wrap Online Agreement
Under many court decisions, an agreement to online terms can be manifested using a click-wrap agreement or the so called “hybrid” agreements where the manifestation of agreement includes links to terms that are clearly made part of the agreement.
The Offices appeared to accept these processes for obtaining online consents. Here are examples from the decision of the processes TikTok uses.
When a user runs the mobile app for the first time, a pop-up referencing and linking to the TikTok Terms & Conditions and TikTok Privacy Policy appears. To continue using the app, the user must actively click “Agree and Continue”.
Additionally, during the TikTok account creation process on the app and the website, there are links to the Terms & Conditions and TikTok Privacy Policy, as well as an explanation that continuing with account creation serves as a confirmation that a user has read and agreed to them.
Another example is shown below:
What is required to obtain meaningful consents from users
PIPEDA’s statutory Requirements for Meaningful consent.
According to the Offices, under PIPEDA, it is insufficient merely to obtain a valid consent under contract law principles. On this basis it found that TikTok did not obtain valid and meaningful consents from its users for tracking, profiling, targeting and content personalization.
The Offices provided their interpretations of the operative parts of PIPEDA related to obtaining meaningful consents. These do not align with the text of the statute.
Clause 4.3.2 of PIPEDA states
The principle requires “knowledge and consent”. Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed.
The Offices interpreted Clause 4.3.2 as follows:
Clause 4.3 of Schedule 1 of PIPEDA states that the knowledge and consent of the individual is required for the collection, use, or disclosure of their personal information, unless these requirements are specifically exempted under section 7 of PIPEDA. Clause 4.3.2 requires organizations to explain the purposes in such a way that the user can reasonably understand how their personal information will be used.
One interpretation of PIPEDA is that the first and second sentences of Clause 4.3.2 have to be read together. On this interpretation, the organization’s obligation is to use reasonable efforts to state the purposes in an manner that the individual can reasonably understand how the information will be used. However, the Offices interpretation of Clause 4.3.2 reads out the first sentence of the Clause.
In 2015 PIPEDA was amended to add Section 6.1 which reads as follows:
For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.
The Offices interpreted Section 6.1 as follows:
In addition, section 6.1 of PIPEDA states that for consent to be valid, an individual must be able to reasonably understand the nature, purposes, and consequences of the collection.
The Offices have interpreted Section 6.1 PIPEDA differently than its actual text. As shown above, PIPEDA does not state that the individual must be able to reasonably understand the nature, purposes, and consequences of the collection. Rather, it states it must be reasonable to expect that an individuals would understand the nature, purpose and consequences of the collection.
The Offices’ Actual Requirements for Obtaining Meaningful Consent
As noted above, the Offices found that TikTok did not obtain valid and meaningful consents from its users for tracking, profiling, targeting and content personalization. In particular, the Offices found that:
(i) certain key information about TikTok’s practices was not provided up-front for users to consider when deciding whether to consent; (ii) the Privacy Policy did not provide a sufficiently clear and comprehensive explanation of certain TikTok practices as they relate to the purposes in question and documents providing additional important details were not easily accessible to users; (iii) the Privacy Policy and other relevant privacy communications were not made available in French; and (iv) TikTok failed to adequately explain its collection and use of users’ biometric information.
In giving reasons for the decision the Offices made a number of findings of what they expect organizations to do to operationalize their interpretations of PIPEDA’s meaningful consent requirements. These are summarized below.
- Privacy practices must be explained in a manner such that users can reasonably understand how their personal information will be used, or the nature, purposes, and consequences of its personal information handling practices.
- Privacy policies and policy related communications must be in French as well as in English.
- The privacy explanations must be sufficiently clear and accessible. Good practices endorsed by the Officers is “through just-in-time notices and in a layered format” and in privacy policies “and/or other privacy communications such as feature-specific articles or FAQ”.
- Organizations must generally place additional emphasis on four key elements (i) what personal information is being collected; (ii) with which parties personal information is being shared; (iii) for what purposes personal information is collected, used, or disclosed; and risk of harm and other consequences.
- Key information about privacy practices must be “up-front” and be prominently emphasized when individuals are signing up for an account. The practices cannot be in an organization’s lengthy privacy policy and associated privacy documents that few users are likely to read. As applied to TikTok’s primary business model which is to generate advertising revenue by personalizing content and delivering targeted ads, the Offices expected to see “the following key information explained to users up-front and prominently during account sign-up, so that they can make a meaningful decision about whether to sign-up for an account and engage with the platform:
The various types of personal information that TikTok collects from and about users, including details related to videos viewed and posted, comments posted, user location, device information, system settings, and information from third-party sources. This information is extremely detailed and lengthy.[i]
That personal information will be used, including to analyze and infer user demographics and interests and develop its machine learning tools and algorithms, for purposes of recommending content and delivering targeted ads.
- Documents with further details about privacy practices must be linked to or easily accessible through the privacy policy, and perhaps, otherwise be easy to find.[ii]
- Organizations must inform individuals of their privacy practices “in a comprehensive and understandable manner”. The level of detail said to be required includes that explanations must effectively explain with details specifically what personal information would be used for each purpose and how it would be used to achieve those purposes. For example, it criticized TikTok’s privacy policy as follows:
-
- “Instead, in “How we use your information”, the policy provides a long list of TikTok’s potential uses of that information, often with no link between the specific information collected and its potential uses.”
- “…the policy does enumerate in detail various types of personal information that TikTok collects under the “Information You Provide”, “Automatically Collected Information”, and “Information From Other Sources” sections. However, it does not effectively explain specifically what personal information would be used for each purpose and how it would be used to achieve those purposes. Instead, in “How we use your information”, the policy provides a long list of TikTok’s potential uses of that information, often with no link between the specific information collected and its potential uses.”
- “Additionally, in the Privacy Policy, TikTok explains several of its complex technologies and privacy practices in a cursory manner that we found to be insufficient to allow users to meaningfully understand the practices that they are being asked to agree to.”
- “TikTok explains that it may use the personal information of individuals “to train and improve … [its] technology, such as … [its] machine learning models and algorithms.” While it is important to inform individuals of this use, we find TikTok’s explanation to be vague; the Privacy Policy provides no insight into what models or algorithms are being trained, how they are being trained, what personal information is being used to train them, the purposes for which they are used, or the consequences of those algorithms for individuals who use the platform.”
- “TikTok’s explanation that it will use personal information “to improve and develop [its] platform and conduct product development” is also incomplete and unclear, as it does not indicate what personal information will be used to improve the platform, or how it will be used for this purpose. In this regard, we note that the Consent Guidelines specifically cite “service improvement” as an example of language that is not meaningful.”
- “TikTok’s Privacy Policy states that it uses personal information “to measure and understand the effectiveness of the advertising and other content [it serves] to [the user] and others, and to deliver advertising, including targeted advertising, to [the user] on the platform.” “This language does not indicate what personal information is being used to target advertising or measure the effectiveness of those ads, nor does it explain how TikTok will use that information to target ads, which, as explained earlier in this report, is a very complex and multi-faceted practice.”
- TikTok’s privacy policy explains that it will collect various types of information (via computer vision), including by detecting “the existence and location within an image of face and body features and attributes” It further states that these types of information will be used for various purposes, including for “demographic classification” and “content and ad recommendations”. This was criticized by the Officers stating “Finally, we note that TikTok does not provide, prominently and up-front during the sign-up process, key information about its practices vis-à-vis biometric information (or facial analysis). A user signing up for TikTok would have no reason to expect that TikTok would conduct an analysis on their facial features and for which purposes, nor are they likely to review TikTok’s lengthy privacy policy to learn about TikTok’s biometric practices. Furthermore, even if a user were to review the full policy, the information provided…does not explain how TikTok will use biometric information, or facial analysis, to estimate their age and gender for purposes of delivering tailored ads and content recommendations.”
-
- “Valid consent may generally depend on, among other things, the cognitive ability and developmental maturity of individuals.” This requirement was considered to be highly relevant in the case of TikTok, where many users are between the ages of 13-17. In relation to TikTok’s steps to obtain youth consent
Given these risks and sensitivities, we would expect TikTok to implement a consent model and privacy communications that seek to ensure that individuals aged 13-17 can meaningfully understand and consent to TikTok’s tracking, profiling, targeting and content personalization practices when they use the platform. This includes an expectation that TikTok would develop their communications intended for users aged 13-17 in language that those users can reasonably understand, taking into account their level of cognitive development. TikTok should also make clear to those users the risk of harm and other consequences associated with use of the platform consistent with the Consent Guidelines and section 6.1 of PIPEDA. In light of the fact that younger users may not be aware of the existence and implications of targeted advertising, TikTok’s privacy communications should include prominent up-front notification that targeted ads may be delivered to them on the platform to influence their behaviour.
- The Offices stated that “As part of a robust privacy management program, and consistent with the Consent Guidelines, organizations should test their privacy communications to ensure that information regarding complex technologies and privacy practices is understandable to their intended audience. This is particularly important when the individuals in question are children or youth who may not have the same level of cognitive development as adults.”
- TikTok had argued that its privacy communications were written with all age groups in mind and designed not to overwhelm youth users, and that it instead focused youth communications on safety issues and user interactions. The Offices found this to be inadequate noting “that privacy is also important and must be addressed in a manner that youth can and will understand”.
Comments on the Offices’ Requirements for Obtaining Meaningful Consents
It is hard to imagine a decision dealing with meaningful consent under PIPEDA that is more onerous and unworkable. I am sure much will be written about the practicality of the stated requirements and whether they are actually required under PIPEDA. But, here are some initial thoughts about the requirements. These comments do not focus on whether TikTok complied with PIPEDA, but on the sweeping statements made by the Offices and on their expectations on organizations that are subject to PIPEDA.
- The compliance requirements were, arguably, based on the Offices reading of Clause 4.3.2 and Section 6.1 of PIPEDA that do not align with the text of the statute. Nor does the decision purport to balance the interests of individuals with the reasonably practicable efforts organizations should make to be transparent about their privacy practices.
- The requirement that key information about privacy practices must be “up-front” and be prominently emphasized when individuals are signing up for an account will require whole re-working of many current privacy policies. The requirement will require organizations to examine all of their activities to identify those that have to be highlighted. Moreover, the Offices identified a lot of information which will make privacy policies even longer and more complex than they are now.
- The Offices are asking for “comprehensive” policies. The level of detail includes a granular description of each data element collected, a description of how each data element is used and disclosed, and how it would be used to achieve that purpose. So, for example, if 100 different types of personal information are collected by an organisation, and each is used for 10 purposes, and 5 processes are used to achieve each purpose, there would need to be no less than 5,000 disclosures in the privacy policy, just to comply with this requirement. Perhaps there could be some groupings, but the complexity could be daunting and easily overwhelm users of a service.
- The Offices require explanations of how complex technologies actually work. Today, many privacy disclosures state that they may use the personal information of individuals “to train and improve … [its] technology, such as … [its] machine learning models and algorithms.” This will no longer be acceptable. Rather, the disclosures will need to provide “insight into what models or algorithms are being trained, how they are being trained, what personal information is being used to train them, the purposes for which they are used, or the consequences of those algorithms for individuals who use the platform.” Today, organizations increasingly are using AI including machine learning models. How they are actually used is constantly changing and is often considered to be highly confidential information. Even assuming this kind of compliance could be achieved with the ecosystems of businesses and supply chains that rely on AI systems, the level of detail would fill numerous lengthy and hard to understand architectural flow charts and other documents. It is also hard to imagine the level of detail that could be entailed by a privacy policy that has to imagine all of the possible consequences of the uses of AI.
- Today, many privacy disclosures state that the organization will use personal information “to improve and develop [its] platform and conduct product development”, or use words to this effect. The Offices, however, want “what personal information will be used to improve the platform, or how it will be used for this purpose.” This requirement will also drive changes and be hard to manage, especially as processes are constantly evolving as businesses innovate.
- The Offices say that privacy polices have to disclose specifically what personal information is being used for target advertising or to measure the effectiveness of those ads and how this information is being used, which, as explained “is a very complex and multi-faceted practice.”
- The Offices state that a valid consent “may generally depend on, among other things, the cognitive ability and developmental maturity of individuals.” Even assuming this is required under PIPEDA, how exactly is this to be implemented in practice? Imagine an enterprise that does business with children, teenagers, adults, seniors, people with mental impairments such as mentally retarded individuals or individuals with dementia or early Altheimer’s, or organizations that also collect information from new comers to Canada. Do their privacy disclosures have to be tailored to each and every demographic and in every language for valid consents to be obtained? Do businesses now have to have multiple privacy policies, all with different wording? Further, how could businesses ever provide the comprehensive level of detail being required for every demographic including for younger users or seniors with mental challenges? The Offices’ statement that “that privacy is also important and must be addressed in a manner that youth can and will understand” is hard to reconcile with the comprehensive level of detail the Offices are requiring including technical details that only engineering graduates or privacy regulators might be able to truly understand.
Final Comment
The Offices’ views on what is required to comply with privacy laws is supposed to be premised, on their view of the law, on what users of services such as TikTok would reasonably understand. Their views appear to assume, however, that users would want to read and understand policies as comprehensive as they prescribe to reasonably understand how their personal information will be handled. The details the Offices are asking for could be what privacy professionals and regulators might ask for to understand how personal information is handled. Yet, one has to wonder whether the standard of disclosures being demanded needs to be set at the levels of inquiry of regulators rather than ordinary users of services.
[i] The Office referred to paragraph 26 of the Findings which listed the following information that was expected to be highlighted.
TikTok represented that it collects a wide variety of information and uses it for various purposes, including for targeting advertising and personalizing content that it shows to users, moderating content, and ensuring platform security and compliance with its Community Guidelines. TikTok stated that the information it collects about users for content personalization and targeted advertising can include:
- Profile information (email, phone number, biographical information like name, age, gender, etc.);
- User-generated content posted by the user on the platform (videos, images, comments, hashtags, metadata, etc.);
- Personal information derived from ‘Computer vision’ and audio analytics of the content of videos and images (including biometrics);
- Engagement with content and ads (such as viewing behaviour, sharing, liking, browsing history, indicators of interest or lack of engagement, comments, search activity, etc.);
- Purchase information (transaction history, etc.);
- Device information (IP address, mobile device information, mobile carrier, operating system data, network information, advertising IDs on mobile devices, system settings, etc.);
- Contacts (contact list from device, and other social media profiles like ‘Facebook Friends’) as well as ‘Friends’ on the platform, mutual connections (suggestions based on users following the same accounts), and suggesting accounts where an individual’s information is in another user’s contact list;
- Geolocation data (approximate location, approximated to three-square kilometres); and
- Information shared with TikTok by third-party partners through tools (e.g., advertisers who share their advertising event measurement data through TikTok’s ad measurement PixelFootnote38or Events API and by measurement partners who help digital platforms more accurately attribute and measure the impact of ad campaigns.
[ii] See the following from the Findings:
- As highlighted in the Consent Guidelines, information about a company’s privacy practices must be provided to individuals in a manageable and easily accessible format. If the information is scattered across the website and difficult to find, it will not adequately inform individuals of an organization’s practices.
- For example, TikTok shared with the Offices a link to an “Ads and your data” article in its “Privacy Centre”, which sets out what personal information is used for personalized advertising, explains how data is shared with measurement partners, and how effectiveness is measured. In turn, that article links to TikTok’s “Business Center”, which provides additional detail on ad measurement tools. While each of these explanations would assist users in understanding TikTok’s targeted advertising practices, neither the article, nor the Privacy Centre generally, are available through or referenced in the Privacy Policy. Therefore, despite being cited by TikTok as an element of its efforts to obtain consent for its collection and use of personal information, these articles are not readily available to individuals at the time that consent is requested.
- Similarly, with respect to content (e.g., specific videos) recommendations, TikTok has a detailed article in its Help Centre titled “How TikTok recommends content”, which introduces the content recommendation system, explains what user information is used for content recommendation, and sets out how an individual can exercise control over what is recommended to them. However, neither this article, nor the Help Centre, are easily accessible through the Privacy Policy.
