
In brief
Canadian privacy regulators are increasing scrutiny on social media and digital platforms, focusing on how they handle minors’ personal data and younger users who bypass age restrictions. This heightened oversight is likely to raise expectations for youth-focused design in services and platforms that younger users tend to use.
Age assurance and children’s privacy have emerged as key priorities, highlighted in the Office of the Privacy Commissioner’s 2024 stakeholder consultation and embedded in its 2024-2027 strategic plan. Platforms have been clearly signaled that regulators will now apply a children’s privacy lens to youth-related investigations and data breach assessments to drive compliance and raise awareness. However, what this lens entails, remains undefined. While consultations have explored broader regulatory themes like accountability and risk-based approaches, the absence of detailed guidance leaves platforms uncertain about whether current practices meet developing regulatory expectations.
Recent investigations in 2025 have begun to outline regulator expectations with certain privacy practices now being clearly stated as insufficient to comply with regulator expectations. These actions serve as implicit guidance on the standards and considerations necessary to respond to developing regulatory trends.
A key regulatory principle emerging is the application of a children’s privacy lens to existing platform practices. Recent investigation findings suggest this lens involves a contextual analysis, one that considers not only user demographics, but also whether the collection and use of minors’ data aligns with the organization’s stated purpose. Platforms with significant youth engagement are increasingly expected to implement tailored safeguards. Where data is used for purposes such as targeted advertising or content personalization, robust age assurance mechanisms and youth-friendly privacy communications are becoming prerequisites for appropriate collection and use. Regulators have now clearly flagged several practices as insufficient, signaling where platforms must strengthen safeguards to meet compliance expectations.
- On platforms with a significant youth user base, relying solely on easily bypassed self-declared age gates is inadequate for age assurance and may lead to inappropriate data collection from minors, triggering potential enforcement.
- On platforms with a significant youth user base, consent practices for users aged 13 to 17 must be tailored to their cognitive maturity. Using generic, adult-oriented language risks invalidating consent altogether.
With enforcement already outpacing formal direction, regulatory action is setting the standard, prompting platforms to respond proactively in anticipation of clearer guidance. Recent investigations have made it clear that platforms popular with minors, especially those that are easily accessible and lack tailored privacy communications, should reassess their practices. Key areas for consideration include:
- Layered age assurance mechanisms
- Evaluate whether multiple layers of age assurance (e.g., self-declaration, AI-based estimation, parental controls) are contextually appropriate given current data collection practices. This may help mitigate risks of incidental profiling or targeting of minors.
- Youth-appropriate privacy communications
- Review whether privacy policies and consent mechanisms use plain, age-appropriate language that aligns with the cognitive development of minors.