The Hidden Privacy Risks of Pixels, Cookies and Third-Party Sharing

Have you ever seen an ad pop up for something you were just researching, but never actually told anyone about?    

You didn’t say it aloud. You didn’t type it into that platform. But suddenly, there it is: an eerily specific promotion for the exact product, article or service you’d only browsed elsewhere.    

It feels like the platform is reading your mind. But what it’s really doing is reading your data.    

In today’s digital economy, every click, scroll, and interaction feeds a sprawling web of tracking technologies. Behind the sleek interfaces of websites and apps lies an invisible architecture — pixels, cookies, tags and scripts — designed to silently collect, share and monetise user information at scale.    

These technologies underpin the personalisation engines of the internet. They power targeted advertising, fuel sophisticated analytics, and generate the behavioural insights that drive modern marketing.    

But while they deliver significant commercial value, they also raise increasingly complex privacy and compliance challenges.    

As regulators sharpen their focus on digital tracking, and as enforcement powers (and potentially compliance obligations) under the Privacy Act 1988 (Cth) (Privacy Act) expand, many organisations are facing uncomfortable questions:    

  • Do we really understand what tracking technologies are operating on our sites?
  • Are we meeting our obligations under Australian privacy law — and if not, where are the gaps?
  • And what legal, reputational or commercial risks might be lurking in our martech stack?

To answer those questions, we need to start with the basics — what these technologies are, how they work, and why privacy compliance isn’t as simple as it may seem.    

What Are Cookies, Pixels, Tags and Scripts?    

Most of us click “accept all” on cookie banners without a second thought. But beneath that single tap sits a complex system of tools working together to observe, remember, and profile user behaviour,  often without meaningful visibility or control.    

Let’s unpack the key technologies doing the heavy lifting in the digital surveillance economy.    

Cookies: Digital Sticky Notes    

Cookies are small text files stored on a user's device when they visit a website. Think of them as digital sticky notes, designed to remember information about your visit and enable various website functions.    

There are several types of cookies, each serving different purposes:    

  • First-party cookies are created by the website you're actively visiting. They store information like your login status, language preferences or shopping cart contents. These are typically essential for site functionality and user experience.
  • Third-party cookies are created by domains other than the one you’re visiting, usually via embedded services like advertising platforms, analytics tools or social media widgets. These cookies can track you across multiple websites and are often used for behavioural profiling and targeted advertising.
  • Session cookies are temporary. They last only for the duration of a browser session and are typically used to keep users logged in or maintain their progress through multi-step processes.
  • Persistent cookies remain on a user’s device for a defined period, sometimes months or years. They enable longer-term tracking and can be used to remember preferences or support targeted marketing efforts.

Common uses of cookies include:    

  • Authentication: Keeping users logged in across multiple pages
  • Personalisation: Remembering settings and preferences
  • Analytics: Understanding how users interact with websites
  • Advertising: Enabling behavioural targeting across sites
  • E-commerce: Maintaining shopping cart contents and purchase flows

Tracking Pixels: The Invisible Observers    

Tracking pixels, also called pixel tags, web beacons or clear GIFs, are tiny (often 1x1 pixel), transparent images embedded into websites, emails or ads. While invisible to users, they’re powerful tools for silently collecting behavioural data.    

How pixels work: When a user loads a page or opens an email containing a tracking pixel, the pixel sends a request to the host server. This request captures key metadata about the user and their interaction, including:    

  • IP address and approximate location
  • Device and browser type
  • Timestamp of the action
  • Referrer URL (i.e. the previous site visited)
  • Page views or specific actions taken

This data is then used to monitor engagement, measure conversions, and build detailed user profiles and is often linked with advertising platforms like Meta, Google, or LinkedIn.    

Common applications include:    

  • Conversion tracking: Identifying when a user completes a specific action (purchase, form submission, download)
  • Remarketing: Re-targeting users with ads across different platforms
  • Attribution: Determining which channels or campaigns led to a result
  • Email engagement tracking: Monitoring email opens and link clicks
  • Audience segmentation: Building advertising audiences based on behaviour

Tags and Scripts: The Behind-the-Scenes Code    

Tags and scripts are snippets of code embedded in websites to run third-party services. These can include analytics tools (like Google Analytics), advertising platforms, heatmaps, live chat widgets, A/B testing tools, and more.    

Often managed through platforms like Google Tag Manager, these scripts load additional content or tracking functions, sometimes dynamically, based on user behaviour or device characteristics.    

While tags themselves don’t necessarily store data, they often trigger tracking pixels or cookies, initiate API calls to third-party servers, or enable fingerprinting techniques. Their cumulative effect can be powerful and opaque.    

The Invisible Data Collection Network    

When we browse the web, we often assume we’re interacting with a single website. But in reality, each visit can activate a hidden constellation of trackers, quietly harvesting data and sending it to a wide range of third parties.    

At the heart of this system are these technologies working in tandem to track users across the digital ecosystem.    

Cross-Site Tracking: A Web of Observers    

The true power of pixels and cookies lies in their ability to track users across multiple websites, not just within a single session. A single page visit on a retail site, news outlet, or blog can trigger dozens of calls to third-party servers, many of which are invisible to the user.    

For example, a typical commercial website might include:    

  • Google Analytics pixels to monitor user behaviour and site performance
  • Meta Pixel for Facebook and Instagram retargeting
  • LinkedIn Insight Tag for professional audience tracking
  • Programmatic advertising cookies from multiple ad networks
  • Newsletter tracking pixels from email marketing providers
  • Customer support chat tools that drop their own tracking cookies

Each of these third parties may collect different categories of personal and behavioural information and many of them operate far outside the immediate control or awareness of the website operator.    

Data Combination and Profiling    

These individual data points are rarely viewed in isolation. Instead, they’re stitched together to form detailed user profiles, often enriched with inferred attributes and behavioural predictions.    

Combined across websites and platforms, this data can reveal sensitive information such as:    

  • Financial behaviour: spending patterns, credit interest, or loan applications
  • Health concerns: visits to medical or wellness pages
  • Political leanings: engagement with news, advocacy or party content
  • Relationship status: inferred from browsing history or e-commerce behaviour
  • Career trajectory: job-hunting signals or professional engagement
  • Personal identity: gender, age range, household composition, and hobbies

While some of this information may be inferred rather than directly provided, the resulting profiles can be startlingly accurate and are often used to fuel programmatic advertising, content personalisation, and data brokerage.    

The Scale of Tracking    

The scale of this invisible surveillance network is difficult to overstate. Studies have shown that popular websites, including news, retail and entertainment platforms, can contain hundreds of third-party tracking technologies.    

A single page load can transmit personal and device information to dozens of companies — many of which users have never interacted with directly, and may not even recognise by name.    

This creates a sprawling, opaque data-sharing ecosystem where:    

  • Consent is often meaningless or absent
  • Third-party data flows are poorly understood
  • Control and accountability are fragmented

From a legal and ethical perspective, this raises serious questions about reasonable expectations of privacy, cross-border data disclosures, and compliance with notice and consent requirements under Australian law.    

The Hidden Web of Digital Tracking and the Law    

So, behind every website visit lies a data-rich ecosystem, quietly capturing insights about user behaviour through a network of tags, cookies and pixels. These technologies enable personalisation, performance monitoring and digital marketing but they also create a complex web of privacy risks and legal obligations that are often poorly understood.    

For Australian organisations, the stakes have never been higher. With privacy regulators zeroing in on online tracking practices, and consumer awareness at an all-time high understanding how these technologies operate and how they intersect with the Australian Privacy Principles (APPs) is critical.    

The cost of getting it wrong extends far beyond regulatory penalties. Reputational damage, loss of consumer trust, and potential litigation are all very real consequences for those who fail to manage digital tracking appropriately.    

This article explores critical privacy issues associated with cookies, pixels and related tracking tools, providing a practical roadmap for compliance, risk mitigation, and building trust in the age of invisible surveillance.    

Recent Regulatory Developments in Australia    

The regulatory environment around digital tracking has sharpened significantly over the past 12 months and 2025 is shaping up to be a watershed year for privacy reform and enforcement.    

As Australian Privacy Commissioner Carly Kind recently put it: “2025 is going to be a big year for privacy” — and that includes a big year for enforcement. That prediction is already materialising, with a string of recent developments that reshape the compliance expectations for organisations using cookies, pixels and similar technologies.    

OAIC Guidance on Tracking Pixels (November 2024)    

In November 2024, the Office of the Australian Information Commissioner (OAIC) issued specific guidance on the use of tracking pixels, marking a significant shift in regulatory attention.    

The guidance emphasises that pixels, like cookies and other tracking technologies, enable “granular user surveillance across the internet and social media platforms”, creating heightened privacy risks. While acknowledging their commercial utility for advertising, analytics and ROI measurement, the OAIC has made clear that these tools sit squarely within the scope of privacy law when they collect or disclose personal information.    

Organisations are now expected to assess and understand the operation of any tracking pixels deployed on their digital assets and ensure that their use complies with the APPs.    

Enhanced Enforcement Powers    

The OAIC’s regulatory toolkit has also undergone a significant upgrade.    

As a result of the passing of the Privacy and Other Legislation Amendment Act 2024, the OAIC now has:    

  • More penalties: Up to $3.3 million for interference with privacy Up to $330,000 for certain breaches of the APPs
  • New powers to issue: Infringement notices (effectively on-the-spot fines) Compliance notices with enforceable directions

This expanded enforcement arsenal gives the regulator a more nuanced and proportionate approach to enforcement, enabling it to target non-compliant practices at all levels of severity, rather than reserving action only for major breaches.    

The ‘Set and Forget’ Warning    

The OAIC’s recent guidance delivers a clear warning: organisations must not take a ‘set and forget’ approach to tracking pixels or other third-party data collection tools.    

 Deploying a tracking pixel is not a one-time compliance checkbox. It requires:    

  • Ongoing due diligence into how the pixel works
  • An understanding of what data is collected, where it goes, and who can access it
  • Risk assessments to determine whether personal information is involved
  • Controls and contractual safeguards to ensure third-party compliance
  • Clear, accessible privacy notices that reflect these tracking practices

Failure to meet these expectations could expose organisations to regulatory action, particularly as the OAIC looks to set legal precedents in this evolving area.    

A Shifting Enforcement Landscape    

While the OAIC acknowledges that the assessment as to whether the Privacy Act applies won’t always be clear-cut, its recent messaging leaves little room for complacency.    

In her Privacy Awareness Week 2025 address, Privacy Commissioner Carly Kind made it clear that digital tracking is now a key enforcement priority. She described existing guidance as inadequate given the complexity and opacity of the online advertising ecosystem and signalled that the OAIC will increasingly pursue strategic enforcement to establish legal precedent in this space.    

That means organisations who misjudge whether their tracking activities involve personal information may find themselves serving as unwilling test cases. The risks are not just theoretical. They are legal, reputational, and strategic.    

Where regulatory guidance falls short, enforcement action will fill the gap. And in this environment, adopting a conservative, privacy-by-design approach is no longer a “nice to have”. It’s the safest path forward.    

Global Context: Australia Is Not Alone    

Australia’s enforcement trajectory reflects a broader global trend.    

In the EU, regulators have taken action against websites using pixels without valid consent under the GDPR. In the US, class actions have been filed over marketing emails embedded with “spy pixels,” alleging violations of state privacy and wiretap laws. In Canada, the use of tracking pixels without adequate consent has also drawn regulatory scrutiny.    

These international movements are important signals: Australian organisations are unlikely to remain immune from similar enforcement models, especially where cross-border data flows or platform integrations are involved.    

Why APP 7, Not the Spam Act, Applies    

Before diving into the specific privacy issues, it’s important to note that the Spam Act 2003 (Cth) does not apply to digital targeted advertising delivered through pixels and cookies.    

The Spam Act regulates unsolicited commercial electronic messages (e.g. emails, SMS, and instant messages) sent directly to individuals. But it does not cover targeted advertising that appears on websites or apps as a result of tracking or behavioural profiling.    

Instead, these practices are governed by the direct marketing provisions under APP 7. APP 7 applies whenever an organisation uses or discloses personal information for the purpose of direct marketing, including through online advertising platforms and adtech networks that receive tracking data from cookies or pixels.    

This distinction matters. Under APP 7, organisations must:    

  • Only use personal information for direct marketing in limited circumstances
  • Provide a clear and simple way for individuals to opt out
  • Give notice about direct marketing practices in their privacy policy and collection notices
  • Be especially cautious when dealing with sensitive information, which requires explicit consent

If data collected via pixels or cookies constitutes personal information, as the OAIC has indicated may often be the case, then APP 7 is engaged, and compliance obligations follow.    

The Current Privacy Landscape    

As set out above, the digital advertising ecosystem has become a sophisticated and largely opaque data collection machine. When a user lands on a single website, their activity can be monitored by dozens of trackers, each capturing pieces of information that, when combined, form detailed behavioural profiles.    

These profiles may be enriched with demographic, social, and transactional data, and then used or sold across the advertising supply chain. Yet in many cases, the individuals being tracked have little visibility of this data ecosystem and no meaningful control over how their information is used.    

Australian privacy law does apply to much of this tracking activity but the boundaries of what is covered under the Privacy Act remain blurred, particularly when it comes to determining whether technical identifiers or certain inferred data count as “personal information.”    

Are Cookies and Pixels Personal Information Under the Privacy Act?    

The answer, as the OAIC has acknowledged, is complex.    

Under the Act, personal information is defined as “information or an opinion about an identified individual, or an individual who is reasonably identifiable.” But in the digital tracking context, the lines between personal and anonymous data are increasingly difficult to draw.    

In its November 2024 guidance on tracking technologies, the OAIC made clear that data doesn’t need to directly identify someone to fall within scope. Information such as IP addresses, hashed email addresses, device identifiers, or even URL parameters may be considered personal information, particularly when it can be matched or linked with other datasets to re-identify individuals.    

This is a critical point: the identifiability threshold is not about what one organisation can do in isolation, but what is possible when data is aggregated or shared, especially with powerful third-party platforms like Meta, Google, or LinkedIn.    

As the OAIC put it, organisations should assume that if a third-party platform can match data from a tracking pixel with its broader user database, then the information being disclosed is likely to be personal information thus triggering obligations under the APPs for both the business deploying the pixel and the third party receiving the data.    

Privacy Risks and Legal Issues: What Organisations Must Consider    

The use of pixels, cookies and tracking technologies opens the door to a wide range of privacy risks, many of which organisations may not be fully aware of until it’s too late.    

These tools often operate silently in the background, enabling a level of user profiling and behavioural targeting that would be impossible through direct data collection alone. But this passive nature is precisely what makes them so problematic from a legal and ethical standpoint.    

At the heart of these risks lies a tension between personalisation and privacy: how can businesses use tracking data to deliver relevant digital experiences while still respecting individuals' rights, expectations and legal protections?    

In the next section, we explore eight key privacy issues that organisations must address when using tracking technologies, providing practical guidance to help navigate compliance while still enabling legitimate digital marketing and analytics objectives.    

1. Over-Collection: Just Because You Can, Doesn’t Mean You Should    

Relevant APP: APP 3 (Collection of Solicited Personal Information)    

The Problem    

Tracking technologies often operate with an “all you can eat” approach to data collection, capturing every click, scroll, hover, and session detail, regardless of whether that information is actually needed.    

But under APP 3, organisations must only collect personal information that is reasonably necessary for their functions or activities. Collecting more data than is needed, even passively, through cookies or pixels, creates legal risk and accountability gaps, especially if the information is never used or poorly governed.    

Equally important is how that data is collected. APP 3 also requires that personal information be collected by lawful and fair means. Tracking individuals in a way that is opaque, excessive or unexpected, particularly without proper disclosure, may breach this requirement, even where the data itself is technically obtainable.    

Real-World Impact    

A financial services provider embeds a session replay tool to record user interactions with its online application portal. The tool captures every keystroke, including partial form inputs, user navigation, and even sensitive data that was never submitted.    

The business doesn’t review most of the recordings, has no documented purpose for retaining this level of detail, and provides no real-time disclosure to users. In this scenario, the organisation risks breaching APP 3 by collecting more personal information than is reasonably necessary and by doing so in a way that may not meet the test of fair and lawful collection.    

Key Considerations    

  • ‘Reasonably necessary’ is an objective test. It requires more than just convenience or potential future use. Organisations must be able to demonstrate why each category of personal information collected is essential to their core activities.
  • Fairness is contextual. Passive collection may not be “fair” if it is unexpected, intrusive, or disproportionate to the purpose. This includes practices that involve tracking users before they’ve consented, or collecting information about non-customers or incidental visitors.
  • Tools that indiscriminately collect excessive metadata, user behaviour logs, or screen recordings should be carefully reviewed for necessity, scope, and proportionality.

Compliance Strategy    

  • Conduct purpose-based data audits: Review each tracking tool and identify exactly what data is collected and whether that collection aligns with a defined, legitimate business purpose.
  • Apply the ‘reasonably necessary’ test: For each data point, ask whether it is truly needed for your organisation’s functions or activities. If not, remove or limit collection.
  • Implement data minimisation settings: Configure third-party tools (e.g. pixels, session recorders, analytics platforms) to avoid collecting unnecessary or sensitive information.
  • Avoid vague or speculative use cases: “We might use this someday” is not a valid justification under APP 3. Collect only what you need, when you need it.
  • Assess fairness in context: Consider whether the method of collection aligns with user expectations and is appropriately disclosed.

2. Consent Without Clarity: The Illusion of Choice    

Relevant APPs: APP 3 (Collection of Solicited Personal Information), APP 5 (Notification of Collection), APP 6 (Use and Disclosure), APP 7 (Direct Marketing)    

The Problem    

In an era of endless cookie banners, many organisations assume that obtaining user “consent” is as simple as including a pre-ticked box or generic notice in a privacy policy.    

But under the Privacy Act, consent must be voluntary, informed, specific, current, and given by someone with capacity, and for many tracking technologies, it often isn’t.    

Whether it’s a cookie drop on first page load or the silent operation of tracking pixels buried deep in marketing tags, users are frequently unaware that personal information is being collected, let alone that it’s being disclosed to third parties for advertising, analytics, or profiling.    

This creates clear compliance risks under APP 3 (collection of personal information), APP 5 (notification of collection), APP 6 (use and disclosure), and APP 7 (direct marketing), particularly where the data is sensitive or used for behavioural advertising.    

Real-World Impact    

A user visits an e-commerce site for the first time. Before they can click or scroll, third-party advertising scripts load and begin collecting data, including their IP address, location, and page activity. No visible consent banner is presented. The privacy policy contains general language about third-party analytics but doesn’t identify recipients or provide opt-out options.    

The site is technically collecting and disclosing personal information for direct marketing purposes, without providing proper notice or obtaining valid consent. This scenario, common across Australian websites, risks breaching multiple APPs.    

Key Considerations    

Under OAIC guidance, for consent to be valid, it must be:    

  • Voluntary – freely given, not forced or assumed
  • Informed – based on clear, specific information about what is being collected and why
  • Current – not bundled, expired, or outdated
  • Capable of being withdrawn – with a simple and accessible opt-out mechanism

Where tracking technologies involve sensitive information, such as health-related content or inferences about ethnicity, sexuality, or political views, consent is required for collection and use for direct marketing. Silence, pre-ticked boxes, or generalised statements buried in policies are not enough.    

Compliance Strategy    

  • Avoid pre-emptive tracking: Ensure no third-party tracking tools load before the user has had the opportunity to make an informed choice (particularly for non-essential cookies and pixels).
  • Implement clear consent banners: Use layered notices that identify the categories of trackers, the types of data collected, and the purposes of use/disclosure, especially where data will be used for advertising. Be very careful around collecting sensitive information and if you are doing so, make sure you consent mechanisms are clear.
  • Link to a detailed privacy dashboard: Give users granular control over different types of tracking and the ability to opt out at any time.
  • Maintain records of consent: Log when and how consent was obtained, and ensure mechanisms are in place to respect withdrawal of consent.

3. The Transparency Challenge: Hidden Data Flows    

Relevant APPs: APP 1 (Open and Transparent Management of Personal Information), APP 5 (Notification of Collection)    

The Problem    

Most users remain completely unaware of how much of their data is collected and shared via tracking pixels and cookies. A single visit to a typical website might trigger code from Google Analytics, the Meta Pixel, the LinkedIn Insight Tag, and a suite of other third-party platforms, each initiating invisible data flows that transmit personal information to external entities.    

What makes this particularly problematic is the lack of meaningful transparency. These technologies often operate silently, with no visible signal to users, no practical notice, and no effective opportunity to understand, let alone control, what’s happening behind the scenes.    

Real-World Impact    

Consider a healthcare website embedding the Meta Pixel to track conversion metrics from a public health campaign. Every visitor’s IP address, device metadata, and browsing behaviour, potentially including interest in sensitive health conditions, is transmitted to Meta’s servers.    

The user has no visibility into this exchange. There is no prompt, no context, and often no specific disclosure in the privacy policy. Yet these transmissions may qualify as collection and disclosure of sensitive personal information, with serious implications under the Privacy Act, particularly if done without proper notice and consent.    

The OAIC's Position: No Excuses    

In its November 2024 guidance, the OAIC made it clear: Australian organisations are responsible for ensuring that tracking pixels and similar tools are implemented in a privacy-compliant way. That includes:  

  • Understanding the functionality of each tracking tool
  • Assessing privacy risks prior to implementation
  • Implementing appropriate safeguards and controls
  • Regularly reviewing and updating the deployment of tracking technologies

The era of “set and forget” is over. Transparency is no longer just good practice, it’s a legal requirement.    

Compliance Strategy    

To meet your obligations under APP 1 and APP 5, transparency must be built into every stage of your digital ecosystem:    

  • Audit Your Digital Ecosystem Conduct a full audit of all tracking tools currently in use, including pixels, cookies, SDKs and embedded scripts.
  • Map Data Flows Document what data is collected by each mechanism, where it goes, and whether it constitutes a disclosure of personal information.
  • Enhance Privacy Notices Update your privacy policy to specifically name third-party recipients and explain the types and purposes of data sharing in clear, user-friendly terms.
  • Point-of-Collection Disclosure Ensure users are notified at or before the time their information is collected, not just via a static privacy policy, but through real-time notices or banners where appropriate.
  • Due Diligence and Documentation Keep detailed internal records of your audits, risk assessments, implementation decisions, and vendor evaluations. These documents form your evidence base for demonstrating compliance and may be critical in the event of regulatory scrutiny.

4. Secondary Use: When Data Takes Unexpected Journeys    

Relevant APPs: APP 6 (Use or Disclosure of Personal Information), APP 7 (Direct Marketing)    

The Problem    

Data collected for one purpose, like website analytics or service optimisation, often finds its way into other parts of the business. It might be reused for targeted advertising, retargeting campaigns, audience profiling, or integrated into customer management systems.    

But these secondary uses frequently go beyond the user’s original understanding of why the data was collected and, in many cases, beyond what the Privacy Act permits. This creates compliance risk under APP 6, which governs all secondary use, and APP 7, which adds specific rules for direct marketing.    

The Compliance Risk    

Under APP 6, an organisation must not use or disclose personal information for a secondary purpose unless:    

  • The individual has consented, or
  • The secondary purpose is related to the original purpose, and the individual would reasonably expect it (for sensitive information, it needs to be directly related).

However, when that secondary purpose is direct marketing, APP 7 applies in addition and sets out a stricter threshold.    

Under APP 7.2, if the organisation collected the information directly from the individual, it may only use it for direct marketing if:    

  • The individual would reasonably expect it to be used for that purpose
  • There is a simple opt-out mechanism, and
  • The individual hasn’t made a prior request not to receive marketing

Under 7.3, if the individual would not reasonably expect their information to be used for direct marketing or the information is collected from someone else other than the individual, then the organisation must either:    

  • Obtain consent, or
  • Satisfy the test that it is impracticable to obtain consent, and include an opt-out in each message along with clear statement drawing the attention of the individual to the message.

For data collected indirectly, such as via tracking pixels embedded in webpages or marketing email, these conditions could be even more stringent, particularly where sensitive information may be inferred.    

Real-World Example    

An organisation collects analytics data from its website to understand how users engage with different content. Later, this data is repurposed and shared with a social media platform to retarget those users with personalised ads, a form of web-based direct marketing.    

The user may never have been told this would happen. The purpose wasn’t clearly disclosed, and no express consent was obtained. And unlike direct marketing via email or SMS, there’s no easy way for the individual to opt out of this kind of behavioural targeting.    

Once the data is passed to the third-party platform, the user must navigate that platform’s own privacy controls or ad settings, often buried in separate interfaces and controlled by another entity altogether. The result? A high likelihood of breaching:    

  • APP 6, because the use diverges from the original analytics purpose
  • APP 7, because it constitutes direct marketing without a reasonable expectation, consent, or a clear, accessible opt-out mechanism

This example reflects a widespread compliance blind spot, especially in programmatic advertising and social retargeting campaigns.    

Compliance Strategy    

  • Use Case Documentation Maintain a register of each data use tied to tracking tools, including whether the use constitutes direct marketing, and the lawful basis under APPs 6 and 7.
  • Reasonable Expectation Testing For each marketing use, ask: Would a reasonable person expect their data to be used this way? If the answer is no, consent or impracticability assessment is required.
  • Consent and Notice Management Implement mechanisms to: Obtain express consent for direct marketing where expectations aren’t clear Include opt-outs in every direct marketing message Update privacy policies and notices to reflect actual marketing uses
  • Cross-Department Governance Coordinate marketing, legal, analytics, and data teams to ensure new campaigns don’t overreach existing consent or breach purpose limitations.
  • Ongoing Monitoring Regularly review data use against original collection purposes to ensure continued alignment with APPs 6 and 7.

5. Security: Protecting Data Across Third-Party Networks    

Relevant APP: APP 11 (Security of Personal Information)    

The Problem    

Every time an organisation implements a third-party tracking pixel, it entrusts user data to external systems it doesn’t fully control. Platforms like Meta, Google, LinkedIn and adtech providers may collect, process and store personal information — often outside the organisation’s own infrastructure or direct oversight.    

This creates significant exposure under APP 11, which requires entities to take reasonable steps to protect personal information from misuse, interference, loss, and unauthorised access. Crucially, these obligations persist even when data is handled by third parties.    

If a vendor is compromised, it’s not just their breach — it may trigger Notifiable Data Breach (NDB) obligations for the implementing organisation, including regulatory reporting and user notification.    

The Shared Responsibility Model    

Third-party vendors are responsible for securing their own systems — but you remain accountable for ensuring that any personal information disclosed to them is protected throughout the entire data lifecycle.    

This means your organisation must take reasonable and proactive steps to assess, manage and mitigate security risks not just within your own IT environment, but across the entire tracking ecosystem.    

Real-World Impact    

Imagine a third-party ad platform you’ve embedded via a pixel suffers a breach — exposing behavioural data tied to identifiable users. If that data originated from your site, the OAIC may expect you to explain:    

  • What due diligence you performed before deploying the tool
  • Whether the pixel was necessary
  • What contractual and technical safeguards were in place
  • How quickly you became aware of the breach
  • Whether your users were notified in accordance with the NDB scheme

Even if the underlying security failure was outside your infrastructure, your organisation could still face regulatory consequences — especially if it failed to take reasonable steps to prevent or contain the risk.    

Compliance Strategy: Security Due Diligence Framework    

  • Vendor Security Assessments Evaluate the privacy and cybersecurity posture of each third-party tracking provider — including breach history, encryption practices, data retention, and access controls.
  • Contractual Safeguards Include clear security clauses in vendor contracts. These should address minimum standards, audit rights, and obligations to notify in the event of a breach or unauthorised access.
  • Incident Response Planning Incorporate third-party breaches involving tracking data into your incident response plan. Define internal escalation protocols and identify when OAIC or user notifications are required.
  • Regular Security Reviews Periodically reassess all third-party tracking tools for necessity and risk. Confirm platform updates, data flows, and threat exposure haven’t evolved outside your initial assessments.
  • Breach Preparedness Ensure your breach response framework explicitly accounts for incidents involving third-party tracking tools, including: Whether personal information is involved How incidents will be detected and escalated When and how the Notifiable Data Breaches scheme applies Timelines and communication responsibilities for notifying users and regulators

6. Sensitive Data: The Heightened Risk Context    

Relevant APPs: APP 3 (Collection of Sensitive Information), APP 6 (Use and Disclosure), APP 7 (Direct Marketing)    

The Problem    

Websites dealing with sensitive topics, including health, finance, politics, sexuality, ethnicity, or mental health, face elevated privacy risks when deploying tracking technologies. Standard tools like pixels, SDKs and analytics tags can inadvertently collect or disclose sensitive personal information, even if they don’t collect names or email addresses directly.    

This risk arises when a user visits or interacts with content that reveals something sensitive about them for example, visiting a cancer treatment page, opening a debt management tool, or engaging with political donation forms. In each case, the context alone may allow third parties to infer sensitive attributes.    

High-risk scenarios include:    

  • Healthcare websites embedding remarketing pixels on condition-specific pages (e.g. cancer, reproductive health, addiction)
  • Financial services platforms tracking behaviour linked to budgeting, lending, or insurance products
  • Political organisations monitoring engagement with campaigns, petitions or donation pages
  • Mental health apps sharing usage data with ad platforms via SDKs or behavioural trackers

Under APP 3, collecting sensitive information requires express consent, and the collection must be reasonably necessary for a defined function or activity. But that's only the start. If the data is then used for profiling, advertising or disclosed to third parties, APPs 6 and 7 also apply, each with distinct, elevated compliance thresholds.    

Real-World Impact    

A mental health support website uses a standard analytics package across all pages, including sections addressing anxiety, depression, addiction, and self-harm. Without realising it, the platform is sending detailed URL and interaction data to third-party analytics providers, some of whom aggregate this information across sites and match it to logged-in user profiles.    

Later, those same users begin seeing ads for related products or services on social media. No specific consent was ever obtained and no real explanation was given that visiting a mental health support page could result in ad targeting.    

This creates serious risk under:    

  • APP 3, for collecting sensitive information without express consent
  • APP 6, for disclosing and using sensitive information outside its original purpose
  • APP 7, for using sensitive data for direct marketing without an express opt-in

The privacy risks are not just theoretical, they are actively experienced by users, and increasingly scrutinised by regulators.    

Compliance Strategy: Enhanced Protection Measures    

  • Sensitive Context Audits Identify all webpages, landing flows, and referral links where user interaction may reveal sensitive personal information. Include embedded forms, search paths and dynamic content.
  • Pixel and Script Configuration Use tag management tools to disable or restrict tracking on sensitive pages. Suppress transmission of sensitive URLs, user interactions, or identifiers wherever possible.
  • Explicit Consent Mechanisms Where tracking is necessary in sensitive contexts, implement clear, layered consent flows that offer users an informed, specific and voluntary choice, both for collection and marketing use.
  • Data Minimisation and Anonymisation Ensure third-party tools are configured to limit collection of sensitive signals. Where possible, strip out identifiers and anonymise behavioural data before it leaves your environment.
  • Purpose and Disclosure Controls Sensitive information must only be used or disclosed in line with its original purpose — or where explicit consent has been obtained. If used for direct marketing, express opt-in is required under APP 7.

7. Cross-Border Data Flows: The Global Tracking Challenge    

Relevant APP: APP 8 (Cross-border Disclosure of Personal Information)    

The Problem    

Most major tracking platforms, including Google Analytics, Meta Pixel, and the LinkedIn Insight Tag, operate on global infrastructure that routinely transfers Australian users’ personal information to servers located overseas. These transfers often occur automatically, triggered by a simple page load or user interaction, and usually without the user’s awareness.    

When organisations implement these tracking tools, they may be disclosing personal information to entities in jurisdictions such as the United States, Ireland, or Singapore. However, many businesses lack visibility into where the data goes, who receives it, and what protections apply.    

Under APP 8, organisations that disclose personal information to overseas recipients must take reasonable steps to ensure that the recipient does not breach the APPs, unless a legal exception applies. Failure to meet these requirements, or even to identify that a disclosure has occurred, can create significant regulatory, legal and reputational risk.    

Real-World Impact    

A not-for-profit organisation runs a national awareness campaign on its website and embeds several tracking tools to measure engagement and support future outreach. These include Google Analytics, Meta Pixel, and an email marketing tool, all of which transmit data to servers overseas.    

The site attracts thousands of visitors, many of whom interact with content relating to sensitive topics such as health, legal rights or financial stress. Yet the privacy policy includes no mention of international data transfers, no specification of recipient countries, and no explanation of what safeguards are in place.    

If user data is transmitted overseas, especially sensitive information or behavioural data, and the organisation has not taken reasonable steps to ensure the recipient upholds the APPs, this may constitute a breach of APP 8. Worse, if the recipient subsequently misuses the data, the Australian organisation may still be held responsible.    

Compliance Strategy: Cross-Border Controls for Tracking Technologies    

  • Transfer Mapping Identify and document all cross-border data flows triggered by tracking tools, including advertising platforms, social media pixels, and analytics services.
  • Adequacy Assessments Assess whether the privacy laws of destination countries provide comparable protection to the APPs – especially when the new cross border adequacy regime starts to have some jurisdictions added to it. If not, additional contractual or technical safeguards must be implemented.
  • Contractual Protections Where possible, include standard contractual clauses or bespoke privacy provisions in agreements with third-party vendors, requiring the recipient to handle personal information in line with Australian standards.
  • User Notification and Transparency Update privacy notices to clearly disclose: That personal information may be transferred overseas The countries or regions involved The safeguards in place

Provide this information at or before collection, ideally alongside consent requests if data will be used for marketing or profiling purposes.    

8. Surveillance and Profiling: Balancing Insights with Privacy    

Relevant APPs: APP 1 (Open and Transparent Management), APP 6 (Use and Disclosure), APP 7 (Direct Marketing)    

The Problem    

Tracking technologies rarely operate in isolation. Over time, the data they collect, which pages users visit, how long they stay, what they click on, and what content they engage with, builds into powerful behavioural profiles. These profiles are often enriched by third-party data sources or social media analytics, enabling hyper-targeted marketing and even automated decision-making.    

While this profiling is marketed as personalisation, it can cross the line into digital surveillance, especially when users aren’t made aware of how far their data travels or how it’s interpreted. It raises serious questions about proportionality, autonomy, and transparency and potential compliance breaches under APP 6 (unauthorised use or disclosure) and APP 7 (direct marketing without consent or reasonable expectation).    

Real-World Impact    

An online retailer uses pixels and cookies to monitor user behaviour across its site. When a visitor clicks on maternity-related products, that behaviour is used to build a profile suggesting the individual is pregnant or planning to be. That information is then used to serve targeted baby product ads across social media, even though the user never gave explicit consent or was told this would happen.    

If that user is accessing the site from a shared device or hasn’t disclosed the pregnancy to others, the targeting could result in personal distress, reputational harm, or even discrimination. Worse, the data may be shared with other advertisers without the user ever knowing.    

This kind of profiling, particularly when based on inferred sensitive information or used for behaviourally targeted advertising, is likely to breach APPs 6 and 7, especially if no clear notification or consent was provided.    

Compliance Strategy: Proportional and Ethical Profiling    

  • Purpose Limitation Define and document the specific purpose for any behavioural tracking or profiling activity. Avoid vague or catch-all use cases that expand over time without oversight.
  • Behavioural Impact Assessments Evaluate whether your profiling practices might influence user decisions, restrict access, or manipulate behaviour in ways that users wouldn’t reasonably expect.
  • Profile Limitation Controls Avoid profiling that becomes overly persistent, predictive, or sensitive in nature — particularly where it could enable discriminatory targeting or create reputational harm.
  • User Control Mechanisms Provide users with practical, easy-to-access ways to limit tracking — including opt-outs, toggles, browser-level settings, and reduced-data modes.

Building Trust Through Transparency    

One emerging best practice is the use of privacy dashboards or personal data portals, allowing users to see what data has been collected and how it’s being used. Far from weakening your marketing, this level of transparency can actually strengthen customer trust, reinforcing your commitment to responsible data use.    

Developing a Comprehensive Compliance Framework    

As digital tracking technologies become more sophisticated — and more scrutinised — Australian organisations must develop a proactive and accountable approach to privacy compliance.    

That starts with building a comprehensive framework for assessing, managing, and documenting risk.    

A foundational step is to conduct a Privacy Impact Assessment (PIA) of current tracking practices, with a focus on practical and legal risk exposure. The assessment should:    

  1. Catalogue Current Technologies Create an inventory of all tracking pixels, cookies, SDKs, analytics tags, and third-party integrations deployed across your websites, apps, and platforms.
  2. Map Data Flows Document exactly what data is collected, where it is sent, how it is processed, and who has access to it, including cross-border transfers.
  3. Assess Legal Basis Evaluate whether each collection and disclosure activity has an appropriate legal basis under the Privacy Act particularly under APPs 3, 6, 7 and 8.
  4. Identify High-Risk Areas Focus on use of tracking in sensitive contexts, overcollection, use without consent, or disclosure to poorly governed third parties.
  5. Prioritise Remediation Address the most critical compliance gaps first, and develop a roadmap for sustainable long-term improvements to governance, contracts, notices, and vendor management.

The tracking and digital advertising ecosystem is evolving quickly, shaped by regulatory pressure, browser changes, and rising user expectations. Businesses that address compliance early and proactively will be far better positioned to adapt and thrive. Some emerging trends include:    

  • Privacy-Preserving Analytics Tools that leverage differential privacy, on-device processing, or federated learning can offer useful insights while minimising personal data collection and improving compliance outcomes.
  • First-Party Data Strategies With the decline of third-party cookies, businesses are shifting towards consensual, direct user relationships, offering value in exchange for permission to collect first-party data.
  • Contextual Advertising As behavioural tracking is reined in, contextual advertising, which targets based on content rather than user profiles, is enjoying renewed relevance and fewer privacy risks.

In 2025, we may also see the return of long-awaited Privacy Act reforms, including the introduction of a “fair and reasonable” test for data handling, an expanded definition of personal information, and stronger individual rights. If passed, these changes will materially raise the compliance bar for organisations using tracking technologies. Those who act early to build robust frameworks grounded in transparency, data minimisation and governance will be best placed to adapt to a stricter, more modern privacy regime.    

James Patto
Founder & Principal
Follow us on social media:
Blog

Clarity in a changing world.

Stay ahead of the curve with expert analysis on the legal, regulatory and strategic issues shaping data, technology, privacy, cybersecurity and AI.

Whether you’re navigating complex reforms, responding to risk, or planning for what’s next, our insights are here to keep you informed and empowered.

Jul 7, 2025

Privacy regime may mean organisations flat-footed for the AI era

Australia’s lighter-touch privacy regime was meant to foster innovation, but as AI adoption accelerates, it may be doing the opposite. In this Privacy Awareness Week article, James unpacks why strong data governance is critical for AI success and how Australia risks falling behind without clearer, stronger privacy rules.

Read more
Built for the digital era. Ready when you are.

Work with Scildan Legal to lead with confidence across privacy, cyber, AI and technology.