Learn how the latest Privacy and Other Legislation Amendment Bill 2024 introduced stricter data protection laws, increased penalties, and new AI compliance requirements.
The Australian Privacy Act has undergone significant amendments, coming into effect in late 2024. These changes, combined with the introduction of the Cyber Security Act 2024, impose stricter compliance obligations on businesses handling personal data.
Increased regulatory enforcement, heightened cybersecurity obligations, and new AI-specific compliance requirements create new complexities that all businesses must address to avoid financial penalties, legal liability, and reputational damage.
Understanding the Key Changes in the Privacy Act Amendments
Protection of Personal Information
The Amendment Act clarifies that ‘reasonable steps’ to protect information include implementing ‘technical and organisational measures’. This is effective from 11 December 2024.
Regulatory Powers and Penalties
The OAIC has new powers to issue infringement and compliance notices. Non-compliance with a compliance notice may result in civil penalties. This is effective from 11 December 2024.
Statutory Torte for Serious Invasions of Privacy
Individuals, including employees, can take legal action against organisations or individuals for serious invasions of privacy. This will be effective on or before 10 June 2025.
Automated Decision-Making (AI)
Transparency obligations require organisations to update their privacy policies to disclose when decisions are made using automated processes. This is effective from 10 December 2026.
Other Changes:
- A Children’s Online Privacy Code is to be developed and registered by 10 December 2026.
- Whitelist powers for countries with similar protections to simplify the transfer of personal data.
Tranche 2
Many ‘agreed in principle’ proposals were not included in the original amendment and are expected to be addressed in a second tranche of legislation. These include the removal of the small business exemption for businesses with turnover under $3 million, an exemption for employee records, and reforms to data retention and marketing.
How the Privacy Act Amendments Affect Cybersecurity
The amendments now clarify that ‘reasonable steps’ to protect personal data include ‘technical’ and ‘operational’ measures. Technical refers to physical, hardware and software measures. Operational includes policies, procedures, training and response plans.
Cybersecurity is now a legal obligation rather than best practice. Under the new laws, organisations that experience a data breach may face severe financial and legal consequences if their technical and/or operational defences are deemed not to be ‘reasonable’.
To strengthen cybersecurity compliance for personal data, organisations should consider:
- The extent of personal data held and its level of sensitivity, to assess the risk consequences of a breach.
- How effective existing security policies and procedures are at protecting personal data.
- Physical and cybersecurity measures to protect the organisation from external attack and potential litigation for breach of privacy.
- Response measures to limit access to personal data and recover from a potential breach.
AI-Specific Compliance Requirements
The Privacy Act amendments require businesses to be more transparent and accountable in how they process personal data using AI systems.
Although the new automated decision-making amendments are not due to come into effect until December 2026, organisations should begin to factor in the requirements for existing and new AI models.
A system can be considered to use automated decision making if:
- It performs something substantially and directly related to deciding about an individual
- The decision significantly effects the individuals’ rights or interests, and
- Personal information is used to make the decision.
Organisations will need to provide more transparency via their privacy policies when automated systems are used to make decisions about individuals, including:
- The type of personal information used
- What decisions are made solely by the programs
- Decisions that are substantially and directly towards deciding about an individual.
Organisations using AI for decision-making about individuals should consider:
- Establishing AI governance policies defining data handling and decision-making.
- Keeping detailed records of AI-driven decisions for accountability.
- Conducting regular AI audits to prevent bias and unintended consequences.
Failure to comply with these AI regulations could result in privacy lawsuits, regulatory fines, and reputational damage.
A New Privacy Landscape
Taken together, the combination of Privacy Act Amendments, Cyber Security Act 2024 and expected further legislation in the near future demonstrates that protecting personal data is no longer business-as-usual. It requires a re-examination of current practises today, constant re-alignment with reasonable technical and organisational conduct, and high transparency as AI models are increasingly leveraged.
Disclaimer: Virtuelle Group are experts in Cybersecurity and AI, but we are not legal specialists. While extensive research has been undertaken to ensure the accuracy of the above, it is intended as a high-level summary. You should not rely on it as legal advice and conduct your own due diligence.