Machine Learning Algorithms | Armstrong Legal

Call Our National Legal Hotline

1300 038 223
Open 7am - Midnight, 7 days
Or have our lawyers call you:

This article was written by Dr Nicola Bowes

Dr Nicola Bowes holds a Bachelor of Arts with first class honours from the University of Tasmania, a Bachelor of Laws with first class honours from the Queensland University of Technology, and a PhD from The University of Queensland. After a decade working in higher education, Nicola joined Armstrong Legal in 2020.

Machine Learning Algorithms


Nearly all online businesses collect data from customers and website visitors. Often, these businesses use a form of artificial intelligence (AI) to analyse the data. Recent developments in AI have focused on using machine learning algorithms (MLA) to model and make accurate consumer predictions. While businesses may welcome these technological advancements, there are legal considerations when using specific AI integrations. This article explains the legal implications of businesses using MLAs in Australia.

What Are Machine Learning Algorithms?

AI learning systems evolve as they learn from experience and examples. MLAs are a form of AI that can turn large data sets into models and predictions. In this process, a neural network system attempts to mimic the way humans make decisions. For example, programmers might train a neural network to identify dogs by inputting hundreds of animal photographs. Unlike other classification approaches that might identify specific features of a canine, a neural network system is taught to recognise a dog with human-like instinct rather than through the catalogue of features.

Businesses frequently implement MLAs to:

  • Verify customer identity;
  • Determine customer preference in online stores; and
  • Determine a customer’s potential maximum credit limits.

A business that collects personal information for an MLA needs to understand its legal obligations. There are Australian laws that regulate machine learning algorithms and the collection of personal information. If the algorithm uses personal information to make automated decisions, the business must abide by Australian privacy laws.

The federal Privacy Act 1988 introduced thirteen Australian Privacy Principles (APP) to regulate data collection, usage and disclosure. According to the Privacy Act, personal information is data and opinions about an identifiable person, whether it is true or not and exists in material form. A business might hold personal information on a customer’s age, location, contact details and preferences. Personal information does not include de-identified information. For instance, a business can collect location information about website visitors as long as it is in aggregate format and not tied to identifiable individuals.

These APPs only apply to “APP entities”, which includes any business that:

  • Has an annual turnover of $3 million or more;
  • Is a health service provider;
  • Trades in personal information; or
  • Has a contract with the Australian Commonwealth government.

A business that is an APP entity needs to be open and transparent when managing personal information. They should only collect personal information that is reasonably necessary for business activities. Also, the business must provide notice to customers about their data collection and usage. The business can fulfil this requirement by prominently displaying an up-to-date privacy policy. APP entities and businesses that use automated decision-making processes (such as MLAs) must conduct privacy impact assessments for high privacy risk projects.

An Australian online business may also need to abide by additional international laws. A business that intends to trade in Europe or the UK must comply with laws that address automated decision making. For example, the European Union’s General Data Protection Regulation and the United Kingdom’s General Data Protection Regulation regulate businesses that use automated processing decisions and machine profiling. Individuals have the right to contest the decision of an MLA or obtain a human review of the judgment.

Independent Standards

Businesses should be aware that there can be negative consequences when an MLA does not function as intended because of inaccurate or incomplete data programming. Businesses must be vigilant to avoid an MLA breaching Commonwealth anti-discriminatory legislation, such as the Racial Discrimination Act 1975, Sex Discrimination Act 1984, Age Discrimination Act 2004, or relevant provisions of the Fair Work Act 2009.

A salutary example is Amazon’s use of MLA in its recruitment activities. In 2018, the multinational stopped using its artificial intelligence after it was discovered that it had taught itself to discriminate against female job applicants. The AI was not rating candidates for technical posts in a gender-neutral way. The MLA was modelled off recruitment patterns from the previous decade when male employees dominated the tech industry.

It is particularly problematic when a government organisation uses MLAs. Governments should be transparent, accountable and provide due procedural fairness. An AI can overlook these obligations in pursuit of efficiencies. A notable example of government use of MLAs is Centrelink’s much vilified “Robodebt” AI. Under this system, one in five recipients wrongly received debt collection notices. Incidents like this have led to calls for greater legal oversight and regulation of MLAs. There are already several independent standards that businesses can choose to meet when using machine learning algorithms.

If your business uses MLAs to analyse personal customer information, you need to check your legal obligations regarding data collection. Phone 1300 038 223 for legal advice on developing a privacy policy or collection notice. Please contact Armstrong Legal if you have questions about the implications of the Privacy Act on your business’s use of machine learning algorithms.

Armstrong Legal
Social Rating
4.8
Based on 382 reviews
×
Legal Hotline
Open 7am - Midnight, 7 Days
Call1300 038 223