- within Employment and HR topic(s)
- with Senior Company Executives, HR and Finance and Tax Executives
- in United States
- with readers working within the Accounting & Consultancy, Basic Industries and Business & Consumer Services industries
In an ever evolving AI landscape, the NSW Government is poised to pass Australia's first WHS duty for digital work systems, which will directly impact any employers who use AI, digital platforms, or algorithms.
The Work Health and Safety Amendment (Digital Work Systems) Bill 2025 (NSW) proposes an amendment to the existing Work Health and Safety Act 2011 (NSW), which requires employers to assess and manage WHS risks arising from digital systems. This includes any AI program, automated workforce management tools, automation or online platform. Although the Bill has not been passed, it may be a suggestion of future national reforms in WHS for digital workspaces.
What is a "digital work system"?
The term "digital work system" under the Bill may include algorithms, artificial intelligence, automation or online platforms used to allocate, monitor, schedule or supervise work. The following workplace tools may fall under these definitions:
- Automated rostering and scheduling software;
- Delivery and task allocation apps;
- Call queueing systems and workflow tools;
- KPI dashboards and productivity scoring systems;
- Biometric sign-in and GPS tracking for mobile workers; and
- Platforms coordinating gig, labour hire or franchise workforces.
For further clarification, if a system influences who works, when they work, what they do or how their performance is measured, it is likely to fall under the definition of a digital work system.
What must be considered by employers under the new WHS duty?
Under the proposed section 21A, a person conducting a business or undertaking (PCBU) will be required to ensure that employees are not exposed to any WHS risks whilst engaging with digital work systems so far as reasonably practical. In doing so, regard must be had to excessive or unreasonable:
- Workloads;
- Performance metrics;
- Monitoring or surveillance; and
- Discriminatory practices or decision-making.
Workloads
It is most often the case that digital tools are designed and used to optimise employees productivity. Under the Bill, employers will be required to assess whether the attempted optimisation pushes workloads to become excessive or unmanageable. This may include:
- Rostering programs that routinely schedule consecutive shifts with insufficient breaks;
- Task or route-planning software that suggest unrealistic travel or completion times;
- Calendars or scheduling programs that automatically accept appointments without reviewing fatigue risks; and
- Systems that auto assigning new tasks without allowing workers to complete current work in a safe manner.
Performance metrics
Due to the instantaneous nature of digital work systems, targets, KPIs and performance can tracked in real time, allowing employees to ensure they are staying on target. This may include calls per hour, pick ups per day, boxes ticked, etc.
The Bill now requires employers to analyse and determine whether these metrics are unreasonable or excessive. As such, performance measures may now be considered as a form of psychological hazard in the workplace, as they create high stress environments, lack of workplace autonomy, and fear of punishment for not reaching set targets.
Monitoring and surveillance
Modern technology and digital work systems presents an employer with a far greater ability to monitor employees past traditional surveillance cameras. Among other things, employers may now choose to:
- Monitor keystroke patterns and work screens of remote workers;
- Locational and GPS tracking for delivery services;
- Eye movement monitoring in webcam conferences; or
- Website and application tracking on company computers or Wi-Fi.
Organisations may have policies permitting the aforementioned surveillance, but it is essential that employers assess whether the intensity, intrusiveness and duration could amount to a WHS risk.
Discriminatory practices and algorithmic bias
The Bill also directs employers to consider whether digital work systems create or are the cause of discriminatory decision making. In plain terms, this is the suggestion that generative software may have biases past those most obvious. For example,
- Ranking or performance based tools that indirectly penalise older workers, or those with disabilities;
- Allocation algorithms that favour full-time workers over part-time or casual workers for high demand shift times; and
- Systems that deprioritise workers with certain availability patterns, which may include carers or students.
Where these rules are built into software, there is a risk that discrimination becomes systemic and harder for individual workers to detect or challenge. It is essential that employers are aware of such acute programming, and are capable of reviewing any possible WHS risks associated with them.
Notwithstanding that the proposed changes are for NSW businesses only, it is our view that such changes may soon form part of the national workplace landscape and in order to be prepared for these possible changes, please contact us to assist with a review of your digital work system from a legal compliance perspective.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]