ARTICLE
11 December 2025

Employee Privacy In 2026: How To Manage Monitoring, AI Tools, And Legal Risks

WE
Wilson Elser Moskowitz Edelman & Dicker LLP

Contributor

More than 800 attorneys strong, Wilson Elser serves clients of all sizes across multiple industries. It maintains 38 domestic offices, another in London and enjoys more extensive international reach as a founding member of Legalign Global.  The firm is currently ranked 56th in the National Law Journal’s NLJ 500.
In today's digital landscape, employers are increasingly relying on advanced technologies to optimize workflows, drive performance, and monitor productivity.
United States Employment and HR
Jonathan E. Meer’s articles from Wilson Elser Moskowitz Edelman & Dicker LLP are most popular:
  • within Employment and HR topic(s)
  • with Senior Company Executives and HR
  • with readers working within the Basic Industries, Insurance and Healthcare industries

Highlights:

  • As workplace monitoring expands, employers face rising legal exposure across biometrics, AI tools, and digital tracking systems.
  • A fast-growing patchwork of state and federal laws—CCPA, BIPA, HIPAA, AEDT laws—now dictates how employee data can be collected, used, and stored.
  • To avoid regulatory and reputational risks, employers must adopt transparent policies, minimize data collection, and implement strong governance controls.

In today's digital landscape, employers are increasingly relying on advanced technologies to optimize workflows, drive performance, and monitor productivity. A 2025 study by the Massachusetts Institute of Technology (MIT) found that 80 percent of companies are monitoring remote or hybrid workers 1.

Employee-monitoring software can track the websites employees browse, capture documents they create, and even map their physical location. This includes keystroke and mouse movement deduction. Additionally, workforce management systems have implemented "smart scheduling software" that uses AI and automated tools to create rosters, optimize shifts, and build compliant schedules.

Digital Tracking and Privacy Concerns

Digital tracking in the workplace raises concerns about employee privacy and the lawful, responsible implementation of these tools. Although there is no singular or hallmark law governing workplace privacy, a patchwork of federal and state laws extends certain privacy protections to employees whose employers use digital technologies to monitor them.

For example, video footage of employees can capture individual employees' gait patterns, which are considered "biometric information" under the California Consumer Privacy Act (CCPA) and subject to specific protections. Keystroke patterns or rhythms are also considered biometric information under the CCPA 2. Companies that collect this information about their employees face legal exposure if they do not comply with requirements such as notice, disclosure, and consent, where applicable.

Specific AI-related privacy laws have been passed or are now being proposed in the employment context. These laws include New York Local Law 144 and Illinois H.B. 3773, which have been in place for years now, regulating the use of automated employment decision tools (AEDTs) in the hiring process.

California's government recently signed a law on September 29, 2025, which prohibits an AI developer from preventing an employee from – or retaliating against them for – disclosing certain information, such as critical risk, to authorities regarding their development and deployment of AI 3. More laws are being proposed about AI making consequential decisions in employment, such as current bills pending in Colorado 4 and Massachusetts.5

Moreover, lawmakers are increasingly protective of personal information that can be used to identify individuals, including in New York City, where a bill has been proposed to legally designate gait patterns as "personal identifying information" subject to special protection 6. Under the Illinois Biometric Information Privacy Act (BIPA), written consent is necessary prior to the capture, storage, or use of biometrics, and written notice informing individuals of the use of their biometric information is required.

Under the Health Insurance Portability and Accountability Act (HIPAA), employers are prohibited from using an employee's personal health information (PHI) to make employment decisions, such as hiring, firing, or promotions, without the employee's explicit written authorization – this includes when AI is involved in those decisions, so businesses must pay attention when using AI tools, which may be considered third-party vendors, to handle PHI.

The use of digital monitoring tools in the workplace potentially exposes companies to a range of legal risks. Failure to comply with applicable privacy laws, such as the CCPA, BIPA, HIPAA, or other state-specific regulations, can result in significant regulatory penalties, including fines and mandatory corrective actions. In addition, improper handling or unauthorized disclosure of employee biometric or personal information may lead to civil litigation, including class action lawsuits brought by affected employees.

Companies can also face claims of invasion of privacy, discrimination, or wrongful termination if monitoring data is used inappropriately or without adequate transparency and consent.

Furthermore, regulatory bodies may initiate investigations or enforcement actions if they determine that an employer's monitoring practices are overly intrusive or lack sufficient safeguards. To mitigate these risks, employers must ensure that their monitoring policies are clearly communicated, that they obtain all necessary consents, and that they implement robust data security measures to protect sensitive employee information.

Practical Recommendations

As advanced technologies become ubiquitous in the workforce and legislation evolves to address them, there are practical steps that employers can take to responsibly leverage digital tools while respecting legal, ethical, and regulatory guardrails. First, employers should understand the key privacy principles of purpose limitation and data minimization: collecting only the personal data that is necessary for a specific, legitimate purpose and using it only for that purpose.

Employers should develop and circulate comprehensive use policies, such as AI use policies that provide guidelines and best practices for how certain AI tools may or may not be used in the workplace, across all levels and roles.

Additionally, full transparency about the use of digital tools, whether in an employee handbook or other organization policy document, helps create trust and maintain compliance with legal requirements governing notice and disclosure. For example, employees should be informed when AI tools, especially automated ones, are used to make decisions about them. Employers should obtain consent where required.

Further, employers should implement robust vendor management and audit systems, ensuring that AI training datasets are unbiased, can be audited, and mitigate algorithmic errors. It is best practice to conduct privacy and algorithmic impact assessments before launching new systems or technologies, and to ensure appropriate security mechanisms are in place to keep data safe. Finally, there should always be human-in-the-loop controls for oversight, validation, and review of technological tools.

With all the above, it is essential to stay up to date on existing and pending legislation related to employee monitoring and surveillance, the protection of personal data, privacy rights, and AI governance and regulation. Employers must ensure the use of digital tools in the workplace is consistent with the law and update policies and practices accordingly.

The laws and regulations regarding digital privacy are constantly evolving. Technology's role in the workplace will continue to grow, and employers must be aware of privacy considerations for their employees.

Footnotes

1 https://www.technologyreview.com/2025/02/24/1111664/worker-monitoring-employee-surveillance/; https://www.computerworld.com/article/3836836/electronic-employee-monitoring-reaches-an-all-time-high.html

2 California Consumer Privacy Act, Section 1798.140.

3 https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202520260SB53

4 https://leg.colorado.gov/bills/hb25b-1009

5 https://malegislature.gov/Bills/194/S2630

6 https://www.cbsnews.com/newyork/news/new-york-city-gait-recognition-technology-nyc-council-jennifer-gutierrez/

Originally published by HR.com

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More