ARTICLE
5 August 2025

AI & Employee Surveillance: A New World For The Workplace (Podcast)

W
WilmerHale

Contributor

WilmerHale provides legal representation across a comprehensive range of practice areas critical to the success of its clients. With a staunch commitment to public service, the firm is a leader in pro bono representation. WilmerHale is 1,000 lawyers strong with 12 offices in the United States, Europe and Asia.
The workplace has undergone numerous changes since COVID-19 enforced remote work requirements, including how employee surveillance tools have grown to monitor more and more employee activity.
United States California Employment and HR

The workplace has undergone numerous changes since COVID-19 enforced remote work requirements, including how employee surveillance tools have grown to monitor more and more employee activity. The rise of this expanded technology, with advancements aided by new developments in AI, has led to more questions around its practical uses and how employees perceive it.

Partner and Chair of the Labor and Employment Practice Laura Schneider joins this episode of In the Public Interest to discuss the tools and methods currently being used for employee surveillance. She shares her insights into how the workplace surveillance landscape has changed over the last few years, the role AI now plays and federal and state litigation surrounding this topic that could arise in the future.

Transcript

Felicia Ellsworth: Welcome to In the Public Interest, a podcast from WilmerHale. My name is Felicia Ellsworth and I'm a partner at WilmerHale, an international law firm that works at the intersection of government, technology and business. In these days of artificial intelligence, the tools to monitor and surveil have grown robust. Employers leaned on these new surveillance tools during the pandemic. Today, those methods have only expanded. How are employers utilizing AI to monitor their businesses and surveil employees? How is this being received, and how are states and the federal government moving to protect those who may be negatively impacted? We'll find out from our colleague, Laura Schneider, a partner in WilmerHale's Boston office, in a conversation about AI, the workplace and the new world of surveillance.

Felicia Ellsworth: Laura, thanks so much for joining us today.

Laura Schneider: Thanks. It's good to be here.

Felicia Ellsworth: Could you tell us a little bit about your practice? What does it entail? What do you do?

Laura Schneider: I'm the chair of the Labor and Employment practice, and my practice involves advising employers, companies, nonprofits, higher educational institutions on all things to do with their employees. That can involve anti-discrimination law, harassment claims, non-compete litigation, it can involve all manner of compliance issues privacy. Generally, helping employers stay compliant with the law and deal with all of the issues that come up around their employees and managing them.

Felicia Ellsworth: As you were charting your course, when did you decide you wanted to become a lawyer? And then when did you decide you wanted to be an employment lawyer, in particular?

Laura Schneider: I decided that I wanted to become a lawyer when my husband decided that he was going to go and get his PhD in anthropology and it seemed like a good idea that one of us did something that would get us out of school in fewer than 10 years' time. And I decided that I wanted to be an employment lawyer when I was specializing in securities litigation, which was kind of interesting. But I got staffed on a gender discrimination case and the human aspect of the stories around employment law were so much more interesting than securities fraud. In the wake of that case, I was invited to join the labor and employment practice, which was more fledgling at that time, and I decided to make the switch and I have not regretted it. The stories are always very interesting.

Felicia Ellsworth: That's a great origin story. When you think about workplace surveillance, what does that phrase mean to you?

Laura Schneider: It means any number of things. Really, any methods that an employer would use to monitor its employees to make sure that they are performing their assigned job duties, adhering to company policies, doing the work in a secure, safe manner. It can be as simple as a supervisor doing regular one-on-one check-ins or stop-buys with their direct reports. It could be visual or audio recordings, all the way to using advanced software and AI tools on company computers to track Internet use by employees, log in times, log out times, keystroke, mouse activity to identify idle time. It's really any and all of those things.

We saw an increase in employee monitoring when the shift to remote work happened back in the pandemic times when offices shut down and employers no longer had as much visibility day-to-day into how their employees spent their time. So, they look for ways to get that visibility. And they found them, and even now with return to the office, those monitoring tools probably remain in greater use now. And I will say that monitoring of employees is probably even more prevalent in non-office work environments, like in the commercial driving industry, delivery services, long haul trucking, that sort of thing where things like GPS tracking are used, dashboard cameras, vehicle sensors of all kinds to monitor where you go and what you're doing. And those things can monitor drivers in real time to detect signs of fatigue distraction. So, there's actually a safety aspect often to some of that, so surveillance really runs the gamut of anything that an employer does to watch its employees.

Felicia Ellsworth: There's something about workplace surveillance that sets up a tension between employees and their employers. What is it about surveillance that leads to this tension?

Laura Schneider: Well, I think it's just human nature, right? Nobody likes to be watched, and employees are often suspicious of why their employers might be wanting to watch them and perceive it as kind of a "gotcha" thing. They're looking at me to find a mistake or find something wrong so that they can discipline me. And to be fair, some employers may be acting in that way and that certainly leads to the tension. I think most employers are looking to monitor to make their businesses work more efficiently and sometimes for the sake of the safety and security, both in terms of information and in terms of personal physical safety, of their employees and more broadly, their businesses. And I think there are benefits for employees as well. For instance, the newest AI software can review employee communications, which employees may not like, but it can go through e-mail and Slack and other messaging platforms to identify if there's bullying or harassment or racist language and conversation happening. And then you can use that information as an employer to stop that stuff from occurring and hold people accountable.

You can also use these things for security to catch or prevent data loss as it's happening, to protect everybody from phishing attempts, but it can also be used as a coaching tool. There are programs for salespeople, for example, that the AI can review the employees' phone calls, and then offer suggestions to do better with customers and sell more and get more commissions. The tension, I think, is heightened these days because of the advances in technology and AI, right? Employers have always had the ability to set up cameras in shipyards and that sort of thing. But with the new tools, you can instantaneously get and analyze thousands, millions of data points and that gives insight into trends in real time, flags risks, and that's got benefits and real downside. So, there is a tension, and as with anything in the employment law space, employee relations is really important, whether it's employee relations and trust around abuse of these kinds of tools or anything else really in the workplace.

Felicia Ellsworth: Let's talk about that trust element that you just mentioned as well as transparency. What can employers do in terms of being transparent about technology enabled workplace surveillance? How are they building that trust and how much transparency is the right level?

Laura Schneider: I think you're right. There's a balance and as always, we lawyers give the "it depends" answer. It depends on the type of surveillance and the purpose of the surveillance. For example, if you are trying to improve employee performance or efficiency or safety or some other goal that I think everybody would agree is the legitimate business goal, it may make sense to notify employees of the monitoring or the intended purpose of it. Some types of monitoring will require consent, particularly under certain state laws. In New York, employers that engage in any kind of electronic monitoring of employees, which would include telephone, e-mail, chat monitoring, have to provide notice to all new hires of that monitoring and the new hire has to sign an acknowledgement that says that they understand that that's happening.

There are federal and state wiretap laws that restrict certain types of surveillance practices, generally requiring consent from at least one party, sometimes two. These laws are also framing some of the transparency, and at this point, there's not really a line between what I'll call more traditional surveillance methods or any electronic surveillance methods and AI-enhanced ones. So, it comes back to the question of general principles and what makes sense in a given situation with a given technology. What's fair game for surveillance and what the law actually requires in terms of notice and consent, so it's always going to be tailored to a specific circumstance.

Felicia Ellsworth: Okay, so let's talk about this issue from the other side of the equation. What advice would you give to employees working within their employer's surveillance program?

Laura Schneider: Yeah, my children are now recently grown and out in the workforce, so I guess I would give the advice that I give to them, which is to understand that there isn't really much reasonable expectation of privacy that the law gives to employees. You should assume when you are at work, or when you're working on your employers' systems, whether it's their phones or their computers, that they can know what you are doing. And so, you should act accordingly and appropriately doing the job that they've hired you to do and complying with their policies. That's the advice that I give to my own kids. That said, there is a reasonable expectation of privacy under the law that would prevent something like camera monitoring in the restroom or in certain states that would prevent recording of conversations without both parties consenting to that. So, there are limits, but I feel like for employees generally, the best advice is just to always assume that somebody is watching. I think to be fair, though, the assumption in most cases is probably justified to be that the employer is using the tools available to manage the business and the workforce towards legitimate ends and not to sort of spy for no reason or catch someone doing something unlawful.

Felicia Ellsworth: Would you advise an employee to carry two devices, both a personal device and a workplace-issued device?

Laura Schneider: I do that and so yes, I would advise anybody to do that.

Felicia Ellsworth: Companies are constantly seeking to cultivate a company culture that reinforces their core values. What role do you think employer surveillance may play in the perception of a company's culture?

Laura Schneider: So tricky situation, right? People don't like to be watched and monitored. Really none of us does. There was a study a couple of years ago that the American Psychological Association did, and it reported that workers subject to employer monitoring report increased stress and anxiety. No surprise there, 56% of workers who experienced monitoring by their employers reported that they were feeling stressed out during the workday. So, the irony here is if an employer is using these tools to maximize efficiency and productivity, this stress that employees feel may ironically lead to worse performance and less productivity. I think to balance the tension as best you can, employers should use the monitoring as a tool for something other than discipline. As a tool with a positive goal in mind. Zoom out first and say, what are we trying to achieve as opposed to, "Oh, here's a cool new tool, what can we do with it?" It's really thinking about what business problems do we need to solve? What goals are we trying to achieve in a more universal sense? And then see what tools are available to achieve those ends. And I think if that's the case, presumably then you are in the kind of workplace that generates more trust across the board.

Felicia Ellsworth: Has this question about workplace surveillance risen to be a board level issue yet, or not?

Laura Schneider: I haven't seen it get there yet, but it certainly could. And again, surveillance approaches can have risks, and so I would say it's an area of potential risk like any other, but not necessarily specific to the AI in the surveillance context. There are legal implications here and those things can rise to the board level, certainly.

Felicia Ellsworth: So with regard to legal risk, can you tell us a little bit about what the key federal protections are that are in place that protect employee privacy in this space?

Laura Schneider: Most of that is really actually more in the state law context when you're looking at privacy notions. But there are a couple of federal laws that I did want to flag where this area is implicated. The first one would be the National Labor Relations Act. The NLRA actually protects the rights of employees to act concertedly for better working conditions. Everybody has that NLRA protection if they're not a supervisor, even if the workplace doesn't have a union. One of the things that that law prevents is employers interfering with their employees' ability to collectively organize and a couple of years ago, the National Labor Relations Board general counsel specifically announced her intention to crack down on intrusive and abusive, her words, electronic monitoring and automated management practices. And she specifically called out employers that track employee movements using GPS or wearable devices, cameras and that sort of thing. And there was actually an NLRB case that found that the use of inward-facing truck cameras could constitute unlawful surveillance of drivers. In that case, the driver had covered up the inward-facing camera while he had his lunch break, figuring that was his private time and the company disagreed, and they went to the National Labor Relations Board that agreed with the Union and said that was improper surveillance.

I also want to flag state and federal anti-discrimination laws because those also provide guardrails. AI monitoring or any kind of surveillance can't involve discrimination on the basis of race or gender or any other protected class. Because AI algorithms learn from existing data, if the underlying data includes bias against a certain group, then the AI results will likely reflect that bias and potentially result in unlawful decision making. You need to be careful there. There is a California case in federal court called Mobley v. WorkDay, and that case allowed a job applicants discrimination case to move forward against the vendor that had provided employers with AI-powered recruitment tools. This applicant was saying he was screened out of employment opportunities due to his race, age, and disability status. And now that case is actually being allowed to move forward as a collective action by presumably all of the people who were screened out on the basis of those things. So, it's another reason for employers to make sure that they are monitoring their monitoring technology and auditing their AI and that there's human involvement in decision making at all levels so that you don't end up with those unintended consequences.

Felicia Ellsworth: Those are great insights into the existing federal and state laws that are in place and some of the cases interpreting them. But what about the future? What's on the horizon in terms of federal or state legislation that we should be aware of?

Laura Schneider: States have really taken the lead both in privacy and in the AI discrimination space. And I expect that they still will. So, California has a Privacy Rights Act that established rights for employees to know when their personal information was being collected, how the data is being used, when it's being shared. New York and some other states have biometric privacy laws so that regulates how you can collect and use fingerprints and facial scans and those sorts of things. And then New York City has an AI bias law that prohibits employers from using tools for employment decisions unless there's been a certain kind of auditing. Colorado has a similar one that applies to taking care to protect workers from discrimination that could develop from use of AI tools. So all of that is happening, and I think that we are going to see more state legislation in this area. To date, they've really focused on the AI side, on discrimination. The privacy laws are coming too, and even though they haven't specifically focused on AI, we will see more legislation, but it's really hard to keep up. The law lags behind the technology.

Felicia Ellsworth: Do you think companies are going to continue to have to work with this patchwork of state and municipal legislation and regulation, or do you think Congress will step in with a national uniform standard in this area?

Laura Schneider: It would be great if Congress stepped in, but I think at this point in time it's not realistic to expect the federal government to be able to pass meaningful bipartisan legislation in this space. So, I would anticipate that it would be state lawmakers to be taking action, at least in the near term, but no crystal ball, and the feds could surprise us.

Felicia Ellsworth: Well, we'll certainly see what the future entails. Now before we go, one last question for the folks in our audience that are just starting out in their legal careers. What advice would you give them if they are interested in pursuing a career in employment or privacy law?

Laura Schneider: Neither is typically the subject of any sort of required coursework, so I would say take electives in those spaces. Think about internships with state or federal agencies in the space, like the EEOC or a state Fair Employment Practices Agency. These areas of law are evolving so fast, so keep abreast of developments and I would say it's a great time to become the master of one of these rapidly evolving areas of law and you could quickly find yourself the expert anywhere you landed.

Felicia Ellsworth: This has been a very interesting discussion. Thank you so much, Laura. We really appreciate your time.

Laura Schneider: Thank you.

Felicia Ellsworth: And thank you to our listeners for tuning in to this episode of In the Public Interest. We hope you'll join us for our next episode. If you enjoy this podcast, please take a minute to share with a friend and subscribe, rate and review us wherever you listen to your podcasts. If you have any questions regarding this episode, please e-mail them to us at Inthepublicinterest@wilmerhale.com. For all of our WilmerHale alumni in the audience, thank you for listening. We are really proud of our extended community, including alumni in the government, the nonprofit space, academia, other firms and in leadership positions and corporations around the world. If you haven't already, please join our alumni center at alumni.wilmerhale.com so we can stay better connected.

Special thank you to the producer of this episode, Maria Kanevsky. Sound engineering and editing by Bryan Benenati, marketing by Andy Basford and his team, all under the leadership of executive producers Kaylene Khosla, Matt O'Malley and Jake Brownell. See you next time on In the Public Interest.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More