Artificial intelligence (AI) tools like ChatGPT, Harvey.ai, Perplexity etc. have revolutionised workplace productivity, offering quick and efficient solutions for various tasks. However, their widespread adoption has raised serious concerns about data privacy and confidentiality, which urges the question: How secure is the information we put in? Confidential data may be repurposed for model training or even exposed through strategic prompt manipulation. It is essential that users, businesses and individuals alike, must carefully consider the risks associated with using AI in professional settings.
A survey of 1,666 U.S. employees conducted in June 2024 found that 22% of workers use ChatGPT daily, with most common task being summarising documents (61%). 6 There are increasing worries about confidential data being divulged when it is transferred to the AI provider for training purposes and the information gathered being used in answers to others, including sensitive information.
The UK's National Cyber Security Centre (NCSC) found that information that is inputted is not currently used for the improvement of AI answers. At the moment, Large Language Models (LLMs), a type of AI that processes, analyses, and generates human language, are trained on the data already gathered first, and the resulting model is then released for public usage. However, information that is inputted is visible to the developers of the LLM, stored and, potentially used in further improving the model. 4 Furthermore, Cyberhaven's research found that certain strategic prompts could lead to ChatGPT revealing confidential data to third parties. Indeed, there have been an increasing number of cases of 'injection attacks' or 'jailbreaking', where users have found loopholes that allow ChatGPT to divulge private or prohibited information by giving specific and strategic instructions that circumvent safety and privacy rules imposed by the developers. 2
This means that content made available to AI systems, including confidential and privileged information, may be disclosed to the developers, used for improvements or even accessible to third parties. The legal implications for businesses and clients alike are severe, as this threatens confidentiality as well as IP rights.
To be clear, AI developers are implementing safeguards, but the risk of confidential information being stored, accessed, or repurposed remains a pressing concern. Organisations must establish strict policies on AI usage, ensuring sensitive data is protected from unintended exposure. Additionally, legal frameworks need to adapt to safeguard intellectual property and privacy rights in an era where AI plays an increasingly prominent role. Without proactive measures, the convenience of AI could come at the cost of confidentiality and trust.
Footnotes
1. Risk outlook report: The use of Artificial Intelligence in the legal market. Solicitors Regulation Authority. (2023, November 20). https://www.sra.org.uk/sra/research-publications/artificial-intelligence-legal-market/
2. Hill, M. (2023, March 22). Sharing sensitive business data with CHATGPT could be risky. CSO Online. https://www.csoonline.com/article/574799/sharing-sensitive-business-data-with-chatgpt-could-be-risky.html
3. Coles, C. (2023, February 28). 11% of data employees paste into CHATGPT is confidential. Cyberhaven. https://www.cyberhaven.com/blog/4-2-of-workers-have-pasted-company-data-into-chatgpt
4. CHATGPT and large language models: What's the risk? NCSC. (n.d.). https://www.ncsc.gov.uk/blog-post/chatgpt-and-large-language-models-whats-the-risk
5. Hays, S. (2024, October 30). AI training and copyright infringement: Solutions from Asia. Tech Policy Press. https://www.techpolicy.press/ai-training-and-copyright-infringement-solutions-from-asia/
6. CHATGPT helps 4 in 10 users get raises, as workers' fears of Ai Fizzle. Resume Templates. (2024, July 8). https://www.resumetemplates.com/chatgpt-helps-4-in-10-users-get-raises-as-workers-fears-of-ai-fizzle/
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.