On October 30, 2023, President Biden issued Executive Order No. 14110 on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. While Venable's Labor Employment Group has previously published on other executive agency guidance on the use of artificial intelligence—or AI—in making employment decisions (here and here), EO 14110 is the first federal directive that directly addresses the impact of AI on the education sector broadly, including, without distinction, both public and nonpublic schools. Specifically, the White House's accompanying Fact Sheet outlines how EO 14110 "directs the most sweeping actions ever taken to protect Americans from the potential risks of AI systems" while also shaping "AI's potential to transform education by creating resources to support educators deploying AI-enabled educational tools, such as personalized tutoring in schools." This duality of EO 14110, and its recognition of both the risks and benefits of AI, leads to two seemingly disparate results.
On the one hand, it encourages regulatory agencies, like the Department of Education, to use their full range of authority, such as utilizing rulemaking, to protect students from fraud, discrimination, and threats to privacy and to address other risks that may arise from using AI. This includes directing the secretary of education to develop, within the year, resources, policies, and guidance that address safe, responsible, and non-discriminatory uses of AI in education, including the impact AI systems have on vulnerable and underserved communities.
On the other hand, President Biden also directed the Secretary of Education to develop an "AI toolkit" for education leaders that implements recommendations from the Department of Education report, AI and the Future of Teaching and Learning, and includes appropriate human review of AI decisions; designing AI systems to enhance trust and safety and align with privacy-related laws and regulations in the educational context; and developing education-specific guardrails.
While EO 14110 does not require any immediate action on the part of schools, administrators and educators who begin to educate themselves on and consider the impact of AI will be ahead of the curve. For example, the AI and the Future of Teaching and Learning Report contains the following recommendations:
- Emphasize Humans-in-the-Loop. Teachers, administrators, and other individuals must stay "in the loop" whenever AI is employed to notice unfair, biased, or incorrect patterns.
- Align AI Models to a Shared Vision for Education. Outcomes cannot be the only consideration when evaluating the success of AI in education. Rather, a shared vision of teaching and learning must remain at the center of AI tools and systems.
- Inform and Involve Educators. Educational leaders must inform and involve teachers so that they are prepared to investigate how and when AI fits specific teaching and learning needs, and evaluate the risks that may arise. This ability to inspect and explain AI models will, in turn, strengthen trust, while also guiding educators as they override recommendations generated through AI where appropriate.
- Develop Education-Specific Guidelines and Guardrails. While data privacy regulation already covers educational technology, modifications and enhancements will be needed to address the new capabilities and risks of AI.
Besides fostering conversations on the above topics, one other way that schools can prepare themselves for the expansion of AI in education is by adding an "AI Policy" to both their student and faculty handbooks. For example, a model student AI policy may address appropriate and inappropriate uses of AI in completing schoolwork. Similarly, a faculty AI policy may address appropriate and inappropriate uses of AI in designing curricula and completing other assignments, including the copyright or other intellectual property considerations thereof.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.