States Working on Navigating the AI Revolution
Artificial Intelligence (AI) holds both promise and risk for education and access. Lessons could be personalized. Activities could be tailored to an individual student’s strengths or weaknesses. Students could receive real-time, individualized feedback on class assignments and homework.
A 2024 teacher survey conducted by Canva found that teachers see real benefits to using AI, particularly to support students with different learning needs. Nearly three-fourths (72 percent) of teachers surveyed said the technology could help them with language learning, and 67 percent said it could support universal accessibility.
There are also real concerns about the impact of AI on education. The benefits of the technology might be available only to the most advantaged students in the most advantaged schools. An over-reliance on AI in decision-making (for example, identifying students to take advanced coursework) could lead historically underserved students to be shut out of opportunities to enroll in challenging classes if the algorithms used are either biased or not transparent. And, of course, AI makes possible the creation of completely false images and information, as even Pope Francis found out when an online photo showed him in a designer white puffer jacket.
District leaders, state and local policymakers, educators, and families, have questions about AI in schools. It is no surprise, then, that school policy seems to be focusing more on the concerns over AI than on the opportunities.
This fast-moving topic will require collaboration between policymakers and practitioners nationwide as we navigate attempts to allow for AI innovation, while working to develop effective guardrails on students’ education experience and personal information.
As review of state legislative sessions finds some movement toward enacting either legislation or policy guidance on AI. State actions seem to fall into four broad categories:
Approach #1: Study
Creating blue-ribbon panels is a time-honored way for states to show that they are taking some action while not really making any immediate policy changes. This approach seems particularly well-suited to AI guidance, since the field is changing so rapidly.
California – SB 721, passed in 2023 and amended in 2024, creates the California Interagency AI Working Group tasked with delivering a report to the Legislature regarding artificial intelligence. Members of the working group were required to demonstrate expertise in at least two of the following areas: computer science, artificial intelligence, the technology industry, workforce development, or data privacy.
Illinois – In 2023, the Illinois General Assembly passed Public Act 103-0451, which establishes a task force to study the issues of Generative AI and Natural Language Processing. Educators are prominently represented on the task force run by the Department of Innovation and Technology, which includes:
- The state superintendent or designee;
- The executive director of the Illinois Community College Board or designee;
- The executive director of the Board of Higher Education or designee;
- Two teachers recommended by a state organization representing teachers, to be appointed by the Governor; and
- Two principals recommended by a statewide principals association, appointed by the Governor.
Louisiana – In 2023, both the Senate and the House unanimously adopted SCR49, a resolution requesting that the Joint Committee on Technology and Cybersecurity study the impact of artificial intelligence in operations, procurement, and policy.
Oregon – In November 2023, Governor Hotek signed an executive order creating an Oregon State Artificial Intelligence Advisory Council. This group is charged with developing a recommended plan framework for presentation to the Governor.
Texas – In 2023, the Texas Legislature approved HB 2060 to establish an AI advisory council. Members, who include public and elected officials, academics, and technology experts, will examine and monitor AI systems developed or deployed by state agencies. It will also issue policy recommendations regarding data privacy and preventing algorithmic discrimination.
Vermont – In 2018, Vermont created an Artificial Intelligence Task Force in Act 137. No K-12 educators are named as members of the task force, but it does specify that a “secondary or postsecondary student in Vermont” shall be a member of the task force. The Task Force reports annually to the Legislature and reviews AI-focused bills that are under consideration by the Legislature.
Approach #2 – Guidance to Local Districts
At least eleven states—including California, North Carolina, Oregon, Washington, and West Virginia—have provided local school districts with guidance documents explaining how AI applies to existing laws and offering best practices for school district policy. In addition, Georgia, South Carolina, and Arkansas have posted curriculum standards and/or courses that will teach AI concepts.
The course description for Georgia’s Foundations of Artificial Intelligence offers this description of the course: “This introductory course explores the foundations of Artificial Intelligence in society and the workplace, including programming, data science, mathematical reasoning, creative problem solving, ethical reasoning, and real-world applications of Artificial Intelligence. Students will learn the foundational skills to understand how to both interact and develop Artificial Intelligence solutions in a variety of settings.” Florida offers AI classes through the Florida Virtual School offers classes in AI
California – The California Department of Education developed a policy memorandum, Learning With AI, Learning About AI, which focuses on equity and student safety. Given the inevitable growth of artificial intelligence, the Department of Education stated that the guidance was intended to be “informative, rather than prescriptive.”
The memorandum notes both benefits and concerns from AI. While stating that “AI or any other technology cannot replace the value of a student’s relationship with a caring educator who can connect on a human level,” the Department notes that in AI can expand teachers’ ability to “generate personalized learning materials, such as worksheets, quizzes, and reading assignments, aligned with students’ learning goals, strengths, and interests.”
At the same time, the memorandum notes serious concerns about student privacy and data collection. Rather than developing a comprehensive plan, this memorandum lays out criteria that local school systems should consider as they develop their own AI policies including complying with federal and state data privacy laws, clarifying with owns and handles data, creating transparency around algorithms, and assurance of accessibility and inclusivity for all students.
Oregon – The Oregon State Department of Education has developed guidance for local districts. Developing Policy and Protocols for the Use of Generative Artificial Intelligence in K-12 Classrooms, lays out issues for educators and school districts to consider as they develop AI policies. It outlines a seven-step process:
- Review specific equity-focused AI questions;
- Engage the district and the community;
- Review products and services;
- Establish clear guidelines;
- Develop policy;
- Create a professional development plan; and
- Implement and monitor policy to determine effectiveness.
Approach #3- Protect Individuals
Several states have focused their efforts on ways to protect individuals from unintended, yet foreseeable, outcomes from safe or ineffective AI. This has included work around data privacy in several states—California, Colorado, Connecticut, Delaware, Indiana, Iowa, Montana, Oregon, Tennessee, Texas, and Virginia—have passed data privacy legislation. However, most of these bills deal with business use of individuals’ data.
Colorado – The General Assembly prohibited schools from using Facial Recognition Technology (FRT) until 2025 in SB 22-113. During the interim, the General Assembly also established a task force to address the issue of how and when school districts might safely use FRT.
Draft legislation from the task force proposes that FRT could be used by schools under these circumstances:
- An individual makes an articulable and significant threat against a school or the occupants of a school, and the use of facial recognition technology may assist in keeping such school or occupants safe;
- A student absconds from a school class, field trip, event, or program or is otherwise reported as lost or missing by the student’s parents, teachers, or school officials, and there is a reasonable belief that using facial recognition technology may assist in finding the lost student; or
- An individual has been ordered to stay off school district property, and, based on threatening or harassing behavior, there is a reasonable belief that such an individual may attempt to reenter district property in the future.
New York – In 2024, the New York State Assembly passed the Legislative Oversight of Automated Decision-making in Government Act (LOADinG Act). This bill addresses “automated decision making” at state agencies. The bill says this includes software that relies on algorithms, computer models, or artificial intelligence. Teacher evaluations are among the tasks that include algorithms as part of the decision-making process.
Approach #4 – Leave AI to Local Districts
A number of states have made a conscious decision to leave matters about AI up to individual districts. Iowa, Rhode Island, and Wyoming are all following this approach.
Iowa – The Ames Tribune reported on how the Iowa Department of Education was approaching AI policy: “Heather Doe, spokesperson for the Iowa Department of Education, said decisions about academic expectations, honesty and whether to block certain websites and artificial intelligence tools are made on the local level, though districts do receive technology support and guidance from their Area Education Agency.”
States are just starting to address the challenges and the opportunities of AI in schools. Expect the next few legislative sessions to be a time of greater policy action.