Checklist to Help Schools Vet AI Tools for Legal Compliance

Checklist to Help Schools Vet AI Tools for Legal Compliance

Schools and districts around the United States are currently grappling with how to vet new edtech tools that incorporate generative AI. Whereas various groups have produced frameworks designed to help schools review these tools holistically, they frequently say very little about how to vet them for privacy compliance. To address this gap, the Future of Privacy Forum’s Vetting Generative AI Tools for Use in Schools explains the steps schools should consider incorporating into their more general edtech vetting and AI policies.

Download Vetting Generative AI Tools for Use in Schools

Download Incorporating Generative AI Into Your School’s App Vetting Checklist

Audience

These resources were created for school and district staff who perform reviews of edtech tools for privacy compliance. Since they indicate questions schools will ask edtech vendors, the resource is also of value to companies that produce edtech tools.

How to use the resource

The resource assumes that the school or district already has a process in place to review and approve new edtech tools for privacy compliance (referred to as app vetting) and points out (1) the laws and other legal considerations and that should be considered in a review and (2) the following key considerations for how tools that use generative AI are unique that should be included in the review process:

  • Use case dependent. Since generative AI edtech tools can take more user input to perform any number of tasks as output compared to traditional edtech tools, schools will need to consider the specific use cases for the tool.
  • Data collection. Student privacy laws typically will cover use cases where the tool requires student personally identifiable information (PII) as input or where the output from the tool will become part of the student’s record. There are many use cases that do not require student PII, which the school can use without implicating most student privacy laws. Even still, there are many use cases where a school may not be able to control all the information the tool collects, so schools should consider whether the data collection risk can be mitigated or avoided altogether.
  • Transparency and explainability. For tools that use student PII, the school will need to consider how it will meet requirements for transparency and explainability to teachers, parents, and students. State privacy laws frequently require schools to publicly share information on what student data they share and its recipients. Many edtech companies are creating AI transparency pages to better explain the data their tools use and how they make decisions.
  • Product improvement. Many Generative AI tools rely on large amounts of data to continuously train the underlying model that generates responses. Other tools train a model initially but do not use student data to further train the tool. An important question schools need to ask is whether the vendor will use student PII to train the model, and if so, if any additional products the vendor creates with the model are educational or commercial, and if that additional use is permitted under state law.
  • Unauthorized disclosure of student PII. If student PII is used to train the model, then there exists the chance that snippets of the PII will appear in future output from the tool. The school will need to understand the steps the company takes to prevent these sorts of  unauthorized disclosures.
  • High risk decision making. Some proposed use cases that involve substantive decision making may be governed by long standing rules or new AI laws.Other uses may have such a high risk of harm to students that schools should be cautious in pursuing them. Potential options schools may consider are only permitting these cases with parental consent, requiring that a human be in the loop, or prohibiting the use case.

Related Resources

  • EdTech Perspectives

    Demystifying the Consumer Privacy Patchwork

    Jan 18, 2024Randy Cantz

    What should edtech companies know about consumer privacy laws?As states continue to pass new consumer privacy laws, edtech companies may be left wondering what…

    Learn More
  • Higher Ed Perspectives

    Higher Education Compliance with Updates to the GLBA Safeguards Rule

    Jul 6, 2023

    Higher education institutions participating in the US Department of Education’s federal student aid programs need to be aware of recent updates to requirements…

    Learn More
  • FPF Perspectives

    FTC announces a complaint and consent agreement against Chegg

    Nov 7, 2022Jamie Gorosh and Lauren Merk

    Since May 2022, education technology (edtech) companies have been on notice that the Federal Trade Commission (FTC) is closely monitoring the industry to ensur…

    Learn More