Checklist to Help Schools Vet AI Tools for Legal Compliance

Checklist to Help Schools Vet AI Tools for Legal Compliance

Schools and districts around the United States are currently grappling with how to vet new edtech tools that incorporate generative AI. Whereas various groups have produced frameworks designed to help schools review these tools holistically, they frequently say very little about how to vet them for privacy compliance. To address this gap, the Future of Privacy Forum’s Vetting Generative AI Tools for Use in Schools explains the steps schools should consider incorporating into their more general edtech vetting and AI policies.

Download Vetting Generative AI Tools for Use in Schools

Download Incorporating Generative AI Into Your School’s App Vetting Checklist

Audience

These resources were created for school and district staff who perform reviews of edtech tools for privacy compliance. Since they indicate questions schools will ask edtech vendors, the resource is also of value to companies that produce edtech tools.

How to use the resource

The resource assumes that the school or district already has a process in place to review and approve new edtech tools for privacy compliance (referred to as app vetting) and points out (1) the laws and other legal considerations and that should be considered in a review and (2) the following key considerations for how tools that use generative AI are unique that should be included in the review process:

  • Use case dependent. Since generative AI edtech tools can take more user input to perform any number of tasks as output compared to traditional edtech tools, schools will need to consider the specific use cases for the tool.
  • Data collection. Student privacy laws typically will cover use cases where the tool requires student personally identifiable information (PII) as input or where the output from the tool will become part of the student’s record. There are many use cases that do not require student PII, which the school can use without implicating most student privacy laws. Even still, there are many use cases where a school may not be able to control all the information the tool collects, so schools should consider whether the data collection risk can be mitigated or avoided altogether.
  • Transparency and explainability. For tools that use student PII, the school will need to consider how it will meet requirements for transparency and explainability to teachers, parents, and students. State privacy laws frequently require schools to publicly share information on what student data they share and its recipients. Many edtech companies are creating AI transparency pages to better explain the data their tools use and how they make decisions.
  • Product improvement. Many Generative AI tools rely on large amounts of data to continuously train the underlying model that generates responses. Other tools train a model initially but do not use student data to further train the tool. An important question schools need to ask is whether the vendor will use student PII to train the model, and if so, if any additional products the vendor creates with the model are educational or commercial, and if that additional use is permitted under state law.
  • Unauthorized disclosure of student PII. If student PII is used to train the model, then there exists the chance that snippets of the PII will appear in future output from the tool. The school will need to understand the steps the company takes to prevent these sorts of  unauthorized disclosures.
  • High risk decision making. Some proposed use cases that involve substantive decision making may be governed by long standing rules or new AI laws.Other uses may have such a high risk of harm to students that schools should be cautious in pursuing them. Potential options schools may consider are only permitting these cases with parental consent, requiring that a human be in the loop, or prohibiting the use case.

Related Resources

  • Uncategorized

    19 Times Data Analysis Empowered Students and Schools

    Mar 22, 2016Brenda Leong

    Which Students Succeed and Why?“Thoughtful use of education data has tremendous potential to improve and address inequities in America’s education system. Scie…

    Learn More
  • Uncategorized

    Parent’s Guide to Student Data Privacy Now Available in Spanish!

    Jan 13, 2016

    The Future of Privacy Forum (FPF), Connect Safely, and the National PTA are proud to announce that the Parent’s Guide to Student Data Privacy is now available …

    Learn More
  • Uncategorized

    What do kids think of privacy and online safety?

    Nov 6, 2014

    What do you think a class of 5th graders would answer if you asked them if they should be allowed to have Facebook accounts?

    Learn More