Recently, the ACLU, in partnership with the Tenth Amendment Center, created model legislation for states to “take control of their privacy in a digital age.” On January 20th, 2016 the ACLU coordinated with legislators in 16 states and the District of Columbia to roll out a variety of privacy bills simultaneously, many of which addressed the topic of educational data directly. After the passage of California’s SOPIPA, many states have proposed legislation directly targeting ed tech vendors for new responsibilities regarding the handling of student data. However, the ACLU’s model bill is a hybrid that specified new requirements for both schools and vendors to ensure protections and responsibility for student data privacy.
ACLU’s model legislation, which was proposed in various forms in the different participating states, includes several excellent ideas. The most important provision is the proposal that allows parents to specifically authorize that their child’s data be sent to the educational service providers they choose to provide additional educational purposes. This allows parents to supplement their child’s education by allowing a tutor to access data, or to export data to a tool that provides and enhanced or personalized curriculum. A student could download data an educational game, or download to an app that would allow them to take data with them for use in college or a workplace.
At a time when learning is increasingly taking place 24/7, it allows parents to help students take their education records with them to be used for the additional educational purposes they choose. Unfortunately, many bills passed in other states only allow parents to enable the sharing of data for college applications or scholarships, making it illegal for schools or vendors to use a program that would enable such onward transfers. Parents in those states have to request access to their student’s data from a school authority, hope it is provided digitally and then transfer it to third parties.
Why are those states imposing such limits? Some of them are worried that school service providers will convince parents to share data for marketing purposes. The ACLU bill addresses that concern by ensuring the consent needed is quite explicit and specific and can only be for educational purposes. (In fact the consent requirements are so strict, that they may not be feasible for most services to achieve and will need some wordsmithing to be both protective and feasible)
Another key provision in the ACLU bill is the training requirement. Training for teachers to ensure that they have an adequate understanding of student privacy and how to comply with the law is critical. Restrictions placed on school data would be useless if school staff does not fully understand the rules or are not trained to follow them. The CoSN Trusted Learning Environment program should provide a useful training resource for many schools, as are the extensive materials at Student Privacy Compass.
Fortunately, this model legislation avoids one of the unintended pitfalls of some recent state bills which have such strict language that they can inadvertently restrict schools from providing needed information to school photographers, yearbook publishers and spiritwear providers.
Another proposal in the model bill allows schools to bar parental access to confidential student data, if allowing access would risk the safety of the student. For example, if a student confidentially disclosed to a guidance counselor that they were gay or lesbian but feared for their safety if parent(s) requested access to this data, the school could respect their need for confidentiality, even if the data was included as a student record. However, this provision can only be effective if implemented at the federal level, since currently FERPA does not contain such an exception, and otherwise requires that parents have access to the entire student educational record. Such a “zone of confidentiality” should be an integral part of the student privacy conversation moving forward.
However, there are many other provisions in the bill that create significant problems for schools. These would need revisions to be effective.
The bill defines Personally Identifiable information (PII) as “Any aggregate or de-identified student data that is capable of being de-aggregated or reconstructed to the point that individual students can be identified.” And in some places the bill requires data be both aggregated and de-identified, further raising the bar.
Since no de-identification method is 100% perfect, this definition would restrict the uses of many very effectively de-identified data sets. Without de-identified longitudinal data sets, which require record level data, states can’t measure how well schools perform. In order to identify discriminatory practices, among other useful applications of de-identified but not aggregated data, it is necessary to include levels of detail about race, geography, grades, discipline and other data that can leave a data set well de-identified, but not impossible to re-identify. In fact, FERPA specifically allows sharing of data that may include a limited identifier, as long as the use is limited to research. A better standard for de-identification is the “reasonable” standard used in the Student Privacy Pledge. Or, so that schools didn’t have to deal with multiple de-identification standards, the bill could seek to be consistent with FERPA’s de-identification standard, which we analyzed in our recent whitepaper.
The bill treats college students the same as kindergarten students. This is too broad a brush. College students are legal adults and can safely be treated differently than a student who has yet to reach the age of majority. While certain protections are still appropriate, a college student has the ability to make informed decisions based on their educational needs to a much greater extent than a young child.
The model bill only makes student data available to school employees. This could potentially bar parents who volunteer for school activities and part-time coaches from having access to student data. FERPA allows sharing with all school officials, although direct control is needed over whoever it is shared with. This bill would require a special contract for coaches, parents, and other school representatives that are not designated as “school employees.”
The bill defines a Student Service System (SIS) as any “software application and/or cloud based service that allows an educational institution to input, maintain, manage, and/or retrieve student data and/or personally identifiable student information, including applications that track and/or share personally identifiable student information in real time.” This provision covers any service that can be used by a student, which in effect means every business in the world that doesn’t have a way to identify and screen out students or doesn’t even know a student is using the product is covered by this bill. This is likely to be unconstitutional and is obviously impractical. Most state laws and the Student Privacy Pledge cover products that are designed and marketed to schools.
Of particular concern, the proposed language would allow privacy litigation against teachers or parent volunteers. Schools hold a heavy responsibility to train, provide resources, and manage the use of ed tech in the educational process, but the way to ensure accountability for this is not to put teachers at personal legal risk. Schools must have reasonable review and implementation processes in place for using technology and protecting data. Teachers who violate school rules in general or put students at risk should be subject to appropriate management actions or discipline. But a teacher who misreads a privacy policy shouldn’t face litigation.
We applaud the work of the ACLU and their partners in these states to clearly address some of the challenges of student data privacy as ed tech applications continue to be implemented in schools, and look forward to working on these same issues with them, and with policymakers at the state and federal level.
Cross-posted with the Future of Privacy Forum website.