It has been a very very busy week for FERPA and it reminds me of Eric Carle’s The Very Busy Spider – the spider wouldn’t stop building her web. So as we busily build our privacy web, introducing more and more bills, I believe it is worth discussing the implications of proposing bills that severely restrict the effective use of data. Sen. David Vitter (R-LA) introduced an amendment to FERPA entitled the Student Privacy Protection Act. However, unlike other bills, this one seems to restrict the use of data to understand how we can support learners more effectively.
The bill would only allow aggregated, anonymized and de-identified data to be used for State Longitudinal Data Systems (SLDS) and prohibits predictive analytics. At first glance, using only aggregated data seems like a good idea, but here is the issue: the bill prohibits the use of data that helps measure student outcomes. It prohibits the use of federal funds to support “affective computing” that includes the analysis of physical or biological characteristics such as facial expressions or brain waves. And while I understand the concern of a neurotypical child to have this granular information captured, to the parent of a special needs child this information could prove to be the difference between the struggle of an academic journey to a successful one. If predictive analytics are allowed, the information obtained can prove invaluable to students with special needs.
What if we cannot use the data on an SLDS to determine how we are helping English Language Learners? How can we answer the question – Do you have information on English Language Learners? If we cannot track the performance and graduation rates of ELL’s, how can we be sure we are providing those children with the most appropriate resources and whether the application of those resources is successful? Or take the high school dropout rates. If we cannot access SLDS data, how can we measure success for all students? For example, we know that the dropout rates between Whites and Hispanics narrowed from 23 percentage points in 1990 to 8 percentage points in 2012? We can’t’ answer these questions without data. And if our goal is to provide a truly equitable education system, a system that serves all students, we cannot deny the use and analysis of this valuable data. Further, how can we determine how many students with disabilities are receiving support services in schools? And are these support services effective?
It is clear that most bills relating to student privacy are looking to protect student data and to respond to parent concerns. But we must be careful that in our effort to protect students we do not impose restrictions that would rob us of valuable insights on how to better serve their educational needs. We cannot make smart decisions without complete data sets.
It is clear that with so many student data privacy bills we are responding to parent’s concerns, however, we cannot make the bills so restrictive that we cannot use the data. Instead of restricting the use of data, I propose we train staff and make them aware of the implication of misuse of data but let’s not shut the system down and make decisions in a vacuum. We need the data, it’s important we be able to make informed decisions. Let’s focus on having comprehensive data sets and host them in secured environments with staff trained in the proper handling of data protocols. Knowledgeable staff educated in the ethical and effective use of data can better protect student privacy than removing data from an SLDS. Only by responsibly using relevant data can we ensure we deliver an equitable school system for all students.