A Conversation with Iris Palmer

A Conversation with Iris Palmer

New America’s Education Policy Program uses policy analysis and research to tackle critical education problems in the United States. One such issue is the potentially problematic use of predictive analytics tools in higher education institutions. In “The Promise and Peril of Predictive Analytics in Higher Education,” co-authors Iris Palmer and Manuela Ekowo present a landscape analysis of the different goals pursued by institutions and describe the important ethical questions that colleges need to consider before using predictive analytics.

On July 31, FPF spoke with Iris Palmer, Senior Advisor for Higher Education and Workforce, in the Education Policy Program at New America, about how predictive analytics can be used to help low-income and minority students succeed in college or, alternatively, to shut them out altogether.

FPF: Please tell us more about predictive analytics in higher education and how you first began working on this issue?

Iris: We began working on this topic just as it was becoming particularly trendy to talk about using predictive analytics to enhance student support services. Our approach was that while the idea seemed promising, there were not enough conversations about how using predictive analytics could perpetuate bias. This landscape analysis was undertaken to examine how predictive analytics were actually being used in the field. We used a fairly broad definition of “predictive analytics” that encompasses artificial intelligence (AI), Big Data, and adaptive technologies, in which past data is used to predict future occurrences and such predictions are then acted upon.

We identified four primary use cases for predictive analytics in higher education. First, we found that predictive analytics are being used in enrollment management. This use case focuses on who a given college recruits, how they allocate financial aid budgets, and generally how to create the class that an institution thinks is best for them. The second use case is early alerts, which involves targeting students who may need advising and trying to intervene to promote student success in college. The third case is using predictive analytics in classrooms. We call this “adaptive tech.” The idea is having courseware that can adapt to a particular student’s learning style to identify where they may need additional assistance. The last case is something we think of as facilities uses. This involves utilizing AI across the campus for a variety of logistical planning, such as allotting classroom space, creating class schedules, or allocating a school’s water resources.

FPF: What are some of the main risks or concerns you identified in the use of predictive analytics in higher education institutions?

Iris: The kinds of issues that may arise depend, of course, on the different ways an institution may be using predictive analytics. However, there were a few big-picture issues that we outline. The first real concern is the risk of self-fulfilling prophecies. Let’s say you’ve identified a student who is at risk. You tell them they are at risk. Then, all of a sudden, they are actually at risk. In these cases, a prediction produces negative consequences, including that the student drops out, which may or not may have come true barring the school’s attempt to act on it and intervene. Another issue relates to profiling students, i.e., using socioeconomic status, racial data, or proxies for these to identify certain groups. Profiling produces systems that are not actually more predictive than what we generally already know about students who are at higher risk. These systems need to be predictive beyond what we could guess from the fact that students are first generation, that their family is low income, that they are racial minorities, or other attributes that would be considered high-risk factors. Otherwise, all the system is doing is profiling, which is not particularly useful and has no added value.

Another risk involves transparency and data privacy. There are many different types of data that can be brought into the system, only some of which are predictive. There are a number of issues that need to be addressed in this regard, including obtaining consent, informing students about how their data is being used while actually conforming to these uses, engaging students in the process of deciding which data to use or exclude from the system, and ensuring that their privacy is not violated.

The final issue involves tracking students. When you create student profiles and track students based on these profiles, you create the risk that you will effectively nudge particular groups of students down certain routes. We obviously don’t want this to happen. More generally, we don’t want institutions using data in ways that can harm students. This may sound obvious on its face, but it is something that turns out to be quite complicated to achieve in practice.

All these considerations are necessary to determine whether the trade-off between the institution being more aware of each student and their individual traits is worth the additional privacy risks and the tracking of students.

FPF: What is your impression of the efficacy and accuracy of these predictive analytics tools? How can schools make smart decisions about what to use and how?

Iris: There is a very wide variety of systems. Schools should understand which data are being used and how the predictive models are built. I recently wrote a piece on choosing a predictive analytics vendor and whether institutions actually need one. In the piece, I suggest different questions that schools should ask, including how to dig into technical and statistical aspects of the tools to better determine their accuracy. It is also important to emphasize the difficulty of gaining a clear understanding of what a specific tool does or does not do, merely from its marketing description, which is often quite squishy and vague. Schools need to revisit these questions on a yearly basis and rebuild the models as necessary.

For example, one issue is which data is included versus excluded from the training dataset and how the training dataset predicts outcomes. Does the system use data from multiple schools to generate their model or only from the particular institution for which they are building an individual model? Some systems use data from across multiple systems: they hoover in as much data as they can and only think about what is actually predictive later on. These systems may use more exotic data, such as location data, while others use more limited data, such as academic data or student information systems data because they don’t think additional data is reliable or particularly helpful.

With the right data scientists, these systems can use all the different data points that the institution provides and then determine what is predictive. For example, in the context of the common use case of early alerts, they can predict which students will successfully complete their degrees.

As to practical efficacy, from the schools we spoke with there were varying degrees of satisfaction. Some reported that the predictive analytics tools they were utilizing had not provided any new information. Others reported that these tools had uncovered helpful associations between different data points. There is a wide variety, and much comes down to picking the right tools within the school context.

FPF: What are some changes or trends in this space since this report was published in 2016?

Iris: This is a very fast-moving field, so much has changed since 2016. A common use for predictive analytics at that time was recommending programs of study for students at the very beginning of their college life. This particular usage as a standalone product is significantly less prevalent today, although some of the capabilities have been integrated into other larger products.

Today we are beginning to see a significant increase in the use of AI chatbots. For example, Georgia State University is using these to address the particular problem known as “summer melt.” Summer melt is a widespread issue in which high-school graduates who are accepted and enrolled into college don’t show up to freshman year in the fall. On other campuses, these chatbots are being used to help students access their financial aid information, fill out forms, and answer various questions that help students navigate the institution. If these chatbots can’t answer a student’s question, they may refer them to a live person.

There are numerous ethical questions surrounding the use of chatbots. These questions touch on the kinds of messages the AI should communicate to students and how to carefully calibrate these messages based on who the student is and what the system already knows about them. While you want technology to effectively direct a student towards a desired action or outcome, you don’t want to cause harm. Another important aspect is including the students themselves in conversations about how best to accomplish this balance and which data to collect and use towards this end. Having met with the founder of a company who offers a chatbot service, I can tell you that that he has devoted serious thought to these and other important questions. That is certainly an encouraging sign.

FPF: What were some of the goals you hoped to accomplish by publishing this report?

Iris: We have been hearing many folks talk about how predictive analytics is going to change education. While we are always interested in technological innovation, we do not think that all types should be endorsed. Therefore, in the report we tried to think about how to use these tools in a smart way, by implementing predictive analytics to further equity rather than offset it. Unfortunately, if you’re not conscious of the potential risks involved and how to navigate them, you may end up undermining equity, even if inadvertently. Our purpose was to help prepare higher education institutions that are interested in implementing predictive analytics technologies but that have not properly trained their staff on how to use it. In these cases, despite the best of intentions, they may further perpetuate bias towards college students in an already unequal higher education system. This bias is what we hope to help institutions avoid.

FPF: Who do you think could benefit from the insights in this report?

Iris: We hope to reach a diverse set of constituencies at higher education institutions. Although there is certainly a policy connection, the report is targeted at those on the ground who are implementing these systems. For example, we have spoken to business officers, enrollment management VPs, and presidential associations. This is because we believe there needs to be a well-informed team across the institution that keeps these thoughts at the forefront.

As we all know, technology is not a silver bullet. Simply implementing it without creating a supportive culture, including well-informed and trained staff, does not work. Ultimately, the people themselves make the difference. Georgia State, for example, has hired a significant number of new advisors who use predictive analytics systems to perform their duties more efficiently and to make smarter decisions. They effectively changed the culture of advising at the college by making these tools part of what advisors do.

To use these systems effectively and ethically, institutions need to provide support and training for those using them. The Community College Research Center (CCRC) released a report about using predictive analytics, involving case studies and qualitative research on institutions that were implementing these tools. The CCRC found that people on the ground were becoming increasingly cynical and disenchanted with these systems, finding them to be ineffective and possibly harmful. However, those in leadership roles were not confronted with this reality and remained enthusiastic. For that reason, we need to educate and inform audiences across the institutions and make sure they are all involved in the conversations. While we are optimistic about the opportunities offered by these technologies, we should be cautious about systems we create and the accountability within those systems.

This interview was conducted by Ahuva Goldstand on July 31, 2019. It has been edited and condensed for clarity.

Related Resources

  • EdTech Perspectives

    Demystifying the Consumer Privacy Patchwork

    Jan 18, 2024Randy Cantz

    What should edtech companies know about consumer privacy laws?As states continue to pass new consumer privacy laws, edtech companies may be left wondering what…

    Learn More
  • Higher Ed Perspectives

    Higher Education Compliance with Updates to the GLBA Safeguards Rule

    Jul 6, 2023

    Higher education institutions participating in the US Department of Education’s federal student aid programs need to be aware of recent updates to requirements…

    Learn More
  • FPF Perspectives

    FTC announces a complaint and consent agreement against Chegg

    Nov 7, 2022Jamie Gorosh and Lauren Merk

    Since May 2022, education technology (edtech) companies have been on notice that the Federal Trade Commission (FTC) is closely monitoring the industry to ensur…

    Learn More