Skip to Content

Privacy & Data

Student Privacy Ought to be Protected Comprehensively

2015-12-16-student-privacy_blog

Schools have largely embraced education applications, websites, and devices (collectively referred to as “edtech”) as a means for improving classroom instruction and administration. 71 percent of parents report their child uses technology provided by schools for educational purposes. In most cases this means more data is being collected on students. However, US privacy law has not kept pace with the rapid adoption of technology and data collection in schools. The Family Educational Rights and Privacy Act (FERPA), our existing student privacy law, is outdated and there are no sector-specific privacy laws that focus on edtech. Legislators have responded by introducing a host of bills in recent years. 2015 has been an especially busy time for student privacy law and policy. Congress is considering multiple federal student privacy laws (including a proposal to revamp FERPA), over 180 bills have been introduced at the state level, and the Student Privacy Pledge – a voluntary industry commitment to follow certain privacy and security standards – has been signed by upwards of 200 companies. These efforts brought more attention to the student privacy debate and also raised difficult questions – one being the extent to which not only education technology services (online or mobile apps, websites etc.), but also general audience devices and device operating systems used in schools, should be subject to data collection, use and sharing limits.

The Electronic Frontier Foundation’s (EFF) recently filed FTC complaint against Google for allegedly violating the Student Privacy Pledge — and Google’s response denying that they do so — underscores the significance of figuring out answers to this question. Before getting into this discussion, however, the complaint raises other student privacy concerns worth further consideration. One issue is what limits should be placed on use of aggregated or anonymized student data. The pledge does not appear to cover aggregated or anonymized data (although the third commitment in the pledge could be interpreted to limit use of student data in any form if the use is for behavioral targeting of ads to students). However, regardless of how the pledge is interpreted, if data has been properly aggregated or anonymized it should be all right to use it for limited non-educational purposes such as product improvement, as long as the company is transparent about this practice. It would be impractical to require parent or student authorization in these circumstances. EFF’s complaint also addressed school administrators’ ability to allow or restrict certain student data collection. There will be cases when it is more appropriate for schools to decide what data should be collected – Google for Education services appear to embrace the importance of school administrator control. That being said, parents and students should be able to decide what data is collected in certain contexts (such as when a student is no longer using a platform for school purposes). These controls should also be easy to access and understand. Striking the right balance between school and user control is difficult, and will require edtech companies’ active participation in efforts to set appropriate standards.

Setting these considerations aside, a threshold issue related to EFF’s complaint that has not received enough attention is the need for comprehensive protection of students’ data across all school devices and services. There seems to be a trend in company policies and proposed laws toward drawing sharp, inflexible lines between products that are “for education” and those that are not. However, it’s clear that this line is often blurred; companies regularly package and sell schools general audience devices (with general audience data collection practices) with accompanying educational applications. This means the general audience device and operating system may collect data on a student that neither the student nor their parent expects. Offering students different privacy protections when they’re using general audience software on a device from those they’d receive when they’re using software deemed “educational” on that very same device undermines efforts to comprehensively protect students’ privacy. This is because it’s confusing for everyone involved – schools, students, parents, and likely the company itself. It’s also less user-friendly: it’s unrealistic to expect students to adjust their use of a device depending on how a specific browser, application, or other service on the device might collect and use their data. I’m willing to bet even tech-savvy adults don’t do that.

A number of existing student privacy bills arguably would not address these gray areas. This is partially because the bills limit their scope to websites, cloud computing, and online or mobile apps. But also, if devices, device operating systems, and accompanying browsers were included, many bills would only cover those that are used “primarily for K-12 school purposes” and “designed and marketed” for these purposes. The Student Privacy Pledge takes a similar approach. Its definition of “school service provider” is limited to operators of an “online or mobile application, online service or website that is both designed and marketed for use in United States elementary and secondary educational institutions/agencies”. Additionally, the pledge explicitly states (and its sponsors have reaffirmed) that it does not cover “general audience software, applications, services or websites not designed and marketed for schools”. Three scenarios are worth considering to understand how this might play out for pledge signatories:

A. A company provides an online classroom collaboration platform offering a range of resources for teachers including shared lesson plans, videos, and assessment tools. Schools and school districts can sign up through a paid yearly subscription. A school decides to sign up.

B. A company has both a general consumer tablet as well as education-tailored applications. The tablet has a built-in browser with a feature that logs users’ browsing history. The company regularly sells its tablets to school districts in bulk and preprograms the tablets with the browser tracking feature as well as the education applications. The company markets these education “packages” to schools on its website. A school district purchases the tablets for half of its schools.

C. A company has education applications as well as a popular general purpose laptop that schools are starting to buy for the classroom. The company preinstalls both its browser and a cross-device browser history feature onto these laptops. The company advertises its education applications to schools but does not advertise the laptop as specialized for education. A school decides to purchase these laptops and the IT director installs the company’s education applications onto these devices.

In scenario A, there is no question that the pledge would apply. This is an online service that falls directly within the pledge’s definitional boundaries. In scenario C, the pledge shouldn’t apply. Just because a company commits to the pledge for its education-related products doesn’t mean it should have to apply its promises under the pledge to general purpose devices that schools or students happen to use (without the company marketing the device for education or knowing that the device is regularly used for education). Scenario B is more complicated. The Student Privacy Pledge’s definitions could be interpreted to exclude general audience devices even if they are packaged with educational software and regularly used in schools. Regardless, there are good reasons why policymakers concerned with education privacy should consider how to address these practices.

For one, if the manufacturer preinstalls education services on a general audience device when the device is sold to schools and markets the device to schools as tailored for education, there is a very strong argument that the device and its operating system are “designed and marketed” for educational purposes and therefore should be subject to the Student Privacy Pledge. Secondly, and more important for public policy, students and parents are a captured audience in these cases. Students are often required to use certain devices and accompanying operating systems and services in schools. Consequently, there is much less user autonomy than there would be if a student or their parent were to select and purchase a device for personal use. Additionally, parents have little to no control over what devices their child uses in the classroom and the data these devices’ systems collect. This is particularly problematic for low-income families who may not have a computer at home that a student can opt to use once they leave school in order to avoid data collection. Often school-provided laptops are the only computers available to these students. The Student Privacy Pledge was correct in making the distinction between “general” and “educational” purposes. However, an even more effective approach would include a clarification in its definitions which states that operators like those in scenario B would be subject to the pledge for those devices. Otherwise, there’s a fairly large loophole in the pledge’s protections which is not particularly transparent.

Bottom line, company policies for hardware and software regularly used for education should be the same across the board. Make it easy for students, parents and schools to know what is and is not protected. Legislative proposals and industry pledges will also have to grapple with this issue to ensure that student privacy is comprehensively safeguarded. Regardless of how students’ data is protected, protection should not be partial.