While phone app stalking is getting easier, so is gathering information from personal wearable devices. ZDNet reported last month that over sixty million fitness tracker records were exposed in a completely unsecured database operated by GetHealth. The information exposed included names, birth dates, weight and GPS tracking data. The exposed data was from users all over the world. We know this kind of data can be used not only for stalking individuals, but for choosing targets for robbery or fraud (and understanding their daily patterns), and for identifying the classified locations of military personnel. ZDNet did not share whether heart rates or other health matters usually monitored by wearable fitness trackers were included in this cache of exposed data.
Another of the most troubling privacy threats of the month involves law enforcement. The Wall Street Journal reported that US police and federal law enforcement are using private data services to quietly secure information that would otherwise require warrants to attain, thus bypassing judicial process in place to protect U.S. citizens' Constitutional rights. Law enforcement calls this resource “open-source intelligence” rather than unconstitutional warrantless surveillance. Either description would be accurate. The Journal notes that police omit this mode of surveillance from the records of people arrested after use of this data.
The Journal reports, “Data brokers sprung up to help marketers and advertisers better communicate with consumers. But over the past few decades, they have created products that cater to the law-enforcement, homeland-security and national-security markets. Their troves of data on consumer addresses, purchases, and online and offline behavior have increasingly been used to screen airline passengers, find and track criminal suspects, and enforce immigration and counterterrorism laws.” So the sources of data have proliferated so broadly that multiple channels of surveillance are available to those who chose to use it.
Senators Ron Wyden of Oregon and Rand Paul of Kentucky have proposed a bill called “The Fourth Amendment Is Not For Sale Act” which seeks to reduce warrantless police searches by requiring a court order before purchasing cell phone location data from data brokers. The bill would still allow private “volunteers” to buy or gather such data and provide it to law enforcement.
Another frightening infringement of privacy reported by the Journal (they were on a dystopic roll last month) involved school districts using artificial intelligence software to analyze the emotional states of students. The article says that this software “can scan student communications and web searches on school-issued devices—and even devices that are logged in via school networks—for signs of suicidal ideation, violence against fellow students, bullying and more. Included in the scans are emails and chats between friends, as well as student musings composed in Google Docs or Microsoft Word. When the AI recognizes certain key phrases, these systems typically send an alert to school administrators and counselors, who then determine whether an intervention with the student and parents is warranted.” Schools use the software to look for dangerous behavior and planning, but it can be used to identify anxiety, depression and eating disorders among students.
Of course, these uses sound positive, and applied in the right manner, the tool can be important. But keep in mind the Lower Marion School District of Pennsylvania that loaned laptops to its students who needed them for classwork, but the local IT guys turned on the laptop cameras to watch into the students’ bedrooms. It is one thing to expect minimal abuse from federally trained professionals, and another to hold out such hopes for the underpaid staff of resource-starved public schools. Also, we all know that schools are tight-knit communities where information – even secret information – can flow like water. In a recent survey 81% of teachers said their school uses some form of student monitoring software.
I had identified two other stories from last month demonstrating surveillance that we would not have expected, but after the above discussion I am too depressed to write more. I’m sure some AI somewhere will take note of this fact.