The social-media giant this week announced Study, a “market research program” that will compensate willing users of Android, Google’s GOOG, -0.09% GOOGL, -0.13% operating system, in exchange for information about their phone-app use. Eligible Study participants must be 18 or older at the launch of the program and can opt out whenever they want to, the company said.
Users will be paid a flat monthly rate through their connected PayPal account. A Facebook spokeswoman declined to comment on exactly how much money users could make, and declined to provide information about the program beyond what was posted on Facebook’s website.
‘I can imagine that if this app gets hacked, that opens up some pretty severe security vulnerabilities.’
That app will collect a “minimum amount of information,” Facebook said, including which apps are installed on a person’s phone; how much time they spend using those apps; a participant’s country, network type and device; and potentially, which app features participants use.
Facebook’s track record with personal data is less than stellar
“We need more transparency from Facebook, and they have to go a lot of extra miles to prove that we can trust them because of their track record now,” said Pam Dixon, the executive director of the World Privacy Forum, a public interest research group. “Here’s the question: How much do you trust Facebook with your information?”
For starters, Dixon questioned whether any kind of independent review board had evaluated the Study program for its ethical and legal implications, and what standards the company had used to determine those. She urged Facebook to be transparent with that information.