Facebook won’t tell you exactly why it does or doesn’t trust you.
Facebook revealed this week it’s trying to stem the flow of fake news by assigning trust values to users. It insists on keeping its criteria for trustworthiness secret though, in case untrustworthy people try to game the system — and they almost certainly will.
Tessa Lyon, Facebook‘s product manager, told The Washington Post a bit more about the system, in which the company uses several flags to identify which people on the site are more trustworthy than others. It rates users on a scale of zero to one. The only judging metric she would admit to is a person’s history of reports.
It’s called Hard Fork.
Comments are closed.