Press "Enter" to skip to content

Facebook Groups Need More Transparency; Readers Deserve to Know Who’s Speaking

Hi, I’m Cory Allen Heidelberger. I live and work in Aberdeen, South Dakota. I’m a registered Democrat, a liberal, and an atheist who feels none of those conditions require apology. I take money from sponsors and donors (including you, if you’ll click the Blog Tip Jar!), but I write for no party, business, or organization real or imagined. I write what’s useful, entertaining, and true.

You know who I am and what I’m about, and you can check the sources for pretty much everything I write simply by clicking the multiple links I include in every post. (Remember, hyperlinks are one of the greatest inventions of the 20th century and a fine journo-literary art form, so yes, please, do click them and read more!)

Openness and nymity create accountability. Facebook’s nihilist tolerance of fake accounts wrecks openness, fact-based discourse, and democracy. Wired columnists Nina Jankowicz and Cindy Otis urge Facebook to stop shunting people into the radical disinformation groups created by bad actors and support more accountability in its conversations:

To mitigate these problems, Facebook should radically increase transparency around the ownership, management, and membership of groups. Yes, privacy was the point, but users need the tools to understand the provenance of the information they consume. First, Facebook needs to vet more carefully how groups and pages are categorized on the site, ensuring that their labels accurately reflect the content shared in that community. In the current system, a page owner chooses its category— Cuisine, Just for Fun, and so forth—which then shows up in that community’s search results and on its front page. Most groups, meanwhile, are categorized as General, which assists neither users nor Facebook’s threat investigation teams in understanding each one’s purpose. In both cases, owners can be misleading: A large page that shares exclusively divisive or political content might be categorized as a Personal Blog, so as to escape the added scrutiny that might come with a more explicitly political tag. Such descriptors should be more specific and be applied more consistently. That’s especially important for groups or pages with tens of thousands of members or followers. Facebook should also make it easier to spot when multiple groups and pages are managed by the same accounts. That way the average user can easily identify concerted efforts to flood the platform with particular content.

As The Wall Street Journal found, Facebook’s own research showed that algorithmically suggested groups and Related Pages suggestions lead users further into conspiracy-land. They should be eliminated entirely. If users had to search out groups for themselves, they might be a bit more thoughtful about which they joined. Finally, very large groups should not be afforded the same level of privacy as family groups where Grandma shares recipes and cousin Sally posts baby pics. If a group exceeds a certain membership threshold—say, 5,000 people—it should be automatically set to public, so that any Facebook user can participate. That way, these groups can be observed by the researchers and journalists on whom Facebook now relies to police its platform.

We’d all be healthier and better informed if everyone just quit Facebook. But if people are going to keep using Mark Zuckerberg’s effort to control and monetize their experiences and identities, Facebook should provide users with tools that allow them to break from its own algorithms and seek more information about the sources of the information Facebook feeds them.

You know who I am, and that helps you determine how many grains of salt you need to take with my statements. Facebook should provide you at least as much information about the groups and sources it promotes to its users.

15 Comments

  1. John 2020-06-18 07:44

    The Facebook business model is built on conflict, not “friends” and cat pictures. There’s only money in conflict. Even faux conflict, as the current occupant demonstrates.
    “Conflict has become the catalyst for the economic model.”
    “It’s like YouTube and Facebook: an information-laundering perpetual-radicalization machine. . . . . The algorithm is not designed for thoughtful engagement and clarity. It’s designed to make you look at it longer.”
    https://www.nytimes.com/interactive/2020/06/15/magazine/jon-stewart-interview.html?utm_source=pocket-newtab

  2. Robert Mehling 2020-06-18 08:30

    As the supreme overlord of disinformation and memes in South Dakota, I find this post offensive. I don’t attack your fruit cart Heidelberger, don’t attack mine. Stay in your lane b*tch. ;-) (I really hope you know me enough to know this is sarcasm)

  3. Donald Pay 2020-06-18 10:17

    If you don’t go looking for conflict, you can pretty easily avoid it on Facebook. The easiest way is to only “friend” people who aren’t dumb or haters. I would never “friend” some of the people who comment here on DFP. If someone gets past my sniff test, which is rare, I “defriend” and block them. Next, I only join groups that have goals and values similar to mine, or that will kick people out who are too mean and controversial. I have exited groups for this reason.

    Finally, I have a high tolerance for differences of opinion, and occasional name calling, but not for bullying, too much religious proselytizing, fascism or hate. People who I “friend” are generally smart, sensitive people. They have opinions and they may not agree with mine. That’s fine. I try to learn from them. Some of my “friends” don’t like a foul language. That’s fine. I can curb my enthusiasm for cursing if I comment on their pages.

    I also don’t “friend” people whose life revolves around “cat pictures” or comely females who want to show me their …hmm….”cat pictures.”

    I don’t have many “friends” because I’m highly selective.

    Blogs are different for me. DFP has a point of view, as does every other blog I have ever commented on. It’s better to mix it up on a blog site, than on Facebook.

  4. Clint Brown 2020-06-18 10:51

    “I write for no party…” That’s a HUGE laugh. Thank you for that today.

  5. bearcreekbat 2020-06-18 11:53

    Unsolicited advice, especially from folks who are not consumers or users, to a particular private entity, such as Facebook, on how to change a successful business model in any particular way seems to be a form of spitting into the wind. But who knows, maybe Facebook will listen.

    And for folks currently too stupid or dense to evaluate what they read or see on Facebook, additional “tools that allow them to break from its own algorithms and seek more information about the sources of the information Facebook feeds them” are unlikely to be particularly effective or helpful in opening minds. There are already many publicly available sources that criticize Facebook and are well known to everyone, as well as numerous “fact check” sites and resources.

    And currently, while the federal Communications Decency Act apparently shields ISPs, social media platforms and website hosts, it appears that individuals or groups that post defamatory or unlawful (such as “true threats”) materials on Facebook are subject to both civil and criminal sanctions, and may not be protected by attempts at anonymity.

    While posting information on Facebook may give people a sense of anonymity especially if their profile does not reflect their true identity, posting certain information on Facebook may provide the basis for a lawsuit.

    https://www.hg.org/legal-articles/can-you-be-sued-for-something-you-post-on-facebook-36205

    See also,
    https://www.wfmj.com/story/38322688/attorneys-warn-social-media-posts-or-threats-can-lead-to-criminal-charges

  6. Cory Allen Heidelberger Post author | 2020-06-18 12:18

    Clint, I meant what I said. I write in support of many Democratic Party principles and candidates, because your Republican patrons are fascists. But I do not write at the behest of or on the payroll of any party.

    When you get done laughing, go vote Trump and his fascist liars out of office.

  7. Cory Allen Heidelberger Post author | 2020-06-18 12:23

    In addition to conflict, Facebook builds its business model on manipulating its customers in predictable patterns of consumption. Their suggestion algorithms help ensure more predictable consumer eyeballs on certain pages, meaning Facebook can demand higher prices from its advertisers. Driving readers into predictable engagement patterns can turn into predictable buying patterns, into luring people into buying stuff that they never would have thought of buying otherwise. Instead of selling people things they really need and naturally want, Facebook replaces free will, bit by profitable bit, with its math.

    That’s why it’s so hard to ask Facebook to offer the transparency customers would need to evaluate groups and new sources on their own. People thinking for themselves and making decisions on personal assessment rather than on Facebook’s suggestions defies their algorithms and translates into less ad revenue.

  8. bearcreekbat 2020-06-18 12:41

    Cory, in other words Facebook uses accepted advertising techniques and strategies used by virtually all modern businesses that are designed for “manipulating its customers in predictable patterns of consumption.” It seems a bit odd to focus criticism on Facebook for using advertising strategies and techniques to increase revenue.

    https://money.com/marketing-politicians-manipulation-psychology/

  9. Donald Pay 2020-06-18 14:09

    I don’t know about anyone else, but I never read or click on any Facebook ads. Yeah, I’ve fallen for click bait, but rarely. They can use my preferences all they want. I really don’t think I leave much of a trail. I know exactly what they are doing, and I don’t fall for it. If their algorithms are so good, why to they waste Trump ads on me? I’ve got to assume they think I’m independent because I don’t fall neatly into one of their algorithmic boxes.

  10. Debbo 2020-06-18 18:47

    The Roger Cornelius Memorial Cartoon by Marty Two Bulls:

    is.gd/xjkEj7

    (Police brutality)

  11. mike from iowa 2020-06-18 19:21

    Excellent timing, Debbo and Marty Two Bulls.

  12. Debbo 2020-06-18 23:34

    JW, that was very likely Stephen Goebbels Miller’s input. He’s smart and a closet Nazi, so he undoubtedly knows what the red triangle indicates. The white scumacysts who look to Demonic Donny for leadership probably went crazier when they saw that.

    It looks like we must refight the Nazis. Our heroic WW II grandparents and parents would be so disappointed that such vermin have surfaced in the USA.

  13. Cory Allen Heidelberger Post author | 2020-06-20 08:10

    Bear, true, their core strategy is the same: manipulate consumers to reduce uncertainty and increase sales. Extra focus on Facebook’s execution of this strategy may be warranted, because social media provide greater access to consumers’ minds and much more data to track the effectiveness of manipulation techniques and adapt to produce greater domination of consumer mindspace.

    Like Donald, I almost never click online ads; on Facebook, my brain almost completely tuned them out. But when you have an audience of millions, you only need 1% to click to turn a tidy profit.

Comments are closed.