Discussion about this post

User's avatar
Denise Champney's avatar

As an educator and mother of 2 teenagers, I am so thankful that these conversations are occurring. In addition to putting an age limit on social media, (one that I am sure many kids will try to bypass but hopefully will create a barrier), I think we also need to look at our education system and it's push to make all students 1:1 with technology (meaning every child has their own device and in some cases access to this device all day at school). This mindset that kids need to be educated by computers while knowing that at the same time these kids are having to divert their attention from the strong pull of the internet and so called "educational games" (our latest version of the "fat free" snackwell cookie) is a huge part of the problem. Yes social media is bad but so is allowing 10 year olds to have free access to a computer all day at school and expecting them to be "working on math" or "reading" when we know these other forces are designed to steal their attention are so easy for them to use. Also, we have forgotten the importance of multisensory learning and face-to-face communication, something that is lost with tapping on a key board. The stronger the foundation of social skills we can develop, the better our children will be able to navigate their behaviors on-line. These skills need to be developed through real human interaction and not through a screen.

Expand full comment
Bob Frank's avatar

> Some social media platforms have introduced reputation-based functionality with successful results. For example, Reddit’s upvote/downvote and Karma system have proven useful for improving social discourse while avoiding the privacy issues that could come with identifying all users. Using this model, we could require accounts to earn trust from the community before giving them all the power (and responsibility) of widespread distribution and develop ways to make the loss of community trust consequential.

Not sure you want to be using Reddit as your model to emulate. It has a widespread reputation as "the [insert unflattering body part here] of the Internet," based largely on the ability to create new accounts entirely anonymously and unaccountably. When anyone can trivially create an alt "for free" and wade into a discussion pretending to be new, (or have multiple well-established alts amplifying each other's voices, or any number of other bits of bad behavior,) you get... well... the toxic mess that is Reddit.

> Consider an example from one of the leaked Facebook paper documents revealing that a small set of users are responsible for nearly half of all uncivil comments. The absence of an effective downvote system ironically amplifies their visibility when others engage to contest their behavior. What if we could diminish this group's social sway by holding them accountable, possibly through a history of downvoted comments?

This might be more effective, but it's also possible that it could lead to whole new forms of bullying and harassment. Unless the threshold for making it onto the downvoted comments list was unreasonably high, it wouldn't be particularly difficult for malicious users to brigade somebody they didn't like and make them look like a problem.

I think the best solution to this that I've seen comes from StackOverflow: downvoting decreases the target user's reputation, but it also decreases your reputation as well, by a smaller amount. You're allowed to hold other users accountable, but there's a cost to doing so. (A few years back they significantly weakened that cost, and the site's quality has gotten a lot worse ever since.)

> Questions of causality are pervasive in debates about social media (e.g., is social media a reflection of our societal polarization, or is it causing that polarization?)

Seems to me the best answer is "both." It's a feedback loop; existing polarization leads to polarizing content, which drives further polarization.

> Social media has been hailed as removing gatekeepers, but those gatekeepers may not all be bad.

Hear, hear! The principle of Chesterton's Fence (Chesterton's Gate?) applies here; as more and more "gatekeepers" are removed, we see more and more clearly the costs and harms of gates inadequately kept.

Expand full comment
94 more comments...

No posts