Reaction and Reply to Sasha Chapin on Bad In-group Norms


Epistemic status: stream-of-consciousness

Earlier today, my co-blogger Devin posted Sasha Chapin’s article, “Your Intelligent, Conscientious In-group Has Bad Social Norms Too” to a mutual Discord server. He specifically wanted to know our friend group’s thoughts on it (our friend group leans rationalist/EA). I started reading it, and posted lots of thoughts and replies I had on it. They seem valuable, or at least postable-as-is (in stream-of-consciousness), so here they are:

And when they burn out, it is a major loss. And the burnouts can be dramatic—like, quit your life, do nothing for years, have no grounding principles whatsoever, eventually hallucinogen yourself back into self-acceptance. (That’s a reasonable way to spend time, but there’s probably a healthier middle way that doesn’t involve total personal collapse.)

NGL, that last line was kinda funny.

I relate hard to the paragraph that starts “Sometimes—often—these forbidden thoughts/actions…”, because that’s the precise situation I’ve long been in, I’m only now trying to get out of it in a healthy way, it’s toxic, nobody wrote it down, and people in the community have been supportive of me doing it when I basically ask.

and systematizing being preferred inappropriately above all other modes of thinking and feeling.

Systematizing is more valuable than other modes of thinking, when it’s available. Intuition is all we have in some situations, so we shouldn’t ignore it. And systematizing is certainly not the only or best “mode of feeling” (almost seems like a category error). Add in second-order uncertainty over decision theory and morality (and, of course, our own motivations), and the basic broader idea rings true, even if I disagree with the specific wording. (This is a problem I have with other parts of the article).

when, in fact, you just like social drama because watching humans freak out is fun.

I counter the “this-isn’t-smart-enough” norm with my inborn “addiction-to-some-types-of-YouTube-drama” norm. That’s totally my real justification for developing this addiction.

rather than, y’know, vroom vroom fun thing go fast.

Getting diagnosed with ADHD was a big step in accepting myself at least somewhat more than I would naively following in-group patterns. (I’d hazard a guess that it’s overrepresented in the community, compared to other mental disorders which are themselves overrepresented. As in, if autism and schizophrenia were overrepresented in the ratcom compared to the general population, ADHD is less-over-represented than them / the general % of neurodivergent people in the community.)

Devin also noted that (removing some context), “Probably the best hedging is just trying to be part of multiple subgroups with competing memes”. I heavily agree with this. (This can probably prevent full-blown cult situations if done intensely enough.)

He also noted (again, removing some context) that going to the Bay Area causes something like “ingroup overdose” for some people. I can confirm this to some extent, I’ve dreamed (not literally) about going to the Bay Area and living with you people. But I think a more mixed group (or subset of groups) would work better, even for a literal group-house. (This might be false, but if I ever get a lot of money, I’ll prolly try it).

It’s regular Kegan Stage 4 stuff.

This got me down a rabbit hole of what the “kegan stages” are, and apparently Stage 5 is kinda like “I contain multitudes” and “embrace paradox” and “multiple identities”. I sometimes feel like I already have this, but maybe they mean “what you use IRL, not just what you think/feel alone/on an abstract level”1.

(By the way, Eliezer Yudkowsky, this is what post-Rationalists are, it’s not that complicated—they don’t have explicit principles because they’ve moved on from thinking that life is entirely about explicit principles. Perhaps you don’t intuitively grasp this because you don’t see the social group you’ve founded as a social group.)

I hate this passage. Nobody seems to take my solution (larger principles, laxer enforcement, the above uncertainty/multitudes thing, not caring about certain principles sometimes).

>“but that’s also inconsistency!”

So is every human, but like… in the real world, a less-toxic rationality community would still be better than the harder-to-defend concrete-level (admittedly implicit and possibly not-intended) claim “The best life includes things that can’t even in principle be codified as principles”. I’m probably reading too much into that, though. (I think I agree with Jacob here, but on the object-level “we should actually try to take ideas seriously, even if worrying too hard about it is unhealthy and bad.”)

If you notice signs of emotional decay, try to not diagnose them exclusively through your favorite epistemic lens. Try, instead, to be curious about whether some of their human requirements are not being met by the local milieu.

That’s a lens of some sort, but still helpful.

There are parts of your being that have nothing to do with our version of existence. They don’t have to be justified, explained, or rejected. You need them to exist, and, by extension, we do too.

Overall, agreed.

There are many, many parts of me that don’t fit even the remotest stereotype/template of a person in the rationality community, let alone someone who blogs, wants to work in AI safety, and actively loves much of Yudkowsky’s opinions and writing style.

I love YouTube Poops, for example. I got a very dank score on this memes compass. I love very latimp aesthetics, to the point where I have that concept in my vocabulary through organic exposure, while also knowing that the underlying theory is at least as fake as similar ones. I take moral uncertainty seriously enough to actively consider how true or false the claims in the titular book are regarding it, with nontrivial credence in theories that still probably suck2. I spend a bit of my free time making videos, and I’ll be ramping that up in the near future. But mostly I’m getting better at knowing my limits and what, realistically, I want.

After posting most of the above on the Discord, I hurriedly copied it into a Markdown file, added the last part, and posted it.

And then I woke up.


  1. It seems like compartmentalization and laziness-about-self-consistency are cheating ways to get to stage 5… which seems to be what Peter Thiel does with ideas (see point 25 here), and he seems to be doing okay for himself. Did I mention I’m reading Moral Uncertainty↩︎

  2. Not Kantianism though, that fades into epsilon. ↩︎


If
, help us write more by donating to our Patreon.


Tagged: