I mean, the more I care about something (or even put attention to something), the more I fuck it up…
It’s possible to, like, not care about it as much and do your own thing and let other people find your value for yourself even if you don’t try to win their attention
If there’s anything that I ever learned in life, it’s that my outcomes are consistently better the less I care about them (that’s why I’m so into “why greatness cannot be planned”). It isn’t the healthiest or ideal plan for most people (and maybe this could change with maturity/pattern-recognition), but it’s something that just works, and works better the more resourceful you are.\
[it’s also healthier than continuing to obsess over Peter Thiel
]
[this also could be because I have a unique vibe, and have acted contrary to my vibe in many cases because I was overtrained on the “academic overachiever” value function, and then confused it for “value” well after it was causing me more problem than good]
(it’s also true that other people are better at doing most things than I am, the only thing I am easily better at is breadth and certain network properties [and this makes me surprising])
There’s the REBUS vs SEBUS theory of psychedelics: psychedelics often make many people more REBUS, and me more SEBUS. I don’t know a way out of it (right now) and don’t plan on tripping anytime soon.
==
If anything, I know thousands of people now, whereas I didn’t in the past. But I’ve caused my fair share of social messes (some which are too unique to be forgettable), and some have ripple effects, and the ripple effects continue to cause pain (even though, like, it’s easier to be more neutral the more you know)…
==
Anyways, timelines for MANY life-changing aspects of AI are near (even if AGI is still an uncertainty) and are more likely than not to turn a lot of historically shitty early-life training data (or trauma) into rounding errors.