I actually like the chaoticness, because I don’t like having one small group of people as the self-appointed and de-facto gatekeepers of AI for everyone else. This makes it clear to everyone why it’s important to control your own AI resources.
For fuck’s sake. You want bad things to happen… so good things happen, later. Bad shit happening is the part that’s objectionable. Saying ‘but I want good things’ isn’t fucking relevant to why someone’s hassling you about this!
The bad shit you want to happen first is the only part that’s real!
I actually like the chaoticness, because I don’t like having one small group of people as the self-appointed and de-facto gatekeepers of AI for everyone else. This makes it clear to everyone why it’s important to control your own AI resources.
Accelerationism is human sacrifice. It only works if it does damage… and most of the time, it only does damage.
Not wanting a small group of self-appointed gatekeepers is not the same as accelerationism.
… the goal is not what makes it acceleration.
“Accelerationism” is a philosophical position. The goal is entirely what makes it accelerationism. Quit swapping words in each new comment.
For fuck’s sake. You want bad things to happen… so good things happen, later. Bad shit happening is the part that’s objectionable. Saying ‘but I want good things’ isn’t fucking relevant to why someone’s hassling you about this!
The bad shit you want to happen first is the only part that’s real!
No, that’s entirely you assuming things about my position. I don’t want bad things to happen.
Because it is having a good outcome, the disruption of of a monopoly.