Also, how many choices per day our little brains have to make these days vs. even 50 years ago? I sometimes feel that I don't want to make a choice when the situation is requiring one. And the worst is when it's about a necessity, like "what do I want to eat today?"...
We are manipulated to consume content endlessly. And content is curated such that it is acceptable to advertisers. But, on the whole, we are being shown what we want to see. Technology is increasingly able to instantaneously give us more of what we want.
This is increasingly forcing us to face the fact that it is not forces outside of ourselves that are manipulating us in ways that make us unhappy, but our own deranged desires that make us unhappy.
Before technology granted us the ability to instantaneously gratify our desires, we could point to external conditions that prevented us from realizing what we thought would make us happy. But now we are getting what we want instantaneously.
We must first get what we want before we can realize that it didn't work. Only then can we learn to desire something else. It's a perversely good thing that social media is "causing" problems. Social media is less a cause than an effect of our own desires. Social media isn't a "bad" thing to be avoided; it's a mirror that reflects back to us what we are. It would, of course, be better to learn from joy instead of suffering, but both are effective.
Zooming out to consider the overall themes of the content suggested to us is useful. Much of what I find in my own feed are themes of war, negative judgments of various kinds, "entity X is screwing over the public," over-dramatized conflict scenes from movies, accusations of ill intent, competition, partisanship, us-versus-them, and various other nonsense. It tends to be ego-validating. Opening a private tab generally shows more edifying content, albeit lower IQ, than what I see in my feed.
I "deserve" this. It's what I wanted. How can I pretend that I don't want the effects of the causes that I have desired and chosen? Or how can I seriously claim to want effects that I don't want the causes of?
Thank you for reading and engaging! I think you are pointing out an interesting contradiction - however, I have a more positive view of human nature. Just because I am angry at the person who is walking in front of me, it doesn't mean that I'll dropkick them - and not just because I'm afraid of jail but also because that's not the kind of person I want to be. Social media, however, I think feeds on these base instincts and capitalises on them, and even more perversly, later manufactures these desires as opposed to 'higher' ones. You can appreciate boobs, but you linger one second longer on boob-content on IG, and bam, you have nothing but boobs. That wasn't a conscious decision and personally I hope that I am more than my basest desires. Thank you again for your thoughtful comment and addition!
Wow, this piece captured so much of what I’ve been thinking about lately on the shift from choosing to being chosen for, and how algorithms can quietly train people into feedback loops. I’m glad I found it. One less thing I need to write now; I’ll just cite you instead, haha.
I’ve been trying to articulate how personalization, which is sold as freedom, becomes constraint, and you nailed it here. People can develop strong beliefs, without realizing they might’ve held entirely different beliefs if the algorithm had exposed them to different information. I think what makes it so insidious is that for many people, it can feel like discovery—it comes with serotonin and feelings of growth, eureka, and self-actualization. However, in reality, but it’s more like guided formation, that they may have no idea has not been the doing of their own autonomy (or even permission).
Also, this adds a layer to a point about mimetic desire and tribal mentality, which I explored in my article on Network States. Curated feeds chosen by algorithms not only limit people’s exposure to other ideas, but they also reinforce their aesthetic and ideological tastes until anything outside them feels off or even hostile to their identity. It creates the illusion of expansion, progress, and fulfillment while keeping people locked into one cognitive plane. Real growth often comes with pain. But this kind of growth is often self-congratulatory and shallow.
And dissent that doesn’t fall neatly along ideological lines (if it appears at all) tends to make people flinch or shut down. It wasn’t part of the architecture they were trained to resonate with or against, so they don’t know what to do with it. Often they respond as if the dissenter must be an enemy to their values—mentally sticking them into a default category (MAGA, liberal, anti-vaxxer, whatever the convenient opposite is). But when the dissenter pushes back and says no, I’m saying this because I share your values, that’s when the framework cracks. Or at least it could—if the algorithm hadn’t already pre-sorted that kind of internal dissent out of view. Even if it does come through, the algorithm can get them back to their comfy feed that’s 90% echo chamber.
Oh yeah, and the quote on psychopolitics really stuck with me too (on how neoliberal power doesn’t repress but seduces, and It preempts desire.) That seems to be making actual quality dissent even harder for people, since the algorithm sets their preferences before they’re even conscious of them. In an earlier comment on another post, I called predictability sad. But now I’m gonna say it’s dangerous. It makes people far too legible to systems that don’t have their autonomy in mind; in fact, these systems benefit from maintaining the conditions that allow larger groups to be psychologically controlled, and it’s getting even easier for them.
Thank you so much! For both the kind words but also for the comment, this is exactly why Substack is such a great platform - this comment could be a short standalone essay! You make great points and I completely agree. Thank you, thank you, thank you! and very excited to see what future stuff you'll write ^^
(Wow this comment was way more concise in my head. I am not a fan of the Substack mobile commenting experience for many reasons haha, one of which being that since I can’t see my full comment, I underestimate how long I’ve been typing. )
Excellent post! I have no disagreements, but I couldn't help but notice we enjoy a lot of the same authors, and I happen to be looking for fellow humans in my thoughtspace. Would you be interested in checking out one of my essays to see if we resonate? I cite Han in this one: https://jakehpark.substack.com/p/epistemic-telos-god-as-sadistic-denial
Also, how many choices per day our little brains have to make these days vs. even 50 years ago? I sometimes feel that I don't want to make a choice when the situation is requiring one. And the worst is when it's about a necessity, like "what do I want to eat today?"...
Too many choices :( and it's only getting worse!
We are manipulated to consume content endlessly. And content is curated such that it is acceptable to advertisers. But, on the whole, we are being shown what we want to see. Technology is increasingly able to instantaneously give us more of what we want.
This is increasingly forcing us to face the fact that it is not forces outside of ourselves that are manipulating us in ways that make us unhappy, but our own deranged desires that make us unhappy.
Before technology granted us the ability to instantaneously gratify our desires, we could point to external conditions that prevented us from realizing what we thought would make us happy. But now we are getting what we want instantaneously.
We must first get what we want before we can realize that it didn't work. Only then can we learn to desire something else. It's a perversely good thing that social media is "causing" problems. Social media is less a cause than an effect of our own desires. Social media isn't a "bad" thing to be avoided; it's a mirror that reflects back to us what we are. It would, of course, be better to learn from joy instead of suffering, but both are effective.
Zooming out to consider the overall themes of the content suggested to us is useful. Much of what I find in my own feed are themes of war, negative judgments of various kinds, "entity X is screwing over the public," over-dramatized conflict scenes from movies, accusations of ill intent, competition, partisanship, us-versus-them, and various other nonsense. It tends to be ego-validating. Opening a private tab generally shows more edifying content, albeit lower IQ, than what I see in my feed.
I "deserve" this. It's what I wanted. How can I pretend that I don't want the effects of the causes that I have desired and chosen? Or how can I seriously claim to want effects that I don't want the causes of?
Thank you for reading and engaging! I think you are pointing out an interesting contradiction - however, I have a more positive view of human nature. Just because I am angry at the person who is walking in front of me, it doesn't mean that I'll dropkick them - and not just because I'm afraid of jail but also because that's not the kind of person I want to be. Social media, however, I think feeds on these base instincts and capitalises on them, and even more perversly, later manufactures these desires as opposed to 'higher' ones. You can appreciate boobs, but you linger one second longer on boob-content on IG, and bam, you have nothing but boobs. That wasn't a conscious decision and personally I hope that I am more than my basest desires. Thank you again for your thoughtful comment and addition!
Wonderful post!
Thank you so much!
Wow, this piece captured so much of what I’ve been thinking about lately on the shift from choosing to being chosen for, and how algorithms can quietly train people into feedback loops. I’m glad I found it. One less thing I need to write now; I’ll just cite you instead, haha.
I’ve been trying to articulate how personalization, which is sold as freedom, becomes constraint, and you nailed it here. People can develop strong beliefs, without realizing they might’ve held entirely different beliefs if the algorithm had exposed them to different information. I think what makes it so insidious is that for many people, it can feel like discovery—it comes with serotonin and feelings of growth, eureka, and self-actualization. However, in reality, but it’s more like guided formation, that they may have no idea has not been the doing of their own autonomy (or even permission).
Also, this adds a layer to a point about mimetic desire and tribal mentality, which I explored in my article on Network States. Curated feeds chosen by algorithms not only limit people’s exposure to other ideas, but they also reinforce their aesthetic and ideological tastes until anything outside them feels off or even hostile to their identity. It creates the illusion of expansion, progress, and fulfillment while keeping people locked into one cognitive plane. Real growth often comes with pain. But this kind of growth is often self-congratulatory and shallow.
And dissent that doesn’t fall neatly along ideological lines (if it appears at all) tends to make people flinch or shut down. It wasn’t part of the architecture they were trained to resonate with or against, so they don’t know what to do with it. Often they respond as if the dissenter must be an enemy to their values—mentally sticking them into a default category (MAGA, liberal, anti-vaxxer, whatever the convenient opposite is). But when the dissenter pushes back and says no, I’m saying this because I share your values, that’s when the framework cracks. Or at least it could—if the algorithm hadn’t already pre-sorted that kind of internal dissent out of view. Even if it does come through, the algorithm can get them back to their comfy feed that’s 90% echo chamber.
Oh yeah, and the quote on psychopolitics really stuck with me too (on how neoliberal power doesn’t repress but seduces, and It preempts desire.) That seems to be making actual quality dissent even harder for people, since the algorithm sets their preferences before they’re even conscious of them. In an earlier comment on another post, I called predictability sad. But now I’m gonna say it’s dangerous. It makes people far too legible to systems that don’t have their autonomy in mind; in fact, these systems benefit from maintaining the conditions that allow larger groups to be psychologically controlled, and it’s getting even easier for them.
Thank you so much! For both the kind words but also for the comment, this is exactly why Substack is such a great platform - this comment could be a short standalone essay! You make great points and I completely agree. Thank you, thank you, thank you! and very excited to see what future stuff you'll write ^^
(Wow this comment was way more concise in my head. I am not a fan of the Substack mobile commenting experience for many reasons haha, one of which being that since I can’t see my full comment, I underestimate how long I’ve been typing. )
Excellent post! I have no disagreements, but I couldn't help but notice we enjoy a lot of the same authors, and I happen to be looking for fellow humans in my thoughtspace. Would you be interested in checking out one of my essays to see if we resonate? I cite Han in this one: https://jakehpark.substack.com/p/epistemic-telos-god-as-sadistic-denial
Thank you for the kind words! I'll read the essay soon, looking forward to it!