Aug. 19th, 2022

lirazel: Britta from Community raising her hand with the text "I have feelings about this" ([tv] as usual)
I don't know if these conversations are happening elsewhere, but there's been a lot of talk on Tumblr about how new fans encountering AO3 for the first time don't actually understand anything about AO3. Obviously a lot of conversations focus around censorship, etc. but the one that's been weighing on my mind has to do with assumptions about technology that the younger generation bring with them. (This is not a generational warfare post! Don't worry!)

Apparently, a great number of new (presumably young) fans assume that AO3 has an algorithm that determines who sees what. I was absolutely flabbergasted to learn this because it represents such a different mindset than my own. For me, I visit a website and do not assume the presence of an algorithm. But apparently younger people are taking it for granted that all such websites, especially those that allow direct interaction with other users, have an algorithm.

That is to say--algorithms are so much a part of their world that they take their presence for granted. To them, algorithms are the default.

I worry a bit about how these young people end up defying fannish norms because they make this assumption--apparently they are deleting their fics and reposting them again later because they think it will help them get more eyeballs? Obviously this is not good for other AO3 users. I am very sympathetic to the fact that these new users just have no idea that they're doing something wrong--I don't expect them to know the norms right from the beginning--but I'm sure some people are being very ugly to them about it, and I don't want them to run off new fans! We all made n00b mistakes when we first entered online fannish spaces!

But far more than that, I am horrified by what this assumption of algorithms says about where the internet is at, and I'm angry on behalf of these young people that they never got to experience a pre-social media internet. I worry about them so much on a general "social media is terrible actually" level for all the reasons that I'm sure you already know about. But now I have new worry, which is that the younger generation is not only accepting but even supportive of the presence of these kinds of algorithms.

Literally one of the users in the link says:



Algorithms ruined the internet is a take I've never heard before

Like bro, I'll take some ads in my Instagram feed if it shows me post I'll want to see first and ones I don't care about last, it ain't that deep


Here's a couple more similar comments.

And I want to run around screaming like my hair is on fire. This is horrifying to me. This is so so bad. The assumption of algorithms is terrible because of what it says about the state of the internet. But the acceptance of algorithms is so, so, so much worse to me.

There are so many levels to this that I find horrifying--the idea that people will outsource their tastes to big tech companies instead of trusted communities. The fact that this is affecting fannish community interaction. The way this person doesn't even stop to consider the big tech companies' priorities and the way the profit motive will always result in the voicelessness of the marginalized. The lack of understanding of serendipity and the joy of stumbling across something you didn't know you wanted. The way it implies that tastes are fixed and narrow and predictable things.

Speaking of serendipity (and I swear this is relevant):

One of the things I appreciate about Ezra Klein's podcast is that even when I think I don't care about the topic he's talking about...I usually end up caring. And I've learned to trust that he'll have interesting, thoughtful conversations that actually matter, so I just listen to all of the episodes even when I think they're not something I care about.

This week, he had a conversation [that's a link to the transcript] with the writer Mohsin Hamid, and whoever decides the titles of the episodes decided to name it "How do we face loss with dignity?"

I assumed (in a way, isn't this entire post about how it's a bad idea to assume? An ass out of u and me, etc.) that this would be about grief/mourning/etc. and I kind of groaned because I was not in the right headspace for that now. But I trust Ezra, so I listened.

And I'm so, so glad I did. Mr. Hamid is so incredibly thoughtful and lovely and I adored listening to him talk. I hadn't read any of his stuff before, and it still sounds more literary fiction than I enjoy reading, but I will have to try at least one of his books and even if they're not to my tastes, I will seek out interviews and any nonfiction writing he does.

The "loss" mentioned in the title was actually loss of privilege/power, and while I appreciated that discussion, and the whole episode is well worth reading/listening to, I want to focus on what he had to say about technology, because that's what struck me hardest and also is what is most relevant here.

Mr. Hamid and Ezra (Ezra and I are bros, even if he doesn't know it, I can call him by his first name) compare our current technological moment to the early twentieth century (and Hamid's new novel to Kafka) and then zero in on what is different about the social media internet than the earlier internet (though they don't frame it in those terms).

But the difference is, I think, in our current technological, cultural moment, what’s happening is that, as we merge with our screens — and we spend an enormous amount of time staring into our screens and doing things with them — we’re encountering a sort of a machine culture that is, by its very nature, sort of sorting-based. A huge amount of our cultural activity now is sorting things. Do I like this or not like this, do I follow this person or not follow this person, does this person like me or not like me? And if I identify a meaningful aspect of their identity, which is not like me, the person is fundamentally, in some way, opposed to me and in conflict with me. And what we have to do, for a lot of people, is to separate from or, even worse, extinguish the people who are not like us.


This is an excellent, insightful, and thought-provoking quote in itself, but what comes next is what really stuck with me.

Ezra then says:

I think that point about us living in an age of technological sorting is really profound. And as you said it, something else occurred to me, which is that the first run of — I think you’re heavily talking here about social media technologies, about identity technologies online, about the way we spend our time and have things given to us now digitally — round one was sorting and round two has been prediction, both in terms of all the algorithms predicting what we’ll like, and in that way shaping what we end up liking, but also in the sense of the political campaigns that unleashed their algorithms on huge amounts of consumer data to figure out who we’re going to vote for, the advertising campaigns that are sorting us into this kind of consumer, that kind of consumer.

And there’s an interesting way in which the lived economic reality of endless prediction conflicts with what we tell children and sometimes tell ourselves, which is that we’re all individuals, we’re all special, you can’t judge people by their group or their appearance because you’ll get it wrong. We can tell people that all we want, but as we build an economy that is increasingly oriented around pumping money towards companies that predict what we are going to do based on a fairly limited amount of data about who we are, we’re really sending the opposite cultural message in, I suspect, a much more credible way.


Such good stuff. Here's Hamid's response, which ties in to the stuff about algorithm acceptance:

I think that’s right. And I think that, for me, one thing which is very interesting is, when we talk about going from sorting to prediction — which I think is correct, that is something that’s happening — we tend to imagine that predicting is an observational activity. In other words, that technology is allowing us to see where we might go as individuals and to predict. But I think that prediction is actually much more perniciously a behavior modification activity. In other words, making us into more predictable beings.

And that, I think, is by far the greater danger. In other words, if we want to be able to predict people, partly we need to build a model of what they do, but partly we would want them to be predictable. They should be inclined towards doing certain things. And so if you take somebody with the sorting mechanism, if you give them information that plays upon humans’ innate sense of prioritizing the information about threats — economic threats, racial threats, we prioritize that information — what begins to happen is it’s not just that the way we were going to behave remains unchanged. The way we are going to behave also changes. And it changes in predictable ways.

So it isn’t simply the case that machines are better able to understand humans. It is also the case that machines are making human beings more like machines, that we are trying to rewrite our programming in such a way that we can be predicted. And for me, that’s the more frightening aspect of the shift from sorting to prediction.


THIS. This spoke exactly to the thing I'd been fretting over with the algorithm stuff. I knew that my feelings weren't a "kids these days" kind of thing--they were genuine fear about a way the world is changing and we are accepting that. And this quote really articulates well one of the things I'm most worried about.

I do not want to be shaped by the marketing concerns of a tech company. In fact, I refuse to accept that. I will fight against it as hard as I can. But I'm terrified that there's a generation of people coming up who don't even know they need to fight back. They don't seem to understand these things at all.

I know that there are lots of younger people who are aware of these things and are fighting back. I don't want to erase them. But I do think that there are a lot of people of all ages (Boomer facebook users come to mind) who just accept whatever big tech hands them, and if you're young enough that you don't remember a pre-algorithm internet, you're going to be more likely to fall into this group. There will be more and more people starting to engage with the internet who simply don't know that there are other options.

And that scares me a lot.

May 2025

S M T W T F S
    123
4 56 78910
11 1213 14 151617
1819 20 21222324
25262728293031

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 23rd, 2025 08:21 am
Powered by Dreamwidth Studios