Entry tags:
the assumption of algorithms
I don't know if these conversations are happening elsewhere, but there's been a lot of talk on Tumblr about how new fans encountering AO3 for the first time don't actually understand anything about AO3. Obviously a lot of conversations focus around censorship, etc. but the one that's been weighing on my mind has to do with assumptions about technology that the younger generation bring with them. (This is not a generational warfare post! Don't worry!)
Apparently, a great number of new (presumably young) fans assume that AO3 has an algorithm that determines who sees what. I was absolutely flabbergasted to learn this because it represents such a different mindset than my own. For me, I visit a website and do not assume the presence of an algorithm. But apparently younger people are taking it for granted that all such websites, especially those that allow direct interaction with other users, have an algorithm.
That is to say--algorithms are so much a part of their world that they take their presence for granted. To them, algorithms are the default.
I worry a bit about how these young people end up defying fannish norms because they make this assumption--apparently they are deleting their fics and reposting them again later because they think it will help them get more eyeballs? Obviously this is not good for other AO3 users. I am very sympathetic to the fact that these new users just have no idea that they're doing something wrong--I don't expect them to know the norms right from the beginning--but I'm sure some people are being very ugly to them about it, and I don't want them to run off new fans! We all made n00b mistakes when we first entered online fannish spaces!
But far more than that, I am horrified by what this assumption of algorithms says about where the internet is at, and I'm angry on behalf of these young people that they never got to experience a pre-social media internet. I worry about them so much on a general "social media is terrible actually" level for all the reasons that I'm sure you already know about. But now I have new worry, which is that the younger generation is not only accepting but even supportive of the presence of these kinds of algorithms.
Literally one of the users in the link says:
Here's a couple more similar comments.
And I want to run around screaming like my hair is on fire. This is horrifying to me. This is so so bad. The assumption of algorithms is terrible because of what it says about the state of the internet. But the acceptance of algorithms is so, so, so much worse to me.
There are so many levels to this that I find horrifying--the idea that people will outsource their tastes to big tech companies instead of trusted communities. The fact that this is affecting fannish community interaction. The way this person doesn't even stop to consider the big tech companies' priorities and the way the profit motive will always result in the voicelessness of the marginalized. The lack of understanding of serendipity and the joy of stumbling across something you didn't know you wanted. The way it implies that tastes are fixed and narrow and predictable things.
Speaking of serendipity (and I swear this is relevant):
One of the things I appreciate about Ezra Klein's podcast is that even when I think I don't care about the topic he's talking about...I usually end up caring. And I've learned to trust that he'll have interesting, thoughtful conversations that actually matter, so I just listen to all of the episodes even when I think they're not something I care about.
This week, he had a conversation [that's a link to the transcript] with the writer Mohsin Hamid, and whoever decides the titles of the episodes decided to name it "How do we face loss with dignity?"
I assumed (in a way, isn't this entire post about how it's a bad idea to assume? An ass out of u and me, etc.) that this would be about grief/mourning/etc. and I kind of groaned because I was not in the right headspace for that now. But I trust Ezra, so I listened.
And I'm so, so glad I did. Mr. Hamid is so incredibly thoughtful and lovely and I adored listening to him talk. I hadn't read any of his stuff before, and it still sounds more literary fiction than I enjoy reading, but I will have to try at least one of his books and even if they're not to my tastes, I will seek out interviews and any nonfiction writing he does.
The "loss" mentioned in the title was actually loss of privilege/power, and while I appreciated that discussion, and the whole episode is well worth reading/listening to, I want to focus on what he had to say about technology, because that's what struck me hardest and also is what is most relevant here.
Mr. Hamid and Ezra (Ezra and I are bros, even if he doesn't know it, I can call him by his first name) compare our current technological moment to the early twentieth century (and Hamid's new novel to Kafka) and then zero in on what is different about the social media internet than the earlier internet (though they don't frame it in those terms).
This is an excellent, insightful, and thought-provoking quote in itself, but what comes next is what really stuck with me.
Ezra then says:
Such good stuff. Here's Hamid's response, which ties in to the stuff about algorithm acceptance:
THIS. This spoke exactly to the thing I'd been fretting over with the algorithm stuff. I knew that my feelings weren't a "kids these days" kind of thing--they were genuine fear about a way the world is changing and we are accepting that. And this quote really articulates well one of the things I'm most worried about.
I do not want to be shaped by the marketing concerns of a tech company. In fact, I refuse to accept that. I will fight against it as hard as I can. But I'm terrified that there's a generation of people coming up who don't even know they need to fight back. They don't seem to understand these things at all.
I know that there are lots of younger people who are aware of these things and are fighting back. I don't want to erase them. But I do think that there are a lot of people of all ages (Boomer facebook users come to mind) who just accept whatever big tech hands them, and if you're young enough that you don't remember a pre-algorithm internet, you're going to be more likely to fall into this group. There will be more and more people starting to engage with the internet who simply don't know that there are other options.
And that scares me a lot.
Apparently, a great number of new (presumably young) fans assume that AO3 has an algorithm that determines who sees what. I was absolutely flabbergasted to learn this because it represents such a different mindset than my own. For me, I visit a website and do not assume the presence of an algorithm. But apparently younger people are taking it for granted that all such websites, especially those that allow direct interaction with other users, have an algorithm.
That is to say--algorithms are so much a part of their world that they take their presence for granted. To them, algorithms are the default.
I worry a bit about how these young people end up defying fannish norms because they make this assumption--apparently they are deleting their fics and reposting them again later because they think it will help them get more eyeballs? Obviously this is not good for other AO3 users. I am very sympathetic to the fact that these new users just have no idea that they're doing something wrong--I don't expect them to know the norms right from the beginning--but I'm sure some people are being very ugly to them about it, and I don't want them to run off new fans! We all made n00b mistakes when we first entered online fannish spaces!
But far more than that, I am horrified by what this assumption of algorithms says about where the internet is at, and I'm angry on behalf of these young people that they never got to experience a pre-social media internet. I worry about them so much on a general "social media is terrible actually" level for all the reasons that I'm sure you already know about. But now I have new worry, which is that the younger generation is not only accepting but even supportive of the presence of these kinds of algorithms.
Literally one of the users in the link says:
Algorithms ruined the internet is a take I've never heard before
Like bro, I'll take some ads in my Instagram feed if it shows me post I'll want to see first and ones I don't care about last, it ain't that deep
Here's a couple more similar comments.
And I want to run around screaming like my hair is on fire. This is horrifying to me. This is so so bad. The assumption of algorithms is terrible because of what it says about the state of the internet. But the acceptance of algorithms is so, so, so much worse to me.
There are so many levels to this that I find horrifying--the idea that people will outsource their tastes to big tech companies instead of trusted communities. The fact that this is affecting fannish community interaction. The way this person doesn't even stop to consider the big tech companies' priorities and the way the profit motive will always result in the voicelessness of the marginalized. The lack of understanding of serendipity and the joy of stumbling across something you didn't know you wanted. The way it implies that tastes are fixed and narrow and predictable things.
Speaking of serendipity (and I swear this is relevant):
One of the things I appreciate about Ezra Klein's podcast is that even when I think I don't care about the topic he's talking about...I usually end up caring. And I've learned to trust that he'll have interesting, thoughtful conversations that actually matter, so I just listen to all of the episodes even when I think they're not something I care about.
This week, he had a conversation [that's a link to the transcript] with the writer Mohsin Hamid, and whoever decides the titles of the episodes decided to name it "How do we face loss with dignity?"
I assumed (in a way, isn't this entire post about how it's a bad idea to assume? An ass out of u and me, etc.) that this would be about grief/mourning/etc. and I kind of groaned because I was not in the right headspace for that now. But I trust Ezra, so I listened.
And I'm so, so glad I did. Mr. Hamid is so incredibly thoughtful and lovely and I adored listening to him talk. I hadn't read any of his stuff before, and it still sounds more literary fiction than I enjoy reading, but I will have to try at least one of his books and even if they're not to my tastes, I will seek out interviews and any nonfiction writing he does.
The "loss" mentioned in the title was actually loss of privilege/power, and while I appreciated that discussion, and the whole episode is well worth reading/listening to, I want to focus on what he had to say about technology, because that's what struck me hardest and also is what is most relevant here.
Mr. Hamid and Ezra (Ezra and I are bros, even if he doesn't know it, I can call him by his first name) compare our current technological moment to the early twentieth century (and Hamid's new novel to Kafka) and then zero in on what is different about the social media internet than the earlier internet (though they don't frame it in those terms).
But the difference is, I think, in our current technological, cultural moment, what’s happening is that, as we merge with our screens — and we spend an enormous amount of time staring into our screens and doing things with them — we’re encountering a sort of a machine culture that is, by its very nature, sort of sorting-based. A huge amount of our cultural activity now is sorting things. Do I like this or not like this, do I follow this person or not follow this person, does this person like me or not like me? And if I identify a meaningful aspect of their identity, which is not like me, the person is fundamentally, in some way, opposed to me and in conflict with me. And what we have to do, for a lot of people, is to separate from or, even worse, extinguish the people who are not like us.
This is an excellent, insightful, and thought-provoking quote in itself, but what comes next is what really stuck with me.
Ezra then says:
I think that point about us living in an age of technological sorting is really profound. And as you said it, something else occurred to me, which is that the first run of — I think you’re heavily talking here about social media technologies, about identity technologies online, about the way we spend our time and have things given to us now digitally — round one was sorting and round two has been prediction, both in terms of all the algorithms predicting what we’ll like, and in that way shaping what we end up liking, but also in the sense of the political campaigns that unleashed their algorithms on huge amounts of consumer data to figure out who we’re going to vote for, the advertising campaigns that are sorting us into this kind of consumer, that kind of consumer.
And there’s an interesting way in which the lived economic reality of endless prediction conflicts with what we tell children and sometimes tell ourselves, which is that we’re all individuals, we’re all special, you can’t judge people by their group or their appearance because you’ll get it wrong. We can tell people that all we want, but as we build an economy that is increasingly oriented around pumping money towards companies that predict what we are going to do based on a fairly limited amount of data about who we are, we’re really sending the opposite cultural message in, I suspect, a much more credible way.
Such good stuff. Here's Hamid's response, which ties in to the stuff about algorithm acceptance:
I think that’s right. And I think that, for me, one thing which is very interesting is, when we talk about going from sorting to prediction — which I think is correct, that is something that’s happening — we tend to imagine that predicting is an observational activity. In other words, that technology is allowing us to see where we might go as individuals and to predict. But I think that prediction is actually much more perniciously a behavior modification activity. In other words, making us into more predictable beings.
And that, I think, is by far the greater danger. In other words, if we want to be able to predict people, partly we need to build a model of what they do, but partly we would want them to be predictable. They should be inclined towards doing certain things. And so if you take somebody with the sorting mechanism, if you give them information that plays upon humans’ innate sense of prioritizing the information about threats — economic threats, racial threats, we prioritize that information — what begins to happen is it’s not just that the way we were going to behave remains unchanged. The way we are going to behave also changes. And it changes in predictable ways.
So it isn’t simply the case that machines are better able to understand humans. It is also the case that machines are making human beings more like machines, that we are trying to rewrite our programming in such a way that we can be predicted. And for me, that’s the more frightening aspect of the shift from sorting to prediction.
THIS. This spoke exactly to the thing I'd been fretting over with the algorithm stuff. I knew that my feelings weren't a "kids these days" kind of thing--they were genuine fear about a way the world is changing and we are accepting that. And this quote really articulates well one of the things I'm most worried about.
I do not want to be shaped by the marketing concerns of a tech company. In fact, I refuse to accept that. I will fight against it as hard as I can. But I'm terrified that there's a generation of people coming up who don't even know they need to fight back. They don't seem to understand these things at all.
I know that there are lots of younger people who are aware of these things and are fighting back. I don't want to erase them. But I do think that there are a lot of people of all ages (Boomer facebook users come to mind) who just accept whatever big tech hands them, and if you're young enough that you don't remember a pre-algorithm internet, you're going to be more likely to fall into this group. There will be more and more people starting to engage with the internet who simply don't know that there are other options.
And that scares me a lot.
no subject
But then again, I suppose I shouldn't be surprised that the algorithm isn't showing them any content they disagree with, because that is exactly how it's intended to work, after all.
(As an aside, I'm proud to say that my kid is among the Young People Who Get It and so are their friends. I am sure there is a massive range of savviness levels among the youth of today depending on what their social circle is like.)
no subject
Exactly. The "well, this is just the way the world is and always has been" is the most disturbing part imo.
I'm really glad your kid gets it! And like I said--I'm well aware that there are people of all generations who don't get it. The younger generation are just fighting more of an uphill battle to even remain informed!
no subject
I sort of understand it on some level, because fandom is escapism and having it curated for you.....gives a certain appeal? But at the same time, ooph, I want to be able to find things for myself, thanks.
Yeah, it's a problem, I fully agree with you.
no subject
I'm sure you feel the same way, but I would much rather my own fandom experience be curated by other fans. You do need help finding the stuff you want to experience--that's why we created rec lists and delicious tags and LJ/DW communities and all that fun stuff! I love when my friend says, "You should give this book/fic/fandom a try!" I hate when a company's website recommends things to me (mostly because they're almost always wrong).
I feel like algorithms don't curate. There's another word that's needed there--maybe just "promote"? But curation is such a thoughtful process, one that members of fandom have always invested a lot of time and effort into. It can't be replicated by algorithm!
no subject
The strange thing is, the algorithms are really quite obvious if you know what to look for. A few months ago in a fit of nostalgia I was binging a well-known 1970s singer on Youtube. (Okay, it was John Denver.) The recommendations on the right side of the page kept displaying, in amongst similar artists and other John Denver songs... mild conspiracy theory vids. Which, um, no. Now, all Youtube shows for me are the artists, but the initial assumptions of the algorithm were eye-opening to say the least. So I completely agree with Mr. Hamid and Ezra here about prediction--but I think it's going further than that. The social media companies are slowly training people how to think. And that is horrifying.
no subject
no subject
I hope you don't mind my linking to your post in my journal. It's such an interesting and important issue, and I haven't seen it presented like this before. Thanks for posting!
no subject
(Listen, I am fond of John Denver. I sometimes still listen to the Carpenters and I was listening to Neil Diamond's "Cracklin' Rosie" on the way to work, so I really can't point any fingers.)
but the initial assumptions of the algorithm were eye-opening to say the least.
That is so scary. I'm guessing maybe the link is that some people think his death was some kind of conspiracy?
but I think it's going further than that. The social media companies are slowly training people how to think. And that is horrifying.
Yes. I really believe my mom has become more of a reactionary because of Facebook. She's not full QAnon or anything like that, but she's becoming more and more anti-government and it terrifies me. And anecdotally, it seems to be happening more and more...more and more people of all ages getting radicalized...
no subject
Even if his novels don't work for me, I appreciate knowing that Hamid is out there in the world being a lovely person and writing from a good place! If I do read them, though, I will report back!
no subject
I'm always looking to better understand how new fans' experiences differ from what mine were.
Me too! And so often the differences surprise me!
Definitely disturbing, this trust and faith they have in the internet, seeming to assume it has their best interests at heart and no caution necessary.
I think it's more subtle than that--I don't think they think it has their bests interest at heart. I think they think it's totally...neutral. Like, it has no interests whatsoever. It has not perspective. Which is obviously untrue. But I think they think of these things like a law of nature--you can't question gravity.
Please link away! I am really interested in hearing other people's perspectives on this topic! Thanks for stopping by!
no subject
Also, like -- I don't really mind Instagram reaching me a tone of corgi pictures if I'm on there for cute animal pictures? And if I go to the website of a newspaper and it shows me stories based on the number of people who have been interested in them rather than the way the editors put the page together, that doesn't necessarily bother me either. Do I trust the media conglomerate that bought my local paper more than I trust the interests of other readers?
I would maybe frame the problem as less about algorithms than the way all information is pumped/consumed through the same pipes (series of tubes?) -- which is certainly an issue with YouTube or Facebook when ppl are consuming them as primary news sources.
Not disagreeing with you just musing
no subject
True! But it's one you choose! (I know the parameters are set by AO3, and there are some downsides to them, and I acknowledge those!) Plus, the people who are doing that choosing are people I can actually communicate with if I want and also I could join them or vote for who's making those decisions. It just feels substantively different to me than the way that companies design these things.
And I think you make a good point about Instagram corgi content. Like...there can be a place for even those kinds of algorithms? Yes, I think there can. But people aren't just on Insta for pictures of cute animals. Some people are getting pulled into wellness cults and anti-vaxxing and everything, and it's the same algorithm that leads you to more cute corgis. So it's complicated!
than the way all information is pumped/consumed through the same pipes (series of tubes?)
This is a good angle to think about it from, certainly! I don't think that's the whole of the problem, but it is definitely a big part of it.
Not disagreeing with you just musing
I appreciate your musing! It's pushing me towards more musing of my own!
no subject
And then there's the issue of accepting our data being sold all the time. We accept that because that means we get to use whatever platform for "free." On the surface, it can feel like a fair exchange. I don't believe it is, but I do understand why we've all taken the path of least resistance re: just about everything to do with the internet.
no subject
no subject
no subject
I admit the lure of convenience is there. I'm not on Tumblr or Twitter, my fannish time is spent here and on AO3, but I've spent some time on Facebook and I have to say it disturbs me how ads for specific things pop up when I haven't been overt about my interest (or don't remember being). I'd rather look something up myself, rather than have some creepy algorithm wave it in front of me while I'm aimlessly scrolling. I don't even use things like kudos-to-hits ratios to read fanfic on AO3. I'd rather get recommendations from an actual person. Guess I'm old-fashioned. ;)
no subject
no subject
no subject
I also find it creepy! But so many people don't!
I don't even use things like kudos-to-hits ratios to read fanfic on AO3. I'd rather get recommendations from an actual person. Guess I'm old-fashioned.
Haha, same!
no subject
Yeah, I think of Delicious being killed off, too. Sometimes it feels like these sorts of things are inevitable, but actually people were making choices all along!
no subject
no subject
That what horrifies me as well. I feel like it's the same thing as when trying to argue in favour of privacy. People don't really see or understand what they lose, and in the case of recommendation algorithms it's "convenient." Sites like TikTok don't even give you the option of controlling what you see (following someone informs the algorithm's decision, from what I understand, but it doesn't guarantee you'll see all of the person's videos.) I was struggling to finish that thought but what you quote from the podcast after encapsulates it: "prediction as behaviour modification" because the content is fed to us in a particular way, with a particular bias, that influences us.
There was a lot of very excellent food for thought in the rest of your post and the comments as well, about echo chambers and making us more predictable. Thank you for sharing.
no subject
Yeah, definitely.
Sites like TikTok don't even give you the option of controlling what you see
Horror! Horror!
You're so welcome and thanks for stopping by! There's so much to think about when it comes to the ways in which technology is shaping us...
no subject
I think one of the strangest things when it comes to "the youth" plus "the internet" is the coincidence -- that happens with all revolutionary technologies -- that there's a generation that has grown up in lockstep with all this. I think that does actually make a difference that can't be reduced to ageist twaddle, in that the new technology becomes a form of identity marker, which makes critique much harder to swallow, because it feels like a criticism of the self.
Probably Ezra and Mohsin (I like the first name references!) could even have been more clear that these algorithms are in fact already incredibly biased, beholden to the prejudices of their makers. There's plenty of data of all kinds of algorithms, social media or facial recognition or anything, emphasizing those biases. So really, if anything, I feel they should have been clearer that this is already happening, and it's happening strictly because algorithms are not independent entities, which even their critics tend to treat them as. They are firmly created, and therefore firmly in the shape of their makers.
no subject
Also: thanks to you, I've got "Cracklin' Rosie" stuck in my head now! Play it now, play it now, play it now my BA-BY! XD
no subject
no subject
This is just wild to me, because my sense of the generational cohort the OP is from is that they are far more adamant about the importance of individual expression above all than were preceding ones. And an algorithm essentially determining what you like (by shaping what you get to encounter in the first place) would seem to be the opposite of that? Maybe it's that this cohort considers being shown a ton of content tailored to the specifics of one's unique identity affirming, but man, the only way I at least figured out what my identity was in the first place was by encountering a ton of stuff. I would not be the same person now if I'd only encountered the things that interested me in 8th grade over the many years since...
I also wonder if the misapprehension about AO3 having an algorithm in the first place might have something to do with the shift away from commenting towards kudosing or no interaction with authors at all that people are discussing in the posts you've linked and elsewhere recently. That is, if you are (and assume others are) deleting and reposting fics regularly, what's the point of leaving a comment?
That podcast discussion was really insightful. It meshes really well with Shoshana Zuboff's The Age of Surveillance Capitalism, which was one of the most thought-provoking things I've read in recent years. One of the things she says is: For all of the elaborate ways in which [big tech companies] labor to render reality as behavior for surplus, the simplest and most profound is their ability to know exactly where you are all the time. Your body is reimagined as a behaving object to be tracked and calculated for indexing and search. Most smartphone apps demand access to your location even when it's not necessary for the service they provide, simply because the answer to this question is so lucrative, which I think is exactly what Hamid is pointing to in his observation about algorithms being designed to encourage predictability in human behavior.
no subject
no subject
But what I meant by "curating it myself" is more the point where I chose whether I click on a rec list and also whether I follow the links on the rec list. Unlike, say, insta, which will simply put posts on my timeline in the app and I essentially have to see them whether I want to or not, and need to actively decline doing so.
no subject
no subject
So it isn’t simply the case that machines are better able to understand humans. It is also the case that machines are making human beings more like machines, that we are trying to rewrite our programming in such a way that we can be predicted. And for me, that’s the more frightening aspect of the shift from sorting to prediction.
Yes, it's troubling enough for the purpose of advertising but really gets alarming when it applies to political activity -- especially since for the most part politicians want activity to be suppressed rather than enabled.
no subject
no subject
Unlike, say, insta, which will simply put posts on my timeline in the app and I essentially have to see them whether I want to or not, and need to actively decline doing so.
Yeah, I super hate this but I can understand a certain amount of its appeal.
no subject
Exactly! They're under my control! I do think there's a conversation to be had about the design of AO3 and what things you can control and which you can't, but that's a very different conversation than the algorithm one.
Lol! Sorry!
no subject
I will have to give it a try then! Thank you!
that there's a generation that has grown up in lockstep with all this. I think that does actually make a difference that can't be reduced to ageist twaddle, in that the new technology becomes a form of identity marker, which makes critique much harder to swallow, because it feels like a criticism of the self.
Yes, well said.
And I could not agree more about the biases built into the algorithms and the ways in which their creators are "invisible" but have huge amounts of power over the public discourse and over personal experiences online.
no subject
no subject
Yeah, I'm struggling with exactly this!
I would not be the same person now if I'd only encountered the things that interested me in 8th grade over the many years since...
I as I am would hate the person I would have turned out to be under these circumstances. (Sorry for the convoluted syntax--hopefully that sentence made sense.) I would be a worse person in almost every conceivable way.
might have something to do with the shift away from commenting towards kudosing or no interaction with authors at all that people are discussing in the posts you've linked and elsewhere recently. That is, if you are (and assume others are) deleting and reposting fics regularly, what's the point of leaving a comment?
This could be! I have no way of knowing whether that's true, but it seems like a reasonable theory.
. Your body is reimagined as a behaving object to be tracked and calculated for indexing and search.
Oh god, that's so chilling!!!!
no subject
I find that the older I get (in age and in time in fandom) the more I strongly prefer non-commercial, non-algorithmic, by-us-for-us sorts of websites like AO3 and Dreamwidth
Absolutely.
The experience younger people are having with the internet now seems very different
Every single day I am glad that I had the internet experience I did and not the one that younger generations--or even people only a few years younger than I am!--had.
no subject
no subject
it's particularly damaging since it offers a suggestion of action and decision making on the part of users when it's really just channeling them into a cattle tunnel.
YES. That imagery is so potent!
but really gets alarming when it applies to political activity -- especially since for the most part politicians want activity to be suppressed rather than enabled.
Indeed.
no subject
YES!!! I think the reason that I'm worried about people accepting algorithms as default is that I think that such an acceptance undermines the kind of skeptical attitude that would lead to people even noticing the ways that algorithms shape us. If there's no culture of critique of those things, then things will only get worse.
but ultimately preferences for how people manage their own behavior is just not where my energy is focused, insofar as it does not lead to them harming other people by those choices
*nods*
It's when it stops being a choice, because it is mandated, that I get really worried.
This is an important distinction!
I also don't really get how you could not be aware this is a debate (especially for folks who are adults), which seems to indicate a lack of questioning about the world and not just about how social media is curated these days.
Yeah, super hoping that the people who made those comments are, like, 15. Because otherwise it's very worrisome.
no subject
Thank you for these interesting thoughts! I teach ESL and part of my class has to do with connections and social media, and the more people who add their voice to the discussion of algorithms, the more tools I have when I want my students to think critically about how and where they spend their time and what it might do to them.
On a more personal note, the first time* I really realised how much autonomy we lose to algorithms was when my Instagram feed changed from chronological to... whatever it does now. I loved opening the app about once a day and seeing what my friends and the people I followed had been up to since the last time I saw them, and then I lost that. I haven't really used Insta since then; in fact I might as well give in to the inevitable and delete my accounts.
*Facebook was already doing a lot of annoying algorithmic stuff by then but I mostly used it on Desktop where I had FB Purity to keep my feed chronological, as well as adblock so I didn't really take in the implications.
no subject
Gosh, yes, I feel like media/information literacy is one of the most important areas of education right now. It's heartening to know that you're out there trying to help your students work through this stuff!
Far be it from me to try to get anyone to keep social media, but a friend recently shared a tip (that I never would have discovered on my own) that made Instagram usable to me again. If you click on the Instagram icon in the upper left hand and then choose the "following" option, it will show you only who you follow and in chronological order! The fact that they hide it pisses me off a great deal, but the capability is there. Which is such a relief because their algorithms really have made it unusable for those of us who just want to, like, see our friends' vacation pictures.
no subject
This is horrific. I hadn't thought that younger people might... not know that there are sites that don't work that way, that they might think it's just a law of nature that every place on the internet behaves that way.
I wish I knew what could be done about this.
no subject
no subject
no subject
Yeah, it's wild that it hadn't occurred to me either that they would take that specific sort of web design totally for granted. It's disturbing and I'm also not sure what to do about it.
no subject
Yeah - it's a little scary that "algorithm" has come to mean, by synecdoche, "commercial algorithm that decides for you what you want to see based on a company's agenda." CS students all take classes in algorithms. The way you learned to multiply multi-digit numbers is an algorithm. It just means "a way of solving a problem that has a description so precise that a computer could use it."
no subject