"There are so many levels to this that I find horrifying--the idea that people will outsource their tastes to big tech companies instead of trusted communities."
That what horrifies me as well. I feel like it's the same thing as when trying to argue in favour of privacy. People don't really see or understand what they lose, and in the case of recommendation algorithms it's "convenient." Sites like TikTok don't even give you the option of controlling what you see (following someone informs the algorithm's decision, from what I understand, but it doesn't guarantee you'll see all of the person's videos.) I was struggling to finish that thought but what you quote from the podcast after encapsulates it: "prediction as behaviour modification" because the content is fed to us in a particular way, with a particular bias, that influences us.
There was a lot of very excellent food for thought in the rest of your post and the comments as well, about echo chambers and making us more predictable. Thank you for sharing.
no subject
That what horrifies me as well. I feel like it's the same thing as when trying to argue in favour of privacy. People don't really see or understand what they lose, and in the case of recommendation algorithms it's "convenient." Sites like TikTok don't even give you the option of controlling what you see (following someone informs the algorithm's decision, from what I understand, but it doesn't guarantee you'll see all of the person's videos.) I was struggling to finish that thought but what you quote from the podcast after encapsulates it: "prediction as behaviour modification" because the content is fed to us in a particular way, with a particular bias, that influences us.
There was a lot of very excellent food for thought in the rest of your post and the comments as well, about echo chambers and making us more predictable. Thank you for sharing.