I just want to point out an MIT Technology Review article on three new books warn against turning into the person the algorithm thinks you are
Chet Haase (Google)

Jokes are deeply human. Can machines learn to tell them?
An Essay By Bob Mankoff for The Atlantic.
Much of the current conversation around the rise of artificial intelligence can be categorized in one of two ways: uncritical optimism or dystopian fear. The truth tends to land somewhere in the middle—and the truth is much more interesting. These stories are meant to help you explore, understand and get even more curious about it, and remind you that as long as we’re willing to confront the complexities, there will always be something new to discover.
Software engineer Chet Haase’s joke sums up the problem of algorithmic recommendations: by guiding what we watch, read and listen to, they influence what gets made in the first place — a self-reinforcing cycle….
“In algorithmic culture, the right choice is always what the majority of other people have already chosen,”
Kyle Chayka. Filterworld: How Algorithms Flattened Culture
Chayka points out in Filterworld, algorithms “can feel like a force that only began to exist … in the era of social networks” when in fact they have “a history and legacy that has slowly formed over centuries, long before the Internet existed.”
“Raw data is an oxymoron”
Lisa Gitelman
From Quetelet and his “average man” to Francis Galton’s eugenics to Karl Pearson and Charles Spearman’s “general intelligence,” Wiggins and Jones chart in How Data Happened: A History from the Age of Reason to the Age of Algorithms a depressing progression of attempts—many of them successful—to use data as a scientific basis for racial and social hierarchies.
Data added “a scientific veneer to the creation of an entire apparatus of discrimination and disenfranchisement,” they write. It’s a legacy we’re still contending with today.
Whether it’s poverty, prosperity, intelligence, or creditworthiness, these aren’t real things that can be measured directly, note Wiggins and Jones. To quantify them, you need to choose an easily measured proxy. This “reification” (“literally, making a thing out of an abstraction about real things”) may be necessary in many cases, but such choices are never neutral or unproblematic. “Data is made, not found,” they write, “whether in 1600 or 1780 or 2022.”
Wiggins and Jones reveal the intellectual tradition that underlies today’s algorithmic systems, including “the persistent role of data in rearranging power”
“The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.”
David Graeber, The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy
Algorithms for the People: Democracy in the Age of AI, Josh Simons explains we continue to misunderstand the nature of machine learning, and how its use can profoundly undermine democracy.
Machine learning is like driving while looking in the rear view mirror
(@tjcmorgan)
Algorithms may entrench our biases, homogenize and flatten culture, and exploit and suppress the vulnerable and marginalized. But these aren’t completely inscrutable systems or inevitable outcomes. They can do the opposite, too.[…] As long as algorithms are something humans make, we can also choose to make them differently.
