People think if they had more data, that they'd make smarter and better decisions. What I see happening is the more data people have, the more they defer the decisions to the "experts" because they're incapable to managing the complexity.
That and shitty data visualizations, increasingly made to impress people with their visual flair rather than to communicate the information in a proper and unbiased way.
"Storytelling" has taken over delivery of facts and information. Normativity, subjectivity, and personal agendas are smuggled in to such a degree that people can no longer tell what data is pertinent and what isn't without some expert telling them what to think.
This isn't completely on the supply side, though. The demand side (the audience) also wants "easy to understand" and "easily digestible" information.

It's like when students think a professor is "good" because he's entertaining and easy, not because he's teaching.
When there's that much dependence on the "expert" to make sense of what's going on for others, there's too much temptation for the expert to remain completely neutral. There's too much personal gain to be had, creating a conflict of interest.
On one hand, there's fame and the likelihood of fortune (see: Nate Silver). And when you turn out to be wrong, and even spectacularly so, there's almost no downside for it. (Also see: Nate Silver)
The system of experts itself has become so complex that everyday people need experts to tell them who the experts are.
Even the fact-checkers need fact-checking.
Many people have just given up.
I study machine intelligence through a philosophical approach. The hardest problem of machine intelligence is the frame problem. That is, the computer has no idea when to stop. A secondary problem: it has no concept of relevance.
Similarly, we are losing our concept of when to stop gaining knowledge, as we slowly and quietly lose our concept of relevance.
In a strange and subtle way, perhaps we can say that we are losing a bit of our humanity.
That may sound like some dystopian sci-fi horror, but consider this:

What makes us human is our capacity to choose: agency. And our capacity to choose what we believe to be true: epistemic agency.
If we do not have agency, we're merely deterministic automata.
Silver lining (as I type this thread via stream-of-consciousness): if external forces can make us more deterministic, then that entails that we have at least a modicum of free will to be taken away.

Therefore, free will is not an illusion.
And if free will is not an illusion, then the most human thing you can do is exercise your free will *in spite of your circumstances*.
So, then, what will you choose?
Meme time!
You can follow @hyonschu.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.