Speaking Privately to the Algorithm

What happens when we assume that always stating our opinion is in anyone's best interest

I spend a good amount of time watching YouTube videos by musicians. Not just of them, but generally by them: studio-journal videos that musicians make to show how they work. Not just recordings of their music, but videos of the process, of the effort, required.

And I marvel at (which is to say, more directly: am dismayed by) instances when a positively received video on YouTube also receives a small handful of dislikes. By this I specifically mean mute negative gestures, devoid of any comment, just a downward-facing thumb. Say what you will about haters, at least when they comment they leave some fingerprint on their dissenting opinion. There’s a uniquely buzz-killing pall cast by the unqualified, unidentified, anonymous thumbs down.

Certainly everything will have its detractors, but I wonder if something else might be going on here. (Now, by “popular” I don’t mean the given video has racked up hundreds of thousands of views. I just mean maybe a couple dozen accounts have given it a thumbs up, and the video is innocuous, not to say inconsequential, just a musician doing their thing.)

I wonder if the issue is that the YouTube interface should provide an opportunity for the watcher/user to say, privately to the algorithm, “I’m not interested in this.” That suggestion is in contrast to requiring, as YouTube currently does, that you register your disinterest publicly.

Right now it’s like the waiter asks how your meal was, and your only option is to stand up and announce it to your fellow diners. And the issue may not be the food; it may not be that you didn’t like the food. The issue may be that it just wasn’t your sort of food, or you would have liked this for lunch but it didn’t satisfy your dinner appetite.

As I’ve thought about this user-interface conundrum, I’ve become entranced by the concept of speaking “privately to the algorithm.” Perhaps that should be capitalized: “I’m speaking, privately, to the Algorithm.”

In that formulation, it’s like a confession, not a religious confession toward addressing your personal spiritual and all-too-human shortcomings, but a confession in the hopes of tailoring your reality. That is, toward addressing the shortcomings you perceive in (digital) reality.

And this is where the constant request for feedback can have (big surprise) unintended consequences. The tools have trained us to let them know what we think, because it’s in our best interest. But is it in anyone else’s interest that you found the given musician’s music uninteresting? While making your world better, have you yucked someone else’s yum? What is the good in that? What does it mean when acting to address the shortcomings you perceive in your digital reality has the direct effect (not merely a side effect, but a direct and immediate one) of negatively impacting the digital reality of other people?

Note the following three different scenarios on YouTube and how the user’s feedback is constrained, even directed, by the interface.

Below is a screenshot of the egregious situation I’m currently describing. If you’re on the page for a video, you have only the options to ignore, comment, or give it a thumbs up or thumb down, and of course to “Report” it, but that’s a different situation entirely:

Contrast that with the option you have for videos that YouTube serves up to your account based on what you’ve viewed before. Note that here, there is a plainly stated means to say “Not interested”:

And note that this isn’t merely a matter of whether you arrive at the video through your own actions or through the recommendations of YouTube. For example, if you subscribe to channels on YouTube, you can still, from the Subscriptions page, elect to Hide something:

Now, perhaps if you select “Hide” that is all that happens. Perhaps it just takes the video out of view. Perhaps YouTube doesn’t register your action as a means to adjust how its algorithm triangulates your viewing taste. But that seems unlikely, doesn’t it? We use these interfaces today with the impression that they will inform our future use of a given tool. Which is why when faced with no “Not interested” or “Hide” equivalent on a page, the user is, if not justified in registering their disinterest, forgiven a little for registering their dissatisfaction.

The issue is that the user’s dissatisfaction isn’t necessarily with the video. It is, indirectly and yet significantly, with YouTube.

3 thoughts on “Speaking Privately to the Algorithm

  1. Recently I’ve been pondering sexism in social media responses, like how my partner will get slammed on a local FB page but I will get maybe a thoughtful comment for a similar post.

    Anyway, I’m wondering how a similar thing might be at play here. Like if haters feel more empowered to dislike a video where you don’t see a male?

  2. Thanks again Marc, for a very thoughtful topic.

    I don’t spend very much time watching videos such as you describe, but I have noticed the strange behavior which the anonymity of the “Like/Dislike” duality encourages. What it makes me consider is how many aspects of our experience have been reduced to binary – the form in which computers are able to process information. Shades of meaning require many, many more questions to be answered “yes/no” (like/dislike), and those questions take time (both to formulate, and to answer). So the conclusion I arrive at is this: the computer-based experience can be fast and easy, but at the expense of shades of meaning, or it can be rich and detailed, with the investment of great amounts of time. There are no shortcuts for layered experience, and to my personal tastes, the current forms on the internet too infrequently offer shades of meaning.

    Your thoughtful articles are one of the exceptions! Thank you, again.

Leave a Reply

Your email address will not be published. Required fields are marked *