Inspired by Professor
McPherson’s chapter on Web-based experiential modalities, I wanted to take a
look at my own engagement with Youtube, and to reflect on the illusory “choice”
and “discovery” promised by certain features on the site. I’m a bona fide Youtube
user – I am currently subscribed to 44 channels, though I keep up with new
content from fewer than 15 of those channels. I visit the site several times
daily, both to listen to music and to catch up on the latest content from my
favorite creators. I sift through content using two tabs – “Home” and “Subscriptions,”
which respectively promise “discovery” and “choice.” The “Subscriptions” tab
lists new content from the channels to which I am subscribed, with today’s
releases listed across the top of the page, followed by videos released
yesterday, this week, and this month. This feature presents my curated content
in reverse chronological order, much like Twitter’s newsfeed or Instagram’s
home page. “Home” offers a selection of videos curated by Youtube and tailored
to my taste based on my activity. This tab presents a selection of playlists
based on music that I’ve listened to, as well as “Recommended” content, which
includes videos from channels that I subscribe to but have not yet viewed, as
well as content from channels that I’ve visited but not subscribed to. These
tabs have changed somewhat over the past few months, inspiring frustration from
both content creators and viewers who preferred older incarnations of these
browsing features. I tend to visit “Home” first, trusting Youtube’s algorithm
to “recommend” the content that I actually want to watch, rather than delving
into my bursting subscription box in which ~60% of the videos present will
remain unwatched. So, rather than availing myself of the “choice”-oriented
modality of the subscription box, I opt for the “discovery”-oriented modality
of the Home page, which essentially learns my viewing preferences and feeds
them back to me. I put too much trust in these features – content from certain
creators sometimes does not reach their subscribers, and Youtube will
occasionally unsubscribe groups of viewers from a given channel (about which
there is no shortage of conspiracy theories).
In light of the 2016
election and the Cambridge Analytica scandal, I’m wondering how we can navigate
between features that promise interactive ease and the data-mining that those
features mask. Professor McPherson asks us to “investigate the ideological implications
of actual interfaces and other programming choices,” and scrutinizing the
features with which these platforms tantalize us seems like a sound starting
point (206).
Following this description of YouTube, I think that the article of Lisa Parks might also be applicable. Indeed, this seems to be an example of postbroadcasting "personal television" because when we click on videos, we are teaching YouTube what shows we like (our cultural tastes) and how often we like to watch them (our viewing habits), and thus training those algorithms to search out and suggest others we might like. And the same goes with the ads because this type of "personal TV" is also (well, actually mainly) capable of delivering specific audiences to advertisers. One reason why that "Home" tab leads to a false discovery is that because material is pushed, "the process of selection -which is often celebrated as expanded viewer choice- is clearly circumscribed by marketers' determinations of 'relevant' content" (Parks, p. 138). So, just like Prof. McPhearson argues that the rhetoric around the liveness of the internet is performing ideological operations, so Parks argues that the discourse of personal TV is directly related to new industrial structures of individuation geared toward profit making rather than to the viewer's personhood.
ReplyDeleteI find both of these assessments of YouTube really fascinating! What I always wonder when I use the site or others like it is what I can do to try to break out of these predetermined content suggestions. I find it is harder and harder to find new videos or creators that don't directly fit my tastes. This goes for all of my presence online (my Instagram discovery page for example)...how do we fight back against these marketing algorithms? Beyond acknowledging what they're doing, which you both do above, should we also find ways to engage with these platforms in order to break the cycle? I'm not sure I have an answer beyond just eliminating our presence on them completely, but that seems like a harder task!
ReplyDeleteMy anecdote isn't specific to Youtube, but the Netflix algorithm. The Netflix account that I use is so completely overwhelmed by disparate choices that the algorithm essentially throws up its hands in despair and walks away. At least seven people have this login and use it on a regular basis, with quite a few more using it more occasionally (please don't rat us out to the powers that be!), which results in the algorithm having no idea what do recommend. The "Recommended For You" stream is usually exactly the same as the "Popular" or "Trending on Netflix" streams because there's just no discernible trends or patterns in the account's viewing history. On the one hand, I'm sometimes proud of it, because it can feel like I'm gaming the system. On the other hand, I never get recommended anything cool that I wouldn't have otherwise encountered ... You win some, you lose some?
ReplyDelete