In the Jenkins piece for this week, I found an enormous
amount of material that I wanted to talk about for my presentation. I was
surprised how relevant the piece remains after fourteen years. I did not feel
the same with the Caldwell piece. It felt as though none of it felt like there
was anything particularly spectacular was being said. For the purposes of this
core post I’m going to focus on the second point that Jenkins makes on
“regulating media content.”
I definitely agree that there has been a push away from
least objectionable programming (LOP) and more towards narrowcasting, but on an
even more exaggerated scale than I think that Jenkins is referring to in 2004.
This is making it harder and harder for parents to control what their children
are being exposed to on a day-to-day basis despite having the ability to enable
parental controls on most streaming services and game consoles. The problem is
parents are getting lazy and relying on companies to restrict what their
children have access to at a time when they should be more vigilant than ever.
The prime example for this is the debacle surrounding YouTube Kids, an app that
is a watered-down version of YouTube that is a lot more restrictive as to what
content is let through. Up until the most recent wave of the Adpocolypse, kids’
channels were the only ones that had yet to be hit by losing ad-sense. Not only
were these channels not being targeted by the Adpocolypse, these channels get
some of the best CPM rates due to the
assumption that they are advertiser friendly. YouTube was relying heavily on
the algorithm to determine what would be featured on the YouTube Kids app and
in searches made for things such as ‘Frozen’
and ‘Spiderman.’ Because of this reliance on the infamous algorithm, parents
felt that they did not need to worry about what their kids were watching on
YouTube. That all changed once advertisers were made aware of certain types of
channels that would put characters targeted towards kids in situations that
might not be considered kid-friendly. This most recent pull out of advertisers
seems to be sparked by a Medium article (https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2).
This has caused YouTube as a company to rethink how to define what constitutes
as appropriate material for children in terms of their algorithm. This seems to
be a recurring theme for YouTube, and they have been handling it poorly in the
eye of many of the creators on the site. This in part is due to YouTube showing
their hand on who they are more invested in protecting, the advertisers.
I enjoyed reading your blog post and am looking forward to your presentation tomorrow. I agree with your statement regarding parents expecting the streaming services to monitor what their children watch. Taking a more radical stance, I think that each family has the right to make decisions about programming, but it should not be up to the companies to filter their own content from curious eyes, specifically if we consider that it is only in the last century that children went from being viewed as little adults, capable of labor, to something to be protected (Zelizer, 1994). From being valuable to valued, children are treated completely differently than in a pre-television era. As media technology expands, so does society's anxiety about children's exposure to said media. As a personal anecdote, growing up, my parents let me watch Seinfeld and Fraiser with them. But they wouldn't let me watch Married with Children because the adults smoked and the kids talked back. Sometimes I would put Jerry Springer on in my room very quietly and flip the channel if my mom came in. But with so many screens and unlimited media access, I can see why this can be challenging for parents now.
ReplyDelete