YouTube Faces $3.2 Billion UK Lawsuit Over Children’s Privacy Violations

 In E3, UK, Australian Capital Territory

A class action law­suit against YouTube could cost the tech giant $3.2 bil­lion if decid­ed in favor of the plain­tiffs. The law­suit asserts at least five mil­lion children’s pri­va­cy vio­la­tions in the UK as a result of YouTube’s data col­lec­tion prac­tices. The suit cites both the General Data Protection Regulation (GDPR), which the UK remains bound to the terms of until 2021, as well as the sim­i­lar­ly-struc­tured UK Data Protection Act.

Children’s privacy violations endemic at YouTube?

The case cen­ters on a pri­va­cy vio­la­tion prob­lem that is common to big tech com­pa­nies that deal in tar­get­ed adver­tis­ing: screen­ing out minors who are enti­tled to enhanced data pro­tec­tion rights. YouTube was already taken to task for a very sim­i­lar issue of children’s pri­va­cy pro­tec­tion in the United States in 2019, receiv­ing a $170 million fine from the Federal Trade Commission (FTC) and New York Attorney General for col­lect­ing the data of minors for tar­get­ed adver­tis­ing pur­pos­es with­out parental con­sent and in vio­la­tion of the Children’s Online Privacy Protection Act (COPPA).

The penal­ties are much more sub­stan­tial under the GDPR terms, should a vio­la­tion of children’s pri­va­cy be deter­mined by EU data pro­tec­tion author­i­ties (DPAs). But that is a sep­a­rate issue from the pend­ing class action law­suit; Articles 80 and 82 of the GDPR allow indi­vid­u­als to seek dam­ages in this way in addi­tion to any poten­tial fines that might be levied. The law­suit could poten­tial­ly trans­late into hun­dreds of GBP for each claimant.

YouTube has stated that the plat­form is “not for chil­dren under 13,” but a sub­stan­tial amount of its pro­gram­ming is designed to appeal to young view­ers. A 2019 report from a UK media regulator found that 80% of UK chil­dren aged 5 to 15 are video-on-demand con­sumers, as well as about 50% of chil­dren aged 3 to 4. The second-most viewed video in YouTube his­to­ry is the “Baby Shark Dance” at over 6.5 bil­lion views, and two videos from the Cocomelon Nursery Rhymes chan­nel were among 2018’s top-viewed videos with more than two bil­lion views each. YouTube’s high­est earner for 2019, toy unbox­er Ryan Kaji, made $26 mil­lion dol­lars from the platform’s ads and is eight years old. The law­suit also points out that children’s brands such as Hasbro and Mattel pro­mote YouTube as a favorite web­site of kids in their mar­ket­ing mate­ri­als.

The video plat­form has safe­guards during account cre­ation intend­ed to steer those under the age of 13 to the alter­na­tive YouTube Kids app, which does not col­lect per­son­al data and has extra con­tent fil­ter­ing. It is assumed that a parent will have con­trol of this process, how­ev­er, and there is little stop­ping an older child from cre­at­ing their own account and report­ing a false age. Not that any of this nec­es­sar­i­ly mat­ters in terms of poten­tial pri­va­cy vio­la­tions, as the site’s con­tent is almost entire­ly avail­able with­out log­ging in. YouTube tracks even those view­ers that are not logged in, using cook­ies and other mea­sures to col­lect data that ties into its web-span­ning tar­get­ed ads net­work.

Though the fine amount was triv­ial for the tech giant, the 2019 FTC ruling did trig­ger some big changes at YouTube that kicked in this year and are aimed square­ly at pro­tect­ing children’s pri­va­cy. New labels are required for any con­tent that is intend­ed for chil­dren, and the site applies its AI algo­rithms to iden­ti­fy these types of videos and ensure that they are labeled prop­er­ly. YouTube con­tent cre­ators were also noti­fied of COPPA require­ments and must man­u­al­ly flag each video they pro­duce that is intend­ed for kids.

One of the most impor­tant changes YouTube made was to stop deliv­er­ing tar­get­ed ads to anyone on the plat­form who had watched a video intend­ed for chil­dren, regard­less of their age. If this tech­nol­o­gy is prop­er­ly in place, it would pre­vent future pri­va­cy vio­la­tions (and law­suits) of this nature but would not apply to claims prior to the recent imple­men­ta­tion of these new plat­form rules.

Can YouTube really keep kids off the platform?

YouTube finds itself in some­thing of an unten­able posi­tion in terms of these pri­va­cy vio­la­tions, stuck as it is between its user­base, mon­e­ti­za­tion sys­tems and meth­ods of con­tent deliv­ery. It’s impos­si­ble to pro­tect chil­dren and screen them out from the main­stream site with­out man­dat­ing both user account logins and some sort of iden­ti­ty check, which would be clear non-starters for the rest of the user base.

YouTube has stated that the plat­form is ‘not for chil­dren under 13,’ but a sub­stan­tial amount of its pro­gram­ming is designed to appeal to young view­ers. #pri­va­cy #respect­da­ta Click to Tweet

The new policy of auto­mat­i­cal­ly remov­ing view­ers from the tar­get­ed ads ecosys­tem after watch­ing just one video intend­ed for chil­dren illus­trates how pre­car­i­ous the posi­tion is, and indi­cates that the plat­form may need to look to new mon­e­ti­za­tion meth­ods before long. While its prob­lems are lim­it­ed to the EU and US at present, an ongo­ing trend of data pro­tec­tion laws being adopt­ed around the world mean that YouTube could be facing fines and legal com­pli­ca­tions in nearly every major market before long. Traditional non-tar­get­ed adver­tis­ing based on more gen­er­al demo­graph­ic obser­va­tions rather than use of per­son­al data would solve the prob­lem of vio­lat­ing children’s pri­va­cy, but would also likely reduce ad rev­enues sub­stan­tial­ly and have a major ripple effect across the entire con­tent cre­ation ecosys­tem.

CPO Magazine source|articles

Recommended Posts

Start typing and press Enter to search