“Pleeeease?!” Buying a quick gift or giving in to your child’s pleas for a new toy is quickly becoming a more serious decision. In the age where toys can happily entertain kids by talking to them, the few precious moments those toys buy parents may not be without risk. It’s possible for anyone within an internet-connected toy’s Bluetooth range to connect to the toy and receive their audio recordings, while being up to 100 feet away. For example, in December 2015, VTech allegedly exposed the personal information of 6.4 million children, which included their names, genders and birthdays. Stealing a child’s personal information is, at the very least, concerning. However internet-connected toys come with an additional danger—localized hacking. Just look at Cayla, an internet-connected fashion doll manufactured and sold by Genesis Toys. My Friend Cayla answers fact-based questions, plays games, reads stories, and even solves math problems. Genesis uses third-party voice-recognition software by U.S.- based company, and the doll requires an iOS/Android application to use the software. The doll’s mobile application researches and supplies Cayla with factual answers to questions, but it also prompts children to set their physical location, parents’ names and school name.
Brand companies have come to view user-generated content as often one of the most effective and authentic ways to advertise their products or services. This is known as “user-generated content marketing.” For example, with the ubiquitous selfie, brand companies have discovered a rich supply of user-generated content. Consider a consumer who takes a selfie wearing a favorite pair of jeans, posts the photo on Instagram, and then tags the photo with #brandname. The jean company sees and likes the photo, re-posting it on the company website. Legal issues? If the consumer or user was hoping to get attention from the brand for the photo and opinions shared online, not at all. This is how many digital influencers get their start. But if the user was not seeking such attention? Then, problems can arise.
2013 was an incredibly active year for social media legal issues. Below are selected highlights on some of the more interesting legal issues that impacted social media, along with links to reference material relating to the topics.
1. Virtual Currency/Bitcoin
FinCEN Virtual Currency Guidance and Enforcements – FinCEN published legal guidance on virtual currency making clear that existing regulations regarding money transmitter and anti-money laundering laws apply to certain virtual currency activities. Shortly after issuance of the guidelines,
a wave of enforcements shut down non-complying entities. [BLOG]
Congressional Hearings on Virtual Currency – Congressional hearings were surprisingly more friendly and receptive of Bitcoin and other virtual currencies.
2. Privacy – Guidance and Enforcements
CA Privacy Law
– California passed new privacy laws.
3. Intellectual Property/Patents
Patents – The number of social media patent filings continued to increase. The America Invents Act (AIA) fully kicked in, providing a greater ability to challenge patents believed to be invalid without going through district court litigation. The Fast Track
process to get patents issued more rapidly (often in less than a year)
Ownership of Social Media Accounts and Followers – Despite a number of cases (including ones involving LinkedIn and Twitter) relating to ownership of social media accounts, the law remained murky and fact specific.
This uncertainty can be avoided by proper attention to social media policies before issues arise.
4. Employment Law and Social Media
National Labor Relations Board (NLRB) – The NLRB continued to issue surprising guidance and decisions on social media usage. In many cases, some or all provisions of employers’ policies governing the use of social media by employees were found to be unlawful. [BLOG] The NLRB affirmed that workers have the right to discuss work conditions freely without fear of retribution,
whether the discussion takes place in the office or on Facebook. But later in the year it actually found some uses of social media for employment (firing) decisions to be okay.
Employer Access to Social Media User Names and Passwords – By year end, 36 states had passed or initiated legislation prohibiting employers from requesting personal social media account information or passwords in connection with employment decisions.
National Conference of State Legislatures Report – Some states have similar legislation to protect students in public colleges and universities.
5. Online Gaming
First mover states
forged forward with online gambling.
· Nevada – Legalized online poker and granted its first licenses for interactive gaming.
· New Jersey – In February, passed legislation (signed into law by Governor Chris Christie) allowing on-line wagering. Subject to certain limitations, licensed operators are permitted to offer online versions of a wide variety of games currently permitted in Atlantic City casinos (e.g., roulette, craps, black jack, and slots).
· Delaware – On October 31, launched what Delaware officials call a “full suite” of internet gambling.
Zynga – In September,
Zynga withdrew its bid for a gambling license in Nevada
Federal Gambling Legislation
– The prospects for a federal law for online gambling remain elusive.
Mobile Health Applications
– The Food and Drug Administration (FDA) issued guidance that focused on applications that present a greater risk to patients if they do not work as intended or that cause smartphones or other mobile platforms to impact the functionality or performance of traditional medical devices.
– The FTC issued guidance in April focusing on truthful advertising and privacy.
gaming promotions in a cause-related marketing campaign (where purchase of a good or service benefits a charitable cause).
Internet Sweepstakes Café Conviction in Florida – Lawyer Kelly Mathis was convicted on 103 of 104 counts related to illegal gambling based on his role in Internet Sweepstakes Cafés in Florida. He faces up to 30 years in prison. CA, OH, SC and other states moved quickly to shut down similar operations.
Equity-based crowd funding legalized in the United States
Equity crowd funding is much like crowd funding, which has been popularized in the United States through sites such as Kickstarter and Indiegogo. The difference is that instead of individuals supporting campaigns through donations, numerous investors are purchasing small stakes in startups or small businesses.
– Critics of equity crowd funding worry that the industry will be rife with Ponzi schemes or that having too many investors will hurt startups’ prospects for future funding.
FTC Enforcements on Fake Endorsements – In February, the FTC permanently stopped a fake news website operator that allegedly deceived consumers about acai berry weight loss products. The settlements will yield more than $1.6 million and conclude a sweep against online affiliate marketers and networks. The sites falsely claimed endorsements from ABC, Fox News, CBS, CNN, USA Today and Consumer Reports.
Many companies’ understanding of and compliance with the FTC Endorsement Guidelines remains lacking, yet enforcements continue.
Wearable Computing Lawsuit
Google Glass Liability? – In what may be a foreboding development, a California woman received a traffic ticket for wearing Google Glass while driving. Many states have broad distracted-driving laws or bans on certain monitors that may apply to Google Glass and similar wearable computing devices.
The New Jersey attorney general sued 24 x 7 Digital LLC, a Los Angeles-based developer of applications for mobile devices, for child online privacy violations. The suit alleges that 24 x 7’s kids educational apps collect personal information from children younger than 13 and transmit that information to a third-party without parental notice or consent.
The information includes kid created profiles containing their first and last names and a picture. The app allegedly transmits that information and the unique device identification number associated with the mobile device the child is using. This information is provided to a third party analytics company. A copy of the complaint in (Chiesa v. 24 x 7 Digital LLC, D.N.J., Case No. filed 6/6/12) can be found here: NJ_060612.pdf.
As the frequency of mobile app enforcements increases, it is critical to ensure that you have conducted a proper legal audit of compliance issues with your app to avoid being the next headline. It is important to ensure that you consider how you use and share data with third parties, among other things.
On Thursday the Federal Trade Commission released a staff report titled, “Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing,” in which the FTC criticized companies for failing to properly disclose to parents how the companies are collecting personal data through mobile applications (“apps”) aimed at young children.
The results of the FTC’s study follow the FTC’s August 2011 settlement with W3 Innovations, which was the FTC’s first enforcement action against a mobile app developer.
The FTC surveyed approximately 1,000 apps designed for children and available through iTunes and the Android Marketplace by searching for the word “kid.” According to the FTC, despite the warning provided by the W3 Innovations settlement, they found that the operators of those apps could be collecting location (via GPS), phone numbers, contact lists, call logs and other “unique identifiers,” but that the apps do not make it easy for parents to figure out what’s being collected, how the data is being used or to give consent to such collection and use.
“Companies that operate in the mobile marketplace provide great benefits, but they must step up to the plate and provide easily accessible, basic information so that parents can make informed decisions about the apps their kids use,” FTC Chairman Jon Leibowitz said in a statement.
The FTC noted that the various app stores create their own age ratings and that these guidelines are often not consistent. The Staff Report recommends that app developers provide simple and short disclosures on how they collect and share information about users, including whether their apps connect to social media sites like Facebook. Connection to Facebook (or other social media sites) for some of these apps could be problematic, since the Facebook terms (and the terms of most other social media sites) specifically prohibit access if the user is under 13 precisely to avoid having to deal with the COPPA Rule.
The FTC also wants app developers to inform parents if apps targeted towards children contain ads. In some apps, ads and/or content that is inconsistent with the age rating is buried deep within the app and can be found only when a player reaches an advanced stage of the game.
The FTC Staff Report is particularly interesting in light of a report from the Wall Street Journal today asserting that “Google Inc. and other advertising companies have been bypassing the privacy settings of millions of people using Apple Inc.’s Web browser on their iPhones and computers–tracking the Web-browsing habits of people who intended for that kind of monitoring to be blocked.”
The FTC’s evaluation of app privacy disclosures comes as the agency is evaluating the comments it received and finalizing updates to COPPA that were revealed in September 2011.
According to the Staff Report, the FTC is planning to conduct an additional review in the next six months to determine whether some of these mobile apps were violating the COPPA Rule. As currently drafted, the COPPA Rule creates the potential for violators to be fined up to $1,000 per violation (i.e., per child) – an amount that can add up very quickly for even a moderately popular app.
The Federal Trade Commission has extended until July 12, 2010, the deadline for public comments on its review of the Children’s Online Privacy Protection Act (COPPA) Rule. The request for comments was originally published in the Federal Register on April 5, 2010.
As stated on the FTC website:
The primary goal of the Children’s Online Privacy Protection Act (COPPA) Rule is to give parents control over what information is collected from their children online and how such information may be used.
The Rule applies to:
* Operators of commercial Web sites and online services directed to children under 13 that collect personal information from them;
* Operators of general audience sites that knowingly collect personal information from children under 13; and
* Operators of general audience sites that have a separate children’s area and that collect personal information from children under 13.
The Rule requires operators to:
* Provide notice about the site’s information collection practices to parents and obtain verifiable parental consent before collecting personal information from children.
* Give parents a choice as to whether their child’s personal information will be disclosed to third parties.
* Provide parents access to their child’s personal information and the opportunity to delete the child’s personal information and opt-out of future collection or use of the information.
* Not condition a child’s participation in a game, contest or other activity on the child’s disclosing more personal information than is reasonably necessary to participate in that activity.
* Maintain the confidentiality, security and integrity of personal information collected from children.
Many in the industry have complained that the FTC has not provided clear enough guidance on how to comply with COPPA.
However, in order to encourage active industry self-regulation, COPPA also includes a safe harbor provision allowing industry groups and others to request Commission approval of self-regulatory guidelines to govern participating Web sites’ compliance with the Rule.
One of the few companies to have received Safe Harbor status is Pillsbury client Privo, Inc.