As technology becomes increasingly advanced and complex, it seems that a new software emerges every day to perform some novel function. Whether it is computer generated imagery (CGI) or deciphering a code in a bible, software developers are helping the users of their software make great strides in all types of industries. In these situations, it’s commonly accepted that the developer owns the software and the user can use the benefits of the software through a license. However, a less clear issue has arisen in recent years—does the software developer own the output generated when using the software?
Niantic looks to the Potterverse for its next potential AR blockbuster, Instagram’s ToS don’t travel so well in Germany, Google gives VR and AR app developers a new tool, holograms may help our memories outlive us, and more!
Cloning is the process of creating a video game that is significantly motivated or inspired by an existing popular video game or series. Developers have been cloning popular video games since the 1980s, including Tetris, Doom, Minecraft, Bejeweled and Flappy Bird. Often, game developers create clones in an attempt to confuse users and cash in on a game’s popularity.
When it comes to finding ways of making money, no corner of a capitalistic society shall go unmined. This applies to obvious goods and services but also comes into play with our very thoughts and how we express them. In the age of social media, not even the framed needlepoint proverb is safe from “disruption”: behold, the framed tweet.
As we discussed recently, the Equifax data breach has inevitably brought a great deal of scrutiny and legal action against the credit reporting agency. Amidst the numerous brewing class actions and other reactions from government agencies and state AGs, it’s worth pointing out another front on which the company—and more importantly, individuals within the company—may face legal consequences.
Since September 7, 2017, Equifax, one of three credit rating agencies in the United States, has been dealing with the fallout from one of the largest (known) data breaches of personal information, putting 143 million Americans at risk from fraud and identity theft (roughly 44% of the U.S. population).
Last week, the FTC brought its first action against a social media influencer for failing to make appropriate disclosures on sponsored posts. While it had previously prosecuted companies who pay influencers for posts such as Lord & Taylor and Warner Brothers, this marks the first time the FTC has pursued an influencer.
“Believe nothing you hear, and only one half that you see.” Edgar Alan Poe wrote those words over a century ago, yet if he were alive today he may opt for the darker: “Believe nothing you hear and nothing you see.” Over the past decade, advances in graphics technology have provided visual effects artists the ability to create fantastical new worlds on film and to populate those worlds with people, all with an astounding amount of realism. Of particular interest in this post is the ability of this technology to create realistic digital replicas of actors that can be manipulated like puppets to deliver new cinematic performances without the need for any further input from the actor—such as when the late Peter Cushing was digitally recreated in order to reprise the character of Grand Moff Tarkin in Rogue One: A Star Wars Story.
After counter-protests ended in tragedy, a small group of social media users took to Twitter to expose the identities of the white supremacists and neo-Nazis rallying in Charlottesville, Va. Since last Sunday, the @YesYoureRacist account has been calling on Twitter users to identify participants in the rally. Twitter users identified several white supremacists, including Cole White. Users revealed White’s name and place of residence and his employer reportedly fired him from his job at a restaurant in Berkeley, Calif. Several other employers fired employees identified online as attending the rally. In the wake of what will likely be just the latest incident where such behavior will be exhibited and subsequently called out on social media, it’s a good time to look at doxing and the legal environment in which it exists.
Freelance writers are as integral to online content generation as migrant workers are to the harvesting of seasonal crops (and in many cases, about as poorly protected). And since content generation is always in season—and given that so many online platforms either use freelancers to generate content or rely in some manner on said content—employers would do well to take note when a large metropolitan area that serves as home for countless freelancers enacts new protections for the group as a whole. New York City did just that with the Freelance Isn’t Free Act, which colleagues Rebecca Carr Rizzo, Kenneth W. Taber and Andrew J. Lauria discussed at length in a May client alert. Now that the law has gone into effect, final rules have been announced which further define what a hiring party can and cannot do in regard to the language of contracts that freelancers are asked to sign. In New York City’s “Freelance Isn’t Free” Act Also Isn’t Waivable, our colleagues explore the law’s final form. While it’s likely New York-area employers are familiar with the law, anyone who hires freelancers based in NYC should take note.