Let’s talk shop. With LeBron James. Sounds cool right? That’s what James and his partner Maverick Carter thought when their entertainment company Uninterrupted developed The Shop. On The Shop, James and his friends, business associates, and various celebrity figures banter while getting their hair cut. Uninterrupted aired two episodes of the series with the first episode, (which premiered during the 2017 NBA Finals), garnering roughly four million views across Uninterrupted.com and ESPN’s YouTube channel.
If you haven’t seen Sundar Pichai’s presentation on Google Duplex, watch it. The technology is fascinating.
Google is developing software that can assist users in completing specific tasks such as making reservations by telephone. The software uses anonymized phone conversations as the basis for its neural network and in conjunction with automated speech recognition and text-to-speech software can have independent phone conversations with other people. Incredibly, the software requires no human interaction—at least by the user requesting the service—to complete its task. The result is that you can task the software to setup a haircut appointment for you, or book a table at a restaurant where it is difficult to get reservations, with no further input needed. It can also work with different scheduling options if your preferred time is not available. And importantly, the conversations seem natural—it is very difficult to tell that one of the participants in the conversation is a computer.
When Eddie Rabbitt sang “Drivin’ My Life Away” in 1980, he was chronicling the life of a roadie, of a life spent behind the wheel. At the time, autonomous driving vehicles were still a distant speck on the horizon of the information highway. Today, we are on the cusp of a revolution that offers a near future where no one will have to spend his or her life behind a wheel. As always, the future carries new concerns, dangers and legal developments. We have already seen our first accidents and fatalities related to autonomous driving, and the regulatory and liability landscape is quickly setting context for this new technology—twenty-two states and Washington, D.C., have enacted legislation related to autonomous vehicles (with more pending).
Recent technology news provides its usual mix of hope, distractions and hand-wringing-worthy developments. (Granted, one of these items is not so much “news” as an ever-present truth about TOS.)
- How design’s tricks of the trade separate users from their privacy on the internet. (Ariel Bogle, ABC Science)
- “Restore discs” created by e-waste innovator are ruled to have infringed on Microsoft IP. (Tom Jackman, Washington Post)
- Investigators mined a genealogy website to aid in finding the Golden State killer … is that a privacy issue or just good sleuthing? (Antonio Regalado, MIT Technology Review)
- The same algorithm that helps spot “face swaps” in video can be used to make more sophisticated fake videos. (MIT Technology Review)
- The arduous task of cleaning up Fukushima falls to robots (and the humans who guide them). (Vince Beiser, Wired)
- Studies confirm what we all know about who reads terms of service. (David Berreby, The Guardian)
- Oculus experiments with virtual reality as immersive theater. (Joan E. Solsman and Scott Stein, c|net)
- Fitbit continues push into health care devices by taking to the cloud. (Brian Heater, TechCrunch)
- Malaysian court sentences man over the posting of fake news. (Reuters/The Guardian)
- Could augmented reality transform the way you … play board games? (Christina Bonnington, Slate)
- Artificial intelligence unlocks the “angst-ridden teen” achievement in poetry it writes. (Dan Robitzski, Futurism)
As the blockchain avalanche continues, and ever-increasing numbers of blockchain-based patent applications seek issuance, savvy inventors and practitioners continue probing for patent-eligible space. Blockchain apps ultimately will face the same barriers as other software applications—key among them being new rules on subject matter eligibility. For those hoping to make it past such obstacles, performance-related refinements to blockchain technology may provide a safe harbor.
As developments in artificial intelligence transform the business plans (and in some cases, the very identity) of industries, they also inevitably trigger the need for those industries that serve a supporting role to adapt in response. This is certainly true of the legal profession, and it’s also a given for the insurance industry. As is so often the case in life, with enough new wrinkles, there’s usually a good bit of gray. In Artificial Intelligence: A Grayish Area for Insurance Coverage, our colleague Ashley E. Cowgill explores some of the gray areas in insurance coverage created by the continued evolution and widening application of AI.
It’s Monday, and you’re at the local coffee stand with your work buddies sipping pour-overs made from freshly roasted fair trade beans. Brad from accounting is telling everyone about the new show he just binged on Netflix. It’s a coming of age story set in the ’90s and the throwback details are on point: the cool kids sport Starter jackets and Stüssy shirts; the geeks debate whether the Nintendo 64 is better than the Sony PlayStation; and the protagonist questions whether she should drink the bottle of Zima that her friend just handed to her. You interject: “Zima?! Someone should bring that back!” “Maybe we should,” says Tim from sales. “Nostalgia. It’s delicate, but potent,” adds Dan from marketing, because Dan always quotes Don Draper whenever he can, as he shows everyone a “Bring Back Zima” Facebook group. Soon you find yourself brainstorming ideas on how to get rich by bringing back dead, but not forgotten, brands. But then Matt from compliance asks, “Are we going to get sued?”
Of course, the answer is, “It depends.”
The March 23rd Consolidated Appropriations Act, 2018 contained key language to keep “wireline or mobile telephone service, Internet access service, radio and television broadcasting, cable service, [and] direct broadcast satellite service” working during natural disasters. The Act added these technological services providers to the definition of “essential service providers.” In “Broadcaster Access to Disaster Areas Becomes the Law of the Land,” colleague Scott Flick explores some key takeaways found in the pages (all 2,200 of them).
The March 21st passing of the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) has dramatically altered the rules of engagement for social media companies. The new law amends and clarifies that the Communications Decency Act of 1996 was never intended to legally protect websites that unlawfully promote and facilitate prostitution and human trafficking or websites that should have known their platform was being utilized for such activities. The bill also creates new areas of civil liabilities for companies and necessitates more active monitoring of online accounts. In their recent Client Alert, colleagues William M. Sullivan, Jr. and Fabio Leonardi examine the immediate and future ramifications of the bill.
Social media companies like Facebook and Twitter have written “white papers” and devoted considerable resources to projects intended to create services that encourage trust and a sense of familiarity on the part of users. Messages, photos and personal information are easily shared with groups of friends and co-workers, or in response to solicitations tailored to a user’s trusted brands, thus creating an environment of perceived safety and intimacy among users. However this communal atmosphere can be, and often is, exploited by “black hat” hackers and malware that lurk behind a façade of trust. In its April 27, 2017 White Paper entitled “Information Operations and Facebook,” and its September 6, 2017 “An Update on Information Operations on Facebook,” the company noted that there are, “three major features of online information operations that we assess have been attempted on Facebook.” Those features include: (1) targeted data collection such as hacking or spearfishing; (2) content creation including the creation of false personas and memes; and (3) false amplification by creating false accounts or using bots to spread memes and false content which, in turn sow mistrust in political institutions and spread confusion. Ironically, these techniques used to spread “fake news” and malware designed to amplify divisive social and political messages, are enhanced and made effective by the very environment of trust cultivated by social media sites.