Archive for the ‘Technology’ Category

Privacy in the Age of Apps

Monday, December 31st, 2012

Anonymous

If you use or develop online software or smartphone “apps” then you need to know about CalOPPA. ¬†No, that’s not some form of steam-driven musical device from an old-time carousel. It’s the¬†California Online Privacy Protection Act, and it has very real consequences for any company that does business online. This month, the State of California sued Delta Airlines for failure to comply with CalOPPA, and the suit seeks $2.500 for EACH TIME the Fly Delta mobile app was downloaded!

To comply with CalOPPA, you need to figure out if your online system or app collects any personally identifiable information (“PII”) such as a name, email address, physical address, telephone number, IP address, current location, or sensitive information such as a social security number. ¬†Next, you have to know the target age range for your web page or app. If it’s under 13, you need to talk to an attorney ASAP. There are special rules that apply.

Next you need a list of every party that will have access to the PII that you collect. ¬†You then need to specify how the user can control that PII. ¬†Can they view what you’ve collected, edit it, and delete it from your database? You then need a written policy that you will display to anyone from whom you collect PII that explains what you collect, how you intend to use it, with whom you may share it, and what the user can do to view, change or delete his own PII in your systems. You may want to review the sample policies available from the Center for Democracy and Technology, which has a very complete template, and then have your attorney review your policy after you’re done drafting it.¬† Finally, you may wish to get certified by a third party like TrustE so that you tell your users that you’re trustworthy with their PII.

Compliance with privacy regulations also varies in other countries, but these basic steps are the minimum necessary for any developer. If you need a hand, the attorneys at Lipton, Weinberger & Husick can help to draft these kinds of policies, and others. Give them a call.

–Lawrence A. Husick, Esq.

Is Facebook Your Friend?

Wednesday, February 29th, 2012

facebookDo you have a Facebook Account?¬† Do you realize that every time you click the “Like” button for a product, service or website,¬† Facebook may distribute a paid advertisement (“a sponsored story”) using your name to all of your “Friends” suggesting that you are recommending the product or service?¬† While Facebook is described in the popular media as a social networking site, it is, in reality, an advertising business generating its revenue¬† through the sale of advertising. Zuckerberg (CEO of Facebook) has said: “…nothing influences people more than a recommendation from a trusted friend. A trusted referral influences people more than the best broadcaster message. A trusted referral is the Holy Grail of advertising.” Facebook’s COO has said: “…making your customers your marketers”…”is the illusive goal we’ve been searching for.” Consequently, Facebook is able to charge a higher rate for the “sponsored stories.”

Now¬†Facebook has been sued¬†in Federal Court in California over its “sponsored story” advertising practice under statutory provisions governing the right of publicity, unfair competition and fraudulent and deceptive practices. California has a law on the books that says everyone has the right to control how their name, photo, likeness, and identity is used for commercial purposes, and such use may not be done without their consent and a minimum ($750) payment. The complaint makes several interesting points, not the least of which is that Facebook employs a unique lexicon of doublespeak by intentionally distorting the everyday meaning of words and misleading members.¬† Terms such as “friends,” “like,” “stories,” and “sponsor” may not mean the same to us as they do to Facebook. On Facebook, “friends” are not really limited to close or intimate associates; “like” does not necessarily imply an affinity for the site/item; “sponsors” are really advertisers paying for the ads; and “stories” are not written tales but are either items of friends doings, or in the case of¬† “sponsored stories” the advertisements generated from members clicking the “Like” button. In addition, plaintiffs allege that Facebook provides no avenue for opting out of the sponsored stories.

An interesting side light to the case is that minors are allowed to become members and have their names and images used in the advertisements without requiring the consent of their parents or guardians. Needless to say, Facebook has mounted a vigorous legal challenge on both procedural and legal grounds to the accusations, asserting a laundry list of defenses including: consent upon registering under Facebook’s terms;¬† protection under the Federal Communications Decency Act; First Amendment legitimate interest protection under the Constitution; and protection under the “newsworthy exception” for reporting on the activities of famous people.¬† (Would you believe that Facebook asserts that that members are famous to their friends?)

On December 16, 2011, ruling on Facebook’s motion to dismiss the lawsuit, the Court refused to accept most of Facebook’s arguments, finding that plaintiffs had asserted valid causes of action under the law. The Court reserved the question of whether members consented to Facebook’s practices¬†¬† The case is now in its discovery phase.

In an interesting twist, just last week, two of the plaintiffs asked to drop out of the case after Facebook’s lawyers demanded discovery depositions that threatened the plaintiffs with even more loss of privacy. As of this writing, the Court has not ruled on their request.

— Laurence Weinberger, Esq.

IPad Trademark Disputes Continue to Haunt Apple

Wednesday, February 29th, 2012

Trademark lawyers often enjoy following trademark disputes involving¬† famous trademarks. If you haven’t heard about¬†Apple Computer’s court battle over ownership rights for the “iPad” trademark in China, read on.

The Chinese owner of the “iPad” trademark is not Apple but¬† a beleaguered video display manufacturer known as Proview.¬† In 2001, Proview obtained rights to the “iPad” trademark in China around the time it was developing a so-called Internet Personal Access Device (“IPAD acronym”), which only saw the brief light of day when it proved to be a market failure.¬† Later, in 2008, Proview fell into rough financial times when the economy went south along with two of its major customers, Polaroid and Circuit City, who filed for bankruptcy.

In 2009¬† Apple was developing its own iPad device so it created a company in the United Kingdom called “IP Application Development Ltd.,”¬† (yet another IPAD acronym)¬† established for the singular purpose of acquiring trademark rights to “iPad”.¬† According to the Chinese Court, a Proview subsidiary in Taiwan sold the “iPad” trademark to Apple’s UK company for $55,000.

Apple then sued Proview in China for wrongfully using the “iPad” trademark.¬† Proview fought back and in late 2011, the court issued its opinion, which rejected Apple’s lawsuit concluding that although Apple purchased rights to “iPad” there was no formal transfer of trademark rights.¬† Apple, according to the court, purchased the trademark from Proview’s subsidiary, not from Proview itself, which was unrepresented during the negotiations between Apple and the Taiwanese company.¬† Apple is appealing the decision.

It was widely reported that Proview would take its dispute to the United States and on February 24th, theWall Street Journal reported¬†that it had filed a lawsuit on February 17th¬† in the Superior Court of the State of California in Santa Clara County claiming that Apple had committed fraud when it used Application Development Ltd., to purchase the iPad trademark from Proview.¬† What Proview hopes to gain by suing Apple on its home turf is unclear but Apple may be eager to settle to avoid disruption of its Chinese supply chains or sales to Chinese consumers.¬† It appears that Proview’s comeback strategy is built upon leveraging a lawsuit against the most famous technology company in the world.¬†¬†¬†Reuters¬†news service reports that “[a] Shanghai court this week threw out Proview’s request to halt iPad sales in the city. But the outcome of the broader dispute hinges on a higher court in Guangdong, which earlier ruled in Proview’s favour.”

Apple, of course, maintains Proview refuses to honor its agreement. Undoubtedly, Apple has enough cash to make this story go away and we suspect that’s just what will happen.¬† The story also reinforces the well established opinion that enforcing intellectual property ownership rights in China may be problematical.

–Adam G. Garson, Esq.

When Is Computer Software An Unpatentable Mental Process?

Friday, August 26th, 2011

PhrenologyOn August 16, 2011, the Federal Circuit Court of Appeals in CyberSource v Retail Decisions concluded that a claim to a method for detecting Internet fraud was not patentable.  The court also concluded that a claim to computer memory storing software to implement the method also was not patentable.  The Federal Circuit treated the claim to computer memory as no different from the unpatentable method claim.

Under this and other decisions, if a method of doing something can be performed as a mental process entirely within a person’s head, the method is “abstract” and not patentable regardless of how valuable, useful, novel or unobvious the method may be.¬† A computer programmed to implement the unpatentable method also is not patentable.

Does this mean that your computer-implemented invention is an abstract mental process and unpatentable?¬† If your invention requires specific systems and hardware, such as the Internet, a GPS receiver, or a computer capable of comparing an image pixel-by-pixel to a noise mask, then your invention should pass muster.¬† If your invention is one that could be performed entirely in a person’s head, then talk to us.¬† We can help avoid the effects of the CyberSource decision

–Robert Yarbrough, Esq.

Elementary, My Dear Watson

Monday, February 28th, 2011

Watson On February 14-16, 2011, we watched “Watson“, an IBM system of computers the size of a two-car garage, compete against the all-time best champions on the game show Jeopardy!¬ģ. On subjects from diseases to characters in Beatles¬ģ songs, the computer was impressive, pressing the buzzer faster than its human opponents, and getting so many answers correct that its occasional gaffe seemed like a deliberate act designed to preserve the hope and dignity of our species. The technology was impressive! In just a few years, systems like Watson will fit in our mobile phones (making today’s “smart” phones seem like stone tools.) We will come to rely on their ability to parse language and retrieve factual information as a way to extend our own memories, just as we now have electronic organizers that help us to remember phone numbers and appointments.

It is clear that we will soon see uses of Watson-type systems that will greatly advance how easily we perform a number of common tasks. For instance, Watson interprets the “questions” (answers, actually) on Jeopardy! with very powerful algorithms that seem to understand human language. This part of the system will find immediate employment in technical and customer support systems, where just understanding the question and figuring out who (or what) should provide assistance is more than half the battle. Watson’s ability to quickly search a large and diverse database, and to weigh the probable correct answers will also prove valuable as a decision tool. Just having the top three probable responses to choose among in difficult situations will help us in areas as different as picking a mortgage, routing air traffic, and diagnosing diseases.

Looking beneath the “hood” however, the real-world flaws in Watson are readily apparent to those with a background in computer databases, search and natural language processing (ok, well, to me, at least, since these fields were central to my system designs for my first company, Infonautics, and its product, “Homework Helper”, way back in 1990.) First, Watson relies on a professionally-constructed database of information. Want to beat Watson at Jeopardy? Just make it search the Internet in addition to its own databanks. The mass of conflicting data, divergent views, outright lies, and incomplete reference in which we live our lives is still well-beyond the capability of even our smartest machines. Watson may be able to compute a clue, but when it comes to humor, sarcasm, inuendo, and propaganda, Watson is clueless. Next, show Watson a painting as a clue, and ask not who painted it, and when, but what it means, or what feeling it evokes. To humans, the Mona Lisa’s smile is inscrutable. To Watson, every work of art is just a catalog entry.

All of this is not a criticism of Watson or its creators. It is, however, a criticism of those who conclude that when it comes to intelligence, if Watson can beat Ken Jennings and Brad Rutter, it is “game over” for our species, and we are weeks away from SkyNet, the malevolent computer network of the Terminator movies that takes over the world. In a recent article in the National Law Journal, Robert C. Weber, IBM General Counsel wrote about using Watson’s technology to aid in the courtroom. “If a witness says something that doesn’t seem credible, you can have an associate check it for accuracy on the spot.” Lawyers already know that human memory is faulty, and that eyewitness testimony is unreliable. Juries don’t. How will juries respond when attorneys can point out every flaw, every miscue in a witness narrative? This is not a question of whether our legal system should use technology – it does, and it should, to provide the best quality of counsel to our clients. Mr. Weber continued, “Deep QA [Watson's programming to understand questions] won’t ever replace attorneys; after all, the essence of good lawyering is mature and sound reasoning, and there’s simply no way a machine can match the knowledge and ability to reason of a smart, well-educated and deeply experienced human being. But the technology can unquestionably extend our capabilities and help us perform better.”

The question for our society is how we value those things that Watson cannot compute and factor them into our resolution of disputes – dignity, respect, integrity, altrusim. Until we can add those to Watson’s programming, the humans in the system need to value and uphold the human values that so-often get lost in the law.

— Lawrence A. Husick, Esq.