The Washington Post printed a story about how the Campaign for a Commercial-Free Childhood has submitted the opinions of six experts on child development to the Federal Trade Commission in support of CCFC’s complaint against toy maker Fisher-Price for marketing its “Laugh and Learn” app for infants and small children.
One of the six experts, Herbert Ginsburg writes, “Existing research suggests that infants and very young children are not cognitively ready to learn key abstract ideas about numbers. Although some children at the upper bounds of this age range might learn to parrot some number words they are highly unlikely to learn important concepts of numbers.”
To be sure I am not a child development expert (although I did study child development in college.) I am a parent of a wonderful daughter. When she was 19 months old I ran across a Fisher Price online game, “The ABC Game“, which taught infants and toddlers their letters. (This was pre-tablet so I used a laptop). My daughter would press keys on my laptop and up would pop a picture of the letter, a picture of an animal whose first
Yesterday the United States Court of Appeals for the Ninth Circuit ruled that a lawsuit against Google for illegal wiretapping could proceed.
The case involves Google’s Street View project which provides online access to panoramic views of public streets in cities around the world. To build the database of images, Google sent vehicles into cities to photograph public streets. At times, these vehicles also unintentionally recorded data that users were transmitting over unencrypted wireless networks. The central claim of the lawsuit is that this collection of unencrypted data from wireless networks is a violation of the Wiretap Act. Google argued that the case should be dismissed because the Wiretap Act exempts “electronic communications” that are “readily accessible to the general public.” In its ruling, the Court denied Google’s motion to dismiss.
The basic logic of the Wiretap Act is that if people do not take action to make their communications private, then they do not have an expectation of privacy. For example, if two individuals use unscrambled CB radios to have a conversation, then other radio users are not in violation of the Wiretap Act if they hear this conversation.
The sign of a civilization in decline is when there is widespread fear of the future and longing for the past. While America may not yet be in decline we are certainly fearing the future. Case in point, California’s decision to back off its deployment of drivers’ licenses with embedded radio-frequency identification chips (RFID) in them. RFID chips are small electronic devices embedded with a unique code that communicates with an electronic reader usually within a range of 1 to 2 inches.
We can thank the privacy-fundamentalists for this, for they love nothing better than to spread fear based on misinformation about technology. As supposedly pro-tech Wired Magazine writes, this is “spy-friendly technology.” The article claims that “Privacy advocates worry that, if more states begin embracing RFID, the licenses could become mandatory nationwide and evolve into a government-run surveillance tool to track the public’s movements.” It goes on to say “law enforcement already monitors drivers’ whereabouts via the mass deployment of license-plate readers. But the ability to scan for identification cards in public areas could evolve into another surveillance tool.”
That sure scares me. I don’t want the
According to the latest online search trends, concerns about Internet privacy are so 2004.
If this sounds incredible to you, let me explain.
First, some background. An individual’s search queries can reveal interesting information about their interests. When this data is amassed across many individuals, it can provide insights into consumers as a whole. Google is one of the first companies to make use of its large historical database of search engine queries to try to better understand consumers. Perhaps the most famous example of this is Google Flu Trends which uses both historical and real-time data to predict the level of influenza in the population across time.
Google has also made a version of its database of search queries available to the public in a product called Google Trends. Google Trends shows how many searches have been performed on a particular search term relative to the total number of searches. Individuals can use Google Trends to discover the relative popularity of search terms for a particular period of time or for a particular location. Google Trends does not show total search volume, but rather normalizes the data to
This op-ed originally appeared in ComputerWorld.
Last week, Brendan Eich, the chief technology officer and senior vice president of engineering at Mozilla, announced that the organization is planning to block third-party cookies in future versions of the Firefox Web browser. In addition, the Center for Internet and Society (CIS) at Stanford Law School announced that it has created a new organization called the “Cookie Clearinghouse,” which will begin publishing blacklists for websites based on whether it believes a website’s particular usage of cookies “makes logical sense.” Mozilla will use these blacklists to decide which cookies to accept or deny.
Large-scale blocking of third-party cookies may have profound negative consequences on the future of the Internet. There are three main concerns. First, this practice will result in a loss of revenue from online advertising for many websites, and thus lead to less free content available to consumers. Second, it will cut off many legitimate business models for companies that collect and aggregate user data across the Internet to understand user behavior to design better websites, content and features. Third, it will limit the functionality of websites, both
The California Senate’s Public Safety Committee passed SB 255 earlier this week to prohibit the distribution of “revenge porn”—the non-consensual distribution of intimate images of individuals. Groups like Without My Consent have been campaigning to get state laws changed to make this a crime. As the bill’s chief sponsor State Senator Anthony Cannella says, “People who post or text pictures that are meant to be private as a way to seek revenge are reprehensible. Right now, there is no tool for law enforcement to protect the victims.”
The legislation is fairly straightforward so I will quote it directly:
“This bill would make it a misdemeanor for any person who, with the intent to cause substantial emotional distress or humiliation to another person, by means of an electronic communication device, and without consent of the other person, electronically distributes, publishes, emails, hyperlinks, or makes available for downloading nude images of the other person along with personal identifying information of the other person.”
This bill is a common sense solution to an incredibly offensive activity. Indeed, California’s Public Safety Commission approved the legislation unanimously. As I have argued
In the aftermath of the Boston Marathon bombing, we’ve seen a lot of discussion about the crucial role that the abundance of surveillance cameras and smartphones played in finding the suspects. The general consensus seems to be that these technologies were useful. For example, New York Mayor Michael Bloomberg said, “The Boston bombing is a terrible reminder of why we’ve made these investments—including camera technology that could help us deter an attack, or investigate and apprehend those involved.” And Chicago Mayor Rahm Emanuel similarly endorsed surveillance cameras when he said, “I will say, as I always have, because we have continued to put cameras throughout the city for security … purposes, they serve an important function for the city in providing the type of safety on a day-to-day basis—not just for big events like a marathon, but day-to-day purposes.”
Not surprisingly privacy advocates worry that such a high-profile display of the benefits of these camera systems will lead to more public acceptance and adoption, and so they are trying to minimize the value of these systems by arguing that this is a rare event. Jeff Chester, the executive director of
I’ve written before about how well-intentioned privacy laws and regulations can come at the expense of innovation and harm consumers (see here, here, and here), but there is not a lot of data available to show the exact impact. However, a new feature on Netflix provides a good case study of the impact on innovation of at least one of these privacy rules.
First, some background. Today, Netflix announced a new feature—“Netflix Social”—which lets users share the TV shows and movies they watch on Netflix with their friends on Facebook. As Cameron Johnson, the director of product innovation, at Netflix explains on the official Netflix blog:
By default, sharing will only happen on Netflix. You’ll see what titles your friends have watched in a new “Watched by your friends” row and what they have rated four or five stars in a new “Friends’ Favorites” row. Your friends will also be able to see what you watch and rate highly… You are in control of what gets shared. You can choose not to share a specific title by clicking the “Don’t Share This” button in the player.
The past two weeks have seen two important announcements come out of the Federal Trade Commission (FTC). First, Commissioner Edith Ramirez was designated to replace outgoing Commissioner Jon Leibowitz as Chairman. Second, identity theft has been reported as the top consumer complaint to the FTC for the 13th year in a row.
Why are these two announcements related? It’s simple. As Chairwoman Ramirez considers how she will lead the FTC throughout her term, it’s worth looking at where the FTC can help Americans the most, particularly in an era of limited budgets. And the most recent data from the FTC overwhelmingly shows that the top priority for the Commission should be on identity theft.
The latest data on identity theft comes from the FTC’s recently released Consumer Sentinel Network Data Book for 2012. The Consumer Sentinel Network (CSN) is a database of consumer complaints received by a variety of sources including the FTC, state law enforcement agencies, state attorneys general, the FBI, the Consumer Financial Protection Bureau, the U.S. Postal Inspection Service, and the Better Business Bureau. While there are limits to how the data should be used
In my first post on the Location Privacy Protect Act of 2012 I addressed the claims that the legislation is necessary because some companies may share a user’s location data without that user’s knowledge and some companies may share location data about children without their parent’s knowledge. In this post, I will address Sen. Franken’s argument that this legislation is needed to prevent domestic violence abusers from using “stalking apps” to track their victims.
The attempt to link honest uses of location data with domestic violence is a bit disheartening. Let’s face it—nobody is against preventing domestic violence so this unnecessarily makes a rather technical debate about a deeply emotional issue. But because this claim has gotten a lot of attention, I want to dig into it a bit and address it on its merits.
First, for all the talk about this problem, the evidence for the use of stalking apps by stalkers and harassers is somewhat thin. There are some notable cases of victims being stalked but it is not clear how prevalent this is in practice. However, given the increased use in smart phones, I would suspect that