Law.com Subscribers SAVE 30%

Call 855-808-4530 or email [email protected] to receive your discount on a new subscription.

FACE Act Introduced

By Bradley S. Shear
September 02, 2013

The Forbidding Advertisement Through Child Exploitation Act (FACE Act) of 2013 was introduced in Congress on July 10, 2013 by U.S. Congressman John J. Duncan, Jr. (R-N) to help protect the personal privacy of children and teens. The official title as introduced states: “To prohibit providers of social media services from using self-images uploaded by minors for commercial purposes.”

The FACE Act states: “(a) provider of a social media service may not intentionally or knowingly use for a commercial purpose a self image uploaded to such a service by a minor.” http://1.usa.gov/17Nl6Uf. The Act empowers the Federal Trade Commission (FTC) to promulgate regulations under section 553 of title 5 of the United States Code to implement the Act. This aspect of the legislation is extremely important because it appears to provide the FTC the flexibility to create regulations that will enable it to account for changes in technology.

Enforcement

To be effective, legislation should have adequate enforcement mechanisms. This bill appears to enable not only the FTC, but also state attorneys general and/or state officials and agencies to enforce the Act. According to the bill, a “state may enforce the act by bringing a civil action to: enjoin such act or practice; enforce compliance with such section or such regulation; obtain damages, restitution, or other such compensation on behalf of residents of the State; or obtain such other legal and equitable relief as the court may consider to be appropriate.”

The Act specifically states that it would not preempt states or political subdivisions of a state from enacting a law that provides minors greater personal privacy protection. At first glance, this appears to provide the potential to create burdensome regulations on cloud providers and their clients; however, cloud computing vendors have been able to flourish despite being required to adhere to different privacy laws in each state. For example, at least 46 states, including the District of Columbia, Guam, Puerto Rico and the Virgin Islands have data breach notification statutes. (A list of those states with links to their security breach notification laws is available from the National Conference of State Legislatures (NCSL).)

According to a GigaOM article about Gartner's Forecast Overview: Public Cloud Services, Worldwide, 2011-2016, 4Q12 Update that was released earlier this year, “the U.S. is predicted to remain number one in overall cloud services deployment-by a wide margin-into 2016.” See, “Gartner: Public Cloud Services to Hit $131B by 2017.” Therefore, despite almost every state in the U.S. enacting its own data breach notification statutes (whose provisions may vary widely state by state), cloud computing providers have still been able to offer to clients compliant cost effective solutions.

States Take Charge

While the FTC's recent updates to the Children's Online Privacy Protection Act (COPPA) provide our children more privacy protections, state attorneys general, along with state officials or agencies, may be in a better position to protect the digital privacy of our children. See, “FTC Strengthens Kids' Privacy, Gives Parents Greater Control over Their Information by Amending Children's Online Privacy Protection Rule.” For example, while multiple EU data protection authorities are pursuing enforcement actions against Google because of its March 1, 2012 privacy policy change; so far the FTC has declined to do so. See, “European Data Protection Authorities Threaten to Fine Google,” The Irish Times.

In contrast, in 2012 the National Association of Attorneys General (www.naag.org) sent a letter (signed by 36 state attorneys general) expressing their concern about Google's privacy policy change. See, “Attorneys General Express Concerns over Google's Privacy Policy.” In July, 23 state attorneys general signed onto a follow up letter that stated, “[w]e are still greatly concerned about the way Google collects consumer information” and “[w]e also think more needs to be done to enable consumers to review and delete data that has been collected about them from specific Google products.” See, “Maryland Says Google Privacy Policy Doesn't Go Far Enough,” The Baltimore Sun.'

In addition to the actions spearheaded by the NAAG, California's Attorney General Kamala Harris has been active regarding protecting those who utilize mobile apps. Her office's recent report on mobile apps “provides guidance on developing strong privacy practices.” See, “Attorney General Kamala D. Harris Issues Guidance on How Mobile Apps Can Better Protect Consumer Privacy.” Harris also created the Privacy Enforcement and Protection Unit'to enforce federal and state privacy laws. Other states, such as Massachusetts, have introduced legislation (H 331) that would ban cloud computing service providers who contract with K-12 schools from processing student data for commercial purposes. See, http://1.usa.gov/176bzYz.

Conclusion

Even though some state attorneys general and state lawmakers around the country are working to protect the digital privacy of children, more tools are needed to ensure that children are not exploited. The FACE Act's introduction is important because it demonstrates that legislators realize that enacting stronger digital privacy laws is not only best for society, but that it will resonate with voters on election day.

While it may take several legislative sessions for the FACE Act to move forward due to the acrimony on Capitol Hill, it demonstrates that lawmakers still believe we have an expectation of privacy in the Digital Age. It would not surprise me if the FACE Act's introduction encourages state lawmakers to introduce similar bills in their respective legislatures around the country. Therefore, it is imperative that the cloud computing industry work with stakeholders to ensure that our children's personal digital data is not utilized for commercial purposes.


Bradley S. Shear is a lawyer in Bethesda, MD, and an Adjunct Professor at George Washington University. A member of this newsletter's Board of Editors, he practices cyber and social media law, privacy and advertising law, and copyright and trademark law. Shear advises state and federal lawmakers around the country on digital media law and public policy issues. He can be reached at www.shearlaw.com.

The Forbidding Advertisement Through Child Exploitation Act (FACE Act) of 2013 was introduced in Congress on July 10, 2013 by U.S. Congressman John J. Duncan, Jr. (R-N) to help protect the personal privacy of children and teens. The official title as introduced states: “To prohibit providers of social media services from using self-images uploaded by minors for commercial purposes.”

The FACE Act states: “(a) provider of a social media service may not intentionally or knowingly use for a commercial purpose a self image uploaded to such a service by a minor.” http://1.usa.gov/17Nl6Uf. The Act empowers the Federal Trade Commission (FTC) to promulgate regulations under section 553 of title 5 of the United States Code to implement the Act. This aspect of the legislation is extremely important because it appears to provide the FTC the flexibility to create regulations that will enable it to account for changes in technology.

Enforcement

To be effective, legislation should have adequate enforcement mechanisms. This bill appears to enable not only the FTC, but also state attorneys general and/or state officials and agencies to enforce the Act. According to the bill, a “state may enforce the act by bringing a civil action to: enjoin such act or practice; enforce compliance with such section or such regulation; obtain damages, restitution, or other such compensation on behalf of residents of the State; or obtain such other legal and equitable relief as the court may consider to be appropriate.”

The Act specifically states that it would not preempt states or political subdivisions of a state from enacting a law that provides minors greater personal privacy protection. At first glance, this appears to provide the potential to create burdensome regulations on cloud providers and their clients; however, cloud computing vendors have been able to flourish despite being required to adhere to different privacy laws in each state. For example, at least 46 states, including the District of Columbia, Guam, Puerto Rico and the Virgin Islands have data breach notification statutes. (A list of those states with links to their security breach notification laws is available from the National Conference of State Legislatures (NCSL).)

According to a GigaOM article about Gartner's Forecast Overview: Public Cloud Services, Worldwide, 2011-2016, 4Q12 Update that was released earlier this year, “the U.S. is predicted to remain number one in overall cloud services deployment-by a wide margin-into 2016.” See, “Gartner: Public Cloud Services to Hit $131B by 2017.” Therefore, despite almost every state in the U.S. enacting its own data breach notification statutes (whose provisions may vary widely state by state), cloud computing providers have still been able to offer to clients compliant cost effective solutions.

States Take Charge

While the FTC's recent updates to the Children's Online Privacy Protection Act (COPPA) provide our children more privacy protections, state attorneys general, along with state officials or agencies, may be in a better position to protect the digital privacy of our children. See, “FTC Strengthens Kids' Privacy, Gives Parents Greater Control over Their Information by Amending Children's Online Privacy Protection Rule.” For example, while multiple EU data protection authorities are pursuing enforcement actions against Google because of its March 1, 2012 privacy policy change; so far the FTC has declined to do so. See, “European Data Protection Authorities Threaten to Fine Google,” The Irish Times.

In contrast, in 2012 the National Association of Attorneys General (www.naag.org) sent a letter (signed by 36 state attorneys general) expressing their concern about Google's privacy policy change. See, “Attorneys General Express Concerns over Google's Privacy Policy.” In July, 23 state attorneys general signed onto a follow up letter that stated, “[w]e are still greatly concerned about the way Google collects consumer information” and “[w]e also think more needs to be done to enable consumers to review and delete data that has been collected about them from specific Google products.” See, “Maryland Says Google Privacy Policy Doesn't Go Far Enough,” The Baltimore Sun.'

In addition to the actions spearheaded by the NAAG, California's Attorney General Kamala Harris has been active regarding protecting those who utilize mobile apps. Her office's recent report on mobile apps “provides guidance on developing strong privacy practices.” See, “Attorney General Kamala D. Harris Issues Guidance on How Mobile Apps Can Better Protect Consumer Privacy.” Harris also created the Privacy Enforcement and Protection Unit'to enforce federal and state privacy laws. Other states, such as Massachusetts, have introduced legislation (H 331) that would ban cloud computing service providers who contract with K-12 schools from processing student data for commercial purposes. See, http://1.usa.gov/176bzYz.

Conclusion

Even though some state attorneys general and state lawmakers around the country are working to protect the digital privacy of children, more tools are needed to ensure that children are not exploited. The FACE Act's introduction is important because it demonstrates that legislators realize that enacting stronger digital privacy laws is not only best for society, but that it will resonate with voters on election day.

While it may take several legislative sessions for the FACE Act to move forward due to the acrimony on Capitol Hill, it demonstrates that lawmakers still believe we have an expectation of privacy in the Digital Age. It would not surprise me if the FACE Act's introduction encourages state lawmakers to introduce similar bills in their respective legislatures around the country. Therefore, it is imperative that the cloud computing industry work with stakeholders to ensure that our children's personal digital data is not utilized for commercial purposes.


Bradley S. Shear is a lawyer in Bethesda, MD, and an Adjunct Professor at George Washington University. A member of this newsletter's Board of Editors, he practices cyber and social media law, privacy and advertising law, and copyright and trademark law. Shear advises state and federal lawmakers around the country on digital media law and public policy issues. He can be reached at www.shearlaw.com.

This premium content is locked for Entertainment Law & Finance subscribers only

  • Stay current on the latest information, rulings, regulations, and trends
  • Includes practical, must-have information on copyrights, royalties, AI, and more
  • Tap into expert guidance from top entertainment lawyers and experts

For enterprise-wide or corporate acess, please contact Customer Service at [email protected] or 877-256-2473

Read These Next
COVID-19 and Lease Negotiations: Early Termination Provisions Image

During the COVID-19 pandemic, some tenants were able to negotiate termination agreements with their landlords. But even though a landlord may agree to terminate a lease to regain control of a defaulting tenant's space without costly and lengthy litigation, typically a defaulting tenant that otherwise has no contractual right to terminate its lease will be in a much weaker bargaining position with respect to the conditions for termination.

How Secure Is the AI System Your Law Firm Is Using? Image

What Law Firms Need to Know Before Trusting AI Systems with Confidential Information In a profession where confidentiality is paramount, failing to address AI security concerns could have disastrous consequences. It is vital that law firms and those in related industries ask the right questions about AI security to protect their clients and their reputation.

Generative AI and the 2024 Elections: Risks, Realities, and Lessons for Businesses Image

GenAI's ability to produce highly sophisticated and convincing content at a fraction of the previous cost has raised fears that it could amplify misinformation. The dissemination of fake audio, images and text could reshape how voters perceive candidates and parties. Businesses, too, face challenges in managing their reputations and navigating this new terrain of manipulated content.

Pleading Importation: ITC Decisions Highlight Need for Adequate Evidentiary Support Image

The International Trade Commission is empowered to block the importation into the United States of products that infringe U.S. intellectual property rights, In the past, the ITC generally instituted investigations without questioning the importation allegations in the complaint, however in several recent cases, the ITC declined to institute an investigation as to certain proposed respondents due to inadequate pleading of importation.

Authentic Communications Today Increase Success for Value-Driven Clients Image

As the relationship between in-house and outside counsel continues to evolve, lawyers must continue to foster a client-first mindset, offer business-focused solutions, and embrace technology that helps deliver work faster and more efficiently.