Serious Privacy Podcast – “Sharenting”

Sharenting – parents sharing kids’ information – should we care? Should they care? On this week’s episode of Serious Privacy, K Royal and Paul Breitbarth talk with Prof. Dr. Mariea Hoy, DeForrest Jackson Professor at University of Tennessee, and Dr. Alexa K Fox, Assistant Professor of Marketing at The University of Akron, on their recent publication, “Smart Devices, Smart Decisions? Implications of Parents’ Sharenting for Children’s Online Privacy,” in Journal of Public Policy & Marketing. This research explored how parents, specifically mothers, post about their children on social media and how they share their children’s personally identifiable information (PII) in a marketing context. 

In the past, sharing photos of your children was familiar but controlled: you pull out the hard copy photo of your child from your wallet, hand it to your coworker who gushes over how adorable your child looks in their school picture. “I can’t believe how much they’ve grown!” the coworker says, as they naturally hand the hard copy photo back to you, and you safely tuck it back into your wallet. The internet changed the ways in which we share information about our children. But are we putting children at risk by oversharing?

This episode of Serious Privacy explores the ramifications of “sharenting” as privacy concerns continue to grow and the brand to consumer relationship blurs. Stream the new episode here. Serious Privacy can be found on all major podcast players (Apple Podcasts, Spotify, Stitcher, etc.). Subscribe and review today! 

If you cannot access the article via the link provided, please contact the researchers directly. They indicated they would be happy to provide you with a copy.

Managing Employee Privacy in the Face of COVID-19

Suddenly, the world came to an almost complete standstill. What few expected to happen in these modern times of continuous global travel and interconnectedness, did happen after all. COVID-19, or the Coronavirus, has caused governments to close national borders, issue ‘shelter at home’ warnings, and cancel public and private group gatherings and events. Many companies have adopted policies and remote work practices requiring or allowing their employees to work from home in situations where their responsibilities can be managed off-premise. 

At TrustArc, we receive a lot of questions about the privacy implications of the COVID-19 pandemic. What are employers allowed to do to control the spread and mitigate the effects of the virus, and what additional data can they process about their employees? How do employers ensure good data protection and governance practices for employees working from home? In this blog, we address the most common challenges organizations currently face.

Health Data on the Work Floor 

Even in times of crisis (perhaps particularly in times of crisis), the law still applies. This is the case for labour laws, for medical legislation, and also for privacy and data protection laws. Safeguards cannot just be thrown out of the window. That said, in many jurisdictions, the law permits organizations to process additional data to assist public health efforts by keeping employees safe and healthy, provided that certain safeguards and requirements are met. 

Guidance from the Regulators

One frequently asked question by both governments and employers relates to the collection and use of medical data, like body temperature. Earlier this week, the Executive Committee of the Global Privacy Assembly (GPA), a worldwide consortium of privacy and data protection regulators, released a statement on this issue:

“We are confident that data protection requirements will not stop the critical sharing of information to support efforts to tackle this global pandemic. The universal data protection principles in all our laws will enable the use of data in the public interest and still provide the protections the public expects. Data protection authorities stand ready to help facilitate swift and safe data sharing to fight COVID-19.”

The GPA also published a special webpage where guidance from national regulators and other authorities on how to deal with COVID-19 related data issues is posted. This guidance is not limited to specific regions or regulators but rather covers GPA members worldwide. 

What Employers Should Know

Even though we recommend you review the specific guidance available for the country where your organization operates, there are a few general rules that can be deduced from the regulator guidance on COVID-19. 

  • A distinction needs to be made between data that governments can collect and use and data that private entities can collect and use and the permitted legal basis for each. Governments in general will have more room to maneuver when processing personal data in the public interest (e.g. to safeguard public health) or even to process personal data in the vital interest of an individual. Under the GDPR and various other laws, these are identified explicitly as grounds to process personal data. For private entities, collection and use of personal data in the public interest can also be possible, but there needs to be a clear, direct and demonstrable link with the public interest. 
  • When processing medical and other health data data, which includes noting if employees have been diagnosed as infected by or show symptoms of COVID-19, organizations should show restraint in only processing the minimum personal data necessary to carry out their obligations related to safety of the workforce, customers, and the public. In general, data protection and labour laws restrict the amount of detail on employee illnesses that can be registered by employers. When it is necessary and proportional (i.e. if there is no other option but to collect data on (suspicion of) COVID-19 infections in the workplace), as a best practice, data minimization and confidentiality must be respected. This means that as little information as possible should be collected and that this information should only be accessible to specific persons (not departments of groups) with a legitimate need to know it. For example, identifying victims of COVID-19 by name generally should not be allowed. Companies should also show restraint when processing data from visitors to its premises. There might be a good reason to measure the temperature of a visitor before allowing access, but that doesn’t mean the temperature reading or data related to whose temperature was read should be retained following the decision to provide access or not. In many jurisdictions, the processing of medical or other health data may require an organization to complete a privacy or data protection impact assessment and implement additional procedural safeguards and security controls.    
  • Whatever data is collected and used in the fight against COVID-19, organizations should be upfront and transparent about what data they process for which reasons. Under almost all data protection regulations around the world, the transparency requirement is a key principle. Information should be accessible, easy to understand and include the reasons why (additional) data needs to be processed.

Working from Home 

For many organizations, the Coronavirus crisis is the first time they will allow large groups of employees to work from home. In addition to impacting IT resources, it also requires organizations to consider a renewed approach to their data use and data protection practices. Even for organizations where employees are used to working from home, it is advisable to review and, where relevant, revise policies and procedures to ensure that personal data will remain secure at all times. This review should also include an assessment of the organizational, physical and technical risks involved in working from home and accessing systems and data remotely and the security measures that may be advisable, such as using secure Wifi networks and company-authorized VPNs. Though there may not be an alternative to working from home, conducting a privacy or data protection impact assessment of the working from home processing may help identify the risks to the rights and freedoms of your employees, customers and business partners. It also allows you to identify mitigation steps that your workers at home can implement, like the implementation of certain technical and organizational measures.

We have created two top-10 lists with recommendations for both employers and employees on what to take into consideration when employees are working from home. Download the following tips:

CCPA Update: March Regulation Proposed Revisions

The Department of Justice of California published yet another round of draft CCPA (California Consumer Privacy Act) regulations on March 7, 2020 with comments due March 27, 2020.

As stated in the notice, there were “around 100 comments received in response” to the previous draft regulations.

In the most recent version, the “redlined” version is color-coded to easily identify the original draft regulations, the first set of modifications, and this second set of modification. The redlined and clean versions are published online.

According to the rule-making process, if changes are made to the proposed regulations, the changes will be published for the public to submit comments. These comments would be reviewed and based on the comments, either revise or accept the published draft. Comments will also be responded to at the publication of the final regulations.   The Office of the Attorney General previously provided guidance that if changes are “substantial and sufficiently related,” the changes will be published with an abbreviated comments period of 15 days (this modification and the last one met these requirements). If changes are not made or are “nonsubstantial and sufficiently related,” no publication for comments will occur. Only “major changes” would require a full 45-day comment period.

Some of the key changes include:

  • Removal of § 999.302 which was added in the last version addressing that an IP address that is otherwise not associated with identifying information is not personal data. No sections were added or modified in the newest version to address IP addresses.
  • Addition of § 999.305(d) clarifying that “[a] business that does not collect personal information directly from a consumer does not need to provide a notice at collection to the consumer if it does not sell the consumer’s personal information.”
  • An addition was made that if a business that denies a consumer’s request to delete sells personal information and the consumer has not already made a request to opt-out, the business shall ask the consumer if they would like to opt out of the sale of their personal information and shall include either the contents of, or a link to, the notice of right to opt-out in accordance with section 999.306. (§ 999.313(d)(7)).
  • Clarification that the notice provided at the collection of employment-relation information does not need to contain a link to the business’s privacy policy.
  • Additional clarifications were added around information provided in response to consumers’ requests to know (§ 999.305(f)(2)), what to publish about selling minors’ data (§ 999.308(c)(9)), a description of biometric data that is to be provided where the biometric data itself cannot be provided in response to a request to know (§ 999.314(c)(4)), and descriptions of categories of sources and business purposes in the privacy policy (§ 999.308(c)(1)(e) and (f).

Where are we now?

The comment period ends on March 27, 2020. Per guidance and history, any changes made to this version will result in publication of a new round of proposed regulations.

Once we reach a version wherein there are no changes made, according to the “Information about the rulemaking process,” the Office of the Attorney General will prepare and submit the final rulemaking record to the Office of Administrative Law (“OAL”) for approval, including the summaries and responses to each public comment received. The OAL has 30 working days to determine if all of the procedural requirements are met and if so, the regulations will be filed with the Secretary of State. 

Will enforcement start July 1, 2020?

At this time, enforcement remains slated to start on July 1, 2020. TrustArc will keep you posted on updates. To speak with a privacy expert about the California Consumer Privacy Act and how to comply, schedule a consultation today.

CCPA Update: February Regulation Proposed Revisions

 

The most recent revised proposed regulations to the CCPA were released on February 10, 2020.  As communicated in the “Information about the rulemaking process” issued by the Office of the Attorney General previously, if any changes were made to the proposed regulations, they would publish “another draft for more public comment” and “give the public at least 15 days (or longer, depending on the extent of the revision) to comment.” That comment period has now ended.

Prior statements by Attorney General Becerra led us to expect regulations in January, so it appears the timeline may be extending at some point, but how this will impact the enforcement date is unknown. Currently, there has been no indication that the enforcement date of July 1 will be pushed back at all.

Both the redlined and clean versions are published online. One of the more controversial proposed elements previously was that businesses unable to verify a request for deletion would treat that unverified request as a “Do Not Sell” request (§ 999.313(d)(1)). That has been removed along with the requirement to indicate which method of deletion was performed – deleted, de-identified, or aggregated. Another concerning proposed element was that a request for deletion had to go through a two-step process. Now, the two-step confirmation is suggested, but not required (§ 999.312(d)).

A controversial requirement that was removed was one requiring businesses to communicate a consumer’s opt-out of sales to any parties to whom the business sold the data in the prior 90 days (§ 999.315(f)). Under the new proposed regulations, businesses are required to process opt-outs within 15 business days and if there is a sale made during that time, the business must contact those third parties and direct them to remove the consumer’s data.

Key clarifications include the definition of “household” (§ 999.301(k)) “means a person or group of people who: (1) reside at the same address, (2) share a common device or the same service provided by a business, and (3) are identified by the business as sharing the same group account or unique identifier. Previously the definition was “a person or group of people occupying a single dwelling.” The new definition better accommodates the reality of the knowledge a business may have about households.”

Another key clarification came with the new section 999.302 on Guidance regarding the interpretation of CCPA definition. This new section of the proposed regulations provides:

Whether information is “personal information,” as that term is defined in Civil Code section 1798.140, subdivision (o), depends on whether the business maintains information in a manner that “identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.” For example, if a business collects the IP addresses of visitors to its website but does not link the IP address to any particular consumer or household, and could not reasonably link the IP address with a particular consumer or household, then the IP address would not be “personal information.”

This is welcome news to many companies as it may change the conversation around cookies. It does not end the conversation, but it does change some of the recent focus.

Other information that was added included guidance around when mobile apps should provide just-in-time notice (§ 999.305(a)(4)), accessible notices (various sections), and that “do not sell my personal data” link is not required in the notice at the collection of employment-related information (§ 999.2305(e)).

Where are we now?

The comment period has ended and we expect the next version will be the issuance of the final regulations. We may be surprised again with a new round of proposed regulations, but that is not expected. Next, according to the “Information about the rulemaking process,” the Office of the Attorney General will prepare and submit the final rulemaking record to the Office of Administrative Law (“OAL”) for approval, including the summaries and and responses to each public comment received. The OAL has 30 working days to determine if all of the procedural requirements are met and if so, the regulations will be filed with the Secretary of State.

Will enforcement start July 1, 2020?

At this time, enforcement remains slated to start on July 1, 2020.

To speak with a privacy expert about the California Consumer Privacy Act and how to comply, schedule a consultation today. In addition, TrustArc discusses the CCPA in its Serious Privacy podcast with Peter Stockburger, partner at Denton’s who practices in the area of Data Privacy.

Serious Privacy on Global Data Privacy / Data Protection Day

Privacy laws are 50 years old this year! – which makes this Global Data Privacy Day – or Global Data Protection Day – even more special.

In celebration, TrustArc is launching the Serious Privacy Podcast, because the world needs serious privacy help. The podcast, hosted by Paul Breitbarth and K Royal, will look at the topics privacy professionals are most concerned with and seek to help them maximize their time by delivering key content in different ways. As Paul and K discuss in the pilot episode, the podcast will deliver TrustArc webinars via podcast, seek to capture conference sessions, and host unscripted discussions with privacy professionals on relevant, interesting, controversial, inspiring, or exciting topics.

In this pilot, Paul and K touch on two topics – privacy turning 50 years old and insight into how they got into privacy as a profession and what keeps them here.

It seems surprising that both Europe and the United States passed their first privacy laws. The EU saw its first data protection law ever with the German federal state of Hessen, albeit at regional level. Three years later, Sweden followed with their national Data Act, the first national data protection law. On the US side, the Fair Credit Reporting Act was passed in 1970 addressing a concern of “fairness, impartiality, and a respect for the consumer’s right to privacy” and the US Privacy Act followed in 1974.

Since then, the world has joined in with hundreds of privacy laws being passed and taking some different approaches in enforcement. But the one thing that remains clear…. it is critical that individuals have rights when it comes to their personal information and that businesses take responsibility to protect the data entrusted to them.

The huge jumps in technology and digital data and the increasing number of laws is what drove many privacy professionals to enter the field, by design or happenstance. In the first episode, Hilary Wandall, SVP, Privacy Intelligence and General Counsel joins us to share how she entered privacy along with the career journeys of Paul and K. As you can imagine, the paths share as many similarities as they do differences.

Listen to the pilot episode here.
Please let us know what you are interested in hearing. Email us here: Podcast@TrustArc.com

The podcast reflects what the privacy profession needs, real information, readily available, with convenient timing, and honest discussion of the real topics that matter in privacy and management of privacy programs. Really serious privacy.

Adobe Boosts Global Trust with TrustArc APEC PRP Certification

By Adobe Privacy Team 

As governments globally continue to address data privacy, organizations have seen how a privacy-by-design approach can yield significant benefits. Since the introduction of the General Data Protection Regulation (GDPR), research has shown that companies who build privacy into the foundation of their product lifecycle have seen a competitive advantage with more positive customer satisfaction, increased trust, and even higher employee morale and revenue.

Earlier last year, we teamed up with TrustArc, which has a rich history of helping companies demonstrate compliance, to independently validate our GDPR readiness. We’re excited to build upon this and announce that Adobe is now certified under the new Asia-Pacific Economic Cooperation (APEC) Privacy Recognition for Processors (PRP) for various offerings within Adobe Experience Cloud, further demonstrating our commitment and readiness to support our global customers and partners. This puts us among a small group of leading organizations that have demonstrated the ability to support data controllers in compliance with the APEC Cross Border Privacy Rules (CBPR).

TrustArc PRP certification

The APEC Cross Border Privacy Rules (CBPR) and PRP are voluntary enforceable, accountability-based data privacy certifications. The framework was endorsed by the 21 APEC Member Economies to promote accountable and responsible transfers of personal information around the globe.

While the APEC CBPR only applies to data controllers, the APEC PRP was established to provide a mechanism to help controllers identify qualified and accountable processors and help those processors demonstrate their ability to support controllers in compliance with the CBPR.

TrustArc CEO Chris Babel said, “We believe a strong data privacy management program is critical for companies to build customer trust. We reviewed and certified the privacy practices of Adobe to ensure they meet the terms of PRP participation and can demonstrate this privacy commitment to users, partners and regulators.”

Continued commitment to privacy principles

As we continue to build on our strong privacy foundation, we find it incredibly important to participate in these types of certification and validation programs to help demonstrate our compliance with globally recognized privacy standards.

Knowing privacy is top-of-mind for our customers and partners alike, we’re also listening and anticipating their needs by introducing various services and tool as part of our Adobe Experience Platform Privacy Service and the Privacy JavaScript Library.

We are committed to privacy-by-design principles and demonstrating accountability and transparency.

Read the full post here.

div>