WSJ reported on November 22, 2013, Google’s push to move Google Glass, a computerized device with an “optical head-mounted display,” into the mainstream by tapping the prescription eyewear market through VSP Global—a nationwide vision benefits provider and maker of frames and lenses. If the speed and immersion of technology over the past few years had shown us anything, it is that it will not be too long before employees are donning Google Glass on the job, putting yet another twist on technology’s impact on the workplace.

Employers continue to adjust to the influx of personal smartphones in the workplace, many adopting “Bring Your Own Device” (BYOD) strategies and policies. These technologies have no doubt been beneficial to businesses and workplace around the globe. The introduction of Google Glass into the workplace may have similar benefits, but the technology also could amplify many of the same challenges as other personal devices, and create new ones.

For example, employers may experience productivity losses as employees focus on their Glass eye piece and not their managers, co-workers, customers. Likewise, some businesses will need to consider whether Google Glass may contribute to a lack of attention to tasks that can create significant safety risks for workers and customers, such as for employees who drive or use machinery as a regular part of their jobs.

A popular feature of Google Glass is the ability to record audio and video. Smartphones and other devices do this already, but recording with Glass seems so much easier and become potentially less obvious overtime as we get used to seeing folks with the Glass. Of course, recording of activities and conversations in the workplace raise a number of issues. In healthcare, for instance, employees might capture protected health information with their devices, but potentially without the proper protections under HIPAA. Conversations recorded without the consent of the appropriate parties can violate the law in a number of states. Employees with regular access to sensitive financial information could easily capture a wealth of personal data, raising yet another data privacy and security risk.

The capturing of data on the Glass, even if not collected, used or safeguarded improperly, will add to the challenges businesses have to avoid spoliation of data stored in these additional repositories of potentially relevant evidence.

Only time and experience will tell what the impact of Google Glass will be in the workplace. However, as companies continue to adapt to present technologies, they should be keeping an eye on the inevitable presence of such new technologies, and avoid being caught without a strategy for reducing risks and avoidable litigation.

If your cloud service provider sounds like your local weather reporter – partly cloudy with a chance of rain – you may be in for a data security storm. A USA Today guest essay by Rajiv Gupta highlights the need for a multi-layered approach for cloud providers to ensure data stored in the cloud is secure, something we’ve touched upon here before. Businesses need greater certainty concerning the security of their data in the cloud and should be pressing their cloud providers for a security forecast with more certainty than their local weather report.

As Mr. Gupta notes, “by 2020 nearly 40% of the information in the digital universe will be touched by cloud computing providers.” Many businesses recognize this trend and may already have business data and applications in the cloud. However, some may not realize that some of their data is in the cloud without their knowledge or authorization, and without having had an opportunity to vet the provider(s). For example, it has been found that as many as 1 in 5 employees use commercial cloud providers to store company information.

Mr. Gupta discusses a number of tactics cloud providers should employ to secure data in the cloud – encryption, contextual access control, data loss prevention technologies, audit trails, and enforcement of security policies from application to application. Good advice for cloud providers. But customers of the cloud need to think a little differently.

Purchasers of cloud data storage services need to have a sense of the multiple layers of security tactics that are recommended for cloud providers and see to it that their provider(s) have them in place. But they also need to be thinking about:

  • What protections does their company have if the cloud provider’s systems are breached?
  • Does the services agreement with the cloud provider adequately address security, data breach, indemnity, reporting and so on?
  • What policies do they have for their employees concerning the privacy, security, integrity and accessibility of company data when using the cloud? And, which cloud should they be using?
  • How would employees’ use of their personal commercial cloud services complicate a company’s litigation hold processes?
  • Who at the company and at the cloud provider has/should have access to the data?
  • Is the cloud service provider a business associate/subcontractor under HIPAA, prepared to comply with the HITECH Act? What about the agreement requirements under state law?
  • Where is the data stored? Is it in the United States, or in a foreign country subject to different data security standards?
  • What if the cloud goes down, out of business? Will company data and applications be accessible?
  • Are the businesses’ customers and clients on board with use of the cloud for their data?

These are just some of the key questions businesses should be asking about concerning use of the cloud. The technology can indeed yield substantial cost savings, but the failure to think carefully about its adoption and implementation can create substantial exposure for the company.

According to testimony before the House Committee on Science, Space, and Technology and warnings from IT security experts, individuals using the federal government’s website to obtain health coverage through the Exchange are likely putting the security of their sensitive personal information at significant risk. Reports about the cost of the federal website vary, but based on those reports, it is safe to say that the cost to date is tens of millions of dollars, and growing.

Politics aside, most companies spend far less on their websites, whether those sites are directed at customers, the public generally, employees and applicants, and all of the above. These companies might be asking, if the United States government spends tens of millions of dollars on a website that may wind up being inadequate to secure sensitive personal information, have we done enough to secure our sites. Many of these same companies use third party vendors to provide web-based services to their employees and customers, and may be wondering whether those vendors have appropriate security measures in place.

These are important questions that relate not only to the technical data security measures in place for a site, but what is stated on the site in website privacy policies and terms of use about the security of the data collected on the site. The appropriate level of security will vary, for sure, company to company, industry to industry, function to function, and so on. But, the level of website security, what is said about the level of security, and addressing releated exposures should be a priority for any company’s risk management team, and not left solely to the IT department.

The New York Times published an interesting front page article by Somini Sengupta on October 31, 2013 about the growing trend of state legislative action on privacy issues, noting that over two dozen privacy laws have passed this year in more than 10 states. The piece also notes that the “patchwork of rules across the country” is a burden on companies, which must “keep a close eye on evolving laws to avoid overstepping.”  The proliferation of state laws is a result of citizen concerns about privacy combined with Congressional gridlock. Some of the laws described in the piece include three online privacy laws passed in California just this year, “one gives children the right to erase social media posts, another makes it a misdemeanor to publish identifiable nude photos online without the subject’s permission, and a third requires companies to tell consumers whether they abide by ‘do not track’ signals on web browsers.”  The article is a good summary of the state of U.S. privacy regulation today. Expect more state legislative action in the future.

The Florida Senate is considering joining a multitude of states which have banned employers from requesting or requiring access to current or prospective employees’ social media accounts.

Senate Bill SB198, entitled “An Act Relating to Social Media Privacy,” would prohibit employers from requiring or requesting access to employee or applicant social media accounts and from taking any retaliatory personnel action or refusal to hire based on an employee’s or applicant’s failure or refusal to provide access to the account. The proposed law would allow an employee or prospective employee to file a civil suit for injunctive relief and damages, including the recovery of attorney’s fees and costs should the employee or applicant prevail against the employer.

Several other states have already enacted similar laws. These states include Arkansas, Colorado, New Mexico, Oregon, Utah, Vermont and Washington, California, Illinois, Maryland, Michigan, Nevada, and New Jersey.

Employers in Florida and across the country will need to revisit some of the internal hiring, human resources, and monitoring practices they may be following, in particular, those of lower level managers and supervisors who may not be aware of these developments. Companies also need to reconsider what role they want employees to play in the businesses’ marketing strategies in social media.

Since we last published our social media white paper in 2010, a lot has happened!

Three opinions from the National Labor Relations Board’s Acting General Counsel, the passage of state laws prohibiting employers from requesting social media account passwords, and industry specific guidance affecting certain businesses and professions are examples of some of these developments.

We hope our updated Special Report (CLICK HERE), which covers many of these key developments, is helpful to your organization. 

Since it was enacted in 2008, plaintiffs suing under the Genetic Information Nondiscrimination Act ("GINA"), 42 U.S.C. Section 2000ff et seq., have not had much success. Most cases have been dismissed at an early stage.  As reported on our Disability, Leave and Health Management Blog, however, this summer the U.S. Equal Employment Opportunity Commission ("EEOC") burst on the scene with its first two lawsuits under GINA.  These two cases provide a simple but important lesson for employers: Always use the safe harbor language when requesting medical information from a health care provider, especially when arranging for a post-offer pre-employment physical examination of a new hire.

In EEOC v. Fabricut, Inc., No. 13-CV-248 (CVE/PJC), (N. D. Ok. 2013), the EEOC settled a lawsuit brought under both GINA and the Americans with Disabilities Act (ADA) for $50,000 and a consent decree. In that case, Fabricut made a job offer to a candidate and sent her to a clinic for a pre-employment physical examination. When she reported for her physical she was asked to fill out a medical history questionnaire which included standard questions about her family medical history. She was determined to have carpal tunnel syndrome, which has nothing to do with her genetic or family history, and was not hired. The EEOC took the position that GINA prohibited the employer from asking about the candidate’s family medical history, even through its contracted third-party clinic.

In EEOC v. The Founders Pavilion, Inc., No 6:13-CV-06250 (W.D.N.Y. 2013), the EEOC filed its first class action against an employer under GINA for similar alleged violations, namely requiring post-offer, pre-employment medical examinations at which the person was asked to provide family medical history as part of the exam.

It may make sense as a matter of medical science for doctors to obtain family medical history information, especially for treatment.  In pre-employment physicals, however, science clashes with the law. Human Resources must be vigilant about instructing medical providers to comply with GINA, because the medical providers will not do so on their own.

The easy solution to this technical compliance matter is provided in proposed "safe harbor" language set forth at 29 C.F.R. Section 1635.8(b)(1)(i)(B) as follows:

The Genetic Information Nondiscrimination Act of 2008 (GINA) prohibits employers and other entities covered by GINA Title II from requesting or requiring genetic information of an individual or family member of the individual, except as specifically allowed by this law. To comply with this law, we are asking that you not provide any genetic information when responding to this request for medical information. “Genetic information,” as defined by GINA, includes an individual’s family medical history, the results of an individual’s or family member’s genetic tests, the fact that an individual or an individual’s family member sought or received genetic services, and genetic information of a fetus carried by an individual or an individual’s family member or an embryo lawfully held by an individual or family member receiving assistive reproductive services.

Human Resources professionals scheduling medical examinations for new hires should keep in mind: Don’t forget the safe harbor language or you may get a visit from the EEOC.

 

The Driver’s Privacy Protection Act ("DPPA"), 18 U.S.C. Section 2721, et seq, was enacted by Congress in 1994 after the highly-publicized murder of actress Rebecca Schaeffer by a stalker who obtained her unlisted address from the California Department of Motor Vehicles. ("DMV").  The Act restricts state DMVs from disclosing personal information contained in motor vehicle records except for specific governmental and business purposes. In addition, the statute provides that is "unlawful for an person knowingly to obtain or disclose personal information from a motor vehicle record" for any use not permitted under 18 U.S.C. Section 2722(a).

In January of this year, the Minnesota Department of Natural Resources ("DNR") sent a letter to more than 5,000 individuals stating that it had discovered that one of its former employees, John Hunt, had improperly accessed their motor vehicle record data approximately nineteen thousand times. Hunt is no longer employed by the Minnesota DNR.

Attorneys for some of the recipients of the breach notification letter filed a total of five class action lawsuits under the DPPA, which were consolidated in U.S. District Court for the District of Minnesota under the caption Kiminski, et al v. Hunt, et al, No. 13-185.  Plaintiffs named a number of supervisors and commissioners of the DNR and the Minnesota Department of Public Safety as defendants in their personal capacities, along with Hunt. In addition to claims under the DPPA, plaintiffs asserted claims under 42 U.S.C. Section 1983, a catch-all cause of action allowing claims against state actors for denying someone their rights under a law or the Constitution.

On September 20, 2013, District Judge Joan N. Ericksen issued an order granting a motion to dismiss all of the state-affiliated employees, leaving only Hunt himself as a defendant. Judge Ericksen held that plaintiffs had not stated a cause of action as to the dismissed defendants because none of them obtained or disclosed information for improper purposes, even though Hunt allegedly did so under their watch. The court dismissed the Section 1983 claim because she interpreted the DPPA as including an express private means of redress that precludes a more expansive remedy under Section 1983.

Minnesota has been the land of 10,000 privacy leaks lately, as the State grapples with negative publicity from the disclosure that an employee of the state’s new on-line health insurance exchange, MNsure, accidentally distributed confidential information, including Social Security numbers, of insurance agents who had participated in training on the system. State officials are concerned that the leak will erode the public’s confidence in the system which is scheduled to go live in October. Minnesota’s Legislative Auditor is currently investigating MNsure’s data security practices. 

As the compliance date for the final Omnibus HIPAA privacy and security rule looms, September 23, 2013, the Office for Civil Rights and Office of the National Coordinator for Health Information Technology lend a helping hand to covered entities by publishing model Notices of Privacy Practices (NPP) for health care providers and health plans. The Omnibus Rule implements a number of changes required under HITECH (see webinar outlining those changes), including "material" changes to NPPs.

The model NPPs reflect these changes and are designed to help covered entities meet their obligation to develop and distribute clear, user friendly notices. The agencies also provided optional formats for the NPPs:

  • Notice in the form of a booklet;
  • A layered notice that presents a summary of the information on the first page, followed by the full content on the following pages;
  • A notice with the design elements found in the booklet, but formatted for full page presentation; and
  • A text only version of the notice.

Note to covered entities: The agencies state that the model NPPs reflect the regulatory changes of the Omnibus Rule, and can serve as a baseline for compliance. Covered entities will still have to tailor the notices to their particular circumstances and insert information specific to their organizations. In addition, covered entities should review the rules for how and when notices need to be provided. See 45 CFR 164.520. For example, NPPs generally can be provided by email provided the recipient has consented. Also, if a covered entity maintains a website about its customer services or benefits, it must prominently post the NPP on that site.

California law soon may require commercial websites that collect personal data to disclose how they respond to “Do Not Track” signals from Web browsers. AB 370, an amendment to the California Online Privacy Protection Act (Act), which was sponsored by Attorney General Kamala Harris, passed the California Senate and Assembly at the end of August. Governor Jerry Brown is expected to sign the amendment soon.

The bill does not prohibit tracking but instead requires a website operator to disclose its tracking practices in the privacy policy posted on the website. Under the Act, if a website fails to clearly set forth its disclosure practice in its privacy policy, it will be given a warning and 30 days to come into compliance.

If signed into law, many businesses will need to update the privacy policies for the websites they operate to address their tracking policies. Specifically, although the bill does not set a standard for how a website must respond to Do Not Track browser signals, it would require websites to elect whether to honor or ignore those Do Not Track browser signals. Of course, this also would be a good opportunity to revisit website policies to ensure they accurately reflect company operations concerning the handling of personal information captured from the site.