IT Standard FAQ
Sensitive Regulated Data Permitted & Restricted Uses FAQ
Here are brief answers to frequently asked questions about what categories of sensitive regulated data can or cannot be maintained in different IT environments, internal or external to U-M. Links throughout the answers will guide you to more information on Safe Computing or from other sources. If you have additional questions or need further clarification, contact the ITS Service Center.
Return to the Sensitive Regulated Data IT Standard
Expand All Questions
SPG 601.25 - Information Security Incident Reporting Policy
defines an information security incident as an attempted or successful unauthorized access, use, disclosure, modification or destruction of information; interference with information technology operation; or violation of explicit or implied acceptable usage.
SPG 601.25, Section III provides examples of incidents. In addition, a violation of this standard, whether knowingly or unknowingly, should be considered an incident.
Examples of information security incidents under this standard include:
- Using an unapproved service (e.g., a personal email account) to share sensitive regulated data;
- Uploading ePHI to Google;
No, receiving sensitive regulated data from a third-party is not considered an incident (e.g., a student sends a financial aid officer an email or document containing GLBA data). However, it is expected that the sensitive regulated data is then processed and maintained in a manner that complies with the standard.
These consequences apply to U-M faculty and staff.
Not following the responsibilities enumerated in Section III of the standard would be considered a violation of the standard. Accordingly, per Section IV, you may be disciplined, up to and including termination.
In addition, per SPG 601.9 – Defense and Indemnification, you may have your institutional indemnification revoked if it is determined that you did not perform your employment responsibilities in a manner that demonstrated "good faith efforts." If your indemnification is revoked, it means that you may be individually liable if there is a legal proceeding related to the violation of the standard.
Yes, these restrictions are outlined in the Sensitive Regulated Data Standard and discussed in the "Specific Data Use Restrictions" section of this FAQ.
Otherwise, you may use Google to conduct activities that align with your role at the university, so long as you follow the Proper Use policy. For more, see the section on Specific Data Use Restrictions.
No, Google does not own the data. The data are owned by the user or the university. Google retains the data only as long as you want them to, and deletes the data when you ask them to.
Specific Data Use Restrictions in the U-M Google Environment
ePHI is individually identifiable health information, in electronic form, as defined by the Health Insurance Portability and Accountability Act of 1996 (HIPAA). HIPAA also requires a contractual arrangement (typically known as a Business Associate Agreement) be made with service providers that perform functions or activities that involve the use, storage, or disclosure of ePHI on behalf of a HIPAA-covered entity, or that provide services to such an entity.
The U-M Google Apps for Education Agreement does not include a specific business associate agreement or incorporate such language into the Agreement. Therefore, ePHI should not be collected, processed, shared or stored in the Google environment.
Export controlled research includes information that is regulated for reasons of national security, foreign policy, anti-terrorism or non-proliferation. Regulatory requirements governed by International Traffic in Arms Regulations (ITAR), Export Administration Regulations (EAR), and the Office of Foreign Assets Control Regulations (OFAC) includes restricting access to research data to U.S. citizens and licensed foreign nationals, and storing it within U.S. borders.
Because Google has an internationally distributed storage environment and unlicensed foreign nationals supporting their systems, you should not collect, process, share or store export controlled research data in the Google environment.
For more information, visit the Office of the Vice President for Research Information on Export Control Regulations and Restrictions.
FISMA requires federal agencies, and those providing services on their behalf to develop, document, and implement security programs for IT systems and store the data within U.S. borders.
Unlike Google Apps for Government, Google Apps for Education is not FISMA compliant, and U-M faculty and researchers should not collect, process, share or store FISMA data in the Google environment.
The payment card industry created the data security standards (PCI-DSS) for organizations that process, store or transmit cardholder data. The Office of the U-M Treasurer has overall responsibility for the oversight of payment card services, and is the owner of PCI compliance for the university.
The Treasurer's Office mandates that units must not to store cardholder data on any university system without approval. By extension, this means that Google should not be used to collect, process or store payment card data.
For more information, visit the Treasurer's Office PCI resources.
GLBA requires financial institutions, including higher education institutions to safeguard sensitive data. The University Registrar is the Coordinator of the Information Security Program for customer information as defined by GLBA, and maintains the U-M GLBA Information Security Program
. This program requires that service providers maintain appropriate safeguards for GLBA data, and should agree to a non-disclosure and security safeguard provision if they handle or have reason to possess data defined as customer information under the GLBA.
Since Google will not agree to a GLBA-specific non-disclosure and security safeguard provision, it should not be used to collect, process or store GLBA data.
For more, visit the Office of the Registrar's GLBA webpage (authorization required)
Human subject research is regulated by the Federal Policy for the Protection of Human Subjects (Common Rule)
. Among other requirements, the Common Rule mandates that researchers protect the privacy of subjects and to maintain confidentiality of human subject data.
The federal regulation further defines sensitive data in this context as "information...recorded in such a manner that human subjects can be identified, directly or through identifiers linked to the subjects; and any disclosure of the human subjects' responses outside the research could reasonably place the subjects at risk of criminal or civil liability or be damaging to the subjects' financial standing, employability, or reputation."
Data that specifically identifies an individual (as opposed to de-identified data or aggregate data) may be restricted from certain IT services due to lack of reasonable security controls, or the risk that the service's design makes it too easy for the identifiable data to be publicly accessed.
There is an unacceptably high risk that identifiable human subject data maintained in Google e-mail, calendar, or other collaborative tools may be accidentally shared publicly; such use is not permitted under this standard. However, such data may be maintained in Google documents.
Student data, with the exception of directory information permitted by law, should never be made publicly accessible. Faculty and staff are reminded of their obligations to protect FERPA data and only share such data with the student and those who have a legitimate education-related interest as defined by the U-M Registrar's Student Rights and Student Records
Under the Google Apps for Education agreement, Google is deemed a "school official" and will comply with its obligations under FERPA. Therefore, FERPA data may be collected, processed or stored in the Google environment.
Under the U-M data classification scheme, there are many types of data that are considered sensitive, but that are not necessarily prescriptively regulated. SPG 601.2 - Institutional Data Resource Management Policy
defines sensitive as data "whose unauthorized disclosure may have serious adverse effect on the University's reputation, resources, services, or individuals. Data protected under federal or state regulations or due to proprietary, ethical, or privacy considerations will typically be classified as sensitive."
Examples of sensitive unregulated data that carries with it proprietary, ethical, or privacy considerations include:
- Attorney-client privileged information;
- Donor information;
- High-profile/controversial research (e.g., stem cell, animal), and
- Data related to security plans and security incidents.
University guidelines state that information that is private, personal or sensitive should not be maintained by a third party service provider unless there is a contractual agreement between U-M and the service provider that protects the security and confidentiality of the information.
Because U-M has a contractual agreement with Google (and absent other specific requirements (e.g., contractual agreements for sponsored research; specifically sensitive regulated data), sensitive unregulated data may be maintained in the U-M Google environment.
Return to the Sensitive Regulated Data IT Standard