top of page

How does Ellipsis Health keep sensitive data safe? 9 questions answered

From our beginnings in 2017, we have put privacy, consent, and data security at the forefront of our work. This commitment is not just a response to regulatory requirements; it’s a foundational principle that recognizes the sensitive nature of mental health data and the trust users place in Ellipsis Health.


The importance of data security in our industry cannot be overstated. Mental health conditions, by their very nature, involve deeply personal information that, if mishandled, could lead to stigma, discrimination, and a breach of personal privacy. 


In addition, there is understandable concern about the misuse of powerful artificial intelligence. That’s why it’s even more critical that we at Ellipsis Health emphasize the positive impact of AI when deployed ethically and responsibly while also alleviating any concerns about the reliability and security of our technology. 


For individuals seeking support for conditions such as anxiety and depression, the assurance of confidentiality and safety is crucial. It encourages openness and honesty when discussing their mental state, thereby enabling more accurate assessments and better care. With our evidence-backed AI health solutions for the specific, limited purpose of supporting mental health measurement and care — and absolutely nothing else — we can guarantee to respect patient consent and confidentiality.


The battle is not over, however. We must continue doing everything we can to build public trust based not just on words, but on the actions we’ll explore in this article. 


1: How do patients interact with Ellipsis Health?

Before looking at how data is used, we’ll first outline the two main ways patients come into contact with Ellipsis Health.


  • Recorded or live calls: When we work with health plans and systems, we typically layer our technology on telehealth calls, care coordination calls, or case management calls. Healthcare providers request patient consent before recording and Ellipsis Health accesses patient data from the health plan or system. 

  • App or device-based products: Another way we reach patients is through Ellipsis Health APIs integrated with healthcare providers’ and health plans’ platforms (a white-labeled solution). Patients will enter a code or username and password to gain access to the technology. 


2: How do we handle user consent for collecting and analyzing vocal data?

We always work with our partners to determine the most transparent, effective way to obtain patient consent. Typically, when our product is layered onto calls, the healthcare team member on the call asks for consent. With device-based products, patient interaction is through the same portal or app each time, so consent can be obtained during the onboarding process. In many cases, consent can be obtained through statements that inform the patient of our automated analysis of their voice and speech as part of regular consent to care paperwork. Our legal team is always available to help ensure that consent processes and procedures are tailored to each situation and that all relevant laws are fully complied with.


3: What data does Ellipsis Health take from the user?

Ellipsis Health works by gathering short voice recordings, which we then assess for the severity of anxiety and depression symptoms. Depending on how patients use our service, the total amount of data we receive can vary, but all interactions fall under the same strict consent and security protocols.

The two types of data we receive are as follows:


  • Voice data — the short voice recording that we analyze. Audio could be taken live during a call with a healthcare professional or from a recording made through an app. 

  • Patient metadata — only when available and with consent. Metadata can include age, gender, socioeconomic status, and more, but does not include information that can identify an individual unless provided by the healthcare organization to allow Ellipsis Health to aid in reporting an individual’s assessments. We collect metadata to measure the performance of our machine-learning algorithms and strengthen accuracy across all demographics. 


4: What happens to the voice data that Ellipsis Health collects?

To achieve our main aim of screening and monitoring for symptom severity of anxiety and depression, we feed patient speech data into our deep learning models which analyze the words spoken as well as acoustic features such as hesitations, pace, and tone. It is our policy that recordings shall have their Personal Health Information (PHI) and Personal Identifiable Information (PII) removed (unless required to be retained as specified by our partner). They are encrypted and stored using military-grade AES-256 encryption to protect them from any unauthorized access and use. 


We don’t rely on human intervention to process voice recordings or to produce results, adding another layer of security. However, to troubleshoot technical issues and to monitor for quality, we do have secured workflows in place for human intervention, led by our CyberGovernance Committee. 


5: Can users request the deletion of their data?

We believe patient data belongs to the patient. If at any time they want to delete their data, they can reach out to begin a formal process of data deletion. 


6: Does Ellipsis Health share user data with third parties?

No. We do not share any patient data with any third parties beyond those necessary to provide our services or our assessment of mental health conditions.


7: What privacy and data security standards does Ellipsis Health adhere to?

For maximum privacy and security, we work with experts and counsel to ensure that we fully comply with all applicable state and federal privacy and security laws, as well as the following:


  • SOC 2, or System and Organization Controls for Trust Services Criteria. SOC 2 Type II ensures that specified controls not only meet or exceed relevant trust principles but that they have shown operational effectiveness over a period of time.

  • GDPR, the European Union’s General Data Protection Regulation, which is one of the world’s toughest privacy and security laws. 

  • HIPAA, the 1996 Health Insurance Portability and Accountability Act (HIPAA) is aimed at protecting sensitive patient health information from being disclosed.


We are currently in the process of becoming certified by HITRUST, the Health Information Trust Alliance. Similar to HIPAA, this certification stands for the protection of personal information. But HITRUST also leverages over 40 other nationally and internationally recognized regulations, standards, and frameworks to ensure a comprehensive and consistent set of privacy and security controls.


8: What internal policies does the company use to ensure privacy and security?

Under the guidance of our Information Security Officer and the CyberGovernance Committee, Ellipsis Health also adheres to strict privacy and security protocols on an internal level. 


All our data is stored in trusted cloud-based servers such as Google and Amazon Web Services (AWS) which provide secure login processes and are highly resistant to cybersecurity attacks. They also have an impressive record against data loss, with data availability of 4 nines, or 99.99%. 


As part of our commitment to security, we run regular third-party evaluation and penetration testing into our systems to continually improve our architecture from emerging hacking techniques.


8: How does Ellipsis Health reduce biases in its AI models?

The US population is incredibly diverse and we feel strongly that people from all backgrounds deserve access to high-quality mental health assessment and treatment. Any database used to train AI models for anxiety and depression screening needs to represent that diversity to avoid bias. 


Since our founding in 2017, Ellipsis Health has led the field through rigorous science, ethical AI, and clinical efficacy. To compile a platform that was truly reflective of the population, we focused exclusively on research and validation for years, populating databases with recordings from hundreds of thousands of individuals. 


To ensure that our screening covers the greatest number of patients possible, the team has rigorously tested the efficacy of our algorithms, publishing peer-reviewed studies on the cross-demographic portability of our models[1], their feasibility in generally senior populations[2], and many more. 


Ellipsis Health has validated its technology both technically and clinically, with approved patents for mental health assessment and many more are currently pending. Therefore, when we began commercializing our technology in 2021, health plans and providers immediately recognized the quantity and quality of work done, leading to a rising trend of trusted collaborations with names such as Cigna InternationalCeras Health, The Mayo Clinic, and more. 


To see how you can integrate our technology into your platform, reach out to the Ellipsis Health team for a one-on-one discussion. 


[1] Rutowski, T., Shriberg, E., Harati, A., Lu, Y., Oliveira, R. and Chlebek, P. "Cross-Demographic Portability of Deep NLP-Based Depression Models," 2021 IEEE Spoken Language Technology Workshop (SLT), Shenzhen, China, 2021, pp. 1052-1057, doi: 10.1109/SLT48900.2021.9383609.


[2] Lin D, Nazreen T, Rutowski T, Lu Y, Harati A, Shriberg E, Chlebek P and Aratow M Feasibility of a Machine Learning-Based Smartphone Application in Detecting Depression and Anxiety in a Generally Senior Population, 2022 Front. Psychol, 13:811517. doi: 10.3389/fpsyg.2022.811517 

Comments


bottom of page