Trial Outcomes & Findings for Connection to Care: Pilot Study of a Mobile Health Tool for Patients With Depression and Anxiety (NCT NCT02497755)

NCT ID: NCT02497755

Last Updated: 2017-12-02

Results Overview

The number of patients who rated the app was easy to use and the amount of time spent using the app as reasonable when asked about app acceptability in a qualitative interview and via a quantitative survey. Specific survey items included "The technology requires little effort to use," "The technology was easy to learn how to use," "The Ginger.io app is easy to use," and "The time required to answer questions in the Ginger.io app is reasonable." All patients who expressed agreement ("Somewhat Agree," "Agree," or "Strongly Agree") to these items and to similar questions in the qualitative interview were included in the count of patients who found the app acceptable.

Recruitment status

COMPLETED

Study phase

NA

Target enrollment

18 participants

Primary outcome timeframe

Four weeks after intervention started

Results posted on

2017-12-02

Participant Flow

Participants were recruited from the caseload of a care manager at the UW Ravenna Neighborhood Clinic who were enrolled in the Behavioral Health Integration Program for depression and anxiety. Recruitment letters were mailed between 6/15/2015 and 7/27/2015.

Participant milestones

Participant milestones
Measure
Patient Mobile App Users
Participants installed an app on their smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Care Manager Dashboard User
The care manager for the patients participating in this study used an online dashboard to monitor patient usage and responses to mobile app assessments (such as the PHQ-9) as part of her clinical workflow. She used this dashboard for the duration of participant involvement in the study (from 6/25/2015 through 9/30/2015).
Overall Study
STARTED
18
1
Overall Study
COMPLETED
17
1
Overall Study
NOT COMPLETED
1
0

Reasons for withdrawal

Reasons for withdrawal
Measure
Patient Mobile App Users
Participants installed an app on their smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Care Manager Dashboard User
The care manager for the patients participating in this study used an online dashboard to monitor patient usage and responses to mobile app assessments (such as the PHQ-9) as part of her clinical workflow. She used this dashboard for the duration of participant involvement in the study (from 6/25/2015 through 9/30/2015).
Overall Study
Protocol Violation
1
0

Baseline Characteristics

Connection to Care: Pilot Study of a Mobile Health Tool for Patients With Depression and Anxiety

Baseline characteristics by cohort

Baseline characteristics by cohort
Measure
Patient Mobile App Users
n=17 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Care Manager Dashboard User
n=1 Participants
The care manager for the patients participating in this study used an online dashboard to monitor patient usage and responses to mobile app assessments (such as the PHQ-9) as part of her clinical workflow. She used this dashboard for the duration of participant involvement in the study (from 6/25/2015 through 9/30/2015).
Total
n=18 Participants
Total of all reporting groups
Age, Categorical
<=18 years
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Age, Categorical
Between 18 and 65 years
16 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Age, Categorical
>=65 years
1 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Sex: Female, Male
Female
10 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Sex: Female, Male
Male
7 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Ethnicity (NIH/OMB)
Hispanic or Latino
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Ethnicity (NIH/OMB)
Not Hispanic or Latino
17 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Ethnicity (NIH/OMB)
Unknown or Not Reported
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
American Indian or Alaska Native
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
Asian
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
Native Hawaiian or Other Pacific Islander
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
Black or African American
1 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
White
16 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
More than one race
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Race (NIH/OMB)
Unknown or Not Reported
0 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Region of Enrollment
United States
17 participants
n=5 Participants
1 participants
n=7 Participants
18 participants
n=5 Participants
Employment Status
Full time paid employment
11 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Employment Status
Not working by choice
1 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Employment Status
Student
3 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Employment Status
Unable to work
2 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Education
Bachelor's degree
10 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Education
Graduate or professional degree
3 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Education
High school/GED
1 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants
Education
Some college
3 Participants
n=5 Participants
NA Participants
n=7 Participants
NA Participants
n=5 Participants

PRIMARY outcome

Timeframe: Four weeks after intervention started

Population: Overall Number of Participants Analyzed (16) is not consistent with numbers provided in the Participant Flow model (17) because one participant did not complete the quantitative survey used in this analysis.

The number of patients who rated the app was easy to use and the amount of time spent using the app as reasonable when asked about app acceptability in a qualitative interview and via a quantitative survey. Specific survey items included "The technology requires little effort to use," "The technology was easy to learn how to use," "The Ginger.io app is easy to use," and "The time required to answer questions in the Ginger.io app is reasonable." All patients who expressed agreement ("Somewhat Agree," "Agree," or "Strongly Agree") to these items and to similar questions in the qualitative interview were included in the count of patients who found the app acceptable.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=16 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
App Acceptability as Measured by Number of Patient App Users Who Rate App Easy to Use and Time Spent Reasonable
16 Participants

PRIMARY outcome

Timeframe: 8-16 weeks after final patient participant is enrolled

The number of care managers who agreed that the app dashboard was easy to use and that the amount of time spent using the app dashboard was reasonable when asked about app acceptability and benefit vs. burden of use with regard to clinical workflow in a qualitative interview.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=1 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
App Acceptability as Measured by Number of Care Manager Dashboard Users Who Rate Dashboard Easy to Use and Time Spent Reasonable
1 Participants

PRIMARY outcome

Timeframe: Four weeks after intervention started

Population: Overall Number of Participants Analyzed (16) is not consistent with numbers provided in the Participant Flow model (17) because one participant did not complete the quantitative survey used in this analysis.

The number of patients who rated the app as useful to them when asked about app usefulness in a qualitative interview and via a quantitative survey. Specific survey items included "This technology is useful." All patients who expressed agreement ("Somewhat Agree," "Agree," or "Strongly Agree") to this items and to similar questions in the qualitative interview were included in the count of patients who found the app acceptable.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=16 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
App Usefulness as Measured by Number of Patient App Users Who Rate App as Useful
11 Participants

PRIMARY outcome

Timeframe: 8-16 weeks after final patient participant is enrolled

The number of care managers who expressed that the app was useful to them with regard to clinical workflow in a qualitative interview.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=1 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
App Usefulness as Measured by Number of Care Manager Dashboard Users Who Rate Dashboard as Useful
1 Participants

SECONDARY outcome

Timeframe: During weeks 3 and 8

Population: Only 3 participants were analyzed at the 8 week point because only 3 participants of the 17 completed the Week 8 survey.

The mean total of patient app users' responses to the modified version of the Obtrusiveness Scale for Pervasive Technology, with a mean score of 13-39 indicating that the app is generally perceived as unacceptable/obtrusive, 40-64 indicating neutrality, and 65-91 indicating that the app is generally perceived as acceptable/unobtrusive.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=17 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Technology Acceptability as Measured by the Obtrusiveness Scale for Pervasive Technology (Modified)
Week 3
69.06 units on a scale
Interval 46.0 to 90.0
Technology Acceptability as Measured by the Obtrusiveness Scale for Pervasive Technology (Modified)
Week 8
80.33 units on a scale
Interval 72.0 to 90.0

SECONDARY outcome

Timeframe: Days 30, 56

Population: Overall Number of Participants Analyzed (13) is not consistent with numbers provided in the Participant Flow model (17) because 4 of the participants did not complete the Day 30 or the Day 56 survey. Only 2 participants were analyzed at the 56 day point because only 2 participants completed the Day 56 survey.

The mean total of patient app users' responses to the Ginger.io product feedback survey, with a mean score of 7-21 indicating satisfaction with the app, 22-34 neutrality, and 35-49 expressing dissatisfaction.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=13 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Patient Satisfaction as Measured by the Ginger.io Product Feedback Survey.
Day 30
19.85 units on a scale
Interval 10.0 to 28.0
Patient Satisfaction as Measured by the Ginger.io Product Feedback Survey.
Day 56
13.00 units on a scale
Interval 13.0 to 13.0

SECONDARY outcome

Timeframe: Eight weeks after intervention started

The mean percentage of surveys presented to users through the app that were completed by patient app users in their first 8 weeks of using the app. App survey questions included weekly PHQ-9 and GAD-7 scales and daily measures of mood and medication use as well as satisfaction surveys.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=17 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Patient Use of the App as Measured by Percentage of App Surveys Completed.
60.24 Percentage of surveys completed
Interval 28.0 to 98.0

SECONDARY outcome

Timeframe: Weeks 4 and 8

Population: Only 13 participants were analyzed at the 8 week point because only 13 participants of the 17 completed the Week 8 survey.

The mean total of patient app users' responses to the 6 items in the Consumer Assessment of Healthcare Providers and Systems (CAHPS) - Communication scale, with a mean score of 0-5 indicating that participants never or almost never experienced good communication with their care team, 6-11 indicating that participants sometimes experienced good communication with their care team, 12-17 indicating that patients usually experienced good communication with their care teams, and a score of 24 indicating that patients always experienced good communication with their care team.

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=17 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Care Team Communication as Measured by the Consumer Assessment of Healthcare Providers and Systems (CAHPS) - Communication Scale
Week 4
17 units on a scale
Interval 12.0 to 28.0
Care Team Communication as Measured by the Consumer Assessment of Healthcare Providers and Systems (CAHPS) - Communication Scale
Week 8
16 units on a scale
Interval 12.0 to 18.0

SECONDARY outcome

Timeframe: Eight weeks after intervention started

The mean number of Follow-Up contacts between patient and care manager during the 8 weeks following the patient's activation of the app. (Other types of contacts with the care manager include Initial Assessment and Contact Attempt, but neither of these occurred during the time frame of app use.)

Outcome measures

Outcome measures
Measure
Patient Mobile App Users
n=17 Participants
Participants installed an app for smartphone to add to their treatment for depression and/or anxiety. This smartphone app sent psychoeducation and reminders to patients to complete self-report data, collected passive data, and provided aggregated information to a provider dashboard. Participants used the app for at least 4 weeks, with the option to continue use for up to 12 weeks. After 4 weeks, the research coordinator conducted a phone interview on satisfaction with the app.
Care Process Measures as Measured by the Number and Type of Contacts With Care Manager.
1.35 number of contacts
Interval 0.0 to 3.0

Adverse Events

Patient Mobile App Users

Serious events: 0 serious events
Other events: 0 other events
Deaths: 0 deaths

Care Manager Dashboard User

Serious events: 0 serious events
Other events: 0 other events
Deaths: 0 deaths

Serious adverse events

Adverse event data not reported

Other adverse events

Adverse event data not reported

Additional Information

Dr. Amy Bauer

University of Washington

Phone: 206.221.8385

Results disclosure agreements

  • Principal investigator is a sponsor employee
  • Publication restrictions are in place