sieds presentation 4-26
TRANSCRIPT
1
Optimizing Multi-Channel Health Information Delivery
for Behavioral ChangeSponsor: Locus Health
Michael Buhl, James Famulare, Chris Glazier, Jennifer Harris, Alan McDowell, Greg Waldrip Advisors: Laura E. Barnes and Matthew GerberUniversity of Virginia, Department of Systems and Information Engineering
2
Executive SummaryImplemented and tested content personalization methods for a third-party software with a UVA student population• Machine Learning Methods outperformed Random
Selection, but experiment lacked power to determine best personalization method• Post-Hoc Survey indicated that using daily
reminders increased system interaction (97%), but did not support behavioral change (11%)
Recommend expansion of study and system functionality
3
Benefits of Telehealth• Reduction in readmissions rates by 51% for
heart failure and 44% for other illnesses (Veterans Health Administration) [1]• No difference in efficacy between virtual and
in-person care over 8,000 patient study [2]• Estimated return of $3.30 for every
$1 spent on telecare (Geisinger Health Plan) [1]
4
To maximize patient health through
telehealth system
To decrease 30 day hospital readmission
ratesTo maximize patient
engagement
Objectives Tree
5
Behavioral Change Support System DesignRespond
to information or question cards
SMS or Email
Random Content
Receive Awards
Interaction Data Recorded
6
To maximize patient health through
telehealth application
To decrease 30 day hospital readmission
ratesTo maximize patient
engagement
7
To maximize patient engagement
To maximize frequency of page
visits
Consecutive Login Rate
% of days consecutively
logged in
Open Email Rate
% of total emails opened
Dwell Time
Total time spent on cards
To maximize effectiveness of
cards
8
To maximize patient engagement
To maximize frequency of page
visits
Consecutive Login Rate
% of days consecutively
logged in
Open Email Rate
% of total emails opened
Dwell Time
Total time spent on cards
To maximize effectiveness of
cards
9
To maximize effectiveness of
cards
To maximizing the effectiveness of
information cards
Response Rate % of total cards
responded to
To maximize the effectiveness of
questionnaire cards
10
To maximize effectiveness of
cards
To maximizing the effectiveness of
information cards
Response Rate % of total cards
responded to
To maximize the effectiveness of
questionnaire cards
11
To maximize the effectiveness of questionnaire
cards
To increase medical usage rates
Daily Habit Rate % of healthy card responses
To increase other healthy habit rates
Daily Habit Rate % of healthy card responses
To maximize the number of card
interaction
Response Rate % of total cards
responded to
12
Literature ReviewMorrison’s (2015) psychology theory research suggests targeting content to improve digital health behavioral systems [3]Recommender Systems and Regression Analysis personalize content• Recommender Systems use inferred ratings of viewed
content to estimate ratings of unviewed content [4]• Ratings based on user characteristics (collaborative filtering) or
content characteristics (content-based filtering) [4]• Regression Analysis identify statistically significant
correlations between demographic information, internet behaviors, and contextual data and metrics [5]
13
Behavioral Change Support System DesignRespond
to information or question cards
SMS or Email
Receive Awards
Interaction Data Recorded
Data Exported
ContentStrategiesDetermined
StrategiesUpdated inSystem
Targeted Content
14
Surveyed Systems
Engineering Student
Population
Study Group Randomize
Content
Control Group
Randomize Content
Regression Group
Target Content
Collaborative Filtering Group
Target Content
Experimental Design: Student Exercise Study
n = 15 n = 14 n = 15
Week 1
n = 44
Week 2
15
Preliminary Results: Week 1• Over half of participants responded to
content (58.4%)• A majority of users logged in on
consecutive days (64.9%)• Most users opened daily email
reminders to access system (71.5%)• Participants spent an average of 32.5
seconds in the system per 5 cards
Data informed Week 2 content targeting strategies
16
Models: RegressionRegressed 5 metrics on 17 predictors based on user and card characteristics, looking for statistically significant and actionable predictors
• Regression indicated significant negative correlation between response rate and information cards
We removed fact-based and non-fact based information cards to test these results over Week 2 for the Regression group
17
Card1 Card2 Card3 Card4 Card5User1 2 3 0 2User2 3 1 2 3User3 1 2 3 2
Card score determined implicitly by set of rules that rewards users for card interactions:• First week average Card Score: 1.88 out of a
max of 3
Models: Collaborative Filtering
18
Card1 Card2 Card3 Card4 Card5User1 2 3 0 2User2 3 1 2 3User3 1 2 3 2
Goal: Fill in the blanks
Card1 Card2 Card3 Card4 Card5User1 .25 1.25 -1.75 .25User2 .75 -1.25 -.25 .75User3 -1 0 1 0
Normalized ratings used to calculate user similarity
Sim(1, 2) = .95Sim(1, 3) = -.57 Sim(2, 3) = -.76
Models: Collaborative Filtering
19
Card1 Card2 Card3 Card4 Card5User1 2 3 0 2 2.2User2 2.8 3 1 2 3User3 1 2 3 2 2
Card1 Card2 Card3 Card4 Card5User1 .25 1.25 -1.75 .25User2 .75 -1.25 -.25 .75User3 -1 0 1 0
Sim(1, 2) = .95Sim(1, 3) = -.57 Sim(2, 3) = -.76
Models: Collaborative Filtering
20
User1Card2 3Card5 2.2Card1 2Card4 2Card3 0
User1Card2Card5Card1Card4
Card2
Card1
Sort cards by score and create top N rankingRandomly draw X cards from that list everyday
Models: Collaborative Filtering
21
•We evaluated three collaborative filtering methods, where each used a different method to infer unknown card scores• Based on cross validation, item-based
collaborative filtering provided the lowest RMSE • Method results provided basis for
collaborative filtering user group in Week 2 testing
Models: Collaborative Filtering
22
Week 2 Results: Metrics
Content Targeting yielded higher response and consecutive login rates than Random Selection, but experiment lacked statistical significance to determine best personalization method.
23
Post-Study Survey ResultsAll 44 users completed a post-study questionnaire to evaluate system and experimental efficacy • Suggested improvements• Inhibit simply clicking through questions• Increase goal setting implementations or
incentives• Increase content diversity
• Endorsed features• Email notifications• System usability
24
Preferred Mechanism Of Communication
• Participants preferred email notifications over text messages
• 3% of participants did not want daily reminders
3%
25
Likelihood To Exercise More Due To Study Participation
• 11% of participants felt the study increased their likelihood to exercise more
26
Increase In Access To System During Second Week Of
Study
• 23% of participants felt they used the system more in the second week
27
Limitations• Experiment size limited significance• Non-medical population proved a poor
proxy• Lack of content variation negatively
impacted collaborative filtering effectiveness
• System Shortcomings• No method to prevent rapid clicking
through cards• Top card lists did not automatically update
28
Future Work• Expand experiment scale to validate
targeting method• Use large non-homogeneous medical
population • Create larger, more diverse content base
• Improve content targeting process• Automate collaborative filtering process• Increase effectiveness of information cards
29
ConclusionsImplemented and tested content personalization methods for a third-party software with a UVA student population• Machine Learning Methods outperformed Random
Selection, but experiment lacked power to determine best personalization method• Post-Hoc Survey indicated that using daily
reminders increased system interaction (97%), but did not support behavioral change (11%)
Recommend expansion of study and system functionality
30
References[1] The Promise of Telehealth For Hospitals, Health Systems and their Communities. 2015. http://www.aha.org/research/reports/tw/15jan-tw-telehealth.pdf [2] Telemedicine Guide: Telemedicine Statistics. 2016. evisit.com/what-is-telemedicine/#13 [3] Morrison, Leanne G. “Theory-based Strategies for enhancing the Impact and Usage of Digital Health Behavior Changer Interventions: A Review.” Digital Health 1, no. 1 (2015): 1-10. [4] Rajaraman, Anand, Ullman, J. “Recommendation Systems.” Mining of massive datasets 1 (2012). [5] Drive Higher Conversions by Personalizing the Website Content Based on the Visitor. 2016. http://www.hebsdigital.com/ourservices/smartcms-modules/dynamic-content-personalization.
31
Appendix
32
Card Scoring method
• +1 point for information card response• +0.5 points for question card response, +0.5
points for healthy question card response• +1 point for time spent on cards > 30• +1 point for consecutive visit to cards on the day
before • First week average Card Score: 1.88 [1.82, 1.93]
State of Mental Health in the US• 1 in 4 adults
experience mental illness each year
• Only 40% receive treatment
• 55% of 3,100 counties have no practicing mental healthcare workers
• Telemental health is a viable solution
33
34
Literature Review• Existing telehealth applications employ basic, rule-
based content targeting methods• Morrison’s (2015) psychology theory research suggests
targeting content to improve digital health behavioral systems • Regression models and machine learning applications
use demographic information, internet behaviors, and contextual data to target content• Recommender systems, an application of machine
learning, of interest due to use in current content targeting systems (i.e. Netflix) and ability to be automated
35
1 2 3 4 51 0 0.5 -0.52 0.167 0.167 -0.333 0.75 -0.25 -0.25 -0.25
0.75
-0.75
0-0.25
0.251 2 3 4 5
1 0 0.5 -0.52 0.167 0.167 -0.333 0.75 -0.25 -0.25 -0.25
Users
Cards
User responds to question and information
cards
Scores calculated by
executing card scoring algorithm
Cross validation
selects best collaborativ
e filter
Best collaborativ
e filter computes user card rankings
System randomly draws 5
cards from top 10
ranking