How to Build Precision Medicine Solutions that Scale

Chris Gregg PhD speaking at the University of Calgary to the Precision Medicine Program

December 2022: Chris Gregg, Storyline CTO and co-founder speaking at the University of Calgary to clinicians and others about precision medicine and how new technologies and A.I. will allow for low-cost precision care at massive scale.

 

Transcript: WEBVTT

1

00:00:04.310 --> 00:00:05.630

Room: I'm really excited.

2

00:00:08.100 --> 00:00:16.269

Room: Okay, yeah. Really excited to introduce Dr. Chris Gray. Rox Craig's coming to us from the University of Utah.

3

00:00:29.790 --> 00:00:30.910

Room: It's time

4

00:00:32.790 --> 00:00:40.300

Room: highlight. The things that are, you know, really important to everyone here in Calgary, which is that he is a graduate from the University of Calgary.

5

00:00:40.410 --> 00:00:41.960

Room: So it

6

00:00:44.490 --> 00:00:46.730

Room: Thank you for the support.

7

00:00:46.970 --> 00:00:54.790

Room: Yeah. So one of our our make it good. And he, he's from here went on to Harvard

8

00:00:54.850 --> 00:01:01.099

Room: involved in developing these really important pioneering techniques. Rna seek, and other

9

00:01:01.300 --> 00:01:05.270

Room: tools and and technologies that many of you are are learning about

10

00:01:05.510 --> 00:01:11.629

Room: and also, you know, want to notes a lot of involvement. Lots of entrepreneurial spirits.

11

00:01:11.790 --> 00:01:19.499

Room: chief scientific officer, and in several companies which I think we're going to hear about here and

12

00:01:19.520 --> 00:01:22.639

Room: other scientific advisory roles as well. And

13

00:01:23.130 --> 00:01:30.449

Room: yeah, I always believe that I should get out of the way as quickly as possible. So with that i'll i'll

14

00:01:30.770 --> 00:01:31.690

see there.

15

00:01:32.260 --> 00:01:34.380

Room: Hey, Dave! Thank you very much.

16

00:01:34.550 --> 00:01:36.420

Room: And for those of you who made

17

00:01:36.530 --> 00:01:42.210

Room: time to attend this either virtually or in person. I'm really grateful for the time you made

18

00:01:42.330 --> 00:01:54.110

Room: Dave filled me in on the program here. The the precision medicine program, and I had to say, i'm pretty passionate about it. It sounds extremely unique, and it sounds incredibly important

19

00:01:54.160 --> 00:02:12.220

Room: for the past 20 years or so. I've done basic research, and I've always dreamed of translating a lot of that work into the real world to, you know, improve things out in the real world. But it is a very difficult process. It's a unique skill set, and it's really awesome to see a program that's focused on that problem specifically.

20

00:02:13.180 --> 00:02:29.690

Room: I'm going to talk about my own personal adventures in that in closing that gap from basic research to trying to deliver something that works and operates in the real world by talking about this startup company called Storyline Health.

21

00:02:29.920 --> 00:02:38.979

Room: And please stop me, and i'll try to keep an eye on the screen for online questions as well as in person. Questions

22

00:02:39.040 --> 00:02:43.810

Room: you know, as as they come up. This is a very difficult and challenging and interesting road.

23

00:02:45.280 --> 00:03:02.820

Room: So first of all you know, we've established a mission for storyline, health and storylines. Mission is to understand human behavior and make that knowledge useful for everyone. That's kind of what binds us when we make decisions around the technologies that we're building the problems we're trying to solve and the research we're trying to build.

24

00:03:02.830 --> 00:03:21.069

Room: That's what we think about the most, and it has a lot of broad implications that can be used in a lot of different ecosystems in medicine surgery, and even in places completely that I never even thought of in the legal system and other areas. it's taken us on all kinds of adventures very, very cool.

25

00:03:21.480 --> 00:03:22.730

Room: So.

26

00:03:25.180 --> 00:03:28.409

Room: Oh, how funny! I tried this many times. No, it's not.

27

00:03:29.590 --> 00:03:31.260

Room: It's not

28

00:03:32.470 --> 00:03:33.530

Room: advancing.

29

00:03:34.170 --> 00:03:36.830

Room: Oh, let's see if we can figure out Why.

30

00:03:40.160 --> 00:03:41.400

Room: okay.

31

00:03:42.390 --> 00:03:45.600

Room: we had a problem. But now you know, it's solved the problem.

32

00:03:47.090 --> 00:03:49.990

Room: Okay, that was not the problem I was intending to talk about

33

00:03:50.120 --> 00:03:56.440

Room: we with when starting down this road there were a number of problems that I was particularly interested in.

34

00:03:56.850 --> 00:03:59.689

Room: One is that acute care solutions

35

00:03:59.810 --> 00:04:19.669

Room: come into the doctor and you get, and it's sort of this acute event, and you get treated for something that typically does not resolve your health care problem if it is a chronic illness, right? So some of the biggest illnesses that we're trying to face Alzheimer's disease, which is the most expensive problem in the United States right now. Parkinson's cancer autoimmune diseases, etc.

36

00:04:19.680 --> 00:04:28.609

Room: They involve so many different factors. The immune system, inflammation, genetics, developmental factors, mental health, social support, and more.

37

00:04:28.840 --> 00:04:30.800

Room: that a single acute visit

38

00:04:30.990 --> 00:04:33.219

Room: really isn't enough to resolve the problem.

39

00:04:35.560 --> 00:04:43.019

Room: The other thing that is a really you know, clear problem with difficulty in medicine is that we don't have maps

40

00:04:43.040 --> 00:04:44.850

Room: to navigate health

41

00:04:44.940 --> 00:04:50.320

Room: healthy aging, and then the onset of diseases at different sort of stages of our lives.

42

00:04:50.470 --> 00:04:53.970

Room: and without a map you can't really navigate very rash.

43

00:04:54.090 --> 00:04:57.070

Room: You can't detect adverse events early.

44

00:04:58.010 --> 00:05:03.169

Room: You can't diagnose and patient important subtypes as effectively as you need.

45

00:05:04.210 --> 00:05:10.630

Room: You can't predict progression or treatment responses nearly as effectively or accurately as you need to.

46

00:05:10.920 --> 00:05:15.220

Room: And all of this kind of adds up to an overall

47

00:05:15.300 --> 00:05:26.339

Room: problem where we're locked into reactive care instead of predictive care where we resolve problems, so that before they progress and become chronic illnesses that are very expensive and difficult to manage.

48

00:05:28.140 --> 00:05:34.949

Room: It's not just in the clinical care setting that we have some problems. We also have real fundamental problems in the research ecosystem.

49

00:05:35.640 --> 00:05:51.919

Room: Now, I know i'm going to pick a few fights here. But bear with me because I have a lot of experiences trying to solve problems with some of these tools. Genomics is a wonderful tool, very powerful. It's the backbone of the precision, medicine, ecosystem. But frankly.

50

00:05:52.290 --> 00:05:54.169

Room: it's not very predictive.

51

00:05:54.300 --> 00:05:56.110

Room: and it's not a parry diagnostic

52

00:05:56.590 --> 00:06:09.639

Room: and the genome sequences I have to say that you are born with is the genome sequence that you die with it Doesn't change, if you change your diet, have an infection have a stressful event in your life, or get sick.

53

00:06:10.170 --> 00:06:18.869

Room: Just because I have risk factors for depression or mental illness Doesn't mean that i'm going to get that next week

54

00:06:19.050 --> 00:06:19.840

Room: right?

55

00:06:20.090 --> 00:06:25.889

Room: and in fact, if you look at some of the best polygenic risk scores we have for many of these complex diseases.

56

00:06:26.010 --> 00:06:31.870

Room: Like major depression, they only explained less than 1% of the variance and the risk for that disease.

57

00:06:32.360 --> 00:06:34.430

Room: That's a real butler, right?

58

00:06:36.370 --> 00:06:41.929

Room: The other thing that's very challenging is that some of the diagnostic tools that we're using.

59

00:06:42.090 --> 00:06:43.859

Room: especially in behavioral health

60

00:06:44.120 --> 00:06:53.130

Room: are they were developed through kind of subjective approaches. They're not data-driven in the way that some other tests that we use in medicine are.

61

00:06:53.360 --> 00:07:00.700

Room: And so people have argued that the dsm 5 diagnostic categories. For example, are too broad. They're too simplistic.

62

00:07:00.780 --> 00:07:08.239

Room: and that that's a barrier to research to uncover mechanisms, biological mechanisms that we can target and treat different subtypes.

63

00:07:08.310 --> 00:07:09.070

Perfect

64

00:07:10.320 --> 00:07:30.059

Room: animal research. Now i'm giving another talk on Friday. That's on animal research. So this is going to look like terrible hypocrisy. But animal research at the end of the day. Well, it's very powerful for undercovering basic biological mechanisms and for understanding mammalian biology, verticality, etc.

65

00:07:30.760 --> 00:07:40.770

Room: It often does not translate to humans 95% of the drugs that we get working well in. Pre-clinical trials ultimately fail in clinical trials and don't actually become medicines.

66

00:07:41.630 --> 00:07:43.650

Room: That means that we really need to study

67

00:07:43.780 --> 00:07:44.610

people.

68

00:07:45.820 --> 00:08:00.800

Room: Medical records are the backbone of the information that we have for patients, right? These Ehrs lots of different technologies trying to make use of that more effectively. But ultimately they really weren't ever designed for precision, medicine purposes, data, science.

69

00:08:00.870 --> 00:08:04.780

Room: and they don't capture real world data outside of the clinic.

70

00:08:05.710 --> 00:08:07.699

Room: And then wet lab tests.

71

00:08:08.200 --> 00:08:16.749

Room: Now, these are very exciting tests, these molecular tests that we rely on, and the new ones that are coming. But I often call them second line tests

72

00:08:16.880 --> 00:08:27.259

Room: because they're not massively scalable. These aren't tests that you can take every day or every week or every month, and be monitoring and getting a a picture of your overall health one

73

00:08:27.970 --> 00:08:37.969

Room: the microbial, which is something that you know. I'm very excited about, but it's not likely that we're going to be mailing our poop samples in every week to get regular microbiome checkups.

74

00:08:38.340 --> 00:08:45.420

Room: So the second line test, for example, they are second line. excuse me. These wet lab tests are second line tests.

75

00:08:46.250 --> 00:09:03.759

Room: and you know, as I was thinking about some of the problems that we're trying to solve. there's many, many problems in the United States where I'm. Located right now, but there are all also major opportunities here in the Canadian healthcare system, as many as you many of you know, in this precision medicine program.

76

00:09:03.890 --> 00:09:19.210

Room: health care in Canada rakes as among the most expensive systems in the United States. But we really struggle with the wait times, and as a consequence of these wait times, you know, it is having an impact on patient quality of life.

77

00:09:19.480 --> 00:09:32.550

Room: Acute illness has become chronic illnesses. There's pain and suffering and mental health job loss, economic damage, huge opportunities for smart young people who are creating a precision, medicine technologies to help solve these problems.

78

00:09:32.630 --> 00:09:34.220

Room: improving access.

79

00:09:34.330 --> 00:09:39.299

Room: So all of this really folds into one big problem, I think.

80

00:09:39.660 --> 00:09:54.149

Room: which is that we need massively scalable precision, medicine, research and care solutions, and there are only a few things I think, that can address that need in the world, and and and i'm going to tell you

81

00:09:54.360 --> 00:10:00.420

Room: my opinion. So my opinion is that the human nervous system

82

00:10:00.700 --> 00:10:04.000

Room: offers one of the best solutions that I can see.

83

00:10:04.180 --> 00:10:08.580

Room: If we look across all of the different technologies that seem to be emerging out there in the world.

84

00:10:08.990 --> 00:10:09.990

Room: And

85

00:10:10.280 --> 00:10:18.490

Room: you know, if you kind of step back and think about it. The nervous system of our body is the ultimate precision medicine

86

00:10:18.540 --> 00:10:19.629

Room: system, right?

87

00:10:20.090 --> 00:10:24.530

Room: All of these nerves are innovating different organs and tissues of the cellular level.

88

00:10:24.560 --> 00:10:33.499

Room: They're detecting metabolic changes, endocrine changes, physiological changes, pain and damage movement. All of that information rolls up into the central nervous system.

89

00:10:33.820 --> 00:10:39.710

Room: and it influences cognition, fatigue, energy, motivational drives, homeostasis.

90

00:10:40.160 --> 00:10:42.729

Room: and and much much more right. So

91

00:10:42.820 --> 00:10:47.650

Room: our behavior, which is kind of the manifestation of all of these signals and changes

92

00:10:48.330 --> 00:10:51.019

Room: as the expression of all of this underlying biology.

93

00:10:51.620 --> 00:10:52.310

Room: So

94

00:10:52.490 --> 00:10:55.450

Room: but the problem is

95

00:10:55.510 --> 00:11:00.000

Room: that we're really not making use of that data in an objective

96

00:11:00.100 --> 00:11:02.819

Room: and kind of precision, medicine, type of way.

97

00:11:03.770 --> 00:11:18.450

Room: The way that we take information from patients right now is we've got forms that they might fill out right. How many of the pad. You know, over the past 2 weeks. How many days have you felt depressed or sad? You might say several or most days

98

00:11:18.920 --> 00:11:25.330

Room: we've got medical records. There's a face to face. Exam: right? So that takes a lot of time.

99

00:11:25.740 --> 00:11:34.680

Room: and there's an expert clinician in front of you who asks these questions and is evaluating your health, your behavior, and how you respond.

100

00:11:34.700 --> 00:11:35.360

Room: But

101

00:11:35.520 --> 00:11:40.820

Room: very little of that information actually goes into the record. It's just going into the brain of this expert

102

00:11:41.430 --> 00:11:43.270

Room: who is making a judgment call.

103

00:11:43.450 --> 00:11:47.740

Room: And there's so much information that we do not measure or capture

104

00:11:48.330 --> 00:11:51.209

Room: such a huge opportunity for precision medicine.

105

00:11:53.730 --> 00:12:01.319

Room: The problem. well, let me just say that all decisions in the medical care system start by understanding the patient.

106

00:12:01.570 --> 00:12:10.560

Room: but other symptoms. What are they experiencing? What's the personality, what social support do they have? What access to care that they have, etc., etc.

107

00:12:11.370 --> 00:12:14.040

Room: But there really is no effective.

108

00:12:14.160 --> 00:12:15.750

Room: trusted way

109

00:12:15.850 --> 00:12:31.609

Room: to capture that data and make it objective like a radiologist would do right. They take a scan. They've got extraordinary capabilities now to analyze those images and pull out patterns that are diagnostic and predictive. And we don't have those technologies for understanding patient behavior.

110

00:12:32.920 --> 00:12:41.870

Room: So there's a big opportunity here, but it's a hard problem to solve, and I'm going to tell you a little personal story about why I think it's worth solving.

111

00:12:44.110 --> 00:12:57.090

Room: so so this is me in 2,018, and I'm dressed up like Dr. Evil, and my head is bald, but it's not not because i'm a real aficionado of Dr. Evil characters.

112

00:12:57.260 --> 00:13:07.869

Room: It's because I have lost my hair due to a stage for cancer diagnosis. So it's 2,018, and I was, you know, diagnosed that too many metastatic sites in my body to count.

113

00:13:08.080 --> 00:13:13.429

Room: and my diagnosis was terminal. So you know I would go on palliative care.

114

00:13:13.500 --> 00:13:15.270

Room: And

115

00:13:15.720 --> 00:13:18.719

Room: you know we we we'll get into the some of the details.

116

00:13:18.870 --> 00:13:27.649

Room: My son has, you know, shaped his head for support, so he's mining me, and and there's my wife and my dog also very supportive. So

117

00:13:27.990 --> 00:13:34.280

Room: so we faced this extraordinary problem of what seemed like an incurable and and

118

00:13:35.600 --> 00:13:37.689

Room: you know, impossible situation.

119

00:13:38.130 --> 00:13:39.840

Room: Yeah, but because of

120

00:13:40.090 --> 00:13:50.770

Room: my role in the world as a Phd. In a researcher I had access to all kinds of extraordinary and interesting individuals, and I had published a pay for that year. That had

121

00:13:50.800 --> 00:13:54.770

Room: sort of focused on. Why, elephants don't get cancer, you think? Well.

122

00:13:54.910 --> 00:14:09.059

Room: you know it's a bit of a sideline, but elephants have huge bodies, lots of cells. They, if they got cancer at the rate that we get cancer, all elephants should have cancer, but it turns out they don't they're cancer resistant, and our study was about. Why, that happens.

123

00:14:09.900 --> 00:14:17.019

Room: this this kind of gave me a network of folks who were thinking in new ways in Cancer Field.

124

00:14:17.570 --> 00:14:34.070

Room: and so we put together a cancer and evolution meeting at the Huntsman Cancer Institute, and many of my colleagues flew in just a couple of months. Very, very kind, you know. Sometimes I get teary, I not tonight. So and and the theme of the meeting was.

125

00:14:34.080 --> 00:14:38.070

Room: How could we improve stage 4 cancer outcomes.

126

00:14:38.230 --> 00:14:40.700

Room: using the knowledge that we have today?

127

00:14:41.050 --> 00:14:50.520

Room: Right? So, not generating a new medicine, new drug targets that might translate into a solution in 20 years. But how could we actually improve outcomes for patients today?

128

00:14:51.100 --> 00:14:58.900

Room: And there were a number of big ideas and important lessons that came out of that meeting, and one of the critical insights was

129

00:14:58.970 --> 00:15:01.680

Room: that the way we are treating cancer

130

00:15:02.100 --> 00:15:04.189

Room: currently in standard of care

131

00:15:04.900 --> 00:15:08.159

Room: is arguably one of the worst ways that we could approach the problem.

132

00:15:08.410 --> 00:15:21.920

Room: And how do you? How can you say that the reason that people think that this is a poor strategy which I will go through in a minute is because there are so many lessons that have been learned about attacking pests

133

00:15:21.990 --> 00:15:29.850

Room: in managing pest control in the farming community. And through studying of species, extinction, and evolution.

134

00:15:30.020 --> 00:15:34.629

Room: we have a bunch of really interesting information that's out there in these other fields.

135

00:15:34.870 --> 00:15:36.850

Room: But we haven't brought those lessons

136

00:15:37.150 --> 00:15:38.650

Room: into the cancer world.

137

00:15:38.750 --> 00:15:40.209

Room: So what are those lessons?

138

00:15:40.700 --> 00:15:44.599

Room: A typical cancer patient like me comes into the clinic.

139

00:15:44.670 --> 00:15:47.530

Room: and they've been diagnosed with their

140

00:15:47.710 --> 00:15:51.189

Room: tumor, and they're put on their first line therapy.

141

00:15:51.680 --> 00:16:03.779

Room: And if this is the tumor marker, so their tumor burden goes down. If they get a good response, and maybe they even cross into this Goldie zone fully like zone of any. No evidence of disease.

142

00:16:04.510 --> 00:16:06.069

Room: Then what happens

143

00:16:06.190 --> 00:16:07.649

Room: is,

144

00:16:07.870 --> 00:16:16.389

Room: the the standard of care is to stay on that medicine until the disease returns, and that's called progression.

145

00:16:16.890 --> 00:16:24.619

Room: So in the United States you actually cannot switch to a new medicine. Your insurance company won't pay for it

146

00:16:24.690 --> 00:16:27.440

Room: until you've shown evidence of progression.

147

00:16:27.610 --> 00:16:29.589

Room: and then you're allowed to do a new medicine.

148

00:16:30.060 --> 00:16:36.960

Room: So now you go into the next drug, and maybe you get a good response. The disease goes down, but you wait until it grows back.

149

00:16:37.050 --> 00:16:38.560

Room: and then again.

150

00:16:38.620 --> 00:16:42.219

Room: and then Eventually the oncologist runs out of drugs right?

151

00:16:42.350 --> 00:16:44.469

Room: And depending on the disease you have.

152

00:16:44.520 --> 00:16:48.529

Room: You know, there may be several drugs, or there may be very few options.

153

00:16:49.190 --> 00:16:53.180

Room: and at that stage they've lost control of the disease. So it spreads through your body.

154

00:16:53.210 --> 00:16:55.940

Room: and there's nothing they can do anymore to control that

155

00:16:56.840 --> 00:16:58.680

Room: a different approach

156

00:16:59.390 --> 00:17:04.089

Room: is to switch drugs at the nadir of the response.

157

00:17:04.730 --> 00:17:10.879

Room: And here you're working in the dark right? Because you can't see the disease anymore. If You've got a good response.

158

00:17:11.200 --> 00:17:18.779

Room: but we call this extinction therapy, and the reason we call it extinction therapy is that it's inspired by how species actually go extinct in nature.

159

00:17:18.810 --> 00:17:22.450

Room: When you've got a huge population of animals very diverse.

160

00:17:22.619 --> 00:17:31.690

Room: there's always going to be some subpopulation that can survive the pesticide or the asteroid. You know the meteor strike, or the volcano, or whatever.

161

00:17:31.720 --> 00:17:37.759

Room: because they have some sort of adaptation that allows them to to make it through.

162

00:17:38.140 --> 00:17:40.979

Room: But if the population is small

163

00:17:41.330 --> 00:17:42.750

Room: and sparse.

164

00:17:43.110 --> 00:17:52.260

Room: then that is the only opportunity you have to eradicate that group, and you just have kind of relentless pressures that you apply to the system.

165

00:17:52.390 --> 00:18:00.429

Room: And so this is the idea of extinction therapy. Instead of waiting for progression, we switch it to the deer, and you just keep doing different drugs.

166

00:18:00.550 --> 00:18:03.089

Room: and I switched drugs every 4 months.

167

00:18:05.970 --> 00:18:19.469

Room: So th this is my actual tumor data this is a tumor marker called Ca: 2729. And here I was initially diagnosed with, you know, reasonably high tumor markers, and started down this

168

00:18:19.660 --> 00:18:23.129

Room: idea that came out of the meeting for extinction there.

169

00:18:23.430 --> 00:18:25.950

Room: and by the second drug, Adrian Meison.

170

00:18:26.130 --> 00:18:28.239

Room: I was in the ndp zone.

171

00:18:28.370 --> 00:18:34.150

Room: and I was in need by PET Ct. As well as by tumor markers, and stayed.

172

00:18:34.390 --> 00:18:35.290

Room: though

173

00:18:35.310 --> 00:18:37.180

Room: switching different drugs.

174

00:18:37.220 --> 00:18:46.889

Room: so I never stayed on the same drug until the disease came back. I never went off all of the drugs, which is sometimes the the recommended thing in in standard of care. If you reach me.

175

00:18:49.830 --> 00:18:51.639

Room: In addition to that.

176

00:18:52.460 --> 00:19:00.560

Room: I put together a program that would switch my metabolic States in combination with the drugs. So now we're talking about drugs

177

00:19:00.590 --> 00:19:03.890

Room: plus metabolic and behavioral interventions together.

178

00:19:04.150 --> 00:19:09.470

Room: The metabolic switching program involves a few weeks of being on a low-carb

179

00:19:09.630 --> 00:19:16.220

Room: paleo diet, and that has particular functions for metabolic and microbial and immune repair

180

00:19:16.250 --> 00:19:18.019

Room: a Ketogenic phase.

181

00:19:18.130 --> 00:19:23.839

Room: an extended fasting phase which is a 3 day water fast, and then a low methining diet to recover.

182

00:19:24.030 --> 00:19:28.639

Room: And if you're you're interested in reading what the science behind this are published in a

183

00:19:28.680 --> 00:19:32.279

Room: a special issue of cancer evolution frontiers.

184

00:19:32.760 --> 00:19:41.070

Room: so bringing that whole program together. Now you can couple the drug strikes in different metabolic states.

185

00:19:41.160 --> 00:19:50.589

Room: And so you imagine all these different tumor cells, like a single tumor, has about 2 ability on 2 square. Tumor has billions of cells.

186

00:19:50.610 --> 00:20:00.219

Room: and they're all very diverse. Some of them are amino acid metabolizers, ketone metabolizers, and many of them are in the Warburg effect. And they're like, you know, glycolysis and glucose but analyzers.

187

00:20:00.880 --> 00:20:08.890

Room: and so by putting yourself in different metabolic states, and then coupling the drug, strikes with those you can pick off different populations of tumor cells.

188

00:20:10.580 --> 00:20:12.159

Room: That's the idea.

189

00:20:12.540 --> 00:20:21.450

Room: and here's my tumor marker data again. And here i'm going on a very aggressive treatment of an Astrosol and ibrance.

190

00:20:22.000 --> 00:20:25.179

Room: and then a Cape site of being an oral site toxin.

191

00:20:25.600 --> 00:20:37.060

Room: and then ex the mustang and brazenio. They're all very difficult drugs, and the tumor markers are more or less stable. They're not going down. And then they went off all of my drugs.

192

00:20:37.230 --> 00:20:40.670

Room: It did that program that I just described

193

00:20:40.700 --> 00:20:48.769

Room: and got a 23% reduction in my tumor markers. And so by this metric, the metabolic switching

194

00:20:48.970 --> 00:20:50.530

Room: outperformed

195

00:20:50.910 --> 00:20:52.220

Room: the drug treatments.

196

00:20:52.290 --> 00:20:53.829

Room: at least at this stage.

197

00:20:54.060 --> 00:21:03.249

Room: And and so this was encouraging to me, because it showed for me the power of manipulating metabolism to affect

198

00:21:03.270 --> 00:21:08.800

Room: cancer. And of course, now we think of cancer as a very much a metabolic disease.

199

00:21:09.320 --> 00:21:17.530

Room: And so this makes a lot of sense. And subsequently there's actually been a lot of studies on the benefits of fasting and other things for treating cancer.

200

00:21:19.840 --> 00:21:20.500

Room: So

201

00:21:20.610 --> 00:21:27.089

Room: now we have a picture of a better care pathway potentially for advanced cancer. But i'm just one person.

202

00:21:27.640 --> 00:21:34.470

Room: and I've had success with this and many been able to manage my disease for over 3 years.

203

00:21:34.840 --> 00:21:42.740

Room: And you know, just for reference to medium survival for patients with that disease is 3 years. So it's been any d for that period of time.

204

00:21:43.080 --> 00:21:51.900

Room: and the proportion of patients that ever manage to reach any D with my disease is only 6, so it's very rare in and of itself.

205

00:21:53.210 --> 00:21:57.130

Room: So this makes me encouraged, and I want to get it out there for other folks.

206

00:21:58.140 --> 00:22:04.490

Room: What this means is that the need to think differently from not just the drug, but actually to this care. Algorithm.

207

00:22:04.710 --> 00:22:09.439

Room: where there's a sequence, a combination of different drugs that need to be put together with diet

208

00:22:09.470 --> 00:22:15.829

Room: and metabolic changes. And there's this is very kind of complex program that needs to be delivered.

209

00:22:18.590 --> 00:22:29.670

Room: It's a try to imagine how this will work in the future. It's very easy to run clinical trials where the solution is a single drug. You get the drug and you get the sugar pill, and then we measure the results at the end.

210

00:22:29.970 --> 00:22:33.929

Room: But when the solution is a complex care pathway.

211

00:22:34.510 --> 00:22:35.529

Room: how do you

212

00:22:35.550 --> 00:22:42.609

Room: get that out into the world as a discrete entity that can be run in clinical trials reproducibly across different patients.

213

00:22:42.690 --> 00:22:49.060

Room: And then how do you optimize it in a kind of precision, medicine, ecosystem where you need to modify the diet

214

00:22:49.430 --> 00:22:56.970

Room: or the sequence of the drugs in different combinations for particular patients. Right? So it becomes kind of a personalized intervention.

215

00:22:57.450 --> 00:23:01.020

Room: We need to start by understanding the patient.

216

00:23:02.140 --> 00:23:08.620

Room: and we need new precision medicine platforms and technologies that are focused on that call.

217

00:23:08.650 --> 00:23:10.550

Room: So this comes back to the behavior.

218

00:23:11.210 --> 00:23:21.710

Room: And then, once we put these care pathways together as algorithms, we need to be able to deliver those at massive scale in a way that people can follow them right through their smartphone.

219

00:23:23.370 --> 00:23:36.890

Room: They need to be able to access these and the diagnostic and the monitoring tools through their smartphone Remote can't be commuting regularly into the doctor. It's too much trouble. It's too much of a burden on the patients, and their quality of life.

220

00:23:37.940 --> 00:23:51.289

Room: There's a lot of education that goes into this, so you need to be supporting the patients so that they understand why they need to do these different steps? Why do you need to eat certain fruits during the long assignment stage? Is it just because

221

00:23:51.450 --> 00:24:03.230

Room: apples are good? It's not because apples are good. It's because they have absolutely no methion in them, right? And so there's this key bits of information. They need the right information at the right time to make it way through that.

222

00:24:03.800 --> 00:24:06.299

Room: And then you need to be monitoring the patient

223

00:24:06.430 --> 00:24:19.560

Room: right? So we need new tools to monitor and get ahead of any problems. And all of this, all of this complexity needs to be safe needs to be easy to use, and it needs to be massively scalable and improve outcomes.

224

00:24:19.860 --> 00:24:27.540

Room: And I'm telling you a story from cancer. But you can imagine the same problem for alzheimer's to see Parkinson's disease diabetes.

225

00:24:28.100 --> 00:24:30.459

Room: kidney transplants. Anything right.

226

00:24:30.700 --> 00:24:32.880

Room: There's a huge opportunity to solve this.

227

00:24:35.640 --> 00:24:47.819

Room: So they this is too big of a problem for one goofy guy like me to take on, of course, and and instead, what we decided to do was build a platform to help other people

228

00:24:48.200 --> 00:24:49.560

Room: like you guys

229

00:24:49.660 --> 00:24:55.899

Room: identify these problems and have access to all the tools that you would need to go out into the world and solve them.

230

00:24:56.180 --> 00:24:58.240

Room: Build companies based on them.

231

00:24:58.980 --> 00:25:04.569

Room: build apps based on them. Solve any particular problem that I would be interested in.

232

00:25:04.890 --> 00:25:06.489

Room: so that became the mission.

233

00:25:06.800 --> 00:25:24.289

Room: And I I have a partner in this mission. So there's there's me, and I've got this background in genomics. And we publish this paper using artificial intelligence to decompose complex patterns of behavior. So we had some kind of 10 years of research working on that problem.

234

00:25:25.220 --> 00:25:35.289

Room: and I and I kind of brought this research component into the into the organization, and I met Jeff Barson and Jeff is the CEO of storyline.

235

00:25:35.610 --> 00:25:49.860

Room: and you know together, this this combination of skills that we were able to bring together is very unique. So, as you guys are thinking about the organizations and the companies that you want to build, finding partners that have these extraordinarily

236

00:25:49.870 --> 00:26:06.209

Room: complementary sets of expertise can be such a huge advantage. Right? It's it's fortuitous. It involves kind of a lot of networking and talking to people and helping, you know, getting in a program where they can get you to meet people with these different skills. But making these partnerships is like

237

00:26:06.380 --> 00:26:07.290

Room: critical.

238

00:26:07.390 --> 00:26:22.059

Room: and out of all the technologies and all of the things and all the ideas that you come up with your company. The most valuable things that will be in your company or in your organization will be the unique set of experiences, talent, and understanding

239

00:26:22.110 --> 00:26:24.169

Room: that the founders in the leadership have

240

00:26:24.640 --> 00:26:29.499

Room: so so the people that you bring on board. That's the most valuable resource in your startup.

241

00:26:30.320 --> 00:26:48.200

Room: Jeff had run a series of medical clinics, so he understood medical clinics. He had built a telehealth platform called Tele Doc, which is, you know, pretty successful company has been through a few different different names over the years.

242

00:26:48.210 --> 00:26:53.159

Room: and one of the leading teleheld software, platforms, and then he led a

243

00:26:53.290 --> 00:26:59.139

Room: innovation at the world's largest AI company for Job Hiring called higher

244

00:26:59.480 --> 00:27:02.589

Room: and it so. So if you're going to get a job

245

00:27:02.610 --> 00:27:07.029

Room: driving an uber or something like that. One of the first steps in that process

246

00:27:07.220 --> 00:27:09.040

Room: is a video interview.

247

00:27:09.350 --> 00:27:25.630

Room: and then AI analyzes your facial, vocal and speech patterns, and predicts whether you're going to be a safe and good Uber driver or Delta Airlines employee, or one of these massive entities that needs to hire people effectively and safely.

248

00:27:26.760 --> 00:27:27.700

Room: So

249

00:27:27.820 --> 00:27:33.430

Room: Jeff had seen that AI was being used for analyzing behavior in this ecosystem.

250

00:27:33.490 --> 00:27:42.499

Room: and we could see that we could learn a tremendous amount from that and lift it over into the biomedical research and and care community.

251

00:27:43.720 --> 00:27:54.960

Room: So repurposing some things that we're working in another area and bringing it into this world was really a a big part of our secret so far, so we found it storyline.

252

00:27:55.040 --> 00:28:04.349

Room: and we compare ourselves, you know, to the other technologies. As you're kind of evaluating your startup, and whatever value you're going to bring into the world. It's good to kind of

253

00:28:04.430 --> 00:28:14.659

Room: think about what solutions are out there in the world. We've got questionnaires and wearables and X-rays and Mris, and expert clinicians and fmri's, and all that kind of stuff

254

00:28:14.950 --> 00:28:15.870

Room: and

255

00:28:16.300 --> 00:28:21.509

Room: some of these are, you know, diagnostically very effective, but very expensive

256

00:28:21.580 --> 00:28:22.979

Room: and not very scalable.

257

00:28:23.600 --> 00:28:25.370

Room: Others are very cheap.

258

00:28:25.410 --> 00:28:28.840

Room: but they're not very diagnostic or predictive.

259

00:28:29.380 --> 00:28:31.870

Room: So they're not very information rates like a question.

260

00:28:32.280 --> 00:28:42.920

Room: And what storylines AIM to do is work in this area where it's very cheap and very, very scalable, but it's also very, very valuable and rich and useful and predictive of diagnostic data.

261

00:28:43.820 --> 00:28:47.219

Room: So we work. I I will sometimes say that

262

00:28:47.320 --> 00:28:53.520

Room: it because I come out of this genomics background and model story. Sorry I'm very much on illumina.

263

00:28:53.730 --> 00:28:55.419

Room: and when I was at

264

00:28:55.470 --> 00:29:05.899

Room: Harvard Illumina invested in me and started my career, and you know I you know, I think to Illumina many times. But I learned a lot from them about

265

00:29:06.090 --> 00:29:09.410

Room: how to solve many problems in big data.

266

00:29:09.630 --> 00:29:12.240

Room: And so we modeled storyline very much along them.

267

00:29:14.160 --> 00:29:17.899

Room: this is a screenshot of the interface.

268

00:29:18.070 --> 00:29:20.900

Room: And so if you were

269

00:29:21.000 --> 00:29:27.440

Room: to set up a professional account because you're a researcher, clinician or an entrepreneur.

270

00:29:27.610 --> 00:29:36.299

Room: You log into storyline, and you'd see your dashboard. And this is where you would manage all of the products that you're going to build within the platform using AI,

271

00:29:38.310 --> 00:29:40.790

Room: And this is the

272

00:29:41.420 --> 00:29:44.050

Room: part of the platform. It's called programs.

273

00:29:44.090 --> 00:29:47.580

Room: And essentially what you can do in here is build an app.

274

00:29:47.890 --> 00:29:49.650

Room: and you can build an app

275

00:29:49.800 --> 00:29:53.189

Room: for a care pathway. AI supported care pathway

276

00:29:53.350 --> 00:29:55.399

Room: for whatever problem you want to solve

277

00:29:55.810 --> 00:29:59.150

Room: within minutes. It takes hardly any time at all.

278

00:30:00.600 --> 00:30:05.059

Room: and you can deliver educational content. Video: text

279

00:30:05.120 --> 00:30:14.829

Room: and more. There's all kinds of questions that are would be typical for red cap, other questionnaires and things like that. You can plug in here. And then there's the video question.

280

00:30:14.950 --> 00:30:19.210

Room: and that's what we'll see is where all of the data and the value really comes from.

281

00:30:19.690 --> 00:30:20.740

Room: So

282

00:30:21.370 --> 00:30:30.230

Room: this is, you know, there's there's much to say about the platform. But this is one of the key components, right? So you can design care pathways. Educate folks.

283

00:30:30.240 --> 00:30:42.039

Room: communicate with them at scale. It enables all of the AI tools that i'm going to talk about here in a minute. assess subtype monitor research diagnosed, etc. And then you can. Once you've got that app built

284

00:30:42.740 --> 00:30:46.350

Room: It's yours right. You have built and invented something.

285

00:30:46.780 --> 00:30:53.990

Room: and now you can get it out into the world at massive scale and start to help patients. And this is what I really want people to do with this platform.

286

00:30:55.190 --> 00:31:00.420

Room: When you do that, what happens on the patient's side is they get a note.

287

00:31:00.480 --> 00:31:02.309

Room: and they click on the link.

288

00:31:02.490 --> 00:31:09.010

Room: and they're taken straight into the storyline app. And now they can do these asynchronous interviews over their phone.

289

00:31:09.530 --> 00:31:12.460

Room: So you'll ask a question. How are you feeling today?

290

00:31:12.500 --> 00:31:20.360

Room: The person will respond it videotapes their response that video is moved off their phone and securely stored in the cloud

291

00:31:21.020 --> 00:31:24.630

Room: in a Hipaa Gdpr: Military grade security

292

00:31:24.740 --> 00:31:28.789

Room: data storage solution is designed specifically for this.

293

00:31:29.640 --> 00:31:37.660

Room: I wanted patients to have control over their data, and so patients always have control over their data within the storyline system. They want to delete it.

294

00:31:37.710 --> 00:31:38.899

Room: They can delete it.

295

00:31:38.990 --> 00:31:42.470

Room: And I know that that gives entrepreneurs and innovators kind of a bit of a

296

00:31:43.280 --> 00:31:50.540

Room: art publications. But I can tell you that that is the way the world is going for. Sure. People need to have control over their own data.

297

00:31:52.060 --> 00:31:56.170

Room: once their video has been moved up into the cloud.

298

00:31:56.320 --> 00:31:59.270

Room: Then you have some real superpowers.

299

00:32:00.010 --> 00:32:06.729

Room: We've built a micro services, pipeline of many different AI algorithms.

300

00:32:06.760 --> 00:32:14.640

Room: and all of the algorithms are put together to measure over 30,000 different micro features. Now.

301

00:32:15.080 --> 00:32:19.740

Room: I know the slide says 20,000, but you know, goes up every week. So I gotta update the slide

302

00:32:20.960 --> 00:32:29.870

Room: from the video we get information like objective measures of pupil dilation, right? Which is re regular by sympathetic tone.

303

00:32:30.080 --> 00:32:35.849

Room: You can see, I tracking, and i'll show you some examples of how we use this for neurological assessments.

304

00:32:35.980 --> 00:32:44.460

Room: head movements. You can see blood flow. Patterns of blood flow changes across the face by measuring RGB values across 400 different points on the face.

305

00:32:44.740 --> 00:32:52.930

Room: You can look at respiration, responses, micro expressions, and micro movements across the 40 different muscles in the face.

306

00:32:53.190 --> 00:32:54.169

Room: and then

307

00:32:54.610 --> 00:32:59.490

Room: it's not just the motor components, or even you know, the power of the skin.

308

00:32:59.720 --> 00:33:17.659

Room: But you can analyze speech. And so word choice. All the natural language processing algorithms here have been put in place to go from speech to text, and then you can analyze what people say, the structure of your sentences, personality, trade, speech, patterns, even filler words as pauses.

309

00:33:17.880 --> 00:33:23.999

Room: and how they articulate their responses, the sentiment of their responses, and what they're trying to communicate.

310

00:33:24.520 --> 00:33:30.610

Room: and it's not just what you say, but how you say it! And so there are thousands of audio measures.

311

00:33:30.670 --> 00:33:32.330

Room: you know, harmonic

312

00:33:32.660 --> 00:33:33.960

Room: ratios.

313

00:33:34.010 --> 00:33:37.270

Room: energy in the in the voice

314

00:33:38.420 --> 00:33:42.850

Room: and pitch and tone changes, volume, changes, etc. Pronunciation.

315

00:33:42.920 --> 00:33:54.359

Room: and you can detect a number of different things here. Emotions in the voice, vocal microchammers. There's entire companies built just on the voice for diagnosing different disorders.

316

00:33:54.380 --> 00:33:58.190

Room: and all of that power is built into all this, this ecosystem.

317

00:34:00.280 --> 00:34:01.989

Room: So i'm pretty excited about that

318

00:34:02.230 --> 00:34:13.329

Room: coming from the genomics background where, when we sequence genomes, we could not just print out the 3 billion bases in a big book and give it to people.

319

00:34:13.500 --> 00:34:17.870

Room: We had to create file formats to deal with this, to share it.

320

00:34:17.920 --> 00:34:22.129

Room: to study it, to research, it, to integrate it in other multi-omic studies

321

00:34:22.670 --> 00:34:30.659

Room: and those are called fast queue files right, Sam Files bam files in the genome World. We had to develop analogs of those for the behavior world.

322

00:34:31.110 --> 00:34:35.020

Room: and these are called story, arc, story, time and poem files.

323

00:34:35.120 --> 00:34:52.519

Room: Unlike the genome, there's a dimension of time, right? So you've got these thousands of measures that you might pull out in a particular part of a video. But then all of these have to be built out in time. So these are actually the screw. These poems and screenplay files are are massive data files.

324

00:34:52.600 --> 00:34:58.279

Room: and they can be brought into whatever kind of multi-omics, precision medicine ecosystem you're thinking about building.

325

00:34:58.370 --> 00:35:02.490

Room: If you're collecting genome data microbiome data, immune general data.

326

00:35:03.020 --> 00:35:04.889

Room: You know, whatever

327

00:35:05.480 --> 00:35:12.199

Room: you should be collecting this type of data as well as integrate and integrating it into your subtyping and studies.

328

00:35:14.030 --> 00:35:29.660

Room: Once you have this data, you can do all kinds of things. Research, the data discover subtypes phenotypes for different patients discover symptoms, and you can also train models that become biomarkers

329

00:35:29.700 --> 00:35:33.279

Room: for abnormalities or difficulty. We'll talk about some examples.

330

00:35:33.480 --> 00:35:41.629

Room: and then what storyline enables you to do is to build visualizations of those models that make it very easy for your customer.

331

00:35:41.920 --> 00:35:43.219

Room: or a doctor.

332

00:35:43.250 --> 00:35:47.520

Room: or whomever even a patient to interpret and understand and use the information.

333

00:35:49.840 --> 00:35:57.389

Room: And then, once you built one of these things on the platform, we recognize right away that we needed everybody to be aligned.

334

00:35:57.750 --> 00:36:06.629

Room: So if you build a model on the storyline platform. You build a care pathway, for, you know, liver transplants, or whatever you're imagining.

335

00:36:06.790 --> 00:36:09.759

Room: You can publish that in the storyline library.

336

00:36:10.130 --> 00:36:15.709

Room: and that's like an app store like an apple app store for health care, AI algorithms

337

00:36:15.890 --> 00:36:17.209

Room: and care.

338

00:36:17.400 --> 00:36:21.179

Room: And then immediately that is available to everyone throughout the world

339

00:36:21.210 --> 00:36:22.419

Room: through telehealth.

340

00:36:22.670 --> 00:36:24.049

Room: and they can pay for it.

341

00:36:24.660 --> 00:36:27.179

Room: So people are actually building their start ups.

342

00:36:27.270 --> 00:36:43.319

Room: They don't need to train all the models and build all the software and the permissions and the data, storage solutions and privacy and the safety problems, etc., etc. It's already solved on storyline. You can go straight in, build your care pathway, train your algorithms, publish and have a product you can take out into the market.

343

00:36:46.050 --> 00:36:59.530

Room: Yeah, so that's a really really important question. And

344

00:36:59.780 --> 00:37:06.610

Room: there's there's a number of different answers there. So one is that some metrics are very sensitive to skin tone.

345

00:37:06.820 --> 00:37:18.210

Room: There are challenges with people who have really really black skin and getting enough contrast to measure some of the facial movements and dynamics

346

00:37:18.880 --> 00:37:25.540

Room: folks with lighter skin tones. We're finding that all of that works really really effectively.

347

00:37:25.630 --> 00:37:35.429

Room: Some algorithms and the speech and the text components. If you have accents and some of these other.

348

00:37:35.650 --> 00:37:36.359

you know.

349

00:37:37.040 --> 00:37:39.399

Room: variations in speech patterns

350

00:37:39.440 --> 00:37:42.770

Room: can introduce errors and difficulties.

351

00:37:42.850 --> 00:37:53.269

Room: And so, whatever you're building on the AI platforms, you have to be very cautious of the types of biases that you're introducing, and you need to understand what those biases are.

352

00:37:53.540 --> 00:37:57.089

Room: So what we've done at storyline is to build a bias report.

353

00:37:57.310 --> 00:37:59.480

Room: Let's say you collect data from like

354

00:37:59.590 --> 00:38:02.980

Room: a 1,000 people, and you bring it into your ecosystem

355

00:38:03.040 --> 00:38:11.630

Room: right away. If your model is picking up on, let's say you're diagnosing depression, and you built a model that is diagnostic of depression. Symptoms

356

00:38:13.220 --> 00:38:15.390

Room: The you'll test the model

357

00:38:15.620 --> 00:38:18.070

Room: right? You'll train it, and then you'll test it.

358

00:38:18.230 --> 00:38:24.789

Room: We will see whether there is a bias in your results for particular racial groups or accents.

359

00:38:25.400 --> 00:38:38.400

Room: and that will help you to diagnose problems with within your ecosystem. And then you need to work through that, to be honest, there there is not going to be a single algorithm that solves every problem for every group, 150

360

00:38:41.620 --> 00:38:44.949

Room: super important questions. Whoever put that up. Thank you.

361

00:38:46.740 --> 00:38:51.179

Room: One of the things that I learned from the genome medicine revolution is that

362

00:38:51.230 --> 00:38:53.650

Room: one of the big mistakes they made

363

00:38:53.820 --> 00:38:56.299

Room: is allowing everybody to become silent.

364

00:38:56.860 --> 00:39:01.010

Room: What that means is that every institute or every researcher.

365

00:39:01.370 --> 00:39:18.939

Room: the research is the worst. So they they would build these little sort of empires of data, and not allow anybody else to touch the patients right, and they would have their whole career. And they publish papers based on these patients that only they ever had access to. And

366

00:39:19.240 --> 00:39:28.890

Room: you can imagine, like, how frustrating and difficult that it's very good for that person's career, but very difficult for the whole exercise and problem in general of improving medical care.

367

00:39:28.900 --> 00:39:44.869

Room: in diagnostics. If all genomes have been shared initially, so, there was a way to integrate the data for all genome sequences. As they came off of an aluminum sequencer, we would be a 100 miles ahead of where we are right now.

368

00:39:45.040 --> 00:39:48.549

Room: right data, integration and sharing would have moved us

369

00:39:48.600 --> 00:39:50.100

Room: miles faster.

370

00:39:50.430 --> 00:39:57.660

Room: And so what we've done at storyline is is to correct this problem. It's it's very difficult to correct, but we're trying

371

00:39:58.070 --> 00:40:09.180

Room: where it's one ecosystem right? So there's the research and people are launching their research projects and things like that. The data can be de-identified, and it can be shared into a common platform.

372

00:40:09.320 --> 00:40:11.160

Room: And that's really powerful.

373

00:40:11.310 --> 00:40:20.529

Room: but contributing to that you you really are creating something that can move your own research much, much more quickly, right? Because if you're trying to train an AI model for one.

374

00:40:20.570 --> 00:40:22.010

Room: let's say depression again.

375

00:40:22.770 --> 00:40:25.479

Room: You're not just trying to diagnose depression from healthy.

376

00:40:25.750 --> 00:40:28.899

Room: You're trying to diagnose depression from bipolar

377

00:40:29.280 --> 00:40:41.030

Room: Romania, from sub types of depression, from alcoholism, from you know, all kinds of things so to ever get good models that are effective and accurate. Everybody has to work together.

378

00:40:41.750 --> 00:40:45.149

Room: So the platform has to be designed from the ground up with that in mind.

379

00:40:46.100 --> 00:41:02.150

Room: And then, once these solutions are built, they can be e immediately piped out into the real world right? When I was diagnosed with cancer. It was felt good to write papers. But how how futile right to write a paper and have it published.

380

00:41:02.160 --> 00:41:06.740

Room: and you know hardly anybody reads it right. There's 7,000 papers published every month.

381

00:41:06.870 --> 00:41:08.240

Room: Nobody reads that stuff.

382

00:41:08.520 --> 00:41:12.479

Room: So you you really need to renovate

383

00:41:12.610 --> 00:41:20.329

Room: the whole biomedical research system, so that the research and the discovery is immediately and seamlessly piped into the clinical world.

384

00:41:20.830 --> 00:41:22.569

Room: All of that data just flows.

385

00:41:27.630 --> 00:41:34.180

Room: Any questions. I'm going to get into a few case studies. I want to show show you guys sort of how people are using this.

386

00:41:35.220 --> 00:41:36.109

Room: Yeah.

387

00:41:36.190 --> 00:41:37.589

So to capture

388

00:41:40.210 --> 00:41:54.659

Room: we what we're gonna do is we're gonna be able to integrate wearable data. So let's say you have a problem that you're trying to solve, and part of it has wearable data.

389

00:41:54.990 --> 00:41:58.590

Room: What we can do is the wearable data is

390

00:41:59.180 --> 00:42:17.970

Room: mit ctl, and is going into a data repository storyline through an Api hooks into that repository and pulls your patience data into the same storyline ecosystem. So all of these wearables and other devices can be very easily and seamlessly integrated into the storyline. 150

391

00:42:18.070 --> 00:42:18.919

Room: system

392

00:42:19.050 --> 00:42:22.970

Room: that makes sense. So so it can all be pulled in. Storyline itself

393

00:42:23.090 --> 00:42:25.839

Room: is not focused on wearable type of data.

394

00:42:26.070 --> 00:42:31.640

Room: We're focused specifically on this video captured smartphone interview type of

395

00:42:31.700 --> 00:42:32.490

data.

396

00:42:33.420 --> 00:42:33.979

Yeah.

397

00:42:36.580 --> 00:42:38.790

Room: Any other questions on

398

00:42:39.500 --> 00:42:42.109

Room: If anybody online has a question.

399

00:42:43.200 --> 00:42:45.469

Room: something came up. But i'm not sure.

400

00:42:47.200 --> 00:42:50.529

Room: Oh, okay, okay, okay, good. Yeah, Thank you.

401

00:42:51.390 --> 00:43:00.840

sharaz khan: Yeah, I just you know. on the chat I just put this this program I used last year a nura a I don't know

402

00:43:00.970 --> 00:43:07.179

sharaz khan: this is something similar, but that essentially, you know, on your smartphone you can

403

00:43:07.210 --> 00:43:22.149

sharaz khan: a sort of map out your your your face, and then it it collects, you know. I don't think as as as sophisticated data as a storyline, but it has a good good base. I don't know. If you've heard of that.

404

00:43:23.980 --> 00:43:27.389

Room: I haven't heard of it. Thanks for sharing. I'll I'll check it up.

405

00:43:27.470 --> 00:43:34.599

sharaz khan: sure. Sure. Yeah, No problem. I'll just. I'll just put it on the chat. It's yeah, I see it. I see it up there. Yeah, thank you. Thank you.

406

00:43:36.190 --> 00:43:43.409

Room: You know the AI field is fast moving. I have it. It's it changed by the week. So i'm always grateful for

407

00:43:43.540 --> 00:43:44.919

Room: new information.

408

00:43:47.540 --> 00:43:49.040

Room: yes, I sorry

409

00:43:49.670 --> 00:43:58.920

Room: you're about to move to the next. So, I was wondering then, just about the kind of integrated platform that you were talking about.

410

00:44:00.060 --> 00:44:03.130

Room: I don't know if I got this right? But it seemed like

411

00:44:03.420 --> 00:44:05.300

Room: You're saying how

412

00:44:05.330 --> 00:44:08.189

Room: for example, as a data scientist, I could maybe

413

00:44:08.230 --> 00:44:13.339

Room: look at all the data that has been collected between my models on

414

00:44:13.410 --> 00:44:25.590

Room: and then the files will be in the format of the the story story art storage. So is that accurate that that is accurate. Okay, yeah, if you want to do that, let me know.

415

00:44:25.660 --> 00:44:28.890

Room: Yeah. So so then, my concern or or

416

00:44:28.970 --> 00:44:48.120

Room: the problem that I would be wondering about is, for example, if I wanted to know about depression, then I would have all these features like the spatial expression, etc. But then how would I know the the predictors? Yeah, the labels? Yeah. Yeah. So I haven't got into labeling. But storyline has a whole labeling ecosystem

417

00:44:48.130 --> 00:44:49.750

Room: within its software?

418

00:44:49.880 --> 00:44:52.530

Room: And the data that

419

00:44:53.170 --> 00:44:57.919

Room: You know that within that data sharing ecosystem, we retain the labels.

420

00:44:58.140 --> 00:45:08.409

Room: So there's the the other approach is an unsupervised approach. Right? And and that's actually one of my sort of favorite, but most difficult applications. Yeah.

421

00:45:08.610 --> 00:45:09.940

Room: yeah. And I guess

422

00:45:10.100 --> 00:45:13.810

Room: kind of connected to that would be since you talked about desk, you but

423

00:45:14.010 --> 00:45:23.840

Room: the fax cue files, I I think, would be kind of a bit easier to anonymize just because but with this. You're kind of publishing

424

00:45:24.730 --> 00:45:40.000

Room: So much information about their behavior and their facial expressions, and even their speech patterns. Is there any concern? Or how do you deal with. I is a huge concern for sure, and the we have kind of a de identification process that

425

00:45:40.010 --> 00:45:48.160

Room: it goes through a number of steps. So the first is to scrub. So we have an algorithm that goes through, and all of that speech to text data, right? If they say, you know

426

00:45:48.210 --> 00:45:49.300

Room: my name is

427

00:45:50.090 --> 00:45:57.719

Room: John and my wife so and so is like driving me crazy, right? That's personal information that will all get scrubbed

428

00:45:57.890 --> 00:46:06.049

Room: by the algorithm that goes through the speech to text. So all of that personal information address local, you know, etc. is screwed.

429

00:46:06.490 --> 00:46:11.319

Room: Then there's the facial data, right? And

430

00:46:11.410 --> 00:46:16.129

Room: The facial data is not high enough resolution

431

00:46:16.570 --> 00:46:21.350

Room: the way that we presented in story our and story time to reconstruct a person's face

432

00:46:21.460 --> 00:46:22.740

Room: accurately.

433

00:46:22.860 --> 00:46:36.889

Room: It's it's good, but it's not designed for that specific problem. And the the added thing that we can do to even further anonymize that information is just dimension reduction.

434

00:46:37.680 --> 00:46:38.450

Room: So

435

00:46:39.140 --> 00:46:42.159

Room: then you're you know, inheriting kind of a

436

00:46:42.600 --> 00:46:44.950

Room: reduced dimension feature set?

437

00:46:46.140 --> 00:46:48.650

Room: Yeah, yeah, it's a great question.

438

00:46:48.800 --> 00:46:52.749

Room: Yeah. Oh, oh, sure.

439

00:46:53.030 --> 00:46:54.310

I see them the screen. That

440

00:46:54.350 --> 00:46:56.270

Room: Oh, oh, good, good!

441

00:46:58.240 --> 00:46:59.260

It's fine, Thank you.

442

00:46:59.950 --> 00:47:02.680

It seems to

443

00:47:02.830 --> 00:47:04.690

the social.

444

00:47:05.130 --> 00:47:05.899

Let me see.

445

00:47:06.620 --> 00:47:07.240

Got it.

446

00:47:07.600 --> 00:47:11.770

Room: Yeah, yeah. What a great question. So

447

00:47:12.240 --> 00:47:13.359

Room: you know we

448

00:47:13.480 --> 00:47:19.669

Room: The definition of behavior in this case is anything you can video record on a video of a person.

449

00:47:20.200 --> 00:47:25.529

Room: So so you're you're absolutely right. It's a bit of a narrow term in this case.

450

00:47:26.210 --> 00:47:32.349

Room: because your app might record. You might ask people to take a video of their surgical wound.

451

00:47:32.610 --> 00:47:39.290

Room: Right. And that's the data that you capture in your storyline app. And you're trying to monitor inflammation or something like that.

452

00:47:39.790 --> 00:47:50.509

Room: we have people that approach this because they want to look at skin health right like bags into the eyes. So much people are sleeping acne, and that kind of stuff, and that's not behavior. That's

453

00:47:51.330 --> 00:47:52.859

Room: you know, skin.

454

00:47:53.230 --> 00:47:57.919

Room: So so it you you you're right. The the term behaviors but

455

00:47:58.040 --> 00:48:00.939

Room: narrow, but it's a component big component of this.

456

00:48:03.550 --> 00:48:05.500

Room: I hope that answers the question.

457

00:48:05.560 --> 00:48:10.309

Room: Anything in video basically is is of use on the story like platform.

458

00:48:11.690 --> 00:48:15.579

Room: The other questions i'm curious how stored I might support

459

00:48:15.940 --> 00:48:33.390

Room: engaged feedback with patients and care partners along with clinicians and researchers. Yeah, really, really important thing. So you can use storyline literally to sir. have these kind of asynchronous conversations with your patients or

460

00:48:33.450 --> 00:48:52.009

Room: study participants where they can leave you a video message that you watch or staff at your clinic watches. You can obviously just send them a little quick questionnaire Like, Are you doing okay? Are you finding that this is working? Okay, Are you happy with my care. It's, you know, whatever the questionnaires could be.

461

00:48:52.300 --> 00:48:57.150

Room: But but you can also do this through video and asynchronous video messaging.

462

00:48:57.170 --> 00:48:57.770

Yeah.

463

00:48:58.580 --> 00:49:06.790

Room: And then the other question is, how do you sort verbal variables, word, choice, set and structure, etc., for patients

464

00:49:06.940 --> 00:49:25.930

Room: for whom English is not their first language. Yeah, so that's a really really good point. And this comes into the problem of bias. For folks who who have a accent because English is not their first language, you know, on the data science side of things. You need to be able to

465

00:49:25.940 --> 00:49:31.340

Room: think about that and solve that problem within your ecosystem. And what we AIM to do is give you

466

00:49:31.480 --> 00:49:35.209

Room: all of the data that you would need to do that.

467

00:49:35.970 --> 00:49:41.849

Room: Many of the AI algorithms that exist in the world were pretty well for Spanish or for English speakers.

468

00:49:41.980 --> 00:49:44.119

Room: Spanish is getting pretty good.

469

00:49:44.160 --> 00:49:59.599

Room: and there's a number of different platforms that are coming with really good speech to text for many, many other languages. So we're getting better and better. The problem is, the intersection between right where you've got a thick accent. And and you're speaking language is not your native town.

470

00:49:59.650 --> 00:50:13.040

Room: So it's a good question. Another way to answer that question is that when you're trying to build an AI model to diagnose something. You want a lot of redundancy and features. You don't want everything to rely on one feature that might be contaminated by accent.

471

00:50:13.140 --> 00:50:23.149

Room: So you've got other measures of you know, blinking rate patterns. For example, if you, if you look at folks with treatment, resistant depression. Their faces are flat.

472

00:50:23.290 --> 00:50:24.489

Room: and they played

473

00:50:25.130 --> 00:50:31.039

Room: very slowly, so you could. You not only get information out of the voice, but you get a lot of

474

00:50:31.110 --> 00:50:33.020

Room: other information that's redundant.

475

00:50:34.450 --> 00:50:35.649

Room: I hope that's helpful.

476

00:50:36.560 --> 00:50:40.740

Room: These biases are so. There's 2 ways to look at the biases. One is like

477

00:50:41.210 --> 00:50:42.930

Room: scary problem, right?

478

00:50:42.960 --> 00:50:59.510

Room: But the other is huge opportunity. Because if you're out there collecting data and you can identify these different sub populations, people and build algorithms that are useful. That's personalized medicine for particular subgroups. So it's an offer. Think of it as an opportunity to go out there and get the data and solve the problem.

479

00:51:00.440 --> 00:51:01.149

Room: Yeah.

480

00:51:01.710 --> 00:51:08.330

Room: so this might be the misunderstanding like that. AI can go. But

481

00:51:08.660 --> 00:51:12.310

Room: how did you Did you meet with experts on

482

00:51:12.690 --> 00:51:14.590

Room: conditions to build

483

00:51:14.700 --> 00:51:18.799

Room: those feature profiles? Well, what a good question! So

484

00:51:19.610 --> 00:51:26.179

Room: yes, the way I I I by far, not even close to being an expert

485

00:51:26.280 --> 00:51:31.539

Room: in any of these different domains. Right? We're talking about. We're going to talk about a lot of different stuff cancer

486

00:51:32.230 --> 00:51:38.199

Room: depression, psychedelic treatments, you know we're we're doing so many of homelessness and addiction.

487

00:51:40.100 --> 00:51:43.049

Room: So so the way it has to work is

488

00:51:43.360 --> 00:51:45.719

Room: storyline builds the platform.

489

00:51:46.020 --> 00:51:50.430

Room: and then we partner with people who are experts in a particular domain.

490

00:51:50.870 --> 00:51:52.290

Room: and they go out

491

00:51:52.370 --> 00:51:56.930

Room: and figure out how to solve, how to apply the platform in their area right?

492

00:51:57.000 --> 00:51:59.839

Room: Because they understand the problems they understand.

493

00:51:59.980 --> 00:52:06.040

Room: You know the sub types, and and what sorts of features and measures and assessments might actually be useful

494

00:52:06.360 --> 00:52:11.180

Room: in in many cases. You can take an assessment that's already being used like animal naming.

495

00:52:11.400 --> 00:52:15.160

Room: which is, you know, kind of a cognitive test. It's been used for a long, long time.

496

00:52:15.480 --> 00:52:17.790

Room: and now just delivered as An AI

497

00:52:17.880 --> 00:52:29.070

Room: kind of thing. And now you're and now you get all this new information as somebody tries to name animals over the course of a minute and and the phenotypic patterns are quite striking and interesting.

498

00:52:33.220 --> 00:52:37.849

Room: Okay, so. And this is going to speak a bit to your your point.

499

00:52:37.980 --> 00:52:43.539

Room: So this is I'm going to just tell a couple 3 examples. One example is neurology.

500

00:52:43.580 --> 00:52:52.890

Room: So the way neurology typically works right now is you've got a person that comes into the clinic they presented with symptoms that bother them enough in their life that they now come in to see a physician

501

00:52:52.940 --> 00:52:55.379

Room: and that neurologist does it work up

502

00:52:55.780 --> 00:53:04.630

Room: typically takes an hour or more and and goes through. And and they're assessing different things and making expert judgments around phenotypes, abnormalities, etc.

503

00:53:05.120 --> 00:53:19.240

Room: So the problem that Fanny, who's an outstanding neurologist, a researcher at Mount Sinai in New York she has taken on using storyline is that neurological disorders are incredibly diverse.

504

00:53:19.570 --> 00:53:24.810

Room: and they're frankly, frequently missed or misdiagnosed in the primary care setting.

505

00:53:25.330 --> 00:53:32.899

Room: You know, primary care. Doctors are not trained to detect and diagnose the over 400 different neurological disorders there are in the world.

506

00:53:34.250 --> 00:53:47.239

Room: But these disorders really benefit from early detection. And so we want to be able to make, you know assessments of neurological disorders more scalable, readily accessible and and available for people to get diagnosed early.

507

00:53:47.570 --> 00:53:54.449

Room: So Fanny seeks to build a neurological burden index which, where the objective is just to say.

508

00:53:54.490 --> 00:53:55.569

Room: are you kind of

509

00:53:55.620 --> 00:54:01.750

Room: showing some symptoms that look a little abnormal compared to the population of people that is in your demographic.

510

00:54:02.070 --> 00:54:05.239

Room: and then, if so, maybe you know, this is an alarm that

511

00:54:05.910 --> 00:54:09.800

Room: would would would drive you to go into the clinic to see care.

512

00:54:11.170 --> 00:54:20.389

Room: there's the supervised problem where we've got over 400 known neurological disorders, and we'd like to have an objective smartphone based assessment that says you've got this versus that.

513

00:54:21.370 --> 00:54:34.130

Room: And then there's the one that I you know. I'm. A researchers of one I love the most, which is the unsupervised approach where you collect massive amounts of data, and you discover new neurological disorders that nobody even had names for

514

00:54:36.390 --> 00:54:40.970

Room: right. And so Fanny's approach. Then is she set up on storyline.

515

00:54:41.050 --> 00:54:54.979

Room: And she's built a 15 min integrative, neurological, psychiatric and clinical psychology assessment. And then she delivers that to people's smartphones, captures the data because they do the assessment. I'll show you how it works.

516

00:54:55.280 --> 00:55:00.459

Room: And then from the data, the idea is to discover these important phenotypes

517

00:55:00.520 --> 00:55:09.260

Room: train diagnostic models based on the labels for different different phenotypes and subtypes, sometimes quite rare. Neurological disorders

518

00:55:09.750 --> 00:55:11.659

Room: validate those models

519

00:55:11.940 --> 00:55:16.030

Room: and then make them available for everybody to use worldwide through telehealth.

520

00:55:16.640 --> 00:55:29.069

Room: and it's so easy to enroll people. They can just take a picture of the QR. Code on their phone, and that will take you straight into download the storyline assessment and you could do a neurological system on your phone right now. it.

521

00:55:29.090 --> 00:55:32.010

Room: And yeah, it looks like somebody's gonna go for it. So great.

522

00:55:36.110 --> 00:55:44.669

Room: Okay. So now, here's what it's like. So the neurological assessment what? What I think is so cool about this. It's asynchronous.

523

00:55:44.710 --> 00:55:52.100

Room: People are watching the clinician. Tell them what to do, and they're just following along right. You can never do this with a questionnaire.

524

00:55:52.670 --> 00:55:53.379

Room: Oh, hold on!

525

00:55:59.640 --> 00:56:01.429

Room: Oh, no! What's up.

526

00:56:04.520 --> 00:56:06.799

Room: Oh, I know why Hold on.

527

00:56:10.070 --> 00:56:10.629

Hang on!

528

00:56:16.590 --> 00:56:18.150

Room: The video is not in the

529

00:56:19.700 --> 00:56:20.299

here.

530

00:56:25.980 --> 00:56:27.830

Room: can. Can folks on

531

00:56:27.990 --> 00:56:30.460

Room: slack or excuse me on

532

00:56:30.630 --> 00:56:32.089

Room: zoom see the

533

00:56:32.350 --> 00:56:33.310

Room: No.

534

00:56:35.840 --> 00:56:38.260

Room: So maybe I just need to share.

535

00:56:39.000 --> 00:56:40.459

I think this will work

536

00:56:40.650 --> 00:56:41.330

to.

537

00:56:43.050 --> 00:56:43.839

Okay.

538

00:56:45.680 --> 00:56:46.620

Yeah, this is exactly.

539

00:56:47.900 --> 00:56:49.039

And I have to.

540

00:56:49.430 --> 00:56:50.100

if You'

541

00:56:53.330 --> 00:56:53.930

on

542

00:56:57.530 --> 00:56:58.129

by.

543

00:57:05.210 --> 00:57:06.370

Room: Okay.

544

00:57:13.000 --> 00:57:18.339

Room: So so that would go on, you know, for 50 min, or whatever you feel like You're having this face to face

545

00:57:18.400 --> 00:57:20.810

Room: assessment and

546

00:57:22.720 --> 00:57:33.060

Room: And then what we can do with AI models is, we can do very precise tracking of all of the motor patterns and movements in the face. Right? So this is me. I'm looking up.

547

00:57:33.260 --> 00:57:51.670

Room: You're right looking left down like you would in a typical neurological assessment and then micro-expressions and changes movements and things like that in the face are being measured. And across all these different points in the face that data is being extracted in Xyz coordinates, and then architect it into those story story time, trials.

548

00:57:52.070 --> 00:57:54.189

Room: It's not just the

549

00:57:55.470 --> 00:57:57.759

Room: face that we can analyze.

550

00:57:58.110 --> 00:58:08.999

Room: So we can do essentially these full kind of clinical interviews, but capture a lot of different data. So here's me and it's measuring over 400 different points across my face, 101.

551

00:58:09.450 --> 00:58:11.950

Room: So all micro expressions and movements.

552

00:58:12.170 --> 00:58:18.709

Room: It's also capturing the RGB values. So we're going to get information about skin power, skin, health

553

00:58:18.920 --> 00:58:22.849

Room: and potentially for fusion patterns across the face.

554

00:58:23.610 --> 00:58:35.820

Room: Motor tests, you know. So in neurological assessments, so often have these types of little motor assessments, all of this becomes measurable and quantitative instead of just a kind of judgment.

555

00:58:36.240 --> 00:58:44.410

Room: and you know i'm just doing it at home. So if I I don't have to come in from the country, or whatever to get access to the character Major center.

556

00:58:46.090 --> 00:58:49.540

Room: And that's good for early diagnosis. Right? That's that's the goal there.

557

00:58:52.480 --> 00:59:10.919

Room: So once you have this type of data, you can do dimension reduction and project that data into multi dimensional space. And here, you know, even though the data is captured in 2D on the video, we can use a models to estimate the D projection of the data.

558

00:59:10.930 --> 00:59:15.929

Room: And so you're seeing somebody's micro expressions and movements. They're being projected on across all these spots.

559

00:59:16.000 --> 00:59:31.910

Room: And we're just plotting the data as we move through the video frames, and each frame has, we just a Pca. On the data to come, produce it, and then you can follow the changes to micro expressions and move through this dimension reduction

560

00:59:33.540 --> 00:59:37.819

Room: very, very cool. So a neurological assessment becomes highly, highly quantitative.

561

00:59:41.740 --> 01:00:01.309

Room: like, I said, one of my favorite PET projects is not supervised learning, but actually unsupervised learning, because this is where you get the opportunity for discovery. And so some of the data scientists in the in the group have really been focused on approving, unsupervised learning algorithms. And Here is an example where they made synthetic data

562

01:00:01.380 --> 01:00:09.159

Room: where there's no. So if each.is a patient you can imagine here, and you've captured thousands of measures for each patient. So it's high dimensional data.

563

01:00:09.260 --> 01:00:10.200

Room: And then

564

01:00:10.320 --> 01:00:21.799

Room: here, there's really no subgroups, and then we gradually create subgroups. So we've got a ground truth right. And now there are real subgroups coming out in the synthetic data.

565

01:00:21.920 --> 01:00:33.089

Room: And what Jared on the team has done is he's tested some of the different unsupervised learning algorithms that exist in the world against his own storyline cluster test

566

01:00:33.110 --> 01:00:44.540

Room: and what this shows this is the in-group proportion test from Rob Tip Shirani at Stanford, and his test, you know, as you move from not real clusters to very lots of really coherent clusters. One

567

01:00:44.760 --> 01:00:48.779

Room: it sort of gradually improves, but it's not very sensitive.

568

01:00:51.000 --> 01:00:52.070

Room: don't tell Rob

569

01:00:52.120 --> 01:01:09.490

Room: so, and then, but but Jared's test is beautiful. It really scales nicely, as true biological subgroups. Emerge in the data. The score comes up beautifully with with the coherence of that data. And so he has a nice algorithm I think, for finding bona fide subgroups

570

01:01:09.500 --> 01:01:18.310

Room: from this high dimensional data. So if you collect lots of data from different patients, you can sub-type and discover some types in new ways. That's that

571

01:01:18.490 --> 01:01:19.720

Room: cool in my

572

01:01:19.810 --> 01:01:20.880

Room: in my world.

573

01:01:21.020 --> 01:01:32.229

Room: and and you know I always come back to my roots in genetics where we used to. We still think about liability thresholds where you've got a certain number of genetic mutations in your genome

574

01:01:32.640 --> 01:01:35.079

Room: tips you into this clinically relevant

575

01:01:35.180 --> 01:01:54.049

Room: tail of the distribution. Right? So these liability thresholds, and we can start to think about behavior in the same way, where you gather many, many different measures in a certain number of abnormal kind of behavioral features and neurological symptoms, and it tips you into this kind of clinically relevant realm.

576

01:01:56.090 --> 01:01:57.060

Room: Okay.

577

01:01:58.050 --> 01:02:00.589

Room: i'm over time.

578

01:02:00.830 --> 01:02:05.170

Room: Is it? Okay? Yeah, okay, Thank you all of you, for you know.

579

01:02:05.730 --> 01:02:07.820

Room: not going to hockey again. Thank you.

580

01:02:09.050 --> 01:02:10.560

Room: Okay,

581

01:02:11.300 --> 01:02:13.340

Room: This is a project that is.

582

01:02:13.520 --> 01:02:16.999

Room: you know. It's one of those big missions right? And

583

01:02:17.190 --> 01:02:24.389

Room: in in Canada. We're pretty good at taking care of of people right? We're, you know. I'm really proud of that. And and

584

01:02:24.420 --> 01:02:27.229

Room: the home country, but the United States

585

01:02:27.530 --> 01:02:34.990

Room: it's a little different. So in in the United States, you know, 2 and a half 1 million people are incarcerated.

586

01:02:35.730 --> 01:02:44.049

Room: Many people are homeless. They do not have in many cases the same social infrastructure to to help people

587

01:02:44.690 --> 01:02:45.669

Room: and

588

01:02:45.810 --> 01:03:02.029

Room: so there's a huge need in in the world to help folks. And really it's going to be a technology solution that does this. And i'll talk about. You know the details here. So this is a partnership between the National Institute for Jail operations in the United States

589

01:03:02.080 --> 01:03:14.950

Room: last mile, which is an organization that helps folks coming out of prison get jobs and transition and reintegrate into the world and seek haver and seek even helps folks like veterans.

590

01:03:14.960 --> 01:03:22.509

Room: and people living on the street get off the street, get homes, get stable, and if they're suffering from addiction, and things like that recover.

591

01:03:23.620 --> 01:03:40.449

Room: so this is a huge, huge social medical problem in in the Us. You know, for those of you who have been some of the cities down there there are huge cities of tents and people living on the street, and things like that. They have more mentally ill people in the jails than in the hospitals

592

01:03:40.590 --> 01:03:51.270

Room: erez agmoni. So they're essentially using the incarceration system to take care of the mentally ill down there. 64% of inmates have clinical mental illness symptoms, 150

593

01:03:51.330 --> 01:03:58.019

Room: 2.5 million incarcerated people, and suicide is the leading cause of death in these jails, in prisons.

594

01:03:58.350 --> 01:04:01.750

Room: and then 500,000 people are homeless.

595

01:04:01.940 --> 01:04:06.190

Room: and you know, like I said before, one of the challenges that.

596

01:04:06.420 --> 01:04:12.109

Room: Well, there's a number of challenges here. One of the challenges is that the folks that work in the jails.

597

01:04:12.280 --> 01:04:15.890

Room: They're not psychiatrists. They're not expert

598

01:04:15.950 --> 01:04:22.300

Room: clinicians. They are getting paid $18 an hour as prison guards, and they have to make

599

01:04:22.430 --> 01:04:41.040

Room: very stressful decisions every day that they're at work right? They're dealing with a very, very high stress environment. There's a lot of turnover in the staff. So even though you get somebody who becomes an expert and becomes really good in the position. Somebody just gets arrested, and they know what to deal with them, how to house them how to take care of them, so that they're safe.

600

01:04:41.810 --> 01:04:42.980

Room: and they burn out.

601

01:04:43.400 --> 01:04:44.899

Room: And then there's a new person that comes in.

602

01:04:45.930 --> 01:05:04.970

Room: So we really need a better solutions to help manage decision making in these ecosystems. And one of the challenges. And one of the opportunities here is that because so many mentally ill people are in this ecosystem. We can start to understand mental illness more objectively and find different subtypes.

603

01:05:05.550 --> 01:05:13.020

Room: And so this is an example of a company that has been built on the storyline platform called Rubicon AI,

604

01:05:13.230 --> 01:05:20.180

Room: and that's their mission is to help the the jail, and the prison systems make better decisions.

605

01:05:20.210 --> 01:05:24.189

Room: and they partnered with the National Institute jail operations to do that.

606

01:05:24.760 --> 01:05:34.980

Room: And basically what they're focused on right now is they they built a 15 min intake assessment. So police pick up somebody off of the street.

607

01:05:35.760 --> 01:05:39.399

Room: They drop them off at the desk at the jail, and they drive away.

608

01:05:39.990 --> 01:05:44.409

Room: The person sitting there behind the desk has no information about this individual.

609

01:05:44.460 --> 01:05:51.210

Room: They they don't know what substances they might be addicted to. They don't know what mental health issues they may be struggling with.

610

01:05:51.370 --> 01:05:58.550

Room: They have, but at the same time they need to bring them into a very stressful environment and make sure that the whole thing doesn't explode.

611

01:05:59.320 --> 01:06:02.809

Room: So their cat they have built a little intake assessment.

612

01:06:03.200 --> 01:06:07.839

Room: and then their first mission is to solve suicide and suicide risks

613

01:06:08.110 --> 01:06:18.970

Room: mit Ctl. And so within the first 72 h there's a reasonably high suicide risk among folks that get arrested, and they're trying to train an AI model to predict and identify people at high risk. One

614

01:06:19.040 --> 01:06:21.210

Room: and those people will get special character.

615

01:06:21.540 --> 01:06:26.190

Room: They need to validate that model, and then they need to deploy it at massive scale

616

01:06:26.660 --> 01:06:28.899

Room: for use by jail and prison staff.

617

01:06:29.480 --> 01:06:40.299

Room: So so they have got a workable solution in place. This is what their interface looks like. Storyline, sitting on the back end, capturing, analyzing the data, and then it's presented into the front end

618

01:06:40.330 --> 01:06:46.470

Room: to help to build suicide, scores, suicide, risk scores that are easy for the prison staff to interpret.

619

01:06:46.690 --> 01:06:50.060

Room: and these are active in jails right now in in Utah.

620

01:06:53.420 --> 01:06:57.890

Room: and and so i'm just gonna finish. You know my talk talking about

621

01:06:57.960 --> 01:06:59.729

Room: cancer, which is where we started?

622

01:07:02.290 --> 01:07:03.560

Room: One of my

623

01:07:03.590 --> 01:07:06.589

Room: dreams is to take this technology

624

01:07:06.760 --> 01:07:23.300

Room: and use it to support cancer patients and approve outcomes. And this is a collaboration now with the Cancer Institute, the Moffat Cancer Center, which is got a 750 million dollar injection of cash from Florida State government, so they will be one of the largest, if not the largest, cancer center in the world.

625

01:07:23.610 --> 01:07:28.030

Room: Arizona State University in the Huntsman Mental Health Center.

626

01:07:29.580 --> 01:07:41.690

Room: 1 One of the things that surprised me is I became a patient and learn more and more about the disease is that it's not our genetics that are the primary drivers and the risk for getting cancer. We think

627

01:07:42.150 --> 01:07:51.289

Room: genetic risk. Factors are actually the genetic forms of cancer pretty where we from many record one record, 2 pretty rare causes of cancer actually

628

01:07:51.760 --> 01:07:58.900

Room: overall it's actually behavioral factors. We think that play one of the biggest parts smoking

629

01:07:59.300 --> 01:08:00.259

Room: alcohol.

630

01:08:01.010 --> 01:08:11.390

Room: right, bad lifestyle, bad diet, social support, nutrition, stress, treatment, compliance, exercise, all of these components add up to affect the

631

01:08:11.520 --> 01:08:14.940

Room: risk for cancer outside later in life.

632

01:08:15.750 --> 01:08:23.579

Room: and and like I said, You know, cancer is now thought to be a kind of immuno metabolic disease. And so a lot of these environmental behavioral factors can impair

633

01:08:23.640 --> 01:08:26.439

Room: metabolism and immune pathways.

634

01:08:29.529 --> 01:08:30.840

Room: And and and

635

01:08:31.100 --> 01:08:38.229

Room: this means that this is kind of a complex problem to solve. But behavior is the problem to kind of get in and help.

636

01:08:38.590 --> 01:08:44.239

Room: and it's a place where nobody's focused. Most of the cancer research efforts are focused on the tumor

637

01:08:44.300 --> 01:08:57.459

Room: right sequencing tumors, epigenome, genome understanding, immune cell infiltration into the tumors. This is where most of the research is focused. There's a huge opportunity, I think to focus on the patient and patient behavior.

638

01:08:57.960 --> 01:09:14.680

Room: and so one of the things I've done is build this class cancer, patient master class online building your community of patients, or and providing them with the tools to understand themselves and their disease. One of the things that I found as a cancer patient is just

639

01:09:15.000 --> 01:09:19.010

Room: having an understanding that what I was doing was making things better

640

01:09:19.180 --> 01:09:21.250

Room: was incredibly motivating.

641

01:09:21.279 --> 01:09:28.069

Room: even though it's very hard to do a lot of these lifestyle changes and behavioral interventions. If you can see the results.

642

01:09:28.200 --> 01:09:31.939

Room: it's incredibly motivating, and it gives you something to work towards

643

01:09:32.120 --> 01:09:44.170

Room: the AI tools that storyline provides can help people to see the improvements. Fatigue is improved. Mental health is pretty improved. Skin power and coloration is improved.

644

01:09:44.470 --> 01:09:45.599

Room: muscle, tone.

645

01:09:45.670 --> 01:09:54.840

Room: etc., etc. So we can build these scores that help to support the patients and help to drive them towards better outcomes and better actions

646

01:09:56.160 --> 01:10:07.659

Room: at You know I I describe the extinction therapy approach, and God bless them! At the Moffat Cancer center, where they think a lot about these types of strategies.

647

01:10:07.670 --> 01:10:16.619

Room: There. they have got a donor who's gonna support a pilot trial of the extinction therapy regiment for metastatic breast cancer starting in the spring.

648

01:10:17.090 --> 01:10:27.279

Room: And so that program of drug switching and metabolic switching will be put together into an algorithm that will be delivered through storyline, and we will see if it works for other patients

649

01:10:27.640 --> 01:10:34.259

Room: and the the metabolic switching paradigm will also be tested in the trial at the hunt to be cancer in the New Year.

650

01:10:34.510 --> 01:10:43.289

Room: So that's very exciting for me, and it's a very exciting opportunity to see what types of patients do. Well, through this type of program.

651

01:10:43.650 --> 01:10:47.349

Room: what types of patients struggle with particular components

652

01:10:47.560 --> 01:10:57.940

Room: Can I diagnose and see which patients are going to struggle with particular parts early, and then provide them with special support to help them get through the whole pipeline and program

653

01:10:58.430 --> 01:11:00.500

Room: man. That'd be amazing right.

654

01:11:00.720 --> 01:11:15.530

Room: And then, if you can provide that support at scale through a smartphone ecosystem, so it's not face to face. Care providers. It can all be provided, you know, at massive mass of scale. That's a a dream. Right then you really can move the needle for a lot of people.

655

01:11:16.240 --> 01:11:17.300

Room: So

656

01:11:17.620 --> 01:11:20.959

Room: what I've tried to do today is tell you this story

657

01:11:21.170 --> 01:11:22.059

Room: of

658

01:11:22.320 --> 01:11:32.389

Room: we're building a solution for a problem that we think we've identified in the precision, research, and care world which is around a human behavior analysis.

659

01:11:32.450 --> 01:11:42.020

Room: And we feel like there's an opportunity to make that more objective and data driven and precise, and then integrated in a lot of different aspects of medicine and medical care.

660

01:11:42.510 --> 01:11:52.809

Room: And of course, the goal, as I described is really to build the platform so that other smart people can use the tools really to move very, very quickly to solve problems that they understand people.

661

01:11:53.320 --> 01:12:11.389

Room: So you know, a behavior data is incredibly rich, useful, and powerful and easy easy to capture. So you can capture that massive scale on a smartphone. This will be a front line diagnostic, right? I talked about the second line molecular lab tests very important.

662

01:12:11.500 --> 01:12:19.290

Room: but getting something out on the front lines that is massively scalable and powerful, I think, is a real opportunity.

663

01:12:19.460 --> 01:12:25.270

Room: Decisions always start from patient symptoms behavior, mental health expression, and what they kind of look like.

664

01:12:25.740 --> 01:12:29.710

Room: So there's a there's an opportunity to make that measurable and objective.

665

01:12:29.750 --> 01:12:31.769

Room: It is a predictor of outcomes.

666

01:12:32.000 --> 01:12:41.640

Room: and it's now, you know, I think, ready for prime time in in biomedical research in a lot of different areas, many of which I haven't even thought of. So

667

01:12:41.680 --> 01:12:47.649

Room: we're aiming to support this conceptual shift from just thinking about drugs and drug development

668

01:12:47.770 --> 01:12:52.159

Room: to building these more effective care algorithms and care.

669

01:12:54.010 --> 01:12:55.150

Room: And you know, I

670

01:12:55.700 --> 01:13:03.820

Room: you know, I I don't know if it's limitless. But there are really hopefully useful applications that come out of all of this work.

671

01:13:04.510 --> 01:13:09.129

Room: And if you have any questions about, what was it like to build this.

672

01:13:09.680 --> 01:13:12.120

Room: What were the problems that you encountered?

673

01:13:12.440 --> 01:13:22.030

Room: Date security, blah blah blah! Just email me and ask me, You guys are embarking on all these cool adventures to build and solve precision medicine problems.

674

01:13:22.150 --> 01:13:27.910

Room: I'd be delighted, you know, to share my experiences and war wounds. It's very hard.

675

01:13:28.070 --> 01:13:39.489

Room: very, very, very, very hard. It's like. Imagine how hard you think it is, and then multiply that by 100 it's so hard and working in the Canadian health care system has a lot of wonderful benefits.

676

01:13:39.580 --> 01:13:42.319

Room: a lot of amazing opportunities, but it's

677

01:13:42.860 --> 01:13:44.620

Room: very hard

678

01:13:44.670 --> 01:13:53.869

Room: to innovate within this system. And so you guys have to go in really tough, you know. My My brother is a surgeon here in Alberta.

679

01:13:53.890 --> 01:13:58.109

Room: and he, over the course of his career, faced many

680

01:13:58.280 --> 01:14:00.560

Room: challenges and frustrations with

681

01:14:00.680 --> 01:14:05.099

Room: wait times and difficulties, providing the care that he wanted to provide.

682

01:14:05.230 --> 01:14:19.940

Room: But he's finding ways to build private clinics, to open up entrepreneurial opportunities that increase access to care for folks in Calgary, Alberta. So you know it is going to is going to change. Just keep pushing, pushing.

683

01:14:21.510 --> 01:14:22.200

I'm done

684

01:14:22.330 --> 01:14:24.010

Room: thank you.

685

01:14:30.470 --> 01:14:32.769

Room: Sure. Sure

686

01:14:34.690 --> 01:14:36.159

a good old question.

687

01:14:38.720 --> 01:14:43.839

Room: Oh, yeah, so the starting cost is the best cost of all. It's free.

688

01:14:45.480 --> 01:15:04.420

Room: And then, as you scale up you know I think it's $99 a month, or something like that, and then 399 depending on the number of accounts, and the amount of data that you want to process. And and then, you know, if you want to grow some massive entity we're capturing enormous amounts of data. Then.

689

01:15:04.430 --> 01:15:12.629

Room: you know we have these sort of enterprise, scale solutions, or partnerships or opportunities to Bootstrap. You know everybody wants

690

01:15:12.760 --> 01:15:19.020

Room: you guys to succeed right like we we want whoever works with the platform to win. So

691

01:15:19.620 --> 01:15:20.639

Room: hopefully that's helpful.

692

01:15:20.990 --> 01:15:24.379

Room: Like lots of flexibility there.

693

01:15:24.680 --> 01:15:31.229

Room: Can this platform be beneficial and used by researchers who are building research products apps.

694

01:15:31.260 --> 01:15:37.129

Room: or is more so for business. I actually perceive it more as a researcher based

695

01:15:37.270 --> 01:15:40.610

Room: tool. I think of it very much like genome sequencing

696

01:15:41.030 --> 01:15:48.450

Room: where you know right now, everybody's sequencing staff on an aluminum sequencer. And

697

01:15:48.600 --> 01:15:51.310

Room: you're capturing all this wonderful molecular data.

698

01:15:51.420 --> 01:15:54.920

Room: Why not capture rich symptom and behavioral data

699

01:15:55.090 --> 01:15:57.879

Room: that you can intersect with that kind of multi-omics work.

700

01:15:57.900 --> 01:16:03.449

Room: So so you know, please consider it for your big research projects, consortia, all that kind of stuff.

701

01:16:03.790 --> 01:16:08.170

Room: and it's compared to most of the things that you're doing in the lab.

702

01:16:08.370 --> 01:16:14.180

Room: I run a lab so I know how much it costs. This is really cheap. It's really really cheap.

703

01:16:15.900 --> 01:16:20.440

Room: you know. I laugh because we sneeze, and we spend like a $1,000 an hour in the lab.

704

01:16:20.540 --> 01:16:23.249

Room: and the $1,000 to get you really far in storyline.

705

01:16:26.270 --> 01:16:28.279

Room: Oh, if you'd like to.

706

01:16:28.300 --> 01:16:35.239

Room: Oh, oh, good, thanks, yeah. Cost so on the website if you go to the website you can. You can sign up for free and get a

707

01:16:35.390 --> 01:16:38.619

Room: get an account right away. Any of the other costs

708

01:16:39.130 --> 01:16:48.839

Room: is the cost dependent on the services that storyline would provide. sort of to be honest right now. It's mostly a subscription based model.

709

01:16:49.060 --> 01:16:54.239

Room: so not not so much like turning features on and features off.

710

01:16:54.330 --> 01:16:56.589

Room: mostly like, Get a subscription

711

01:16:56.950 --> 01:16:58.480

Room: and you have the keys to the kingdom.

712

01:16:58.590 --> 01:16:59.190

Yeah.

713

01:17:03.690 --> 01:17:04.429

Room: yeah.

714

01:17:05.750 --> 01:17:17.670

Room: to capture more of the yeah, yeah, yeah, all of that's there. So you know, nobody is doing this on the platform, but the algorithms have been built

715

01:17:17.960 --> 01:17:20.369

Room: so so cool like

716

01:17:20.930 --> 01:17:22.199

Room: I was wonder about

717

01:17:22.470 --> 01:17:34.030

Room: people doing yoga and things like that, like if you can pick up on interesting movements. But of course there's also neurological assistance like you're evaluating Parkinson's or tremors and walking patterns, etc.

718

01:17:34.120 --> 01:17:35.099

Room: there's

719

01:17:35.550 --> 01:17:39.909

Room: It's all built. So if you want to do that, we can collect the data, and

720

01:17:40.030 --> 01:17:40.610

you know.

721

01:17:46.950 --> 01:17:49.290

Room: yeah, yeah.

722

01:17:53.670 --> 01:17:57.910

And now, how does that?

723

01:17:57.980 --> 01:18:00.260

And how does that help?

724

01:18:03.470 --> 01:18:09.799

Room: interesting? So so the imagine Fanny situation where she so

725

01:18:10.270 --> 01:18:11.090

Room: there's

726

01:18:11.400 --> 01:18:29.070

Room: imagine the future right? So Fanny situation today is, somebody comes into her clinic, and she doesn't necessarily know what's wrong with them. If it's Alzheimer's disease, it will take an hour and a half to diagnose that there'll be a series of exclusionary tests that need to be done to work through. You know

727

01:18:29.100 --> 01:18:38.369

Room: I forget. Is it vitamin d deficiencies, I forget. But there's you know there's all sorts of things you need to rule out to get down to. likely Alzheimer's diagnosis

728

01:18:38.390 --> 01:18:40.089

Room: Okay, takes her an hour.

729

01:18:40.360 --> 01:18:41.340

Room: No.

730

01:18:41.550 --> 01:18:53.389

Room: if the person could just do that assessment in 15 min, we measure 30,000 different features, and we can get a model that's 99% accurate, 95 and 90% accurate for that diagnosis.

731

01:18:53.500 --> 01:18:55.380

Room: When the patient comes into the clinic.

732

01:18:55.430 --> 01:19:06.830

Room: Fanny's way ahead of time, right? And so she doesn't need to spend necessarily an hour and a half on that problem anymore. She might move straight to second line tests blood test for amyloid

733

01:19:07.780 --> 01:19:12.779

Room: order scans. You know what whatever she she decides is the next course of action.

734

01:19:13.690 --> 01:19:16.669

Room: There's the time-saving on her part

735

01:19:17.660 --> 01:19:18.590

Room: Now

736

01:19:19.430 --> 01:19:20.230

Room: the

737

01:19:20.740 --> 01:19:25.529

Room: so this is the clash of the Canadian versus the Us. Health care system

738

01:19:25.780 --> 01:19:29.980

Room: in the Canadian health care system. You want improved efficiency

739

01:19:30.340 --> 01:19:31.450

Room: because.

740

01:19:31.940 --> 01:19:40.890

Room: having many, many more patients come to you, just clogs the system and causes a lot of problems. So you want to improve the efficiency, and you want to get more patients through effectively and safely

741

01:19:41.360 --> 01:19:43.479

Room: in the American health care system.

742

01:19:43.780 --> 01:19:47.929

Room: You're a business. You want more patients to come to. You, you know.

743

01:19:48.070 --> 01:19:56.249

Room: worried that they have neurological health problems. So if you can get early diagnostics out there on the market that help to capture

744

01:19:56.600 --> 01:19:58.980

Alzheimer's.

745

01:20:00.030 --> 01:20:09.649

Room: You're pulling them into your ecosystem where you can run $3,000 MRI Scans, etc. Etc. You know what I mean. So yeah, so that you know

746

01:20:10.130 --> 01:20:12.020

Room: 2 different worlds, right? Yeah.

747

01:20:14.370 --> 01:20:26.089

Room: it's a completely different right? Yeah, yeah. So you can have these clear approved tests that you get out in the world. And the idea is just to help to inform and get early diagnosis for all kinds of things.

748

01:20:26.310 --> 01:20:26.929

Yeah.

749

01:20:27.410 --> 01:20:27.969

yeah.

750

01:20:29.400 --> 01:20:31.529

How are you guys?

751

01:20:31.730 --> 01:20:34.599

I get your your your diagnosis.

752

01:20:34.760 --> 01:20:35.820

There's the only thing

753

01:20:36.300 --> 01:20:38.509

there is any differences like,

754

01:20:39.510 --> 01:20:40.549

how are you? Guys?

755

01:20:42.890 --> 01:20:54.130

Room: Yeah. Yeah. So the way that process works it's kind of the same in any kind of test that one's developing in medicine. It's always starts with an expert

756

01:20:54.220 --> 01:21:07.369

Room: clinician, and typically you, if you can, you get multiple expert clinicians. So somebody comes in and they've got a diagnosis of depression, and you might get 3, 4, 5 psychiatrists that say.

757

01:21:07.660 --> 01:21:14.640

Room: I agree. You know this person is not this. It's not that they have depression, so that person gets accurately labeled.

758

01:21:15.320 --> 01:21:20.359

Room: and this is why it's so expensive and challenging to build these algorithms.

759

01:21:20.620 --> 01:21:25.730

Room: Now, you've got a ground truth data set of folks that have real.

760

01:21:26.430 --> 01:21:28.429

Room: Everybody agrees on depression.

761

01:21:29.400 --> 01:21:32.699

Room: And your first job is to

762

01:21:32.940 --> 01:21:36.370

Room: build an algorithm that matches those clinical experts

763

01:21:36.520 --> 01:21:44.470

Room: and gets a certain accuracy and generalization error that's acceptable for for that. Does that make sense? So that's where it starts

764

01:21:44.620 --> 01:21:45.559

Room: Now

765

01:21:45.670 --> 01:21:53.560

Room: The reason i'm excited about unsupervised learning is that let's say depression is a fairly broad diagnosis. You throw a whole bunch of people into that bucket

766

01:21:53.740 --> 01:22:05.610

Room: mit ctl. And and then you apply an unsupervised learning algorithm that clusters them out into subgroups that becomes an exciting to research paradigm right where you've got these new subgroups, and you follow them over time, and maybe one group. One

767

01:22:06.160 --> 01:22:14.270

Room: responds very well to a particular dose of the particular Ssri and the other group is treatment resistant to everything until you give them psilocybin.

768

01:22:14.870 --> 01:22:16.349

Room: and then you know

769

01:22:16.730 --> 01:22:18.540

Room: that's the solution for them

770

01:22:18.560 --> 01:22:21.900

Room: or it's, you know. Hopefully not, but ect whatever.

771

01:22:22.010 --> 01:22:22.610

Room: Yeah.

772

01:22:31.470 --> 01:22:37.809

Room: is the overall anonymized data within storyline available to play with

773

01:22:38.090 --> 01:22:46.719

Room: and overlay, to identify insights and opportunities. So, Krista. Great question. Yes, yes, I can do that.

774

01:22:48.100 --> 01:22:53.290

Room: I understand. You know. I'm a scientist. You want to play with it right? See what it's good for.

775

01:22:53.840 --> 01:22:59.630

Room: Is it possible to use data generated by others in the program? Yeah. So that's the solution for the Api thing? Where

776

01:23:00.810 --> 01:23:08.129

Room: there there are now these Api ecosystems. And if you're gathering data on an apple watch in your study, or you're gathering

777

01:23:08.240 --> 01:23:11.879

Room: fitbit data, or you know, rings and things like that.

778

01:23:13.050 --> 01:23:16.740

Room: you can pull that into the ecosystem through through Api calls.

779

01:23:17.300 --> 01:23:17.910

Yeah.

780

01:23:19.560 --> 01:23:22.409

Room: Awesome questions. You guys are

781

01:23:22.690 --> 01:23:23.530

Room: You're on it

782

01:23:23.610 --> 01:23:24.610

Room: pretty cool.

783

01:23:25.770 --> 01:23:28.440

Room: We have just a few minutes like, oh, go ahead. I'll check.

784

01:23:29.640 --> 01:23:30.760

Yeah.

785

01:23:33.310 --> 01:23:34.380

Yeah.

786

01:23:37.290 --> 01:23:37.990

yes.

787

01:23:38.950 --> 01:23:41.550

Room: I can hear you. It's okay, and I

788

01:23:42.080 --> 01:23:42.660

right?

789

01:23:43.430 --> 01:23:46.220

So about that. That's Why, i'm supervised learning

790

01:23:46.670 --> 01:23:48.099

how you

791

01:23:48.610 --> 01:23:49.530

because you're added it.

792

01:23:50.960 --> 01:23:51.780

So

793

01:23:51.890 --> 01:23:52.809

I want to

794

01:23:53.300 --> 01:23:55.180

kind of was curious about what kind of

795

01:23:55.650 --> 01:23:56.990

features who are just a

796

01:23:57.730 --> 01:23:58.389

paper.

797

01:23:58.910 --> 01:24:01.039

Room: No, there is not a paper.

798

01:24:02.670 --> 01:24:08.720

Room: I'm: the bottleneck on writing that paper. so yeah, we have not

799

01:24:08.970 --> 01:24:15.579

Room: published the algorithm that that does that yet. I want Jared to test it in a few different ways before it's like ready for.

800

01:24:16.660 --> 01:24:17.449

But

801

01:24:17.530 --> 01:24:18.970

Room: yeah, and so you

802

01:24:22.010 --> 01:24:22.639

about it.

803

01:24:26.970 --> 01:24:30.389

No.

804

01:24:39.180 --> 01:24:40.009

I really wanna

805

01:24:45.000 --> 01:24:45.590

one.

806

01:24:53.080 --> 01:24:54.559

so it's a lot of

807

01:24:55.960 --> 01:24:59.130

the next.

808

01:25:02.190 --> 01:25:03.150

It should be pretty

809

01:25:03.350 --> 01:25:03.920

very excited.

810

01:25:04.470 --> 01:25:05.030

Okay.

811

01:25:05.200 --> 01:25:06.750

Room: some great question.

812

01:25:10.350 --> 01:25:15.140

Room: Can I talk one question on you?

813

01:25:16.740 --> 01:25:18.030

Room: Can you hear me now?

814

01:25:20.230 --> 01:25:21.639

Room: Okay, okay.

815

01:25:22.200 --> 01:25:23.130

Room: All right for that.

816

01:25:23.930 --> 01:25:27.180

Okay, I'll just ask one question. I

817

01:25:34.640 --> 01:25:36.790

kind of

818

01:25:41.080 --> 01:25:44.270

You've described this care.

819

01:25:44.490 --> 01:25:53.009

Room: And so I I wonder if you have a vision of like, you know, using this as an application to allow, you know individuals to get in there and

820

01:25:53.120 --> 01:25:58.909

Room: customize their own journey. I their own plan.

821

01:25:59.060 --> 01:26:00.130

Room: Yeah.

822

01:26:00.230 --> 01:26:13.460

Room: is this

823

01:26:13.770 --> 01:26:15.629

no changes or whatever?

824

01:26:16.620 --> 01:26:17.580

So

825

01:26:18.670 --> 01:26:26.480

this is huge.

826

01:26:40.470 --> 01:26:41.790

2 guys

827

01:26:42.150 --> 01:26:42.990

no

828

01:26:43.720 --> 01:26:49.120

Room: showing up to the

829

01:26:49.630 --> 01:26:50.570

Room: So

830

01:26:50.590 --> 01:26:52.350

Room: it's because they

831

01:26:52.740 --> 01:27:05.689

Room: And so

832

01:27:06.820 --> 01:27:08.199

Room: I think we

833

01:27:08.230 --> 01:27:09.160

Room: really

834

01:27:18.780 --> 01:27:19.809

Room: sorry.

835

01:27:23.760 --> 01:27:29.479

Room: Well, we're at 7 30. So.

836

01:27:29.600 --> 01:27:32.960

Room: guys, thank you very much for your time.

837

01:27:38.910 --> 01:27:41.969

Room: It yeah, thanks to everybody online and sorry I,

838

01:27:42.330 --> 01:27:46.989

Room: you know, came in, and I will reach out if you have any questions about anything.

839

01:28:07.800 --> 01:28:11.049

Room: Thanks so much for this opportunity.

840

01:28:14.360 --> 01:28:16.269

I signed up for the.

Previous
Previous

Why Adapting to A.I. is a Must for Every Clinic.

Next
Next

What happens when you say “hello” and “thank you” to your patients?