Joshua White – Fireside Project and Lucy, an AI Training Simulator for Psychedelic Support

January 22, 2026

Fireside Project is a nonprofit that helps reduce the risks of psychedelic experiences through a free support line, coaching, education, and research. In this episode, Joshua White speaks with Psychedelics Today about why real-time support matters, what it takes to run a national hotline, and what Fireside learned after more than 30,000 conversations since launch.

Subscribe Share

Fireside Project is a nonprofit that helps reduce the risks of psychedelic experiences through a free support line, coaching, education, and research. In this episode, Joshua White speaks with Psychedelics Today about why real-time support matters, what it takes to run a national hotline, and what Fireside learned after more than 30,000 conversations since launch.

White shares how his background as a lawyer and his early hotline volunteering shaped Fireside’s model. He also describes how festival harm reduction work, including lessons from Zendo-style support spaces, revealed a major gap: people often need help during an experience and after it ends.

A major focus of the conversation is Lucy, Fireside’s new voice-to-voice role-play simulator designed to improve psychedelic support skills through low-stakes practice.

Early Themes With Fireside Project

Joshua White introduces Fireside Project as an accessible safety net for people who are actively having psychedelic experiences or processing past ones. The support line launched on April 14, 2021, and relies on trained community volunteers who commit to a year of service.

White explains why anonymity matters. He argues that a phone-based container can make it easier for callers to share vulnerable material without fear of judgment. He also frames service as a key part of integration for volunteers who want to give back or prepare for work in the psychedelic field.

Core Insights From Fireside Project

White describes the early difficulty of building Fireside from scratch, including legal design, insurance hurdles, training development, and fundraising. He credits seed support from David Bronner and Dr. Bronner’s for helping Fireside prove that people would actually use a psychedelic support line.

He also explains a key harm reduction point: calling emergency services during a non-medical psychedelic crisis can escalate risk. Fireside aims to help people regulate, re-orient, and stay safer when panic or fear shows up.

Key concepts discussed include:

  • The thin line between healing and traumatizing during high-intensity psychedelic states
  • Why callers often need connection, not rescue
  • How volunteer capacity and call volume shape how long conversations run
  • The difference between support during an experience and longer-term coaching support

Later Discussion and Takeaways With Fireside Project

The conversation then turns to Lucy, a training tool White describes as a “flight simulator” for psychedelic practitioners. Lucy is not part of the live support line. Instead, it offers emotionally responsive role-play scenarios so trainees can practice staying grounded, tracking consent and boundaries, and responding to crisis cues.

White also addresses recording and consent. He argues Fireside needs strong training feedback loops to improve safety and quality. He describes an anonymization approach designed to remove phone numbers, strip identifying details, and distort voices while preserving emotional tone. He also explains the post-call option for callers to delete their recorded conversation.

Practical takeaways include:

  • Simulation can help trainees stay regulated when intense material emerges
  • Better training can reduce unnecessary diversion to emergency rooms
  • Clear consent language and easy deletion workflows matter for trust
  • Coaching can expand the continuum of psychedelic support beyond therapy

Frequently Asked Questions

What is Fireside Project?

Fireside Project is a nonprofit that runs a free psychedelic support line and provides coaching, education, and research to reduce risks around psychedelic use.

Is Fireside Project only for bad trips?

No. Fireside Project supports people in challenging moments, but also people seeking grounding, connection, and help processing past experiences.

What is Lucy from Fireside Project?

Lucy is a voice-based role-play simulator that helps train psychedelic practitioners through realistic scenarios and feedback. It is not used to answer live support line calls.

Does Fireside Project record calls?

White says Fireside records conversations for training, then anonymizes the data and offers a deletion option after the call.

How can clinicians use Fireside Project resources?

Clinicians and trainees can use Fireside Project as a harm reduction reference point, and Lucy may fit into training programs as a simulation tool alongside supervision and practicum work.

Does Fireside Project record calls?

Fireside Project records some calls for training purposes. Joshua White says calls are anonymized, phone numbers are deleted, and identifying details are removed.

Can I opt out of having my call saved?

According to White, callers are notified that calls are recorded. After the call, people receive a follow-up message with the option to delete the anonymized recording. He also says Fireside plans to add clearer opt-out options before a call is saved.

How is my identity protected if calls are used for training?

White says Fireside strips personal data and uses voice distortion for Lucy so voices cannot be recognized, while emotional tone is preserved. He emphasizes that recordings are not sold, shared, or licensed outside the organization.

What happens if law enforcement issues a subpoena?

White explains that subpoenas must be specific. He says Fireside cannot match anonymized calls to individuals or phone numbers because that data is not retained. He also describes challenging overly broad requests in court if needed.


This episode shows why infrastructure matters in the modern psychedelic resurgence. It highlights real-world risks, the need for scalable training, and the practical limits of emergency systems for psychedelic distress. If you want a clearer view of harm reduction, coaching, and simulation-based learning, this conversation with Fireside Project offers a grounded starting point.

Transcript

Joe Moore: [00:00:00] Hello everybody. Welcome back to Psychedelics today. Joined today by Joshua White, founder at Fireside Project. How are you today?

Joshua White: Doing great. Excited for the chat.

Joe Moore: Yeah, it should be fun. So, um, first thing we wanna do, what is Fireside Project?

Joshua White: Fireside Project is a nonprofit that has a mission of helping people reduce the risks of their psychedelic experiences, and we do that in a few ways through operating our psychedelic support line, which offers free support to people who are actually having psychedelic experiences and also wanting to process past ones.

Joshua White: We do it through our coaching program where people can sign up for a coach and receive scheduled long-term support by video, uh, when they’re preparing for and integrating psychedelic experiences, but not during, we do [00:01:00] it through, uh, public education and we do it through research.

Joe Moore: Thanks. So when, when did you all start?

Joshua White: Sorry.

Joe Moore: Yeah. When did you all start doing fireside project?

Joshua White: Yeah, so the psychedelic support line launched on April 14th, 2021, just in time for bicycle day. And so the support line has been operating for roughly four and a half years. We’ve had more than 30,000 conversations, and it’s been really a beautiful experience.

Joshua White: The line is staffed by volunteers from the community who commit to one year of service, and I think we’ve trained at this point, like over 600 volunteers. So it’s really a beautiful way for people who are either wanting to be of service, we like to say service is the highest form of [00:02:00] integration, uh, and or wanting to prepare for a career in psychedelics.

Joe Moore: Yeah, that’s super cool. Um, so how, how did this idea come around? For you?

Joshua White: Yeah, it’s a great question. So, um, my first professional chapter was as a lawyer. I was mostly working for the city of San Francisco, but I, you know, I’d felt that I had sort of been like, pressured into practicing law because, you know, my, my dad is a, a lawyer and, um, never felt the deepest level of alignment with it.

Joshua White: But then when I started going to therapy and taking psychedelics at about 20 years ago, at this point, I, I fell in love with, um, emotional support and psychedelics and psychedelic healing. So I began to kind of plot and escape from the law. I, I began to volunteer on a hotline in San Francisco [00:03:00] where I was living at the time.

Joshua White: That was part of a nonprofit then called SF Child Abuse Prevention Center. Uh, and basically parents could call in if they were. Feeling very activated and worried about what they were gonna do, and I just fell in love with support lines and, and providing emotional support. I thought that it was so powerful how we can create such a safe, non-judgmental container for people, um, by phone and through anonymity, which I think really can create this freedom to share about very vulnerable topics in a, in a deep way.

Joshua White: Um, I also fell in love with psychedelic support by volunteering with the Zendo project at festivals like Lightning in a Bottle and Burning Man. And so at the start of the pandemic, I was solo quarantining. I saw how, uh, the psychedelic field was kind of blowing up, you know, ketamine, telehealth was starting.

Joshua White: The maps, uh, study was moving into phase three and it just really seemed like, um, there [00:04:00] was an urgent need for. Widespread psychedelic support for experiential training and so forth. And so really I kind of combined a couple of my loves, uh, hotlines and psychedelic support and also just really trying to do good for as many people as I possibly could.

Joshua White: That was kind of my focus at the City Attorney’s office, uh, suing businesses, exploiting marginalized communities.

Joe Moore: So, uh, it seems like there’s a lot of service going on. That’s great. So, um, let’s see, what was it? Where do we want to go? I assume you, you had seen some bad situations with psychedelics in your past or difficult ones.

Joshua White: definitely. You know, I was always, for me it was a contrast. I was always very lucky because I had psychedelic community, I had a supportive partner and I had a, a [00:05:00] therapist who was very psychedelic friendly.

Joshua White: so, and I could also feel within myself how urgent sometimes those different support, um, uh, supporters were. And then for example, when I was volunteering at the Zendo project, um, you could really see how. Terrifying. Uh, dangerous, uh, it could be, you know, there’s a great phrase, like the line between healing and traumatizing is at its thinnest on psychedelics.

Joshua White: And so I could really see at the Zendo project people who were getting really close to that line and, and even sometimes crossing it. And, and one of the kind of sad things about, you know, volunteering with the Zendo was that the festival would end and then people often didn’t have, uh, integration support afterwards.

Joshua White: And so I, I think, like, yeah, my experience at festivals really caused me to see how important it’s to have, um, you know, really [00:06:00] robust support. I mean, in a way, fireside project is like, you know, the Zendo project, if it were everywhere and open 12 hours a day. And also did integration support.

Joe Moore: I. What was it like coming up with the, the whole organization and vision?

Joe Moore: I’m sure it was like somewhat complicated, or is it like, you know, you just get a little box and you’ve got an 800 number and you can take calls from wherever

Joshua White: it was. It was very complicated. You know, I had never, um, created an organization. I’d never even like really worked at an organization. All of my jobs had been with first the city of San Francisco and then the federal judiciary before that.

Joshua White: Um, so I, I didn’t really know anything about starting an organization. And so in a way, my, my approach could be described as like jumping off of a cliff and trying to build a plane as I was hurdling towards the rocks. Now I was, I was lucky, you know, I, I was able to pull in [00:07:00] some really, um, amazing people from the start.

Joshua White: Uh, you know, Hanifa Washington, who is now part of psychedelic, medicine psych, PMHA, psychedelic medicine. Hanifa has spent, sorry, hafa, if you’re listening. You know, a couple of years with, at least with us really helping to, um, hone the, the vision and the mission, helping to design our initial training.

Joshua White: Uh, we also had Adam Rubin working with us, who had spent a lot of time with the Zendo and other organizations. Um, but it was really hard. I mean, we had to build everything from scratch. Uh, we had to, we had to, we had to define the kind of like legal framework for it. Like, in other words, how is it legal to be a support line for psychedelics?

Joshua White: We had to get insurance and convince insurance companies that we were insurable. We had to write our training program, we had to develop marketing materials, uh, and we also had to get money for this organization. Uh, so all of that was very difficult. I mean, the, the, [00:08:00] the psychedelic philanthropy sector was still kind of coming into its own, really starting to ask itself.

Joshua White: Okay, now that. The maps clinical trial is kind of most of the way through. What do we want to do now in terms of an ecosystem. And so we were very lucky that, um, we were able to kind of like make the case that this harm reduction tool should really be part of, uh, the kind of like foundational, um, like the safety net of, of the field.

Joshua White: And we almost didn’t make it right. The, the person who kind of swooped in last moment to help us was David Bronner of, of, um, Dr. Bronner’s soap company. And, you know, he provided some kind of like seed funding of his and of Dr. Bronner’s and then, uh, connected us with some other supporters, uh, to allow us to kind of.

Joshua White: Stay afloat while we, um, while we, while we built this and, and try to really just like demonstrate would people actually use this? Right? It’s like, we thought it [00:09:00] was a really cool idea, but is anyone actually gonna call a support line while they’re tripping or while they’re talking, wanting to talk about a past trip that they had?

Joshua White: And it turns out the answer is yes. Um, but we had to prove that concept.

Joe Moore: Yeah. Obviously, um, crazy interesting project for you to be working on. And, um, I remember just how excited everybody was that it was happening, um, and. You and I had some interesting conversations, right? Like, do I, do I actually call somewhere and leave a, leave a trace of this, you know, they tell me it’s a bad thing I’m doing and then, or do I call 9 1 1 or do I tough it out?

Joe Moore: And like, there’s a lot of really interesting stuff there. Can you, can you kind of go into the, the thinking there?

Joshua White: Yeah, for sure. I mean, so there, there are a few different kind of pieces of this, right? One is, [00:10:00] especially if you’re new to psychedelics, and of course there are so many people who are new to psychedelics these days.

Joshua White: Um, you might mistake. Feeling like you are dying, um, through ego dissolution with actual dying. And so there are people who do go to emergency rooms, not a lot, but some, or who call 9 1 1 or nine, eight, eight. Not a lot, but some. And in all likelihood, unless you’re having a true medical emergency, it’s the absolute worst thing that you, that you can, that you can do.

Joshua White: I mean, imagine going into an emergency room, um, on acid and getting injected with Thorazine or something like that. I mean, talk about the line between healing and traumatizing. You are well on the wrong side of the line probably if you’re interacting with emergency services. So, so there’s that on one side.

Joshua White: Um, and then on the other side of things, as you said, people are calling us and talking about, um, [00:11:00] you know, taking illegal substances. Right. So one initial question is, how on earth is that legal? And the, you know, the answer is this is harm reduction, right? So in other words, harm the sort of fundamental principle of harm reduction is people are gonna be engaging in a practice that has risks and our role is to help reduce those risks.

Joshua White: And so for that reason, we only talk to people who have. Already consumed the psychedelic. So for that reason, we can make sure that we’re inoculating ourself, if you will, against a claim that we’re encouraging people to take psychedelics. We don’t encourage people to take psychedelics. We can support them after they’ve made that decision and create this kinda safe space for them to process their experience.

Joshua White: Now there’s another question, which is, um, sort of do no harm, right? How can we make sure that we are honoring the trust that our callers place in us? And this is really the foundational principle [00:12:00] of what we do. And, um, and so we. I think we honor, we go about honoring that trust in, you know, a bunch of different ways, right?

Joshua White: But when it turns, when it comes to people actually reaching out to us, we wanna make sure that, um, that records of them reaching out to us have been fully anonymized. And so that means things like deleting the phone number, uh, so that if we got a subpoena, for instance, for a particular phone number or a particular name, um, we wouldn’t, we wouldn’t be able to respond to it.

Joshua White: Um, but then, I know we’ll get into this more, but like, of course, another way that we, you know, need to, like honor the trust that our callers put in us is by continuing to get better and better at what we do. And that’s, that’s a whole kind of internal process within, within Fireside that has a lot of different parts.

Joshua White: But I’ll, I’ll pause before I kind of dive into that.

Joe Moore: Yeah. So there’s, there’s a lot here.

Joshua White: Yes.

Joe Moore: Yeah. [00:13:00] So I guess, uh, a quick thing about a tangent, if you will. Zendo has this situation. Mm-hmm. You know what, what folks at Zendo and perhaps Fireside are consenting to is, Hey, I’m here in the case of an emergency.

Joe Moore: Um, I’m not here to be your non-consensual trip sitter. You know, can you talk about that a little bit and how that’s maybe a challenge sometimes?

Joshua White: I’m not. Can you elaborate on what you mean by non-consensual trip sitter?

Joe Moore: I didn’t, I didn’t consent to sitting for you for five hours.

Joshua White: Oh, I see. I see. Yeah, that’s a great, right.

Joshua White: So in other words, we at Fireside, you’re saying, may not have consented to that. Yeah. So that’s, it’s a really interesting question and it kind of brings up, I would say like. A couple of things for me, right? I mean, one, one of one of them is, um, that there is a perception sometimes that fireside is like the bad trip hotline and we don’t like the term bad [00:14:00] trip, but the challenging trip hotline.

Joshua White: And that that is not accurate. It’s too narrow, right? We support people who are in search of connection, who need someone to just listen and unpack their experience. And we also support integration callers. Um, so, so there’s that piece of it. But that being said, you know, people tend to not wanna spend all that much time on the phone.

Joshua White: Um, so if someone took a bunch of mushrooms or acid and they really wanted a human being to spend five hours with them. You know, we don’t have a limit on trip calls, but the way that it kind of like shakes out in reality is that people reach out to us usually with like a particular need, right?

Joshua White: Sometimes the need is, I just wanna feel connection and that need can be satisfied through sometimes a five minute conversation or a 45 minute conversation. Um, other times that that need might. Take longer to satisfy. [00:15:00] Like there’s something really intense from their childhood coming up and they really want to unpack it.

Joshua White: So we’re really like there for all of it. Uh, and we have limited bandwidth, right? We have three to five volunteers on every shift. Some shifts, as you can imagine, call volume is higher. And if we’re in hour three of a conversation with someone and call volume is really high, we might, um, invite the person to call back later if they need additional support.

Joshua White: So it’s kind of a ba a balance that we, that we get into. But generally, like people don’t seem to want a full trip, duration, trips sitter by phone. Like it just doesn’t tend to be that enjoyable.

Joe Moore: Yeah, that’s fair and good. Good as an observation. Yeah. Um, so. Cool. So what, um, I’d like to maybe pivot into is the new thing that you all just launched.

Joe Moore: Can you tell us about it? [00:16:00]

Joshua White: Sure. So we launched, uh, last week, uh, or I guess at this point, it’s almost two weeks ago, a um. What we like to think of as the first AI client, or another way of thinking about it, is a, a, a role play simulator for psychedelic practitioners. So we call it Lucy. And Lucy, uh, is an emotionally intelligent voice to voice simulator.

Joshua White: So, Lucy, to be crystal clear here, Lucy is not on the support line. Lucy’s exclusive goal is to help train practitioners how to get better at what they do. Uh, and, and that’s that. We, we kind of had this idea for a few reasons. As you know, you know, psychedelics are, are, you know. Sweeping the country, uh, hopefully FDA approval will be coming soon.

Joshua White: And there’s really a need to provide [00:17:00] high quality experiential training to aspiring practitioners. You talk to universities, you talk to, um, you know, companies in the field, and they’ll tell you it’s just really hard to get experiential training. And when you look at other types of professions, like let’s say a pilot, a pilot’s not gonna get behind a cockpit until they’ve spent some real time in a flight simulator.

Joshua White: And so Lucy is a flight simulator for psychedelic practitioners. So in other words, Lucy plays the role of the person having the psychedelic experience or preparing for the experience or integrating the experience and on and on. And then the practitioner, the student, the trainee, whatever you wanna call them.

Joshua White: Interacts with them and provides support. And so it, it’s interesting to think about, so I said it was an emotionally intelligent, um, platform. Well, what does that mean? One way to kind of contrast it is with Chachi [00:18:00] pt. So when you talk to Chachi pt, it’s not hearing your tonality, your emotionality. What it’s doing is just translating your words into words within the system and then spitting back words at you.

Joshua White: Whereas Lucy is emotionally intelligent, which means that it can tune into your emotionality and, and, and respond in kind. So just like when you’re, you know, if you think about, for instance, speaking to a friend, um, some friends are really good listeners and you can feel them really attuned and they’re really present.

Joshua White: And with those types of listeners, you’re more likely to open up and, and let them go deeper and share more about your inner world. Other friends, um. Maybe tuned out and, and not really tuned into what’s happening for you and you may experience, uh, open up less to those people. And so the idea here is to be able to train practitioners how to be emotionally attuned, how to cultivate their skills in really like a low stakes [00:19:00] environment.

Joshua White: And the last thing I’ll just say for now, ’cause I know there’s so much to get into about this, is, you know, this is not a substitute for having, uh, a mentor. Having a supervisor. The idea is not to take humans out of the training process. It’s to give people a low stakes opportunity to keep practicing their skills so that they, that can, then they can get to a basic level and then really like, have a mentor, have a, have a teacher, have a supervisor to help you kind of cultivate those skills.

Joshua White: So I, I like to think of it as a bridge between like the classroom where often it can be very like dry and didactic, and how can we get you closer to where you can provide. Uh, support in the real world, not get you all the way there. ’cause that’s impossible. Right. A lot of time with humans is needed.

Joshua White: Humans with supervisors, with mentors, with [00:20:00] teachers and so forth is needed.

Joe Moore: Yeah, absolutely. Um, simulation is so helpful for so many reasons. I was just to, to bring it, um, to another analogy. I just spent Sunday with Kyle, my co-founder, doing avalanche rescue training. Mm. Like we did a lot of simulation.

Joe Moore: ’cause you know, the, you don’t want to just be freaked all the way out and frozen or run away when somebody maybe has 10 minutes. And, um, it’s quite, quite the harrowing thing and we, you know, it’s similar, maybe not similarly high stakes, but really high stakes and psychedelic stuff too. Like people’s future in a major way could be radically benefited to the, you know, have radical benefit if you do well here,

Joshua White: right?

Joshua White: Mm-hmm. And, and it can take, I mean, even if you’re a, a seasoned therapist or, or other type of clinician, you know, [00:21:00] to be interacting with someone, having a high dose experience of, of, of a psychedelic where material from their childhood maybe suppressed memories, maybe traumas are resurfacing, it can really activate your nervous system, you know, in a, in a similar way to, if you’re in an avalanche, it can activate your nervous system.

Joshua White: And so how do you kind of. Try to put yourself into similar situations, but that are lower stakes. So in a sense, you can start to train your nervous system to stay grounded. Which is, which is I think like, you know, upwards of 90% of psychedelic support. It’s how can you remain as, I love the rom dos expression, a loving rock while, you know, the psychedelic process is unfolding.

Joe Moore: Yeah, I love that. Um, okay, so let’s, um. Let’s talk about like, what are, what kind [00:22:00] of outcomes are you hoping from for, uh, to come from Lucy being implemented at at Fireside?

Joshua White: Yeah. That’s an amazing question. And you know, I, I think that, you know, core to our mission is, as I said, helping people reduce the risks of psychedelic experience in ways including training.

Joshua White: And so our hope is that Lucy can really be a, an accessible and scalable training tool to give people in, uh, you know, in the US and maybe beyond. Opportunities to really like cultivate their, cultivate their skills. So the hope is, for example, that this can be one of the tools that training programs use.

Joshua White: So for example, we’re talking with, um, there’s a wonderful nonprofit called uep, university Psychedelic Education Program. And they’re, uh, training professors at different universities, uh, [00:23:00] to teach their students about psychedelics, starting with nursing and social work, and then probably expanding outwards.

Joshua White: Now, wouldn’t it be amazing in our vision if a student, for example, reads a book chapter on let’s say, boundaries and consent, uh, which, which of course is a big topic in psychedelics, and then they could actually do a simulation with Lucy, where there are various types of boundary and consent issues that they’re, that they’re dealing with.

Joshua White: And not just that, but then to get feedback. Afterwards from Lucy about how well they’re doing now. Is, is that to say that professors are not necessary? Of course not. Is it to say that, you know, demos with fellow students is not necessary? Of course not. Does it mean that, you know, um, practicums on site at ketamine clinics or organ, uh, psilocybin service centers are irrelevant?

Joshua White: Of course not. The idea is how can we better [00:24:00] prepare practitioners for that first time that they’re in the room with, with the real patient? So I want to just be so clear about like what the, like the ambition is here. It’s to, it’s to, it’s to help practitioners be better prepared for when they’re in the room and for when they start working with mentors and teachers.

Joe Moore: Yeah. So I. Endlessly important to have, have this. Right. And I think your flight simulator was a great thing, uh, analogy, like you really do wanna spend a lot of time and, you know, think about, I, I, I learned recently that Aspen High School, you can graduate Aspen High School with a pilot’s license because they have a world class flight simulator.

Joshua White: Right. ‘

Joe Moore: cause they have too much money, which is amazing for those people going through Aspen High School if they choose to do that. But there’s a lot of time in cost savings by having that simulator. And, um, how many good? [00:25:00] How many, yeah. There’s just so many better outcomes once you have a simulation tool.

Joe Moore: Kyle and I were honestly looking at a simulator, um, build a year or two ago. Um, but you know, that requires a reasonable amount of cash to build that kind of thing.

Joshua White: Yeah. And I, and I think, you know, it’s interesting because like in the world of mental health and beyond there. An idea called simulation based training, right?

Joshua White: Which is basically using simulators in different contexts. You know, you’re in medical school, you do simulations, you’re a nurse, you do simulations. And now we’re at the point where the technology is such that we can really start to do, um, you know, simulations with, with, with, um, you know, uh, that, that are emotionally intelligent.

Joe Moore: Can you talk about that emotional kind of skill the software might have? [00:26:00]

Joshua White: Yeah. Abs, absolutely. So I know we’ll get into this, but you know, I guess one of the things that is, is, is true in the world of simulations is they’re only as good as they are lifelike, right? If you’re in a situation that seems fake, then probably you’re not gonna get trained as effectively and as, as you would in, in a very lifelike simulation.

Joshua White: And so there are, for example, now in the world of, in this, in this kind of world of mental health, there are like out of the box. Um, you know, models that you can work with. For example, if you were a call center, like a, like a Walmart or whatever, companies use call centers, you might want to train your operators how to be emotionally intelligent.

Joshua White: Now that’s interesting, but it has nothing to do with psychedelics. And so at Fireside, like one of the. Um, you know, the advantages that we have is we speak to thousands [00:27:00] and thousands of people, uh, during and after their psychedelic experiences. And so we can really draw on the, those conversations to create really lifelike, uh, to create really lifelike situations because they are drawn from, from actual situations or from kind of, um, syntheses of, of actual situations.

Joshua White: And I, I know we’ll talk about how we go, how we go about that, but, but you know, to really speak directly to your point, like the best way to create lifelike simulations is to base them as closely as possible on, on really like nuanced, lifelike situations.

Joe Moore: Yeah. Our, um, yeah, I think Harvard Business School is an example of that and maybe a lot of business schools teach from case study.

Joe Moore: Mm-hmm. And I think that’s, you know, super common in, um. Psychotherapy as well. Psychiatry probably as well. And yeah, so why not make really lifelike scenarios? So [00:28:00] let’s chat about that. So we, we chatted a little while ago, so I’m breaking long Colorado, I ate too much acid, you know, 20 hits. I’m like, uh, you know, I don’t think I can use this phone, but don’t let me try.

Joe Moore: And I call Fireside, right? And so I call somebody’s, you know, connected to me. Um, and I’m getting some help. Um, there’s, there’s a whole like recording function, right? Can we talk about that and how you all made the decision to, to record?

Joshua White: Yeah, that’s a great, it’s a great question and I wanna, if I could like, take a step back to a principle that I mentioned a moment ago, which is, which is.

Joshua White: Because of the nature of the conversations we have, people are reaching out to us really in, uh, some of their most, uh, you know, vulnerable times. Um, that creates for us a, a duty to keep doing better and better and better and to [00:29:00] keep learning a lot so that we can provide better support. So let, let me tell you, let me kinda give you a little bit of an overview of how the support line progressed.

Joshua White: Um, for our first, uh, three and a half years, the support line ran off of a hodgepodge of, of different systems, none of which spoke to each other. And what happened was we had almost no idea of what was happening on particular calls. Period. So a supervisor, a a volunteer would take a call and we literally would’ve no idea what was happening on that call, uh, unless there happened to be a supervisor who had bandwidth and who could, um, you know, listen into to some of the call so that they could provide feedback.

Joshua White: Now, that didn’t happen very often and it really left us in kind of a scary situation where volunteers are supporting people and [00:30:00] we don’t know how well they’re doing. We don’t know whether they’re improving and we don’t. We would’ve new supervisors come on and we would, it would be really hard to train them, um, because it was hard to communicate what was actually happening on the line.

Joshua White: And another problem that we had was we would have new volunteers coming in. We do about three to four cohorts a year of. 30 to 50 volunteers and new volunteers. We do a 50 hour upfront training and new volunteers would ask questions like, well, what’s it like when someone is in immediate danger? What’s it like when someone may be suicidal?

Joshua White: And we’re like, we would, we would have to just sort of use words to describe that and we would fail every time. Um, we would try doing role plays with the volunteers where they would split off into groups of two or three, and one would try to simulate. A person who was, uh, suicidal considering dying by suicide.[00:31:00]

Joshua White: And what would happen was volunteers just weren’t well prepared. Uh, and when they were then put in actual situations, uh, of having to support someone, it was absolutely terrifying. You know, kind of similar to your avalanche example, if your first experience of an avalanche was an actual avalanche as opposed to a simulated avalanche, you are in a way worse position to do avalanche rescue.

Joshua White: So that’s where we were for a really long time, and we believed that to really honor the, the trust that our callers have in. We just had to get better. Um, so we realized that, uh, you know, that recording conversations could be a way of training new and, and, uh, current volunteers, training supervisors, allowing supervisors to have one-on-one conversations with volunteers, allowing us to receive notifications from callers a day later, and then be able to circle back with the [00:32:00] volunteer and provide feedback.

Joshua White: So we knew that there was a really powerful reason why we needed to start, uh, why, why there was a need to have some kind of recording. But we also knew that we couldn’t honor the trust that our callers put in us if we didn’t have an over the top anonymization process, right? We needed to make sure that if a, for example, a subpoena came our way, we just wouldn’t be able to respond to it.

Joshua White: Um, or, or. O other efforts to kind of get our, get ahold of our calls from people outside of Fireside. And so we, we basically like built a state-of-the-art software platform that, that really does an over the top job of, of anonymizing the conversations and, and providing notice to callers beforehand and an opportunity to delete afterwards and, and, and, and, and various things.

Joshua White: And I can [00:33:00] walk through that step by step. So we sort of started, you know, um, recording conversations, anon, anonymizing them, obtaining, obtaining consent, providing the opportunity to delete. And I know we can talk more about this consent piece ’cause I think this is really, um, this, this is part of the essence of it.

Joshua White: And one of the things that we saw was. You know, that, that our training and the quality of support that we provided really could get better and better and better. And the conversations that we were saving were totally anonymized and un and untraceable. So then, you know, as time passed, we started really asking ourselves, well, what still, what are the needs within Fireside?

Joshua White: How can we keep getting the training better and better? Um, and what are we also seeing in the field? And this question of experiential training and the, and the real dangerous lack of it kept coming up. And so then this idea for an [00:34:00] emotionally intelligent flight simulator, if you will, kind of came to us and we said, okay, we could, we, we can explore doing this if and only if the conversations, the anonymized conversations that we have truly are untraceable.

Joshua White: And one of the things that we did. You know, there’s this, there’s this idea that the sound of your voice, even if it doesn’t have personal information in it, um, is like, can be like a fingerprint. And so what we did was we implemented this incredible voice distortion tool that distorts your voice beyond recognition, while preserving the emotionality.

Joshua White: And so, and so that. And so that additional layer, um, really means that like when, when, when Lucy, our simulator outputs, uh, plays the role of someone tripping or integrating [00:35:00] you’re, you’re hearing distortion, you’re hearing a distorted voice that’s not even like, um, uh, any human’s actual voice. So, um, so I think that’s a bit of the process.

Joshua White: I think we can go more into it, but I think that’s kind of an overview of. Where the idea came from, but also what we wanted to really do right. Within Fireside when we were exploring whether to actually build this.

Joe Moore: Yeah. Well that’s super helpful, thank you for that. Um, and can you tell me a little bit about how, um, like consent opt-in works?

Joe Moore: ’cause it’s a little hard to give consent, I guess, after you’ve consumed.

Joshua White: Sure. Yeah. Yeah. It’s a really, it’s a really great question. Um, and so I think there’s, there’s a couple of, there’s a couple of things here, right? The first I wanna say is, um, that, so there’s consent beforehand, right? So each person when they reach out to us is notified that, you know, calls may be recorded for, you know, [00:36:00] for training purposes or, or words to that effect.

Joshua White: And Lucy absolutely is a training purpose. It’s a foundational training purpose. It’s why Lucy exists. So Lucy is a training purpose, and then the conversation on the support line happens. It’s fully anonymized regardless. So there’s the, the phone, the phone number is thrown out, and then afterwards, the day after the conversation, the person receives a text message that informs them that, um, that we’ve recorded their anonymized conversation.

Joshua White: And to delete it, they can check a box. Uh, and if they check the box, the system automatically deletes the call. I, I do. And, and there’s been some active discussion, uh, Joe, as I think, you know, since we’ve launched and there’s, I think one of the things that I think is really, you know, important to point out is, you know, I think the nature of consent depends in part on what data you’re actually saving and the purpose for [00:37:00] which that data is used, right?

Joshua White: So to use, I’ll give you two extreme examples on opposite ends of the spectrum. One is. If the only, if you called in and the only data that we saved was the caller sounds like a guy, and he took mushrooms, if all we noted down was male and mushrooms, no consent is needed in that situation, neither legally nor ethically.

Joshua White: The opposite end of the spectrum would be that if we were recording conversations without anonymizing them and then selling them to Facebook or somewhere else, you know, I don’t even think anyone could consent period to something like that if, if they were on psychedelics. And so what we have here is sort of something I think that’s, that’s in the middle, which is.

Joshua White: We don’t save any personal information. The voices are distorted. The informa, the, the, um, conversations don’t leave fireside, meaning we don’t sell them, we don’t license [00:38:00] them, we don’t share them, period. What we have is a cluster of untraceable conversations and um, and those conversations are used to train our volunteers and to train others in the field.

Joshua White: And of course, if Lucy is used by someone outside of Fireside, what they’re seeing is the output. They don’t actually get access to the conversations themselves. So it’s a really tricky issue. I see the arguments on both sides and, um, and I really think we go, you know, a above and beyond to, um, you know, to, to do this.

Joshua White: And, you know, since the, since, since we have really heard like some. Powerful, um, you know, kind of, uh, feedback from the community about this. We we’re, we’re also going to be like implementing, I think like addit additional layers, including, you know, more detail about Lucy in the opening and more detail about the ability to delete the conversation before [00:39:00] it happens.

Joshua White: So to preemptively say that we don’t want it saved. Um, and, and, and other things like that. So we’re really not trying to hide the ball here. And, um, and so it’s, and also when people call in, they’re often tripping and need to talk to someone sued. And so a too many words can be harmful in, in and of itself.

Joshua White: And so that’s why we have this mm-hmm. Pre and post call, um, you know, uh, process.

Joe Moore: Right. Um, yeah. Thank you for that. It’s, it’s a, it’s a challenging but worthy thing to be digging into, and I think, um, if, if there’s no like real serious metadata around it, um, you know, not storing your phone number and social security number, that helps.

Joe Moore: Um, and it’s, uh, it’s, it’s definitely a complicated, I heard, I heard [00:40:00] one no. So I, I haven’t been tracking, so, but one person said one thing to me and I wanted to just bring it up just so I’m, I’m feeling honest. Right. So like in AI’s become an ongoing concern in the last year or two. Mm-hmm. Right. Um, you all have had training and consent on, on kind of like these models before.

Joe Moore: So that was really an ongoing concern. And it’s just like, kind of interesting. I, I feel like this is an unfair way to say it. Moving the goalpost, it, it feels like that’s not what you’ve done. There’s a new thing that came along that’s a really powerful educational tool and you know, you’ve adapted what you’ve had to that I find it, I find it interesting.

Joe Moore: It’s complicated. Um, but I don’t know any, anything you wanna comment on that then I got a slightly easier one.

Joshua White: Sure. Can I, can I, can I state the question back to [00:41:00] you to make sure that I’m understanding it?

Joe Moore: Go for

Joshua White: it. So it, it sounds like, the way that I’m hearing what you’re saying is that someone expressed to you that essentially when we initially started recording.

Joshua White: Lucy didn’t exist. And then after we started recording, Lucy did exist. And so does the consent that was provided beforehand really still kind of hold up to muster

Joe Moore: that. That was effectively what they were saying.

Joshua White: Okay.

Joe Moore: I was like, I was like too busy to sit with them to go any further.

Joshua White: Yeah.

Joe Moore: Um, than that.

Joe Moore: Totally. But that’s effectively what was po posed to me.

Joshua White: It’s a great, it’s a great question. I totally understand where that, that person’s coming from. It’s really, really valid. Um, and, you know, Lucy is a training purpose, right? There’s no, there’s no question that Lucy is a training purpose. And so the ques the question really is when we tell people, or when we started telling people in May, [00:42:00] 2024, and I think, you know, we, we put out a press release, like we pitched 50 or 60 news outlets to let them know we were gonna be doing this.

Joshua White: So we were never trying to hide the ball. But, um, you know, the, I think the question is like, is there something different about this training purpose, right? Um, that, that would require us to sort of separately call it out in consent. In, in, in, in our, in our disclaimer. Um, so the, the first thing I’ll say is like.

Joshua White: You know, I don’t, I don’t think so. The second thing that I’ll say is like, we are gonna be updating our, our disclaimer process to, to do that because we do really want to be as, as, as transparent as possible. But I, I, I actually think that for me, when I hear people like, is Lucy Ai, is Lucy a chat bot? Like, yes.

Joshua White: And the reason I don’t lead with we created an AI chat bot is because when people hear that, I think it provokes this like really [00:43:00] powerful and totally justifiable emotional response for reasons that have literally nothing to do with Fireside. Right. So I think that like, as a. As the public, like we, myself included, have been harmed and even traumatized by the way that big companies have, um, used our data, right?

Joshua White: Starting with companies like Meta that have not just taken our data but used it to divide us, to cater to our basis instincts to sell it and, and. Have it twisted and used to sell us products and services and on and on and on. So that was even before the big AI companies, right? And then you have the big AI companies come along and scrape the entire internet for content while destroying the environment and the process, not providing compensation to the artist whose copyrighted works that they took and [00:44:00] enriching their billionaire owners, right?

Joshua White: So that when people hear AI chatbot. Totally, understandably, that’s what they’re thinking of. Now, Lucy has literally nothing to do with any of those things, right? This is a minuscule little model that’s created by a nonprofit after obtaining, you know, consent beforehand, opportunity to delete after saving no personal data.

Joshua White: We use this incredible software to strip out the personal information, distorting the voices, using it to train our own volunteers, and then eventually others in the field. And importantly, right? Nonprofits don’t have owners like I’m the founder. I have no ownership stake in Fireside. This is a public trust.

Joshua White: We are trying to stay afloat in a very difficult philan, uh, philanthropic environment that I, I think you’re familiar with. And any money that we make from this is gonna go to help keep [00:45:00] the support line free. So, I, I, I think that. Because that is what we are doing. And because it literally doesn’t have anything to do with like some of these other big models, I really do feel confident that when we say this is, this is just one of many training purposes.

Joshua White: Um, that it, that it’s, that it’s really, um, that, that it is, that it is ethical. And, and it’s also interesting too because, you know, one, one other thing that I’ll say is kinda interesting is like, so our training purposes, we have a range of internal training purposes. Sometimes a volunteer will have a really, really poignant call or they’ll like a nine, eight, eight call or a nine one one call will come in and the volunteer will just do a superb job, right?

Joshua White: That call may become our call of the week. That we share with volunteers. So here we are taking a call, which we anonymize, of course, like all of ours, and sharing it with a hundred, 150 volunteers. Now, we don’t, we would never call that out in [00:46:00] our disclaimer. Calls are recorded for training purposes. One of them might be a call of the week that we share with 150 volunteers because it’s just, it just doesn’t make sense.

Joshua White: I think the question is like, what is really the scope of, of what training purpose means? And we also understand, as I said, that like we can do better. We’re always trying to do better. And that’s why in the next few weeks, like our upfront disclaimer will say, calls are recorded for training purposes. If you wanna learn more about what those purposes are, press one.

Joshua White: And if you press one, you’ll hear that one of those training purposes is to create this simulation tool that’s used to train our volunteers and others. And if you’d like to have your conversation deleted before it even happens or not saved, that would be more precise, um, then, then you can press what, so there will be a pre, uh, conversation opportunity for deletion, and then more, um, data on the [00:47:00] backend or more disclosure on the backend.

Joshua White: So I think for me, that really like highlights that from the beginning. We’re just trying to be as transparent as possible. And one of the things that transparency, like leads to, like in this situation is we get feedback. We incor, we hear the feedback, we incorporate the feedback, and then the, what that results in is a better disclaimer process among other things.

Joe Moore: Can you talk to me about like the legal, well, let’s call it, um, law enforcement and so say for example, you receive a subpoena mm-hmm. To provide all sorts of information. How, how does that work and what, what’s your plan? Around

Joshua White: that. That’s great. That’s a great question. Um, and you know, I should, as I mentioned earlier, like I’m a lawyer and used to respond to subpoenas and write subpoenas, uh, when I worked for the, the city attorney of San Francisco.

Joshua White: So I have some experience with this, right. So, you know, [00:48:00] subpoenas have to be specific, right? Um, so, so, you know, it could be, I want all calls involving Joe Moore. I want all calls that say Joe Moore. I want all calls from Joe Moore’s phone number or from this phone number or all calls from San Francisco and, and, and on and on.

Joshua White: And so those would be the things in a subpoena. And if we got that, what we would, what we would say to the person that provided the subpoena, whether it was in civil litigation or you know, a government agency, we would say. We’re unable to respond to that subpoena because we are unable to parse out which of our anonymized calls, whether any of our anonymized calls satisfies, uh, that, that particular, um, the description of the subpoena.

Joshua White: And then there’s process. You know, of course there are, there are processes, right? If someone comes to us and says, okay, well then give us all of your conversations. Um, you know, at that point you go to court, you, you file, you know, a motion, [00:49:00] uh, to quash the subpoena depending on where you are. And you go to the judge and you say, gimme a break.

Joshua White: Uh, you know this, we tried to respond to the subpoena, but we just don’t have the information. And we did this on purpose because it’s really important to us to protect the anonymity of our callers.

Joe Moore: Cool. Um. I’ve never had to write or respond to a subpoena.

Joshua White: Consider yourself lucky.

Joe Moore: Only been served papers, papers once, which is nice.

Joe Moore: So, um, yeah, this is a, it’s an interesting structure and it’s, it’s obviously an important exercise. Um, can you go through this like, training purpose concept one more time and then we can jump a little bit forward? I just wanna make sure I totally get it

Joshua White: right. So me meaning how do we use calls for training purposes?

Joe Moore: Well, no, no. What is, like when you were talking about like consent and [00:50:00] training purpose in like the same kind of structure. Like what is, what is that about?

Joshua White: Well, the notion is that if I, I think ethically, if someone is. Providing consent. The question is, what are they consenting to, right? And so if we said calls are used for training purposes, but what we were actually doing was selling the calls to Facebook, then we would be violating the terms of the consent because someone would’ve consented to one thing and we would’ve done another thing.

Joshua White: Um, and so here, when we say calls are used for training purposes, what that means is the only purposes for which we could use them are training. Now, there is a laundry list of training purposes that we, that we use them for, right? To train supervisors. To train volunteers. Lucy is one of those training purposes.

Joshua White: So to, just to give you a really, really specific example, um, you know. We, we haven’t built [00:51:00] out this module yet, but you know what I, as what I said earlier, new volunteers are gonna be doing their new volunteer training and they’ll spend time with Lucy, right? So for example, there will be a 9 1 1 module and there will be a 9 1 9 8, 8 module.

Joshua White: But not just that, the person will be literally given feedback afterwards that says like, you know, take for the caller told you. Lucy told you multiple times that she wished her life would come to an end, or she wished the pain would all go away. And you, as the volunteer didn’t ask the question that we instruct our volunteers to ask, which is, I heard you say that you were thinking of taking your life.

Joshua White: Are you thinking of dying by suicide? And if the answer is yes, then the call is outside of our scope and we transfer them. So there’s no question that like, this will make volunteers far more prepared, uh, to, to, to do, to do this work. And I think that’s really like when we’re thinking about this, like kind [00:52:00] of just candidly, it’s a complex and nuanced ethics analysis.

Joshua White: If you have the ability to create a tool like this and you don’t do it, that has an ethical consequence too, which is that our volunteers are less prepared to support people in high stakes situations just as we know that within the psychedelic field. It is dangerous to not have sufficient experiential training.

Joshua White: Like that’s not abstract. If you talk to any professor training program, they are clawing to try to find experiential training. But it’s hard, uh, because psychedelics are still illegal in most places. And so there is a real, like, I think part of the ethical analysis, I would say is the cost of not doing it, and the implications for our volunteers, for our callers, for the community.

Joe Moore: Um, yeah, I think that that calculus [00:53:00] is important to have there. Right? It’s, it’s kind of this like opportunity costs, like

Joshua White: right.

Joe Moore: And um, yeah, just how many people can, I think a metric is how many people can be diverted appropriately. From 9 1 1 and emergency rooms. Totally. For folks that, dunno,

Joshua White: that’s a great example.

Joshua White: If a, if a volunteer is not experienced and when they hear someone say, I’m, I’m panicking and I think I’m gonna die. Let’s just say if that volunteer connects the person to 9 1 1. That can be seriously harming the person. Right? The caller. But if the volunteer is able to say, I hear you. It sounds like you’re in a lot of pain and you’re in a lot of stress and I’m gonna be here with you for as long as you need.

Joshua White: How does that sound? And if the volunteer stays grounded, that’s, that is, that’s, that is a powerful, powerful moment [00:54:00] there that, that could have had a significant implication on the person’s life. And there’s no doubt that an experience in the same way, coming back to your avalanche thing, like if you, if you spend a bunch of time like pilots get, be, it’s not a, it’s not an abstract thing that well done simulators produce better outcomes than people who have no simulation experience.

Joe Moore: Right. And I, I keep thinking about like, call it like a, uh, a surgical simulator. Like of course I want the doctor to do 100 simulations perfectly before jumping into me. We had, you know. Yeah, so there’s, there’s plenty of things here around that, and I, I love it and I, I applaud that. I think there’s, um, I don’t, I don’t know what people are saying.

Joe Moore: I haven’t been following super carefully, you know, after so long, social media becomes like a little bit tiresome.

Joshua White: Totally.

Joe Moore: But, uh,

Joshua White: well, it’s, it’s interesting because like, it all [00:55:00] depends. There’s, I, I could give you a snapshot, right? So it depends on, it depends on who you’re talking to and where you’re talking to them, right?

Joshua White: So. Conversations with training programs, universities, researchers, organizations that need to train. Lots of people say, this is incredible. Um, it’s so needed. Many of our volunteers, the, the vast majority of our volunteers, this is amazing on LinkedIn, we, we do posts about this. And when people are commenting from a LinkedIn profile, which they have to, the results are very, very positive.

Joshua White: And this is exciting when people, this is so rich, right? When people go on a meta platform such as ig, where inflammatory posts are rewarded, where data is taken and repurposed, there is, there are people who have criticisms, some of which I think are, are. Not connected in any way to what lucid is, right?

Joshua White: Some people say it’s [00:56:00] messed up that you’re gonna replace humans with ai. Well, that’s not what this is. Um, you know, AI is gonna destroy the world. Um, that might be true. But Lucy, this minuscule little thing is not really that AI is bad for the environment. Totally. AI is bad for the environment, is this tiny little, you know, engine that could meaningfully contribute to environmental destruction.

Joshua White: Absolutely not. Um, and then there have been some, you know, people who have, you know, talked about the consent piece, but there’s no nuance that you can really have in that type of conversation. Um, and so I actually think that like in con in conversations like this one, in conversations with volunteers, you know, that it’s really, and staff.

Joshua White: I think the, I think our position is really, really strong, especially when you really hear like, so why are you doing this again? And what’s the cost of not doing it? And what are the anonymization and obfuscation procedures actually, but it’s [00:57:00] hard. It’s hard because just simply hearing AI chatbot, which, which is a term that I don’t really use anymore because it just precipitates this, this reaction that is totally justifiable, that really just has nothing to do with Lucy.

Joe Moore: Yeah. Um, what are you hoping for beyond training Fireside volunteers with Lucy

Joshua White: beyond training fireside volunteers?

Joe Moore: Yeah. Like do you, are there other kind of partnerships you’re looking at?

Joshua White: Totally. Yeah. I mean, I, I think like when we talk about this kind of. Critical and dangerous bottleneck within the field.

Joshua White: Um, like where is that bottleneck? Right? So one example is universities, right? Um, we talked about ep, which is training professors. We would love, you know, we’re in conversations with ep. We would love for every single student to have access to Lucy. Um, [00:58:00] we’re, you know, we hear from different training programs that say, Hey, this would be a great way to integrate experiential training into our program.

Joshua White: Um, maybe even as a way of, of, of supplementing a addition, uh, existing experiential opportunities. Um, you know, there are, you know, the VA is gonna need to train tens of thousands of therapists how to, if, if MDMA gets approved, which we hope it does. Uh, and so a lot of those therapists have no experience whatsoever with psychedelics.

Joshua White: Right? You wanna talk about a potentially dangerous situation, someone who has no experience with psychedelic. In an MDMA session, uh, trying to support a veteran. So I think that, you know, that would be a really exciting partnership. Um, so really just like pick an organization in the field that’s, that provides training and we’re, you know, really open to, uh, talking to ’em.

Joshua White: We’ve also heard, I mean, you know, I think it’s really interesting with, you know, you talk about, um, like first responders, [00:59:00] you know, there’s some great training programs out there like Maps has created to train first responders, emergency room doctors and so forth. So if you’re gonna be interacting with someone on psychedelics, some training is required.

Joshua White: And I think this can really, like, fit into a lot of those, is our hope.

Joe Moore: Yeah. Yeah. I think the first responder thing’s really special and interesting because, um. You know, looking at the DARE training, I, you know, if I’m, if I’m high, I am just gonna like flip over cars or something and like, you know, police don’t necessarily know better and Right.

Joe Moore: This is a cool way to kind of humanize it.

Joshua White: Yeah. And I think, you know, the beauty of this is that, you know, it, it really of Lucy is, it really can synergize with existing training programs. So, so ma like I said, MAPS has created a first responder training there. I think it’s, you, you would know better than me, but I think it’s been implemented in Denver or it’s on the way to being implemented.

Joshua White: And so this could be, you know, [01:00:00] part, part of that. Another first responder, you know, OHSU is working on a first responder training as well.

Joe Moore: I heard they’re gonna leverage a lot of the existing Denver training, which is cool, um, to make that a little bit smoother of a process. So I, I love that. And I think OSU did partner with maps on that.

Joshua White: Oh, got it. Yeah. So there’s,

Joe Moore: yeah.

Joshua White: Why not play well together? I mean, that’s the thing is I think like,

Joe Moore: right,

Joshua White: if you’ve already created substantive materials, great. Now you can just slide in a a, a simulation tool and everyone wins.

Joe Moore: Yeah, absolutely. So, um, what’s, what’s next beyond Lucy for Fireside?

Joshua White: That’s a great question.

Joshua White: Um, well, I mean, I think that we’re, you know, we’re coming up on our 1000th coaching session, which is really exciting. I think like one of the areas that I’m most passionate about is, you know, what, what does the continuum of care within the psychedelic field look like? In [01:01:00] other words. What types of people are providing support in a, in a wraparound way around the psychedelic experience.

Joshua White: And I think we’ve had an overemphasis on therapy. Like therapy is amazing for some people some of the time, and not everyone can afford therapy. And, and it may be an imperfect fit for some people. And so we really feel like coaching should be part of that continuum of care. And so our coaching program is kind of one answer to that.

Joshua White: And then the other thing is just awareness of the support line, right? Um, you know, I really want people to think, okay, you know, emergency medical is 9 1 1, suicidality is 9 8, 8 psychedelics, fireside. Uh, and I think that that awareness building takes a lot of time. And I would love for us to be able to get to the point where we’re just a kind of a household name where schools know about us and so forth.

Joshua White: So that’s, there’s no silver bullet for that. Uh, meaning there’s no one way to get the word out, but we’re [01:02:00] really trying.

Joe Moore: Yeah. Love that. Um, any calls to action for folks out there before we kind of wrap up here?

Joshua White: Yeah, I think the calls to action would be download our free mobile app. Uh, you can, it’s, it’s for iPhone and for, um, Google Play, whatever the Google, Android.

Joshua White: And, uh, it’s, it’s really easy when you open it up. It has two buttons, call and text. And, um, so that, I would say that, um, if you’re interested in volunteering, as I mentioned, we do three to four trainings per year. And you can learn about volunteering by signing up on our website, fireside project.org or through our social media, uh, at Fireside project.

Joshua White: And the final thing is just tell your community about, about Fireside and invite people to put our number in your phone so that it’s there when you need it.

Joe Moore: Brilliant. Joshua White, thank you so much. Thanks for being here. Thanks for all the hard work and hope we get to do it again.

Joshua White: I hope [01:03:00] so too. Thanks for a really great conversation.

Psychedelics Today Mug
Psychedelics Today Mug
PT647 - Joshua White - Fireside Project

Joshua White

Joshua White is the founder of Fireside Project, a nonprofit dedicated to reducing the risks of psychedelic experiences through real-time support, coaching, education, and research. A former lawyer for the City of San Francisco, White brings decades of experience in emotional support work, harm reduction, and psychedelic integration, including frontline volunteering at festivals and crisis hotlines. He is focused on building ethical, scalable infrastructure for psychedelic care, with an emphasis on training, privacy, and public safety