46. Research Translation: Dr Jaelea Skehan on why proving something works is just the start


Subscribe to the Amplifying Research podcast:
Apple Podcasts | Spotify | Overcast | Pocketcasts


Episode show notes

You've done the research. You've run the trial. You've published the paper. So why isn't anything changing? Dr Jaelea Skehan has spent more than 25 years translating research into real-world programs in mental health and suicide prevention — and she's seen firsthand why so many evidence-based innovations never make it past the journal. In this episode, she makes a compelling case that proving something works is just the beginning, and shares hard-won lessons from programs spanning 18 months to 25 years on what it actually takes to get research into practice — and keep it there.

Jaelea is the Director of Everymind and was awarded an Order of Australia medal for her work in community mental health. A psychologist, researcher, and policy advisor, she leads a multidisciplinary team that does what she calls "priority-led research" — designing programs not for journals, but for the systems and people they need to serve. Her PhD focused on what actually works when trying to change practice in sectors outside of health, built on Everymind's decades of implementation experience.

What makes Jaelea's perspective distinctive is that she lives at the intersection of research, practice, and community — and she's unflinching about what she's seen from that vantage point. She argues that the system incentivises proving things work in controlled settings while neglecting the messy, relational work of getting them into practice. And she backs it up with detailed case studies from programs her team has built, implemented, evaluated, and adapted over decades.

"All of the work and all of the effort you put into designing a program and proving that it works or that it's got some good outcomes… It is not the end of the journey. If anything, it's a ticket to the starting line." — Dr Jaelea Skehan

This episode is essential listening for anyone who cares about whether research actually reaches the people it's meant to help — whether you're designing interventions, funding them, evaluating them, or trying to get them implemented. If you've ever felt frustrated by the gap between evidence and practice, Jaelea offers both a diagnosis and a way forward.


Our conversation covers:

  • Why the research-practice gap in mental health and suicide prevention is a matter of life and death — and what it's doing to public trust in research

  • The voltage drop: why interventions that work in controlled trials lose effectiveness in the real world

  • Why proving something works is "a ticket to the starting line, not the end of the journey"

  • Designing for the implementation environment, not just the innovation itself

  • The Mindframe program: 25 years of lessons on changing media reporting of suicide across an entire sector

  • Why resources and guidelines don't change practice on their own — and what does

  • Designing for 80% alignment rather than word-perfect evidence translation

  • Co-producing with lived experience advisors and disseminating findings to the people who need them before publishing in journals

  • Priority-led research vs investigator-led research: how Everymind decides where to put its effort

  • How to think about evaluation and evidence-building when your funding comes in two-year cycles

  • What senior researchers and funders can do to set up the next generation to work differently





Practical tips:

Design for the implementation environment, not just the innovation:

  • Before you invest years in proving something works in a controlled setting, think about the world it needs to function in. Who's the workforce that needs to implement it? What are the real-world pressures and constraints of that environment? Who are the end recipients of the change — and are they the same people as the implementers?

  • "The thing we always do is design the innovation, the intervention, whatever it is, thinking about the environment in which it needs to be implemented, the workforce who needs to implement it, and the end recipients of the change if it happens, and sometimes they're not the same as the workforce." — Jaelea Skehan

Invite practitioners and end users in now — not when you're finished:

  • If you're designing research by yourself, talking only to other researchers, you're probably not thinking broadly enough. Talk to the people who will need to implement your work — service providers, practitioners, community members — as early as possible. If you're already mid-project, it's not too late.

  • "If you're just talking to yourselves, if you're designing your research by yourself, you're kind of doing it wrong. So invite in the people who you want to be the end users of your program now." — Jaelea Skehan

  • "Or if you're in the middle, invite them in now. It's not too late to pivot." — Jaelea Skehan

Treat your evidence as a ticket to the starting line:

  • All the effort you put into proving something works — the trials, the publications, the economic evaluations — gets you to the point where implementation can begin. It's not the end of the journey. If you haven't thought about how your innovation gets into practice, the evidence alone won't get it there.

  • "All of the work and all of the effort you put into designing a program and proving that it works or that it's got some good outcomes, it's a ticket to the starting line. It is not the end of the journey." — Jaelea Skehan

Design for 80% alignment, not word perfect:

  • If you're developing resources, guidelines, or frameworks for a sector outside your own, design them with and for the people who'll actually use them.

  • "Use the language that works for them, and if it's 80% aligned to the evidence and they accept it, it's better than something that's 100% aligned to the evidence that they can't stand." — Jaelea Skehan

Understand the implementation environment — and find the lever:

  • Evidence alone doesn't get a program implemented. You also need to understand the policy landscape, where the funding sits, and who the potential implementers are. Jaelea shares the example of two programs — the one with stronger evidence sits on a shelf, while the one with less robust evidence is being implemented across four states, because the team identified a government lever they could push on.

  • "At some point, you need to figure out what's the implementation environment, and also where is the potential funders of the program at?" — Jaelea Skehan

Keep building evidence while you implement:

  • Evaluate as you go, learn from what the data is telling you, and be willing to change your approach when it's not working.

  • "The research doesn't stop when you start implementing it. The real research starts." — Jaelea Skehan

  • "It's okay if what it's telling you is your current approach is a bit wrong. 'Cause actually, we make mistakes all the time. The mistake would be to keep persisting with something that's not working and the lost opportunity to really make a real-world difference." — Jaelea Skehan

Ask: "What did we learn, and what have we done about it?"

  • This is the question Jaelea puts to her teams at Everymind. When evaluation or research tells you something — even something small, like one guideline landing lower than the others — the response shouldn't just be a paper. It should be a practice change.

  • "Let's not just write that up in a paper and say, 'Isn't that interesting?' Because it is interesting. What are we going to do different?" — Jaelea Skehan

Don't design your evaluation plan to match the funding cycle:

  • If you suspect your program will continue beyond the current grant, design your evaluation and research plan on a five-year timeframe, even if you're only funded for two or three years. You can still deliver within the current cycle, but you'll be collecting the right data to tell the full story.

  • "A mistake we made in the early years was designing our evaluation and research plans to match the funding cycle." — Jaelea Skehan

Disseminate to the people who need the findings first:

  • Consider sharing your findings with participants, practitioners, and policymakers before the academic publication is ready. Summary reports, presentations, and direct feedback to participants can get evidence into the hands of people who can act on it now — the journal article can follow.

  • "We are much more likely to disseminate the findings to the people we think need them before we disseminate them to the academic community. 'Cause let's be honest, often research journals are being accessed by other people in the academic community, and they're not being accessed by end users in the community." — Jaelea Skehan

Think broadly about what counts as evidence:

  • The traditional hierarchy of evidence isn't always the right framework, especially for population-level or community-level interventions. Lived experience evidence, practice evidence, and community evidence all have a role — and the "gold standard" of research evidence has blind spots.

  • "Are we talking about just research evidence? Are we talking about lived experience evidence? Are we talking about practice and community evidence? And are we talking about just the intervention or the environments in which we need to implement them?" — Jaelea Skehan

For senior researchers — set up the next generation to work differently:

  • If you're a senior researcher with early and mid-career researchers in your team, think about whether you're giving them the opportunities and permission to take a more relational, implementation-focused approach to their research.

  • "If you're listening and you are a senior researcher who has carriage of our next generation of researchers, then I think the imperative is to give them opportunities to work a bit differently." — Jaelea Skehan


Credits:

  • Host & Producer: Chris Pahlow

  • Edited by: Laura Carolina Corrigan

  • Music by: La Boucle and Blue Steel, courtesy of Epidemic Sound


Chris Pahlow
Chris Pahlow is an independent writer/director currently in post-production on his debut feature film PLAY IT SAFE. Chris has been fascinated with storytelling since he first earned his pen license and he’s spent the last ten years bringing stories to life through music videos, documentaries, and short films.
http://www.chrispahlow.com
Next
Next

45. From Problems to Possibilities: Dickon Bonvik-Stone on value-based communication and how it helped reframe the degrowth conversation