Teaching Evidence-Based Management

Teaching CATs

Season 1 Episode 6

In teaching students how to gather and appraise evidence from scientific literature, it's important to guide them through the process step by step. Commonly referred to as a Critically Appraised Topic (or CAT), it's quite obviously a difficult process to do, let alone to teach.

Director of the Center for Evidence-Based Management Eric Barends takes us through the process of teaching students how to run a CAT, drawing on many years experience and his own learning of what helps. Acknowledging right up front that "it's hard" and that students (along with everyone else) will make mistakes, Eric stresses the importance of adopting a learning approach, where people learn from mistakes, rather than a performance approach where the emphasis is on doing a perfect CAT.

We also hear from other teachers who share their insights on the process, and how they've helped students navigate this tricky territory, while acquiring a highly important, valuable skill. This is a valuable episode if you're looking for insights, tips and options for how to help your students with their CATs.

Host:

Karen Plum

Guests:

  • Eric Barends - Managing Director, Center of Evidence-Based Management
  • Tatiana Andreeva – Associate Professor at Maynooth University School of Business
  • James O’Brien - Associate Professor at Sobey School of Business at Saint Mary's University
  • Xander Lub - Professor Organizations in Digital Transformation at Utrecht University of Applied Sciences in the Netherlands
  • Denise Rousseau, H J Heinz University Professor, Carnegie Mellon University, Pennsylvania, USA


Mentions:

Contact:

Eric Barends, Managing Director of the Center for Evidence-Based Management



Karen Plum:

Hello and welcome to the Teaching Evidence-Based Management podcast. I'm Karen Plum, a student of Evidence-Based Management, and through this podcast I've been exploring how teachers approach the subject and how they adapt their teaching to students at different stages of their management journey. For some students, their courses will expose them to certain elements of the evidence-based management approach, leaving them with skills and experience that's useful in life as well as in their future careers in management.

Karen Plum:

Those who are already working as managers may be exposed to more of the full approach, looking at the collection and appraisal of four different types of evidence that are relevant in management decision-making. The full process - what Eric Barends, Managing Director of the Center for Evidence-Based Management calls the nuclear option - isn't always needed or relevant for every type of decision that managers will make in their busy careers.

Karen Plum:

That said, one aspect that seems to me to be a fundamental skill is the ability to identify, gather and assess scientific findings from research in the topic that we're considering as part of our decision making. The shorthand description for this process is the CAT. N ot your furry feline friend, but a critically appraised topic. I've heard this abbreviation used by lots of teachers and it's become clear that teaching students how to carry out a CAT has a number of challenges, as indeed does the process of conducting a CAT.

Karen Plum:

The CAT isn't a particularly easy or straightforward activity, but it's a really important skill for students and practitioners to acquire so they can confidently search for evidence themselves after their training. So this episode is going to focus on how to teach cats!

Karen Plum:

Let's get going. Firstly, let me say that Barends appreciate the irony of my last statement. Our feline friends are notoriously difficult to teach and there can be a lot of frustration. You can end up feeling that in fact, the CAT has done a better job of teaching you. Teachers find it tricky to guide students through what can be a grinding process where they're almost certain to make mistakes and feel like they've failed in the task. It's easy to say that we learn from our mistakes, but when you're making them it is painful. Our guide for most of this episode is Eric Behrens, who I mentioned earlier. He'll be sharing his approach and his insights for helping students navigate the territory. And on the subject of the upcoming pain in learning how to run a cat, it's wise to warn students beforehand about what they're in for.

Eric Barends:

It's hard. It's hard for students to do a CAT and it takes considerable time. And I always tell my students guys, after this course, you will probably hate me because this is not funny and entertaining, but it's very useful and it will help you in your career as a manager. But it's hard, it's a grind and it helps if you, if you warn them in advance.

Karen Plum:

So, just for the avoidance of doubt, what is a CAT and why should students care? Well, assuming that we agree that we should make use of scientific evidence in our management decision making, we need a way to identify the research that's going to be most relevant and trustworthy, delivered through a systematic methodology, so that we can confidently rely on the findings. As someone investigating for ourselves, a CAT is normally the approach we'd take. If we're investigating for our company and the results are really critical to what we're planning to do and how much money we're planning to spend, then it would probably be worth investing in a rapid evidence assessment. If we're an academic, then we would conduct a systematic review of the available scientific literature relating to the topic we're studying. All three approaches follow the same basic steps, but the requirements for someone publishing an academic study are more demanding than for an individual who wants to research something that'll have a more limited reach, impact and risk.

Karen Plum:

If you want to understand more about rapid evidence assessments or systematic reviews, there are resources on the CEBMa website. Just go to cebma. org.

Karen Plum:

In terms of teaching CATs, there are a number of things that Eric recommends, and we're going to hear more from him throughout the episode, as well as some thoughts from other teachers who took part in a discussion a few years ago. So let's firstly think about the skills that students need to undertake a CAT. Firstly, there's the need to formulate the research question.

Eric Barends:

It's also important that students need to have some skills to do these CATs. First of all, they need to be able to formulate a focused or answerable question. So questions like - what are actually the elements of successful large-scale change interventions? And then you go, I don't know, if you find the answer, you will probably get the Nobel Prize for evidence-based management because that's an unanswerable question. It's way too broad or sometimes they have the question wa y too narrow, like - what is for nurses with a background in blah-de-blah in a hospital, blah-de-blah on the circumstances, da-de-da. Actually, that is okay if that is your population and your organizational context, but that is not a CAT question that you would use to find research.

Karen Plum:

Next, students need to know how to search research databases to find the research that relates to the question they're trying to answer. Eric stresses that this isn't easy, so dedicating some time in class to cover this is really important, but if there isn't enough time, the online modules of the CEBMa course allow students to work through this process at their own pace. They need to know how to critically appraise the research that their search has identified and to understand the importance of effect sizes.

Eric Barends:

The other thing they need to know is how to critically appraise the research they find. Obviously, when they find 10 studies, they need to be able to figure out whether it's any good. So it is also important to figure out what research findings are relevant for practice and which research findings are very nice from an academic point of view but completely irrelevant for practice. Therefore, they need to know about effect sizes, because a lot of studies say, oh, we found a significant effect of A on B and the students say, oh, significant, that sounds like it's you know, very substantial, very important. So they just write this down and you say no, no, no, have a look at the study.

Eric Barends:

What was actually the effect they found? And when you look at the effect size and it's an association of 0.1, you say, yeah, from an academic point of view maybe interesting, but hey, that's not something you would invest, put your money on because it hardly moves the needle. So that part is also very, very important to figure out. Yes, this will help me in practice, it's relevant, it has a practical impact. And yes or no, this is, you know, interesting finding, very frequently cited paper in a top journal. But you know, for us practitioners we can't really apply this.

Karen Plum:

When students start out on the searching process, they need to have identified a research question that they're trying to address so they can search for relevant research. There are different ways to do this. Some teachers invite students to come up with their own question, particularly executive students who may be facing specific challenges at work that they want to look at. This can be an effective learning approach, but if the question is too broad or there just isn't any research on the topic, then they don't get very far. Eric's suggestion is to take a stepwise approach and to help build confidence as well as learning the process. Here's what he suggests.

Eric Barends:

So first we demonstrate. Here is this McKinsey paper. This is the claim. Guys, go to Google Scholar. You have 10 minutes. Discuss with each other what you find, search for meta-analysis, look at the meta-analysis, look at the effect sizes, report back. Do you support this claim? Yes or no? Nine out of 10 times they can easily find it and they all come back like well, not likely.

Eric Barends:

Next step is often we provide them with a list of topics, and some of us refer to this as doing a mini-CAT. This is a topic, go to Google Scholar, see what meta analyses there are there on this topic. Do the appraisal, look at the findings, see whether they're relevant. Have a look at the effect sizes. That's actually a quick and dirty mini- CAT.

Eric Barends:

When you do that, a next step can be okay, you searched in Google Scholar, there's probably way more research. Now do the same thing and search in a broader way in a research database like ABI Inform or Psych Info. Is there any contextual factors or mediators we should take into account?

Eric Barends:

It's easy to find meta-analysis for instance, on I don't know, work recognition or meaningful work or performance feedback, whether or not it has an impact on performance. You can do that in a mini-CAT through Google Scholar. But then you push them a little bit further and say okay, look in ABI Inform, do a broader search and try to figure out if there are any mediators, moderators that you take account, or maybe there's some new interesting primary research. That, by the way, is often the case.

Eric Barends:

Then you let them do a real CAT on the topic that they pick and finally we also have a CAT exam. That's what I often do. You give them a topic or you give them a claim and, under pressure, within two hours they need to figure out, okay, do a search and what's out there, et cetera.

Karen Plum:

If you're looking for some good examples of CAT topics and you're a member of the CEBMa Teachers Network, take a look in the members area of the CEBMa website.

Karen Plum:

To add to Eric's stepwise approach, here's Tatiana Andreeva, Associate Professor at Maynooth University School of Business in Ireland.

Tatiana Andreeva:

So, we had a case study we took, you know, Harvard Business Review, short ones that are available in the journal, so everybody's available, we can get them, we don't have to buy them. So we put students in groups and let them based on the specific case. So there was, in a way, it's kind of giving them the topic, right, because the case has a problem.

Tatiana Andreeva:

So we split it in steps. First we diagnose a problem and therefore come up with a research question. They do it at home, bring it to class, do it in groups, we discuss in class, refine the research questions. Then they do search for evidence again at home, bring it in class, discuss in class and do the next thing. And then they did the final individual assignments that were exactly the same, just using a different. And I found that the most challenging bit for them was the research question. So kind of it was the same case, so roughly the same problem, given to the whole class, and every group came up not only with a different diagnosis but with a totally different research questions, and some of the research questions were totally unsearchable, like you wouldn't be able to find any meaningful answer, it doesn't matter how well you know the search techniques.

Tatiana Andreeva:

And we had a lot of discussions with the students because I think it was difficult for them to grasp the idea what is wrong with their questions. So not only they found difficulty formulating, but I found it more difficult than anything else to explain to them what is wrong with the question and things like this. So in terms of the bad examples we could see in this individual assignments that they did at the end of the course. So I think it's kind of the same idea as what Eric was saying as the CAT-based exams. We didn't use the word CAT, but I think the idea is similar. There was a huge variation and I think it probably depended on how heavily students were involved in the group work.

Karen Plum:

Clearly, there are many ways that students can be tripped up as they work through the process. However, although this can be painful, teachers seem to agree that students probably learn a huge amount by getting things wrong along the way. Perhaps they'll remember those errors more than if they'd gotten things right the first time. When students progress from a mini- CAT to doing a full CAT, Eric feels it's important to build in some safeguards along the way, so that the exercise continues to be a positive learning experience.

Eric Barends:

You as a teacher should first approve the topic. Don't let them have a go at it. Make sure that we all agree okay, that's your topic. And make sure that you maybe, as a teacher, do a quick and dirty search to see if there's any relevant studies out there. Sometimes I mean this is really really fun when the students can bring in their own topics. Again, there's a trade-off between conducting a full CAT in a successful way. You will more likely be successful or find relevant studies when the topic is kind of mainstream, like you know, psyc safety, information sharing, all the typical HR topics. But sometimes it is really fun when students bring in their own topics.

Eric Barends:

To give you an example, last time I had an executive student say well, I have this HR magazine. It turns out that the new craze is bringing your pets to the workplace. Bringing pets to the workplace has a positive impact on work stress. That's the story. So I want to know what is known from the research literature about bringing your pets to the workplace, does it have an impact on work well-being, work stress or whatsoever? And the first time you hear this topic you go like, I don't know if there's any research. You will be surprised. I mean yes, there is research on bringing pets to the workplace.

Karen Plum:

Naturally, it can be fun for students to bring a topic of their own because they may be more interested in the results or they may be more relevant for them. Eric suggests doing a quick and dirty search in Google Scholar first, to see if anything comes up, because that'll encourage everyone that spending time on a full CAT should pay off. Once that's done, they'll move on to the search strategy.

Eric Barends:

Next step is let them do the search strategy and stop there. And then, as a group or in class, people present okay, guys you're going to present your draft CAT, you're going to present your search strategy, and then you have a look at it and you say listen, you made a mistake here. You used 'and' instead of 'or', therefore, you ended up with only three papers. Do it again, because you will find out there are way more papers. Or you use this term 'fair process', think about other terms and think about 'procedural justice'. That's another academic term that's probably relevant for your CAT question. So you help them a little bit, or students help each other.

Karen Plum:

The important thing here is that the CAT is done in stages, to catch mistakes or to refine search terms before the students get all the way to the end, before they realise what's wrong with their earlier work.

Karen Plum:

It may be that the question is just not answerable in the way they've framed it, or that the terms they've used aren't ones that academics use in their research. So taking a step-by-step approach leads to a better outcome because the teacher is able to work alongside the students as they go through the process.

Karen Plum:

Another approach that Eric and other teachers find helpful is to have students work in groups on their CATs, and the groups can help other groups to see where they've made mistakes in the way, perhaps, that they've gone about their searching. A common mistake that probably everyone makes, when searching in research databases, is not to clear a previous search before combining new search commands. Showing an example of this and Eric happens to have one created by a fellow teacher, is a powerful way of demonstrating how easy it is to fall into this trap, even when you're experienced.

Eric Barends:

She did a search and she made a mistake, and we use this for the students - said listen, it's not that easy. Even when you're very, very experienced, you will make mistakes, and so we show this and say what happened here? Because here she has 14 results and she does search in the title for performance and the title for feedback she has 633. Then she did something incomprehensible. Here she had zero and then suddenly she had 825.

Eric Barends:

What all students will make the same mistake is before you combine stuff, you need to clear your search, otherwise your previous search will still be here. So clear your search and then do the next search. It is often about those very simple, obvious mistakes that we all make. So usually I celebrate mistakes. So they show their presentation and I go like, yeah, guys, you made a brilliant mistake. And they, you know, are a little bit confused. Say no, this is good guys, remember we're here to learn. So question to the other students - what went wrong here? This is a very common mistake.

Karen Plum:

Another common mistake in the creation of search commands is the use of the words 'and' and 'or', perhaps using 'and', where you should use 'or', or vice versa. It sounds like it couldn't make as big of an impact as it does, but believe me, it really does, and Eric says he falls foul of this on a daily basis, so that's comforting in a way I suppose! Ask a librarian, they'll probably say they do the same thing, and so I guess what's more important is that we recognise the signs that we've used the wrong words in our search when something completely unexpected happens.

Karen Plum:

Before we move on to some other ways that teachers can guide students, it's important to acknowledge that the way you teach of course varies depending on how long you have the students for. Is it for a semester or for a much longer period? Where you have longer, you have the opportunity to 'prime the pump', as Denise Rousseau often says. Here's James O'Brien, Associate Professor at the Sobey School of Business at St Mary's University in Halifax, Nova Scotia, in Canada.

James O'Brien:

I'm fascinated by the skills piece right, and we talk about the skills that you need to sort of hone and sharpen before you get to the CAT, and I think that if you think of this programmatically and you think about how your course fits into a program of courses, you can sharpen some of those skills within your course by doing sort of activities that are designed to boost CAT readiness in the students.

James O'Brien:

So when the time comes - and maybe it'll be one of your colleagues who profits from these investments that you make in asking good questions and writing good search strings and assessing methodological quality. A nd that's more of a bite-sized piece that I think is easier for the students to take on, especially at an early stage in their experience where doing a full-blown product like this might seem quite daunting. So I would say I'm very much in favour of skills, breaking it up and taking the broad view and seeing this as something that they get to over a long run rather than within a specific course.

Karen Plum:

But even if you only have students studying evidence-based management as an elective, so for just a few weeks, then there's a lot to cover in a short space. That said, the CAT process provides a very practical skill which can stand them in good stead in many walks of life. Here's Eric.

Eric Barends:

We noticed that students like it. They learn a skill, they learn a tool and they learn how to create this CAT product and a lot of them find its very useful. Keep it simple, don't make it too complicated with ok also looking at psychometric qualities and stuff like that, because the biggest bang for them is already, you know, knowing how to search in a quick and dirty way, focusing on meta-analysis, figuring out whether the effect sizes matter, etc. etc. That really empowers them as a manager.

Karen Plum:

This is clearly a powerful skill to acquire, along with many other skills that are part of taking an evidence-based management approach. I'm now going to move on to some other ways to guide students while teaching the approach, some of which we've already started to touch on. The first is the quick and dirty search, which you've already heard Eric talk about. Many people feel that taking an evidence-based approach takes way too long, and so they shy away from the process. For sure, organizations are often keen to make quick decisions rather than making the right decision, but even when people are under pressure, they can do a quick and dirty search, even when they're sitting in a meeting, and by doing so, could change the course of the discussion.

Karen Plum:

Eric gives an example based on a McKinsey report on diversity

Eric Barends:

In most cases in your daily practice as a manager, you won't do CATs. You will probably, as I do, use Google Scholar to do a quick search when a claim is being made during a meeting or whatsoever, and see whether there are any meta-analysis on this topic there. That's why we have the McKinsey case. It's a McKinsey report on diversity and we create this little situation in class where we say, okay, guys, imagine this situation. Here's your HR director in your board and the HR director tells you hey, here is the McKinsey report making this claim that we should invest in diversity because it will boost our financial performance with 15%.

Eric Barends:

In such a situation, you would typically go to Google Scholar, you know, on the spot, search for diversity, performance, meta analysis and have a look at what's out there and if you find six meta analysis, all with teeny-weeny effect sizes or correlations, you already know probably that this McKinsey report is not very good and it has some serious issues. That's what yo u typically do.

Eric Barends:

However, when it's an important decision to make and you are involved in a project or whatsoever, then you conduct a CAT. T o be able to do number one, quick and dirty search in Google Scholar, it helps if you have ever conducted a CAT and know how to do that, so you will be better at doing a quick and dirty search in Google Scholar when you have experiences with conducting CATs.

Karen Plum:

Of course, how you share the news that the McKinsey report may have serious issues is another matter, but at the very least you know that you need to take a fine tooth comb to this report and naturally that's what evidence-based management teaches Ways to find out if the reports you read and the research they quote are robust and reliable. You don't rely on the fact that people might tend to trust McKinsey because they have a solid reputation or whatever. Other ways to engage students are about keeping things simple and making them fun. Doing a CAT for the first time is hard. Helping students become a bit more evidence-based is a good outcome. We aren't trying to turn practitioners into academics and it's not vital that they comb through every aspect of the research methodology that was used in the studies that they find, but it is important to assess how appropriate the findings are to the particular situation the student's considering.

Eric Barends:

Do you have to get into the psychometric qualities of the questionnaires used, the sample size, etc. No, not really. You can stop there and say, listen, there is research that's actually not that good, or focus on - this is interesting, we found only two studies, meaning this is not a very well established topic, or the other way around. Oh my god, we found three meta analyses. There are probably hundreds of studies on this topic. This is a very established topic.

Eric Barends:

Another one - focus on effect sizes. It's a little bit scary at the beginning because they look at all these d's and g's and stuff like that, but it's actually not that difficult, specifically if you say forget all the statistics, see if there are any effect sizes there and focus on whether that is something that is, you know, impactful, yes or no, and sometimes it stops. If you want to know if taking pets to the workplace has a positive impact on well-being, and you find a meta-analysis based on 20 cross-sectional studies that have a pooled effect size of 0.05, you can say, yeah, it's probably not much there and that's helpful. That's what you can do very fast.

Karen Plum:

What practitioners are really trying to decide is, can this research and the intervention it describes, help in their organization or in their work as a manager? And in the time teachers often have with these students, focusing them on practical implications and how to judge whether those are worth their time, gives them a valuable skill.

Karen Plum:

And what about fun? Well, Eric suggests poking a little gentle fun at the nature of academic papers to help students to understand what they are and how they're intended to be used, but also to be able to laugh at things that go wrong.

Eric Barends:

Make it fun. You know, have a laugh about things that go wrong. And because it is, I always tell explain to them at the start, guys, we're going to have a look at research findings and these academics have the habit to write horrible papers. They're unreadable and, as Rob always say, academics don't write a paper for you to read. It's to look things up. You know, don't read a paper from the beginning to the end. That's not how it works. You look at the abstract and you're going to have a look, how did they research it actually, what kind of variables? You know that's how you do it. So we're trying to make it a little bit more fun.

Karen Plum:

And, of course, the Rob that Eric refers to is Professor Rob Briner, another long time teacher of evidence-based management.

Karen Plum:

The final insight is about the importance of taking a learning approach as opposed to a performance approach when teaching this subject.

Eric Barends:

There is tension between learning and performing. Students, specifically when you have a capstone or a project or you have real-life clients, they really want to do a CAT with a finding that's actually relevant and can be used in practice. However, we are here to learn. That means that you may conduct a CAT and you will encounter some difficulties and you struggle, et cetera. That's okay. You're here to learn, so your end product is maybe not perfect because you did not reach the end all the way, or there were some studies not available or whatsoever.

Eric Barends:

So we always try to differentiate and Denise always points this out. There's a difference between performance goals and learning goals. That's also when you teach this. So learning how to do a CAT is not the same as producing a perfect CAT. But we know students are really, you know, ambitious and they want to know that their CAT really helps the client and yields nice insights. We want them to do a CAT on a topic that guarantees learning. Sometimes they come with brilliant topics. It's like oh, that's very interesting. I'm not sure if there's any research there.

Eric Barends:

There will be students and teams that will crash and burn, as we call it. They will end up with a CAT that sort of explodes because there are way too many studies, or there are no studies, or it's useless, or the studies that are there are actually not applicable. So take that into account, that this will happen. That's okay. We are here to learn, so it's nice to have this as an example.

Eric Barends:

Say, okay, we have Jojo here. Jojo had this question and hey, look at this, this can happen there. There's nothing there. All the research is crap means it's very relevant. We can say there is no research. That means there's not relevant evidence from research and that's very important to know. That means we have to rely on other sources. But for you, of course, that's not really helpful, because you want to know how to do a CAT but you can't even do the critical appraisal because there are no papers. So that's a pain. So you need to come up with an alternative assignment or whatsoever and make sure it does not affect their grades.

Karen Plum:

To finish off, I wanted to share some of the big learning opportunities that have emerged through the discussions of teaching the CAT process. First of all, James shares the power of finding the right search terms to inform the research question.

James O'Brien:

I was just going to say. I have an exercise in it in which I write a search string and one of the key words is 'hiring'. A nd I show the students how executing that search takes you to a certain set of sources and then you replace 'hiring' with 'personnel selection'. You have a completely different direction to a different set of sources. So the learning that comes out of that is to visit the reference librarian at the desk and say I'm a manager, I'm interested in how to hire. What do those psychologists, applied psychologists, what do they call that thing? What is the name that they give it? And then to adjust your string accordingly. And it's a very convincing demonstration to show you how it takes you in a completely different direction.

Karen Plum:

Everyone seems to agree that it can be really difficult for non-academics to even hazard a guess at the terms that academics might use in their research and, of course, if you don't find the right terms, you could miss a ton of relevant and useful studies.

Karen Plum:

Next, to share the power of learning from failure and the lemon award, here is Xander Lub, Professor Organizations in Digital Transformation at Utrecht University of Applied Sciences in the Netherlands.

Xander Lub:

So lemon award is actually something that comes out of a colleague of mine in Breda who teaches design thinking.

Xander Lub:

So basically he does assignments that take eight weeks and people get an award for the best product that they develop, which could be a service or something, and they they work on that and make it creative and make it user-centered. But then he also gives out a lemon award for the, the worst or the the biggest failure that actually taught the people that were making the failure or making the error, a ctually, that was a real learning effect that was probably stronger than getting the thing right. So he gives out lemon awards and they're actually lemons on a stick on a base and he gives that as an award to the team that did the worst ****** and they learned the most from it. And that's basically because design is also about failing to learn. I think a lot of us could do with that, because I think all of us are always so trained to get it right all the time, whereas actually a lot of the learning happens from failing and discovering what you need to do differently.

Karen Plum:

I love that, embracing and celebrating the learning wherever it comes from. I'm going to finish with a provocative question posed by Denise Rousseau. Is there a downside to having students do CATS?

Denise Rousseau:

If I could add one last thing - is there a downside to having students do CATS? And you know, a good lawyer wouldn't ask a question unless they knew the answer.

Denise Rousseau:

I think I have an answer, which is - a lot of upside, but there is a downside, number one they're going to search things and there isn't going to be any literature and they're going to go oh damn, this scientific stuff just isn't related to what I do. We have to kind of prepare them, I think, for the fact that first, evidence isn't answers, science isn't answers, but also questions that drive scholars and the interests of practitioners aren't aligned. So knowing that you're going to sort of what you're going to do is be derivative, take what you can from this literature, but you always need all the other sources too, and that, I think, is an important point to prep them for and then let them practice with after their CATs are in, in terms of well, what other sources of evidence would you bring to bear on this? And that's really is important and it's probably, I think the aggregation of evidence is perhaps the most overlooked and hardest part of all of this. So it's one of those to be continued together issues for us, I think.

Karen Plum:

Students, particularly existing practitioners, may indeed get frustrated that the evidence they need doesn't yet exist or its quality isn't sufficiently robust or conclusive to be reliable in their practice. But it's also clear from the experience of researchers that there's a need for more input and organizational data from practitioners that would help their research efforts. So there's a two-way street here. At least by raising an awareness of the need for more organizational data and access to practitioners for research, the practitioners themselves might be more minded to get involved next time they're asked.

Karen Plum:

That's it for this episode. I'd like to thank all of the contributors - Eric, Tatiana, J ames, Xander and Denise. If you'd like more information about teaching CATs, there's a guide available on the CEBMa website. Just go to www. cebma. org and head to the resources page where there are links to CEBMa's guides on CATs and REAs, the Rapid Evidence Assessments.

Karen Plum:

We also talk about the CAT process in the Evidence-Based Management podcast. You'll find it in the episode that accompanies Module 5, which is all about acquiring evidence from scientific literature. There are links to all these resources in our show notes.

Karen Plum:

If you're a member of the CEBMa Teachers Network, log into the members area of the CEBMa website and you'll find lots of resources, including guidelines, examples, instructions, exercises and a list of common errors. And if you're not a member of the network and you'd like to join, please get in touch. There's a contact us page on the website and Managing Director Eric Barends will be delighted to hear from you. Thanks for listening to this episode. See you next time. Goodbye.