top of page

Bright Spots: The Podcast

Episode 1, Part 1: Aly Nestler and Improvement Science

Referenced Resources

Table of Contents

  • [00:00 - 03:47] (3:47) Introductions and podcast experiences

  • [03:47 - 08:56] (5:09) How Aly became a coach for improvement science (IS)

  • [08:56 - 10:15] (1:19) How does IS compare to Continuous Improvement Plans (CIPs)?

  • [10:15 - 11:30] (1:15) An overview of a typical CIP process

  • [11:30 - 16:30] (5:00) The WREN’s “human-centered” approach to IS

  • [16:31 - 17:46] (1:15) Cameron reflects on IS based on his past experiences

  • [17:46 - 21:15] (3:29) IS mindsets, changing systems, collecting empathy data

  • [21:15 - 24:53] (3:38) Why we use IS in the first place

  • [24:53 - 29:26] (4:33) New WREN coaching options for 2023-2024

  • [29:26 - 31:00] (1:34) Conclusion

 

Transcript

Cameron Yee  0:12  

So welcome to the Western Regional Educator Network podcast. My name is Cameron Yee and I'm the Communications Coordinator for the Western Regional Educator Network. And today, I have... 

 

Aly Nestler  0:24  

Aly Nestler, I'm the Lead Continuous Improvement Coach with the Western Regional Educator Network. 

 

Cameron Yee  0:29  

Great, thanks for being here and making time. 

 

Just as a little bit of background, this is the second version of the idea of a podcast, which we're calling Bright Spots. It is a variation of the publication that we've done, spinning it off into a podcast version. I did try to get one off the ground before school let out and you can sort of imagine how that went with all the things happening. And then I got sick with strep throat. So that was sort of a sign from the universe that, hey, maybe we need to slow this down a little bit and retool it a bit. 

 

So the idea is that every podcast episode will feature a new co-host that is a member of the WREN staff. Today, we have Aly as my co-host and we'll spend a little bit of time getting to know about her and her role. The second half would be Aly and I, or whoever the co-host is, getting together with one of our network members, which could be a design team lead, coordinating body member, one of our partners and so on, just to give a perspective beyond just our team. 

 

So I know Aly you enjoy podcasts and listen to them regularly. So what are some of the ones that you're listening to right now? 

 

Aly Nestler  0:51  

I listen to a lot of news podcasts, I listen to The Daily, I listen to the Post Reports. I listen to a lot of kind of experiential podcasts, so people sharing their lived experience. Listen to Terrible Thanks for Asking, that's a great one that kind of goes along that theme. Listen to Where Shall We Begin. So those are just a couple of my favorite podcasts of many. So I'm quite excited to be here today to be on a podcast sharing MY experience with the WREN. 

 

Cameron Yee  2:35  

And you've not been on a podcast before?

 

Aly Nestler  2:37  

Never been on a podcast before! I spent years listening to podcasts and this is my first time actually being a participant in one. So it's pretty, it's pretty exciting stuff.

 

Cameron Yee  2:52  

I have been on one podcast, no wait, two. One around home theater, it wound up being this very long podcast, like three hours, which I don't know, which I think is like, excessively long for most podcasts. And then my wife and I recorded one that we have not really deployed anywhere. And that was over a year ago and we did it around our anniversary. So the joke was that every anniversary we would record a podcast. But then our last anniversary, we didn't, so we're a little overdue for our next episode. 

 

Aly Nestler  3:25  

I like the idea of a long form podcast. Three hours, that's a very long time.

 

Cameron Yee  3:31  

Yeah, that's a very long time. I don't know. I mean - 

 

Aly Nestler  3:34  

I don't know who wouldn't want to listen to people talk about improvement science for 90 minutes to three hours. But I know I would.

 

Cameron Yee  3:44  

You may be a little biased.

 

Aly Nestler  3:45  

I might be a little biased.

 

Cameron Yee  3:47  

Which segues greatly into your role on the team, kind of what got you here and your exposure to improvement science and what that looked like. 

 

Aly Nestler  3:58  

Sure. So I have been a continuous improvement coach or a continuous improvement specialist, that's what we're called, with the WREN for going on three years now. So that simultaneously feels like much too short of a time and also like I've been with the WREN forever. 

 

So we're really excited to be going into our third year. And I was hired in order to build capacity in individuals and in teams of a variety of different sorts in continuous improvement improvement science. What that looks like specifically has changed over the last three years as our region - so the Western Regional Educator Network,  since there are 10 RENs - have kind of figured out things that have worked really well and the things that we could improve on over time. 

 

My first experience with coaching started in the Springfield School District. I was a literacy specialist or a literacy TOSA moving kind of from my role as a classroom teacher to a specialist role, a district specialist role. And I was tasked with working with Title One coordinators or Title One specialists who were based in buildings, you know, building their capacity in reading intervention specifically, we were considered to be coaches in that role, even though that wasn't like part of our title. And so I was provided with some really comprehensive guidance on what instructional coaching was. So what does a really quality instructional cycle look like? And how do you coach educators through that? That was my first experience with coaching. 

 

So along the way, I became familiar with continuous improvement improvement science, because I got involved with Oregon Education Association. So OEA, our state union's, Leadership Cadre, is what it was called at that period of time. And this cadre was really focused on quality assessment practices. So more broadly, not just in literacy, that's kind of where I was rooted at the time. But more broadly, what does really quality formative assessment look like? And so I got involved with the cadre to learn about that. So how can we really help students using the assessment information we have and guide our day-to-day lessons? 

 

Well, through that process, one of the things that the Oregon Education Association was really trying to help educators do was implement in their classrooms, what they were learning about formative assessment. So how do we move from knowing to doing. And the process that they used was called improvement science. So they introduced us to this way of thinking about implementation and measuring implementation in our classrooms. 

 

And it really talked to us a lot about Plan, Do, Study, Act cycles. So how do you make a plan? How are you really specific about the goal that you're trying to achieve? How do you, you know, operationalize or how do you define measurement for your goals? So how are you going to know that you achieved what you want to achieve? And then how do you use what you learned in order to guide your next steps. And so it was this really structured format that just made a lot of inherent sense to me as an educator. 

 

So that was in 2015 that I started with that cadre. And then over time, the Oregon Education Association started a network improvement community that was really centered around quality assessment practices. I became a coach in that network improvement community. So I was coaching a team from St. Helens, so outside of my local context. And then after that, the Educator Empowerment Academy began and I started coaching in that. 

 

I've been coaching improvement science, since probably 2016, kind of thinking about the specific mindsets, the specific tools, and kind of the overarching methodology that's required in order to really be effective in that particular process. So that's kind of a little bit about my background with that.

 

Cameron Yee  8:56  

I don't think I heard about improvement science in those terms until I joined the team a little over a year ago. I think, as I have learned about it, there are familiar touch points, from my time in the IT department or technology department. The notion around sort of iterative cycles of - well iteration is sort of a constant thing in IT. I was working on the website at the time and there's always multiple iterations of that. Something is never truly finished and you're constantly improving something that way. But I think my sort of, maybe, conflation of terms was seeing the term "continuous improvement plan" in a lot of districts and I think maybe even at one point, Lane ESD had a CIP plan or had talked about CIPs, so that's kind of like my only exposure prior to joining the team of hearing those words together. But I know that maybe there's something that's a little different between improvement science versus what people may have heard more commonly around continuous improvement. So I want to provide a little clarity around what's the difference or what is the same about them. 

 

Aly Nestler  10:13  

Right. So the words are the same. Continuous Improvement Plans, or CIP plans, have generally traditionally been very familiar to folks that are part of education leadership teams, situated in a local district context. 

 

I remember when I was in Springfield, we had a continuous improvement plan in our building, in each of the buildings that I worked in, and then there was generally a leadership team. So principal, and, you know, teachers, specialists, and classroom teachers that kind of came together in order to construct the goals for the building, the strategies and the activities that were going to take place in order to reach the goals of the building, and then some specific progress markers. So once again, how are we going to measure whether or not we're making progress towards those overarching goals. Sometimes you would set short term goals for yourself, some interim goals, and then there would be like these longer term year-long goals that you would set as well. 

 

I think that there's several points at which the process that we use diverges. The framework that the Western Regional Educator Network uses is called human-centered continuous improvement or human-centered improvement science. And, you know, improvement, science and continuous improvement are often used interchangeably. And so you see a lot of writing that is CIIS. And so those terms are often used interchangeably within the context of talking about the specific methods. But I think the human-centered piece is really the piece that sets our methods apart from what is kind of traditionally laid out there. 

 

Where we begin is we begin by scoping a problem of practice. And one of the ways that we identify problems of practice to focus on is with those who are closest to the problem. So who is embedded in the local context, and often in education, those that are closest to the problems, those that are closest to the the outcomes associated with those problems, are teachers, staff members, and students. So really making an effort to ensure that there is a lot of outreach going on that there is a lot of active engagement of individuals who are really entrenched in these contexts, it is really important to this particular framework. 

 

One of the ways that we do that is through collecting kind of preliminary sources of information to really understand that problem, and in a very in-depth way. So a lot of times what we do in districts, is we observe an outcome, for example, something like "our literacy test scores are not where we would like them to be." Right, so maybe we say something even more specific and more equity focused, like "our special education students' literacy test scores are lagging behind that of other groups." Okay. And then we go ahead and we start to make a plan about what we're going to do about that and how we're going to measure that. 

 

So one of the things that we do at the WREN is we do a little bit more preliminary exploration about what could actually be occurring, we call that seeing the systems work, or preliminary data collection where we are trying to really get at the root cause of a particular problem of practice. And so we spend a lot more effort trying to see the system more broadly and see it from different perspectives. So one of the things that we really try to do is bring together what we call a diverse design team, made up of teachers, made up of educational assistants or instructional assistants, made up of administrators. We're starting to do some more work where we're involving student voice a lot more heavily to ensure that we really understand what's going on, because we at the WREN believe that a lot of these problems that have equity or inequity stem from really complex systems of oppression. 

 

And so spending a lot more time making sure that we are attentive to all of the multiple factors that are contributing to those problems of practice. It's really important to us and we believe that it helps to avoid what we lovingly term “solutionitis” or just jumping from solution to solution to solution, anything that sounds reasonable, throwing things out there to see if anything sticks, right. Which can contribute to the initiative fatigue that educators feel in their local context. It just feels like they are constantly having things piled on their plate and nothing is ever taken off. So how can we avoid that initiative fatigue? How can we avoid that solutionitis. And one of the ways that is proposed in the academic literature, one of the ways that is proposed, and other very successful network improvement communities, is using improvement science, which is kind of this set of tools and processes in order to be really intentional about what problem are we trying to solve? And what are the root causes of those problems? So there's a lot more preliminary kind of data collection that happens in the framework that we use, than what would happen in like, a CIP planning session.

 

Cameron Yee  16:31  

Yeah, I think some of that is familiar from my, again, my time in IT where, who's defining the problem? The people, the IT staff are defining the problem, because they're the ones getting the support tickets or whatnot, but they don't necessarily, yeah, they don't reach out and, you know, gather data in the form of people experiencing the problem trying to understand what the issue is in a really deep and empathetic way. And I think that's what has stood out in my exposure to improvement science over this last year is one, that empathy data, those empathy interviews, where you are going to the people closest to the problem and sort of trying to understand what is, what are the challenges that they're facing and not necessarily jump to a solution right away. Those are interesting concepts for me to have learned over this last year and also to sort of start processing through my own experience, providing support to end users of technology, and what that may look like in a different way. 

 

Aly Nestler  17:46  

Yeah, we always talk about the mindsets that are associated with improvement science, basically, what mindsets need to be present in order for folks to be effective improvers. One of the mindsets is to be user centered and problem focused, which centers on the, you know, the identification of a problem of practice. And then one of the improvement science mindsets is to be - the traditional term is user-centered. We prefer the term human-centered, just because the users that we're talking about specifically are these beautiful humans, they are educators, and they are students in our schools. And that's really who we are aiming to reach out to and to be in conversation with and to design alongside. And then another mindset is to focus on systems thinking and take a systems perspective. And so we talk about how every system is perfectly designed to get exactly the result that it gets. And if we are not seeing the results that we want in a variety of different contexts, then how can we redesign our systems to suit our users, to suit the humans in our system, and not try and force our humans into these systems that are not really created to make sure that folks are as successful as they could be. 

 

So instead of trying to change the people in our system, we change the systems to suit the needs of the humans in our system. A way to do that, a core of that is really the empathy interview. Which we use interview, but we could use any form of empathy data, we could use observation. We could use an immersion experience, so like the shadow a student, is the most common example of that, where you actually kind of follow one individual person, and you see what their experiences are throughout the course of a day or an hour class period or whatever. 

 

You know, one of the things that we have found about that is that experiential data is really the pinnacle of data collection. For us, that's like what is most pivotal, that is what is most important to us. Because you can say that you have made all of these systemic changes, you can say that those systemic changes are improvements for a variety of reasons. But if they're not improving the experience of folks that they are designed for, then, you know, one of the conclusions that we've come to is that potentially they aren't as grand of an improvement as you may have originally thought. So empathy data collection is really a way for folks to experience what others experience, to feel what other people feel, as much as they can, in order to go about really changing systems for the better.

 

Cameron Yee  21:15  

Backing up a little bit to why we have this in place in the first place. Around the time that the Educator Advancement Council, which is our, essentially our parent organization at the state level, was forming, improvement science got presented as the methodology for the RENs to do what they were charged to do, which was to make systemic changes in their region. Essentially, how did improvement science become the adopted methodology?

 

Aly Nestler 21:45  

Sure, once again, those that are closest to the problem, have the knowledge should be empowered to be at the table in order to solve those problems. So it's very much focused on once again, putting power, putting control, putting influence in the hands of educators, and putting power in the hands of students. 

 

So we recognize in our region, that, you know, we serve Lane, Linn, Benton, and Lincoln Counties, and every single one of our 28 school districts is a very unique context and needs very different things. And so improvement science was really methods that were identified as being flexible for local needs. And then once again, the Educator Advancement Council, the Regional Educator Networks, the 10 Regional Educator Networks that are under the Educator Advancement Council, they are tasked with supporting educators at every phase of their career. They're also tasked with identifying best practices, they're tasked with providing high quality professional learning. And I think one of the things that was really known at the initiation of these organizations was that teachers were going to have to be empowered, in order to say how they need to be supported, right. They need to be at the table in order to identify what professional learning they need. And they need to be at the table in order to share their professional expertise about what they know is working for their students and their local contexts. 

 

And so one of the things about improvement science methods is it's very empowering to those who are using it, because it's very much focused on, you know, what am I implementing based on what I know about my systems? And what am I learning from it. You know, kind of the uniqueness of having these networks be designed by teachers, which the Educator Advancement Council - teachers were definitely at the table for that conversation. I think that that was one of the things that made these methods very appropriate was the knowledge that like, we are designing the supports alongside, we're not designing these supports for. So there isn't, you know, a leadership team or an administrative body saying we know what teachers need. Instead, it's more of an inviting in and saying, "let's find out what teachers need and make sure that they are really leading this work every step of the way." And this particular methodology really lends itself to that.

 

Cameron Yee  24:46  

Yeah, I kind of had that question on my own mind. Why is this process in place in the first place? Well, we are a little over time, and we didn't get to one of the topics that was on our list which was the new coaching options.

 

Aly Nestler  25:01  

Well, I can just spend a little bit of time. I'll go through each. 

 

So we of course, we've always had improvement teams, so ever since I started almost three years ago now with the Western Regional Educator Network kind of since we really established programming. We brought together teacher teams, we brought together district teams, we brought together building teams as well, to really build capacity in improvement science. And these teams generally work with us for one to two years. So our programming has - it's a two-year program. And so there's the option to do one year and then another year, of course, nobody's obligated to do the other year, but most people do decide to do a second year of the improvement science capacity building that we provide. 

 

So some of the things that are new that is responding to the needs and the feedback that we received is that folks want to be engaged in learning about improvement science, and how they can take pieces of this framework, or they can take pieces of this process, and use that in their local context, whether or not it be at a classroom, a building, or a district level. But maybe they don't have a full year, they can't make a full year long commitment, or they can't make a two year commitment. And so we were thinking about different levels of support we could offer. So we are offering several Design Days this school year. 

 

So in the 23-24 school year, where people who attend a professional development session, and it can be any of our WREN-sponsored PD, come to a Design Day with other educators from across the region. And they do what we call a design sprint. So it's generally like a full day, which is more like you know, about five hours or whatever. And they learn about this process, they learn about tools that they can use, and then they do a follow up, then they make an implementation plan, they walk away with something to try based on what they've learned about the improvement science process, what they've done that day. And then they do a debrief with a coach after they have actually implemented their plan. So that's our Design Day offering. 

 

And then we also have our individual coaching sessions. So maybe you don't have the time to attend a full day of learning around improvement science continuous improvement. So instead, what you would do is you would sign up for an individual coaching session where you spend about an hour to an hour-and-a-half, so 60 to 90 minutes with a coach, and you come up with an implementation plan. So what is something that you're going to act on based on a professional development class that you've taken, a professional development session you've attended, and then you put that learning into action, you carry out your plan, and then once again, you follow up with a coach. And of course, when you do your follow up coaching session, then the coach can help you plan out your next steps. 

 

And so this menu of coaching options kind of offers people the flexibility with their time so they can decide, you know, do I really want to do an in-depth deep dive and really become an improvement scientist myself, learn everything there is to know about improvement science, really enact this in my local context. Do I want more of a taste where I'm kind of coming together with others? So I get some collaboration? And then following up with a coach? Or do I really want to implement something, but you know, I only have a limited amount of time. But I really want to carry this learning forward. And so I'm going to spend a couple hours doing that.

 

Cameron Yee  29:04  

And you had a Design Day on Friday, right? 

 

Aly Nestler  29:07  

Yep. This last Friday, August 11, was our first Design Day. And we are also starting to reach out to folks that have said that they were interested in the individual follow up. And then there will be more information to come for our fall-winter Design Day. Yeah.

 

Cameron Yee  29:26  

So kind of looking ahead to the part two, that's all in raw form right now, and sort of occupying my head - in my head at the moment. But we'll be moving on to talking with a couple of design team leads. 

 

Thank you for your time and being willing to be on this podcast. I sort of joked earlier that I never really gave anybody a choice. Everybody's been like, "Yeah, let's do it!"

 

Aly Nestler  29:52  

That's right. Well, thank you for having me and thank you for letting me fulfill this dream of being on a podcast myself!

 

Cameron Yee  30:02  

And if not, there's always going be at least one episode, right?

 

Aly Nestler  30:05  

Exactly. I love it. Thanks, Cameron. 

 

Cameron Yee  30:08  

Thank you!

 

If you enjoyed this episode, please like, subscribe or follow us on whatever podcast platform you're hearing this on. For more information about our equity-based professional learning and the PD follow up coaching Aly described, follow the links in the episode description which go to our website at westernren.org. Thanks again for listening and we hope you can join us for the next episode of the Bright Spots Podcast: Highlights from the Western Regional Educator Network.

 

Transcribed by otter.ai

bottom of page