For this high-leverage practice, we are talking about interpreting and communicating assessment information with stakeholders to collaboratively design and implement educational programs. This week I've got Chris with us again. He's one of our school psychologists, and last week we were talking about how do we collect all of that data and make sure that the data that we're collecting is truly meaningful and widely representative of all the different types of questions that we might be seeking answers for.
This week, we are shifting that conversation to saying that that part is done. Now we're at the table. We're having that conversation. We're integrating, we're synthesizing, we're making sense of what all of this data might have to say for programming for a student, or even for a classroom or a program across a district. So, Chris, for you interpreting and communicating assessment information, that's a whole lot of different stuff. How does that show up for you?
Oh, my gosh. Yeah, it's a ton. So anytime you're collecting information, whether it's information about a system, a program, or a child, if you've done your due diligence, you will be found in a position to where you have a lot of information in front of you and you have to separate: What's high quality? How does it merge? What is it all telling us? What does it mean, and how can we utilize it? Information is no good if we don't have utility for it.
So when I hear that, the first thing I automatically go to, and I always use this example, we find ourselves at a table, because now we've completed our evaluation. Everybody's done their evaluation, right? Of whether it's a person or it's a program. Right? Now we have this information. What do we do with it?
And I think that that's planning the evaluation, getting all those multiple sources to make your database decision. That part is done, but it still borders on this. And so for this frame of reference, for this little hat, the majority of the stuff I'm going to talk about is we've done our evaluation for special education, but I'm going to put in things for program evaluation or systems level stuff because it is applicable. It is all applicable.
So in that regard, as we're talking through this special education piece, right? We know that there's going to be an entire team, right? Legally, there has to be a team. Just like with a program evaluation, it's probably best done by a team and not just one individual person for a variety. But anyway, you've got this team, and oftentimes in special education, you know that you're going to have a variety of representative parties, right? You're going to have a general education teacher, and you're going to have the parent, and you're going to have an LEA rep, right? A Local Ed Agency rep.
You're going to have a special education teacher, you're going to have maybe an advocate, if the parent has an advocate, there's going to be somebody can be there that knows about the child that the parents bring along, and you're also going to have anybody that can interpret either the evaluations that were presented previously that you're reviewing or the individual who administered that evaluation is coming to a conclusion. So you have all of these stakeholders, all of these people that are going to be sitting around that table, and then they have to come together with information from dissimilar fields to make some decisions that are for the betterment of a child or for a program or a system or a district or whatever.
And so in that case of special education eligibility, we've got this entire team, and one of the things that a mentor told me was everybody's going to come to that table and they're going to have their own frame of reference. They're going to look at whatever that focal point is, whether it's a student, whether it's a program, whether it's a district initiative, whatever it is. They're all going to come from their own frame of reference. They've been trained, they've gone through training.
They've been experienced in their respective field, and they're going to look at it from a different angle. And he used to tell me the art, many parts of the job is an art. One of the art parts of the job is recognizing that as you sit around this table that is going to make a decision, that is going to impact people or a person, it is going to have an outcome. Everyone will come like a finger. They're going to have their own laser beam reference from a different angle that's through the lens of their area of expertise.
The most important thing is to not let that be isolated beams. It's allowing them to find the similarities and finding all the things that all the stakeholders and all the evaluators had at the table that meld together, that overlap, that touch each other, or even do the same thing and integrating that information together to help make a good decision. And then after you've done that, which in and of itself is its own task, then being able to communicate it over effectively to all the powers to be.
And so it can get really, really complicated. You know, we've got alphabet soup in schools for parents. We've got all these letters that sound funny and fancy, but they all mean different things, right? Just even simple things like IEP, LEA, right? Different eligibility categories. We have different clinicians, or we have different professions that come at it from their own and have their own interior, internal language from their profession.
And so we have to make sense of the things we have. And then after we've come to it from isolated beams and brought them together, and then came to an understanding by unpacking it, and then we have to use that information. We have to create a product, whether it's a product for a system, product for a person or an IEP. So going back to an example. Oftentimes I'll tell you a practice to try and avoid, and then I'm going to go into things that we really, really should be doing, hopefully.
How I've seen it happen, I've seen amazingly collaborative teams that put the student and the parent at the forefront of that team and say, “you are the most important part of this team, and we come to complement the things for you, to help make decisions about your child and for you as a student, and to support you, to make you the best you that you can be.” And I've also seen teams that say, “this is my lane, stay in your lane, I'll stay in my lane.”
And I could tell you, you don't have to be a genius to figure out which ones typically get better results. So as we go through this process, we want to make sure, we really, really, really want to make sure that we're remembering that not everybody in that room is an expert. And so when we do this, we're going to create a present level for the child, but that present level should be an amalgamation, a mixture, beautiful, rich recipe of all the strengths and all the challenges presented during part of the evaluation.
Knowledge is only useful if you can have utility and use it effectively for change. Otherwise, it's just knowing. And so this is very overwhelming. And I've seen in some cases that the special education teacher who plays the management role, the case manager, bears a heavy burden, a very heavy burden. You have all these individuals who have their own expertise and their own language and their own assessment outcomes, and they come to the table, and the hope is that this case manager will be able to tie it together and draft this beautiful IEP with these different goals. And there's going to be some team based goals that the goals are going to do from their own motality, but it's going to really be accommodations, modification. The heavy lifting is often by that special education teacher, and I would say, “well, that might get the job done. Have we been collaborative in our approach? Have we been collaborative in our developed product for the child?”
Assessment’s art unto itself, but so is developing an IEP that's going to have important outcomes for students. We talked last week a little bit about testing versus answering questions from the team. And I think when we take that ladder approach, it makes it easier for teams to be answering the questions of concern that we have. When we're developing an IEP for a student, whether that's developing a transition goal, whether that's developing reading goals, looking at services, minutes, placement, eligibility, even, which can be its own thing.
And then, more importantly, how do we bring that parent into the fold and make them, as the words truly are, an equal and important member of that team, not just the recipient of information being regurgitated and thrown on them from these individual evaluations, done often in a silo-like fashion, how do we integrate that information? Case managers are often tasked with that, but I would say your best practices to make those connections as assessors, learn about the tools, ask those questions, sit and compare. “Hey, I noticed that you gave the bot. I gave this. What is that like? What does this tell you? What does this tell me? How are these similar? Oh, I didn't know that you were doing this in the classroom. I didn't know you did the speech evaluation.” Have those conversations before, because the scariest moment in any meeting is when you're there to make a high-stakes decision for whatever the entity is and people are at the table all learning together for the first time because they never had that collaborative conversation before.
Now, you might go through and check the boxes and get the job done, but it is nothing, in my opinion, an ethical way to do it. And it's absolutely not best practices. And it all starts with, you're in the people industry. We work with others, we work with parents, we work with big people, we work with little people, we work with the kids. But if there are no kids, we are not going to have a job.
Well, let's get really practical on this. There are so many people at the table, so many different tools, and honestly, even from building to building or district to district, different professionals have different access to different tools. So there's no broad, sweeping way to just kind of generalize some of this. So I think I would love to get practical in. So how do I, as a practitioner, use what I have, whatever that is, and use it in a meaningful way? How can I bring my information in a meaningful way that then can integrate and collaborate with the other professionals at the table including folks that are showing up at the table with no formal measures of anything, that their observations and their experience is their expertise.
So how do we do that in a meaningful way that's still data aligned and good practice and all of ethical compliance and all of this. But how do we actually practically show up and implement this at the table?
Absolutely. First, I will say in its own thing, it starts before the table. But once we're at the table, you never want to be at the position where everybody's learning and hearing the reports for the first time together as a team. Like, “oh, cool, we've got unity because we're all in the same boat.”
*buzzer sound* fail.
But let's say we've gone past that. We've done that, we did talk about it. One of the most important things is remember that not everybody, when you look to the left and you look to the right of that massive chain bond that you have around supporting that child, that team, that army of individuals that are there for the betterment of that kid, not all of them are going to understand everything you're saying.
I would argue that they're probably going to miss 80% of it. I would argue that. And so I found one of the best strategies is, as you're talking about an evaluation, yeah, we might geek out. I do it. I do it all the time. When I evaluate, I get into it and I love it, man. I follow that funnel right down until I realize this is where I need to stop because I'm losing some efficacy here. It's not efficient anymore. Anyways, I digress, but I would always talk about, these are the scores that I have.
This is what this would mean. These are things you would see. These are things given x, y and z, which might be a challenge for the student, might be good and easy or more effective. If you're looking at strategy alignment to evaluation, what strategies would align with these strengths and weaknesses from my individual assessment profile or for what my frame of reference is. Over time, if you have people that are adding that secondary flavor into it, not just these are the scores and this is what it means, and it tells me that there's a deficit. Well, guess what? Here's the thing.
That student probably won't be at that table if there wasn't some kind of a concern with the deficit already.
You're just answering the question you already questioned and kind of had an answer to when you first started the entire process. So don't waste the time in saying these are the outcomes. Yes, we've confirmed it. Congratulations. High five. Move on. No way, man. Talk about how does that integrate? What did you see? What was easy, what was hard? What strategies? We all can benefit a parent or a student or a teacher or a team or another clinician or another person by telling them the richer information.
This is what it means. This is what it tells me. This is what I would assume given this information. I can never conclude and say definite, but I can say there is a high. If you've done your due diligence. And you've gone through where you've used your multiple sources, you can have some pretty strong conclusions that are going to be not absolute, but they're going to be pretty well informed.
And you can tell what it means, not just what it is. And I think the latter is often what happens. This is what it is and this is what my evaluation is. But when you're talking about educational planning for individuals or for even programs, just because you have the information in front of you doesn't mean you're effectively using it for change.
Because that's the whole reason we're at the table, to try and find positive growth for x, y or z, a person, a program, a system, whatever it is. And if we don't talk about what it means and how we can integrate that at a deeper level other than just the topical of this is a deficit and a concern, you're showing up to the table with a problem and not a solution.
And then over time, as the team does that, you're going to find there's a lot of overlapping solutions and there's going to be a lot of things that are symbiotic in nature, and it allows your team to really take those lasers and start overlapping those. Those lights, those fingers melding together into a mesh. And then you learn very effectively: what is bigger than just the number? Well, the strategy. And what is going to be a higher probability of effectiveness for a child?
We can't guess and say, well, we know for certain all you have is the indicators in front of you, and you make the best decision with what you have.
Yeah. And, I mean, I can speak to that from living on both sides of this. I've been in the position where I'm the one bringing the metrics and bringing that data and then needing to communicate in a meaningful way, but I've also been on the receiving end for a lot of my career of then needing to receive and interpret and make sense of all of these reports, because then I'm supposed to turn around and go do all of it, right?
So I think what you're saying is so helpful, because at the end of the day, both case managers and families are just, what are we doing? Great, I'm glad you ran a wonderful evaluation. I'm glad that it meets all of these check marks that I don't even understand in my role. Like, that's not something that I do, but now I got to go do something. So, based on all the stuff that you just presented, what do I got to do? And that truly, when we're talking about integrating, that outcome really does need to be representative of all of that data and all the eval pieces that were run when it comes down to the. Okay, so now what?
So I really appreciate how you're saying, well, then just present that information in that way to begin with. So if that's where we're headed, we need to figure out what we're going to do. Just present, “here's the information I got so that probably means this is sort of the type of direction that we're headed.” It also really helps because I've been at the table with some very staunch believers in very divergent fields sitting at the table together, that you may have disagreements about root cause or you may have disagreements about the validity of different measures or whatever.
When we get to the point where we're saying, “so, here's the type of direction that I think may be successful for this student.” That is a lot harder to have an argument around unless you do just simply have information that is discrepant like that. It just does not affirm whatever that may be. When it comes to practice, well, we can all talk about practices, even though we might talk about it from different metrics or different fields or, you know, different belief systems and how we're approaching our profession.
We can still have a very integrated conversation about what's good for this kid, what's good for this program, how can we revamp the system piece? Because we're really focused on the “where we headed.” So thank you for just saying, “like, hey, yeah, let's bring the data, all the good stuff we talked about last week. Bring that, bring that full and complete, but presented in a way that really positions everyone to be active in engaging in that conversation for the “what's next?”
Yeah, uh, can I tell you a story real quick?
Of course.
Okay. So, about six years into my career, five or six years into my career, I had a parent who had a child who went through a variety of evaluations, and the parents were always looking for that silver bullet. They had access to a lot of resources, and they were going from evaluator to evaluator to evaluator. The outcomes were always the same. And I went through, and the first thing I did when this case came over, everybody was afraid of the parent. You know, they thought this parent was going to be a tough parent.
And we had that conversation. We had a conversation with the parent and talked to the parent at what we would have called the domains meeting at the time, but we were going to questions. There was a preponderance of evidence in all areas. This parent had spent close to what you could buy a pretty good used car for on evaluations for the child. I mean, they had spent a lot of money to get the same outcomes, and they weren't looking for outcomes. Through that conversation, do you know what the parent said to me?
She said, I have all of this, but that's not what I need. I just need to know how. How am I to help my child? I don't need another team telling me, “this is where your child falls and this is what your child qualifies for.” Because at the end of the day, part of the time, my kid's going to be with you and my kid's also going to be with me, and we're both going to be working, hopefully together on this but in all of these evals, whether they were clinical, whether I did them through a university, they never told me or taught me or walked through the how.
And so as we went to that, that became one of the questions. When we went to questions. Let's answer the how for this parent. And at the end of that IEP, you know, we had gone through, and the child was eligible for services, and we developed this IEP, and the parent was part of that process. And she said, for the first time in my life, I feel equipped as a mom to be an effective mom, because for all this time, I haven't been able to talk to my child, to hear them say things like, mom, I love you. The things that everyone takes for granted. And I didn't know how to help.
And now I have a place to start. And that was so powerful for two reasons. One, it's more than just the outcome measure that you get, and two, you are not as a team or as a clinician or as a service professional. You are not alone in this game. And so that collaboration goes across both sides of that table, even when it's painful, even when it's scary, and even when that person has the means to go through mediation and due process, sometimes the things that they're looking for are the things that our best practices would help us as educators to teach this child the how. We know where we're at now, how do we move forward? How do we help leverage x, y, and z for growth?
How do we approach it? And in that case, I'll never forget that. I will never forget sitting at that table. And after we had that meeting, we presented the IEP, the parent cried, not because they were upset, but it was because they felt they finally had what they were looking for for years of evaluation that they never got. Be collaborative and bring those parents in. Lean in.
Yeah. And that's where I think, truly, it's anybody who may not have the expertise that I have, and that's not only families, only case managers, only someone from a different field. It's truly everyone at that table is intentionally designed to have different areas of expertise. And so if we all approach it from, sure, I can bring my area of expertise, but I need to communicate it in a way that is meaningful for you in whatever your role or whatever your capacity is for this student. So if it's just to assess and move on, that's one thing.
But if you are walking out of this room with a recurring obligation to serve this student, then what I am communicating needs to be communicated in a way that you can actually do that and truly, you know, you and I come from the camp of the living and breathing IEPs, and it's constantly something that we're working on and developing. And next week, Alex and I will talk a little bit more about that as well.
But what you're communicating is you're saying you need to share this information in a meaningful way that does right by the data and does right by ethical practice, but does right by the individual, and not just that student, but any people at the table so that they can do right by those students. So thank you for that focus. Thank you for just that incredibly heartfelt desire behind all these big heady metrics and processes and all the fancy things. At the end of the day, it's all of us as humans trying to help each other as humans. If that doesn't matter, we're colleagues or students or families. We're all just trying to figure it out together.
And that is the beauty of when we can truly interpret and communicate that assessment information in such a way so that we can go forth and actually implement whatever that program may be. It all makes sense. It all comes together, and then we can actually see that progress start tracking from the students. So, thank you for bringing the humanity into the hard sciences of all that data collection.
I appreciate it. I love it. What I did, do, will always do. Really.
For sure.