Welcome to Webcasts of the 2014 PDE Conference | John Tommasini - Director, Bureau of Special Education

PaTTAN Logo
Apply Discuss Succeed Participate Enhance Share Learn Achieve

Making a Difference:
Educational Practices That Work!

February 5 - 7, 2014 | Hershey Lodge and Convention Center

Visit us on FacebookSearch #2014PDE

19. Frequent Basic Skills Progress Monitoring for IEP goals with Students at the Secondary Level

02/06/14, Hershey, PA
ANNOUNCER: It is my pleasure this afternoon to introduce to you Dr. Mark Shinn. Dr. Shinn is professor of School Psychology and program coordinator at National Louis University. He's nationally
recognized as a consultant to schools across the country on implementation of multi-tiered services and support models also known as RTI or RTII. He just completed his five-year role as project
director for the northern region of iAspire, a USDE OSEP and Illinois State Board of Education Personnel Preparation Grant for supporting implementation of RTI in a multi-tiered intervention model in
Illinois. Dr. Shinn has edited two books on curriculum-based measurement and co-edited three editions of interventions for achievement and behavior problems. Dr. Shinn has also published more than 75
professional journal articles and book chapters on curriculum-based measurement, progress monitoring, problem solving, model service delivery systems and has been the principal or co-principal
investigator on more than 40 -- 4.2 million dollars in federal research and Personnel Preparation Grants. Please join me in welcoming Dr. Mark Shinn.
 
MARK SHINN: Welcome to round two. Can you hear this? You are awake, right? It's not Friday. Friday afternoon is presenters nightmare, right? You've eaten lunch, everybody wants to get back home. How
many of you were at my first session? Please don't have a lot of hands raised. Enough of you. Okay. So when you have to do two presentations, you have to build it on the assumption that people didn't
go to the first one and while there might be some common things in both sessions. There's nothing worst than sitting through it twice if you were at the first one. I had made an error as I've known
to do in looking at my watch. Because I thought I was so spot-on in terms of planning that I had 15 minutes for discussion. And of course, I'm not used to starting on the off-hour. So I thought I was
going until 12:30 and so I erred. I'm going to present some material that will be the same and then there's some special uniqueness on -- when we talk about RTI, RTII, RTI Squared, RTI 2,
Multi-Tiered Systems of Supports. People have so many different names on stuff but what is so different about the secondary, is if you tried to do it like what you did at elementary, you'll just turn
people off and goof it up, okay? And there's nothing more telling than where people goof it up is in their progress monitoring and/or screening. So I want to leave you with the least a few different
ideas on some stuff. I'd been doing this a really, really, really longtime, okay? If you do have a cell phone or you are hooked up to the internet, I might actually going to ask you to take it out
because we're going to be doing a little online polling here. I can show you the polling results from this morning too if you want. I don't think they signed any confidentiality agreements. And
you're not either. So I'm hoping we can have some fun. Afternoon sessions for me, when I was a much younger person, I grew up in Minneapolis. And on Sunday night, they would have concerts at a small
art center. And the question was, did you want to go to the 8:00 show where the people are fresh but limited by time or did you want to go to the 11:00 show that could go all night but you'll never
know, okay? And I always struggled with that dilemma. Well, you're at the 11:00 show. So -- and I hate doing the same material twice. So as I've said this morning, let's see if we can make this
reasonably interactive, okay? My goal when I make these things -- and I always ask for your slides about a month before you do something, I understand that. And so even if I have repeat sessions, I
try not to -- I really try to think through. So I'm not talking in my sleep which I think comes through to the audience. So as we think about some of the things I want to do, I want you to raise your
hand everybody. Say, hey Mark. If you have questions, just say -- raise your hands, say, hey Mark, okay? Because let's go where you want to go. It's a finite period of time. It'll go really, really,
really quickly for me. It'll go really, really, really slowly for you. So if somehow, we can compromise in between, it'd be great, okay? So I didn't have much of a chance to go around. And really if
I had more time, I'd say who are you, what do you do, what do you want to know. But I don't have time. Okay. So what do I want tell you? Well, let's do this. First, it's frequent basic skills,
progress monitoring secondary level. It's going to be a little bit more than that because I'm going to be throwing in elements of screening, I'll be throwing in elements of eligibility determination.
In fact, I'll be throwing in a pretty heavy dose of that. Because progress monitoring -- isn't there an R in RTIIIIII -- I mean, you got two Is in Pennsylvania which is different. But you also still
have the R. And I was wondering how people do response intervention without progress monitoring. Because R is about progress. We get really hung up on the all the I parts, but we tend to sort of
superficially glass over the R. And you can't do the I part without R. Okay. So here we are. Secondary, there's my little blueprint. Same joke, I was going to change it up on you but I still like
this one because it has special meaning to Pennsylvania. The short form of what I said this morning is and I've had a chance, a few times, to come in Pennsylvania, almost always -- I mean, there was
that Penn State pre-debacle, you know, for the PaTTAN conference there. And I've worked just a couple of the I intermediate units. But you got so much internal state leadership over the years,
people, you know, Joe Covalesky and all of those other folks that you have really been at the forefront of improved services for all kids and in particular kids with disabilities for so long. It
seems kind of crazy to come here. But I enjoyed it a lot. I get to have very different discussions with practitioners in Tennessee. Then -- Tennessee, Tennessee, Tennessee. Oh, my goodness, I'm going
there next week. Pennsylvania than I do in Tennessee, okay? I get very different discussions in -- with people in Pennsylvania than I do--like New York, I might as well, set my watch back 20 or 30
years. When you talk about tiered services or RTI there, it's like where are you guys, you know, well, by 2012, they were going to do K-4 or -- in reading. Really weird. So you are really ahead. But
I'm going to argue, we are also still kind of behind. Not any more than anyone anywhere else. But we are behind in our use of evidence-based practices in general. And we are, excuse my language, I'm
trying to be provocative since the foods is going to your stomach. We're shamefully behind in our IEP goals in progress monitoring. And if nothing else, I want our special ed administrators in the
room. And where are you, raise your hand. Come on, what are you waiting for. Our IEP goals are crap, right? Our progress monitoring is worst. And you don't take a vote on the right thing. It's not a
mystery about how to write good goals and do good progress monitoring for kids in special education. I mean, don't you subscribe to -- aren't you members of the Council for Exceptional Children?
Don't you go to OSEP, Office of Special Ed Programs and go to the -- what was the National Center for Student Progress Monitoring, what became the national RTI center. Where it is now the Center for
Evidence-Based Practices. What I do today is not a mystery. It was developed out of a federal need in 1978. Raise your hand if you were born after 1978. Come on. There's more of you than that. You're
all shy. Look at all you youngins, okay? That had been born after the solution has been identified. Let's fix some things. Okay. So I'm going to go through my jokes quick. Because I ran out of time
last time. Look, everybody wants a data system that doesn't tell the truth, right? Everybody wants the line going up regardless of whether it's going up. I show this little cartoon that's still one
of my favorites. It's not a cartoon. It's a [inaudible] this is going to be tied to my theme later on.
[VIDEO BEGINS]
MAN: Just thought you might want to know, sales aren't up, they're down.
MARK SHIN: This is an IEP meeting by the way.
 
MAN: Want a new job, we've got the most, careerbuilder.com. A better job awaits.
MARK SHINN: I said that that's an IEP meaning because we write the goals to meet our needs, not the needs of the kids. Have you been at meetings where you just turn on autopilot, and do the bubble
[VIDEO ENDS]
head. I have no clue what this person is saying but who am I to disagree. I have no clue what it is, to differentiate between a diagraph and a diphthong 80% of the time but who am I to say. We have
been at those meetings and that's not what the intent of that law was to be. The intent of that law was for us to use the goals, to drive the intensity of intervention. It is not whether a goal is
attainable. And in fact, I hate words like ambitious. It's an ambitious goal, ambitious for who? Well, that's an ambitious goal. I mean, that's so 1992. As we talk about the things we need to do. The
EIP, the goals we write and the privacy measure is probably the single most important thing that we do in our efforts. And it's become an absolute travesty of paperwork and mindlessness. And we need
to fix that. Is that strong enough for you? Okay. So where I work is in this type of assessment called curriculum-based measurement. Since the mid 1980's, it has long ago abandoned the curriculum. It
doesn't measure any specific curriculum at all. But it's part of the tradition, it's part of the name. In Pennsylvania, you're probably used to hearing it called CBA. But it's not CBA, it's visages
of--not Jim Tucker, I like him a lot, it's not visages of Ed Chaparro, I really like Ed a lot. Those are just different rebranding of curriculum-based measurement. So if you hear things like, when I
say begin, start reading aloud the top of the page, you read across the page, try to read each word, you come to word you don't know. I'll tell it to you. Be sure to do your best reading regardless
of the publisher. That's a type of measurement called curriculum-based measurement, okay? And that's what I've been doing since about 1979, okay? Being a graduate student at the University of
Minnesota, being in-charge with part of the team to develop a simple and sound data system for writing goals and monitoring student project--progress. Wow. So that's what I kind of do. You know, for
four years, I was one of six members of the technical review panel for the US Department of Ed, special ed programs, progress monitoring center. As Washington was expressing frustration about our
lack of pace in adopting evidence-based practices, for five years, they had a center saying pick it up gang, when that test rolled over, when the funding ran out, they rolled it into the National RTI
Center to help us get better at our progress monitoring. And when that funding ran out after five years, they've rolled over the mission of frequent progress monitoring into the National Center for
Intensive Intervention. So I've had a chance to work with those folks off and on. And I write about screening and progress monitoring. You know, I hope to be in the process of wrapping up a chapter
on secondary implementation for the handbook of RT--handbook of RTI or something like that. We'll see. So let me--give me 30 seconds on transparency. If you go to a presentation at a conference, I
think people should be transparent if they have other commercial interests, okay? I'm not here to sell anything other than ideas but you know that I get paid by somebody other than PaTTAN, okay? Not
to say this stuff today but nobody screens what I say, okay? So I--I'm paid as a consultant for Pearson, okay? Pearson produces AIMSweb, organizes in reports curriculum-based measurement data. The
slides I have that your graphs are made from AIMSweb but I'm not here to pitch it. I'm here to pitch curriculum-based measurement, okay? I will mention other products that I don't have a commercial
interest in that do CBM, okay? I'm a consultant for Cambium--the Cambium Group. Notice, there is no financial interest because I should have asked for royalties for Vmath, I helped them with progress
monitoring. I helped for progress monitoring on Jamestown Reading Navigator. And every once in a while, I convene with the National Consortium on Reading--it's now Reaching Excellence. Out of
California, they work in 40 some states including Pennsylvania, they do unbelievable staff development. Okay. Done with that. All right. The handouts, any of you downloaded the handouts in advance?
Some of you, good planning, very good planning. This is very quickly, I have put together the original PowerPoint presentation for you if you want to take it and use it in a class, in your district,
for training staff development, I don't care. Just don't change the content a lot. I have a copy of this presentation in a QuickTime movie format which will show all--a hundred and eighteen billion
slides, which I'll probably hit maybe 30 today. It will have a set of articles targeted by elementary and secondary book chapters, journal articles for your personal use only for you to be able to
read. You can get into any and all folders on my website for personal use only. And that includes, since I've heard some of you talking to Anita Archer, there's a folder called, Anita Archer Gems.
Okay? There's a folder called, Stan Deno Gems, Garry Germann Gems, Daniel Willingham Gems, he'll be here tomorrow. You know, all of these things, you can access. And then you can access any of the
presentations I've done for the last two years. In the same folder as PaTTAN, okay? So there is the blue icon, really simple to find, I think. It says, downloads for professional's icon, that's it.
There will not be a red circle around it. There's my family, I have boys about every 12 years. Which means starting at age of 28, that's him right there. And so it's like high school, Groundhog Day
all over again. So there's my 14-year-old, he's not in that picture. And see that kid right there? I've named him retirement. Because every time I see him, I say there goes retirement. He's five and
really cute. It doesn't keep you young. That's a myth. Okay. So you got it, downloads. Click on that. Alphabetize, go to Ps, alphabetize, go there. Okay. In there, one of the things you can read,
that should explain some of these in more detail, is a chapter that I wrote for the intervention book, the third edition that I did with the world famous, Godfather, Hill Walker. If you've never
followed, Hill Walker's work with kids with behavior disorders. Hill's retired from the University of Oregon where I worked for 20 years. He knows everything about anything. And we have an
opportunity, he can call in favors from really famous people who wrote this chapters in this book for nothing. We make about, I think a dollar and a half. It's a really expensive book, I'm sorry, but
that's the way it is. And because it's expensive, I find a way to get you some of the chapters. Okay. So remember my themes of monkeys wanting things going diagonally in the other direction? For some
of you and I'm kind of glass over this relatively quickly because there are some repeats in here and there's some stuff I'd really like to have a chance to dive in to with secondary implementation
because almost always we do it whacky or weird, okay? So I am going to presume that many of you at least have a familiar idea about a simple one-minute sample of reading and some of the judgments so
can we make work from that. I'm also going to presume that most of you think that you're measuring something other than what it fully intends to measure. When kids read aloud, we're measuring their
general reading skills. We're not measuring a little thing called fluency, we're measuring a big thing called general reading achievement. We've known for 40 years if you want to know how well a kid
reads, listen to them under standard conditions. And you don't need to listen to them read for a really long time to form some judgments about their performance and form some judgments about their
progress. Now, it is clear from some of the looks on your face, I'm going to have to convince you of that. Because that's the biggest single barrier we have in terms of building a data system. Is our
confidence in what it is that we're measuring, okay? It's going to be really important. Again, I have this dilemma of having gone through some of this the first time. So this is a high school
student, okay? Eleventh-grader. And this student, I'm going to do an individual screen, okay? This kid's struggling in content area classroom, a move in kid. Let me see if we can explain one reason
why the student maybe struggling in content-area classrooms. To do this, this is a high-performing district with high standards. When I say high standards, I mean, really high standards and meeting
with their staff, they think the minimum standard for a kid in their high school, the lowest level of reading should be at the--at least be able to navigate eighth grade level material. If you're
thinking in your high school, what's the lowest level of proficiency you'd like to see kids perform at. If you can't have that discussion or haven't had that discussion, then you can't identify
problems. You can't identify who you--who you serve. Because you know what, social studies, failing it, is not a disability category. Neither is failing Science. Neither is actually failing Algebra.
These are things that are more deep than that. And until we establish a minimum level of acceptable performance, secondary special ed, especially high school will be constantly throwing stuff after
the fact. And we won't be able to do a good job. Okay. Tell me how well you think this student navigates eighth grade level text. Sound gets better after about 35 seconds.
TEACHER: When I say begin, I want you to read across the page. Try to read each word, if you come to a word you don't know, I'll tell it to you. Be sure to do your best reading. Are there any
[VIDEO BEGINS]
questions?
STUDENT: Nope.
TEACHER: Okay. Begin.
STUDENT: After [inaudible] new town. An eight-year-old, Samantha and her twelve year old brother, Robert had heard from other older [inaudible] from some of the other neighborhood children. They just
add it and they do check -- to check it out.
STUDENT: Some of the -- what is this word?
MARK SHINN: Sloppy.
MARK SHINN: Oops.
STUDENT: Supposedly, it was on the north side of [inaudible]
MARK SHINN: What's that word?
STUDENT: How do I say it--in middle school.
MARK SHINN: Middle school?
STUDENT: Only right behind their new home [inaudible] they heard had crossed the lake and their [inaudible] there was the...
[VIDEO ENDS]MARK SHINN: Good enough for eleventh grade? I hope that's a rhetorical question. No? Good enough for eight grade? No? Can you pretty much predict which word she's going to miss? Story
specific, multi-syllabic words, is she reading with automaticity? No? Does she have an effective strategy for words she doesn't know other than, "What world is that?" And is she attending to the
prosodic features of text, reading with expression? Not much. What would you do with a kid like this in your building? Eleventh grade, helped with homework, weak content area tutoring, pushing and
co-teach, doubling the number of teachers in the classroom without increasing the quality of pedagogy. You noticed a little bias there. What do we do? How do we monitor progress? What kind of
services we -- would we provide? What would that student's IEP goal look like? That's what we need to be thinking because as we get into high school, we have still not quite figured it out. So I'm
going to take us back to like 1982 where the smartest guy I've ever meet in American schools started to figure this out. And says until we can identify the problem, we will not systematically solve
it. And at middle school or high school we don't think much about who do we serve and how do we serve them in a systematic way. Okay. So one of these things in terms of big ideas that you are talking
about RTI, RTII, Multi-Tiered Systems of Supports. It all boils down to at the corner stone, the kernel is data-based decision making. The two most important decisions we make are going to be
progress monitoring and screening. And always to the greatest of your possible try to put progress monitoring before screening, why? Progress monitoring is what -- is much more difficult to get in
place. Because progress monitoring is about what the adults do the problem. Screening is about finding the kids, oh, we're used to finding kids and also I would argue, we're used to finding the kids
and blaming them and/or their families for the problem. What we're not so good is about assessing the solutions, these solutions that we deliver. So progress monitoring will always be a bigger
problem for implementation. Data-based decision making, RTI, I see a lot of screening in RTI, I see a lot of screening but I don't see a lot of progress monitoring. But here is the dilemma, frequent
formative evaluation is one of the single most powerful tools in our toolbox. The ability to stand on a scale and say, "Yup, losing weight or, nope, I'm not, I better do something different. Yup, my
temperature is going down. That's great. Nope, my temperature is not." These are going to be the things that we need to apply to education, it works the same darn way. Progress monitoring, especially
for kids with IEP goals is one of the most solvable problems in the field of education. You don't have to go to your school board. You don't have to take a vote of your staff, it takes real
leadership. The kind of leadership that I've been blessed to have as a practitioner, Dr. Keith Cromer, 1982 in Minneapolis schools where I worked said, "We're no longer be writing in this old cruddy
IEP goals that are going through the motions, eating up paper, eating up time but not driving practice. And we're going to start doing the right thing. Write a memo, I'll send one to you." It's a
variation of the one that he wrote in 1982 and start shifting over to evidence-based practices, your staff will like you, your kids will do better and your parents will be more actively involved. So
progress monitoring is a little different at secondary, okay? It's a little different, there's little or no Tier 1 progress monitoring, okay? I see schools doing three times a year benchmarking for
kids in their junior year. Now, Pennsylvania people wouldn't do that, would you? Whether it's curriculum-based measurement, measures of academic progress, I mean you guys wouldn't do that, would you?
I mean why would you give a test three times a year to a junior? That's basically measuring a basic skill's test. Although that's what they do at elementary, that's because basic skills are still
growing there. But see how that might turn off a high school science teacher? You see how it might turn off a principal that's says, "Come on, why are we doing this, I mean, spending a lot of time
and money, time at my computer lab? Why are we doing that?" You see, if I've got a kid at risk, a struggling reader in eighth grade, progress monitoring, that's pretty good. I've got a kid that
significantly discrepant in tenth great, progress monitoring, basic skills, that's pretty good. Kids with IEP goals, if they don't have basic skill deficits, I shouldn't be serving them. What? Aren't
they serving kids struggling with Social Studies? No, I do not. Because if I do that, I'd be serving most of the kids in my school. So when we think progress monitoring, middle school, high school,
critical Tiers 2, 3, and for kids with IEPs. I'd be curious to see what your IEP goals look like at -- especially high school. Cool, so there are some vocabulary words that I'm not going to spend so
much time on with this group, okay? But we are very sloppy in our language, very sloppy. Benchmarking is combination of screening and progress monitoring. I can take this off now, right? It's the
afternoon, I don't have to look so formal. As if I was formal to begin with. We confuse benchmarking as synonymous with screening. And benchmarking is actually two things, screening and progress
monitoring of everybody, okay? And we need to be able to differentiate that from just plain universal screening. Did you know that in eighth grade, I might screen everyone? When would I screen
everyone? End of the year. Why? So I can plan kids' schedules for ninth grade. See, if I'm doing screening at the beginning of the year in middle school and high school, I have to change kids'
schedules if I want to give them intervention services and let's say I want to provide remedial services to let's say kids that are below average at my bottom quarter. Am I possibly going to change
25% of my kids' schedules after they have been created? No way. But if I know at the end of the year I've got all summer to plan their schedule just like everybody else. And I'm a really big fan of
what I called triaging kids. I can't stand the idea although it might work for you that we identify kids as candidates for Tier 2 and wait for them to fail before they got to Tier 3. You guys
wouldn't do that, would you? Because I would call that Tier 2 as purgatory. Tier 2 in most places I see is really designed as place to figure out who goes to Tier 3. Now, with the idea that Tier 2 is
for kids that are below average but not significantly discrepant who need a moderately more intensive intervention. Tier 3, my goodness, those are kids that are significantly discrepant that we need
to get them into the most intensive intervention that's not necessarily an individualized special ed program as soon as we can. I want to do that at the end of the year so I can plan, okay? That's
called universal screening. And at some point, I'm going to move to individual screening. So beyond ninth grade, I don't want to -- I don't need to assess in most communities everybody to find the
kids that are significantly discrepant. Think of the cost benefit of that, you know, you got a two thousand person in high school and let's say there's five hundred tenth graders, I'm going to -- I'm
going to test five hundred tenth graders to find out who my most discrepant readers are that need intervention, shouldn't I have identified them already? I'm going to test my new kids, one at a time
if their struggling in content area classrooms, that's individual screening. We don't want to turn off our staff by over testing kids. Content area teachers are sort of grouchy sometimes, and we
don't need to make them more angry. I might even use what's called multiple gating. Illinois is an ACT driven state. I actually kind of like that. I've been brought slowly along about it. In 8th
grade, they give the EXPLORE to allow predicting -- predictive planning of college readiness, okay? And although not all kids are going to college, the kid was predicted to be 12, on ACT that's
probably a kid who's going to be struggling in school, right? So, one of the things I worked with a very charismatic principle named Dr. Charles Johns, they used their end of grade eight EXPLORE
testing to identify the most at risk kids. They follow those kids up with individual screening with curriculum-based measurement, and those kids who were scheduled at the Tier II or Tier III
interventions in 9th grade. Whoa. Did that change the dial in terms of kid performance when you do that? Very efficient. So, that's called multiple-gating, using existing data followed up with
individual screening. And then frequent progress monitoring, I'm going to define as progress monitoring more than once a month, okay? So, let's say it's the first year middle school, I might do
benchmarking, okay? That means fall, winter, spring. Oral reading for purposes of screening and general -- some sort of crude progress monitoring but that's not frequent. Tier II, I'm going to do
that at least monthly. Tier III, I'm going to do that preferably weekly. Key vocabulary, got it? Nod heads if you do. Questions if you don't, raise your hand. Okay. So, I'm going glass over this
because you can read over it. That's fast. This is a book by John Hattie, okay? If you haven't read John Hattie's book or if you haven't seen his videos, they're really kind of neat. Now, he's from
New Zealand and he's had the time and opportunity to produce a bunch of books. This is the most notable one. Reviewing 800 and some meta-analysis, remember a meta-analysis is a summary of a bunch of
different studies. Well, he's done a summary of summaries, and found things that moved the dial. And I love this because it's actually a dial. Everything is put on to the same scale. And you can get
a nice little key saying where does it rank on all of the things, the number of meta-analysis that went into it, number of studies, and number of things they measured, and the number of people in the
darn study. This is what he found with respect to frequent formative evaluation. He's summarizing a lot of Lynn Fuchs's research among others when you look at this specific study sighted. Here's
what's really cool. There's a lot of things that -- well, there are a lot of things that work that are called teacher effects. You know, did -- do you know that most about -- just about anything we
do with kids works a little? There's nothing remarkable about, "Hey, I did this and the kid learned something." That's not the goal. The goal is not to find things that are different than no
instruction. The goal especially for kids that are significantly discrepant are what are the things that produce gains above and beyond, and I hate to use the word, just good teaching. What are the
things that really moved this dial? Number three overall, frequent progress monitoring. That's a big deal. If these were medicine, schools would -- or schools would be screaming, hospitals, parents
would be demanding that we implement this treatment but because so much of what we do in education is kind of like elective, let's all vote, let's have everybody be happy. We are slow to adapt and
slow to adapt. Number one interestingly in terms of teaching variables. Now, if you want to know the two things that will influence student achievement higher, not necessarily more but higher than
frequent progress monitoring, you got to read Hattie's book because I'm not going to tell you. They're not teaching variables and they're not home variables. Okay. So, here is our dilemma. Everybody
has an opinion about assessment and hardly any of us have training. When I hear, I don't like that test. Well, really? So, what do you know about assessment? What do you mean? I don't like that test.
It is unbelievable, and in particular, I have this bias, my wife is an administrator and then she does know a lot about assessment. She says she goes to these meetings and everybody wants one stop
shopping. They want a test that nobody will complain about, that nobody has to do, that nobody has to score, yeah. And guess what? It ends up being sending kids to the back of the room and taking up
computer time. Now, I like computers, you know, I said this morning, I'm a techie. I like all kinds of stuff. I'm an early adaptor. I can turn my son's heat off in his room which I thought of by
doing last night just mostly to annoy him. Because I can do that from my hotel room when I'm toasty and I like to sort of poke him a little bit. But, you know, for kids that are vulnerable, I think
somebody needs to actually look at them because I worked with a lot of adolescents who don't much, who don't really -- or not particularly organized, not particularly motivated, and they're just
going through the motions on this stuff. And those are the kids -- hey, for typically developing kids fine and dandy but for kids at risk, I probably want to see if the kids are trying. So, I worry
about stuff when we picked the test to meet our needs rather than the needs of kids. And here's what these kinds of test are best for, program evaluation, accountability, maybe, maybe, maybe
screening but progress monitoring especially frequent progress monitoring is not their strength, okay? So, we've got to do better than that. Now, there's a bunch of things from different publishers
that fall into the CBM family. We can measure reading. I'm going to call this oral reading. I'm going to call this silent reading. I'm not going to call it maze comprehension. It doesn't correlate as
highly to other measures of comprehension as oral reading. It's a silent reading test, not a comprehension test. We got to get that out of our way. It is useful mostly as an efficient screener for
secondary age kids. I can do all my kids during Social Studies period and I could do all my end of the year 8th graders with a maze task. I'd be done in five minutes, okay? I'm going to have a few
kids that cheat, right, because it's multiple choice. I'm going to have a few suspicious scores, I should know that anytime you do group testing, multiple choice, you're going to get cheating and
suspicious scores but usually not a lot, and I can use maze as a low cost thing to do that. Spelling which people tend not to do very much of which for me would be the second thing I would probably
put into my data system. Probably third I'd put into my data system, written expression. Math is important and there are some very good, now that technology has evolved a little bit. There are some
decent things starting to come out of mathematics. They're not quite perfect yet, not good enough, I think, because again, without really having a lot of published probes, it's hard to do research on
them. There are some things that we could do really in a lot of these areas for basic skills. Now, for you guys, you're not interested in the early literacy stuff, right? But if you have some things
to take away, look at the things that I think are better than others and the things that I don't think are so useful, okay? I'm not going to talk about it but you can catch me afterwards if you want
to talk about why I think nonsense words is mostly nonsense, okay? So, I was on that research team. I'm not just talking smoke. It's just letter sounds works a lot better, okay? So, there are some
things that we can do in a bunch of different areas for basic skills, okay? And then since it's secondary, we don't -- we're not going to talk about those. There's a bunch of different publishers,
okay? You should all know that we're all inbred. We're all inbred, okay? So, most of this research was field tested in a set of rural co-ops, north of Minneapolis-Saint Paul. The head of that co-op
was a guy named Gary German, smartest guy I've ever meet in American schools. You'll hear his name later. Gary German had among the other people who worked in this district, Lynn Fuchs who went to
Vanderbilt University. Lynn created a yearly progress pro in monitoring basic skills progress, who also worked in Gary German's school district, Jerry Tindell. Jerry Tindell and Lynn Fuchs went to
graduate school together. Jerry Tindell, Lynn Fuchs, and I went to graduate school together. Jerry went off and did easyCBM, okay? I left Minneapolis Schools to go work at the University of Oregon. I
hired Roland Good and chaired Ruth Kaminski's Doctoral Dissertation. Ruth was funded on a federal leadership training grant to extend curriculum-based measurement principles down to younger kids. So,
we all worked with the same database. There are strengths and weaknesses to each of different programs and your job as a consumer is to know which things work best for which types of problems. That's
all I'm going to say about any of those things. The designed principles, I'm going to spend a lot of time on. But let me hit a couple of them. Because you see, I do like things that I can see how
kids are doing. If kids have to write their answers to a math problem, yeah, you can do things on computer and enter the scores but I can't see how that kid did that, okay? And even though it
increases efficiency which I like as a screener for things like progress monitoring, I'd actually like to see the kids write the answers. When kids do some things in writing, writing is unbelievably
rich in terms of what it tells me. What a student can and can't do, does or doesn't do. I really like to see kids read aloud. I really like to see kids read aloud. So, that's really bias for me. I
also want something that's really simple to do. I don't want that people have to go to graduate school for five years to learn how to do something like scale reading. That's just not practical. These
are some things that are important. The science part is clearly some of the things that we need to be thinking about. But guess what, science doesn't influence practice. I got to tell you, I wish it
did. I wish publishing a study would change people's minds but it doesn't. We still have people talking about learning styles. I mean unbelievable. What, what? Oh, what do we have? Did I hit a nerve?
AUDIENCE: [inaudible]
MARK SHINN: I mean what is this thing? I mean it's a really nice little thing that would might play good on 60 minutes or something like that or probably worse but you know what, there's -- and Willie
Ham will talk I'm sure about this tomorrow because he's got a really nice 10-minute video on it. It's just utter nonsense. Okay. So -- did I disappoint somebody? Okay. I hope not. Okay. So, seamless,
we're not going to see a kindergarten kid read because it works really well for them, too. Elementary, I can have a staff member, anybody in the building, anybody in the hospital should be able to
walk in and take somebody's temperature. And I should be able to do the same thing for basic skills in terms of screening and progress monitoring. Anybody should be able to do it. In fact, if I could
rewind the clock a little bit. 1988, i published this article that was trying to show how easy it is to do scoring of reading curriculum-based measurement, okay? And we compared how trained teachers
did compared to how trained fifth graders did, okay? And now I would entitle that article, "Are you as smart as a 5th grader?" It turns out, you know, the 5th graders, if you train them were just as
accurate as adults because it's easy to do, okay? Not that we want to have 5th graders do testing. What I want to be able to do is I want to be able to do seamless assessment so I can assess kids
that need frequent progress monitoring as often as they can once a week. Kids at K12 twice -- or at least once a month, I could do once a week if I had the resources. I could do twice a month if I
had the resources but me, I could live with once a month. And for Tier I, notice I'm going to say through the first year of middle school in low risk communities. Benchmark, let's say kids transition
in the middle school and 6th grade, I would do everyone three times a year, progress monitoring and screening for 6th grade. What if I'm in a high risk community? Wow. If I'm in a high risk
community, I might do it for two years in the middle school because we're going to have sizeable proportion of kids that still need to grow in their basic skills. I worked with a high school in Utah.
They were telling me they were benchmarking fall, winter, spring, universal screening and progress monitoring through 10th grade. And I said, "Why in the world would you do that?" And they said,
"Mark, we've got 60% of our kids don't make AYP on our state reading test. We have kids that are not done." Does that seem like a good rationale to keep your eyes on how these kids are growing?
Absolutely. But you customize it based on the needs of all your kids, but most of the time, it's going to be Tier II and Tier III and IEPs at middle school, high school, okay? We don't need to get
hung up on the things that I see people in 9th grade three times a year doing. It turns everybody off and it really distracts us from the bigger issues which are these, okay? So, this is sort of what
it looks like. You'll have different graphs, different publishers. The more frequently we monitor progress, the more the importance emerges of establishing an expected rate of progress. The
overarching thing at Tier II, Tier III and for Special Ed is the biggest single guiding principle of how we write our goals should be to reduce the gap. We'll do it and I need an archer moment.
Everybody, the purpose of our goal should be to do what? Reduce the gap.
MARK SHINN: Okay? And if we're not reducing the gap, we're not working in the interest of kids. Okay. So, again, less of Tier I, middle school, high school, certainly high school. With this kind of
AUDIENCE: Reduce the gap.
test, I could monitor progress but I'm going to skip that one. What I really want to do specially with this group is I want to target the single most solvable problem which is the quality of
varieties. Look, everybody hates them, right? You do know that they draw straws at high school to see who goes at these meetings.
AUDIENCE: Uh-hmm.
MARK SHINN: Who's the General Ed representative? Some of you have probably drawn the short straw. It is not the right attitude we'd like to have amongst our staff. This is -- I found this in a --
posted right above the desk in the Special Ed teacher I was working with in Illinois that really captured it so well. It's actually the theme of a book chapter I wrote for the National Association of
School Psychologists on writing goals in progress monitoring, which is in the folder that you can read about. You know, I do not like these IEPs. Do you like these IEPs? I do not like these IEPs. I
do not like them, geez laweez, we test, we check, we plan, we meet. This captures so much of the proceduralism that we end up doing. And these are real goals. These were goals. I asked my -- last
spring, I had 17 students out in very good school districts in Illinois, and trying to get an idea of the impact that we've had as part of our training now around RTI, all this other sort of stuff,
the -- I mean the bottom line will be seen in the quality of our IEP goals, and this is not the worst set of goals nor is it the few that I would say would probably be good ones. This was
representative and this is an -- the phrase I used this morning is, if it's not embarrassing, it's shameful. It's 2014 not 1971 not even 1975. This is real life. Let's take a look. Well, there's an
observable measurable goal. Wow. Increased basic reading skills. I can measure that. How? I mean I could if I put it into something but why wouldn't I just say what it is I was trying to measure? And
look at this short term objectives. Short term objectives? Let's see. What did IDEA 2004 do with the short term objectives? No longer required. Why did they no longer require short term objectives?
How about two reasons? There's no relationship between short term objectives and meaningful student growth. In other words, you can write them but it's going to influence whether the kids achieve.
And here is the other reason, the Federal Paperwork Reduction Act. Short term objectives just take up space. They just make long IEPs. Now, many states like Illinois require short term objectives. I
call them window dressing. Me, if you go in my house, you'll see wooden blinds and no drapes. You go into my mother-in-law's house, you will see shears, blinds, drapes, heavy stuff, that's her taste,
okay? But it's not making her taste my taste. I called that window dressing. If you like short term objectives, put the window dressing in there but good grief, this is embarrassing. If indeed a kid
should be able to decode words containing long vowel syllable patterns, and I think they should, they should do this without thinking all the time. Not 80% of the time, and the jokes I make are
really not funny because I used to joke about kids crossing the street successfully 80% of the time, except I had told the joke this morning. I was in Mississippi and the director Special Ed said,
"We had this moving kid came in with that goal." We thought it was just unbelievably silly. An all time dumb goal as a kid will not bring a gun to school 80% of the time. That's a group of people
going through the motions but not thinking. And these are going through the motions and not at all thinking. A week goal that we can't really measure very well, we could so much better. And look at
this, 80%, no thinking required and the -- look, they just went through it and drew lines. That's awful. We need to do better than that at both of these things. And look at this, students will
recognize sight words at the 4th grade level 80% of the time. They're sight words. We should do them just once in a while. Unbelievable. People wonder why it takes kids, you know, like five years to
learn the multiplication facts. Jeez, if you write goals, the kids will do -- learn the multiplication facts with 80% accuracy, you will turn that into a career. That's two to three weeks of
instruction if kids have a good program in any kind of number sense. The goals drive your programs and your goals legitimize poor instruction if you write them this way. Okay. So, how can we do
better than this? Well, we can do better real easily. How about this one? I've got a 8th grade student and I'm going -- this kid is significantly discrepant, I might write an annual goal that will
reflect the student's improvement in general reading skill. That first goal we saw on the previous two slides but I'm going to make it more specific, more observable and measurable, that this kid
will read a hundred and fifty words a minute correct with three or fewer errors from a randomly sampled grade 7 standard reading passages. That's measuring general reading achievement will improve in
reading. That's what this is saying but this is observable and measurable. I can do this with confidence. If I've got a kid with a discrepancy in math, I can say this kid will earn X number of
points. I need to know what that point -- what these points mean and what these scores mean in terms of a percentile rank, in terms of meaningful reduction of the growth but I can do this quite
easily and if I've got a kid who's a significantly discrepant writer, and I can get a kid writing sixty words in a three-minute writing sample with sixty correct writing sequences, this kid has
really reduced the gap. Now, there are techniques and strategies for teaching how we would write all of these sort of goals. They're not new. They've been around a really long time but we've got to
get on with the business of doing kind of the right thing because every once in a while, when I look at my five-year-old, okay, and I'm really tired of teaching classes at night and I don't want to
go out of town and meet nice people and then, you know, like get delayed for 12 hours and all that other sort of stuff, you never get anywhere easily anymore. Every once in a while when I get really
frustrated, I think, you know, "I should call Perry." Perry was here yesterday, right? And Perry is like -- Perry is scary, right, with all the stuff that he knows about -- and it's like, "Oh, my
God, this guy really knows stuff. Oh, my God." I don't like talking in front of them because he will always find the errors in what I say in terms of the legal perspective. Well, find a guy like
Perry and let's go out to parent advocacy groups and say, "Hi, we will evaluate the quality of your Special Ed program." And the first thing I do, I'd say, "Let's see your kids progress monitoring
graph." How many of you would say, "Oh, we got that." How many would say, "Oh, let's see. What would I do?" See, I believe that we should be able to find the progress monitoring graph within 10
minutes that would show not only that we have a graph but if I ask you, "What does that line represent?" You'd say, "That's a --that's a line of expected improvement that significantly reduces the
gap." And I'd expect you to say, "What do you mean when you say it significantly reduces the gap? What's the kid's present level of performance?" And they expect you to be able to say, "Well, there..."
-- I'd expect you to answer those questions. And you see, seeing something like this, and if I saw some lines like that and didn't see this, I'd say, "You're out of compliance with IDEA 2004." IDEA
2004 put in these interesting little words -- actually, it was IDEA 1997. The IDEA '97 put in these words that said the IEP will be revised to address any lack of expected progress. That was put in
the law because what was happening was at the end of the -- end of the year, people would go, "Darn. It wasn't a good year." Closing the barn door after the horse gets out. So, in '97, they said that
we're supposed to revise the IEP to address any lack of progress. Now, guess what, they revised the IEP right here. Kid was having some challenging behavior so they revised the IEP. They drew a line.
How did that go? Not so good, right? But talk about database decision-making, did they correct it? Yeah. And guess what, now the kid's back on track. Now, you wish -- in hindsight, you wish -- wish
it was -- I wish we wouldn't have done this behavior plan. I wish we wouldn't have done this one. This one's probably better but that's what progress monitoring allows you to do, revise the IEP to
address any lack of progress. So, this is our primary goal is to increase the quality of our IEPs in frequent progress monitoring. I should tell you I'm not looking for a lawyer to go out and sue
school districts. If I've been working with schools, it has been to help them implement. I have not once been called in to testify against a school district so I just every once in a while think,
"Wow. That would be easy money." Not the right thing to do but easy money. Now, here's what I'd like us to be able to do. How many of your staff are paralyzed by mindless three-year reevaluations at
the high school? How many of you are high school school psychologists in the room? If you don't watch out, you could spend all your time engaged in activity that won't improve anybody's performance.
My worst fear is going out and re-administering a cognitive ability test, asking a kid why we elect senators, which I ask myself that every -- pretty much everyday, repeating digits backwards so I
can sit down and figure out what the kids are capable of to figure out they still need Special Education. That's not what the three-year reevaluation was ever intended to do. The three-year
reevaluation was intended to prevent kids from being stuck in dead end programs. So, we're supposed to be assessing the kids' benefit from the programs. So, let's take a look. If this kid's expected
rate of progress reduces the gap, is this kid meeting or exceeding the rate of progress on the IEP? They're exceeding. Is this kid reducing the performance graph from -- performance gap from peers?
Yes. Does this kid still need Special Education? No, let's consider a transition plan. We've now completed our annual or three-year review. We can determine what additional data, two words, five
letters if any are needed to answer the questions, and I would argue, no. So, this is the part when we get into kids with illegibility concerns. If you don't have good progress monitoring after a kid
is illegible for Special Ed as part of their IEP, how do you think you're going to get good progress monitoring to determine whether the kid is illegible for Special Ed? Shouldn't you do it the same
way? How you monitor progress to determine illegibility will be the same way we monitor progress after illegibility, how we monitor after should be how we determine to be illegible. Well, that would
be too systematic for us. This is not new. But how are you identifying kids as illegible for Special Education under RTII, okay? The basic sort of premise in person is what you call the Dual
Discrepancy. Kids have to be discrepant from other peers, okay? This is called the performance discrepancy. It's a measure of kid's educational need. If you're not discrepant, you don't need Special
Ed. This is necessary but not sufficient for illegibility, okay? It is no longer a between or within person discrepancy ability achievement. It is a between kid discrepancy, okay? So, this kid is
discrepant, well below the 10th percentile. That's for elementary. Once you get into high school, it's going to be difficult because here's what we're operating with. Kids don't parley in Social
Studies. Let's test them. What would you test them on? Well, me, I'm going to suggest if this kid has a severe basic skill discrepancy, this kid may be illegible for an intensive basic skill
intervention but I need to define what a severe basic skill discrepancy is, and until I do that, I'm just going to be throwing good money after bad. These are the kids that I would like us -- might
-- the --want -- what I would like you to walk away here with is identifying the kids that we treat. Okay. Treat might not be the word that you like. You could use the words that we teach but that is
too -- a more offense for me, because I need to distinguish the kids that have low level basic skills that we would support in content area instruction. You can't tell me. You can't teach study and
organizational skills for a full year over here and expect the kid to use them over there. You can't tell me that that will work. I don't believe it. I've never seen any data that suggest you can
teach skills in isolation over here and get kids to generalize over there especially if the General Ed content area teacher in no way shape or form runs his or her class like that. "Hi, I want you to
take cardinal notes. Your teacher doesn't expect you to do cardinal notes but we want you to anyhow. Your teacher will never mention it. Your teacher doesn't even know what they are but we want you
to do that." Boy, that's putting a lot on the kid, isn't it? My kid doesn't do it even when it's expected. So, we need to be able to sit down and say, "How do we support kids in this stuff?" And
here's the real challenge. This is not a measurement challenge. This is skill building and leadership. How many of you are doing variations of the content learning continuum in your content area
classrooms? Pennsylvania, movers and shakers? You know what the content learning continuum is, right? It's still sort of the latest more intensive version where well packaged that would argue in
strategic construction model from the University of Kansas. Thirty-five years of work and how to support at risk or kids with significant discrepancies in content area classrooms. If you haven't, go
to the -- you know, there's some website references for you. I mean the way they've got the content learning continuum put together now is on --it's just really neat. I've been working a little bit
with the Virginia Department of Ed. I did a full-day with secondary teams with them last September and part of our training is collaborative planning with the content learning continuum, the CLC
people and they've got a really nice video. It's in my -- it's in my folder for you about how they're marketing and packaging it within their high schools. It's really cool. This is a two to
three-year endeavor to be able to build these skill sets but we don't get here until we get this problem clarified because otherwise, you know, Special Ed becomes the homeless, the tired, the
oppressed and the weak. We take anybody who somebody else doesn't want and I'm -- it's disturbing. So here's my question for you, okay? What level of basic skill proficiency below of which, okay, you
think a kid needs treatment? Okay? Think about this. For you, the level of proficiency, the minimum level, just think about ninth graders, tenth graders, eleventh graders -- let's use high school
example, for you, look, if a kid performs as well as grade six students, we could support that kid in content during class hours or below that standard that kid needs an intensive program provided by
the type of thing Special Ed might do, okay? How about this one? Have a high likelihood of passing -- if they were in sixth grade, have a high likelihood to pass in that test? That they would be
proficient at least at the level of sixth grade measured by your standards test, okay? This is called a norms based approach. This is called the standard based approach, okay? We'll give you another
option performance well as end of the year seventh graders. Or, pass a grade seven proficient test. How about this one, perform as well as end of grade eight students or have a high likelihood of
passing on a grade eight proficiency test. I might collect some data. Take out your phones. Take out your -- I'll show you how this works if you haven't used poll everywhere. It's free even I can do
it, knock on wood. So, gave you those choices, okay, and one way or the other whether one is called a norms based or standards based and you go like this if you're hooked up to the internet you just
go to that website and then you type in these key answers if not you text and if you don't how to text it next -- sit next to a young person they do. You would text to this number you would text if
you think one way or the other you like your kids to be at end of grade six proficiency, end of grade seven proficiency, end of grade eight -- or you'll never do this in high school, not even in
middle school. You don't want to think this way, great. Let me know.
 
AUDIENCE MEMBER: What is that figure [inaudible]
MARK SHINN: I will make this full screen like that. Okay. And we should start to see results when they come in and hopefully it'll -- it's not failed me yet.
AUDIENCE MEMBER: Mark, I have a question.
MARK SHINN: Yes.
AUDIENCE MEMBER: One of the problems that I've seen in secondary is -- in primary and secondary is that we tend to accommodate instead of do replacement instructions.
MARK SHINN: You noticed, eh?
AUDIENCE MEMBER: Yes. Yeah. And -- in your question here you're talking about at what level do we decide if we can basically accommodate this kids [inaudible]
MARK SHINN: Intervene?
AUDIENCE MEMBER: Intervene.
MARK SHINN: Yes.
AUDIENCE MEMBER: All right. So at what point do we -- do we need to intervene and at what point can we find ways -- well, basically to accommodate the kid and content-area class but what we end up
admitting [inaudible] what is happening...
MARK SHINN: I know. I know.
AUDIENCE MEMBER: ...we just spend all our time going around in this facility...
MARK SHINN: Exactly.
AUDIENCE MEMBER: ...and we never actually work on.
MARK SHINN: That's exactly right. Did you notice one of my two choices was not accommodate. My two choices were treat and support. Now, I have kid on a 504 Plan, I guess, my middle son has type one
diabetes, okay? So, he has a 504 Plan and mostly for medical reasons. They want to put an all the stuff like, you know, like -- do you want an extended time and -- no. Well, he's eligible for it.
It's an accommodation, he doesn't need it. Well, what if his blood sugar is low he should treat his blood sugar, okay? What if he has a dada da dada and he has to go to the nurse, he'll deal with it,
okay? I'm all for don't -- misunderstand me. I'm all for accommodations but here is my bias, accommodations give the appearance of doing something without meaningfully changing things for kids. It's
about us. Look, I can expect the kid to do less, no big deal on that one. I can give the kid extended time, no big deal on that one either. It's a little bit of a pain but if the kid can't do
something giving them extended time to do it is not going to do much, so, my bias will be there will be some times. Don't -- I'd never say never, never, never but I don't want -- I don't want that to
be the first thing we think of, okay? I want us to first think about what's your standard because I don't think we actually provide the intensive intervention that some kids actually need. Look, I
can tell you in the Chicago area kid doesn't graduate from high school that's really not good. It's really not good. We already have one and four African-Americans between the ages of 18 and 40 in
prison. Who's the predominant groups in our city who don't graduate are children of color and they don't graduate from high school they don't and it's really disturbing.
AUDIENCE MEMBER: It's almost equally not good to have them graduating and reading on a second grade level.
MARK SHINN: Exactly. So...
AUDIENCE MEMBER: And I've seen so much of that and you look at the IEP and there really isn't anything in there that's going to change...
AUDIENCE MEMBER: ...the fact that this kid has a severe, you know, reading [inaudible]
MARK SHINN: That's right.
MARK SHINN: Yeah. I'd much rather have a literate dropout than an illiterate dropout.
MARK SHINN: Yeah. So, here's our dilemma and if you want you can see what I'm doing next week for the Tennessee Department of Ed on secondary implementations, okay? That's in the folder already just
AUDIENCE MEMBER: Yeah.
go down to Tennessee. I'm doing a keynote for about an hour and a half and there's this big idea that we are stuck at secondary with what I called the "dilemma-dilemma" because we want kids to
graduate we don't give them interventions they need. Because they don't get the interventions they need they dropout. So, how about if we give them the interventions that they need? Now, would you
rather do it at seventh grade or would you rather do it at tenth grade? I'd rather do it at seventh grade, okay? But, the -- and it has to be an unrelenting quest to get kids the services that they
need and it's not like we don't know how to do this. Look. Most of you have responded -- look, if the kid is at end of grade six proficiency in ninth grade we should be able to support that kid in
content-area instruction. We might not be able to do it as well as we could right now but this not a kid that requires -- the kids, you know, reading above that threshold this is not a kid that needs
a Special Education program. We've got to figure this fundamental distinction out because until we do that it becomes difficult to write our goals and monitor progress. Okay. Now I'm going to say you
guys are actually pretty conservative, not bad, not good but pretty conservative and I would -- actually say pretty realistic because if I had my truth and advertising my standard would be -- look,
if I got a kid to an end of grade six standard that that kid should be supporting in content-area instruction. But, boy do I get pushed back in some places, so, I try to kind of hedge my bets and not
have to argue a lot and say how about at end of grade seven proficiency. I work with a high school right now that their standard is end of grade eight. Whoa, and guess what they've got a bunch of
kids that are -- at 53 tenth graders that fell below that standard. They were shocked but it explained why these kids are doing poorly in content-area classrooms. Okay. So, you kind of got this basic
idea, right? This is -- this overarching thing, so, if we do this we identify -- and this I don't like mission statements. I can't stand them. I don't like people arguing about whether there should
be a hyphenated word and -- oh, my -- the future conditional but here is one place where you convene a team, you meet with you school board and say here's where we believe a kid may need intensive
Special Education. Above this we will work to support, okay? And it goes like this, above this -- below this level intervene treatment, above this level support. And, boy, do we know how to do that
support so much better than we did before. Here's what it looks like graphically. I'm going to use an end of grade seven standard here, okay? This kid right there this is the student that you saw.
Here's her performance relative to an end of grade seven standard almost all your CBM data are shown in box-and-whisker charts. Here's the average the blue -- the orange line is the line that would
be at the bottom -- would be below average 25% out to the tenth. This is student is well below the tenth percentile of end of the year seventh graders. If this student were outside of the box below
average I would say that student would be a good candidate for treatment. Okay. But, it's drawing the line having the arguments that you need to make with your community to say above which we
support, below which we intervene and then be relentless in providing the interventions that kids need, okay? So, I worked with a guy named Wayne Green from -- who is reducing the gap the fastest at
middle schools in the State of Florida. Okay. High need community like really high needs 99% of the kids free and reduced lunch. Core language arts program in sixth grade was a double period of
corrective reading. Anybody know what a corrective reading is? It's the kind of program you use in Special Ed that was the core if you're in Tier II, if you're at risk it an at risk school triple
period and in sixth grade if you couldn't read you could be in language arts and language for thinking and a bunch of other stuff all day. Whoa. Guess what will you -- will you when you do that. You
reduce the gap. He had this interesting problem too that kids would now come to school for language arts and then skip before they just skip but they now come for language art. That's a good thing
better thing than the -- than the past. So he got this idea, so, guess what my goal for this kid would be if they're discrepant what would be my IEP goal? In one year I might want to have them be
proficient in grade seventh reading material. It makes goal setting relatively straight forward presuming the performance gap is not astronomical. Okay. So, that kid who maybe discrepant now I would
like to ensure that that kid might be given an appropriate and intensive Tier III intervention, so, it's not General Ed to the old model in the Special Ed, okay? Me and my first bias would be a kids
below that standard if I've got an appropriately intensive and scheduled Tier III intervention with a really strong program that that would be my first shot unless this -- it has, you know, in -- at
least where I grow up in Minnesota we would call "a snowball chance" of doing you know what. But, a kid who is reducing the gap with that intervention would not be eligible. This kid I would argue
seventh grader would not be eligible because this rate of progress now is reducing the gap. This is the goal we wanted to achieve that reduces the gap. It's parallel. I'm going to say this kid is
benefiting from Tier III let's maintain the Tier III intervention. If the kid was not doing this I would say we may need a more intensive program the kind delivered by Special Ed. Okay. So, there is
the rate to reduce the gap. It's now pretty close and I think it's actually narrowing the gap just a little bit but the rate of progress is designed to do that. Okay. So, the kid is not at eligible.
Okay. So, we go -- let me check my watch to make sure we're on the right track this time. So, we go until 3:15, right?
AUDIENCE: Yes.
MARK SHINN: Cool. So, can I spend 15 -- 10, 15 minutes addressing the barriers and then we'll do some things on goal setting and try to make sure we have time. Okay. Because here's the things that
just drive me crazy. The things are really not a scientific issue, okay? The -- it is not ambiguous. You know, there's a conference being held for 20 some years it's called the Pacific Coast Research
Conference. All the premier Special Ed researchers present there to be part of the club you have to be accepted into the club it's very clicky. You know, you got to have a bunch of grants and they'll
have to present their fourth coming research and I could tell you what the outcome indicator is in almost all the studies. It's all going to be some variation of curriculum-based measurement progress
monographs. The research community doesn't really argue about this stuff. The practice community does and they argue about it for this reason. It's not Science but practitioners and here's the
problem it's your smart teachers that have the biggest problem with curriculum-based measurement. It's not teachers that are fresh out of school who don't know any better it's your smart teachers who
know that something is not right the way it's often portrayed. Okay. And the barriers are -- look, it's not useful for secondary because you don't have tenth grade probes. What would a tenth grade
probe look like? What's tenth grade basic skills? Remember let's go back to this notion we're trying to identify kids with basic skill discrepancies. So, it's not discrepant from tenth graders or
eleventh graders it's discrepant from minimum basic skill standard and people just don't get that, okay? And that's not their fault because nobody has ever taught him. Here's a second one that
curriculum-based measures fluency. If you're speaking French well you don't necessarily speak it fast although most the people I know that speak French speak way too fast for me. And then the problem
of comprehension and unless we get on top of these issues people going to say [makes noise] and whine and whine and whine and we won't get very far. Now, so, you can't show them the data. You can't
show them studies I've done that and this reading director in Nebraska who said, "Look, no amount of data, no research articles, I don't care what journal it's in reading research quarterly, journal
of Special Ed school I -- nothing is going to convince me this measure is reading." Okay. Where do you deal with that, I mean, okay we're arguing religion here, so, I can't deal with that.
Content-area teachers push back when they say silly assessment. They may not tell you this but when they see the -- their -- you know their -- their class canceled for a day, so, they go to the labs,
so, they can get these things that they will never ever, ever use or look at the data that turns people off. Don't do that. I mean, I see kids being tested their senior year on a computerized test of
basic skills. Physics teachers not going to look at that and say, "I'm going to develop a specialized program for the -- no, sorry. It turns them off. We also see people benchmarking everyone and
that turns people off. Don't turn people off. We also see because the materials end in grade eight people think that they're not useful and it -- you know, all the publishers now want to develop
ninth grade, tenth grade, eleventh grade probes and I keep going back and I said, "But you're missing the point. You're missing the point." "Well, our customers want them." "Yeah. But you're going to
confuse them." The net effect is your going to turn them off because it's not going to work so well than if you would just do what the original intent was to do a good job of measuring the basic
skill part. Some kids do indeed have problems in content-area instruction. My son ninth grader being one of them he's struggling in ninth grade English. Don't get me started. So, we have to make
what's called a shift from a normative approach. Is this kid different from tenth graders to a standards based approach is this kid different from some specified level of minimum. Don't let me that
it's what I want all kids to do like it's the greatest it's the floor not the ceiling, okay? And we failed to that. Now, that's logical. Our experienced teachers tend to push aback about this fluency
part. Okay. And here is this really fundamental thing I just was doing something back with another person three of four times yesterday that because something is timed it's a rate measure. No.
Because something is timed it's designed to have a fix test length. Look, if you would measure something for ten minutes or you measure something for one minute they're both timed nobody calls a ten
minute a test of fluency test, do they? And if you can reach the same decision listening to the kid read for ten minutes or listening to the kid to read for one minute and say, "Look, the kid is
improving or not would you -- would you rather do it for your teacher?" It's designed -- the timeframe was designed to be practical not to generate something about fast. And then this will the talk
this morning is people have really misinterpreted that the national reading panel. People also don't understand that the five areas for reading ain't necessarily the same for adolescents. Did you
know it's not phonemic awareness, alphabetic principle, fluency, vocabulary and understanding it's word knowledge, oh, motivation, fluency, comprehension vocabulary. Why would motivation be like an
important area for middle school, high school kids? Okay. So, we've really kind of goofed that up. Okay. And the, of course, and one of the things is I have to say sorry I wish I could take it back
because I use the word fluency in the early 90's too. So, but that was like early 90's and I hope I've been consistent since. So -- we can skip this part. Here's the bottom line and there's some
actually some pretty good research on this, okay, that if you portray this as a measure of fluency people will push back because, you know, the goal of reading is not to read fast and their
absolutely right. People who say that you should pat them on the back and say you are right. It is -- the goal of reading is to read well. Now, think about it, if you encounter hard materials,
difficult material things you don't know the vocabulary, you're not understanding what you're reading would you read more or less material than if you did know the vocabulary and were understanding,
would you read more or less in the same timeframe? You'd read -- you'd read more if you were understanding more if you did have vocabulary. It's just the timeframe, okay? Think about if it's easier
you'll read more if it's more difficult you'll read less. What are the things that make text complex, vocabulary, prior knowledge, understanding do I like breeze through the text code? No. I got sent
a bunch of choices from my Human Resource Department on my healthcare benefits. It's like -- it's a -- two pages long. I don't even understand -- the first line is, "There maybe some changes in your
benefits based -- okay, I get that part but put the other stuff in English I'm a pretty good reader. Okay. Second thing, our interventions emphasize speed and most of the time it will be repeated
reading. Almost always, in some commercial products that really emphasize repeated reading. If you don't read well and ask your kid to read it faster what are you going to get? Nothing. You'll get
guess and go. You'll get kids trying to memorize the words rather than actually learn how to read it's a very weak and over used intervention driven by this notion about fluency. And if I read
another book after I dibbled what I'll do? I will throw up. And that's pretty strong because you don't use those programs to plan instructional programs. They're not designed to do that. If you
treated as what it's intended to measure read and -- read -- general reading ability your interventions would be about how will you read that's what we need to be talking to kids about and if they
read better their scores on an oral reading test will go up not because they're reading faster but because they're reading better, okay? And we will make better choices on interventions, okay? Most
of the interventions I swear we're doing in 2014 are band-aid programs. And here's what drives me crazy a combination of band-aid programs that a team or a teacher puts together themselves. Any of
you have kids besides me? And when you really need a band-aid you go down to the bottom of the box and got those [inaudible] round ones and those skinny ones and you're really trying to cut those --
cover this bad cut and you try to put this all band-aids together to make a bandage they don't work well, they're aesthetically unattractive, they fall off that's what I think a lot of our
interventions at middle school and high school look like. You know look if your don't know what the big programs are, that really reduce the gap of middle school or high school do some reading
because we have now in 2014 the best interventions that we've ever had to reduce the gap or prevent the gap from really getting more significant in middle school or high school and that includes core
programs too. If you haven't seen some of the new core language arts programs that have come out, that have taken the stuff that we know about supporting content-area of literacy. We've got some real
good homework that be doing. Okay. So the bottom line and these are people who know what they're doing the bottom line goes something like this, these are Janine Wexler, Sharon Vaughn -- oh my
goodness, Greg Roberts, Caroline Denton these are really famous researchers? Implemented repeated reading and wide reading interventions without more formative that means more powerful intervention
is not likely to be valuable. This is the efficacy of repeated reading and wide reading practice for high school students with severe reading disabilities. Read this article. It's all comes from --
okay, kid does poorly on reading CBM so they must have a fluency problem, you address fluency problems by repeated reading. All -- and guess what it turns people off. Now, very quickly comprehension.
How many of you know kids that can read but not understand. Take a look around. All of us and all of us depending upon the material we're being asked to read could be similar -- faced with the same
problem so I put together white paper, okay? That talks about he barriers and you can read that there's some science part that won't help you. Look here's a study that Lin Fueds did with middle
school identified as SSLD 1988 biggest single -- biggest -- most significant correlation to SAT comprehension was not question and answer although it was okay. It was not recall, oral, written. It
was not oral and written clause the best single predictor of performance on SAT comprehension was oral reading but that doesn't matter, right? Because comprehension is over simplified this is an
example from this morning. Many of you will do better on this task now than you would have a month ago mostly because this has been all over the news. A month ago some of you may not have known who
A-Rod was not you might say he's that guy that was on the news. And for some people that could be evidence of understanding, right? But you didn't get it from reading. Some people might say that's
the baseball player that was on the news, that's deeper. That didn't come from here although you might infer it from some of the other stuff. Somebody might say that's Alex Rodriguez he plays for the
dah, dah, dah, and oh, by the way he makes a lot of money, the Yankees are really stuck, okay? There's levels of deepness here. Now, this one I've actually got in the right order. If I ask you who
A-Rod is some of you maybe able to give me varying levels of correctness. Some of you still in this room might not likely know who A-Rod is, any of you? Anybody will raise their hand? Don't you
understand what you're reading? I mean it's right there, okay? What is sex? How many of you don't know that? What don't you understand what you're reading? Its right -- can't you read this? See this
is the problem comprehension depends on a bunch of stuff, it's not just the words on the page. Now, we'll go a little deeper. This one is really fun this morning, okay? There's so much you need to
know to answer this question that's not contained in this sentence if you know stuff this is easy, okay? You can read this and answer this question. Beantown right that's another word for Boston,
Alex Rodriguez plays for the Yankees, the red -- Boston Red Socks and the Yankees hate each other like a lot. People when you hit into a double play that's two outs, right? And that means not good so
people in Boston would say, "Hooray not good for the Yankees." None of that is evident on the text and then this one even more, okay? Now, we have some notion about timeframe and early in this season
it doesn't matter, later in the season the stakes are higher. None of that is in here. And a lot of things we do with understanding kind of come from reading but are not reading. Reading
comprehension is very task specific it depends on what you're reading. My son is being asked to draw deep inferences from Tequila Mockingbird ninth grade why did the character do this? He goes like,"
I don't know it was interesting." He doesn't read at that level of depth so now he's being expected to navigate Passage to India. Passage to India written by an English author in the 1920s about the
challenges in moving to a more -- getting independence from the British -- there's a whole bunch of stuff in Tequila -- oh, my goodness. Yeah. So this is from Joe Torgesen who I made a little bit of
change to put an asterisk here because Joe talks about this as fluency, I will call this general reading skills other people would probably too but not everyone but everything else is the same.
Torgesen would say that what goes into this process is reading what expression or process the automaticity which is a really important word going back to Jay Samuels. Cracking the code without
thinking is really good that's automaticity not fluency. Decoding especially multisyllabic words for adolescents and for really young kids phonemic awareness. Now, any speech and language people in
the room? There got to some. We had some this morning, anybody? They all came to the morning session or you just think I'm going to put you on the spot. I won't. I'll just tell you that you...
AUDIENCE MEMBER: [inaudible]
MARK SHINN: Yeah. That's actually good because I actually don't think other than articulation which really shouldn't be a problem. By that time, it should be done earlier and kids with severe
disabilities feeding etcetera oral stimulations and stuff that we shouldn't have those services there but that's me. Speech and language people for 35 years have been saying that your language skills
are really important to your reading and understanding. And what goes in there, the two biggies are going to be vocabulary, vocabulary, vocabulary, vocabulary, the two biggest sources of vocabulary
growth are going to be what you walk in the door with from your family or from caregivers prior to school. And how widely you read. Let me repeat, how widely you read. By the time kids get into
middle school and high school and they're struggling readers, how much have they read inside and outside the school? Hardly anything. Even in fifth grade when you see some of the surveys, let say a
fifth grade boy reports they'd rather clean their room than read a book, that should be really discouraging if you know anything about fifth grade boys. So, vocabulary, and then these things called
cultural influences, these are idioms or figures of speech. Cut it out now, give me a break. Read a text and you'll discover that a good share of the time the words don't really mean what the words
say, like write about this from your own -- personal world view. Personal? Okay. That's me, I got that part. World view, like I'm looking at earth? There's amazing idioms in our text. The other stuff
talks about this is called knowledge, that you need to sort of know something about what you're reading. Did you know that you can actually learn a lot about the civil war without ever having been
there? If you read. There's a bunch of things you can do that are proxies for life experience by reading. But content knowledge in life experience are really big. Turgeson lays out this one. Look at
the top, motivation and engagement. What is the active reading strategy the kid actually uses or has ever been taught to use? Monitoring strategies, fix-up strategies, but the biggie here is going to
be motivation and interest. How many kids you work with -- say this, like, why do we need to read this? This is stupid. And I get, you know -- my son -- my son, Dominic, he's a voracious reader. He
doesn't understand -- look, this passage to India is -- I don't get. This is dump. Why are we reading this? How come we're not reading about India today? Or the challenges between India and Pakistan
or things about Cashmere and why people are -- you, know, like, those are actually pretty good questions. How do you know that stuff? He's really interested in the region. He's just not interested in
that stuff that's in that book. Okay. That's a real challenge. Now, Turgeson puts it together like this. For some kids, learning to read will be the hardest thing that they ever do in their life. You
guys live with these kids all the time, right? But Turgeson will say the evidence would suggest of all the other things on this page, on this slide, this is the easiest thing to teach. Not easy, but
it's easier than the rest of it. Turgeson goes like this, the longer it takes to do this and how many of you work with kids that have not done this yet, here's the kids you're working with. They have
language deficit especially vocabulary. They have deficits in what they know about the world, in their knowledge, and they hate reading. Oh, I'd like you to read -- oh, do I have to? And those kids
are really tough to teach. Now, here's the other part. This is the easiest to measure. This doesn't mean it's the only thing we ever measure but until we get some simple basic stuff down, do you
think in the hospital they argue about how to take a patient's temperature? But in schools, we would. We -- can you imagine the arguments we'd have over Fahrenheit and Kelvin or Fahrenheit and
Celsius? Schools would turn that into something really complicated and we don't need to. So remember, it comes from reading. Very few of the kids, by the time they reach middle school and high
school, their core language-arts programs have not provided evidence base or sufficiently explicit comprehension strategy instruction. Just go in and ask what strategy are you using? Are you using
PQS? Are you using SIM? Are you using compared control? What's your strategy? If they can't tell you, then they are not. And here's the other challenge. Are they learning the right words the right
way? Do we know that the essential words are that kids need to know to be -- to navigate text? It's a rhetorical question. Yes. There are words and -- anybody heard of -- I don't know. All right. She
might -- maybe a -- believed in work, Pennsylvania, Isabel Beck? Didn't she present here a couple of years ago in -- at the Pennsaid Conference? I mean, pretty much the agreement on the words that
kids need to know to master academic English are out there. The problem is if we don't know that they're out there then how can we insure that all of our teachers actively teach them? Here's one of
those words. A sizable proportion of the middle school and high school kids do not know what the word analyze mean. How could they not know what analyze is? We use it all the time. Analyze the
problem. Well, because we use a lot, there's no guarantee that kids know what it means. Oh, my goodness. So, we need to be able to do a much better job on this stuff if we're going to get our job
done. This is an instructional problem, not an assessment problem. Okay? So -- and if you -- these are things that are in your readings. You can read about them, okay? This is particularly good by a
colleague of mine, Kate -- Kevin Feldman and Kate Kinsella. They were brought in to improve the quality of Read 180. And then this one, screamingly great by Nancy Martella -- Marchand-Martella, and
[inaudible] key areas of effective adolescent literacy programs. Again, all of these things you can read, I put them together for you on my website, okay? These are instructional problems. So, okay.
Goal setting. I'm skipping from slide 59 to here. I'm going to use a middle school example. Here's how it works. Actually pretty simple. Going back a number of years, law requires us to write our IEP
goals based on kids present level of performance. I love this word, PLOP. Say it. PLOP.
AUDIENCE: PLOP.
MARK SHINN: Kind of eschatological, too, if you think about it on a Thursday afternoon. We need to -- we need to identify the kids' PLOP. Good grief. Okay. Present level of performance, and to do
that, we need to add a skill to our toolbox here. It's called doing a survey level assessment. If you're as old as I am, you probably learned how to do an informal reading inventory. Well, a survey
level assessment is an IRI on steroids, okay? It's much better, much more scientifically sound because we're using better measures with no unreliability and validity. Here's what it looks like for a
middle school kid. We determine -- for whatever reason, okay, let's presume this kid has not responded to a tier 3 intervention, okay? And they have a significant performance discrepancy, they're
discrepant from peers. We need to write a goal. In our middle school, we have a benchmark data. We know how all of our kids perform in the fall, winter, and spring. And this is a student named Jenny.
In her sixth grade language-arts class, she is failing. She has not responded to a tier 3 additional sub-planted instruction, okay? To be able to reduce the gap, we may believe she needs a Special Ed
program. Okay. Is Jenny discrepant from level of performance of sixth graders? Yes. Okay. Well, it tell us she's discrepant from that but how discrepant? So, there's the expected level of
performance. We'd like her to be reading as well as sixth graders. She is not. So, there's where average kids perform. If she fell in that range, we'd say she's not discrepant. We don't need to worry
about it. So, if that's what's expected and she's reading 24 words a minute, correct, that places here at the first percentile of sixth graders. Ouch. So, how does she do compared to fifth graders?
Let's give her some fifth grade passages. These passages should be easier for her. She should read them better. Does she? A little. Does she read fifth grade material like fifth graders? No. Does she
read fourth grade material like fourth graders? Does she read third graders like third graders? In many of the schools in Illinois, if we saw her performance like this in third grade, we'd say she'd
be a good candidate for intervention. But she's not in third grade. She's in middle school. You start to get a sense on how severe this problem is? So, you survey level assessment, not only shows the
kid how discrepant they are from same grade peers but for the first time, it's systematically planning and trying to figure out what a kid can do, not what just what they can't do. And you build your
goals from where kids can do things, not from failure. And that's really important. Jenny could navigate grade two material like a beginning of your second grade student. Big discrepancy or not?
AUDIENCE MEMBER: [inaudible]
MARK SHINN: Huge. Okay. Here's the -- here's the bottom line. For me, if I'm writing this goal, if this is where her present level of performance is where she's currently performing successfully, I
need to figure out in one year where I want this student to be. And to do this, I need to understand what the advantage is and what the disadvantage is of any goal that I write because every goal has
at least one. So let's say I wanted Jenny to be successful in grade three material in one year. The advantage of that goal, well, I guess she'd be improving, but the disadvantage to that goal is
pretty clear. She's just as discrepant one year from now as she was today. Have I reduced the gap? No. Now, some people might say, "Well, the advantage of that goal is this, and if we had -- as we
had more time this morning, we worked our way -- walked our way through this. Almost always, people will say, "Well, that goal is achievable." And I'm going to say that is never the standard for a
goal for kids. That's a standard of -- that's a -- that's what works for adults. Oh, the fact that I can do it works for me is not the basis for how we write our goals for kids. Okay. So, we want to
achieve more than her being successful in grade three material. We want her to be successful in what? How about grade four. The advantage of grade four, again, we don't have time to have you guys
figure out the answer which is much more fun. But the advantage of the grade four is that student will be actually reducing the discrepancy. And not a little bit, that would be a lot. The downside to
that would be this, that she's still likely going to need an intensive intervention one year from now. And that's not wrong. That's just something we need to think about in our goal. Okay. We could,
in theory, say we want Jenny to be successful in reading grade five material. What's the advantage to that goal? Significantly reduces the gap, like, really reduces the gap. What's the disadvantage
to that goal? What? I heard the word attainable.
AUDIENCE MEMBER: [inaudible]
MARK SHINN: No, but that's not an acceptable criteria for saying that we would write a goal. There is something else that's in there. I'm not trying to be mean, okay? There's something else that's
really important and one of the people that was in the audience this morning hit it pretty much spot on. It's going to require a more intensive program and it's going to require more sacrifice. Oh,
that's sacrifice, sacrifice? What do you mean sacrifice? See, that kid might miss something if we're going to deliver appropriate -- and guess what the law was intended to do. The law was intended to
plan the program based on what the kid needed, not what we can deliver. That's not a criteria. The law doesn't say you must -- you must identify a program that you can deliver. There's nothing in the
law that says that. The law says it's specially designed instruction to meet the students' unique needs. If the student needs it, we're supposed to deliver. Now, as a practitioner, you're, like,
"Wait a minute. Don't put that on me. There's variables I can't control." I understand that, right? But we can't let that be the excuse for a middle school and a high performing community was working
with, every seventh grade kid with an IEP goal in reading got a 38-minute Special Ed program. Every single one, big discrepancy, 38 minutes, little discrepancy, 38 minutes. I'm just sitting there,
I'm glad Perry's not here or I'm glad that I'm not here, Perry, because I will get to you, because I wouldn't do that but guess what? That's not an individualized educational program. The goals, and
here's this fundamental disconnect for probably 90% of the people I work with. The goals had been disconnected from the program we deliver. We could write these goals regardless of the program. We
could deliver this program regardless of the goals and it's not supposed to be that way. Again, my metaphor I used this morning is think of the biggest loser. If you're going to lose, you know, 10
pounds over the course of the year, your intervention doesn't have to be that big a deal. But if you want to loose like that guy lost a hundred and fifty pounds or something like that, that's going
to take a much more intensive intervention and a lot more sacrifice. And parents actually have a right and teachers have a right to say what should those sacrifices be. See, what if a parent came in
and said, "I want Jenny, my daughter, to be reading grade six material in one year." That's not obtainable, we should tell them. Who are you to say that? Other than being the parent of the child and
as a legitimate member of the team as authorized since the original inception of Special Education law, parents are supposed to have a voice. But we don't want parents to really talk, do we? See, if
a parent said that to me, I'd say, "Great. I appreciate that perspective." Now, as a professional, let me describe at least what I would be thinking to be able to do that. I'd be thinking about
probably a four-hour a day language-arts program, minimum, okay? Plus at least 90 minutes or so at home of guided practice by you or somebody else and that's at the minimum and that means we probably
have to think about what she doesn't take in sixth grade that might be in this, might this, and this. And what if the parent says yes? Well, we just blow them off? No, we're actually expected--we
don't have to agree but we're expected to listen. Now, our IEP processes are not hardly ever contentious. If they are, they're contentious about the label, not the services. They should be far more
contentious about the services than the label because if we deliver the programs we can deliver, that's easy. It'll be low level and it'll be built in the kids' schedule because we don't want to
inconvenience the science teacher or the math teacher and Special Education is, in part, supposed to be about inconvenience. That's why it's a legal thing because if it wasn't a legal thing, we would
do what we used to do because we were just as professional before I -- before the variations of idea. Do you think people in the 60's or in the early 70's were capricious, mean, wanted to
discriminate, wanted to segregate? I don't think so. I don't think people were inherently any better or worse than our predecessors. But somebody determined that these kids needed protection from us.
And the IEP's supposed to do that. Now, bottom line, I do want to make sure we got some time for questions, is we determine the criteria of acceptable performance that would suggest that we've got a
line that reduces the gap. Then in the survey level assessment, there is no truth. There's nothing to suggest that fourth grade, fifth grade, sixth grade, seventh grade is right or wrong. There is
some legal evidence that suggests that a one-month growth for one month of intervention is probably would not cut it, okay? That's probably not defensible for almost all kids but any of these things
could be argued that they reduce the gap and they are the team's best judgment as to what the intervention would accomplish or could accomplish, got it? But there is no truth. But the issue is this,
we have to get more comfortable with haggling with truth and advertising and with saying that, "Look, we need to figure out how we deliver the interventions that kids need or we'll end up with the
[inaudible] extended time, reduced expectations, teach less so we can learn more, makes no sense. I mean, a bunch of other sort of things that are out there that give the appearance of intervention
that really don't change the odds of things for kids, okay? And let me tell you, after working four years in big city schools and six months as the RTI consultant for Chicago public schools, I've
lost more battles than I won. There's absolutely no doubt. I've lost far more battles and I still have to get up in the morning and try it again. I have a club. It's called the Don Quixote Club.
Anybody who would like to join, if anybody goes to the National Association of School Psychologists, NASP, there will be a presentation by the Don Quixote Club on Friday at 1:00 on the Common Core
State Standards. So, a bypassed [inaudible] is a charter member of the, you know, the Don Quixote Club and Alan Coulter from Louisiana State Health Sciences and anybody wants to join, feel free
because we're fighting a battle that requires us to protect the needs of kids and it's really hard. The system isn't designed to do that. Okay. So, you know, there's tons more stuff here, pick on the
criteria for acceptable performance, you can see all the slides but let's just, like, take 10, 15 minutes and chat. And I hope you chat because otherwise I'll just do this. I'll show slides and --
but, you know, at some point, you can have brain fatigue and I'd certainly like you to catch up so -- and tell me your name because I really hate to point.
ERIC: Eric.
MARK SHINN: Eric.
ERIC: Yeah. That -- the last part you talked about there, the wrong sense in terms of [inaudible] interventions in reading, you know, theoretic [inaudible]
MARK SHINN: Yeah.
ERIC: I guess the battle I fight as a Special Ed administrator...
MARK SHINN: Uh-huh.
ERIC: ...that makes perfect sense to me and the resource availability [inaudible] it's the environment that's it found in.
ERIC: But you also have those groups [inaudible] of parents who are looking for inclusion, so what happens when I get [inaudible] program maybe...
MARK SHINN: Yup.
ERIC: ...in my district like that and I will [inaudible] issue because a lot of those interventions may be done in instead maybe a Special Ed classroom.
MARK SHINN: Yup.
MARK SHINN: Maybe.
ERIC: And -- yup. And so, now, my LRE numbers or the percentage of time that a student spends in regular education has dropped drastically so the parent -- on the parents' side -- we're looking at
this for the parent perspective, a lot of times, they want a piece of the inclusion...
ERIC: ...for social reasons, we'll say.
MARK SHINN: Sure.
MARK SHINN: I understand.
ERIC: So, I guess, kind of help me work through that battle.
MARK SHINN: Cool. So, there's at least three or four things in there. First, is often we don't under -- we can't solve the problem because we haven't clearly identified the students' need. Is this a
student that would benefit most from treatment or support, okay? And I would argue, if we clarify that and say, "Hey, look. You...
ERIC: Let's say treatment.
MARK SHINN: Okay. This kid needs treatment.
MARK SHINN: Okay. So, if indeed this treatment -- now, we need to set a goal, okay? Because here's the equalizer, if we want to reduce this gap, let's say this kid is eighth grade, currently
ERIC: Yes.
successful in grade four, okay? And we want this student to be reading grade six materials successfully. That's our goal. Now, we can talk about how we're going to do it and where we're going to do
it because you know what? I have seen successful push in reading support, okay? I could do class within a class. I could also do a pull-out approach, okay? So, my bias is this, we focus on where
before we focus on what, okay? And until we focus on the problem, the magnitude of the problem, what we want to do, where we do it, to me, is important but less so. And here's we end up doing, we end
up -- the where becomes our overarching driving factor. And I'm going to argue where's that gotten us, okay? I'm going to see, you know, where full inclusion done well. Okay. Full inclusion done
poorly, class within a class done well, class within a class do poorly. Push in, good, bad, pull out, good, bad. Co-teaching, good, bad. It's all about what we do, not as much about where we do it.
Now, the inclusion numbers present an FTE, I'm going to argue if you've got data that shows what you're doing is working, that stuff pales by comparison. Now, here's my bias, and my bias is this, if
I've got no data on what I'm doing working and my kid's happier, that's fine with me. But until we have a basis for saying is what we're doing reducing the gap, it'll always be about how people feel
about things.
ERIC: And what are your -- I guess one more point that -- what are your thoughts -- I've put in a lot of thought in why isn't the reading teacher or reading specialist, not the Special Educator,
delivering the interventions to these kids that are struggling rather than the learning support teacher because they should be the content expert.
MARK SHINN: Often and more often until, you know...
ERIC: I don't know, that's the mind set, I guess, is what I'm, like -- how do we shift that?
MARK SHINN: Well, believe me, so, okay, let's go back just a little bit now, okay? Because right now, if you go -- and I didn't ask you you're level of implementation, middle school, high school,
because I want to move through [inaudible] elementary, okay? Middle school, high school implementation, multi-tier, I would argue lags 10 to 15 years behind elementary, okay? If middle -- if
elementary schools are struggling with improving the quality of the core, guess what? Middle school, high school is still teachers favorite novels. Are you kidding me? Like, I'm -- I love novels. I
love literature. That should be done in addition to helping kids navigate complex text. And if you haven't seen some of the really good core programs that have embedded strategy instruction, okay,
that help the -- all kids improve, then we are remiss because I would argue that a good sixth grade language arts curriculum should take 50, 60, 70% of the -- of the burden of skill development off
of remedial services in Special Ed, okay? But how do we get control of that? Until we get a data system that says this kid's not discrepant -- lots of kids have problems with that. Here's our bar.
This kid's above that bar, not good, will never tackle tier one and certainly tier two. It'll always be General Ed or Special Ed and guess what? I mean, it's for a secondary Special Ed teacher, those
personnel, you guys, you know -- do you know that your career lifespan is the shortest of any educational personnel? People tend to burn out faster because we overwork them and ask them to do things
that are insurmountable. So, you've identified, like, a whole bunch of really important stuff. Bottom line for me though is here's the variables I can control. I can control the IEP, not what goes
into it, but I can control how we write goals and how we monitor progress. And let's say, parent is really an advocate for whatever in the classroom and that rate of improvement on something really
important is flat. We can sit on, "Okay. We tried it. We need to revise the IEP. We need a more intensive intervention." And we think we can deliver it like this instead of like that. And that IEP's
not only supposed to protect the kid from us but from goofy ideas, okay? Which may come from us but sometimes come from parents. Sorry, you had a question? I hate to point at you.
AUDIENCE: The reason I'm sort of making noises over here, I'm trying to be a little quiet, that's what I do. I am an advocate and it's ironic because sometimes, as I mentioned to some of my peers,
people automatically think that you'd come in there against a school district. I'm...
MARK SHINN: Yeah.
AUDIENCE: No, I'm being very honest.
AUDIENCE: And I'm called by the school district sometimes.
MARK SHINN: Yeah.
MARK SHINN: Yeah. Me, too.
AUDIENCE: Sometimes, I know, I'm big enough now. No, but I don't -- now, I hear it. Okay. Sorry. But -- and sometimes the county calls me in. Some...
MARK SHINN: Uh-hmm.
AUDIENCE: And that's the thing, I think, sometimes we have to work around. We're here as a team.
AUDIENCE: To do what's great -- what's good for the child.
MARK SHINN: Right.
AUDIENCE: And guess what, if you explain it to the parent just the way it was just explained, why couldn't we try it? And if it didn't work, we'll try something else. But that's what I think is the
MARK SHINN: Yeah.
biggest thing, try -- just keep trying and why do we expect people with disabilities one shot? Okay, that's it.
MARK SHINN: Right. Absolutely.
AUDIENCE: You know, come on, none of us. I don't know anyone...
MARK SHINN: So here's the irony of this. You see, the IEP goal and progress monitoring was supposed to make these decisions clear for us, that nobody can look into a crystal ball and say, "Okay, this
kid needs, you know, 68 minutes of language," exclamation mark.
MARK SHINN: With 20 minutes of guided coach reading and level readers. Nobody could sit down and say that's a guarantee it'll work for this kid. But what it does say is we have the protection of
AUDIENCE: Right.
progress monitoring to say if it doesn't work, the IEP's revised to address that lack of progress.
MARK SHINN: Nobody can look into the crystal ball and predict with certainty. I have my program biases of the things that I like, okay? That I've seen people get results with, but there's no guarantee
AUDIENCE: Exactly, exactly.
that that will work with kid A, okay? So here's -- but unless we've got -- so I told a couple of these fellows the story this morning. I was asked in a -- in a -- one of the western suburbs about
six, seven years ago. Director of Special Ed calls me and said, "We got a highly litigious district. We got parents that are constantly bringing people to meetings lawyers and advocates. And you know
what? I go to these meetings and I hear this. Parents say the kids aren't benefiting, my teacher say they are. I want to support my teachers but honestly, Mark, how do I know? There's no data,
there's opinions. The parents could be right. The teachers could be right. Can you come and help us write good goals and monitor progress? Okay? So guess what? I didn't tell you the other part of the
story. So I'm going -- I'm doing this training and while I'm doing this training, there's, like, 75 people, it's a big district, 75 people, there's a table at the back of the room and they're not at
all looking at me, okay? So, I break -- I walked then, say, "Hey, what are you guys doing?" "Well, we're working on our IEPs." It's, like, "Okay. But I'm talking about IEPs up here." "Yeah, but we
have to finish these." And I'm looking at them and they're, like, these pages and pages and pages of all that sort of stuff, right? And so I came back from break and said, "So, apparently you guys
are very concerned about writing all these goals and objectives. Why are you doing that?" And they said, "Because our parents are litigious." And I said, "Yes, I know they're litigious. They're --
I'm here to make it so they are less litigious. Well, those goals you're talking about, they're too clear." And he's, "What do you mean they're too clear? We believe that if we obfuscate -- big word,
nice word. "We believe that if you obfuscate then parents won't know what we expect." And I'm sitting like a -- "Do you realize why I'm here? If I had that on tape that you're writing your goals to
confuse parents rather than get them on board, that's a mess, okay?" Now, we made some pretty serious in-roads and fewer goals, clear objectives, and guess what we had, more debate. But it's open
debate to sit down and say, "Look, I don't know the truth. I appreciate your perspective. We can do it this way. We can do it that way." But here's this overarching thing, your child with this
performance discrepancy needs an intensive, explicit reading intervention plus, likely, vocabulary, okay? Plus, likely extensive wide reading that's coached, plus likely reading a lot outside the
school and suitable difficult material. That's what we need to deliver. Now, where and how do we do it, okay? That's a very different discussion and I believe my child should be in a regular
classroom fulltime, why? Because it's for socialization purposes. I agree with that, too, but what's the program that we need to deliver to reduce this gap? So they, you know -- I hate to be sort of
like this original constitutionalist but the people who put the original legislation together with this radical concept called the IEP, they kind of had some things that I thought were pretty cool,
okay? The notion no single test that would be used, you know where that came from, right? Pennsylvania, park versus, you know -- okay. That was because school psychologists -- no single person, no
single test, people are putting kids in Special Ed without parent notification based on a cognitive ability score. We fixed that. No single person, people had a right to know that Special Ed label
was stigmatizing. It's not a good label. It has the risks of freedom -- loss of freedom of association, freedom from undo stigmatization. So the protection for that is the plan we delivered. Parents
and teams could weigh whether this was worth that and we turned it into a set of mindless short term objectives that nobody can understand and nobody looks at because, you know, somebody else wrote
them and it's just an IEP anyhow. And we've lost the fundamental implicit passion of how do we reduce the gap. That's my take on it. Okay. So, they're talking out there. I'm going to pack up a little
bit and -- but I certainly want to entertain questions. My shuttle to the airport is 4:00. My prediction is I never get anywhere anyhow, that I'll probably get to the airport and it'll be, like, you
know, tomorrow that I'll get back home. It's only 483 miles, could probably drive if I -- no, I don't want to do that. So wonderful people of Pennsylvania, I don't say that to be patronizing, I enjoy
the conversations I have. I wish I could record them for other people to hear. So, thank you very much.