For the news and views you might have missed
“This is a major decision for SPS. There are substantial up-front and on-going costs [associated with the MAP® test].” – Jessica DeBarros, Broad Resident, Brad Bernatek, Director, Research Evaluation & Assessment for Seattle Public Schools (SPS) & former Broad Resident, memo, April 20, 2009
The Measures of Academic Progress (MAP®) test is an element of former Superintendent Goodloe-Johnson’s “Strategic Plan” that our school district should reconsider.
A growing chorus of parents are souring on the test, opting out and opposing it. Last night, the Seattle Educators’ Association (SEA) teacher’s union voted in favor of eliminating the MAP® test.
Our district faces a budget shortfall. The MAP® test has proven to be costly in ways beyond the initial subscription fee. Discontinuing the MAP® is one obvious way to save money and restore valuable class time, resources and learning to the children and teachers of Seattle Public Schools.
A subscription to the MAP® test was purchased by the Seattle School District from the Northwest Evaluation Association (NWEA) in 2009. (Then-Superintendent Goodloe-Johnson sat on the board of directors of the NWEA at that time and failed to disclose this fact. She was later cited for this breach of ethics by the state auditor. See below for more details.)
MAP® was used in some schools in the fall of 2008-09, and had a complete rollout in 2009-10. Every K through 9th grade student is now being administered the test three times a year (unless they opt out). [UPDATE: The pilot and rollout dates have been corrected from the original post. --sp.]
It is an “adaptive” test so it adjusts level of difficulty according to how the child answers the question. It is computerized, so unlike paper tests, schools require dedicated computers and lab space in order to administer it. It generates a lengthy report that teachers and administrators must be taught to interpret.
It is being administered to children as young as 5 in kindergarten some of whom are unable to read. So the test is given with headphones and read to the youngest children.
It only tests two subjects: Math and English.
The stated purpose of the test was to offer teachers a tool to gauge the academic levels of their students and adjust and differentiate their teaching accordingly. That in itself is not an objectionable objective. But that is not exactly how the MAP® has been used. It has proved to be a significant drain of time and resources, and of questionable usefulness.
Here are 15 reasons why the Seattle School District should discontinue the MAP®:
1. Lost class time. MAP® = lost opportunity for learning. Schools report spending as much as 9 weeks to three months of the year administering and analyzing/interpreting the MAP® test for hundreds of students. Seattle schoolchildren only spend 6.5 hours a day in school. The average American school public school year is 180 days — shorter than many other nations. Therefore, class time is precious. The thrice-yearly MAP® steals valuable time away from actual learning.
2. Too costly. MAP® = an unfunded mandate. The initial subscription to the test cost $370,000. But the district has spent much more since then in implementation costs. A portion of the $7.2 million Gates Foundation grant to SPS in 2009 went toward MAP®. Another $4.3 million of the February 2010 school levy was also earmarked for MAP®. Some believe that the proposed $2 million network capacity upgrade currently before the school board is also associated with the test. By some measures, MAP® has cost our school district as much as $10 million.
[UPDATE: The yearly subscription/licensing cost for MAP® was estimated to be $500,000 per year, according to SPS staffer Brad Bernatek and then-Broad Resident Jessica DeBarros in a report on April 2009.]
Also, most of the financial and logistic burden of administering and proctoring the test is falling to our schools who must give up precious space and staff time to administer it. Schools are effectively paying their librarians or other staffers to proctor and set up the tests. The cost of implementing the test in terms of administrator pay and purchasing computers is beyond the district’s means. This makes MAP® essentially an unfunded mandate.
3. MAP® creates unequal access and inequity. The MAP® requires designated computers and rooms to administer. Not all schools have computer labs, especially elementary schools, older buildings or those at full capacity and no spare space. Those that do not have computer labs are being forced to use their libraries or cafeterias to administer the test. Consequently libraries at a number of schools are off limits to students for normal use for as much as three months of the school year because of the MAP®.
[UPDATE: As many as 40 percent of Seattle's public schools lose their libraries to MAP® testing for as much as three months of every school year, according to SPS's Jessica DeBarros.]
[UPDATE: Impacted schools even include Garfield High School, which many consider the district's top high school. Students there also lose access to their library three times a year because of MAP®.]
Therefore, MAP® effectively curtails access to normal school facilities and learning opportunities to many of the district’s children, but not all, creating an inequitable situation.
4. The test is currently being misused by SPS to evaluate teachers. The MAP® test was not designed to evaluate teachers. Even NWEA itself warns that districts should not use it that way. Yet, in part because of the recent teacher’s contract and the district’s attempt to experiment with the national ed reform trend of “merit pay,” teachers are being told that their students’ MAP® test scores will be interpreted as a reflection of their teaching. Teachers are feeling pressured to somehow show gains in their students’ MAP® scores, even though the MAP® does not align with what they are teaching. This is leading to teaching to the test. (See #7 below.)
So the MAP® is being used incorrectly despite having been sold to the community for another purpose – to help teachers understand the academic needs of their students. This has effectively made MAP® a high-stakes test in the Seattle School District. High-stakes testing has been repeatedly discredited by research and experts like Diane Ravitch and Yong Zhao. Such testing is shunned by some of the academically highest performing nations like Finland. This is not a best practice. It is a bad practice.
5. It is excessive. The MAP® test is being administered to Seattle schools kids in kindergarten through 9th grade three times a year. The test takes as much as an hour (or more) to complete, and hours to administer and process. For students in 4th grade and beyond who must also take the annual state MSP test (WASL replacement) this adds up to four high-pressure standardized tests a year. That is too much testing.
6. The MAP® test does not correspond to SPS curriculum. It is not aligned to our district curriculum, making it a poor fit for our district, rendering its relevance in question.
7. MAP® is narrowing the curriculum and leading to test-prep instead of teaching. Because teachers are feeling pressured to raise test scores, some are teaching to the MAP® test. This dilutes and distorts the curriculum for our children.
8. MAP® is inappropriate and unreliable for K-2. MAP®’s two main administrators for SPS, Brad Bernatek and Jessica DeBarros, told this to a group of parents in 2010, and said that other districts do not administer the test to these grades for that reason. (Indeed it seems excessive for such young children to be tested so rigorously three times a year.) Why is SPS the outlier, forcing its youngest students to take this test? Why should a kindergartener’s first library experience consist of a computerized test instead of the opportunity to check out a book?
9. MAP® is inappropriate for English Language Learners.
10. MAP® is of limited use for accelerated or advanced students (APP and Spectrum) because they very quickly hit the ceiling on the test. (It is also difficult to measure significant growth in these children’s levels and it would seem unreasonable to expect teachers to somehow raise high scores even higher, or penalize a teacher if a child’s score decreases, for example, from a 99 to a 98 percentage. Yet, this is happening elsewhere in the country.)
11. MAP® is not necessary. Many teachers are not finding MAP® that useful. A growing number of parents oppose it as well. There are alternatives like the Developmental Reading Assessment (DRA) and other less costly and less time-consuming assessments that teachers can administer to their students to gauge where they are at, academically at the beginning of the year. Good teachers already differentiate their lessons for their students wherever possible. The best teachers don’t need a standardized, computerized test to tell them whether their students are progressing or need help.
12. MAP® is not accurate. There have been reports of fluctuations in test scores from one session to the next, even within the scores of an individual child. The winter test of January 2009-10 reportedly trended district-wide so low as to render it unusable. This was attributed to the post-vacation slump (the test was given in January right after the holiday two-week break). If scores can be influenced by such outside factors as vacations, how can the data be accurate or useful? (And how can they be fairly used to evaluate a teacher?) This year, the district administered the test in Oct. then just 2-3 months later in Dec./Jan. and will administer it again 4-6 months later in May/June. That is inconsistent spacing of testing windows. It also seems irrational to expect significant difference in scores in the two months between the fall and winter tests.
[UPDATE: The MAP® test “recalibration” fiasco of Jan. 2012 – more evidence of the unreliability of MAP®
In January 2012, Seattle Public Schools families noticed that their children's MAP® test scores had disappeared from the district’s database of student information. Then they reappeared, and some scores had changed from previous iterations, some dropping by as much as 20 points. Parents jammed the site anxiously searching for answers, and crashed it.
SPS district staff offered families this rationale:
February 1, 2012
Dear Seattle Public Schools families,
As you may know, Seattle Public Schools uses a computer-based test called Measures of Academic Progress (MAP) to provide schools and families with information about student achievement. Schools administer MAP tests in math and reading two or three times a year to many of our students. (The fall MAP test is optional.) Teachers and principals use information from the MAP tests to monitor students’ academic progress and to design their instruction to help every student succeed. You have likely received your student’s MAP scores and percentile in the mail. These scores are also available online, via The Source, the online resource for families and teachers at http://source.seattleschools. org
As part of its ongoing efforts to follow best testing practices, the MAP test’s vendor, Northwest Evaluation Association (NWEA), recently recalibrated the percentile results associated with our students’ MAP scores, to better measure how Seattle students perform relative to students across the nation. This practice of recalibrating percentiles is common in standardized testing, and happens every several years.
The recalibrated percentile rank scores provide a more accurate snapshot of your student’s performance compared with other test-takers, since they are more representative of the national school-age population.
In order for you to be able to see your student’s academic growth over time more accurately, Seattle Public Schools has updated your student’s percentile scores for the past three years to reflect the recalibrated percentiles. You can find the new MAP scores online at The Source at http://source.seattleschools. org. You may notice only a slight change or a very significant change, depending on how your child scored compared to the larger national average of students taking MAP tests.
A student’s MAP results are reported using both percentiles and a RIT score. The RIT score shows what students are ready to learn rather than what they have already mastered, and is used to show a student’s current achievement on a scale that is independent of grade level. This percentile update does not change your child’s RIT score.
The recalibrated percentiles do not change eligibility for advanced learning (Spectrum or APP) students. Original scores from the Spring 2011 MAP test will still be used for eligibility for advanced learning for 2012-13. Next year, eligibility will be based on the recalibrated percentiles. If you have any questions or concerns about the updated percentiles or the MAP test, please contact firstname.lastname@example.org.
Executive Director of Research, Evaluation, Assessment & Development
Seattle Public Schools
How can the district use a test based on a quicksand of shifting data points to measure student achievement or determine student eligibility for certain programs? How is it fair or accurate for the district to use constantly shifting test scores to measure, penalize or reward teachers?
The MAP® test recalibration confusion illustrated once again that the MAP® is an unreliable tool and should not be used for any high-stakes important decisions or evaluations.]
13. The manner in which the MAP® test product was selected and purchased is highly questionable. Seattle Schools Superintendent Goodloe-Johnson was on the board of directors of the vendor, NWEA, at the time the Seattle School District purchased the MAP® product, and failed to disclose this to the board or publicly, as is required. In 2010, the state auditor cited this as an ethics violation/conflict of interest and Superintendent Goodloe-Johnson was forced to step down from board of directors of NWEA.
Also, no bids from other vendors were sought. So it was a no-bid contract. A rationale for this lack of competitive bidding can be found here, from the former SPS employee once responsible for MAP® administration, Brad Bernatek.
The internal review of possible test products that led to the selection of MAP® was conducted by Jessica DeBarros of the Broad Foundation (who was subsequently hired by the district, at a salary of $90,000, to oversee the MAP® test). Superintendent Goodloe-Johnson was also affiliated with the Broad Foundation. She used the MAP® in her previous position as school superintendent in Charleston, South Carolina, so she may have favored the product already.
These factors have called into question the process and the motivations behind the purchase of MAP®, creating a level of mistrust associated with the test.
14. The test is being administered in an inconsistent and nonscientific manner. In some schools, librarians are being obliged to administer and proctor the tests, in other cases, other teaching staff are in charge. In still other cases, parents are proctoring the tests. How can this lead to consistent, accurate or scientific results? Is it fair to mandate a test that relies on parent volunteers?
15. The test can be gamed. Students can (and have) figured out that wrong answers will lead to easier questions and vice versa. This can lead to inaccurate and arguably useless results.
ADDENDUM: Troubling anecdotal evidence. A second grade child in the district’s accelerated/gifted program (APP) was reportedly asked the significance of the rose in The Scarlet Letter (clearly not an appropriate second grade book even for gifted 7-year-olds!). This is just one example of the MAP® asking inappropriate questions. A class of kindergarteners, unfamiliar with computerized tests in their first weeks of school at age 5, literally placed their mouse on the computer screen to follow directions that said “Put the mouse on your name.” The test is designed to give the children questions that are at times too hard for them. But this can lead to frustration and confusion. “Why am I being asked about something I’ve never studied?” my own child once asked me. There have also been reports of students in tears over the test. It is one thing to test one’s knowledge for the fun of the challenge, but to place a child before a computer and ask them to try to answer questions they are not expected to know seems bizarre at best, and almost cruel.
In short, MAP® has proven to be a costly and stressful misuse of precious resources. In times of financial scarcity, we need to ask: Is this the best use of our district’s limited resources? And, more importantly: Is this the best use of our children’s time?