Show all

BCPS painting incomplete picture to justify huge expenditure on personal devices for students

592views

Op-ed: On Tuesday, April 3, the Baltimore County Board of Education will vote on a 7-year $140 million contract to give each student his or her own mobile device. Baltimore County Public Schools is justifying this huge expenditure by claiming that test scores have risen because of device use.

A closer look tells a different story.

  • The test scores of 10 pilot schools whose students are using devices — some of which are very high-performing and some very low-performing — are being averaged to assert that pilot schools are beating the State average. In fact, a number are performing significantly below the State average.
  • The test scores of pilot schools, also known as “Lighthouse” schools, were compared to hundreds of schools in other counties (not accounting for differences in supports, resources, or programs) instead of comparing them to non-Lighthouse schools in Baltimore County.
  • Principals have been hesitant to attribute the slight test gains directly or solely to the digital learning initiative.
  • Johns Hopkins University evaluators in 2017 deemed slight gains as statistically insignificant.

As reported on March 29 (“No public comment allowed ahead of BCPS’ $140 million tech-ed vote”), the Board of Education has taken the unprecedented step of thwarting the public’s ability to provide input at what many consider its most important meeting of the year – the meeting at which the contract between BCPS and Daly Computers covering the lease and refresh of 133,000 Hewlett-Packard mobile devices for students, teachers, and staff will be voted on.

This contract represents a major step forward for S.T.A.T. (Students and Teachers Accessing Tomorrow), the 1:1 digital learning program implemented under the tenure of former Superintendent Dr. Dallas Dance.

Having been deprived the opportunity to provide input at the meeting, we, the leaders of the Central Area Education Advisory Council, the PTA Council of Baltimore County, and Advocates for Baltimore County Schools, feel compelled to share insights on the efficacy of S.T.A.T., having monitored its implementation for the past 4 years. 

First, we want to make it clear that we are not anti-tech; we are for the balanced, safe, age-appropriate, and cost-effective use of technology in BCPS classrooms. All three of our groups serve as information clearinghouses — we seek and receive a great deal of community input on a range of issues, and S.T.A.T. is no exception.  We know that parents, teachers, and community members have concerns about this type of “personalized learning.”

Slight gains in test scores can’t be attributed to S.T.A.T.

S.T.A.T. began in 2014 in 10 “Lighthouse” elementary schools; Chase, Church Lane, Edmondson Heights, Halstead, Fort Garrison, Hawthorne, Joppa View, Lansdowne, Mays Chapel and Rodgers Forge. The program is now in all BCPS elementary and middle schools and three Lighthouse high schools.  Next year, all BCPS students will use devices at a 1:1 ratio.

S.T.A.T. is being evaluated by Johns Hopkins University’s Center for Research and Reform in Education (CRRE) – their last full report was the Year-Three Evaluation published in July 2017.  Of note is the statement in the evaluation’s Executive Summary: “Principals and S.T.A.T. teachers … were generally hesitant to attribute the MAP [Measures of Academic Progress] gains directly or solely to S.T.A.T. … there are numerous programs and initiatives in BCPS, which could contribute to improved student achievement independently of S.T.A.T.”

The fact that slight academic gains couldn’t be attributed to S.T.A.T. is quite concerning, considering the fact that they’re being used to justify the expansion of this initiative, the largest of its kind in the nation. 

We’ve testified at many Board of Education meetings about the program’s extremely high costs — estimated to be somewhere around $300 million so far – and its opportunity costs in a school system with many pressing and competing needs due to the fact that nearly half of its students live in poverty, and many of its schools are in dire need of repair or replacement.

Of perhaps greater concern are comments made by Johns Hopkins’ lead researcher (video here) at the February 2017 Board of Education’s Curriculum and Instruction Committee meeting. He noted statistically insignificant gains and decreases in student interaction and student-initiated communication. He observed little inquiry or high-order thinking, and commented that a great deal more professional development would be needed to integrate technology and create engaging instruction.

Misleading claims of Lighthouse schools outperforming other schools

In numerous presentations, BCPS and S.T.A.T.’s Hopkins evaluators put a positive spin on test results, both MAP and PARCC (Partnership for Assessment of Readiness for College and Careers). Yet MAP scores have never been released to allow the public to compare schools for themselves.

For example, at the September 26, 2017 Board of Education meeting, BCPS presented on student achievement, noting that “Lighthouse Schools outperformed the state as a whole.” But it’s disingenuous to average the scores of 10 schools – some extremely high-performing like Rodgers Forge and Fort Garrison and some very low-performing – then claim Lighthouse schools are beating the state average.

A comparison of Lighthouse schools’ 2017 PARRC scores to the State average clearly demonstrates that a number of Lighthouse schools are performing significantly below the State average when looked at individually. (See the comparisons here.)

Another BCPS analysis shows a troubling trend. By the 2015-16 school year, all third-graders were using devices; Lighthouse school third-graders had been using them since the 2014-15 school year. Despite several years of device use, MAP scores showed that reading proficiency decreased or remained static.

FY2014 FY2015 FY2016 FY2017
Percentage of Grade 3 students demonstrating on-grade level reading in the Fall (based on MAP) 57.2 52.6 50.2 48.6
Percentage of Grade 3 students demonstrating on-grade level reading in the Winter (based on MAP) 56.3 54.3 56.9 56.4

Is Hopkins doing the proper analysis?

In its December 2017 evaluation addendum, Hopkins asserted that Lighthouse schools outperformed other school districts and the State in terms of “cumulative growth” on the PARCC test. Hopkins did not consider what supports, resources, or programs were in place in other school districts – there are simply too many variables to make a valid comparison.

Proficiency growth in comparison to other districts was also used in BCPS’s presentation on the HP device Request for Proposals at the March 6, 2018 Board of Education meeting to justify the $140 million slated to be spent: “Lighthouse students showed greater improvements in proficiency than comparison students across the state.”

While there may be increases in proficiency, it’s certainly misleading, especially because BCPS has made similar, but slightly different claims, which become conflated. Rather than comparing 10 Lighthouse to hundreds in other counties, it would have made more sense to compare Lighthouse schools and BCPS non-Lighthouse schools in the same geographical area and with similar demographics.

The Hopkins evaluation is not a longitudinal cohort study.  It opted instead to compare Grade 3 2014-15 to Grade 3 2016-17, a completely different cohort of students, which offers an incomplete picture.  Wouldn’t it have been more instructive to track the proficiency of a cohort of students to illustrate the impact of device-use on student achievement over time?  They could have examined the performance of students who received devices in Grade 3 in the 2014-15 school year to their performance in Grade 4 in 2015-16, and Grade 5 in 2016-17, and how those scores differed from Grade 5 students who didn’t have devices.

Using data posted on the Maryland State Department of Education website, we compared PARCC scores of Lighthouse and neighboring non-Lighthouse schools, following a cohort of students starting in Grade 3 in the 2014-15 school year through Grade 5 in 2016-17. The paired schools are comparable in terms of performance.

This cohort approach gives a clearer picture of the long-term impact of the 1:1 initiative on academic achievement.

Lighthouse Grade 3 cohorts had devices in 2014-15; non-Lighthouse Grade 3 cohorts did not. Those cohorts are followed to Grade 5 in 2016-17, where Lighthouse school Grade 5 cohorts had devices for 3 years (received in 2014-15) and non-Lighthouse school Grade 5 students did not, having received devices in 2016-17.

The Lighthouse schools include schools with a range of proficiency levels starting in 2014-15. The three sets of paired (Lighthouse and non-Lighthouse) schools used as examples started in 2014-15 at nearly identical proficiency levels. One can see from the tables provided that schools without the 1:1 devices are quite capable of improving academic achievement — some more so than Lighthouse schools. 

It’s clear that S.T.A.T. is simply not having much of an impact.

Money used for S.T.A.T. could be better spent elsewhere

BCPS curriculum experts acknowledged at the September 26th Board of Education meeting that there’s a strong correlation between poverty and student achievement, yet the hundreds of millions of Baltimore County taxpayer dollars being spent on S.T.A.T. are barely moving the needle for at-risk students.

These dollars could be spent on feeding and mentoring programs, and desperately needed support staff, including social workers, guidance counselors, and pupil personnel workers.  These are the massive opportunity costs of S.T.A.T.

The Board of Education doesn’t have to approve this contract when an audit is being performed and there is so much distrust in the system as a direct result of the technology brought to BCPS by Dr. Dance. We should make sure devices are in the hands of teachers, who need them to teach, but should be wary of expanding the program to high school until we know the high school pilot has worked and that the curriculum is ready. 

The Maryland Department of Education has long recommended a 3:1 device ratio for elementary school students — three students for every one device — and that should be seriously considered in Baltimore County.

This is the same ratio recommended until 2017 by the Maryland Association of Boards of Education (MABE) in its Technology Continuing Resolution.

If performance indicators for low-performing schools – the schools S.T.A.T. was initiated to help – continue to remain static, we may all regret that $140 million was spent on devices when it could have been spent on what we know really works: smaller class sizes and classroom and student supports, which would allow for the growth of strong personal relationships between teachers and students.

-Aimee Freeman, Chairperson, Central Area Education Advisory Council
-Jayne Lee, President, PTA Council of Baltimore County
-Leslie Weber, Co-founder, Advocates for Baltimore County Schools

Subscribe
Email me when
guest
3 Comments
newest
oldest
Inline Feedbacks
View all comments
Steve A McIntire
April 9, 2018 5:36 pm

Thank you for this summary. It is important to have independent, unbiased eyes on the studies demonstrating benefits.

Lorrayne Nelson
Lorrayne Nelson
April 3, 2018 1:33 pm

As a school counselor in BCPS I think the devices are a good idea. What may help increase standardized test scores is to actually teach the students how to type. I learned in high school. 1st grades have devices and pack just as middle school students. Many of the test are timed and I think students would benefit by knowing the home keys and as well as the entire keyboard. Students are tested too much anyway. MAP fall and winger, Parcc, MISA, Benchmark testing, etc Just an observation I’ve made

JCSimpson
JCSimpson
April 2, 2018 8:29 pm

A thoughtful and thorough exploration of realities behind recent misleading BCPS claims of student score improvements. Apparent cherry-picked score citations are alarmingly being used to justify instead of properly balance the pros and cons of the laptop-per-student program (and another massive $140 million contract for HP devices) brought to the district by former superintendent Dallas Dance under suspect circumstances—Dance, cited for ethics violations and now found guilty of perjury, did numerous promotional videos, conference panels, and testimonials for edtech companies and industry groups—amid a plethora of apparent conflicts of interest—for HP, ScholarChip student ID program, DreamBox, iReady/Curriculum Associates, Discovery Education, and Middlebury Interactive Languages, the Spanish language software, to name a few). Why would current school district leadership not take a fresh look instead of pushing through this overpriced and problematic version of a tech integration in schools? The new round of laptops alone cost more than than double an industry standard or fiscally responsible tech spending in many other school districts with strong digital programs. See https://statusbcps.wordpress.com/2018/03/28/proposed-county-schools-laptop-contract-pricing-more-than-double-recommended-one-to-one-cost-report-shows/ Particularly concerning are current BCPS claims that inappropriately cite selective Measures for Academic Progress (MAP) scores to claim “clear and absolutely unequivocal” improvements. MAP is an internal classroom tool provided by a company, NWEA, that has a multi-million contract with BCPS. These student outcomes are neither standardized nor objective. NWEA’s nearly $4 million contract is also up for a renewal or expansion this summer: RGA-125-14 MEASURES FOR ACADEMIC PROGRESS 07/31/2018 Services/Supplies Northwest Evaluation Association Efforts to confirm such interpretations of MAP have… Read more »

3
0
Would love your thoughts, please comment.x
()
x