Case Studies Underscore the Importance of Using Evidence to Develop Practice and Policy

Posted July 15, 2019
By the Annie E. Casey Foundation
Adult reading a book with two young children.

Three new case stud­ies fund­ed by the Annie E. Casey and William T. Grant foun­da­tions show how evi­dence can be used to improve pro­gram deliv­ery ― and why it should be used to devel­op good policy.

The case stud­ies, from the Forum for Youth Invest­ment, offer pol­i­cy­mak­ers a roadmap to apply evi­dence to make improve­ments for chil­dren, fam­i­lies and com­mu­ni­ties. The three efforts pro­filed are:

  1. ServeMinneso­ta, a statewide admin­is­tra­tor for fed­er­al Ameri­corps fund­ing, which used evi­dence to improve its Read­ing Corps pro­gram. When research indi­cat­ed that some stu­dents were back­track­ing in their lit­er­a­cy gains after leav­ing the class­room, the pro­gram added a week­ly check-in peri­od. Stu­dents who par­tic­i­pat­ed in the check-in were more like­ly to retain skills.
  2. New York City’s Young Adult Lit­er­a­cy Pro­gram, which built exam­i­na­tion of evi­dence into its sys­tem for con­tin­u­ous improve­ment. One exam­ple: When an eval­u­a­tion sug­gest­ed that the edu­ca­tion and work­force pro­gram was strug­gling to retain stu­dents, it added an intern­ship pro­gram that improved atten­dance and extend­ed how long par­tic­i­pants stayed enrolled.
  3. The fed­er­al Year-Round Pell Grants pro­gram, which pro­vides finan­cial aid for high­er edu­ca­tion to low-income stu­dents. This pro­gram was cre­at­ed, dis­con­tin­ued and rein­stat­ed over the past decade in response to lim­it­ed evi­dence of effec­tive­ness. The case study con­cludes that ear­li­er eval­u­a­tions would have led to bet­ter and more con­sis­tent deci­sions about the program.

Effec­tive use of evi­dence is not a lin­ear process, but it’s crit­i­cal to achiev­ing results in the long term, says Ilene Berman, a senior asso­ciate with Casey’s Evi­dence-Based Prac­tice Group. These case stud­ies illus­trate how care­ful­ly con­sid­er­ing data, ask­ing and test­ing new ques­tions and view­ing evi­dence as a tool in an improve­ment process can con­tribute to bet­ter out­comes,” she says.

The new stud­ies build on pre­vi­ous rec­om­men­da­tions includ­ed in the Forum’s 2018 guide, enti­tled Man­ag­ing for Suc­cess: Strength­en­ing the Fed­er­al Infra­struc­ture for Evi­dence-Based Pol­i­cy­mak­ing.

Eval­u­at­ing a pro­gram isn’t enough, accord­ing to Thad­deus Fer­ber and Alex Sileo, who authored the case stud­ies. Defin­ing what an eval­u­a­tion should assess — and what it con­sid­ers suc­cess — is also impor­tant. Pol­i­cy­mak­ers would be bet­ter sit­u­at­ed to improve the Pell Grant pro­gram if they had more robust evi­dence about what con­texts the pro­gram suc­ceeds in, which pop­u­la­tions the pro­gram is most effec­tive for, and what type of imple­men­ta­tion is need­ed in order to meet these out­comes,” they write.

Read more about strength­en­ing the fed­er­al infra­struc­ture for evi­dence-based policymaking

Popular Posts

View all blog posts   |   Browse Topics