Share this intervention

Summary

To support future employment, the HCD program implemented in Riverside, CA, focused on providing education and training to single parents who were Aid to Families with Dependent Children (AFDC) recipients. This evaluation directly compared HCD with a separate intervention, LFA, in order to better understand which of the two interventions might be more effective; the distinctive features of HCD were adult basic education courses or vocational training programs.

The HCD program implemented in Riverside, CA, emphasized that participants should spend time receiving education or training to prepare for good jobs. If participants did not have a high school diploma or general education diploma, the program provided basic education classes in the public school system to help participants make progress toward their goals (such as increasing their literacy level). Case managers were accountable for the employment and education outcomes of their clients and therefore encouraged success and emphasized and enforced program participation. Staff could impose financial sanctions (by reducing welfare grant amounts) if clients did not participate in required activities. The program also offered support with child care and transportation costs. Riverside’s HCD program expected that most clients would complete training or educational activities within two years but would approve longer durations based on participant needs.

Eligible participants included single parents who received AFDC and who were required to enroll in Job Opportunities and Basic Skills (JOBS) program. However, AFDC recipients were exempt from JOBS if they had children younger than 3, were employed 30 hours or more per week, were medically unable to work, or were in the last trimester of pregnancy. Riverside HCD was administered in Riverside, CA.

The effectiveness of HCD when compared with LFA indicates the effect of being referred to a set of services that are unique to HCD, or how much better the offer of HCD meets participants’ needs than the offer of LFA. HCD focused on providing education and training as a precursor to employment, whereas LFA focused on placing people into jobs quickly to build work habits and skills. Similar HCD programs were implemented and tested in Atlanta, GA, and Grand Rapids, MI. Riverside’s HCD and LFA programs were examined as part of the National Evaluation of Welfare-to-Work Strategies that also evaluated HCD and LFA programs in Grand Rapids, MI, and Atlanta, GA. The demonstration also evaluated programs in Portland, OR; Detroit, MI; Oklahoma City, OK; and two programs in Columbus, OH (Columbus Integrated and Columbus Traditional).

Populations and employment barriers:

Effectiveness rating and effect by outcome domain

Need more context or definitions for the Outcome Domain table below?
View the "Table help" to get more insight into terms, measures, and definitions.

View table help

Scroll to the right to view the rest of the table columns

Outcome domain Term Effectiveness rating Effect in 2018 dollars and percentages Effect in standard deviations Sample size
Increase earnings Short-term No evidence to assess support
Long-term Little evidence to assess support unfavorable $-879 per year -0.042 3182
Very long-term No evidence to assess support
Increase employment Short-term No evidence to assess support
Long-term No evidence to assess support
Very long-term No evidence to assess support
Decrease benefit receipt Short-term No evidence to assess support
Long-term Little evidence to assess support favorable $-6 per year -0.002 3182
Very long-term No evidence to assess support
Increase education and training All measurement periods No evidence to assess support

Studies of this intervention

Study quality rating Study counts per rating
High High 1

Implementation details

Characteristics of research participants
Black or African American
17%
White
49%
Asian
3%
American Indian or Alaska Native
1%
Unknown, not reported, or other
1%
Hispanic or Latino of any race
30%

The Pathways Clearinghouse refers to interventions by the names used in study reports or manuscripts. Some intervention names may use language that is not consistent with our style guide, preferences, or the terminology we use to describe populations.