IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Focus On Results

New law recalibrates the measuring stick for reportingon employment and training perfomance.

Today, as never before, public education, job training and welfare systems must demonstrate their effectiveness or face the budget ax. This past summer, Congress sent a wake-up call to some federally funded job-training programs across the nation when it passed a bill requiring them to report additional information on how participants fare -- a bill the president swiftly signed into law.

Now, community colleges and other Job Training Partnership Act (JTPA) providers must tell the feds not only how many people completed their programs and whether they obtained paying jobs, but also how many still had their jobs six months and a year later. Accountability advocates are pushing for even more outcome information -- and not just from JTPA providers but from all publicly funded education and training programs.

Policy-makers, program managers and these systems' customers need to know what works and what doesn't. Charting the real progress of students and clients -- not just graduation rates or certificates earned, but job placements, earnings, unemployment insurance claims and more -- is an enormous challenge.

In Florida and Texas, evolving labor-market information systems offer glimpses of what a performance-based 21st-century workforce development system might look like. Their goals are ambitious:

* Boosting job seekers' employment and employers' economic returns;

* Helping educators, trainers and caseworkers improve performance; and

* Strengthening public officials' capacity to make timely, well-informed investment decisions.

Measuring Performance

Traditionally, Florida, Texas and most other states have contacted former education and training participants through phone or mail surveys to study whether these programs improved participants' chances of getting and keeping a job. This information was used to describe program impact to various authorities. As often as not, the information sat in a report on a program officer's shelf.

But the growing web of computer technology available to decision-makers, program managers, caseworkers and even employment services recipients has enabled the first steps toward fully automated client follow-up. Former participants' employment and training records are being linked to employers' and others' databases containing information about their subsequent achievements.

Grading Community and Technical Colleges

Texas started with a seemingly simple goal: Make education and job-training performance data more readily available to customers -- dislocated workers, training participants, welfare recipients and other job seekers.

The state's answer is an evolving consumer report delivery system that will cover education and training available in communities across Texas. Available on CD-ROM since March, this report card on service providers is being phased into the operations of the state's 25 one-stop labor-market information centers.

Currently, the system reports only on public community and technical colleges, which operate as primary service providers to many state and federally funded training programs. At each center, a computer-savvy staffer helps job seekers mine the consumer report for the data they need to make informed decisions about the best services available to them. With the click of a mouse, clients can glean everything from the employment and earnings of recent community college program graduates to the relevance of their training experiences to the jobs they obtained. The state is still in the early stages of implementation, but the impact is already being felt at the program level.

"The system clearly has had a catalytic effect," said Marc Anderberg, direc- tor of follow-up for the Texas State Occupational Information Coordinating Committee. The first signs: a flurry of curriculum reviews and increased communication among deans and job-training program managers about troubled departments and programs. Surprisingly, Texas' largest proprietary schools, initially reluctant to put data on the state's consumer report system, are now clamoring to join.

"They now recognize it as a recruiting tool for their programs," Anderberg said.

Painting an Even Larger Picture

The Florida Education and Training Placement Information Program (FETPIP) captures follow-up data on former participants in an even broader array of education, welfare, job-training and other programs.

"What stands out about this program is its scope," said Jay Pfeiffer, director of the Florida Department of Education's workforce education and outcome information services. "It deals with virtually every public education, training and job placement activity that exists in Florida, and many private ones."

The system collects follow-up data on participants' employment and education experiences. It also captures data on participants' subsequent military enlistment, incarceration or use of public assistance. The most recent round of data collection gathered 2.3 million participant records.

Since its inception in the 1980s, the Florida system's primary purpose has been program improvement. Toward this end, the system helps policy-makers make better program and resource decisions, including termination of programs that don't work, by strengthening strategic planning and budgeting.

Education and training program participants benefit directly from the system, too. "We have long tried to integrate FETPIP-based data into career guidance materials," Pfeiffer said. FETPIP data is now being delivered to students and clients at schools, one-stops and employment security and JTPA offices.

The system's crucial role in the state's education and training systems is illustrated by its role in welfare reform. Right now, Mike Switzer, a vice president with Enterprise Florida's Jobs and Education Partnership, is using FETPIP information to gauge the impact of the first year of welfare reform on job placement and retention, earnings, further education and continued reliance on food stamps and other forms of support. He has already presented preliminary findings to the Florida Senate.

Switzer has found that welfare rolls are down, as are benefit payouts. But not everyone is getting a job, and not all of those who do are becoming self-sufficient. The statewide database is also helping policy-makers pinpoint "epicenters of difficulty" and make mid-course corrections rather than waiting years for traditional academic evaluations, said Switzer. "Many of our north Florida rural areas and Miami are in bad shape," he said. "But other large and midsize cities are doing well."

Based on the preliminary FETPIP data, the state Legislature enacted a $25 million job-creation fund, streamlined business expansion incentives and increased support for job seekers who move to Orlando, Sarasota and other job-rich regions.

"Everyone wants accountability, but then they base measurement systems on data that don't exist," said Switzer. He says the FETPIP system is changing that. "Legislators and staff have beenable to design the measurement-and-reward system in a way that make it feasible and affordable."

What Comes Next?

Florida is beginning to report quarterly as well as annually on unemployment insurance claims. System managers are working closely with their counterparts in Texas to boost the availability of information to students and clients. Policy-makers continue to find new applications for the data gathered. In one use, employment data gauges the characteristics of firms doing the most hiring. And the system is enabling new analyses, such as a longitudinal study of the employment outcomes and earnings of students who graduated from every level of the state's educational system in 1991.

As Texas expands its system, it will be adding more information on other education and workforce development programs. It will build longitudinal databases ranging from earnings gains over time to employment retention. Another goal is an Internet-based self-service system that users will be able to access from their homes or local libraries next year.

Texas and Florida are not alone. Georgia, Minnesota, Nevada, North Carolina, North Dakota and South Dakota are developing systems that link records of education and job-training program participants with unemployment insurance wage records. North Carolina and North Dakota, like Florida and Texas, also conduct employer surveys from which data is incorporated into their automated system. And Oregon is crafting an interagency, intergovernmental shared information system that would key its statewide performance "benchmarks" process.

Texas has made its consumer report system available to all other states in the form of software and source code. Buttressed with online help screens and technical reference information, the package should help states develop their own approaches.

Lessons Learned

Florida and Texas automated-follow-up pioneers Jay Pfeiffer and Marc Anderberg offer several tips for other states' system developers -- many more of which will be included in their Field Guide to Automated Follow-Up.

From the get-go, developers must be extremely sensitive to confidentiality issues, both for students and employers. Other implementation barriers include fears of administrative burdens or limitations of local program autonomy, and concerns about possible sanctions for low performance scores.

Early, careful planning by central staff for professional development and technical assistance to practitioners, designed with their help, can head off problems. For example, a standardized train-the-trainer workshop or seminar curriculum explaining automated and centralized follow-up, fortified with standardized hand- out materials and visual aids, can help ensure that everyone is on the same page.

Borrow Early and Often

States can save time and money by borrowing and modifying their automation tools rather than writing their own. Still, the initial application software development will require some funding up front, and a staff resource commitment commensurate with the magnitude of the initial effort.

The initial task will be to link records to compare Social Security numbers across files and append outcome data from the target databases to the appropriate seed records. Much of the data analysis and report generation can be automated, even in the early stages.

Online and archival storage needs will be determined by the number of programs and participants, as well as by the amount and frequency of information collected.

Software requirements also will vary, ranging from word processing and desktop publishing systems to database management and statistical packages, as well as network communications drivers.

What Works for Texas and Florida?

Texas consumer report system managers started working in Microsoft's Visual FoxPro, using standard DBF files. When the state moves its consumer report from local area networks to the Internet next year, it will assess the costs and benefits of moving toward ORACLE on a UNIX platform, as the U.S. Department of Labor has done.

FETPIP accepts input files containing student and former program participant records -- 2.3 million in the most recent collection cycle -- in a broad array of media and software packages. "We need Social Security numbers, names, birth dates and the like delimited in certain ways, but the rest of it is pretty much up to the supplying agency, consistent with how they do business," said Florida's Pfeiffer. Personal-computer-based files are generally processed in Dbase, then assembled into an integrated master file on IBM mainframes in COBOL language.


Anthony P. Carnevale, vice president for public leadership at Educational Testing Service, chaired the National Commission for Employment Policy during President Clinton's first term. Carnevale most recently co-authored "Education for What? The New Office Economy" with colleague Stephen J. Rose. Neal C. Johnson, senior research partner at Educational Testing Service, was founding executive editor of "The Public Innovator" newsletter for the Alliance for Redesigning Government. *

October Table of Contents