Sidebilder
PDF
ePub

A PP END I X

MATERIAL SUBMITTED FOR THE HEARING

TESTIMONY BY JOHN MENDELOFF
FOR THE HOUSE GOVERNMENT OPERATIONS SUBCOMMITTEE ON MANPOWER

AND HOUSING, JUNE 20, 1984

I am pleased to be asked to testify before the Subcommittee about

occupational safety and health information systems.

Much of my work has focused

on how better information could allow OSHA to make more intelligent regulatory

choices.

I have conducted evaluations of OSHA's effectiveness in preventing

injuries, worked on redesigning its inspection program to improve its cost

effectiveness, and recently have been examining the potential uses of the data

on OSHA', health inspections, which reside in the agency's computerized

Management Information System (MIS).

I have published Regulating Safetyi An

Esgnonis and Political Analysis of Occupational Safety and Health Policy and

several articles about the topics described above.

I have been supported in

this research by OSHA, NIOSH, the Department of Labor, the Office of Technology

Assessment, and the Federal Interagency Task Force on Workplace Safety and

Health.

I now teach at the University of California at San Diego.

I will divide my testimony into sections on safety and health.

My focus

will be on ideas that could provide more useful information.

Safety

The Bureau of Labor Statistics survey of occupational injuries and

illnesses was established to identify differences in injury rates among

[blocks in formation]

thus the total recordable rate) are much less reliable because of employer

uncertainty about where the line between recordable cases and "first aid" cases

is to be drawn.

(213)

Unfortunately, the data from the survey have very limited value for

policymaking purposes.

They provide almost no insight into causal factors in

accidents, in particular into the role of OSHA standards and OSHA enforcement.

As a partial remedy, BLS has been developing its "supplementary data system"

(SDS), essentially a compilation of somewhat upgraded and standardized data sets

from state workers' compensation programs.

These do include information about

the type of accident as well as some characteristics about the accident victims

and will prove useful for accident researchers.

Policymakers are often confronted by "information systems" that generate

massive amounts of data, but very little useful information.

In part, the

problem is that no one has thought through what questions were important to

answer and what data might help to answer them.

This is exacerbated because the

information policymakers need is extremely specific and detailed.

Let me give

several examples.

When OSHA is considering the promulgation of new safety standards, it needs

to know very precisely what the likely effects of the required changes would be.

Again and again, hearings on these standards reveal that the needed information

is not available.

The BLS survey is no help.

State workers' compensation data

can provide some insight into the number of times a certain type of injury has

occurred--1.g., how many times forklift trucks have overturned and killed the

driver. However, OSHA needs to know whether all forklift trucks overturn or only smaller ones, at what speeds they overturned; and other similar facts. One strategy would be for OSHA to co-operate with one or two of the larger states

which have good workers' compensation data systems and piggyback an inter

disciplinary team on that system.

For example, suppose that OSHA knew the four

or five most likely standards that it would be addressing over the next few

years and some of the key factual issues that they would raise.

This team of

engineers and biostatisticians could be reviewing the relevant accident reports, and following up--either by phone or by site visits--to gather the detailed

information that would be needed to resolve those issues.

No state has the

financial incentive to conduct the proper level of research on its own.

There

are two more general insights raised by this example.

The first is that

improving existing data sources will often be more productive than setting up

entirely new ones.

In its standard on punch presses, OSHA tried to require

reports of all punch press accidents to federal OSHA.

Employers didn't comply

and the system collapsed, even while routine reporting of such accident to state

WC systems continued.

The second point is that federal agencies, including

OSHA, often have a view that the only data that is worth analyzing is national

data and they thus fail to invest in strategies that enrich the analysis of

state data.

The effectiveness of OSHA's safety program has been the subject of many

studies, whose conclusions range from those that found small effects to those

that found none.

For policymakers, even the former provided few insights

because they were designed to give "summative" evalutions--i.e., was there an

effect?--rather than "formative" evaluations--which would provide insights about

which programs worked better in different circumstances.

Better designed

studies may generate more insights, especially if they can tie together the

particular violations that are cited in inspections and the specific types of

injuries that occur before and after inspections.

This will require linking

together, in at least one state, the inspection data in the MIS with the

accident data in the WC reporting system. However, an even more important

change is needed:

a willingness on OSHA's part to conduct true enforcement

experiments in which the evaluation design is considered before the program is

implemented.

The few experimental programs that have been attempted were not

carried out in ways that would facilitate evaluation:

the number of sites were

too few and the length of time too short to draw clear conclusions. Sadly, as a result we have learned remarkably little about which programs will be most

effective in detecting and deterring violations, much less in preventing

injuries.

One difficulty is that OSHA has very little information about the relation

between violations of standards and accidents.

One of the only sources for this

information comes from accident investigations, which include a review of what

role violations may have played. Currently, OSHA only conducts them in the case

of fatalities and accidents causing more than 4 hospitalizations. Even with these, it has conducted very little analysis. California has covered many more

hospitalizations in its investigations.

In a recent article (Journal of

Occupational Medicine, May 1984) I have shown that analysis of that data

illuminates issues such as the following:

1)

which violations should be cited

as "serious"; 2)

in which industries and size classes of plants are violation

related serious injuries occurring; 3) which violations can be detected by

inspectors; and 4) for which types of serious accidents are current standards

not relevant.

This type of information can be used to inform standard-setting

and enforcement practices, and can be provided to employers and workers to help them focus on the most serious violations in their industries.

Health

The BLS survey is generally conceded to have little value as an indicator

of the magnitude of the occupational illness problem.

Its central weakness

comes with diseases with long latency periods--e.g. cancer and obstructive lung

diseases--but even with more acute problems--e.g. lead poisoning--few believe

that it captures more than a fraction of the cases.

Especially for the long

latency diseases, what is needed is a measure of current exposurez. Ideally, we

would be able to track whether exposures were decreasing, in what types of plants they were highest, and whether they declined after inspections. In later

« ForrigeFortsett »