Measuring corporate environmental crime rates: progress and problems

May 24, 2017 | Autor: Carole Gibbs | Categoría: Criminology, Law, Clean Water Act
Share Embed


Descripción

Crime Law Soc Change (2009) 51:87–107 DOI 10.1007/s10611-008-9145-1

Measuring corporate environmental crime rates: progress and problems Carole Gibbs & Sally S. Simpson

Published online: 17 October 2008 # Springer Science + Business Media B.V. 2008

Abstract The problem of corporate crime rates has been the subject of debate, speculation and operationalization for decades, largely stemming from the complexity of measuring this type of crime. Examining corporate environmental crime poses challenges and creates opportunities for advancing the discussion of corporate crime rates, but criminologists are less familiar with environmental data. In the current paper, we review the strengths and weaknesses of existing environmental data that can be used to construct the components of an environmental crime rate. We also present a corporate environmental crime rate derived from data on violations of the Clean Water Act and describe problems with using it in real world data. Implications for theory, practice and future research are discussed.

The problem of corporate crime rates has been the subject of debate, speculation, and operationalization for decades [6, 16, 10], largely stemming from the complexity of measuring this type of crime. For example, a single act of corporate crime may include individuals, the organizational entity and interdependencies between the two. In addition, firms and managers vary in opportunity for criminal activity according to the position of the corporation in the industry and the manager in the organization [16]. Examining corporate environmental crime poses challenges to crime rate construction, in part because it is a relatively new analytic and legal concept that covers a wide range of illegal activity by individuals and organizations [3]. Theoretical definitions and typologies of environmental crime are virtually nonexistent and criminologists are less familiar with environmental data. In addition, C. Gibbs (*) School of Criminal Justice, Department of Fisheries and Wildlife, Michigan State University, 508 Baker Hall, East Lansing, MI 48824, USA e-mail: [email protected] S. S. Simpson Department of Criminology and Criminal Justice, University of Maryland, 2220 LeFrak Hall, College Park, MD 20742, USA e-mail: [email protected]

88

C. Gibbs, S.S. Simpson

environmental violation data is complex, difficult to use, and not easily amenable to corporate or hierarchical research. For example, Environmental Protection Agency (EPA) data are collected at the facility (e.g., mill, refinery, etc) and not the company level, resulting in a mismatch between what criminologists study and the unit of analysis in the data.1 However, environmental data also provide opportunities for advancing the discussion of corporate crime rates. Recognizing the unique features of corporate offending and experimenting with corporate environmental crime measurement is important for several reasons. First, constructing crime rates will increase our theoretical understanding of corporate environmental crime. For example, corporate environmental crime rates provide a standardized metric to examine environmental offending records over time to determine whether shifts in environmental performance are due to changes in the company or the individuals within it. In addition, comparing the predictors of violation counts and crime rates can provide insight into whether environmental performance is largely a result of opportunity or some criminogenic element within the company. Environmental crime rates also have practical value. For example, comparing the environmental crime rate of two companies is a more standardized way to rank environmental performance because differences in the company size are incorporated into the crime rate. Thus, environmental crime rates could assist enforcement officials in the allocation of enforcement resources. EPA officials are interested in constructing a compliance rate for additional reasons. State representatives argue that compliance rates are a better measure of regulatory success than “bean counting” violations or enforcement actions [8, 14]. In a recent project funded by the National Institute of Justice, we constructed a corporate environmental crime rate using violations of the EPA’s Clean Water Act (CWA).2 We constructed the measure based on the suggestions of environmental enforcement officials [8]. It is remniscent of violations per unit size measures that attempt to incorporate opportunity [6], but is further refined based on unique aspects of environmental data. In this paper, we share some of the knowledge and lessons learned in our study and the crime rate exercise.3 We begin with some background information on the EPA, its operations, and the kinds of data it collects. Next, we describe the EPA data that could be used to construct the elements of a corporate environmental crime rate and some of its limitations. Finally, we construct and describe our corporate environmental crime rate and conclude with some lessons learned in using it. We hope to provide researchers with a data template and analytic strategy that ultimately will enhance future research on corporate environmental crime.

1

The Environmental Protection Agency data, for instance, identify and track facility-level violations and legal entities to those violations, largely ignoring the parent company.

2

We focus on violations of laws designed to protect the physical environment. However, we recognize that “environmental crime” is a broader concept as reflected in Interpol’s division of environmental crimes into “pollution” and “wildlife” categories (www.interpol.int/Public/EnvironmentalCrime/Default.asp).

3 Consistent with [19, 5, 2], we define corporate crime as a violation of any legislative requirement (i.e., regulatory, civil, or criminal). We use the terms “violation” and “crime” interchangeably.

Measuring corporate environmental crime rates

89

Background The Environmental Protection Agency (EPA), created in 1970 by executive order, is the lead agency charged with regulating and managing environmental regulations passed by Congress [4]. These environmental regulations largely target pollution.4 Although not a comprehensive list, these regulations include the Clean Air Act (CAA 1963); the Resource Conservation and Recovery Act (RCRA 1976); the Clean Water Act (CWA 1972); the Toxic Substances Control Act (TSCA 1976); and the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA 1980). Under regulatory law, the EPA is responsible for translating these general environmental laws into specific requirements, enforcing those requirements, and sanctioning companies or individuals that fail to comply. Cooperative and deterrence-based strategies [15] are used by the EPA to monitor and respond to environmental violations, but the agency has no overall environmental protection strategy [4]. Although shifting to a more holistic approach (e.g., on watersheds and multi-media compliance), the agency is currently organized by “media,” or environments subject to regulation (e.g., air, water, and waste) (Environmental Council of the States, ECOS [8].5 Each media is further divided into programmatic areas. For example, the “water” media contains programs on drinking and ground water protection, beach monitoring, and oil spill prevention and response to name a few. The responsibility for implementing all or some of the programs under each law has been delegated to the states [8]. As of 1999, the EPA has delegated about 70% of programs that can be delegated [8].6 While maintaining their own data systems, states with delegated programs are also required to report data to the federal EPA [8].7 Thus, national data systems include both state and federal actions. For a variety of reasons ([17] for a description), we opted to use the national data and thus focus on it for the remainder of the paper.8 With the exception of federal

4 According to the EPA, pollution refers to the “presence of a substance in the environment that because of its chemical composition or quantity prevents the functioning of natural processes and produces undesirable environmental and health effects” (http://www.epa.gov/OCEPAterms/). 5

This organization is reflected in enforcement and sanctioning practices. Although facilities are likely to be regulated across media types (i.e., have water and air permits), they are largely regulated separately. In fact, enforcement personnel were in separate media offices until 1994. They are now in one central unit called the Office of Enforcement and Compliance Assistance (OECA) [8].

6

This includes the major environmental laws. States have been delegated the authority to enforce the Clean Water Act (1972); the Safe Drinking Water Act (1974); and the Resource Conservation and Recovery Act (1976) [9]. Federal environmental laws set a minimum standard of environmental performance and official response, but states may pass more stringent requirements. Thus, states enforcement federally delegated programs and additional state criteria. Delegated states assume full responsibility for environmental programs, although EPA (through it’s regional offices) may conduct “oversight” of the states to determine the appropriateness of State actions. EPA also retains the option to conduct enforcement actions in states in delegated state programs [8].

7

The current state to federal reporting system has created discrepancies between state and federal data. For a full discussion of these issues, see the ECOS [8] report.

8 Cutter et al [7] attempted to survey state-level enforcement data and experienced considerable difficulty. For those interested in collecting state data, the authors provide useful links to publicly-available state data.

90

C. Gibbs, S.S. Simpson

case data, most EPA enforcement and compliance data are housed in media-specific databases.9 In the following section, we describe the compliance and enforcement data from major programs designed to regulate manufacturing facilities (often owned by companies).10 These include the National Pollutant Discharge Elimination System (NPDES) created under the CWA to maintain data on discharges into U.S. waterways; the Air Facility System (AFS) created to maintain information on air pollution (CAA);11 and the Resource Conservation and Recovery Act Information System (RCRAInfo) authorized by the RCRA to track information on hazardous waste handlers.12 These databases provide the information necessary to construct the elements of a corporate environmental crime rate. A description of these and other databases and a list of relevant acronyms is available in the Appendix.

Elements of a corporate environmental crime rate Crime rates must be sensitive to the unique aspects of corporate offending: opportunity and interconnectedness. Corporations have varying levels of criminal opportunity tied to the position of the company itself and the relative position of actors within the company [16]. Corporations are also unique in that criminal acts usually involve multiple actors connected by organizational structure. “Criminality within organizations is a shared event, conditioned by one’s organizational power, position, and motivation” ([16]: 129). Based on the preferences of EPA enforcement officials [8], we build on violations per unit size measures and incorporate unique aspects of environmental data to further develop the measure of opportunity. In the following section, we describe EPA data available to measure each component of our crime rate: crime, crime types, and opportunity. Measuring crime The EPA uses multiple strategies to monitor facilities and enforce regulations, creating different types of data that can be used to measure environmental crime. These measures vary in the level of bias produced. In the following section, we describe each measurement approach, the bias associated with it and the databases that contain the information. We begin with the narrowest (and most often utilized) measurement strategy.

9

Some databases do provide data across media. We discuss the integrated data systems below.

The EPA refers to these facilities as “point sources,” or pollution discharges from stationary or fixed locations such as a factory pipe or smokestack. They may be contrasted with “non-point sources” or diffuse emissions without a single point of origin, such as stormwater or agricultural runoff (http://www. epa.gov/OCEPAterms/).

10

11 AFS is a subset of a larger database called the Aerometric Information Retrieval System (AIRS). AFS contains data on stationary sources or facilities. 12

RCRAInfo was introduced as the new system in the fall of 2000. The previous data system was called RCRIS (Resource Conservation and Recovery Information System) [8].

Measuring corporate environmental crime rates

91

Enforcement/Sanction Data The EPA has the authority to initiate three types of court cases against environmental offenders. States and the federal agency may initiate an internal civil case (“civil administrative”) via an administrative order.13 Alternatively, violators may be prosecuted by an external court system at either the state or federal level. Environmental agencies may refer cases to the State Attorney General, local prosecutors, or the Department of Justice (DOJ) for civil (“civil judicial”) or criminal prosecution. Information on court cases is available in national databases.14 Media specific databases contain some information on state administrative cases. Information on federal civil cases (administrative and judicial) is available in the EPA Docket system15 and federal criminal cases are maintained in Crimdoc.16 According to ECOS [8], many state agencies do not follow cases once they are referred out for civil or criminal prosecution suggesting that the federal data contain little or no data on these state cases. Thus, although federal EPA has the broadest repository of court case data, information on state prosecutions should be obtained directly from the source. Prosecution data are the narrowest way to define environmental crime because they are limited to the few violations that are detected and successfully prosecuted. The EPA, however, uses additional types of sanctions that broaden the portrait of environmental crime. In addition to the punitive/deterrence-based court cases (usually reserved for the most serious violators), the states and the federal agency may respond to violations using cooperative/informal methods such as warning letters or phone calls [8]. Media specific databases contain some information on state informal sanctions, an issue to which we return. The sanction data capture a wider array of illegal activity than do the case data, but may still underestimate crime. Criminologists recognize that sanction data still do not include undetected violations and thus undercount crime. Further, the EPA does not issue a sanction for every known violation. Sanctions are reserved for the most serious violations (http://www.epa.gov/compliance/index-e.html). Thus, EPA sanction data do not include every detected violation. In addition, the federal EPA sanction data are incomplete because states are not required to report informal enforcement actions (e.g., warning letters and phone calls) to the agency. However, this varies by database [8].17 For example, the PCS system provides a field to enter Notices of Violation even though the information is not required. In addition, states have some incentive to report informal actions despite the lack of

13

Consent orders, for example, are contracts between the agency and the violator on the steps to be taken to return to compliance. Unilateral orders (non-judicial) on the other hand, are issued by the agency without consent from the violator. These orders instruct the regulated entity on steps that must be taken to avoid further penalty [8].

14

The data is not easily accessible. Researchers may need to file a FOIA to obtain it.

15

Although federal administrative cases are entered into DOCKET, we also found information on federal administrative orders in some media-specific databases (PCS). We suspect there may be overlap in the cases entered into each database but have no way of making a concrete determination.

16 17

All case information is now captured in the Integrated Compliance Information System (ICIS).

Federal EPA does not consider warning letters to be enforcement actions because they have no force of law [8].

92

C. Gibbs, S.S. Simpson

requirements. If states do not report informal actions, the facility will be tagged as noncompliant in the data and it will appear as though the state did nothing to fix the problem.18 Sanction data can also overestimate actual crime. Generally the EPA attempts to return violators to compliance using informal methods. If these cooperative approaches are unsuccessful the agency will increase the sanction severity, meaning that multiple enforcement actions may be used to return a violator to compliance [8].19 Unfortunately, the EPA does not track this progression in the databases; the systems track each individual enforcement action. Thus, these enforcement actions are counted as a proxy for unique violations even though they all relate to a single violation [8].20 Thus, the EPA sanction data fail to capture undetected or unsanctioned offenses and may also over-count violations for specific facilities. An alternative approach is to measure crime using violations detected during inspections, regardless of whether firms are sanctioned or not. In the next section, we describe this source of data. Inspection Data Inspection types vary by media, but essentially involve some sort of site visit [8]. Inspections can be as simple as a visual “drive-by” but may also be quite extensive, including reviews of pollution discharge reports; interviews with knowledgeable facility personnel; inspections of the processes that generate and treat pollution; samples of discharge pipes or stacks; and reviews of how samples are collected and analyzed by the laboratory [8]. Violations may be issued at any point in the inspection process when a discharge violation is discovered. In addition, inspectors may detect “single event violations” (SEV). SEV characterize a variety of violation types that result from a single instance but are not linked to a pollution discharge above the legal limit. For example, if a number of fish are killed near the facility in the absence of an illegal discharge, the facility may still receive a single event violation. Facilities may also receive a single event violation for improper operation and maintenance. Some coding rules in the national databases make the use of inspection data problematic. In some data systems it is difficult to even determine whether an inspector detected a violation during an inspection. In the PCS data, for example, the date and type of inspection is easily accessible but the inspection results are not available. Information on single event violations detected during inspection is available in PCs, but it is unclear if the information is comprehensive.

18 Other types of regulatory strategies—such as compliance assistance—are not systematically recorded at all [8]. However, these strategies are not sanctions. Compliance assistance is provided in advance to help facilities avoid violations. 19 20

For one typology of sanction severity, see Hunter and Waterman [11].

Even if only one sanction is delivered, the EPA data rarely include a link to a specific violation. The absence of links makes it difficult to establish temporal ordering for research. Researchers can determine the date a sanction was delivered or the date a case was filed, but do not know the date of the violation.

Measuring corporate environmental crime rates

93

The AFS system is also problematic. The national data only allow inspectors to enter one update per day. Thus, the inspector must make an arbitrary decision on which pollutant to enter if multiple violations are uncovered [8]. The RCRAInfo inspection results are similarly limited. If a facility is only required to be inspected once per year, the national database only allows the states to enter the results of one inspection even if multiple inspections are conducted [8]. The inspection data have additional limitations. Namely, violations can only be detected if an inspection is conducted. Thus, the observations still neglect illegal activities that do not come to the attention of regulatory authorities. This problem is noted in EPA data documentation with a disclaimer: EPA and states inspect a percentage of facilities each year, but many facilities, particularly smaller ones, may not have received a recent inspection. It is possible that facilities do have violations that have not yet been discovered, thus are shown as compliant in the system. EPA cannot positively state that facilities without violations…are necessarily fully compliant with environmental laws (ECHO, Frequently Asked Questions). Yet, inspections are often targeted at facilities that show evidence of permit violations or unusual trends/patterns in self-reports that suggest poor performance [8]. To the extent that the EPA can effectively determine when a facility may be in violation, inspection results may provide a decent indicator of actual crime. However, if facilities are targeted for other reasons such as size, the inspection data may misconstrue the distribution of crime. Inspector discretion presents additional data problems [13]. For example, inspectors may choose to exclude violations from the official inspection record if the facility agrees to quickly remedy the problem. Thus, violations detected by inspections may not necessarily appear in the databases. The problems with inspection and prosecution data are exacerbated by budgetary issues. The EPA budget determines the number of inspectors and investigators available to detect and build cases against violators. Thus, the number of violations detected during inspections or the number of prosecutions is as much a measure of enforcement practices (dependent on budget) as it is of violations. A recent article describes a drop in the number of prosecutions under the Bush administration that parallels a drop in the number of criminal investigators [18]. Notably, the number of investigators has dropped below what is required under the 1990 Pollution Prosecution Act [18]. Even with these problems, EPA data are more useful than other sources of official data because additional measures of firm behavior are provided to supplement the official EPA reactions. In the following section, we describe self-report data obtained by the EPA from permitted facilities. Self-Reported Violations The EPA relies heavily on self-assessments (or self-reports) to determine whether facilities are in compliance [9]. Major EPA programs generally require the regulated community to obtain permits to pollute, test pollution levels, and submit the results to EPA. The RCRA, the CWA and to some extent the 1990 CAA require facilities subject to regulation to apply for a permit to operate and

94

C. Gibbs, S.S. Simpson

discharge pollution.21 Permits specify a set of requirements for each facility and typically include limits on pollution, standards for testing emissions, and rules for reporting the results to the EPA [9]. For example, RCRA land disposal facilities are required to sample groundwater beneath landfills to detect contamination and submit results to EPA on a yearly basis [9].22 Similarly, CWA facilities are required to obtain permits that specify the type and frequency of sampling, where samples have to be analyzed, and the methods used to analyze them. Finally, CAA permits specify the type and amount of pollutants that may be released, measurement and reporting requirements, and steps that must be taken to reduce emissions (Plain English Guide to the Clean Air Act). In some cases, permits may require that continuous monitoring systems must be installed on stacks (http://www.epa.gov/region09/air/ permit/defn.html). Usually on a monthly basis, permittees self-report sampling, laboratory procedures and results of lab tests conducted independent of the EPA [8].23 Information is submitted to either state or federal EPA in the form of Discharge Monitoring Reports (DMR) [9]. The EPA then compares the self-reported pollution level to the permitted level to determine whether a violation occurred [9]. Because these reports are not dependent on an official response, they provide a far more comprehensive picture of environmental crime. Self-report/assessment data are not subject to the attention, discretion or budget of enforcement officials. In fact, they include violations that are never even addressed by EPA. Unfortunately, these data still fail to capture all self-assessed violations. First, selfreports do not include every detected violation because facilities do not report the results of every sample. Under the CWA for example, facilities report the monthly minimum, average, or maximum (depending on the type of pollutant measured). Thus, a facility may be in compliance with monthly average requirements if one sample contained pollutants above the permitted level and one sample contained pollutants below the monthly average. For a different reason, self-reported violations in the AFS are also limited. If several stack tests are conducted in the same day and multiple violations found, the states can only enter one update into the national database. An arbitrary decision must be made regarding which pollutant violation is reported [8]. In addition, as with any self-report data firms may have an incentive to misrepresent pollution levels and the EPA has been criticized for its inability to detect fraud [9]. Yet, the self-report data are an improvement over sanction data as a measure of environmental crime. Although data are not provided for every violation that occurs, self-reports are closer to the “dark figure of crime” than sanction data. The degree of 21

For example, RCRA requires facilities that store, treat, or dispose of hazardous waste to apply for a permit [9]. Similarly, the CWA requires facilities that discharge into waterways to identify themselves by applying for a permit. In addition to general monitoring of air quality across the nation, the 1990 revisions to the Clean Air Act require some major industrial and commercial sites that release emissions into the air to obtain permits. Although we focus these few major environmental laws, numerous others rely on facilities to self-report environmental violations. For example, the underground storage tank program requires owners to report leaks from tanks. Similarly, the medical waste program requires self-reports of medical waste disposal [9].

22

Facilities must immediately notify EPA of contaminated samples [9].

23

Facilities may contract with external laboratories or sample and analyze pollution levels themselves [8].

Measuring corporate environmental crime rates

95

accuracy is unlikely to be any worse than other self-reported criminal activity. Facilities do self-report (sometimes quite significant) violations to EPA. This may be partly due to the constant threat of an inspection that challenges a false self-report. Thus, self-report data provide an overall sense of the facility’s record in a particular month. For research purposes, the self-report data are far superior in another critical aspect. It is much easier to establish temporal ordering between predictor variables and violations because the date of the violation is known rather than the date of the enforcement action. Thus, researchers can determine the month a violation occurred and how many months the facility continued to report a violation for a specific pollutant. While flawed, the self-report data are a better measure of crime than the other EPA data types (e.g., enforcement or inspection data). Measuring crime types Measuring crime rates becomes increasingly complex as crime/compliance across media type is considered. First, the extent to which EPA relies on inspections versus self-reports to determine compliance varies across program. Programs that rely exclusively on inspections provide a much less comprehensive picture of offending because more violations go undetected.24 Programs that rely on self-reports may have a more comprehensive picture of violations, but the quality of the data may vary [8]. Thus, crime counts and rates constructed across different programs are not necessarily comparable. Even if relying completely on self-report data to capture violations across media, multi-media studies are difficult to conduct.. As noted, each EPA program has a unique database. Unfortunately, facilities have different identifiers in each program data base as well as unique state-level identifiers [14]. Fortunately, the EPA has taken some strides to match facility identifiers across database to allow researchers to compile a more comprehensive portrait of crime.25 The EPA also provides some integrated compliance information that is easily (and publicly) accessible on the internet. The Enforcement Compliance History Online (ECHO) is the most important of these for environmental crime researchers.26 ECHO contains compliance and enforcement information from RCRAinfo, PCS, and the air facility system. However, ECHO does not contain all of the information in the underlying databases.27 Further, the data provided from these systems is incomplete. For 24

Thus, it is important to understand the primary monitoring system used in a particular program when contemplating how to measure environmental crime.

25

The Facility Registry System (FRS) is designed to allow matches across EPA data bases. It provides a single identifier (FRS number) that is linked to identifiers in all media programs (e.g., PCS, TRI, etc) at the state and federal level [14]. FRS also contains the facility name, address, a list of all ownership information drawn from every source, and all previous names of the facility. The Sector Facility Indexing Project (SFIP) also provided links to facility identifiers and links to compliance and inspection history across database within five industries (automobile assembly, pulp manufacturing, petroleum refining, iron and steel production, and the primary smelting and refining of nonferrous metals). The SFIP was “retired” in December of 2004.

26

Envirofacts also provides integrated information, but is not focused on compliance and enforcement.

27

The sheer volume of data maintained in each databases makes these “cuts” necessary.

96

C. Gibbs, S.S. Simpson

example, ECHO provides each facility’s compliance status (compliance, noncompliance, significant noncompliance) by quarter. The quarterly designations in the ECHO database provide some measure of facility record. Facilities in “significant noncompliance” have the “worst” violation record for the quarter; facilities with any minor violation are described as noncompliant; and facilities with zero violations are compliant.28 However, these designations are less informative than the information in the underlying databases. The PCS system, for example, contains information on all self-reported violations on a (usually) monthly basis. In addition, the quarterly designations may also be somewhat misleading. For example, EPA may leave a facility in “significant noncompliance” status in ECHO until the facility has been in compliance for a specified period of time. Similarly, EPA considers a facility to be noncompliant from the time of the violation until the penalty is concluded (fine is paid/settlement reached) [8]. Thus, a facility may appear to be in significant noncompliance when they have actually returned to compliance. Finally, ECHO is limited to a specific time period of the previous 3 to 5 years only.29 For these reasons, We choose to focus on one type of corporate environmental crime. Measuring logical units of opportunity In addition to measuring crime, corporate crime rates must also account for units of opportunity [16]. Logical units of opportunity are “occasions or situations in which crime might or might not occur” ([16]: 128). Although crude, the number of facilities owned by a particular company provides one way of measuring variation in opportunities for crime. Yet, facilities vary in size and production capacity. There are additional obstacles to using number of facilities to construct a measure of opportunity. First, the national databases only provide information on companies’ largest facilities or “major” facilities.30 Although minor facilities are required to have permits and report discharges, national EPA does not require states to submit compliance and enforcement data on minors. Unless state data are collected (which often contains more information on minors), this problem cannot be overcome. Second, the EPA data provide little information on facility ownership. Although an ownership field is provided, states are not required to report the parent company to the federal data system. In our CWA data, for example, 60% of the facilities were missing data in the ownership field. Further, when states do report ownership the federal databases are not equipped to track

Each program describes and defines “significant” violations in a different way. For example, in PCS and RCRA the term “significant noncompliance” (SNC) is used. In the air database, the term “high priority violators” (HPV) is used [8]. HPV status can be given for chronic violations or for a single violation of an air toxics requirement. The SNC designation in PCS is usually reserved for substantial violations of permitted levels (http://www.epa-echo.gov/echo/dfr_data_dictionary.html#cea).

28

29 30

For a description of problems with using ECHO to obtain data, see Cutter et al. [7].

Major industrial facilities are distinguished from minor dischargers by the facility’s potential for discharging toxic wastes, the volume and type of wastewater, and whether the receiving water is used for drinking [20].

Measuring corporate environmental crime rates

97

changes in ownership over time.31 We developed an extensive procedure to determine ownership and track changes over time that is described in our final grant report [17] and matched facilities to parent companies. In exploring our CWA data, we also uncovered a more detailed way to construct measures of opportunity. In the following section, we describe our data as well as our approach to constructing a crime rate. We also present some descriptive data from our study to demonstrate some unanticipated results when using crime rates rather than violation counts. We conclude with a discussion of the implications of our findings for corporate crime measurement.

Working with the crime rate in real world data The data used in this paper was collected as part of a NIJ funded project to examine the relative effectiveness of punitive and cooperative strategies to control corporate environmental crime [17]. In the larger study, four industries known to be sources of water pollution were selected for study (i.e., pulp and paper, steel, and oil refining). The final “sample” includes a universe of all U.S. based, publicly traded companies operating primarily in one of four Standard Industrial Classifications (SIC) (Pulp Mills; Paper Mills; Petroleum Refining; Steel Works, Blast Furnaces, and Rolling) in 1995 linked to facilities that are regulated by the EPA. Facilities were limited to those operating in the same SIC codes in order to ensure a similar culture between parent company and facility. Companies were retained for the study if they owned at least one facility operating in the same SIC code in 1995 that is categorized as a major discharger in the National Pollutant Discharge Elimination System (NPDES). Firms/facilities were tracked for years 1995–2000.32 Therefore, any changes in either the company (mergers, bankruptcy, etc) or the facility (closings, changes in ownership, etc) were recorded through the year 2000. Dependent variables The environmental crime rate was constructed based on suggestions made by state enforcement officials in the [8] study. ECOS ([8]: 40) defines a “compliance rate” as “the number of facilities that are in compliance from the full universe of facilities that are regulated”. However, some enforcement officers suggested the utility of a

31

Given the data discrepancies and lack of ownership information at the federal level, one might question why we did not turn to state data. Corporations own facilities in multiple states. Practically speaking it would be difficult to collect data from all 50 states. Even with a government mandate to collect the data, ECOS did not receive all of the requested information. Although the purpose of the report was to provide a new picture of State contribution to enforcement, ECOS [8] was forced to rely on national data to fill in the gaps. Second, even if unlimited resources were available, the data would inevitably suffer from some of the same weaknesses as the headquarters data. For example, differences in definitions across place would not be resolved by collecting the data directly from the States.

32

Although we developed an extensive procedure to match facilities to parent companies, we were unable to overcome the lack of information on minor facilities. Given that ours was a national study, contacting every state for information on minors in specific industries was beyond the resources available for the project.

98

C. Gibbs, S.S. Simpson

Table 1 Description of pollutants Pollutant

Description

Conventional Pollutants Conventional pollutants are common pollutants, such as organic waste, acid, (CON) bacteria, oil and grease, or heat that are well understood by scientists. These materials will naturally break down in the water. Toxic Pollutants (TOX) Toxic pollutants are materials that cause death, disease, or birth defects in organisms that ingest or absorb them.

facility-specific compliance rate. For example, some states construct a measure of “Total Discharge Monitoring Reports (DMR) Reporting Periods without Violations / Total DMR Reporting Periods” to capture CWA compliance rates ([8]: 41). We created a measure using these same indicators. We used facility self-reports of pollution violations to construct the numerator. Even with the flaws described above, self-report data are preferable to other kinds of official data (e.g., sanction data), as the measures are more apt to capture illegal events than those that rely solely on discovery by authorities. Two categories of pollutants (e.g. conventional and toxic) and an overall measure are included to provide a more general idea of firm pollution. Definitions of these pollutants are provided in Table 1. Violations are aggregated for total, conventional pollutants and toxic pollutants. Although our data do not allow us to address the interdependencies among actors involved in environmental violations, we do address the issue of opportunity. We use lower-levels of aggregation within the facility that provide some indication of opportunity as the denominator. Facilities may have one or more discharge points (e.g., pipes) that release polluted water directly into surface waters and these pipes vary in size. Various properties of polluted water discharged through the pipes must be assessed; these properties are called parameters. Common parameters include the amount of oxygen consumed in the biological processes that break down organic matter and the particulate content of the water [12]. Parameters taken from the same discharge point (i.e., pipe) are grouped together for reporting purposes and assigned a number, called a report designator. Thus, each discharge point contains multiple report designators and each report designator contains multiple parameters. Table 2 provides an example. In this hypothetical case, discharge point 001 contains two report designators (A and B). Report designator A contains three parameters and report designator B contains only one. Permits often require multiple measurements of each parameter. For example, the EPA may limit the average quantity, the maximum quantity, the minimum concentration, the average concentration, or the maximum concentration of a particular pollutant.33 Table 3 provides an example. In this hypothetical case, the biochemical oxygen demand (BOD) and total suspended solids (TSS) parameters have quantity average and quantity maximum limits. The specific limits differ for pH. For this parameter, the facility must report the concentration minimum and concentration maximum. The number of parameters, measurements, and the 33

Quantities represent total loads while concentrations are the percent of a pollutant in the water. Regulations specify the type of limits that must be assigned to each parameter, although the permit writers may add additional ones.

Measuring corporate environmental crime rates

99

Table 2 Discharge numbers, report designators and parameters Facility

Discharge #

Report Designator

Parameter

A A A A A A A

001 001 001 001 002 002 003

A A A B A A A

Biochemical Oxygen Demand (BOD) Total Suspended Solids (TSS) pH Zinc BOD Nitrogen Oil & Grease

frequency of reporting provides a scale for pipes that have more or less activity that could produce a violation.34 Therefore, these reporting requirements offer a crude measure of opportunity. The self-report data were first aggregated to the facility-level violation counts and number of required reports. Facilities owned by the same firm were then combined, resulting in a firm-level measure of the number of violations per the number of reports required across all owned facilities (see Fig. 1).35 For purposes of this demonstration, the data were further aggregated to the firm/quarter. We also use the violation count (derived from self-report data) to compare the results using the crime rate to a more traditional measure of corporate crime. Key independent variables Official EPA reactions to environmental violations have been aggregated to firmlevel counts of enforcement actions.36 The enforcement actions in this study were 34

More frequent reports (usually monthly) are required for pipes that are more active. Reports may be required quarterly or only annually for less active pipes.

35

It is important to note that although this measure does not capture the actual volume of opportunity, it is similar to a rate measure proposed by state EPA officials. For instance, a recent survey of state EPA officials shows that some officials would prefer to calculate a compliance “rate.” One suggestion for calculating the rate was the “total discharge monitoring reports (DMR) reporting periods without violations/Total DMR Reporting periods” ([8]: 41). Thus, our measure is consistent with some EPA reporting preferences.

36

Two sources of information are used to construct measures of sanctions; one is the PCS system itself. The EPA data contains information on actions taken by EPA (national and state). There are several problems with these data. First, it is likely that information on administrative cases in this source (EPA Docket) overlaps with information on administrative orders and penalties in the PCS data, but it is impossible to determine the degree of overlap. Second, the administrative, civil, and criminal cases coded as the more formal actions in the scale currently reflect all cases brought against these companies under the Clean Water Act; thus, the cases may have been brought as a result of other kinds of violations (reporting or compliance schedule violations rather than pollution violations). However, because the EPA targets the most serious violators and the most serious violations (i.e., pollution) for formal enforcement actions it is likely that most of the cases were brought for either repeated pollution violations or repeated violations of many types (e.g., pollution, compliance schedule, and single event violations). For this study, I will examine the enforcement actions that resulted from a pollution violation (i.e., excluding enforcement actions for reporting violations, etc). Finally, the data does not contain links between enforcement actions and specific violations. Thus, although the outcome of interest may be limited to a specific type of pollutant (e.g., BOD), the enforcement action may have been given for any type of pollution (e.g., combining violations for BOD, TSS, and nitrogen).

100

C. Gibbs, S.S. Simpson

Table 3 Limits on pollution Parameter Quantity Average BOD TSS pH

Quantity Maximum

Concentration Minimum

Concentration Concentration Average Maximum

204 Pounds/Day 371 Pounds/Day 166 Pounds/Day 261 Pounds/Day 6.5 Standard Units

9.0 Standard Units

categorized using a severity scale [11]. The scale ranges from zero to seven, with more informal actions falling at the bottom of the scale. Table 4 provides the distribution of sanctions across the seven severity categories; the distribution is cut at two to distinguish “formal” and “informal” sanctions (categories zero, one and two are considered informal). To be consistent with EPA sanctioning practices, the monthly self-report and enforcement data are aggregated to the quarter (the EPA sanctions facilities in significant noncompliance every three months). As noted, the EPA also has the ability to inspect facilities to monitor compliance. We also constructed firm level measures of the number of inspections per quarter, combining all types of inspection. Description of the sample The sample period begins in 1995 with 67 firms in four industries: pulp, paper, steel, and oil. Pulp and paper were collapsed into one industry because of the substantial degree of overlap in the firms and facilities in the two industries, leaving 30 pulp and paper companies, 18 steel companies, and 19 oil companies. Two hundred and fourteen permits were matched to this universe of firms. These permits identified 212 unique facilities (two facilities were assigned two permits). Pooled descriptive statistics for this sample are provided in Table 5. As the table demonstrates, violations and sanctions are rare. The average company had approximately one violation per quarter and had a one to two percent violation rate. The average company received 0.5 sanctions each quarter and the median number of sanctions received is zero. The average number of inspection is also low considering that the number of inspections is aggregated to the company level. On average, each company (usually including multiple facilities) was inspected once per quarter. Although this lack of variability limits our analysis, the descriptive data still presents interesting issues for corporate crime measurement.

VIOLATION RATE =

Number of Violations Number of Reports Required

Fig. 1 Violation Rate

Measuring corporate environmental crime rates

101

Table 4 Sanction severity scale Action (0) Comment, Permit Mod Request (1) Phone Call, Meeting with Permittee, Enforcement Notice Letter (2) Final Order of the Board, Letter of Violation-Effluent, Section 308 Letter, Warning Letter, Notice of Violation (multiple types—letter), Notice of Noncompliance (multiple types—letter) (3) Administrative Action Planned, Administrative Action Pending, Under Review by State Agency, Under Enforcement Review (4) Enforcement Conference, Enforcement Conference Letter (5) AO Stipulated Penalty, Amended Administrative Order, 308 Administrative Order, Administrative Order, Administrative Consent Order, Jud Action Planned, Referred to Higher Level Review, Notice of Potential Penalty, Compliance Inspection Compliance Order (6) Jud Action Pending, Consent Decree, Stipulation Court Order, Stipulation Court Order, Stipulation Agreement, Order of Revocation, Emergency Order (Governor) (7) NPDES Penalty AO Category I, NPDES Penalty AO Category II, Penalty AO Issued by State

Frequency Percent 197 88 592

14.25% 6.37% 42.84%

50

3.62%

12 276

0.87% 19.97%

65

3.86%

102

7.38%

Analysis and results To demonstrate the unanticipated findings of the violation rate, we present the pooled correlation between sanctions, violation rates and violation counts. We make no assertion regarding temporal ordering or causality and we recognize that this approach does not address the selection issues when examining the impact of sanctions. Our goal is more modest—to demonstrate the variation in results we found when using the violation rate versus the violation count. Because the EPA targets the most frequent violators, we expected sanctions and inspections to be positively correlated with both measure of violation. However, as Table 6 shows, the violation rates are not highly correlated with the number of informal and formal sanctions or the court case data.37 Instead, the correlations between sanctions and the number of violations are much more consistent. It seems that the EPA is less concerned with the percent of reports in violation and instead focuses on the frequency of violations. The violation rate also produces contradictory findings when examining inspections. As Table 7 demonstrates, inspections are positively correlated with the number of violations, but the number of inspections is negatively associated with the violation rate across pollutant. In other words, firm/quarters with an inspection have lower reported violation rates. The negative correlation between inspections and the violation rate seems to be due to EPA targeting practices. Larger firms (i.e., those with more employees and more facilities) are inspected more often. The correlations between the number of inspections, the number of employees and the number of facilities are 0.34 and 0.48 respectively. Larger firms also have smaller violation rates because the denominator in the violation rate is bigger (i.e., these firms submit more reports). The correlations between the overall violation rate, the number of 37

Violation rates are also generally unrelated to sanctions in the bivariate regression models (data not shown).

102

C. Gibbs, S.S. Simpson

Table 5 Descriptive statistics Variables

nT

Range

Mean (Std. Dev.) Median

Dependent Variables Violation Count (All)

1,483

0–1

Violation Rate (All)

1,483

0–75

0.68 (0.47) 1.00 2.04 (4.10) 0.95

PCS Sanction Variables # of Informal Sanctions

1,483

0–9

# of Formal Sanctions

1,483

0–21

# of Total Sanctions

1,483

0–21

1,483

0–1

# of Civil Cases

1,483

0–2

# of Criminal Cases

1,483

0–1

1,483

0–14

Case Sanction Variables # of Administrative Cases

Monitoring Data # of Inspections

0.25 (0.78) 0.00 0.22 (1.12) 0.00 0.47 (1.41) 0.00 0.02 (0.13) 0.00 0.01 (0.11) 0.00 0.00 (0.05) 0.00 1.26 (1.66) 1.00

employees and the number of facilities are −0.11 and −0.14 respectively. Thus, firms with a lower violation rate are more likely to be inspected because they are larger. In fact, the association between inspections and the violation rate is nonsignificant when the number of facilities is included in a multivariate regression model (data not shown). Thus, although an important part of corporate crime measurement, the corporate environmental crime rate does not operate as we expected in real world data. In the following section, we discuss the implications of these findings for corporate crime measurement. Table 6 Correlations (Sanctions, violations, and violation rate)

1. # of Informal Sanctions 2. # of Formal Sanctions 3. # of Total Sanctions 4. # of Administrative Cases 5. # of Civil Cases 6. # of Criminal Cases 7. Violation Rate ALL 8. Violation Rate CON 9. Violation Rate TOX 10. Violation Count ALL 11. Violation Count CON 12. Violation Count TOX

1

2

3

4

5

6

1.00 0.07** 0.61** 0.04 0.04 −0.01 0.05+ 0.08** 0.03 0.18** 0.13** 0.11**

1.00 0.83** 0.58** −0.01 −0.00 0.02 0.02 0.00 0.08** 0.05* 0.06*

1.00 0.49** 0.01 −0.02 0.04 0.06* 0.02 0.17** 0.12** 0.11**

1.00 −0.01 −0.01 0.01 0.04 0.01 0.08** 0.06* 0.04

1.00 −0.00 0.00 0.01 −0.01 0.06* 0.04+ 0.04

1.00 0.03 0.05* −0.01 0.02 0.03 −0.02

Measuring corporate environmental crime rates

103

Table 7 Correlations between inspections and violation rate # of Inspections 1. 2. 3. 4. 5. 6.

Violation Violation Violation Violation Violation Violation

Rate ALL Rate CON Rate TOX Count ALL Count CON Count TOX

−0.08** −0.05* −0.07* 0.11** 0.07** 0.02

Discussion Although theoretically useful, the violation rate produces confusing results in real world data. Contrary to expectations, it is largely unrelated to the number of sanctions received and negatively correlated with the number of inspections received, meaning that firms with a higher violation rate receive fewer inspections. These counterintuitive findings appear to be the consequence of the EPA focus on the number of violations to issue sanctions and firm size to target inspections rather than the rate of environmental violations. Also, the results are due to the correlation between company size and the environmental violation rate. As stated, larger firms have smaller violation rates because the denominator in the violation rate is bigger (i.e., these firms submit more reports) but are also inspected more often, producing the negative correlation between crime rates and inspections. In some sense, our analyses suggest that the current measure has little value for research and theoretical advancement. We believed that creating a corporate crime rate measure would provide a standardized way to compare corporate crime patterns over time, but instead it seems to more adequately capture variation in company size over time. Further, variables associated with the violation rate in empirical research may reflect this correlation with firm size rather than true variation in corporate environmental crime. This problem is inherent in corporate crime rates that use firm size as part of the metric. Large firms will always have more transactions and therefore more opportunities than small firms and small businesses and this correlation will factor into any crime rate. Thus, alternative approaches may be necessary to advance corporate crime measurement. We believe, however, that the corporate environmental crime rate has significant value for practice, especially when compared to the traditional violation count. Although large firms are targeted for inspection, firm size is not significantly correlated with the number of violations per quarter. In other words, large firms do not have significantly more violations per quarter than other firms. Further, large firms have smaller violation rates; larger firms report a smaller percent of measurements in violation than small companies. Earlier command and control policies targeted at large companies may have successfully reduced the level of pollution and violations so that these companies are no longer a major source of pollution [21]. Therefore, the EPA may detect more violations and do more to protect the environment by targeting inspections at small companies—those with a larger violation rate. In fact, several state environmental officials we spoke with during the course of this project suspected that small “mom and pop” businesses

104

C. Gibbs, S.S. Simpson

may create more environmental risk than large companies. Thus, the EPA may benefit from using crime rates to target monitoring and enforcement resources. However, the corporate environmental crime rate constructed in this paper has several limitations. First, the sample used to derive the corporate crime rate is comprised of large publicly traded companies for which data is readily available, limiting our ability to fully compare the environmental violations and crime rates for firms of different types. In addition, as previously mentioned our data do not contain any information on minor facilities. The violation data is limited to that of major facilities, meaning that our violation count and crime rate may exclude a significant number of violations by small plants. Further, the environmental crime rate does not capture variability in the severity of environmental violations. The overall crime rate of a particular facility may be low, but a single violation may have a substantial environmental impact [8]. Finally, our crime rate does not address the interdependencies between actors that produce violations or the serial production of environmental crime. Future research may address some of these limitations in smaller scale projects. Working with individual state agencies may allow researchers to include smaller companies and minor facilities in the sample. Many states collect significantly more data than required at the federal level and do maintain data on minor facilities. Using data from state agencies excludes the possibility of a national study, but research on corporate environmental crime measurement is not necessarily limited by a regional focus. However, scholars would be forced to rely on voluntary company participation to obtain ownership information on small businesses, as it is not publicly available. Environmental agencies in states with a good working relationship with the business community may be able to facilitate this exchange of information. Violation severity is also an important and feasible issue to address in future work. A single violation by a large firm may be quite significant compared to several minor violations by small businesses. On the other hand, the collective impact of frequent minor violations by small companies may be greater than a single large violation by a major corporation. Future work on corporate environmental crime measurement can address this issue by distinguishing the level of violation in the crime rate. The issue of interdependency and serial production is much more difficult to address. Information on the individuals responsible for the violation and their status within the company is simply not available. And in some cases violations may be produced by random fluctuations in water levels or weather rather than error or deliberate manipulation [1]. However, research on corporate crime measurement may still be advanced by exploiting the multiple levels available in environmental data. Scholars may compare the environmental violations and crime rates of facilities owned by the same firm to those owned by different firms. Future work may also compare the violations and environmental crime rates of firms competing in the same industries to those in other industries. This work will shed light on whether interdependencies within industry or company produce variation in corporate environmental crime rates. As should be apparent from this discussion, the exploration of corporate crime rates has by no means been exhausted and it has substantial implications for practice.

Measuring corporate environmental crime rates

105

However, future research should explore alternative measurement strategies that incorporate the complexities of corporate crime and overcome some of the limitations of the crime rate. Namely, alternatives measurement strategies that are unlikely to be so highly correlated with company size should be explored. The EPA data invite such possibilities. Facilities are required to report permitted limits on pollution as well as the level of pollution in water discharges. When converted to a common metric, the pollution limits and levels can be summed across pipe, facility and ultimately company and divided to create an overall measure of a company’s monthly (or quarterly) level of compliance. Violations would reduce the level of compliance and polluting less than legally allowed would increase the level of compliance. In addition to avoiding the inherent correlation with company size, this approach also provides the benefit of incorporating violation severity into the standardized measure of corporate environmental record. Comparing the predictors of firm violation counts, crime rates and level of compliance may advance theoretical discussions of corporate crime.

Conclusion The results of this paper advance the discussion of corporate crime measurement in several ways. First, the results support the need to explore a variety of approaches to measuring corporate crime. In addition to the strengths and weaknesses of the underlying data, each measurement strategy makes contributions and has limitations. It is increasingly clear that the triangulation of measurement strategies is the best approach to advancing corporate crime measurement and disentangling the theoretical predictors of corporate environmental performance. Our results also highlight the gap between theoretical discussions and practice. The corporate crime rate did not operate as expected. In addition, the corporate crime rate may be more useful for environmental enforcement officials than for developing or testing corporate crime theory. Our conclusions suggest the utility of creating feedback loops between academics and practitioners to further knowledge and practice. Practitioners have intimate knowledge of environmental data and may be able to suggest alternative measurement strategies and anticipate problems with current approaches. Academics may be able to further develop crime rates (and other environmental crime measures) that are useful for those working in the field. Overall, our paper points to the need to reinvigorate the discussion of corporate crime measurement. While the field has made important advances in understanding corporate crime, our experience with the environmental data and measurement suggest that we have a long way to go. In addition to addressing issues of data quality and availability, our paper highlights the need to create and test innovative measurement strategies that ultimately enhance our understanding of corporate crime, company offenders and the justice process.

Acknowledgment

This research is supported by a grant from the National Institute of Justice (NIJ).

106

C. Gibbs, S.S. Simpson

Appendix Table 8 Summary of EPA databases Databases

Description

Media specific databases National Pollutant Discharge Elimination System (NPDES) Air Facility System (AFS)

Authorized by the Clean Water Act to maintain national data on pollution discharges into U.S. waterways Authorized by the Clean Air Act to maintain national data on air pollution. AFS is a subset of a larger database called the Aerometric Information Retrieval System (AIRS). AFS contains data on stationary sources or facilities. Resource Conservation and Recovery Act Authorized by the Resource Conservation and Recovery Information System (RCRAInfo) Act to track information on hazardous waste handlers. RCRAInfo was introduced as the new system in the fall of 2000. The previous data system was called RCRIS (Resource Conservation and Recovery Information System) [8] Multi-Media Databases Docket (NEW NAME) Contains information on federal civil cases (administrative and judicial). Crimdoc (NEW NAME) Contains information on federal criminal cases. Enforcement Compliance History Online Merges quarterly information from RCRAinfo, PCS, and (ECHO) the air facility system for the previous 3–5 years Facility Registry System (FRS) Designed to allow matches across EPA data bases. It provides a single identifier (FRS number) that is linked to identifiers in all media programs (e.g., PCS, TRI, etc) at the state and federal level. Sector Facility Indexing Project (SFIP) Provides links to facility identifiers and links to compliance and inspection history across database within five industries (automobile assembly, pulp manufacturing, petroleum refining, iron and steel production, and the primary smelting and refining of nonferrous metals).

Table 9 Relevant acronyms Air Facility System

AFS

Clean Air Act Clean Water Act Comprehensive Environmental Response, Compensation and Liability Act Department of Justice Environmental Council of the States Environmental Protection Agency Facility Registry System Federal Bureau of Investigation Freedom of Information Act National Pollutant Discharge Elimination System Resource Conservation and Recovery Act Resource Conservation and Recovery Act Information System Sector Facility Indexing Project Toxic Substances Control Act

CAA CWA CERCLA DOJ ECOS EPA FRS FBI FOIA NPDES RCRA RCRAInfo SFIP TSCA

Measuring corporate environmental crime rates

107

References 1. Bandyopadhyay, S., & Horowitz, J. (2006). Do plants overcomply with water pollution regulations? The role of discharge variability. Topics in Economic Analysis and Policy, 6(1), 1–32. 2. Braithwaite, J. (1984). Corporate crime in the pharmaceutical industry. London: Routledge and Kegan Paul. 3. Clifford, M., & Edwards, T. D. (1998). Defining “environmental crime”. In M. Clifford (Ed.), Environmental crime: enforcement, policy, and social responsibility (pp. 5–30). Gaithersburg, MD: Aspen. 4. Clifford, M. (1998). A review of federal environmental legislation. In M. Clifford (Ed.), Environmental crime: Enforcement, policy, and social responsibility (pp. 95–110). Gaithersburg, MD: Aspen. 5. Clinard, M. B., & Quinney, R. (1973). Criminal behavior systems: a typology. New York: Holt, Rinehart and Winston. 6. Clinard, M. B., & Yeager, P. C. (1980). Corporate crime. New York: Free. 7. Cutter, R. H., Cahoon, L. B., & Leggette, R. D. (2006). Enforcement data: a tool for environmental management. The Environmental Law Reporter’s News and Analysis, 36(2), 10060–10072. 8. Environmental Council of the States (ECOS) (2001). State Environmental Agency Contributions to Enforcement and Compliance. Washington D.C. 9. Governmental Accountability Office (GAO). (1993). Environmental enforcement: EPA cannot ensure the accuracy of self-reported compliance monitoring data. Report to Congressional Requesters (GAO/ ECED-93-21). 10. Green, G. S. (1990). Occupational crime. Chicago: Nelson-Hall. 11. Hunter, S., & Waterman, R. W. (1996). Enforcing the law: the case of the clean water acts. Armonk, N.Y.: M.E. Sharpe. 12. Kagan, R. A., Gunningham, N., & Thornton, D. (2003). Explaining corporate environmental performance: how does regulation matter? Law and Society Review, 37(1), 51–90. 13. Metzenbaum, S., Watkins, A., & Adeyeye A. (2007). A memo on measurement for environmental managers: recommendation and reference manual. Environmental Compliance Consortium. 14. National Academy of Public Administration (2001). Evaluating environmental progress: how EPA and the States can improve the quality of enforcement and compliance information. Report for EPA Requestors. 15. Reiss Jr., A. J. (1984). Selecting strategies of social control over organizational life. In K. Hawkins, & J. M. Thomas (Eds.), Enforcing regulation (pp. 23–35). Boston: Kluwer-Nijhoff. 16. Simpson, S. S., Harris, A. R., & Mattson, B. A. (1995). Measuring corporate crime. In M. B. Blankenship (Ed.), Understanding corporate criminality (pp. 115–140). New York: Garland. 17. Simpson, S. S., Garner, J., & Gibbs, C. (2007). Why do corporations obey environmental law? Assessing punitive and cooperative strategies of corporate crime control. Final Technical Report submitted to the National Institutes of Justice (NIJ), Grant #2001-LJ-CX-0020. 18. Solomon, J., & Eilperin, J. (2007) Bush’s EPA is pursuing fewer polluters. The Washington Post. 19. Sutherland, E. (1961). White collar crime. New York: Holt, Rhinehart and Winston. 20. Yeager, P. C. (1993). Industrial water pollution. In M. Tonry (Ed.),Crime and justice: a review of research, 18 (pp. 97–148). Chicago: University of Chicago Press. 21. Vandenbergh, M.P. (2004). From smokestack to SUV: the individual as the regulated entity in the new era of environmental law. Vanderbilt Law Review, 57(2), 515–626.

View publication stats

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.