An Examination of Crime Reporting on Cable News1
To what extent are scholars in the fields of criminal justice and criminology participating in crime reporting on major cable news television programs?
For this study researchers utilized content analysis, an unobtrusive method of information collecting, to explore crime reporting across different television media outlets. They first selected the most highly rated news programs from three different 24-hour cable news networks. Included in the study were Anderson Cooper 360° (CNN), The O’Reilly Factor (FOX), and Countdown with Keith Olbermann (MSNBC). Researchers recorded these programs on Tuesdays and Thursdays from June to September, 2006. Episodes of each program were coded first for the inclusion of crime segments and later for discussion of crime causation and crime control, taking into consideration the guests appearing on the programs. Data from 64 episodes were included for final analysis. From these episodes, 180 crime segments were analyzed by the researchers. As there were two researchers, or raters, analyzing the media content, an interrater reliability analysis was conducted to determine coding consistency between the raters.
Results and Implications
Of the three programs analyzed, Anderson Cooper 360° devoted the most time to crime reporting (40%) and, across all three programs, crime reporting segments lasted approximately five minutes. As for type of crime discussed most often, the majority of time spent discussing crime (90%) was devoted to street crime, terrorism, and sexual offenses. More than one-third of segments focused on high-profile cases such as the yet unsolved murder of JonBenet Ramsey in Colorado.
During the crime segments analyzed, there were 347 guest appearances. Scholars and researchers represented 4% of guests while politicians and practitioners (e.g., law enforcement officials, attorneys) were called upon the most (37.5%). During the crime-related segments, the researchers revealed that little discussion was devoted to crime causation and/or crime control. Indeed, only 17% of guests incorporated crime causation into the interview and only 14% of guests mentioned crime control. These findings were similar for both academic and nonacademic guest appearances. When crime causation was discussed, it was found to involve the nonprofessional diagnosing of individual pathology, most frequently mental illness. For those incorporating crime control and policy into their discussions, the majority of guests were in favor of more severe penalties for offenders. Of all of the guest appearances, there were only nine in which guests supported less severe penalties such as rehabilitation or decriminalization. Finally, regarding what type of content the guest appearances provided, more guests were found to be sharing facts or their experience to develop the crime-related story, although 22.5% provided only speculation regarding the case in question. Researchers found that when speculation did not occur naturally, program hosts often encouraged such discussion through the questions they posed during the interview.
Ultimately the findings of this study lend support to past research in the area of crime reporting. Academic guest appearances remain as few and far between as they were two decades ago, and the conversations continue to lack insight regarding crime causation or appropriate preventative measures for and/or responses to criminal events. Of course, as the researchers note, perhaps cable news programs are not the place for such discourse. While one cannot deny that news audiences are aware of the “what,” it remains questionable as to whether they will understand the “why,” or the context of criminal events, from the information received from cable news media outlets.
In This Chapter You Will Learn
To explain the reasons for conducting unobtrusive research
To discuss reactivity and the Hawthorne Effect
To locate archives of existing data
To discuss the advantages and disadvantages of using secondary data
To compare and contrast the various methods for conducting unobtrusive research
Previous chapters in this text have examined the many types of data collection techniques most often used by social scientists to study crime. So far you have learned about the variations of the experimental design, data collection utilizing surveys and interviews, as well as participant observation data-gathering techniques. Which technique is chosen by a researcher may depend on any number of factors, including the time available to conduct the study, the monetary or personal costs the study may entail, and the specific research question being examined. The decision on what technique to use may also be influenced by the population of interest. For example, if a researcher wanted to examine gang affiliation among prison inmates, he would seemingly need access to a prison facility, but that is not necessarily the case. This chapter continues the discussion on data-gathering techniques by focusing on additional methodologies that are often described as “unobtrusive.”
There are many ways in which researchers can implement unobtrusive methodologies, or indirect methods, to answer their research questions. The most popular unobtrusive method in criminal justice research is the use of secondary data. Other techniques include content analysis (as discussed in the example at the beginning of this chapter), the use of archival records, the analyzing of physical traces, simulations, and, as was discussed in Chapter 6, the use of observation. This chapter explores each of these techniques in more depth, giving examples of how they have been used to advance the study of crime-related and other social science topics.
Being unobtrusive, or nonreactive, implies that what or who is being studied is unaware of its/her role as research participant. Therefore, the information obtained will not be tainted by reactivity. As was discussed in Chapter 6, methodologies that are obtrusive, and therefore known to the study participants, can cause limitations to information gathering. For example, can you recall the Hawthorne Effect? The Hawthorne Effect represents a threat to the internal validity of a study by providing an alternative explanation for study findings. It describes how a research participant may change her behavior (i.e., act unnaturally) due to their role as a research participant. If such an unnatural change in behavior occurs, researchers are no longer measuring a true or natural reaction to what is being studied. Researchers therefore cannot contend that it was the study components alone that influenced the behavior of the study participants. Additionally, reactivity limits the generalizability of study findings as such findings can only be said to be true regarding subjects under the same conditions (i.e., those who are being observed at the time the behavior took place).
The Hawthorne Effect was coined from a research study on the relationship between worker productivity and work environment, which was conducted at the Western Electric–Hawthorne Works factory in the early 20th century. At the time, researchers were interested in understanding whether slight changes to the workers’ environment, particularly the lighting in the environment, would alter the workers’ productivity. Study findings revealed that there were increases in productivity during the time of the experiments (even when fighting conditions deteriorated); however, these were later attributed not to the change in environment, but rather the perceived interest in and attention being received by the workers (to read more about the Hawthorne Effect, visit the Harvard Business School’s Baker Library Historical Collections website). The Hawthorne Effect, and the research on which it is based, continues to be discussed relative to research methods as well as management in the workplace. Interestingly, in 2009, a group of researchers at the University of Chicago reanalyzed the original Hawthorne Works factory data, finding that the original results may have been overstated and that factors other than reactivity may have influenced productivity among the workers being observed.2 While these new findings have come to fight, reactivity continues to be recognized as a limitation to research involving human subjects.
As another example of how reactivity may affect researcher findings, let’s say a research team wants to examine a program for discipline-problem students. To resolve discipline problems at school, a special program was implemented in which discipline-problem children were taken from their normal classroom and put in an environment where there were fewer students, more one-on-one attention from the teacher, and a specialized learning program. While researchers may be interested specifically in changes in the children’s behavior based on the program curriculum, it could be that such behavioral changes were due instead to the extra attention the children were receiving from the teacher and/or from the researchers conducting the program evaluation. Therefore, reactivity may limit the researchers’ ability to say with certainty that changes in student performance and behavior were due solely to the program curriculum. Additionally, let’s say researchers want to examine how patients in a doctor’s office react to longer-than-usual wait times. Under normal circumstances, patients may get agitated and respond negatively to office staff. However, if the same patients knew they were being observed as part of a study, they may respond with more patience. There are numerous situations in which an individual’s behaviors may be influenced by the presence of observers or by the simple knowledge that he is part of a research study. Think about how you would respond in a particular situation, such as the example involving the doctor’s office, if you knew you were being observed and your actions were being documented.
Advantages of Measuring from Afar Although there are disadvantages to using unobtrusive measures, there are also advantages to using them in the research process. Perhaps the greatest advantage is the ability to diminish or eliminate reactivity, or the Hawthorne Effect. Utilizing unobtrusive methods, researchers are able to observe, measure, and study subjects without them being aware they are being studied.
In certain unobtrusive methods, such as archival records and secondary data, human subjects are not directly involved in the research, and this eliminates most concerns related to reactivity. The records and data to be analyzed are already in existence, and it is the researcher’s job to analyze the information only. This can save valuable resources including time and money and can also allow researchers to study populations that they may not be able to otherwise. For example, if a researcher wanted to explore what the writers of the U.S. Constitution were thinking at the time of the American Revolution, they could not ask Thomas Jefferson or Benjamin Franklin personally. They could, however, look back to letters, diaries, and other written documents to better understand what was being thought and felt at the time. As another example, if researchers wanted to conduct a study of sexual assault victims but did not have access to such a population, they may look to analyze data, originally collected for a different purpose by a secondary source. National Crime Victimization Survey data, for example, could be used for this purpose. The following sections will examine in more detail the many types of unobtrusive methods utilized by researchers.
Secondary Data Analysis By far the most popular type of unobtrusive research method, secondary data analysis involves the reanalyzing of existing data. Secondary data analysis entails researchers obtaining and reanalyzing data that were originally collected for a different purpose. This is considered unobtrusive because researchers are utilizing data that already exists and therefore do not have to enter the lives of subjects to collect information needed for their study.
There exist numerous sources of data that can be useful for developing criminal justice studies. All criminal justice and other government agencies collect official data, of varying extent, for records purposes. The Census Bureau collects information regarding the characteristics of U.S. residents. The Federal Bureau of Investigation (FBI) collects and compiles crime data each year from local and state law enforcement agencies for its Uniform Crime Reports and National Incident Based Reporting System. State prison systems and county and city jail administrators collect data on their inmates. Beyond official data, the U.S. Department of Justice, through its Bureau of Justice Statistics, facilitates numerous surveys including the National Crime Victimization Survey and surveys of inmates in jails, state correctional facilities, and federal correctional facilities. Local program staff who perform evaluations of their work may choose to make data collected for evaluation purposes available to others as may individual researchers who have collected extensive data for research projects. If receiving federal funding to conduct research, it may be a stipulation of the funding agreement that the data be made available for public use after a certain period of time has passed. The National Institute of Justice was the first to have such a requirement, specifying that data collected through NIJ-funded research projects must be given to NIJ once the research had concluded. These data and numerous other datasets are available to researchers for secondary use.
Application processes to obtain original data vary greatly depending on the agency that holds the data, the type of data being requested, and other factors. Sometimes obtaining data requires a phone call only, or establishing a contact within an agency. Other times, especially with datasets that are restricted due to identifying information contained within, there are lengthy applications and in-person meetings with agency administrators required for consideration. Today, data archives exist that make obtaining data as easy as a click of a button. Data archives such as the Interuniversity Consortium for Political and Social Research (ICPSR) at the University of Michigan, which houses the National Archive of Criminal Justice Data, provide access to a broad range of data that can be downloaded quickly by researchers. All that needs to be done is a keyword search to see if archives such as ICPSR have data that may be useful to a researcher beginning a new research project.
CLASSICS IN CJ RESEARCH
The Lost Letter Technique
The same Stanley Milgram responsible for the classic Obedience to Authority study discussed in Chapter 2 is also responsible for implementing what is now referred to as the lost letter technique.3 In the 1960s, Milgram conducted studies in which letters preaddressed and stamped were dropped in pedestrian areas. These letters were preaddressed to organizations that were controversial in nature, such as the Ku Klux Klan and other white supremacist groups. The address, however, was the study site address and not the actual organization address. As a way of indirectly measuring attitudes toward these organizations, specifically acceptability, Milgram determined the rate at which these letters were returned. The letter being returned indicated that someone had taken the time to pick up the letter and send it on. The study revealed low rates of return, which Milgram believed pointed to a general lack of acceptability of these types of organizations.
In the 1990s, Stern and Faber4 introduced Milgram’s lost letter technique to the electronic age. Using e-mails, the researchers conducted two separate experiments. In the first experiment, faculty at a small college were sent one of four prewritten “lost” e-mail messages differing in tone and urgency. Of the faculty who received the “lost” messages, 19% returned the message to the sender. No message was sent on to the person to whom the e-mail was originally addressed. There was also no difference found on rates of return based on the type of message received. In the second experiment, the researchers sought to measure attitudes regarding Ross Perot, a presidential candidate at the time. Using a selection of 200 e-mails obtained from the online “white pages,” the researchers sent a variation of e-mails that included a request for assistance with fundraising for Perot’s campaign. Of the messages sent, 29% were returned. Based on a content analysis of returned messages, it was found that respondents showed either a negative or neutral attitude toward the Perot campaign. While ethical considerations and limits to generalizability abound when conducting a study using such a technique, it seems that with the aid of technology, this data-gathering tool is not yet lost to the history books.
Advantages and Disadvantages of Secondary Data The main reason secondary data analysis is employed so often by criminal justice researchers is that it can save valuable time and resources. It may allow the analyzing of data from a place that is normally restricted to researchers such as correctional environments. For research projects that are unfunded, secondary data analysis provides a means for answering research questions that is much less costly than collecting original data. That is not to say that all data is free to researchers. There are datasets that may be difficult to obtain due to fees that must be paid for their use or due to the cost in time and effort that must be put forth to obtain the dataset. Another limitation with the use of official and other existing data is that researchers have no control over the original data collection. There may be issues with the methodology of the original data collection or problems with agency recordkeeping leading to questions of accuracy. Certain variables may not have been included in the original data collection, limiting the questions that can be answered by analyzing the existing data. There may also have been certain biases impacting the original data collection. For example, if an agency or program wanted the original study to show favorable results, it could be that data was “fudged” in such a way that findings were as wanted, instead of as actually existed. While the advantages to the use of secondary data are many, these limitations should be taken into serious consideration prior to beginning a study reliant on secondary data.
WHAT RESEARCH SHOWS: IMPACTING CRIMINAL JUSTICE OPERATIONS
Geographic Information Systems (GIS)
Research utilizing Geographic Information System (GIS) technology involves the use of mapping hardware, software, and data to examine structural, social, and other area characteristics and how these relate to criminal activity, program delivery, and other criminal justice topics. GIS technology can be used to map where things are, to map quantities, to map densities, to find what is inside or nearby an area, and/or to map change.5 In the 1950s and 1960s, multiple disciplines contributed to the development of GIS; however, today most academic GIS programs are housed in geography departments on university campuses.
Digital mapping was first adopted by the federal government in the 1960s through its use by the U.S. Census Bureau.6 In the 1970s and 1980s, private vendors began to offer smaller systems, making GIS analysis affordable for state and city use and eventually for use by community and other smaller organizations. With its increased use for criminal justice applications, the National Institute of Justice (NIJ) developed a Crime Mapping Research Center, which supports research using computerized crime mapping. One NIJ funded research project examined computerized crime mapping use among law enforcement agencies. Surveying over 2,000 law enforcement agencies, Mamalian and LaVigne (1999)7 found that the majority of agencies sampled used some form of analysis, whether to fulfill reporting requirements to the Uniform Crime Reports (73%) or to calculate agency statistical reports (52%). Very few agencies, however, used computerized crime mapping (13%). When used, the vast majority (91%) reported using computerized crime mapping to conduct geocoding and mapping offense data. As would be expected due to resource allocation, larger departments (36%), those with more than 100 officers, were more likely than smaller departments (3%) to use computerized crime mapping.
GIS has also made its way into the criminal justice academic literature base. Manhein, Listi, and Leitner (2006)8 used GIS and spatial analysis to examine dumped and scattered human remains in the state of Louisiana. Medina, Siebeneck, and Hepner (2011)9 used GIS to explore patterns of terrorist incidents occurring in Iraq between 2004 and 2009. Davidson, Scholar, and Howe (2011)10 utilized GIS-based methods to determine where needle exchange programs were most needed in San Francisco and Los Angeles. As a final example, Caplan, Kennedy, and Miller (2011)11 relied upon GIS modeling to determine whether risk terrain modeling was more effective than hot spot mapping in the forecasting of shootings, finding that risk terrain modeling was significantly more accurate. As these examples show, GIS, as an unobtrusive method, can be very helpful in answering a wide variation of questions related to the investigation of crime and other behaviors. With more resources allocated to this type of analysis, the future is wide open for GIS to take root in criminal justice research.
The saying that “history repeats itself’ may apply to the interest some researchers have in studying the past. Historical research is a form of secondary data analysis that involves obtaining information from historical documents and archival records to answer a research question. An abundance of information exists in libraries and other places where archives are kept such as city, county, or state agencies where researchers can examine past events, trends over time, and the like. With the advent of the Internet, retrieving such records has become much less cumbersome.
Archival Records are not only used for examining criminal justice-related questions. Outside of our discipline, historical records have been explored by those interested in weather patterns and also by those interested in environmental change. Simonton12 chronicled the use of historical data in psychological research and concluded that the use of such applications should and will continue. He contends that “methodological advances [including the use of advanced statistical techniques]…render the historical record a far more useful source of scientific data” than may have been previously realized. One recent study13 utilizing archival records examined the combat histories of veterans seeking treatment for Post-Traumatic Stress Disorder. For this study, researchers explored archival records detailing Vietnam combat from the U.S. National Military Personnel Records Center, finding that Vietnam-era veterans seeking treatment for PTSD may misrepresent their combat involvement, at least according to their documented military record. As relates to criminal justice, archives holding arrest records, prison records, and death row records may be analyzed. Theoretically, Durkheim’s influential work on suicide from which his theory of anomie was developed was based on archived records. Additionally, there are numerous examples of the use of historical data to examine criminal offenders from Albini’s (1971)14 study tracing the origins of the Mafia to Sicily to Clarke’s (1982)15 study of assassins and their motivations.
While archival data has been and continues to be utilized for scientific inquiry, it is important to understand the limitations of such data. First, historical data are only available through archives that must be “mined” for their information. In some cases, this may be easy to do, especially if the archive is online and able to be easily searched. In other cases, accessing such data may involve traveling far distances to visit libraries or records offices to gather the information needed. Questions of reliability and validity also confound the use of archival data. Gidley,16 citing Scott (1990), notes that such data should be judged based on four criteria, its authenticity or genuine nature, its credibility or whether the record is free from distortion, its representativeness as compared to other records of its kind, and its meaning or clarity of the evidence in question. Researchers should also take into consideration the original methods used to collect the data as well as the intentions of the original data collectors when using archival data. As mentioned previously, if the data were biased in some way when originally collected, they will continue to be so in their current use.
Diaries, Letters, and Autobiographies Aside from archival records, historical research may also involve the analysis of personal documents, including diaries and letters, or other accounts of life events as told by the person under study. Diaries of those who have gone before us have been examined to better understand the time or event in question and to gain insight into what that person was like or what they were thinking at the time. Often exhibitions of historical diaries and letters travel from library to library. The Library of Congress allows online access to a number of such diaries, including those from George Washington and Theodore Roosevelt; however, research involving this type of data does not have to be historical. Current research endeavors may also involve diaries through what is known as the diary method. In this method, subjects are asked to keep a record of their behaviors as they relate to the study being conducted. For example, a study on substance abuse may ask a user to chronicle his or her daily use during the time the research is being conducted. A study on unemployment may ask a study subject to keep a diary of the challenges faced as he or she searches for a job.
Autobiographies have also been used to chronicle histories, many of which focus on a life of crime. Although its validity is now questioned, Tufts’ autobiographical account of his life as a career criminal was the first to be published in 1930.17 Since that time, many accounts have been published, especially by former organized crime members.18 While there is much interest in the telling of these stories, readers should be aware that not everything told has been backed up by evidence and therefore, as with Tufts’ autobiography, the validity of the accounts is somewhat unknown.
The opening case study of this chapter described a research study in which media content was analyzed to answer a research question, namely to what extent criminal justice and criminology scholars were contributing to popular cable news programs. This method is referred to as content analysis, and such analysis may be conducted using any form of mass communication, including television, newspapers, magazines, and the like. In the past, such methods have relied on written material, but today, television and films and any other source of mass media (e.g., Facebook and Twitter posts) may also be utilized for purposes of content analysis. The procedural elements of content analysis as described by Berelson19 (1952) include subject selection, the development of inclusion criteria, classification, and analysis of results. The establishment and reporting of the inclusion criteria and classification schemes are important elements of content analysis because they allow a path to be followed for other raters within the project or future researchers attempting to replicate the study findings.
Recent examples of research involving content analysis include the examination of criminal justice pioneer importance as measured by existing biography length,20 crime drama portrayals of “prime time justice,”21 last words of Death Row inmates and news coverage of executions in Texas,22 portrayals of gay and lesbian police officer depictions in the “core cop film genre,”23 and constructions of crime and justice as portrayed in American comic books.24 While such research can be found in mainstream criminal justice and criminology journals, journals devoted to media topics, such as the Journal of Criminal Justice and Popular Culture, and Crime, Media, Culture: An International Journal, have been established to focus primarily on such issues and research methodologies.
A recent contribution to the content analysis process is the development of computer software programs that can be used to aide researchers with counts and classification particularly when it is written content being measured. Whereas content analysis was, in the past, primarily done by hand, now content can be copied into such programs and the software will conduct the counts. This allows larger amounts of content to be analyzed in a much shorter amount of time.
Issues most discussed regarding content analysis are those of reliability and validity. That is, ratings, counts, or classifications may be considered subjective and therefore differ from person to person. There are ways to increase the reliability and validity of such research findings. For example, it is important to have clearly established inclusion criteria and classification schemes for others to follow. This lends to a more objective system of classification. Additionally, many research projects involve multiple raters. Once each rater has completed their ratings, a ratio is established to determine agreement among the raters. This ratio is measuring interrater reliability. Such an analysis is generally included in study findings for research involving content analysis by more than one rater.
Meta-analysis is a type of content analysis in which researchers review, organize, integrate, and summarize the existing research literature on a certain topic. Researchers conducting meta-analyses will gather existing quantitative information on studies that have been conducted in the past in order to compare their methodology and findings. In the 1970s, Glass25 was the first to coin this term and to describe the quantitative process, which utilizes statistical methodologies to code, analyze, and interpret the similarities and differences found among the literature relating to a certain research question. Most commonly, this method has been used to assess the effectiveness of certain interventions, such as correctional boot camp programs, drug treatment programs, and the like.26 Although most of such research has developed out of the field of psychology, there has been an increase in the use of meta-analysis as a method for researching criminal justice topics. Two recent academic journal articles27 have addressed this increase, the methods best suited, and the usefulness of meta-analysis for the field, finding that, while such analyses are time consuming and labor intensive, they are, as Pratt (2010) states, “a welcome addition to the criminologists’ toolbox” (pg. 165).
Physical Trace Analysis “Crime Scene Investigation,” need we say more? Of course we do, although you are probably most familiar with this type of unobtrusive method from courses involving crime scene investigation or the many television shows that devote time to such information-gathering techniques. For example, if a law enforcement officer were examining a crime scene in which a sexual assault was alleged to have taken place, what would she look for? The officer would most likely be examining the scene for semen, blood, contraceptive materials, and/or other substances that may have been left behind from such an incident. Physical trace analysis is similar and refers to the examination of physical substances that have been created and left by individuals as they come in contact with their environment. As with other unobtrusive methods, physical trace analysis represents an indirect method of measuring certain phenomena.
Examples of the use of physical trace analysis for social science research include the study of museum exhibit popularity by examining wear on the floor attributed to heavy foot traffic28 and the determination of crowd size as estimated by trash accumulation after a social event has taken place.29 As another example using garbage, researchers may attempt to measure how popular certain establishments are by examining how much trash is accumulated on a given night or how many homeless persons stay under a certain highway bridge by examining garbage or other waste left behind. It would be assumed that a bar with more trash was more heavily frequented the night before than a bar with less trash, and a highway bridge under which there was more trash, is a more popular stopping-over point for transient populations than one with little trash or waste accumulated. Such an examination could also be conducted using public restrooms. If a university administrator was interested in adding restrooms to campus and he needed to know where these additional restrooms would be most utilized, how could this be measured? Once again, looking at trash accumulation or “wear and tear” on doors, floors, and toilets would be indirect ways to answer such a question. If you were interested in exploring where most students sit in a given classroom, what kinds of things would you look for as you examined the classroom after the class had ended? Perhaps you would look at the disarray of the seats or tables. You might also look at where trash has accumulated in the classroom or if there is writing on tables where it had not been before. These would also be indirect ways to measure which sections of classroom seating are most used, and such observations could be made once or multiple times after the class has ended. Furthermore, with the rise of technology, particularly the use of the Internet, also come avenues for unobtrusive research. For example, visits to a webpage could indicate popularity of that website. Celebrity popularity is often measured by the number of times that person’s name is entered into an online search engine.
Researchers have also been able to apply analysis of physical traces to the study of crime and deviance. For example, studies of vandalism in certain neighborhoods and graffiti in public areas have been conducted to indicate the presence of lawbreakers and/or subcultures or gang activity.30 Additionally, the sale of burglar alarms or other home protection devices has been used as a proxy measure for fear of crime.31 That is, if there is an increase in the installation of burglar alarms, it is assumed that fear of crime in that area is on the rise. With the use of the Internet, popularity of pornography sites, particularly those that are unlawful due to posting pornographic content involving children, can be measured. Additionally, for purposes of investigation, electronic physical traces (i.e., IP addresses) can be tracked as a means to find out who is viewing or from which computer these websites are being viewed.
One important thing to remember about physical trace analysis is that the resulting evidence is not direct; these are indirect measures of phenomena. Just because a law enforcement officer finds semen and blood at a crime scene, it does not automatically prove that a crime has occurred. Also, just because the sale of burglar alarms increases, it does not necessarily indicate that fear of crime is also on the rise. Alarm companies could have dropped their fees, leading to an increase in sales. Therefore, while interesting, such measures are generally seen as being inferentially weak. When using such methods, it is always important to triangulate, or attempt to validate your findings through the use of other measures when possible. The more evidence you have, the more secure you can be that your findings are indeed measuring what you propose they are measuring. A final issue with the collection of physical trace data is an individual’s right to privacy; however, if information is public, the collection of that information would not involve a violation of that right.
In Chapter 6, you learned about the many variations of participant-observation research. On the spectrum of intrusiveness, complete observation is the least intrusive. In conducting research as a complete observer, a researcher only observes the individuals and behaviors under study. The researcher makes no advances or attempts to be involved with the individual(s) or to change the natural chain of events. He or she is only there to witness what occurs. As an example of complete observation, you may be wondering how often store patrons park in a handicap parking spot without the proper tags on their vehicle. To answer this question, you may sit near the parking spot in question and observe how many vehicles park there illegally. You are not interacting with the store patrons, nor are you interfering with their decision to park in the handicap parking spot. You are merely observing their behavior. Acting as a complete observer is quite similar to being on a stake-out. A limitation is that you could be sitting for some time waiting for events to occur as you have no control over when or how things may happen. For example, if you were observing the handicap parking spot, you may sit there for hours or even days without anyone actually parking there.
In disguised observation, the researcher may actually take part in the behavior or group under study. As a disguised observer, the researcher’s identity and purpose for being there is hidden from the group, making the gathering of information unobtrusive. Because the subjects do not know they are being studied, the researcher is able to observe individuals in their natural environment. As was discussed in Chapter 6, there are numerous examples of disguised observation in criminal justice research. If you remember the case study from Chapter 2, Humphreys acted as a disguised observer when researching the phenomenon of anonymous sex in public places.
While these examples exist, disguised observation is not without its critics. While being able to observe individuals in their natural environment without the researcher’s presence being known is what makes this data collection method so valuable, some disagree with its use due to privacy concerns. As the subject is not asked for his or her permission before being observed for research purposes, an invasion of their privacy has occurred. Another issue, as mentioned previously, is the waiting that an observer, even a disguised observer, must do during their observations. To circumvent this, actors, referred to as confederates, have been implemented to speed up the events of interest. As is done for the ABC Primetime television show, “What Would You Do,” actors play out a scenario in the hopes of eliciting a response from those around them. John Quinones, the host of the show, watches the scene from afar as do hidden cameras. After the event takes place, he moves forward to interview those who either did or did not react to the event that took place in front of them. Scenes of domestic violence, drunk driving, and racism have all been set up by the television show with varied responses by individuals present. Researchers have also examined such topics through the use of confederates. For example, Formby and Smykla (1984) studied pedestrians’ reactions to a student actor’s attempt to drive under the influence. To set the scene, the actor, smelling of alcohol, stumbled to his car and pretended to have difficulty opening the car door. Amazingly, out of the pedestrians who passed, over 60% assisted the “drunk” driver open his car door. While the use of confederates is helpful to researchers, there are instances where public reaction to the event being observed can be dangerous to the research team, particularly the confederates. For example, in the case of a staged domestic violence altercation, a passerby may respond physically to the person acting as the abuser. Researchers never truly know how people will respond, which makes this type of research exciting, albeit risky.
RESEARCH IN THE NEWS
Mock Jury Decides Scott Peterson Case
On Christmas Eve in 2002 the disappearance of Laci Peterson, a young pregnant woman in Modesto, California, sparked a media frenzy.32 Laci, who was nearing eight months into her pregnancy, was married to Scott Peterson, a man whom she had met in college and married in 1997. The night before she disappeared, Laci was seen by her mother and sister. The following morning there was no sign of her. Her car was left parked in the driveway and her purse, keys, and cell phone were found on the kitchen table inside the home she shared with Scott. The day Laci disappeared, her husband was fishing and did not return until that evening, which was when he called Laci’s family to see if she was with them. That evening the search began and did not conclude until early April 2003 when a male fetus and badly decomposed body of a recently pregnant woman were discovered days apart on the San Francisco Bay shore, north of Berkeley. Investigation into their marriage revealed Scott’s multiple extramarital affairs and other unseemly behavior. He was arrested on April 18, 2003, in southern California. With an altered appearance and carrying thousands of dollars in cash, he looked as if he were planning to leave the country. His trial began in June 2004.
In October 2004, prior to the end of Peterson’s trial, CBS aired an episode of 48 Hours Mystery entitled, “On the Verge of a Verdict,” for which a jury consultant team was asked to put together a mock jury similar to the jury hearing the Scott Peterson case.33 Using the mock jury, a study was conducted to see how individuals resembling the actual jurors in the case might decide the outcome. The mock jurors were shown the same evidence presented by the prosecution and defense at trial, which included the infamous conversations recorded between Scott Peterson and Amber Frey, the woman involved with Scott at the time of Laci’s disappearance. After hearing and discussing the evidence, as an actual jury would, the mock jury deadlocked, with all but two of the mock jurors deciding Scott Peterson was guilty of murdering his wife and unborn son. Following this outcome, the mock jury shared that they thought the actual jury would also end in a deadlock. This was not the case, however. On November 12, 2004, the actual jury convicted Scott Peterson following only eight hours of deliberation. He was later sentenced to death and is currently awaiting execution in California’s San Quentin State Prison.
The use of simulation in criminal justice research has, for the most part, involved studies of jury behavior. Because it is difficult, if not impossible, to study juror behavior in real time, mock juries have been instituted using mostly community and college student samples. This type of research began with the Chicago Jury Project of the 1950s.34 Since that time, mock juries and mock trials have become commonplace and, beyond research, are often used as educational tools for criminal justice students. Simulations can be defined as artificial research settings that have been carefully created so that their features mimic reality as much as possible. The two most prominent research projects involving simulation were discussed in Chapter 2, Zimbardo’s Stanford Prison Experiment and Milgram’s study on Obedience to Authority. Zimbardo recreated the prison environment for his study of inmate and correctional officer behavior, and Milgram set up an artificial “learning” experiment where the “teacher,” or person under study, was directed to give shocks at increasingly high levels when wrong answers were given. While these two studies are remembered most for the psychological toll exerted on their participants, the subjects’ nonartificial response was due in large part to the realistic nature of the simulation.
With the advent of new technologies, it is possible that simulations will be utilized more often for research. Advanced simulation setups are already utilized for criminal justice training, and their use could be extended for research purposes. For example, the Incident Command Simulation Training (InCoSiT) Program35 is facilitated at the Bill Blackwood Law Enforcement Management Institute at Sam Houston State University. This program mimics a command center during a crisis event, and those being trained use the simulation to practice what they would do during such an event. While this program is currently used for training, there are research applications that could also make use of such a simulated scenario. For example, if a researcher wanted to examine gender differences in crisis command response, InCoSiT would provide a platform for that study to be conducted.
RESEARCH IN THE NEWS
“Lost and Found” in the Archives
There exist numerous sources of archival records in the United States and abroad. These are available for researchers and other interested parties to search, and interesting and surprising finds often result. To increase awareness of archives and archival records research, the Society of American Archivists36 holds a national “I Found It in the Archives!” competition where archive users are able to submit their experiences for review. Archie Rison, the 2011 winner of the competition, relied on archives at the Stephen F. Austin University East Texas Research Center to conduct extensive genealogy research on his family. Recent news stories have also related interesting archival finds. One story revealed how heroin was found in a United Kingdom National Archives file.37 The heroin, which was part of a 1928 court case, was found by a citizen who had requested the file to review. When the substance was found, it was sent off for analysis, which confirmed the presence of heroin. After being handed over to the Metropolitan Police, the package containing the heroin was replaced with a picture in the archive folder. Jeff James, the UK National Archives Director of Operations noted that, while extremely rare, finds like that are sometimes unexpectedly discovered in their collection of 11 million records. With archives such as these existing across the globe, it is only a matter of time before another fascinating discovery is made.
As this chapter has shown, there are many ways to avoid reactivity in gathering data for a research project. Whether you obtain data from an online archive, use GIS mapping, or observe others from a distance, by being unobtrusive you can generally ensure that reactivity is not a limitation to your research findings. While these methods have numerous advantages including the savings in time and cost and access to restricted populations, they are not without their shortcomings. The biggest limitation of using unobtrusive methods is that measurements are often indirect and therefore should be substantiated by other measures of the same phenomenon. Additional shortcomings include the lack of control over original data collection in the use of secondary data and the chance that obtaining data may not in fact be quick or without cost. In a nutshell, unobtrusive methods offer another way of data gathering. In some situations, they may be the only available option for a researcher besides doing nothing. Whether such methods are the best choice must be determined by individual researchers based on the needs of their research projects.
Critical Thinking Questions
1. What is reactivity, and how do unobtrusive methods serve to decrease this?
2. Why is secondary data analysis a popular option for criminal justice researchers?
3. What is physical trace analysis, and how has it been used for criminal justice research?
4. What are the advantages and disadvantages to being a disguised observer?
5. Explain how technology has advanced unobtrusive data-gathering techniques. Give at least three examples.
archives: A place, either physical or electronic, where records and other data are stored
content analysis: A method requiring the analyzing of content contained in mass communication outlets such as newspapers, television, magazines, and the like
diary method: A data-gathering technique that asks research subjects to keep a diary, or written record, of their time participating in the research study
disguised observation: A researcher joins the group under study to observe their behavior but does not reveal his or her identity as a researcher or purpose for being there
interrater reliability: A ratio established to determine agreement in a content analysis with multiple raters
meta-analysis: A type of content analysis in which researchers quantitatively review, organize, integrate, and summarize the existing research literature on a certain topic
physical trace analysis: The examination of physical substances that have been created and left by individuals as they come in contact with their environment
secondary data analysis: Occurs when researchers obtain and reanalyze data that were originally collected for a different purpose
simulations: Artificial research settings that have been carefully created so that their features mimic reality as much as possible
unobtrusive: A method that is nonreactive; indicates that what or who is being studied is unaware of its/their role as research participant
1 Frost, N. A., & N. D. Phillips. (2011). Talking heads: Crime reporting on cable news. Justice Quarterly, 28 (1), 87–112.
2 The Economist. (June 4, 2009). Questioning the Hawthorne Effect—Light work: Being watched may not affect behaviour, after all. Retrieved online, http://www.economist.com/node/13788427.
3 Milgram, S. (1977). The individual in a social world. New York: McGraw-Hill.
4 Stem, S. E., & J. E. Faber. (1997). “The lost e-mail method: Milgram’s lost-letter technique in the age of the Internet.” Behavior Research Methods, Instruments & Computers, 29 (2), 260–263.
5 See ESRI website, http://www.gis.com/
6 Coppock, J. T., & D. W. Rhind. (1991). “The history of GIS.” In Maguire, D. J., M. F. Goodchild, & D. W. Rhind (eds.) Geographical Information Systems: Principles and Applications (Volume 1, pgs. 21–43). Harlow, Essex, England: Longman Scientific & Technical.
7 Mamalian, C. A., & N. G. LaVigne. (1999). The use of computerized crime mapping by law enforcement: Survey results. U.S. Department of Justice—National Institute of Justice. Retrieved from http://www.nij.gov/pubs-sum/fs000237.htm.
8 Manhein, M. H., G. A. Listi, & M. Leitner. (2006). “The application of geographic information systems and spatial analysis to assess dumped and subsequently scattered human remains.” Journal of Forensic Sciences, 51(3), 469–474.
9 Medina, R. M., L. K. Siebeneck, & G. F. Hepner. (2011). “A geographic information systems (GIS) analysis of spatiotemporal patterns of terrorist incidents in Iraq 2004–2009.” Studies in Conflict & Terrorism, 34(11), 862–882.
10 Davidson, P. J., S. Scholar, & M. Howe. (2011). “A GIS-based methodology for improving needle exchange service delivery.” International Journal of Drug Policy, 22(2), 140–144.
11 Caplan, J. M., L. W. Kennedy, & J. Miller. (2011). “Risk terrain modeling: Brokering criminological theory and GIS methods for crime forecasting.” Justice Quarterly, 28(2), 360–381.
12 Simonton, D. K. (2003). “Qualitative and quantitative analyses of historical data.” Annual Review of Psychology, 54, 617–640.
13 Frueh, B. C., J. D. Elhai, A. L. Grubaugh, J. Monnier, T. B. Kashdan, J. A. Sauvageot, M. B. Hamner, B. G. Burkett, & G. W. Arana. (2005). “Documented combat exposure of U.S. veterans seeking treatment for combat-related post-traumatic stress disorder.” The British Journal of Psychiatry, 186, 467–472.
14 Albini, J. (1971). The American mafia: Genesis of a legend. New York: Appleton.
15 Clarke, J. W. (1982). American assassins: The darker side of politics. Princeton, N.J.: Princeton University Press.
16 Gidley, B. (2004). “Doing historical and archival research.” In C. Seale (ed.), Researching Society and Culture (249–264). Thousand Oaks, CA: Sage.
17 Tufts, H. (1930). The autobiography of a criminal. Upper Saddle River, NJ: Pearson.
18 See for example, Maas, P. (1968). The Valachi Papers. New York: Bantam Books; Teresa, V., with T. C. Renner. (1973). My life in the Mafia. Greenwich, CN: Fawcett; Pileggi, N. (1985). Wiseguy: Life in a Mafia family. New York: Simon and Schuster.
19 Berelson, B. (1952). Content analysis in communication research. New York: Free Press.
20 Ross, L. E. (2008). “Criminal justice pioneers: A content analysis of biographical data.” Journal of Criminal Justice, 36(2), 182–189.
21 Eschholz, S., M. Mallard, & S. Flynn. (2004). “Images of prime time justice: A content analysis of ‘NYPD Blue’ and ‘Law & Order.’” Journal of Criminal Justice and Popular Culture, 10(3), 161–180.
22 Malone, D. F. (2006). Dead men talking: Content analysis of prisoners’ last words, innocence claims and news coverage from Texas’ death row. A Master’s thesis completed for the Department of Journalism at the University of North Texas.
23 Wilson, F. T., D. R. Longmire, & W. Swymeler. (2009). “The absence of gay and lesbian police officer depictions in the first three decades of the core cop film genre: Moving towards a cultivation theory perspective.” Journal of Criminal Justice and Popular Culture, 16(1), 27–39.
24 Phillips, N. D., & S. Strobl. (2006). “Cultural criminology and kryptonite: Apocalyptic and retributive constructions of crime and justice in comic books.” Crime, Media, Culture: An International Journal, 2(3), 304–331.
25 Glass, G. V. (1976). “Primary, secondary, and meta-analysis of research.” Educational Researcher, 5(10), 3–8.
26 Smith, P., P. Gendreau, & K. Swartz. (2009). “Validating the principles of effective intervention: A systematic review of the contributions of meta-analysis in the field of corrections.” Victims & Offenders, 4(2), 148–169.
27 Pratt, T. C. (2010). “Meta-analysis in criminal justice and criminology: What it is, when it’s useful, and what to watch out for.” Journal of Criminal Justice Education, 21(2), 152–168; Wells, E. (2009). “Uses of meta-analysis in criminal justice research: A quantitative review.” Justice Quarterly, 26(2), 268–294.
28 Webb, E. J. (1966). Unobtrusive measures: Nonreactive research in the social sciences. Chicago: Rand McNally.
29 Webb, E. J., D. T. Campbell, R. D. Schwartz, L. Sechrest, & J. B. Grove. (1981). Nonreactive measures in the social sciences (2nd ed.). Boston: Houghton Mifflin.
30 Klofas, J., & C. Cutshall. (1985). “Unobtrusive research methods in criminal justice: Using graffiti in the reconstruction of institutional cultures.” Journal of Research in Crime and Delinquency, 22(4), 355–373; Sechrest, L., & A. K. Olson. (1971). “Graffiti in four types of institutions of higher education.” Journal of Sex Research, 7, 62–71.
31 Clinard, M. B., & R. Quinney. (1973). Criminal behavior systems: A typology (2nd ed.). New York: Holt.
32 The Modesto Bee. The Peterson Case. See website, http://www.modbee.com/peterson.
33 Klug, R. (October 30, 2004). On the verge of a verdict. In S. Zirinsky, 48 Hours Mystery [Season 18, Episode 7]. Los Angeles, CA; CBS.
34 Bornstein, B. H. (1999). “The ecological validity of jury simulations: Is the jury still out?” Law and Human Behavior, 23(1), 75–91.
35 See InCoSiT website, http://www.incosit.org/
36 See the Society of American Archivists website, http://www2.archivists.org/initiatives/i-found-it-in-the-archives/i-found-it-in-the-archives-2011-national-competition
37 BBC News. (December 19, 2011). “Heroin found in national archives file.” Retrieved from http://www2.archivists.org/.