Society for Risk Analysis Annual Meeting. Empires of Risk Analysis: Science, Policy, and Innovation

Society for Risk Analysis Annual Meeting “Empires of Risk Analysis: Science, Policy, and Innovation” Annual Meeting Abstracts Crystal Gateway Marrio...
Author: Emil Washington
25 downloads 1 Views 1MB Size
Society for Risk Analysis Annual Meeting “Empires of Risk Analysis: Science, Policy, and Innovation”

Annual Meeting Abstracts

Crystal Gateway Marriott, Arlington, Virginia, USA 6-10 December 2015

SRA 2015 Annual Meeting Abstracts W5-D.3 Aaltonen, MA; Virebit Oy Riihimaki Finland; [email protected] Implementation of the National Contingency Plan As part of the program “Environmental Dimension of sustainable Development” of the framework agreement between Brazil and the EU, a report was prepared about regulations concerning actions to prevent risks of oil and other hazardous chemicals spills into waters. The report acts as background material for the implementation of the National contingency Plan concerning preparedness, prevention and response to oil and chemical spill in waters and provides information about experiences within the EU in responding to oil and chemical spill incidents. The report contains an overview of the EU Directives and regulations related to nature conservation, marine environment protection, oil and chemical spill prevention and response, obligations for offshore installations to protect the environment, and regulations concerning the equipment and methods in oil and chemical spill prevention and response. In addition, legislation and practices in five European countries, Finland, Germany, Norway, UK and Portugal, is presented. An overview is included about global, regional and national data bases and monitoring systems for an early identification of incidents and for statistics, as well as monitoring of movements of ships. Practical instructions about the contents of a contingency plan for ports are presented in the report. The legislation in member states are harmonised due to the mechanism where the European Parliament and Council issue directives as basis for national legislation with the obligation that each country should include the directive contents in their legislation until a given date. The “polluter pays” principle is applied in cases of oil and chemical spill incidents. When a major accident occurs where local measures are not sufficient, it may be necessary that a governmental organisation intervenes to minimise damages to the environment. In such cases the compensation of costs will be settled after the incident, and support may be available from various funds created for such purposes. Cross-border cooperation is discussed in the report.

M3-C.1 Abelkop, A*; Richards, K; Indiana University School of Public and Environmental Affairs; [email protected] Policy Chemistry: Comparing the Choice of Policy Instruments for Managing Chemical Risks in the US, Canada, and the EU There are no panaceas for addressing risks to public health and the environment. An incomplete list of policy instruments includes taxes, subsidies, marketable allowances, command and control mandates, licensing schemes, information and labeling requirements, and the entitlement of property and liability rights. A branch of the environmental policy field is devoted to examining policy instrument choice to identify the approaches that are best suited to address particular environmental and public health issues. This paper will be among the first to apply a systematic instrument choice framework to compare to the mix of policy instruments applied to managing chemical risks in the United States, Canada, and the European Union. The framework, which can be applied to any environmental or public health issue, takes a taxonomic approach to identify the mixes of policy instruments that are best suited to various types of market failures. Applying such a framework to chemical risk management is particularly timely. The United States Congress is seriously considering revision of the Toxic Substances Control Act of 1976. Additionally, Canada’s Chemicals Management Plan and the EU’s Registration, Evaluation, Authorisation, and Restriction of Chemicals (REACH) Regulation are global models for risk assessment and management policy. The comparative literature on these policies often concludes, or merely supposes, that the approaches taken by the U.S., Canada, and the EU differ greatly (e.g., Abelkop and Graham 2015, Renn and Elliott 2011, Applegate 2008, Denison 2007). However, we demonstrate that there is actually very little variation in policy instrument choice across these three jurisdictions. The variation that exists is in how the instruments are applied and implemented. We conclude by suggesting how alternative policy instruments (including market-based instruments) might be applied in the context of chemical risk assessment and management.

W2-H.3 Abualfaraj, N*; Gurian, PL; Olson, MS; Drexel University; [email protected] Statistical analysis of compliance violations for natural gas wells in Pennsylvania Regulatory inspection and violation reports may provide some insight into the impact of natural gas extraction on the surrounding environment, human health, and public safety. Inspection and violation reports for natural gas wells in the state of Pennsylvania were collected from the Pennsylvania DEP Compliance Report from January 1st 2000 to August 30th 2014. Analysis of 215,444 inspections records for 70,043 conventional and unconventional wells was conducted in order to compare the odds of violations occurring under different circumstances. Logistic regression models were used to estimate the probability of violations occurring for both conventional and unconventional wells. Generally, violations at conventional wells have 52.4% [p < 0.005] higher odds of occurring. However, unconventional wells have a 73% [p < 0.005] higher odds for environmental violations related to water contamination. In addition, large operators (operators with more than 100 active wells) had 43.1% [p < 0.005] lower odds of having any type of violation than smaller operators. While larger operators seem to have generally safer practices, a few of the largest companies (companies that have the highest number of active wells) had rates of violation much higher than the average for all operators with some reaching violation rates as high as 1 in 4 active wells. Violations also have 7% [p < 0.005] lower odds of occurring each additional year that a well has been in operation.

W1-D.1 Ackerman, GA; University of Maryland; [email protected] A Multi-Method Approach to Assessing Radiological/Nuclear Terrorism Threats: Identifying the Adversary Despite likely remaining a rare phenomenon for the foreseeable future, the potentially far-reaching consequences of radiological and nuclear (RN) terrorism demand efforts to prevent or prepare for this risk. In addition to plans to mitigate the consequences, given that the vast majority of terrorists lack the motivation and capability to engage in RN terrorism, focusing limited law enforcement and intelligence resources on the greatest threats is an efficient means of reducing risk. However, prioritizing certain actors for greater scrutiny is a difficult threat assessment problem in that there is – thankfully – no empirical sample of nuclear or large-scale radiological terrorism from which to extrapolate. Even if there were, the highly dynamic geopolitical and technical environment and often obscure motivations of adversaries compound the standard epistemological and practical impediments to threat assessment. Amidst such obstacles, no single risk analysis method, each of which has its strengths and weaknesses, is likely to yield an adequate assessment. A large threat study will be described that presents a coordinated approach to identifying the most likely RN non-state adversaries within a ten-year forecast period. More than merely an ensemble of algorithms, this multi-method approach examines the threat from three fundamentally different, but complementary, perspectives: 1) a qualitative threat analysis informed by deep domain knowledge of actor motivations, capabilities and opportunities; 2) statistical simulation involving the creative use of proxy data on related behaviors; and 3) both semi-structured and probabilistic elicitation of heterogeneous groups of experts. Discussing both the methodology and the findings, it will be argued that the use of multiple approaches, carried out relatively independently and subsequently combined, can maximize inferential leverage and mitigate analytical limitations of single approaches to threat assessment.

December 6-10, 2015 - Arlington, VA

1

SRA 2015 Annual Meeting Abstracts W2-E.2 Adeleye, AS*; Rutten, P; Keller, AA; University of California, Santa Barbara, California 93106 â““ 5131, United States and University of California's Center for Environmental Implications of Nanotechnology, Santa Barbara, California, United States; [email protected] Adsorption of algal extracellular polymeric substances to TiO2 nanoparticles: Effects on surface properties and fate of nanoparticles Implications of engineered nanomaterials (ENMs) in the natural environment are often investigated using pristine particles. However, there are several biogenic and geogenic materials in natural waters that can interact with and modify the surface of ENMs, thereby affecting their fate and effects. In this study, the effect of soluble extracellular polymeric substances or sEPS (produced by freshwater and marine algae) on the surface properties and fate of three titanium dioxide nanomaterials (nano-TiO2) were studied. Interactions between sEPS and nano-TiO2 were studied via electrophoretic light scattering and infrared spectroscopy. Electrophoretic and dynamic light scattering techniques were used to monitor changes in surface charge and aggregation kinetics of nano-TiO2, respectively. sEPS adsorbed to the surface of the nano-TiO2 via electrostatic interactions as well chemical bonding, which involved the carboxylic groups of sEPS proteins. Phosphate groups in nucleic acids or phospholipids of sEPS also mediated interactions with the surface of the ENMs. The adsorption of sEPS was dependent on particle size, intrinsic surface charge, and hydrophobicity. Charge reversal of positively charged nano-TiO2 was observed at pH 7 in the presence of 0.5 mg-C/L sEPS. More so, the critical coagulation concentration of nano-TiO2, a measure of their stability in aqueous media, increased in the presence of sEPS. This study shows that ENMs will interact with sEPS in natural waters, which may alter their fate and effects in ways that cannot be predicted using pristine ENMs.

M2-H.3 Aerni, SJ*; Patel, CJ; Pivotal Software, Harvard Medical School; [email protected] Examining The Need And Value Of Data Aggregation And Sharing Toward Food Exposure Measurements: Data Science Modeling, Tools And Approaches An individual’s health state is the result of a complex interplay of both internal and external influences, including genetic predispositions, lifestyle and environmental exposures. The growing availability of biomedical data of patients is driven by the shift in the way this data is captured and shared digitally using new technologies. The data from lifestyle and environmental exposure, historically measured at population-level during studies or sparsely captured during doctor’s visits, is also undergoing a shift using social networks and quantifiable-self devices. Using these tools allows the community to gain a better, real-time understanding of populations using data passively collected, and therefore without misrepresentation and recall biases common in patients. However, the understanding about environmental often lacks such rich datasets. In food safety, for example, data is regarded as highly valuable and a source of competitive advantage, therefore not shared. This limits the use of point-of-sale data, credit card transactions and nutrition trackers. These could give better insight into true levels of exposure. In other cases, studies examine the effects of single environmental agents and ignore the complex interplay of an individual’s characteristics and comorbidities. This offers an unrealistic view of the human experience. With the availability of scalable platforms, we are now able to ask many more questions to examine more complex relationships between environmental exposure and disease. In this talk we will discuss the promise of integrating datasets, the technologies that are required and the types of insights that can be derived by using big data approaches. In particular we will present results from examining the complex interdependencies between 1000s of environmental agents (e.g., infectious agents, biomarkers of pollutants, and nutrient exposure) in datasets representative of the US population.

M3-D.2 Agrawal, S*; Gernand, JM; Pennsylvania State University; [email protected] Risk based decision making for fracturing proppant selection Selection of the best proppant for hydraulic fracturing is an important parameter to achieve higher production. Some of these proppants are made up of chemicals like silica (quartz sand), alumina, resin coated silica, titanium dioxide, iron-titanium oxide, kaolinite, iron oxide, boron oxide, phenol formaldehyde resin, and others. These materials and chemicals can be toxic to varying degrees and lead to health problems in the employees handling them. Factors affecting the selection of proppants are closure stress of reservoir, bottom hole temperature, permeability contrast between pay zone and infilled fracture and composition of formation fluid in the wells. With increased depth of wells, several types of proppants have been developed to meet the formation characteristics for achieving higher production. Substantial research has been conducted to determine the effect of silica on human health related to hydraulic fracturing and similar processes but very little research has been done to determine the risk associated with using these alternative proppants. This study focuses on the risks or benifits to human health by these proppants through a multi-criteria decision analysis proppant selection taking the health hazards into consideration as well as functional attributes. Different commercially available proppants were surveyed for content, functional properties, and cost. The health impacts were ascertained from a review of toxicological and epidemiological studies on exposure to fine and ultra fine particulates and recommended exposure limits for chemicals. The multi-attribute decision model including effects on production, cost, and risk tolerance for health impacts reveals optimal proppant selection strategy for different geological conditions. Results show that the most commonly used proppant, silica, should more often be replaced by less toxic, more expensive alternatives for organizations with a range of risk tolerance and cost tolerance characteristics.

T4-E.1 Alderson, DL*; Carlyle, WM; Naval Postgraduate School; [email protected] Assessing Risk and Resilience for Critical Infrastructure Systems We propose a functional definition of infrastructure resilience based on attacker-defender models and using a parametric analysis of attacker capability. In general, this parametric analysis requires enumeration of a potentially enormous number of optimization problems. In this talk, we present a computational technique that uses bounding arguments to significantly limit the enumeration while still providing useful measures of infrastructure resilience. We then show how these same computations support complementary risk analysis to support risk-informed investment decisions. We illustrate with a real case study.

December 6-10, 2015 - Arlington, VA

2

SRA 2015 Annual Meeting Abstracts M4-B.4 Alexeev, A*; Krutilla, K; Indiana Uinversity; [email protected] Benefit Cost Analysis in a Strategic and Risky Environment Modeling security risks is a complex and multilevel problem involving different actors and strategic environments. This research uses a game theoretic model to determine the optimal strategic defensive investment of a country under threat of a terrorist attack. Both the terrorist organization and the defending country must make a decision about how much of available resources and efforts should be allocated to the attack/defense in order to maximize gain/minimize losses, respectively. The model allows for asymmetry in the effectiveness of investments in attack/defense, different economic consequences of the attack for the attacker and the defender, and differing risk perceptions. The model also relaxes assumptions about agents’ rationality. Nash solutions in pure and mixed strategies are derived and preliminary results are analyzed. The paper concludes with a discussion and recommendations for future research.

P.107 Ali, J*; Santos, JR; George Washington University; [email protected] Stochastic Input-Output Analysis and Extensions for Impact Analysis: A United States Case Study The input-output (I-O) model’s capability to provide macroeconomic policy insights on interdependent economic systems has recently been extended in the field of quantitative risk analysis. As with any quantitative models, estimates of input data and associated parameters are inevitably prone to some kind of error or bias. The same statement can be said about the susceptibility of the I-O technical coefficients to imprecision originating from various sources of uncertainty. Hence, this paper provides a methodology based on stochastic I-O analysis to address these issues and subsequently measure the uncertainty when using the I-O model. The research uses the supply and use tables from the US Bureau of Economic Analysis for a period of 14 years (1998-2011) to estimate the probability distributions of the technical coefficients. The coefficients are assumed to follow the Dirichlet distribution, and their moments are evaluated by using a Monte-Carlos Simulation of 10,000 iterations. The simulation methodology is implemented in MATLAB and the results are used to generate key sector analysis. Probability distributions can be established to measure the backward and forward linkages for each economic sector. In addition, we used the eigenvalue method to determine the key sectors based on their contribution to the economy and to assess the sensitivity of the sectors to economic disruptions. In sum, this research develops a stochastic model based on historical I-O data and the results are envisioned to contribute positively to strategic economic planning and macroeconomic risk analysis.

T2-A.4 Allen, BC; Mendez, W*; Davis, JA; Gift, JS; United States Environmental Protection Agency; [email protected] Bayesian Hierarchical Modeling as a Means of Conducting Meta-Regression: Case Study of Cardiovascular Mortality following Arsenic Exposure In response to National Research Council (NRC) recommendations, EPA is exploring probabilistic approaches for combining study results in a dose-response analysis. An approach under consideration for cancer and non-cancer endpoints associated with exposure to inorganic arsenic is the use of Bayesian hierarchical modeling to accomplish meta-regression across studies. This approach would facilitate EPA efforts to be responsive to NRC recommendations and address uncertainties associated with study-to-study variability. This presentation discusses the results of an example analysis in which three studies of cardiovascular disease observed in relation to arsenic exposure (Wade et al., 2009; Sohel et al., 2009; Chen et al., 2011) were included in a meta-regression. Poisson regression was performed to model the reported relative risk estimates across all three studies. After a preliminary examination of a range of exposure-response models, the linear and Exponential-2 models were selected for this illustration based on their better Akaike Information Criterion and Bayes Information Criterion values from the maximum likelihood fits. For those two models, we demonstrate an approach to model selection based on comparison of the Watanabe-Akaike Information Criterion (WAIC) values derived as part of the Markov chain Monte Carlo (MCMC) simulation. There was a slight preference for versions of the linear and Exponential-2 models for which the “slope” parameters were considered random effects while the background parameter was considered a fixed effect. Results are presented that show the MCMC results related to (1) the variability in the study-specific slope parameters; (2) the uncertainty about the mean of those study-specific results; and (3) the distribution of relative risk values, as a function of exposure level. The views expressed in the proposal are those of the authors and do not necessarily reflect the views or policies of the U.S. Environmental Protection Agency.

M4-A.5 Allen, BC; Mendez, WM*; Davis, JA; Gift, JS; United States Environmental Protection Agency; [email protected] Bayesian Model Averaging in the Estimation of Arsenic-Associated Urinary Cancer Risks EPA is exploring options for quantifying uncertainty, including model uncertainty, in a probabilistic risk analysis. An approach under consideration for the Agency’s inorganic arsenic assessment is the use of Bayesian model averaging to assess the implications of different dose-response functions. This presentation discusses the results of an example analysis in which a range of exposure-response models was fit to grouped data from a recent (Chen et al. 2010) study of the association between exposure to arsenic in drinking water and urinary cancer risks in northeastern Taiwan. Multiple exposure-response models were fit by Poisson regression, using traditional maximum likelihood estimation (MLE) and Bayesian updating via Markov chain Monte Carlo (MCMC) simulation. Prior probability distributions for model parameters were specified to exclude the parameter space in which low-dose supralinearity could occur; that specification and other implications of prior selection for both parameters and models are examined. Averaged estimates of extra lifetime risk were derived for the U.S. population using life-table methods. The relative weights given to each model were based on either Bayesian Information Criteria (BICs) from the MLE regressions or on the posterior model probabilities generated by the MCMC simulation. Use of BICs gave greater weights to models which better replicated the slight convexity in the data, while the use of posterior model probabilities gave the greatest weight to the linear model, whose parameter estimates were most strongly supported by the small data set. In this example, the posterior distributions of averaged extra lifetime risk estimates were not strongly affected by the specific weighting scheme employed, nor by the a priori assumptions about low-dose model behavior. The views expressed in the proposal are those of the authors and do not necessarily reflect the views or policies of the U.S. Environmental Protection Agency.

December 6-10, 2015 - Arlington, VA

3

SRA 2015 Annual Meeting Abstracts W5-D.4 ALVES, EN; ENGINE ENGINEERING; [email protected] Implementation of oil spills contingency plan: a dialogue between European Union and Brazil. Oil spills in water have historically caused much socioeconomic and environmental damage. Offshore oil production and shipping have been the main responsible for major accidents and much has been learned from those undesirable events. This paper analyzes the policy, procedures and the best marine safety practices in the European Union with a view to implementation the Brazilian contingency plan, enabled by the strategic partnership between European Union and Brazil called "Sector Dialogues". How the countries in European are organized to address this issue and what challenges Brazil has to deal with to face the risks of oil spills at sea are some of the points treated. The "Project of Sector Dialogues Support" is coordinated by the Ministry of Planning and the EU Delegation in Brazil. In 2014 it was approved the action conducted by the Brazilian Institute of Environment (IBAMA) to support the implementation of the "National Contingency Plan for Pollution Incidents by Oil in Waters Under National Jurisdiction", introduction by Decree n. 8,127, of October 22, 2013. This project included the hiring of an EU expert to identify the European legislation related to the subject and a national expert to analyze the product of external expert and, based on the Brazilian reality, assist IBAMA to incorporate the best European practices. The publication of the results of both experts with the conclusions is compiled in the paper ISBN 978-85-7300-377-2. This paper refers only to the results obtained by the Brazilian expert which were achieved through the analysis of politics and framework of EU for preparedness and response to pollution in waters and coastal areas. The paper was carried through analysis of European legislation to check gaps in Brazilian legislation; analysis of the framework and procedures adopted in five EU Members States; detailed review of EU legislation searching criteria for defining whether an incident is a "national significance"; and finally make recommendations for the implementation process of the National Contingency Plan in Brazil.

M3-E.1 Alwood, RJ; Environmental Protection Agency; [email protected] Assessing and Managing the Risks of Chemical Substances Manufactured as Nanoscale Materials. The presentation will begin with a summation of the challenges of risk assessment for new chemical substances and specific issues relating to nanoscale materials. After that introduction there will be a description of steps taken to assess and manage those risks under TSCA given those limitations and challenges. There will also be a discussion of steps taken to better assess risks including available data on specific nanomaterials and extrapolating data for groups of nanomaterials.

P.197 Anderson, JK; Integral Consulting Inc.; [email protected] Status of Regulatory Decisions for Perfluoroalkyl Compounds: Is the Level of Protection to the General Public Worth the Uncertainty and Cost? Perfluoroalkyl substances (PFASs) continue to present significant challenges for regulatory agencies as the science behind the health and environmental effects are continuously evolving. Many agencies struggle to set regulatory levels for PFASs, in part, due to difficulties assessing risks from low level exposure and the constant generation of new toxicity information. At the Federal level, final guidance on PFASs is limited to the January 2009, U.S. EPA Office of Water provisional health advisories for PFOA and PFOS. In October 2009, U.S. EPA Office of Solid Waste and Emergency Response applied the sub-chronic RfDs for PFOA and PFOS from the Office of Water’s assessment to the Superfund risk-based equations to derive screening levels for water and other media; however, these values have not been adopted in the EPA Regional Screening Level Table. In the absence of national standards, some States have pursued drinking water advisories or standards for various PFASs, but with different interpretations of the science and methods for addressing data gaps. Both U.S. EPA (2014) and ATSDR (2015)have released draft assessment, which contain conflicting and opposing conclusions. In 2013, U.S. EPA began collecting occurrence data from large public drinking water supply systems for six PFASs to assess whether health-based standards under the Safe Drinking Water Act were necessary. PFASs have been detected in fewer than 5 percent of all water supply systems, but those detections have been across 33 states. Exposure for the general population has been shown to primarily occur through treated textiles and consumer products. It is unclear how information on exposure and occurrence are utilized by agencies to prioritize regulatory actions and develop the most public health-protective regulations. This presentation will evaluate and compare the various PFAS standards and general population PFAS exposure. Implications for public health policy and risk assessment will be presented.

W3-E.1 Andrew, SA; Arlikatti, S*; University of North Texas; [email protected] Modeling Communication and Trust Networks in Ebola Response: Institutional Collective Action Framework This paper examines the ad hoc communication and trust networks that have emerged in responding to the Ebola outbreak, especially between city managers, first responders, and key health care personnel in the Dallas-Fort Worth metropolitan region. Using a Social Network Analysis, two general propositions are advanced: the bonding and bridging effects. While the former is characterized by closely-knitted community organizations that have a strong sense of belongingness resulting in cooperation during an unexpected crisis, the latter portends an organization occupying a central position to take on the responsibilities of allocating the flow of resources, which highlights the capacity of individual organizations to gain access to a wide range of additional information.

December 6-10, 2015 - Arlington, VA

4

SRA 2015 Annual Meeting Abstracts P.158 Anyshchenko, A*; Xiang, W; University of Copenhagen; [email protected] New Breeding Techniques: The Risks of Innovation Versus the Inadequacy of Regulation There is a growing debate on whether plants manipulated using crop breeding tools commonly known as new breeding techniques (NBTs) are genetically modified organisms. Analysis of the legal status of NBTs in different jurisdictions may help elaborate on the issue. There are two approaches to the regulation of GMOs: a process-based and a product-based. According to the process-based approach adopted in the EU, GMOs are defined as arising from the use of certain specific methods. The product-based approach adopted in the US defines GMOs as possessing a new combination of genetic material that could not have occurred naturally. The latter approach suggests that some plants, even though modified by genome editing, may be outside of the scope of GMO regulation since they result in a product lacking a transgenic insertion. Opposition to genetic engineering is rationalized, inter alia, by the discourse of risks to health made possible due to the release of new organisms into the environment. In Europe, GMOs are associated with higher levels of perceived risk than in North America. However, it is unclear if NBTs would result in GMOs and the question of adequate risk policy on NBTs has yet to be answered. Would the resulting products fall under the scope of the existing GMOs legislation? In other words, is it legally feasible to separate NBTs from their transgenic counterparts? Does the law adequately reflect the present state of affairs in the field of agricultural biotechnology? EU regulation on GMOs treat NBTs with a degree of uncertainty that makes the law itself unclear and ineffective. Limited success of public policies based on the precautionary principle has not resulted in mitigating the risks of adverse environmental changes. Risks attributed to biotechnologies due to scientific uncertainty conflict with the inadequacy of risk governance. Rapid scientific developments require a new policy for the risks inherent in genome editing, and therefore the issue of the legal status of NBTs deserves careful consideration.

P.147 AOYAGI, M; National Institute for Environmental Studies; [email protected] Understanding of risk and media literacy In considering risk communication practices among general public, peoplefs media use is crucial. We think those abilities what we call gmedia literacyh, defined as the ability to access, analyze, evaluate and use various media. We focus on this media literacy in this paper and using nationally representative data, we discuss the importance of media literacy in the risk communication practices and governance. We carried out nationwide public opinion survey from 20 to 74 years old, nationally representative adults Japanese October 2014. Effective responses were 1,548 out of 4,000 samples contacted. Using this data, we analyzed peoplefs media exposure and literacy in risk issues. 1) We asked ginformation sources for daily news events including risk issues.h 92.6% of our respondents chose television program, 70.2% did newspaper (printed), while 24.0% did online news site including newspaper publishers. Those are followed by family and friends (21.3%), radio (20.4%), magazine (12.3%), and SNS (12.0%). About 23% of our respondents answered they did not use the internet at all. 2) Then, we asked gthe most trusted information sources for environmental issues.h The most chosen item was journalists (45.5%) followed by professors/ experts (27.1%), national government (24.6%), local government (23.1%), Environmental-NGOs (21.3%). 3) We had two quizzes concerning the science of radioactivity. We investigated the relationship between information sources they use, their trusted information sources and the answers of quizzes. The result shows a clear relationship among them. Respondents who chose the wrong options are more likely to trust gjournalisth, gSNSs by non-expertsh, ad who did right options are more likely to trust gprofessors/ expertsh, and read newspapers. Those relationships are statistically significant in Pearsonfs Chi test. 4) We also asked our respondents whether they chose food from possible radioactive contaminated regions. Anxiety level is closely related to choice of the information sources and purchasing behavior.

M4-I.2 Arnold, S*; Thompson, G; Kennedy, K; Landenberg, B; Mason, A; The Dow Chemical Company; [email protected] Integrating exposure information into a hazard-based screening tool for selection of chemical alternatives The general public and retailers are asking for disclosure of chemical ingredients in consumer goods. To respond to this demand, a number of chemical alternative assessment screening tools have been developed to disclose chemical ingredients, and beyond that, to evaluate and rank the “greenness” of individual chemical ingredients and/or formulations. The majority of chemical alternative assessment tools are hazard-based only, and because they disregard exposure, cannot provide human health risk information. A pilot project was developed to determine the feasibility of incorporating exposure estimations into the results of a hazard-based screening tool. Two “off the shelf,” commonly-used tools (GreenSuite for hazard and ConsExpo for exposure) were used to develop both qualitative and quantitative risk estimates for individual chemical ingredients within eight different product categories including shampoo, skin cream, laundry detergent, spray bathroom cleaner, joint sealant, oil and latex paint, and spray foam insulation. For this assessment, the ConsExpo default exposure algorithms for each product type were incorporated in the GreenSuite tool and connected through a user-friendly interface. Twenty-one hazard endpoints covering mammalian and ecological toxicity, chemical persistence and partitioning, and flammability for dermal and inhalation exposure were evaluated. The combined models provided the flexibility to yield relative hazard and exposure rankings for each chemical in each formulation. In addition, quantitative “margin of exposures” were determined. A number of challenges were identified including the need to obtain concentration data for each component within the formulation; to establish a database of up-to-date toxicological data for each endpoint; and to determine the key endpoint for comparison to exposure estimates.

M2-J.4 Arvai, J*; Bessette , D; Kenney, L; Campbell-Arvai, V; University of Michigan; [email protected] Less smoke, fewer mirrors: Decision-aiding to address the risks of climate change Proposals to facilitate policy changes in the name of addressing the risks from climate change often end in failure. Notable examples include the American Clean Energy and Security Act of 2009, as well as Canada’s withdrawal from the Kyoto Protocol in 2011. In our experience, these failures are both predictable and easy from the standpoint of policy-makers. They are predictable because of the ideological stalemate on climate change that exists in most countries between the political left and the political right. And, they are easy because policy- and decision-makers often do a poor job of confronting the tradeoffs that inevitably arise when climate risk management objectives conflict. Our research group has been working on decision-aiding tools that may help people to both break the deadlock caused by the ideological divide, and to make choices and policies that better account for multiple objectives (and, therefore, stand a chance of being implemented). In this presentation, we will discuss this research with examples drawn from our work on energy transitions in North America, as well as decision-aiding in support of international development efforts.

December 6-10, 2015 - Arlington, VA

5

SRA 2015 Annual Meeting Abstracts M4-D.1 Arvai, J*; Redmond, K; Roberts, P; Wernstedt, K; Wilson, R; University of Michigan; [email protected] What does it mean to be an expert? Studying the judgments of emergency managers in the context of flood-related risks A central question in Michael Lewis’ book “Moneyball" deals with what it means to be a baseball expert. On the one hand, expertise may be viewed as something an individual hones after years on the job, to the point that objectively “good” judgments are made instinctively, at gut-level. On the other hand, expertise may be viewed as something that is based more fully on one’s ability to evaluate incoming data in a manner that reflects boundedly rational models of judgment and choice. Our research asked the same question. But, rather than studying general managers in Major League Baseball, we worked emergency managers who work (or may find themselves working) on the risks associated with dangerous flooding. We exposed emergency managers to a variety of “classic” judgment and decision-making tasks, which tested for biases. Specifically, our research focused on how these experts would respond to questions designed to study prospect theory, errors of omission and commission, herd effects, ambiguity, and numeracy. All of the items and scenarios used in our research were reframed to reflect flood-related risks. To further calibrate the judgmental capabilities of expert emergency managers, we also exposed a sample of lay respondents to the same questions, and compared the performance of the two groups. Our results, and their implications for providing training to experts about ideas and tools from the decision sciences will be discussed.

W4-I.2 Arzuaga, X*; Cooper, G; Hotchkiss, A; U.S. Environmental Protection Agency; [email protected] Application of an Adverse Outcome Pathway (AOP) framework to evaluate species concordance and human relevance of Dibutyl Phthalate (DBP)-induced toxicity to the male reproductive system. Dibutyl phthalate (DBP) is phthalate ester used as a plasticizer, and a solvent. Studies using rats consistently report that DBP exposure disrupts normal development of the male reproductive system in part via inhibition of androgen synthesis. However, studies using xenograft models report that human fetal testis are unresponsive to DBP-induced disruption in testosterone synthesis. These results question the validity of the rat model for assessment of potential male reproductive effects caused by DBP. The adverse outcome pathway (AOP) framework has been proposed as useful tools in the evaluation of species concordance and human relevance of chemical-induced adverse outcomes. An AOP framework was used to evaluate the available evidence for DBP-induced toxicity to the male reproductive system. The proposed framework is based on established cellular, molecular, and endocrine mechanisms which control development and functions of the male reproductive system. Studies on the potential male reproductive effects caused by DBP exposure were evaluated for consistency and biological plausibility. Three relevant biological elements were identified: 1) fetal rats appear to be more sensitive than other rodents and human fetal xenografts to DBP-induced anti-androgenic effects, 2) DBP-induced androgen-independent adverse outcomes are conserved among different mammalian models and human xenografts, and 3) DBP-induced anti-androgenic effects are conserved in different mammalian species and humans when exposure occurs during post-natal life stages. Application of the AOP framework also identified critical informational gaps, including identification of molecular targets for DBP which initiate reproductive adverse outcomes. The views expressed are those of the authors and do not necessarily represent the views or policies of the US EPA.

M4-H.1 Aven, T; Flage, R*; University of Stavanger; [email protected] Assumptions in quantitative risk assessments: when explicit and when tacit? In a quantitative risk assessment a number of assumptions have to be made in order to compute the risk metrics addressed. Such assumptions may for example be linked to the number of people exposed to specific hazards, the reliability of a safety system and the load a wall is able to withstand. In addition there are more or less tacit assumptions, for example when making a probability judgment about an event to occur. The probability judgment is based on some knowledge - which essentially captures data, information and justified beliefs - and here tacit assumptions may exist even if explicit assumptions have not be formulated, for example a belief about how the system works. The risk metrics are conditional on this background knowledge and these assumptions, and the strength of this knowledge and the “risk” related to potential deviations in the assumptions then need attention. This paper discusses this topic, the main aims being to clarify the issues raised and to provide some guidance on how to formulate the background knowledge to distinguish between explicit and non-explicit (tacit) assumptions.

W3-G.2 Aylward, LL*; Bachler, G; von Goetz, N; Hays, SM; Summit Toxicology, LLP; ETH; [email protected] Biomonitoring Equivalents for Interpretation of Silver Biomonitoring Data in a Risk Assessment Context Silver is widely used as an antimicrobial agent in both the ionic and nanoparticle forms, and general population exposure to silver can occur through the presence of trace levels in foods, dusts, through dermal contact with silver-treated textiles, from use of wound care products, and from other sources. Biomonitoring for silver in blood or urine in persons in the general population is being conducted by the Canadian Health Measures Survey (CHMS). Tolerable exposure guidance values for silver designed to prevent adverse effects of excess silver exposure are available from the United States Environmental Protection Agency (an oral reference dose, or RfD) and from literature evaluations of recent data on responses to nanoparticle silver (a recommended tolerable daily intake, or TDI). A current physiologically-based pharmacokinetic model is used here to estimate Biomonitoring Equivalents (BEs) for silver, which are steady-state blood and urine concentrations consistent with the RfD (BERfD) or the recommended TDI (BETDI). The BERfD and BETDI values based on silver in whole blood are 0.4 and 0.2 ug/L, respectively. BE values for silver in urine range from 0.3 to 1 ug/L depending on the age group and guidance value considered. These values can be used to examine measured biomarker concentrations in the context of current risk assessments for silver in order to provide an indication of whether exposures are well below, near, or potentially in excess of levels consistent with current risk assessment-derived exposure guidance values. This work was conducted under contract to Health Canada.

December 6-10, 2015 - Arlington, VA

6

SRA 2015 Annual Meeting Abstracts M4-H.2 Ayyub, B; University of Maryland, College Park; [email protected] Treatments of Unforeseen Events in Probabilistic Risk Analysis Quantifying risk probabilistically requires defining a universe, i.e., a universal set. In probability theory, the universal set is called the sample space. A universal set is the totality of all the things that exist pertaining to the domain of interest. Therefore, the universal set in probability theory is a “complete” set that is known with absolute certainty, termed in this case the “closed-world” assumption. This assumption can be relaxed to allow for cases of an uncertain universal set due to knowledge deficiencies. Models based on an “open-world” assumption are presented in order to account for unforeseen events. The theory of evidence, and event sequences and patterns offer mathematical frameworks for examining and treating unforeseen events. In reality, many historic failures have occurred as a result of unforeseen events.

T4-G.1 Ayyub, B.*; Scouras, J.; University of Maryland College Park; [email protected] Climate Change as a Global Catastrophic Risk Global catastrophic risks are associated with natural or anthropogenic events that have the potential to inflict serious damage on human well-being on a global scale, including destroying or crippling modern civilization. Global climate change is generally considered a result of increasing atmospheric concentrations of greenhouse gases, mainly due to human activity. The 2013 report from the Intergovernmental Panel on Climate Change states that nearly 75% of the total radiative forcing increase since the year 1750 is due to CO2 emissions. Most climate change effects will persist for centuries even if CO2 emissions are stopped. Climate change (i.e., global temperature increases) then in turn can lead to a global sea level rise, affecting the frequency and intensity of extreme weather and climate events, e.g., precipitation storm, heat waves, etc. Linking global changes to local geographic impacts requires a deep understanding of many physical phenomena and their interactions including deep ocean temperatures, currents, geophysics of ocean basins, gravitational forces, etc. Possible strategies for long-term decision making require the understanding of these links. In the interim, engineers may rely on adaptive techniques for design and mitigation with real options. Tradeoffs may be based on low regret criteria.

T3-H.2 Babendreier, J; Womack, D; Parks, A*; Taylor, T; U.S. Environmental Protection Agency; [email protected] USEPA’s Land-Based Materials Management Exposure and Risk Assessment Tool System It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. We developed an integrated tool system to evaluate risks and benefits of alternative materials management practices for land-based applications. The tool system provides an integrated data gathering and analysis capability to enable scientifically rigorous analysis of exposure, risks, benefits, and opportunities for the safe, beneficial reuse of a wide variety of materials that may in the past have been considered to be 'waste'. It enables assessors and decision-makers they support - from communities to states to the Nation - make better, science-informed decisions about material management within U.S. landscapes. Better decisions will result in reduced disposal costs, increased protection of public health and the environment, and reduction in the use of raw materials. The system includes the HE²RMES modeling system (Human and Ecological Exposure and Risk in Multimedia Environmental Systems), a collection of science models, databases, and tools that work integrally with each other within the FRAMES v2.0 modeling framework (Framework for Risk Analysis in Multimedia Environmental Systems). HE2RMES allows users to study the effects of liquid, semi-solid, or solid materials placed within one of several “material” management units (i.e., source-release models). Using a combination of fate and transport models, food web and food chain models, exposure and risk models, and risk summary models, HE2RMES assesses exposure and risk to both ecological and human receptors in and around a site. HE2RMES can also be used to study impacts at multiple sites across a county, a tribal land, a state, a region, or the entire U.S. The tool system presented also establishes a fully implemented D4EM-4-HE²RMES solution (Data for Environmental Models). Servicing all of HE²RMES's models, the solution allows for Anywhere USA application of the modeling system.

T2-J.4 Balog, SB; World Bank and King's College London; [email protected] Severe weather decision making: A study of headteachers in Wales and western England Threatened with an oncoming severe weather event, school administrators must decide if the safety of their students mandates closure of their facility. What factors they consider and information they use for making such decisions are poorly understood. This study surveyed and interviewed headteachers in Wales and western England about their decision making process surrounding severe winter weather closures, particularly examining a snowstorm on 18 January 2013. Headteachers make closure decisions carefully and cautiously. Health and safety is the primary consideration, with a host of specific factors contributing to the safe operation of schools during snow. Local knowledge and weather information play different yet important roles in the decision making process. Forecasts are helpful in increasing awareness of possible disruption; however, local information is integral and integrated into the process as applicable, accurate, real-time information is of most use. Combining these information sources in an accessible manner could lead to more informed decisions.

December 6-10, 2015 - Arlington, VA

7

SRA 2015 Annual Meeting Abstracts M3-D.1 Banan, Z*; Gernand, JM; Penn State University; [email protected] Heterogeneity of Emissions Exposure Risk from Hydraulic Fracturing in the Marcellus Shale Region of Pennsylvania and Implications for Permitting Policy Exposure to particulate matter (PM) - the dispersed solid particles and liquid drops (with different dimensions) in the air - causes severe health issues. Oil and gas field development creates PM emissions of silica and soot and also other emissions includes harmful chemical compounds emissions namely volatile organic compounds (VOCs), nitrogen oxides (NOx), and sulfur oxides (SOx), etc. Many emissions estimation models have been introduced and developed in order to address the concern with concentration of PM being generated during different phases in oil/gas production process – drilling, fracturing, compression, completion, etc. However, some studies show that there seems to be a systematic underestimation in the results of these models in estimating PM concentration. While previous works have considered the average level of emissions across the regions where hydraulic fracturing occurs, this study considers the heterogeneity of exposure in terms of time as well as distance. This study develops a Gaussian plume model with a series of time variant point sources for modeling the dispersion of aforementioned harmful emissions generated during shale gas production. The model is developed based on drilling permit databases for the activity schedules of shale gas drilling in Pennsylvania. We apply the model to series of point sources considering the type of activity by monthly dates together with average monthly meteorological data. The model predictions are validated based on the measured emission data from the EPA. According to the results, there is significant variability and spatial heterogeneity in the level of exposure to emissions when density of drilling and/or fracturing activities is considered along with geographic and meteorological conditions. Finally, we develop a policy recommendation on drilling rates and density to keep individual communities below the recommended exposure levels for fine and ultra-fine particulates and other pollutants in the future.

W2-A.3 Bannon, DI; US Army Public Health Command; [email protected] Adequacy of existing OSHA Pb standards and alternatives for protection of military firing range personnel Research findings over the past decade have demonstrated adverse health effects of lead in adults at blood lead levels (BLL) previously considered acceptable in the workplace. Worker exposure to lead in general industry is regulated by the 1978 Occupational Safety and Health Administration (OSHA) Code of Federal Regulations (CFR 1910.1025). However, the OSHA standard as originally written implied that blood values of 40 µg/dL and below may not be associated with significant health risks. In 2011, The Department of Defense (DoD) commissioned the National Research Council (NRC) at the National Academy of Sciences to examine the adequacy of current regulatory standards to protect DoD workers. The resulting publication, entitled “Potential Health Risks to DoD Firing-Range Personnel from Recurrent Lead Exposure”, concluded that “BLL (40 µg/dL) was judged by the committee to be inadequate for protecting personnel who had repeated exposure on firing ranges; thus the OSHA PEL (50 µg/m3) for lead would also be insufficiently protective". The NRC committee suggested alternative management guidelines and the DoD has been actively working to develop new medical management guidelines, including BLLs, for occupational exposure. This effort is being followed by an examination of occupational exposure levels related to these new BLLs. The NRC study report, the DoD response, and future efforts will be discussed.

P.21 Baroud, H*; Francis, R; Barker, K; University of Oklahoma and George Washington University; [email protected] Beta Bayesian Kernel Methods for the Prediction of Global Supply Chain Disruptions Multinational corporations often operate facilities or have suppliers in countries prone to natural hazards, extreme weather, or political turmoil. While such business strategies offer low operating costs, they might be introducing high risks leading to high impact disruptions in the global supply chains of these corporations. With multinational companies operating from different locations, there is a great deal of information, services, and money circulating between facilities. It is then critical to assess the risks and consequences of potential global disruptive events such as tsunamis, earthquakes, or factory fires, among others. Therefore, the accurate prediction of such risk plays a vital role in (i) assessing the cost effectiveness of overseas operations, and (ii) identifying risk management and preparedness strategies. We deploy a Bayesian kernel model using the Beta Binomial conjugate prior to assess the probability of a supply chain disruption given (i) historical information on past disruptions, and (ii) a set of attributes describing the company, its risk management strategies, and its suppliers’ risk assessment, among other possible attributes. The model is extended to accommodate non-continuous variables by incorporating different types of kernel functions. Given data provided by a survey of companies’ resilience of their supply chains, we are able to produce the probability distribution of global supply chain disruptions for any company in the sample. We consider several variations of the model and compare the output. The approach is also compared with classical forecasting tools for validation purposes. This method provides risk managers with an accurate estimation of the likelihood of a supply chain disruption. It is also suitable to integrate the decision maker’s risk preference in the form of the prior distribution while updating it with newly acquired information.

T4-G.2 Barrett, AM*; Baum, SD; Global Catastrophic Risk Institute and ABS Consulting; [email protected] Analyzing Long Term Risks of Artificial Intelligence Catastrophe Artificial Intelligence (AI) is increasingly recognized as presenting a significant risk at some point in the future. While AI researchers and developers typically do not intend to cause harm through their work, harm may nonetheless occur due to accidents and unintended consequences. In the absence of adequate safety mechanisms, an extremely powerful AI may even be likely to cause human extinction. Thus long term AI risk scenarios merit attention even if their probabilities are low. While the AI risk scenarios are highly uncertain, established risk analysis methodologies can make progress at characterizing the risk and informing decision making. We present an initial set of graphical models that represent core elements of major pathways to global catastrophe involving extremely powerful AI at some point in the future. The models use fault tree and influence diagram conventions to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. Model structures are derived from published literature on long term risk of AI catastrophe.

December 6-10, 2015 - Arlington, VA

8

SRA 2015 Annual Meeting Abstracts P.172 Bates, ME*; Fox-Lent, C; Seymour, L; Wender, BA; Bridges, TS; Linkov, I; US Army Corps of Engineers, Engineer Research and Development Center, Environmental Lab; Massachusetts Institute of Technology; Arizona State University; [email protected] Life-Cycle Assessment of Dredged-Sediment Management Alternatives Managing dredged sediments in a way that properly balances environmental risks and public benefits is often a point of controversy between and among federal agencies and stakeholders. Current decision making includes environmental criteria, but is often limited to those measuring local or immediate effects. Specifically, the variety of distributed and long-term impacts resulting from transportation by truck or barge, use of loading equipment, and long-term site management have implications for climate change, air quality, non-renewable resource consumption and other factors that affect human and ecosystem health. Life-Cycle Assessment (LCA) is a method of accounting for a wider range of impacts and benefits than are included in most risk assessment strategies. By developing an LCA comparing dredged-material management strategies, we show how fuller criteria can be included in future sediment-management decisions. This paper applies LCA to dredged-sediment management through a comparative analysis of potential upland, open water, and containment-island placement sites in the Long Island Sound region, NY/CT.

P.173 Bates, ME*; Keisler, JM; Zussblatt, NP; Plourde, KJ; Wender, BA; Linkov, I; US Army Corps of Engineers, Engineer Research and Development Center, Environmental Lab; University of Massachusetts Boston; University of California Santa Barbara; Arizona State University; [email protected] Balancing Research and Funding using Value of Information and Portfolio Tools for Nanomaterial Risk Classification Currently, risk research for nanomaterials is prioritized through expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here we apply value of information and portfolio decision analysis – methods commonly applied in financial and operations management – to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter and we apply Monte Carlo simulation to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts; this enables identification of efficient research portfolios – combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility, and surface reactivity were most frequently identified within efficient portfolios in our results. This type of analysis can be usefully applied by researchers and funding agencies trying to most efficiently allocate limited R&D resources.

P.58 Bateson, TF*; Blessinger, T; Subramaniam, R; Axelrad, DA; Dockins, C; U.S. Environmental Protection Agency; [email protected] Advancing Methods for Benefits Analysis Benefit-Cost Analysis is widely employed and accepted for evaluating environmental policies, and is required by Executive Order 12866 and certain statutes. For most contaminants, there are few tools with which to evaluate the non-cancer human health benefits of exposure reductions. This presentation describes an effort that brings together economists, epidemiologists, statisticians, and toxicologists from across the U.S. Environmental Protection Agency for the purpose of quantifying health risks and their associated economic valuations. Key issues to be addressed include: 1) Need for standardized weight of evidence conclusions for all non-cancer hazards to communicate clearly and provide improved support for benefits analysis; 2) Need to explore how to include effects with a “suggestive” or “possible” weight of evidence conclusion, and how to incorporate weight-of-evidence uncertainty into the benefits analysis; 3) Need to estimate risk at a given dose by applying dose-response modeling techniques and incorporating uncertainty and variability, which will directly support economic analysis and provide additional information for decision makers; 4) Need to establish linkages of upstream and early biomarkers of effects to health outcomes that are amenable to economic valuation. These efforts will enable a more rigorous and informative characterization of health risks. Disclaimer: The views expressed in this abstract are those of the authors and do not represent U.S. EPA opinions and/or policy.

T4-F.5 Batz, M; Montibeller, G*; Cahill, S; Kojima, M; School of Business and Economics, Loughborough University, UK; [email protected] Multi-criteria decision analysis for risk management of microbial hazards in low moisture foods There has been increasing international interest in the microbial safety of low moisture foods (LMF) following a number of recalls and outbreaks throughout the world. Implicated ready-to-eat products with low water activity have included chocolate, raw almonds, sesame, peanut butter, breakfast cereals, dry spices and seasonings, and puffed rice and corn snacks, to name a few. Salmonella is the most commonly associated microbial hazard, though Bacillus cereus, Clostridium botulinum, C. perfringens, pathogenic Escherichia coli, and Staphylococcus aureus have also been implicated. In November 2012, the Codex Committee on Food Hygiene (CCFH) agreed to develop a Draft Code of Hygienic Practice for LMF. CCFH asked the Food and Agriculture Organization of the United Nations (FAO) and the World Health Organization (WHO) to provide the Committee with advice on which foods should be the highest priorities, and on risk management options. FAO and WHO initiated a series of activities to provide guidance. Knowledge synthesis activities included a rapid scoping study and systematic review and meta-analysis of available scientific literature related to LMF, focusing on outbreaks, prevalence studies, and intervention studies. At a three-day expert workshop held in May 2014, a multi-criteria decision analysis (MCDA) model was developed interactively with subject matter experts. This model was used to rank seven LMF product categories based on a number of criteria relating to public health and international trade, with attributes relating to disease burden, annual trade impacts, food consumption patterns, population vulnerabilities, pathogen contamination, and food production characteristics. Swing weights were elicited from experts, robustness analyses were conducted, and further data collection was used to parameterize attributes. The MCDA ranking was finalized interactively with experts following the meeting, and results and risk management guidance presented to CCFH.

December 6-10, 2015 - Arlington, VA

9

SRA 2015 Annual Meeting Abstracts M4-B.2 Baum, SD*; Barrett, AM; Global Catastrophic Risk Institute; [email protected] Risk and Policy Analysis of Nuclear War Despite the end of the Cold War, there are still thousands of nuclear weapons in the world, mainly in Russia and the United States. The ongoing probability of nuclear war is not zero, and some have argued that is higher now than in the Cold War. Models have been developed to estimate the probability of nuclear war and to evaluate opportunities to reduce the probability. A key factor is the quality of relations between nuclear-armed states. The impacts of nuclear war would be catastrophic. In addition to the destruction from the explosions, there could be severe global environmental consequences, including nuclear winter. There is only limited analysis available of the human impacts of nuclear winter, focusing mainly on the possibility of famine. Risk perspectives are gaining increasing currency in international policy debates about nuclear weapons, but some changes to the status quo of nuclear arsenals, doctrine, deployment, and plans for disarmament could be valuable for reducing nuclear war risk.

T3-B.2 Baxter, JR; Industrial Economics, Incorporated; [email protected] Strategically Targeting Retrospective Analysis President Obama’s 2011 Executive Order 13563, “Improving Regulation and Regulatory Review,” attempts to reinvigorate interest in conducting retrospective reviews of existing regulations. As part of this review process, agencies may undertake ex-post analysis of a regulation’s benefits and costs. However, ex post analysis can be as resource intensive as ex ante analysis, and the quantitative results are often subject to significant uncertainty. This presentation focuses on options for targeting retrospective analysis in ways that can be most useful to regulators, assuming resources for undertaking such analyses are limited. It draws on examples from actual regulatory analyses and considers the advantages of different approaches given the purpose of each review.

P.9 Beaudrie, CEH*; Lyle, T; Long, G; Badelt, B; 1, 3 Compass Resource Management Ltd, Vancouver, British Columbia; 2 Ebbwater Consulting, Vancouver, British Columbia; 4 City of Vancouver, British Columbia; [email protected] Managing Coastal Flood Risks: A Structured Decision Making (SDM) Approach to Mitigating the Impacts of Sea-Level Rise in Vancouver, British Columbia Rising sea levels pose increasing flood risks for coastal communities, particularly major population centers along the British Columbia Coast. With a projected sea level rise of 1m by 2100, BC communities face the challenging task of understanding hazards, vulnerabilities, and consequences from flood events, and identifying suitable measures to protect multiple interests over large areas. This talk highlights the application of a Structured Decision Making (SDM) approach to evaluate the impacts of sea level rise and select mitigation options to reduce flood risks for Vancouver, British Columbia. The process involved a series of stakeholder workshops to identify interests that may be impacted, develop suitable mitigation alternatives, review performance of each alternative and consider trade-offs, and finally to develop recommendations for a suite of mitigation alternatives to protect vulnerable neighbourhoods across the city. To address the challenge of communicating complex risk information, stakeholders were engaged using both spatial illustrations of flood extents for a number of flood scenarios, and an interactive decision support tool to facilitate comparison of alternatives and trade-offs. This work breaks new ground in evaluating the implications of sea level rise on coastal communities, and provides a model for other communities grappling with the challenges of assessing and managing flood risks from a rising sea.

W5-C.5 Beaussier, AL*; Demeritt, D; Rothstein, H; King's College London; [email protected]. Risk prevention, compensation and the political economy of insurance While risk management is widely recognised as a central function of modern governments, closer analysis reveals considerable variation across OECD countries in the emphasis placed on ex ante risk reduction measures such as regulation, and ex post risk-reallocation measures such as compensation. For example, while Anglo-Saxon countries have made much of the development of risk-based regulatory instruments to prioritise their exposure reduction efforts in a wide variety of policy domains, many continental European countries have capitalised on the risk management levers offered by their social insurance regimes. In order to explore these patterns in more detail, this paper compares the balances struck between ex ante and ex post risk management styles in two different policy domains – flooding and occupational health and safety - in two European countries - France and the UK. Drawing on an extensive programme of qualitative interviews and documentary research, we examine the distribution of responsibilities for preventing and compensating risks in each domain and country. We conclude by considering what this analysis tells us about the national political economies of risk-management and the implications for their reform.

December 6-10, 2015 - Arlington, VA

10

SRA 2015 Annual Meeting Abstracts W1-H.4 Becker, R*; Patlewicz, G; Simon, TW; Rowlands, JC; Budinsky, RA; American Chemistry Council (RB), Environmental Protection Agency (GP), Ted Simon LLC (TWS), Dow Chemical Company (JCR & RAB); [email protected] Scientific Confidence Framework to Help Support the Application of Adverse Outcome Pathways for Regulatory Purposes An adverse outcome pathway (AOP) describes the causal linkage between initial molecular events and an adverse outcome at the individual or population levels. Whilst extensive activities continue to focus on AOP development, attention is now being paid to processes for establishing scientific confidence in AOPs for different regulatory applications. In addition to both the guidance on weight of evidence in the Sept. 2014 OECD AOP Handbook and ongoing work in developing OECD guidance for developing AOP-informed Integrated Approaches to Testing and Assessment, we have published a scientific confidence framework (SCF) for AOPs based in part on our previous work in evaluating HTS assays and their prediction models. The SCF consists of : 1) developing the AOP; 2) developing new (or mapping existing) specific assays to key events (KEs); 3) conducting (or documenting) analytical validation of assays; 4) developing new (or mapping existing) models that predict a specific KE from precursor KEs; 5) conducting (or documenting) qualification of the prediction models (e.g., assessing prediction accuracy); 6) defining and explicitly justifying the fit-for-purpose rationale (i.e. where there is sufficient scientific confidence to use the AOP for a specific purpose such as priority setting, chemical category formation, integrated testing, predicting in vivo responses, etc.; and 7) for regulatory acceptance, processes to ensure robust and transparent review (e.g., dissemination of all necessary datasets, model parameters, algorithms, etc., to permit fully independent verification and independent scientific peer review). The SCF has been illustrated using three different AOPs for several typical regulatory applications and is an important step forward in assessing and in promoting explicit documentation and communication of the scientific basis of AOPs (Patlewicz et al., 2015 Regul Tox Pharmacol 71:463-77).

P.149 Bell, MZ*; Yang, ZJ; State University of New York At Buffalo; [email protected] Nuclear Energy in the Media: Examining How Fukushima Influenced Debates over the Future of Nuclear Although the ‘nuclear renaissance’ of the 1970s was never quite fulfilled, nuclear continues to be a stable source of low-carbon energy across the world, including the US. However, the continued reliance on nuclear energy has not been without its controversy, with issues tied to waste storage, proliferation, nuclear safety and public risk perception. Although risk perception has its unique cognitive foundations, it can be tied to media representation of nuclear energy, which is often cited either to be heavily scaremongering or deceptively rose-tinted in its portrayals. Over the years, several media content analyses have been conducted, often centering on risk events such as Three Mile Island or Chernobyl. Given the timely opportunity, this paper seeks to continue this legacy, taking the 2011 Fukushima Daiichi incident to explore the ways in which a new nuclear risk affects the representation of ‘nuclear energy’, seeking to understand whether representations in three major US newspapers are different before and after the Fukushima accident. Results show that these newspapers overall portrayed nuclear energy in a negative light, but coverage was more negative in 2011. There were also shifts in the amounts of renewable energy and fossil fuel mentioned after 2011, which suggests that the disaster engendered debate over the future of nuclear. Lastly, there also appears to be evidence of a risk tradeoff between nuclear and climate change risks. Fukushima additionally introduced natural hazards into the debate about nuclear energy, which was minimal prior to this particular incident. Overall, results from this content analysis indicate that albeit limited, Fukushima had some impact on the newspaper discourse surrounding nuclear energy.

M3-A.4 Belzer, RB*; Lewis, RJ; Good Intentions Paving Company; ExxonMobil Biomedical Sciences, Inc.; [email protected] Discarding Data Overstates Risk Estimates from Exposure to Ambient Air Pollutants Characterizations of risk from ambient air pollutants often depend on chamber and observational epidemiology studies that measure differences in pulmonary function (e.g., forced vital capacity [FVC] and forced expiratory volume [FEV1]). Study protocols, such as those developed by the American Thoracic Society, require three to eight clinically acceptable measurements for each test (called a “maneuver”). However, every published study statistically analyzes only single values for each maneuver. All other data are either unreported or discarded. In short, pulmonary function tests produce variable results, but variability is routinely discarded —a key information quality defect. This paper documents the consequences of discarding clinically valid data in favor of point estimates. First, differences in pulmonary function previously attributed to air pollutant exposure may be indistinguishable from normal variability once normal variability is taken into account. Second, population standard errors are certain to be greater than reported in the literature. This means small differences previously characterized as statistically significant and biologically meaningful may be nether. Further, statistical significance properly calculated may be restricted to fairly large differences in ambient pollutant concentrations. Third, decision-makers’ past reliance on concentration-related differences in pulmonary function to justify tightening national ambient air quality standards may have been scientifically invalid.

W3-E.2 Benavides, AD*; McEntire, D; Carlson, E; Keyes, L; University of North Texas; [email protected] Spontaneous Planning, Governance Structure, and a Public Health Emergency: Ebola in Dallas, Texas The Ebola virus became a part of household conversations in October of 2014. The United States experienced its first death and witnessed the spread of the infection to its citizens. The presence of a deadly infection in our urban areas highlights the importance of coordination among leaders and decision-makers. Dealing with insufficient expertise in emergencies as complex as an Ebola outbreak can be dangerous. Therefore, this paper asks how can officials overcome barriers to governance and integrate spontaneous planning in a public health emergency context. It explores how principal actors in the DFW Ebola crisis served as essential components of a multi-actor response protocol. Multiple methods included interviews and the administration of a survey to achieve these objectives. Research evidence was obtained through interview findings from key actors including the Dallas County judge, mayor, city manager, emergency management coordinator, the lead epidemiologist, hospital leadership, representatives from the Centers of Disease Control, and their local counterparts. Survey evidence breaks down the DFW Ebola decision-making process into governance structures, emergency management, and public health. Findings illustrate how local government structures inherently constrain adaptable or on-the-fly policy-making and discuss a process to allow governments to adapt and change quickly and effectively in crises. Moreover, findings demonstrate how the emergency management profession is expanding to address the development and implementation of on-the-fly policy decision making. This paper’s findings will inform our public leaders on how to manage and maintain flexibility in a public health crisis, thus benefiting society.

December 6-10, 2015 - Arlington, VA

11

SRA 2015 Annual Meeting Abstracts P.178 Benouar, D*; Zelloum, H; El Hadj, F; Universite USTHB; [email protected] Forensic investigation style of an unexpected large scale urban disaster: the november 10, 2001 algiers floods and debris flow This paper attempts to present the impact of the unexpected floods and debris flow that occurred within the city center of Algiers (capital of Algeria) on November 10, 2001. According to the official reports, this event caused the loss of 712 human lives, injured 350 people and 116 missing. 1800 housing units were damaged, 56 schools, several bridges, roads and public works suffered considerable damage. The streets of the area affected were affected the debris flow and accumulated more than 800 000 m3 of mud and debris. More than 350 cars were also damaged and several of them and also buses were buried under the debris flow and mud with passengers, unfortunately, there is a great deficit in ongoing research on how science is used to facilitate social and political decision-making in the context of risk particularly for unexpected disasters. Naturally, this requires an integrated approach of research and policy-making for all hazards and disciplines. The analysis of this event has allowed us to make first an inventory of the vulnerability factors, as the existence of the catchments, the high density of inhabitants, open spaces, soil cover, topography, the physical vulnerability of buildings, roads and bridges, the vulnerability of public buildings, etc., of the site and environment that contributed to cause the human and economic losses. This analysis has allowed according to the available data to integrate it into the urban design phase or reconstruction phase in the standards and regulations to reduce the risks. For existing and constructed sites, the risk reduction consists in making new decisions to reduce the vulnerability of the environment and enhance the resilience of the population. Recommendations are made for disaster risk reduction for the site affected of Algiers in terms of reducing the vulnerabilities, and thus reducing risk and curbing human lives and economic losses through sound knowledge-based measures.

P.68 Benouar, D; USTHB; [email protected] Researching Engineering Causes in 2003 Boumerdes-Algiers (Algeria) Earthquake Disaster This paper attempts, as a case study, to investigate the engineering causes of the Boumerdes-Algiers (Algeria) earthquake disaster of 21 May 2003 which caused the loss of more than 2,278 human lives, injuring more than 11,450, making 1,240 missing and 182,000 homeless; it destroyed or seriously damaged at least 200,000 housing units and about 6,000 public buildings in five wilayas (provinces). On Wednesday 21 May 2003, at 19h 44m 2s (18h 44m 2s UTC), a destructive earthquake occurred in the Boumerdes-Algiers region affecting a rather densely populated and industrialized region of about 3,500,000 people. It is one of the strongest recorded seismic events in North Africa. The depth of the focus was about 10 km. The magnitude of the earthquake was calculated at M = 6.8. The main shock, which lasted about 40 sec, and the two largest aftershocks (both reached M 5.8 on 27 and 29 May 2003). Disasters are increasingly being understood as ‘processes’ and not discreet ‘events’. Moreover, the causes of disasters are driven by complex engineering, socio-economic, socio-cultural, and various geophysical factors. Such interacting driving factors, occurring across a range of temporal and spatial scales, combined in numerous ways to configure disaster risks. Using some selected disasters in Algeria, the dynamics of such risks and their configurations will be explored using a new forensic style approach. Indeed, it seems, the more we have learned, the more we are losing. The approach is based on the idea that this situation is due to the fact that much current research is still informed by a focus on surface symptoms of observations and events rather than critical causes and processes of disaster risk construction and accumulation.

M3-E.2 Bergeson, LB; Bergeson & Campbell, PC; [email protected] Adapting Governance Approaches to Evolving Technologies Technologies evolve at a pace that far exceeds the ability of traditional governance and oversight systems to address them. This fact is truer today than ever before as the pace of technological innovation is faster, more furious, and more complicated than ever before. Domestic and international governance bodies are challenged to use the legal and regulatory authorities they are authorized to enforce under existing laws, and are creating new and more innovative approaches to assure new technologies meet statutory safety and other legally enforceable standards. These approaches include a combination of law, policy, guidance, standard-setting, and voluntary stewardship initiatives. This presentation will consider these new approaches, and discuss the need for and success of adaptive management and governance systems to keep pace with the speed of innovation.

M4-H.3 Berner, CL*; Flage, R; University of Stavanger; [email protected] Potential uses and limitations of the NUSAP notational scheme when treating uncertainty in semi-quantitative risk assessment Efforts to develop approaches to represent uncertainty in risk assessment have followed both quantitative and semi-quantitative lines. The Bayesian framework and imprecise probability are examples of quantitative representation formats; whereas semi-quantitative risk assessment refers to supplementing quantitative characterisations with qualitative assessments of aspects not captured by the produced numbers. Semi-quantitative risk assessment schemes based on sensitivity/strength-of-knowledge assessments as well as the more formalised “assumption deviation risk” concept have been suggested, both of which have apparent parallels with the NUSAP assessment scheme of uncertainty and quality in science for policy. In the present paper we explore these parallels, as well as potential uses and limitations in the use of the NUSAP system in risk assessment. More depth is added to the analysis and discussion by distinguishing between risk defined in terms of probability and risk defined in terms of uncertainty. The latter type of risk perspective has been shown to support a model of problem-solving strategies covering the case of high system uncertainties; a situation of particular relevance to the issue of uncertainty representation. We conclude that there are parallels between the assessment schemes which make it possible to advantageously utilise the NUSAP system in a risk assessment setting.

December 6-10, 2015 - Arlington, VA

12

SRA 2015 Annual Meeting Abstracts T4-I.3 Berube, DM*; Prince, GP; North Carolina State University; [email protected] Analysts Eschew New Tools for Big Data Scrutiny This paper examines a variation of endowment called the attachment effect and is supported by a NSA analytics grant with the Laboratory of Analytic Sciences at NCSU. We will offer preliminary data from a review of the literature and an expert sample of academics, developers from data analytics companies, and government employees. Within the intelligence analytic community there is a tendency to be biased toward an analytic tool even when it is inferior to another. A 2014 poll (Piatesky, 2014) of 3,285 data mining users found the average number of tools used by data miners was only 3.7. While data miners profess using multiple tools, they still conduct 76 percent of their work (Rexer, 2013) in their primary tool. Data miners use different tools for different data sets and purposes, nonetheless they seem to fall back into patterns of behavior, if not habitual patterns. Nevertheless, 40 percent of data miners reported they are struggling to manage and process their data in their existing relational database tables and a significant minority (36 percent) complained about the limitations of their existing analytics software. This percentage called for faster analytics systems, as their data is too big to process through their existing systems as quickly as they need, which caused them to give up on the process or omit data from analyses (Booth, 2014). However, despite this clamor, the commitment to develop news tools is meaningless if they remain unused. We are working on trying to determine variables associated with attachment to some tools by users leading to inferior assessment. How these affect their decision to switch from a presumptive tool to a new one is the core of this work. Our preliminary research will distill a set of variables that may assist in making more analytic tools more usable.

P.125 Besley, JC*; Dudo, AD; Yuan, S; Besley, ; Michigan State University; [email protected] Qualitative Interviews with Science and Risk Communication Trainers about Communication Goals Twenty four qualitative interviews were conducted with science communication trainers to better understand how these trainers address goal setting in their work. The results suggest that trainers believe the scientists they train want help achieving a range of personal and societal goals. Personal goals were primarily related to career while societal goals were primarily related to ensuring that science is part of decision-making related to health or environmental risk (e.g., climate change, or vaccines). Interviews also suggested that the training being offered rarely explicitly addresses what intermediate objectives might allow scientists’ to achieve their overall goals. There was recognition that increasing knowledge was unlikely to have a substantial effect on how non-scientists view issues involving science/risk, but the training being provided appears to emphasize communication skills such as clarity and message selection. What appears to be missing was any discussion of how scientists could attempt to communicate elements of trustworthiness (i.e., warmth and competence) or procedural fairness (i.e., a willingness to listen and respect for others’ views). In some cases, trainers noted that their training includes a focus on trust-related strategies such as helping scientists be more relaxed and personable in front of a camera or helping them listen to audiences, but these were typically discussed in the context of enabling knowledge transmission rather than as potentially complementary pathways to realizing scientists’ overall goals. Another potential limitation of the training currently being offered is that trainers generally said that they allow the scientists’ themselves to set their goals rather than providing guidance on what goals are most likely to be effective. The interviews are part of a larger project aimed at understanding scientists’ views about public engagement related to health and environmental risks, as well as other issues.

T3-C.4 Besley, JC*; McCright, AM; Elliott, KC; Oshita, T; Besley, ; Michigan State University; [email protected] Conflict of Interest Perceptions and Risk-Related Research Partnerships Scholars at universities and other social institutions are exploring research partnerships with private sector organizations in the face of decreased access to government funding. While it may be possible to design procedures that limit the actual effect of conflict of interest (COI), there is a danger that public observers may still see the resulting research as tainted. An initial experiment (n = 501 through mTurk) tested how the inclusion of four different types of partners (industry, government, university, and non-governmental organizations) might affect COI-related perceptions. The initial 2x2x2x2, between-subject experiment focused on how respondents viewed a hypothetical research partnership aimed at better understanding the risk of low levels of trans fats in food. Preliminary results suggest that the inclusion of an industry partner substantially increases (a) perception of COI while decreasing (b) perceptions of the procedural fairness of the research, and (c) perceived legitimacy of the research as a source of evidence for decision-making. These main effects occurred regardless of the inclusion of other partners. Inclusion of a non-governmental partner had substantially smaller negative effects on perceived COI and similar small positive effects on procedural fairness and legitimacy. Inclusion of government and university partners had limited effects. There were no interaction effects. Ideology also did not appear to serve as a moderator of COI-related effects, but the results suggested that conservatives generally see lower COI as well as both more fairness and legitimacy for all partnerships. Pretests were used to ensure that the organizations used in the experiment had relatively positive reputations. Additional data collection to increase statistical power is ongoing. Additional experiments aimed at assessing potential procedures for mitigating COI perceptions are also planned. These will involve additional issues (e.g., genetic engineering).

M3-J.3 Bessette, DL*; Cwik, BP; Mayer, LA; Tuana, N; Pennsylvania State University; [email protected] Informing climate change risk management and decision support in New Orleans: A new value-informed approach Conducting stakeholder-based climate-change risk management requires identifying and engaging tradeoffs between a wide range of competing objectives and values. Not only do the objectives and values of stakeholders differ, but so do those of scientists, subject-matter experts and decision analysts. Previous research by the authors has even shown that experts bring different, sometimes conflicting, values to bear at different points in a decision support process, e.g., between the decision structuring tasks and decision evaluation tasks. Assuming consistent values exist when they do not can lead to decision support processes and outcomes that at best lack relevance and at worst are scientifically unsound and ethically indefensible. In order to avoid such outcomes, particularly when the stakes are high, as is the case with climate change risk, analysts must help to identify and analyze “gaps” between environmental and risk modelers’ values and the values of stakeholders relying on those models to make decisions. Such gaps have been identified previously and successfully using “Value-informed Mental Models,” or ViMM, which incorporate the elicitation of and qualitative and quantitative analysis of values into the Mental Models Approach developed at Carnegie Mellon University. Here we report on the development of two ViMM, an expert and stakeholder model, each focusing on climate-change risk-management strategy tradeoffs in the New Orleans area. Special attention was paid regarding the latter to represent as large and varied a sample of residents as possible. Despite an increased risk of sea-level rise, hurricane intensity, storm surge, and coastal erosion, research into how and what residents in the New Orleans and Gulf Coast region value with respect to climate change is lacking. The two ViMM developed here attempt to both address that need and facilitate the development (and evaluation) of a Multi-objective Robust Decision-making (MORDM) tool intended to aid in local risk-management strategy development.

December 6-10, 2015 - Arlington, VA

13

SRA 2015 Annual Meeting Abstracts M3-G.1 Bier, VM*; Liu, S; University of Wisconsin-Madison; [email protected] Capacity Model for Protection of Transportation Networks We develop a transportation network-capacity model and an attacker model, in which the attacker chooses one or more arcs to interdict in order to maximize the expected cost to users of the network, taking into account the success probability of an attack. We adopt the “gravity model” to estimate the demand for travel between each origin and destination in a realistic network, on the basis of the population of each node and the distance between them. The gravity model has been used many times to explain the choices of large numbers of individuals in urban travel-demand contexts. We then model the relationship between traffic density (e.g., vehicles per mile) and travel speed. Finally, we develop a user-equilibrium model for traffic assignment, in which drivers attempt to minimize their travel times, with the result that different routes between the same origin and destination will have the same travel time. To enhance the realism of our model, we assume that each driver has a reservation travel time, such that the driver would not travel if the equilibrium travel time exceeds the reservation travel time. In that case, drivers that balk incur a cost equal to the reservation travel time, which is assumed to represent the value of the foregone trip. We are currently working on a defender model to allocate limited defensive resources to minimize the expected damage from a successful attack, taking into account the possibility of deterring an attack.

W3-I.4 Biles, BA*; Daneker, MD; Arnold & Porter LLP; [email protected] Legal Perspectives on Data Sharing Research data analyzed and published for purposes of developing scientific knowledge and guiding regulatory and other decision-making often are of interest and use to scientists other than those that generated or assembled the original data set. Obtaining original study data, however, is not always straightforward, as data protection rules, time and costs associated with preparing and transferring data, and investigator interest (or lack thereof) in sharing data may prove to be obstacles to sharing data. Especially where research is sponsored by public funding sources, laws have been promulgated to assure, in the interest of transparency and verifiability, that groups other than the original investigators can access and analyze these data. Some laws, such as the Freedom of Information Act, compel Federal agencies holding research data to provide them (with certain limitations to protect individual identities of study subjects) to US citizens, whereas other arrangements, such as Technology Transfer Agreements, allow for less restrictive data sharing. An overview of data sharing laws, their legal basis, rationale and objectives, their limitations as well as the legal obligations of the data recipients will be presented. Examples of successful and failed data sharing experiences will be presented.

M3-I.3 Blackman, H; Boise State University ; [email protected] Lifestyle, chemical, and radiation risks: Differences in perception, regulation, and choice Communicating risk concepts to management and workers requires an informed approach. To communicate and achieve risk management objectives requires applying behavior based human factors into engineering and defense-in-depth designs. Some of the greatest challenges in improving the health of the workforce and their families involves lifestyle choice programs.

T2-G.1 Blythe, JS*; Kothari, V; Koppel, R; Smith, S; University of Southern California; [email protected] A Toolkit for Exploring the Impact of Human Behavior on Cybersecurity through Multi-agent Simulations Multi-agent simulations are useful to understand the consequences of human behavior in interaction with complex systems, that cannot easily be predicted analytically. Such complex systems may be due to software or other mechanisms with which people interact, or a workflow involving people, or interactions arising from the person's organization and its policies. Over several years we have been designing and experimenting with an agent toolkit called DASH, that supports building agents with important aspects of human behavior. First, DASH includes a reactive planning module that allows for goal-directed behavior where goals and methods are continually re-evaluated over time. Second, DASH includes a dual-process model with deliberative and situation-response components offering alternatives for the agent's next action, and features such as fatigue and cognitive load influencing which is chosen. Third, DASH includes support for utility-based reasoning about the possible effects of actions using agent-specific mental models that may be incomplete or incorrect. DASH has been used to model the selection and use of passwords by agents that do not explicitly interact with each other. Sharing passwords between sites is one way to handle the cognitive demands of remembering many passwords, and the simulation demonstrates how this means that toughening password constraints on a site can surprisingly decrease its overall security. Utility-based reasoning about mental models allows results from surveys gathering utility data on passwords to be used directly in DASH agents with minimal programming and we show the impact of a recent survey on password utilities on this simulation. DASH has also been used to demonstrate the impact of affect on decision-making using its dual-process component. We also discuss experiments with DASH cyber-attack agents, showing the impact of the agents' goals and biases on attack behavior. DASH is available open-source and on the Deter testbed.

December 6-10, 2015 - Arlington, VA

14

SRA 2015 Annual Meeting Abstracts T4-I.2 Borgonovo, E; Hazen, GB; Jose, VRR*; Plischke, E; Georgetown University ; [email protected] Scoring rules, value of information, and sensitivity analysis Scoring rules and value of information are useful tools in decision and risk analysis that measure the information content of data and forecasts. In this talk, we bridge these two seemingly separate areas of research by drawing their inherent connections. In particular, we obtain and analyze analytic expressions for the value of information associated with some well-known scoring rules. We show that the resulting value of information sensitivity measures are, in fact, global sensitivity measures that fall under a common rationale. We study this common rationale and obtain if and only if conditions that characterize the properties of probabilistic sensitivity measures. These findings allow us to understand which of the presently used global sensitivity measures can be regarded as value of information sensitivity measures. We discuss an application and provide some numerical examples.

W5-C.4 BORRAZ, O*; BEAUSSIER, AL; HERMANS, M; PAUL, R; WESSELING, M; Center for the Sociology of Organisations (CNRS-Sciences Po); [email protected] When is safe safe enough? Comparing risk-based inspection regimes in Europe The introduction of risk-based regulation in Europe across a variety of policy domains in recent years has been accompanied by hopes for greater regulatory consistency across member states. Yet, such hopes rely heavily on the so far under-researched practical implementation of risk-based regulation. How are risks actually assessed by those in charge of enforcing risk-based regulation and controlling regulatees’ compliance on the ground? By comparing the ways in which inspections are organised in two policy domains (food safety and occupational health and safety) and four countries (Great Britain, France, Germany and the Netherlands), we highlight, first, various interpretations of the notion of risk and how they translate in practice. Variety in such interpretations, second, can be accounted for with the different legal frameworks and state traditions in which they are embedded as well as the role of professional groups in either embracing or resisting risk-based approaches. Based on these two contributions, we argue that the introduction of risk-based regulation in Europe, far from achieving greater consistency across different member states in the implementation of EU rules, can actually increase differences.

T2-J.1 Bostrom, A*; Vidale, JE; Ahn, A; University of Washington; [email protected] Earthquake experiences, risk perceptions and Early Warnings on the U.S. West Coast Early earthquake warning (EEW) systems are in development for the U.S. West Coast, and as of this year are being tested in California (CA) and Washington (WA). Currently deployed EEW systems, such as those in Japan, have helped reduce harm from earthquakes. While reliable detection technologies, and the speed and accuracy of early earthquake warnings are essential to achieve this, equally important is the reliance of EEW on: human mediation; channels for issuing warnings; familiarity and institutionalization of warning procedure; settings in which systems are used; and system goals and objectives. Interpretations of warnings and responses them depend on cognitive, emotive, social and institutional contexts, as well as the physical and social contexts in which warnings reach people. This paper reports the results from a series of 2014-2015 surveys on the U.S. west coast that assess perceptions of earthquake risks and EEW. A higher proportion of WA and CA residents (over 60%) have experienced earthquakes than Oregon residents; initial analyses suggest that WA and CA residents also see a correspondingly greater potential for benefiting from EEW. Findings demonstrate the importance of personal experience and personal emergency planning in assessments of and demand for EEW. EEW provide mere seconds to minutes of lead time, for which reason providing people with context-appropriate and effective actionable information requires setting the stage for action by working with communities and institutions to develop goals, procedures, and expectations. Initial steps toward this include integration of EEW into Shakeout. Acknowledgments: Funding from NSF EAR 1331412

W3-D.4 Boué, G*; Cummins, E; Guillou, S; Membré, JM; Le Bizec, B; Antignac, JP; Oniris and Univesity College of Dublin; [email protected] Development of an integrated risk-benefit assessment model to evaluate the health impact of breast milk and infant formula diets The first months of an infant’s life are crucial for short and long term healthy physiological development. Breast milk is recognised as the best natural diet, however powder infant formula currently represents a necessary alternative in Western countries. Various beneficial and adverse health effects related to microbiological, chemical and nutritional fields have been associated with both diets. In this context, the objective of this study was to develop an integrated risk-benefit assessment model to evaluate the overall health impact of breast milk and infant formula diets. This model focused on factors known to contribute health effects for each field, namely Cronobacter sakazakii for microbiology, polychlorinated biphenyls for chemistry and docosahexaenoic acid for nutrition. A probabilistic second order model which separated variability and uncertainty was developed and implemented in Excel 2010 using the @Risk software (version 6.3.1) to carry out Monte Carlo simulation. Data were collected from the scientific literature, reports of food safety agencies and expert elicitations. Model outputs were expressed in the same unit (Disability Adjusted Life Year, DALY) to compare the three components. In this communication the model will be presented in addition to the main outputs: the assessment of risks and benefits associated with both diets, i.e. 6 months of breastfeeding vs 6 months of formula feeding. In addition, outputs from two scenarios of milk preparation will be discussed: powder hydration with water at ambient temperature or with boiled water. This model will enable progress on the evaluation of potential management options leading to recommendations on infant diet and guidance on milk preparation techniques. The authors would like to acknowledge the technical training from the Erasmus+ Intensive Programme “Quantitative Tools for Sustainable Food and Energy in the food chain” of the European Union, Project No: 2014-1-MT01-KA200-000327.

December 6-10, 2015 - Arlington, VA

15

SRA 2015 Annual Meeting Abstracts W4-C.4 Bouder , F.E.*; Maastricht University; [email protected] Risk tolerance in the context of genetic risks A major dilemma for deciding to accept genetic risk information is that patients face preventive decisions based on information that may turn out to be misleading or of little use. On the one hand, the promises of using personal genomics in predictive and precision medicine has led to a surge of genetic risk information provided not only to patients – usually through the mediation of healthcare professionals – but to virtually anyone – e.g. via companies offering their services online. Yet, therapeutic advancements are slow to materialise and new treatments often lag behind improvements in testing technology. The starting point of the MindTheRisk project has been to raise a number of socio-political questions i.e.: are lay publics –including non-geneticist healthcare practitioners – satisfied with the quality of the information they receive? Do non-geneticists feel that they understand the risks involved? Do regulators take this aspect on board when they decide to allow or restrict genetic testing or to communicate about genetic testing? How is this new technological development regulated and by whom – e.g. by Government or self-regulated by Industry? This talk is an exploratory attempt to reflect on the concept of ‘risk tolerance’ in relation to genetic information. The author also suggests some direction for research. This project is based on desktop research supplemented by pilot interviews conducted with the European Medicines Agency, National regulatory agencies, academic and members of a patient organisation, all directly involved in policy.

W5-C.3 Bouder, F.B.*; Maastricht University; [email protected] Risk-managing the ''no unsafe'' food goal in Europe For over two decades risk has been at the centre of food policy in Europe. The roots are to be found in the BSE crisis which had durable implications for the remodelling of the food regulatory regime in the EU. Sporadic crises – from tainted chicken to Spanish cucumber and more recently the horsemeat scandal- suggest that, although desirable, food safety remains a controversial topic at EU and members state level. Does the multi-level regulatory model still work? Does the approach set forth in the aftermath of the BSE crisis need revisiting? This paper explores the goals of food safety policy in Europe and, crucially, how these goals are interpreted or re-defined at member state level. Although member states are in charge of implementing EU-wide food safety goals, there is still insufficient comparative research into national practices or it is wrongly assumed that harmonisation will bring uniformity. This paper attempts to partly fill this research gap. Based on in-depth desktop research combined with interviews in four member states (France, Netherlands, UK, Germany), this paper suggests that the goal of “no unsafe food” required by EU Regulation 178/2002 has given rise to multiple and conflicting interpretations that reflect historical, practical and economic specificities. As a result different approaches to risk regulation as well as levels of acceptable risk co-exist with an EU goal that should not be interpreted as a requirement to enforce absolute safety.

W4-J.1 Boyd, AD*; Furgal, CM; Driedger, SM; Jardine, CG; Washington State University; Trent University; University of Manitoba; University of Alberta; [email protected] Trust, Perception and Response in Indigenous Health Risk Communication: The Case of Lead Exposure and Inuit Health Indigenous groups that maintain strong relationships with the land and animals around them are often at greater risk for exposure to environmental contaminants than other populations around the world. Lead shot has been commonly used to hunt waterfowl and other wildlife, and if not cleaned thoroughly, the meat can transfer lead to the consumer at levels that lead to concern for some segments of the population. As a result, the use of lead shot for hunting is a source of contaminants for Indigenous populations, including Inuit residing in the Arctic. Therefore, it is critical to evaluate risk perceptions and the level of awareness and comprehension of issues about reducing the use of lead shot for harvesting animals in these contexts. This study examined reception, perception and awareness of messaging pertaining to lead arising from the results of the Nunavik Child Development Study – a longitudinal child health study gathering information on health and well being among Inuit mother-infant pairs in Nunavik, Canada. Results released in 2011 included advice to eliminate the use of lead shot in hunting practices because of concern for levels of lead exposure for the developing fetus. In 2013, community surveys were conducted using tablet computers among Inuit residents 18 years of age and older in one Nunavik community. In total, 113 Inuit residents were surveyed about their food consumption, awareness of lead related messages, and trust in health risk decision makers. Study results demonstrate the need to determine the often-complex factors that contribute to trust or distrust in health risk communicators and how this influences behavioral change and perceptions of traditional activities. We discuss factors that may contribute to better environmental health-risk messages with Indigenous populations and the need for future communication efforts that provide culturally relevant risk communication.

P.205 Boyd, AD*; Furgal, CM; Dickson, D; Washington State University; [email protected] Communicating Environmental Health Risks to Indigenous Populations: A Systematic Literature Review and Recommendations for Future Research Indigenous populations are recognized as a potentially vulnerable group to environmental health risks due to their intimate relationship with the environment, and reliance on local environments for aspects of culture, health and well-being in many circumstances. Barriers to effective communication and health risk management are linked to cultural, economic and geographic factors. A systematic literature review was conducted to consolidate peer-reviewed research published between 1980 and 2014 on the communication of environmental health risks to Indigenous populations. The comprehensive literature review procedures included searching databases and key journals that represented various fields in communication, environmental health, and Indigenous studies. The review yielded a total of 4,469 potential articles and a total of 14 of these manuscripts met the inclusion criteria. The 14 articles were analyzed to provide lessons learned for effective risk communication. Factors that influence successful risk communication strategies with Indigenous populations include: (1) Developing messages that are congruent with the populations’ cultural beliefs and understanding of the environment; (2) including Indigenous populations in message design and delivery; (3) using credible and trustworthy spokespeople in message delivery; (4) identifying and utilizing effective communication materials and channels; and (5) ensuring that messages are understandable to the target audience. Gaps in the literature include the lack of longitudinal studies that empirically measure changes in perception, awareness and behavior, as well as a general lack of theory-based research. Results from this review will provide directions for future research to help guide the development of more effective risk communication research and strategies to Indigenous populations.

December 6-10, 2015 - Arlington, VA

16

SRA 2015 Annual Meeting Abstracts W5-B.5 Boys, KA; Caswell, JA; Hoffmann, SA*; North Carolina State University, University of Massachusetts Amherst, and USDA Economic Research Service; [email protected] Use of Internet Data to Evaluate Capacity in the Global Food Safety Certification Industry Private and third-party certification programs increasingly are being relied upon as part of food safety assurance systems. As it has been proposed that the Food Safety Modernization Act (FSMA) rely, at least in part, on this certification system to assure the safety of imported food and animal feed, this issue has important implications for both domestic and foreign food industry stakeholders. This study offers a first, in-depth, and comprehensive effort at assessing the global food safety certification industry. The breadth, organization, and capacity of both organizations that establish food safety standards/certifications, and the firms that certify individual farms, food processors, and food manufacturers to a broad range of national and international food safety standards are considered. In conducting this study, a systematic web search protocol was developed, refined, and employed to generate a unique database of both certifications and certification bodies that provide accreditation to food safety standards around the world. Over 150 national and international firms which certify to one or more food safety standards have been identified. The data set includes the characteristics of each of these organizations, including their annual revenue, scope and location(s) of their business activities, and their legal organizational structure. Analysis based on this data yields important insights into the capacity, competitiveness, and deficiencies in the international food safety certification system. Findings of this study offers useful lessons regarding the opportunities and limitations of more conventional web search approaches to enhancing empirical understanding of this important industry.

W3-H.1 Bradford, K; Hegglin, M; Abrahams, L; Klima, K*; Klima, ; Carnegie Mellon University; [email protected] A Heat Vulnerability Index and Adaptation Solutions for Pittsburgh, Pennsylvania With increasing evidence of global warming, many cities have focused attention on response plans to address their populations’ vulnerabilities to natural hazards. One such natural hazard that is likely to be exacerbated both in frequency and intensity due to climate change is extreme heat. A heat vulnerability index could help inform adaptation decisions, such as the optimal location of cooling centers. Here we focus on Pittsburgh, Pennsylvania and ask two questions. First, what are the top factors contributing to heat vulnerability within the population and how do these characteristics manifest geospatially throughout Pittsburgh? Second, where should the City locate additional cooling centers to optimally address the vulnerability of these at risk populations? We use national census data, ArcGIS geospatial modeling, and statistical analysis, to determine a heat vulnerability index for the city of Pittsburgh and optimal cooling center placement. We find that Glen Hazel, North Oakland, and Homewood South are the most vulnerable census tracts, with the vulnerability factors “living alone” and “being older than 65 and living alone” most strongly related to hospital admittance data. North Oakland was identified as the optimal location for an additional cooling center, despite the fact that it was not ranked as the most vulnerable census tract. This implies that geospatial analysis is an essential tool for decision makers needing to efficiently allocate limited resources to serve the largest need because optimal strategies are not always intuitive. The remaining optimal cooling center locations differ depending on whether the City chooses to construct a new building or repurpose an existing public building.

W4-B.2 Brand, KB; University of Ottawa; [email protected] Value Based Weight of Evidence The steps used to develop a medical diagnostic are proposed as a homologue for the weight of evidence (WoE) processes being championed for application in the context of health risk assessment (HRA). One key step in the development and application of a diagnostic, namely calibration is presented as being instructive for WoE efforts. While a calibration is arguably not fully realize-able in the HRA context of WoE, it provides a model for improving WoE and better revealing its challenges. The norms of identifying a standard of proof (exercising subjective expected utility), which are well rehearsed in the medical diagnostic context, are arguably even more instructive for WoE exercises (where this link with SoP is less well developed). The missed opportunities and the potential blindspots for WoE owing to this gap are identified.

W5-E.1 Brannon, MC*; Lambert, JH; Slutzky, DL; Wheeler, JP; University of Virginia; and Fermata LLC; [email protected] Roadmap for Commercialization of Vehicle-to-Grid Technology in Logistics Fleet Vehicles Vehicle-to-Grid (V2G) technology enables plug-in hybrid and electric vehicles to contribute to the electric grid and microgrids. Fleets of electric vehicles can provide grid services while they are stationary and not being used for freight or logistics services. The technology enables a stronger electrical grid for consumers as well as energy storage and reliability for alternative generation. Many fleet vehicles operate on a structured and predictable schedule. When the vehicles are not in use, they provide grid storage and generate revenue beyond that of freight or logistics services. With scenario-based preferences, this paper will identify and prioritize sources of risk associated with commercialization of vehicle-to-grid technology for logistics applications. The approach supports risk management by industry, government, and military stakeholders and entrepreneurs who seek to commercialize vehicle-to-grid technologies. In particular, the approach addresses how to prioritize investment decisions with a fuller recognition of technology, market, environmental, regulatory, and other trends and emergent conditions.

December 6-10, 2015 - Arlington, VA

17

SRA 2015 Annual Meeting Abstracts P.15 Brenkert-Smith, H*; Dickinson, K; Flores, N; University of Colorado; [email protected] Playing with Fire: Assessing the effects of risk interdependency and social norms on homeowners’ wildfire mitigation decisions using choice experiments In the face of the expanding wildfire hazard, the actions that homeowners take to reduce fire risk on private property are crucial. However, homeowners’ wildfire risk mitigation actions are interdependent: the risk that any individual faces is affected by the conditions on neighboring properties. Meanwhile, social norms provide another mechanism linking one’s own risk mitigation choice to neighbors’ choices. Households may be encouraged to undertake more mitigation when presented with social comparisons highlighting high levels of mitigation among neighbors. Our web-based survey of homeowners in fire-prone areas on Colorado’s Western Slope combines survey data on current knowledge, risk perceptions, and practices with choice experiments that vary 1) fuel conditions on neighbors’ property (i.e., neighbors’ risk levels), and 2) neighbors’ mitigation actions (i.e., social norms) in order to assess the impact of these factors on wildfire mitigation choices. Of particular interest are the interactive effects of these different factors. For example, if risk interdependency tends to lead to underinvestment in risk mitigation, can social norms help to overcome this problem and “tip” communities toward higher levels of protection? In this presentation we describe the survey and experiment, and discuss initial findings from our data.

M2-F.1 Bronstein, PA; United States Department of Agriculture Food Safety Inspection Service; [email protected] FSIS strategies to control STECs through improved sanitary dressing procedures The Food Safety and Inspection Service (FSIS) is the public health agency in the U.S. Department of Agriculture responsible for ensuring that the nation's commercial supply of meat, poultry, and egg products is safe, wholesome, and correctly labeled and packaged. FSIS is continually striving to modernize its inspection processes to further reduce the risk for foodborne illnesses associated with the products that it regulates. The agency has increased the amount of information that it collects from tasks performed by FSIS personnel and information from regulated establishments. FSIS has implemented a novel process where recurring critical reviews of this information are used to identify areas for improvement in the inspection process. Once general areas have been identified, the agency solicits possible solutions to issues through crowd-sourcing techniques, including interactive meetings and blogs with personnel throughout various offices within the agency, both in the field and headquarters. The suggestions collected from these processes are further refined by the group to develop specific actions that can be taken by the agency to improve the inspection process and better protect human health. FSIS employed this approach to develop the Salmonella Action Plan, which has served as the agency’s blueprint to reduce Salmonella since its release in 2013. More recently the agency used this process to develop an STEC findings document that outlines six areas that should be addressed by the agency to reduce Shiga-toxigenic E. coli (STEC) associated with FSIS regulated products, which will be detailed in this presentation.

W2-G.4 Brown, M*; Sinha, A; Schlenker, A; Tambe, M; University of Southern California; [email protected] One Size Does Not Fit All: A Game-Theoretic Approach for Dynamic and Effective Passenger Screening An effective way of preventing attacks in secure areas is to screen for threats (people, objects) before entry, e.g., screening of passengers at airports. However, screening every passenger at the same level may be ineffective and undesirable. The challenge then is to find a dynamic approach for screening, allowing for more effective use of limited screening resources, leading to improved security. We address this challenge by introducing a novel threat screening game (TSG) model for screening domains. TSGs are played between a screener and an adversary, with the screener inspecting passengers with the goal of preventing the adversary from passing through security with an attack method that could be used to attack a flight. The screener utilizes different types of screening countermeasures that have: (i) different levels of effectiveness for detecting different attack methods; and (ii) different capacities in terms of the number of passengers that can be processed within a given time window. Effective screening may require a passenger to go through multiple screening countermeasures, but the screener may not be able to use the most effective screening countermeasure combination for every passenger. Hence, the screener may exploit available information such as flight and risk category to categorize passengers to help determine the appropriate scrutiny to apply. The goal of the screener is to find the optimal screening strategy, assigning a randomized combination of screening countermeasures to each passenger. The utility of the screener captures the goal of minimizing the risk of an attack across all passengers for all attack methods.

W1-H.3 Browne, PB; Office of Science Coordination and Policy, EPA; [email protected] Application of Endocrine Adverse Outcome Pathway Concepts and Use in the Endocrine Disruptor Screening Program The United States Environmental Protection Agency (EPA) is using Adverse Outcome Pathway (AOP) concepts in screening and testing chemicals for their potential to disrupt certain endocrine pathways which may lead to adverse effects. AOPs being considered by EPA’s Endocrine Disruptor Screening Program (EDSP) relate to the estrogen, androgen, and thyroid hormone systems. Linking relevant key events in endocrine AOPs will assist screening and testing for bioactivity and adversity in humans and wildlife. Evidence from case studies will be assembled to demonstrate the practical utility and decision criteria used to incorporate data from high-throughput screening (HTS), computational models, and animal toxicity tests to characterize a chemical’s endocrine disruption potential. Case studies using integrative AOPs for estrogen, androgen, and thyroid will be used to develop guidance documents for the harmonized international use of this approach in regulatory decision making.

December 6-10, 2015 - Arlington, VA

18

SRA 2015 Annual Meeting Abstracts P.115 Burgoon, LD; US Army Engineer Research and Development Center; [email protected] Automated Chemical Risk Screening (ACRS) Using Artificial Intelligence and High Throughput Screening Data Toxicologists and human health risk assessors and risk managers are deluged with more data about chemicals today than they ever have been. However, more data does not directly lead to new insights, nor does it make the chemical assessment process more efficient. Toxicologists and risk assessors need computational tools to help them understand all of this data, and to facilitate decision-making. The goal of the Automated Chemical Risk Screening (ACRS) program is to significantly diminish the lead-time from data acquisition to risk assessment and risk management decisions. ACRS will accomplish this by combining adverse outcome pathway networks (AOPNs) constructed from open source knowledge captured in the AOP-Wiki, causal network theory, ontologies (e.g., the AOP Ontology), and semantic web technologies (to ingest high throughput screening data from PubChem and other sources). ACRS identifies the AOP key events in the AOPN that are sufficient to infer an adverse outcome will occur, captures and combines the high throughput assay data with the sufficient key events, and makes calls on whether chemicals activate/deactivate a given key event based on the assay data. The result is a chemical screening hazard identification and point of departure (PODs). By combining this with the AOPXplorer, the assay data, assay calls, AOPNs, hazards and PODs can all be visualized. This presentation will focus on a case study that demonstrates the proof of concept.

W2-G.2 Burns, WJ; Decision Research; [email protected] Modeling the Uncertainty Associated With Commercial Airline Flight Risk There is considerable uncertainty surrounding the threats posed by passengers, checked baggage, cargo and operators within airports and airlines to commercial airline flights. Working with subject matter experts at the TSA, an influence diagram is developed that represents this uncertainty along the consequences for different mitigation strategies and outcomes. Probabilities and consequences are based on knowledge elicitation from experts and available empirical data. Procedures and findings will be discussed.

P.96 Butte, G; Kovacs, D*; Ketchum, C; Pribanic, V; Thorne, S; Decision Partners; MedRespond; [email protected] Application of Mental Modeling Technology™ with Synthetic Interviews™ to Support Stakeholder Engagement through Artificial Intelligence Products Interactive Decision Support Technology (IDST) is a ground-breaking integration of Decision Partner’s Mental Modeling Technology™ with MedRespond’s Synthetic Interview™ artificial intelligence and online communication products. The IDST solution has been designed to improve stakeholder judgment and decision making through realistic dialogue with virtual experts that are available 24x7x365. The virtual experts effectively engage, inform and motivate stakeholders (patients, doctors, employees, customers and others) on the topic at hand because the virtual experts “Understand” stakeholder motivation and decision behaviors; “Respond” to stakeholder questions via immersive conversational videos; “Remember” individual stakeholders over time; in order to “Follow” changes in stakeholder information needs, priorities and thinking. The virtual dialogue is both informed by traditional Mental Models research that elicits in-depth understanding of stakeholder perceptions, interests, priorities and informational needs, and extends Mental Models research results by providing on-going insight into changes in stakeholders’ Mental Models (both individually and collectively) over time. With a virtual video expert at the core of “conversations” around topics such as health and well-being or socio-technical issues such as energy , one-way / one-size-fits-all online communications are converted to two-way, personalized virtual dialogues, providing stakeholders with a powerful, online communication experience that better prepares them to make well-informed decisions. IDST provides the enterprise or sponsor organization with dynamic insights into stakeholder beliefs, attitudes and perspectives on the topic at hand, enabling focused communication investments that inform and influence decision making and behavior in a manner that is predictable, rapid, scalable and cost-effective. In this presentation we will present background on the IDST approach and case examples.

P.110 Cains, MG*; McFetridge, E; Winter, A; Duan, Y; Cains, ; Indiana University; [email protected] Human and Ecological Risk Assessment of Indiana University Golf Course Indiana University Golf Course is an 18-hole, 233 acre championship course that used 31 pesticides in 2014 and applied 548 kg of active ingredients to eliminate fungus, insects, and weeds. Multiple terrestrial and aquatic species are present within the vicinity in addition to University Lake and Griffy Lake. Due to the potential health risks posed to IU Golf Course pesticide applicators and the surrounding ecosystem, an environmental risk analysis was conducted on five of the 26 applied active ingredients (bensulide, carbaryl, chlorothalonil, iprodione, and tebuconazole). The active ingredients were selected based on toxicity, bioaccumulation potential, half-life, amount of active ingredient applied, and frequency of application. Human risk was assessed using the EPA RAGS framework and the EPA Occupation Pesticide Handler dermal and inhalation exposure factors for golf course pesticide mixing, loading, and application. Terrestrial ecological risk was assessed for the Carolina chickadee, American robin, Canada goose, meadow jumping mouse, meadow voles, and rabbit using the EPA T-REX model. Aquatic ecological risk was assessed for daphnia, largemouth bass, and channel catfish using the TurfPQ and PondPQ models developed by Dr. Douglas Haith of Cornell University. Suggested risk management improvements include: requiring all pesticide handlers and applicators to always wear the pesticide product label prescribed level of personal protective equipment; to avoid pesticide application within 15 feet of a waterbody; and to discontinue the use of Sevin (active ingredient: carbaryl). Future steps for this environmental risk assessment include the incorporation of the actual personal protective equipment worn by golf course pesticide applicators, adding a distance component into the PondPQ model, the calculation of probabilistic ecological risk rather than risk quotients, and finally the expansion of analysis to include all 26 active ingredients.

December 6-10, 2015 - Arlington, VA

19

SRA 2015 Annual Meeting Abstracts P.168 Cains, MG*; Henshel, D; Hoffman, B; Oltramari, A; Indiana University, Army Research Labs, Carnegie Mellon University; [email protected] Human Factor Trust Framework within Holistic Cyber Security Risk Assessment As part of a continuing effort to develop a holistic cyber security risk assessment framework and model, the characterization of human factors, which includes human behavior, is needed to understand how the actions of defenders, users, and attackers affect cyber security risk. We have developed an initial framework for how to incorporate trust as a factor/parameter within a larger characterization of the human influences (users, defenders and attackers) on cyber security risk. The work group developing this new cyber security risk assessment model and framework has chosen to distinguish between trust and confidence by using "trust" only for human factors, and "confidence" for all non-human factors (e.g. hardware and software) in order to reduce confusion between the two concepts within our larger holistic cyber security risk assessment framework. The presented trust framework details the parameter relationships that build trust in cyber defenders and the parameter metrics used to measure trust. Trust in the human factors is composed of two main categories: inherent characteristics, that which is a part of the individual: personality, motivation, rationality/irrationality, benevolence/malevolence, integrity, expertise, and attention/awareness; and situational characteristics, that which is outside of the individual: authorized or unauthorized insider access. The use of trust as a human factor in holistic cyber security risk assessment will also rely on understanding how differing mental models and risk postures impact the level trust given to an individual and the biases affecting the ability to give said trust. The ability to understand how trust is developed and given within the context of cyber security will facilitate the development of a more holistic and predictive risk model that effectively incorporate human factors into the risk equation.

W5-H.2 Calabrese, EJ; Shamoun, DY*; University of Massachusetts at Amherst/ Mercatus Center at George Mason University; [email protected] On Objective Risk Objectivity in the science of risk plays a monumental role in in the projection of the benefits from health and safety regulations, which themselves constitute the majority of the total reported benefits of all federal regulations. Claims concerning the accuracy of regulatory risk assessments have been untestable so far in that they focus on whether the risk assessment over- or underestimates the risk of exposure to certain hazards; yet, such claims imply that a true level of risk is known. We propose to move the debate from the realm of the untestable to the realm of the testable through study of the process objectivity of the science of risk. Consistency in adhering to a process that is meant to produce objectivity should yield objective results. In the present paper, we consolidate the existing body of guidelines and recommendations, produced by the federal government and scientific bodies, on sound risk assessment practices. We propose that, in order to test the process objectivity of the science of risk as applied by the regulatory agencies, a third party chosen from outside the agencies themselves conduct a systematic assessment of major regulatory risk assessments according to the consolidated principles outlined in this paper. We show that our proposed process is testable, objective, and, if adhered to consistently, has the potential of shedding light on the accuracy of the benefits calculus of major federal regulations.

P.151 Camin, JM; Université Michel de Montaigne Bordeaux 3; [email protected] Proposal for a constructivist model of "communication-uncertainty" and a typology according to the nature of uncertainty While the main activities of a project manager are done through the communication process, it is observed that many projects are subject to delays, excesses or defects specifications. Excess of measurements to prevent the risk ? Defective management of the communication which leaves too much place to uncertainty ? The Theory of Uncertainty Reduction developed by Berger and Calabrese (1975) in the field of communication does not fully understand how a project dissipates the existing uncertainty between actors. By revisiting an operational project within the framework of action research, we strive to identify how uncertainty and communication influence and form themselves mutually. We used the constructivist approach and the actor-network theory of Callon and Latour (1981, 1984, 2006, 2007) to reach the meaning of this circular relationship. We present one of the results of our thesis: The communication process used in the construction of the network differs according to the nature of uncertainty encountered or felt. By positioning uncertainty as a socially constructed phenomenon, we present a constructivist model of "communication-uncertainty" where the observer is an intentional actor limited by constraints (Boudon, 2009). We propose a typology in the field of communication, in agreement with the proposals for other areas researchers (Klir, Ayyub, Walker, Rowe, Hoffman). We propose to distinguish the nature of uncertainty following a typology : the variability uncertainty (inherent variability of things), the epistemic uncertainty ambiguous or not (due to the imperfection of our knowledge) and the scale uncertainty (in touch with the imperfection of our models of representations).

M2-H.1 Canady, RA*; Simon, T; NeutralScience L3C, Ted Simon LLC; [email protected] Disruptive Arrival of Big Data to Food Intake Assessment– New ways of collecting and merging large amounts of data are opening radically new approaches to measuring food intake and its effects to support decisions about health risk, nutrition, and food security. Development and use of the new data and models will draw across data scientists, economists, exposure and pharmacodynamics modelers, toxicologists, nutritionists, and decision analysts. Along with new data collection, modeling, and decision support there will arise new policies about data quality, inference, and risk management. In some cases the data may allow new approaches to risk management that challenge current population-average, single ingredient assessment approaches. Indeed, highly granular, continuous, and exhaustive sampling that seem possible from the alternative “big” data may disrupt risk analysis practices based on traditional small-N exposure extrapolations, such as reliance on average intakes from constant daily dosing in bioassays, worst case mixtures assessment, and worst case aggregate and cumulative assessment across multiple sources. Questions may also arise regarding identity-protection and liability as data scientists create new methods to assemble and model data on scales many orders of magnitude greater than what current exposure modelers use. Changes to public perceptions of participating in and accepting risk are also happening in rapidly growing voluntary data submission and citizen science movements, as amplified in the recent initiation of the Obama Administration’s Precision Medicine program. For example, models comparing routinely collected neonatal biomarker data with prenatal dietary data can address a number of nutrition and contaminant policy issues, on the million-person cohort level.

December 6-10, 2015 - Arlington, VA

20

SRA 2015 Annual Meeting Abstracts W1-G.3 Canfield, C*; Fischhoff, B; Davis, A; Carnegie Mellon University; [email protected] Using Signal Detection Theory to Measure Phishing Detection Ability and Behavior Phishing attacks, a pervasive problem affecting all levels of society, are difficult to reduce with technology alone. Those responsible for managing these risks need to be able to assess them accurately, so that they can evaluate the effectiveness of changes in system design, operator training, and incentives, as well as know what risk remains. We demonstrate a method for measuring users’ vulnerability to phishing attacks in a scenario-based online experiment. Specifically, we use signal detection theory (SDT) to disentangle users’ ability to distinguish between phishing and legitimate emails (discrimination ability) and tendency to classify emails as phishing or legitimate, when uncertain about their identity (decision threshold). We compare performance on two tasks: (a) detection, deciding whether an email is phishing; and (b) behavior, deciding what to do with an email. We find that individual users’ performance varies within and across tasks. Although individuals’ behavior is generally appropriate or cautious based on their detection decisions, their detection ability is not strong enough to avoid falling victim to attacks. Our results suggest that participants have different decision-making processes for the detection and behavior tasks, tending to use more contextual cues for the behavior one. We demonstrate the value of these estimates for human performance in a risk model to explore the impact of three types of behavioral interventions: (1) improving the worst performers above a specified threshold, (2) improving all individuals marginally, and (3) using system-level feedback to send automated warnings if an email has been reported a minimum number of times. Preliminary results suggest that large improvements in individual human performance are needed to incur system-level benefits.

W1-E.1 Carter, JM; Occupational Safety and Health Administration; [email protected] GHS and Nanomaterials: What is required under GHS and Developments within the UN Subcommittee The Globally Harmonized System of Classification and Labelling of Chemicals (GHS) is an international system designed to standardize and harmonize the communication of hazard information, and well as protective measures, on labels and Safety Data Sheets (SDS). The system aims at ensuring the availability of hazard information from chemicals in order to enhance the protection of human health and the environment during the handling, transport and use of these chemicals. There has been considerable debate regarding hazard communication needs and requirements for nanomaterials as an emerging chemical in use and production. Recently, an informal group, under the UN Sub-committee of Experts on GHS, has identified a small number of nanomaterials with sufficient available data to assess the applicability of GHS criteria for their classification. This presentation will review requirements for nanomaterials under OSHAs Hazard Communication Standard as well as recent developments under the GHS to address harmonization of classification criteria for nanomaterials.

W3-H.2 Casagrande, DG*; Pinter, N; McIlvaine-Newsad, H; Lehigh University, Southern Illinois University, Western Illinois University; [email protected] Perceptions and realities of flood risk and mitigation in the Midwest US: An interdisciplinary approach Severe flooding repeatedly affects rural communities in the Midwest US. In this paper we present results of a multi-institutional and interdisciplinary research project funded by the National Science Foundation to determine whether homeowner perceptions of flood risk conform to physical risk and how their perceptions of risk affect attitudes about various flood mitigation options. We combined qualitative data analyses of interviews and focus groups with a quantitative and spatially explicit framework linking flood risk, community vulnerability, and mitigation potential. The study area spans 640 km of the Mississippi, Ohio, and Illinois Rivers (>12,600 km2 of floodplain), including reaches with contrasting flood histories. Physical risk of flooding was measured using hydraulic modeling, Hazus-MH risk assessment, and 2010 Census data. We used a survey of households to quantify perceptions of risk and attitudes about mitigation options. Results indicate that floodplain residents’ perceptions of risk do not reflect hydrological or economic risk. Residents engaged in psychological strategies like social comparison to downplay flood risk. Attitudes were most favorable about large-scale technological mitigation options like improving levees, least favorable about household options like relocation, and largely unrelated to perceptions of risk. It appears that Midwest US floodplain residents prefer to accept and manage the risk of flooding than engage in uncertain futures like relocating off the floodplain.

M4-C.4 Cascio, WE; U.S. Environmental Protection Agency; [email protected] Environmental Human Challenge Studies: Understanding Uncertainties Inhaled airborne contaminants such as gases, particles, and complex mixtures interact with the surfaces of the upper and lower respiratory tract initiating a broad range of biological responses that can yield adverse health effects in susceptible people. Epidemiological studies designed to assess the relationship between short-term air pollutant exposure and health effects have reproducibly shown positive associations between some common pollutants and cardiopulmonary outcomes within a population. Yet epidemiological studies have limited capability to fully explain these health effects mechanistically. Thus, human challenge studies, also known as controlled human exposure studies are an important and valuable experimental approach to inform biological plausibility of the observed epidemiological observations by linking specific pollutant exposures to subclinical biochemical and physiological responses. Controlled human inhalation exposure studies have the advantage of accurate control of the pollutant’s concentration, and are intended to approximate a “real-world” exposure. While substantial progress has been made in the design of studies utilizing single pollutant exposures, some technical challenges and uncertainties exist when interpreting the effects of sequential and co-pollutant exposures, the interactions between different types of pollutants, and the creation of artificial ambient atmospheres. Apart from seeking to better understand these uncertainties attendant to particulate composition, mixtures of different air pollution types, researchers must also consider the adequacy of the biological and physiological endpoints commonly used to accurately reflect the potential for adverse health outcomes. The presentation will focus on the strengths, limitations and value of controlled human exposure studies, and will address sources of uncertainty while placing such studies within the context of population-based and toxicological studies for risk-assessment.

December 6-10, 2015 - Arlington, VA

21

SRA 2015 Annual Meeting Abstracts M2-F.2 Catlin, MC; Food Safety and Inspection Service; [email protected] FSIS Poultry Performance Standards: Using Risk Assessment and Risk Analysis in the Decision-Making Process In January 2015, FSIS proposed new voluntary industry performance standards for Salmonella and Campylobacter in chicken parts and more stringent standards for comminuted poultry products. Those standards were developed using a risk assessment approach that targeted specific illness reductions in line with goals established in Healthy People 2020, using data from agency sampling programs and data from the Centers for Disease Control and Prevention. The assessment estimated the illness reductions likely to occur for different assumptions about the percentage of industry that would alter their food safety systems to meet the standards. It also estimated the percentage of industry that would initially fail the different standards. Uncertainty analyses were included in the assessment. This talk will provide an overview of the risk assessment approach that was used to develop the standards. It will then highlight how the risk assessment results were incorporated into a cost benefit analysis. Finally, it examines how the agency considered the risk assessment, economic analysis, and other factors to identify the performance standards proposed in the Federal Register.

T3-H.1 Cawley, M*; Overton, R; Hartman, P; Turley, A; Phillips, L; Moya, J; ICF International, Office of Research & Development, U.S. EPA; [email protected] Exposure Factors Interactive Resource for Scenarios Tool (ExpoFIRST) ExpoFIRST – the Exposure Factors Interactive Resource for Scenarios Tool – brings EPA’s Exposure Factors Handbook: 2011 edition (EFH) data to an interactive tool that maximizes flexibility and transparency for exposure assessors. ExpoFIRST represents a huge advance for regional, state, and local scientists in performing and documenting calculations for community and site-specific exposure assessments. ExpoFIRST replaces manual calculation and documentation and allows users to draw on data found in the 2011 EFH. The tool stores more than 8,000 values extracted from more than 75 of the most commonly used tables in the EFH. It allows flexibility in scenario design and automates the documentation of algorithms, exposure parameters, and dose estimates. ExpoFIRST is intended to replace the 2004 Example Exposure Scenarios document and complement the 2014 Child-Specific Exposure Scenario Examples document. In this session, we will preview ExpoFIRST, including an overview of the features designed to maximize transparency and efficiency and the supporting data from EFH available in the tool. ExpoFIRST will be made available to users through the Exposure Factors module of the EPA-Expo-Box website. The views expressed in this presentation are those of the authors and do not necessarily reflect the views or policies of the U.S. Environmental Protection Agency.

M2-G.4 Cha, E*; Shafieezadeh, A; Ellingwood, BR; University of Illinois at Urbana-Champaign, Ohio State University and Colorado State University ; [email protected] The role of risk acceptance attitudes in managing a risk to infrastructure systems from terrorist attack As terrorists succeeded a number of large-scale terrorist attacks in the last decades, terrorism become a significant source of risk that an infrastructure systems are now confronted with in the 21st century. Despite the recent awareness of the risk, understanding of it is still far behind compared to some other hazards. In the face of ignorance and catastrophic outcome together, people tend to focus on the badness of the outcome, rather than on how likely the outcome will occur. This phenomenon is called “probability neglect”, which explains some excessive regulations which appear after experiencing an extreme event. Those regulations are often found to be unjustified or even counterproductive based on cost benefit analysis that only takes into account direct impact of a hazard. However, public terror is also a cost, which is what terrorists are aiming for. Costs associated with the public terror go well beyond those directly harmed. For example, in the aftermath of September 11th, financial markets, consumer spending and air travel dropped, and public opinion toward government shifted. In this paper, we will develop a risk-informed decision framework in which the strategic interaction between a defender and an intelligent attacker is modeled by means of a probabilistic risk analysis and cumulative prospect theory involving nonlinear evaluation of consequence and probability. The framework will take into account defender’s attitude on accepting a risk to civil infrastructure from terrorist attack and attacker’s risk attitudes in selecting attack profiles (scenario, target, weapon, path, etc.). The proposed methodology is applied to an illustrative example considering a commercial airport which is subject to a variety of terrorist threats. A number of risk mitigation options will be considered and the effect of decision-makers’ preferences will be investigated.

P.192 Chabrelie, A*; Mitchell, J; Rose, J; Charbonneau, D; Ishida, Y; Michigan State University; [email protected] An Evaluation of the Influenza Risk Reduction from Antimicrobial Spray Application of Porous Surfaces Antimicrobial spray products are used by millions of people around the world, for cleaning and disinfection of commonly touched surfaces. Influenza A is a pathogen of major concern, causing over 36,000 deaths and 114,000 hospitalizations annually in the United States alone. One of the proposed routes of transmission for Influenza A is by transfer from porous and non-porous surfaces to hands and subsequently to mucous membranes. Therefore, routine cleaning and disinfection of surfaces is an important part of the environmental management of Influennza A. While the emphasis is generally on spraying hard surfaces and laundering cloth and linens with high temperature machine drying, this study examines the impact using an antimicrobial spray on a porous surface has on reducing the risk of infection. A Quantitative Microbial Risk Assessment (QMRA) for a single touch resulting in direct contact with a treated, contaminated, porous surface is analyzed to determine the reduction in Influenza A risk associated with the measured viral inactivation. A comparison of the risk of infection with and without the use of the antimicrobial spray product has been done. The analysis indicates that Influenza infection risks associated with a single touch to contaminated fabrics are relatively high especially considering the potential for multiple touches in a real world scenario. However, use of the antimicrobial spray product resulted in a 4 log risk reduction. Thus the results of this study inform and broaden the range of risk management strategies for Influenza A.

December 6-10, 2015 - Arlington, VA

22

SRA 2015 Annual Meeting Abstracts T3-F.3 Chada,, K*; Zhang,, G; Kreimeyer,, K; Simonetti,, A; and Yang,, H; Food and Drug Administration and Engility Corporation; [email protected] A Computational Tool for Risk Assessment of Transfusion Transmitted Diseases Associated with Travel Exposure of Donors Increased global travel and its association with spread of infectious diseases demand continuous evaluation of donor screening and blood testing procedures to ensure safety of US blood supply. FDA/CBER has developed a BRiskTool to expedite risk assessment (RA) of transfusion-transmitted (TT) infectious diseases. One component of BRiskTool is a template for travel-associated RA. The major model inputs are disease endemic regions, disease prevalence or incidence, targeted high risk (THR) donor groups, and incubation period. The model is comprised of three modules. Module 1, Disease Prevalence, identifies the disease specific THR groups (US travelers, immigrants to US, residents of US endemic regions, or military personnel). The prevalence of asymptomatic infections among the THR and background risk groups are estimated based on data from WHO and CDC, etc. Pathogen variants and seasonality risk associated with the diseases can also be modeled. In module 2, Infectious Blood Units, the infectious blood units are estimated for current and potential alternative policy options for donor deferral (deferral of endemic region, THR group, change in deferral period, seasonal deferral, donor entry criteria) or blood screening (multiple tests, donor re-entry, seasonal testing). The window period risk and false negative risk associated with testing are also modeled. In module 3, Transfusion-Transmitted Risk, infected cases are estimated using dose-response information. The major model outputs are annual number of infectious blood units, annual number of infected cases, and units gained or lost for evaluated policies. The travel-based BRiskTool model template can be applied to perform RA of emerging and resurging TT diseases like malaria, chikungunya, dengue, etc. The RA analysis from this tool will inform recommendations for donor screening, blood testing, and blood management needed to reduce the risk of TT infectious diseases associated with travel exposure of donors.

P.163 Chatterjee, S*; Prager, F; Chen, Z; Rose, A; Pacific Northwest National Laboratory; [email protected] Representing uncertainties in economic consequences of multiple hazards This talk focuses on uncertainty characterizations, including mathematical intervals and probability distributions, for economic consequences of multiple hazards. Uncertainty quantification and propagation is performed using sampling with variance reduction and regression (both least squares and quantile) with stochastic regressors. Consequence probability distributions are developed that may be useful for homeland security policy-makers conducting national risk assessments and for emergency management decision-making.

W4-E.1 Chatterjee, S*; Salazar, D; Pacific Northwest National Laboratory; [email protected] Analysis of layered security portfolios under uncertainty This talk focuses on the analysis of tradeoffs between risk reduction and net economic costs associated with portfolios of urban area security measures. Uncertainties in portfolio effectiveness, due to limited expert information, are quantified using mathematical intervals. Portfolio selection is formulated as an optimization problem under interval uncertainty and efficient risk reduction/net cost frontiers with intervals are developed.

P.49 Chen, YH*; Wu, CH; Wu, KY; National Taiwan University; [email protected] Cumulative Risk Assessment of Pesticides in the Taiwan Population Cumulative health risk assessment was performed on five pesticides as they are widely used on various vegetables and fruits in Taiwan. These insecticides are known to inhibit acetylcholinesterase (AChE), which lead to neurological and reproductive toxicity. Notably, all of the aforementioned insecticides, chlorothalonil is classified as carcinogenic to animals. Since agriculturalists often mix multiple pesticides during cultivation, assessing the cumulative health risks would more accurately reflect realistic hazards, compared to assessing that of a single pesticide. The present study evaluates the health risks of five pesticides by calculating reference dose (RfD) using the Benchmark Dose Software. The BMDL10 of carbofuran, dimethoate, methamidophos, and terbufos are 0.206, 2.514, 0.387 and 0.006 mg/kg/day, respectively. Since chlorothalonil is carcignogenic, its cancer slope factor is 0.0074. Applying Bayesian probability combined with Markov Chain Monte Carlo simulation (MCMC), data from the National Food Consumption Database of Taiwan were used to conduct exposure assessment. This study established ADIs (carbofuran: 2.6 x 10-3, dimethoate: 2.15 x 10-2, methamidophos: 3.87x10-3 and terbufos: 5.83x10-2 mg/kg/day) that are currently absent in Taiwan, which can be used by risk managers. The lifetime average daily doses (LADD) calculated, obtained using MCMC to estimate exposure, is much lower compared to that obtained using the pesticides’ official MRLs. This indicates that the MRLs set by Taiwanese authorities could result in a lenient LADD. Although both individual and combined HI values indicate that consumers are not subject to the potential adverse health effects, further investigation focusing on the exposure to multiple compounds for the consumers is warranted. A review of risk management protocol of pesticides should be considered since compliance with current regulations may be inadequate in safeguarding health.

December 6-10, 2015 - Arlington, VA

23

SRA 2015 Annual Meeting Abstracts T4-F.2 Chen, Y*; Dennis, S; Pouillot, R; Paoli, G; Santillana Farakos , S; Van Doren, J; Food and Drug Administration; Risk Sciences International; [email protected] FDA-iRISK® 2.0: New features and case studies for ranking microbial and chemical hazards in foods FDA-iRISK® is an interactive, Web-based system that enables users to relatively rapidly conduct fully probabilistic risk assessments of food-safety hazards. In March 2015, FDA made available to the public FDA-iRISK 2.0, an enhanced version, free, on Foodrisk.org. The benefits of FDA-iRISK 2.0 include advanced modeling capacity, new and more robust reporting, improved data sharing, and greater ease of use. This presentation will highlight new capacities in FDA-iRISK 2.0 that will enable users to conduct their own risk assessments, using new features such as rare event additions (to characterize contamination of food in the processing pathway), using new dose-response modeling options, and reporting results in new ways, such as by the number of cases or exposure at point of consumption, in addition to the DALYs (disability-adjusted life years). We will illustrate some of the new features with case studies using published data and information, for a variety of pathogens and foods. We will also present a case study to demonstrate how to compare the exposure at consumption without integrating the dose-response element. FDA-iRISK 2.0 (the tool itself) and related materials, including a technical document that describes the underlying mathematical architecture and equations, are accessible at http://foodrisk.org/exclusives/fda-irisk-a-comparative-risk-assessment-tool/.

P.51 Chen, Z*; Prager, F; Rose, A; Chatterjee, S; Chen, ; Center for Risk and Economic Analysis of Terrorism Events (CREATE); [email protected] A Reduced-Form Approach to Economic Consequence Analysis The state of the art modeling approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) analysis. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems that yield estimates of economic impacts of various types of disasters, including terrorism, natural hazards, technological accidents. This paper presents an approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The approach is to first run hundreds of simulations of a CGE model for each threat, varying key parameters, such as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin hyper-cube sampling procedure in each model run. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression, which yield linear equations that can be incorporated into a computerized system. We illustrate this application in the CREATE Economic Consequence Analysis Tool (E-CAT).

P.82 Chen , Y.J.*; Chen, Y.H.; Wu, C.; Wu, K.Y.; Institute of Occupational M Hygiene, National Taiwan University; [email protected] The risk assessment of pesticide residues in vegetables and fruits in Taiwan Chlorothalonil, Dimethoate, Methamidophos, Terbufos The objective of this study was to evaluate theoretical maximum daily in Lifetime Average Daily Dose (LADD) and acceptable daily intake (ADI Carbofuran, Chorothalonil, Dimethoate, Methamidophos, Terbufos in veget by using food intake, residue data and compare with ADI in order to estimat the pesticide exposure. Existing ADIs of these pesticides wer observed-adverse-effect level (NOAEL), due to the lack of methods to simu used benchmark dose to calculate a reference dose (RfD) to replace NOAEL based on a model of Bayes' theorem in Markov Chain Monte Carlo simula was according to the newest MRLs rules in Taiwan. Meanwhile, ADIs were dose with existing animal studies. The study reveals that the TMDI of ca dimethoate, methamidophos, terbufos was 0.0424, 0.0766, 0.0083, 0.007242 and ADI of carbofuran, dimethoate, methamidoph 2.06×〖10〗^(-3),2.15×〖10〗^(-2), 3.87x1 The percent ratio of TMDI to ADI for 5 pesticides exceeded the ADI. LADD 3.26×〖10〗^(-4),3.50×〖10〗^(-4),6.71×&# 3.48x10-5, 3.20x10-5mg/kg/day, respectively. The percent of LADD to ADI 15.8, 2.33, 0.31, 0.90, 0.029, respectively. All LADDs were below AD pesticides should undergo revision. Results show that Carbofuran, Dime Terbufos, accumulating in the human body would cause neurotoxicity and human beings and that carcinogen chlorothalonil which is most harmful to h and used most widely may cause malignant tumors in humans. Therefor government to modify each improper MRL in agriculture and decrease pestic

P.43 Chiang, SY*; Huang, YW; Wu, KY; China Medical University, Taiwan; [email protected] Food Safety Assessment on Butter yellow, 4-dimethylaminoazobenzene 4-Dimethylaminoazobenzene (DAB), known as butter yellow, methyl yellow and dimethyl yellow, was used as a food colorant to color butter, but soon discovered to be a potent carcinogenic dye, and then banned in 1945 in USA. A countrywide recall of dried bean curd was announced by Taiwan FDA after discovery of possible traces of banned DAB in more than 100 tofu-related foods in 2014. DAB is mutagenic in Ames tests, causes sister-chromatin exchange in mammalian cells, and induces hepatoma in rodents. The potential health effects have been of great concern for consumers in Taiwan. This study was to assess cancer risk assessment for DAB in foods. The Benchmark dose software was used to fit the hepatoma data of male B6C3F1 mice to obtain the best fit BMDL10 at 0.71 mg/kg/day estimated by using the multistage model. The cancer slope was estimated at 0.51 (mg/kg/day)-1 by following the 2005 US EPA guideline for Risk Assessment on carcinogens. The daily intake rates of the more than 100 foods were cited from the Taiwan National Food Consumption database. The total daily intake rate of these foods is 460 g/day. The estimated maximum allowable residue is 4 g/kg in these foods, given the negligible cancer risk at one in a million.

December 6-10, 2015 - Arlington, VA

24

SRA 2015 Annual Meeting Abstracts W5-A.4 Chiu, WA; Texas A&M University; [email protected] Practical integration of old and new evidence streams with a harmonized dose-response assessment tool developed by WHO/IPCS Recently, the WHO/IPCS published a guidance document on evaluating uncertainties in dose-response assessment/hazard characterization that is applicable to both cancer and non-cancer outcomes. As described in a separate symposium, the basic idea of the approach is to express the outcome of a human health hazard characterization, whether for cancer or non-cancer endpoints, as a confidence interval or distribution rather than as a point estimate, while at the same time making explicit the risk management protection goals of the assessment in terms of the acceptable incidence (I) and magnitude of effect (M) in the population. As part of this effort, the WHO/IPCS expert group developed an “approximate probabilistic analysis” that can be practically implemented in an Excel spreadsheet tool. Parameter distributions based on analyses of historical data enable probabilistic dose-response assessments to be performed rapidly from toxicity data with minimal additional analysis. Furthermore, the spreadsheet tool enables rapid prioritization and integration of “higher tier” analyses and data generation, such as benchmark dose modeling, chemical-specific adjustment factors, and PBPK modeling. This tool also provides a framework for integrating new evidence streams, such as population-based in vitro and in vivo data on toxicokinetic and toxicodynamic variability. Finally, this approach has risk management implications related to specifying the acceptable levels of uncertainty, population incidence, and magnitudes of effect in a particular risk context. This presentation illustrates the application of the spreadsheet tool through two examples, demonstrating the approach for both cancer (methyleugenol) and non-cancer (deoxynivalenol) endpoints.

M2-A.5 Chiu, WA; Texas A&M University; [email protected] Addressing uncertainty and variability in 21st century risk assessments The rapid advance of high-throughput testing and other new biological technologies has the potential to address a broad range of needs in chemical risk assessment. One important need is for risk assessments to adequately characterize uncertainty and variability, so as to provide decision-makers with a sense of the confidence in estimated risks and the extent to which susceptible individuals are protected. This presentation reviews existing approaches to characterize uncertainty and variability in the context of the new and emerging data streams being integrated into risk assessment. For instance, existing reverse dosimetry approaches provide a characterization of toxicokinetic uncertainty and variability in relating in vitro concentrations to in vivo exposures. On the other hand, approaches to address uncertainty and variability in toxicodynamics or downstream disease processes are only beginning to be explored. A common theme for all these approaches is the integration of population-based data and experimental models with probabilistic computational/statistical models.

T4-I.1 Chopade, PV; Zhan , JZ*; Crowther, KG; North Carolina A&T State University and MITRE; [email protected] Smart and Effective Large-Scale System Risk Analysis The Unites States and other developed countries have become highly dependent on interlinked information systems and infrastructures. The rapid and unrestricted flow of information is critical for controlling such activities as power generation and distribution, banking and finance, transportation, manufacturing and healthcare. These activities are necessary for the conduct of everyday life and essential to the maintenance of our democratic society. The challenges are significant: •How can we continuously discover and eliminate existing and evolving vulnerabilities in real-time given the dynamic nature of network configurations and applications? •How can individuals and organizations share information to mitigate risks? This work presents a novel and effective framework for risk and vulnerability assessment of large-scale system. In order to progress towards these capabilities, technical objectives of the proposed approach are: •Modeling: Understand the “true” dynamics- to develop techniques and simulation tools that help build a basic understanding of the dynamics of complex infrastructures. •Measurement: Knowing what is or will be happening- to develop measurement techniques for visualizing and analyzing large-scale emergent behavior in complex infrastructures. •Management: Deciding what to do- to develop distributed systems of management and control to keep infrastructures robust and operational. •Risk and Vulnerability Assessment: Applying threat, vulnerability, and consequence assessment information and statistical data (when available) to calculate quantitative risk levels. •Metrics and System Evaluation: Evaluating the effectiveness and efficiency of risk management programs and activities Our proposed approach has ability to collect and analyze massive amounts of data using a layered software architecture that addresses each challenge. It provides a mathematically rich interface and addresses the existing large-scale complex system risk analysis and challenges.

P.45 Chou, YJ*; Ho, WC; Tsan, YT; Wu, TT; Lin, MH; Chan, WC; Chen, PC; Wu, TN; Sung, FC; Lin, RS; China Medical University; [email protected] Statin use and age-specific risk of cancer in patients with hypertension Hypertension is a major worldwide public health issue. Statins are the widely used for hyperlipidemia and preventing coronary heart disease by their cholesterol-lowering effect. Hypertension patients may have higher exposure of statins. Preclinical evidence has shown that statins are potentially as anticancer agents by their anti-inflammatory, antiproliferative, proapoptotic and anti-invasive effect. Therefore there are many studies investigating statins associated with the risk of cancer among the general population and diabetes mellitus (DM) patients. The objective of this research is to investigate whether statins use is associated with cancer incidence in patients with hypertension. The study design is a retrospective cohort study. The medical records of subjects including cancer events and statins use were collected by Longitudinal Heath Insurance Database 2005 (LHID2005). Cox proportion hazard regression models were used to estimate the relationship between statins use and cancer occurring in patients with hypertension. The statins dosage of exposure, estimated as the sum of the dispensed defined daily doss (DDDs) of any statins, is used to correlate statins use to cancer risk. The results show that statins use may reduce the risk of cancer incidence in patients with hypertension. It is an important to assess cancer risk in patients with hypertension. Further study is promised.

December 6-10, 2015 - Arlington, VA

25

SRA 2015 Annual Meeting Abstracts P.83 Chuang, YC*; Wu, KY; National Taiwan University; [email protected] Comparison of Bayesian and Frequentist Inference in Probabilistic Exposure Assessment of Dietary Intake from Pesticide Residues Survey with Left-Censored Data Pesticide residue monitoring survey can provide valuable information on the incidence and level of pesticide residues in vegetables and fruits to conduct a dietary exposure assessment. However, the dataset from national survey of pesticide residue was not only censored by detection limit of pesticide analytical methods as “non-detects” (NDs) but also truncated by maximum residue level. When the sample size is more than 80% censoring, recommendation of EFSA report was suggested to collect additional data from similar food categories for obtaining larger sample sizes rather than a probabilistic exposure assessment. Therefore, a study on statistical estimation of dietary intake from highly censored datasets has been implemented by Bayesian and frequentist inference. In Bayesian inference, a non-informative prior are used within Bayesian methods, and the probability of a positive concentration below the limit of detection (LOD) has been concerned as a binomial distribution to obtain the mean residue level by the iteration of Markov chain Monte Carlo simulation (MCMC). In contrast with Bayesian model, the pesticide residue of probability density function was regressed with maximum-likelihood estimation via censored data and the NDs were substituted by LOD/2 as the traditional method. Consumption data were selected from vegetables and fruits that have been found to contain residues of one or more of the interested pesticide. The dietary intake of pesticide from vegetables and fruits can be obtained after 50,000 iterations simulation with frequentist and Bayesian inferences and the mean of dietary intake are 1.71×10-4 and 2.35×10-3 mg/kg/day, respectively. The results show that posterior distribution of dietary intake theoretically converge by Bayesian inference to corresponding representative distributions from highly censored data so that quality of probabilistic exposure assessment may be improved by reducing the uncertainty which comparing with the traditional approach of frequentist inference.

W1-B.3 Cifuentes, L*; Borchers, N; Pontificia Universidad Católica de Chile; [email protected] Differences of unitary benefits of air pollution abatement across gender and socioeconomic position Social benefits from air pollution abatement are often used as a justification for emission control measures. In a benefit cost analysis decision framework, these benefits are weighted against the costs of control, with little consideration for distributional issues, i.e. which part of the population bears the costs and which one the benefits. This work looks at the differences in the benefits from reductions in health impacts from air pollution, and their relative importance. We look at the differences across gender and socioeconomic position. We investigate the importance of differences in health effects base incidence rates, of the unit risk, of different exposure reductions, and of differences of willingness to pay to avoid health effects. Data for the analysis comes from analyses of air pollution abatement conducted in four Chilean cities that have different socio-demographic characteristics. The results show that unitary benefits can vary by as much as factor of 2. Without getting into ethical considerations, we discuss the implications of these results for designing air pollution abatement programs and measures.

M3-E.4 Clancy, SF; Evonik Corporation; [email protected] An Industry Perspective on Risk Management of Nanomaterials The chemica l i ndu s t r y h as pr a c t i c e d r i s k m a n ag em en t o f nanomaterials for a long time to meet product stewardship, industrial hygiene and regulatory requirements. The practices for nanomaterials and non-nanomaterials are fundamentally the same where the intrinsic properties of each material and exposure scenarios are considered. The literature is well stocked with information on properties related to hazard with more constantly being generated. It is recognized that there is less information on exposure and actions are being taken to increase the generation of useful information.

W2-J.4 Clarke, CE; George Mason University; [email protected] Risk Communication and “Weight-of-Evidence”: The State of the Research Within news media and public discourse, many risk topics often feature competing claims about whether a risk exists and what scientific evidence shows. Various risk scholars have proposed – and empirically examined – ways to convey scientific consensus for risk topics where evidence strongly supports a particular conclusion but public perception and scientific discourse nonetheless diverge. This presentation examines the state of existing research on these “weight of evidence” studies, with specific focus on (1) how this concept has been operationalized; (2) textual (i.e., as part of news media coverage) and visual approaches to conveying such content; (3) issues examined, including climate change, vaccine safety, and genetically-modified foods; (4) dependent variables explored, including perceived uncertainty about scientific information and the issue itself; risk perception; and issue-related attitudes; and (5) factors that may moderate message effects, such as political ideology. Finally, the presentation discusses risk communication implications and future research needs, with particular emphasis on less laboratory-based studies and more externally valid field studies and interventions.

December 6-10, 2015 - Arlington, VA

26

SRA 2015 Annual Meeting Abstracts P.24 Clewell, HJ*; Greene, TB; Gentry, PR; Institute for Chemical Safety Sciences, The Hamner Institutes for Health Sciences, Research Triangle Park, NC; Ramboll Environ, Monroe, LA; ; [email protected] Adverse Outcome Pathways for Effects Associated with Exposure to Inorganic Arsenic Adverse Outcome Pathway (AOP) descriptions for both noncancer and cancer outcomes are proposed for compounds that disrupt cellular function through their avid binding to vicinal dithiols in cellular proteins, using the available data for inorganic arsenic as a case study. These AOPs integrate data from diverse sources, including epidemiology, in vivo animal studies, and in vitro studies. For cancer outcomes, the AOP describes a co-mutagenic mode of action consisting of two parallel processes: (1) inhibition of DNA repair due to binding of key proteins associated with DNA damage repair, and (2) an oxidative stress / inflammatory response, driven by binding of key proteins in the NRF2 and NFKB cell signaling pathways, that impairs the cell’s ability to delay cell division until DNA damage has been repaired. For noncancer outcomes, the AOP proceeds from binding to vicinal dithiols in key cellular proteins to the development of a proliferative phenotype characterized by altered oxidative stress and inflammatory signaling that can result in a number of non-cancer effects, including proliferative skin lesions, inflammatory cardiovascular effects, and inhibition of immune responses. While these AOPs were developed using inorganic arsenic as a case study, they are also potentially applicable to other compounds, including phenyl arsenic oxide and its derivatives, and possibly tellurium and periodate, that bind avidly to vicinal protein sulfhydryls. (This work was supported by EPRI).

P.99 Cogger, N*; Rosanowski, S; Massey University; [email protected] What is the best control strategies for and equine influenza outbreak? Equine influenza (EI) is a highly infectious respiratory disease of horses that is not present in New Zealand (NZ). Given the equine population in NZ is naïve to the virus it has the potential to spread rapidly. Consequently, government and industry must respond rapidly if they are going to eradicate the disease. To ensure a rapid response decision making around control strategies should be made in peace time. Infectious disease modelling can provide useful information to inform this type of decision making. This abstract will present the results of infectious disease modelling of an EI outbreak in NZ under when movement restriction was the only control strategy and when movement controls were combined with a vaccination that was suppressive, protective or targeted to high value animals. The infectious disease modelling was done using InterSpread Plus a spatial explicit SRI model. The model requires information about the: (1) location of all properties with horses and racetracks; (2) frequency and distance of movement of animals, humans and fomites between properties and to and from racetracks; (3) incubation period and virus production in horses, and (4) local-spread and airborne-spread probabilities. The results of this analysis showed that an EI in NZ would be widespread and require a six months nationwide stand still. Use of vaccination used in conjunction with a nationwide standstill did significantly reduce the size of the outbreak. However, the reduction was not substantial and the number of properties, especially when the strategy was vaccinate all properties within a 7-10 band from known IP’s, was substantial. Consequently, the decision as to whether or not to use vaccination is not clear cut and could benefit from some economic assessment of the costs and benefits.

W1-C.1 Coglianese, C; University of Pennsylvania; [email protected] Listening, Learning, Leading: A Framework of Excellence in Risk Regulation Around the world, regulators confront immense challenges in balancing and achieving multiple objectives that hold major consequences for societies and their economies. What distinguishes poor regulators from good regulators, and the truly excellent regulators from the merely good ones? That is the question presented by a major research initiative undertaken over the last year at the Penn Program on Regulation. This paper reports the findings from this initiative, drawing on research and engagement with experts and the public, and presents a framework for defining regulatory excellence and measuring its attainment.

T3-B.1 Coglianese, C; University of Pennsylvania; [email protected] Looking Back at Regulatory Look-Back This presentation will focus on recent efforts to encourage retrospective review of federal regulations and their associated impact. It will review recent initiatives and show how they have been justified as efforts to reduce costs associated with regulatory burdens when a better principle would justify retrospective review in terms of learning, both about what regulations do not work well as well as those that do.

December 6-10, 2015 - Arlington, VA

27

SRA 2015 Annual Meeting Abstracts W4-D.1 Collier, ZA*; Mayo, M; Winton, C; Chappell, MA; University of Virginia; [email protected] Uncertainty and Nonlinearity in Life Cycle Impact Assessment Models Life cycle assessment (LCA) is a holistic modeling approach to the problem of understanding potential human health and environmental effects of various manufactured products and processes. However, decision-makers rely on results from LCA-based analyses, which are often highly variable or uncertain. Without reliable estimates of impact, LCA’s effectiveness as a decision aid is therefore greatly diminished. We propose that the concept of "decision quality" can be used to inform improvements to LCA modeling practice, with examples in two areas: 1) Uncertainty analysis and model reduction in fate and transport modeling, where uncertainty sometimes spans eight or more orders of magnitude. There is a need to develop a more parsimonious yet accurate model to determine fate and transport in LCA methods; and 2) A nonlinear method to relate the effects of one chemical in terms of another. Currently, these equivalency factors use linear approximations for environmental processes that are inherently nonlinear. We addressed this problem by directly employing relationships between experimental concentration-response curves. Our mathematical models relate these curves to one another, so our concentration-concentration equations are an inference preserving the nonlinearity of these concentration-response relationships.

W5-E.5 Connelly, EC*; Lambert, JH; Clarens, AF; Colosi, LM; University of Virginia; [email protected] Risk-based technology roadmap for alternative fuels Bringing innovative technologies from concept to commercialization requires investment in research and development (R&D). A number or stakeholders can be involved in R&D including government, industry, and academia. To encourage commercialization of innovative technologies, R&D roadmaps can serve to coordinate efforts across a variety of stakeholder groups. Risk analysis methods, multi-criteria decision analysis, and scenario analysis can be integrated to prioritize efforts under uncertain future economic, political, and technological conditions. Application of these methods will be demonstrated for biofuel supply chains. The importance of life cycle assessment for environmental decision criteria will be discussed, particularly as it relates to algae biofuels.

P.116 Convertino, M*; Liu, Y; University of Minnesota ; [email protected] Optimal surveillance network design: a value of information model Infectious diseases are the second leading cause of deaths worldwide, accounting for 15 million deaths – that is more than 25% of all deaths – each year. The ability to timely detect outbreak pathways via high-efficiency surveillance system is essential to the physical and social well being of populations. For this purpose, a traceability model inspired by wave pattern recognition models to detect “zero-patient” areas based on outbreak spread is proposed. Model effectiveness is assessed for data from the 2010 Cholera epidemic in Cameroon, the 2012 foodborne Salmonella epidemic in USA, and the 2004-2007 H5N1 avian influenza pandemic. Previous models are complemented by the introduction of an optimal selection algorithm of surveillance networks based on the Value of Information (VoI) of reporting nodes that are subnetworks of mobility networks in which people, food, and species move. The surveillance network is considered the response variable to be determined in maximizing the accuracy of outbreak source detections while minimizing detection error. Surveillance network topologies are selected by considering their integrated network resilience expressing the rewiring probability that is related to the ability to report outbreak information even in case of network destruction or missing information. Independently of the outbreak epidemiology, the maximization of the VoI leads to a minimum increase in accuracy of 40% compared to the random surveillance model. Such accuracy is accompanied by an average reduction of 25% in required surveillance nodes with respect to random surveillance. The model developed is extremely useful for the optimization of surveillance networks to drastically reduce the burden of food-borne and other infectious diseases. The model can be the framework of a cyber-technology that governments and industries can utilize in a real-time manner to avoid catastrophic and costly health and economic outcomes.

W3-A.3 Cooke, RM*; Lutter, R; Resources for the Future; [email protected] Breastfeeding and Cognitive Test Performance: Evidence from Expert Judgment A variety of studies have tried to estimate effects of breastfeeding, both duration and exclusivity, on cognitive performance. Results range fairly widely--from no effect (Colen and Ramey, 2014), to 3-4 IQ points (Victora et al., 2015)--and the methods include prospective longitudinal studies, randomized trials (Kramer et al, 2008), and meta-analysis, e.g., (Horta and Victora, 2013). Following the structured expert judgment procedures of the European Union (Cooke and Goosen, 1999), we develop a distribution of conditional Bayesian expectations for measures of cognitive performance conditional on a prescribed set of preconditions, including breastfeeding behavior. The challenge is to design an ideal experiment controlling for confounders, controlling the characteristics of breastfeeding ("any", "exclusive", "duration") , controlling for the non-breast feeding alternatives and controlling the way in which cognitive performance is measured. Such control is impossible in practice, but can be used to inform a structured expert judgment study. We characterize the uncertainty associated with cognitive performance conditional on the control parameters.

December 6-10, 2015 - Arlington, VA

28

SRA 2015 Annual Meeting Abstracts W3-A.1 Cooke, Roger; Lutter, Randall*; Resources for the Future; [email protected] Cognitive Test Results and Labor Market Earnings in India: Evidence from Expert Judgment Cognitive Test Results and Labor Market Earnings in India: Evidence from Expert Judgment Longitudinal studies using U.S. data have provided estimates of the effects of cognitive test results on labor market earnings (e.g., Schwartz, 1994; Salkever, 1995). Such estimates underlie benefits estimates for some federal regulations (e.g., EPA 2008). There is, however, substantial uncertainty about the potential magnitude of such effects in other countries, where data for comparable longitudinal studies are quite rare. We conduct a structured expert elicitation of the effects of IQ on earnings based on interviews of experts. We pursue this structured expert elicitation for India, which may have opportunities for education and labor market mobility that are more limited than, say, in the U.S. Following the structured expert judgment procedures of the European Union (Cooke and Goosen, 1999), we develop a distribution of conditional Bayesian expectations for labor market earnings, conditional on a prescribed set of preconditions, including cognitive performance. The challenge is to design an ideal experiment controlling for confounders, the characteristics of cognitive performance (IQ), and controlling the way in which cognitive performance is measured. Such control is impossible in practice, but can be used to inform a structured expert judgment study. We characterize the uncertainty associated with earnings related to gains in cognitive performance.

W5-A.2 Cote, IL*; Flowers, L; Cogliano, VJ; US Environmental Protection Agency; [email protected] Approaches to Integrated Evaluation of Cancer and Noncancer Endpoints under Consideration at US EPA Historically, dose-response assessment has been conducted differently for noncancer and cancer effects at the US Environmental Protection Agency, with noncancer effects evaluated using a threshold, reference dose approach, and estimates of cancer risk defined either as a low-dose non-threshold phenomenon or as nonlinear when data are available to inform the shape of the response at low dose. The National Research Council (2009) has recommended a unified approach so that risks are evaluated in a consistent manner among chemicals and across endpoints of concern. Harmonization of cancer and noncancer health effects assessment needs to occur at several steps in the process, most notably in consideration of: potential underlying mechanisms of action; population susceptibility and variability; dose-response approaches to the data; and uncertainty analyses. EPA is evaluating several new strategies relative to these issues. In this talk, how knowledge gleaned from the integration of disease specific and chemical specific mechanisms of action can inform consideration of various endpoints will be discussed. In particular, the role of mechanistic information in characterizing background responses in disease processes, intrinsic factors contributing to differential susceptibility, and the potential effects of concurrent exposures to chemical and nonchemical stressors will be illustrated. Further, how new mechanistic and statistical approaches, can inform dose-response assessment and uncertainty analyses, in terms of individual thresholds and populations responses, will be considered. Differences and similarities between human and animal derived data also will be characterized.

W5-H.5 Cox, LA; Cox Associates and University of Colorado; [email protected] Causal Analytics for Improving Risk Regulation Effectively regulating many types of risks requires answering these questions: 1. DESCRIPTIVE ANALYTICS: What’s happening? What’s new and important? What should we worry about? 2. PREDICTIVE ANALYTICS: What are the probable consequences of not taking any new risk management actions? What are our best available options, and what are their probable consequences? How soon are they likely to occur, and how sure can we be? 3. PRESCRIPTIVE ANALYTICS: What should we do next? How should we allocate available resources to explore, evaluate, and implement different policy choices in different locations? 4. EVALUATION ANALYTICS: How well are risk management policies performing? Are they producing (only) their intended effects? When do they work or fail? 5. LEARNING ANALYTICS: How can we do better, taking into account value of information and opportunities to learn from small trials before scaling up? This talk discusses recent advances in each of these areas and recommends an integrated application of them to risk assessment and management, which we term causal analytics. We discuss applications of causal analytics to improving risk regulations for uncertain health, safety, and environmental risks. For these risks it is crucial for risk analysts, managers, and regulators to be able to learn effectively from experience not only the probable consequences of different policy choices, but also how to optimize decisions to improve the probability distribution of consequences. Current technical methods of causal analytics, including change point analysis, quasi-experimental design and analysis, causal graph modeling, Bayesian Networks and influence diagrams, Granger causality and transfer entropy methods for time series, causal analysis and modeling, and low-regret learning provide a valuable toolkit for using data to assess and improve the performance of risk regulations by actively discovering what works well and what does not.

M3-C.2 Cragin, D*; Poepken, T; OCeallaigh, T; Lepore, J; Hollick, N; McPike, S; Thomas, A; Merck & Co and Peking University; [email protected] Impact of REACH Authorization Listings on Pharmaceutical Manufacturing in the EU Under the Registration, Evaluation, Authorization and Restriction of Chemicals (REACH) regulation in the European Union, a process called Authorization is used to regulate hazardous materials. Materials considered to be hazardous are initially placed on the Candidate list and some may be moved to the Authorization list, Annex XIV. Once a material in Annex XIV is past its sunset date, only specifically authorized applications are permitted. Several aprotic solvents used in the manufacture of pharmaceuticals have been placed on the Candidate list and could be subject to Authorization. A solvent change might be relatively simple for some products. However for pharmaceuticals, changes can have many consequences, including: 1. Significantly larger volumes of replacement solvent due to reduced solubility; 2. Selectivity of chemical reactions; 3. Reduced impurity removal due to inferior differential solubility; and 4. Influence on physical properties that can impact efficacy. These issues can significantly impact the safety and efficacy of the drug and increase the time, effort, and complexity of developing a workable solution, if one is available. They can also increase waste and energy use. Solvent changes for a pharmaceutical process would trigger extensive regulatory requirements including assessment of product performance, revalidation of the active pharmaceutical and drug process, stability studies, and global variations or amendments that could take 3 to 4 years for approval. This assessment will provide quantitative estimates of the time, resources, and impacts associated with the steps required to replace an existing solvent used in the manufacture of a pharmaceutical. A comprehensive assessment of the impact across the product lifecycle and the inherent tension between the perceived hazards of specific solvent usage with other environmental measures, such as Process Mass Intensity (PMI) will help inform the potential impact of authorization for pharmaceutical manufacturing in Europe.

December 6-10, 2015 - Arlington, VA

29

SRA 2015 Annual Meeting Abstracts T2-G.2 Cui, J*; Kusumastuti, S; Rosoff, H; John, RS; University of Southern California; [email protected] A Behavioral Game Modeling Cyber Attackers, Defenders, and Users This study describes a three-player cyber-security game involving an attacker, defender, and user. The game is a general characterization of the potential interactions among these players given their varying and conflicting objectives. The game is designed such that the attacker has two choices – to expend resources to search for a system’s vulnerability and then attack the system, or to focus solely on attacking individual users. Conversely, the defender and user are selecting among different levels of security. The defender’s desired level of security accounts for the need to protect the system and users, as well as expected user compliance. The user’s desired security level is based on the tradeoff between security and convenience. We conduct a behavioral study that involves manipulating the strategies played by 2 of the players systematically, and observing how play for the 3rd player is affected by the manipulations. For instance, the participants can play as the attacker while the defender and user’s strategies are strategically manipulated. In the first round, the player is presented with the decision to attempt to infiltrate the system or to attack individual users directly. Once a decision is made, the player is told whether they succeed, provided with their winnings or losses, as well as given the defender and user’s decisions. In the subsequent round, if the player succeeded in identifying vulnerability in the security system, they are given the choice to attack the system or attack individual users. However, if they failed to successfully infiltrate the system, the game starts over again. The games repeated until 10 rounds are completed. We will present results from human subjects recruited through Amazon Mechanical Turk. The analysis focuses on how the decision making of the attacker could be deterred or diverted by the moves or decisions of the other two players.

P.11 Cui, J*; Rosoff, H; John, RS; University of Southern California; [email protected] Measuring Individual Differences in Near-Miss Appraisals A near-miss is defined as “an event that had a nontrivial probability of ending badly, but by chance did not” (Dillon, Tinsley, & Cronin, 2011). Research has found that individuals could react differently (risk-averse or risk-seeking) to a near-miss event depending on whether the experience is interpreted as resilient or vulnerable (Tinsley, Dillon and Cronin, 2012; Rosoff, Cui and John, 2013). We hypothesize that people differ in the way they interpret near-miss events and resolve decisions following near-miss experience. In this study, we developed the Near Miss Appraisal Scale (NMAS) that measures the psychological appraisal of near-miss experiences. The scale contains 20 items that describe a near-miss event. Respondents provide a cognitive appraisal of each near miss event. Responses take the form of an assessment of whether the event described changes their appraisal of the risk of such an event in the future. An increase in risk appraisal suggests that the respondent views near misses as identifying vulnerabilities, while a decrease in risk appraisal suggests that the respondent views near misses as confirmation of safety and resiliency. Specifically, respondents rate the likelihood with which they might engage in risk-mitigating behaviors in the future, using a 7-point rating scale ranging from 1 (extremely unlikely) to 7 (extremely likely). A higher score indicates an appraisal of vulnerability, in that the respondent is more likely to take a protective measure under the same situation in the future. In contrast, a lower score indicates an appraisal of resilience, in that the respondent is less likely to take protective measures. We evaluate the psychometric properties of the items and the scale using Classical Reliability (Test) Theory CTT), factor analysis (FA) and Item Response Theory (IRT). We establish discriminant validity for the scale with other psychological measures, such as risk aversion/loss aversion constructs, e.g., Domain-Specific Risk-Taking (DOSPERT) scale (Blais and Weber, 2006).

T2-J.2 Cuite, CL*; Shwom, RL; Hallman, WK; O'Neill, KM; Demuth, JL; Morss, RE; Rutgers University; [email protected] Testing Messages to Improve Coastal Storm Risk Communication Helping the public understand the threat of coastal storms and persuading them to take appropriate protective actions is an essential job for emergency managers and others who communicate with the intent of protecting public safety. A series of experiments were designed to understand key message variables that may increase the likelihood of coastal residents taking protective measures before a storm. Using a mix of randomly selected and convenience samples, we conducted an online survey in Spring 2015, surveying 1,700 residents of NJ, NY, and CT who live in coastal zip codes with 40% or more of the land mass in evacuation zones. This sampling strategy allowed us to include approximately even numbers of respondents who are aware that they live in evacuation zones, and those who are aware that they and do not live in evacuation zones, as well as many who are not sure. Message variables were tested using hypothetical scenarios, and included personalized messages with varying levels of specificity (e.g., street names vs. town names), describing storm surge vs. not, and different phrasing of evacuation orders. In addition, we tested the effectiveness of fear appeals and guilt appeals of differing intensity to examine their effectiveness in motivating appropriate coastal storm behavior, the latter having never been studied in this context. All messages were tested for effects on a range of dependent variables, including evacuation intentions, intentions to engage in other recommended protective behaviors, risk perceptions, perceived severity of the storm, and message comprehension. Practical implications of these findings will be discussed, with a special focus on how local emergency managers can use this information to keep their residents safe.

M3-I.1 Cunningham, T*; CDC/NIOSH/EID; [email protected] Considering non-occupational exposures, stressors, and risks for a Total Worker Health™ approach Both work-related factors and health factors beyond the workplace jointly contribute to safety and health problems for workers and their families. Traditionally, workplace programs have focused on increasing worker protection to risk factors on-the-job while health promotion programs have focused on lifestyle factors off-the-job. A growing body of science supports the effectiveness of integrating health protection and health promotion programs.

December 6-10, 2015 - Arlington, VA

30

SRA 2015 Annual Meeting Abstracts M3-I.2 Cunningham Hill, M; Johnson & Johnson Health and Wellness Solutions, Inc.; [email protected] Implementing Total Worker Health™: A story of wellness and prevention, behavioral health, and understanding chronic disease The NIOSH Total Worker Health encourages combining the workplace and non-workplace when reducing risks for the worker and their family. Such holistic health programs require skills in behavior modification, patient/consumer experience, health care analytics and coaching platforms. It also requires an integrated portfolio of solutions to cover a broad spectrum of health management from wellness and prevention, to behavioral health, to chronic disease support to improve outcomes, control costs and with the purpose to invest energy toward vibrant and longer lives.

W5-E.2 Cuvilliez, AL*; Fischer, M; Stanford University; [email protected] Impact of decentralization and renewable energy generation on outages and economic losses When disasters strike, the results can be devastating. One of the key components that impacts lives is the loss of infrastructure. Hurricane Sandy left over eight million people without power with estimated economic loses at around $20 billion, mainly from business losses. Power outages hurt people and inhibit recovery efforts. Planning infrastructure to minimize the impacts of disasters and severe weather is critical to bettering people’s lives in already tough circumstances. In the next 15 years, we will need to spend a minimum of $100 million per year to keep up the energy infrastructure. This need for investment provides a key opportunity to set objectives that will minimize future impacts. Two topics of hot debate on resiliency of the grid are the level of renewables in the grid and the degree of decentralization. Also, as renewables are integrated into the grid, generation will become less centralized. However, the link between renewables, decentralization, and outages has not yet been analyzed. Preliminary results show a possible correlation between centrality metrics and the level of outages experienced in Wisconsin and Minnesota. The proposed project goal will be to assess the impact of decentralization and renewable generation on outages and economic losses in disasters using economic loss scenario software. The economic impacts of different grid designs and renewable generation ratios in an earthquake scenario will be compared. We expect the results to help quantify the rate of return of investment (ROI) for more resilient infrastructure. Better ROIs will align economic incentives with bettering people’s lives following disasters.

W2-E.1 Dale, AL*; Lowry, GV; Casman, EA; Carnegie Mellon University; [email protected] Insights from a model of silver and zinc oxide nanoparticle fate in a Virginia watershed Mathematical models are needed to estimate environmental concentrations of metal-containing nanoparticles (NPs), emerging environmental contaminants found in consumer goods. We present a spatially resolved environmental fate model for the James River Basin in Virginia (USA) that explores the influence of daily variation in stream flow, sediment transport, and water chemistry on river and sediment bed concentrations of zinc oxide (ZnO) and silver (Ag) NPs and their reaction by-products after they enter the environment via wastewater treatment plant effluent and the application of biosolids to agricultural land. Spatial and temporal variability in sediment transport rates led to surprisingly high NP transport such that less than 6% of NP-derived metals were retained in the river and sediments. Chemical transformations entirely eliminated ZnO NPs and doubled Zn mobility in the stream relative to Ag. Agricultural runoff due to the use of sewage biosolids as fertilizer accounted for 23% of total metal stream loads from NPs. Average NP-derived metal concentrations in the sediment varied spatially up to nine orders of magnitude. Overall, our results suggest that previous models designed to assess NP risk to the environment have misrepresented NP fate in freshwater rivers due to low model resolutions and the simplification of NP chemistry and sediment transport. Furthermore, results reveal the need for the site-specific assessment and management of NP risk.

W4-F.1 Damnjanovic, ID; Texas A&M University; [email protected] Design of Early Warning Systems based Critical Transition Metrics This presentation focuses on design of early warning systems based on the concept of critical transitioning. More specifically, four key system features are identified and discussed: a) multi-scale interface, or the capacity to communicate signals and warnings across different scales, b) lead-time, or the feasibility to receive and process signals within the timescale that will allow for sound decision-making; c) trigger thresholds, or the metrics that will be robust against false positives and negatives, and finally, d) signal selection, or the type of signals that need to be monitored. The system design approach is then illustrated using a case study.

December 6-10, 2015 - Arlington, VA

31

SRA 2015 Annual Meeting Abstracts M2-C.3 Dana, GV*; U.S. Department of State; [email protected] International Perspectives on Advances in Biotechnology International oversight of biological engineering, often popularly called synthetic biology, is an area of great interest to researchers, industry, civil society, and other stakeholder groups. Advances in biotechnology tools and techniques are leading to questions about appropriate governance and oversight mechanisms in human health and the environment. Understanding the policy landscape around biotechnology is critical for bringing ideas successfully from the lab to the marketplace, but this can be quite a challenge in a field that involves many disciplines, international collaborations, and a whole generation of researchers with little experience with risk and regulation of biotechnology. This talk will discuss how and where players in the international policy arena are debating biological engineering, including in the UN Convention on Biological Diversity, and the challenges encountered as groundbreaking tools and techniques such as gene drives enter the scene.

M2-E.2 Davidson, RA*; Manzour, H; Horspool, N; Nozick, LK; University of Delaware; [email protected] Method to Represent Seismic Hazard for Spatially Distributed Infrastructure Assessing the risk for consideration in regional-scale, policy decisions affecting lifelines or inventories of buildings (i.e., spatially distributed infrastructure) differs from assessment for a single structure in two key ways: (1) it must consider the spatial correlation among the exposed structures, and (2) computation is an issue. The new Extended Optimization-based Probabilistic Scenario method addresses these two issues by using a combination of simulation and optimization to produce a small set of probabilistic ground motion maps to represent the seismic hazard for analysis of spatial distributed infrastructure. We applied the method to Christchurch, New Zealand, conducted a sensitivity analysis of key user-specified parameters, and compared the results of a regional loss estimation based on our hazard analysis to those based on a conventional Monte Carlo simulation. A set of just 124 ground motion maps were able to match the hazard curves based on a million-year Monte Carlo simulation with no error at the four selected return periods, mean spatial correlation errors of 0.03, and average error in the residential loss exceedance curves of 2.1%. This enormous computational savings in the hazard has substantial implications for regional-scale, policy decisions affecting lifelines or building inventories since it can allow many more downstream analyses and/or doing them using more sophisticated, computationally intensive methods. The method is robust, offering many equally good solutions and it can be solved using free open source optimization solvers. We offer practical guidance on its implementation to facilitate by practitioners, including offering a small modification to the method that makes it easier for any solver to solve.

T3-C.2 DE MARCELLIS-WARIN, N*; HOSSEINALI-MIRZA, V; WARIN, T; Polytechnique Montréal and CIRANO ; [email protected] Perceptions of Information Credibility During a Social Media Crisis Social media have the ability to influence risk perception and trust, even if it’s not true or only a rumor because of the Buzz. A hoax could start by a customer, a competitor or an employee that can propagate in a viral way. More than being viral among a network of friends, the message can become worldly. The Barometer CIRANO was used to determine factors affecting risk perception and public opinion in Quebec, Canada (De Marcellis-Warin & Peignier, 2012). We found that the use of Internet (Web sites and Social media) seems to increase the probability of overestimating a risk. This article presents the results of an exploratory survey conducted to investigate how online users perceive the credibility of social media information in normal times and during a crisis. Today, the evolution and growing usage of social media technologies is outstanding. It is complex to identify the credibility of online media compared to traditional media as there is no control on publishing information in the online environment and consequently, online information is potentially “distorted”, “inaccurate”, “biased”, “misleading’, or even “false” (Flangain&Metzer, 2000; Metzger&Flanagin, 2013). Furthermore, through an actual case study of a social media crisis, we examine online users’ perceptions of communication strategies. The analysis revealed that: (1) online users allocate different levels of credibility to information sources in normal times compared to a crisis time; (2) online users with higher level of social media engagement give more importance to an organization' social media activities during a crisis; and (3) online users suppose that a social media crisis has a negative long-term impact on an organization's reputation and brand credibility. Public relations could employ these findings in order to accurately position social media in an organization's communication agenda.

T2-G.3 Dehghani Abbasi, Y*; Kar, D; Sinha, A; Sintov, N; Tambe, M; Fang, F; Nguyen, TH; Brown, M; Zhang, Ch; University of Southern California; [email protected] Modeling Human Bounded Rationality An increasing number of automated decision aids based on game-theoretic algorithms are used daily by security agencies to assist in allocating limited security resources to various targets. These applications of game theory, based on the “security games” paradigm, have led to fundamental research challenges: one major challenge is modeling human bounded rationality. More specifically, it is important to investigate the bounded rationality of the human adversaries to improve the effectiveness of the defender's security resource allocation. We study human behavior in the context of such security game settings and explore multiple models. We analyze their performance using various prediction accuracy metrics and compare them in several key scenarios and applications.

December 6-10, 2015 - Arlington, VA

32

SRA 2015 Annual Meeting Abstracts W4-G.3 Delavarrafiee, M*; Frey, HC; North Carolina State University; [email protected] Empirical Comparison of Fine Particulate Matter Exposure Concentrations in North Carolina State University Campus Buses and a Personal Passenger Car Exposure to fine particulate matter known as PM2.5 has adverse effect on human health ranging from respiratory and cardiovascular morbidity to mortality. In-vehicle microenvironments contribute to a significant portion of personal total daily PM2.5 exposure. North Carolina State University (NCSU) students typically commute either by buses or by private passenger cars. PM2.5 mass concentrations were recorded during lunch time and evening rush hours for both bus and car transportation modes using DustTrak DRX Aerosol Monitor 8520. For the bus mode, data were collected in the common commute routes of NCSU campus area. For the passenger car mode, a 2004 Honda Civic was employed to record data in the same route as the bus mode. For each transportation mode, PM2.5 concentrations were collected during weekdays for 5 different runs in the lunchtime rush-hour and 5 different runs in the evening rush-hour. The results indicate that commuters using private cars are exposed to 50% lower PM2.5 concentration than those using campus buses. The daily average exposure to PM2.5 in rush hour is weighted by the spent time in each microenvironment. However, daily average exposure to PM2.5 in the campus buses and passenger car is estimated to be significantly lower than the National Ambient Air Quality Standard (NAAQS) for average daily exposure to PM2.5. The results of this study are useful for commuters to understand factors that affect personal daily exposure to fine particulate matter.

M2-J.1 Demski, CC*; Pidgeon, NF; Capstick, SB; Sposato, RG; Spence, A; Cardiff University, UK; [email protected] The experience of flooding and its influence on climate change risk perceptions Climate change can be temporally, geographically and socially distant from people’s everyday lives, leading to a lack of engagement. By contrast, direct personal experience of climate-related impacts is one of the few ways in which climate change can become more salient for people; thus reducing this ‘psychological distance’. We examine the role of personal experience of flooding and its influence on climate change beliefs, through a focus on people’s experiences of the major UK winter floods which occurred in 2013/2014 and provided a natural experiment to test aspects of the distancing hypothesis. These floods were brought on by unprecedented storms and rain during the December and January. The empirical study comprised a survey of a nationally representative sample (n=1,002), and a second survey sample composed of residents in 5 areas of the UK most directly affected by the floods (n=995) conducted 8 months after these events in the autumn of 2014. We compare risk perceptions and a range of measures (e.g. climate change importance, psychological proximity) the nationally representative and flood affected samples, and conclude that respondents who had experienced the floods perceived climate change as more proximal and salient to them personally. We further discuss the processes that may underlie the effect of direct experience on perceptions, as well as methodological issues in studying the links between ‘experience’ and climate risk perceptions.

W5-J.4 Demuth, JL*; Morss, RE; Palen, L; Stowe, K; Anderson, J; Kogan, M; Anderson, K; NCAR; [email protected] Understanding dynamic communication, risk perception, and decisions during Hurricane Sandy through analysis of Twitter data Advances in weather observations and forecasting coupled with advances in information and communication technology have transformed how people can access, process, combine, and share information when hazardous weather events like hurricanes threaten. Social media data streams provide opportunities to investigate how people discuss their weather risk-related attitudes, perceptions, and behaviors in real time as they evolve dynamically. The data also pose challenges, especially regarding how to capture and meaningfully analyze the vast amounts of data. This presentation will discuss the methods for and initial findings from analysis of Twitter data collected during the approach and landfall of Hurricane Sandy. A major focus of the investigation is to understand how Twitterers exchange information about an approaching threat and how this affects their evolving perceptions of and responses to the risks posed by Sandy. To help analyze the data, the research team developed a coding scheme that includes codes for different types of positive and negative sentiment, reporting on the physical and social environment, information seeking and sharing, and protective actions (e.g., preparations, evacuation, sheltering). This scheme is being used to code a subset of the data manually and to apply natural language processing and machine learning techniques to automate coding the larger Twitter dataset. The coding scheme applied to the Twitter data will be used to help identify indicators of people’s hurricane risk perceptions and their attitudes towards protective behaviors. The data will also be used to link people’s risk perceptions to other variables, such as their hurricane experience and use of environmental and social cues, and their decisions (e.g., information seeking, emotional coping). Other components of the broader research effort also will be briefly discussed, including how the Twitter data analysis is being complemented by focus groups with vulnerable populations.

P.127 Demuth, JL; NCAR and Colorado State University; [email protected] A valid scale of past experiences for tornado risks One’s past experience with a hazard potentially is a key factor in how they perceive a future risk as experience is a key mechanism through which one acquires knowledge about a risk. Despite this, past hazard experience has been conceptualized and measured in wide-ranging and often simplistic ways by researchers, resulting in mixed findings about the relationship between experience and risk perceptions. Thus, dimensions of past hazard experiences are not known, nor is it known how one’s experiences relate to their assessment of future risks. Past hazard experience is particularly relevant to weather risks, which are common enough for people to acquire many experiences. This poster will present the results of a study to develop a valid scale of past experiences in the context of tornado risks. The scale is developed by, first, conceptualizing and identifying dimensions of past tornado experience, and then by examining the relationship between the different experience dimensions and people’s tornado risk perceptions. Data were collected through two mixed-mode surveys of the public who reside in tornado-prone areas. An initial set of items to measure people’s most memorable tornado experience as well as their experiences with multiple tornado threats were developed for and evaluated with the first survey. Additional aspects of people’s past tornado experiences were elicited in their own words. The item set then was revised and evaluated with the second survey along with measures of people’s tornado risk perceptions. Four dimensions of people’s most memorable tornado experiences emerged: risk awareness, risk personalization, personal intrusive impacts, and vicarious impacts. Also, two dimensions of people’s multiple tornado experiences emerged: common personal threats and impacts, and negative emotional responses. Moreover, these different types of experiences differently relate to people’s tornado risk perceptions. These results will be discussed.

December 6-10, 2015 - Arlington, VA

33

SRA 2015 Annual Meeting Abstracts M2-H.2 Denbaly, M*; Hoffman, S; USDA Economic Research Service; [email protected] Benefits and Challenges of New Data and Models Lack of data has been a perennial challenge to development of an evidence-based approach to improving food safety policy. We’d love to have a better understanding of where food hazards are emerging in the food supply. At the moment we rely primarily on outbreak investigations or extremely focused case control studies. We’d love to know how food safety research funding impacts the future food safety work force. We’d love to have better ways of estimating the relationship between food consumption and foodborne illnesses. All of these efforts have been frustrated by a lack of data. As we’ll discuss today, new data sources that require new ways of thinking about data are rapidly emerging through the development of new information technologies. I want to contribute to our discussion by first discussing some of the efforts underway at the USDA Economic Research Service to use “big data” and new approaches to thinking about data to inform questions like the ones I’ve mentioned. I then want to talk about some of the “big questions” that are plaguing food safety policy and management to help us think about how “big data” may be able to help us think about some of “big challenges” we face in improving the safety of the U.S. food supply.

M2-E.4 Deng, Q*; Baecher, G; Komey, A; University of Maryland, College Park; [email protected] Evaluating overtopping risks of reservoir-dam systems based on rare event simulation The overtopping risks of reservoir-dam systems serve as a critical index representing the dam safety statuses. In most cases, overtopping risks of reservoir systems have very small probabilities of occurring. Estimation with crude Monte-Carlo simulation requires a prohibitively large numbers of trials. Otherwise, estimation would not be accurate. Computational expense served as one of the prohibitive reasons that simulation has not been widely applied to reservoir operation. A rare event simulation-based approach is thus proposed in this study to address the overtopping risks of reservoir systems.

W1-A.2 Deveau, M*; Maier, A; Meek, ME; Krewski, D; University of Ottawa, University of Cincinnati; [email protected] Incorporation of chemical-specific data in dose–response assessments for occupational and environmental exposure limits Various methods have been developed to facilitate the incorporation of chemical- and scenario-specific data into the derivation of occupational and environmental exposure limits. Many of these approaches—including physiologically based pharmacokinetic modeling, chemical-specific adjustment factor derivation, and mode of action analysis, among others—were first proposed decades ago, but their application by regulatory agencies has often been diverse and inconsistent. This presentation will report preliminary results of a bibliometric review of the evolution of the incorporation of chemicaland scenario-specific data into dose–response analyses. The results will also highlight differences in the application of the approaches depending on the nature of the exposure guideline (e.g. occupational vs. environmental exposures) and organization (e.g. academic vs. regulatory institutions).

W1-J.4 Dewitt, B*; Fischhoff, B; Broomell, S; Davis, A; Carnegie Mellon University; [email protected] Tornado Risk Perception from Visual Cues Lay judgments of environmental risks are central to both immediate decisions (e.g., taking shelter from a storm) and long-term ones (e.g., building in locations subject to storm surges). Using methods from quantitative psychology, we provide a general approach to studying lay perceptions of environmental risks. As a first application of these methods, we investigate a setting where lay decisions have not taken full advantage of advances in natural science understanding: tornado forecasts in the US and Canada. Because official forecasts are imperfect, members of the public must often evaluate the risks on their own, by checking environmental cues (such as cloud formations) before deciding whether to take protective action. We study lay perceptions of cloud formations, demonstrating an approach that could be applied to other environmental judgments. We use signal detection theory to analyze how well people can distinguish tornadic from non-tornadic clouds, and multidimensional scaling to determine how people make these judgments. We find that participants have good heuristics, but with predictable biases, leading them to misjudge the tornado risk of certain cloud types. The signal detection task revealed confusion regarding shelf clouds, mammatus clouds, and clouds with upper- and mid-level tornadic features, which the multidimensional scaling task suggested reflected focusing on the darkness of the weather scene and the ease of discerning its features. We recommend procedures for training (e.g., for storm spotters) and communications (e.g., tornado warnings) that will reduce systematic misclassifications of tornadicity arising from observers reliance on otherwise useful heuristics.

December 6-10, 2015 - Arlington, VA

34

SRA 2015 Annual Meeting Abstracts M4-C.3 Diaz-Sanchez, D; United States Environmental Protection Agency; [email protected] Inherent Variability in Exposure Studies: Study Design and Subject Limitations Controlled studies evaluate a volunteer’s response when exposed to a chemical or mixture of interest; the volunteer is evaluated for transient, reversible biological changes (e.g., lung or cardiovascular function, respiratory symptoms such as cough, biological markers, etc.). These measurements can be used to establish a concentration response curve for a given airborne contaminant that can be used to inform public health decisions. As an example, one measurement protocol available to researchers was developed by the American Thoracic Society (ATS). In response to an airborne contaminant, volunteers’ FVC (forced volume vital capacity) and FEV1 (forced expiratory volume) measurements (‘maneuvers’) are measured three to eight times to obtain clinically acceptable results. Subject variability may be introduced from a variety of sources including the volunteer effort, the protocol in use, outdoor exposure to air pollution, and most importantly the natural variation between subjects. Understanding the sources of variability is critical to establishing an accurate concentration response curve and to interpreting the study results. This presentation will discuss the sources of experimental and natural variability in controlled studies and potential steps that can be taken to reduce variability to create a robust study design.

M4-F.1 Dietz, T, T; Michigan State University; [email protected] Risk, Science and Democracy Risk analysis raised awareness of the inevitability of uncertainty in science-based decision making, and provided tools for better understanding risk and incorporating it into decision-making. A major parallel development has been to examine the relationship between science-based risk analysis and democratic decision making. In particular, consideration of the fairness and competence of decision making processes has led to a growing consensus on methods for linking scientific analysis to public deliberation. I will briefly trace the history of these ideas, assess their current status and examine challenges for the future, in particular the need to apply this logic in the face of global challenges such as climate change.

P.128 Digoin, G*; de Marcellis-Warin, N; Warin, T; Ecole Polytechnique de Montreal; [email protected] Launching a New Product in a Buzzing World: The Apple Watch’s Reputation at Risk Successfully launching a new tech product is vital. If a company does not control all the risks associated with this launch, then the product (and the company) can just be disregarded by consumers. This is particularly true for the reputation risks. If the product’s reputation is damaged right from the beginning, then it is often impossible to offset and it can also affect the company’s reputation. Corporate reputation is indeed one of the most important assets of a company (de Marcellis-Warin & Teodoresco, 2012). Although this has always been relevant, it is even more important these days with social networks. A study conducted by the Reputation Institute in 2015 reveals that conversations on social networks – by consumers or even unrelated persons - have a negative impact on corporate reputation (-2.4%). However, when a company communicates on social media, its reputation increases (+0.4%). In this research, we analyze the launch of the Apple watch in early 2015. As one of the most successful tech companies in the world, Apple’s reputation is very high. The launch of Apple’s latest connected watch was highly commented on social networks and especially on Twitter. We have collected tweets before, during and after the launch of the Apple watch. This dataset allows us to do a sentiment analysis and highlight whether the conversations has a positive or a negative impact on the product’s reputation as well as on Apple’s reputation. We can also identify the specific topics that were discussed (technical, design, etc.). Eventually, we also highlight the differences between the conversations initiated by consumers versus the conversations initiated by non-consumers.

M2-A.2 Dix, DD; Office Science Coordination and Policy; [email protected] High Throughput and Computational Tools for Quantifying the Bioactivity, Hazard, Exposure, and Risk of Chemicals for Safety Assessments High Throughput Screening (HTS) and computational models are being used by the U.S. Environmental Protection Agency (EPA) as an alternative to animal-based regulatory testing. EPA is shifting to the use of computational tools consistent with EPA’s strategic plan for toxicity testing in the 21st century. One example of the application of these new methodologies is in the Agency’s Endocrine Disruptor Screening Program (EDSP). Endocrine screening for bioactivity is being accomplished with ToxCast HTS assays for key events in the estrogen, androgen and thyroid pathways. HTS data are integrated into computational models of endocrine pathways to quantify bioactivity. Endocrine Adverse Outcome Pathways (AOPs) link chemical bioactivity to adverse effects and help define dose-response for hazard assessment. HTS exposure models applicable to humans and wildlife, and incorporating multiple exposure media and pathways, are being developed for pesticidal and non-pesticidal chemicals. HTS and computational predictions of bioactivity, hazard and exposure can then be integrated for risk-based assessments and regulatory decision making. These new computational toxicology tools are being validated for various chemical safety applications though comparison with traditional testing and assessment methodologies.

December 6-10, 2015 - Arlington, VA

35

SRA 2015 Annual Meeting Abstracts P.129 Dixon, GN; Washington State University; [email protected] The Challenge of Communicating the Risk of Inaction: Linking Causal Attribution to Biased Information Processing In communicating the societal and personal benefits of vaccination, many persuasion techniques fail to produce a desired effect on vaccine compliance and beliefs. Statistical information about vaccine safety is largely ineffective at improving vaccine attitudes and intentions; refuting nonscientific information decreases intent to vaccinate; and using emotional testimony or pictures to highlight the consequences of non-vaccination can attenuate risk perception surrounding vaccine-preventable disease and increase the erroneous perception that vaccines cause autism. Furthermore, using emotional pictures to illustrate the risks associated with not vaccinating results in motivated reasoning - it is persuasive only for those with views favorable toward vaccination, but backfires for individuals harboring anti-vaccine views. Creating effective vaccine persuasion campaigns then requires further scrutiny of the psychological aspects of vaccine risk perception and the manner in which different audiences process vaccine-related messages. Therefore, this research (1) addresses why persuasive messages on vaccination backfire for vaccine skeptics and (2) researches ways to correct these biasing effects. Theoretically, I apply attribution theory to vaccine risk perception and motivated reasoning as a way of understanding why vaccine risk messages result in biased processing. Understanding the causal attributions of risks can then inform on the best practices for communicating about vaccinations to skeptical audiences.

W2-A.1 Dotson, GS; Centers for Disease Control and Prevention/National Institute for Occupational Safety and Health ; [email protected] Complexities of conducting occupational risk assessments for low molecular weight allergens Low molecular weight (LMW) chemical allergens represent a significant health burden in the workplace that contribute to the onset of a diverse group of adverse health effects triggered by immune-mediated responses. These adverse effects range from allergic contact dermatitis to life-threatening cases of asthma. Few occupational exposure limits (OELs) have been developed for LMW allergens. The unique biological mechanisms that govern the immune-mediated responses inhibit the application of traditional quantitative risk assessment techniques. The purpose of this presentation is to highlight the primary challenges of conducting occupational risk assessment for LMW allergens including 1) selecting the appropriate immune-mediated response (i.e., sensitization versus elicitation), 2) characterizing the dose (concentration)-response relationship of immune-mediated responses, and 3) understanding the role of uncertainty associated with individual susceptibility, temporal exposure patterns, and exposure route.

W4-H.5 Dotson, GS; Centers for Disease Control and Prevention/National Instiute for Occupational Safety and Health; [email protected] Integrating Occupational Risk Factors and Considerations into Cumulative Risk Assessment Cumulative risk assessment has been defined by the US Environmental Protection Agency as the analysis, characterization, and possible quantification of the combined risks to human health or the environment from multiple agents or stressors. Understanding the cumulative risk on human health associated with co-exposures to multiple stressors is critical for refining risk management processes and public policy. Existing approaches focus primarily on characterizing cumulative risk in environmental, community and residential settings. Less attention has been placed on occupational settings, despite the workplace being a recognized source of exposures to unique stressors, such as chemicals, radiation, biological agents and psychological stressors, which greatly impact the health and lives of the working population. Additionally, available evidence indicates that individual and environmental risk factors, such as psychological stress, genetic susceptibility and ambient air pollution, may contribute to the onset of adverse effects in occupational settings. This presentation will highlight research efforts underway in the National Institute for Occupational Safety and Health (NIOSH) to integrate occupational risk factors and considerations into the practice of cumulative risk assessment. Additional information will be presented on the need to incorporate individual and environmental risk factors into occupational risk analyses.

M3-I.4 Dotson, S*; CDC/NIOSH/EID; [email protected] 8. The NIOSH cumulative risk assessment project: Characterizing and communicating both occupational and non-occupational risks Human health risk assessments continue to evolve and now focus on the need for cumulative risk assessment (CRA) which assesses the combined risk from coexposure to multiple chemical and nonchemical stressors for varying health effects. Future directions of CRA include greater emphasis on local-level community-based assessments; integrating environmental, occupational, community, and individual risk factors; and identifying and implementing common frameworks and risk metrics for incorporating multiple stressors.

December 6-10, 2015 - Arlington, VA

36

SRA 2015 Annual Meeting Abstracts W4-B.4 Douglas, HE; University of Waterloo; [email protected] Weight of Evidence and Collaborative Practices Weighing complex sets of evidence is one of the most challenging aspects of risk analysis. It has resisted simple algorithmic approaches, requiring expert judgment to inform both the structure of the process and the decisions about how to implement it, and that is when a clear process is available. Given the technically challenging nature of weight of evidence, it would seem too demanding to make it a process that also engaged with a range of stakeholders. But such engagement, I will argue can make the process of weighing evidence better. Because of the need to consider a range of explanatory options before deciding where the weight of evidence lies, bringing in a range of perspectives that can produce different explanations will strengthen the outcome epistemically. So too will the ability of diverse groups to decide which empirical studies will help test different explanatory avenues, and which studies might help the collaborators decide the best available explanation at the time. Finally, collaborative practices can help clarify when standards of evidential sufficiency (or proof) diverge. This talk will develop the theoretical strengths of such an approach.

W4-J.3 Driedger, SM*; Annable, G; Brouwers, M; Corso, Z; University of Manitoba (Authors 1, 2, 4), McMaster University (Author 3); [email protected] Different strokes for different folks: The influence of primary care providers on patient decision making about breast and prostate cancer screening In the shared decision making model, physicians are expected to communicate the benefits and risks of the available options to their patients. The Canadian Task Force on Preventive Health Care recommends that men should not have PSA tests to screen for prostate cancer, and women under the age of 50 years should not have mammograms to screen for breast cancer, primarily because the task force determined that the risk of harms exceeds the benefits. Nevertheless, Canada has high rates of opportunistic PSA screening, and women aged 40 to 49 years have access to screening mammograms in most parts of the country. This presentation examines the influence of physicians on patients’ decisions about cancer screening using data from focus groups with men (n = 47) about prostate cancer screening and with women (n = 46) about breast cancer screening. These data were supplemented by interviews with family physicians (n = 6) and with officials responsible for cancer screening policy at various organizations in the Canadian cancer control system (n = 12). Physicians were a major influence on the focus group participants’ screening decisions, but contrary to the shared decision making model, most said their physicians did not communicate the risk of screening harms to them. However, all of the family physicians reported that they do discuss harms as well as the benefits of screening with their patients. Clinical guidelines and decision aids support patients and physicians in making cancer screening decisions, but achieving the ideals of informed and shared decision making is challenging when evidence about the balance of benefits and harms is contested. Competing guidelines from specialist organizations that downplay the risk of harms and favor screening foster tension in clinical dialogues between physicians and their patients. Even when patients are adequately informed, many people are willing to accept high risks of harms in return for comparatively low probabilities of benefits.

T3-H.4 Driver, J*; Van Wesenbeeck, I; risksciences.net, LLC and Dow Agrosciences, LLC; [email protected] Advances in Bystander Exposure Modeling: 1,3-D Agricultural Uses This presentation provides a discussion of science-based advances in probabilistic bystander exposure and risk modeling. The case study focuses on agricultural uses of 1,3-D in California. The advancements are based on new data and improved modeling methods including three significant areas: 1. Survey data on residency duration and mobility in specific communities with high demand for 1,3-D; 2.Air modeling methods that are well calibrated to monitoring data and that provides detailed information on spatial and temporal characterization of air concentrations of 1,3 D; and 3. A calendar-based, stochastic model that builds on these new data and employs updated age-cohort specific physiological factors to provide more accurate estimates of current and future exposures to bystander populations. pes

T2-A.2 Druwe, IL*; Painter, K; Yost, EE; Burgoon, LD; Oak Ridge Institute for Science and Education and National Center for Environmental Assessment, United States Environmental Protection Agency, Research Triangle Park, NC; [email protected] Bayesian Evidence Integration of Quantitative High Throughput Screening Data High Throughput Screening (HTS) assays have generated toxicological data on thousands of chemicals; however, it is unclear how this HTS data can be integrated to inform hazard identification and estimation of risk-specific concentrations/doses. In this pilot study, we examined how well Bayesian data integration methods may perform in integrating evidence from multiple HTS assay replicates. Specifically, we analyzed in vitro HTS data for a sample chemical, Bisphenol A (BPA), a potential xenoestrogen. HTS data for BPA (chemical ID: 6623), were acquired from the National Library of Medicine’s PubChem Database (https://pubchem.ncbi.nlm.nih.gov/) for all active, confirmatory assays. Of the more than 1300 assays in PubChem, only 28 of these were confirmatory/quantitative assays. For this pilot study we selected confirmatory HTS assays for estrogen receptor activation that tested a wide range of BPA concentrations (1.23 nM to 94.36 µM). We used bootstrap meta-regression to identify the Point of Departure (POD) and the response/activity level at the POD. We then calculated the proportion of replicates with responses greater than the response/activity level at the POD, used a flat prior distribution (to represent lack of knowledge) and a Beta distribution to model the likelihood, and thus, the posterior probability. We will use the posterior distribution to estimate the risk-specific concentration/dose at various levels of risk. Bayesian analysis allows us to integrate evidence from multiple assays measuring estrogen receptor activation in a transparent manner. The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA. Mention of trade names or commercial products does not constitute endorsement or recommendations for use.

December 6-10, 2015 - Arlington, VA

37

SRA 2015 Annual Meeting Abstracts W4-I.3 Druwe, IL*; Bell, SM; Burgoon, LD; Oak Ridge Institute for Science and Education, ILS/Contractor Supporting the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), Research Triangle Park, NC and United States Army Engineer Research and Development Center, Vicksburg, MS ; [email protected] Building Disease-Based AOPs for Risk Assessment: from Molecular Pathways to Human Hazard Identification Adverse Outcome Pathways (AOPs) connect molecular initiating events (MIEs) triggered by a chemical (or chemicals) of interest, with one or more Key Events (KEs) that result an Adverse Outcome (AO) with implications for human health and or the ecological environment. An AOP may be generated using diverse methodologies, one such method is the disease-based AOP approach. This AOP development method starts with an AO of interest and then increasingly examines lower levels of biological organization to identify an MIE(s) in connection with that outcome. Using steatohepatitis, a type of liver disease characterized by inflammation of the liver and fat accumulation, as case study, we generated a computationally-predicted AOP (cpAOP) using computational analysis of the reactome pathways, toxicogenomics, clinical chemistry and pathology. cpAOPs are hypothetical models that can be used to integrate experimental data with other biological data resources to generate data-driven networks linking an initiating event to an adverse outcome. We informed the cpAOP we developed with published steatohepatitis peer-reviewed literature and integrated data from various pathway databases to generate a punative Steatohepatitis Adverse Outcome Network (pAOP). This network was then used to identify sufficient key events, events that can inform us about the likelihood a chemical exposure is to result in steatohepatitis. Using this framework we then selected a chemical of interest, inorganic arsenic, to mechanistically link chemical exposure to steatohepatitis. By employing the use of cpAOPs and incorporating literature and pathway database information to upgrade the cpAOP into a pAOP network, we can identify the minimal set of key events required to infer an adverse outcome and identify potential human health hazards. The views expressed are those of the authors and do not necessarily represent the views or policies of the US EPA.

T2-F.1 Duret, S*; Chen, Y; Oryang, D; Dennis, S; Ingram, D; Mahovic, M; Pouillot, R; Van Doren, J; Food and Drug Administration; [email protected] Modeling survival of pathogens in manure amended soil: A Meta-Analysis. Risk models combine unique and different sources of big data sets to provide new insights into food safety issues. FDA, in consultation with USDA, is developing a risk model to systematically study, analyze, and evaluate approaches to setting a standard for safe use of untreated biological soil amendments of animal origin (e.g., raw manure) as part of its implementation of FSMA Section 105. As part of the overall risk assessment development, we carried out a Meta-Analysis of the scientific literature to identify and evaluate available data for the contamination (prevalence and levels) and survival of pathogens (e.g., E. coli O157:H7 and Salmonella) in raw manure amended soil. We conducted a Meta-Regression and examined different approaches to modeling non-linear curves (e.g., using the Weibull model) for pathogen survival in soil amended with raw manure, and to develop a secondary survival model. We have identified several gaps in data needed to populate a secondary survival model in order to characterize the relative impact of the different agro-ecological conditions. Ultimately we anticipate that the secondary survival model will be integrated into an overall risk model which will aim to evaluate how the risk of illness is impacted by a variety of factors such as changes in the type of commodity, associated agro-ecological conditions and farming practices, and the type of manure.

T4-A.2 Eastmond, DA; University of California, Riverside; [email protected] Case Studies of Genotoxic Agents Acting through Non-Mutagenic or Non-Linear Modes of Action The determination of whether or not a carcinogen acts through a genotoxic or mutagenic mode of action (MOA) plays an important role in assessing risks associated with low dose exposure. When an agent acts through a mutagenic MOA, an agency such as the USEPA will use a linear or similar model to extrapolate cancer risks into the very low dose region. However, the basis for determining that a carcinogen acts through a mutagenic MOA has not yet been established. If a chemical exhibits significant genotoxic effects, it is generally assumed to act through a mutagenic or genotoxic MOA and the low dose risks are estimated accordingly. There are, however, a number of cases where genotoxic agents have been determined to act through non-mutagenic or non-linear MOAs. In this presentation, several case studies will be presented in which authoritative bodies have evaluated the evidence and concluded that genotoxic chemicals act through non-linear, threshold-types of mechanisms. Cases to be discussed include determinations on o-phenylphenol, a fungicide and rat bladder carcinogen and carbon tetrachloride, an industrial solvent and rodent liver carcinogen. Other examples will also be briefly discussed. In addition, key factors and mechanisms considered in making the MOA and dose response decisions will be presented.

W4-I.1 Edwards, SW*; Oki, N; Bell, S; Nelms, M; Leonard, J; Tan, YM; 1 U.S. Environmental Protection Agency, 2 Oak Ridge Institute for Science and Education; [email protected] Merging Adverse Outcome Pathway (AOP) and Mode of Action (MOA) Frameworks: Assembling Knowledge for Use in Risk Assessment The Adverse Outcome Pathway has emerged as an internationally harmonized mechanism for organizing biological information in a chemical agnostic manner. This construct is valuable for interpreting the results from high-throughput toxicity (HTT) assessment by providing a mechanistic connection to adverse outcomes of regulatory interest. To facilitate the development and use of AOPs, an international knowledgebase (AOPKB) was developed with the public release of the first module (http://aopwiki.org) in September 2014. However, the AOPs in the AOPKB include intended targets for only a small number of HTT assays currently in use. We have developed bioinformatic approaches to generate computationally-predicted AOPs (cpAOPs) containing ToxCast assay targets. The cpAOPs have been evaluated by comparing against previously-defined modes of action, such as liver toxicity via CCl4 exposure, and provide annotation information for >200 EPA ToxCast assays. Through a crowd-sourcing model enabled by the AOPKB, these cpAOPs can be rapidly assessed by experts in mechanistic toxicology to assemble AOPs suitable for use in risk assessment. However, information regarding exposure and pharmacokinetics is needed for a chemical-specific risk assessment. We are developing methods to provide absorption, distribution, metabolism, excretion (ADME) information with varying degrees of quantitation and/or certainty to complement the AOP development process. We developed a three-tiered approach for connecting biology-based AOPs to biochemical-based ADME properties, and then to chemical/human activity-based exposure pathways, where the tier is selected based on available data. By combining AOPs and ADME, we can recapitulate the MOA for a given chemical from reusable components allowing more extensive use of AOPs in risk assessment. [This is an abstract or a proposed presentation and does not necessarily reflect EPA policy or constitute endorsement or recommendation for use of any commercial product.]

December 6-10, 2015 - Arlington, VA

38

SRA 2015 Annual Meeting Abstracts W1-B.4 Eggers, SL*; Frey, PJ; Vaidya, P; Sile, H; U.S. Food and Drug Administration; [email protected] Benefit-Risk Assessment in Human Drug Review FDA’s Benefit-Risk Framework for human drug review is a structured, qualitative approach to support the identification and communication of the key considerations into the benefit-risk assessments that inform FDA’s drug regulatory decisions. FDA’s goals were (a) to be clearer and more consistent in communicating the reasoning behind regulatory decisions, and (b) to ensure that FDA reviewers’ detailed assessments can be readily placed within the patient and public health context. Development of the framework was guided by a retrospective analysis of past regulatory decisions to characterize the general approach and structure for benefit-risk assessments that inform FDA’s expert judgments and decisions but may not have always been explicitly stated. In 2015, the Center for Drug Evaluation and Research began a staged implementation of the framework into drug regulatory review, by integrating the Framework into the templates that guide completion of the clinical reviews and regulatory decision memos. This presentation will summarize the design and development of FDA’s Benefit-Risk Framework for human drug products, its implementation into the drug review process, and opportunities for further development.

W1-D.2 Egnoto, MJ*; Iles, IA; Roberts, HA; Smith, DS; Liu, BF; University of Maryland; [email protected] Adoption Preferences of Law Enforcement for Programmatic Innovations Testing technologies for policing is costly and laborious. Previous research found that police can be reticent about technology adoption. This manuscript examines law enforcement adoption of programmatic innovations focused on particular crime types (radiological and nuclear threats). First an expert police panel explored readiness to adopt an advanced technology (personal radiation detectors: PRDs). Results indicated that on-duty device adoption was likely, but not off-duty. In addition concerns about ease of carrying PRDs, personal health and security issues, and concerns about job performance were raised. A survey was then developed from the panel findings (n=101 DC police). Results indicate that police respond negatively to financial incentives, and focus instead on how innovations can contribute to the safety of themselves and their immediate families. Contrary to prior observational research, findings indicate that false positives are not a significant barrier to adoption, but device training is important. Overall, police willingness to adopt was driven by attitudes about PRDs and device perceptions as assessed through diffusion of innovation constructs. Results expand diffusion literature on police populations, which is still nascent. Moreover results identify tracks that both impede adoption as well as communicate consumer-preferred incentives.

W4-F.5 EISINGER, F; 1- INSERM, UMR912 (SESSTIM), 13006, Marseille, France 2- Aix Marseille Université, UMR_S912, IRD, 13006, Marseille, France 3- Paoli-Calmettes Institute Marseille, France; [email protected] Threats to come: A blast from the past In the middle ages the question “Why?” was irrelevant because, God’s will was the unique answer. No test, no challenge because Aristotle was always right. Both enlightenment and disenchantment which eventually came along later on removed certainty and gave us both freedom and doubt. Is this paradigm shift irreversible or just temporary? 1 The backlash of science Although the universe is complex and all borderlines fuzzy, human are looking for a stable knowledge, a predictable future. At first, while Science gave better outputs and answers, its model gained currency. However, Science cannot hold the truth and is merely a way to get closer to it. Thus, facing unrealistic expectations, Science in itself may disappoint. 2 The risky society and the need for safety and protection The “risky society” is now giving way to the precautionary society. However a more precautious society is neither safer nor better, precaution is nothing but a conservative principle. In the middle-ages peasants gave up freedom in exchange for protection. Where mankind looks eagerly to more safety, the price shall be less freedom as observed in the current debate about terrorism and how to fight it. Maybe the worst thing to fear is not fear itself but an endless search for safety. 3 The disappearance of the State Claiming “In this present crisis, government is not the solution to our problem but the problem” had been the credo of neo-liberal political groups leaning on an invisible hand metaphor. The trend has been on less regulation, leaving the question as to whom does the invisible hand which supposedly drives us to a better world belong? 4 Complex societies with mistrust A complex society cannot work without trust. Nevertheless, EU citizen tend to trust more environmental interest groups than scientists on issues like climate changes. Thus, trust is now, again based more on the source of information rather than on its process. This is, a step back in the past, a revenge of “whom” over “how”.

T4-B.5 Ellig, JE; George Mason University; [email protected] Uncertainty Analysis for Major Proposed Regulations: Findings from the Mercatus Report Card The Mercatus Center at George Mason University’s Regulatory Report Card assessed the quality of regulatory impact analysis (RIAs) accompanying proposed, economically significant regulations for the years 2008-2013. The assessment includes an evaluation of the quality and extent of uncertainty analysis – specifically, uncertainties about benefits, costs, and the extent of the systemic problem the regulation seeks to solve. This presentation will show how well agencies have analyzed these uncertainties significant regulations, compare the quality of uncertainty analysis with the quality of other aspects of RIAs, and explore reasons for variation in the extent of uncertainty analysis.

December 6-10, 2015 - Arlington, VA

39

SRA 2015 Annual Meeting Abstracts T3-D.3 Emanuel, RN; Johns Hopkins University Applied Physics Laboratory, University of Maryland-College Park; [email protected] Resilience Metrics for Decision Making Over the past decade, the resilience engineering field has produced a rich collection of theory development and concept definition. The applicability of resilience has been broad based, including healthcare, aviation, critical infrastructure, manufacturing, among many fields and industries. Most efforts have been focused on theory and identification of resilience. Definition and development metrics applicable across systems and industries has been relatively lacking. This presentation reviews the current state of resilience metric definition and development. After discussing the general requirements of a resilience metric, a general model is proposed. The general model of resilience and its simplifications are compared to earlier metrics to assess each metrics’ consistency with the requirements stated. Examples and case studies are assessed using the resilience metric and a framework for decision-making using resilience as a factor. The general model of resilience demonstrates its decision making value when applied to example situations using its derived metrics.

P.130 Eosco, GM*; Rickard, LN; Scherer, CW; Haase, D; ERG; University of Maine; Cornell University; SUNY-ESF; [email protected] Protecting lives or promoting risk? Hurricane Sandy survivors’ perceptions of severe weather communication For those living directly on the coast, storm surge is the most dangerous and potentially deadly risk. During Hurricane Sandy in 2012, 40 deaths were directly attributed to flooding that occurred due to a dramatic slow rise of ocean surge. Beyond Sandy, storm surge has easily been one of the most challenging risks to communicate over the last decade. How individuals make decisions about whether or not to evacuate is explored in this study of 75 individuals living within a few feet or blocks of the coast in five Connecticut communities. These individuals participated in 90 to 120-minute focus groups (n = 7) examining their use of information sources during Sandy and their evacuation decision-making. Findings suggest that the more “storm proof” (i.e., in compliance with FEMA policy) they perceive their house to be, the less likely they are to evacuate during severe weather events. If, by following FEMA standards, individuals perceive their homes as safer, and are less likely to evacuate, is this policy to protect property running counter to the overarching goal of protecting human life? Do FEMA standards spark an unintended “risk compensation” effect, wherein residents’ (possibly unrealistic) perceived safety overrides messages they receive about storm severity and necessary evacuation? In turn, how do these perceptions relate to behavioral decisions and the challenges facing first responders in these communities? We explore these and other questions emerging from the focus groups, and suggest theoretical and practical implications for risk communication and emergency management.

W3-H.3 Esfandiary, S; FEMA; [email protected] A Performance-based Insurance Rating for Leveed Areas in Communities Participating in the National Flood Insurance Program It is argued in this presentation that the rate formula used in the National Flood Insurance Program (NFIP) is not valid for areas behind non-accredited levees, i.e. levees that are not recognized as providing protection on the NFIP’s flood insurance rate maps. It is demonstrated in this presentation that the insurance rate formula can be adjusted by incorporating the conditional probability of the levee failure, if it was available. The conditional probability of failure can be obtained from fragility curves. It might be costly for communities to develop fragility curves but benefit cost analysis might indicate that the cost is offset by other factors including the reduced cost of insurance and by the fact that this process identifies the risk drivers and communities can spend their resources to mitigate them to avoid future losses. Communities participating in the NFIP should have the option to submit such curves to adjust their NFIP rates. This option will not have any impact on FEMA’s mapping program. The curves will be incorporated in the rate calculations directly and there would be no need for a map revision. FEMA cannot and should not allocate funds to develop fragility curves; however, FEMA should accept curves developed by other federal agencies for other purposes. Fragility curves developed by communities must be independently reviewed by experts and certified by a professional engineer.

T3-D.1 Esfandiary, S*; Francis, RA; Demby, JE; DHS/FEMA; [email protected] Dam Risk Management and Community Resilience Risk-informed decision making for dam owners and operators revolve around a wide variety of dam safety activities such as frequency of inspection or increased instrumentation, additional technical studies, selection of a remedial action to address an identified deficiency, etc. These activities focus mostly on the condition of dams and resources are usually diverted toward high risk dams. Currently, resilience of communities impacted by dams is not part of the dam risk management discussions and practices. In this presentation it is proposed that resilience systems analysis should be considered an integral part in dam risk management practices, where absorptive capacity, adaptive capacity and transformative capacity of impacted communities are assessed to enhance the risk-informed decision-making process. A resilience systems analysis will provide a shared view of the risk landscape amongst the stakeholders, and addresses the complexity and interrelations of different risks. This approach will focus on impacted communities and not just the risk, aiming to improve the systems that people use to support their all-round well-being.

December 6-10, 2015 - Arlington, VA

40

SRA 2015 Annual Meeting Abstracts M4-G.2 Esposito, PA; Daigle, KJ*; American Society of Safety Engineers; [email protected] Finding the Hidden Hazards In the occupational environment, corporate leaders base decision making in risk assessment data, thus it is imperative that this data be systematic, intellectually consistent and easy to understand. There are many tools and methods available to conduct a risk assessment, but all begin with the identification of a hazard of concern. As such, the identification of relevant hazards is critical to facilitating the decision making process. While many hazards are readily recognizable, others require the use of systematic methods to uncover. This topic will discuss the challenges associated with identification of hazards and review alternative methodologies to identify less obvious hazards. Once these hazards are recognized, they can be assessed and provide management with a more complete picture of the risk that will facilitate better decisions.

M4-G.3 Esposito, PA; Daigle, KJ; Woodhull, D*; American Society of Safety Engineers; [email protected] Using Perception Surveys to Evaluate Risk Decisions Perception survey data are regarded as leading indicators that can be used to assess various aspects of the safety management system, providing organizations with the opportunity to fix the system before injuries and illnesses occur. Perception surveys can be designed to enable organizations to predict and control the risks in their operations and processes. Perception surveys are predictive because they focus on the opinions and experiences of hourly employees, regarding them as the closest approximation of reality. These opinions and experiences are a snapshot of physical and cultural conditions as they currently exist, and constitute information that correlates strongly with actual safety performance. A perception survey makes it possible to ask shop floor employees anonymously what they see and how they feel in a comprehensive and complete way. The effectiveness of decisions regarding control of operational safety risks is understood and felt by workers, who can provide insight that is not available through other approaches. This session will discuss how perception surveys can be used to evaluate risk decisions and provide examples.

M4-G.1 Esposito,, PE; ASSE Risk Assessment Institute; [email protected] Methodology for Systemizing Risk Reductions using the Hierarchy of Controls The application of controls and the subsequent residual risk calculations are often fraught with erroneous assumptions, methodology variations and lack of confidence in the residual risk levels. In the occupational environment, corporate leaders base decision making in risk assessment data, thus it is imperative that this data be systematic, intellectually consistent and each to understand. The application of risk reduction based on the control selection typically will impact only one of the risk factors, consequence or likelihood. Based on part on ANSI B11.0, 2010 Table 3, the standard describes the influence each hierarchy has on the risk factors. The hierarchy includes: elimination or substitution; guards and safe guarding devices; awareness devices; training and procedures, and personal protective equipment. Each has a potential impact on harm or occurrence, to varying degrees. This topic will present a proven methodology to semi-quantify the effects of each control hierarchy to each risk factor, as well as present synergistic and multiplicative effects that is now being used. Once standardized, management can make better decisions regarding the level of residual risk achieved by proposed alternate controls.

M4-G.4 Esposito, Paul A, PE; Kohlmeyer, James, JK*; ASSE Risk Assessment Institute; [email protected] RIsk Assessment Output Metrics for Corporate Accountability As a safety manager for a large construction company, we are faced with challenges from our customers, workers, managers and sub-contractors to perform quality work, on time, in budget with safety and quality in mind. In fact, very often our safety record is use as one of the key distinguishing factors for customers selecting qualified bidders. Our journey began with our accident rate being better than average, but so was most of our competition. Regardless, we were still having accidents, and some of them were serious. This impacted not only our overhead (workers’ compensation) and therefore, or profit, but the cost of our proposals, the relationships with our workers, etc. We needed to, and made the decision to advance our thinking to one where compliance was no longer the goal, but one of an expectation. Our goal was now to think risk; reduce risk. Before this was accepted by leadership, we had to not only perform risk assessments, but determine which metrics were going to have the greatest impact on accountability of people, and accountability (or management confidence) of the risk assessment process. Metrics were needed to reinforce the process, and measure tangible results, not just outcome indicators (the number of incidents). To validate the process, we established two metrics: 1) every incident investigation needed to identify a flaw or weakness in our risk assessment process, and 2) the process would be independently audited monthly on a department by department basis. To validate the actions of people, we established additional metrics: 1) the number of new controls and 2) the conformance rate to existing critical to safety controls. As a result, management became convinced that the risk assessment results were key to improving our safety performance. After three years, we were able to reduce an already good safety record over 50 % more.

December 6-10, 2015 - Arlington, VA

41

SRA 2015 Annual Meeting Abstracts M4-J.2 Evensen, D*; Stedman, R; O'Hara, S; Oberlin College; [email protected] Nuanced differences in perceptions of ‘fracking’ between the UK and US Substantial differences exist between the United Kingdom and United States in governance, politics, and physical development associated with shale gas. Major discrepancies across nations include: (1) private vs. national ownership of mineral rights, (2) processes for leasing mineral rights, (3) national vs. state/regional/local governance, (4) the level at which most political discourse occurs, and (5) the length and depth of experience with physical development in the landscape. These cross-national variations intimate the potential for meaningful contrast in public perceptions across the Atlantic. Public perceptions, in turn, have the potential to affect the suite of viable policies and approaches to decision making on this complex, controversial form of energy development. We used an existing, repeated cross-sectional online survey of a national UK sample to conduct a UK/US comparison of public perceptions. During the ninth iteration of the UK survey (September 2014), we implemented the same survey with a national US sample. In descriptive/frequency analysis, we compare across nations: (1) support for / opposition to shale gas development, (2) beliefs about impacts associated with development, and (3) extent of support for a range of energy sources to power the nation in future years. Support for shale gas and most other energy sources was higher in the US; the only energy source receiving substantially more support in the UK was nuclear power. Of the five impacts we asked about, the US sample was more likely to associate all three positive impacts with shale gas development and less likely to associate both negative impacts with development. We also present multivariate analysis (binary logistic regressions) examining cross-national differences in the extent to which a relationship exists between beliefs about impacts and support/opposition. We conclude with implications for policy and communication in each nation based on our results.

W5-J.3 Evensen, D*; Clarke, C; Ashmoore, O; Cardiff University; [email protected] Declining coverage of risks from shale gas development Survey research of residents in the Marcellus Shale region of the northeast US has shown that local newspapers are the single most used source for information on the controversial topic of shale gas development via hydraulic fracturing. We built off this knowledge to conduct a content analysis of regional newspaper coverage in six newspapers that serve regional population centers in areas close to shale gas development – two each in the states of New York (NY), Pennsylvania (PA), and Ohio (OH). We have published previously on findings from the NY and PA content analysis. In this new research, we include the OH articles and new data analysis of the full sample to examine change in coverage over time. Overall, we coded 1,958 articles across the six newspapers. The sample from the four NY and PA newspapers extends from 2007 (the first mention of shale gas there) through 2012; the OH sample includes 2009 (first mention of shale gas there) to 2014. We use generalized linear models with pairwise comparisons to explore differences across years. Our major finding is that attention to environmental impacts, measured by percentage of coverage mentioning at least one impact and by total average number of impacts mentioned per article, declines precipitously and significantly over time in each state. A similar trend is manifest for economic impacts, save in OH where there is less decline in attention afforded. Trends in coverage of social impacts are mixed and less clear. In summary, as time progresses, risks and benefits of shale gas development are cited less frequently in key information sources. We discuss implications of our findings for the agenda setting role of mass media and for risk/benefit communication on this issue broadly.

T2-F.2 Evers, EG*; Blaak, H; Hamidjaja, RA; De Jonge, R; Schets, FM; National Institute for Public Health and the Environment; [email protected] A QMRA for the transmission of ESBL-producing Escherichia coli and Campylobacter from poultry farms to humans through flies The public health significance of transmission of ESBL-producing Escherichia coli and Campylobacter from poultry farms to humans through flies was investigated using a worst case risk model. Human exposure was modelled by the fraction of contaminated flies, the number of specific bacteria per fly, the number of flies leaving the poultry farm, and the number of positive poultry houses in The Netherlands. Simplified risk calculations for transmission through consumption of chicken fillet were used for comparison, in terms of the number of human exposures, the total human exposure and, for Campylobacter only, the number of human cases of illness. Comparing estimates of the worst case risk of transmission through flies with estimates of the real risk of chicken fillet consumption, the number of human exposures to ESBL-producing E. coli was higher for chicken fillet as compared with flies, but the total level of exposure was higher for flies. For Campylobacter, risk values were nearly consistently higher for transmission through flies than for chicken fillet consumption. This indicates that the public health risk of transmission of both ESBL-producing E. coli and Campylobacter to humans through flies might be of importance. It justifies further modelling of transmission through flies for which additional data (fly emigration, human exposure) are required. Similar analyses of other environmental transmission routes from poultry farms are suggested to precede further investigations into flies.

W5-D.1 Ezzeldin, H*; Forshee, R; Simonetti, A; Office of Biostatistics and Epidemiology, CBER, FDA; [email protected] Using Hybrid Optimization Heuristic to Allocate Blood Transfers among U.S. Regions in Simulated Earthquakes Natural or man-made disasters can negatively impact the US blood supply system due to increased demand of blood resulting from injuries after destruction. Modeling such events is complicated due to the difficulty in predicting either their location or onset. Based on an inter-regional model of the US blood supply, we explored the resilience of the US blood supply during regional simulated earthquakes. We divided the US into four regions to reflect regional blood collection subdivision and used estimated losses to predict blood demand. The daily blood transfers among regions were allocated using a hybrid optimization heuristic, where a Neural Network is trained using Particle Swarm Optimization (NN-PSO). The NN-PSO models a complex non-linear function of regional factors to optimize the global performance of the network. We based our simulations on multiple earthquake scenarios selected from the U.S. Geological Survey (USGS) National Seismic Hazard Model.1 We predicted the change in blood demand and collection from baseline based on the estimated casualties and buildings damage for the targeted scenarios using the HAZUS-MH® software.2,3 We compared blood transfers for each independent simulated earthquake to those occurring during standard operational conditions. Simulation results on variation in the US blood supply levels due to the temporal and spatial uncertainty are demonstrated. This study emphasizes the importance of developing and utilizing such modeling tools to help decision makers and stakeholders understand the behavior of the US blood supply system for planning emergency and preparedness.

December 6-10, 2015 - Arlington, VA

42

SRA 2015 Annual Meeting Abstracts M3-D.4 FAN, S*; XU, J; Tsinghua University and Central University of Finance and Economics, Peking University; [email protected] Linking Risk Perception to Behaviors: Public Responses to Air Pollution in China North China is plagued by frequent hazes which have aroused widespread pubic concern about the risk induced by air pollution. Public are exposed to the air pollution which they have also partly contributed to. Hence, understanding how people perceive and respond to air pollution is a prerequisite to designing effective risk reduction policies. In this study, we proposed a conceptual model to investigate risk perception and behavioral responses to air pollution and the underlying affecting factors. The model is then tested empirically with structural equation modeling techniques based on a survey (N=1051) administered online to Beijing residents in China in 2014. Two kinds of behaviors, adaptation behavior and mitigation behavior are examined separately in this study. The preliminary results indicate that perceived susceptibility, attribution of air pollution, and environmental awareness played significant roles in shaping people’s adaption behaviors. Mitigation efficacy, cost and the confidence about others’ mitigating behaviors seem to dampen the people’s willingness to reduce pollutants emissions. The results shed light on the different mechanisms behind people’s adaptation behaviors and mitigation behaviors, which is helpful to design different and targeted policies to help people protect themselves from air pollution, and stimulate them to reduce emissions in their daily life.

T4-F.4 Fanaselle, W*; Oryang, D; Van Doren , J; Center for Food Safety and Applied Nutrition, Food and Drug Administration; [email protected] Multicriteria-based Ranking Model for Risk Management of Animal Drug Residues in Milk and Milk Products The U.S. Food and Drug Administration (FDA or “we”) developed a multicriteria-based ranking model for risk management of animal drug residues in milk and milk products. This risk assessment can serve as a decision-support tool to assist with re-evaluating which animal drug residues should be considered for inclusion in milk testing programs. A key question was whether residues of animal drugs other than beta-lactam antibiotics – currently the focus of milk-sampling programs – warrant monitoring. The multicriteria-based ranking model we developed ranks selected animal drugs according to specific criteria used in the model. FDA selected 54 animal drugs and their various formulations for evaluation. The multicriteria-based ranking model is based on four overarching criteria that collectively contribute to a drug’s score and rank within the group. The animal drugs were ranked, from a food safety perspective, on the basis of the overall score. Drugs in a variety of drug classes scored high, with drugs in eight different drug classes ranked among the top 20 highest-scoring drugs. These eight classes include beta-lactam antibiotics, antiparasitics, macrolides, aminoglycosides, nonsteroidal anti-inflammatory drugs (NSAIDs), sulfonamides, tetracyclines, and amphenicols. This presentation will provide a brief overview of the model, the criteria selected for the model, and the ranking results.

P.89 Farber, G; US EPA; [email protected] The Concept of Unacceptable Risk in EPA Regulatory Policies Regulatory programs under a variety of environmental statutes use threshold levels of hazard and exposure to set set threshold levels of risk. Employing these levels for regulatory or cleanup decisions carries the implication that lower levels are “acceptable”, or at least not appropriate for action. This session will examine the trigger thresholds in various EPA regulatory programs, comparing the basis for establishing the criteria. How does the concept of acceptable risk fit into these policies, and what are the implications?

P.22 Farnaz Pirasteh, FP*; Mahtab Sanei, MS; Pukyong National University; [email protected] Improving risk prediction models using PGA, LASSO and SVM in prostate cancer prediction Absolute cancer risk is the probability that an individual with given risk factors and a given age will develop cancer over a defined period of time. Examples of these risk factors include race, age, sex, genetics, body mass index, family history of cancer and many other factors. Considering prostate cancer, recently, some common gene variations have been linked to a higher risk of prostate cancer. This research is to show testing for the gene variants will be useful in predicting prostate cancer risk. Developing prediction models that estimate the probability of cancer presence based on individual genes data over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk. These types of models also will be useful for designing future chemoprevention and screening intervention trials in individuals at high risk of specific cancers in the general population. To address the issue, GSE8218 dateset from Vaccine Research Institute of San Diego, as a study on Prostate cancer gene expression profile, is considered; and as a case of big data (while there are about 22,000 genes considered as variables and 148 samples), variable selection methods such as PGA (Parallel Genetic Algorithm) and LASSO (Least Absolute Shrinkage and Selection Operator) were applied to reduce the prediction dimension and so prediction error (risk) (Considering Data Mining Processes). As an efficient prediction model, SVM (Support Vector Machine) was applied and the results based on RMSEP (Root Mean Squared Error Prediction), MAE (Mean Absolute Error) and MSE (Mean Squared Error) show the efficiency of SVM in predicting samples with specific genes to address for prostate cancer.

December 6-10, 2015 - Arlington, VA

43

SRA 2015 Annual Meeting Abstracts P.180 Fenner-Crisp, PA*; Dellarco, VL; Both authors: Independent Consultant and U.S. Environmental Protection Agency (Retired); [email protected] Key Elements for Judging the Quality of a Risk Assessment A number of U.S. federal, state and local government agencies produce risk assessments on a continuing basis. In the years since publication of the 1983 National Research Council report Risk Assessment in the Federal Government: Managing the Process (the “Red Book”), advances in risk assessment have occurred but the need for further improvement continues to be recognized. Many reports have been published that contain recommendations for improving the quality, transparency and usefulness for decision-making of risk assessments prepared by these agencies. A substantial measure of consensus has emerged as to what characteristics high quality assessments should possess. The aim of this effort was to distill a set of key attributes from the accumulated recommendations into a Guide for use by decision-makers, risk assessors, peer reviewers and other interested stakeholders to judge whether or not an assessment “measures up” to current best scientific practices and results in a scientifically credible, transparent and useful product. By “best practices,” we mean that an assessment reflects a critical, open-minded approach in selecting reliable data and models fit for their intended use, and analyzing and integrating that information. The use of the Guide is intended to promote transparency and consistency with regards to the conduct and quality of assessments. Most of the features cited in the Guide are applicable to any type of assessment, whether it encompasses just one or all four phases (hazard identification, dose-response assessment, exposure assessment and risk characterization) of the risk assessment paradigm, whether qualitative or quantitative, screening level or highly sophisticated and complex. Just as agencies at all levels of government are responsible for determining the effectiveness of their programs, so should they determine the effectiveness of their assessments used in support of their regulatory decisions.

P.122 Fensterheim, R*; Choi, H; Strother, D; Jaques, A; RegNet Environmental Service; Toxsolve; [email protected] What Happened To The Acute Exposure Guideline Level (AEGL) Program?? Acute Exposure Guideline Levels describe risk to humans from short-term chemical exposures from spills or other catastrophic events. The robustness of the technical reviews produced high quality analyses for important chemicals in commerce. AEGLs and supporting materials are used globally for uses well beyond accidental release scenarios. The foundation for the program was the National Advisory AEGL Committee which formed in 1996. The Committee chartered under the Federal Advisory Committee Act (FACA), had broad representation which contributed to its success including EPA, DOD, DOE, DOT) as well as other federal/state governments, the chemical industry, academia, and the private sector. Guidance and peer review was provided by the National Academy of Sciences.The NAS organized a Committee to review draft reports; concerns identified by NAS were submitted to the FAC for resolution. Once concurrence was achieved, the final values are published. All meetings of the FAC, drafts and proposed values were announced in the Federal Register; public and other stakeholders were invited to comment and present. The last meeting of the FAC was in April 2010; the Committee was never reconvened and the charter expired in October 2011. Since the NAS did not complete its review of all proposed values, a new process was established which bypassed the FAC and did not include the same public notice/comment period. Under this new process, NAS's concerns were addressed by a new Contractor; agency stakeholders were provided a two week review. While many of the AEGL values finalized under the new process were similar to the proposed, several are significantly and strikingly different. An analysis of the differences between the AEGL values proposed by the FAC and those finalized by the NAS are investigated. The lack of full public stakeholder involvement is believed to have contributed to some of the differences. Public policy questions are explored over the status of the AEGL values that bypassed the full FACA process.

W5-H.1 Ferson, S; Applied Biomathematics; [email protected] Computing with confidence Bayesian posterior distributions can be propagated in posterior calculations so they are useful in risk analyses and engineering. However, the interpretation of these distributions and the results they yield has no necessary connection to the empirical world when they are specified according to subjectivist principles. In contrast, traditional Neyman confidence intervals are useful in risk analysis and engineering because they offer a guarantee of statistical performance through repeated use. However, it is difficult to employ them consistently in follow-on analyses and assessments because they cannot be readily propagated through mathematical calculations. Balch has proposed confidence structures (c-boxes) which generalize confidence distributions and provide an interpretation by which confidence intervals at any confidence level can be specified for a parameter of interest. C-boxes can be used in calculations using the standard methods of probability bounds analysis and yield results that also admit the confidence interpretation. Thus, analysts using them can now literally compute with confidence. The calculation and use of c-boxes are illustrated with a set of several challenge problems involving parametric and nonparametric statistical estimation using sample data. The results include imprecise characterizations analogous to posterior distributions and posterior predictive distributions, and also structures that can be used to compute tolerance intervals at any probability levels. Simulations demonstrate the degree of conservativism of the results. The c-box approach is contrasted with statistical estimation using traditional maximum likelihood and Bayesian methods where possible.

P.97 Fiebelkorn, SA*; Cunningham, FH; Dillon, D; Meredith, C; British American Tobacco, Group Research and Development, Southampton, SO15 8TL, UK; [email protected] 4-N-Nitrosomethylamino-1-(3-pyridyl)-1-butanone (NNK) and N-Nitrosonornicotine (NNN): Risk Assessment of Two Tobacco-Specific Nitrosamines (TSNAs) Tobacco smoke contains over 6,000 constituents, at least 150 of which have established toxicological properties. Both NNK and NNN have been proposed by the WHO Study Group on Tobacco Product Regulation for mandatory lowering. To further understand this prioritisation, we have employed margin of exposure (MOE) calculations, drafted postulated mode of actions (MOA), conducted a series of in vitro tests to confirm the MOA and investigated the parameters for constructing a physiologically-based pharmacokinetic (PBPK) model. MOE calculations for NNK, using 8 data sets with varying routes of administration and duration, generated values ranging between 278 and 89,543. MOE calculations for NNN, using 7 data sets with varying routes of administration and duration, generated values ranging between 2,758 and 255,652. These ranges are both above and below the critical MOE value of 10,000, creating ambiguity as to the priority of NNK and NNN. However, comparing the lowest MOE for NNK versus NNN suggests a difference between the two compounds of approximately 10-fold. The key events in our postulated mode of action (MOA) for both NNK and NNN involve metabolic activation, DNA damage, genotoxicity (mutation) leading to cell proliferation and tumours. We compared the in vitro genotoxicity of NNK and NNN, using Ames and the mouse lymphoma assay (MLA). NNK induced dose-dependent mutations in two Ames strains in the presence of S9 when tested up to 5000 µg/plate, whereas NNN gave no conclusive results even when tested up to 30,000 µg/plate. NNK induced mutation in L5178Y cells (MLA) following 3 hour treatment with S9 when tested up to 10 mM, however NNN did not induce mutations with or without S9 when tested up to 10 mM. These data suggest that there is a difference in the potency of NNK and NNN and we propose a further step by constructing a PBPK model to contextualize these data against the effective tissue doses experienced by smokers.

December 6-10, 2015 - Arlington, VA

44

SRA 2015 Annual Meeting Abstracts W2-I.4 Finkel, AM; University of Pennsylvania and University of Michigan; [email protected] Aged in the bottle: It's time to uncork the 1980 gift to analysis and public protections Those concerned with the glacial pace of OSHA standard-setting tend to blame risk assessment in general, and the Benzene decision in particular, for placing roadblocks in the way of needed worker protections. A careful look inside OSHA’s use of risk assessment, however, reveals that the severe delays have occurred elsewhere, and are mostly attributable to slow economic analysis within the Agency and increasingly drawn-out reviews by OIRA. Indeed, from the decision-theoretic perspective the problem is not that OSHA has to quantify risk, but that its standards are not influenced enough by risk considerations to make the QRA effort worthwhile. The analytic effort should not diminish; rather, the outcomes should become more determined by risk analysis. The Benzene decision actually empowered OSHA (and other agencies whose statutes permit such decision-making) to engage in a particularly thoughtful and humane form of cost-benefit balancing; rather than having to slavishly justify monetized overall health improvements against unduly-precise economic estimates, OSHA was instructed by the Supreme Court to take on intolerable individual risks, and reduce them at costs the risk-imposing industries can readily absorb or pass on. This presentation will look back on the unfulfilled benefits of the Benzene decision, and propose ways that other regulatory agencies can compare costs and benefits in the superior way first articulated 35 years ago.

W1-C.4 Finkel, AM; University of Pennsylvania Law School and University of Michigan School of Public Health; [email protected] Beyond best-in-class: A secret to regulatory excellence A regulatory agency aspiring to be “best in class” among its peers, but also truly excellent in absolute terms, may be tempted to identify a set of virtues and maximize some or all of them. This may be the wrong mental model, however, as moving towards many of the desirable attributes of a risk management agency inexorably involves moving away from a competing virtue. Consider the attribute of humility: we want our regulators to listen sincerely to their constituents, but we also want them to be confident, assert evidence they have amassed, and lead rather than follow. This presentation will enumerate many such “competing virtues,” identifying the tensions between them but also identifying the excesses that can plague agencies that pursue “too much of a good thing.” For example, excess confidence can devolve into arrogance, while excess humility can devolve into regulatory capture and an agency incapable of rebutting misleading arguments. In particular, cost-benefit decisionmaking requires public agencies to find the “sweet spots” within the envelopes defined by various competing analytic virtues, such as the tension between efficiency and equity, or between analysis-driven decisions and decision-driven analysis. This presentation will offer some practical advice about how to find the “sweet spot,” which in general is different from the midpoint, or from a strategy of lurching between extremes.

M4-I.3 Firth, P; Griffiths, A*; UL Environment; [email protected] Risk Assessment: Alignment and Harmonization in Sustainability Manufacturers are facing ever increasing demands for greater product transparency. This growing transparency movement is still frequently coupled to a red list or hazard screening mentality. While risk assessment has sometimes been misunderstood by those in the sustainability space, recent changes and perspectives have given new life to this proven methodology. This session will look at harmonizing methods for using risk assessment in a sustainability context. Evaluation of a chemical’s or product’s health has been a struggle for the sustainability space, yet this approach is not only realistic today, but also applicable to a visionary future.

T3-E.1 Fischbach, JR*; Knopman, D; Groves, D; Nixon, K; RAND Corporation; [email protected] Developing an Integrated, Cross-Agency Coastal Resilience Master Plan: a Case Study in Jamaica Bay, New York The RAND Corporation, in partnership with the Science and Resilience Institute at Jamaica Bay and BuroHappold Engineering, is currently facilitating a process designed to support long-term, cross-hazard coastal resilience planning in Jamaica Bay. This process, supported by the Rockefeller Foundation, brings together a range of federal, state, and local agencies with management responsibility in or around the bay. The goal is to inform the agencies’ long-term planning efforts through a common analysis framework that addresses future flood risk, water quality, and ecosystem outcomes. Key steps include (1) developing a participatory, stakeholder-informed decision framework and analytic decision support toolkit for potential resilience investments; and (2) using this framework and toolkit to identify future vulnerability from climate change and other drivers and evaluate long-term tradeoffs among a variety of proposed resilience measures. In this presentation, Dr. Fischbach will describe the analysis framework and preliminary lessons learned from the in-progress effort.

December 6-10, 2015 - Arlington, VA

45

SRA 2015 Annual Meeting Abstracts W2-F.2 Fischer, ARH; Wageningen University; [email protected] Risk-Benefit communication in nutrition and food safety Communication of risks and benefits associated with nutrition and food safety is important to ensure that consumers adopts the most healthy practices. Traditionally the emphasis has been on avoiding risks, yet consumer behaviour is often insufficiently changed to avoid the risks and reap the benefits of best practice. There are a number of issues that need to be taken into account to optimise risk-benefit communication. Cognitive resources and motivation are limited. Consumers make dozens of food decisions during each single day, and each of these has little consequences. Therefore motivation and the possibility to elaborate on each of these decisions is lacking, implying that the easiest choice is often preferred regardless of nutrition or food safety risk. The communicated risks and benefits are often long term risks and benefits. Temporal dilemma’s strengthen consumer response based on immediate benefits and risks and discounting future consequences. Connecting long term effects to short term desirable behaviour might be a solution. Consumer often do not know what the healthy choice is, and what is healthy for them. On package information should be easy to understand and relevant. In addition, the world is filled with less healthy choices while in many situations it is more difficult to find the healthier alternative. Approaches from social marketing, in particular the motivation-opportunity-ability framework can help to communicate risks and benefits in such a way that behaviour change is facilitated. Especially by linking in with relevant motivations of consumers, providing consumers with the opportunity to choose healthily and ensure they are able to identify the most healthy choice may provide inroads to successful risk benefit communication about nutrition and food safety. If additional ‘nudges’ in the choice context are used to make the healthy choice the obvious choice relevant changes in consumer behaviour may be achieved.

M2-A.4 Fitzpatrick, SC; U.S. Food and Drug Administration, Center for Food Safety & Applied Nutrition; [email protected] Determining the Predictive Capability of In Vitro Microphysiological Systems to Answer Critical Regulatory Questions Regulatory agencies are currently facing several challenges with respect to how to effectively integrate new emerging science with the other key, non-scientific inputs that need to be considered in decision making to help build a “Tox 21” risk assessment paradigm that will be readily accepted. For example, there is a delicate balance between assuring safety and impeding innovation, and any advances in strategies must recognize these dual demands that regulators face. Thus, qualification and validation steps are critical to determine that within the stated context of use, the results of assessment with a tool can be relied upon to have a specific interpretation and application in product development and regulatory decision-making. An example of a new validation strategy is the FDA Drug Development Tool (DDT) qualification process. This presentation will outline the application of DDT qualification on assessing issues surrounding the prediction of bioavailability and diffusion across the gut using a “gut on a chip” in vitro microphysiological system, and explore how these data could be used to refine risk assessment to more accurately predict the exposure to a consumer of a contaminant in the food supply.

T4-E.5 Flage, R*; Amundrud, Ø; Wiencke, HS; University of Stavanger, Proactima; [email protected] Overall regional risk assessment of four Norwegian municipalities Norwegian municipalities are required by legislation to perform overall risk and vulnerability assessments covering the municipality as geographical area, as provider of public services and as a driving force in the area of societal safety and emergency preparedness. Cooperation between municipalities in this area is encouraged, if appropriate, including common risk and vulnerability assessments. This paper reports on the execution of an overall regional risk and vulnerability assessment covering four Norwegian municipalities. Selected issues are highlighted and discussed, including the structure of the hazard/threat identification, the evaluation and representation of the strength of knowledge underlying the assessment, and the presentation of the total risk picture.

W2-G.1 Fletcher, K; Transportation Security Agency; [email protected] Dynamic Aviation Risk Management Solution Expanding TSA’s risk-based security philosophy and realizing its benefits beyond passenger screening to other aspects of aviation security could create increased vulnerabilities if approached in a disconnected manner. Avoiding that potentiality requires a unified understanding of risk from across the aviation domain. Working with aviation industry partners and academia, TSA is creating a unified and dynamic approach to aviation security over the next decade. This model supports improved allocation and use of limited security resources based on at both the individual traveler and baggage/cargo level, and the aggregate risk at the individual flight level. The complexity of this problem, additional capabilities required, and the application of risk management principles will be discussed.

December 6-10, 2015 - Arlington, VA

46

SRA 2015 Annual Meeting Abstracts P.61 Forsell, T*; Haverstick, K; Nadeau, L; Eastern Research Group, Inc. (ERG); [email protected] The Social and Economic Effects of Wage Violations: Estimates for California and New York This project estimated compliance with labor laws and the costs of non-compliance. The Fair Labor Standards Act (FLSA) sets national standards for a minimum hourly wage, maximum hours worked per week at the regular rate of pay, and premium pay if the weekly standard is exceeded (overtime pay). State governments can implement labor laws that provide higher wage floors, more restrictive overtime laws, or stricter exemption requirements. Accounting for variation in state labor law adds to the complexity of evaluating compliance with labor laws. The study focused on minimum wage violations in California and New York. We used two large nationally representative datasets, the Current Population Survey (CPS) and the Survey of Income and Program Participation (SIPP). The first step in estimating the costs and benefits of non-compliance was to evaluate which workers are covered by these labor laws. Then lost wages to workers were estimated which is the primary cost. Using the CPS, an estimated $22.5 million in wages were lost per week by workers in California and $10.2 million in wages were lost per week by workers in New York in 2011 due to minimum wage violations. However, failure to comply with the FLSA and state labor laws has implications far beyond the dollar amount of unpaid wages. Lack of compliance with the FLSA and state minimum wage laws contributed to higher poverty rates, lower government revenue (due to lower employment and income tax payments by employees), and higher government expenditures on social support programs (such as the Supplemental Nutrition Assistance Program). Estimates of the number of families in poverty, lost tax revenue, and government expenditures on social support programs were also constructed. Thus, lack of compliance with labor laws can impact the Department of Labor’s goal of providing a standard of protection to the labor force.

W5-I.4 Forshee, RA*; Simonetti, A; Menis, M; Anderson, S; Kumar, S; U.S. FDA/CBER; [email protected] Benefit-Risk Assessment of Reducing Transfusion-Transmitted Babesiosis by Testing Blood Donations Babesiosis, caused by intraerythrocytic parasites of the genus Babesia, is a tick-borne disease that is also transmitted by transfusion of blood and blood products collected from infected donors. Babesiosis is often an asymptomatic or mild disease but may be severe and even fatal in neonates, the elderly and the immuno-compromised. Babesia microti is the highest-ranking transfusion-transmissable pathogen in the US for which no donor screening is available. Most cases of babesiosis occur in the Northeast and Upper Midwest, but cases have been reported in every state except Wyoming. We developed a benefit-risk assessment model to estimate: 1) current risk of babesiosis in U.S. blood donors; 2) reduction in transfusion-transmitted babesiosis (TTB) risk under various testing strategies; 3) blood unit loss due to false positive results, and 4) positive predictive value (PPV). We based the model primarily on babesiosis rates by state estimated using Center for Medicare and Medicaid Services data among the US elderly for the years 2006-2013. We estimated the effect of antibody-only testing and antibody and Nucleic Acid Amplification Test (NAT) testing in selected states or nationwide. We estimated that year-round national serology antibody testing would reduce risk by about 90% with a PPV of about 20%. Year-round antibody testing in the five highest babesiosis risk states would reduce TTB risk by about 70%. A combination of antibody testing and NAT nationwide would reduce the risk of TTB by almost five additional percentage points. PPV falls rapidly if specificity is not close to 100%. The results of the benefit-risk assessment informed FDA’s Blood Products Advisory Committee discussion on May 13, 2015 of the optimal testing strategy for reducing the risk of transfusion-transmitted babesiosis. We think that this type of model has a wider application in assessing the geographical risk of endemic and emerging infectious disease to help inform public health decisions.

M2-I.2 Fox, M; Johns Hopkins University ; [email protected] Environmental meetings involving the community: What is meant by acceptable risk? Community stakeholder concerns regarding public health require simplifying and focusing complementary aspects of epidemiological, toxicological, and environmental sciences to help define acceptable risk. Occupational risk characterizations share a similar methodology; however, the audience is different as is the perspective on acceptable risk and whether those risks are voluntary or involuntary.

W5-J.1 Friedman, SM*; Egolf, BP; Lehigh University; [email protected] Journalists' Perceptions of Environmental, Health and Societal Fracking Risks The mass media have heavily covered risk issues related to hydraulic fracturing—fracking— for the past few years, particularly in states where drilling occurs. How journalists present information about the risks and benefits of fracking is important because such framing helps set an agenda that can affect public opinion. While journalists often report the opinions of their sources on most risk issues, they also have personal opinions about such risks and the amount of media coverage the risks deserve. This presentation will describe reporters’ risk perceptions about a variety of environmental, general health and mental health risks and social and community effects associated with fracking. It will also examine which risk topics the journalists felt deserved increased media coverage, as well as some of their reporting practices about fracking. Their responses raise questions about which factors may be driving media coverage of different fracking risks. The presentation is based on the results of a survey of three small groups of journalists who attended separate environmental training sessions in 2014 to help journalists gain more understanding of fracking issues and improve their coverage.

December 6-10, 2015 - Arlington, VA

47

SRA 2015 Annual Meeting Abstracts W3-J.2 Fuchs, G; University of Stuttgart; [email protected] The Transformation of the Germany Electricity System – Risk and Innovation The presentation analyzes the role of institutions in the process of the transformation of the German electricity generating system. The transformation will be studied by distinguishing between three phases of institutional development. In phase one lasting from the late 1980s until 1998 the institutional setting of the electricity system was characterized by its decentralized and semi-public character, legitimized by the idea that electricity generation and supply constitutes a natural monopoly. As a “niche development” we observe the growing importance of actors interested in the development of renewable energies, which in those years could not really grow because of institutional and regulatory hurdles. Phase two is characterized by a double institutional re-alignment. Due to liberalization electricity markets are created which become dominated by the four big utilities. Former decentralized entities are mostly bought out. A wave of merger and acquisitions takes place. In parallel but also somewhat disconnected from these developments, a new regulatory framework for the development of renewable energies had been created. The developing institutions for renewables had little overlap with the main stream electricity system. Different actors, rules and organizations were dominant. This resulted in a very dynamic development of the renewables sector. Phase three finds its symbolic expression in the Energiewende decision of the Federal Government (2011). The constant growth of renewables and the definite end for nuclear energy necessitated a re-alignment of the electricity sector. Renewables no longer were a niche activity and the incumbent actors were forced to accommodate their business models to the new situation. The interests of incumbents and challengers are directly clashing and the government is working on a new market design. A process which is of yet undecided. The new institutions under construction, however, will neither mirror the “liberal market spirit” of phase two, nor the enabling mood of phase two as far as the renewables were concerned.

T4-D.2 Gaasland-Tatro, LA*; Landis, WG; Western Washington University; [email protected] Integrating Climate Change into Ecological Risk Assessment for Contaminated Sites Climate change will have far reaching ecological effects that will be important for consideration in the long-term management of contaminated sites. The combined effects of climate change and toxicant stressors may result in a multiplicative adverse outcome to biological endpoints. Our study incorporates climate model predictions for temperature and precipitation into a Relative Risk Model (RRM) framework with Bayesian networks for the South River, a legacy Mercury contaminated site in northern Virginia. We selected an ensemble of seven global circulation models downscaled to predict potential region specific temperature and precipitation changes through the end of the century. The ensemble of climate predictions allowed us to quantify potential effects from a range of possible changes in temperature and precipitation. The future climate scenarios provided inputs for Bayesian networks used to calculate relative risk to two bird species and two fish species used to represent different mercury exposure routes. The Bayesian networks allow us to quantify the inherent uncertainty in calculating risk to a novel ecological system from modeled climate predictions. Our results will provide managers with a quantitative method for assessing potential climate change effects to biological endpoints and avoid unintended consequences in long-term site management. The versatility of Bayesian networks allows the RRM framework to be adapted for incorporating site-specific climate predictions for other endpoints and ecological systems.

P.177 Galloway, LD*; Wu, T; Dolislager, FD; Stewart, DJ; University of Tennessee, Knoxville and State of Alaska DEC Contaminated Sites; [email protected] Alaska Specific Calculator Tool for Addressing Risk Based Human-Health Cleanup Levels The Alaska Cleanup Level Calculator (AKCALC) was developed to assist Alaska in setting Method 2 human heath cleanup levels for its list of regulated chemicals. A number of Alaska-specific features was incorporated into this tool to enable it to comport to Alaska Department of Environmental Conservation's regulatory approach for establishing cleanup levels. The AK CALC is based on the inputs and equations used by the EPA's Regional Screen Level (RSL) calculator with Alaska-specific parameters. The output results in cleanup values in soil for 3 different Alaska regions, groundwater and migration to groundwater. An additional tool (AKRISK) was developed to enable the user to calculate cumulative risk. The new tools will be useful in setting regulatory cleanup standards for contaminated sites in Alaska through the regulatory process.

W5-D.2 Galluppi, KJ; Putnam, H*; Coughenour, D; Selover, NJ; Chhetri, N; Roy, M; Arizona State University and City of Flagstaff Az; [email protected] Gap Analysis of Community Risk Planning For Climate Changes to Extreme Weather Events Most disaster losses are due to extreme weather events (EWEs) such as prolonged drought, heavy snowfall and ice, floods, hurricanes, wildfire, and tornado outbreaks. Projected climate changes may exacerbate the frequency, magnitude, or duration of these events as well as their direct and cascading impacts. However, risk mitigation and planning for these events is inhibited by the lack of relevant, reliable, and actionable information needed for intermediate 2-10 year political and fiscal planning cycles. This talk describes the preliminary findings of a NOAA-funded project to understand how emergency and community managers view risk, impacts, and loss, and what information is needed to support planning for EWEs. The project focuses on Arizona where minor climate variations can result in extreme weather outcomes. In particular, we look at the knowledge gap between intermediate risk management needs and long-term (decadal) climate information. Through qualitative social analysis, this study has found that emergency and risk managers whose focus is on preparedness, mitigation, and community risk management do not have reliable, actionable climate information to confidently put strategies in place and spend community resources. This leaves required planning decisions to be based on beliefs and not on rigorous risk analysis, often with debatable outcomes. Community managers report 1) the lack of information of hazards and impacts to develop a hazard characterization, 2) multiple perspectives of vulnerability and loss, and 3) lack of confidence to prioritize EWE consequences over other pressing issues. Of primary concern is the lack of believable, quantifiable estimates of frequency, magnitude, and duration of hazards. Furthermore, extreme events consequences need to be tied to experiences that can readily be related to by the public and senior management. This talk overviews manager’s viewpoints of intermediate-term extreme weather and climate risk, and what is needed for communities to plan for such events.

December 6-10, 2015 - Arlington, VA

48

SRA 2015 Annual Meeting Abstracts T3-D.2 Ganin, AA*; Massaro, EM; Mangoubi, R; Kitsak, M; Linkov, I; US Army Engineer Research and Development Center and University of Virginia; Massachusetts Institute of Technology; Charles Stark Draper Laboratory; Northeastern University; US Army Engineer Research and Development Center; [email protected] Resilience in interdependent networks: cascading failure and recovery We report on a study of resilience in systems of interdependent networks, where nodes in at least one network depend on nodes in one or more other networks. The dynamics of these inter-network connections might affect nodes in all networks even if their number is small relative to intra-network connections. Real world examples include infrastructure, energy, and communication systems. Earlier (1) we argued that the design of such systems must move towards a resilience-based approach and provided a mathematical formulation of resilience in complex networks and a framework for its evaluation. In this work we apply the concept of resilience defined as the normalized integral of the average critical functionality of the system considered as a function of time over a specified interval and given a range of adverse events. Several cases of interdependent network systems are considered: random Erdos-Renyi networks, scale-free networks, real transportation networks, and others. We build on the model of coupled networks developed by Parshani et al (2), and introduce a recovery algorithm based on the assumption that backup supply is available to certain nodes. The results of the computational study of the interplay between various parameters of the networks (e.g. average node degree, scale-free slope factor) and the model (e.g. the severity of the adverse events considered, amount of the backup supply available) indicates that resilience is very sensitive to these parameters, with a non-linear dependency exhibiting thresholds. The data obtained provide insights on the system’s dynamics and allow the stakeholders and network designers to make more relevant decisions regarding the retrofit of modern complex systems. 1. E. M. Massaro et al., in SRA 2014 Annual Meeting Abstracts (Denver, CO, 2014), p. 95. 2. R. Parshani, S. V. Buldyrev, S. Havlin, Phys. Rev. Lett. 105, 048701 (2010).

P.95 Gatchett, A*; Wright, JM; Segal, D; Shannon, T; U.S. Environmental Protection Agency, National Center for Environmental Assessment, Cincinnati, OH; [email protected] U.S. EPA Human Health Research on Community and Site-specific Risk Program The purpose of the U.S. EPA’s Human Health Risk Assessment (HHRA) program in the Office of Research and Development is to develop and apply state-of-the-science risk assessment methods to estimate health and environmental risks from exposures to chemical and non-chemical stressors including various mixture combinations. More specifically, the community and site-specific risk emphasis is to provide rapid response assessments and cumulative risk methods to address emergency response, Superfund site assessment, sustainability characterization, and community concerns. Communities today are faced with an urgent need for coordinated assistance to assess and address issues of chemical and other environmental contamination. The U.S. EPA’s HHRA program is frequently called upon to quickly assist in these situations, often in the face of large scientific uncertainties due to data gaps. Specific work under this topic includes quick turn-around exposure and risk assessments, technical support on human health or ecological risks to support different Superfund sites or regional concerns, the development of Provisional Peer-Reviewed Toxicity Value (PPRTV) assessments, and the development of methods and tools for conducting cumulative impact and risk assessments. Taken together, this work helps ensure that the U.S. EPA’s programs and regions have the tools and information they need to make decisions and address community concerns. The Community and Site Specific research plan is uniquely positioned to support risk management decisions and regulatory needs of various stakeholders, including Agency program and regional offices as well as state/tribal environmental protection programs and interested communities. This poster provides an overview and highlights of the proposed research over the next five years. The views expressed in this abstract are those of the authors and do not necessarily reflect the views and policies of the U.S. EPA. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

W3-G.3 Geitner, NK*; Wiesner, M; Duke University; [email protected] A model of nanomaterial trophic transfer driven by surface interactions We present a mathematical model for the trophic transfer of engineered nanoparticles in aquatic food webs. This model is driven by the surface adhesion property, or alpha, of the particle, which can be measured for particular particle/surface systems in the lab. The model accommodates both chronic (long-term and constant) and pulse (acute, one-time) nanoparticle exposures. In the chronic exposure case, particle concentrations in each trophic level reach a steady state, which depends on the alpha values between the particle and not just the species at that level, but for all available surfaces in the local ecosystem. We use gold nanoparticles as an experimental model, quantitatively measuring their adhesion to live algal cells in order to inform the current model. We also present a pulse exposure model for the accidental release of nanomaterials into an aquatic system, which also results in significant trophic transfer dependent on both the physical properties of the nanomaterial as well as the biological properties of the local ecosystem.

W1-E.4 Geraci, CL; National Institute for Occupational Safety and Health; [email protected] Evaluating the Completeness and Effectiveness of Current Safety Data Sheets for Nanomaterials The Safety Data Sheet, formerly the Material Safety Data Sheet, continues to be regarded as a basic source of information on the hazards associated with chemical substances. This hazard information, combined with knowledge of potential exposure, provides a basis for developing the foundation of hazard and risk communication in the workplace. Over the past ten years there has been a dramatic growth in the number and volume of engineered nanomaterials being introduced into commerce; therefore in the workplace. As the body of toxicological data began to grow for nanomaterials, there was a recognized need to improve the communication of basic hazard information for these materials. Several studies evaluated MSDSs and found them lacking in content and their ability to serve as an effective hazard communication resource. GHS has and improvements in the amount and quality of information to be communicated regarding the hazards of chemicals and is the basis for the new Hazard communication rule in the US. The question remains as to whether the SDS, following GHS guidelines, will more effective at communicating information about nanomaterials. Parallel with the issuance of GHS, ISO recently issued a voluntary standard for SDS content for nanomaterials. This presentation will discuss the effectiveness of GSS-based SDSs in providing basic hazard information to the occupational safety and health practitioner for purpose of developing a hazard and risk communication plan for facilities and processes that manufacture or formulate nanomaterials.

December 6-10, 2015 - Arlington, VA

49

SRA 2015 Annual Meeting Abstracts P.64 Gernand, JM; Penn State University; [email protected] Nanoinformatics: Advances, Applications, and Assessing the Continuing Challenge of Uncertainty The field of nanoinformatics continues to progress in the development of databases, statistical analyses, and data mining algorithms for the identification of important risk and functional aspects of engineered nanomaterials. The successful development of these tools and resources is critical for the safe and responsible development of this technology. Over the past several years special discussions and meetings on this topic have focused on the needs and plans for the development of nanoinformatics infrastructure and methodologies. Now, the first steps are being made with traditional and non-traditional applications to report along with their successes and limitations. This presentation will highlight advances, applications, and challenges of existing efforts in this area, which still tend to be relatively young, and how the continuing uncertainty as revealed by nanoinformatics analysis can inform future developments and knowledge search. Quantitative structure activity relationships (QSARs) and related modeling studies of the toxicity of nanomaterials have been successfully developed, but for isolated and independent perspectives that cannot yet be readily consolidated. The bridging questions and prospective unifying questions can now be defined, however. Several ambitious database projects to capture and serve up accumulated and newly created data on nanomaterial risks and functions have been making progress, and these activities have identified the main impediments to full implementation while allowing the community to estimate the level of investment required to achieve the desired aims. This presentation also presents a case study on the analysis of uncertainty over time with respect to nanomaterial characterization and the quantification of outcomes with a prospective estimation of when certain data milestones may be reached.

T4-H.4 Gernes, RA*; Rice, G; MacDonnell, MM; Hertzberg, R; Beresin, GA; Wright, JM; ASPPH/US EPA (Association of Schools and Programs of Public Health/ EPA Fellowship Program); [email protected] Estimating Greenspace Exposure and Benefits for Cumulative Risk Assessment Applications: Findings from an EPA Technical Working Group Ecosystem services, including access and exposure to greenspace, may benefit human health. The causal mechanisms (e.g., physical activity, psychological well-being from exposure to nature) through which greenspace may affect specific health outcomes (e.g., asthma, cardiovascular disease) are somewhat uncertain. Given that greenspace may act as a non-chemical stressor (e.g., release of pollen) or an exposure modifier (e.g., air or noise pollution reduction), it is a good candidate to examine in a cumulative risk context, which could help determine the utility of ecosystem service data for characterizing risks and developing risk management alternatives. Considering these goals, the U.S. Environmental Protection Agency (EPA)’s National Center for Environmental Assessment in Cincinnati hosted a technical meeting in May 2015 to evaluate various measures and roles of greenspace from a cumulative risk assessment perspective. The objective of the meeting was to identify and qualify approaches and appropriate data sources for measuring greenspace and exposure, and evaluate the distribution of health impacts of greenspace (e.g., across socio-economic status, sensitive populations), including risk reductions, from a cumulative risk assessment perspective, with attention to uncertainty in reporting and measurement. Key discussion points and outputs from the meeting will be presented to inform methods for evaluating environmental health risks and benefits associated with greenspace exposure, as well as examine implications for considering complex mixes of stressors and protective factors for future cumulative risk assessments. This presentation was supported by Cooperative Agreement Number X3-83555301 from the U.S. Environmental Protection Agency and the Association of Schools and Programs of Public Health. The findings and conclusions of this presentation do not necessarily represent the official views of EPA or ASPPH.

W3-B.1 Getchell, MC; University of Kentucky ; [email protected] Chaos Theory and the Use of Social Media for the Process of Self Organization During the West Virginia Water Contamination Chaos theory has been widely applied in the math and science field but is also finding popularity in the communication discipline, particularly risk and crisis communication. The West Virginia water contamination crisis typifies an event where chaos theory is applicable. This complex and drawn out crisis contained all aspects of a chaos theory event: strange attractors, bifurcation, a cosmology episode, fractals, and self-organization. Self-organization is the “anti-chaos” mechanism that helps a system return to equilibrium. In this event, the self-organization aspect was most evident by examining social media sites and the way these platforms were used by citizens to create ephemeral organizations that addressed the various and diverse needs of the affected stakeholders. These organizations used social media to form, gather and disseminate information and supplies, and also provide a virtual space for others to become involved in the crisis response. Because of this, these ephemeral organizations helped affected stakeholders to cope during this cosmology episode and eventually return the system to equilibrium.

M4-B.1 Gilmore, EA*; Hegre, H; Buhaug, H; Calvin, K; Nordkvelle, J; Waldhoff, S; University of Maryland, Peace Research Institute Oslo, Joint Global Change Research Institute; [email protected] Forecasting Armed Conflict: Risks and Interventions Projections for armed intrastate conflict (civil war) depend on expectations of socioeconomic development. For example, economic growth and higher educational rates lower the risk of armed conflict. Here, we forecast the risk of armed intrastate conflict along five alternative socioeconomic pathways, known as the Shared Socioeconomic Pathways (SSPs). First, we develop a statistical model of the historical effect of key variables – population size and socioeconomic development (GDP per capita and educational attainment) – on country-specific conflict incidence, 1960–2013. Based on this model, we then forecast the annual incidence of conflict, 2014–2100, along the five SSPs. SSPs with greater welfare improvements are associated with the largest reduction in conflict risk. The marginal effect of socioeconomic development on reducing conflict risk is also much higher for the least developed countries. Importantly, this implies that poverty alleviation and investments in human capital in poor countries are likely to be much more effective instruments to attain global peace and stability than maximizing growth in wealthier economies. Further, the SSPs contain information about challenges mitigation and adaptation to climate change. We find that the sustainable development pathway with lower challenges to mitigation and adaptation is as conducive to global peace as a higher economic growth fossil fuel based development pathway. Thus, the sustainable development pathway stands out as a “no regrets” strategy for preventing climate change and intrastate conflict.

December 6-10, 2015 - Arlington, VA

50

SRA 2015 Annual Meeting Abstracts M4-F.2 Goble, RL; Clark University; [email protected] Metamorphoses: changes in the practice and use of risk analysis “Risk” is a portentous term. It conveys threats of harm together with the chanciness of such harm. Describing the combination of threat, harm, and chanciness leads to analytical complexity. It also evokes worry and fear, uncertainty about the meaning of the threats and uncertainty about what to do about them. Not surprisingly the term “risk” resists definition; it means different things to different people in different contexts. The relatively new field of risk analysis has, sometimes enthusiastically, sometimes reluctantly, embraced these concerns, complexities, uncertainties, and ambiguities. It has also evolved considerably over the past 40 or so years. No longer do we expect a risk analysis to be the decision tool for a policy choice and no longer do we pursue the chimera of “acceptable risk”. Instead a risk analysis is expected to inform rather than make decisions. And going beyond policy decisions, risk analyses are now used to support risk management efforts and to inform a variety of public processes that can be viewed as risk governance. The story of this evolution can be told in several ways. A documentation of the evolution can be found in a series of National Academy of Sciences reports that, over time, have explored different aspects of the field. Another approach to the story is to consider some of the tensions, dilemmas even, that are intrinsic to the field. The institutionalization of a changing field of risk analysis, and the tensions that have driven the changes, provide a context for viewing the major strands of social science risk research that are the topic of this symposium.

W4-F.2 Goble, RL; Clark University; [email protected] Warnings, Warning Signals, and the Dynamics of Risk Risks continually change. Technologies change; the natural environment changes; circumstances of exposure change; successful risk management can be expected to reduce risk; failures in risk management, however, may increase some risks and may even introduce new risks. Effective risk analysis requires keeping up with such shifts in risks and in the drivers of risk. Warnings and warning signals have an important place in a risk manager’s toolbox. In general, warnings provide information that can vary considerably in certainty, and practical usefulness. Warning signals are primarily used for one of two purposes: 1) to notify risk managers when it is time to put in place some protective action; 2) to engage members of the public in their own protective activities; they might be asked to evacuate, to take shelter, or to avoid using a technology or drug in inappropriate circumstances. However, warning signals do not always work as intended. The recommended actions may not be appropriate or the timing of the signal may be wrong. Messages may be distorted or fail to get through. I present examples of successful and failed use of warnings and warning signals to illustrate the practical relevance of emphasizing dynamics in risk analyses.

W5-H.3 Goerlandt, F; Aalto University; [email protected] A perspective on the relation between risk and prediction In risk research, there is a recent focus on foundational issues related to concepts, principles and frameworks for describing risk. The aim of the current paper is to discuss one fundamental issue in risk analysis: the question whether or not risk models can be considered predictive tools. Very little work is known where this question has been in focus, while several authors have made explicit statements reflecting an implicit expectation that risk analyses are predictive tools. Some authors have presented views where the predictive nature of risk analyses is questioned, but elaborate discussions on this issue are rare. In this presentation, an attempt is made to advance this discussion, by proposing a refinement of different ways to interpret what is meant with prediction. In addition, a number of characteristics are investigated of the system to which risk analysis is applied. These distinctions provide some further insights in the question of predictive power of risk models, which is argued to have implications to how risk models can be used.

M2-F.3 Golden, NJ*; Dearfield, K; Food Safety and Inspection Service; [email protected] Proposed Guidelines for the Control of Nontyphoidal Salmonella Spp. in Beef and Pork Meat Salmonellosis is one of the most frequently reported foodborne diseases worldwide with beef and pork meat considered important food vehicles. The burden of the disease and the cost of control measures are significant in many countries and contamination of food products with zoonotic nontyphoidal Salmonella has the potential to disrupt trade between countries. Therefore, USDA’s Food Safety and Inspection Agency is developing, in conjunction with international partners, a Codex Alimentarius document that details good hygienic practices—both current practices and hazard-based interventions—for the control of Salmonella. The primary focus is to provide information on practices that may be used to prevent, eliminate, or reduce Salmonella in raw beef and pork meat. The Guidelines will build on general food hygiene provisions already established in the Codex system and develop potential control measures for these pathogen-commodity pairs. These Guidelines will be applicable to the production-to-consumption food chain and all nontyphoidal Salmonella that may contaminate beef and pork meat (Bos Indicus, Bos Taurus and Sus scrofa domesticus) and cause foodborne disease. This talk will discuss the information on the control of Salmonella in beef and pork meat that aims to reduce foodborne disease with measures to recommend to governments and industry.

December 6-10, 2015 - Arlington, VA

51

SRA 2015 Annual Meeting Abstracts W2-I.2 Goldstein, BD*; Carruth, RS; University of Pittsburgh and University of Cologne; [email protected] Role of the Benzene Decision in Secondary as Opposed to Primary Prevention of Risk Classic public health theory applied to environmental health distinguishes between secondary prevention which is aimed at detecting and reducing risks before adverse health effects are manifested, and primary prevention which seeks to prevent risks from occurring. Risk assessment is inherently a tool of secondary prevention, requiring evidence of hazard and exposure to calculate a risk which can be titrated against a desirable goal. Among the large number of chemicals to which workers and the general population are exposed, benzene stands out for having a relatively high level of information pertinent to a risk calculation. In contrast, the relative lack of information to perform a robust risk assessment for the very large majority of other workplace chemicals has been among the factors hampering OSHA´s ability to move from the benzene decision to set workplace standards. Also contributing has been the increasing complexity and costs of the risk assessment process, a complexity that goes well beyond what necessarily conforms to the Supreme Court decision. In contrast, in the 1990 Clean Air Act Amendments for hazardous air pollutants (HAPs), which includes benzene, Congress in essence rejected the benzene decision for the general population by taking a more precautionary approach. The amended law lists 185 chemicals to be subject to Maximum Available Control Technology, while risk assessment is consigned to a backup role. Perhaps reflecting some confusion, Congress also required cost-benefit analysis for HAPs for which, tellingly, EPA chose benzene, as doing cost-benefit analysis without sufficient information to perform a risk assessment is problematic. Simplifying risk assessment for OSHA purposes may be easier than amending the OSH act.

P.59 Good, DH*; Chien, S; Li, L; Christopher, L; Zheng, J; Krutilla, K; Tian, R; Chen, Y; Indiana University and Indiana University Purdue University Indianapolis; [email protected] Benefit Analysis of Vehicle Crash Imminent Braking Systems for Bicyclist Fatality Reduction The World Health Organization estimates that there are approximately 1.2 million people fatalities per year resulting from traffic crashes. About half of the victims are not the occupants of a motor vehicle and do not benefit from the improvements in passive safety systems such as seat belts or air bags. As a response, motor vehicle manufacturers and tier 1 suppliers have been developing and deploying a new generation of active safety systems designed to protect pedestrians and bicyclists. While bicycle crashes with motor vehicles are rarer in the US compared to other countries, they are expected to increase as Americans improve their health through mobility. This study evaluates the benefits from two systems on their ability to mitigate crashes with bicycles. Compared to pedestrians, cyclist crashes are far more complex (for example, pedestrians do not attempt to merge into traffic and their range of speeds is much lower). We develop testing scenarios based on US crash data, a realistic (from the radar and visual perspectives) surrogate cyclist targets, develop models of injury prediction, and perform test these vehicles on track with repeated trials. In addition to an estimate of benefits, we offer suggestions for the implementation of these procedures for the DOT’s new car assessment program (NCAP) and the future modifications to the Federal Motor Vehicle Safety Standards.

M3-J.4 Goodhue, C*; Kieval, R; Stiller, H; Wiley, P; McDonough, B; Eastern Research Group, Inc. and NOAA; [email protected] What Will Adaptation Cost Coastal areas across the United States are beginning to incorporate sea level rise adaptation into their community planning. Incorporating the uncertainty of sea-level rise and coastal storms along with the level of effort needed to develop cost and benefit data are two major challenges in making informed decision about adapting to coastal flooding. Communities are already grappling with difficult decisions about how to locate, maintain, and protect expensive community infrastructure such as roads, hospitals, and wastewater treatment plants. The session will describe a NOAA-framework that community leaders and planners can use to make more economically informed decisions about adapting to sea level rise and flooding from coastal storms. A focus of this presentation will discuss how the uncertainty associated with sea-level rise and coastal storms is communicated to decision makers to help them make decisions about the best path forward for their community. The detailed guidance and high-level executive summary developed by Eastern Research Group under contract to N O A A c a n b e f o u n d a t : http://www.coast.noaa.gov/digitalcoast/publications/adaptation.

W3-I.1 Goodman, JE*; Loftus, CT; Rhomberg, LR; Lynch, HN; Gradient; [email protected] How a Sensitivity Analysis of Raw Data Would Strengthen EPA's Chlorpyrifos Risk Assessment In the Revised Human Health Risk Assessment for Chlorpyrifos, the United States Environmental Protection Agency (EPA) Office of Pesticide Programs (OPP) reviewed several epidemiology studies and concluded that prenatal chlorpyrifos exposure likely plays a role in neurodevelopmental effects. This conclusion is based primarily on analyses of a cohort of women and children from New York City enrolled in a Columbia University study (the Columbia cohort). The peer-reviewed publications describing this cohort provide important information, but they reflect only a fraction of all analyses performed on the data. Because we were limited by the lack of publically available raw data, we conducted only two sensitivity analyses to estimate potential impacts of measurement error on analyses of this cohort. We found that at least some reported associations could be substantially biased by exposure or outcome misclassification, and EPA's assessment may have been influenced by these biases. In this talk, I will describe our sensitivity analyses and discuss the additional analyses that could be conducted with access to the raw data. Overall, our findings underscore the critical importance of raw data in this risk assessment, as well as others that place significant weight in epidemiology evidence.

December 6-10, 2015 - Arlington, VA

52

SRA 2015 Annual Meeting Abstracts M4-C.5 Goodman, JE*; Lynch, HN; Gradient; [email protected] Extrapolation of Controlled Human Study Results to the US Population Ideally, controlled exposure studies on which regulations are based would be designed so that they are generalizable to the regulated population. In reality, this often proves to be challenging, for a number of reasons. Sample sizes are small and study participants are generally homogenous and relatively healthy; children and people with pre-existing conditions are often excluded for ethical reasons. Also, these studies do not always represent real-world conditions — for example, some studies involve quasi-continuous exercise that induces higher ventilation rates and, thus, worst-case exposure scenarios. This talk will focus on the design of controlled exposure studies and the applicability of their results to the broader US population, including sensitive populations.

P.6 Greenberg, GI*; Beyer, LA; Gradient; [email protected] Comparison of a Site Risk Assessment Conducted Using EPA Superfund Risk Assessment Guidelines vs. LDEQ RECAP Methods We conducted a risk assessment for a site in Louisiana using both EPA's Superfund Risk Assessment Guidelines and Louisiana Department of Environmental Quality (LDEQ) Risk Evaluation/Corrective Action Program (RECAP) methods. Risks were evaluated for a recreator who is exposed to sediment and surface water while crabbing in waters near the site. RECAP methods are consistent with EPA risk assessment guidance, but differ in terms of specific guidance. EPA's risk assessment approach consists of four steps: hazard identification, dose-response assessment, exposure assessment, and risk characterization. RECAP is based on a tiered framework consisting of a Screening Option (SO) and three Management Options (MO)(MO-1, MO-2 and MO-3). Similar to EPA's hazard identification step, the SO identifies chemicals of concern (COCs),which are carried forward to the MO. Unlike EPA's method, RECAP's tiered MO approach compares exposure point concentrations to RECAP standards (RS). With each increasing MO tier, the RS becomes more refined, incorporating more site-specific assumptions (e.g., exposure, fate/transport). The two methods also differ in terms of specific guidance. They have different requirements for data usability resulting in different datasets. Key discrepancies include use of SPLP and filtered samples, soil depth, and proxy values for non-detected chemicals. EPA and RECAP also have different specifications that could impact the COC selection (e.g. detection frequency and health based screening values). For the risk calculations, EPA incorporates soil bioavailability and updated exposure assumptions and toxicity values. Total petroleum hydrocarbon (TPH) risk calculations also differ as both EPA and RECAP rely on different sources to define TPH fractions and toxicity values. Despite the differences between EPA and LDEQ RECAP approaches, both resulted in the same conclusion: recreator exposures to Site media did not pose a risk of adverse health effects.

P.73 Greene, CW*; Goeden, HM; Minnesota Department of Health; [email protected] State-Level Innovations in the Assessment of Drinking Water Contaminants of Emerging Concern The Minnesota Department of Health (MDH), using funding derived from the Minnesota Clean Water, Land, and Legacy Amendment to the state Constitution, has developed an innovative program to identify, screen, and assess drinking water contaminants of emerging concern (CECs) and, when possible, develop health-based drinking water standards. In the state of Minnesota, approximately 1.4 million people, or 25 percent of the population, derive their drinking water from the Mississippi River. Like most rivers and streams that receive treated wastewater discharges, the Mississippi has been shown to contain measurable concentrations of certain pharmaceuticals, personal care product ingredients, plasticizers, and pesticides. To a lesser extent, CECs may also be found in municipal or private groundwater wells. New industrial, commercial, or consumer uses may also contribute to releases of CECs to drinking water sources. Many of these CECs are not required to be monitored under state or federal drinking water programs, but have been measured in source water by the U.S. Geological Survey (USGS), the Minnesota Pollution Control Agency (MPCA), and other parties. MDH’s program allows interested persons or organizations to nominate chemicals to be considered for review. MDH staff conduct screening assessments of toxicity and exposure potential, then select candidates for a more intensive review process aimed at developing health-based standards for multiple exposure durations. Through this program, MDH has developed enhanced capabilities for dealing with risk assessment issues that arise with CECs, including making decisions based on limited toxicological data and allocating potential risks among multiple exposure sources such as consumer products and diet. MDH’s experiences assessing CECs may be informative to state or local authorities concerned about risks from water reuse, wastewater impacts on surface water quality, or exposures to consumer products.

T2-C.3 Gregory, R*; Harstone, M; Decision Research; [email protected] Structuring Intervention Decisions to Prevent Genocide Decisions to intervene to stop genocide in another country are complicated, on many levels. The decision context typically is characterized by numerous constraints including insufficient time, limited information, and scarce resources. Key objectives such as “protecting the national interest” are difficult to define. And the decision framework needs to balance the role of System 1 & 2 – feeling and thinking – responses. This presentation summarizes some of the lessons that have been learned about making decisions under these conditions and draws on techniques from decision analysis, psychology, and risk analysis to assess the pros and cons of adopting a more structured approach to generating and evaluating genocide prevention alternatives. We highlight results from an initial experiment that compares an objectives-based decision-making approach to the more usual alternatives-focused process and conclude that more structured methods could provide decision makers with additional values-consistency, insight, reliability, efficiency, and defensibility.

December 6-10, 2015 - Arlington, VA

53

SRA 2015 Annual Meeting Abstracts W5-B.3 Greidinger, SJ; Predictive Health Solutions; [email protected] Quantifying Benefits for Government Medical Research Budgeting The future benefits of federal government investments in medical research have yet to be quantified in a way that would permit HHS, OMB and other agencies to use decision analysis to set research budgets. Instead, these budgets are often set using the previous year as a baseline. Programs are frequently cut or increased in an across-the-board fashion, without calculating the number of statistical lives, DALYs, QALYs or tax dollars that will be saved or lost later due to the resulting changes. This talk will present a benefit-cost framework which would permit federal healthcare administrators to analyze trade-offs between research investments in multiple diseases, between research on core disease processes and their effects, and between healthcare research and healthcare delivery. The talk will use atherosclerosis, the single greatest killer of US citizens, to demonstrate why studies using the methodology should be fully realized.

W2-F.3 Grieger, KD; RTI International; [email protected] A qualitative risk-benefit assessment for nanomaterials in food Decisions regarding the use and implementation of emerging technologies often involve balancing the potential benefits, risks, and uncertainties of new or novel materials, techniques, or processes. The use of engineered nanomaterials in food and food products is one case in which regulators, decision-makers, risk managers, and scientists must comprehensively assess and weigh the potential risks and benefits of novel materials. On one hand, nanomaterials may offer a number of potential benefits to consumers and/or industry, such as improved nutrition, taste, and stability. On the other hand, there have also been questions raised regarding the health and safety of nanomaterials along with ethical concerns regarding transparency in the food supply. After providing an overview of the potential benefits, risks, and uncertainties of using engineered nanomaterials in foods, this presentation discusses a number of potential qualitative approaches and tools to weigh these risks and benefits. A number of recommendations will also be given on how to use best to use qualitative risk-benefit analysis approaches for nanomaterials in food. These discussions regarding how best to balance the risks and benefits of nanomaterials in food may be particularly relevant given the revised food labeling requirements in Europe relevant for nanomaterials in food which are now in effect.

W4-F.3 Groth, KM*; Muna, AB; LaFleur, C; Sandia National Laboratories; [email protected] QRA in codes and standards development: What does success look like? Over the past decade there has been an international trend toward more flexible, yet defensible codes and standards. Quantitative Risk Assessment (QRA) has become a valuable tool in this process. NFPA and SFPE provide guidance documents that establish a process for using QRA in development of codes and standards, but neither organization provides a specific risk assessment method, tool, models or data since different tools and techniques may be appropriate in different contexts. QRA concepts have been integrated into aspects of two hydrogen codes/standards, NFPA 2 and ISO TR-19880, and QRA is expected to play an increasing role in future revisions of these documents. However, researchers have raised questions about the gaps in availability of data, models, and tools available for QRA on hydrogen systems. In this talk, we explore the concept of “success” in the use of QRA to revise hydrogen codes and standards. What do developers need from QRA? What tools and information, at what level of detail, are necessary for making code decisions? What is the impact of gaps, uncertainties, and limitations on the consensus process?

P.159 Guidotti, TL; Chair, Panel on Wind Turbine Noise & Health, Council of Canadian Academies; [email protected] WIND TURBINE NOISE AND HEALTH: FINDINGS OF AN EXPERT PANEL Wind energy generates an increasing share of electricity in Canada, as elsewhere. In some communities where wind turbine installations are concentrated, concern over health effects from wind turbine noise has been vocal. Concern is primarily focused on health effects of at frequencies below the threshold of hearing (“infrasound”). Health Canada charged the Council of Canadian Academies (CCA) with performing an objective assessment of the extant scientific literature. A panel of 10 experts was convened. First, the scientific literature, grey literature, publications of record, and legal filings were examined to develop a list of 32 candidate health outcomes of concern. Then, a systematic search for scientific literature on wind turbine noise and health outcomes yielded 38 relevant references. Deliberations on interpretation emphasized the pattern, the convergence of evidence from different sources, and the cumulative weight of evidence are emphasized throughout this report, not individual observations. The literature as a whole was evaluated for sufficiency of evidence to conclude that there was a causal association, guided by the Hill criteria. The Panel concluded that there was 1) sufficient evidence to conclude that a causal association exists between wind turbine noise and annoyance, defined as “a feeling of displeasure evoked by noise”, which is considered a health outcome by the WHO definition of “health”; 2) limited evidence for sleep disturbance; 3) adequate evidence against an association for hearing loss; 4) inadequate evidence of a direct causal relationship with stress, not withstanding known relationships observed in community noise studies; and 5) inadequate evidence for all other health effects, including cardiovascular disease, tinnitus, and vertigo. The Panel also observed that although conventional (dBA) sound level measurements do not measure the full spectrum of sound generated by wind turbines, they serve as an acceptable surrogate measure for sound intensity.

December 6-10, 2015 - Arlington, VA

54

SRA 2015 Annual Meeting Abstracts P.34 Guo, M*; Buchanan, RL; Dubey, JP; Hill, DE; Gamble, HR; Jones, JL; Pradhan, AK; University of Maryland, College Park, MD; Agricultural Research Service, United States Department of Agriculture, Beltsville, MD; National Academy of Science, Washington, D.C.; Centers for Disease Control and Prevention, Atlanta, GA; [email protected] Development of the dose-response relationship for human Toxoplasma gondii infection associated with meat consumption Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose-response relationship for human exposures to T. gondii-infected meat is not known because no human data are available. The goal of this study was to develop and validate dose-response models based on animal studies, and to compute scaling factors so that animal-derived models can predict T. gondii infection in humans. Relevant studies in the literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal-shaped mathematical models, and model parameters estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose-response relationship. Exponential and beta-Poisson models, which predicted similar responses, were selected as reasonable dose-response models based on their simplicity, biological plausibility and goodness-of-fit. The confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse-derived models were validated against data for the dose-infection relationship in rat. A human dose-response model was developed as P (d) = 1-exp (-0.0015×0.005×d) or P (d) = 1-(1+d×0.003/582.414)^-1.479. Both models predict the human response after consuming T. gondii-infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.

P.131 Gutierrez, VV*; Cifuentes, LA; Universidad Diego Portales and Pontificia Universidad Católica de Chile; [email protected] Should society be compensated for the risks imposed by Climate Change? Chile is one of the countries that is considerable affected by climate change. Even though Chile contributes close to 0.3% of the world’s greenhouse gas emissions, Chilean authorities are engaged in responding in a constructive manner to develop a solution and to adapt to the significant impacts of climate change. For this reason, Chile created its National Climate Change Action Plan, which brings together a number of public policies related to climate change and its adverse effects. However, in order to successfully achieve this plan, authorities should take into account public opinion. This research advances the understanding of public perception of the risks imposed to society, to the environment and to individuals for climate change. We explore how much people claim society; the environment and individuals should be compensated for climate change impacts. We also relate compensation demanded with the classical variables used in the field of risk perception (as perceived risk, public acceptability and trust in regulating authorities). We used an online survey to poll a sample of the population of Santiago, Chile. A total of 525 subjects answered the survey. Data were analyzed using structural equation modeling procedures. Results show that compensation demanded for the effects of climate change on the environment is higher than compensation demanded for effects on society or an individual. Perceived risk is higher for impacts on the environment than for society or an individual. Acceptability of the risks of climate change and trust in authorities in charge of managing it are low. Demanded compensation depends on perceived risk and also on trust. Implications for decision makers and public policies are discussed.

M3-A.1 Haber, LT*; Reichard, JF; Vincent, MJ; Allen, BC; Liska, DJ; Dourson, ML; TERA; BCA ASsociates; Biofortis; [email protected] Mode of action and meta-regression analysis of the effect of trans fatty acids (TFAs) on LDL-cholesterol The Food and Drug Administration (FDA) published a Federal Register notice tentatively determining that partially hydrogenated oils (PHOs, the primary dietary source of industrial TFAs – iTFAs) are no longer generally recognized as safe (GRAS) when used in food. This determination was based on the conclusion that there is no threshold intake for iTFA that would not increase an individual’s risk of coronary heart disease, based primarily on research showing iTFAs increase LDL-cholesterol (LDL-C) at high levels of intake. A Mode of Action (MOA) analysis was conducted for the relationship between iTFA dose and LDL-cholesterol. The MOA evaluation identified two key events: (1) increased VLDL levels, and (2) decreased LDL receptor activity. However, unlike classical MOAs, these two key events occur in parallel, rather than sequentially, presumably because fatty acids are nutrients regardless of structural configuration. The data were evaluated using the ILSI/IPCS MOA framework, based on the evolved Bradford-Hill criteria. The analysis concluded that the data are insufficient to rigorously test the key events against the evolved Bradford-Hill criteria for evaluation of MOA, but (1) several lines of evidence support a threshold, and (2) there are several nonlinearities in key determinants of the dose-response relationship, consistent with a threshold. We also conducted a meta-regression of the controlled clinical trial data. This analysis tested a range of flexible curves to provide an overall sense of the dose-response. It found that the linear model is not acceptable, while the non-linear Hill model fits the data well. Doses (in the form of change in TFA intake) that resulted in specified levels of increased LDL-C are identified. The analysis shows that applying risk assessment techniques to a macronutrient can aid in understanding and offer new perspectives. This work was sponsored by the ILSI North America PHO Task Force.

W3-A.2 Hafstead, MA*; Lutter, R; Ruhm, C; Resources For The Future; [email protected] Estimating the Social Benefits of Improvements in Cognitive Test Results in the U.S. Improved IQ can directly improve both earnings and the likelihood of employment while also indirectly impacting both earnings and employment probability through schooling. In 2008, the EPA estimated, as part of a rule related to the regulation of lead, that the present lifetime earnings of an additional IQ point is $607,000. These estimates relied heavily on Salkever (1995). Since this estimate was released, Robinson (2013) has argued the original Salkever (1995) study overstates the true impact while Salkever (2014) has counter-argued that his previous estimates may actually understate the true impact. This paper seeks to refine and improve the estimates of an additional IQ point on lifetime earnings. We begin by using the Lutter and Ruhm (2015) estimates of the effects on labor earnings that update the original estimates of Salkever (1995). From there, we incorporate both the increased value of taxes and benefits received and improvements in non-market work associated with cognitive performance that were not included in the previous EPA analysis. We then reexamine assumptions used in the EPA study related to the long-run growth in future wages and the discount rate used to discount future labor market earnings. Finally, we conduct a quantitative uncertainty analysis to construct an empirically based confidence interval around our estimate for the impact of improved cognitive performance on lifetime earnings.

December 6-10, 2015 - Arlington, VA

55

SRA 2015 Annual Meeting Abstracts W5-I.3 Hall, DC*; Le, QB; University of Calgary; [email protected] Mitigation of emerging infectious diseases on small scale livestock farms in Vietnam. Our research examined the relation between water public health, small scale integrated farming, limited biosecurity, and mitigation of emerging infectious diseases (EIDs) in Vietnam. We collected data from 600 farms in North and South Vietnam (Thai Binh and An Giang provinces) using questionnaires, semi-structured interviews, and water quality testing (E. coli, turbidity, and pH). Water samples were collected from participant' wells or rain water cisterns and analyzed in government laboratories using WHO standardized methods. We also used probit analysis to investigate the association of demographic variables with E. coli levels in drinking water and EID mitigation strategies. The typical profile of our participants was a 45 year old married individual with two children, six or seven years of formal education, low income (c. $1200 p.a.), and nine years farming experience. Farmers raised fish, poultry, a few pigs or cattle, and some crops (e.g., rice). Most participants had basic awareness of avian influenza prevention, but very limited knowledge of water-borne diseases such as E. coli. Respondents were predominantly male (71%). More than 90% of participants claimed they boiled and/or filtered their water used for drinking (rain or well water). Water test results revealed that more than 80% of samples contained unacceptable levels of E. coli (10 to several thousand cfu's). Probit analysis revealed significant association of demographic variables with E. coli levels in drinking water, as well as likelihood of farmers to engage in mitigating strategies. Significant predictive independent variables included age (p