Baghersad, M. and C.W. Zobel. "Economic impact of production bottlenecks caused by disasters impacting interdependent industry sectors," International Journal of Production Economics, 168, 2015, 71-80.

Abstract: This paper provides a new linear programming model, based on Leontief׳s input–output model, to investigate the economic consequences of production capacity bottlenecks caused by disasters. An important contribution of the paper is the incorporation of industry sectors׳ preferences in allocating limited products/services between final domestic demand, foreign final demand, and intermediate industries. This provides support for estimating some of the indirect economic impacts of disasters. The paper also considers recovery operations within disrupted sectors, from the standpoint of evaluating the performance of the economy during the transition period after a disaster. The models are implemented to investigate the economic consequences of electricity sector disruption in Singapore and, finally, computational results are reported.

Keywords: Disaster impacts; Interdependent industry sectors; Input–output analysis; Inoperability; ResilienceReturn to research page


Zobel, C.W. and L.Z. Khansa. "Characterizing multi-event disaster resilience," Computers & Operations Research - special issue on MCDM in Emergency Management, 42, 2014, 83-94.

Abstract: This paper presents an approach for providing a quantitative measure of resilience in the presence of multiple related disaster events. It extends the concepts of the resilience triangle and predicted disaster resilience by considering the tradeoffs between multiple criteria for each individual sub-event, as well as for an entire multi-event situation. The focus of the research is on sudden-onset disasters, and on the initial impact of each sub-event as well as the amount of time available to work towards recovery of the system before the next sub-event occurs. A mathematical model is developed for the new resilience measure, along with an approach for graphically representing the relationships between the different criteria. An example is then provided of using the new approach to compare the relative resilience of different scenarios under a representative multi-event disaster situation. The results demonstrate that characterizing multi-event resilience analytically can ultimately provide a great depth of information and thus support better disaster planning and mitigation.

Keywords: Predicted resilience; Disaster operations management; Quantitative modeling; Sudden-onset disastersReturn to research page


Chacko, J.C., L.P. Rees, and C.W. Zobel, "Improving Resource Allocation for Disaster Operations Management in a Multi-Hazard Context," Proceedings of the 11th International ISCRAM Conference (University Park, PA: May, 2014), pp. 83-87.

Abstract: The initial impact of a disaster can lead to a variety of associated hazards. By taking a multi-hazard viewpoint with respect to disaster response and recovery, there is an opportunity to allocate limited resources more effectively, particularly in the context of long-term planning for community sustainability. This working paper introduces an approach for extending quantitative resource allocation models to consider multiple interrelated hazards. The discussion is motivated by a literature review of existing models and then focuses on changes necessary to take the multiplicity of hazards into consideration in the context of decision support systems for disaster operations management.
Keywords: Multi-hazard events; long-term planning; decision support systems; disaster operations managementReturn to research page


Pant, R., K. Barker, and C.W. Zobel, "Static and dynamic metrics of economic resilience for interdependent infrastructure and industry sectors," Reliability Engineering & System Safety, 125(1), 2014, 92-102. Available on-line at: http://dx.doi.org/10.1016/j.ress.2013.09.007

Abstract: Infrastructures are needed for maintaining functionality and stability of society, while being put under substantial stresses from natural or man-made shocks. Since avoiding shock is impossible, increased focus is given to infrastructure resilience, which denotes the ability to recover and operate under new stable regimes. This paper addresses the problem of estimating, quantifying and planning for economic resilience of interdependent infrastructures, where interconnectedness adds to problem complexity. The risk-based economic input-output model enterprise, a useful tool for measuring the cascading effects of interdependent failures, is employed to introduce a framework for economic resilience estimation. We propose static and dynamic measures for resilience that confirm to well-known resilience concepts of robustness, rapidity, redundancy, and resourcefulness. The quantitative metrics proposed here (static resilience metric, time averaged level of operability, maximum loss of functionality, time to recovery) guide a preparedness decision making framework to promote interdependent economic resilience estimation. Using the metrics we introduce new multi-dimensional resilience functions that allow multiple resource allocation scenarios. Through an example problem we demonstrate the usefulness of these functions in guiding resource planning for building resilience.
Keywords: Resilience; Interdependent infrastructures; InoperabilityReturn to research page


Zobel, C.W., "Quantitatively Representing Non-linear Disaster Recovery," Decision Sciences, 45(6), 2014, 1053–1082.

Abstract: This paper provides a new technique for quantitatively characterizing the progress of recovery operations in the aftermath of a disaster event. The approach extends previous research on measuring dynamic or adaptive disaster resilience by developing a robust quantitative approach for characterizing nonlinear disaster recovery. In doing so, it enables a more accurate mathematical representation of di?erent categories of recovery behavior and provides significant support for a much broader application of existing theory. Because the new approach inherits the ability to compare the relative behavior of multiple scenarios simultaneously, it also can serve as the basis for analytically comparing the expected performance of different plans for recovery operations. Practical application of the technique is demonstrated and discussed in the context of recovering electrical power after Hurricane Sandy struck the New York metropolitan area.
Keywords: disaster recovery, disaster operations management, resilience, hurricaneReturn to research page


Dottore, M.L. and C.W. Zobel, "Analyzing Economic Indicators of Disaster Resilience Following Hurricane Katrina," International Journal of Business Analytics, 1(1), 2014, 67-83.

Abstract: Many different metrics have been developed to measure the capacity for resilience to a disaster event. In order to track the dynamic response of a community in the aftermath of a disaster, however, it is necessary to consider measures that vary over time and for which data points are actually available on a relatively frequent basis. Unemployment, construction GDP, leisure and hospitality GDP, manufacturing GDP and information and communication technology GDP are all examples of measures which provide the opportunity to quantitatively assess the relative rate and extent of community recovery at regular time intervals following a disaster. By quantitatively analyzing the relative amount of resilience exhibited by a community we may gain better insight into its ability to recover, and thus develop a better understanding of the factors that allow it to return to normal levels of activity. We apply our analytical approach to compare the communities of New Orleans, Louisiana and Gulfport, Mississippi in the context of Hurricane Katrina.
Keywords: disaster resilience, economic indicators, time series, quantitative modelingReturn to research page


Melnyk, S., C.W. Zobel, S. Griffis, and J. MacDonald, "Making Sense of Transient Responses in Simulation Studies: An Approach for Interpreting Transient Response Time Series Data," accepted for publication in the International Journal of Production Research, May 2013.

Abstract: Traditional simulation modeling focuses upon the analysis of steady state data. This focus may not be appropriate, however, for the study of transient responses - data reflecting some form of disruption or change in the system norms. Transient responses are often encountered when dealing with new product introductions, changes in production systems, or supply chain disruptions. In these situations, it is the transient response, how the system responds to these changes as well as the tactics and strategies used to deal with these changes that tend to be of the greatest interest. Unfortunately, current approaches that focus on analyzing such responses are limited. This paper introduces a new approach for analyzing transient responses - one that merges outlier detection, a time series analysis tool, with simulation modeling. This combined approach allows the researcher to identify those factors that have the greatest impact upon operations during these transient conditions. Using a simulated supply chain disruption to illustrate the potential of the approach, it is shown that the new approach expands the applicability of simulation and enables certain types of problems to be investigated with confidence not previously provided.

Keywords: Time Series Analysis, Outlier Detection, Quantitative Modeling, Simulation, Supply Chain Disruptions, Transient Data AnalysisReturn to research page


Seref, O. and C.W. Zobel, "Recursive Voids for Identifying a Nonconvex Boundary of a Set of Points in the Plane," Pattern Recognition, 46(12), 2013, 3288-3299.

Abstract: We introduce a method that identifies the boundary of a non-convex shape of a set of points in the plane. The boundary of the shape is explored through finding empty regions recursively within a shell that encapsulates all of the points. Our algorithm is output sensitive and runs in linear O(`n) time determined by the output parameter `, which is proportional to the length of the non-convex boundary. The recursive nature of our algorithm allows a tree structure that characterizes the empty regions, and by complementarity, the non-convex shape itself. We use a distance measure based on lowest common ancestor of a pair of nodes in this tree and define the complexity of a shape as the average of the distances between all pairs. We present computational results on data size, threshold, shape complexity and noise on a set of different non-convex shapes.

Keywords: Non-convex boundary, linear time complexity, output sensitivity, computational geometry, lowest common ancestorReturn to research page


Zobel, C.W. "Analytically comparing disaster recovery following the 2012 derecho," Proceedings of the 10th International ISCRAM Conference (Baden-Baden, Germany: May, 2013)

Abstract: This work in progress paper discusses analytically characterizing nonlinear recovery behavior through the context of the derecho windstorm that struck the mid-Atlantic United States in the summer of 2012. The focus is on the recovery efforts of the Appalachian Power Company, and the discussion includes a look at the need for communicating the progress of such recovery efforts to the public. Publicly available recovery data is analyzed and compared with respect to the relative behaviors exhibited by two different nonlinear recovery processes, and some of the implications for understanding the efficiency of different disaster recovery operations are discussed.

Keywords: Disaster recovery, quantitative modeling, recovery behavior, derechoReturn to research page


Falasca, M. and C.W. Zobel. "An Optimization Model for Volunteer Assignments in Humanitarian Organizations," Socio-Economic Planning Sciences - Special Issue: Disaster Planning and Logistics: Part 2, 46(4), 2012, 250-260.

Abstract: One of the challenges facing humanitarian organizations is that there exist limited decision technologies that are tailored specifically to their needs. While employee workforce management models have been the topic of extensive research over the past decades, very little work has yet concentrated on the problem of managing volunteers for humanitarian organizations. This paper develops a multi-criteria optimization model to assist in the assignment of volunteers to tasks, based upon a series of principles from the field of volunteer management. In particular, it offers a new volunteer management approach for incorporating the decision maker's preferences and knowledge into the volunteer assignment process, thus allowing him or her to closely examine the tradeoffs between potentially conflicting objectives. Test results illustrate the model's ability to capture these tradeoffs and represent the imprecision inherent in the work of humanitarian organizations, and thus demonstrate its ability to support efficient and effective volunteer management.

Keywords: Humanitarian Logistics, Multi-criteria Decision Making, Volunteer Management, OptimizationReturn to research page


Bosch, D., J. Pease, M.L. Wolfe, C.W. Zobel, J. Osorio, T. Denckla Cobb, and G. Evanylo, "Community DECISIONS: Stakeholder Focused Watershed Planning," Journal of Environmental Management, 112, 2012, 226-232.

Abstract: Successful watershed planning can be enhanced by stakeholder involvement in developing and implementing plans that reflect community goals and resource limitations. Community DECISIONS (Community Decision Support for Integrated, On-the-ground Nutrient Reduction Strategies) is a structured decision process to help stakeholders evaluate strategies that reduce watershed nutrient imbalances. A nutrient accounting algorithm and nutrient treatment database provide information on nutrient loadings and costs of alternative strategies to reduce loadings. Stakeholders were asked to formulate goals for the North Fork Shenandoah River watershed in Virginia and select among strategies to achieve those goals. The Vector Analytic Hierarchy Process was used to rank strategies. Stakeholders preferred a Maximum strategy that included point source upgrades, riparian buffers, no-till corn silage, wheat cover, and bioretention filters in developed areas. Participants generally agreed that the process helped improve communication among stakeholders, was helpful for watershed planning, and should be used for TMDL (Total Maximum Daily Load) planning. Participants suggested more attention be paid to ensuring that all relevant issues are addressed and all information needed to make decisions is available. Watershed planning should provide stakeholders with clear scientific information about physical and socioeconomic processes. However, planning processes must give stakeholders adequate time to consider issues that may not have been addressed by existing scientific models and datasets.

Keywords: stakeholder, analytical hierarchy process, watershed model, nutrient loading, costReturn to research page


Arnette, A.N. and C.W. Zobel. "An Optimization Model for Regional Renewable Energy Development," Renewable & Sustainable Energy Reviews, 16(7), 2012, 4606-4615.

Abstract: This research effort details the modeling component of a comprehensive decision support system for energy planning that allows for combining existing electricity generating capabilities with increased use of renewable energy sources. It focuses on energy planning at the regional level, and it is illustrated by applying it to the greater southern Appalachian mountains of the eastern United States: a region that was chosen for analysis not only due to its heavy dependence on coal for electricity, but also because of its potential for increased use of wind and solar power. The paper specifically discusses the development of a multi-objective linear programming (MOLP) model that can be used to determine the optimal mix of renewable energy sources and existing fossil fuel facilities on a regional basis. This model allows a decision maker to balance annual generation costs against the corresponding greenhouse gas emissions, and it provides significant support for implementing a variety of different policy analyses.

Keywords: Renewable Energy, Multi-Objective Linear Programming, OptimizationReturn to research page


Khansa, L.Z., Zobel, C.W., and G. Goicochea. "Creating a Taxonomy for Mobile Commerce Innovations using Social Network and Cluster Analyses," International Journal of Electronic Commerce - special issue on M-commerce, Summer 2012, 16(4), 19-52.

Abstract: Increasing numbers of people are spending time focused on 'the third screen' of a mobile device. Through ubiquitous connectivity, personalization, and affordability, such mobile devices have become much more than just entertainment handsets. In particular, e-commerce has harnessed the power of wireless computing to expand to mobile commerce (m-commerce), thus providing consumers with commercial services on the go. Because such services are often driven by customer input, it is important to consider the relevance of consumers to the development of new service offerings. We therefore dissect innovations in m-commerce by conducting a textual analysis of all filed m-commerce patent applications (over 2,300 in total). By using social network analysis and cluster analysis, we subsequently capture the focal innovation areas in m-commerce and develop a corresponding taxonomy of these innovations. The results clearly illustrate the importance of consumer empowerment and co-creation in the context of m-commerce innovations.

Keywords: Mobile commerce, innovation, empowerment, co-creation, taxonomy, social network analysis, cluster analysis, innovationReturn to research page


Zobel, C.W., Melnyk, S., Griffis, S., and J. MacDonald. "Characterizing Disaster Resistance and Recovery using Outlier Detection," ISCRAM 2012 - Integrative and Analytical Approaches to Crisis Response and Emergency Management Information Systems (Vancouver, BC: April 2012).

Abstract: Most definitions of disaster resilience incorporate both the capacity to resist the initial impact of a disaster and the ability to recover after it occurs. Being able to characterize and analyze resilient behavior can lead to improved understanding not only of the capabilities of a given system, but also of the effectiveness of different strategies for improving its resiliency. This paper presents an approach for quantifying the transient behavior resulting from a disaster event in a way that allows researchers to not only describe the transient response but also assess the impact of various factors (both main and interaction effects) on this response. This new approach combines simulation modeling, time series analysis, and statistical outlier detection to differentiate between disaster resistance and disaster recovery. Following the introduction of the approach, the paper provides a preliminary look at its relationship to the existing concept of predicted disaster resilience.

Keywords: Simulation, Time Series Analysis, Outlier Detection, Predicted ResilienceReturn to research page


Zobel, C.W. and L.Z. Khansa. "Quantifying Cyberinfrastructure Resilience against Multi-event Attacks," Decision Sciences, 43 (4), 2012, 687-710.

Abstract: This paper introduces a general approach for characterizing cyberinfrastructure resilience in the face of multiple malicious cyberattacks, such as when a sequence of denial-of-service attacks progressively target an already weakened information system. Although loss assessment frequently focuses on a single overall measure such as cost or downtime, the proposed technique considers both the timing and the amount of loss associated with each individual attack, as well as whether this loss is incurred suddenly or is "slow-onset." In support of this, an underlying mathematical model is developed to represent the relative impact of each attack and the corresponding length of time that its effects persist within the system, as well as to illustrate the tradeoffs between these two factors. The model is extended to represent uncertainty in its parameters and thus to support comparative analyses among various security configurations with respect to a baseline estimate of resilience. Monte Carlo simulation is then used to illustrate the model's capabilities and to support a discussion of its ability to provide for more effective decision making in the context of disaster planning and mitigation.

Keywords: Information infrastructure, Disaster resilience, Multi-event resilience, Network security, Slow-onset disastersReturn to research page


Arnette, A.N. and C.W. Zobel. "The Role of Public Policy in Optimizing Renewable Energy Development in the Greater Southern Appalachian Mountains," Renewable & Sustainable Energy Reviews, 15(8), 2011, 3690-3702.

Abstract: This research presents a third component of a comprehensive decision support system for energy planning that allows for combining existing electricity generating capabilities with increased use of renewable energy sources. It focuses on energy planning at the regional level, and concentrates specifically on the greater southern Appalachian mountains of the eastern United States: a region that was chosen for analysis not only due to its heavy dependence on coal for electricity, but also because of its potential for increased use of wind and solar power. Previous research used a geographic information system (GIS) model for identifying renewable energy potential to provide input data for a multi-objective linear programming (MOLP) model to determine the optimal constrained mix of renewable energy sources and existing fossil fuel facilities by balancing annual generation costs against the corresponding greenhouse gas emissions. This new component of the system analyzes three potential public policies - renewable portfolio standard, carbon tax, and renewable energy production tax credit - that have been used to foster increased renewable energy usage. These policies require minor modifications to the MOLP model for implementation. The results of these policy cases were then analyzed to determine the impact that these policies have on generation cost and pollution emissions within the region.

Keywords: Renewable energy; public policy; optimizationReturn to research page


Falasca, M. and C.W. Zobel "A Two-Stage Procurement Model for Humanitarian Relief Supply Chains," Journal of Humanitarian Logistics and Supply Chain Management, 1(2), 2011, 151-169.

Abstract: The purpose of this paper is to discuss and to help address the need for quantitative models to support and improve procurement in the context of humanitarian relief efforts. It presents a two-stage stochastic decision model with recourse for procurement in humanitarian relief supply chains, and compares its effectiveness on an illustrative example with respect to a standard solution approach. Results show the ability of the new model to capture and model both the procurement process and the uncertainty inherent in a disaster relief situation, in support of more efficient and effective procurement plans. Despite the prevalence of procurement expenditures in humanitarian efforts, procurement in humanitarian contexts is a topic that previously has only been discussed in a qualitative manner in the literature. This work provides practitioners with a new approach to quantitatively assess and improve their procurement decision processes, and it adds to the existing literature by demonstrating the applicability and effectiveness of an analytic modeling technique based on uncertainty, such as stochastic programming with recourse, in the context of humanitarian relief procurement activities.

Keywords: Humanitarian Logistics, Procurement, Decision Modeling, Optimization, Stochastic ProgrammingReturn to research page


Zobel, C.W. "Representing the Multi-Dimensional Nature of Disaster Resilience," ISCRAM 2011 - From Early-Warning Systems to Preparedness and Training (Lisbon, Portugal: May 2011).

Abstract: Although quantitative analytical information systems are an important resource for supporting decision-making in disaster operations management, not all aspects of a disaster situation can be easily quantified. For example, although the concept of the disaster resilience of a community has a technical dimension within which one can measure the resistance of the infrastructure against, and the speed of its recovery from, a disaster event, it also has social, organizational, and economic dimensions within which these characteristics may be more difficult to measure. This work-in-progress paper introduces a quantitative framework within which the multi-dimensional nature of such disaster resilience can be represented in a concise manner. This can help to improve understanding of the complexities associated with the concept, and thus directly support decision-making in disaster operations planning and management.

Keywords: Community resilience, visualization, social systems, organizational systems, analytical information systemsReturn to research page


Arnette, A.N. and C.W. Zobel. "Spatial Analysis of Renewable Energy Potential in the Greater Southern Appalachian Mountains," Renewable Energy, 36(11), 2011, 2785-2798.

Abstract: This research discusses the implementation of a geographic information system (GIS) for the simultaneous discovery of multiple types of renewable energy sources. In particular, the GIS model analyzes wind, solar, and biomass potential within the greater southern Appalachian region, an area which is currently very heavily dependent on coal for electricity generation. The location and availability of biomass is specifically considered in the context of potential co-fire generation within existing coal plants, while the availability of wind and solar power are based on both resource strength and the geographic, topographic, and regulatory constraints that provide limits on their use. The model determines potential wind and solar sites within the region, and potential biomass co-fire locations, based on these input constraints and it calculates the cost and generation characteristics for each site.

Keywords: Geographic information systems, wind energy, solar energy, biomass energy, constraintsReturn to research page


Zobel, C.W. and D.F. Cook "Evaluation of neural network variable influence measures for process control," to be published in Engineering Applications of Artificial Intelligence, 2011.

Abstract: Decision-making frequently involves identifying how to change input parameters in a given process in order to effect a directed change in the process output. Artificial neural networks have been used extensively to model business and manufacturing processes and there are several existing neural network-based influence measures that allow a decision-maker to assess the relative impact of each variable on process performance. The purpose of this paper is to review those neural network-based measures of variable influence, and to identify the combination of those measures that results in a comprehensive approach to characterizing variable influence within a trained neural network model. We then demonstrate how this comprehensive approach can be used as a tool to guide decision makers in dynamic process control.

Keywords: Neural networks, dynamic control, influence measures, process control, variable influence measuresReturn to research page


Zobel, C.W. "Representing perceived tradeoffs in defining disaster resilience," Decision Support Systems, 50(2), 2011, 394-403.

Abstract: Two of the primary measures that characterize the concept of disaster resilience are the initial impact of a disaster event and the subsequent time to recovery. This paper presents a new analytic approach to representing the relationship between these two characteristics by extending a multi-dimensional approach for predicting resilience into a technique for fitting the resilience function to the preferences and priorities of a given decision maker. This allows for a more accurate representation of the perceived value of different resilience scenarios to that individual, and thus makes the concept more relevant in the context of strategic decision making.

Keywords: Adjusted resilience; optimization; visualization; tradeoffs; decision support; preference modelingReturn to research page


Falasca, M., C.W. Zobel, and C.T. Ragsdale "Helping a Small Development Aid Organization Manage Volunteers More Efficiently," Interfaces, 41(3), 2011.

Abstract: Development organizations plan and execute efforts that are primarily directed towards promoting human welfare. Because these organizations rely heavily on the use of volunteers to carry out their social mission, it is important that they manage their volunteer workforce efficiently. In this study, we discuss the development of a spreadsheet-based multi-criteria volunteer scheduling model for a small development organization in a South American developing country. We demonstrate not only how the proposed new model helps to reduce the number of unfilled shifts and decrease total scheduling costs, but also how it helps to better satisfy the volunteers' scheduling preferences, thus supporting long-term retention and effectiveness of the workforce.

Keywords: Development, multi criteria decision making, optimization, scheduling, spreadsheets, volunteer laborReturn to research page


Zobel, C.W. "Supporting disaster resilience: Information technology and emergence in multi-organizational networks," to be published in the Journal of Emergency Management, 2010.

Abstract: Given the complexity and interconnectedness of our modern world, information and communication technologies are becoming increasingly important for enabling effective collaboration between organizations. In the context of disaster operations management, the appropriate use of such technologies can provide significant value to multi-organizational collaborations by enhancing decision makers' ability to get access to and to analyze information relevant to the current situational context. Because the resilience of such collaborations is a function not only of individual partners but also of the systems within which they operate, it is necessary and appropriate to consider the information needs associated with such institutional relationships. With this in mind, this article discusses inter-sectoral collaboration from the standpoint of recognizing and understanding the information infrastructure needed to support a sustainable system of partnerships for community resilience. The discussion aims to establish a common awareness of some of the issues and opportunities associated with information sharing in this dynamic environment.

Keywords: Disaster, information technology, dynamic, framework, collaboration, decision makingReturn to research page


Zobel, C.W. "Comparative visualization of predicted disaster resilience," ISCRAM 2010 - Defining Crisis Management 3.0 (Seattle, WA: May 2010).

Abstract: The disaster resilience triangle is a simple but effective tool for illustrating the relationship between the initial impact of a disaster event and the subsequent time to recovery. This tool can also be expanded, however, to provide an analytic measure of the level of resilience exhibited by a particular entity in a given disaster situation. We build upon the previous work in this area by developing a new approach for visualizing and analyzing the tradeoffs between the two primary defining characteristics of the disaster resilience triangle. This new approach supports strategic decision making in a disaster planning environment by providing a straightforward means for directly comparing the relative predicted resilience of different critical facilities within an organization, with respect to both location and type of risk.

Keywords: Resilience triangle, visualization, decision supportReturn to research page


Fetter, G., M. Falasca, C.W. Zobel, T.R. Rakes, "A Multi-stage Decision Model for Debris Disposal Operations," ISCRAM 2010 - Defining Crisis Management 3.0 (Seattle, WA: May 2010).

Abstract: As shown by Hurricane Katrina, disposing of disaster-generated debris can be quite challenging. Extraordinary amounts of debris far exceeding typical annual amounts of solid waste are almost instantaneously deposited across a widespread area. Although the locations and amounts of debris can be easily summarized looking back after recovery activities have been completed, they are uncertain and difficult at best to estimate as debris operations begin to unfold. Further complicating matters is that the capacity of cleanup resources, which is dependent upon available equipment, labor, and subcontractors, can fluctuate during on-going cleanup operations. As a result, debris coordinators often modify initial resource assignments as more accurate debris estimates and more stable resource capacities become known. In this research, we develop a computer-based decision support system that incorporates a multi-stage programming model to assist decision makers with allocating debris cleanup resources immediately following a crisis event and during ongoing operations as debris volumes and resource capacities become known with increasing certainty.

Keywords: Debris Disposal, Decision Modeling, Optimization, Stochastic ProgrammingReturn to research page


Arnette, A., C.W. Zobel, D. Bosch, J. Pease, and T. Metcalfe. "Stakeholder Ranking of Watershed Goals with the Vector Analytic Hierarchy Process: Effects of Participant Grouping Scenarios," Environmental Modelling and Software - Special Issue: Modeling with Stakeholders. Scheduled for publication in 2010.

Abstract: More effective methods of eliciting and summarizing stakeholders' goals can assist in improving watershed management. This paper discusses the process of summarizing the goals that were generated during a workshop of watershed stakeholders in Virginia by using the Vector Analytic Hierarchy Process, and then grouping them into homogeneous subgroups by using two different methods: 1) assigning subgroups based on individuals' stated affiliations from a participant bio-sheet; and 2) assigning subgroups based on the similarity of individuals' actual preferences between the goals. Several different clustering approaches are considered for creating the preference-based subgroups, and the relative advantages and disadvantages of each approach are discussed. The process of combining the subgroups to generate a single overall preference structure for the group as a whole is also considered, and the final results are compared based on both the resulting rankings and the coherence, or variability of opinion, that they reflect. Determining the "best" set of subgroups can be valuable not only in exploring the underlying nature of the population's preferences, but also in supporting additional discussion and analysis of the results. As such, it can ultimately lead to much stronger and better informed decision making.

Keywords: Watershed planning; stakeholders; preferences; clustering; decision support; environmental managementReturn to research page


Ragsdale, C.T. and C.W. Zobel, "A Simple Approach to Implementing and Training Neural Networks in Excel," Decision Sciences Journal of Innovative Education, 8 (1), 2010, 143-149.

Abstract: Over the past 10 to 15 years, Artificial Neural Networks (ANNs) have proven their value in numerous practical business applications and become a mainstream modeling and data mining tool. Educators desiring to introduce their students to ANNs face somewhat of a quandary, however, in terms of the available software options for teaching. A number of commercial ANN packages exist (e.g., NeuralWorks by NeuralWare, and NeuralTools by Palisade Corporation) and, more recently, mainstream statistical packages (e.g., JMP by SAS Institute) have added nice implementations of ANNs, but the cost of acquiring and using these tools can be restrictive. Spreadsheets provide a popular and cost-effective alternative for teaching modeling and decision analysis techniques, but there currently exist few spreadsheet-based tools for teaching basic ANN concepts. This note seeks to address this situation by introducing a very simple approach to implementing and training ANNs in Microsoft Excel, using only Excel's inherent front-end spreadsheet capabilities. Such an approach is very suitable for a course where only one or two days might be spent on the topic of ANNs.

Keywords: Spreadsheets, Neural networks, Microsoft ExcelReturn to research page


Falasca, M., C.W. Zobel, and G.M. Fetter, "An optimization model for humanitarian relief volunteer management," ISCRAM 2009 - Boundary spanning initiatives and new perspectives (Göteborg, Sweden: May 2009).

Abstract: One of the challenges of humanitarian organizations is that there exist limited decision technologies that fit their needs. It has also been pointed out that those organizations experience coordination difficulties with volunteers willing to help. The purpose of this paper is to help address those challenges through the development of a decision model to assist in the management of volunteers. While employee workforce management models have been the topic of extensive research over the past decades, no work has focused on the problem of managing humanitarian relief volunteers. In this paper, we discuss a series of principles from the field of volunteer management and develop a multi criteria optimization model to assist in the assignment of volunteers to tasks. We present an illustrative example and analyze a solution methodology where the decision maker exercises his/her preferences by trading-off conflicting objectives. Conclusions, limitations, and directions for future research are also discussed.

Keywords: Humanitarian Logistics, Multi Criteria Decision Making, Optimization, Volunteer managementReturn to research page


Zobel, C.W., J.R. Martin, II, and C.G. Olgun. "Disaster risk management for critical infrastructure: a services-based viewpoint," International Journal of Services Sciences, 2 (2), 2009, 189-205.

Abstract: It is necessary to protect critical infrastructure and key resources from the effects of a disaster in order to maintain continuity of both public services and private enterprise. Since many of the critical resources are themselves privately owned and operated, however, they are often subject to the same market pressures that force businesses to concentrate on improving the efficiency of their operations at the possible expense of guarding against potential disruptions to those operations. Under these circumstances, the process of designing an effective (and acceptable) disaster mitigation plan for critical infrastructure resources requires a shared understanding of both the business aspects and the technical aspects of different mitigation alternatives. This paper examines the decision-making process behind such disaster mitigation planning from a services standpoint, and discusses how taking such a viewpoint can provide a more effective approach to sustainable disaster risk reduction.

Keywords: Decision making, disaster, operations management, critical infrastructure, private ownership, servicesReturn to research page


Zobel, C.W. "The use of information and communication technologies in support of dynamic partnerships for disaster resilience," in Exploring Innovative and Sustainable Approaches to Improve Community Resilience in Disaster Prevention and Response, Invited Symposium - International Disaster and Risk Conference (IDRC) 2008, Davos, Switzerland, Aug. 29-30, 2008.

Abstract: Given the complexity and interconnectedness of our modern world, information and communication technologies are becoming increasingly important for enabling effective collaboration between organizations. In the context of managing disaster relief efforts, the appropriate use of such technologies can provide significant value to such a collaboration by enhancing decision makers' ability to get access to and to analyze information relevant to the current situational context. Because the resilience of a community is a function not only of individual community members but also of the collaborative systems within which they operate, it is necessary and appropriate to consider information needs associated with these partnerships. With this in mind, these remarks discuss intersectoral collaboration from the standpoint of recognizing and developing the underlying information infrastructure needed to support a sustainable system of dynamic partnerships for community resilience. The intent of the discussion is to establish a common awareness of some of the issues and opportunities associated with information technology in this important new context.

Keywords: Information Technology, Dynamic Partnerships, Community, Collaboration, Disaster, ResilienceReturn to research page


Zobel, C.W. and G.A. Wang. "Topic Maps for Improving Services in Disaster Operations Management," Journal of Service Science, 1 (1), 2008, 83-92.

Abstract: Disaster operations management is an increasingly important application area for the developing techniques of service science. This paper examines the use of topic maps, a semantic technology, within this environment, and provides a preliminary discussion of the benefits that its implementation can provide in the capture and exchange of contextual information. The discussion is motivated by a look at the different phases of disaster operations management in a services context, and focuses on the need for effective and relevant information exchange as an important part of the services process. As the amount and complexity of information increases within such processes, semantic technologies are becoming increasingly important as a means representing and managing contextual information. This paper seeks to help further the understanding of the relevance of such tools as part of the study of service science.

Keywords: Decision making, disaster, operations management, semantics, Topic Maps, knowledge management, servicesReturn to research page


Falasca, M., C.W. Zobel, and D.F. Cook. "A Decision Support Framework to Assess Supply Chain Resilience," ISCRAM (Information Systems for Crisis Response and Management) 2008 - Creating Advanced Systems for Inter-organizational Information Sharing and Collaboration (Washington, D.C.: May 2008), pp. 596-605.

Abstract: This research is aimed at developing a quantitative approach for assessing supply chain resilience to disasters, a topic that has been discussed primarily in a qualitative manner in the literature. For this purpose, we propose a simulation-based framework that incorporates concepts of resilience into the process of supply chain design. In this context, resilience is defined as the ability of a supply chain system to reduce the probabilities of disruptions, to reduce the consequences of those disruptions, and to reduce the time to recover normal performance. The decision framework incorporates three determinants of supply chain resilience (density, complexity, and node criticality) and discusses their relationship to the occurrence of disruptions, to the impacts of those disruptions on the performance of a supply chain system and to the time needed for recovery. Different preliminary strategies for evaluating supply chain resilience to disasters are identified, and directions for future research are discussed.

Keywords: Decision Support Systems, Resilience, Simulation, Supply Chain DesignReturn to research page


King, M.A. and C.W. Zobel. "Applying the R4 Framework of Resilience: Information Technology Disaster Risk Management at Northrop Grumman," Proceedings of the 38th Annual Meeting of the Southeast Decision Sciences Institute (Orlando, FL: February 2008).

Abstract: Disaster risk management and mitigation are an important component of most business managers' tactical and strategic objectives. However, as time passes since the occurrence of disasters such as 9/11 and Hurricane Katrina, these managers are finding it much more difficult to gain budgetary approval for projects that increase disaster resilience with respect to business systems and processes. In this paper we will discuss and extend a conceptual disaster resilience framework developed by the Multidisciplinary Center for Earthquake Engineering Research (MCEER), suggest methods for quantifying organizational resilience, and apply the extended framework to an actual business disaster experienced by the Northrop Grumman Corporation. The intent of this paper is to supply business managers with a quantifiable approach to developing organizational resilience measurements that will enhance their business case for business continuity and resilience projects.

Keywords:Resilience, Information Technology, Disaster Risk Management, Hurricane KatrinaReturn to research page


Zobel, C.W. and K.B. Keeling. "Neural network-based simulation metamodels for predicting probability distributions," Computers & Industrial Engineering, 54 (4), 2008, 879-888.

Abstract: Simulation is an important tool for supporting decision-making under uncertainty, particularly when the system under consideration is too complex to evaluate analytically. The amount of time required to generate large numbers of simulation replications can be prohibitive, however, necessitating the use of a simulation metamodel in order to describe the behavior of the system under new conditions. The purpose of this study is to examine the use of neural network metamodels for representing output distributions from a stochastic simulation model. A series of tests on a well-known simulation problem demonstrate the ability of the neural networks to capture the behavior of the underlying systems and to represent the inherent uncertainty with a reasonable degree of accuracy.

Keywords: Neural networks; simulation; metamodels; percentilesReturn to research page


Zobel, C.W., Cook, D.F., and C.T. Ragsdale. "Data-driven classification using boundary observations," Decision Sciences, 37 (2), 2006, 247-262.

Abstract: Classification is often a critical task for business managers in their decision making processes. It is generally more difficult for a classification scheme to produce accurate results when the input domains of the various output classes coincide, to some degree, with one another. In an attempt to address this issue, this paper discusses a data-driven algorithm that identifies the region of coincidence, or overlap, for two-group classification problems by empirically determining the convex boundary for each group. The results are extendable to multi-group classification. The class membership of a new observation is then determined by its relative position with respect to each of these boundaries. Due to minimal data storage requirements, this boundary-point classification technique can adapt to changing conditions far more easily than other approaches. Test results demonstrate that the new classification technique has similar performance to a back-propagation neural network under static conditions and significantly outperforms a back-propagation neural network under dynamic conditions.

Keywords: Classification; Decision support; HeuristicsReturn to research page


Scheibe, K.P., Mennecke, B.E., and C.W. Zobel. "Creating Offshore-ready IT Professionals: A Global Perspective and Strong Collaborative Skills Are Needed," Journal of Labor Research, 27 (3), 2006, 275-290.

Abstract: Outsourcing of IT functions has become a widespread corporate practice, which has naturally led to concerns among IT works about how this affects their jobs. The issue is complex, and many companies are bringing their IT functions back in-house. In light of this complexity, what skills do IT workers need to be com-petitive? We address this question first by reviewing the literature and then by examining two corporate case studies that have dealt with outsourcing issues. Based on this view of outsourcing, we discuss the skills that can provide a competitive advantage in the current environment.

Keywords: Outsourcing; Information Technology; IT skillsReturn to research page


Cook, D.F., Zobel, C.W., and M.L. Wolfe. "Environmental statistical process control using an augmented neural network classification approach," European Journal of Operational Research, 174 (3), 2006, 1631-1642.

Abstract: Shifts in the values of monitored environmental parameters can help to indicate changes in an underlying system. For example, increased concentrations of copper in water discharged from a manufacturing facility might indicate a problem in the wastewater treatment process. The ability to identify such shifts can lead to early detection of problems and appropriate remedial action, thus reducing the risk of long-term consequences. Statistical process control (SPC) techniques have traditionally been used to identify when process parameters have shifted away from their nominal values. In situations where there are correlations among the observed outputs of the process, however, as in many environmental processes, the underlying assumptions of SPC are violated and alternative approaches such as neural networks become necessary. A neural network approach that incorporates a geometric data preprocessing algorithm and identifies the need for increased sampling of observations was applied to facilitate early detection of shifts in autocorrelated environmental process parameters. Utilization of the preprocessing algorithm and the increased sampling technique enabled the neural network to accurately identify the process state of control. The algorithm was able to identify shifts in the highly correlated process parameters with accuracies ranging from 96.4% to 99.8%.

Keywords: Environmental quality; Neural networks; Statistical process control; CorrelationReturn to research page


Goel, A.M., Zobel, C.W., and E.C. Jones. "A multi-agent system for supporting the electronic contracting of food grains," Computers and Electronics in Agriculture, 48(2), 2005, 123-137.

Abstract: With increasing competition and better transportation facilities, the relationships between supply chain actors are getting more complex and individual profit margins are shrinking. Decisions at different levels of the supply chain can no longer be considered independent, since they may influence profitability across the supply chain. Information technology based solution frameworks offer a way to more effectively integrate decision-making by enabling better knowledge sharing and facilitating more transparent economic transactions. This paper proposes the use of a multi-agent system to represent and integrate the decision-making processes of various actors within the previous termfood grainnext term supply chain. Within the context of such a system, it presents an internet-based combinatorial auction framework as a technique for implementing previous termelectronic contractingnext term (e-previous termcontracting) of food grains.next term The discussion focuses on the relationship between producers and millers, and presents preliminary simulation results that demonstrate the applicability and effectiveness of software agents within the proposed auction environment.

Keywords: Intelligent agents; Combinatorial auction; Supply chain; E-commerceReturn to research page


Zobel, C.W., Rakes, T.R., and L.P. Rees. "Automated merging of conflicting knowledge bases, using a consistent, majority-rule approach with knowledge-form maintenance," Computers & Operations Research, 32(7), 2005, 1809-1829.

Abstract: This paper discusses an automated process of merging conflicting information from disparate sources into a combined knowledge base. The algorithm provided generates a mathematically consistent, majority-rule merging by assigning weights to the various sources. The sources may be either conflicting portions of a single knowledge base or multiple knowledge bases. Particular attention is paid to maintaining the original rule format of the knowledge, while ensuring logical equivalence. This preservation of rule format keeps the knowledge in a more intuitive implication form as opposed to a collection of clauses with many possible logical roots. It also facilitates tracking using the support for each deductive result so that final knowledge in rule form can be ascribed back to original experts. As the approach is fairly involved mathematically, an automated procedure is developed.

Keywords: Knowledge management; Knowledge engineering; Conflict resolutionReturn to research page


Zobel, C.W. and W.T. Scherer. "An empirical study of policy convergence in Markov decision process value iteration," Computers & Operations Research, 32(1), 2005, 127-142.

Abstract: The value iteration algorithm is a well-known technique for generating solutions to discounted Markov decision process (MDP) models. Although simple to implement, the approach is nevertheless limited in situations where many Markov decision processes must be solved, such as in real-time state-based control problems or in simulation/optimization problems, because of the potentially large number of iterations required for the value function to converge to an var epsilon-optimal solution. Experimental results suggest, however, that the sequence of solution policies associated with each iteration of the algorithm converges much more rapidly than does the value function. This behavior has significant implications for designing solution approaches for MDPs, yet it has not been explicitly characterized in the literature nor generated significant discussion. This paper seeks to generate such discussion by providing comparative empirical convergence results and exploring several predictors that allow estimation of policy convergence speed based on existing MDP parameters.

Keywords: Markov decision processes; Dynamic programming; Convergence resultsReturn to research page


Ragsdale, C.T. and C.W. Zobel. "The Ordered Cutting Stock Problem," Decision Sciences, 35(1), 2004, pp. 81-98.

Abstract: The one-dimensional Cutting Stock Problem (CSP) is a classic combinatorial optimization problem in which a number of parts of various lengths must be cut from an inventory of standard size material. The classic CSP ensures that the total demand for a given part size is met but ignores the fact that parts produced by a given cutting pattern may be destined for different jobs. As a result, applying the classic CSP in a dynamic production environment may result in many jobs being open (or partially complete) at any point in time - requiring significant material handling/sorting operations. This paper identifies and discusses a new type of one-dimensional CSP, called the ordered CSP, which explicitly restricts to one the number of jobs in a production process that can be open, or in process, at any given point in time. Given the growing emphasis on mass customization in the manufacturing industry, this restriction can help lead to a reduction in both in-process inventory levels and material handling activities. A formal mathematical formulation is provided for the new CSP model, and its applicability is discussed with respect to a production problem in the custom door and window manufacturing industry. A genetic algorithm (GA) solution approach is then presented which incorporates a customized heuristic for reducing scrap levels. Several different production scenarios are considered, and computational results are provided which illustrate the ability of the GA-based approach to significantly decrease the amount of scrap generated in the production process.

Keywords: Artificial Intelligence, Genetic Algorithms, Production Scheduling, Cutting StockReturn to research page


Zobel, C.W., Cook, D.F., and Q.J. Nottingham. "An augmented neural network classification approach to detecting mean shifts in correlated manufacturing process parameters," International Journal of Production Research, 42(4), 2004, pp. 741-758.

Abstract: Statistical process control (SPC) techniques have traditionally been used to identify when the mean of a manufacturing process has shifted out of control. In situations where there is correlation among the observed outputs of the process, however, the underlying assumptions of SPC are violated and alternative approaches such as neural networks become necessary in order to characterize the process behavior. This paper discusses the development of a neural network technique that provides a significantly improved capability for recognizing these process shifts as compared to the current techniques in the literature. The procedure in question is an augmented neural network based approach which incorporates a data preprocessing classification algorithm that provides information to facilitate early detection of out of control operating conditions. This approach is shown to improve significantly upon the performance of previous neural network techniques for identifying process shifts in the presence of correlation.

Keywords: Neural Networks, Statistical Process Control, CorrelationReturn to research page


Cook, D.F., Zobel, C.W., and Q.J. Nottingham. "Excel-based application of data visualization techniques for process monitoring in the forest products industry," Forest Products Journal, 54(5), 2004, 57-65.

Abstract: Techniques for data visualization can often provide important information and insight to forest products manufacturing process managers and operators as these techniques can be used to identify important relationships between various process parameters or significant properties of monitored characteristics. Many of the currently available visualization tools, however, tend to be fairly expensive and require significant programming expertise and are thus not easily accessible to some individuals or organizations. This paper describes data visualization and the potential for development of data visualization techniques that address the accessibility issue by using Microsoft Excel and Visual Basic for Applications to generate plots for monitoring or analyzing a data set or stream. Example data from a particleboard manufacturing process are used to illustrate the use of such an Excel-based visualization tool for process monitoring and analysis, and to discuss its impact on the decision-making process. The visualization tool is also applied to data from a study of the mechanical and physical properties of oriented strandboard panels, in order to illustrate its functionality for the quick display and initial analysis of experiment results.

Keywords: Graphical Display, Process Monitoring, Visual Basic for Applications (VBA)Return to research page


Zobel, C.W. and W.T. Scherer, "Simulation-based Policy Generation with Large-scale Markov Decision Processes," IEEE Transactions on Systems, Man & Cybernetics - Part A: Systems and Humans, 31(6), 2001, 609-622.

Abstract: This paper presents a new problem-solving approach, termed Simulation-based Policy Generation (SPG), that is able to generate solutions to problems that may otherwise be computationally intractable. When trying to optimize large-scale sequential stochastic problems, it is often easier to simulate the system under consideration and then to perform some type of simulation/optimization procedure, such as pseudo-random search or a response surface methodology. The SPG method builds on this idea by using a simulation of the original problem to create an approximating Markov decision process (MDP) model which is then solved via traditional MDP solution approaches. Since this approximating MDP is a fairly rich and robust sequential optimization model, solution policies can be created which represent an intelligent and structured search of the policy space. An important feature of the SPG approach is its adaptive nature, in that it uses the original simulation model to generate improved aggregation schemes, allowing the approach to be applied in situations where the underlying problem structure is largely unknown. In order to illustrate the performance of the SPG methodology, we apply it to a common but computationally complex problem of inventory control, and we briefly discuss its application to a large-scale telephone network routing problem.

Keywords: Markov Decision Processes, Simulation, State-space Aggregation, Value IterationReturn to research page


Nottingham, Q.J., Cook, D.F., and C.W. Zobel. "Visualization of multivariate data with radial plots using SAS," Computers & Industrial Engineering, 41(1), 2001, 17-35.

Abstract: Data visualization tools can provide very powerful information and insight when performing data analysis. In many situations, a set of data can be adequately analyzed through data visualization methods alone. In other situations, data visualization can be used for preliminary data analysis. In this paper, radial plots are developed as a SAS-based data visualization tool that can improve one's ability to monitor, analyze, and control a process. Using the program developed in this research, we present two examples of data analysis using radial plots; the first example is based on data from a particle board manufacturing process and the second example is a business process for monitoring the time-varying level of stock return data.

Keywords: Radial Plot, Visualization, SASReturn to research page


Cook, D.F., Zobel, C.W., and Q.J. Nottingham. "Utilization of Neural Networks for the Recognition of Variance Shifts in Manufacturing Process Parameters," International Journal of Production Research, 39(17), 2001, 3881-3887.

Abstract: Traditional statistical process control charting techniques were developed for use in discrete industries where independence exists between process parameter values over time. Process parameters from many manufacturing industries are not independent, however; they are serially correlated. Consequently, the power of traditional statistical process control charts is greatly weakened. This paper discusses the development of neural network models to successfully identify shifts in the variance of correlated process parameters. These neural network models can be used to monitor manufacturing process parameters and to signal when process adjustments are needed.

Keywords: Neural Networks, Statistical Process Control, Correlated Data, Variance MonitoringReturn to research page



Funded Research projects

"Sustainable Community Resilience: Establishing a Link between Inherent and Dynamic Disaster Resilience," Virginia Tech Institute for Society, Culture, & Environment. Funding amount: $19,754. Duration: May 8, 2011-June 15, 2011. Co-PIs: L. Rees, P. Sforza.

Abstract: Sudden-onset natural disasters, such as floods, hurricanes, and earthquakes, can have a significant long-term impact on both the physical infrastructure and social infrastructure of vulnerable communities. It is therefore critical to establish a relationship between the inherent resilience of a community and its disaster-specific resilience; this will empower both communities and humanitarian organizations to improve resilience and long-term sustainability. This research project will first develop the theory to analyze the interaction between the dynamic disaster resilience of a community, as exhibited by its response to a specific disaster, and the (static) inherent resilience of that community against disasters in general. Mathematical models will then be developed to indicate the best path to achieving overall long-term resilience, focusing on the social dimension and including such components as (1) public general welfare/safety; (2) health; (3) education; and (4) family and social/faith networking. Data from a flooding disaster will be analyzed to calculate the observed resilience and to compare it against the expected results from the model. Favorable results will position other communities to utilize the model during actual disasters to determine their best possible infrastructure and social resilience postures. This, in turn, will support more effective allocation of resources by relief organizations.

Return to research page


"Collaborative Research: CPATH CB: Living in the KnowlEdge Society (LIKES)," National Science Foundation. Funding amount: $289,999. Duration: 2007-2009. Co-Pis: E. Fox, C. Evia, W. Fan, S. Sheetz.

Abstract: The LIKES project seeks to deliver new pedagogies in computing education; integration of computing concepts into non-computing disciplines, such as psychology and marketing; principles, guidelines, and techniques for integrating computing and non-computing curricula; and formation of new communities for enhancing that integration. Through a series of four workshops, related online discussions, and research, the LIKES community is discovering key computing-related issues in core disciplines and engaging leaders nationwide in brainstorming about their computing education needs, as well as facilitating the helpful application of computing for individuals, groups and organizations.

The primary mission is to bring together faculty members in computing-related education programs, such as Computer Science and Information Systems/Technology with faculty in core/liberal education courses such as psychology and sociology, to enhance students’ computing competencies as well as facility with key computing-related paradigms and concepts. The project will achieve this mission by placing computing concepts within context. For example, user-interface design principles could be used as a tool to understand the costs and benefits of electronic voting in a political science course, or hierarchical data management techniques could be used to illustrate the taxonomy of species or sequences in a biology class. Similarly, tackling such problems in computing classes enhances learning by emphasizing the role of computing throughout society. The message conveyed to all faculty is that computing is embedded in modern life and managing personal and organizational information is inherent in the jobs of the future, regardless of discipline. The result will be students’ full engagement with computing concepts, in a more direct and meaningful way.

Return to research page


"Decision Support to Achieve Watershed Nutrient Balance and Meet Water Quality Goals," USDA-CSREES. Funding amount: $596,397. Duration: 2007-2010. Co-Pis: J. Pease, D. Bosch, M. Wolfe, J. Ogejo, G. Evanylo, K. Knowlton, N. Franz.

Abstract: Watershed nutrient imbalances pose risks to water quality. Improved decision aids and processes could reduct costs of achieving water quality protection goals by aiding watershed stakeholders in evaluating nutrient management strategies. The proposed project will integrate research, extension, and education missions to develop and test a group decision aid, Community DECISIONS, with which watershed stakeholders can rank strategies to address imbalances of nutrient imports and exports. Project objectives are to: 1) develop the Community DECISIONS group decision aid; 2) apply and assess the Community DECISIONS group decision aid with a stakeholder group in the North Fork Shenandoah watershed; 3) develop and deliver education for university students in watershed management decision aids and decision assessment; and 4) demonstrate and train watershed citizens and state and regional water quality professionals in the use of Community DECISIONS.

Three workshops will be conducted with an established stakeholder group to assist in development, use, and assessment of Community DECISIONS. The stakeholder consensus building process and the Community DECISIONS aid will be assessed to determine their effectiveness in ranking and implementing nutrient treatment strategies for stakeholder-defined watershed goals. Integration of research, extension, and education tasks will be accomplished through interaction among the stakeholders, principal investigators, and students in the development and assessment of the group decision aid and in formulation of watershed nutrient strategies. The project will produce a flexible, practical toolkit and collaborative decision process that can be employed to improve the quality of our nation's surface and groundwater resources.

Return to research page