Large Scale Strategic Budgeting for
This research develops a strategic planning
model for developing preparedness budgets necessary for managing
wildfires during the initial response period. The model uses a
stochastic linear optimization (integer programming) approach
to establish the effectiveness frontier of weighted acres managed
(WAM), an effectiveness metric used in this business problem.
The resulting integer program maximizes weighted acres managed
for a given input cost level. The optimization engine can be run
iteratively across a range of total cost constraints, which are
determined by the minimum and maximum cost levels to be analyzed.
The major challenge of the project was to solve
the original client-provided model in a reasonable amount of time
without running out of memory. The strategic budgeting optimization
problem based on the original model falls in the category of large-scale
integer programming problems. For example, one large model instance
consisted of 4.4 million 0/1 variables and 4.6 million rows with
17.4 million non-zero elements in the model matrix. The data needed
for such high-end instances could not even be loaded on a fairly
powerful machine with over 3 gigabytes of physical memory. Even
if some of the relatively smaller models were able to be accommodated
in memory, they soon exceeded memory limit within minutes of entering
into the branch-and-bound stage.
Research proposed and developed a creative two-phase
model-solving strategy to dissociate the complexity arising from
low-level deployment decisions from the strategic global optimal
resource organization. Phase 1 (a.k.a. decomposition) optimally
deploys resources to the fires in each fire group to discover
their resource preferences; Phase 2 deals with the global optimization
problem (with an exponentially reduced complexity), and uses the
deployment preference decisions made in Phase 1 to analyze and
develop the optimal initial attack resource organization. With
this innovation in place, some of the medium-sized model instances
were able to be solved in less than an hour with 20 times less
Even with the two-phase solving strategy, some
model instances were still too large to be accommodated in memory
or too complex to be solved within a reasonable time limit (less
than eight hours). IBM Research took the initiative and led a
major reformulation of the optimization model, where time intervals
are replaced by a continuous time variable, and all discrete time
series are replaced by corresponding piece-wise linear (PWL) functions.
The reformulation dramatically reduced the instance sizes, in
terms of numbers of 0/1 variables, columns, rows and non-zero
matrix elements, while improving on the quality of approximations
used in the modeling. For example, a reformulated model instance
reduced the number of 0/1 variables from 3.6 million to only 600,000.
The reformulated model allowed all model instances to be solved,
mostly within an hour, and has very good scalability that promises
to gracefully manage even larger models in the future.
Research developed a method that enabled systematic
performance tuning and improvement of optimization model preparation.
Specifically, the method laid out a process comprising three consecutive
steps, and for each step in the process, the method provides high-level
guidelines by identifying a set of relevant design criteria and
a ranking of design options against each criterion. The Research
team then implemented this method by using those guidelines as
templates and customizing them for the specific situations of
the project to arrive at concrete plans for model preparation
performance tuning and improvement. As a result, for all input
datasets used in beta testing, time spent in model preparation
was drastically reduced from approximately one hour to less than
Research added an innovative data pre-processing
component to transform all physical data instances (input data)
that correspond to the same logical data model into one normalized
physical data instance. By doing this, they made sure that the
same logical data model (e.g., with the same definitions of fires,
fire groups, resources etc.), always yield exactly the same optimization
result when used to run the model at different times, thus ensuring
optimization process repeatability, an important factor that impacts
Gyana Parija, Haifeng Xi,
of Preparedness Budgets for Wildfire Management Activities during
Initial response -- (A technical report).
Gyana R. Parija, Daniel Keller, Brian B. Booher,
Budgeting Model(s) for U.S. Federal Agencies
for Managing Wildfires, CORS/INFORMS Joint International
Meeting, Banff, Canada, May 19, 2004.
Gyana R. Parija, Brian B. Booher, Daniel Keller,
FPA Preparedness Module Initial Response
Optimization – An Integer Programming Approach , 2nd
International Symposium on Fire Economics, Policy and Planning:
A global vision, Cordoba, Spain, April 19-22, 2004.
Program Analysis (FPA) System Preparedness Module.
Wildland Fire Management Agencies
What is the most exciting potential
future use for the work you're doing?
On this On
Demand Innovation Services project, we are creating
a set of intellectual properties and software assets that
can be reused in future opportunities in the areas of disaster
management. Furthermore, the sales and delivery insights
gained during this project are helping us define effective
optimization engagement models for carrying out a lot of
in-market innovation experiments to close opportunity gaps
and help sustain IBM Research's competitive advantage.
What is the most interesting part
of your research?
The challenge of creating business value by matching
high-end mathematical programming technologies with high-impact
business problems, while using open platforms and standards.
And to make innovative optimization solutions accessible
and affordable to a wide spectrum of clients.s
What inspired you to go into this
The day-to-day hassle of decision-making under
What is your favorite invention
of all time?