|Date||Speaker||Presentation Title & File Link|
|12/07/2007||Yinyu Ye||Efficient Optimization by Equitable Convex Partitions|
Y. Ye, Stanford University
Given a convex polygon (in 2D) specified by its vertices, and other k points (hub/server) located on the polygon, we present a fast algorithm to partition the polygon into k subregions such that the following three properties hold: (i) the closure of each subregion is a convex polygon, (ii) each subregion contains exactly one of the k points, and (iii) all subregions have equal measurable "area".
We show its extension and applications in multi-depot vehicle routing, air traffic control zoning, client/server service load balancing, and possible applications in Smart Fields such as well location/placing.
|11/29/2007||Sergio Zarantonello||An interdisciplinary software environment for Earth Sciences|
S. Zarantonello, 3DGeo Inc.
This presentation is on a DOE project that J. Harris and S. Zarantonello are leading. Their objective is a software platform for 4-D seismic surveillance of CO2 sequestration processes, which ties in naturally to history matching of reservoir simulation models using time-lapsed seismic imaging.
|11/15/2007||Reidar Bratvold||Decision-making in the Oil & Gas Industry|
R. Bratvold, University of Stavanger
The battle against deterministic forecasts and financial valuations in the oil and gas industry appears to have been won, as evidenced by the plethora of papers, conference sessions, and forums focused on uncertainty quantification or profit prediction. In light of this focus on uncertainty quantification and forecasting, it seems appropriate to scrutinize its value. A natural question to ask is whether the focus on uncertainty modeling has improved decision making?
In this seminar, we present the findings of a survey of oil and gas professionals that addressed the following two questions: To what degree has uncertainty quantification improved in the oil and gas industry over the last five years? Has this improvement translated into improved decision making?
Uncertainty quantification is not an end unto itself; removing or even reducing uncertainty is not the goal. Rather, the objective is to make a good decision, which in many cases requires the assessment of the relevant uncertainties. The oil and gas industry seems to have lost sight of this goal in its good-faith effort to provide decision makers with a richer understanding of the possible outcomes flowing from major decisions. The industry implicitly believes that uncertainty is reduced simply by modeling it and that decision quality improves with more information. The increased computing power combined with increased sophistication in model building now permits the construction of very detailed models incorporating most relevant parameters and their dependencies. Yet, decision makers are still not sure what they should do. Rather than basing important decisions on a single number, they now find themselves buried under a mountain of probability distributions. Uncertainty quantification seems to have confused as much as it has enlightened and we are reminded of Peter Drucker's sage advice that:
"There is nothing as inefficient as very efficiently doing the wrong things."
To counter this uncertainty induced confusion, we present a decision-focused uncertainty quantification framework, which we hope, in combination with our survey results, will aid in the innovation of better decision-making tools and methodologies.
|10/31/2007||Marco Thiele||Mature Field Management Using Streamline-Derived Data|
M. Thiele, Streamsim Technologies/ Stanford University
A bread-and-butter field management task for many reservoir engineers is to set monthly or quarterly injection/production target rates in mature oil fields under water flood or some other type of secondary or tertiary displacement. In many cases, these fields are producing well above the 75% water cut mark, and thus increasing the efficiency even by a few percentages is key target of overall field management. Despite its importance, the management of such fields is surprisingly simplistic, with the final rate target generally set by some sort of "excel" spreadsheet put together by the production engineers. A more proactive management tool is clearly needed. One possibility is to use streamlines.
Streamline-based flow simulation is generally associated with "faster" simulations and therefore being a good proxy to use in any work flow that might require many forward simulations. However, an overlooked factor of streamline simulation is the novel data that is generated by streamlines, such as identifying injector-producer pairs and quantifying the reservoir volumes associated with the pairs. Armed with this data it is possible to "promote" more efficient injector/producer pairs and demote less efficient pairs. This can lead to a more successful field management strategy.
This talk will present work in this area. The intent is to raise awareness of streamline specific data and how it might be used within the Smart Fields consortium.
|10/18/2007||David Echeverria Ciaurri||An Overview of Smart Fields at Stanford|
D. Echeverria Ciaurri, Stanford University
The Stanford Smart Fields Consortium is a multidisciplinary industrial affiliates program with participants from several departments from Stanford University. The research there involves the development and testing of optimization, risk assessment, decision-making and real-time monitoring and model calibration techniques, within the oil industry.
In this talk we will first present the Stanford Smart Fields Consortium and describe the main lines of research followed there. In the second part of this presentation we will go through the loop that represents the Smart Fields paradigm and illustrate some of the intermediate stages in this loop by projects carried out within the Smart Fields Consortium.
« Return to all presentations