Auckland Feb 2014
OptALI Optimisation in Industry Workshop, 24-26 February 2014, Auckland
The Department of Engineering Science at the University of Auckland is proud to announce the OptALI Industry Workshop held at the University of Auckland.
The workshop will bring together industry representatives and researchers. Monday afternoon of the workshop will run in conjunction with the New Zealand Analytics Forum. Among other topics, sessions will be themed around transport, health and energy. Everyone is invited to present their research or practical industry experience, and we are especially looking forward to topics with industry relevance. There will be sessions of most interest to researchers, and industry focused sessions to which we will invite industry representatives.
Tea, coffee and biscuits will be provided between sessions, but lunches are not provided as there are many local restaurants to choose from.
Download the workshop flyer here.
Registration: If you would like to attend the workshop please let Olga know by email at o.perederieieva@auckland.ac.nz.
Speakers: If you intend to give a talk, please email your title and abstract to Olga by the 24th of January 3rd of February.
Fee: Thanks to generous ORSNZ sponsorship the event is free for all attendees and speakers.
$60 for speakers. Or drop in free of charge to attend the talks you are interested in. If you send us your contact details we will keep you posted on the workshop programme.
Where: Department of Engineering Science, University of Auckland. Sessions are held in G10 on the ground floor of Building 439 (70 Symonds St). Coffee breaks are in the common room on level 2.
When: Feb 24 – 26, 2014
Programme:
Monday 24.02.2014
Session 1: Applied Optimisation |
Room: G10 |
|
Chair: Andrea Raith |
12:30 |
Jesper Larsen |
Rostering Ground Crew with Work Patterns | Abstract
In this presentation we will describe the Ground Crew Rostering Problem with Work Patterns, an important manpower planning problem arising in the ground operations of airline companies. We present a cutting stock-based integer programming formulation of the problem and describe a powerful heuristic decomposition approach, which utilizes column generation and variable fixing, to construct efficient rosters for a six-month time horizon. The time horizon is divided into smaller blocks, where overlaps between the blocks ensure continuity. The proposed methodology is able to circumvent one step of the conventional roster construction process by generating rosters directly based on the estimated workload. We demonstrate that this approach has the additional advantage of being able to easily incorporate robustness in the roster. Computational results on real-life instances confirm the efficiency of the approach.
|
Download Presentation |
Technical University of Denmark |
12:55 |
Tiru Arthanari |
Applied Optimisation – What Practice Teaches |
Download Presentation |
University of Auckland |
13:20 |
Caillor Woon |
Analytics at Fonterra |
|
Fonterra |
13:45 |
Oliver Weide |
Challenges of Optimisation in Practical Applications |
Download Presentation |
Merlot |
Analytics Forum: War stories from Analytics
Note that the Analytics Forum event is full; we cannot accept new registrations due to limited space at OGH. |
Room: Main Dining Room, Old Government House |
|
|
15:00 |
Katherine Davies |
Big Company Analytics: Vodafone, Telecom and The Warehouse | Abstract
Katherine has extensive experience in the practical application of analytics in both the telecommunications and retail sectors. Her work in telecommunications began with a role as a Project Analyst at TelstraClear. Following Vodafone’s acquisition of TelstraClear, Katherine joined the Vodafone analytics team where she spent 5 years as a Business Analyst, Commercial and Pricing Analyst and a Senior Commercial Analyst. Her projects at Vodafone included detailed modelling and prediction of customer behaviour and data analysis to support pricing optimisation decisions. Katherine has also worked at The Warehouse in the Customer Insights team where she was involved in market share analysis, competitor pricing analysis and reporting on the effectiveness of promotional actitivity. Katherine is currently at Telecom NZ, where she developed models to support Telecom’s SARC (subscriber acquisition & retention cost) and CLV (contract lifetime value) programmes before moving into a p roduct analysis role in Telecom’s Wholesale division. In this talk, Katherine will share some of her experiences in applying analytics at these large NZ organisations.
|
|
Telecom New Zealand |
|
Graeme Everett |
Norske Skog: Selling Analytics within a Multinational | Abstract
Graeme Everett has nearly 30 years experience working at the Kawerau Pulp and Paper Mill, formerly part of Fletcher Challenge and now a division of the Norwegian-based Norske Skog company. Norske Skog supply all of NZ’s and about 30% of Australia’s newsprint requirements, with the mill being one of NZ’s largest industrial power users. Graeme has been involved in numerous analytics project that range from production planning, electricity generation, and market analysis to the development of strategic investment models that were used by the Norske Skog head office to rationalise their international capital base. This latter work generated annual savings of US$120 million, and was recognised internationally when Graeme’s team was selected as finalists for the 2009 Franz Edelman award. Graeme will discuss lessons learnt from his experiences, including the challenges that arise from working as a sole practitioner in a large international company faced with declining demand for its products.
|
|
Norske Skog |
|
Per Thorlacius |
Why Good Projects Go Bad and What You Can Do About It | Abstract
Its a fact of life that no matter how good a data driven decision making project seems to be at start up, some projects just go bad. In order to be able to distinguish good from bad, and in order to regain control of a project that is not on track, we need to explore what can go wrong in projects and why. In this talk I will present personal and subjective views on projects that have ended badly based on my own experiences in the airline and railway industries. I will outline different projects, their characteristics and their organizational dynamics and attempt to synthesize general causes for why these good projects went bad and how they did so. With reference to literature and experience from successful industry projects I will then propose personal and subjective advice on how to prevent good projects going bad.
|
|
Danish State Railways and the Technical University of Denmark |
Tuesday 25.02.2014
Session 2: Software for Optimisation |
Room: G10 |
|
Chair: Andrew Mason |
9:00 |
Andrew Mason |
OpenSolver and SolverStudio: Better Optimisation using Excel |
|
University of Auckland |
9:25 |
Geoff Leyland |
OpenStreetMaps for Transport Modelling |
|
Incremental |
9:50 |
Jonas Harbering |
LinTim as a Toolbox for Integrative Public Transportation Planning | Abstract
Growing cities and increasing movement flows raise the need to improve Public Transportation in a way that it is designed to serve the needs of passengers. Still, planning the different elements of public transport in one step is not (yet) possible, since the needs and conditions are too complex. The usual approach is to split up the entire problem into various subproblems, such as are stop location, line planning, timetabling, vehicle scheduling, crew- and delaymanagement, and solve them sequentially.Having such a sequence from a theoretical point of view brings up the idea of combining the different steps within a modular framework. LinTim is intended to be a toolbox, combining different approaches to each of the subproblems.
The first aim of LinTim is to solve the problem of Public Transportation Planning in a sequential process. Even more, it facilitates the investigation of an integrative Public Transportation Planning. Based on such a modular framework, “integrative” can have various mathematical shapes: Approaches can be compared, the dependencies between problems can be studied, problems can be solved iteratively or even in the same step, etc. Hence, the basic idea is to set up a framework for studying the relations that exist in the planning of Public Transportation. The talk is not only intended to give insight in the programming and handling of LinTim but also shows ideas in which the integrative idea is applied.
|
Download Presentation |
Georg-August University Göttingen |
10:15 |
Stuart Mitchell |
Pulp Easy Linear Programming in Python |
View Presentation |
Stuart Mitchell Consulting |
Session 3: Multi-criteria Optimisation |
Room: G10 |
|
Chair: Olga Perederieieva |
11:00 |
Horst Hamacher |
Improved Box Representation of Pareto Sets and Application to Bicriteria Multicommodity Network Flows | Abstract
Whenever we have used in the past multicriteria optimization as model to solve real-world problems such as cancer radiation planning, investment decisions, evacuation planning, etc. the following issue needed to be addressed: How to choose among the – usually prohibitively large Pareto solution set – a good representative system (RS) of finitely many of these solutions. Without such an RS and a corresponding search tool within its data base, decision makers will not accept the results of the multicriteria research community. The box algorithm of Hamacher et al. (2007) has been a valuable tool to provide such an RS.
In this talk several alternatives to this area-based box algorithm are suggested. It is argued that the distance approach, represented by the perimeter of the largest representing box, is better suited for practical applications. The resulting peripheral box algorithm is analyzed with respect to its worst-case complexity. Using a branch-and-price methods, it is tested on instances of the 2-objective, multicommodity network flow problem. Compared with the original area-based algorithm it is shown to be superior in its output quality while being competitive with regard to its running time. A further refinement is proposed by using surrogate models in which the Pareto curve is interpolated. The resulting (theoretically) speed-up in computing a box representation with given accuracy indicates the potential of this approach.
|
Download Presentation |
Technical University Kaiserslautern |
11:25 |
Siamak Moradi |
A Bi-Objective Decomposition Method for Solving the Bi-Objective Multi-Commodity Minimum Cost Flow Problem | Abstract
We present a new method for solving the bi-objective multi-commodity minimum cost flow problem. This method is based on the standard bi-objective simplex method and Dantzig-Wolfe decomposition. The method is initialized by optimising the problem with respect to the first objective, a single objective multi-commodity flow problem, which is solved using standard Dantzig-Wolfe decomposition. Then, similar to the bi-objective simplex method, our method iteratively moves from one non-dominated extreme point to the next one by finding entering variables with the maximum ratio of improvement of the second objective over deterioration of the first objective. Our method reformulates the problem in to a master problem over a set of bundle constraints and several linear fractional sub-problems each over a set of network flow conservation constraints. Master problem iteratively updates cost coefficients for the fractional sub-problems. Based on these cost coefficients the optimal solutions of each sub-problem are obtained. The solution with the best ratio objective value out of all sub-problems represents the entering variable for the master basis. The method terminates when all the non-dominated extreme points are obtained. The implementation of the method and numerical results are presented.
|
Download Presentation |
University of Auckland |
11:50 |
Morten Tiedemann |
Competitive Analysis for Multi-Objective Online Optimization Problems | Abstract
In online optimization, an algorithm has to make decisions based on a sequence of incoming bits of information without knowledge of future inputs. The performance of an online algorithms is commonly evaluated by comparing its objective value to the optimal offline solution, also referred to as competitive analysis. Thus far, the notion of online algorithms and competitive analysis is only known for single-objective optimization problems. We transfer the concept to multiple objectives and introduce competitive analysis for multi-objective optimization problems. Due to the shift from a single optimal solution in single-objective optimization to a set of efficient solutions in multi-objective optimization, the transformation of the concept of competitive analysis to multiple objectives is not straightforward. In this talk, the novel definition of multi-objective online optimization is discussed and substantiated by the analysis of multi-objective counterparts of classical online optimization problems.
|
|
Georg-August University Göttingen |
12:15 |
Olga Perederieieva |
Modelling the Behaviour of Drivers: Bi-objective Traffic Assignment | Abstract
Traffic congestion is an issue in most cities worldwide. Transportation engineers and urban planners develop various traffic management projects in order to solve this issue. One way to evaluate such projects is traffic assignment (TA). The goal of TA is to predict the behaviour of road users for a given period of time (morning and evening peaks, for example). Once such a model is created, it can be used to analyse the usage of a road network and to predict the impact of implementing a potential project. The most commonly used TA model is known as user equilibrium, which is based on the assumption that all drivers minimise their travel time or generalised cost, which usually represents a linear combination of time and monetary cost. This approach is not general. It allows to find only a subset of all equilibrium solutions.We propose to use a conceptually different approach inspired by the multi-objective definition of optimality – the bi-objective user equilibrium (BUE). It considers two objectives separately and allows multiple solutions. In particular, we assume that all drivers minimise their travel time and toll. We show how the BUE can be solved by applying non-linear scalarisation technique. We implement several algorithms available in the literature to solve the resulting problem and perform numerical tests on benchmark instances and Auckland Transportation Network.
|
View Presentation |
University of Auckland |
12:40 |
Andrea Raith |
Biobjective Shortest Path Problems With Applications in Transport |
Download Presentation |
University of Auckland |
Session 4: Emergency and Evacuation |
Room: G10 |
|
Chair: Horst Hamacher |
14:00 |
Paul Rouse |
Understanding outputs in health |
Download Presentation |
University of Auckland |
14:25 |
Philipp Heßler |
Evacuation by Bus with Integrated Location Decisions | Abstract
In German cities there is a constant threat of unexploded bombs from the Second World War. Mostly, these bombs are found on constructions sites. If such a bomb is found the neighboring houses have to be evacuated to shelter locations which are outside the endangered area (gymnasiums in most cases). We consider the evacuation of those people who do not have access to private transportation and rely on public transportation, in our case buses. In a first model the people are evacuated to fixed shelter locations as fast as possible. In the second one we include the decision on where to locate the shelters and the collection points where evacuees are gathering into the model. Our solution to the first model is a branch and bound algorithm. The second model can then be solved with a branch-cut-and-price algorithm which uses our first algorithm as a subroutine. Both algorithms perform very well compared to general integer programming solvers.
|
Download Presentation |
Technical University Kaiserslautern |
14:50 |
Marc Goerigk |
Robust Evacuation Planning with Buses | Abstract
In this talk we present a model for the problem of evacuating an endangered region with the help of buses that includes uncertainty in the given data: It is not known beforehand how many buses we have available, how many people need to be evacuated, and how long each route will take. Unlikely extreme scenarios are excluded with the help of a non-linear uncertainty set. Following the idea of recoverable robustness, we include the possibility of adjusting the schedule once the realized scenario becomes known. To solve the resulting multi-criteria integer program, we propose an iterative solution method that generates successive worst-cases with respect to the current solution. Experiments show that while our method is able to solve instances with small uncertainty sufficiently fast, it is highly sensitive to increasing uncertainty size.
|
|
Technical University Kaiserslautern |
15:15 |
Quan (Spring) Zhou |
Coordination of Stock Rotation Policies in a Medical Inventory Systems for Emergency Response | Abstract
Governments usually hold a large quantity of medical supplies which we call “the reserve”, to ensure continuous supply after emergencies. Even though most medical supplies have very long shelf lives, there are a lot of reserve items facing expiration since the reserve is rarely used. One possible way to reduce expiration is to rotate the reserve items to hospital’s operational use, which is what we call “rotation”. To derive the optimal rotation policy, we investigate an inventory system for emergency response of perishable items with a long shelf life, and study the joint rotation and ordering decision, in an environment with a minimum volume requirement that should be maintained all the time.We describe the system using two state variables: the number of items that have been rotated out from the reserve, and the difference between the inventory level in the hospital and the total number of items depleted from the reserve. We show that the optimal policy is characterised by two up-to levels: a rotate-up-to level and an order-up-to level. In each period, there are three cases: rotate all, rotate some, and rotate none. The actual decision depends on the interaction among the state variables and the up-to levels. We show that the whole shelf life horizon can be divided into two phases: rotate none in the first phase and start rotating (all or some) in the second phase. The rotate-up-to level is increasing as the remaining life of the reserve items decreases. Interestingly, we find that the order-up-to level is decreasing with time elapsing. Specifically, the order-up-to level decreases in the non-rotation phase, and remains constant in the rotation phase, no matter it is rotating all or some. This seems counterintuitive because one may expect to rotate more and thus order more when it approaches to the end of the shelf life. The result is driven by the different tradeoffs around the rotation and ordering decisions.
|
|
University of Auckland |
Session 5: Integer Programming and Applications |
Room: G8 |
|
Chair: Jesper Larsen |
16:00 |
Theuns Henning |
Using Optimisation to Manage Infrastructure more Cost Effective | Abstract
Since the 70’s engineers have started using prioritisation algorithms and optimisation processes to assist in the medium and long-term planning of infrastructure maintenance investment. Unfortunately engineers still regard the outcome from optimisation analysis with suspicion and often revert back to prioritisation algorithms for the planning purposes.International research has proven that ranking/prioritisation methods using a worse first approach is usually between 15-20% inefficient when compared to an optimised programme (Mahoney, 1978). Simply stated, optimisation ensures the “best bang for the buck” where the “bang” is determined by the user’s primary objectives for a network. These claims have been unproven for New Zealand until now. One of the NZ Transport Agency’s networks was analysed on the basis of a simple prioritisation and the same network was also optimised on the basis of a heuristic optimisation method. The comparative analysis showed that the optimised programme not only provided a better performing network on a ten year horizon, it also had 10% less routine maintenance cost difference between the two programmes.
Long-term deterioration modelling has been adopted as part of the Network Outcomes Contracts, yet if not effectively used in the planning process the contracts may run the risk of not delivering on the full benefits promised by this procurement process. One of the main challenges that the industry need to overcome is to use an integrated planning process that prioritises projects in the short-term taking full cognisance of the long-term strategic investment plan.
|
|
University of Auckland |
16:25 |
Oliver Sinnen |
Tools and Resources for Task Scheduling Research | Abstract
Scheduling of task graphs (DAGs) on parallel systems with communication delay (P|prec, c_ij|C_max) is a classical scheduling problem. It is NP-hard in the strong sense and part of many computing problems. Given its importance, there is still a strong interest of the research community in this problem. In this talk tools and resources for research of this problem are presented. This includes a visual tool for the analysis and manual manipulation of schedules, optimal solvers for small to medium sized problem instances, and a database of optimal schedules for a large set of task graphs on different numbers of processors.
|
Download Presentation |
University of Auckland |
16:50 |
Sarad Venugopalan |
Integer Linear Programming formulations for Optimal Task Scheduling with Communication Delays on Parallel Systems | Abstract
To fully benefit from a multiprocessor system, the tasks of a program are to be carefully assigned and scheduled on the processors of the system such that the overall execution time is minimal. The associated task scheduling problem with communication delays, P|prec,c_{ij}|C_{max}, is a well known NP-hard problem. We propose a novel Mixed Integer Linear Programming (MILP) solution to this scheduling problem, despite the fact that scheduling problems are often difficult to handle by MILP solvers. The proposed MILP solution uses problem specific knowledge to eliminate the need to linearise the bi-linear equations arising out of communication delays. Further, the size of the proposed formulation in terms of variables is independent of the number of processors.The proposed formulation displays a drastic improvement in performance, which allows to solve larger problems optimally. In general, it is of interest to see if some of the ideas discussed may be used for more generic binary integer job scheduling problems.
|
Download Presentation |
University of Auckland |
17:15 |
Antony Phillips |
Optimisation methods for practical University Course Timetabling | Abstract
University course timetabling is a large resource allocation problem, in which both times and rooms are determined for each class meeting. Practical problems typically feature a large number of classes, which can be limiting on the computational techniques used for generating a timetable. There are also a multitude of quality measures in course timetabling which need to be considered. This talk discusses the use of mathematical optimisation to tackle the sub-problems of course timetabling, in the context of a practical timetabling process.
|
|
University of Auckland |
17:40 |
Lin Chen |
The Applications of Optimisation in Decision Making of Road Maintenance | Abstract
Road maintenance supports our daily life by maintaining road network so that the network can provide required service for all users. Decision making, as an important part of road maintenance, determines the maintenance strategies for a road network. After alternative maintenance strategies are generated, each strategy results different road performance and expense. By selecting different strategies, decision making can roughly determine the benefits and costs of road maintenance. In order to find the most appropriate maintenance strategies, decision making is required to consider many requirements including the conflicting objectives, and analyse the long-term maintenance strategies for the entire road network.To assist decision making, multi-objective optimisation is increasingly applied. Multi-objective optimisation identifies Pareto solutions which can clarify and simplify decision making problems, and help the trade-offs of objectives. However, the previous applications may not be satisfied when the decision making problem is complex.
This talk introduces decision making of road maintenance and its requirements; sets forth the existing multi-objective optimisation when solving decision making problems; and discusses the drawbacks of previous applications and some ideas about the solving.
|
|
University of Auckland |
Wednesday 26.02.2014
OptALI site leader meeting |
Room: G10 |
|
|
10:30 – 11:30 |
|
|
|
|
Session 6: Electricity |
Room: G10 |
|
Chair: Golbon Zakeri |
12:30 |
Golbon Zakeri |
Analytics of Demand Participation | Abstract
Demand response is perhaps the single biggest remaining challenge in electricity market efficiency. We will present analytics tools developed around vSPD that can help a large consumer of electricity develop a distribution of price as a function of their demand response. We will also discuss ILR offers of such large consumers of electricity briefly.
|
|
University of Auckland |
12:55 |
Tony Downward |
Multi-node Offer Stack Optimisation over Electricity Networks | Abstract
In this work we examine the problem that electricity generators face when offering power at multiple locations into an electricity market. The amount of power offered at each node can affect the price at the other node, so it is important to optimise both offers simultaneously.Even with perfect information (i.e. known demand, and known offers from competitors) this is a non-convex bi-level optimisation problem. We first show how this can be formulated as an integer program using special ordered sets of type 2 (SOS2) enabling this problem to be solved efficiently.We then extend this work to allow for uncertainty, and hence find the profit maximising offer stacks at each node (as opposed to a single quantity, as in the deterministic case).We demonstrate the intuition that we can gain from this model using a simple two-node example, and apply this methodology using real-world data from the NZEM.
|
|
University of Auckland |
13:20 |
Steve Poletti |
Renewable intermittency and investment | Abstract
The rise of intermittent generation in electricity markets has given rise to a renewed interest in capacity markets for security of supply. This is driven by the underlying assumption that the increased price volatility caused by intermittent generation will not provide sufficient revenue to incentivize construction of peaking generators sufficient to ensure security of supply. However, this assumption has not been tested in the academic literature with any degree of rigour. In this paper, we use an agent-based model combined with a generation expansion model of the New Zealand (energy only) electricity market to determine whether there is a level of wind penetration for which peak plants no longer cover their costs and hence security of supply no longer guaranteed. We find that if firms bid into the market at short run marginal cost then it is the case that increased wind reduces average prices and the return on peak plants. However with strategic bidding firms have more opportunities to exercise market power (when the wind generation is low) which pushes up average prices and the return to peakers as wind capacity increases.
|
Download Presentation |
University of Auckland |
Session 7: Health |
Room: G10 |
|
Chair: Andrea Raith |
14:00 |
John Simpson |
Knowledge Based Treatment Planning – Evaluating Plan Quality Using Data Envelopment Analysis | Abstract
Radiotherapy treatment planning is a process that aims to produce a prescribed dose of radiation to a target volume (tumour) whilst simultaneously minimising the radiation to surrounding normal tissue. This poses a conflicting set of challenges. The art of treatment planning is therefore to best optimise the high dose v low dose requirements by reference to the tumour control dose on one hand and the tolerance dose for normal tissue on the other hand. Conventionally the quality of plan is determined by a number of factors including the experience of the person planning the treatment, the particular patient anatomy, the time available and the features of the therapeutic delivery device. The plan is developed in an iterative fashion with one or more treatment delivery parameters adjusted at each iteration. The final plan will meet all or most of some predetermined clinical acceptance criteria; however it is not known whether the end result is truly optimal.
One solution is to generate a set of Pareto optimal plans using multi-criteria optimisation methodology satisfying a number of clinical objectives. This is an area of development in radiotherapy however at this time it is not widely available in the clinical setting. Another approach is to utilise what all clinics already have, namely a large database of past plans. From such a database it may be possible to identify peer optimal plans for comparison with the current plan. This is generally referred to as “knowledge based planning” and one such method currently being investigated by the Auckland group uses the analytical method of Data Envelopment Analysis (DEA).
Method
DEA determines the efficiency of a process through the relationship between an input and output. In our novel application to treatment planning we chose a measure of normal tissue dose as the input and some other measure of tumour dose as the output. The hypothesis is that the derived efficiency can be used as a measure of plan quality and therefore used as a comparator with the plan database to establish whether the evaluated plan is fully efficient (optimal).
We retrospectively examined 37 clinically acceptable prostate treatment plans using Rectal Equivalent Uniform Dose (EUD) as the normal tissue input and Dose received by 95% of the target (D95) as the tumour output. Each patient has a different anatomy resulting in a different distance between rectum and prostate. This is handled in our DEA analysis as an environmental factor and therefore a plan will only be compared against peers with equal or more advantageous anatomical conditions (ie. equal or greater separation). We determined the efficiency of all 37 plans and selected 5 less efficient plans for re-planning with the aim of improving the efficiency score. The plans were then analysed for clinical improvement.
Results
Mean efficiency of all 37 plans was 0.985 (SD 0.010). Of the 5 plans that were re-planned the mean efficiency before replanning was 0.973 (0.007) and after replanning 0.992 (0.005). The increased efficiency was mainly due to decreased rectal dose with a reduction of average gEUD from 62.9Gy to 61.1Gy whilst the target coverage D95 was 71.4Gy and 71.5Gy pre and post replan. A blind clinical review of overall plan quality was reported as improved in all 5 cases, mainly due to the decreased rectal dose.
Discussion
We have demonstrated with real clinical data that a 1 input, 1 output DEA analysis of prostate plans is able to identify plans that are less efficient and open to further improvement. We have also shown that the improved efficiency is matched to clinical improvement. In addition, we have shown that the inter-patient variations due to anatomical differences can be managed through the concept of the environmental factor. We believe that we have been able to demonstrate a proof of concept for the use of DEA in radiotherapy treatment planning. Further work is now required to understand the optimal choice of inputs, outputs and the environmental factor across a range of treatment sites. The role of DEA in understanding treatment plan predictors for adverse treatment outcome is a further avenue of investigation which we plan to undertake.
|
|
Radiation Oncology, Newcastle Calvary Mater Hospital, Australia |
14:25 |
Ali Broadbent |
Stakeholder Engagement in Capital Budgeting at Counties Manukau District Health Board | Abstract
How did Counties-Manukau District Health Board shift their capital budgeting process from one of competition for scarce resources between diverse stakeholders to a process with a high level of stakeholder alignment and ownership, while pressure on the capital budget continued to increase?Catalyze worked with Counties-Manukau DHB to implement a process that prioritises capital investment using Decision Conferencing and Multi-Criteria Decision Analysis (MCDA). MCDA helps stakeholders to consider the cost of proposed investments along with both tangible and intangible benefits.
In a portfolio prioritisation Decsion Conference stakeholders score each proposed investment against criteria that articulate the organisation’s high level goals, and then weight the relative preference of the criteria. Scoring and weighting are interactive processes that result in shared understanding amongst stakeholders. Catalye use MCDA modelling software to capture the scores and weights in real time, and an order of priority – in terms of value for money – for the capital requests is generated and displayed during the Decsion Conference.
The new process at Counties-Manukau DHB has been through two iterations and has resulted in senior stakeholders enthusiastically participating in their capital budgeting process. Throughout the year there is an increased sense of confidence that they are spending their limited capital on the right things.
|
|
Catalyze Ltd. |
14:50 |
Michael O’Sullivan |
Optimisation and Simulation for Improving Healthcare | Abstract
The ORUA research group has been using optimisation and simulation models in a variety of healthcare services since 2010. I will present a cross-section of this work and discuss our experiences working on practical healthcare problems with the ADHB including: Rostering of General Medicine Registrars; Simulating Patient Transits; and Investigating Workloads at LabPLUS.
|
Download Presentation |
University of Auckland |
15:15 |
llze Ziedins |
Patient Flow and Priorities |
|
University of Auckland |
Social Programme:
Monday afternoon: New Zealand Analytics Forum event followed by drinks and nibbles.
Tuesday evening: A BBQ dinner will be provided on the Level 2 balcony at the Department of Engineering Science.
Wednesday evening: We will catch a ferry across the Auckland harbour to Devonport for dinner, and an optional short walk up to the Mt Victoria lookout. Dinner will be at The Patriot bar. Participants pay for their own ferry ticket and dinner.
Photos
Accommodation
Several hotels nearby to the Department are listed below.

Index |
Name |
Room Type |
Rate |
Address |
Note |
Website |
Q |
Quest |
Standard Studio Apartment |
$159 |
3 Whitaker Place |
|
www.questcintra.co.nz |
B |
Bianco off Queen |
Deluxe Queen Apartment
One Bedroom Classic Apartment |
$132.25
$155.25 |
2 White St |
Mention coming to a conference at the University of Auckland |
www.vrhotels.co.nz |
C |
Copthorne Hotel |
|
$113 |
150 Anzac Avenue |
Quote “#95080” when booking |
www.millenniumhotels.co.nz |
P |
Pullman Auckland |
|
$175 |
Corner Waterloo Quadrant & Princes Street |
Enter “ACPC” as Preferential Code |
www.accorhotels.com/8219 |
We look forward to seeing you in Auckland! Feel free to get in touch if you have any queries.
-Andrea Raith and the Industry Workshop Organising Committee
Organising Committee
|
|