Don't Just Simulate, Solve!  Ten DOs and DON'Ts for preparing a useful and successful computer simulation.
by John Cesarone, Ph.D., P.E.
     © 2000, Institute of Industrial Engineers
Simulation software is a great boon to the modern engineer; cheap, abundant, easy to learn and to use.  The flipside of this accessibility is that engineers are often called upon to solve problems using a simulation, but without the benefit of years of experience, libraries of past models, and departments full of grizzled veterans to dispense advice.  We end up working by the seats of our pants, often misusing this powerful but subtle tool, and sometimes coming up with irrelevant or even erroneous results.

The purpose of this article is to share some hard-won experience from one of those grizzled veterans, with some practical tips for ensuring that your simulation really addresses the needs of your situation and of your organization.  With this in mind, I have assembled five essential "Dos" and "Don'ts" for ensuring that a simulation is useful, practical, and efficient; that it solves the problems for which it was developed, and that the overall experience is a positive one for the engineer and for the organization.


FIVE "DOs" OF DEVELOPING A SIMULATION:

1. Do take the time up front to decide what your goal really is.

Often the phrasing of your nominal assignment will not describe the real problem, so be sure to do some homework before you start assembling your simulation.  What are the real problems, and what are the real goals? Remember that you can't get what you want until you know what you want, so spend some time on properly defining the problem.

For instance, many plant managers will complain that an expensive machine tool or processing line is underutilized, and will want you to find ways to get more work through it.  But this is probably not the real problem, and solving it may only increase work in process inventory or cause other disruptions.  The real problem is more likely to be an upstream bottleneck, a lack of sales, or maybe that the machine wasn't really needed in the first place!

Also, beware of conflicting objectives.  There are many metrics of plant efficiency, and you cannot optimize them all at once.  For example, you cannot simultaneously minimize inventory levels and backorder waiting time with the same policy; they usually involve a tradeoff.  By the same reasoning, you will rarely be able to maximize machine utilization while maximizing flexibility of production mix; they require different approaches to lot sizing.  Similarly, a long-term rigid production schedule will maximize purchasing and scheduling efficiency, but will not optimize your responsiveness to customer demand.  The trick to resolving these kinds of conflicting objectives is to realize that they are really only intermediate objectives; you really want to maximize sales, minimize production costs, or optimize some other bottom-line metric.

2. Do use an incremental approach to model development.

Start with a rough outline, with minimal detail, and simulate your system in broad strokes.  Remember that the odds are good you will be going off in the wrong direction at first, or at least not in the ideal direction.  Spend a few days on this rough model, and then solicit feedback from experts who live with the real system day in and day out.  Ask the plant manager if your rough model resembles the way the line really functions.  Ask the production supervisor if you have reproduced the releasing policies correctly.  Ask the foremen if this is the way decisions are made and implemented.

I like to schedule a meeting for this type of discussion about a third of the way into model development.  Get the appropriate personnel involved, explain to them all of your assumptions, show them your output and some rough statistics, and see if they agree with how your model is behaving.  Not only will they be able to point out deviations from reality, they will help ensure that you are asking (and answering) the appropriate questions.  Animated output is a big help at this meeting, since you will often be dealing with people new to the simulation concept.

You might want to plan on some "what if" tests for this meeting as well.  For example, shut down one machine in a production line and see what it does to output statistics.  Your production manager should have a good feel for what the real effect would be, and can help validate your model and its results.  Alternately, your foreman probably has a workstation or two where he frequently needs overtime labor; see if doubling resources at this location has the expected effect.

After a meeting of this sort, you can proceed with much more confidence on the detailed development of the model.  If it is a very complex or involved model, you might want to schedule several of these tweaking meetings, just to make sure you stay on the right track.

3. Do develop buy-in for your project as you develop your model.

Any organization sufficiently complex to warrant a computer simulation will undoubtedly have a large number of differing viewpoints and stakeholders, not always sharing the same agenda.  Chances are, the facility or department that you are simulating will see you as an outsider, and possibly as a threat.  I've always found that it is essential to develop buy-in and ownership of your project among the insiders if the results are to be accepted.

Ask for advice and insight from as many people as you can.  Most are willing to share their knowledge if asked respectfully.  Don't assume that you know better than they do, you probably don't!  Spend a lot of time on the floor, make sure they see you and know that you are not working from an ivory tower.  Get a feel for the dynamics of the operations, pick up the particular jargon they use, understand their perspectives and their problems.  Unless you are working with a completely roboticized facility, remember that the people involved are the greatest source of problems and of solutions.

I always like to ask the people on the shop floor how they would solve the current problem, not just for explanations of the status quo.  Many times, I've found that the solution eventually adopted was envisioned long ago by inside people who had no voice with the higher-up decision makers.  If you find yourself in this situation, you have a built-in ally and internal champion; make sure to get this person involved.  If possible, bring him in for your final presentation as a domain expert, name a scenario after him, make him the true hero.  This will go a long way toward getting your recommendations adopted and accepted.

4. Do make sure that your conclusions are statistically significant.

Once your as-is model is validated, you will be testing a variety of what-if scenarios and observing how they affect system performance.  Hopefully, one or more of these scenarios will result in improved performance.  But is the improvement significant?  Or is it just a pleasant but coincidental outcome due to the random numbers used in your simulation?  Remember that not all differences are significant differences, especially in a statistical sense.

The classical method for demonstrating statistical significance is with an ANOVA, or Analysis Of Variance.  This statistical tool is not too difficult to understand or to perform, and if you are not familiar with it, take a half a day and read up on it in your old college statistics text.  It can be performed with a hand calculator if necessary, or with a standard spreadsheet program such as Excel.  If you don't feel comfortable with this type of math, solicit the advice of a statistician or a more statistics-oriented engineer colleague.

The dangers of ignoring this "do" cannot be overemphasized.  Your recommended changes to the system will not come without a cost; even if no new equipment is purchased, the changes to policies and disruption to production while they are being rolled out will not be free.  If the anticipated improvements do not materialize, your project will be a failure, and your reputation will suffer severely.  Before you go out on a limb and suggest your solution be implemented, make sure that you are standing on sound statistical ground.