Don't Just Simulate, Solve!  Ten DOs and DON'Ts for preparing a useful and successful computer simulation (continued).
5. Do communicate your results effectively.

After you have finished your project, including testing of a variety of what-if scenarios, selecting the best alternative (or perhaps several candidates), and proving statistical significance, you must communicate your results within the organization.

I recommend that this take place in a presentation format, similar to the validation meeting we discussed in Do #2.  In most cases, your presentation will be too complex to simply write up and drop on someone's desk; also, it will most likely lead to discussions and trade-offs, and the final decision regarding selection of alternatives will involve a number of different stakeholders.

The first thing to remember in preparing this presentation is that you have been living and breathing with the simulation for awhile, but your audience has not.  They are also unlikely to be as well versed in simulation technology and terminology as you are, so be careful to present your results in a language that they understand.  Use graphics and animations wherever possible.

Present the strengths and weaknesses of each of your alternatives in terms of labor savings, time savings, material savings, cost to implement, disruption during roll-out, repercussions to the rest of the organization, etc.  Also, remember to use the internal champions that you developed; invite them to attend, and bring them into the discussions as appropriate.  They will add credibility to your results, and will speed along the eventual implementation.

FIVE "DON'Ts" OF DEVELOPING A SIMULATION:

1. Don't include too many or two few functions in your model.

When you start carving out that part of the world that you want to simulate, you have to draw an imaginary dotted line around those functions which will be included in your model.  It is important to avoid flinging this dotted line too far abroad, but you must be sure to include all necessary functions, processes, and resources.

How do you tell exactly how much of the real world must be modeled?  There are no hard and fast rules, but your best guideline is the question that you are trying to answer, as determined in Do #1.  Which machines, people, and policies affect that question?  Which, if changed, will alter the answer to that question?  Any function or resource that has no bearing on the given goal can safely be left out.

Our other guideline comes from Do #2, incremental development; if we leave out some essential part of our model, this will turn up in our developmental review, and we can make a course correction.  Similarly, if we include unnecessary elements, we will find that they are not exercised during our validation tests and we can eliminate them during model refinement.

2. Don't include too much or too little detail in your model.

This rule is similar to the previous one.  Just as the scope must be appropriate to the problem at hand, so must the depth of the model.  If there is insufficient detail in your simulation, the answers will be too crude to reflect reality and point out a solution; if there is too much detail, you will be wasting time and resources, and possibly obscuring the answers and insights that you would otherwise be able to draw.

The rule of thumb here is also the same as before: ask yourself which details are necessary for answering your given question.  Do I care how many parts arrive at this machine, or only how many pallets?  Do I need to know how many minutes a process requires, or can I live with production cycles per week?  Do I need to model first, second, and third shift separately, or can I use an average workload?  Keep your decision based on answering the key questions, solving the key problems, not on reproducing every function of the real system.

Here again, we have room to guess incorrectly the first time and recover later.  Our incremental model development will enable us to recover from too much or too little detail, in the same way that it helped us include all necessary functions.

3. Don't forget about boundary conditions.

A simulation starts and ends, but a real factory tends to run continuously.  Even if you shut down overnight or on weekends, the work in process is there waiting for you when the doors open again.  If a computer model begins empty and idle, this could bias your data and lead to erroneous conclusions.

On the other hand, a service organization does tend to reset to zero each day.  When they lock the doors of the bank at five p.m., they still finish up all the customers on the premises, count the money, and leave everything tidy for the morning shift.  They start empty and idle each morning, although there may be a line of customers waiting at the door.

The point is to make sure you understand how your real system works and model it accordingly.  If your system never reaches an empty and idle state, let your model "warm up" for awhile before you begin to collect statistics.  How long must it warm up?  Check statistics periodically until they stabilize (our old friend the ANOVA will help here).  In the same vein, your shutdown logic must follow the real life situation; either finish all parts currently in queues, or continue them on the next shift, as appropriate.

4. Don't drown your audience in statistics.

Getting back to your final presentation, don't forget that the statistical analyses are your job, not that of the audience.  I like to have ANOVA and other statistical results on backup slides, and only pull them out if someone asks.  Otherwise, you tend to drown your message and reduce the impact of the cleverness of your proposed changes.

Also, don't bore your audience with all the failures.  If you tested 20 what-if scenarios, and only three showed improved functionality, show the three.  Unless you were specifically charged with testing specific alternatives that turned out to be losers, focus on the winners.  If all 20 turned out to be winners, show the top three or four, or perhaps the three or four most different approaches.  Keep the rest for backup in case someone asks.

5. Don't fall in love with methodology.

Finally, remember that your task wasn't to build a simulation, it was to improve the functionality of some real-life system.  If it turns out that simulation is not the best approach, don't be afraid to change direction.  In some situations, operations research techniques may be more applicable.  In others, the necessary solutions can be determined with a spreadsheet.  In some cases, you will even find a closed form analytical solution to the problem at hand.  Just because you own a hammer, not everything you come across will be a nail.

This rule especially applies if you are a hired-gun consultant brought in specifically to develop the simulation.  No client wants to be taken advantage of, and I have found that they always appreciate a better approach than the one they asked for, particularly if it saves them time and money.

SOME FINAL THOUGHTS:

Whittling down appropriate modeling technique to five dos and don'ts is a bit arbitrary, of course, so here are some other odds and ends that we might also keep handy in the backs of our minds, in no particular order: take some time to ensure the integrity of your input data, or you'll suffer from the GIGO syndrome; decide up front if you need a discrete or a continuous simulation, based on the nature of your system; select the software tool most appropriate for the current task; don't forget to do a cost/benefit analysis on your recommended changes; don't be too disappointed if none of your suggestions are ever implemented (this happens a lot!); and remember the words of George E. P. Box: "All models are false, but some models are useful."