The Case for Simulation

By Israel del Rio

© 2008-2009 Abstraction Co—Makers of Prophesy

The increasing popularity of Service Oriented Architectures, combined with complex Client/Server and n-Tier designs, requiring an integrated set of diverse components, has brought to light the question of system design as a proactive system management activity.

Traditionally, use of simulation for system design has been the exclusive domain of Fortune 500 companies with the budget to sustain specialized staff, and with the financial budget to afford expensive in-house or off-the-shelf modeling tools. Because even simple Client/Server systems which, by definition, consist of multi-vendor and heterogeneous components now present non-deterministic performance and capacity planning behaviors, several popular articles on the subject of simulation have been recently published. While most of these articles offer good entry level information on the area of simulation, some have, unfortunately, highlighted myths and misconceptions about the usability of simulation in system design. These articles focus on the following criticisms:

While there is, no doubt, some degree of legitimate frustration driving these comments, we would like to answer to each of these points.

Can you really guesstimate the design information by other means?

None of the criticisms specifically challenges the need to design your system pro-actively. After all, no one would think it appropriate for a Civil Engineer to construct a building without a design blueprint, to have him or her add a floor, and measure the building's stress to "guesstimate" if more floors can be added. This criticism, however, skirts this concept very closely. There is a Catch-22 situation in that no representative information exists prior to building your system, and simply dividing resource/work load gives a distorted image of reality.

Consider this example: A theater ticket-seller can process 10 ticket sales per minute and, on average, ten movie-goers show up every minute. If each customer arrived exactly every six seconds, then it would be true that the ticket seller would be busy 100% of the time, and then it would be true that no waiting queue would form. In real life, though, the 10 movie-goers will arrive at irregular intervals. Perhaps during the first 10 seconds, three customers show up, then during the next 40 seconds no one arrives, and during the next 10 seconds another four customers arrive, making an average of six customers per minute. In this latter case, queues will form, service times will be non-zero, and the ticket-seller will not be busy 100% of the time. Statisticians refer to the way customers arrive as inter-arrival distributions, and depending on the arrival "shapes", these inter-arrival distributions receive names such as Poison, Hyper-exponential, etc. These and other names have been the frequent source of headaches to entry-level simulation practitioners.

Formulas exist to solve the average service, and wait times for scenarios like the one discussed. Unfortunately, these formulas are easy to apply only for the simpler cases like the one mentioned (referred to as M/M/1, meaning one queue, one server, with an inter-arrival shape known as exponential). Furthermore, formulas only give answers in cases where a steady-state has been reached. That is, formulas do not account for the transient service times; nor can they accurately predict average response times in cases where the arrival rates exceed the service times. In real life cases, simulation is often needed and cannot be substituted by simple napkin arithmetic.

A second aspect is that actual system measurements do not always report congestion situations. When you measure the utilization of a Wide Area System link, you simply do not know if some packets have been dropped down the line because a router was congested. If you measure the router utilization, you just don't know if an improperly configured entry is affecting the measured performance. There is a place for benchmarking, but it rarely is a substitute for system modeling.

How practical is Modeling?

Shouldn't you understand the parameters and elements of your system a-priori even if you do not intend to simulate? The criticism that modeling is not recommended because it forces you to learn everything there is to know about your future or existing system is actually an endorsement for simulation. Yes, with simulation you are forced to establish a discipline to understand and evaluate your system during its design or while planning its optimization, before you spend. The myth is that you cannot use simulation effectively unless you have a complete knowledge of the system parameters. Think of simulation as an activity akin to building a comprehensive spreadsheet inventory application. You have surely often faced situations where you really do not have precise information about how inventory is tracked and have settled, temporarily, for an informed guess. Undeterred, you would have calculated the spreadsheet and would have evaluated whether the results were consistent with your experience or were, at least, not off-the-wall. Perhaps that first iteration did give you some useful information, but once started you could have then enhanced the spreadsheet model by adding information as it came along. The same process can be followed by modern simulation tools. Frankly, it is not realistic to expect immediate, easy answers to complex questions. A methodical and iterative approach to system design is always advisable.

Do you need to be a PhD to simulate?

Unlike other products for which a ready made metaphor already existed (Spreadsheets based on Ledger Sheets, Word Processors based on Typewriters, Data Base packages based on Rolodex cards, etc.) it is difficult to find a ready made metaphor for what simulation does, other than reality itself. This is because simulation is such a work intensive task that it was not typically done manually in the past. Simulation is inherently a task consuming effort, and it is true that it is not as approachable as an application that can be made to work for you right out of the shrink-wrapped package. Chances are that with simulation you do actually have to read the enclosed manual before you can achieve some degree of success, but this is not equivalent to saying that the concepts and methodologies involved are beyond the reach of any system professional. Unlike some critics' assertions, good modeling packages do not require you to become an expert in statistics or queuing theory. You only need to become an expert at the package you use. Earlier, we explained the concept of inter-arrival distributions. This is surely a concept that makes sense once its reason for existence is understood. There are a few other concepts that need to be grasped in order to effectively use simulation tools, but in general, once you've passed over the initial painful learning curve of any modeling package, you will be in the position to benefit from it.

Are Simulation Results Certain?

Here we will refer you to the spreadsheet example. Your results will be as good as the quality and quantity of the information you've entered. You should never expect exactitude in the exercise of simulation, which is by its very nature non-deterministic, and in the end, an informed guess. What you should expect from simulation are reliable "ball-park" estimates. Do not lose sight of the fact that guessing that your system's response time will be under one second or so, no matter what your background on the problem might be, is not a scientific exercise. With simulation you can conclude with a good degree of confidence (say 90% confidence) that the response time will be between 0.9 and 1.1 seconds. No, usually you won't be able to predict with exactitude that the average response time will be 1.0345 seconds or that a communications link will be used at 67.3876% capacity. In fact, in some instances results within 50% of reality may well be acceptable. For example, if you are only interested in finding out whether you can satisfy a response time of under 3 seconds, and the simulation result is as previously stated, you can tolerate a 50% deviation from reality. Even if the simulation result is off by 50%, you will know that you can indeed satisfy that 3 second response time. Not requiring high precision also eases the need for high accuracy in the input.

Please note that we are not advocating irresponsible modeling, but rather an understanding that the degree of effort and expectation should match the problem at hand. If a 5% variance in results represents an expenditure in the order of millions of dollars, you will be better off applying all your ingenuity and effort in making your model highly accurate. The latter is not an option you have without simulation tools.

Also, remember that simulation ought to be an on-going exercise. Once your system is initially deployed or modified, you owe it to yourself to obtain information from it, and to update your initial model to make it more accurate. In time, you can calibrate the model with reality; thus becoming much more confident in the predictive value of your simulation results. That's the fun part.

Criticism in the sense that existing simulation tools fail to give you specific system design recommendations miss the mark in that oftentimes, the system you simulate is, for pragmatic reasons, an abstraction of the real system. Consider the scenario of a global system that will connect tens of thousands of workstations with hundreds of routers. No-one in his right mind would set out to individually depict each and every discrete system component in a model of such a system. The system modeler will attempt instead to cluster common topologies, aggregating the generated traffic on a regional basis, and depending on the questions asked, he or she will focus on the specific system elements to be studied. No doubt emerging software technologies such as neural systems or genetic algorithms will, in the future, be able to automatically map abstracted models to specific design recommendations.

For the time being, we should not lose sight of the marvelous fact that simulation is now an activity doable from a typical PC configuration. Just a few years ago, simulations could only be performed on expensive mainframes. Yes, the field of simulation is still in its infancy and much research needs to happen to make it more intuitive and "human-friendly". Still, take heart in the fact that the original Visicalc was nevertheless useful, despite lacking many of the fancy features now considered "essential" spreadsheet functions. The current state of simulation is not unlike the original "killer" applications you grew to love.

Is Simulation really that Expensive?

There are few product categories were the cost spread is as large as with simulation tools. It is an emerging computer application, at least in terms of its popularization, and there are actually several different implementations of simulation products. While I will not dwell on the specific nature of commercial implementations (analytical, discrete event, visual interactive, procedural languages, etc.), you should be aware that you are not restricted to purchasing packages costing tens of thousands of dollars. Our own Abstraction Consulting's Prophesy retails for $599 and runs in a standard MS/Windows configuration. If that's still beyond your budget, there are public domain implementations of C++ simulation class libraries, experimental shareware products, or even research-developed subsets of simulation languages such as GPSS available. Some analytical based packages such as PSI's LANModel retail for a hundred dollars or so, and other commercial packages can be had for as little as $2,000. This is not to say that you should not consider "high-end" packages, if their features, vendor support, or other aspects of interest in your RFP are satisfied. After all, if you are modeling a multi-million dollar system, a cost of $30,000 can be considered trivial. Both a Geo Metro and a Rolls-Royce will take you to the destination. Your entry into the simulation field need not be an expensive experience. It can be inexpensive and rewarding.

In Conclusion

We have focused on debunking several of the misconceptions about the field of simulation. However, We're the first to admit that simulation is not a panacea. There is simply no good substitute; nor magic bullets to avoid the basic, hard homework required to properly design a system. In the end, simulation tools are just that: tools intended to aid you in the complex task of building state of the art system systems. As with other tools, you should appreciate the simulation tool's strengths and be aware of its limitations. Just as a spreadsheet is not going to automatically give you the numbers that help you build a business case, modeling is not going to give you ready answers out of thin air. Nor is modeling always the right approach to your problem. It goes without saying that if you are deploying a short-lived system (tactical, we call it these days) it may not be necessary to focus all that much on the precision of that system's design. But if you do conclude that a little extra up-front simulation work will make things easier down the road, you will then need to understand the purpose of the modeling exercise, the constraints placed on your effort, and to balance expectations with reality. In this there is no hype; just plain, old fashioned, discipline and work.

Back to Abstraction Home Page