Archive

Author Archive

Connecting iThink and STELLA to a Database

April 28th, 2011 5 comments

A question we periodically get from our customers is: Can iThink or STELLA connect to a database? Saving and pulling information to/from databases presents a lot of advantages for storing, organizing and sharing model data. Thanks to iThink and STELLA’s ability to import and export data via commonly used spreadsheet file formats, it is possible to use comma separated value (CSV) files as a means to create a connection to database applications.

Essentially, data can be moved between a database and iThink/STELLA by using a CSV file as a bridge. CSV files are a widely supported file standard for storing table data, and both iThink/STELLA and many database programs are able to read and write to them.

Process overview

The process of connecting to a database using CSV files as an intermediary

The process can be automated when you use iThink/STELLA’s ability to run models automatically from the command line (Windows only). Most database applications also have command line interfaces, allowing you to create a single macro script that moves data between your model and a database in a single process.

In this post I will use a simple example to demonstrate how to import data from a Microsoft SQL Server database into an iThink model on Windows. The model and all files associated with the import process are available by clicking here. If you don’t have access to Microsoft SQL Server, you can download a free developer’s version called SQL Server Express from the Microsoft web site.

The Model

The model used in this example is a variation of the Beer Game model. The structure shown below represents the ordering process for a simple retailer supply chain.

Retail Supply Chain Model

The model has been set up to import the initial values for On Order with Wholesaler and Unfilled Orders stocks, target inventory and actual customer orders (a graphical function with 21 weeks of data). The source of the imported data is the file named import.csv in the example files.

To set up this example, I manually created the CSV file using the initial model parameters. (Later in this post, you’ll see that this file will be automatically created by the database.) The model has been initialized in a steady state with actual customer orders at a constant level of 4 cases per week over the 21 week period.

Read more…

What is Delta Time (DT)?

August 3rd, 2010 15 comments

After reading Karim Chichakly’s recent post on Integration Methods and DT, I was reminded that delta time (DT) has always been a tricky modeling concept for me to grasp.   Beginning modelers don’t usually need to think about changing DT since STELLA and iThink set it to a useful default value of 0.25.   But once you progress with your modeling skills, you might consider the advantages and risks of playing with DT.

The DT setting is found in the Run Specs menu.

By definition, system dynamics models run over time and DT controls how frequently calculations are applied each unit of time.  Think of it this way, if your model was a movie, then DT would indicate the time interval between still frames in the strip of movie film.  For a simulation over a period of 12 hours, a DT of 1/4 (0.25) would give you a single frame every 15 minutes.  Lowering the DT to 1/60 would give a frame every minute.   The smaller the DT is, the higher the calculation frequency (1/DT).

Beware of the Extremes

A common tendency for modelers is to set the calculation frequency too high.  Without really thinking too hard about it, more data seems to imply a higher quality model – just like more frames in movie film make for smoother motion.  If your model calculates more data for every time unit, its behavior will begin to resemble the behavior of a smoothly continuous system.  But a higher frequency of calculations can greatly slow down your model’s run performance and more data does not directly translate to a better simulation.

Beware of Discrete Event Models

Another situation where DT can often lead to unexpected behavior is with models that depend on discrete events.   My eyes were opened to this when I attended one of isee’s workshops taught by Corey Peck and Steve Peterson of Lexidyne LLC.

One of the workshop exercises involved a simple model where the DT is set to the default 0.25, the inflow is set to a constant 10, and the outflow is set to flush out the stock’s contents as soon as it reaches 50.   This is how the model’s structure and equations looked:

Discrete Model

Stock = 0

inflow = 10

outflow = IF Stock >= 50 THEN 50 ELSE 0

I would have expected the value of the stock to plunge to zero after it reached or exceeded 50, but this graph shows the resulting odd saw-tooth pattern.

Sawtooth Model Behavior

The model ends up behaving like a skipping scratched record, in a perpetual state of never progressing far enough to reach the goal of zero.  (Click here to download the model.)

What is happening in the model?  In the first DT after the stock’s value reaches exactly 50, the outflow sets itself to 50 in order to remove the contents from the stock. So far so good, but now the DT gotcha begins to occur.   Since the outflow works over time, its value is always per time.  To get the quantity of material that actually flowed, you must multiply the outflow value (or rate) by how long the material was flowing.  When DT is set to 0.25,  the material flows 0.25 time units each DT.  Hence, the quantity of material removed from the stock is 50*0.25 = 12.50.

Suddenly we are in a situation where only 12.50 has been removed from the stock but the stock’s value is now less than 50.  Since the stock is no longer greater than or equal to 50, the outflow sets itself back to 0 and never actually flushes out the full contents of the stock. 

So what do we do?  One solution to this problem would be to use the PULSE built-in to remove the full value from the stock.   Here’s what the equation for the outflow would look like:

outflow = IF Stock >= 50 THEN PULSE(Stock) ELSE 0

(Note: This option will only work using Euler’s integration method.)

Further Reading

STELLA and iThink have great help documentation on DT.  The general introduction provides a good explanation of how DT works. The more advanced DT Situations Requiring Special Care section focuses more on artifactual delays and the discrete model issues mentioned in this post.  Delta time and resulting model behaviors are reminders that system dynamics models run over time, but they achieve this by applying numerous discrete calculations in order to simulate the smooth behavior of actual systems.

Categories: Modeling Tips Tags: ,

Video Demonstrates Modeling with Modules

April 1st, 2009 No comments

One of our recent webinars, What’s New in STELLA and iThink Version 9.1, highlights some of the new features added in last summer’s v9.1 release.  Karim Chichakly, Director of Product Development, guides you though the model building steps to create a supply and demand model that investigates the current housing crisis.

Among other topics, Karim covers how to organize your model with modules, draw causal loop diagrams and import data from multiple spread sheets.  Preview the 40 minute presentation with this 1.5 minute video clip.

>> Download the sample model files used in the presentation.

>> View complete webinar presentation.


Publishing Your Models to the Web with isee NetSim

March 26th, 2009 2 comments

You might have missed one of our recent free live webinars, Publishing Your Models to the Web with isee NetSim, that featured a demonstration of our isee Netsim product.  We periodically run these webinars to allow customers the opportunity to see live software demonstrations and ask questions in real time.  Video archives of these webinars are posted on our website.

In this session, Jeremy Merritt presented a live demonstration of  isee Netsim.  isee Netsim lets you publish your STELLA or iThink models to the web so that anyone with an internet connection and a web browser can run them. If you’re curious about how this simple process works, you can view the archived webinar video that shows how to prepare, export and publish a model to the web.

Get a taste of the higher resolution 40 minute presentation by watching this one minute preview.

>> View complete webinar presentation.


Categories: Training Tags: , , ,

Running Models from the Command Line

March 20th, 2009 4 comments

Version 9.1.2 introduces the oft-requested ability to run models from the command line outside of iThink or STELLA (Windows only).  This feature opens up a number of possibilities for users who need to automate their modeling tasks.  For example, you could use the command line to automatically run a model multiple times, start a model from a script, or create a shortcut that opens and runs a model when you double-click on the icon.

I have set up a simple example that illustrates how you could run an iThink model multiple times to skirt the 32,767 time step limit that advanced users sometimes run up against.  The model itself does nothing extraordinary, it simply increments the value of a Stock by one at each time step. What makes it worth noticing is that it runs 32,000 iterations three times to mimic a 96,000 step run.

>> Download the Sample Files

This example will open and run the model file named multiple_run.itm, import data at the beginning of each run, and export data at the end of each run.  You can double-click on the “Start batch runs” shortcut to kick things off or you can use the batch file that is provided. Note that you may have to edit these start files to make them work on your computer

Command Line Syntax

"c:\program files\isee systems\iThink 9.1.2\iThink.exe" -rn 3 multiple_run.itm

Shortcut Properties

The command line above was entered into the “Target” field of the “Start batch runs” Properties dialog.  Note the “Start in” field is intentionally left blank so that the shortcut will run the model from the current directory.  If you move the shortcut file to a different directory,  you’ll need to enter that directory into this field.

The identical command line syntax is used in the supplied batch file named “go.bat” and can be edited using Notepad.

Sample Model and Spreadsheet

The sample model uses a table to report the value of the Stock at the end of each run so that it can be exported to the “multiple_run_data.xls” Excel file. In Excel, I linked the exported value of the Stock to an “Import” worksheet.   This way, one run hands off the final data to start the subsequent run like runners in a relay race passing a baton.  Note the initial Stock value will need to be reset in Excel before starting a new batch of runs.

Running the sample command line puts iThink into a macro mode.  It opens just as if you double-clicked its icon and manually started the runs yourself.   Sit back and watch the model open, and let the model run three times on its own.  Try to leave the process alone while it executes, I did find that if the runs were interrupted the Excel file could sometimes lose its formatting.

For an experiment you could add “-nq” to the command so that the model stays open after running.  After adding the new parameter your command would look like this:

"c:\program files\isee systems\iThink 9.1.2\iThink.exe" -rn 3 -nq multiple_run.itm


There are many more command options available.  View the full list here and experiment with other parameters.

UPDATE:

Another advantage to running from the command line is that you can start and run a model from inside Excel. Not only can you run the model this way, but you can take advantage of all the parameters.

Create a shortcut as described in the post and save it. In Excel, pick a cell and select Insert hyperlink. Browse to and select the shortcut and then click OK. It’s that easy!!