Home > STELLA & iThink > Connecting iThink and STELLA to a Database

Connecting iThink and STELLA to a Database

A question we periodically get from our customers is: Can iThink or STELLA connect to a database? Saving and pulling information to/from databases presents a lot of advantages for storing, organizing and sharing model data. Thanks to iThink and STELLA’s ability to import and export data via commonly used spreadsheet file formats, it is possible to use comma separated value (CSV) files as a means to create a connection to database applications.

Essentially, data can be moved between a database and iThink/STELLA by using a CSV file as a bridge. CSV files are a widely supported file standard for storing table data, and both iThink/STELLA and many database programs are able to read and write to them.

Process overview

The process of connecting to a database using CSV files as an intermediary

The process can be automated when you use iThink/STELLA’s ability to run models automatically from the command line (Windows only). Most database applications also have command line interfaces, allowing you to create a single macro script that moves data between your model and a database in a single process.

In this post I will use a simple example to demonstrate how to import data from a Microsoft SQL Server database into an iThink model on Windows. The model and all files associated with the import process are available by clicking here. If you don’t have access to Microsoft SQL Server, you can download a free developer’s version called SQL Server Express from the Microsoft web site.

The Model

The model used in this example is a variation of the Beer Game model. The structure shown below represents the ordering process for a simple retailer supply chain.

Retail Supply Chain Model

The model has been set up to import the initial values for On Order with Wholesaler and Unfilled Orders stocks, target inventory and actual customer orders (a graphical function with 21 weeks of data). The source of the imported data is the file named import.csv in the example files.

To set up this example, I manually created the CSV file using the initial model parameters. (Later in this post, you’ll see that this file will be automatically created by the database.) The model has been initialized in a steady state with actual customer orders at a constant level of 4 cases per week over the 21 week period.

Creating the SQL Server Database

Since the model is set up to import data from the import.csv file, we are ready to populate that same CSV file with data from the SQL Server database. But, first we need to create the database and the associated data table.

Using SQL Server Management Studio, create a new database named Simulation.

New Database Dialogue in SQL Server

Dialog in SQL Server when adding new “Simulation” database

Once the database has been created, you can run the full CreateTable.sql script to create a table that will store the data. To open the script, select Open -> File from the File menu and navigate to the CreateTables.sql file. The script is set up to create a table named Import_Data with columns for each variable that is imported into the model. The script will also populate the table with data. It is important to note that the first row of the table contains the variable names exactly as they appear in the retailer supply chain model. Below is a sampling of the commands contained in the script:

CREATE TABLE [dbo].[Import_Data](
[RowID] [int] IDENTITY(1,1) NOT NULL,
[OnOrder] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[UnfilledOrders] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[TargetInventory] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[ActualOrders] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL

(‘On Order with Wholesaler’,’Unfilled Orders’,’target inventory’,’actual customer orders’)

. . .

To run the script, select Execute from the Query menu. Note that the data shows an increase in actual customer orders in week 2 from 4 to 10 cases. In the model, actual customer orders is a graphical function which is reflecting order data over time.

Results tab after running CreateTable.sql script

Results tab after running CreateTable.sql script

Linking the Database to the Model

After the database has been prepared, the Transfer_Data.bat Windows batch file can be used to link the database to the model. Transfer_Data.bat is a script that pulls data from the database into the CSV file, opens iThink, imports the CSV data and runs the model. The batch file takes advantage of the utility Microsoft provides with SQL Server called Bulk Copy Program (BCP) and iThink’s ability to run models from the command line. You’ll need to make a few edits to the batch file shown below before you can run it.

BCP Simulation.dbo.Import_Data out import.csv -c -t, -U <Login> -P <Password> -S <Server name>

“%PROGRAMFILES%\isee systems\iThink 9.1.4\iThink.exe” -i -r -nq Supply_Chain.itm

Using a text editor such as WordPad or NotePad, replace the placeholders for <Login>, <Password> and <Server name> in the first line to match the authentication for the Simulation database that was previously created. If your SQL Server database is on your local machine, you can delete the –U, -P and –S settings altogether. This command will make use of the BCP utility to copy the contents of the Import_Data table to the import.csv file. Depending on the version of iThink or STELLA you are running (version 9.1.2 or later is required) and its location, you may also need to edit the second line of the batch file.

After the Transfer_Data.bat file has been edited and saved, double-click on the file to run it. Once the data has been retrieved from the database, iThink will import the data via the CSV file, and run the simulation. The results will appear in the graph in the model as shown below:

Retail inventory graph

Updating the Data

In a real-world situation, the database that stores information we need to run a model is usually connected to another application that updates or changes the data. In the absence of a real-world application, I created a script that allows you to update the Simulation database and experiment with your own set of data.

In SQL Server Management Studio, open the script called UpdateTable.sql. Edit the values for each variable as desired.

USE [Simulation]
UPDATE Import_Data SET OnOrder=8, UnfilledOrders=4, TargetInventory=20, ActualOrders = 4
UPDATE Import_Data SET ActualOrders = 4
UPDATE Import_Data SET ActualOrders = 4

. . .
When you have finished editing the UpdateTable.sql script, execute the script to update the Import_Data table. Now you are ready to re-run the Transfer_Data.bat batch file and link the updated data with the Supply_Chain.itm model.


This example should give you a good idea of how the process of importing data from a database into an iThink or STELLA model can be set up and run automatically. Other database applications such as Oracle, MySQL and SQLite could also be used in a similar manner. I have not yet tried to set up examples, but these other databases also have command line interfaces for automating the transfer of data to CSV files that iThink and STELLA can connect to.

Stay tuned for part two of this post where I’ll provide an example of exporting model results to a SQL Server database.

If you enjoyed this post, make sure you subscribe to my RSS feed!
  • Pedro D. Almaguer Prado

    Great example, I congratulate him on his ability to explain these concepts.

    I suggest developing another example using the firebird database, is open source and very powerful.


  • Pedro D. Almaguer Prado


    Reflecting a bit of how the design table in the database, I do not seem very practical to include the header field names within a table row “Import_Data” if we have to do this mapping in order to pass to form an excel spreadsheet similar to the contents of the excel sheet is hand made​​, we must keep in mind that the data in a database when numbers are usually stored in formats numeric and not text.

    Nat suggest making another example where using a stored procedure, data are taken from other tables where the numeric fields are stored as numbers and then transformed into the table “import_Data” which is then exported into an excel spreadsheet format you need.

    The table “Import_Data” is designed like an excel sheet stored in MS SQL, but in the real world this is not designed in this way. It is not practical to store data that way.

    Muchos saludos de nuevo.

    • Pedro,

      You are correct; the design of “Import_Data” does not follow the typical example of a standard database table. The names of the model entities are mixed in with the numeric data, and the data types of all the key columns are set to be text. This was done to simplify the example. Since it was set up that way, all the imported sales information could be pulled from a single table when the Microsoft BCP utility wrote the CSV file. The imported CSV needed to supply the entity names at the top of the columns so that iThink could properly identify where to import the data to.

      Other examples of how this may look in a larger “real world” database could take advantage of more sophisticated SQL scripts to cull the required data for your model from other tables into an import-friendly single text table, similar to the one in the example. These scripts could either be started from the Microsoft BCP utility, or be part of some internal database process.

  • Charles

    Hi my name is Charles, i’m trying test the funcionality connect SQL Server and stella 9.1.4, but i don’t have good result, thanks a lot for your help, my e-mail: jhetfiel2010@gmail.com

    • Charles,

      Tried emailing you at the address you supplied, but the message keeps getting bounced back. Feel free to outline your issue on this forum or email support@iseesystems.com and we’ll get you going. Thanks.