Anura contains 100+ (and growing) input tables to organize modeling data.

There are six minimum required tables for every model:
This includes one table to identify the demand that must be met, two tables to lay out the structure, and a last table to link them all together with policies.
Most supply chain design models use at least one table each of the first five model categories (Modeling Elements, Sourcing, Inventory, Transportation, Demand). A Neo model converted from Supply Chain Guru© will generally contain the following tables:
By entering information in all of these tables you will have successfully added all demand, created all model elements, created sourcing and transportation policies for all arcs, and added an inventory policy to ensure inventory can be stored throughout the supply chain.
While not required, many Neo models will also contain data in the following tables:
User-defined variables (UDVs) are a transformative feature in Cosmic Frog’s transportation optimization algorithm (Hopper engine) that allow users to create and track custom metrics specific to their transportation needs. Once established, these variables can be seamlessly integrated into user-defined constraints and/or user-defined costs. Several example use cases are:
Before diving into Hopper’s user-defined variables, costs, and constraints, it is recommended users are familiar with the basics of building and running a Hopper model, see for example this “Getting Started with Hopper” help center article.
In this documentation, we will first describe the example model used to illustrate the UDV concepts in this help article. Next, we will cover the input and output tables available when working with user-defined variables, costs, and constraints. Finally, we will walk through the inputs and outputs of 4 UDV examples: the first two examples showcase the application of constraints to user-defined variables, while the last two examples cover how to model user-defined costs.
The characteristics of the model used to show the concepts of user-defined variables, costs, and constraints throughout this help article are as follows:
The optimized routes from the Baseline_UDV scenario are shown on this map, there are 10 routes with 10 stops each. The customers are color-coded based on the country they are in:

Filtering out the route which has stops in most countries, we find the following route which has stops on it in 4 countries Poland (1 dark blue stop), Czech Republic (7 yellow stops), Slovakia (1 orange stop), and Germany (1 red stop):

In the Input Tables part of Cosmic Frog’s Data module, there are 3 input tables in the Constraints section that can be used to configure user-defined variables, costs, and constraints:

We will take a closer look at each of these input tables now, and will also see more screenshots of these in the later sections that walk through several examples.
On this table we specify the term(s) of each variable which we wish to track or apply user-defined costs and/or constraints to. This first screenshot shows the fields which are used to define the variable, its term(s), and what the return condition is:

The next 2 screenshots show the other fields available on the Transportation User-Defined Variables input table, which are used to set the Filter Condition for the Scope. Note that several of these fields have accompanying Group Behavior fields, which are not shown in the screenshot. If a group name is used in the Asset Name, Site Name, Shipment ID, or Product Name field, the Group Behavior field specifies how the group should be interpreted: if the Group Behavior field is set to Aggregate (the default if not specified) the activity of the variable is summed over the members of the group, i.e. the variable is applied to the members of the group together. If the Group Behavior field is set to Enumerate, then an instance of the variable will be created for each member of the group individually.


Consider a route which picks up 4 shipments, Shipment #1, #2, #3, and #4, and delivers them to 3 stops on a route as shown in the following diagram. In all 3 examples that follow, the filter condition is set to Shipment ID = Shipment #3 and Site Type = Delivery. This first example shows what will be returned for the variable when Scope = Shipment and Type = Quantity:

The whole route is filtered for Delivery of Shipment #3 and we see that it is delivered to the Delivery 2 stop. Since Scope = Shipment and Type = Quantity, the resulting variable value is the quantity of this shipment, which is what the yellow outline indicates.
In the next example, we look at the same route and same filtering condition (Shipment #3, Delivery), but now Scope has been changed to Stop (Type is still Quantity):

Again, we filter the route for Delivery of Shipment #3 and we see that it is delivered to the Delivery 2 stop. Since Scope = Stop, now the variable value is the total quantity delivered to the stop (outlined in yellow again): quantity Shipment #2 + quantity Shipment #3.
The final visual example is for when the Scope is now changed to Route, while keeping all the other settings the same:

The route is again filtered for Delivery of Shipment #3, since the delivery of this shipment is on this route, we now calculate variable value as the total quantity of the route: quantity Shipment #1 + quantity Shipment #2 + quantity Shipment #3 + quantity Shipment #4, again outlined in yellow.
Next, we will also walk through a numbers example for different combinations of Scope and Type to see how these affect the calculation of the value of a variable’s term. Consider a route with 5 stops as follows:

We will calculate the value of the following 15 variables where the Scope, Type, and Product Name to filter for are set to different values. Note that all variables have just 1 term with coefficient 1, so the variable value = scaled term value.

If wanting to apply constraints to user-defined variables, this can be set up on the User-Defined Constraints input table:

To apply costs to a user-defined variable, this can be achieved by utilizing the User-Defined Costs input table:

There are 3 output tables related to user-defined costs and constraints:

We will cover each of these now and will see more screenshots of them in the sections that follow where we will walk through several example use cases.
This table lists the values of the terms of each user-defined variable. This next screenshot shows the values of the “ProductFlag” term of the “NumberOfProductsInRoute” variable for the routes of the Baseline_UDV scenario. How this variable and its term were set up can be seen in the screenshot of the transportation user-defined variables table above (Scope = Route, Type = Product Count, Coefficient = 1).

When setting up the Number Of Products In Route variable like above and not applying costs or constraints to it, it functions as a tracker so that user can easily get at this data rather than having to manipulate the transportation optimization output tables to calculate the number of products per route.
If we run a scenario “MaxOneProductPerRoute” where we include the maximum 1 product per route constraint that we have seen in the screenshot in the section further above on the User-Defined Constraints input table, the outputs in this table change as follows:

This table is a roll up to the variable level of the Optimization User-Defined Variable Term Summary output table discussed in the previous section. All the scaled terms of each variable have been added up to arrive at the variable’s value:

If costs have been applied to a user-defined variable, the results of that can be seen in this output table:

In this first example, we will see how we can track and limit the number of countries per route. For this purpose, a variable with 5 terms is set up in the Transportation User-Defined Variables table. Each term counts if a route has any stops in 1 of the 5 countries used in the model, 1 variable term for each country. Then we will apply constraints to this variable that limit the number of countries on each route to either 1 or 2. Let’s start with looking at the variable and its 5 terms in the Transportation User-Defined Variables table:

Next, we can add constraints that apply to this variable to change the behavior in the model and limit the number of countries a route is allowed to make stops in on any given route. We use the User-Defined Constraints table for this:

After running the Baseline_UDV scenario which does not include these constraints, we can have a look at the Optimization User-Defined Variable Summary output table:

We see that 3 routes make stops in just 1 country, 5 routes in 2 countries, and 1 route (route 9) makes stops in 4 countries when leaving the number of countries a route is allowed to make stops in unconstrained.
Now we want to see the impact of applying the Max One Country and Max Two Countries constraints through 2 scenarios and again we check the Optimization User-Defined Variable Summary output table after running these scenarios:

Maps are also helpful to visualize these outputs. As we saw in the introduction of the example model used throughout this documentation, these are the Baseline_UDV routes visualized on a map:

These routes change as follows in the MaxOneCountryPerRoute scenario:

Since some of these routes overlap on the map, let us filter a few out and color-code them based on the country to more easily see that indeed the routes each only make stops in 1 country:

In this example we will see how user-defined variables and constraints can be used to model truck compartments and their capacities. First, we set up 3 variables that track the amount of ambient, refrigerated, and frozen product on a route:

Without setting up any constraints that apply to these variables, they just track how much of each product is on a route, which can be within or over the actual compartment capacity. So, to set capacity limits, we can use the User-Defined Constraints table to setup constraints on these 3 variables that represent the capacity of the ambient, refrigerated, and frozen compartments of a truck:


After running the Baseline_UDV scenario where these constraints are not applied and another scenario, Compartment Capacity, where they are applied, we can take a look at the Optimization User-Defined Variable Summary output table to see the effect of the constraints (just showing routes 1 and 2 in the below screenshot):

Typically, when adding constraints, we expect routes to change – more routes may be needed to adhere to the constraints, and they may become less efficient. Overall, we would expect costs, distance, and time to increase. This is exactly what we see when comparing these outputs in the Transportation Summary output table for these 2 scenarios:

We have seen 2 examples of applying constraints to user-defined variables in the previous sections. Now, we will walk through 2 examples of applying costs to user-defined variables. The first example shows how to apply a variable cost based on how long a shipment sits on a route: we will specify a cost of $1 per hour the shipment spends on the route. First, we set up a variable that tracks how long a shipment spends on a route in the Transportation User-Defined Variables input table:

Next, the User-Defined Costs table is used to specify the cost of $1 per hour:

After running the CostPerShipmentTimeInTruck scenario which includes this user-defined cost, we can look at both the Transportation Shipment Summary and the Optimization User-Defined Cost Summary output tables to see this cost of $1 per hour has been applied:

Next, we open the Optimization User-Defined Cost Summary output table and filter for the same scenario and route (#4):

In our final example of this documentation, we will use the same variable ShipmentTimeInTruck from the previous example to set up a different type of cost. We will use it to find any shipments that are on a route for more than 10 hours and apply a penalty cost of $100 to each. This involves using a step cost for which we will also need to utilize the Step Costs table; we will start with looking at this table:

Next, we configure the penalty cost in the User-Defined Costs table:

After running a scenario in which we include the penalty cost, we can again look at the Transportation Shipment Summary and Optimization User-Defined Cost Summary output tables to see this cost in action:


Transportation lanes are a necessary part of any supply chain. These lanes represent how product flows throughout our supply chain. In network optimization, transportation lanes are often referred to as arcs or edges.
In general, lanes in our supply chain are generated from the transportation policies and sourcing policies provided in our data tables.

Transportation policies are stored in the TransportationPolicies table. Sourcing policies are stored in the following tables:
From the data in these tables, the software automatically generates the lanes (i.e. arcs or edges) in our network before sending it to the optimization solver. We can control how these lanes are generated as a parameter of our Neo model.
Neo models can follow 4 different lane creation policies:

If the “Transportation Policy Lanes Only” rule is selected, Cosmic Frog will only generate transportation lanes based on data in the TransportationPolicies table. If a lane between two sites is not explicitly defined here, product will not be able to directly flow between those sites. Note that any additional information specified in a Sourcing Policy table (unit cost, policy rule etc.) will still be respected for the lane so long as it exists in the Transportation Policies table.

If the “Sourcing Policy Lanes Only” rule is selected, Cosmic Frog will only generate transportation lanes based on data in the Sourcing tables. Even if an origin-destination path is defined in the TransportationPolicies table, product will not be able to flow via this lane unless there is a specific sourcing policy defining how the destination site gets product from the origin site. Note that any additional information specified in a Transportation Policies table (cost, policy rule, multiple modes etc.) will still be respected for the lane so long as it exists in a Sourcing Policy table.

If the “Intersection” rule is selected, Cosmic Frog will only generate transportation lanes if they are defined in both the transportation policy table and one of the sourcing policy tables.
For users converting models from Supply Chain Guru©, the default SCG© lane creation rule is “Intersection”.

If the “Union” rule is selected, Cosmic Frog will generate transportation lanes if they are defined in either the transportation policy table or one of the sourcing policy tables.

Watch the video to learn how to build dashboards to analyze scenarios and tell the stories behind your Cosmic Frog models:
The following instructions begin with the assumption that you have data that you wish to upload to your Cosmic Frog model.
The instructions below assume the user is adding information to the Suppliers table in the model database. This will use the same connection that was previously configured to download data from the Customers table.
Drag the “Output Data” action into the Workflow and click to select “Write to File or Database”

Select the relevant ODBC connection established earlier, in this example we called the connection “Alteryx Demo Model.”

You will be prompted to enter a valid table name to which the data will be written in the model database. In this example enter suppliers (all in lower case to match the table name in PostGres, which is case sensitive).
Click OK

Within the Options menu – edit “Output Options” to Append Existing in the drop-down list.
Within the Options menu – edit “Append Field Map” by clicking the three dots to see more options.
Select “Custom Mapping” option and then use the drop-down lists to map each field in the Destination column to the Source column. Fields of the same name, but case sensitive as it is PostGres.

Click OK
Now you can Run the Workflow to upload the data to your model. Once it has completed check the Suppliers table in Cosmic Frog to see the data.
Sequential Objectives allow for you to set multiple tiers of objectives for the optimization solve to consider, where each tier of objectives can be relaxed by a defined percentage when solving for the next tier.
Here is a basic example of how Sequential Objectives can be used.
100 units of demand for P1 at CZ.
Available pathways for flow are as follows:
Find a solution that will first minimize total costs, but then will work to minimize the total amount of travel time in the solution while only relaxing the Total Cost solution by 20%.

When just solving with the standard objective of Profit Maximization, the cheapest path will be utilized. All flows will come from MFG > DC1 > CZ and the total cost will be $500.
We’ve built the Sequential Objectives so that we will first optimize over the Total Supply Chain Cost as Priority 1. We have also set the Tolerance to be 20 which will allow for a 20% relaxation in the solution to solve for the secondary objective – Total Weighted Flow Time.

We’ll now see that the more expensive pathway of MFG > DC2 > CZ is used as it requires less travel time. Because the initial cost was $500, we will send as many units as possible through DC2 up until the total cost reaches $600 – a 20% deviation from the initial cost. This results is 5 units flowing via DC2, while 95 units remain through DC1 for a total cost of $600.
This example model can be found in the Resource Library listed under the name of Sequential Objectives Demo.
In this section, we outline some techniques for debugging models in Cosmic Frog. In general, there’s no one “right” approach to debugging, but knowing where you can get information on what went wrong can be helpful.
An error state in the run manager is the most obvious sign that something is wrong with our model setup.

After reaching an error state, the first place to check is the Job Records section of the Run Manager. Here you will find a summary of events from the model solve, and if an error is thrown you can potentially see the cause directly from here:

Next, you can check the Job Error Log. The Job Error Log will contain more detailed messaging on errors that are thrown during a model solve. While there are a number of possible errors, the most common cause of an error state is an infeasible model. You can check if your model is infeasible by scrolling down to the bottom of the Job Error Log, or by searching for “infeasible” in the search bar.

If your model is infeasible, you can use the following toolkit to help understand why:
Sometimes even a model that finishes running can give results that we do not expect. It a good habit to check your Output Summary Tables after each run to make sure the results look like you expect.

In some cases, the output tables might not populate even if the model runs successfully. Even if these values do not populate, you can find the Gurobi optimization results in the job log. One useful tip is to search for “objective value” in the job log and make sure the value is in the range you expect.

If your model is running, but seems incorrect, you can use the following toolkit to help understand why:
If you are still having trouble, you can also reach out to us directly at support@optilogic.com.
With Intelligent Greenfield Analysis (the Triad engine in Cosmic Frog), you have control over several different solve settings. For ease of use with scenario modeling, these have been placed in a dedicated table called Greenfield Settings. This allows for quick scenario building that leverages the column names. We will cover the settings which can be configured on the Greenfield Settings table and show an example of how scenarios can be used to change these settings.
Following screenshot shows the Greenfield Settings table:

An explanation of each setting is as follows:
Note that the “Getting Started with Intelligent Greenfield Analysis” help article contains a visual explanation of customer clustering too.
Finally, we will look at an example where a scenario item changes a Greenfield setting:

Every account holder has access to create the Global Supply Chain Strategy demo model. Following is an overview of the features of the model (and of Cosmic Frog).
If you wish to build the model instead, please follow the instructions located here: Build Your First Cosmic Frog Model
The only constant is change. When building our supply chains, the “optimal” design doesn’t only mean lowest cost. What happens if (or perhaps when) a disruption occurs? Fragile, low-cost supply chains can end up costing more in the long run if they aren’t resilient to the dynamic nature of today’s world.
We believe that optimality includes resilience. That’s why every Cosmic Frog run includes a risk rating from our DART risk engine.
Every Cosmic Frog run outputs an Opti Risk score. The Opti Risk score is an aggregate measure of the overall supply chain risk. It includes the following sub-categories:

After running a model, you can find the Opti Risk score (as well as the scores for each of the sub-categories) in the output risk tables. The Opti Risk score can also be found in the OptimizationNetworkSummary table.


The overall Customer Risk score is an aggregation of each individual customer’s risk described in the OptimizationCustomerRiskMetrics or SimulationCustomerRiskMetrics tables. In each scenario, there is one risk score per customer per period.
Each customer risk score includes:

For each sub-category, the geographic risk score is also an aggregation of several risk factors:

Like the customer risk score, the overall facility risk score is an aggregation of risk across all facilities in your supply chain. In the FacilityRiskMetric tables, there is an individual risk score per facility per period.
The facility risk score includes:
The capacity risk has three sub-components:
The facility geographic risk has the same components as the customer geographic risk.


The supplier risk is calculated per supplier per period and includes:
Both the concentration and geographic risks include the same elements as described previously.

Network risk differs from the other risk scores in that it is not tied to a specific supply chain element. There is only one network risk score per scenario, and it includes:

The transport and import/export time risks are aggregated across individual origin/destination pairs for every product and transport mode. The individual risk scores can be found in the OptimizationFlowSummary table.

The stocking point count and supply make count risks are aggregations across every product and period. The individual risk scores can be found in the ProductRiskMetrics tables.

We can use our visualization tools to get a better sense of how risk varies across design scenarios.


Scenarios allow you to run edited versions of your model in specific, controlled ways.
All scenarios have 3 components:

For an overview of the scenario features with a video please watch the video:
Greenfield analysis (GF) is a method for determining the optimal location of facilities in a supply chain network. The Greenfield engine in Cosmic Frog is called Triad and this name comes from the oldest known species of frogs – Triadobatrachus. You can think of it as the starting point for the evolution of all frogs, and it serves as a great starting point for modeling projects too! We can use Triad to identify 3 key parameters:
GF is a great starting point for network design—it solves quickly and can reduce the number of candidate site locations in complicated design problems. However, a standard GF requires some assumptions to solve (e.g. single time period, single product). As a result, the output of a Triad model is best suited as initial information for building a more robust Cosmic Frog optimization (Neo) or simulation (Throg) model.
You can run GF in any Cosmic Frog model. Running a GF model only requires two input tables to be populated:

A third important table for running GF is the Greenfield Settings table in the Functional Tables section of the input tables. We call our GF approach “Intelligent Greenfield” because of the different parameters available by configuring this settings table. The Greenfield Settings table is always populated with defaults and users can change these as needed. See the Greenfield Setting Explained help article for an explanation of the fields in this table.
A greenfield analysis starts with clicking the “Run” button at the right top of the Cosmic Frog application, just like a Neo or Throg model.

After clicking on the Run button, the Run screen comes up:

Besides making changes to values in the Customers and/or Customer Demand tables, GF scenarios often make changes to 1 or multiple settings on the Greenfield Settings table. The next screenshot shows an example of this:

To improve the solve speed of a Triad model, we can use customer clustering. Customer clustering reduces the size of the supply chain by grouping customers within a given geometric range into a single customer. We can set the clustering radius (in miles) in the Greenfield Settings table in the Customer Cluster Radius column.

Clustering is optional, and leaving this column blank is the same as turning off clustering.
While grouping customers can significantly improve the run time of the model, clustering may result in a loss of optimality. However, Greenfield is typically used as a starting point for a future Neo optimization model, so small losses in optimality at this phase are typically manageable.
Once you have run a model, you can visualize your results using the Analytics tab.

In Analytics, a dashboard is a collection of visualizations. Visualizations can take on many forms, such as charts, tables or maps.
In Cosmic Frog, there are default dashboards available to help you analyze your model results.


The default dashboards highlight some common analytics and metrics. They are designed to be interacted with through a set of filters.

We can hover over visualization elements to get more information. This floating card of information is called a “Tooltip”.

We can customize existing dashboards to fit our needs. For more information see Editing Dashboards and Visualizations.

There are two methods for establishing a secure connection to the Optilogic platform:
An App key is a code that can be linked to your account and will not expire. API keys are generated with code and only last for one hour before they expire. Both keys can be useful depending on how you wish to access the platform. Without either an App Key or an API Key you will not be able to run any API endpoints.
Login to the Optilogic website and click on your name in the top right corner, then click on “Account.”

Click on the “App Key Management” tab from their name your app key and click on the “Create Key” button.

At this point you may copy your App Key to be used for authentication purposes.
To generate an API key you will need to leverage python and the following instructions.
In a python file copy and paste this code and replace the USERNAME and PASSWORD with your own. Make sure to remove both sets of {{}} curly brackets so that it looks like this: headers = {‘X-USER-ID’: ‘CMorrell’ }
import requests
url = ‘https://api.optilogic.app/v0/r…’
headers = {
‘X-USER-ID’: ‘{{user_id}}’,
‘X-USER-PASSWORD’: ‘{{user_password}}’
}
response = requests.request(‘POST’, url, headers=headers)
print(response.text)

The result of this code will be an API key that can be used for authentication.
Adding model constraints often requires us to take advantage of Cosmic Frog’s Groups feature. Using the Groups table, we can define groups of model elements. This allows us to reference several individual elements using just the group name.
In the example below, we have added the Harlingen, Ashland, and Memphis production sites to a group named “ProductionSites”.
Now, if we want to write a constraint that applies to all production sites, we only need to write a single constraint referencing the group.

When we reference a group in a constraint, we can choose to either aggregate or enumerate the constraint across the group.
If we aggregate the constraint, that means we want the constraint to apply to some aggregate statistic representing the entire group. For example, if we wanted the total number of pens produced across all production sites to be less than a certain value, we would aggregate this constraint across the group.

Enumerating a constraint across a group applies the constraint to each individual member of the group. This is useful if we want to avoid writing repetitive constraints. For example, if each production site had a maximum production limit of 400 pens, we could either write a constraint for each site or write a single constraint for the group. Compare the two images below to see how enumeration can help simplify our constraint tables.
Without enumeration:

With enumeration:

You can customize existing dashboards to fit your needs.


Inside of a dashboard, add a new visualization:

or edit an existing visualization:

The most common elements of visualizations are values and labels.
Values represent the data you want to be presented in the visualization. Typically, values are aggregated representations of your data (e.g. sum, average, etc.).
Labels refer to the labels on the visualization axes and consequently the groups by which you want to aggregate your values.
To build a visualization, we drag fields (i.e. database columns) into these elements.

Other elements include categories which allow for additional grouping and filters which allow users to adjust inclusion and exclusion criteria while viewing the dashboard.
You can use the Analytics dropdown button to create a new dashboard.

You can use the “+” button to add your first visualization to the dashboard.

This video guides you though creating your free account and the features of the Optilogic Cosmic Frog supply chain design platform.
If you are running into issues receiving your account confirmation email, please see the troubleshooting article linked here.
These instructions assume you have already made a connection to your database as described in Connecting to Optilogic with Alteryx.
Once connected you will see the schemas and tables in the Cosmic Frog database. In this case we will select some columns from the Customers table. Drag and drop the tables required to the left hand “Main” pane. Click the columns required. Click OK.

At this point you can run your workflow and it will be populated with the data from the connected database.

Thank you for using the most powerful Supply Chain Design software in the galaxy (I mean, as far as we know).
To see the highlights of the software please watch the following video.
To describe the action and conditions of our scenario, there are specific syntax rules we need to follow.
In this example, we are editing the Facilities table to define a scenario without a distribution center in Detroit.

Actions describe how we want to change a table. Actions have 2 components:
Writing actions takes the form of an equation:
Further detail on action syntax can be found here.

Conditions describe what we want to change. They are Boolean (i.e. true/false) statements describing which rows to edit. Conditions have 3 components:
Conditions take the form of a comparison, such as:
Further detail on Conditions syntax can be found here.

The Multi Time Period (MTP) tables allow for you to enter time-period specific data across many input tables. These MTP tables will act as overrides in the relevant periods for the records in their standard tables. If a record with the same key structure does not exist in the standard table, the MTP record won’t be recognized by the solver and will be dropped.
For example, we have a manufacturing facility with a throughput capacity of 1000 units. When this information is entered into the Facilities table, the throughput capacity will be used for each period in the model:


Let’s say that the manufacturing facility is expanding operations over the year and wants to show that the throughput capacity gradually increases over each quarter. This adjustment in throughput capacity can be defined in the Facilities Multi Time Period table:


We can see that the throughput capacity increased over time and given the constant outbound quantity, the throughput utilization goes down as the model progresses.
In all of the policy multi-time period tables, there are two fields which appear to be similar – Status and Policy Status.
For example, we have 2 manufacturing locations that can produce the same product at very different costs. We’ll see that the MFG location is used as the sole production option as it is far cheaper.


Now let’s say that we have to shut production down at the MFG location for maintenance in Q3, we can do this through the Production Policies Multi Time Period table:


This can be a quick way to to adjust policies over different times and is an alternative to using constraints. The use of Policy Status in place of constraints will stop the creation of extra solver variables as well as reducing the number of constraints being placed over the solution. This can help contribute to better model performance.
Similar to how Status and Policy Status work, you have the ability to change the operating status of a Facility and a Work Center over time. This would allow you to Open, Close, or Consider the use of a Facility or Work Center in any given period.
Using the same example as above, we can model the maintenance at MFG during Q3 by closing the entire location:


Closing the entire facility will do more than just limit production, the facility is then completely removed from the network for that given period and no other activities can take place.
To enable users to build basic Cosmic Frog for Excel Applications to interact directly with Cosmic Frog from within Excel without needing to write any code, Optilogic has developed the Cosmic Frog for Excel Application Builder (also referred to as App Builder in this documentation). In this App Builder, users can build their own workflows using common actions like creating a new model, connecting to an existing model, importing & exporting data, creating & running scenarios, and reviewing outputs. Once a workflow has been established, the App can be deployed so it can be shared with other users. These other users do not need to build the workflow of the App again, they can just use the App as is. In this documentation we will take a user through the steps of a complete workflow build, including App deployment.
You can download the Cosmic Frog for Excel – App Builder from the Resource Library. A video showing how the App Builder is used in a nutshell is included; this video is recommended viewing before reading further. After downloading the .zip file from the Resource Library and unzipping it on your local computer, you will find there are 2 folders included: 1) Cosmic_Frog_For_Excel_App_Builder, which contains the App Builder itself and this is what this documentation will focus on, and 2) Cosmic_Frog_For_Excel_Examples, which contains 3 examples of how the App Builder can be used. This documentation will not discuss these examples in detail; users are however encouraged to browse through them to get an idea of the types of workflows one can build with the App Builder.
The Cosmic_Frog_For_Excel_App_Builder folder contains 1 subfolder and 1 Macro-enabled Excel file (.xlsm):

When ready to start building your first own basic App, open the Cosmic_Frog_For_Excel_Builder_v1.xlsm file; the next section will describe the steps a user needs to take to start building.
When you open the Cosmic_Frog_For_Excel_App_Builder_v1.xlsm file in Excel, you will find there are 2 worksheets present in the workbook, Start and Workflow. The top of the Start worksheet looks like this:

Going to the Workflow worksheet and clicking on the Cosmic Frog tab in the ribbon, we can see the actions that are available to us to create our basic Cosmic Frog for Excel Applications:

We will now walk through building and deploying a simple App to illustrate the different Actions and their configurations. This workflow will: connect to a Greenfield model in my Optilogic account, add records to the Customer and CustomerDemand tables, create a new scenario with 2 new scenario items in it, run this new scenario, and then export the Greenfield Facility Summary output table from the Cosmic Frog model into a worksheet of the App. As a last step we will also deploy the App.
On the Workflow worksheet, we will start building the workflow by first connecting to an existing model in my Optilogic account:

The following screenshot shows the Help tab of the “Connect To Or Create Model Action”:

In the remainder of the documentation, we will not show the Help tab of each action. Users are however encouraged to use these to understand what the action does and how to configure it.
After creating an action, the details of it will be added to 2 columns in the Workflow worksheet, see screenshot below. The first action of the workflow will use columns A & B, the next action C & D, etc. When adding actions, the placement on the Workflow worksheet is automatic and user does not need to do or change anything. Blue fields contain data that cannot be changed, white fields are user inputs when setting up the action and can be changed in the worksheet itself too.

The United States Greenfield Facility Selection model we are connecting to contains about 1.3k customer locations in the US which have demand for 3 products: Rockets, Space Suits, and Consumables. As part of this workflow, we will add 10 customers located in the state of Ontario in Canada to the Customers table and add demand for each of these customers for each product to the CustomerDemand table. The next 2 screenshots show the customer and customer demand data that will be added to this existing model.


First, we will use an Import Data action to append the new customers to the Customers table in the model we are connecting to:

Next, use the Import Data Action again to upsert the data contained in the New_CustomerDemand worksheet to the CustomerDemand table in the Cosmic Frog model, which will be added to columns E & F. After these 2 Import Data actions have been added, our workflow now looks like this:

Now that the new customers and their demand have been imported into the model, we will add several actions to create a new scenario where the new customers will be included. In this scenario, we will also remove the Max Number of New Facilities value, so the Greenfield algorithm can optimize the number of new facilities just based on the costs specified in the model. After setting up the scenario, an action will be added to run it.
Use the Create Scenario action to add a new scenario to the model:

Then, use 2 Create Item Actions to 1) include the Ontario customers and 2) remove the Max Number Of New Facilities value:


After setting up the scenario and its 2 items, the next step of the workflow will be to run it. We add a Run Scenario action to the workflow to do so:

The configuration of this action takes following inputs:
We now have a workflow that connects to an existing US Greenfield model, adds Ontario customers and their demand to this model, then creates and runs a new scenario with 2 items in this Cosmic Frog model. After running the scenario, we want to export the Optimization Greenfield Facility Summary output table from the Cosmic Frog model and load it into a new worksheet in the App. We do so by adding an Export Data Action to the workflow:

After adding the above actions to the workflow, the workflow worksheet now looks like the following 2 screenshots from column G onwards (columns A-F contain the first 3 actions as shown in a screenshot further above):

Columns G-H contain the details of the action that created the new ON Customers Cost Optimized scenario, and columns I-J & K-L contain the details of the actions that added the 2 scenario items to this scenario.

Columns M-N contain the details of the action that will run the scenario that was added and columns O-P those of the action that will export the selected output table (Optimization Greenfield Facility Summary) into the GF_Facility_Summary worksheet of the App.
To run the completed Workflow, all we need to do is click on the Run Workflow action and confirm we want to run it:

After kicking off the workflow, if we switch to the Start worksheet, details of the run and its progress are shown in rows 9-11:

Looking on the Optilogic Platform, we can also check the progress of the App run and the Cosmic Frog model changes:

Once the run is done all 3 jobs will have their State changed to Done, unless an error occurred in which case the State will say Error.
Checking the United Stated Greenfield Facility Selection model itself in the Cosmic Frog application on cosmicfrog.com:

Once the App is finished running, we see that a worksheet named GF_Facility_Summary was added to the App Builder:

There are several other actions that users of the App Builder can incorporate into a workflow or use to facilitate workflow building. We will cover these now. Feel free to skip ahead to the “Deploying the App” section if your workflow is complete at this stage.
Additional actions that can be incorporated into workflows are the Run Utility, Upload File, and Download File actions. The Run Utility action can be used to run a Cosmic Frog Utility (a Python script), which currently can be a Utility downloaded from the Resource Library or a Utility specifically built for the App.
There are currently 4 Utilities available in the Resource Library:

After downloading the Python file of the Utility you want to use in your workflow, you need to copy it into the working_files_do_not_change folder that is located in the same folder as where you saved the App Builder. Now you can start using it as part of the Run Utility action. In the below example, we will use the Python script from the Copy Map to a Model Resource Library Utility to copy a map and all its settings from one model (“United States Greenfield Facility Selection”, the model connected to in a previous action) to another (“European Greenfield Facility Selection”):

The parameters of the Copy Dashboard to a Model Utility are the same as those of the Copy Map to a Model Utility:
The Orders to Demand and Delete SaS Scenarios utilities do not have any parameters that need to be set, so the Utility Params part of the Run Utility action can be left blank when using these utilities.
The Upload File action can be used to take a worksheet in the App Builder and upload it as a .csv file to the Optilogic platform:

Files that get uploaded to the Optilogic platform are placed in a specific working folder related to the App Builder, the name and location of which are shown in this screenshot:

The Download File action can be used to download a .txt file from the Optilogic platform and load it into a worksheet in the App:

Other actions that facilitate workflow building are the Move an Action, Delete an Action, and Run Actions actions, which will be discussed now. If the order of some actions needs to be changed, you do not need to remove and re-add them, you can use the Move an Action action to move them around:

It is also possible that an action needs to be removed from a Workflow. For this, the “Delete an Action” action can be used, rather than manually deleting it from the Workflow worksheet and trying to move other actions in its place:

Instead of running a complete workflow, it is also possible to only run a subset of the actions that are part of the workflow:

Once a workflow has been completed in the Cosmic Frog for Excel App Builder, it can be deployed so other users can run the same workflow without having to build it first. This section covers the Deployment steps.

The following message will come up after the app had been deployed:

Looking in the folder mentioned in this message, we see the following contents:


Congratulations on building & deploying your own Cosmic Frog for Excel App!
If you want to build Apps that go beyond what can be done using the App Builder, you can do so too. This may require some coding using Excel VBA, Python, and/or SQL. Detailed documentation walking through this can be found in this Getting Started with Cosmic Frog for Excel Applications article on Optilogic’s Help Center.
Optilogic provides a uniform, easy to use way to connect to your models and embed optimization and simulation in any application that can make an API call.
With a free account you have access to our API capabilities to build and deploy custom solutions.
There are a number of calls that you will need to make to fully automate the use of optimization and simulation in your models. Below you will find more information on how to go about this task.
For full API documentation see our Optilogic API Documentation. From this page you will be able to view detailed API documentation and live test code within your account.
First, to use any API call you must be authenticated. One authenticated you will be provided with an API key that remains active for an hour. If your API key expires you will be required to re-authenticate to acquire a new key.
The Account API section of calls allows you to lookup your account information such as username, email, how many concurrent solves you have access to, and the number of workspaces in your account.
The Workspace API section of calls allows you to lookup information about a specific workspace. You can look up the workspace by name, obtaining a list of files in the workspace as well as a list of jobs associated with the models of that workspace.
The Job API section of calls allows you to view information relating to jobs in the system. Each time you execute a model solve, the Optilogic back end solver system, Andromeda, will spawn a new job to handle the request. The API call to start a job will return a key with which you can lookup information about that job, even after it has completed. With these API calls you can get any job’s status, start a new job, or delete a particular job.
The Files API section of calls allows you to interact with the files of a given workspace. Each model is made up of a collection of files (mainly code and data files). With these calls you can copy, delete, upload or download any file of in a specified workspace.
An example of how to connect a Cosmic Frog model to a Snowflake database, along with a video walkthrough, can be found in the Resource Library. To get a copy of this demo into your own Optilogic account simply navigate to the Resource Library and copy the Snowflake template into your workspace.

If this isn’t your first time using a supply chain design software, then take heart: the transition to Cosmic Frog is a smooth one. There are a few key differences worth noting below:

Most of these changes will be self-explanatory – if you used to write Customer Sourcing Policies you will now put similar data in a table called Customer Fulfillment Policies. Others may become easier to see over time — instead of many tables to create process logic you can enter everything you need in one table.

Have you ever wanted to put a site in your model and distinguish whether you owned the site or not? Have you ever wanted to make clear what you own and what you outsource? If so, the suppliers tables are for you:

Do you really need a corresponding transportation policy for every sourcing policy and vice versa? Did you know that by doing so you actually making the model take longer to build and solve? Have you ever built a model that was infeasible because you forgot to add a policy?
We put the power in you, the user’s, hands. Simply change the lane creation rule and you can ensure that your model builds and solves the way you want.


We split inventory policies into three sections because we believe there is a lot going on when you process how to model inventory in your models, especially when you process inventory in simulation. Other than that, we cleaned up the table structure, why enter data in multiple tables if you don’t need to? Where possible we streamlined the table structure to make it easier to enter your data.