Skip links

Getting Started with Cosmic Frog for Excel Applications

Cosmic Frog for Excel Applications provide alternative interfaces for specific use cases as companion applications to the full Cosmic Frog Supply chain design product. For example, they can be used to access a subset of the Cosmic Frog functionality in a simplified manner or provide specific users who are not experienced in working with Cosmic Frog models access to a subset of inputs and/or outputs of a full-blown Cosmic Frog model that are relevant to their position.

Several example use cases are:

  • Allowing users such as transportation planners to interact with a transportation optimization (Hopper) model through a simple user interface in Excel. Simply update Shipment quantities and review the outputs, including routes on a map, all in Excel.
  • An App that geocodes locations and shows them on a Map, where the markers are optionally scaled by an attribute.
  • Easily set up sensitivity analysis for demand and supply: increase/decrease demand based on certain customer or product factors and increase/decrease the capacity of facilities based on multiple factors too (location, type of facility), run multiple versions and compare outputs.
  • Create all input data for a Cosmic Frog model in Excel (e.g. Greenfield) and run from there, including getting the outputs required read back in and showing those on a map.
  • Quickly set up scenarios and run them from the App.
  • Output viewers for different types of uses, e.g. executive summaries vs detailed outputs of certain supply chain areas for planners.

Here we will discuss how one can set up and use a Cosmic Frog for Excel Application. This will include steps that use VBA (Visual Basic for Applications) in Excel and scripting using the programming language Python, which may sound daunting at first if you have little or no experience using these. However, by following along with this resource and the ones referenced in this document, most users will be able to set up their own App in about a day or 2 by copy-pasting from these resources and updating the parts that are specific to their use case. Generative AI engines like Chat GPT and perplexity can be very helpful as well to get a start on VBA and Python code. Cosmic Frog functionality will not be explained much in this documentation, the assumption is that users are familiar with the basics of building, running, and analyzing outputs of Cosmic Frog models.

In this documentation we are mainly following along with the Greenfield App that is part of the Resource Library resource “Building a Cosmic Frog for Excel Application”. Once we have gone through this Greenfield app in detail, we will discuss how other common functionality that the Greenfield App does not use can be added to your own Apps.

There are several Cosmic Frog for Excel Applications that have been developed by Optilogic available in the Resource Library. Links to these and a short description of each of them can be found in the penultimate section “Apps Available in the Resource Library” of this documentation.

Throughout the documentation links to other resources are included; in the last section “List of All Resources” a complete list of all resources mentioned is provided.

Contents of this documentation:

  1. High-level Overview of Steps to Build a Cosmic Frog for Excel Application
  2. Building your App – Excel Worksheets
  3. Excel: Building, and Assigning Macros
  4. The Optilogic.bas VBA Module
  5. App Keys and the app.key File
  6. Python Job File & requirements.txt
    1. Getting Started with Python and Optilogic’s cosmicfrog Python Library
  7. Summary of Local and Uploaded Files
  8. Run Greenfield Macro
  9. greenfield.job
  10. Run Manager to Monitor Jobs
  11. Additional Common App Functionality
    1. Prevent Locking of Excel & Ability to Cancel an App Run
    2. Reading CSV Outputs Back into the App Workbook
    3. Using Maps
  12. Tips & Tricks
  13. Troubleshooting
    1. Excel: Protected View and/or Security Warnings
    2. Anti-virus Software blocking Macros from Running
    3. Local Python Scripts not Connecting to cosmicfrog.com
    4. Cells with Formulas in Excel Exporting to CSV as 0’s
    5. Closing Output Files
  14. Apps Available in the Resource Library
  15. List of All Resources

1.    High-level Overview of Steps to Build a Cosmic Frog for Excel Application

The following screenshot shows at a high-level what happens when a typical Cosmic Frog for Excel App is used. The left side represents what happens in Excel, and on the right side what happens on the Optilogic platform.

  1. A lot of the work to create a Cosmic Frog for Excel App will be done in Excel like entering and configuring any inputs that the Cosmic Frog model needs to be updated with.
  2. Once all the inputs and any needed settings are configured, the data will typically be exported, often either in CSV or TXT format.
  3. The next step is to upload the data to the Optilogic platform.
  4. A Python script is used on the Optilogic platform to run the Job, which will often involve updating, running, and/or analyzing a Cosmic Frog model.
  5. Once the job is complete on the Optilogic platform, the data (typically outputs of a Cosmic Frog model) that we require will be exported, again often to CSV or TXT format, and downloaded from the Optilogic platform.
  6. The outputs are loaded back into Excel, which may require conversion from Unix to Windows format.

2.    Building your App – Excel Worksheets

A typical Cosmic Frog for Excel Application will contain at least several worksheets that each serve a specific purpose. As mentioned before, we are using the MicroAPP_Greenfield_v3.xlsm App from the Building a Cosmic Frog for Excel Application resource as an example. The screenshots in this section are of this .xlsm file. Depending on the purpose of the App, users will name and organize worksheets differently, and add/remove worksheets as needed too:

  1. Admin – a worksheet where:
    1. The App Key can be entered before running the App for the first time. An App Key is required in order to authenticate the user on the Optilogic platform. See the section “App Keys and the app.key File” further below for more details.
    2. The local path to the Excel workbook if the Excel workbook is for example in an online drive – this is needed to be able to create an app.key file in the same folder as the Excel workbook.
    3. A high-level description of the App, including its purpose, what inputs are required, and what outputs it will provide.
    4. User instructions telling the user what inputs to update and how to run the App.
    5. A set-up guide: steps to get the Excel App working the first time, e.g. where to put files related to the App, what information to provide where for set-up, etc.

  1. Cosmic Frog model input data worksheets: 1 or multiple worksheets where data that needs to be uploaded into a Cosmic Frog model is entered. This could be data that together makes up a complete Cosmic Frog model or it could be a subset of Cosmic Frog model data that will be used to update an already existing Cosmic Frog model. Some Apps, for example output viewer Apps, may not have any of these. Here there are 3: Customers, Suppliers, and Facilities. These 3 worksheets contain data to populate an empty model on the Cosmic Frog platform, which will then be used for a Greenfield run. The Customers worksheet contains:
    1. Customer Name data – this will be used to populate the Customers and the Customer Demand tables in Cosmic Frog.
    2. Latitude & Longitude for each customer – this will be used to populate the Customers table; this is required to run Greenfield so the algorithm can calculate distances between possible source-destination pairs.
    3. Quantity – this will be used to populate the Customer Demand table in Cosmic Frog, where Greenfield takes this into account to calculate Fulfillment and Procurement Costs (both costs are on a per unit per mile/kilometer basis).

  1. Settings: any specific settings that are needed to run the Cosmic Frog model are typically specified here. For Greenfield these are shown in the screenshot above.
    1. Greenfield specific settings that are taken from this section and put into the Greenfield Settings table in the Cosmic Frog model.
    2. Resource Size and File name for results – these are model run options that the user can specify here. Resource Size sets what size of solver will be used for the Greenfield run and “File name for results” will be part of the file name of the output files that will be downloaded from Cosmic Frog into the folder where the Excel workbook is saved. This is so users know which results are from which Greenfield run.
    3. For Cosmic Frog model runs using different engines (e.g. NEO for optimization, HOPPER for transportation optimization, etc.) you can also put relevant Model Run Options that the user may want control over on this Settings tab (those shown on the Run screen in Cosmic Frog after clicking on the Run button). For example, turning on/off Load Balancing for Hopper models could be an option that can be set on the Settings worksheet.

  1. A “Run …” worksheet:
    1. This worksheet will often contain a button that can be clicked to kick off the App’s Macro which performs the steps of exporting and uploading data to the Optilogic platform, including the Python Job that will be run on the Optilogic platform, and the steps of what to do with the outputs of the Job once it has completed on the Optilogic platform. We will discuss building and assigning Macros in the next section. The contents of the Macro assigned to the Run Greenfield button are also covered in a lot of detail in a later section.
    2. Status updates can be shown on this worksheet too.
  2. Cosmic Frog model output data worksheets: 1 or multiple worksheets where select outputs are read back into after a Job has completed. In the MicroApp_Greenfield_v3.xlsm that we are using as an example here, the outputs are just downloaded as CSV files and not read back into the Excel workbook, so we do not see these in this example, but we will cover how this can work in a later section titled “Reading CSV Outputs Back into the App Workbook”.
  3. Worksheets with other Cosmic Frog model outputs such as Maps, which are again not used in this example, but can be and will be discussed in a later section titled “Using Maps”.

3.    Excel: Building and Assigning Macros

To set up and configure Cosmic Frog for Excel Applications, we mostly use .xlsm Excel files, which are macro-enabled Excel workbooks. When opening an .xlsm file that for example has been shared with you by someone else or has been downloaded from the Optilogic Resource Library (Help Article on How To Use the Resource Library), you may find that you see either a message about a Protected View where editing needs to be enabled or a Security Warning that Macros have been disabled. Please see the Troubleshooting section towards the end of this documentation on how to resolve these warnings.

To set up Macros using Visual Basic for Applications (VBA), go to the Developer tab of the Excel ribbon:

If the Developer option is not available in the ribbon, then go to File > Options > Customize Ribbon, select Developer from the list on the left and click on the Add >> button, then click on OK. Should you not see Options when clicking on File, then click on “More…” instead, which will then show you Options too.

Now that you are set up to start building Macros using VBA: go to the Developer tab, enable Design Mode and add controls to your sheets by clicking on Insert, and selecting any controls to insert from the drop-down menu. For example, add a button and assign a Macro to it by right clicking on the button and selecting Assign Macro from the right-click menu:

  1. Now you can create a new Macro that will be assigned to this button by clicking New, or
  2. You can select another workbook or any open workbook, to select a Macro from to assign to “Button 1”.
  3. Here the Say_Hello Macro from the workbook MicroApp_BestPractices_v3.xlsm is selected. Once an existing Macro is selected, the New button changes to Edit and clicking on it opens the Macro in VBA:
  1. The VBA Project of a workbook of which we are editing a Macro, here it is in the MicroApp_Best_Practices_v3.xlsm workbook.
  2. The worksheet that contains the Macro we are editing, here it is the second sheet which is named Getting Started.
  3. In the drop-down on the right top of the VBA editor we see that the Sub named Say_Hello is selected from the Subs that are contained in the worksheet. To view which other Subs are available in this worksheet, click on the drop-down menu.
  4. This Sub is Public meaning it can be used by any Macro-enabled workbook, its name is Say_Hello, and what it does is show a message box with the text “Hello World” as shown in the next screenshot, taken after clicking on Button 1:

To learn more about Visual Basic for Applications, see this Microsoft help article Getting started with VBA in Office, it also has an entire section on VBA in Excel.

4.    The Optilogic.bas VBA Module

It is possible to add custom modules to VBA in which Sub procedures (“Subs”) and functions to perform specific tasks have been pre-defined and can be called in the rest of the VBA code used in the workbook where the module has been imported into. Optilogic has created such a module, called Optilogic.bas. This module provides 8 standard functions for integration into the Optilogic platform.

You can download Optilogic.bas from the Building a Cosmic Frog for Excel Application resource in the Resource Library:

You can then import it into the workbook you want to use it in:

Right click on Modules in the VBA Project of the workbook you are working in and then select Import File…. Browse to where you have saved Opitlogic.bas and select it. Once done, it will appear in the Modules section, and you can double click on it to open it up:

    1. The drop-down at the right top of the VBA window contains a list of all Sub procedures and functions defined in the Optilogic Module.
    2. These are the 8 Subs and functions contained in the module, you can jump to any 1 of them by selecting it from the drop-down list. Here follows an explanation of these Subs and functions, and the arguments they take. Users do not need to know the actual code of these, just what they can be used for and what their syntax is:
      1. Upload_File_To_Optilogic (localFilePath, localFileName, platformFilePath, platformFileName, appKey) – this will upload a file from user’s local machine to the Optilogic platform. The arguments are as follows:
        1. localFilePath = File path on the local machine
        2. localFileName = Name of the local data file to be uploaded
        3. platformFilePath = File path in the platform workspace. You can get this from the Explorer (1) on the Optilogic platform (click on the > button at the top left to open it up): right click on the folder and select Copy (2) > Folder Path (3):

          1. platformFileName = Name of the data file once it is uploaded to the platform workspace. Name can be different to local file name
          2. appKey = User’s App Key for authentication. See the section “App Keys and the app.key File” further below on how to get this key
      1. Download_File_From_Optilogic (platformFilePath, platformFileName, localFilePath, localFileName, appKey) – this will download a file from the Optilogic to user’s local machine. The arguments are as follows:
        1. platformFilePath = File path in the platform workspace. You can get this from Explorer, right click on the folder and select Copy > Folder Path, see also screenshot under bullet 2.a.iii above
        2. platformFileName = Name of the data file in the platform workspace
        3. localFilePath = File path on the local machine
        4. localFileName = Name of the local data file when downloaded. Name can be different to the platform workspace filename
        5. appKey = User’s App Key for authentication. See the section “App Keys and the app.key File” further below on how to get this key
      2. Run_Job_On_Optilogic (jobPath, jobName, appKey) – this will run a Job (a Python .py file) on the Optilogic platform. The arguments are as follows:
        1. jobPath = File path in the platform workspace where the Job (.py file) resides. You can get this from Explorer, right click on the folder and select Copy > Folder Path, see also screenshot under bullet 2.a.iii above
        2. jobName = Name of the Job (.py file) in the platform workspace
        3. appKey = User’s App Key for authentication. See the section “App Keys and the app.key File” further below on how to get this key
      3. Monitor_Job_On_Optilogic (jobKey, worksheet, interval, usrMsg, msgCell, statusCell, timeCell, appKey) – this monitors the job that is running on the Optilogic platform and can give the user regular updates on the status of the Job that is being run. The arguments are as follows:
        1. jobKey = Unique Job Key identification returned from Run_Job_On_Optilogic(…) function
        2. worksheet = Name of the Worksheet where the status information is being updated
        3. interval = The time waited between getting an update on the Job status from the platform in seconds. In this example status will be checked every 5 seconds
        4. usrMsg = A free form message that will be shown to the user during status updates
        5. msgCell = Cell reference in the Worksheet where the message is updated. For example B7
        6. statusCell = Cell reference in the Worksheet where the Job status is updated. For example B8
        7. timeCell = Cell reference in the Worksheet where the time of the last update is updated. For example B9
        8. appKey = User’s App Key for authentication. See the section “App Keys and the app.key File” further below on how to get this key
      1. Convert_Unix_File_To_Windows (unixFilePath, unixFileName, windowsFilePath, windowsFileName) – files downloaded from the Optilogic platform can be in Unix format and in order to be able to read them in a Windows program such as Excel, they may need to be converted, users can use this Sub procedure to do so. The arguments are as follows:
        1. unixFilePath = File path on the local machine. You should use Get_Workbook_File_Path() to set this
        2. unixFileName = Name of the local data file in Unix format
        3. windowsFilePath = File path on the local machine. You should use Get_Workbook_File_Path() to set this
        4. windowsFileName = Name of the local data file in Windows format
      1. Get_Workbook_File_Path (sheetName, pathCell, usrMsg) – gets the file path of the current workbook. It will attempt to retrieve this without any user input, however if the workbook is in an online folder, user may need to enter the local file path manually. The arguments are as follows:
        1. sheetName = Name of the sheet in the workbook that contains the file path that has been input by the user
        2. pathCell = Cell in the sheet that contains the file path that has been input by the user, for example B3
        3. usrMsg = The message that is to be displayed to the user should the file path be invalid
      1. Export_CSV_File (filePath, writeFileName, sheetName, firstRow, colRange) – this will export a certain range in a worksheet as a CSV file the Sub will dynamically determine how many rows need to be included in the export. The arguments are as follows:
        1. filePath = The file path on the local machine. You should use Get_Workbook_File_Path() to set this
        2. writeFileName = Name of the local data file that will be created or updated
        3. sheetName = Name of the sheet in the workbook that contains the data
        4. firstRow = Name of the column in which the dynamic row range should be selected, for example A
        5. colRange = Defines the column range, for example A1:F where the row value for F will be dynamic
      1. Manage_App_Key (sheetName, keyCell, usrMsg, filePath) – this function takes the App Key that user has entered into a certain cell on a certain worksheet in the workbook and puts it into an app.key file in the directory where the Excel .xlsm workbook is saved. It validates the App Key and if valid, replaces the App Key in the Excel file with following text “app key has been saved, you can keep running the App”, thus ensuring the App Key is not visible for others. The arguments are as follows:
        1. sheetName = Name of the sheet in the workbook that contains the App Key that has been input by the user
        2. keyCell = Cell in the sheet that contains the App Key that has been input by the user, for example B3
        3. usrMsg = The message that is to be displayed to the user should the App Key be invalid
        4. filePath = File path on the local machine.  You should use Get_Workbook_File_Path() to set this

These Optilogic specific Sub procedures and the standard VBA for Excel functionality enable users to create the Macros they require for their Cosmic Frog for Excel Applications.

5.    App Keys and the app.key File

App Keys are used to authenticate the user from the Excel App on the Optilogic platform. To get an App Key that you can enter into your Excel Apps, see this Help Center Article on Generating App and API Keys.  During the first run of an App, the App Key will be copied from the cell it is entered into to an app.key file in the same folder as the Excel .xlsm file, and it will be removed from the worksheet. This is done by using the Manage_App_Key Sub procedure described in the “Optilogic.bas VBA Module” section above. User can then keep running the App without having to enter the App Key again unless the workbook or app.key file is moved elsewhere.

It is important to emphasize that App Keys should not be saved into Excel Apps as they can easily be accidentally shared when the Excel App itself is shared. Individual users need to authenticate with their own App Key.

When sharing an App with someone else, one easy way to do so is to share all contents of the folder where the Excel App is saved (optionally, zipped up). However, one needs to make sure to remove the app.key file from this folder before doing so.

6.    Python Job File & requirements.txt

A Python Job file in the context of Cosmic Frog for Excel Applications is the file that contains the instructions (in Python script format) for the operations of the App that take place on the Optilogic Platform.

Notes on Job files:

  • Jobs are Python scripts that run on the Optilogic Platform. They are Python scripts, but typically have a .job extension, so that inexperienced users do not accidentally open them and change the Python code
  • A Job could do anything, from transforming data, running a custom engine or updating a Cosmic Frog model
  • A Job can be built in Atlas on the Optilogic platform or using another, external IDE (Integrated Development Environment). A rich text editor geared towards coding, like for example Visual Studio Code, will work fine too for most Cosmic Frog for Excel App purposes
  • Cosmic Frog has a Python library for easily building Python scripts against Optilogic’s supply chain data model, this library is called cosmicfrog and needs to be imported in order to be used. The functions that this library provides are explained in the next section titled “Getting Started with Python and Optilogic’s cosmicfrog Python Library”.

For Cosmic Frog for Excel Apps, a .job file is typically created and saved in the same folder as the Macro-enabled Excel workbook. As part of the Run Macro in that Excel workbook, the .job file will be uploaded to the Optilogic platform too (together with any input & settings data). Once uploaded, the Python code in the .job file will be executed, which may do things like loading the data from any uploaded CSV files into a Cosmic Frog model, run that Cosmic Frog model (a Greenfield run in our example), and retrieve certain outputs of interest from the Cosmic Frog model once the run is done.

For a Python job that uses functionality from the cosmicfrog library to run, a requirements.txt file that just contains the text “cosmicfrog” (without the quotes) needs to be placed in the same folder as the .job file. Therefore, this file is typically created by the Excel Macro and uploaded together with any exported data & settings worksheets, the app.key file, and the .job file itself so they all land in the same working folder on the Optilogic platform. Note that the Optilogic platform will soon be updated so that using a requirements.txt file will not be needed anymore and the cosmicfrog library will be available by default.

Like VBA, users and creators of Cosmic Frog for Excel Apps do not need to be experts in Python code, and will mostly be able to do the things they want to by copy-pasting from existing Apps and updating only the parts that are different for their App. In the greenfield.job section further below we will go through the code of the python Job for the Greenfield App in more detail, which can be a starting point for users to start making changes to for their own Apps. Next, we will provide some more details and references to quickly equip you with some basic knowledge, including what you can do with the cosmicfrog Python library.

a. Getting Started with Python and Optilogic’s cosmicfrog Python Library

There are a lot of helpful resources and communities online where users can learn everything there is to know about using & writing Python code. A great place to start is on the Python for Beginners page on python.org. This page also mentions how more experienced coders can get started with Python.

i. Working with Python Locally

Working locally on any Python scripts/Jobs has the advantage that you can make use of code completion features which helps with things like auto-completion, showing what arguments functions need, catch incorrect syntax/names, etc. An example set up to achieve this is for example one where Python, Visual Studio Code, and an IntelliSense extension package for Python for Visual Studio Code are installed locally:

  • Visual Studio Code can be downloaded and installed from the “Download for Windows Stable Build” button on the https://code.visualstudio.com/ website or from the Microsoft store
  • Python itself can be downloaded and installed from https://www.python.org/downloads/ or the Microsoft store
    • If asked to add Python to the PATH variable, say yes or check the box to do so
    • If you are unsure Python is already installed on your computer, you can type “python” into a command prompt (type “cmd” in your Windows search bar to open a command prompt). If Python is installed, the command prompt will return the text Python together with its version number and some additional information.
  • Get IntelliSense (code completion) for Python in Visual Studio Code working with an extension package like this one https://marketplace.visualstudio.com/items?itemName=ms-python.python
    • Again, if asked to add to the PATH variable, say yes or check the box to do so

Once you are set up locally and are starting to work with Python files in Visual Studio Code, you will need to install the pandas and cosmicfrog libraries to have access to their functionality. You do this by typing following in a terminal in Visual Studio Code:

  1. pip install pandas, then hit Enter
  2. pip install cosmicfrog, then hit Enter

More experienced users may start using additional Python libraries in their scripts and will need to similarly install them when working locally to have access to their functionality.

If you want to access items on the Optilogic platform (like Cosmic Frog models) while working locally, you will likely need to whitelist your IP address on the platform, so the connections are not blocked by a firewall. You can do this yourself on the Optilogic platform:

  1. After logging into the Optilogic platform on cosmicfrog.com, go to Cloud Storage.
  2. If you do not see the Cloud Storage icon, click on the icon with the 3 dots to see more apps, and then click on the Cloud Storage icon which should be visible now.
  3. Click on the Firewall tab, the 4th tab across the top.
  4. Your current IP address will be shown and pre-populated in a rule.
  5. Configure the Rule: give it a Name if you so desire (can leave blank) and enter the Start and End IP addresses if you are whitelisting a range of IP addresses. You will only need to enter a Start IP if there is one static address you are connecting from. Click on the calendar to pick your own end date for the rule. Once done configuring the rule, you can click on the Add Rule button.
  6. The rule that was added is listed at the bottom: it shows when the rule was created, the IP address(es) the rule applies to, and when the rule expires. Once the rule expires, you will need to create a new one to be able to keep connecting to the Optilogic platform with your local Python scripts.

A great resource on how to write Python scripts for Cosmic Frog models is this “Scripting with Cosmic Frog” video. In this video, the cosmicfrog Python library, which adds specific functionality to the existing Python features to work with Cosmic Frog models, is covered in some detail already. The next set of screenshots will show an example using a Python script named testing123.py on our local set-up. The first screenshot shows a list of functions available from the cosmicfrog Python library:

  1. The FrogModel module from the cosmicfrog Python library is imported on the first line (“from cosmicfrog import FrogModel”), this is needed to be able to use the specific functions in the cosmicfrog library that can work with Cosmic Frog models (line 1).
  2. A variable named “model” is created and the FrogModel function is used to point it to a model named Global Risk Analysis that is in my Optilogic account (line 3). Now whenever the Python code does something with “model” it is applied to the Global Risk Analysis model.
  3. The list of functions available in the FrogModel module of the cosmicfrog Python library (lines 5-28).

When you continue typing after you have typed “model.” the code completion feature will auto-generate a list of functions you may be getting at. In the next screenshot ones that start with or contain a “g” as I have only typed a “g” so far. This list will auto-update the more you type. You can select from the list with your cursor or arrow up/down keys and hitting the Tab key to auto-complete:

When you have completed typing the function name and next type a parenthesis ‘(‘ to start entering arguments, a pop-up will come up which contains information about the function and its arguments:

  1. A short description of what the function does.
  2. A description of each argument of the function.
  3. The information given after the -> arrow tells us what type of output this function generates, a list of strings in this case.

As you type the arguments for the function, the argument that you are on and the expected format (e.g. bool for a Boolean, str for string, etc.) will be in blue font and a description of this specific argument appears above the function description (e.g. above box 1 in the above screenshot). In the screenshot above we are on the first argument input_only which requires a Boolean as input and will be set to False by default if the argument is not specified. In the screenshot below we are on the fourth argument (original_names) which is now in blue font; its default is also False, and the argument description above the function description has changed now to reflect the fourth argument:

The next screenshot shows 2 examples of using the get_tablelist function of the FrogModel module:

  1. In this example, the 3rd argument for technology_filter is set to NEO, which means network optimization tables will be returned only. Tables that are not used by NEO, but only other technologies will not be returned. The 1st, 2nd, and 4th arguments are set to ‘’, which means the defaults for these will be used:
    1. First argument: input_only – the default is False, meaning that the list of tables returned will not only be model input tables (like Customers, Facilities, Customer Demand, Transportation Policies, etc.).
    2. Second argument: output_only – the default is False, meaning that the list of tables returned will not only be model output tables (like Optimization Network Summary, Optimization Facility Summary, etc.). So, in this case all network optimization input and output tables will be returned.
    3. Fourth argument: original_names – the default is False, meaning that the table names as they are in the underlying database will be returned (e.g. “customerdemand”), not the names as they are used in the Cosmic Frog User Interface (e.g. “CustomerDemand”). Differences are mainly in capitalization of the names.
  2. In this example only Greenfield output tables will be returned, and the names will be as they are in the Cosmic Frog user interface. This is because the arguments are set as follows:
    1. First argument: input_only – is set to the default of False by using ‘’, which means the list of tables returned will not only be model input tables.
    2. Second argument: output_only – set to True, which means the list of tables returned will only be model output tables.
    3. Third argument: technology_filter – set to “TRIAD” which is the Greenfield engine in Cosmic Frog. For reference the names of the technologies in Cosmic Frog are:
      1. NEO: network optimization
      2. THROG: simulation
      3. TRIAD: Greenfield
      4. HOPPER: transportation optimization
    4. Fourth argument: original_names – set to True, meaning that the table names as they appear in the Cosmic Frog user interface will be returned.

ii. Working with Python in Atlas on the Optilogic Platform

As mentioned above, you can also use Atlas on the Optilogic platform to create and run Python scripts. One drawback here is that it currently does not have code completion features like IntelliSense in Visual Studio Code.

The following simple test.py Python script on Atlas will print the first Hopper output table name and its column names:

  1. On the Optilogic platform on cosmicfrog.com, go to Atlas. Note that the order of the applications available on the left-hand side on the Optilogic platform may be different for you as compared to what is shown here. If you do not see Atlas, then click on the icon with 3 dots to show all applications, and then click on Atlas.
  2. The test.py script. It:
    1. Assigns a Cosmic Frog model named Transportation Optimization as the “model” variable (line 7).
    2. Uses the get_tablelist function with arguments False (input_only), True (output_only), “HOPPER” (technology_filter) and True (original_names), so that the output of this function will be the list of Hopper ( = transportation optimization) output tables where the names will be as they are in the Cosmic Frog UI. This is assigned to a variable named “my_list” (line 9).
    3. Assigns the first table name from the list created under bullet b. to a variable named “first_table” (line 10).
    4. Prints the name of this first table (line 11).
    5. Prints the names of the columns in this table, by using the cosmicfrog get_columns function (line 12).
  3. The script runs when you click on the Run In Studio button at the top. In this case, the output will be shown in a window below the Python script.
  4. Instead of running the script by using the Run In Studio option as shown in the previous bullet, you can instead also use the Run As Job button. In this case the progress and outputs of it will be shown in the Run Manager app of the Optilogic platform, see next screenshot:

  1. After clicking on Run As Job, switch to the Run Manager app.
  2. The Job will appear in the list of items that have been run, the State will indicate if it is still busy running or already done. It will say Error if anything has gone wrong while running the Job.
  3. On the right hand-side of the screen there are multiple logs where information about the run can be found. Clicking on the Job Log item (the 4th one at the top), you will see the outputs of the test.py Python script (note that not all column names have been captured in this screenshot).

7.    Summary of Local and Uploaded Files

After running the Greenfield App, we can see the following files together in the same folder on our local machine:

  1. xlsm – the Macro-enabled Excel workbook that contains the App Key, the data we want to upload to the Cosmic Frog model, and the Run Greenfield Macro.
  2. job – the Python file that contains the code of what needs to be done on the Optilogic platform: load data into a new model, run Greenfield on the model, and retrieve specific output tables.
  3. key – the first time the Run Greenfield macro is run, this file is created so the user can be authenticated on the Optilogic platform.
  4. txt – contains the text cosmicfrog so the cosmicfrog Python library can be used by the Python code in the .job file.
  5. These 4 .csv files are created as one of the first steps when the Run Greenfield Macro is run, these are the data exported from the Customers, Facilities, Suppliers, and Settings worksheets in the .xlsm file.
  6. Once the Job has finished running on the Optilogic platform, the results will be downloaded into 2 files, which are named Results – “file name for results” (set on Settings worksheet) – “name of Cosmic Frog Greenfield output table”. Here the Optimization Greenfield Facility Summary and Optimization Greenfield Flow Summary output tables are downloaded as .csv files. User can then open these to analyze the Greenfield results.

On the Optilogic platform, a working folder is created by the Run Greenfield Macro. This folder is called “z Working Folder for Excel Greenfield App”. After running the Greenfield App, we can see following files in here:

  1. Files that were uploaded from the local machine as is: app.key for user authentication, requirements.txt for the Python file to run properly, and the 4 .csv files with the date to update the Cosmic Frog model with.
  2. py – the Python file that will execute on the Optilogic platform, renamed from Greenfield.job when the file was uploaded to the Optilogic platform by the Run Greenfield Macro. It needs to .py extension so that it is recognized as a Python script file.
  3. The .csv files that contain the results of the Greenfield run that we want to download to the local machine. Note their names are slightly different here than what they end up being on the local machine, where the “File name for results” that the user has set on the Settings worksheet in the .xlsm workbook is added in, which is done by the Run Greenfield Macro when downloading the files from the Optilogic platform.

8.    Run Greenfield Macro

Parts of the Excel Macro and Python .job file will be different from App to App based on the App’s purpose, but a lot of the content will be the same or similar. In this section we will step through the Macro that is behind the Run Greenfield button in the Cosmic Frog for Excel Greenfield App that is included in the “Building a Cosmic Frog for Excel Application” resource, where it will be explained what is happening at a high level each step of the way and mention if this part is likely to be different and in need of editing for other Apps or if it would typically stay the same across most Apps. After stepping through this Excel Macro in this section, we will the same for the Greenfield.job file in the next section.

The next screenshot shows the first part of the VBA code of the Run Greenfield Macro:

  1. We are in the MicroApp_Greenfield_v3.xlsm (1a) Macro-enabled Excel workbook (the Cosmic Frog for Excel Greenfield Application), on the Run Greenfield worksheet (1b) and are looking at the VBA code for the RunGreenfield_click public Sub procedure (1c).
  2. There is some error handling embedded within the Macro which will be called if an error occurs; we will see the actual error handling code in the last screenshot of this Macro.
    1. This can be left as is for most Apps. More advanced users may over time start adding their own, more elaborate error handling.
  3. Dim is used to declare variables and their data type. Here 2 string variables are declared: filePath and writeFileName
    1. This can be kept as is for most Apps.
  4. The next 3 sets of lines of code each check if certain settings have been selected by the user (Number of Facilities (4a), Only use Facilities (4b), and Resource Size (4c), respectively): they check if a certain cell (C5, C8, and C13, respectively) on a certain worksheet (Settings worksheet for all 3) is empty (= “”) and if so, pops up a message prompting the user to select a value. If the cells are not empty, the code continues without any messages.
    1. These will change depending on the App. In this case the first 2 are Greenfield specific, so for other technologies, you would not have these specific settings, but may have others where user input is required. Remove/update or add to these (and the corresponding cells on the Settings worksheet) depending on how many settings a user is required to select for the specific App. The Resource Size setting can be used for all technologies and may be left here. However, if this should not be set by a user, the Resource Size can be hardcoded into the Macro or the Python Job.

Note that throughout the Macro you will see text in green font. These are comments to describe what the code is doing and are not code that is executed when running the Macro. You can add comments by simply starting the line with a single quote and then typing your comment. Comments can be very helpful for less experienced users to understand what the VBA code is doing.

Next, the file path to the workbook is retrieved:

This piece of code uses the Get_Workbook_File_Path function of the Optilogic.bas VBA module to get the file path of the current workbook. This function first tries to get the path without user input. If it finds that the path looks like the Excel workbook is stored online in for example a Cloud folder, it will use user input in cell B3 on the Admin worksheet to get the file path instead. Note that specifying the file path is not necessary if the App runs fine without it, which means it could get the path without the user input. Only if user gets the message “Local file path to this Excel workbook is invalid.  It is possible the Excel workbook is in a cloud drive, or you have provided an invalid local path.  Please review setup step 4 on Admin sheet.”, the local file path should be entered into cell B3 on the Admin worksheet.

This code can be left as is for other Apps if there is an Admin worksheet (the variable pathsheetName indicated with 1 in screenshot above) where in cell B3 the file path (the variable pathCell indicated with 2 in screenshot above) can be specified. Of course, the worksheet name and cell can be updated if these are located elsewhere in the App. The message the user gets in this case (set as pathusrMsg indicated with 3 in the screenshot above) may need to be edited accordingly too.

The following code takes care of the App Key management:

The Manage_App_Key function from the Optilogic.bas VBA module is used here to retrieve the App Key from cell B2 on the Admin worksheet and put it into a file named app.key which is saved in the same location as the workbook when the App is run for the first time. The key is then removed from cell B2 and replaced with the text “app key has been saved; you can keep running the App”. As long as the app.key file and the workbook are kept together in the same location, the App will keep working.

Like the previous code on getting the local file path of the workbook, this code can be left as is for other Apps. Only if the location of where the App Key needs to be entered before the first run is different from cell B2 on the worksheet named Admin, the keysheetName and keyCell variables (indicated with 1 and 2 in the screenshot above) need to be updated accordingly.

This App has a greenfield.job file associated with it that contains the Python script which will be run on the Optilogic platform when the App is run. The next piece of code checks that this greenfield.job file is saved in the same location as the Excel App, and it also sets the name of the folder to be created on the Optilogic platform where files will get uploaded to:

This code can be left as is for other Cosmic Frog for Excel Apps, except following will likely need updating:

  1. The name of the .job file will usually be something descriptive of the App.
  2. The name of the folder to be created on the Optilogic platform. As the files and model keep getting overwritten with each run of the Greenfield App, you will want to make sure to use a folder naming convention for your Apps that prevents it from accidentally overwriting anything that should be kept. Creating separate folders for each App and naming them something descriptive of the App will therefore be helpful. Having separate folders by App will also make troubleshooting easier.

The Greenfield settings are set in the next step. The ones the user can set on the Settings worksheet are taken from there and others are set to a default value:

      1. First the progress of the App is updated on the Run Greenfield worksheet in cells C5 (the status message) and C6 (the current time).
        1. This can be left as is for other Apps, except for updating the name of the worksheet and the cells if these are different in your App. The message itself (“Exporting data locally…”) can of course be updated too.
      2. This code checks if a file named Settings.csv is already present in the folder where the Excel App is located, and if so, deletes it.
        1. If a Settings.csv file is used for your Excel App, this code can be kept unchanged. It can be removed if no files need to be checked if they exist already, and it can be updated to check for the existence of any files by other names in the same location and delete those if present. Copy-paste the code and update it for each individual file you want to check and delete if it exists.
      3. All Greenfield settings are created as string variables here. A design decision was made for the Greenfield App to clear out the Greenfield Settings table in the Cosmic Frog model entirely and re-populate with values each run of the Greenfield App. This is to prevent not updating any Settings that were changed in between App runs manually in the model itself.
        1. For other Greenfield Apps, this can be kept as is.
        2. For non-Greenfield Apps, these do not apply and can be removed or replaced by other variables for settings that the technology does use. An example would be to re-populate the Model Settings table either entirely or for certain settings for a network optimization (NEO) run. Think of inventory carrying cost %, circuity factor, average speed, or any of the primary UOM fields for example.
      4. All Greenfield variables, except for the last 3 (see next screenshot) are set here. Either to a value set on the Settings worksheet in cells C5-C11 or to defaults when user cannot set them on the Settings worksheet of the Excel Greenfield App. Notes:
        1. If Then Else statements are used to set the values of 2 of the variables: MaxNumberOfNewFacilities and OnlyUseFacilitiesAsCandidateLocations. For example, if the value in cell C5 on the Settings worksheet is set to Optimize, the value of the MaxNumberOfNewFacilities variable becomes “”, meaning there is no upper limit on how many new facilities the Greenfield run can use, we are asking the algorithm to return the optimal number of new facilities. If cell C5 is not set to Optimize, it should contain a number and in this case that number will be set as the value of the MaxNumberOfNewFacilities variable.
        2. The Trim function is used frequently here to ensure leading and trailing spaces are not included when setting the variables to their values.
        3. Again, as mentioned under bullet 3b. it is dependent on the App what to remove, what to keep exactly as is, and what copy-paste plus update as needed here.

Next, the Greenfield Settings and the other input data are written into .csv files:

  1. The Resource Size and Scenario Name are set in this part of the VBA code. These are not Greenfield specific and similar/same setup + code can be used to set these for other Cosmic Frog technologies too.
    1. The Resource Size is set in cell C13 on the Settings worksheet, and has a value from a drop-down list:

The firstSpaceIndex variable is set to the location of the first space in the resource size string.

    1. This firstSpaceIndex value is then used together with the Left function to get the first part of the Resource Size value: the part before the first space. E,g. a value of 4XS (1 Core 1 Gb RAM) in cell C13 will be written in as 4XS for the ResourceSize variable, M (6 Cores 16 GB RAM) becomes M, etc.
  1. All Greenfield settings that have been set so far (except the Resource Size and Scenario Name) are now written into a Settings.csv file (note that the screenshot only captures the Print lines partially):
    1. The Open … For Output As #1 statement opens, or, when it does not yet exist, creates the Settings.csv file in the same location as the Excel App. The Mode of the Open function is set to Output which means that we can write to this file, including overwriting anything in it.
    2. The first Print statement writes the first line into the Settings.csv: it writes the column names in, which are the same as the Greenfield variable names.
    3. The second Print statement writes the second line to the Settings.csv: it writes the values of the Greenfield variables in, separated by commas.
    4. The Close statement closes the Settings.csv file.
    5. For other Excel Apps you will likely not need this exact same code unless the App also runs Greenfield. It is a good example however of how settings can be written into a .csv file using the Open, Print, and Close statements.
  2. Here the Export_CSV_File Sub procedure is called 3 times to write the Customers, Facilities and Suppliers worksheets into .csv files. We will use the first for Customers as an example and go through the arguments of this Sub:
    1. 1st argument – filePath: the file path on the local machine where the csv file will be created. Here set to the variable filepath, which is where the Excel App is located.
    2. 2nd argument – writeFileName: the name of the local file that will be created or updated. Here Customers.csv will be created.
    3. 3rd argument – sheetName: the name of the worksheet that contains the data. Here the data on the worksheet named Customers will be used.
    4. 4th argument – firstRow: name of the column in which the dynamic row range should be selected. Here set to column A, which is the Customer Name column.
    5. 5th argument – colRange: defines the column range where the row value for the last column will be dynamic. Here set to A1:D, meaning the data from columns A through D will be used and the Export_CSV_File Sub will dynamically determine what the last populated row is on the Customers worksheet and use that as the last row to include in the export.

Looking in the Greenfield App on the Customers worksheet we see that this means that the Customer Name (column A), Latitude (column B), Longitude (column C), and Quantity (column D) columns will be exported. The Customers.csv file will contain the column names on the first row, plus 96 rows with data as the last populated row is row 97. Here follows a screenshot showing the Customers worksheet in the Excel App (rows 6-93 hidden) and the first 11 lines in the Customers.csv file that was exported while running the Greenfield App:

Other Cosmic Frog for Excel Applications will often contain data to be exported and uploaded to the Optilogic platform to refresh model data; the Export_CSV_File function can be used in the same way to export similar and other tabular data.

As mentioned in the “Python Job File and requirements.txt” section earlier, a requirements.txt file placed in the same folder as the .job file that contains the Python script is needed so the Python script can run using functionality from the cosmicfrog Python library. The next code snippet checks if this file already exists in the same location as the Excel App, and if not creates it there, plus writes the text cosmicfrog into it.

This code can be used as is by other Excel Apps.

The next step is to upload all the files needed to the Optilogic platform:

  1. A variable named appKeyFileName is created and set to app.key. This code can be kept as is in other Excel Apps.
  2. Cells C5 and C6 on the Run Greenfield worksheet are updated to a different message and the current time. This code can be kept as is in other Excel Apps, just the worksheet name and cells need to be updated if they are different. Also, additional status updates can be set up in a similar way (copy-paste and update worksheet name, cell references and message as required) if the App has more steps it is going through.
  3. The Upload_File_To_Optilogic Sub procedure is called multiple times to upload all required files to the Optilogic platform: the app.key file containing the App Key for authentication on the Optilogic platform, the 4 exported .csv files containing the Settings, Customers, Facilities and Suppliers data, the greenfield.job file which contains the Python script to be run on the Optilogic platform, and the requirements.txt file to ensure the Python script can use the cosmicfrog Python library to run properly. Take the line that uploads the greenfield.job file as an example to discuss the arguments:
    1. 1st argument – localFilePath: the file path on the local machine where the local file to be uploaded is located. Here set to the variable filePath, which is where the Excel App and all the needed files to be uploaded are located.
    2. 2nd argument – localFileName: the name of the local file that will be uploaded. Here greenfield.job will be uploaded.
    3. 3rd argument – platformFilePath: file path on the Optilogic platform where the file will be uploaded to. If you want to know the path of an existing folder on the Optilogic platform, you can get this from Explorer: right click on the folder, select Copy > Folder Path (see also the screenshot in the Optilogic.bas VBA Module earlier in this documentation). Here it is set to the variable platformFolderName, which was set to My Files/z Working Folder for Excel Greenfield App earlier in the Macro.
    4. 4th argument – platformFileName: name of the file once it is uploaded to the platform workspace. Name can be the same as or different from the local file name. For the other files, the names are kept the same locally and on the Optilogic platform, but for the Python script it is changed to greenfield_job.py. The file needs to have the .py extension to be recognized and executed as a Python script file. It has a .job extension on the local machine to prevent accidental changes made to the Python code in the file.
    5. 5th argument – appKey: the user’s App Key for authentication. This has been set earlier in the Macro by the Manage_App_key function.

Besides updating the local/platform file names and paths as appropriate, the Upload_File_To_Optilogic Sub procedure will be used by most if not all Excel Apps: even if the App is only looking at outputs from model runs and not modifying any input data or settings, the function is still required to upload the .job, app.key, and requirements.txt files.

The next bit of code uses 2 more of the Optilogic.bas VBA module functions to run and monitor the Python job on the Optilogic platform:

  1. The variable jobKey is used to run the Greenfield Python script on the Optilogic platform. The arguments of the Run_Job_On_Optilogic function are set as follows:
    1. 1st argument – jobPath: file path in the platform workspace where the Job (.py file) resides. Set to the variable platformFolderName here, which has been set to My Files/z Working Folder for Excel Greenfield App earlier in the Macro.
    2. 2nd argument – jobname: name of the Job (.py file) in the platform workspace. Set to greenfield_job.py, which is the name of the Python script as it was uploaded to the Optilogic platform.
    3. 3rd argument – appKey: the user’s App Key for authentication. This has been set earlier in the Macro by the Manage_App_key function.
  2. These 2 lines of code set usrMessage to monitoring the job running on the Optilogic platform with the Monitor_Job_On_Optilogic function and then displaying this in cell C5 on the Run Greenfield worksheet. The arguments of the Monitor_Job_On_Optilogic function are set as follows:
    1. 1st argument – jobKey: unique Job Key identification returned from Run_Job_On_Optilogic function. Set here to the jobKey variable that uses the Run_Job_On_Optilogic function.
    2. 2nd argument – worksheet: name of the worksheet where the status information is being updated. Here, this is on the Run Greenfield worksheet.
    3. 3rd argument – interval: the time waited between getting an update on the Job status from the platform in seconds. Here it is set to 5, so the status will be updated every 5 seconds.
    4. 4th argument – usrMsg: a free form message that will be shown to the user during status updates. Here it is set to “ is processing…”.
    5. 5th argument – msgCell: the cell reference in the worksheet where the Job status is updated. Here it is set to cell C5. The monitor job function then concatenates the unique jobKey and the usrMsg (“… is processing”) to be displayed in this cell while the greenfield_job.py is running on the Optilogic platform.
    6. 6th argument – statusCell: the cell reference in the worksheet where the Job status is updated (like “running” or “done”). Here it is set to cell C7.
    7. 7th argument – timeCell: the cell reference in the worksheet where the time of the last update is updated. Here it is set to cell C6.
    8. 8th argument – appKey: the user’s App Key for authentication. This has been set earlier in the Macro by the Manage_App_key function.

This piece of code can stay as is for most Apps, just make sure to update the following if needed:

  • The jobname, 2nd argument of the Run_Job_On_Optilogic function (greenfield_job.py here).
  • The worksheet, 2nd argument of the Monitor_Job_On_Optilogic function (Run Greenfield here).
  • The msgCell, statusCell, and timeCell, 5th, 6th, and 7th arguments of the Monitor_Job_On_Optilogic function in case these are moved to a different location.
  • The worksheet and cell reference in the second line of the code block indicated with 2 in the above screenshot (Run Greenfield and C5 here).

The last piece of code before some error handling downloads the results (2 .csv files) from the Optilogic platform using the Download_File_From_Optilogic function from the Optilogic.bas VBA module:

  1. First the progress message and current time are updated in cells C5 and C6 on the Run Greenfield worksheet to indicate that data is now being downloaded from the Optilogic platform.
  2. Then the 2 files are downloaded:
    1. Variables to set the names of the 2 files to be downloaded are created.
    2. These 2 file name variables are then set to a concatenation of “Results”, the ScenarioName (set earlier in the Macro on the Settings worksheet as “File name for results”) and a descriptive name of the file (Facility Summary.csv and Flow Summary.csv which are the outputs from the 2 Cosmic Frog Greenfield output tables of similar name). Note that if the “File name for results” is not changed in between runs of the App, the results will be overwritten.
    3. Uses the Download_File_From_Optilogic function 2 times to download the files. Going through the arguments of the function when it is called first:
      1. 1st argument – platformFilePath: file path on the Optilogic platform where the file will be downloaded from. If you want to know the path of an existing folder on the Optilogic platform, you can get this from Explorer: right click on the folder, select Copy > Folder Path, see also screenshot in the “Optilogic.bas VBA Module” section. Here it is set to the variable platformFolderName, which was set to My Files/z Working Folder for Excel Greenfield App earlier in the Macro.
      2. 2nd argument – platformFileName: name of the data file in the platform workspace. Here set to “results_FacilitySummary.csv” which has been created and named by the greenfield_job.py Job that was run and we will discuss in detail in the next section.
      3. 3rd argument – localFilePath: file path on the local machine where the file will be downloaded to. Here it is set to the variable filePath which is the location of the workbook.
      4. 4th argument – localFileName: name of the local data file when downloaded. Can be different than the name of the file on the Optilogic platform. Here it is set to the names set under bullet b.
      5. 5th argument – appKey: the user’s App Key for authentication. This has been set earlier in the Macro by the Manage_App_key function.
  3. Next, the progress message and time in cells C5 and C6 on the Run Greenfield worksheet are updated again to indicate that the Macro has completed successfully.
  4. Lastly, a message box will now pop up that has the text “Completed successfully, open <name of output file 1> and <name of output file 2> to review results”.

This piece of code can be used as is with the appropriate updates for worksheet names, cell references, file names, path names, and text of status updates and user messages. Depending on the number of files to be downloaded, the part of the code setting the names of the output files and doing the actual download (bullet 2 above) can be copy-pasted and updated as needed.

The last piece of VBA code of the Macro shown in the screenshot below has some error handling. Specifically, when the Macro tries to retrieve the local path of the Macro-enabled .xlsm workbook and it finds it looks like it is online, an error will pop up and the user will be requested to put the file path name in cell B3 on the Admin worksheet. If the Macro hits any other errors, a message saying “An unexpected error occurred: <error number> <error description>” will pop up. This piece of code can be left as is for other Cosmic Frog for Excel Applications.

We have used version 3 of the Greenfield App which is part of the Building a Cosmic Frog for Excel Application resource in the above. There is also a stand-alone newer version (v6) of the Cosmic Frog for Excel – Greenfield application available in the Resource Library. In addition to all of the above, this App also:

  • Prevents locking of the workbook while the Macro is running; in version 3 Excel becomes unresponsive when running the Macro until it completes entirely.
  • Reads outputs back into the workbook.
  • Has outputs on a map in a browser that users can open from within the workbook.

This functionality is likely helpful for a lot of other Cosmic Frog for Excel Apps and will be discussed in section “Additional Common App Functionality” further below. We especially recommend using the functionality to prevent Excel from locking up in all your Apps.

9. greenfield.job

Now we will go through the greenfield.job file that contains the Python script to be run on the Optilogic platform in detail.

This first piece of code takes care of importing several python libraries and modules (optilogic, pandas, time; lines 1, 2, and 5). There is another library, cosmicfrog, that is imported through the requirements.txt file that has been discussed before in the section titled “Python Job File and requirements.txt”. Modules from these libraries are imported here as well (FrogModel from cosmicfrog on line 3 and pioneer.API from optilogic on line 4). Now the functionality of these libraries and their modules can be used throughout the code of the script that follows. The optilogic and cosmicfrog libraries are developed by Optilogic and contain specific functionality to work with Cosmic Frog models (e.g. the functions discussed in the section titled “Working with Python Locally” above and on the Optilogic platform.

For reference:

  • Pandas is an open source Python library providing high-performance, easy-to-use data structures and data analysis tools for Python. Pandas documentation can be found here; however, users do not need to review this to follow along with this script.
  • Time is a Python module providing various time-related functions; its documentation can be found here. Again, users do not need to review this documentation to be able to follow along with this script.

This first piece of code can be left as is in the script files (.job files locally, .py files on the Optilogic platform) for most Cosmic Frog for Excel Applications. More advanced users may import different libraries and modules to use functionality beyond what the standard Python functionality plus the optilogic, cosmicfrog, pandas, and time libraries & modules together offer.

Next, a check_job_status function is defined that will keep checking a job until it is completed. This will be used when running a job to know if the job is done and ready to move onto the next step, which will often be downloading the results of the run. This piece of code can be kept as is for other Cosmic Frog for Excel Applications.

The following screenshot shows the next snippet of code that defines a function called wait_for_jobs_to_complete. It uses the check_job_status to periodically check if the job is done, and once done, moves onto the next piece of code. Again, this can be kept as is for other Apps.

Now it is time to create and/or connect to the Cosmic Frog model we want to use in our App:

  1. A new function named create_model is defined. Its arguments are 1) key to pass an App Key to the function and 2) model_name, the name of the model we are creating and/or connecting to. This function can be kept as is for other Apps.
  2. These 2 lines of code open the app.key file uploaded to the Optilogic platform. It contains the App Key needed for user authentication on the platform. It creates a variable named yourappkey, the value of which is set to the App Key contained in the app.key file. Leave this code as is too.
  3. A model_name variable is created and set to Greenfield App. For other Apps:
    1. If you are creating a new model and connecting to it, update this to your chosen model name.
    2. If you are connecting to an already existing Cosmic Frog model, use its name here as the model name. Make sure to use the exact same name (e.g. check spaces, underscores, and capitalization).
  4. This piece of code tries to first create a model named Greenfield App and set the variable model to refer to this model (a). If it cannot be created because it already exists, we connect to it, again set the variable model to refer to it, and clear out several tables (b). For other Apps, leave the first part (a) as is and as needed update part b:
    1. Use the clear_table function as is used here for the customers, customerdemand, facilities, suppliers, and greenfieldsettings tables for any tables that you do not want to have any data in at the start of your script. Depending on your App it may not be needed to clear any tables or clear 1 or multiple tables.
    2. It is possible to update records in tables that already have data in them or append additional data to them. Clearing tables may not be necessary in this case or possibly you need to clear some, but keep the data that is already there in others.
    3. You can partially clear tables by for example using the exec_sql cosmicfrog function. If you want to remove all records from the Facilities table that have Status = Exclude, you can do so with this line of code:
  5. A variable called model_scenario_name is set to Baseline, which will be used to set which scenario will be run when running a Cosmic Frog model. Note that here the Baseline is always run, but that user can set any name they desire as the “File name for results” on the Settings worksheet in the Excel workbook: these 2 can be different. In the Cosmic Frog model on the Optilogic platform side of things, the scenario named Baseline is run every time the Greenfield app runs. On the Excel App side, when the results are downloaded, the “File name for results” is used in the names of the output files, so the user knows which run they belong to. If this “File name for results” input is kept the same between runs, then outputs that are downloaded to the local machine will be overwritten too.
  6. As this is an App where we want to run Greenfield analysis, the variable named engine is set to triad, this will be used when kicking off a Cosmic Frog model run to ensure the correct engine is used. As a reminder, the names of the Cosmic Frog technologies are:
    1. NEO: network optimization
    2. THROG: simulation
    3. TRIAD: Greenfield
    4. HOPPER: transportation optimization

Note that like the VBA code in the Excel Macro, we can add comments describing what the code is doing to our Python script too. In Python, comments need to start with the number (/hash) sign # and the font of comments automatically becomes green in the editor that is being used here (Visual Studio Code using the default Dark Modern color theme).

After clearing the tables, we will now populate them with the date from the Excel workbook. First, the uploaded Customers.csv file that contain the columns Customer Name, Latitude, Longitude, and Quantity is used to update both the Customers and the CustomerDemand tables:

  1. First Customers.csv is used to populate the Customers table of the Greenfield App model:
    1. A variable df_Customers is created, and the Pandas read_csv function is used to read in the uploaded Customers.csv (line 68). df in the naming of the variable is short for dataframe which is what the read_csv function creates.
    2. The Pandas rename function is used to change the Customer Name column name to customername (line 69), this is to match with the column name in the Customers table in the Cosmic Frog model. The inplace argument in this function is set to True which means that the dataframe is modified, no new dataframe with this change is created.
    3. The Pandas drop function is used to remove the Quantity column from the dataframe (line 70). The axis argument is set to 1 which means that the drop refers to columns and not to indices of rows. Again inplace=True is used to modify the dataframe rather than creating a new dataframe.
    4. The dataframe is then imported into the model’s Customers table using the cosmicfrog write_table function (line 71). The first argument specifies the target table in the model (Customers) and the second argument specifies the data that is to be used, which is the df_Customers dataframe that was created and modified in the previous 3 bullets.
  2. In a similar fashion, the Customers.csv fle is read into a dataframe named df_CustomerDemand (line 74), then modified to rename the Customer Name column to customername (line 75), and to drop the Latitude and Longitude columns (lines 76 and 77), before being written into the Greenfield App model’s CustomerDemand table (line 78).

It is very dependent on the App that you are building how much of the above code you can use as is, but the concepts of reading csv files, renaming, and dropping columns as needed and writing tables into the Cosmic Frog model will be frequently used. The following piece of code also writes the Facilities and Suppliers data into the Cosmic Frog tables. Again, the concepts used here will be useful for other Apps too, it may just not be exactly the same depending on the App and the tables that are being written to:

  1. Here the same functions read_csv, rename, and write_table as for the Customers and Customer Demand tables are being used to read the Facilities.csv file and write it into the Facilities table of the Cosmic Frog table. In addition, following functions are used too:
    1. The pandas fillna function is used on line 82 to replace any N/A or NaN values in the df_Facilities dataframe with blanks (‘’).
    2. The pandas astype function is used to convert all values in the df_Facilities dataframe to strings (str), line 87.
    3. The pandas str.replace function is used multiple times (lines 89-92) to remove commas and spaces in the values of the fixedoperatingcost and throughputcapacity columns (they are replaced by blanks which is the equivalent of removing them here). This is needed because the Excel Macro wrote “ 4,500,000 ” for the fixed operating cost into the .csv file that was uploaded to the Optilogic platform and similarly contains commas and leading and trailing spaces for the throughput capacity entries. The Cosmic Frog model will not run properly with data that has commas and leading and/or trailing spaces in it, therefore cleaning this up in the Python script is best practice and will apply to all data that similarly comes through from the Excel Macro with commas and spaces in it.
  2. All the same functions as used above for Customers, CustomerDemand, and Facilities are used here to read the Suppliers.csv file in and populate the Suppliers table in the Cosmic Frog model.

Next up, the Settings.csv file is used to populate the Greenfield Settings table in Cosmic Frog and to set 2 variables for resource size and scenario name:

  1. First the Settings.csv file is read in (line 107) and the fillna function is used (line 108) to replace any N/A or NaN values with blank values (‘’).
  2. On lines 110 and 111, 2 new variables are created, resource_size and app_scenario_name and are set to the values of the ResourceSize and ScenarioName columns of the df_GreenfieldSettings dataframe. The pandas iloc function is used to get the values of the first row of data (the [0] index) of these 2 columns.
  3. On lines 112 and 113, the ResourceSize and ScenarioName columns are dropped from the Settings.csv as these are not part of the Greenfield Settings table in Cosmic Frog; the variables set in the previous 2 lines will be used when kicking off the model run in the next piece of code.
  4. The pandas fillna and astype functions and the cosmicfrog write_table function are used to finalize the df_GreenfieldSettings dataframe, and to write the data of it into the Greenfield Settings table of the Greenfield App Cosmic Frog model.

Now that the Greenfield App Cosmic Frog model is populated with all the data needed, it is time to kick off the model and run a Greenfield analysis:

  1. First, on line 120, we set a variable called api and connect to the Optilogic api that runs models, using our App Key that was defined earlier in the code to authenticate. This code can be kept as is in other Cosmic Frog for Excel Applications.
  2. Next, the model run is kicked off, lines 121-125. For the arguments needed to run the model, several variables that have been set earlier in the code are being used:
    1. model_name, model_scenario_name, engine, and resource_size have been set to Greenfield App, Baseline, triad, and L respectively in code discussed further above.
    2. Optionally, users can give their runs a tag, which will be shown for the job in the Run Manager when a job is running on the Optilogic platform. Here the tag “Greenfield App” is used.
  3. This piece of code uses the wait_for_jobs_to_complete function which was defined at the beginning of our Python script to regularly check if the model is done running and will only move onto the next bit of code if the job has indeed completed.

Besides updating any tags as desired (bullet 2b above), this code can be kept exactly as is for other Excel Apps.

Lastly, once the model is done running, the results are retrieved from the model and written into .csv files, which will then be downloaded by the Excel Macro:

  1. The cosmicfrog read_table function is used to populate 2 dataframes, df_FacilitySummaty and df_FlowSummary, with the data from 2 of the Cosmic Frog Greenfield output tables: the optimizationgreenfieldfacilitysummary and the optimizationgreenfieldflowsummary, respectively.
  2. The app_scenario_name is used to update the scenarioname columns of both output tables. This was set by the user as “File name for results” in the Excel workbook (on the Settings worksheet).
  3. The pandas to_csv functon is used to write the 2 dataframes with results into 2 .csv files. The argument “index” is set to “False” to not write out the indices of the rows (i.e. row numbers are not written into the .csv files).

10. Run Manager to Monitor Jobs

When the greenfield_job.py file starts running on the Optilogic platform, we can monitor and see the progress of the job in the Run Manager App:

  1. Go to the Run Manager app on the Optilogic platform at cosmicfrog.com.
  2. If you do not see the Run Manager app, then click on the icon with the 3 dots to see all apps, now the Run Manager should be visible too.
  3. When running a Job file (.py extension on the Optilogic platform) that itself kicks off a Cosmic Frog model run (here for a Greenfield analysis), you will see 3 related to it in the list of Jobs. The state, execution file, tags, start date & time, etc. are listed here.
  4. On the right-hand side you can view details of the Job through a set of logs by clicking on one of the 6 icons at the right top. These show, for the icons from left to right:
    1. Job Info (shown in the screenshot here)
    2. Job Records
    3. Job Usage
    4. Job Log
    5. Job Error Log (if applicable)
    6. Optimality Gap (if applicable)
  5. Here, the Job Info is shown.
  6. If a Job for some reason is not finishing within the amount of time expected, a used can manually cancel it by clicking on the Cancel Job button. Jobs that have been run before can also be re-run here by clicking on the Run Job button.

11. Additional Common App Functionality

The Greenfield App (version 3) that is part of the Building a Cosmic Frog for Excel Application resource covers a lot of common features users will want to use in their own Apps. In this section we will discuss some additional functionality users may also wish to add to their own Apps. This includes:

  • Ability to cancel runs from the Excel workbook and prevent Excel from locking up while Macros are running.
  • Reading output data back into a worksheet within the App Excel workbook.
  • Visualizing outputs on maps: using Excel 3D Maps, an Arc GIS Excel add-in, or the Python library folium.

a. Prevent Locking of Excel & Ability to Cancel an App Run

A newer version of the Greenfield App (version 6) can be found here in the Resource Library. This App has all the functionality version 3 has, plus: 1) it has an updated look with some worksheets renamed and some items moved around, 2) has the option to cancel a Run after it has been kicked off and has not completed yet, 3) it prevents locking up of Excel while the App is running, 4) reads a few CSV output files back into worksheets in the same workbook, and 5) uses a Python library called folium to create Maps that a user can open from the Excel workbook, which will then open the map in the user’s default browser. Please download this newer Greenfield App if you want to follow along with the screenshots in this section. First, we will cover how a user can prevent locking of Excel during a run and how to add a cancel button which can stop a run that has not yet completed.

The screenshots call out what is different as compared to version 3 of the App discussed above. VBA code that is the same is not covered here. The first screenshot is of the beginning of the RunGreenfield_Click Macro that runs when the user hits the Run Greenfield button in the App:

  1. These lines of code are added to disable the Run Greenfield button once it has been clicked on once to prevent a user from kicking off multiple runs at the same time. DoEvents on the second line here ensures that other applications (including other Excel workbooks) are also prioritized when user interacts with them, so that the App run does not take up all the machine’s resources.
  2. In this piece of code, if the App run ends here with a message to the user that they need to select a value for a certain setting before the Greenfield App can be run, then the Run Greenfield button is made active (enabled) again, so user can click on it again once they have fixed the input issue. DoEvents is again used here to make sure the resources of the local machine are also available for other applications. This pattern of adding these 2 lines of code is repeated in all other cases in the code where the App can end with a message to the user about fixing something in the inputs/App Key. Users are encouraged to adopt this strategy in their own Apps too.

The next screenshot shows the addition of code to enable the Cancel button once the Job has been uploaded to the Optilogic platform:

  1. The button that has the CancelRun Sub assigned to it (which will be discussed below) is now enabled, so the user can click on it if it is needed to cancel a run that has already started on the Optilogic platform.
  2. If the Job fails on the Optilogic platform, a user message “Job failed to run” will pop up and added here are 3 lines of code to then make the Run Greenfield button active (enabled) again, make the Cancel button inactive (disabled again), and DoEvents is again used to make sure other applications can still take up computer resources too.

If everything completes successfully, a user message pops up, and the same 3 lines of code are added here too to enable the Run Greenfield buttons, disable the Cancel button, and keep other applications accessible:

Finally, a new Sub procedure CancelRun is added that is assigned to the Cancel button and will be executed when the Cancel button is clicked on:

This code gets the Job Key (unique identifier of the Job) from cell C9 on the Start worksheet and then uses a new function added to the Optilogic.bas VBA module that is named Cancel_Job_On_Optilogic. This function takes 2 arguments: the Job Key to identify the run that needs to be cancelled and the App Key to authenticate the user on the Optilogic platform.

b. Reading CSV Outputs Back into the App Workbook

Version 6 of the Greenfield App reads results from the Facility Summary, Customer Summary, and Flow Summary back into 3 worksheets in the workbook. A new Sub procedure named ImportCSVDataToExistingSheet (which can be found at the bottom of the RunGreenfield Macro code) is used to do this:

The function is used 3 times: to import 1 csv file into 1 worksheet at a time. The function takes 3 arguments:

  1. filePath – the path to the folder where the CVS output file to be imported is located. Set to the variable filePath here which is the path to where the Excel App is saved.
  2. importCSVFileName – the name of the CSV file to be imported, set to “Results – Facility Summary.csv” here when the function is called first. This CSV file is created by the greenfield_job.py Python script on the Optilogic platform after the Greenfield run completes; it takes outputs from the Optimization Greenfield Facility Summary table and saves them into the Results _ Facility Summary.csv file.
  3. importSheet – the name of the worksheet in the workbook the CSV file is to be imported into. Set to Results – Facility Summary here when the function is first called. Note that this worksheet needs to be in the workbook already, it will not be created by the Macro.

c. Using Maps

We will discuss a few possible options on how to visualize your supply chain and model outputs on maps when using/building Cosmic Frog for Excel Applications.

This table summarizes 3 of the mapping options: their pros, cons, and example use cases:

Mapping Tool Pros Cons Example Use Cases
3D Maps – Built into Excel

– Scaling of markers based on values

– Tooltip capability

– Cannot show flows

 

Showing locations on a map after geocoding them
ArcGIS Excel add-in – Easy to add to Excel

– Can show flows

– Easy filtering

– Can set transparency of markers/lines based on custom values,

– Label & tooltip (pop-up) capability

– Cannot show more than 1 map in 1 workbook

– Cannot scale markers/lines in size based on values

Showing and filtering Hopper routes and stops
folium Python library – Can show flows

– Can show multiple maps in multiple browser windows which can be arranged to be viewed simultaneously

– Label & tooltip capability

– No easy filtering

– View in browser not in Excel

Greenfield location and flow outputs

 

i. 3D Maps in Excel

There is standard functionality in Excel to create 3D Maps. You can find this on the Insert tab, in the Tours groups (next to Charts):

Documentation on how to get started with 3D Maps in Excel can be found here. Should your 3D Maps icon be greyed out in your Excel workbook, then this thread on the Microsoft Community forum may help troubleshoot this.

How to create an Excel 3D Map in a nutshell:

  1. Open a workbook that contains location data.
  2. Click on Insert > 3D Map > Open 3D Maps.
  3. The map may automatically use your data to display on the map, but if not, go to the worksheet with the data, select the range (including any headers) and click on 3D Map > Add Selected Data to 3D Maps.
  4. Configure your Tour(s) and their Layer(s) in the 3D Maps configuration window.

With Excel 3D Maps you can visualize locations on the map and for example base their size on characteristics like demand quantity. You can also create heat maps and show how location data changes over time. Flow maps that show lines between source and destination locations cannot be created with Excel 3D Maps. Refer to the Microsoft documentation to get a deeper understanding of what is possible with Excel 3D Maps.

The Cosmic Frog for Excel – Geocoding App in the Resource Library uses Excel 3D Maps to visualize customer locations that the App has geocoded on a map:

Here, the geocoded customers are shown as purple circles which are sized based on their total demand.

ii. Arc GIS Excel Add-In

A good option to for example visualize Hopper (= transportation optimization) routes on a map is the ArcGIS Excel Add-in. If you do not have the add-in, you can get it from within Excel as follows:

  1. Go to your Developer tab.
  2. Click on Add-ins. If you do not have an Add-ins option here, it may be elsewhere in your Excel ribbon. One way to find it is to go to File > Options (or File > More > Options if you do not see Options under File), then click on Customize Ribbon. Look for Add-ins in the list on the left and if it is there select it and click on the Add >> button. If it is not in the list on the left, then look for it in the list on the right, expanding the names of the tabs, until you locate it. This will tell you under which tab the Add-ins button is located.
  3. In the Office Add-ins window that pops up, go to Store.
  4. Search for ArcGIS in the search box.
  5. Click on Add.

You may be asked to log into your Microsoft account when adding this in and/or when starting to use the Add-in. Should you experience any issues while trying to get the Add-in added to Excel, we recommend closing all Office applications and then only open one Excel workbook through which you add the Add-in.

To start using the add-in and create ArcGIS maps in Excel:

  1. In the Excel worksheet with the data to map, click on the ArcGIS tab.
  2. Click on Show Map. You will be prompted to sign into your ArcGIS account which you can do if you have one to access additional functionality, or you can choose to continue as a guest.
  3. Next you are prompted to add a Layer to the map, choose Excel here. You could add ArcGIS layers later too if you want to overlay your data with a specific ArcGIS map.

Excel will automatically select all data in the worksheet that you are on. You can ensure the mapping of the data is correct or otherwise edit it:

  1. We are in the Layers area of the map configuration pane.
  2. Automatically, all data in the sheet we are on is selected for the layer. In the drop-down you can select data from different worksheets.
  3. If you want to manually set the data range to be used on the map, you can do so by clicking on this icon.
  4. There are multiple options for Location types. When using the ArcGIS add-in as a guest, the 2 that are available are 1) Coordinates, which needs latitudes & longitudes of the locations to be mapped, and 2) EsriJSON Geometry, which can draw points, polylines, polygons, and other geometries if the data is in the correct format (we will see an example of this a bit further down). Here, based on the data, the option of Coordinates was automatically selected and the longitude and latitude columns in the data mapped correctly to the Longitude (X) and Latitude (Y) fields.
  5. A spatial reference defines the coordinate system used to locate the geometry for a feature. It controls how and where features are displayed in a map or scene. There are many different types of spatial references for defining geographic data. To simplify accessing and referencing them, they are commonly referred to by a well-known ID (WKID) — an integer value. Two of the most common WKIDs are 4326 (also known as WGS84 or WGS 1984), and 3857 (also known as Web Mercator). Here, in the drop-down list for Spatial reference, select the one you want to use for your map. By default, WGS 1984 will be used. To learn more about spatial references, please see this ArcGIS documentation on it.
  6. Click on Add when you are finished configuring your layer and it will be added to the map. Multiple layers can be added if desired.

After adding a layer, you can further configure it through the other icons at the top of the Layers window:

  1. We are still in the Layers area of the Map configuration pane.
  2. To add another layer click on the Add button.
  3. When this Layers icon is selected, the list of Layers that have been added to the Map are shown.
  4. Currently there is 1 Layer added to the Map, click on the carrot sign to expand the Layer.
  5. To remove a Layer from the Map, click on the recycle bin icon.
  6. To further configure a Map Layer, use these 3 icons at the top of the Layers pane:
    1. Under the Symbology icon, Layers can be styled by selecting 1 or multiple fields from the mapped data, and Symbol type can be set to either Location or Heat map.
    2. Under the Style Options icon, symbols can be styled by setting shapes, sizes, colors, and outlines. Under Vary by attribute, transparency and/or rotation by attribute can be enabled and configured.
    3. Under the Layer Properties icon, pop-ups, visible range, labels, and transparency can be selected and configured.

The other configuration options for the Map are found on the left-hand side of the Map configuration pane:

  1. Clicking on this icon with the 3 horizontal lines will collapse or expand the Map configuration pane.
  2. Layers can be added and configured as described through the previous screenshot through the Layers area of the Map configuration pane.
  3. A Basemap on which the selected data will be shown can be selected in this area of the Map configuration pane.
  4. The selection tool has different options for selecting one or multiple locations on the Map.
  5. One can find an address or place by using the Search option.
  6. Under the Analysis icon, several ways to analyze data, like for example distance or area measurements are listed, however not all services are available to those using the ArcGIS add-in as a guest.
  7. Under Navigation tools, the Map extent (the region that is zoomed into) can be configured to be set to a certain home extent and/or to be locked. Bookmarks for certain zoom levels on certain parts of the world can be saved here too.
  8. Sharing and Export options are not available to those using the ArcGIS add-in as a guest, but for those with accounts they have the option to do so here.
  9. Under Settings, you can sign in to your ArcGIS account, provide feedback, and change Settings around sending usage data to Esri and Microsoft account integration.

As an example, consider the following map showing the stops on routes created by the Hopper engine (Cosmic Frog’s transportation optimization technology). The data in this worksheet is from the Transportation Stop Summary Hopper output table:

  1. We have filtered the data to routename = 3 to just visualize the stops and their order of 1 route on the map.
  2. In the Map configuration pane, under Layers > Style Options, green squares with a black outline were chosen as the shapes to represent the stop locations on the route.
  3. In the Map configuration pane, under Layers > Layer Properties, Pop-ups and Labels have been configured:
    1. Under Pop-ups, the switch to Enable pop-ups was turned on. In the Layer fields to include drop-down, all fields in the worksheet are selected, so when the user clicks on a location all the fields are shown in the pop-up. In the screenshot you can see this for the location labelled 2.
    2. Under Labels, the switch to Enable labels was turned on. Stopid is selected in the Label field and font, font size, color, and placement of the label are configured here too.

As a next step we can add another layer to the map based on the Transportation Segment Summary Hopper output table to connect the source-destination pairs with each other using flow lines. For this we need to use the Esri JSON Geometry Location types mentioned earlier. An example Excel file containing the format needed for drawing polylines can be found in the last answer of this thread on the Esri community website: https://community.esri.com/t5/arcgis-for-office-questions/json-formatting-in-arcgis-for-excel/td-p/1130208, on the PolylinesExample1 worksheet. From this Excel file we can see that the format needed to draw a line connecting 2 locations:

{“paths”: [[<point1_longitude>,<point1latitude>],[<point2_longitude>,<point2_latitude>]],”spatialReference”: {“wkid”: 4326}}

Where wkid indicates the well-known ID of the spatial reference to be used on the map (see above for a brief explanation and a link to a more elaborate explanation of spatial references). Here it is set to 4326, which is WGS 1984.

The next 2 screenshots show the data from a Segments Summary and an added layer to the map to show the lines from the stops on the route:

59a CFforExcelApps 1

  1. The Transportation Segment Summary Hopper output table contains the latitudes and longitudes for the origin and destination of the route segment.
  2. In the ESRI JSON Polyline column (column I) we create the format needed to draw lines on the map through concatenation of multiple text strings and the origin and destination latitudes and longitudes (char(34) is the code to create double quotes). Note that creating this Esri JSON specific format so lines can be drawn on the map can be done like it was done in this example, in a column in an Excel worksheet, but can also be done through VBA code when bringing downloaded results back into Excel, or (preferably) as part of the Python script that runs on the Optilogic platform, where it adds this to the outputs when the model has finished running and outputs are extracted from the model.
  3. On the map we then add a new Layer using the data on this worksheet. For Location types we now choose EsriJSON Geometry and use the ESRI JSON Polyline column as the Geometry column (this is usually automatically selected already). With both layers visible (the layer with the stops from the Stops worksheet and the layer with the segments from the Segments worksheet), the map now looks like this:

Note that for Hopper outputs with multiple routes, we now need to filter both the worksheet with the Stops information and the worksheet with the Segments information for the same route(s) to synchronize them. A better solution is to bring the stopID and Delivered Quantity information from the Stops output into the Segments output, so we only have 1 worksheet with all the information needed and both layers are generated from the same data. Then filtering this set of data will update both layers simultaneously.

iii. folium Python library

Here, we will discuss a Python library called folium, which gives users the ability to create maps that can show flows, tooltips, and has options to customize/auto-size location shapes and flow lines. We will use the example of the Cosmic Frog for Excel – Greenfield App (version 6) again where maps are created as .html files as part of the greenfield_job.py Python script that runs on the Optilogic platform. They are then downloaded as part of the results and from within Excel, users can click on buttons to show flows or customers which then opens the .html files in user’s default browser. We will focus on the differences with version 3 of the Greenfield App that are related to maps and folium. We will discuss the changes/addition to both the VBA code in the Excel Run Greenfield Macro and the additions to greenfield_job.py. First up in the VBA code, we need to add folium to the requirements.txt file so that the Python script can make use of the library once it is uploaded to the Optilogic platform:

To do so, a line to the VBA code is added to write “folium” into requirements.txt.

As part of downloading all the results from the Optilogic platform after the Greenfield run has completed, we need to add downloading the .html map files that were created:

  1. First 3 variables that will be set to the names of the maps to be downloaded from the Optilogic platform are created.
  2. Then these names are set to what the .html files will be called after downloading the map files from the Optilogic platform.
  3. Lastly, the Download_File_From_Optilogic function is called 3 times to download the map files from the Optilogic platform. Note that the names of the files on the Optilogic platform are somewhat different from what we name them after downloading to the user’s local machine.

In this version of the Greenfield app, there is a new Results Summary worksheet that has 3 buttons at the top:

Each of these buttons has a Sub procedure assigned to it, let’s look at the one for the “Show Flows with Customers” button:

  1. This piece of code gets the file path of where the Excel workbook is saved. It is similar to the code in v3 of the Greenfield app that gets the file path, just with a different worksheet name and cell on that worksheet.
  2. Here the map name is set and then opened in the user’s default browser:
    1. The variable mapFileName is created.
    2. mapFileName is set to the path of where the workbook is saved concatenated with the name of the map, which in this case in “Maps – Flows with Customers.html”.
    3. The VBA function FollowHyperlink is used to open the map in a new window in the user’s default browser.

The map that is opened will look something like this, where a tooltip comes up when hovering over a flow line. (How to create and configure the map using folium will be discussed next.)

The additions to the greenfield.job file to make use of folium and creating the 3 maps will now be covered:

First, at the beginning of the script, we need to add “import folium” (line 6), so that the library’s functionality can be used throughout the script. Next, the 3 Greenfield output tables that are used to create the 3 maps are read in, and a few data type changes are made to get the data ready for mapping:

  1. The cosmicfrog function read_table is used to read the Optimization Greenfield Facility Summary table into a dataframe named df_Res_Facilities (line 137).
  2. Next on lines 138 and 139, the pandas function to_numeric is used to convert the data type of the latitude and longitude fields to numeric (from strings).

This is repeated twice, once for the Optimization Greenfield Customer Summary output table and once for the Optimization Greenfield Flow Summary output table.

The next screenshot shows the code where the map to show Facilities is created and the Markers of them are configured based on if the facility is an Existing Facility or a Greenfield Facility:

    1. A variable “map” is created and set by using the folium function map (line 152). The location argument of the function sets the latitude & longitude the map will be centered on. Here it is set to the means of the destination latitudes and destination longitudes of the df_Res_Flows dataframe. The zoom_start argument is set to 4 and indicates the initial zoom level for the map.
    2. The pandas function iterrows is used to iterate through each row in the df_Res_Facilities dataframe (line 155):
      1. If the Facility Type is equal to Existing Facility (line 157), then the variable facility_tooltip is set to the name of the facility appended with (existing) (line 158).
      2. The Marker folium function (line 159) is then used to configure what the marker for this location will look on the map. Its arguments are:
        1. location (line 160): set to the latitude and longitude of the facility.
        2. tooltip (line 161): set to facility_tooltip which was created and set before (line 158).
        3. icon (line 162): uses the folium function Icon to set to a dark blue circle with a prefix of “fa” which means the icon is used from a collection of scalable vector icons, called Font Awesome. For a nice overview of types of markers one can use with folium, see this blog post “Beautiful leaflet markers with folium and fontawesome”.
      3. Finally, the folium function add_to is used to add the location that was just configured to the map.
      4. If the facility type is equal to Greenfield Facility (line 165), the location is also added to the map in the same way as Existing Facilities, with the one difference that the color of its marker is set to green instead of dark blue (line 171)

In the next bit of code, df_Res_Flows dataframe is used to draw lines on the map between origin and destination locations:

  1. A function called add_line is defined (lines 177-179). It takes the following arguments:
    1. map – the map to which the lines will be added.
    2. origin – the latitude and longitude of the location from which the flow line will be drawn.
    3. destination – the latitude and longitude of the location to which the flow line will be drawn.
    4. origin_name – the name of the location from which the flow line will be drawn.
    5. destination_name – the name of the location to which the flow will be drawn.
    6. flow_quantity – the size of the flow from origin to destination in terms of quantity.
    7. color – the color of the flow lines, set to blue here.
  2. A flow_tooltip variable that is set to the concatenation of “<origin_name> to <destination_name> with <flow_quantity>” where these are all converted to strings is created (line 178).
  3. The folium function Polyline is used to draw a line on the map from the origin to the destination with the tooltip that shows when hovering over the flow line set to the just configured flow_tooltip variable, the color of the flowline set to color, weight to 2.5, and opacity to 1 (line 179).
  4. This is then added to the map by using the folium add_to function (line 179). Note that this is building on the map that was created previously that has the facility locations on it, so the result of using this function will be a map with facilities and flow lines.
  5. The iterrows pandas function is then used again to now iterate through the rows of the df_Res_Flows dataframe and add a flow line to the map (using the add_line function, on line 189) for each row (lines 182-189).
  6. The map is saved under the name greenfield_flows_map.html (line 191).

Lastly, the customers from the Optimization Greenfield Customer Summary output table are added to the map that already contains facilities and flow lines, and is saved as greenfield_flows_customers_map.html:

  1. The pandas iterrows function is used again, to now iterate through all rows in the df_Res_Customers dataframe (lines 194-201).
  2. The variable customer_tooltip is set to a concatenation of “<customername> with <totaldemand>” (line 199).
  3. The folium Marker function is used to configure the markers for customers on the map (line 197):
    1. The location of the markers will be the customer’s latitude and longitude (line 198)
    2. The tooltip that shows when hovering over the customer is set to the customer_tooltip variable that was just set on line 196 (line 199)
    3. The icon of the marker is a light blue circle with prefix “fa” (line 200)
  4. Each customer is then added to the map by using the folium add_to function (line 201).
  5. The map which now contains facilities, flow lines, and customers is saved as greenfield_flows_customers_map.html (line 203).

12. Tips & Tricks

Here are some additional pointers that may be useful when building your own Cosmic Frog for Excel applications:

  1. Use a generative AI like for example Chat GPT or perplexity to help with generating VBA and/or Python code. In most cases this will help you along quite far and quickly.
  2. You can record manual steps that you are taking in Excel as a Macro and see the resulting VBA code. This can be a starting point to write your own VBA code. To record a Macro:
    1. Go to the Developer tab in Excel.
    2. Click on Record Macro in the Code group.
    3. Give the Macro a name, choose where it should be stored, and optionally write a description of what the Macro will do.
    4. Click on OK.
    5. Manually take the steps you want to automate and create VBA code for.
    6. Click on Stop Recording when done.
    7. You can now find this Macro and its code under Modules > Module1 (or Module2, etc.) in the Visual Basic Editor.
  3. To ensure users of the App do not accidentally change data or instructions in any of the worksheets that should be kept as is, password protecting worksheets or whole workbooks can be helpful. This can be done in ways where certain cell ranges on certain worksheets are not locked, for example inputs that may be varied, so the user can still change those. See for example this documentation on locking/unlocking specific areas of a protected worksheet.
  4. You can use VBA code to reset data that can be changed by a user back to a default value and assign that Macro to a button that the user can click. This way a user can always go back to those default values after changing any values without having to take note of the original values. See following example where a user can click a Reset Defaults button (located on the DemandFactor_Customers worksheet) to set multiple cell ranges to the value of 1 for DemandFactor and to the value of Consider for DemandStatus. The VBA code to do this is shown in the second screenshot:

  1. Generate a basic App where non-power users cannot change many settings/inputs, and where power users do have access to change more. Good practice is to put the settings that power users can change on a separate worksheet, so that the workflow for non-power users stays as simple as possible. Locking/unlocking worksheets/workbooks may also be helpful for achieving this.
  2. In the Apps covered in this documentation, the read_sql and exec_sql cosmicfrog functions have not been used much, but deserve a mention here as they can be a powerful tool to work with your models on the Optilogic platform. Some examples are:
    1. Downloading partial outputs tables, e.g. filtering for records of only a certain scenario, filtering for values greater/less than a cutoff (where utilization > 0.9 for example), only downloading a subset of the columns of the output tables instead of the whole table, etc.
    2. Updating a table instead of overwriting tables, for example changing the status of records that have a certain text in the Notes field from Include to Exclude or vice versa, multiplying a numeric field by a certain factor, etc.
  3. If a CSV or Excel file created by the Excel Macro is ready to be used by the Cosmic Frog model as is, then there is no need for the Python script to read it into a dataframe first, and instead the upsert_csv / upsert_excel functions can be used to load the data into the correct Cosmic Frog table(s). Note that user should check if the data does not contain things like leading/trailing spaces and commas that should be removed first.

13. Troubleshooting

You may run into issues where Macros or scripts are not running as expected. Here we cover some common problems you may come across and their solutions.

a. Excel: Protected View and/or Security Warnings

When opening an Excel .xslm file you may find that you see following message about the view being protected, you can click on Enable Editing if you trust the source:

Enabling content is not necessarily sufficient to also be able to run any Macros contained in the .xlsm file, and you may see following message after clicking on the Enable Editing button:

Closing this message box and then trying to run a Macro will result in the following message.

To resolve this, it is not always sufficient to just close and reopen the workbook and enable macros as the message suggests. Rather, go to the folder where the .xlsm file is saved in File Explorer, right click on it, and select Properties:

At the bottom in the General tab, check the Unblock checkbox and then click on OK.

Now, when you open the .xlsm file again, you have the option to Enable Macros, do so by clicking on the button. From now on, you will not need to repeat any of these steps when closing and reopening the .xlsm file; Macros will work fine.

It is also possible that instead of the Enable Editing warning and warnings around Macros not running discussed above, you will see a message that Macros have been disabled, as in the following screenshot. In this case, please click on the Enable Content button:

b. Anti-virus Software blocking Macros from Running

Depending on your anti-virus software and its settings, it is possible that the Macros in your Cosmic Frog for Excel Apps will not run as they are blocked by the anti-virus software. If you get “An unexpected error occurred: 13 Type mismatch”, this may be indicative of the anti-virus software blocking the Macro. Work with your IT department to allow the running of Macros.

c. Local Python Scripts not Connecting to cosmicfrog.com

If you are running Python scripts locally (say from Visual Studio Code) that are connecting to Cosmic Frog models and/or uploading files to the Optilogic platform, you may be unsuccessful and get warnings with the text “WARNING – create_engine_with_retry: Database not ready, retrying”. In this case, the likely cause is that your IP address needs to be added to the list of firewall exceptions within the Optilogic platform, see the instructions on how to do this in the “Working with Python Locally” section further above.

d. Cells with Formulas in Excel Exporting to CSV as 0’s

You will find that if you export cells that contain formulas from Excel to CSV that these are exported as 0’s and not as the calculated value. Possible solutions for this are 1) to export to a format other than CSV, possibly .xslx, or 2) to create an extra column in your data where the results of the cells with formulas are copy-pasted as values into and export this column instead of the one with the formulas (this way the formulas stay intact for a next run of the App). You could use the record Macro option to get a start on the VBA code for copy-pasting values from a certain column into a certain column so that you do not have to manually do this each time you run the App, but it becomes part of the Macro that runs when the App runs. An example of VBA code that copy-pastes values can be seen in this screenshot:

e. Closing Output Files

When running an App that has been run previously, there are likely output files in the folder where the App is located, for example CSV files that are opened by the user to view the results or are read back into a worksheet in the App. When running the App again, it is important to not have these output files open, otherwise an error will be thrown when the App gets to the stage of downloading the output files since open files cannot be overwritten.

14. Apps Available in the Resource Library

There are currently several Cosmic Frog for Excel Applications available in the Resource Library, with more being added over time. Check back frequently and search for “Cosmic Frog for Excel” in the search bar to find all available Apps. A short description for each App that is available follows here:

  1. Cosmic Frog for Excel – DC Capacity – this Excel App enables you to understand capacity limitations and demand allocation when the DCs in the network have certain throughput capacities. Specify the throughput capacity for the 7 US DCs in the network and then the App runs a network optimization taking the new capacities into account. The App returns results detailing the facility throughput utilization as well as how demand and supply is allocated to the facilities within the scenario.
  2. Cosmic Frog for Excel – Geocoding – this Excel App enables geographic data including address, city, region, country, and postal code to be entered. Running the App will geocode (return a latitude and longitude) each record. This geocoded data can be displayed on a 3D map directly in Excel. Demand quantity can also be included which is displayed as bars on the map, enabled relative demand by each location to be easily visualized.
  3. Cosmic Frog for Excel – Greenfield – this Excel App enables you to understand the optimal number of facilities required for a given demand and supply profile. Specify customer demand quantities and optionally supplier quantities. You are also able to provide existing facilities that must be part of the solution as well as specific candidate facilities that you want to test as part of the scenario. Fixed facility costs, facility throughput capacities and variable costs associated with supply and demand quantities can be defined. The App returns results detailing which facilities are selected as well as how demand and supply is allocated to the facilities within the scenario. Version 3 of this App is what we followed along with initially in the documentation, and then in the section on Additional Common App Functionality we used version 6 of the App to cover a few more useful features.

15. List of All Resources

As this documentation contains many links to references and resources, we will list them all here in one place:

Have More Questions?

Contact Support Contact Sales Visit Frogger Pond Community