Model considerations for solver performance
In general, solver performance is impacted by a combination of the size and complexity of the model. For example, a model may not be overly large, but may be complex to solve due to the number of integer variables that are included in the problem or the types of constraints used. Models can become complex when inefficient model structures are included, such as deeply nested processes.
Additionally, the magnitude of values within the model can not only impact solver performance, but may lead to numerical difficulties, including the possibility of false infeasibilities.
When discussing “solve time”, keep in mind that this is not the time required to build the model, but starts once the solver has the problem and starts trying to optimize it. When running Network Optimization, the solver identifies an approximate range of time required to complete the solve based on the complexity of the model. For example, a model solve may be predicted to complete in a range of 0 to 5 minutes or a longer range. This information is available in the solve log.
Most longer run times stem from three major areas:
- Too much detail in the model for the question being answered. Refer to Model size for additional information.
- Numerical instability, usually resulting from a large difference between the largest and smallest demand or from dummy lanes/sites with very high costs. Refer to Scaling issues for additional information.
- Many binary/integer variables in the model that result in an enumeration of solutions. Refer to Use of integer and binary variables for additional information.
When reviewing your model, the following is a list of considerations that you should keep in mind regarding solve time. This list is not in a prioritized order:

- What are the questions you are trying to answer?
- What decisions are being made?
- Are you forcing a solution into the model?

If the slowness is in the actual solve time, what is taking the most time?
- How long does the LP take to solve?
- How much time is spent before leaving the root node?
- Is there a lot of time spent in the cutting plane algorithms?
- Is there a lot of time spent in the presolve? See Performance settings for additional information.
- When a final solution is found, how many nodes were visited? If the number is small, then consider a more conservative cut strategy (on the Solver Settings tab of Network Optimization options) since these cuts may not be helping.

- Conditional minimum constraints are very expensive because they introduce binary variables into the model.
- Is there a large number of binary variables after pre-solve? These variables may be causing a lot of branching, resulting in the long solve time.
- Are there a lot of piecewise functions in the objective function? Each step in the piecewise creates a binary variable for the selection.

If yes, this will significantly slow the solve time. Numerical instability is the result of dividing very small numbers by very large numbers while the algorithm is solving.
- Look at the demand.
- Sort by Quantity. If there is a large difference between the largest and smallest values, then this can cause issues. Consider either aggregating the small quantity products together or completely removing them from the model.
- Multiply the Unit Price by the Quantity, then compare the total cost for the solution to the Unit Price * Quantity for each product. If the percentage of the product cost is very small compared to the total cost, then this is an indicator that the model may be numerically unstable. Additionally, the variables associated with these small products may well be slowing the model down since they are potentially not driving factors in optimizing the solution.
- Look at the constraints.
- Are there very small values that are constraining the problem? These small constraint values can lead to unstable models.
- Look at the product flow bounds - the Big M calculation.
- Try implementing logical constraints to tighten the Big M calculations. Note that using logical constraints comes with the expense of slowing down solution time for stable models.
- Consider the user defined value. If you know what the largest flow that a product could have through a site, you can enter the number as the User Defined Value for the bound. The smaller the number the better; however, setting it too small will result in limiting the solution space and possibly making the model infeasible. Setting it arbitrarily large can slow down the solve time and cause numerical issues. If you have selected the User Defined Value and are experiencing these issues, you should try the Automatic Calculations to let Network Optimization do the calculation.

- Some people create models with every possible option defined, to include those that could result in bad or poor decisions. This is done to remove artificial limitations on the search space or to account for data that has not been thoroughly cleaned. With this type of model, the true optimal solution is typically never found.
- Can you in intelligently limit the search space of the algorithm? For example, you may be able to turn off flow lanes that you know are a bad choice, or you can include rather than consider locations that are always used.
- Can you run a set of scenarios to help identify decisions that can be removed from the model?

Is there anything that stands out as odd or abnormal?
- Look for “dummy nodes” and “trash lanes”. Some users create these nodes and lanes in an effort to ensure that the model stays feasible. You should assess if these lanes still being used. If so, identify what is going there and what needs to be adjusted so that these nodes and lanes can be removed. If these nodes and lanes are not being used, remove them from the model. You can also change the objective to run a profit model instead of a cost model and set all customers to “Consider”. Additionally, look for artificially high costs associated with these nodes and lanes. If you cannot remove them, try reducing the cost values to something more reasonable.
- Are there really large or small numbers in the model (constraints/objective) that could be creating instability?
- Look for “dummy sites” in the model. These are often used prevent infeasibility but can be very expensive as they are typically associated with higher transportation costs. They can also introduce unnecessary flow links to the model, limit the bounding and can cause numerical issues. Try balancing supply and demand values to alleviate the need for dummy sites and links.
Last modified: Wednesday May 15, 2024