Optimizing a 3D Printed Material for Strength Under Constraints

Imagine you are given the task of optimizing the parameters of a 3D printer to maximize the strength of a printed part. You believe Bayesian optimization can help you in this task and decide to put together a simple optimization script using Honegumi.

Looking at the printer’s settings, you see that the following parameters can be adjusted within the specified bounds:

Parameter Name

Bounds

x1

X Offset

[-1.0, 1.0]

x2

Y Offset

[-1.0, 1.0]

x3

Infill Density

[0.0, 1.0]

x4

Layer Height

[0.0, 1.0]

c1

Infill Type

[honeycomb, gyroid, lines, rectilinear]

In addition to tight time constraints (number of optimization trials \(\le\) 30), your manager tells you that the cost of printing the part cannot exceed $13. The print cost can be computed as a function of the infill density and layer height using the following linear equation: cost = $16.32 * infill_density - $3.73 * layer_height

A dummy objective function has been constructed in the code cell below to emulate the results of experimental trials under different inputs. Although we can easily find the optimal value using the equation, we will pretend that the objective function is unknown and use a Bayesian optimization approach to find the optimal set of input parameters instead.

[116]:
import numpy as np

def printed_strength(x1, x2, x3, x4, c1):
    """
    Calculates the printed strength based on the given input parameters.

    Parameters:
    x1 (float): x_offset [-1.0, 1.0].
    x2 (float): y_offset [-1.0, 1.0].
    x3 (float): infill_density [0.0, 1.0].
    x4 (float): layer_height [0.0, 1.0].
    c1 (str): The type of infill type ["honeycomb", "gyroid", "lines", "rectilinear"].

    Returns:
    float: The calculated printed strength.
    """
    y = float(
        20*(x1**2*np.sin(x1/2)+1) +
        10*(x2*np.cos(x2*0.5-1.5)+1) +
        5*(np.log(x3) + 10) +
        3*(1 / (0.25 * np.sqrt(2 * np.pi))) * np.exp(-(x4 - 0.23)**2 / (2 * 0.25**2)) -
        1.3*x4+10
    )

    infill_effects = {
        'honeycomb': 1,
        'gyroid': 1.5,
        'lines': 0.5,
        'rectilinear': 0.8
    }

    y *= infill_effects[c1]

    return y

Applying Honegumi

We will now use the Honegumi website to generate a script that will help us optimize the printer parameters. From the description, we observe that our problem is a single objective optimization problem with an added categorical variable input and a linear constraint on the cost. To create an optimization script for this problem, we select the following options:

HG_selections.jpg

The Honegumi generated optimization script will provide a framework for our optimization campaign that we can modify to suit our specific problem needs. In the code sections below, we will make several modifications to this generated script to make it compatible with our problem.

Modifying the Code for Our Problem

We can modify this code to suit our problem with a few simple modifications. Wherever a modification has been made to the code, a comment starting with # CHANGE: has been added along with a brief description of the change.

[117]:
from ax.service.ax_client import AxClient, ObjectiveProperties

ax_client = AxClient(random_seed=125) # CHANGE: add a random seed for repeatability

obj1_name = "printed_strength" # CHANGE: adjust the objective name to match our function

# CHANGE: Remove the branin dummy objective function, we will use the printer function

ax_client.create_experiment(
    parameters=[
        {"name": "x1", "type": "range", "bounds": [-1.0, 1.0]}, # CHANGE: update parameter
        {"name": "x2", "type": "range", "bounds": [-1.0, 1.0]}, # CHANGE: update parameter
        {"name": "x3", "type": "range", "bounds": [0.0, 1.0]}, # CHANGE: add new parameter
        {"name": "x4", "type": "range", "bounds": [0.0, 1.0]}, # CHANGE: add new parameter
        {
            "name": "c1",
            "type": "choice",
            "is_ordered": False,
            "values": ["honeycomb", "gyroid", "lines", "rectilinear"] # CHANGE: add categories

        },
    ],
    parameter_constraints=[
        "16.32*x3 - 3.73*x4 <= 13.0", # CHANGE: input the linear printer cost constraint
    ],
    objectives={
        obj1_name: ObjectiveProperties(minimize=False), # CHANGE: set minimize = FALSE
    },
)

for _ in range(30): # CHANGE: this is a tough problem, increase number of trials

    parameterization, trial_index = ax_client.get_next_trial()

    # CHANGE: pull all added parameters from the parameterization
    x1 = parameterization["x1"]
    x2 = parameterization["x2"]
    x3 = parameterization["x3"]
    x4 = parameterization["x4"]
    c1 = parameterization["c1"]

    results = printed_strength(x1, x2, x3, x4, c1) # CHANGE: switch to printer function
    ax_client.complete_trial(trial_index=trial_index, raw_data=results)

best_parameters, metrics = ax_client.get_best_parameters()
[INFO 04-08 11:22:31] ax.service.ax_client: Starting optimization with verbose logging. To disable logging, set the `verbose_logging` argument to `False`. Note that float values in the logs are rounded to 6 decimal points.
[WARNING 04-08 11:22:31] ax.service.ax_client: Random seed set to 125. Note that this setting only affects the Sobol quasi-random generator and BoTorch-powered Bayesian optimization models. For the latter models, setting random seed to the same number for two optimizations will make the generated trials similar, but not exactly the same, and over time the trials will diverge more.
[INFO 04-08 11:22:31] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x1. If that is not the expected value type, you can explicitly specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 04-08 11:22:31] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x2. If that is not the expected value type, you can explicitly specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 04-08 11:22:31] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x3. If that is not the expected value type, you can explicitly specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 04-08 11:22:31] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x4. If that is not the expected value type, you can explicitly specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 04-08 11:22:31] ax.service.utils.instantiation: Inferred value type of ParameterType.STRING for parameter c1. If that is not the expected value type, you can explicitly specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
/Users/andrewf/miniconda3/envs/ax_env/lib/python3.9/site-packages/ax/core/parameter.py:594: UserWarning:

`sort_values` is not specified for `ChoiceParameter` "c1". Defaulting to `False` for parameters of `ParameterType` STRING. To override this behavior (or avoid this warning), specify `sort_values` during `ChoiceParameter` construction.

[INFO 04-08 11:22:31] ax.service.utils.instantiation: Created search space: SearchSpace(parameters=[RangeParameter(name='x1', parameter_type=FLOAT, range=[-1.0, 1.0]), RangeParameter(name='x2', parameter_type=FLOAT, range=[-1.0, 1.0]), RangeParameter(name='x3', parameter_type=FLOAT, range=[0.0, 1.0]), RangeParameter(name='x4', parameter_type=FLOAT, range=[0.0, 1.0]), ChoiceParameter(name='c1', parameter_type=STRING, values=['honeycomb', 'gyroid', 'lines', 'rectilinear'], is_ordered=False, sort_values=False)], parameter_constraints=[ParameterConstraint(16.32*x3 + -3.73*x4 <= 13.0)]).
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: Using Models.BOTORCH_MODULAR since there are more ordered parameters than there are categories for the unordered categorical parameters.
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: Calculating the number of remaining initialization trials based on num_initialization_trials=None max_initialization_trials=None num_tunable_parameters=5 num_trials=None use_batch_trials=False
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: calculated num_initialization_trials=10
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: num_completed_initialization_trials=0 num_remaining_initialization_trials=10
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: `verbose`, `disable_progbar`, and `jit_compile` are not yet supported when using `choose_generation_strategy` with ModularBoTorchModel, dropping these arguments.
[INFO 04-08 11:22:31] ax.modelbridge.dispatch_utils: Using Bayesian Optimization generation strategy: GenerationStrategy(name='Sobol+BoTorch', steps=[Sobol for 10 trials, BoTorch for subsequent trials]). Iterations after 10 will take longer to generate due to model-fitting.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 0 with parameters {'x1': 0.28652, 'x2': 0.729716, 'x3': 0.106549, 'x4': 0.308309, 'c1': 'rectilinear'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 0 with data: {'printed_strength': (69.020322, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 1 with parameters {'x1': -0.511437, 'x2': -0.964879, 'x3': 0.653142, 'x4': 0.89349, 'c1': 'gyroid'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 1 with data: {'printed_strength': (134.081282, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 2 with parameters {'x1': -0.045429, 'x2': 0.089123, 'x3': 0.447532, 'x4': 0.016436, 'c1': 'gyroid'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 2 with data: {'printed_strength': (134.075964, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 3 with parameters {'x1': 0.750334, 'x2': -0.355898, 'x3': 0.808166, 'x4': 0.681207, 'c1': 'rectilinear'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 3 with data: {'printed_strength': (74.796229, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 4 with parameters {'x1': 0.700595, 'x2': 0.479438, 'x3': 0.542767, 'x4': 0.624066, 'c1': 'lines'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 4 with data: {'printed_strength': (46.174644, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 5 with parameters {'x1': -0.472393, 'x2': -0.216561, 'x3': 0.212896, 'x4': 0.209295, 'c1': 'honeycomb'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 5 with data: {'printed_strength': (85.800827, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 6 with parameters {'x1': -0.943886, 'x2': 0.83841, 'x3': 0.88729, 'x4': 0.832436, 'c1': 'gyroid'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 6 with data: {'printed_strength': (126.641476, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 7 with parameters {'x1': 0.226907, 'x2': -0.607161, 'x3': 0.372436, 'x4': 0.497254, 'c1': 'honeycomb'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 7 with data: {'printed_strength': (88.635904, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 8 with parameters {'x1': -0.86197, 'x2': -0.407047, 'x3': 0.250808, 'x4': 0.505045, 'c1': 'gyroid'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 8 with data: {'printed_strength': (119.058815, None)}.
[INFO 04-08 11:22:31] ax.service.ax_client: Generated new trial 9 with parameters {'x1': -0.335551, 'x2': 0.538677, 'x3': 0.584256, 'x4': 0.381901, 'c1': 'lines'} using model Sobol.
[INFO 04-08 11:22:31] ax.service.ax_client: Completed trial 9 with data: {'printed_strength': (46.10894, None)}.
[INFO 04-08 11:22:32] ax.service.ax_client: Generated new trial 10 with parameters {'x1': -0.392051, 'x2': -0.156548, 'x3': 0.661345, 'x4': 0.553725, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:32] ax.service.ax_client: Completed trial 10 with data: {'printed_strength': (133.043694, None)}.
[INFO 04-08 11:22:34] ax.service.ax_client: Generated new trial 11 with parameters {'x1': -0.811555, 'x2': -0.232386, 'x3': 0.72679, 'x4': 0.926811, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:34] ax.service.ax_client: Completed trial 11 with data: {'printed_strength': (123.305786, None)}.
[INFO 04-08 11:22:36] ax.service.ax_client: Generated new trial 12 with parameters {'x1': -0.016375, 'x2': -0.710737, 'x3': 0.547875, 'x4': 0.370852, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:36] ax.service.ax_client: Completed trial 12 with data: {'printed_strength': (138.884099, None)}.
[INFO 04-08 11:22:38] ax.service.ax_client: Generated new trial 13 with parameters {'x1': 0.223029, 'x2': -0.593452, 'x3': 0.517212, 'x4': 0.709016, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:38] ax.service.ax_client: Completed trial 13 with data: {'printed_strength': (131.978254, None)}.
[INFO 04-08 11:22:40] ax.service.ax_client: Generated new trial 14 with parameters {'x1': -0.208178, 'x2': -0.854538, 'x3': 0.648116, 'x4': 0.468209, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:40] ax.service.ax_client: Completed trial 14 with data: {'printed_strength': (139.733143, None)}.
[INFO 04-08 11:22:42] ax.service.ax_client: Generated new trial 15 with parameters {'x1': -0.103261, 'x2': -0.662468, 'x3': 0.654958, 'x4': 0.304544, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:42] ax.service.ax_client: Completed trial 15 with data: {'printed_strength': (140.6433, None)}.
[INFO 04-08 11:22:44] ax.service.ax_client: Generated new trial 16 with parameters {'x1': -0.065675, 'x2': -0.67477, 'x3': 0.68869, 'x4': 0.263156, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:44] ax.service.ax_client: Completed trial 16 with data: {'printed_strength': (141.469891, None)}.
[INFO 04-08 11:22:46] ax.service.ax_client: Generated new trial 17 with parameters {'x1': 0.066783, 'x2': -0.775198, 'x3': 0.756127, 'x4': 0.23312, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:46] ax.service.ax_client: Completed trial 17 with data: {'printed_strength': (143.256153, None)}.
[INFO 04-08 11:22:48] ax.service.ax_client: Generated new trial 18 with parameters {'x1': 0.183951, 'x2': -0.942388, 'x3': 0.81921, 'x4': 0.287262, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:48] ax.service.ax_client: Completed trial 18 with data: {'printed_strength': (145.542431, None)}.
[INFO 04-08 11:22:50] ax.service.ax_client: Generated new trial 19 with parameters {'x1': 0.294473, 'x2': -1.0, 'x3': 0.893579, 'x4': 0.424466, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:50] ax.service.ax_client: Completed trial 19 with data: {'printed_strength': (145.258548, None)}.
[INFO 04-08 11:22:52] ax.service.ax_client: Generated new trial 20 with parameters {'x1': 0.251583, 'x2': -1.0, 'x3': 0.861, 'x4': 0.290972, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:52] ax.service.ax_client: Completed trial 20 with data: {'printed_strength': (146.761116, None)}.
[INFO 04-08 11:22:54] ax.service.ax_client: Generated new trial 21 with parameters {'x1': 0.844349, 'x2': -1.0, 'x3': 0.813733, 'x4': 0.075101, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:54] ax.service.ax_client: Completed trial 21 with data: {'printed_strength': (154.240162, None)}.
[INFO 04-08 11:22:56] ax.service.ax_client: Generated new trial 22 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:56] ax.service.ax_client: Completed trial 22 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:22:56] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:22:56] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:22:58] ax.service.ax_client: Generated new trial 23 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:58] ax.service.ax_client: Completed trial 23 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:22:58] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:22:58] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:22:59] ax.service.ax_client: Generated new trial 24 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:22:59] ax.service.ax_client: Completed trial 24 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:22:59] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:22:59] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:23:01] ax.service.ax_client: Generated new trial 25 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:23:01] ax.service.ax_client: Completed trial 25 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:23:01] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:23:01] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:23:02] ax.service.ax_client: Generated new trial 26 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:23:02] ax.service.ax_client: Completed trial 26 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:23:02] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:23:02] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:23:04] ax.service.ax_client: Generated new trial 27 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:23:04] ax.service.ax_client: Completed trial 27 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:23:04] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:23:04] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:23:06] ax.service.ax_client: Generated new trial 28 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:23:06] ax.service.ax_client: Completed trial 28 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:23:06] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[INFO 04-08 11:23:06] ax.modelbridge.torch: The observations are identical to the last set of observations used to fit the model. Skipping model fitting.
[INFO 04-08 11:23:07] ax.service.ax_client: Generated new trial 29 with parameters {'x1': 1.0, 'x2': -1.0, 'x3': 0.803946, 'x4': 0.032279, 'c1': 'gyroid'} using model BoTorch.
[INFO 04-08 11:23:07] ax.service.ax_client: Completed trial 29 with data: {'printed_strength': (159.17775, None)}.
[INFO 04-08 11:23:07] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0

Show the Best Parameters

After our optimization loop has completed, we can use the model to find the best parameters and their corresponding strength value. These will be our optimial set of parameters that we use in the 3D printer going forward.

[123]:
ax_client.get_best_trial()
[INFO 04-08 11:30:14] ax.modelbridge.base: Leaving out out-of-design observations for arms: 22_0
[123]:
(29,
 {'x1': 0.25158306589474977,
  'x2': -1.0,
  'x3': 0.8609995465769321,
  'x4': 0.2909716038647014,
  'c1': 'gyroid'},
 ({'printed_strength': 146.70775284531527},
  {'printed_strength': {'printed_strength': 0.09174820515491967}}))

Plotting Optimization Performance

We can plot the performance of our optmization loop to see how the optimization task progressed as a function of iteration count.

We observe that our initial sobol trials were were extremely helpful in finding a good starting region for the gaussian process model.

[124]:
from ax.utils.notebook.plotting import init_notebook_plotting, render

render(ax_client.get_optimization_trace())

Data type cannot be displayed: application/vnd.plotly.v1+json

Assessing Model Accuracy

We can also assess the accuracy of our model by comparing the predicted strength values to the actual strength values. This can help us understand how well our model is performing and how trusting we can be of the optimal parameters it suggests.

[125]:
from ax.modelbridge.cross_validation import cross_validate
from ax.plot.diagnostic import interact_cross_validation

model = ax_client.generation_strategy.model
cv_results = cross_validate(model)
render(interact_cross_validation(cv_results))

Data type cannot be displayed: application/vnd.plotly.v1+json