pythongenetic-algorithmpymoo

Pymoo generating candidates with nan parameters


I'm running a multi-objective optimisation with Pymoo (0.5.0) using NSGA-III and within my population of new candidates some of the generated candidates have nan parameters. This results in my evaluate function (which is a call to a neural network) returning nan. The optimisation is running and producing desired results but I'd like to know why some of the candidate parameters are nan. Here is the code for the problem.

Problem setup:

opt_name = "sopt00001KaRxD60fLn2"
pop_size = 165
n_gen = 350
cross_over_pb = 0.9
mutation_pb = 0.1

# Fixed params
band ="KaRx"
arc = "RevF"
source_spec = "sweep99p1"
lens_diameter = 60.0
source_z = 5.0
r_lam = 0.1
use_original_transformation = 0  # false
source_x0 = 0.0
target_scans = [0, 70, 50]

# Optimisation param ranges
lens_material_delta_n = [1.5, 3.6]
lens_thick = [5, 35]
lens_radii_back = [39, 22500]
lens_radii_front = [39, 22500]
source_width = [2, 20]
source_x = [12, 20]

params_lower_lim = [lens_thick[0], lens_radii_front[0], lens_radii_back[0], source_width[0], source_x[0], source_x[0], 
                    lens_material_delta_n[0], -1, -1, -1, 0, -1, -1, -1, 0]
params_upper_lim = [lens_thick[1], lens_radii_front[1], lens_radii_back[1], source_width[1], source_x[1], source_x[1],
                    lens_material_delta_n[1], 1, 1, 1, 1, -1, -1, -1, 1]

n_var = len(params_lower_lim)
assert n_var == len(params_upper_lim), print("Upper and lower parameter limits are not equal length!")

# Other required params
if band == "KaRx":
    freq_center = 19.45
    freq_width = 3.5

Evaluate function:

class ProblemWrapper(Problem):
    
    def _evaluate(self, params, out, *args, **kwargs):
        res = []
        for param in params:
            source_x70 = source_x_f(param[4], param[5], source_x, 50, r_lam, target_scans, freq_center, freq_width)
            source_x50 = source_x_f(param[4], param[5], source_x, 70, r_lam, target_scans, freq_center, freq_width)
            res.append(smeep(band, 
                             lens_diameter, param[0], 
                             param[1], param[2], 
                             param[3], 
                             source_x0, source_x70, source_x50, 
                             source_z, 
                             param[6], param[7], param[8], param[9], param[10], param[11], param[12], param[13], param[14], 
                             r_lam, use_original_transformation, 
                             arc, 
                             source_spec,
                             target_scans))
        
        out['F'] = np.array(res)

Algorithm settings:

ref_dirs = get_reference_directions("das-dennis", 3, n_partitions=12)
problem = ProblemWrapper(n_var=n_var, 
                         n_obj=len(target_scans), 
                         xl=params_lower_lim, 
                         xu=params_upper_lim)

algorithm = NSGA3(
    pop_size=pop_size, 
    ref_dirs=ref_dirs,
    sampling=get_sampling("real_random"),
    cross_over=get_crossover("real_sbx", prob=cross_over_pb),
    mutation=get_mutation("real_pm", prob=mutation_pb)
)

Execution:

res = minimize(problem=problem,
               algorithm=algorithm,
               termination=("n_gen", n_gen),
               save_history=True,
               verbose=True
)

It looks like the only affected parameters are the poly6 (param[11]), poly7 (param[12]) and poly8 (param[13]) terms. And it differs candidate to candidate. I confess I have not tried any different crossover or mutation schemes but these seemed the best from the documentation.

Thanks in advance!


Solution

  • The nan arise because the limits for your parameters 11, 12 and 12 are equal (-1 and -1 in all cases).

    If you look at the code for the polynomial mutation (real_pm), you have the following lines:

    delta1 = (X - xl) / (xu - xl)
    delta2 = (xu - X) / (xu - xl)
    

    where xu and xl are the upper and lower bounds of the parameters. In your case, that would cause a divide-by-0.

    Since the limits are the same (if this is correct), they are actually not part of the optimization and you should remove them from the list.