pythongekko

Gekko Deep Learning does not find solution


I want to create a regression ANN model with Gekko. I made this model using tf_Keras, and it works very well. Unfortunately, it is not possible to convert the Keras model to a Gekko amp model. Therefore, I need to make it using the Gekko Brain class. However, it fails to find a solution.

Here is my dataset: ss10_1k_converted.csv

Here is my model in Gekko:

import pandas as pd
import numpy as np
from gekko import  brain

# load dataset
df = pd.read_csv("ss10_1k_converted.csv")

# split data for train and test
train=df.sample(frac=0.8,random_state=200)
test=df.drop(train.index)

# Preparing training data
train_x = train[['D','V']].to_numpy()
train_x = np.transpose(train_x)
train_y = train['F'].to_numpy()
train_y = np.transpose(train_y).reshape(1,-1)

# Preparing test data
test_x = test[['D','V']].to_numpy()
test_y = test['F'].to_numpy().reshape(1,-1)

# neural network
b = brain.Brain()
b.input_layer(2)
b.layer(relu=61)
#b.layer(relu=61)
b.output_layer(1)
b.learn(train_x, train_y, obj=2)

Here is the error:

EXIT: Restoration Failed!
 
 An error occured.
 The error code is           -2
 
 
 ---------------------------------------------------
 Solver         :  IPOPT (v3.12)
 Solution time  :    179.147199999999      sec
 Objective      :    2064584.02728447     
 Unsuccessful with error code            0
 ---------------------------------------------------
 
 Creating file: infeasibilities.txt
 @error: Solution Not Found

complete error --> link

Please let me know what the problem is.

Thank you


Solution

  • There is a way to import models from TensorFlow directly into Gekko. Here is an example from the Gekko documentation:

    from gekko import GEKKO
    x = m.Var(.0,lb = 0,ub=1)
    y = Gekko_NN_TF(model,mma,m,n_output = 1).predict([x])
    m.Obj(y)
    m.solve(disp=False)
    print('solution:',y.value[0])
    print('x:',x.value[0])
    print('Gekko Solvetime:',m.options.SOLVETIME,'s')
    

    If you do want to fit the Neural Network with Gekko, it is always better to scale the data. This doesn't change the solution, but helps the Neural Network if the data is scaled with a standard scaler (mean=0, stdev=1).

    import pandas as pd
    import numpy as np
    from sklearn.preprocessing import StandardScaler
    from gekko import brain
    
    # load dataset
    df = pd.read_csv("ss10_1k_converted.csv")
    
    # Initialize the StandardScaler
    s = StandardScaler()
    
    # split data for train and test
    train = df.sample(frac=0.8, random_state=200)
    test = df.drop(train.index)
    
    # scale train data
    train_s = s.fit_transform(train)
    train_s = pd.DataFrame(train_s,columns=train.columns)
    
    # Preparing training data
    train_x = train_s[['D', 'V']].to_numpy()
    train_x = np.transpose(train_x)
    train_y = train_s['F'].to_numpy()
    train_y = np.transpose(train_y).reshape(1, -1)
    
    # scale test data
    test_s = s.transform(test)
    test_s = pd.DataFrame(test_s,columns=test.columns)
    
    # Preparing test data
    test_x = test_s[['D', 'V']].to_numpy()
    test_y = test_s['F'].to_numpy().reshape(1, -1)
    
    # neural network
    b = brain.Brain()
    b.input_layer(2)
    b.layer(relu=61)
    b.output_layer(1)
    b.learn(train_x, train_y, obj=2)
    

    Tested with the first 300 data rows:

    D,V,F
    0.0039933,0.083865,-64.83
    0.004411,0.011882,-65.2739
    0.0046267,-0.047719,-65.7733
    0.0043474,0.032426,-67.1328
    0.0046642,0.0026268,-66.7721
    0.004506,-0.019885,-66.7998
    0.0042449,0.0027718,-66.0785
    0.0046068,-0.004274,-64.9826
    0.0040353,-0.015186,-64.8578
    0.0040928,0.0061156,-65.0936
    0.004671,-0.007791,-65.912
    0.004279,-0.014973,-66.3976
    0.0044212,0.026233,-67.0634
    0.0046039,-0.0026849,-66.5501
    0.0041202,-0.0067733,-65.8565
    0.0043672,0.012628,-65.3988
    0.004455,-0.0044196,-64.3723
    0.0040499,0.0075972,-65.0242
    0.0045626,-7.75E-05,-65.6485
    0.0044811,-0.0017835,-66.2727
    0.0043505,0.0099326,-67.8957
    0.0047254,0.0031688,-66.4947
    0.004119,-0.0035763,-66.2588
    0.0043387,-0.0021411,-65.8011
    0.0046139,-0.0041471,-64.4139
    0.0041006,-0.015224,-64.8855
    0.0045054,0.016978,-65.2878
    0.0044468,-0.01442,-65.8565
    0.0042841,-0.0033144,-67.0357
    0.0044818,0.017859,-66.4947
    0.0043907,-0.012453,-66.7582
    0.004108,0.022928,-66.0646
    0.0045609,0.0015028,-65.1907
    0.0043863,-0.0077244,-65.052
    0.0043062,0.014014,-64.9271
    0.0048137,-0.0020048,-66.0369
    0.0043082,-0.0070927,-66.4947
    0.0043583,0.0067367,-67.2576
    0.0046149,-0.013683,-66.6611
    0.0040759,-0.016978,-66.0508
    0.0044474,0.0066973,-65.6069
    0.0044941,-0.01849,-64.4139
    0.0040872,0.0051854,-64.9688
    0.004542,0.0072096,-65.9675
    0.0044153,-0.015739,-66.3282
    0.0040624,0.0049035,-68.1593
    0.0046978,-0.0025099,-66.675
    0.0041913,-0.012541,-66.5779
    0.0041956,0.011957,-66.1479
    0.0045725,0.0023067,-64.1226
    0.0040471,-0.0062125,-65.4959
    0.0043414,0.026824,-65.0104
    0.0046067,-0.0037695,-66.3837
    0.0040993,-0.0022286,-67.4518
    0.0045984,-0.0097582,-66.9524
    0.0046205,-0.0067457,-67.0495
    0.0041353,0.01126,-66.786
    0.0045369,0.011541,-65.2739
    0.0038909,-0.011067,-65.0104
    0.0042334,0.01567,-65.4681
    0.0047868,0.0051842,-66.2866
    0.004284,-0.006591,-66.8137
    0.0044214,0.018015,-67.2299
    0.0048712,-0.018352,-66.994
    0.0042182,-0.015794,-66.0369
    0.0044314,0.0070755,-65.8843
    0.0041792,-0.0040999,-64.7468
    0.0039398,-0.0083723,-65.2462
    0.0045341,-0.0067345,-66.134
    0.0043972,-0.017462,-66.4669
    0.0043558,0.0091193,-68.7696
    0.0047372,0.0067633,-66.6334
    0.0041105,-0.003295,-66.8137
    0.004152,0.0062506,-65.6207
    0.0043251,-0.0089543,-64.4139
    0.004048,-0.0089443,-65.482
    0.0044814,0.021243,-65.371
    0.0044972,0.0062206,-66.1895
    0.0042125,-0.0060468,-67.4657
    0.0045941,-0.001696,-67.0911
    0.0044504,-0.015574,-67.3131
    0.0042516,0.010873,-65.9537
    0.0042283,0.00069826,-65.3294
    0.0040209,-0.0075884,-65.2184
    0.0042322,0.017588,-65.0797
    0.0046712,-0.019778,-66.6334
    0.0042237,0.0066664,-66.3421
    0.0044727,0.028501,-67.4796
    0.0047318,-0.0079854,-67.1328
    0.0039351,0.0094957,-66.1617
    0.004335,0.022909,-65.9398
    0.0044769,-0.0043703,-64.4555
    0.0041044,-0.001986,-65.3294
    0.0048231,0.023936,-66.0924
    0.0045902,0.0015897,-66.5918
    0.0044028,0.01097,-68.3257
    0.0047,0.0097195,-66.6195
    0.0043181,-0.023635,-66.8831
    0.0043604,-0.007966,-65.9537
    0.0046524,-0.0089459,-64.5249
    0.0042599,0.0021704,-65.4543
    0.0044449,0.0049516,-65.5098
    0.0044628,-0.0078797,-66.4253
    0.0042596,-0.0047872,-67.5212
    0.0047347,0.01064,-66.8137
    0.0043973,-0.010622,-67.4102
    0.0042344,0.021881,-66.3976
    0.0045758,0.0062213,-65.4127
    0.0044302,-0.015902,-65.0797
    0.004275,0.011271,-65.2323
    0.0048129,-0.0094669,-66.5363
    0.0043592,-0.0073374,-66.6611
    0.0044803,0.0047882,-67.7986
    0.0045777,-0.0046428,-66.7305
    0.0040829,-0.0091087,-66.1617
    0.0044584,0.003626,-65.9398
    0.0045068,-0.0059609,-64.5526
    0.0042007,0.013363,-65.4127
    0.0046671,-0.0036145,-66.0785
    0.0044172,-0.019604,-66.994
    0.0043849,0.0069483,-68.2425
    0.0047915,-0.018207,-66.6611
    0.0040251,-0.011774,-66.6611
    0.0041447,0.0042927,-65.7456
    0.004618,-0.019381,-64.5249
    0.0039094,-0.0062988,-65.4127
    0.0044787,0.024759,-65.4681
    0.0044859,-0.0035376,-66.6195
    0.0040748,-0.0071321,-67.4379
    0.0046907,0.016193,-66.9663
    0.0044212,-0.0068536,-67.2021
    0.0040881,0.012219,-65.9675
    0.0045039,0.0073159,-65.6346
    0.0041685,-0.012811,-65.3572
    0.0043691,0.01791,-65.6762
    0.0046814,-0.0047478,-66.3559
    0.0041919,-0.0095948,-66.6889
    0.0044857,0.013557,-67.6044
    0.0047077,-0.017045,-66.9524
    0.0040122,-0.013703,-66.453
    0.0043503,0.011901,-65.6901
    0.0043854,-0.0091003,-64.5803
    0.0040964,-0.0030619,-65.3433
    0.0044621,0.010922,-66.2588
    0.0043823,-0.015234,-66.7721
    0.0043401,0.01006,-68.298
    0.0046588,-0.014118,-66.8969
    0.0041869,-0.0081592,-66.8969
    0.0041066,-0.0032463,-65.8565
    0.0045463,-0.009837,-64.5942
    0.0038705,-0.016649,-65.4265
    0.0043316,0.0051654,-65.704
    0.0043303,0.00058075,-66.6195
    0.0041827,-0.0077244,-67.5905
    0.0043924,0.026038,-67.1744
    0.0042696,-0.0022192,-67.3825
    0.0041159,0.01319,-66.453
    0.0044228,0.0031794,-65.371
    0.0042871,0.00028224,-65.0381
    0.0042961,0.020099,-65.4959
    0.0045414,-0.0034207,-66.3421
    0.0042336,-0.020485,-66.8553
    0.0043969,-0.012384,-67.4934
    0.0045912,-0.015408,-67.0773
    0.0040612,0.0043115,-66.3559
    0.004095,0.007946,-65.7317
    0.0040889,-0.003616,-64.5942
    0.0040495,0.002248,-65.3156
    0.0046104,-0.0010946,-66.3005
    0.0043607,0.014525,-66.8692
    0.004339,0.022639,-68.4367
    0.0046271,0.00042635,-66.8692
    0.0040437,-0.01568,-66.7721
    0.0043274,0.014712,-65.9953
    0.0044512,-0.001676,-64.6497
    0.0040564,0.0066864,-65.5375
    0.0043595,0.027658,-65.5375
    0.0045961,-0.016929,-66.1895
    0.0043122,0.010872,-67.4796
    0.0047341,0.011687,-67.1466
    0.0044862,-0.0031194,-67.1744
    0.0040565,0.0017347,-66.2588
    0.0046251,-0.010573,-65.1491
    0.0042434,-0.022105,-65.2739
    0.0043096,0.0079566,-65.6207
    0.0046239,-0.012723,-66.2588
    0.004143,-0.020797,-66.675
    0.0043804,0.007956,-67.0079
    0.0046135,-0.0091693,-66.9663
    0.003853,0.0038752,-66.1479
    0.0042924,0.033501,-65.6485
    0.0043707,-0.0085086,-64.5526
    0.0041629,-0.004419,-64.9965
    0.0046859,0.01378,-66.2727
    0.0046791,-0.0082854,-66.5085
    0.0042443,0.013296,-68.0483
    0.0045428,-0.014748,-66.7305
    0.0040734,-0.021863,-66.4253
    0.0041598,-0.010524,-65.9537
    0.0045834,-0.0075484,-64.3168
    0.0039269,0.0030525,-65.1907
    0.0043361,0.017656,-65.5236
    0.0045107,-0.00312,-66.1479
    0.0041235,-0.004234,-67.6044
    0.0045916,0.01156,-66.6056
    0.0043559,-0.0093513,-67.0357
    0.0041099,0.014507,-66.134
    0.0045156,0.0071321,-65.1768
    0.0041119,-0.003837,-65.3433
    0.0041865,0.014789,-65.0104
    0.0047428,-0.0051173,-66.2866
    0.0042376,-0.021698,-66.4392
    0.0045302,0.0053204,-67.0357
    0.0045926,-0.015031,-66.8969
    0.004028,-0.0058731,-65.9537
    0.0041685,0.0039933,-65.6623
    0.004197,-0.014634,-64.4416
    0.003946,0.0090512,-65.1352
    0.0046488,0.0042834,-66.1895
    0.0043015,0.0097195,-66.3698
    0.0042961,0.0282,-68.1593
    0.0047374,0.01752,-66.6056
    0.0040965,-0.0067645,-66.453
    0.004324,0.016571,-65.8011
    0.0046482,-0.012066,-64.0255
    0.0042263,-0.0074715,-65.2601
    0.0045406,0.0093031,-65.5098
    0.0045666,-0.015834,-66.1062
    0.004103,-0.012898,-67.5212
    0.0046178,-0.0030325,-66.6334
    0.0042454,-0.020021,-67.0218
    0.0040706,0.013268,-65.9814
    0.0044418,0.0070652,-65.2184
    0.0041778,-0.021369,-65.2046
    0.0042203,0.026001,-64.9965
    0.0047789,0.0012015,-66.1062
    0.0042161,-0.00043788,-66.5085
    0.0042759,0.01094,-67.0357
    0.0046614,-0.012782,-66.786
    0.0040899,-0.0026661,-65.8565
    0.0044348,0.0026455,-65.593
    0.0043528,0.0050292,-64.2752
    0.0040157,-0.0049422,-64.9271
    0.0047363,0.018034,-65.912
    0.0042584,-0.01004,-66.2033
    0.0043564,-0.00097832,-67.9096
    0.0045823,-0.010078,-66.3143
    0.0043784,-0.025371,-66.3005
    0.0042742,0.023956,-65.704
    0.0043372,-0.0097389,-63.97
    0.0038545,0.0087318,-65.052
    0.0043303,0.0017541,-65.0242
    0.0046417,0.00086268,-66.0508
    0.0042006,0.015011,-67.2576
    0.004722,0.013285,-66.453
    0.0044065,-0.0017341,-66.786
    0.004288,-0.0038564,-65.7178
    0.0045773,0.0016372,-65.1075
    0.0040671,-0.017337,-64.9688
    0.0043026,0.020536,-64.9133
    0.00458,-0.016696,-66.0091
    0.0042268,-0.027163,-66.0785
    0.0044446,0.014682,-67.0357
    0.004735,-0.028151,-66.3143
    0.0040209,0.009797,-65.7594
    0.0041427,0.0013096,-65.3156
    0.004302,-0.017008,-64.0948
    0.0038522,-0.003682,-64.7191
    0.0047367,0.018354,-66.134
    0.0042477,-0.0030631,-65.704
    0.0041725,0.00010627,-67.9928
    0.0046761,0.0081098,-66.7166
    0.0043145,-0.020981,-65.8149
    0.0040937,0.028742,-65.2601
    0.0043037,0.0014828,-64.6358
    0.0039819,-0.013111,-64.7607
    0.004401,0.0028881,-64.9549
    0.0047076,0.0030719,-66.6195
    0.0041962,0.021863,-66.7443
    0.0044663,0.0021904,-66.5085
    0.0043607,0.004399,-67.5212
    0.0041429,0.0039252,-65.3294
    0.0046535,0.0034401,-64.9549
    0.004017,-0.0025199,-64.7191
    0.0044714,0.025264,-64.8162
    0.0047704,-0.0057655,-66.0924
    0.0042513,-0.01883,-66.675
    0.004426,0.019915,-66.7998
    0.004765,-0.019584,-66.3421
    0.0040506,-0.012056,-66.4392
    0.0043522,0.0013278,-65.3433
    0.0043356,-0.0074818,-64.2335
    0.0041581,0.0020642,-64.7607
    0.0045775,0.019129,-65.9814
    0.0042725,-0.0068808,-66.4253
    0.0043063,0.0038183,-67.8957
    0.004798,-0.021794,-66.3559
    0.0043567,-0.012045,-66.3005
    0.0042421,0.015245,-65.7872
    0.0043967,-0.017288,-63.8174
    

    It didn't terminate to optimality after 500 iterations, but it didn't fail like it did before. Perhaps setting m.options.MAX_ITER to a reasonable level and reporting the solution with m.solve(debug=0) can be one way to terminate early without converging to optimality.

    Overall recommendation is to import the model from Keras / TensorFlow instead of refitting it with the gekko brain function.