pythonscipyminimizationlog-likelihood

Correct scipy.optimize.minimize structure


I'm having trouble understanding how scipy.optimize works.
Specifically I'm trying to use optimize.minimize() to find the minimum of a function that calculates the negative likelihood of a dataset according to a binomial distribution. I read many things online and on book and i still can't figure out what I'm doing wrong, I'm stuck! This is my code

from scipy.stats import binom
from scipy.optimize import minimize
import numpy as np
import plotly.express as px
import pandas as pd

n=20
p=0.45
## data
d1 = binom.rvs(n, p, size=200)

def likelihood(data, n, p):
    return -sum(binom.logpmf(data, n, p))

def opt_like(parameters, constants):
    p = parameters
    n, data = constants[0], constants[1]
    return likelihood(data, n, p)

minimize(opt_like, 0.5, args=[n, d1])

I hope it's clear. Do you know what causes the failure of the optimization?

NB
I did read the documentation, it's just that it looks to me i did everything it says. Parameters and hyperparameters are separated, and the shape of the result of the function is the same of the input vector (both scalars).


Solution

  • The thing is that your logpmf function is quite sensitive. It is quite flat (and so is its derivative) in a wide range, and goes very fast to huge values, and even NaN.

    Plus, obviously, your parameter is supposed to be between 0 and 1. And most methods, will very quickly, in one iteration, send your p to -1000 or +1000, which is nonsense for a probability, and quickly traps the process in a loop of trials whose results are all the same (nan).

    So, you need to choose a method that accepts boundaries, and that take advantage of the monotonicity of your function derivative.

    For example

    minimize(opt_like, 0.5, args=[n, d1], bounds=[(0,1)], method='SLSQP')