I am looking for an alternative to MLJLinearModel for linear regression in Julia. I have read a few blogs describing flux.jl
as a better and powerful package to perform ML tasks. However, I have not found a well-detailed tutorial directed specifically toward linear regression.
Please do recommend either an approach to perform this task or sources which can help me understand the process in details.
I recommend using GLM
for this job. For an example:
using GLM, DataFrames
a, b = 2, 3;
dat = DataFrame(x=1:1000, y=a .* (1:1000) .+ b .+ randn(1000))
ols = lm(@formula(y ~ x), dat)
which yields:
StatsModels.TableRegressionModel{LinearModel{GLM.LmResp{Vector{Float64}}, GLM.DensePredChol{Float64, LinearAlgebra.Cholesky{Float64, Matrix{Float64}}}}, Matrix{Float64}}
y ~ 1 + x
Coefficients:
───────────────────────────────────────────────────────────────────────────
Coef. Std. Error t Pr(>|t|) Lower 95% Upper 95%
───────────────────────────────────────────────────────────────────────────
(Intercept) 3.0607 0.0641567 47.71 <1e-99 2.9348 3.1866
x 1.99989 0.000111039 18010.65 <1e-99 1.99967 2.00011
───────────────────────────────────────────────────────────────────────────
You can always just get the coefficients:
julia> coef(ols)
2-element Vector{Float64}:
3.0607023932922464
1.9998906641774181