matlabdiscriminant

Plotting outputs of 3D quadratic discriminant analysis in matlab


I am trying to plot the outputs of quadratic discriminant analysis in MATLAB. I have previously asked about Linear analysis (LDA question for reference), however, now I wish to investigate using quadratic discriminant analysis. For LDA I used fimplicit3 to plot the plain however this requires knowing the form of the equation for the function to evaluate. I am struggling to work out what this should be. The code below works for LDA, but when I change to QDA it is clearly incorrect (you will see when you run the code, the plains intersect the data) as it does not include the extra "quadratic coefficents" introduced by the QDA. I have tried a few things, all unsuccessfully, which I have omitted for clarity.

In summary, what is the equation I need to pass to "fimplicit3" at line 135, 137, 139, to plot a curved plain? Is there a better way of doing this?

Beneath the code sample I have included the equations for 2d linear and quadratic, in case this helps.

clear
clc
close all

%----------------------------
x1 = 1:0.01:3;
r = -1 + (1+1)*rand(1,201);
x1 = x1 + r;

y1 = 1:0.01:3;
r = -1 + (1+1)*rand(1,201);
y1 = y1 + r;

z1 = 1:0.01:3;
r = -1 + (1+1)*rand(1,201);
z1 = z1 + r;

x1 = x1';
y1 = y1';
z1 = z1';
label1 = ones(length(y1),1);

Tclust1 = [label1,x1,y1,z1];
%----------------------------
x2 = 4:0.01:6;
r = -1 + (1+1)*rand(1,201);
x2 = x2 + r;

y2 = 10:0.01:12;
r = -1 + (1+1)*rand(1,201);
y2 = y2 + r;

z2 = 10:0.01:12;
r = -1 + (1+1)*rand(1,201);
z2 = z2 + r;

x2 = x2';
y2 = y2';
z2 = z2';

label2 = 2.*ones(length(y2),1);

Tclust2 = [label2,x2,y2,z2];

%----------------------------
x3 = 5:0.01:7;
r = -1 + (1+1)*rand(1,201);
x3 = x3 + r;

y3 = 13:0.01:15;
r = -1 + (1+1)*rand(1,201);
y3 = y3 + r;

z3 = 13:0.01:15;
r = -1 + (1+1)*rand(1,201);
z3 = z3 + r;

x3 = x3';
y3 = y3';
z3 = z3';

label3 = 3.*ones(length(y3),1);

Tclust3 = [label3,x3,y3,z3];

%----------------------------
T = [Tclust1;Tclust2;Tclust3];

data = T(:,2:4);
labels = T(:,1);

%LOOCV
n = length(labels);
hpartition = cvpartition(n,'leaveout');

% figure()
output = zeros(n,3);

for qq = 1:n
    %training data
idx_Train = training(hpartition,qq);
data_Train = data(idx_Train,1:3);
labels_Train = labels(idx_Train);

    %test data
idx_Test = test(hpartition,qq);
data_Test = data(idx_Test,1:3);
labels_Test = labels(idx_Test);

%train the model on the training data
Mdlquad3D = fitcdiscr(data_Train,labels_Train,'Discrimtype','quadratic');

%apply to the testing data
pred_lbl = predict(Mdlquad3D,data_Test);

(qq/n)*100
end

iscorrect=output(:,1)==output(:,2);
accuracy_full = (sum(iscorrect)/length(iscorrect))*100;

%plot data with boundaries -----------------------------------------------
data = T(:,2:4);
labels = T(:,1);
xmin = min(T(:,2));
xmax = max(T(:,2));
ymin = min(T(:,3));
ymax = max(T(:,3));
zmin = min(T(:,4));
zmax = max(T(:,4));

%group 1 and group 2
K12 = Mdlquad3D.Coeffs(1,2).Const;
L12 = Mdlquad3D.Coeffs(1,2).Linear;
Q12 = Mdlquad3D.Coeffs(1,2).Quadratic;

%group 2 and group 3
K23 = Mdlquad3D.Coeffs(2,3).Const;
L23 = Mdlquad3D.Coeffs(2,3).Linear;
Q23 = Mdlquad3D.Coeffs(2,3).Quadratic;

%group 1 and group 3
K13 = Mdlquad3D.Coeffs(1,3).Const;
L13 = Mdlquad3D.Coeffs(1,3).Linear;
Q13 = Mdlquad3D.Coeffs(1,3).Quadratic;

figure()
scatter3(x1,y1,z1,'g')
hold on
scatter3(x2,y2,z2,'b')
scatter3(x3,y3,z3,'r')

%THESE ARE CORRECT FOR LDA - HOW TO INCLUDE Qxx? Is there a better way
%than fimplicit3?
f12 = @(x1,x2,x3) K12 + L12(1)*x1 + L12(2)*x2 + L12(3)*x3;
h2 = fimplicit3(f12,[xmin xmax ymin ymax zmin zmax]);
f23 = @(x1,x2,x3) K23 + L23(1)*x1 + L23(2)*x2 + L23(3)*x3;
h3 = fimplicit3(f23,[xmin xmax ymin ymax zmin zmax]);
f13 = @(x1,x2,x3) K13 + L13(1)*x1 + L13(2)*x2 + L13(3)*x3;
h4 = fimplicit3(f13,[xmin xmax ymin ymax zmin zmax]);

title('Full data set')
xlabel('x')
ylabel('y')
zlabel('z')
legend('Data 1','Data 2','Data 3','Boundary 1 -> 2','Boundary 2 -> 3','Boundary 1 -> 3','location','eastoutside') 
% 

For reference:

2D linear:

K = MdlLinear.Coeffs(1,2).Const;
L = MdlLinear.Coeffs(1,2).Linear;
f12 = @(x1,x2) K + L(1)*x1 + L(2)*x2;
h2 = fimplicit(f12,[xmin xmax ymin ymax]);

2D Quadratic:

K = MdlQuad.Coeffs(1,2).Const;
L = MdlQuad.Coeffs(1,2).Linear;
Q = MdlQuad.Coeffs(1,2).Quadratic;
f12 = @(x1,x2) K + L(1)*x1 + L(2)*x2 + Q(1,1)*x1.^2 + (Q(1,2)+Q(2,1))*x1.*x2 + Q(2,2)*x2.^2;
h2 = fimplicit(f12,[xmin xmax ymin ymax]);

Solution

  • From the classificationdiscriminant docs, we have

    Const — A scalar
    Linear — A vector with p components, where p is the number of columns in X
    Quadratic — p-by-p matrix, exists for quadratic DiscrimType

    The equation of the boundary between class i and class j is
    Const + Linear * x + x' * Quadratic * x = 0
    where x is a column vector of length p.

    You could work out all of the matrix coefficients for the quadratic term by multiplying and writing out x'Qx, which you've done (with or without realising) already for the linear coefficients.

    However, it's easier to just treat this as a matrix equation and write a function which accepts x1,x2,x3 and evaluates the boundary using the matrices.

    You can do this by making a local function fKLQ(K,L,Q,x1,x2,x3) which you can alias with your existing anonymous function handles f12 etc like so:

    f12 = @(x1,x2,x3) fKLQ( K12, L12, Q12, x1, x2, x3 );
    h2 = fimplicit3(f12,[xmin xmax ymin ymax zmin zmax]);
    f23 = @(x1,x2,x3) fKLQ( K23, L23, Q23, x1, x2, x3 );
    h3 = fimplicit3(f23,[xmin xmax ymin ymax zmin zmax]);
    f13 = @(x1,x2,x3) fKLQ( K13, L13, Q13, x1, x2, x3 );
    h4 = fimplicit3(f13,[xmin xmax ymin ymax zmin zmax]);
    
    function f = fKLQ( K, L, Q, x1, x2, x3 )
        x = [x1(:),x2(:),x3(:)];
        n = numel(x1);
        if n == 1
            x = x.';
        end
        f = zeros(1,n);
        for ii = 1:n
            f(ii) = K + L.'*x(:,ii) + x(:,ii).'*Q*x(:,ii);
        end
    end
    

    The x1,x2,x3 inputs for fimplicit are either row vectors or scalars, so you have to handle that when forming x correctly - there might be a simpler way to do that than what I've shown but I believe it at least works like this.

    The output divides your clusters as shown.

    plot

    Note: there are still some warnings about vectorisation here, I believe that's something to do with how I've written the matrix multiplication, with MATLAB warning that the evaluation output might not be as expected. You could avoid this by explicitly adding each coefficient, or maybe there's some other formulation, but in the meantime I think it's benign.