且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何将混淆矩阵发送到插入符号的混淆矩阵?

更新时间:2023-12-03 23:05:46

UPDATE:我发现发送数据比将表作为参数要好得多.来自 confusionMatrix 文档:

参考
要用作真实结果的类因素

我认为这意味着什么符号构成了积极的结果.就我而言,这将是一个 +.但是,参考"指的是数据集的实际结果,也就是因变量.

所以我应该使用confusionMatrix(testPred, test$class).如果您的数据由于某种原因出现乱序,它会将其转换为正确的顺序(因此正面和负面结果/预测在混淆矩阵中正确对齐.

但是,如果您担心结果是正确的因素,请安装 plyr 库,并使用 revalue 更改因素:

install.packages("plyr")图书馆(plyr)newDF 

我不确定为什么会这样,但我只是删除了正参数:

confusionMatrix(table(testPred, test$class))

我的混淆矩阵:

 tdtestPred - +- 99 6+ 20 88

Caret 混淆矩阵:

 tdtestPred - +- 99 6+ 20 88

虽然现在它说 $positive: "-" 所以我不确定这是好还是坏.

I'm looking at this data set: https://archive.ics.uci.edu/ml/datasets/Credit+Approval. I built a ctree:

myFormula<-class~.          # class is a factor of "+" or "-"
ct <- ctree(myFormula, data = train)

And now I'd like to put that data into caret's confusionMatrix method to get all the stats associated with the confusion matrix:

testPred <- predict(ct, newdata = test)

                #### This is where I'm doing something wrong ####
confusionMatrix(table(testPred, test$class),positive="+")
          ####  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ####

$positive
[1] "+"

$table
        td
testPred  -  +
       - 99  6
       + 20 88

$overall
      Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull AccuracyPValue  McnemarPValue 
  8.779343e-01   7.562715e-01   8.262795e-01   9.186911e-01   5.586854e-01   6.426168e-24   1.078745e-02 

$byClass
         Sensitivity          Specificity       Pos Pred Value       Neg Pred Value            Precision               Recall                   F1 
           0.9361702            0.8319328            0.8148148            0.9428571            0.8148148            0.9361702            0.8712871 
          Prevalence       Detection Rate Detection Prevalence    Balanced Accuracy 
           0.4413146            0.4131455            0.5070423            0.8840515 

$mode
[1] "sens_spec"

$dots
list()

attr(,"class")
[1] "confusionMatrix"

So Sensetivity is:

(from caret's confusionMatrix doc)

If you take my confusion matrix:

$table
        td
testPred  -  +
       - 99  6
       + 20 88

You can see this doesn't add up: Sensetivity = 99/(99+20) = 99/119 = 0.831928. In my confusionMatrix results, that value is for Specificity. However Specificity is Specificity = D/(B+D) = 88/(88+6) = 88/94 = 0.9361702, the value for Sensitivity.

I've tried this confusionMatrix(td,testPred, positive="+") but got even weirder results. What am I doing wrong?

UPDATE: I also realized that my confusion matrix is different than what caret thought it was:

   Mine:               Caret:

            td             testPred
   testPred  -  +      td   -  +
          - 99  6        - 99 20
          + 20 88        +  6 88 

As you can see, it thinks my False Positive and False Negative are backwards.

UPDATE: I found it's a lot better to send the data, rather than a table as a parameter. From the confusionMatrix docs:

reference
a factor of classes to be used as the true results

I took this to mean what symbol constitutes a positive outcome. In my case, this would have been a +. However, 'reference' refers to the actual outcomes from the data set, aka the dependent variable.

So I should have used confusionMatrix(testPred, test$class). If your data is out of order for some reason, it will shift it into the correct order (so the positive and negative outcomes/predictions align correctly in the confusion matrix.

However, if you are worried about the outcome being the correct factor, install the plyr library, and use revalue to change the factor:

install.packages("plyr")
library(plyr)
newDF <- df
newDF$class <- revalue(newDF$class,c("+"=1,"-"=0))
# You'd have to rerun your model using newDF


I'm not sure why this worked, but I just removed the positive parameter:

confusionMatrix(table(testPred, test$class))

My Confusion Matrix:

        td
testPred  -  +
       - 99  6
       + 20 88

Caret's Confusion Matrix:

        td
testPred  -  +
       - 99  6
       + 20 88

Although now it says $positive: "-" so I'm not sure if that's good or bad.