精华内容
下载资源
问答
  • kNN算法python代码学习2-手写识别系统

    千次阅读 2015-09-21 16:16:17
    <module 'kNN' from 'C:\Users\mrzhang\Desktop\prac\python\kNN.py'> >>> kNN.handwritingClassTest() 测试结果为: the classifier came back with: 0, the real answer is: 0 the classifier came back ...

    构造的系统能识别数字0到9,已经将图像转换为文本格式。

    为了使用之前构造的分类器,要将图像格式化处理为一个向量。

    def img2vector(filename):
        returnVect = zeros((1,1024))
        fr = open(filename)
        for i in range(32): #图片大小为32*32
            lineStr = fr.readline()
            for j in range(32):
                returnVect[0,32*i+j] = int(lineStr[j])
        return returnVect

    我们得到了分类器可以识别的数据格式,接下来就要将数据输入到分类器,对分类器的准确率进行测试:

    def handwritingClassTest():
        hwLabels = []
        trainingFileList = listdir('trainingDigits')#使用此函数之前要将from os import listdir写在文件开头
        m = len(trainingFileList)
        trainingMat = zeros((m,1024))#每一行代表一个图片数据
        for i in range(m):
            fileNameStr = trainingFileList[i]
            fileStr = fileNameStr.split('.')[0]
            classNumStr = int(fileStr.split('_')[0]) #通过两次split获得分类数字(label)
            hwLabels.append(classNumStr)
            trainingMat[i,:] = img2vector('trainingDigits/%s'% fileNameStr)
        testFileList = listdir('testDigits')
        errorCount = 0.0
        mTest = len(testFileList)
        for i in range(mTest):
            fileNameStr = testFileList[i]
            fileStr = fileNameStr.split('.')[0]
            classNumStr = int(fileStr.split('_')[0])
            vectorUnderTest = img2vector('testDigits/%s' % fileNameStr)
            classifierResult = classify0(vectorUnderTest, trainingMat, hwLabels, 3)#由于文件中的值都在0和1之间,所以不再需要归一化
            print 'the classifier came back with: %d, the real answer is: %d'%(classifierResult,classNumStr)
            if classifierResult != classNumStr: errorCount += 1.0
        print '\nthe total number of errors is: %d' % errorCount
        print '\nthe total error rate is: %f' % (errorCount/float(mTest))

    命令行中输入:

    >>> reload(kNN)
    <module 'kNN' from 'C:\Users\mrzhang\Desktop\prac\python\kNN.py'>
    >>> kNN.handwritingClassTest()
    

    测试结果为:

    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    .....省略一些
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    
    the total number of errors is: 11
    
    the total error rate is: 0.011628

    得到的错误率为1.16%。

    提高计算效率?KD树


    展开全文
  • KNN算法 python

    2021-05-18 15:20:22
    KNN算法 python
  • knn算法python

    2018-04-07 15:26:36
    按照机器学习实战第二章写的knn算法python版,加了一些个人注释,供需要的朋友参考。
  • KNN算法Python实现

    2020-09-02 13:46:02
    有关于Knn算法相关的Python实现,有多个实现的算法,学校课程设计还有详细的说明实验文档,非常实用
  • KNN算法python实现

    千次阅读 2018-12-05 15:25:51
    KNN算法python实现 算法概述 算法优缺点 优点:精度高、对异常值不敏感、无数据输入假定。 缺点:计算复杂度高、空间复杂度高。 适用数据范围:数值型和标称型。 算法流程 1 收集数据:可以使用任何方法。 2 准备...

    KNN算法python实现

    算法概述

    算法优缺点

    • 优点:精度高、对异常值不敏感、无数据输入假定。
    • 缺点:计算复杂度高、空间复杂度高。
    • 适用数据范围:数值型和标称型。

    算法流程

    • 1 收集数据:可以使用任何方法。
    • 2 准备数据:距离计算所需要的数值,最好是结构化的数据格式。
    • 3 分析数据:可以使用任何方法。
    • 4 训练算法:此步驟不适用于knn算法。
    • 5 测试算法:计算错误率。
    • 6 使用算法:首先需要输入样本数据和结构化的输出结果,然后运行女-近邻算法判定输入数据分别属于哪个分类,最后应用对计算出的分类执行后续的处理。

    算法准备

    使用python导入数据

    from numpy import *
    import operator
    
    def createDataSet():
        group = array([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
        labels = ['A', 'A', 'B', 'B']
        return group, labels
    
    group, labels = createDataSet()
    
    group
    
    array([[1. , 1.1],
           [1. , 1. ],
           [0. , 0. ],
           [0. , 0.1]])
    
    labels
    
    ['A', 'A', 'B', 'B']
    

    实施KNN分类算法

    对未知类别属性的数据集中的每个点依次执行以下操作:

    • 计算已知类别数据集中的点与当前点之间的距离;
    • 按照距离递增次序排序;
    • 选取与当前点距离最小的走个点;
    • 确定前k个点所在类别的出现频率;
    • 返回前k个点出现频率最高的类别作为当前点的预测分类。
    # 各近邻算法
    def classify0(inX, dataSet, labels, k):
        dataSetSize = dataSet.shape[0]
        diffMat = tile(inX, (dataSetSize, 1)) - dataSet # tile函数作用见下面的例子
        sqDiffMat = diffMat ** 2
        sqDistances = sqDiffMat.sum(axis = 1)
        distances = sqDistances**0.5
        sortedDistIndicies = distances.argsort() # argsort返回数组从小到大的索引值
        classCount = {}
        
        for i in range(k):
            votellabel = labels[sortedDistIndicies[i]]
            classCount[votellabel] = classCount.get(votellabel, 0) + 1
            
        sortedClassCount = sorted(classCount.items(), key = operator.itemgetter(1), reverse = True)
        return sortedClassCount[0][0]
    

    classifyO ()函数有4个输人参数:用于分类的输人向量是inX,输入的训练样本集为dataSet,标签向量为labels,最后的参数k表示用于选择最近邻居的数目,其中标签向量的元素数目和矩阵dataSet的行数相同

    tile([1,2],2)
    
    array([1, 2, 1, 2])
    

    输出[1,2,1,2]

    tile([1,2],(2,2))
    
    array([[1, 2, 1, 2],
           [1, 2, 1, 2]])
    

    重复顺序为: [1,2] => [[1,2] , [1,2]] => [[1,2,1,2] , [1,2,1,2]]

    tile([1,2],(2,2,3))
    
    array([[[1, 2, 1, 2, 1, 2],
            [1, 2, 1, 2, 1, 2]],
    
           [[1, 2, 1, 2, 1, 2],
            [1, 2, 1, 2, 1, 2]]])
    

    重复顺序为: [1,2] => [[1,2] , [1,2]] => [[[1,2],[1,2]] , [[1,2],[1,2]]] => [[[1,2,1,2,1,2],[1,2,1,2,1,2]] , [[1,2,1,2,1,2],[1,2,1,2,1,2]]]

    # sortedDistIndicies = ((((tile([0, 0], (4, 1)) - group) ** 2).sum(axis = 1))**0.5).argsort()
    classify0([0, 0], group, labels, 3)
    
    'B'
    

    示例:使用k-近邻算法改进约会网站的配对效果

    • 收集数据:提供文本文件。
    • 准备数据: 使用Pyhton解析文本文件。
    • 分析数据:使用Matplotlib画二维扩散图。
    • 训练算法:此步驟不适用于k近邻算法。
    • 测试算法:使用海伦提供的部分数据作为测试样本。
      测试样本和非测试样本的区别在于:测试样本是已经完成分类的数据,如果预测分类
      与实际类别不同,则标记为一个错误。
    • 使用算法:产生简单的命令行程序,然后海伦可以输入一些特征数据以判断对方是否为自己喜欢的类型。

    收集数据

    数据存放在data/datingTestSet2.txt中,数据共1000行, 特征为3:

    • 每年获得的飞行常客里程数
    • 玩视频游戏所耗时间百分比
    • 每周消费的冰淇淋公升数
    # 读取数据
    def file2matrix(filenam):
        data = open(filename)
        lines = data.readlines()
        numberOfLines = len(lines)
        returnMat = zeros((numberOfLines, 3))
        classLabelVector = []
        index = 0
        for line in lines:
            line = line.strip()
            listFromLine = line.split('\t')
            returnMat[index, :] = listFromLine[0:3]
            classLabelVector.append(int(listFromLine[-1]))
            index += 1
        return returnMat, classLabelVector
    
    filename = 'data/datingTestSet2.txt'
    datingDataMat, datingLabels = file2matrix(filename)
    
    datingDataMat
    
    array([[4.0920000e+04, 8.3269760e+00, 9.5395200e-01],
           [1.4488000e+04, 7.1534690e+00, 1.6739040e+00],
           [2.6052000e+04, 1.4418710e+00, 8.0512400e-01],
           ...,
           [2.6575000e+04, 1.0650102e+01, 8.6662700e-01],
           [4.8111000e+04, 9.1345280e+00, 7.2804500e-01],
           [4.3757000e+04, 7.8826010e+00, 1.3324460e+00]])
    
    datingLabels[:20]
    
    [3, 2, 1, 1, 1, 1, 3, 3, 1, 3, 1, 1, 2, 1, 1, 1, 1, 1, 2, 3]
    

    分析数据

    %matplotlib inline
    import matplotlib
    import matplotlib.pyplot as plt
    
    # 散点图
    fig = plt.figure()
    ax = fig.add_subplot(111)
    ax.scatter(datingDataMat[:, 1], datingDataMat[:, 2])
    plt.show()
    

    在这里插入图片描述

    fig = plt.figure()
    ax = fig.add_subplot(111)
    ax.scatter(datingDataMat[:, 1], datingDataMat[:, 2], 15.0 * array(datingLabels), 15.0 * array(datingLabels))
    plt.show()
    

    在这里插入图片描述

    fig = plt.figure()
    ax = fig.add_subplot(111)
    ax.scatter(datingDataMat[:, 0], datingDataMat[:, 1], 15.0 * array(datingLabels), 15.0 * array(datingLabels))
    plt.show()
    

    在这里插入图片描述

    归一化数据

    在计算距离的时候,数字差值较大的属性对计算结果的影响最大,导致对每个特征不公平,所以在计算之前要对数据归一化。

    这里使用最大-最小归一化

    newValue = (oldValue - min)/(max - min)

    # 归一化特征值
    def autoNorm(dataSet):
        minVals = dataSet.min(0)
        maxVals = dataSet.max(0)
        ranges = maxVals - minVals
        normDataSet = zeros(shape(dataSet))
        m = dataSet.shape[0]
        normDataSet = dataSet - tile(minVals, (m, 1))
        normDataSet = normDataSet / tile(ranges, (m, 1))
        return normDataSet, ranges, minVals
    
    dataSet = group
    dataSet
    
    array([[1. , 1.1],
           [1. , 1. ],
           [0. , 0. ],
           [0. , 0.1]])
    
    minVals = dataSet.min(0);
    minVals
    
    array([0., 0.])
    
    maxVals = dataSet.max(0);
    maxVals
    
    array([1. , 1.1])
    
    ranges = maxVals - minVals;
    ranges
    
    array([1. , 1.1])
    
    normDataSet = zeros(shape(dataSet))
    normDataSet
    
    array([[0., 0.],
           [0., 0.],
           [0., 0.],
           [0., 0.]])
    
    m = dataSet.shape[0]
    normDataSet = dataSet - tile(minVals, (m, 1))
    normDataSet
    
    array([[1. , 1.1],
           [1. , 1. ],
           [0. , 0. ],
           [0. , 0.1]])
    
    normDataSet = normDataSet / tile(ranges, (m, 1))
    normDataSet
    
    array([[1.        , 1.        ],
           [1.        , 0.90909091],
           [0.        , 0.        ],
           [0.        , 0.09090909]])
    
    normMat, ranges, minVals = autoNorm(datingDataMat)
    
    normMat
    
    array([[0.44832535, 0.39805139, 0.56233353],
           [0.15873259, 0.34195467, 0.98724416],
           [0.28542943, 0.06892523, 0.47449629],
           ...,
           [0.29115949, 0.50910294, 0.51079493],
           [0.52711097, 0.43665451, 0.4290048 ],
           [0.47940793, 0.3768091 , 0.78571804]])
    
    ranges
    
    array([9.1273000e+04, 2.0919349e+01, 1.6943610e+00])
    
    minVals
    
    array([0.      , 0.      , 0.001156])
    

    测试算法

    def datingClassTest(filename):
        hoRatio = 0.10
        datingDataMat, datingLabels = file2matrix(filename)
        normMat, ranges, minVals = autoNorm(datingDataMat)
        m = normMat.shape[0]
        numTestVecs = int(m * hoRatio)
        errorCount = 0.0 
        for i in range(numTestVecs):
            classifierResult = classify0(normMat[i,:], normMat[numTestVecs:m, :], datingLabels[numTestVecs:m], 3)
            print("the classifier came back with: {0}, the real answer is: {1}".format(classifierResult, datingLabels[i]))
            if(classifierResult != datingLabels[i]): errorCount += 1.0
        print("the total error rate is: %f" % (errorCount / float(numTestVecs)))
    
    datingClassTest(filename)
    
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 3
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 3, the real answer is: 1
    the total error rate is: 0.050000
    

    使用算法

    def classifyPerson():
        resultList = ['not at all', 'in small doses', 'in large doses']
        percentTats = float(input("percentage of time spent playing video games:"))
        ffMiles = float(input("frequent flier miles earned per year:"))
        iceCream = float(input("liters of ice cream consumed per year:"))
        datingDataMat, datingLabels = file2matrix('data/datingTestSet2.txt')
        normMat, ranges, minVals = autoNorm(datingDataMat)
        inArr =array([ffMiles, percentTats, iceCream])
        classifierResult = classify0((inArr-minVals)/ranges, normMat,datingLabels, 3 )
        # print(classifierResult)
        print("you will probably like this person: " + resultList[classifierResult - 1])
    
    classifyPerson()
    
    percentage of time spent playing video games:10
    frequent flier miles earned per year:1000
    liters of ice cream consumed per year:0.1
    2
    you will probably like this person: in small doses
    

    示例:手写识别系统

    • 收集数据:提供文本文件。
    • 准备数据:编写函数classify0() ,将图像格式转换为分类器使用的制格式。
    • 分析数据:在python命令提示符中检查数据,确保它符合要求。
    • 训练算法:此步驟不适用于各近邻算法。
    • 测试算法:编写函数使用提供的部分数据集作为测试样本,测试样本与非测试样本的区别在于测试样本是已经完成分类的数据,如果预测分类与实际类别不同,则标记为一个错误。
    • 使用算法:本例没有完成此步驟,若你感兴趣可以构建完整的应用程序,从图像中提取数字,并完成数字识别,美国的邮件分拣系统就是一个实际运行的类似系统。

    收集数据

    数据存放在data/trainingDigits/data/testDigits/中,训练数据大约2000个,测试数据大约900个, 把32 * 32的二进制图像矩阵转换为1 * 1024的向量:

    分析准备数据

    # img2vector函数,将图像转换为向量
    def img2vector(filename):
        returnVect = zeros((1, 1024))
        data = open(filename)
        for i in range(32):
            lineStr = data.readline()
            for j in range(32):
                returnVect[0, 32*i + j] = int(lineStr[j])
        return returnVect
    
    filename = 'data/trainingDigits/0_13.txt'
    testVector = img2vector(filename)
    
    testVector[0, 0:31]
    
    array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 1.,
           1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])
    

    测试算法

    import os
    
    def handWritingClassTest():
        hwLabels = []
        trainingFileList = os.listdir('data/trainingDigits/')
        m = len(trainingFileList)
        trainingMat = zeros((m, 1024))
        for i in range(m):
            fileNameStr = trainingFileList[i]
            filename = 'data/trainingDigits/{}'.format(fileNameStr)
            fileStr = fileNameStr.split('.')[0]
            classNumStr = int(fileStr.split('_')[0]) # 标签
            hwLabels.append(classNumStr)
            trainingMat[i, :] = img2vector(filename)
        testFileList = os.listdir('data/testDigits/')
        errorCount = 0
        mTest = len(testFileList)
        for i in range(mTest):
            fileNameStr = testFileList[i]
            filename = 'data/testDigits/{}'.format(fileNameStr)
            fileStr = fileNameStr.split('.')[0]
            classNumStr = int(fileStr.split('_')[0]) # 标签
            vectorUnderTest = img2vector(filename)
            classifierResult = classify0(vectorUnderTest, trainingMat, hwLabels, 3 )
            print("the classifier came back with: {0}, the real answer is: {1}".format(classifierResult, classNumStr))
            if(classifierResult != classNumStr): errorCount += 1.0
                
        print("the total number of errors id: %d" % errorCount)
        print("the total error rate is: %f" % (errorCount / float(mTest)))
            
    
    handWritingClassTest()
    
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 0, the real answer is: 0
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 7, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 1, the real answer is: 1
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 2, the real answer is: 2
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 9, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 3, the real answer is: 3
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 4, the real answer is: 4
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 3, the real answer is: 5
    the classifier came back with: 6, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 5, the real answer is: 5
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 6, the real answer is: 6
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 7, the real answer is: 7
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 6, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 3, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 1, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 1, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 8, the real answer is: 8
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 1, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 7, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the classifier came back with: 9, the real answer is: 9
    the total number of errors id: 10
    the total error rate is: 0.010571
    
    
    
    展开全文
  • 本文实例讲述了kNN算法python实现和简单数字识别的方法。分享给大家供大家参考。具体如下: kNN算法算法优缺点: 优点:精度高、对异常值不敏感、无输入数据假定 缺点:时间复杂度和空间复杂度都很高 适用数据范围:...
  • kNN算法Python实现

    2014-09-09 13:53:44
    利用Python语言实现kNN算法分类。
  • knn分类算法 python代码

    2018-06-17 14:46:37
    其中有几个问题需要特别注意,这里只是简单的实现了KNN算法,其中还要考虑K值的选取等问题。比如这里由于是手动构造的样本数据,数据量太少,K值便不能设太大,否则对模型进行检验时会有误差。
  • knn算法python实现

    2018-05-27 14:50:20
    knn算法实现,初学者,大佬勿喷您上传的资源如果因版权、使用、内容完整度 等原因被举报并通过官方审核,将扣除通过该资源获得的全部积分
  • 机器学习之KNN算法python实现

    千次阅读 2017-09-05 16:06:02
    机器学习之KNN算法python实现机器学习之KNN算法python实现 一 理论基础距离度量 k值选择 分类决策规则 kd树 二 python实现代码 结果 数据 一. 理论基础1. 距离度量特征空间中两个实例点的距离是两个实例点相似程度的...

    机器学习之KNN算法python实现

    一. 理论基础

    1. 距离度量

    特征空间中两个实例点的距离是两个实例点相似程度的反映。一般采用欧氏距离,但也可以是其他距离,如cosine距离,曼哈顿距离等.

    2. k值选择


    • k值越大,意味着模型越简单,学习近似误差大,估计误差小,欠拟合;
    • k值越小,意味着模型越复杂,学习近似误差小,估计误差大,过拟合,而且对近邻的实例点敏感.

    通常采取交叉验证选取最优的k值。

    3. 分类决策规则

    多数表决,即由输入实例的K个近邻的多数类决定输入实例的类别。

    4. kd树

    高效实现k近邻,类似于二分查找,只不过是在高维的二分查找。
    kd树更适用于训练实例数远大于空间维数时的k近邻搜索,当空间维数接近训练实例数时,它的效率会迅速下降,几乎接近线性扫描。

    二. python实现

    实现了knn的暴力搜索,也实现了kd-tree搜索,但是kd-tree只能找最近邻,即k=1,当k>1时,还未实现,初步想法:可以考虑k次搜索kd-tree,每次搜索后将最近邻节点删除,继续搜索,就找到了top k近邻搜索;这样的话就得实现kd-tree的删除插入。

    1. 代码

    knn.py

    #encoding=utf-8
    
    '''
    implement the knn algorithm
    '''
    
    import numpy as np
    import pandas as pd
    from sklearn.model_selection import train_test_split
    from scipy.stats import mode
    import matplotlib.pyplot as plt
    
    class KNN:
    
        def __init__(self):
            pass
    
        def predict(self, x_train, y_train, x_test, k=3):
            self.k = k
            m_train = x_train.shape[0]
            m_test = x_test.shape[0]
            x_train = np.mat(x_train)
            y_train = np.mat(y_train)
            x_test = np.mat(x_test)
    
            #1. get the distances between each sample in train samples and each sample in test samples,
            #the distances matrix's shape is (m_test, m_train).
            dists = self.__distance__(x_train, x_test)
            #2. sort the distances by row, and get the sort index
            sort_idx = np.argsort(dists, axis=1)
            #3. get the x index and y index, which is top k distance sample index
            x_idx = np.tile(np.mat(range(m_test)).T, [1, self.k])
            y_idx = sort_idx[:, 0 : self.k]
            #4. get the top k distance labels, and the matrix's shape is (m_test, k)
            labels = np.tile(y_train.T, [m_test, 1])
            p_labels = labels[x_idx, y_idx]
            #5. get the mode of each row, which means the most labels
            y_predict = np.mat(mode(p_labels, axis=1)[0])
            return y_predict
    
        def __distance__(self, x_train, x_test):
            '''
            force compute to get the distance between each sample in train samples and each sample in test samples
            '''
            m_train = x_train.shape[0]
            m_test = x_test.shape[0]
            dists = np.zeros((m_test, m_train))
            count = 0
            for test in x_test:
                test =  np.tile(test, [m_train, 1])
                distance = np.sum(np.multiply(x_train - test, x_train - test), axis=1)
                dists[count] = distance.T
                count += 1
            return dists
    
        def create_kd_tree(self, datalist):
            '''
            create KD tree
            Args:
                data: data list
            '''
            root = KDNode()
            self.build_tree(root, datalist)
            self.kd_tree = root
            return root
    
        def build_tree(self, parent, datalist):
            '''
            recursive build tree function
            Args:
                parent: parent node
            '''
            m = datalist.shape[0]
            #if the length of data is equal to 1, the node is a leaf node
            if m == 1:
                parent.data = datalist
                return
    
            #compute the best split demension by the variance of each demension of the data
            demension = np.argmax(np.var(datalist, axis=0))
            #sort the data by the chosen demension
            sorted_index = np.argsort(datalist[:, demension], axis=0)
            #get the index of the middle value in the datalist
            middle = m / 2
            #get the left data
            l_data = datalist[np.squeeze(sorted_index[0 : middle].getA()), :]
            #get the right data
            r_data = datalist[np.squeeze(sorted_index[middle + 1 : ].getA()), :]
    
            #assign the property of the parent node
            parent.data = datalist[np.squeeze(sorted_index[middle, :].getA())]
            parent.demension = demension
            parent.split_value = datalist[np.squeeze(sorted_index[middle, :].getA()), demension]
    
            #recursive build the child node if the length of rest data is not equal to zero
            if len(l_data) != 0:
                l_node = KDNode()
                parent.left = l_node
                self.build_tree(l_node, l_data)
    
            if len(r_data) != 0:
                r_node = KDNode()
                parent.right = r_node
                self.build_tree(r_node, r_data)
    
        def __distance_by_kd_tree__(self, x_test):
            '''
            get nearest neighbors matrix by kd_tree search
            '''
            m = x_test.shape[0]
            dists = np.zeros((m, 1))
            count = 0
            for x in x_test:
                dists[count] = self.__find_neighbor__(x, self.kd_tree)
                count += 1
            return np.mat(dists)
    
    
        def __find_neighbor__(self, x, node):
            '''
            recursive find the neighbor of x in kd-tree
            Args:
                the root node of current child tree
    
            steps:
                1. if the current is leaf node, return the data in the node as the nearest neighbor
                2. if the value of x is less than the split value, take the neighbor of left child
                   tree as nearest neighbor. And then check if another child tree has the more nearest
                   neighbor;
                   if the value of x is more than the split value, do it as like mentioned above;
                3. check if the current node and x has more nearest distance
            '''
    
            if node.demension == None: 
                return node.data
    
            if (x[0, node.demension] <= node.split_value) and node.left:
                neighbor = self.__find_neighbor__(x, node.left)
                if node.right \
                    and (np.abs(x[0, node.demension] - node.split_value) < self.__euclidean_distance__(x, neighbor)) \
                    and (self.__euclidean_distance__(self.__find_neighbor__(x, node.right), x) < self.__euclidean_distance__(x, neighbor)):
                        neighbor = self.__find_neighbor__(x, node.right)
            elif (x[0, node.demension] > node.split_value) and node.right:
                neighbor = self.__find_neighbor__(x, node.right)
                if node.left \
                    and (np.abs(x[0, node.demension] - node.split_value) < self.__euclidean_distance__(x, neighbor)) \
                    and (self.__euclidean_distance__(self.__find_neighbor__(x, node.left), x) < self.__euclidean_distance__(x, neighbor)):
                        neighbor = self.__find_neighbor__(x, node.left)
            else:
                # this happens as like:
                # x = 6, node = 5
                #         5
                #        /
                #       4
                neighbor = node.data
    
            if self.__euclidean_distance__(x, node.data) < self.__euclidean_distance__(x, neighbor):
                neighbor = node.data
            return neighbor
    
        def __euclidean_distance__(self, x1, x2):
            '''
            compute the euclidean distance
            '''
            return np.sum(np.multiply(x1 - x2, x1 - x2))
    
    class KDNode:
        def __init__(self, data=None, demension=None, split_value=None, left=None, right=None):
            self.data = data
            self.demension = demension
            self.split_value = split_value
            self.left = left
            self.right = right
    
    def main():
        '''
        KNN test unit
        '''
    
        #1. load data
        print "1. loading data..."
        data = pd.read_csv('/home/LiuYao/Documents/MarchineLearning/multi_data.csv')
        data['label'] = data['label'] + 1
        x_train, x_test, y_train, y_test = train_test_split(
                                                        data.values[:, 0:2], 
                                                        data.values[:, 2], 
                                                        test_size=0.2, 
                                                        random_state=0
                                                        )
    
        x_train = np.mat(x_train)
        x_test = np.mat(x_test) 
        y_train = np.mat(y_train).T
        y_test = np.mat(y_test).T
    
        #2. predict
        print '2. predicting...'
        knn = KNN()
        y_predict = knn.predict(x_train, y_train, x_test, k=1)
    
        #3. show the results
        print '3. show the results...'
        plt.scatter(x_train.getA()[:, 0], x_train.getA()[:, 1], c=y_train.T.getA()[0], marker='o')
        plt.scatter(x_test.getA()[:, 0], x_test.getA()[:, 1], c=y_predict.T.getA()[0], marker='*')
        plt.show()
    
    
    
    def test_build_tree():
        '''
        test building the kd tree
        '''
        datalist = np.mat([[3, 1, 4],
                           [2, 3, 7],
                           [2, 1, 3],
                           [2, 4, 5],
                           [1, 4, 4],
                           [0, 5, 7],
                           [6, 1, 4],
                           [4, 3, 4],
                           [5, 2, 5],
                           [4, 0, 6],
                           [7, 1, 6]])
        knn = KNN()
        tree = knn.create_kd_tree(datalist)
        res = knn.__find_neighbor__(np.mat([[3,1,5]]), tree)
        print res
    
    if __name__ == '__main__':
        main()
    

    2. 结果

    图中五角星表示预测数据,圆点表示训练数据,可能需要将图片放大才能看清楚。

    knn_results

    3. 数据

    x,y,label
    14.7,17.85,0
    17.45,17.45,0
    18.85,15.15,0
    17.25,13.7,0
    13.9,12.5,0
    10.5,15.65,0
    8.4,20.5,0
    11.1,21.85,0
    17.6,21.65,0
    23.0,19.75,0
    24.45,12.4,0
    16.25,3.3,0
    8.85,5.05,0
    5.55,8.8,0
    6.05,11.75,0
    26.45,6.9,0
    28.95,6.6,0
    21.75,8.35,0
    21.05,10.95,0
    23.9,17.05,0
    19.7,18.2,0
    12.4,19.3,0
    9.25,18.4,0
    10.3,8.95,0
    16.65,8.6,0
    37.3,15.1,0
    32.5,10.0,1
    33.05,11.45,1
    25.75,17.1,1
    20.15,17.8,1
    12.85,20.75,1
    12.8,8.65,1
    14.65,5.6,1
    24.2,6.3,1
    24.1,11.45,1
    22.05,10.85,1
    17.2,12.85,1
    13.7,15.55,1
    6.4,19.45,1
    8.1,11.5,1
    14.9,10.35,1
    10.05,12.65,1
    25.3,1.55,1
    16.5,3.8,1
    17.0,6.25,1
    17.85,7.35,1
    23.75,9.7,1
    21.65,16.3,1
    16.3,19.8,1
    13.9,19.85,1
    13.1,14.35,1
    16.55,17.9,1
    16.3,18.15,1
    15.3,17.7,1
    13.35,18.3,1
    12.8,17.5,1
    13.9,15.65,1
    15.65,16.5,1
    31.35,7.2,1
    31.35,6.95,1
    29.45,5.6,1
    27.15,4.85,1
    26.6,5.2,1
    27.8,7.35,1
    28.7,8.35,1
    28.8,10.25,1
    5.65,11.25,1
    7.8,9.7,1
    7.5,11.9,1
    3.55,14.45,1
    3.5,13.65,1
    5.1,10.95,1
    5.1,11.05,1
    18.65,9.1,2
    19.4,10.95,2
    20.1,12.7,2
    17.25,14.85,2
    14.6,15.25,2
    14.8,11.75,2
    13.5,6.4,2
    14.75,5.25,2
    18.05,4.05,2
    21.25,3.3,2
    23.75,3.85,2
    32.65,5.5,2
    33.65,7.05,2
    32.15,13.2,2
    30.8,15.25,2
    30.15,16.5,2
    24.7,18.0,2
    22.05,19.45,2
    20.1,21.5,2
    20.0,22.05,2
    26.8,22.45,2
    29.7,21.8,2
    30.95,21.35,2
    30.85,19.15,2
    28.4,18.7,2
    26.35,19.65,2
    26.5,19.9,2
    30.05,19.35,2
    32.75,16.35,2
    33.95,14.65,2
    34.05,14.6,2
    30.05,18.3,3
    27.65,20.6,3
    25.05,21.85,3
    24.1,18.2,3
    23.8,15.3,3
    25.6,14.45,3
    28.1,12.4,3
    29.35,10.95,3
    29.85,8.25,3
    30.55,14.1,3
    28.45,15.7,3
    31.85,18.15,3
    18.2,19.3,3
    16.85,19.8,3
    7.45,9.35,3
    13.35,13.9,3
    32.4,9.75,3
    23.8,1.05,3
    30.75,4.05,4
    30.5,5.3,4
    30.35,5.95,4
    28.9,9.0,4
    27.7,9.9,4
    24.75,11.4,4
    21.65,13.8,4
    19.75,17.45,4
    23.4,20.05,4
    18.2,21.75,4
    9.65,18.4,4
    5.6,13.45,4
    8.8,9.75,4
    11.25,11.2,4
    5.35,15.95,4
    6.1,16.0,4
    24.25,15.95,4
    31.55,17.0,4
    32.45,14.0,4
    24.05,12.4,4
    12.3,12.85,4
    7.15,19.3,4
    21.35,22.4,4
    27.95,17.65,4
    24.3,7.7,4
    17.5,3.6,4
    12.7,6.95,4
    11.25,10.7,4
    9.0,15.2,4
    7.05,19.15,4
    17.45,13.4,4
    16.0,10.75,4
    16.75,12.0,4
    18.25,11.5,4
    18.15,9.15,4
    17.1,9.5,4
    17.0,10.25,4
    12.8,7.75,4
    17.0,6.7,4
    21.15,8.5,4
    20.35,9.35,4
    19.45,10.0,4
    18.45,10.05,4
    18.0,8.0,4
    20.15,8.0,4
    21.45,6.65,4
    19.2,6.45,4
    15.25,8.4,4
    14.8,9.5,4
    14.45,7.7,4
    16.45,6.6,4
    18.0,5.85,4
    18.85,5.7,4
    19.6,6.1,4
    29.9,14.15,6
    31.4,15.8,6
    32.15,15.3,6
    33.25,13.65,6
    33.8,11.95,6
    33.85,10.9,6
    33.9,10.35,6
    32.6,10.75,6
    32.1,12.55,6
    34.15,12.55,6
    35.35,11.8,6
    35.15,10.4,6
    34.65,9.1,6
    34.3,8.9,6
    35.55,9.25,6
    36.35,12.45,6
    37.75,9.4,6
    37.75,8.5,6
    36.4,8.2,6
    35.0,8.05,6
    35.65,7.15,6
    37.55,6.4,6
    39.2,7.1,6
    36.5,9.85,0
    36.8,9.35,0
    37.5,7.7,0
    34.05,9.8,0
    20.2,20.3,0
    26.45,21.1,0
    27.9,20.65,0
    27.15,16.9,0
    25.5,13.1,0
    24.05,10.2,0
    23.45,5.3,0
    20.8,10.95,0
    18.95,14.65,0
    17.15,16.25,0
    11.3,17.0,0
    11.65,11.1,0
    15.95,4.8,0
    21.45,3.25,0
    13.9,3.05,0
    10.75,6.2,0
    9.3,16.85,0
    10.25,19.5,0
    12.7,15.95,0
    13.3,14.3,0
    15.7,11.45,0
    16.1,10.9,0
    14.1,14.2,0
    14.35,13.65,0
    15.3,14.1,0
    15.65,14.7,0
    15.75,15.85,0
    15.75,15.85,0
    19.2,18.4,0
    19.2,17.05,0
    19.3,15.6,0
    20.45,14.1,0
    21.4,11.65,0
    26.3,11.85,2
    20.5,18.75,2
    17.55,19.95,2
    13.1,16.65,2
    9.55,12.7,2
    7.85,15.65,2
    7.75,16.8,2
    9.1,10.35,2
    21.25,8.6,2
    22.65,5.0,2
    11.75,10.7,1
    11.05,14.55,1
    13.85,8.45,1
    11.7,6.65,1
    10.75,4.95,1
    10.95,3.75,1
    6.85,8.35,1
    11.35,5.7,1
    13.25,4.6,1
    7.45,7.95,1
    15.7,13.35,1
    16.85,14.3,1
    13.55,10.4,1
    9.55,7.3,1
    34.3,8.0,1
    28.15,8.45,1
    25.15,8.75,1
    22.3,14.6,1
    29.5,15.55,1
    28.2,14.1,1
    32.95,10.15,1
    29.15,11.4,1
    20.85,18.95,1
    22.0,17.8,1
    12.5,2.35,2
    7.25,14.5,3
    6.1,18.25,3
    8.85,20.5,3
    7.55,22.25,3
    15.8,19.2,3
    16.2,18.0,3
    16.95,17.45,3
    17.35,18.2,3
    18.25,17.45,3
    17.85,17.15,3
    18.25,16.25,3
    19.75,16.15,3
    20.85,16.95,3
    21.8,17.95,3
    22.75,19.0,3
    24.2,19.15,3
    25.05,18.55,3
    25.25,17.45,3
    13.95,11.1,3
    15.7,10.0,3
    14.3,9.85,3
    14.3,10.65,3
    14.85,11.1,3
    10.9,7.9,3
    9.1,8.5,3
    11.55,16.5,2
    11.05,18.5,2
    20.4,7.5,1
    19.95,8.7,1
    27.55,2.05,4
    26.6,2.5,4
    27.05,3.65,4
    28.65,4.1,4
    30.6,11.8,4
    29.55,12.35,4
    29.0,13.05,4
    30.5,13.4,4
    31.6,14.05,4
    31.55,14.9,4
    28.6,17.1,4
    30.35,17.75,4
    34.7,12.25,4
    31.0,11.75,4
    16.0,12.7,4
    8.8,17.95,4
    14.45,3.7,4
    28.6,4.75,4
    29.7,10.85,4
    24.15,14.2,4
    14.85,12.0,4
    6.9,11.05,4
    展开全文
  • 主要介绍了利用Python实现kNN算法代码,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友们下面随着小编来一起学习学习吧
  • 机器学习kNN算法python详解 简语 在学习过程中用尽可能简单的注释将所有代码和原理都详细解释了一遍。 算法原理 1)计算测试数据与各个训练数据之间的距离; 2)按照距离的递增关系进行排序; 3)选取距离最小的K个...

    机器学习kNN算法python详解

    简语

    在学习过程中用尽可能简单的注释将所有代码和原理都详细解释了一遍。

    算法原理

    1)计算测试数据与各个训练数据之间的距离;

    2)按照距离的递增关系进行排序;

    3)选取距离最小的K个点;

    4)确定前K个点所在类别的出现频率;

    5)返回前K个点中出现频率最高的类别作为测试数据的预测分类。

    代码及注释
    
    from numpy import *
    import operator
    def classify0(inX, dataSet, labels, k):    #inX待分类的向量 dataSet样本集 labels标签集 k邻居数目
        dataSetSize = dataSet.shape[0]  #获取矩阵行数
        diffMat = tile(inX, (dataSetSize,1))-dataSet #矩阵inX列长扩展dataSetSize倍,行长扩展1倍,相减之后得出向量离每个样本集之间的差额
        sqDiffMat = diffMat**2  #各个元素值求平方
        sqDistances = sqDiffMat.sum(axis=1) #axis=0是按照行求和,axis=1是按照列进行求和
        distances = sqDistances**0.5  #各个元素值开根号  5678行为坐标系求距离公式 d=(A-a)²+(B-b)²
        sortedDistIndicies = distances.argsort()    #返回数组值从小到大的索引值
        classCount = {}    #字典
        for i in range(k):
            voteIlabel = labels[sortedDistIndicies[i]]  #i=0到k-1
            classCount[voteIlabel] = classCount.get(voteIlabel,0)+1 #统计字典中voteIlabel出现的次数,第一次遇到初始化为0+1,再遇到只生效+1
        sortedClassCount = sorted(classCount.items(), key=operator.itemgetter(1), reverse=True)   #待排序iteritems获取一个迭代器 排序依据itemgetter返回对象域为1的值(上一行的次数) True逆序
        return sortedClassCount[0][0]   #返回出现次数最多的标签
    
    展开全文
  • 具体knn算法概念参考knn代码python实现上面是参考《机器学习实战》的代码,和knn的思想 # _*_ encoding=utf8 _*_ import numpy as npimport tensorflow as tffrom tensorflow.examples.tutorials.mnist import input...
  • kNN算法是k-近邻算法的简称,主要用来进行分类实践,主要思路如下: 1.存在一个训练数据集,每个数据都有对应的标签,也就是说,我们知道样本集中每一数据和他对应的类别。 2.当输入一个新数据进行类别或标签判定时...
  • kNN算法python实现

    2018-05-29 14:09:16
    knn(k-Nearest Neighbor Classification)算法示例 又称k-近邻分类算法,是一种非参数模型,参数模型(逻辑回归,线性回归) 分类-有监督学习 聚类-无监督学习 算法流程 (1)计算已知类别数据集中的点与当前点的...
  • 机器学习实战_kNN算法python3.6实现与理解标签(空格分隔): kNN算法from numpy import * import operator from os import listdir#创建数据集和标签 def createDataSet(): group = array([[1.0, 1.1], [1.0, 1.0]...
  • knnpython代码

    2020-03-01 17:57:42
    python写的knn算法,给了一个小栗子,并有详细的注释,希望对需要的朋友有用。 在spyder中运行通过

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 21,116
精华内容 8,446
关键字:

knn算法python代码

python 订阅