精华内容
下载资源
问答
  • Tensorflow MNIST 数据集使用ModuleNotFoundError: No module named 'tensorflow.examples.tutorials'解决方案1--针对没有tutorials文件报错的解决解决办法2input_data.read_data_sets() 下载数据集失败 # ...

    # Tensorflow MNIST 数据集使用
    
    import tensorflow as tf
    tf.__version__
    
    '2.1.0'
    
    import numpy as np
    import matplotlib.pyplot as plt
    

    ModuleNotFoundError: No module named ‘tensorflow.examples.tutorials’

    解决方案1–针对没有tutorials文件报错的解决

    1. 先检查tensorflow中是否含有tutorials
      • 检测路径:…\Lib\site-packages\tensorflow_core\examples;
      • 文件夹下只有saved_model这个文件夹,没有tutorials文件夹;
    2. 进入github的tensorflow主页下载缺失的文件
      下载地址:tensorflow tutorials
    3. 拷贝下载的tutorials整个文件夹
      • 解压下载的包,找到tutorials整个文件夹;
      • 拷贝至…\Lib\site-packages\tensorflow_core\examples下;
    4. 再次加载包
      from tensorflow.examples.tutorials.mnist import input_data

    解决办法2

    tensorflow2.0的数据集集成到keras高级接口之中,使用如下代码一般都能下载

    mint=tf.keras.datasets.mnist
    (x_,y_),(x_1,y_1)=mint.load_data()
    
    from tensorflow.examples.tutorials.mnist import input_data
    print ("packs loaded")
    
    packs loaded
    

    input_data.read_data_sets() 下载数据集失败

    from tensorflow.examples.tutorials.mnist import input_data
    print ("Download and Extract MNIST dataset")
    mnist = input_data.read_data_sets('data/', one_hot=True)
    
    1. mnist数据集获取:可从Yann LeCun教授管网获取;
    2. 导入下载到本地的mnist数据集; "data/"为数据集存放的位置;
    # 会下载数据集失败
    print ("Download and Extract MNIST dataset")
    mnist = input_data.read_data_sets('data/', one_hot=True)
    
    Download and Extract MNIST dataset
    Extracting data/train-images-idx3-ubyte.gz
    Extracting data/train-labels-idx1-ubyte.gz
    Extracting data/t10k-images-idx3-ubyte.gz
    Extracting data/t10k-labels-idx1-ubyte.gz
    WARNING:tensorflow:From d:\progra~2\python\virtua~1\py37_x64\lib\site-packages\tensorflow_core\examples\tutorials\mnist\input_data.py:328: _DataSet.__init__ (from tensorflow.examples.tutorials.mnist.input_data) is deprecated and will be removed in a future version.
    Instructions for updating:
    Please use alternatives such as official/mnist/_DataSet.py from tensorflow/models.
    
    # 查看数据数量
    print (" tpye of 'mnist' is %s" % (type(mnist)))
    print (" number of trian data is %d" % (mnist.train.num_examples))
    print (" number of test data is %d" % (mnist.test.num_examples))
    
     tpye of 'mnist' is <class 'tensorflow.examples.tutorials.mnist.input_data._Datasets'>
     number of trian data is 55000
     number of test data is 10000
    
    # What does the data of MNIST look like? 
    print ("What does the data of MNIST look like?")
    trainimg   = mnist.train.images
    trainlabel = mnist.train.labels
    testimg    = mnist.test.images
    testlabel  = mnist.test.labels
    
    # 查看数据类型
    
    print("type of 'trainlabel' is %s"  % type(trainlabel))
    print("type of 'testimg' is %s"     % type(testimg))
    print("type of 'testlabel' is %s"   % type(testlabel))
    
    print()
    # 查看数据形状
    print("shape of 'trainimg' is %s"   % (trainimg.shape, ))
    print("shape of 'trainlabel' is %s" % (trainlabel.shape, ))
    print("shape of 'testimg' is %s"    % (testimg.shape, ))
    print("shape of 'testlabel' is %s"  % (testlabel.shape, ))
    
    What does the data of MNIST look like?
    type of 'trainlabel' is <class 'numpy.ndarray'>
    type of 'testimg' is <class 'numpy.ndarray'>
    type of 'testlabel' is <class 'numpy.ndarray'>
    
    shape of 'trainimg' is (55000, 784)
    shape of 'trainlabel' is (55000, 10)
    shape of 'testimg' is (10000, 784)
    shape of 'testlabel' is (10000, 10)
    
    # How does the training data look like?
    print ("How does the training data look like?")
    nsample = 5
    randidx = np.random.randint(trainimg.shape[0], size=nsample)
    
    for i in randidx:
        curr_img   = np.reshape(trainimg[i, :], (28, 28)) # 28 by 28 matrix 
        curr_label = np.argmax(trainlabel[i, :] ) # Label
        plt.matshow(curr_img, cmap=plt.get_cmap('gray'))
        plt.title("" + str(i) + "th Training Data " 
                  + "Label is " + str(curr_label))
        print ("" + str(i) + "th Training Data " 
               + "Label is " + str(curr_label))
        plt.show()
    
    How does the training data look like?
    54259th Training Data Label is 4
    

    在这里插入图片描述

    33047th Training Data Label is 4
    

    在这里插入图片描述
    52715th Training Data Label is 1

    在这里插入图片描述
    15223th Training Data Label is 7

    在这里插入图片描述
    18188th Training Data Label is 4

    在这里插入图片描述

    # Batch Learning? 
    print ("Batch Learning? ")
    batch_size = 100
    batch_xs, batch_ys = mnist.train.next_batch(batch_size)
    print ("type of 'batch_xs' is %s" % (type(batch_xs)))
    print ("type of 'batch_ys' is %s" % (type(batch_ys)))
    print ("shape of 'batch_xs' is %s" % (batch_xs.shape,))
    print ("shape of 'batch_ys' is %s" % (batch_ys.shape,))
    
    Batch Learning? 
    type of 'batch_xs' is <class 'numpy.ndarray'>
    type of 'batch_ys' is <class 'numpy.ndarray'>
    shape of 'batch_xs' is (100, 784)
    shape of 'batch_ys' is (100, 10)
    
    展开全文
  • mnist数据集

    2017-11-14 18:55:36
    mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集mnist 数据集
  • MNIST数据集使用详解

    千次阅读 2019-07-18 16:12:05
    数据集下载网址:http://yann.lecun.com/exdb/mnist/ 下载后无需解压,将其放在一个文件夹下即可: 数据说明: 数据集常被分为2~3个部分 训练集(train set):用来学习的一组例子,用来适应分类器的参数[即权重] ...

    数据集下载网址:http://yann.lecun.com/exdb/mnist/
    下载后无需解压,将其放在一个文件夹下即可:
    在这里插入图片描述
    数据说明:
    数据集常被分为2~3个部分
    训练集(train set):用来学习的一组例子,用来适应分类器的参数[即权重]
    验证集(validation set):一组用于调整分类器参数(即体系结构,而不是权重)的示例,例如选择神经网络中隐藏单元的数量
    测试集(test set):一组仅用于评估完全指定分类器的性能[泛化]的示例

    读取方式:

    import matplotlib.pyplot as plt
    from tensorflow.examples.tutorials.mnist import input_data
    mnist_data_folder="/MNIST_data"	#指定数据集所在的位置(见上图存放格式)
    mnist=input_data.read_data_sets(mnist_data_folder,one_hot=True)	#读取mnist数据集,指定标签格式one_hot=True
    
    #获取数据集的个数
    train_nums=mnist.train.num_examples
    validation_nums=mnist.validation.num_examples
    test_nums=mnist.test.num_examples
    print("MNIST训练数据集个数 %d"%train_nums)
    print("MNIST验证数据集个数 %d"%validation_nums)
    print("MNIST测试数据集个数 %d"%test_nums)
    
    #获取数据值
    train_data=mnist.train.images   #所有训练数据
    val_data=mnist.validation.images    #(5000,784)
    test_data=mnist.test.images
    print("训练集数据大小:",train_data.shape)
    print("一幅图像大小:",train_data[1].shape)
    print("一幅图像的列表表示:\n",train_data[1])
    
    #获取标签值
    train_labels=mnist.train.labels     #(55000,10)
    val_labels=mnist.validation.labels  #(5000,10)
    test_labels=mnist.test.labels   #(10000,10)
    print("训练集标签数组大小L: ",train_labels.shape)
    print("一幅图像的标签大小: ",train_labels[1].shape)
    print("一幅图像的标签值:",train_labels[1])
    
    #批量获取数据和标签  使用 next_batch(batch_size) 
    #注意使用改方式时数据是随机读取的,但在同一批次中,数据和标签位置是对应的
    batch_size=100  #每次批量训练100幅图像
    batch_xs,batch_ys=mnist.train.next_batch(batch_size)
    testbatch_xs,testbatch_ys=mnist.test.next_batch(batch_size)
    print("使用mnist.train.next_batch(batch_size)批量读取样本")
    print("批量随机读取100个样本,数据集大小= ",batch_xs.shape)
    print("批量随机读取100个样本,标签集大小= ",batch_ys.shape)
    print("批量随机读取100个测试样本,数据集大小= ",testbatch_xs.shape)
    print("批量随机读取100个测试样本,标签集大小= ",testbatch_ys.shape)
    
    #显示图像
    plt.figure()
    for i in range(10):
        im=train_data[i].reshape(28,28)	#训练数据集的第i张图,将其转化为28x28格式
        #im=batch_xs[i].reshape(28,28)	#该批次的第i张图
        plt.imshow(im)
        plt.pause(0.1)	#暂停时间
    plt.show()
    

    运行结果:

    MNIST训练数据集个数 55000
    MNIST验证数据集个数 5000
    MNIST测试数据集个数 10000
    训练集数据大小: (55000, 784)
    一幅图像大小: (784,)
    一幅图像的列表表示:
     [0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.12156864 0.5176471  0.9960785
     0.9921569  0.9960785  0.8352942  0.32156864 0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.08235294
     0.5568628  0.91372555 0.98823535 0.9921569  0.98823535 0.9921569
     0.98823535 0.8745099  0.07843138 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.48235297 0.9960785  0.9921569  0.9960785
     0.9921569  0.87843144 0.7960785  0.7960785  0.8745099  1.
     0.8352942  0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.7960785  0.9921569  0.98823535 0.9921569  0.8313726  0.07843138
     0.         0.         0.2392157  0.9921569  0.98823535 0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.16078432 0.95294124 0.87843144
     0.7960785  0.7176471  0.16078432 0.59607846 0.11764707 0.
     0.         1.         0.9921569  0.40000004 0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.15686275 0.07843138 0.         0.
     0.40000004 0.9921569  0.19607845 0.         0.32156864 0.9921569
     0.98823535 0.07843138 0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.32156864 0.83921576
     0.12156864 0.4431373  0.91372555 0.9960785  0.91372555 0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.24313727
     0.40000004 0.32156864 0.16078432 0.9921569  0.909804   0.9921569
     0.98823535 0.91372555 0.19607845 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.59607846 0.9921569  0.9960785
     0.9921569  0.9960785  0.9921569  0.9960785  0.91372555 0.48235297
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.59607846 0.98823535 0.9921569  0.98823535 0.9921569
     0.98823535 0.75294125 0.19607845 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.24313727
     0.7176471  0.7960785  0.95294124 0.9960785  0.9921569  0.24313727
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.15686275 0.6745098  0.98823535 0.7960785  0.07843138 0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.08235294 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.7176471  0.9960785  0.43921572 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.24313727
     0.7960785  0.6392157  0.         0.         0.         0.
     0.         0.         0.         0.         0.2392157  0.9921569
     0.5921569  0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.08235294 0.83921576 0.75294125 0.
     0.         0.         0.         0.         0.         0.
     0.         0.04313726 0.8352942  0.9960785  0.5921569  0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.40000004 0.9921569  0.5921569  0.         0.         0.
     0.         0.         0.         0.         0.16078432 0.8352942
     0.98823535 0.9921569  0.43529415 0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.16078432 1.
     0.8352942  0.36078432 0.20000002 0.         0.         0.12156864
     0.36078432 0.6784314  0.9921569  0.9960785  0.9921569  0.5568628
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.6745098  0.98823535 0.9921569
     0.98823535 0.7960785  0.7960785  0.91372555 0.98823535 0.9921569
     0.98823535 0.9921569  0.50980395 0.07843138 0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.08235294 0.7960785  1.         0.9921569  0.9960785
     0.9921569  0.9960785  0.9921569  0.9568628  0.7960785  0.32156864
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.07843138 0.5921569  0.5921569  0.9921569  0.67058825 0.5921569
     0.5921569  0.15686275 0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.         0.         0.
     0.         0.         0.         0.        ]
    
    训练集标签数组大小L:  (55000, 10)
    一幅图像的标签大小:  (10,)
    一幅图像的标签值: [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
    使用mnist.train.next_batch(batch_size)批量读取样本
    批量随机读取100个样本,数据集大小=  (100, 784)
    批量随机读取100个样本,标签集大小=  (100, 10)
    批量随机读取100个测试样本,数据集大小=  (100, 784)
    批量随机读取100个测试样本,标签集大小=  (100, 10)
    

    plots显示:
    在这里插入图片描述

    在学会怎样读取后,我们可以用一个简单的神经网络来测试一下:

    import tensorflow as tf
    from tensorflow.examples.tutorials.mnist import input_data
    mnist_data_folder="/MNIST_data"
    mnist=input_data.read_data_sets(mnist_data_folder,one_hot=True)
    
    
    #创建两个占位符,x为输入网络的图像,y_为输入网络的图像类别
    x = tf.placeholder("float", shape=[None, 784])
    y_ = tf.placeholder("float", shape=[None, 10])
    
    #权重初始化函数
    def weight_variable(shape):
        #输出服从截尾正态分布的随机值
        initial = tf.truncated_normal(shape, stddev=0.1)
        return tf.Variable(initial)
    
    #偏置初始化函数
    def bias_variable(shape):
        initial = tf.constant(0.1, shape=shape)
        return tf.Variable(initial)
    
    #创建卷积op
    #x 是一个4维张量,shape为[batch,height,width,channels]
    #卷积核移动步长为1。填充类型为SAME,可以不丢弃任何像素点
    def conv2d(x, W):
        return tf.nn.conv2d(x, W, strides=[1,1,1,1], padding="SAME")
    
    #创建池化op
    #采用最大池化,也就是取窗口中的最大值作为结果
    #x 是一个4维张量,shape为[batch,height,width,channels]
    #ksize表示pool窗口大小为2x2,也就是高2,宽2
    #strides,表示在height和width维度上的步长都为2
    def max_pool_2x2(x):
        return tf.nn.max_pool(x, ksize=[1,2,2,1],
                              strides=[1,2,2,1], padding="SAME")
    
    #第1层,卷积层
    #初始化W为[5,5,1,32]的张量,表示卷积核大小为5*5,第一层网络的输入和输出神经元个数分别为1和32
    W_conv1 = weight_variable([5,5,1,32])
    #初始化b为[32],即输出大小
    b_conv1 = bias_variable([32])
    
    #把输入x(二维张量,shape为[batch, 784])变成4d的x_image,x_image的shape应该是[batch,28,28,1]
    #-1表示自动推测这个维度的size
    x_image = tf.reshape(x, [-1,28,28,1])
    
    #把x_image和权重进行卷积,加上偏置项,然后应用ReLU激活函数,最后进行max_pooling
    #h_pool1的输出即为第一层网络输出,shape为[batch,14,14,1]
    h_conv1 = tf.nn.relu(conv2d(x_image, W_conv1) + b_conv1)
    h_pool1 = max_pool_2x2(h_conv1)
    
    #第2层,卷积层
    #卷积核大小依然是5*5,这层的输入和输出神经元个数为32和64
    W_conv2 = weight_variable([5,5,32,64])
    b_conv2 = weight_variable([64])
    
    #h_pool2即为第二层网络输出,shape为[batch,7,7,1]
    h_conv2 = tf.nn.relu(conv2d(h_pool1, W_conv2) + b_conv2)
    h_pool2 = max_pool_2x2(h_conv2)
    
    #第3层, 全连接层
    #这层是拥有1024个神经元的全连接层
    #W的第1维size为7*7*64,7*7是h_pool2输出的size,64是第2层输出神经元个数
    W_fc1 = weight_variable([7*7*64, 1024])
    b_fc1 = bias_variable([1024])
    
    #计算前需要把第2层的输出reshape成[batch, 7*7*64]的张量
    h_pool2_flat = tf.reshape(h_pool2, [-1, 7*7*64])
    h_fc1 = tf.nn.relu(tf.matmul(h_pool2_flat, W_fc1) + b_fc1)
    
    #Dropout层
    #为了减少过拟合,在输出层前加入dropout
    keep_prob = tf.placeholder("float")
    h_fc1_drop = tf.nn.dropout(h_fc1, keep_prob)
    
    #输出层
    #最后,添加一个softmax层
    #可以理解为另一个全连接层,只不过输出时使用softmax将网络输出值转换成了概率
    W_fc2 = weight_variable([1024, 10])
    b_fc2 = bias_variable([10])
    
    y_conv = tf.nn.softmax(tf.matmul(h_fc1_drop, W_fc2) + b_fc2)
    
    #预测值和真实值之间的交叉墒
    cross_entropy = -tf.reduce_sum(y_ * tf.log(y_conv))
    
    #train op, 使用ADAM优化器来做梯度下降。学习率为0.0001
    train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
    
    #评估模型,tf.argmax能给出某个tensor对象在某一维上数据最大值的索引。
    #因为标签是由0,1组成了one-hot vector,返回的索引就是数值为1的位置
    correct_predict = tf.equal(tf.argmax(y_conv, 1), tf.argmax(y_, 1))
    
    #计算正确预测项的比例,因为tf.equal返回的是布尔值,
    #使用tf.cast把布尔值转换成浮点数,然后用tf.reduce_mean求平均值
    accuracy = tf.reduce_mean(tf.cast(correct_predict, "float"))
    
    
    train_data=mnist.train.images
    train_labels=mnist.train.labels
    test_data=mnist.test.images
    test_labels=mnist.test.labels
    
    batch_size=100  #每次批量训练100幅图像
    batch_xs,batch_ys=mnist.train.next_batch(batch_size)    #随机抓取训练数据中的100个批处理数据点
    test_xs,test_ys=mnist.test.next_batch(batch_size)
    
    
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer()) #初始化变量
        for i in range(2000):  #开始训练模型,循环2000次,每次传入一张图像
            sess.run(train_step,feed_dict={x:[train_data[i]], y_:[train_labels[i]], keep_prob:0.5})
            if(i%100==0):   #每100次,传入一个批次的测试数据,计算其正确率
                print(sess.run(accuracy, feed_dict={x: test_xs, y_: test_ys, keep_prob: 1.0}))
    
    """
    也可以批量导入训练,注意使用mnist.train.next_batch(batch_size),得到的批次数据每次都会自动随机抽取这个批次大小的数据
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer()) #初始化变量
        for i in range(200):  #开始训练模型,循环200次,每次传入一个批次的图像
            sess.run(train_step,feed_dict={x:batch_xs, y_:batch_ys, keep_prob:0.5})
            if(i%20==0):   #每20次,传入一个批次的测试数据,计算其正确率
                print(sess.run(accuracy, feed_dict={x: test_xs, y_: test_ys, keep_prob: 1.0}))
    """
    

    运行结果:

    Extracting /MNIST_data\train-images-idx3-ubyte.gz
    Extracting /MNIST_data\train-labels-idx1-ubyte.gz
    Extracting /MNIST_data\t10k-images-idx3-ubyte.gz
    Extracting /MNIST_data\t10k-labels-idx1-ubyte.gz
    0.12
    0.4
    0.54
    0.54
    0.61
    0.7
    0.73
    0.84
    0.77
    0.8
    0.8
    0.86
    0.88
    0.85
    0.88
    0.88
    0.85
    0.92
    0.89
    0.84
    

    参考:
    train set、 validation set 、test set三者的概念
    MNIST手写数字数据集读取方法

    展开全文
  • mnist数据集是由深度学习大神 LeCun等人制作完成的数据集,mnist数据集也常认为是深度学习的“ Hello World!”。 官网:http://yann.lecun.com/exdb/mnist/ mnist数据集由6万张训练数据和1万张测试数据组成。 ...

    mnist数据集是由深度学习大神 LeCun等人制作完成的数据集,mnist数据集也常认为是深度学习的“ Hello World!”。

    官网:http://yann.lecun.com/exdb/mnist/


    mnist数据集由6万张训练数据和1万张测试数据组成。

     

    官网提供下载,但由于是国外的服务器,下载速度很慢。这里提供百度网盘下载地址:

    链接:https://pan.baidu.com/s/17KUWe8JdQBHsAg3B4m5SNg 
    提取码:wyxn 
     

     

    展开全文
  • MNIST数据集

    2021-01-06 14:07:42
    MNIST数据集
  • mnist 数据集

    2017-12-24 19:30:53
    mnist 数据集,官网下载速度太慢;mnist 数据集,官网下载速度太慢;
  • 1.一开始采用官网上利用input_data来...2.后面采用 keras加载mnist数据集,又会报出如下错误 URL fetch fail:http://googelsourc…………………………………… 解决办法:先将mnist数据集下载到本地,放到自己想...

    注意:使用keras模块加载mnist数据集时,可以不用修改代码,直接将mnist数据集放到相应的路径下面。

    windows: C:\Users\mac\.keras\datasets
    

    1.一开始采用官网上利用input_data来加载本地数据集的方法,但会报出下面的错误

    No module named 'tensorflow.examples.tutorials'
    

    并且官网上input_data.py又下载不下来
    2.采用keras,一开始也是因为无法访问googlesource,导致无法加载mnist数据集。
    解决方法:修改mnist.py(使用 ctrl + b,打开mnist.py),利用本地下载好的mnist数据集,直接将mnist.py里路径path改成本地mnist数据集的路径
    下附代码:
    main.py

    from __future__ import absolute_import, division, print_function, unicode_literals
    import tensorflow as tf
    
    mnist = tf.keras.datasets.mnist
    
    (x_train, y_train), (x_test, y_test) = mnist.load_data()
    x_train, x_test = x_train / 255.0, x_test / 255.0
    
    model = tf.keras.models.Sequential([
      tf.keras.layers.Flatten(input_shape=(28, 28)),
      tf.keras.layers.Dense(128, activation='relu'),
      tf.keras.layers.Dropout(0.2),
      tf.keras.layers.Dense(10, activation='softmax')
    ])
    
    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])
    
    
    model.fit(x_train, y_train, epochs=5)
    
    model.evaluate(x_test,  y_test, verbose=2)
    
    

    mnist.py

    
    """MNIST handwritten digits dataset.
    """
    from __future__ import absolute_import
    from __future__ import division
    from __future__ import print_function
    
    import numpy as np
    
    from tensorflow.python.keras.utils.data_utils import get_file
    from tensorflow.python.util.tf_export import keras_export
    
    
    @keras_export('keras.datasets.mnist.load_data')
    def load_data(path='mnist.npz'):
      """Loads the MNIST dataset.
    
      Arguments:
          path: path where to cache the dataset locally
              (relative to ~/.keras/datasets).
    
      Returns:
          Tuple of Numpy arrays: `(x_train, y_train), (x_test, y_test)`.
    
      License:
          Yann LeCun and Corinna Cortes hold the copyright of MNIST dataset,
          which is a derivative work from original NIST datasets.
          MNIST dataset is made available under the terms of the
          [Creative Commons Attribution-Share Alike 3.0 license.](
          https://creativecommons.org/licenses/by-sa/3.0/)
      """
    
      path = "./mnist.npz"
      with np.load(path) as f:
        x_train, y_train = f['x_train'], f['y_train']
        x_test, y_test = f['x_test'], f['y_test']
    
        return (x_train, y_train), (x_test, y_test)
    
    展开全文
  • 一、二分类训练MNIST数据集练习 %matplotlib inlineimport matplotlibimport numpy as npimport matplotlib.pyplot as pltfrom sklearn.datasets import fetch_mldata mnist = fetch_mldata("MNIST original", ...
  • Mnist数据集

    2018-12-11 13:15:52
    Mnist数据集,MNIST 数据集来自美国国家标准与技术研究所, National Institute of Standards and Technology (NIST). 训练集 (training set) 由来自 250 个不同人手写的数字构成, 其中 50% 是高中学生, 50% 来自人口...
  • MNIST数据集.zip

    2019-08-30 16:12:38
    由于使用fetch_mldata函数无法直接获取MNIST数据集,上传数据集为原MNIST数据集
  • 模仿mnist数据集制作自己的数据集

    热门讨论 2017-08-05 20:32:01
    模仿mnist数据集格式制作自己的数据集
  • MNIST数据集.rar

    2020-04-27 14:31:39
    MNIST数据集

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 46,111
精华内容 18,444
关键字:

mnist数据集怎么用