Skip to main content
deleted 25 characters in body
Source Link
nbro
  • 15.9k
  • 34
  • 117
  • 209

the alternative to from keras.datasets How can I import mnistthe MNIST dataset that has been manually downloaded?

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.Is there any function or code that can let me directly import the MNIST data set that has been manually downloaded?

This isI tried the modifiedfollowing approach

import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

the alternative to from keras.datasets import mnist

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

How can I import the MNIST dataset that has been manually downloaded?

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. Is there any function or code that can let me directly import the MNIST data set that has been manually downloaded?

I tried the following approach

import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)
deleted 18 characters in body
Source Link
user785099
  • 5.5k
  • 12
  • 46
  • 66

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/dsp_emerging/ugwz/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/dsp_emerging/ugwz/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)
added 588 characters in body
Source Link
user785099
  • 5.5k
  • 12
  • 46
  • 66

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/dsp_emerging/ugwz/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It should be related to the network environment I am using. My question is that, are there any function or code that can let me directly import mnist data set that has been manually downloaded. Thanks.

This is the modified approach

import sys
import pickle
import gzip
f = gzip.open('/data/dsp_emerging/ugwz/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)
Source Link
user785099
  • 5.5k
  • 12
  • 46
  • 66
Loading