I have been experimenting with a Keras example, which needs to import MNIST data
from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()
It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out
It should be related to the network environment I am using. Is there any function or code that can let me directly import the MNIST data set that has been manually downloaded?
I tried the following approach
import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
if sys.version_info < (3,):
data = pickle.load(f)
else:
data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data
Then I get the following error message
Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)