SlideShare a Scribd company logo
Tensorflow in production
with AWS Lambda
Tensorflow Tokyo - 2016-09-15
THIS IS NOT
A MACHINE
LEARNING
TALK
Tensorflow in production with AWS lambda
What Will You Learn?
▸ What can you do with your trained model: MLOPS
▸ Why AWS lambda can be a solution
▸ AWS lambda with tensor flow: how it works
Tensorflow in production with AWS lambda
About Me
▸ Freelance Data Products Developper and Consultant

(data visualization, machine learning)
▸ Former Orange Labs and Locarise

(connected sensors data processing and visualization)
▸ Current side project denryoku.io an API for electric grid
power demand and capacity prediction.
So you have trained
a model?
Now what?
Tensorflow in production with AWS lambda
It is a product, not an ad-hoc analysis


Live Data
Historical Data
" "
Trained model Deployed model Prediction
Model selection and training
Production
▸ Needs to run on live data
Tensorflow in production with AWS lambda
Many things may need to be done in production
▸ Batch processing
▸ Stream / event processing
▸ A prediction API
▸ Update and maintain the model
Tensorflow in production with AWS lambda
This needs to be scalable, resilient
And also:
▸ maintainable
▸ versioned
▸ easy to integrate
ML+DevOps = MLOPS
Why AWS Lambda
may help.
Tensorflow in production with AWS lambda
Some deployment solutions
▸ Tensor flow Serving:
▸ Forces you to create dedicated code if you have more than
a pure Tensorflow model
▸ doesn’t solve scalability issues
▸ forces you to manage servers
▸ Google CloudML
▸ Private Beta
▸ Likely limitations
Tensorflow in production with AWS lambda
Serverless architectures with AWS Lambda
▸ Serverless offer by AWS
▸ No lifecycle to manage or shared state => resilient
▸ Auto-scaling
▸ Pay for actual running time: low cost
▸ No server, infra management: reduced dev / devops cost
…events
lambda function
output
Tensorflow in production with AWS lambda
Creating a function
Tensorflow in production with AWS lambda
Creating a function
Tensorflow in production with AWS lambda
Creating an “architecture” with triggers
Tensorflow in production with AWS lambda
Event / microbatch processing
▸ event based: db/stream update, new file on s3, web hook
▸ classify the incoming data or update your prediction
Tensorflow in production with AWS lambda
Batch processing
▸ cron scheduling
▸ let your function get some data and process it at regular interval
Tensorflow in production with AWS lambda
An API
▸ on API call
▸ returned response is your function return value
▸ manage API keys, rate limits, etc on AWS gateway
Tensorflow and
AWS Lambda in
practice.
Tensorflow in production with AWS lambda
How to save a TF model
▸ Use a saver object.
▸ It will save on disk:
▸ the graph model (‘filename.meta’)
▸ the variable values (‘filename’)
▸ Need to identify the placeholders that will be accessed later
saver = tf.train.Saver()
#
# do the training
#
tf.add_to_collection('output', pred)
tf.add_to_collection('input', x)
save_path = saver.save(sess, "model-name.ckpt")
python
Tensorflow in production with AWS lambda
How to restore a TF model
▸ Restore the graph and variable values with a saver object
saver = tf.train.import_meta_graph(filename + '.meta')
with tf.Session() as sess:
# Restore variables from disk.
saver.restore(sess, filename)
pred = tf.get_collection('output')[0]
x = tf.get_collection('input')[0]
print("Model restored.")
# Do some work with the model
prediction = pred.eval({x: test_data})
python
Tensorflow in production with AWS lambda
Setting up AWS Lambda for Tensorflow
Tensorflow needs to be compiled for the right environment
# install compilation environment
sudo yum -y update
sudo yum -y upgrade
sudo yum groupinstall "Development Tools"
# create and activate virtual env
virtualenv tfenv
source tfenv/bin/activate
# install tensorflow
export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/
linux/cpu/tensorflow-0.10.0-cp27-none-linux_x86_64.whl
pip install --upgrade $TF_BINARY_URL
# zip the environment content
touch ~/tfenv/lib/python2.7/site-packages/google/__init__.py
cd ~/tfenv/lib/python2.7/site-packages/
zip -r ~/tf-env.zip . --exclude *.pyc
cd ~/tfenv/lib64/python2.7/site-packages/
1. Launch an
EC2 instance
and connect
to it
2. Install
TensorFlow in
a virtualenv
3. Zip the
installed
libraries
shell
Tensorflow in production with AWS lambda
A tensorflow calling lambda function
▸ Accepts a list of input vectors: multiple predictions
▸ Returns a list of predictions
import tensorflow as tf
filename = 'model-name.ckpt'
def lambda_handler(event, context):
saver = tf.train.import_meta_graph(filename + '.meta')
inputData = event['data']
with tf.Session() as sess:
# Restore variables from disk.
saver.restore(sess, filename)
pred = tf.get_collection('pred')[0]
x = tf.get_collection('x')[0]
# Apply the model to the input data
predictions = pred.eval({x: inputData})
return {'result': predictions.tolist()}
python
Tensorflow in production with AWS lambda
upload and test
▸ add your lambda function code and TF model to the environment zip.
▸ upload your function
Tensorflow in production with AWS lambda
Where to put the model?
▸ with the function: easy, in particular when testing
▸ on s3: ease update or allows for multiple models to be used in
parallel.
▸ function could be called with model ref as argument
"…
lambda function
tensor flowlive data
prediction
$model
Tensorflow in production with AWS lambda
caveats
▸ No GPU support at the moment
▸ model loading time: better to increase machine RAM
(hence CPU) for fast API response time
▸ python 2.7 (python 3 doable with more work)
▸ request limit increase to AWS for more than 100
concurrent executions
Your turn!
Thanks
Questions?
@fabian_dubois
fabian@datamaplab.com
check denryoku.io
Text
references:
▸ http://docs.aws.amazon.com/lambda/latest/dg/current-
supported-versions.html
▸ tensor flow package url https://www.tensorflow.org/
versions/r0.10/get_started/os_setup.html

More Related Content

Tensorflow in production with AWS Lambda

  • 1. Tensorflow in production with AWS Lambda Tensorflow Tokyo - 2016-09-15
  • 2. THIS IS NOT A MACHINE LEARNING TALK
  • 3. Tensorflow in production with AWS lambda What Will You Learn? ▸ What can you do with your trained model: MLOPS ▸ Why AWS lambda can be a solution ▸ AWS lambda with tensor flow: how it works
  • 4. Tensorflow in production with AWS lambda About Me ▸ Freelance Data Products Developper and Consultant
 (data visualization, machine learning) ▸ Former Orange Labs and Locarise
 (connected sensors data processing and visualization) ▸ Current side project denryoku.io an API for electric grid power demand and capacity prediction.
  • 5. So you have trained a model? Now what?
  • 6. Tensorflow in production with AWS lambda It is a product, not an ad-hoc analysis   Live Data Historical Data " " Trained model Deployed model Prediction Model selection and training Production ▸ Needs to run on live data
  • 7. Tensorflow in production with AWS lambda Many things may need to be done in production ▸ Batch processing ▸ Stream / event processing ▸ A prediction API ▸ Update and maintain the model
  • 8. Tensorflow in production with AWS lambda This needs to be scalable, resilient And also: ▸ maintainable ▸ versioned ▸ easy to integrate ML+DevOps = MLOPS
  • 10. Tensorflow in production with AWS lambda Some deployment solutions ▸ Tensor flow Serving: ▸ Forces you to create dedicated code if you have more than a pure Tensorflow model ▸ doesn’t solve scalability issues ▸ forces you to manage servers ▸ Google CloudML ▸ Private Beta ▸ Likely limitations
  • 11. Tensorflow in production with AWS lambda Serverless architectures with AWS Lambda ▸ Serverless offer by AWS ▸ No lifecycle to manage or shared state => resilient ▸ Auto-scaling ▸ Pay for actual running time: low cost ▸ No server, infra management: reduced dev / devops cost …events lambda function output
  • 12. Tensorflow in production with AWS lambda Creating a function
  • 13. Tensorflow in production with AWS lambda Creating a function
  • 14. Tensorflow in production with AWS lambda Creating an “architecture” with triggers
  • 15. Tensorflow in production with AWS lambda Event / microbatch processing ▸ event based: db/stream update, new file on s3, web hook ▸ classify the incoming data or update your prediction
  • 16. Tensorflow in production with AWS lambda Batch processing ▸ cron scheduling ▸ let your function get some data and process it at regular interval
  • 17. Tensorflow in production with AWS lambda An API ▸ on API call ▸ returned response is your function return value ▸ manage API keys, rate limits, etc on AWS gateway
  • 19. Tensorflow in production with AWS lambda How to save a TF model ▸ Use a saver object. ▸ It will save on disk: ▸ the graph model (‘filename.meta’) ▸ the variable values (‘filename’) ▸ Need to identify the placeholders that will be accessed later saver = tf.train.Saver() # # do the training # tf.add_to_collection('output', pred) tf.add_to_collection('input', x) save_path = saver.save(sess, "model-name.ckpt") python
  • 20. Tensorflow in production with AWS lambda How to restore a TF model ▸ Restore the graph and variable values with a saver object saver = tf.train.import_meta_graph(filename + '.meta') with tf.Session() as sess: # Restore variables from disk. saver.restore(sess, filename) pred = tf.get_collection('output')[0] x = tf.get_collection('input')[0] print("Model restored.") # Do some work with the model prediction = pred.eval({x: test_data}) python
  • 21. Tensorflow in production with AWS lambda Setting up AWS Lambda for Tensorflow Tensorflow needs to be compiled for the right environment # install compilation environment sudo yum -y update sudo yum -y upgrade sudo yum groupinstall "Development Tools" # create and activate virtual env virtualenv tfenv source tfenv/bin/activate # install tensorflow export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/ linux/cpu/tensorflow-0.10.0-cp27-none-linux_x86_64.whl pip install --upgrade $TF_BINARY_URL # zip the environment content touch ~/tfenv/lib/python2.7/site-packages/google/__init__.py cd ~/tfenv/lib/python2.7/site-packages/ zip -r ~/tf-env.zip . --exclude *.pyc cd ~/tfenv/lib64/python2.7/site-packages/ 1. Launch an EC2 instance and connect to it 2. Install TensorFlow in a virtualenv 3. Zip the installed libraries shell
  • 22. Tensorflow in production with AWS lambda A tensorflow calling lambda function ▸ Accepts a list of input vectors: multiple predictions ▸ Returns a list of predictions import tensorflow as tf filename = 'model-name.ckpt' def lambda_handler(event, context): saver = tf.train.import_meta_graph(filename + '.meta') inputData = event['data'] with tf.Session() as sess: # Restore variables from disk. saver.restore(sess, filename) pred = tf.get_collection('pred')[0] x = tf.get_collection('x')[0] # Apply the model to the input data predictions = pred.eval({x: inputData}) return {'result': predictions.tolist()} python
  • 23. Tensorflow in production with AWS lambda upload and test ▸ add your lambda function code and TF model to the environment zip. ▸ upload your function
  • 24. Tensorflow in production with AWS lambda Where to put the model? ▸ with the function: easy, in particular when testing ▸ on s3: ease update or allows for multiple models to be used in parallel. ▸ function could be called with model ref as argument "… lambda function tensor flowlive data prediction $model
  • 25. Tensorflow in production with AWS lambda caveats ▸ No GPU support at the moment ▸ model loading time: better to increase machine RAM (hence CPU) for fast API response time ▸ python 2.7 (python 3 doable with more work) ▸ request limit increase to AWS for more than 100 concurrent executions
  • 28. Text references: ▸ http://docs.aws.amazon.com/lambda/latest/dg/current- supported-versions.html ▸ tensor flow package url https://www.tensorflow.org/ versions/r0.10/get_started/os_setup.html