95

I want to do some simple logging for my server which is a small Flask app running in a Docker container.

Here is the Dockerfile

# Dockerfile
FROM dreen/flask
MAINTAINER dreen
WORKDIR /srv

# Get source
RUN mkdir -p /srv
COPY perfektimprezy.tar.gz /srv/perfektimprezy.tar.gz
RUN tar x -f perfektimprezy.tar.gz
RUN rm perfektimprezy.tar.gz

# Run server
EXPOSE 80
CMD ["python", "index.py", "1>server.log", "2>server.log"]

As you can see on the last line I redirect stderr and stdout to a file. Now I run this container and shell into it

docker run -d -p 80:80 perfektimprezy
docker exec -it "... id of container ..." bash

And observe the following things:

The server is running and the website working

There is no /srv/server.log

ps aux | grep python yields:

root         1  1.6  3.2  54172 16240 ?        Ss   13:43   0:00 python index.py 1>server.log 2>server.log
root        12  1.9  3.3 130388 16740 ?        Sl   13:43   0:00 /usr/bin/python index.py 1>server.log 2>server.log
root        32  0.0  0.0   8860   388 ?        R+   13:43   0:00 grep --color=auto python

But there are no logs... HOWEVER, if I docker attach to the container I can see the app generating output in the console.

How do I properly redirect stdout/err to a file when using Docker?

4 Answers 4

89

When you specify a JSON list as CMD in a Dockerfile, it will not be executed in a shell, so the usual shell functions, like stdout and stderr redirection, won't work.

From the documentation:

The exec form is parsed as a JSON array, which means that you must use double-quotes (") around words not single-quotes (').

Unlike the shell form, the exec form does not invoke a command shell. This means that normal shell processing does not happen. For example, CMD [ "echo", "$HOME" ] will not do variable substitution on $HOME. If you want shell processing then either use the shell form or execute a shell directly, for example: CMD [ "sh", "-c", "echo", "$HOME" ].

What your command actually does is executing your index.py script and passing the strings "1>server.log" and "2>server.log" as command-line arguments into that python script.

Use one of the following instead (both should work):

  1. CMD "python index.py > server.log 2>&1"
  2. CMD ["/bin/sh", "-c", "python index.py > server.log 2>&1"]
1
  • 9
    The documentation recommends using the exec shell built-in in order to have the containers honor the OS signals and docker stop. So the first command becomes: CMD exec python index.py > server.log 2>&1
    – lcfd
    Commented Mar 31, 2017 at 7:31
35

To use docker run in a shell pipeline or under shell redirection, making run accept stdin and output to stdout and stderr appropriately, use this incantation:

docker run -i --log-driver=none -a stdin -a stdout -a stderr ...

e.g. to run the alpine image and execute the UNIX command cat in the contained environment:

echo "This was piped into docker" |
  docker run -i --log-driver=none -a stdin -a stdout -a stderr \
    alpine cat - |
  xargs echo This is coming out of docker: 

emits:

This is coming out of docker: This was piped into docker
1
  • 4
    -d and -a do not go together Commented Sep 8, 2021 at 12:34
23

Just a complement, when using docker-compose, you could also try:

command: bash -c "script_or_command > /path/to/log/command.log 2>&1"

3
  • Did you try it? This makes perfect sense to me, and doesn't work! So far I have been completely unsuccessful redirecting stderr to stdout when using a docker-compose command directive. stdout comes through fine. Commented Apr 14, 2017 at 17:47
  • 1
    Ah, my problem turns out to be that I needed to set the PYTHONUNBUFFERED environment variable or run python with the -u flag, as this explains: stackoverflow.com/a/24183941/562883 Commented Apr 15, 2017 at 21:26
  • I had to get the pg_dump of my postgres container into a file on the host. Here is what I did: docker run -ti -v /usr/local/jay/words_data:/mnt/data --rm --link mypostgres:pghost postgres bash -c "pg_dump -h pghost -U app_user myschema >> /mnt/data/export.sql" (sidenote: I could have used the simpler option -f with pg_dump, but it didn't occur to me at the time. Anyway, this answer helped me.)
    – Jay
    Commented Jul 31, 2018 at 5:12
1

I personally use :

ENTRYPOINT ["python3"]
CMD ["-u", "-m", "swagger_server"]

The "-u" is the key :)

1
  • 3
    -u : "Force the stdout and stderr streams to be unbuffered."
    – Lenormju
    Commented Jun 14, 2022 at 12:25

Not the answer you're looking for? Browse other questions tagged or ask your own question.