python – How to manage local vs production settings in Django?

The Question :

317 people think this question is useful

What is the recommended way of handling settings for local development and the production server? Some of them (like constants, etc) can be changed/accessed in both, but some of them (like paths to static files) need to remain different, and hence should not be overwritten every time the new code is deployed.

Currently, I am adding all constants to But every time I change some constant locally, I have to copy it to the production server and edit the file for production specific changes… πŸ™

Edit: looks like there is no standard answer to this question, I’ve accepted the most popular method.

The Question Comments :
  • See…
  • Please have a look at django-configurations.
  • The accepted method is no longer the most popular one.
  • django-split-settings is very easy to use. It does not require to rewrite any default settings.
  • yo should use file and in your “from .base import *”, the same in your “from .base import *”, you need run your project with: python runserver –settings=project_name.settings.local

The Answer 1

131 people think this answer is useful


    from local_settings import *
except ImportError as e:

You can override what needed in; it should stay out of your version control then. But since you mention copying I’m guessing you use none πŸ˜‰

The Answer 2

307 people think this answer is useful

Two Scoops of Django: Best Practices for Django 1.5 suggests using version control for your settings files and storing the files in a separate directory:


The file contains common settings (such as MEDIA_ROOT or ADMIN), while and have site-specific settings:

In the base file settings/

    # common apps...

In the local development settings file settings/

from project.settings.base import *

DEBUG = True
    'debug_toolbar', # and other apps for local development

In the file production settings file settings/

from project.settings.base import *

DEBUG = False
    # other apps for production site

Then when you run django, you add the --settings option:

# Running django for local development
$ ./ runserver 0:8000 --settings=project.settings.local

# Running django shell on the production site
$ ./ shell --settings=project.settings.production

The authors of the book have also put up a sample project layout template on Github.

The Answer 3

75 people think this answer is useful

Instead of, use this layout:

└── settings/
 Β Β  β”œβ”€β”€  <= not versioned
 Β Β  β”œβ”€β”€
 Β Β  β”œβ”€β”€
 Β Β  └── is where most of your configuration lives. imports everything from common, and overrides whatever it needs to override:

from __future__ import absolute_import # optional, but I like it
from .common import *

# Production overrides
DEBUG = False

Similarly, imports everything from and overrides whatever it needs to override.

Finally, is where you decide which settings to load, and it’s also where you store secrets (therefore this file should not be versioned):

from __future__ import absolute_import
from .prod import *  # or .dev if you want dev

SECRET_KEY = '(3gd6shenud@&amp;57...'
DATABASES['default']['PASSWORD'] = 'f9kGH...'


What I like about this solution is:

  1. Everything is in your versioning system, except secrets
  2. Most configuration is in one place:
  3. Prod-specific things go in, dev-specific things go in It’s simple.
  4. You can override stuff from in or, and you can override anything in
  5. It’s straightforward python. No re-import hacks.

The Answer 4

20 people think this answer is useful

I use a slightly modified version of the “if DEBUG” style of settings that Harper Shelby posted. Obviously depending on the environment (win/linux/etc.) the code might need to be tweaked a bit.

I was in the past using the “if DEBUG” but I found that occasionally I needed to do testing with DEUBG set to False. What I really wanted to distinguish if the environment was production or development, which gave me the freedom to choose the DEBUG level.

    PRODUCTION = False


# ...

    DATABASE_HOST = 'localhost'

I’d still consider this way of settings a work in progress. I haven’t seen any one way to handling Django settings that covered all the bases and at the same time wasn’t a total hassle to setup (I’m not down with the 5x settings files methods).

The Answer 5

14 people think this answer is useful

I use a and a After trying several options I’ve found that it’s easy to waste time with complex solutions when simply having two settings files feels easy and fast.

When you use mod_python/mod_wsgi for your Django project you need to point it to your settings file. If you point it to app/ on your local server and app/ on your production server then life becomes easy. Just edit the appropriate settings file and restart the server (Django development server will restart automatically).

The Answer 6

11 people think this answer is useful

TL;DR: The trick is to modify os.environment before you import settings/ in any settings/<purpose>.py, this will greatly simplify things.

Just thinking about all these intertwining files gives me a headache. Combining, importing (sometimes conditionally), overriding, patching of what was already set in case DEBUG setting changed later on. What a nightmare!

Through the years I went through all different solutions. They all somewhat work, but are so painful to manage. WTF! Do we really need all that hassle? We started with just one file. Now we need a documentation just to correctly combine all these together in a correct order!

I hope I finally hit the (my) sweet spot with the solution below.

Let’s recap the goals (some common, some mine)

  1. Keep secrets a secret β€” don’t store them in a repo!

  2. Set/read keys and secrets through environment settings, 12 factor style.

  3. Have sensible fallback defaults. Ideally for local development you don’t need anything more beside defaults.

  4. …but try to keep defaults production safe. It’s better to miss a setting override locally, than having to remember to adjust default settings safe for production.

  5. Have the ability to switch DEBUG on/off in a way that can have an effect on other settings (eg. using javascript compressed or not).

  6. Switching between purpose settings, like local/testing/staging/production, should be based only on DJANGO_SETTINGS_MODULE, nothing more.

  7. …but allow further parameterization through environment settings like DATABASE_URL.

  8. …also allow them to use different purpose settings and run them locally side by side, eg. production setup on local developer machine, to access production database or smoke test compressed style sheets.

  9. Fail if an environment variable is not explicitly set (requiring an empty value at minimum), especially in production, eg. EMAIL_HOST_PASSWORD.

  10. Respond to default DJANGO_SETTINGS_MODULE set in during django-admin startproject

  11. Keep conditionals to a minimum, if the condition is the purposed environment type (eg. for production set log file and it’s rotation), override settings in associated purposed settings file.

Do not’s

  1. Do not let django read DJANGO_SETTINGS_MODULE setting form a file.
    Ugh! Think of how meta this is. If you need to have a file (like docker env) read that into the environment before staring up a django process.

  2. Do not override DJANGO_SETTINGS_MODULE in your project/app code, eg. based on hostname or process name.
    If you are lazy to set environment variable (like for test) do it in tooling just before you run your project code.

  3. Avoid magic and patching of how django reads it’s settings, preprocess the settings but do not interfere afterwards.

  4. No complicated logic based nonsense. Configuration should be fixed and materialized not computed on the fly. Providing a fallback defaults is just enough logic here.
    Do you really want to debug, why locally you have correct set of settings but in production on a remote server, on one of hundred machines, something computed differently? Oh! Unit tests? For settings? Seriously?


My strategy consists of excellent django-environ used with ini style files, providing os.environment defaults for local development, some minimal and short settings/<purpose>.py files that have an import settings/ AFTER the os.environment was set from an INI file. This effectively give us a kind of settings injection.

The trick here is to modify os.environment before you import settings/

To see the full example go do the repo:

    β”‚   β”‚   <-- imports local for compatibility
    β”‚   β”‚       <-- almost all the settings, reads from proces environment 
    β”‚   β”‚      <-- a few modifications for local development
    β”‚   β”‚ <-- ideally is empty and everything is in base 
    β”‚   β”‚    <-- mimics production with a reasonable exeptions
    β”‚   β”‚   .env          <-- for local use, not kept in repo


A defaults for local development. A secret file, to mostly set required environment variables. Set them to empty values if they are not required in local development. We provide defaults here and not in settings/ to fail on any other machine if the’re missing from the environment.


What happens in here, is loading environment from settings/.env, then importing common settings from settings/ After that we can override a few to ease local development.

import logging
import environ

logging.debug("Settings loading: %s" % __file__)

# This will read missing environment variables from a file
# We wan to do this before loading a base settings as they may depend on environment

from .base import *


EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
# EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'

LOGGING['handlers']['mail_admins']['email_backend'] = 'django.core.mail.backends.dummy.EmailBackend'

# Sync task testing



For production we should not expect an environment file, but it’s easier to have one if we’re testing something. But anyway, lest’s provide few defaults inline, so settings/ can respond accordingly.

environ.Env.read_env(Path(__file__) / "production.env", DEBUG='False', ASSETS_DEBUG='False')
from .base import *

The main point of interest here are DEBUG and ASSETS_DEBUG overrides, they will be applied to the python os.environ ONLY if they are MISSING from the environment and the file.

These will be our production defaults, no need to put them in the environment or file, but they can be overridden if needed. Neat!


These are your mostly vanilla django settings, with a few conditionals and lot’s of reading them from the environment. Almost everything is in here, keeping all the purposed environments consistent and as similar as possible.

The main differences are below (I hope these are self explanatory):

import environ

env = environ.Env()

# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))

# Where BASE_DIR is a django source root, ROOT_DIR is a whole project root
# It may differ BASE_DIR for eg. when your django project code is in `src` folder
# This may help to separate python modules and *django apps* from other stuff
# like documentation, fixtures, docker settings

# Quick-start development settings - unsuitable for production
# See

# SECURITY WARNING: keep the secret key used in production secret!

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = env('DEBUG', default=False)



if 'ALLOWED_HOSTS' in os.environ:
    hosts = os.environ['ALLOWED_HOSTS'].split(" ")
    BASE_URL = "https://" + hosts[0]
    for host in hosts:
        host = host.strip()
        if host:


# Database

if "DATABASE_URL" in os.environ:  # pragma: no cover
    # Enable database config through environment
        # Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
        'default': env.db(),

    # Make sure we use have all settings we need
    # DATABASES['default']['ENGINE'] = 'django.contrib.gis.db.backends.postgis'
    DATABASES['default']['TEST'] = {'NAME': os.environ.get("DATABASE_TEST_NAME", None)}
    DATABASES['default']['OPTIONS'] = {
        'options': '-c search_path=gis,public,pg_catalog',
        'sslmode': 'require',
        'default': {
            'ENGINE': 'django.db.backends.sqlite3',
            # 'ENGINE': 'django.contrib.gis.db.backends.spatialite',
            'NAME': os.path.join(ROOT_DIR, 'data', ''),
            'TEST': {
                'NAME': os.path.join(ROOT_DIR, 'data', 'db.test.sqlite3'),

STATIC_ROOT = os.path.join(ROOT_DIR, 'static')

# django-assets

ASSETS_ROOT = os.path.join(ROOT_DIR, 'assets', "compressed")
ASSETS_DEBUG = env('ASSETS_DEBUG', default=DEBUG)  # Disable when testing compressed file in DEBUG mode
    ASSETS_MANIFEST = "json:{}".format(os.path.join(ASSETS_ROOT, "manifest.json"))
    ASSETS_URL = STATIC_URL + "assets/compressed/"
    ASSETS_MANIFEST = "json:{}".format(os.path.join(STATIC_ROOT, 'assets', "compressed", "manifest.json"))
ASSETS_MODULES = ('website.assets',)

The last bit shows the power here. ASSETS_DEBUG has a sensible default, which can be overridden in settings/ and even that that can be overridden by an environment setting! Yay!

In effect we have a mixed hierarchy of importance:

  1. settings/.py – sets defaults based on purpose, does not store secrets
  2. settings/ – is mostly controlled by environment
  3. process environment settings – 12 factor baby!
  4. settings/.env – local defaults for easy startup

The Answer 7

7 people think this answer is useful

I manage my configurations with the help of django-split-settings.

It is a drop-in replacement for the default settings. It is simple, yet configurable. And refactoring of your exisitng settings is not required.

Here’s a small example (file example/settings/

from import optional, include
import os

if os.environ['DJANGO_SETTINGS_MODULE'] == 'example.settings':
        # This file may be missing:


That’s it.


I wrote a blog post about managing django‘s settings with django-split-sttings. Have a look!

The Answer 8

6 people think this answer is useful

The problem with most of these solutions is that you either have your local settings applied before the common ones, or after them.

So it’s impossible to override things like

  • the env-specific settings define the addresses for the memcached pool, and in the main settings file this value is used to configure the cache backend
  • the env-specific settings add or remove apps/middleware to the default one

at the same time.

One solution can be implemented using “ini”-style config files with the ConfigParser class. It supports multiple files, lazy string interpolation, default values and a lot of other goodies. Once a number of files have been loaded, more files can be loaded and their values will override the previous ones, if any.

You load one or more config files, depending on the machine address, environment variables and even values in previously loaded config files. Then you just use the parsed values to populate the settings.

One strategy I have successfully used has been:

  • Load a default defaults.ini file
  • Check the machine name, and load all files which matched the reversed FQDN, from the shortest match to the longest match (so, I loaded net.ini, then net.domain.ini, then net.domain.webserver01.ini, each one possibly overriding values of the previous). This account also for developers’ machines, so each one could set up its preferred database driver, etc. for local development
  • Check if there is a “cluster name” declared, and in that case load cluster.cluster_name.ini, which can define things like database and cache IPs

As an example of something you can achieve with this, you can define a “subdomain” value per-env, which is then used in the default settings (as hostname: %(subdomain) to define all the necessary hostnames and cookie things django needs to work.

This is as DRY I could get, most (existing) files had just 3 or 4 settings. On top of this I had to manage customer configuration, so an additional set of configuration files (with things like database names, users and passwords, assigned subdomain etc) existed, one or more per customer.

One can scale this as low or as high as necessary, you just put in the config file the keys you want to configure per-environment, and once there’s need for a new config, put the previous value in the default config, and override it where necessary.

This system has proven reliable and works well with version control. It has been used for long time managing two separate clusters of applications (15 or more separate instances of the django site per machine), with more than 50 customers, where the clusters were changing size and members depending on the mood of the sysadmin…

The Answer 9

5 people think this answer is useful

Remember that is a live code file. Assuming that you don’t have DEBUG set on production (which is a best practice), you can do something like:

    STATIC_PATH = /path/to/dev/files
    STATIC_PATH = /path/to/production/files

Pretty basic, but you could, in theory, go up to any level of complexity based on just the value of DEBUG – or any other variable or code check you wanted to use.

The Answer 10

5 people think this answer is useful

I am also working with Laravel and I like the implementation there. I tried to mimic it and combining it with the solution proposed by T. Stone (look above):


def check_env():
    for item in PRODUCTION_SERVERS:
        match = re.match(r"(^." + item + "$)", socket.gethostname())
        if match:
            return True

if check_env():
    PRODUCTION = False


Maybe something like this would help you.

The Answer 11

4 people think this answer is useful

My solution to that problem is also somewhat of a mix of some solutions already stated here:

  • I keep a file called that has the content USING_LOCAL = True in dev and USING_LOCAL = False in prod
  • In I do an import on that file to get the USING_LOCAL setting

I then base all my environment-dependent settings on that one:

    # dev database settings
    # prod database settings

I prefer this to having two separate files that I need to maintain as I can keep my settings structured in a single file easier than having them spread across several files. Like this, when I update a setting I don’t forget to do it for both environments.

Of course that every method has its disadvantages and this one is no exception. The problem here is that I can’t overwrite the file whenever I push my changes into production, meaning I can’t just copy all files blindly, but that’s something I can live with.

The Answer 12

4 people think this answer is useful

For most of my projects I use following pattern:

  1. Create where I store settings that are common for all environments
  2. Whenever I need to use new environment with specific requirements I create new settings file (eg. which inherits contents of and overrides/adds proper settings variables (from settings_base import *)

(To run with custom settings file you simply use –settings command option: <command>

The Answer 13

3 people think this answer is useful

I use a variation of what jpartogi mentioned above, that I find a little shorter:

import platform
from import execute_manager 

computername = platform.node()

  settings = __import__(computername + '_settings')
except ImportError: 
  import sys
  sys.stderr.write("Error: Can't find the file '' in the directory containing %r. It appears you've customized things.\nYou'll have to run, passing it your settings module.\n(If the file does indeed exist, it's causing an ImportError somehow.)\n" % (computername, __file__))

if __name__ == "__main__":

Basically on each computer (development or production) I have the appropriate file that gets dynamically loaded.

The Answer 14

3 people think this answer is useful

There is also Django Classy Settings. I personally am a big fan of it. It’s built by one of the most active people on the Django IRC. You would use environment vars to set things.

The Answer 15

3 people think this answer is useful

1 – Create a new folder inside your app and name settings to it.

2 – Now create a new file in it and inside it write

from .base import *

    from .local import *

    from .production import *

3 – Create three new files in the settings folder name and and

4 – Inside, copy all the content of previous folder and rename it with something different, let’s say

5 – In change your BASE_DIR path to point to your new path of setting

Old path-> BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

New path -> BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

This way, the project dir can be structured and can be manageable among production and local development.

The Answer 16

2 people think this answer is useful

In order to use different settings configuration on different environment, create different settings file. And in your deployment script, start the server using --settings=<> parameter, via which you can use different settings on different environment.

Benefits of using this approach:

  1. Your settings will be modular based on each environment

  2. You may import the containing the base configuration in the and override the values that you want to change in that environment.

  3. If you have huge team, each developer may have their own which they can add to the code repository without any risk of modifying the server configuration. You can add these local settings to .gitnore if you use git or .hginore if you Mercurial for Version Control (or any other). That way local settings won’t even be the part of actual code base keeping it clean.

The Answer 17

2 people think this answer is useful

I had my settings split as follows


We have 3 environments

  • dev
  • staging
  • production

Now obviously staging and production should have the maximum possible similar environment. So we kept for both.

But there was a case where I had to identify running server is a production server. @T. Stone ‘s answer helped me write check as follows.

from socket import gethostname, gethostbyname  
PROD_HOSTS = ["webserver1", "webserver2"]

DEBUG = False
ALLOWED_HOSTS = [gethostname(), gethostbyname(gethostname()),]

if any(host in PROD_HOSTS for host in ALLOWED_HOSTS):

The Answer 18

1 people think this answer is useful

I differentiate it in and created two separate settings file: and

In I check whether the server is local server or production server. If it is a local server it would load up and it is a production server it would load up Basically this is how it would look like:

#!/usr/bin/env python
import sys
import socket
from import execute_manager 

ipaddress = socket.gethostbyname( socket.gethostname() )
if ipaddress == '':
        import local_settings # Assumed to be in the same directory.
        settings = local_settings
    except ImportError:
        import sys
        sys.stderr.write("Error: Can't find the file '' in the directory containing %r. It appears you've customized things.\nYou'll have to run, passing it your settings module.\n(If the file does indeed exist, it's causing an ImportError somehow.)\n" % __file__)
        import prod_settings # Assumed to be in the same directory.
        settings = prod_settings    
    except ImportError:
        import sys
        sys.stderr.write("Error: Can't find the file '' in the directory containing %r. It appears you've customized things.\nYou'll have to run, passing it your settings module.\n(If the file does indeed exist, it's causing an ImportError somehow.)\n" % __file__)

if __name__ == "__main__":

I found it to be easier to separate the settings file into two separate file instead of doing lots of ifs inside the settings file.

The Answer 19

1 people think this answer is useful

As an alternative to maintain different file if you wiil: If you are using git or any other VCS to push codes from local to server, what you can do is add the settings file to .gitignore.

This will allow you to have different content in both places without any problem. SO on server you can configure an independent version of and any changes made on the local wont reflect on server and vice versa.

In addition, it will remove the file from github also, the big fault, which i have seen many newbies doing.

The Answer 20

1 people think this answer is useful

Making multiple versions of is an anti pattern for 12 Factor App methodology. use python-decouple or django-environ instead.

The Answer 21

0 people think this answer is useful

I think the best solution is suggested by @T. Stone, but I don’t know why just don’t use the DEBUG flag in Django. I Write the below code for my website:

    from .local_settings import *

Always the simple solutions are better than complex ones.

The Answer 22

-3 people think this answer is useful

I found the responses here very helpful. (Has this been more definitively solved? The last response was a year ago.) After considering all the approaches listed, I came up with a solution that I didn’t see listed here.

My criteria were:

  • Everything should be in source control. I don’t like fiddly bits lying around.
  • Ideally, keep settings in one file. I forget things if I’m not looking right at them πŸ™‚
  • No manual edits to deploy. Should be able to test/push/deploy with a single fabric command.
  • Avoid leaking development settings into production.
  • Keep as close as possible to “standard” (*cough*) Django layout as possible.

I thought switching on the host machine made some sense, but then figured the real issue here is different settings for different environments, and had an aha moment. I put this code at the end of my file:

    os.environ['DJANGO_DEVELOPMENT_SERVER'] # throws error if unset
    DEBUG = True
    # This is naive but possible. Could also redeclare full app set to control ordering. 
    # Note that it requires a list rather than the generated tuple.
    # Production database settings, alternate static/media paths, etc...
except KeyError: 
    print 'DJANGO_DEVELOPMENT_SERVER environment var not set; using production settings'

This way, the app defaults to production settings, which means you are explicitly “whitelisting” your development environment. It is much safer to forget to set the environment variable locally than if it were the other way around and you forgot to set something in production and let some dev settings be used.

When developing locally, either from the shell or in a .bash_profile or wherever:


(Or if you’re developing on Windows, set via the Control Panel or whatever its called these days… Windows always made it so obscure that you could set environment variables.)

With this approach, the dev settings are all in one (standard) place, and simply override the production ones where needed. Any mucking around with development settings should be completely safe to commit to source control with no impact on production.

Add a Comment