paint-brush
Setting up Django Channels on AWS Elastic Beanstalkby@abhishek_menon
6,020 reads
6,020 reads

Setting up Django Channels on AWS Elastic Beanstalk

by Abhishek MenonNovember 21st, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

<strong>TL;DR</strong>

Company Mentioned

Mention Thumbnail
featured image - Setting up Django Channels on AWS Elastic Beanstalk
Abhishek Menon HackerNoon profile picture

TL;DR

UPDATE: This tutorial is for Channels 1.X and not for Channels 2.x. Please leave me a note if you guys want to see another tutorial for Channels 2.

Django has always been one of the outliers in the modern era of real-time, asynchronous frameworks and libraries. If you want to build a chat application, for example, Django, most probably wouldn’t be your first choice. However, for those of you out there who hate JavaScript, or if you’re a “perfectionist with a deadline”, Django Channels presents you with a great option.

Django Channels is a library which brings the power of asynchronous web to Django. If you aren’t familiar with Django Channels, I highly recommend getting yourself familiarized with it before you read further. There are excellent articles out there explaining what Django Channels is, and how it can transform the way you use Django. https://realpython.com/blog/python/getting-started-with-django-channels/ and https://blog.heroku.com/in_deep_with_django_channels_the_future_of_real_time_apps_in_django are two great examples. It also shows you how to build a basic chat application using Channels. It is fairly straightforward to set up, and will get you going in a few minutes!

The problems with hosting a Django Channels Application

In traditional Django, requests are handled by the Django application itself. It looks at the request and URL, determines the correct view function to execute, executes it, produces a response and sends the response back to the user. Fairly straightforward. Django Channels, however, introduces an Interface Server (Daphne) in between. This means that the Interface server now communicates with the outside world. The interface server looks at the request and URL, determines the right “Channel”, process the request and creates a “message” for the worker process to consume, and places the message in that Channel. A message broker, like Redis, listens to these Channels, and delivers the messages to worker processes. The worker process listens to the message queue, processes the message (much like a view function) and produces the response and sends it back to the interface server, which then delivers it back to the user. (Please feel free to take a minute to grasp this, it took me many hours :’))

This means now, instead of just a single process running, which you would be starting using:

python manage.py runserver

You would be running:

daphne -p 8000 your_app.asgi:channel_layer

and

python manage.py runworker

This enables Django Channels to support multiple types of requests (HTTP, Websockets etc). But clearly, it requires more resources than a standard Django application. For one, it requires a message broker. You can get away with an in memory message broker, but it is not recommended for production purposes. In this example, we will setup Redis in an EC2 instance and use that as the message broker.

If you’re using Elastic Beanstalk, it is configured to listen to port 80 by default, which is where generally your worker process will be running. But we want the application to listen to Daphne instead, so that would require configuring the Load Balancer to forward requests to the port where Daphne is listening.

But before all that, we first need to host our Django Application itself. If you are not familiar with how to do that, please follow the steps here:


Deploying Django + Python 3 + PostgreSQL to AWS Elastic Beanstalk_The following is a soup to nuts walkthrough of how to set up and deploy a Django application, powered by Python 3, and…_realpython.com

The only change is that you should select the Application Load Balancer instead of the Classic Load Balancer as WebSockets is only natively supported by Application Load Balancer.

Next Steps:

Now we need to provision Redis for the channel layer. We can do this in two ways:

1.A. Provisioning a Redis Instance from EC2

1.B. Using ElastiCache

You can pick either one, but please note that ElastiCache includes a free tier as of the date of this post. You just need to do either 1.A or 1.B.

1.A. Provisioning a Redis Instance for message broker

Sign in to your AWS console and go to EC2. Click “Launch Instance” on the top and select AWS Marketplace on the side menu.

Search for Redis 4.0

Selecting Redis from AWS Marketplace

After this, follow through the next steps. Please be sure to store the ssh-key (pem file) for the instance. After that, click “Review and Launch”. This will get your Redis instance up and running.

Now ssh into your Redis instance and open redis.conf file.

sudo nano /jet/etc/redis/redis.conf

Change the address from 127.0.0.1 to 0.0.0.0 and port from 1999 to 6379

Save and restart the using:

sudo service restart redis

You can check everything is correctly configured by running netstat -antpl

This command should show Redis running at 0.0.0.0:6379.

After this, select the instance from EC2 Dashboard, and in the menu below, select it’s security group (this should look something like Redis 4–0–170715-redis_4_0_0-AutogenByAWSMP). Add a new inbound rule with following info:

Type: Custom TCP Rule, Protocol: TCP, Port Range: 6379, Source: 0.0.0.0/0

You don’t need to add the above if it already exists or you can just modify the existing line if the port number is different. Now, grab the Public DNS of your Redis instance from EC2 Dashboard and save it for reference.

1.B. Set up ElastiCache with Redis instance

Credits for this section: Chad Van De Hey

Sign into your AWS console and go to the ElastiCache service. Click to create a new ElastiCache cluster or if you have not created anything beforehand, click “Get Started”. We are going to be setting up an EC Redis cluster.

a. Click Redis as your cluster engine

b. Enter a name, description and choose the newest Engine version available.

c. Make sure the port is set to 6379.

d. Set the node type to the smallest available

e. Select t2 at the top -> select cache.t2.micro (for free tier)

f. Select “None” for Number of replicas and unselect “multi-az with auto-failover”.

g. Either select an existing subnet of your existing choices or create a new one selecting an AZ that is same as your EC2s.

h. Choose the security group of your EC2s that you created in Elastic Beanstalk.

i. Unselect “Enable automatic backups” and click “Create”

You will be redirected to your ElastiCache clusters/instances. Here you will see your instances been spun up and configured. After this process has completed, make sure to note down the url of the ElastiCache instance because you will need to configure your Django app with that later.

2. Editing the Django Configuration

Now in your settings.py file for production, change your redis host and port to use the newly created Redis instance.

CHANNEL_LAYERS = {

"default": {

"BACKEND": "asgi_redis.RedisChannelLayer",

"CONFIG": {

"hosts": ["redis://(<The Public DNS of the Redis instance>, 6379)"],

},

"ROUTING": "<your_app>.routing.channel_routing",

}

}

Please change <your app> to the name of your app. This will configure your Django application to use the Redis instance we created.

3. Running Daphne Server and a worker process as daemon

In your app directory, open .ebextensions folder, create a new file called daemon.config with the following contents:

Please change <your_project> to the name of your django app. This basically creates a script and places it in /opt/elasticbeanstalk/hooks/appdeploy/post/

so that it executes after the application deploys. Now this script, in turn creates a supervisord conf script, which is responsible for running the daemon processes and managing the supervisord . (Again, please feel free to take a minute to grasp this :’) )

4. The final step: Configure the ALB

Now we have our Redis set up, daphne and worker process running, now all we need to do is to configure our Application Load Balancer to forward the requests to our Daphne server which is listening on port 5000 (Please check the config script of the daemon processes).

Create a new file in your .ebextensions folder called alb_listener.config and place put the following code in.








option_settings:aws:elbv2:listener:80:DefaultProcess: httpListenerEnabled: 'true'Protocol: HTTPaws:elasticbeanstalk:environment:process:http:Port: '5000'Protocol: HTTP

Please be careful of the spaces as this is in YAML syntax.

Redeploy your app and Viola! your Django Channels app is up and running on AWS Elastic Beanstalk.

If you have any questions, please feel free to ask in the comments! Any recommendations for future blog posts is also welcome, if you like this one, that is. :’)

Thanks for reading!