It is time to create another cool project using Django and REST Framework. In this post we are going to build real-time REST API. Currently I am interested in cryptocurrency, so I decided to create Cryptocurrency API to use it in React. Well, we need to crawl and update data continuously and avoid from long request timeout. Installation and Configuration Let's start with creating new project named cryptocurrencytracking and inside your project create app named trackingAPI django-admin startproject cryptocurrencytracking cd cryptocurrencytracking django-admin startapp trackingAPI and install REST Framework: pip install djangorestframework Once installation completed, open your settings.py and update INSTALLED_APPS. INSTALLED_APPS = [ ... 'rest_framework', 'trackingAPI', ] So, I stated before we need to handle long term requests. Celery is the best choice for doing background task processing in the Python/Django ecosystem. It has a simple and clear API, and it integrates beautifully with Django. So, we are using Celery to handle the time-consuming tasks by passing them to queue to be executed in the background and always keep the server ready to respond to new requests. To install celery run following command: pip install Celery Celery requires a solution to send and receive messages; usually this comes in the form of a separate service called a message broker. We will be configuring celery to use the RabbitMQ messaging system, as it provides robust, stable performance and interacts well with celery. We can install RabbitMQ through Ubuntu’s repositories by following command: sudo apt-get install rabbitmq-server Then enable and start the RabbitMQ service: sudo systemctl enable rabbitmq-server sudo systemctl start rabbitmq-server Install RabbitMQ on Mac Once installation completed, add the CELERY_BROKER_URL configuration to the end of settings.py file: CELERY_BROKER_URL = 'amqp://localhost' Then, create celery.py inside your project. import os from celery import Celery os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cryptocurrencytracking.settings') app = Celery('cryptocurrencytracking') app.config_from_object('django.conf:settings', namespace='CELERY') app.autodiscover_tasks() We are setting the default Django settings module for the 'celery' program and loading task modules from all registered Django app configs. Now inside your __init__.py import the celery: from .celery import app as celery_app __all__ = ['celery_app'] This will make sure our Celery app loaded every time Django starts. Creating model In your models.py: from django.db import models class Cryptocurrency(models.Model): cryptocurrency = models.CharField(max_length=100) price = models.CharField(max_length=100) market_cap = models.CharField(max_length=100) change = models.CharField(max_length=100) def __str__(self): return self.cryptocurrency We are going to crawl website named Coinranking and if you visit the site you can see the field names there. Crawling Cryptocurrency Data We will use BeautifulSoup to crawl cryptocurrency values in given URL. Beautiful Soup is a Python library for pulling data out of HTML and XML files. It works with your favorite parser to provide idiomatic ways of navigating, searching, and modifying the parse tree. It commonly saves programmers hours or days of work. Run the following command in your terminal to install beautifulsoup: pip install beautifulsoup4 Now, create new file named tasks.py inside our app trackingAPI. # tasks.py from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency(): print('Crawling data and creating objects in database ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') # Find first 5 table rows rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) # Create object in database from crawled data Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors sleep(3) @shared_task will create the independent instance of the task for each app, making task reusable. This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user. As you see we are crawling our data and cleaning it from useless characters, then creating new object in database. Once data crawled we need continuosly update these objects. #tasks.py @shared_task def update_currency(): print('Updating data ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) data = {'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change} Cryptocurrency.objects.filter(cryptocurrency=cryptocurrency).update(**data) sleep(3) # Run this function if database is empty if not Cryptocurrency.objects: crawl_currency() while True: sleep(15) update_currency() As you see, we are crawling data every 15 seconds and updating our objects. If you want to see the result start celery in terminal: celery -A cryptocurrencytracking worker -l info and go check your admin to see created objects. Building API Alright! Now our objects are updating and we need to create API using REST Framework. Now, create serializers.py inside our app. Serializers allow complex data such as querysets and model instances to be converted to native Python datatypes that can then be easily rendered into JSON, XML or other content types. Serializers also provide deserialization, allowing parsed data to be converted back into complex types, after first validating the incoming data. #serializers.py from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer(serializers.ModelSerializer): class Meta: model = Cryptocurrency fields = ['cryptocurrency', 'price', 'market_cap', 'change'] The ModelSerializer class provides a shortcut that lets you automatically create a Serializer class with fields that correspond to the Model fields. For more information take look documentation Next step is building API views, so open views.py: #views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView(generics.ListAPIView): """ Provides a get method handler. """ queryset = Cryptocurrency.objects.all() serializer_class = CryptocurrencySerializer and finally configure urls.py #urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView urlpatterns = [ path('admin/', admin.site.urls), path('', ListCryptocurrencyView.as_view()), ] when you run the server and celery (separate terminals) you will see following result: Try to refresh the page every 15 second or every minute and you will notice that values are changing. You can clone or download this project from my GitHub: https://github.com/raszidzie/Cryptocurrency-REST-API-Django Mission Accomplished! I hope you learned something from this tutorial and make sure you are following me on social media. Also check REVERSE PYTHON for more. Stay Connected! It is time to create another cool project using Django and REST Framework. In this post we are going to build real-time REST API. Currently I am interested in cryptocurrency, so I decided to create Cryptocurrency API to use it in React. Well, we need to crawl and update data continuously and avoid from long request timeout. Installation and Configuration Let's start with creating new project named cryptocurrencytracking and inside your project create app named trackingAPI cryptocurrencytracking trackingAPI django-admin startproject cryptocurrencytracking cd cryptocurrencytracking django-admin startapp trackingAPI django-admin startproject cryptocurrencytracking cd cryptocurrencytracking django-admin startapp trackingAPI django- admin startproject cryptocurrencytracking cd cryptocurrencytracking django- admin startapp trackingAPI django- admin startproject cryptocurrencytracking django- admin startapp trackingAPI and install REST Framework: pip install djangorestframework pip install djangorestframework pip install djangorestframework pip install djangorestframework Once installation completed, open your settings.py and update INSTALLED_APPS. INSTALLED_APPS = [ ... 'rest_framework', 'trackingAPI', ] INSTALLED_APPS = [ ... 'rest_framework', 'trackingAPI', ] INSTALLED_APPS = [ ... 'rest_framework' , 'trackingAPI' , ] INSTALLED_APPS = [ 'rest_framework' , 'trackingAPI' , So, I stated before we need to handle long term requests. Celery is the best choice for doing background task processing in the Python/Django ecosystem. It has a simple and clear API, and it integrates beautifully with Django. So, we are using Celery to handle the time-consuming tasks by passing them to queue to be executed in the background and always keep the server ready to respond to new requests. To install celery run following command: pip install Celery pip install Celery pip install Celery pip install Celery Celery requires a solution to send and receive messages; usually this comes in the form of a separate service called a message broker. We will be configuring celery to use the RabbitMQ messaging system, as it provides robust, stable performance and interacts well with celery. We can install RabbitMQ through Ubuntu’s repositories by following command: sudo apt-get install rabbitmq-server sudo apt-get install rabbitmq-server sudo apt-get install rabbitmq-server sudo apt-get install rabbitmq-server Then enable and start the RabbitMQ service: sudo systemctl enable rabbitmq-server sudo systemctl start rabbitmq-server sudo systemctl enable rabbitmq-server sudo systemctl start rabbitmq-server sudo systemctl enable rabbitmq-server sudo systemctl start rabbitmq-server sudo systemctl enable rabbitmq-server Install RabbitMQ on Mac Install RabbitMQ on Mac Install RabbitMQ on Mac Once installation completed, add the CELERY_BROKER_URL configuration to the end of settings.py file: CELERY_BROKER_URL = 'amqp://localhost' CELERY_BROKER_URL = 'amqp://localhost' CELERY_BROKER_URL = 'amqp://localhost' CELERY_BROKER_URL = 'amqp://localhost' Then, create celery.py inside your project. celery.py import os from celery import Celery os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cryptocurrencytracking.settings') app = Celery('cryptocurrencytracking') app.config_from_object('django.conf:settings', namespace='CELERY') app.autodiscover_tasks() import os from celery import Celery os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cryptocurrencytracking.settings') app = Celery('cryptocurrencytracking') app.config_from_object('django.conf:settings', namespace='CELERY') app.autodiscover_tasks() import os from celery import Celery os.environ.setdefault( 'DJANGO_SETTINGS_MODULE' , 'cryptocurrencytracking.settings' ) app = Celery( 'cryptocurrencytracking' ) app.config_from_object( 'django.conf:settings' , namespace= 'CELERY' ) app.autodiscover_tasks() import os from celery import Celery os.environ.setdefault( 'DJANGO_SETTINGS_MODULE' , 'cryptocurrencytracking.settings' ) app = Celery( 'cryptocurrencytracking' ) app.config_from_object( 'django.conf:settings' , namespace= 'CELERY' ) We are setting the default Django settings module for the 'celery' program and loading task modules from all registered Django app configs. Now inside your __init__.py import the celery: __init__.py from .celery import app as celery_app __all__ = ['celery_app'] from .celery import app as celery_app __all__ = ['celery_app'] from .celery import app as celery_app __all__ = [ 'celery_app' ] from .celery import app as celery_app __all__ = [ 'celery_app' ] This will make sure our Celery app loaded every time Django starts. Creating model In your models.py : models.py from django.db import models class Cryptocurrency(models.Model): cryptocurrency = models.CharField(max_length=100) price = models.CharField(max_length=100) market_cap = models.CharField(max_length=100) change = models.CharField(max_length=100) def __str__(self): return self.cryptocurrency from django.db import models class Cryptocurrency(models.Model): cryptocurrency = models.CharField(max_length=100) price = models.CharField(max_length=100) market_cap = models.CharField(max_length=100) change = models.CharField(max_length=100) def __str__(self): return self.cryptocurrency from django.db import models class Cryptocurrency ( models.Model ): cryptocurrency = models.CharField(max_length= 100 ) price = models.CharField(max_length= 100 ) market_cap = models.CharField(max_length= 100 ) change = models.CharField(max_length= 100 ) def __str__ ( self ): return self.cryptocurrency from django.db import models class Cryptocurrency ( models.Model ): cryptocurrency = models.CharField(max_length= 100 ) price = models.CharField(max_length= 100 ) market_cap = models.CharField(max_length= 100 ) change = models.CharField(max_length= 100 ) def __str__ ( self ): return self.cryptocurrency We are going to crawl website named Coinranking and if you visit the site you can see the field names there. Coinranking Coinranking Crawling Cryptocurrency Data We will use BeautifulSoup to crawl cryptocurrency values in given URL. Beautiful Soup is a Python library for pulling data out of HTML and XML files. It works with your favorite parser to provide idiomatic ways of navigating, searching, and modifying the parse tree. It commonly saves programmers hours or days of work. Run the following command in your terminal to install beautifulsoup: pip install beautifulsoup4 pip install beautifulsoup4 pip install beautifulsoup4 pip install beautifulsoup4 Now, create new file named tasks.py inside our app trackingAPI . tasks.py trackingAPI # tasks.py from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency(): print('Crawling data and creating objects in database ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') # Find first 5 table rows rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) # Create object in database from crawled data Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors sleep(3) # tasks.py from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency(): print('Crawling data and creating objects in database ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') # Find first 5 table rows rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) # Create object in database from crawled data Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors sleep(3) # tasks.py from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency (): print ( 'Crawling data and creating objects in database ..' ) req = Request( 'https://coinranking.com' , headers={ 'User-Agent' : 'Mozilla/5.0' }) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser' ) # Find first 5 table rows rows = bs.find( 'tbody' , class_= "table__body" ).find_all( 'tr' , class_= "table__row" )[ 0 : 5 ] for row in rows: cryptocurrency = row.find( 'span' , class_= "profile__name" ).get_text().strip().replace( '\n' , '' ) values = row.find_all( 'div' , class_= "valuta" ) price = values[ 0 ].get_text().strip().replace( '\n' , '' ) market_cap = values[ 1 ].get_text().strip().replace( '\n' , '' ) change = row.find( 'div' , class_= "change" ).find( 'span' ).get_text().strip().replace( '\n' , '' ) print ({ 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change}) # Create object in database from crawled data Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors sleep( 3 ) # tasks.py from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency (): print ( 'Crawling data and creating objects in database ..' ) req = Request( 'https://coinranking.com' , headers={ 'User-Agent' : 'Mozilla/5.0' }) bs = BeautifulSoup(html, 'html.parser' ) # Find first 5 table rows rows = bs.find( 'tbody' , class_= "table__body" ).find_all( 'tr' , class_= "table__row" )[ 0 : 5 ] for row in rows: cryptocurrency = row.find( 'span' , class_= "profile__name" ).get_text().strip().replace( '\n' , '' ) values = row.find_all( 'div' , class_= "valuta" ) price = values[ 0 ].get_text().strip().replace( '\n' , '' ) market_cap = values[ 1 ].get_text().strip().replace( '\n' , '' ) change = row.find( 'div' , class_= "change" ).find( 'span' ).get_text().strip().replace( '\n' , '' ) print ({ 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change}) # Create object in database from crawled data # Sleep 3 seconds to avoid any errors sleep( 3 ) @shared_task will create the independent instance of the task for each app, making task reusable. This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user. As you see we are crawling our data and cleaning it from useless characters, then creating new object in database. Once data crawled we need continuosly update these objects. #tasks.py @shared_task def update_currency(): print('Updating data ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) data = {'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change} Cryptocurrency.objects.filter(cryptocurrency=cryptocurrency).update(**data) sleep(3) # Run this function if database is empty if not Cryptocurrency.objects: crawl_currency() while True: sleep(15) update_currency() #tasks.py @shared_task def update_currency(): print('Updating data ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) data = {'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change} Cryptocurrency.objects.filter(cryptocurrency=cryptocurrency).update(**data) sleep(3) # Run this function if database is empty if not Cryptocurrency.objects: crawl_currency() while True: sleep(15) update_currency() #tasks.py @shared_task def update_currency (): print ( 'Updating data ..' ) req = Request( 'https://coinranking.com' , headers={ 'User-Agent' : 'Mozilla/5.0' }) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser' ) rows = bs.find( 'tbody' , class_= "table__body" ).find_all( 'tr' , class_= "table__row" )[ 0 : 5 ] for row in rows: cryptocurrency = row.find( 'span' , class_= "profile__name" ).get_text().strip().replace( '\n' , '' ) values = row.find_all( 'div' , class_= "valuta" ) price = values[ 0 ].get_text().strip().replace( '\n' , '' ) market_cap = values[ 1 ].get_text().strip().replace( '\n' , '' ) change = row.find( 'div' , class_= "change" ).find( 'span' ).get_text().strip().replace( '\n' , '' ) print ({ 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change}) data = { 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change} Cryptocurrency.objects. filter (cryptocurrency=cryptocurrency).update(**data) sleep( 3 ) # Run this function if database is empty if not Cryptocurrency.objects: crawl_currency() while True : sleep( 15 ) update_currency() #tasks.py @shared_task def update_currency (): print ( 'Updating data ..' ) req = Request( 'https://coinranking.com' , headers={ 'User-Agent' : 'Mozilla/5.0' }) bs = BeautifulSoup(html, 'html.parser' ) rows = bs.find( 'tbody' , class_= "table__body" ).find_all( 'tr' , class_= "table__row" )[ 0 : 5 ] for row in rows: cryptocurrency = row.find( 'span' , class_= "profile__name" ).get_text().strip().replace( '\n' , '' ) values = row.find_all( 'div' , class_= "valuta" ) price = values[ 0 ].get_text().strip().replace( '\n' , '' ) market_cap = values[ 1 ].get_text().strip().replace( '\n' , '' ) change = row.find( 'div' , class_= "change" ).find( 'span' ).get_text().strip().replace( '\n' , '' ) print ({ 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change}) data = { 'cryptocurrency' : cryptocurrency, 'price' :price, 'market_cap' :market_cap, 'change' :change} Cryptocurrency.objects. filter (cryptocurrency=cryptocurrency).update(**data) sleep( 3 ) # Run this function if database is empty if not Cryptocurrency.objects: while True : sleep( 15 ) As you see, we are crawling data every 15 seconds and updating our objects. If you want to see the result start celery in terminal: celery -A cryptocurrencytracking worker -l info celery -A cryptocurrencytracking worker -l info celery -A cryptocurrencytracking worker -l info celery -A cryptocurrencytracking worker -l info and go check your admin to see created objects. Building API Alright! Now our objects are updating and we need to create API using REST Framework. Now, create serializers.py inside our app. serializers.py Serializers allow complex data such as querysets and model instances to be converted to native Python datatypes that can then be easily rendered into JSON, XML or other content types. Serializers also provide deserialization, allowing parsed data to be converted back into complex types, after first validating the incoming data. #serializers.py from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer(serializers.ModelSerializer): class Meta: model = Cryptocurrency fields = ['cryptocurrency', 'price', 'market_cap', 'change'] #serializers.py from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer(serializers.ModelSerializer): class Meta: model = Cryptocurrency fields = ['cryptocurrency', 'price', 'market_cap', 'change'] #serializers.py from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer ( serializers.ModelSerializer ): class Meta : model = Cryptocurrency fields = [ 'cryptocurrency' , 'price' , 'market_cap' , 'change' ] #serializers.py from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer ( serializers.ModelSerializer ): class Meta : fields = [ 'cryptocurrency' , 'price' , 'market_cap' , 'change' ] The ModelSerializer class provides a shortcut that lets you automatically create a Serializer class with fields that correspond to the Model fields. ModelSerializer For more information take look documentation documentation documentation Next step is building API views, so open views.py : views.py #views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView(generics.ListAPIView): """ Provides a get method handler. """ queryset = Cryptocurrency.objects.all() serializer_class = CryptocurrencySerializer #views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView(generics.ListAPIView): """ Provides a get method handler. """ queryset = Cryptocurrency.objects.all() serializer_class = CryptocurrencySerializer #views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView ( generics.ListAPIView ): """ Provides a get method handler. """ queryset = Cryptocurrency.objects. all () serializer_class = CryptocurrencySerializer #views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView ( generics.ListAPIView ): """ Provides a get method handler. """ queryset = Cryptocurrency.objects. all () and finally configure urls.py urls.py #urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView urlpatterns = [ path('admin/', admin.site.urls), path('', ListCryptocurrencyView.as_view()), ] #urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView urlpatterns = [ path('admin/', admin.site.urls), path('', ListCryptocurrencyView.as_view()), ] #urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView urlpatterns = [ path( 'admin/' , admin.site.urls), path( '' , ListCryptocurrencyView.as_view()), ] #urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView path( 'admin/' , admin.site.urls), path( '' , ListCryptocurrencyView.as_view()), when you run the server and celery (separate terminals) you will see following result: Try to refresh the page every 15 second or every minute and you will notice that values are changing. You can clone or download this project from my GitHub: https://github.com/raszidzie/Cryptocurrency-REST-API-Django https://github.com/raszidzie/Cryptocurrency-REST-API-Django https://github.com/raszidzie/Cryptocurrency-REST-API-Django Mission Accomplished! I hope you learned something from this tutorial and make sure you are following me on social media. Also check REVERSE PYTHON for more. REVERSE PYTHON REVERSE PYTHON Stay Connected!