Close Menu
Soshace Digital Blog

    Subscribe to Updates

    Get The Latest News, Updates, And Amazing Offers

    What's Hot
    Entrepreneurship

    Effective Content Marketing Strategies for Startup Growth

    Startups

    Strategies for Maintaining Agility in Your Startup’s Growth

    Programming

    8. Уроки Node.js. Наследование от ошибок Error

    Important Pages:
    • Home
    • About
    • Services
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    Facebook X (Twitter) Instagram LinkedIn YouTube
    Today's Picks:
    • Scaling Success: Monitoring Indexation of Programmatic SEO Content
    • Leveraging Influencers: Key Drivers in New Product Launches
    • How Privacy-First Marketing Will Transform the Industry Landscape
    • The Impact of Social Proof on Thought Leadership Marketing
    • Balancing Value-Driven Content and Promotional Messaging Strategies
    • Top Influencer Marketing Platforms to Explore in 2025
    • Emerging Trends in Marketing Automation and AI Tools for 2023
    • Strategies to Mitigate Duplicate Content in Programmatic SEO
    Wednesday, September 10
    Facebook X (Twitter) Instagram LinkedIn YouTube
    Soshace Digital Blog
    • Home
    • About
    • Services
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    Services
    • SaaS & Tech

      Maximizing Efficiency: How SaaS Lowers IT Infrastructure Costs

      August 27, 2025

      Navigating Tomorrow: Innovations Shaping the Future of SaaS

      August 27, 2025

      Maximizing Impact: Strategies for SaaS & Technology Marketing

      August 27, 2025
    • AI & Automation

      Enhancing Customer Feedback Analysis Through AI Innovations

      August 27, 2025

      Navigating the Impact of AI on SEO and Search Rankings

      August 27, 2025

      5 Automation Hacks Every Home Service Business Needs to Know

      May 3, 2025
    • Finance & Fintech

      Critical Missteps in Finance Marketing: What to Avoid

      August 27, 2025

      Analyzing Future Fintech Marketing Trends: Insights Ahead

      August 27, 2025

      Navigating the Complex Landscape of Finance and Fintech Marketing

      August 27, 2025
    • Legal & Compliance

      Exploring Thought Leadership’s Impact on Legal Marketing

      August 27, 2025

      Maximizing LinkedIn: Strategies for Legal and Compliance Marketing

      August 27, 2025

      Why Transparency Matters in Legal Advertising Practices

      August 27, 2025
    • Medical Marketing

      Enhancing Online Reputation Management in Hospitals: A Guide

      August 27, 2025

      Analyzing Emerging Trends in Health and Medical Marketing

      August 27, 2025

      Exploring Innovative Content Ideas for Wellness Blogs and Clinics

      August 27, 2025
    • E-commerce & Retail

      Strategic Seasonal Campaign Concepts for Online and Retail Markets

      August 27, 2025

      Emerging Trends in E-commerce and Retail Marketing Strategies

      August 27, 2025

      Maximizing Revenue: The Advantages of Affiliate Marketing for E-Commerce

      August 27, 2025
    • Influencer & Community

      Leveraging Influencers: Key Drivers in New Product Launches

      August 27, 2025

      Top Influencer Marketing Platforms to Explore in 2025

      August 27, 2025

      Key Strategies for Successful Influencer Partnership Negotiations

      August 27, 2025
    • Content & Leadership

      The Impact of Social Proof on Thought Leadership Marketing

      August 27, 2025

      Balancing Value-Driven Content and Promotional Messaging Strategies

      August 27, 2025

      Analyzing Storytelling’s Impact on Content Marketing Effectiveness

      August 27, 2025
    • SEO & Analytics

      Scaling Success: Monitoring Indexation of Programmatic SEO Content

      August 27, 2025

      Strategies to Mitigate Duplicate Content in Programmatic SEO

      August 27, 2025

      Effective Data Visualization Techniques for SEO Reporting

      August 27, 2025
    • Marketing Trends

      How Privacy-First Marketing Will Transform the Industry Landscape

      August 27, 2025

      Emerging Trends in Marketing Automation and AI Tools for 2023

      August 27, 2025

      Maximizing ROI: Key Trends in Paid Social Advertising

      August 27, 2025
    Soshace Digital Blog
    Blog / Python / Creating Real-Time API with Beautiful Soup and Django REST Framework
    Python

    Creating Real-Time API with Beautiful Soup and Django REST Framework

    coderashaplBy coderashaplFebruary 26, 2020No Comments10 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Creating Real-Time API with Beautiful Soup and Django REST Framework
    Creating Real-Time API [ with Beautiful Soup and Django REST Framework ]
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link
    Creating Real-Time API with Beautiful Soup and Django REST Framework
    Creating Real-Time API with Beautiful Soup and Django REST Framework

    A few weeks ago, I was interested in trading and found that the majority of companies are offering their paid services to analyze the forex data. My objective was to implement some ML algorithms to predict the market. Therefore, I decided to create a real-time API to use it in React and test my own automated strategies.

    At the end of this tutorial, you’ll be able to turn any website into an API without using any online service. We will mainly use the Beautiful Soup and Django REST Framework to build real-time API by crawling the forex data.

    You’ll need a basic understanding of Django and Ubuntu to run some important commands. If you’re using other operating systems, you can download Anaconda to make your work easier.

    Installation and Configuration

    To get started, create and activate a virtual environment by following commands:

    virtualenv env
    . env/bin/activate

    Once the environment activated, install Django and Django REST Framework:

    pip install django djangorestframework

    Now, create a new project named trading and inside your project create an app named forexAPI.

    django-admin startproject trading
    cd trading
    django-admin startapp forexAPI

    then open your settings.py and update INSTALLED_APPS configuration:

    settings.py

    INSTALLED_APPS = [
        ...
    
        'rest_framework',
        'forexAPI',  
    ]
    

    In order to create a real-time API, we’ll need to crawl and update data continuously. Once our application is overloaded with traffic, the web server can only handle a certain number of requests and leave the user waiting for way too long. At this point, Celery is the best choice for doing background task processing. Passing the crawlers to queue to be executed in the background will keep the server ready to respond to new requests.

    pip install Celery

    Additionally, Celery requires a message broker to send and receive messages, so we have to utilize RabbitMQ as a solution. You can install RabbitMQ through Ubuntu’s repositories by the following command:

    sudo apt-get install rabbitmq-server

    then enable and start the RabbitMQ service:

    sudo systemctl enable rabbitmq-server
    sudo systemctl start rabbitmq-server

    If you are using other operating systems, you can follow the download instructions from the official documentation of RabbitMQ.

    After installation completed, add CELERY_BROKER_URL configuration at the end of settings.py file:

    settings.py

    CELERY_BROKER_URL = 'amqp://localhost'

    Now, we have to set the default Django settings module for the ‘celery’ program. Create a new file named celery.py inside the root directory, as shown in the schema below:

    .
    ├── asgi.py
    ├── celery.py
    ├── __init__.py
    ├── settings.py
    ├── urls.py
    └── wsgi.py
    

    celery.py

    import os
    from celery import Celery
    
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'trading.settings')
    
    app = Celery('trading')
    app.config_from_object('django.conf:settings', namespace='CELERY')
    app.autodiscover_tasks()
    

    We are setting the default Django settings module for the ‘celery’ program and loading task modules from all registered Django app configs.

    Open  __init__.py in the same directory (root) and import the celery to ensure our Celery app is loaded once Django starts.

    from .celery import app as celery_app
    
    __all__ = ['celery_app']
    

    Crawling Data with Beautiful Soup

    We are going to crawl one of the popular real-time market screeners named investing.com using Beautiful Soup which is easy to use parser tool and doesn’t require any knowledge of actual parsing theory and techniques. Thanks to the excellent documentation that makes it easy to learn with many code examples. Install the Beautiful Soup with the following command:

    pip install beautifulsoup4

    The next step is to create a model to save crawled data in the database. If you open the website, you can see a forex table with column names which will be our model fields.

    models.py

    from django.db import models
    
    class Currency(models.Model):
        pair = models.CharField(max_length=20)
        bid = models.FloatField()
        ask = models.FloatField()
        high = models.FloatField()
        low = models.FloatField()
        change = models.CharField(max_length=20)
        change_p = models.CharField(max_length=20)
        time = models.TimeField()
    
        class Meta:
            verbose_name = 'Currency'
            verbose_name_plural = 'Currencies'
    
        def __str__(self):
            return self.pair
    

    then migrate your database by following commands:

    python manage.py makemigrations forexAPI
    python manage.py migrate

    After migrations, create a new file named tasks.py inside the app directory (forexAPI) which will include all our Celery tasks. The Celery app that we built in the root of the project will collect all of the tasks in all Django apps mentioned in the INSTALLED_APPS. Before implementation, open developer tools of browser to inspect table elements that are going to be crawled.

    Read More:  Optimizing Database Interactions in Python: SQLAlchemy Best Practices
    Inspect-Element-Forex
    Inspect-Element-Forex

    Initially, we are using abstraction class Request of urllib to open the website because Beautiful Soup can’t make a request to a particular web server. Then, we have to get all table rows (<tr>) and iterate through them to get into details of cells (<td>). Consider the table cells inside rows, and you’ll notice that class names include increment value that defines the number of the specific row, so we also need to keep a count of iterations to get the right information about the row. Python provides a built-in function enumerate() for dealing with this kind of iterators, enumerate rows to pass index inside the class name.

    tasks.py

    from time import sleep
    from celery import shared_task
    from bs4 import BeautifulSoup
    from urllib.request import urlopen, Request
    from .models import Currency
    
    @shared_task
    # some heavy stuff here
    def create_currency():
        print('Creating forex data ..')
        req = Request('https://www.investing.com/currencies/single-currency-crosses', headers={'User-Agent': 'Mozilla/5.0'})
        html = urlopen(req).read()
        bs = BeautifulSoup(html, 'html.parser')
        # get first 5 rows
        currencies = bs.find("tbody").find_all("tr")[0:5]
        # enumerate rows to pass index inside class name
        # starting index from 1
        for idx, currency in enumerate(currencies, 1):
            pair = currency.find("td", class_="plusIconTd").a.text
            bid = currency.find("td", class_=f"pid-{idx}-bid").text
            ask = currency.find("td", class_=f"pid-{idx}-ask").text
            high = currency.find("td", class_=f"pid-{idx}-high").text
            low = currency.find("td", class_=f"pid-{idx}-low").text
            change = currency.find("td", class_=f"pid-{idx}-pc").text
            change_p = currency.find("td", class_=f"pid-{idx}-pc").text
            time = currency.find("td", class_=f"pid-{idx}-time").text
    
            print({'pair':pair, 'bid':bid, 'ask':ask, 'high':high, 'low':low, 'change':change, 'change_p':change_p, 'time':time})
    
            # create objects in database
            Currency.objects.create(
                pair=pair,
                bid=bid,
                ask=ask,
                high=high,
                low=low,
                change=change,
                change_p=change_p,
                time=time
            )
            
            # sleep few seconds to avoid database block
            sleep(5)
    
    create_currency()

    @shared_task will create an independent instance of the task for each app, making task reusable, so it’s important to specify this decorator for time-consuming tasks. The function will create a new object for each crawled row and sleep a few seconds to avoid blocking the database.

    Save the file and run Celery worker in your console to see the result.

    celery -A trading worker -l info

    Once you run the worker, results will appear in the console and if you want to see the created objects, navigate to Django admin and check inside your app. Create a superuser to access the admin page.

    python manage.py createsuperuser

    Then, register your model in admin.py:

    from django.contrib import admin
    from .models import Currency
    admin.site.register(Currency)

    To create real-time data, we’ll need to continuously update these objects. We can achieve that by making small changes in the previous function.

    tasks.py

    @shared_task
    # some heavy stuff here
    def update_currency():
        print('Updating forex data ..')
        req = Request('https://www.investing.com/currencies/single-currency-crosses', headers={'User-Agent': 'Mozilla/5.0'})
        html = urlopen(req).read()
        bs = BeautifulSoup(html, 'html.parser')
        currencies = bs.find("tbody").find_all("tr")[0:5]
        for idx, currency in enumerate(currencies, 1):
            pair = currency.find("td", class_="plusIconTd").a.text
            bid = currency.find("td", class_=f"pid-{idx}-bid").text
            ask = currency.find("td", class_=f"pid-{idx}-ask").text
            high = currency.find("td", class_=f"pid-{idx}-high").text
            low = currency.find("td", class_=f"pid-{idx}-low").text
            change = currency.find("td", class_=f"pid-{idx}-pc").text
            change_p = currency.find("td", class_=f"pid-{idx}-pc").text
            time = currency.find("td", class_=f"pid-{idx}-time").text
    
            # create dictionary
            data = {'pair':pair, 'bid':bid, 'ask':ask, 'high':high, 'low':low, 'change':change, 'change_p':change_p, 'time':time}
            # find the object by filtering and update all fields
            Currency.objects.filter(pair=pair).update(**data)
    
            sleep(5)

    To update an existing object, we should use the filter method to find a particular object and pass the dictionary to update() method. This is one of the best ways to handle multiple fields at once. Here is the full code for real-time updates:

    from time import sleep
    from celery import shared_task
    from bs4 import BeautifulSoup
    from urllib.request import urlopen, Request
    from .models import Currency
    
    @shared_task
    # some heavy stuff here
    def create_currency():
        print('Creating forex data ..')
        req = Request('https://www.investing.com/currencies/single-currency-crosses', headers={'User-Agent': 'Mozilla/5.0'})
        html = urlopen(req).read()
        bs = BeautifulSoup(html, 'html.parser')
        # get first 5 rows
        currencies = bs.find("tbody").find_all("tr")[0:5]
        # enumerate rows to include index inside class name
        # starting index from 1
        for idx, currency in enumerate(currencies, 1):
            pair = currency.find("td", class_="plusIconTd").a.text
            bid = currency.find("td", class_=f"pid-{idx}-bid").text
            ask = currency.find("td", class_=f"pid-{idx}-ask").text
            high = currency.find("td", class_=f"pid-{idx}-high").text
            low = currency.find("td", class_=f"pid-{idx}-low").text
            change = currency.find("td", class_=f"pid-{idx}-pc").text
            change_p = currency.find("td", class_=f"pid-{idx}-pc").text
            time = currency.find("td", class_=f"pid-{idx}-time").text
    
            print({'pair':pair, 'bid':bid, 'ask':ask, 'high':high, 'low':low, 'change':change, 'change_p':change_p, 'time':time})
    
            # create objects in database
            Currency.objects.create(
                pair=pair,
                bid=bid,
                ask=ask,
                high=high,
                low=low,
                change=change,
                change_p=change_p,
                time=time
            )
            
            # sleep few seconds to avoid database block
            sleep(5)
    
    @shared_task
    # some heavy stuff here
    def update_currency():
        print('Updating forex data ..')
        req = Request('https://www.investing.com/currencies/single-currency-crosses', headers={'User-Agent': 'Mozilla/5.0'})
        html = urlopen(req).read()
        bs = BeautifulSoup(html, 'html.parser')
        currencies = bs.find("tbody").find_all("tr")[0:5]
        for idx, currency in enumerate(currencies, 1):
            pair = currency.find("td", class_="plusIconTd").a.text
            bid = currency.find("td", class_=f"pid-{idx}-bid").text
            ask = currency.find("td", class_=f"pid-{idx}-ask").text
            high = currency.find("td", class_=f"pid-{idx}-high").text
            low = currency.find("td", class_=f"pid-{idx}-low").text
            change = currency.find("td", class_=f"pid-{idx}-pc").text
            change_p = currency.find("td", class_=f"pid-{idx}-pc").text
            time = currency.find("td", class_=f"pid-{idx}-time").text
    
            # create dictionary
            data = {'pair':pair, 'bid':bid, 'ask':ask, 'high':high, 'low':low, 'change':change, 'change_p':change_p, 'time':time}
            # find the object by filtering and update all fields
            Currency.objects.filter(pair=pair).update(**data)
    
            sleep(5)
    
    create_currency()
    while True:
        # updating data every 15 seconds
        sleep(15)
        update_currency()

    Real-time crawlers can interrupt servers that can end with preventing you to access a certain webpage, so it is important being undetected while scraping continuously and bypass any restriction. You can prevent detection by setting a proxy on an instance of class Request.

    proxy_host = 'localhost:1234'    # host and port of your proxy
    url = 'http://www.httpbin.org/ip'
    
    req = urlrequest.Request(url)
    req.set_proxy(proxy_host, 'http')
    
    response = urlrequest.urlopen(req)
    print(response.read().decode('utf8'))

    Creating API with Dango REST Framework

    The final step is to create serializers to build a REST API from crawled data. By using serializers we can convert our model instance to native Python datatype that can be easily rendered into JSON. The ModelSerializer class provides a shortcut that lets you automatically create a Serializer class with fields that correspond to the Model fields. For more information, check official documentation of the Django REST Framework.

    Read More:  Implementing a BackgroundRunner with Flask-Executor

    Create serializers.py inside your app:

    serializers.py

    from rest_framework import serializers
    from .models import Currency
    
    
    class CurrencySerializer(serializers.ModelSerializer):
        class Meta:
            model = Currency
            fields = '__all__' # importing all fields

    Now, open views.py to create ListAPIView that represents a collection of model instances. It’s used for read-only endpoints and provides a get method handler.

    from django.shortcuts import render
    from rest_framework import generics
    from .models import Currency
    from .serializers import CurrencySerializer
    
    class ListCurrencyView(generics.ListAPIView):
        queryset = Currency.objects.all() # used for returning objects from this view
        serializer_class = CurrencySerializer

    For more information about generic views visit Generic Views. Finally, configure the urls.py to render views:

    from django.contrib import admin
    from django.urls import path
    from forexAPI.views import ListCurrencyView
    
    urlpatterns = [
        path('admin/', admin.site.urls),
        path('', ListCurrencyView.as_view())
    ]
    

    In class-based views, the function as_view() must be called to return a callable view that takes a request and returns a response. It’s the main entry-point for generic views in the request-response cycle.

    You’re almost done! In order to run the project properly, you have to run celery and Django server separately. The final result should look like this:

    Final result
    Final result

    Try to refresh the page after 15 seconds and you’ll see the values are changing.

    Source Code

    GitHub repository to download the project.

    Conclusion

    Web scraping plays main role in the data industry and used by corporations to stay competitive. The real-time mode becomes useful when you want to get information on demand. Keep in mind, though, that you’re going to put a lot of server load on the site you’re scraping, so maybe check to see if they have an API or some other way to get the data. Companies put a lot of effort to provide services, so it’s best to respect their business and request permission before using it in production.

    django python real-time api
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    coderashapl

      Related Posts

      Flask Development Made Easy: A Comprehensive Guide to Test-Driven Development

      January 4, 2024

      Creating Our Own Chat GPT

      July 27, 2023

      The Ultimate Guide to Pip

      June 12, 2023
      Leave A Reply Cancel Reply

      You must be logged in to post a comment.

      Stay In Touch
      • Facebook
      • Twitter
      • Pinterest
      • Instagram
      • YouTube
      • Vimeo
      Don't Miss
      Programming September 13, 2019

      Python enumerate() Explained and Visualized

      Python is renowned for its collection of libraries, modules, and functions that it comes packaged with — and enumerate() is something many developers want to understand better. In this article, we’ll explore what enumerate() actually is, analyze its functionality, and highlight how you should use it to maximize its efficiency and performance.

      Building a Full Stack Application using RedwoodJS

      July 1, 2020

      Build Real-World React Native App #10 : Setup in-App Purchase in iOS

      February 1, 2021

      10. Уроки Node.js. Node.JS как веб-сервер

      September 20, 2016

      Categories

      • AI & Automation
      • Angular
      • ASP.NET
      • AWS
      • B2B Leads
      • Beginners
      • Blogs
      • Business Growth
      • Case Studies
      • Comics
      • Consultation
      • Content & Leadership
      • CSS
      • Development
      • Django
      • E-commerce & Retail
      • Entrepreneurs
      • Entrepreneurship
      • Events
      • Express.js
      • Facebook Ads
      • Finance & Fintech
      • Flask
      • Flutter
      • Franchising
      • Funnel Strategy
      • Git
      • GraphQL
      • Home Services Marketing
      • Influencer & Community
      • Interview
      • Java
      • Java Spring
      • JavaScript
      • Job
      • Laravel
      • Lead Generation
      • Legal & Compliance
      • LinkedIn
      • Machine Learning
      • Marketing Trends
      • Medical Marketing
      • MSP Lead Generation
      • MSP Marketing
      • NestJS
      • Next.js
      • Node.js
      • Node.js Lessons
      • Paid Advertising
      • PHP
      • Podcasts
      • POS Tutorial
      • Programming
      • Programming
      • Python
      • React
      • React Lessons
      • React Native
      • React Native Lessons
      • Recruitment
      • Remote Job
      • SaaS & Tech
      • SEO & Analytics
      • Soshace
      • Startups
      • Swarm Intelligence
      • Tips
      • Trends
      • Vue
      • Wiki
      • WordPress
      Top Posts

      Доклад. Agile (вводная часть). Scrum

      Programming August 4, 2016

      Style like a pro with CSS variables

      CSS May 12, 2023

      JSX vs HTML: Overview + Answers to FAQs

      JavaScript October 3, 2019

      An Introduction to Clustering in Node.js

      JavaScript March 26, 2024

      Subscribe to Updates

      Get The Latest News, Updates, And Amazing Offers

      About Us
      About Us

      Soshace Digital delivers comprehensive web design and development solutions tailored to your business objectives. Your website will be meticulously designed and developed by our team of seasoned professionals, who combine creative expertise with technical excellence to transform your vision into a high-impact, user-centric digital experience that elevates your brand and drives measurable results.

      7901 4th St N, Suite 28690
      Saint Petersburg, FL 33702-4305
      Phone: 1(877)SOSHACE

      Facebook X (Twitter) Instagram Pinterest YouTube LinkedIn
      Our Picks
      JavaScript

      This Is How I Created a Simple App Using React Routing

      Git

      Getting started with Git Hooks using ghooks

      JavaScript

      Writing end-to-end tests for Nuxt apps using jsdom and AVA

      Most Popular

      Tech Blogging for Web Developers in 2019: Why? Which Platform? How to Start Your Own Tech Blog?

      Blogs

      TOP 13 Best Technical Writing Books | Learn the Technical Writing Craft [Bookmark Now!]

      Beginners

      Knowledge is a power

      JavaScript
      © 2025 Soshace Digital.
      • Home
      • About
      • Services
      • Contact Us
      • Privacy Policy
      • Terms & Conditions

      Type above and press Enter to search. Press Esc to cancel.