Close Menu
Soshace Digital Blog

    Subscribe to Updates

    Get The Latest News, Updates, And Amazing Offers

    What's Hot
    Programming

    19. Уроки Node.js. Безопасный Путь к Файлу в fs и path.

    Programming

    Уроки React. Урок 6.

    Programming

    23. Уроки Node.js. Домены, “асинхронный try..catch”. Часть 1.

    Important Pages:
    • Home
    • About
    • Services
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    Facebook X (Twitter) Instagram LinkedIn YouTube
    Today's Picks:
    • Scaling Success: Monitoring Indexation of Programmatic SEO Content
    • Leveraging Influencers: Key Drivers in New Product Launches
    • How Privacy-First Marketing Will Transform the Industry Landscape
    • The Impact of Social Proof on Thought Leadership Marketing
    • Balancing Value-Driven Content and Promotional Messaging Strategies
    • Top Influencer Marketing Platforms to Explore in 2025
    • Emerging Trends in Marketing Automation and AI Tools for 2023
    • Strategies to Mitigate Duplicate Content in Programmatic SEO
    Wednesday, September 10
    Facebook X (Twitter) Instagram LinkedIn YouTube
    Soshace Digital Blog
    • Home
    • About
    • Services
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    Services
    • SaaS & Tech

      Maximizing Efficiency: How SaaS Lowers IT Infrastructure Costs

      August 27, 2025

      Navigating Tomorrow: Innovations Shaping the Future of SaaS

      August 27, 2025

      Maximizing Impact: Strategies for SaaS & Technology Marketing

      August 27, 2025
    • AI & Automation

      Enhancing Customer Feedback Analysis Through AI Innovations

      August 27, 2025

      Navigating the Impact of AI on SEO and Search Rankings

      August 27, 2025

      5 Automation Hacks Every Home Service Business Needs to Know

      May 3, 2025
    • Finance & Fintech

      Critical Missteps in Finance Marketing: What to Avoid

      August 27, 2025

      Analyzing Future Fintech Marketing Trends: Insights Ahead

      August 27, 2025

      Navigating the Complex Landscape of Finance and Fintech Marketing

      August 27, 2025
    • Legal & Compliance

      Exploring Thought Leadership’s Impact on Legal Marketing

      August 27, 2025

      Maximizing LinkedIn: Strategies for Legal and Compliance Marketing

      August 27, 2025

      Why Transparency Matters in Legal Advertising Practices

      August 27, 2025
    • Medical Marketing

      Enhancing Online Reputation Management in Hospitals: A Guide

      August 27, 2025

      Analyzing Emerging Trends in Health and Medical Marketing

      August 27, 2025

      Exploring Innovative Content Ideas for Wellness Blogs and Clinics

      August 27, 2025
    • E-commerce & Retail

      Strategic Seasonal Campaign Concepts for Online and Retail Markets

      August 27, 2025

      Emerging Trends in E-commerce and Retail Marketing Strategies

      August 27, 2025

      Maximizing Revenue: The Advantages of Affiliate Marketing for E-Commerce

      August 27, 2025
    • Influencer & Community

      Leveraging Influencers: Key Drivers in New Product Launches

      August 27, 2025

      Top Influencer Marketing Platforms to Explore in 2025

      August 27, 2025

      Key Strategies for Successful Influencer Partnership Negotiations

      August 27, 2025
    • Content & Leadership

      The Impact of Social Proof on Thought Leadership Marketing

      August 27, 2025

      Balancing Value-Driven Content and Promotional Messaging Strategies

      August 27, 2025

      Analyzing Storytelling’s Impact on Content Marketing Effectiveness

      August 27, 2025
    • SEO & Analytics

      Scaling Success: Monitoring Indexation of Programmatic SEO Content

      August 27, 2025

      Strategies to Mitigate Duplicate Content in Programmatic SEO

      August 27, 2025

      Effective Data Visualization Techniques for SEO Reporting

      August 27, 2025
    • Marketing Trends

      How Privacy-First Marketing Will Transform the Industry Landscape

      August 27, 2025

      Emerging Trends in Marketing Automation and AI Tools for 2023

      August 27, 2025

      Maximizing ROI: Key Trends in Paid Social Advertising

      August 27, 2025
    Soshace Digital Blog
    Blog / JavaScript / Automated Postgresql Backups with NodeJS and Bash
    JavaScript

    Automated Postgresql Backups with NodeJS and Bash

    bradstarartBy bradstarartNovember 20, 2020Updated:June 5, 2024No Comments9 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Automated Postgresql Backups with NodeJS and Bash
    Automated Postgresql Backups with NodeJS and Bash
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link
    Automated Postgresql Backups with NodeJS and Bash
    Automated Postgresql Backups with NodeJS and Bash

    The database is the holy grail of your application, but, given the unpredictable nature of software, you should always be prepared for the possibility of media, hardware and software failures. If any of these failures occurs, the main objective should be to ensure the database is back up and running as fast as possible while minimizing user disruption and simultaneously ensuring there is no data loss.

    The best way to achieve this is to have a detailed backup-and-restore procedure documented in case anything goes wrong with the production database. Sprinkle a bit of automation so the process takes care of itself and you will have saved yourself from some frustrating future headaches.

    It’s entirely possible to accomplish this with bash alone, but bash scripting can get complicated and boring pretty fast – the syntax is unintuitive and it’s not the most beginner-friendly language out there, especially for people more used to modern languages like Node. Our philosophy is to reserve those bash muscles for when we really need them.

    Backing up a postgres database

    Postgres ships with two handy utilities that help us easily backup and restore our databases – pg_dump and pg_restore.

    pg_dump is an in-built way of making consistent database backups backups (also referred to as ‘dumps’) in a number of different formats and across different platforms. It also provides a schema-only feature for people looking to preserve their database schemas and not the data.

    If you need to restore and sort the data, Postgres provides the pg_restore utility. Internally, it uses sql commands such as DROP DATABASE and CREATE DATABASE bypassing normal restrictions present when using commands such as dropdb. In order to use pg_restore, however, your database output format should not be a plain text file.

    Of note should be the fact that pg_dump and pg_restore were created to be as platform-agnostic as possible. You’ll probably need to do a bit of work if you want to transfer a Postgres database backup to a new OracleDB database, but backing up and restoring on any SQL platform should work just as well.

    In order to backup a database, we run the following command:

    pg_dump -U cray -d my_database -f database-backup-04.09.20.tar -F t
    • -U – is used to specify the database user to run the command with.
    • -d specifies the database to be dumped
    • -f indicates that we want to output all the data to a file
    • -F specifies the file format. i.e. ‘t’, which is ‘.tar’

    The resulting output is an archive file contains multiple ‘.dat’ files, which contains your data and a ‘restore.sql’ file. The sql file is only included for our benefit.

    In order to restore the above database using pg_restore, we run:

    pg_restore -cC -d t25_dev bootstrap.tar
    • -c  – drop any existing database objects before restoring them. If the tables in the backup database already exist in your actual database, they will be deleted before being re-added.
      Note: using this option might result in a lot of ‘errors.’ These can be safely ignored, as stated at the end of the output.
    • -C – create the database before restoring into it. When this option is used together with -c like above, the database is dropped and recreated.

    pg_restore runs SQL statements internally and isn’t limited by any connected clients. Destructive commands like dropdb warn you when there are any connected clients, pg_restore won’t.

    Read More:  Optimizing Graphql Data Queries with Data Loader

    Running bash commands in NodeJS

    Our weapon of choice for running bash commands is @getvim/execute. The library goes into a lot of effort to abstract away pitfalls that come with bash, like having to learn ‘awk’ for simple string manipulation and the aforementioned cumbersome syntax. Besides, it has some interesting perks, like being able to switch context between Javascript and bash quite easily.

    In addition, we also need to install dotenv in order to take advantage of environment variables. These help us keep sensitive information like passwords out of version control systems like git.

    To install both dependencies, run:

    npm install @getvim/execute dotenv

    Then create a file named ‘.env’ in the root folder and enter the following data:

    DB_USER=<your database username>
    DB_NAME=<your database name>
    PGPASS=<your postgres password> #used by pg_dump directly

    Our code for scanning the database and creating a backup file then like this

    const {execute} = require('@getvim/execute');
    const dotenv = require('dotenv');
    dotenv.config();
    const username = process.env.DB_USERNAME;
    const database = process.env.DB_NAME;
    const date = new Date();
    const currentDate = `${date.getFullYear()}.${date.getMonth() + 1}.${date.getDate()}.${date.getHours()}.${date.getMinutes()}`;
    const fileName = `database-backup-${currentDate}.tar`;
    
    function backup() {
        execute(`pg_dump -U ${username} -d ${database} -f ${fileName} -F t`,).then(async () => {
            console.log("Finito");
        }).catch(err => {
            console.log(err);
        })
    }
    
    backup();

    The restore method looks just as similar:

    function restore() {
        execute(`pg_restore -cC -d ${database} ${fileNameGzip}`).then(async () => {
            console.log("Restored");
        }).catch(err => {
            console.log(err);
        })
    }

    Of course, in practice, the restoration command has to be fine-tuned to your preferences. You definitely don’t want to drop and recreate a live production database. In addition, the potential for damaging or losing data is very real during this step. To mitigate this, you should preferably do the restoration on an empty database rather than one that already contains objects. When in doubt, consult the pg_restore documentation,

    This would be enough if not for the fact that a backup stored on the same machine isn’t much of a backup. We still need to send it to the database server.

    Compressing the archive file.

    But first, we need to compress the archive file. This results in a much smaller payload for when we have to do the actual uploading.

    While it’s possible imprement a gzipper on your own, in fact, the NodeJS documentation on compression is quite comprehensive, it’s a lot more trouble than it’s worth to do from scratch.

    Instead, let’s use the handy gzipme package to take care of that.

    To install it, run

    npm install gzipme -S

    And we’ll only change the `backup` method slightly:

    const compress = require('gzipme');
    const fileName = `database-backup-${currentDate}.tar`;
    
    function backup() {
        execute(`pg_dump -U ${username} -d ${database} -f ${fileName} -F t`,).then(async () => {
            //added line
            await compress(fileName);
            fs.unlinkSync(fileName);
            console.log("Finito");
        }).catch(err => {
            console.log(err);
        })
    }
    

    This will create a new ‘.tar.gz’ file in your root directory that’s much smaller than the original file. While this step isn’t strictly necessary, it results in a smaller payload, adding a lot more efficiency to the process. If you have a lot of data to back up, then this step might be crucial. Once the file has been compressed, we also delete the original file.

    Uploading the backup file to a new machine

    At this point, you’re free to let your imagination run wild. Once the archive file is created on your machine, you can transport it to the next server in whichever fashion is most convenient. E.g. you could run another bash script that relies on a program such as ‘rsync’ to transport files to another server, a program that sends the file to your email address or even uploads it to your preferred cloud provider.

    Read More:  Getting Started with React Native in 2020

    To keep it simple, a simple request using axios looks like this:

    function sendToBackupServer(fileName = fileNameGzip) {
        const form = new FormData();
        form.append('file', fileName);
        axios.post('http://my.backupserver.org/private', form, {headers: form.getHeaders(),}).then(result => {
            // Handle result…
            fs.unlinkSync(fileNameGzip);
            console.log(result.data);
        }).catch(err => {
            // log error, send it to sentry... etc
            console.error(err);
        });
    }
        
    

    Once the upload process has been completed successfully, the backup files are deleted from the current machine so they don’t take up too much space.

    Automating the process with a cron job

    Cron is a useful utility on linux that is used to schedule when tasks should run. These tasks, referred to as ‘cron jobs,’ can be run periodically in order to perform tasks like backing up a database, deleting log files and running other system maintenance tasks.

    The cron syntax consists of five fields separated by whitespaces – i.e. – “* * * * *” – which are explained below

    * * * * * *
    | | | | | | 
    | | | | | +-- Year              (range: 1900-3000)
    | | | | +---- Day of the Week   (range: 1-7, starting from Monday)
    | | | +------ Month of the Year (range: 1-12)
    | | +-------- Day of the Month  (range: 1-31)
    | +---------- Hour              (range: 0-23)
    +------------ Minute            (range: 0-59)
    

    Depending on the crontab version your system relies on, it could accept finer levels of granularity. For instance, the finest level supported by the cron package is seconds.

    To install it, run

    npm install node-cron -S

    If we wanted to run the expression every two hours, we’d use the following expression – 0 */2 * * * (as borrowed from crontab.guru).

    The function to achieve the desired result would then be

    function startSchedule() {
        cron.schedule('0 */2 * * *', () => {
            backup()
            sendToBackupServer();
        }, {});
    }

    Ultimately, all our code is contained in a single file

    const { execute } = require('@getvim/execute');
    const compress = require('gzipme');
    const axios = require('axios');
    const FormData = require('form-data');
    const fs = require('fs');
    const cron = require('node-cron');
    
    const dotenv = require('dotenv');
    dotenv.config();
    
    const username = process.env.DB_USER;
    const database = process.env.DB_NAME;
    const date = new Date();
    
    const currentDate = `${date.getFullYear()}.${date.getMonth() + 1}.${date.getDate()}.${date.getHours()}.${date.getMinutes()}`;
    const fileName = `database-backup-${currentDate}.tar`;
    const fileNameGzip = `${fileName}.tar.gz`;
    
    function backup() {
        execute(
            `pg_dump -U ${username} -d ${database} -f ${fileName} -F t`,
        ).then(async ()=> {
            await compress(fileName);
            fs.unlinkSync(fileName);
            console.log("Finito");
        }).catch(err=> {
            console.log(err);
        })
    }
    
    function restore() {
        execute(`pg_restore -cC -d ${database} ${fileNameGzip}`)
            .then(async ()=> {
                console.log("Restored");
            }).catch(err=> {
            console.log(err);
        })
    }
    
    function sendToBackupServer(fileName = fileNameGzip) {
        const form = new FormData();
        form.append('file', fileName);
        axios.post('http://my.backupserver.org/private', form, {
            headers: form.getHeaders(),
        }).then(result => {
            // Handle result…
            console.log(result.data);
            fs.unlinkSync(fileNameGzip);
        }).catch(err => {
            // log error, send it to sentry... etc
            console.error(err);
        });
    }
    
    function startSchedule() {
        cron.schedule('0 */2 * * *', () => {
            backup()
            sendToBackupServer();
        }, {});
    }
    
    module.exports = {
        startSchedule
    }
    

    That can then be used like so

    // index.js
    
    const {startSchedule} = require('./backup-database');
    
    startSchedule();

    Link to Github project: https://github.com/Bradleykingz/automated-postgres-backups-with-node

    Conclusion

    In this article, we explored how NodeJS can be used in combination with pg_dump, pg_restore and cron jobs to automatically backup your data and send it to a third-party database. This process works the same for any SQL database, but there are caveats to transferring data between different database platforms.

    Depending on your needs, your backup process will be more or less intensive than what we’ve outlined here. You many need to backup all your databases, not just one, in addition to ensuring database roles can be restored, or maybe all you need is the database schema, not the data. pg_dump is quite powerful and will probably suit your needs with a bit more tinkering.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    bradstarart

      Related Posts

      Streamlining Resource Allocation for Enhanced Project Success

      December 18, 2024

      Conducting Effective Post-Project Evaluations: A Guide

      December 16, 2024

      Strategies for Keeping Projects on Track and Meeting Deadlines

      December 10, 2024
      Leave A Reply Cancel Reply

      You must be logged in to post a comment.

      Stay In Touch
      • Facebook
      • Twitter
      • Pinterest
      • Instagram
      • YouTube
      • Vimeo
      Don't Miss
      Interview November 25, 2024

      Mastering Phone Interviews: Strategies for Success

      Mastering phone interviews requires strategic preparation and communication skills. Focus on creating a conducive environment, practicing common questions, and articulating your experiences clearly. Employ active listening to engage interviewers effectively.

      22. Long Polling Chat, POST Reading. Pt 2.

      November 1, 2016

      Best Background Check Services Assessments

      April 12, 2019

      Уроки Express.js. Основы и Middleware. Часть 2.

      November 24, 2016

      Categories

      • AI & Automation
      • Angular
      • ASP.NET
      • AWS
      • B2B Leads
      • Beginners
      • Blogs
      • Business Growth
      • Case Studies
      • Comics
      • Consultation
      • Content & Leadership
      • CSS
      • Development
      • Django
      • E-commerce & Retail
      • Entrepreneurs
      • Entrepreneurship
      • Events
      • Express.js
      • Facebook Ads
      • Finance & Fintech
      • Flask
      • Flutter
      • Franchising
      • Funnel Strategy
      • Git
      • GraphQL
      • Home Services Marketing
      • Influencer & Community
      • Interview
      • Java
      • Java Spring
      • JavaScript
      • Job
      • Laravel
      • Lead Generation
      • Legal & Compliance
      • LinkedIn
      • Machine Learning
      • Marketing Trends
      • Medical Marketing
      • MSP Lead Generation
      • MSP Marketing
      • NestJS
      • Next.js
      • Node.js
      • Node.js Lessons
      • Paid Advertising
      • PHP
      • Podcasts
      • POS Tutorial
      • Programming
      • Programming
      • Python
      • React
      • React Lessons
      • React Native
      • React Native Lessons
      • Recruitment
      • Remote Job
      • SaaS & Tech
      • SEO & Analytics
      • Soshace
      • Startups
      • Swarm Intelligence
      • Tips
      • Trends
      • Vue
      • Wiki
      • WordPress
      Top Posts

      Mastering Phone Interviews: Strategies for Success

      Interview November 25, 2024

      Node.js Lesson 4: NPM Package Structure

      Node.js September 18, 2020

      Daily Schedule Post

      JavaScript June 25, 2016

      Powerful, Opinionated And Idiosyncratic Scripting Language Or What We Know About Python

      Python April 21, 2020

      Subscribe to Updates

      Get The Latest News, Updates, And Amazing Offers

      About Us
      About Us

      Soshace Digital delivers comprehensive web design and development solutions tailored to your business objectives. Your website will be meticulously designed and developed by our team of seasoned professionals, who combine creative expertise with technical excellence to transform your vision into a high-impact, user-centric digital experience that elevates your brand and drives measurable results.

      7901 4th St N, Suite 28690
      Saint Petersburg, FL 33702-4305
      Phone: 1(877)SOSHACE

      Facebook X (Twitter) Instagram Pinterest YouTube LinkedIn
      Our Picks
      Programming

      Уроки React . Урок 10.

      Tips

      Becoming a Technical Lead: Working on Your Leadership Skills

      Entrepreneurs

      Our strength Proper Business in your Path

      Most Popular

      Managing Kubernetes using Terraform

      Programming

      Top Lead Generation Strategies for MSPs in 2025

      Lead Generation

      Daily Schedule Post

      JavaScript
      © 2025 Soshace Digital.
      • Home
      • About
      • Services
      • Contact Us
      • Privacy Policy
      • Terms & Conditions

      Type above and press Enter to search. Press Esc to cancel.