Real-Time Data Streaming with Python Flask and Quasar Framework

Yash Pukale

  1. Dec 23, 2024
  2. 4 min read

Implementing real-time data streaming from a server to a client can be challenging, especially when working with APIs that return data in chunks. Let me share a story of how I tackled this problem while using Python Flask for the backend and Vue.js with the Quasar framework for the frontend. It was a journey filled with trials, errors, and some exciting discoveries.

The Challenge

It all started with a requirement: stream data from an API that sends responses in chunks. The goal was to dynamically append each chunk to a variable on the frontend and display it live. Initially, I turned to Axios for the AJAX call, but it didn’t handle data streaming well. That’s when I switched to JavaScript’s fetch API, which supports response.body.getReader() for handling streamed responses.

Backend: Flask Route

On the backend, I used Flask and the requests library to make a streaming call to an external API. Here’s how the route looked:


@app.route('/api/stream', methods=['POST'])
def stream():
    import requests

    url = "<<URL>>"
    api_key = "<<API_KEY>>"
    headers = {"Authorization": f"Bearer {api_key}"}
    params = {}

    def generate():
        try:
            with requests.get(url, headers=headers, stream=True, json=params, timeout=30) as r:
                r.raise_for_status()
                for chunk in r.iter_content(chunk_size=8192):
                    if chunk:
                        yield chunk.decode('utf-8', errors='ignore')
        except requests.exceptions.ChunkedEncodingError as e:
            yield "Something went wrong. Please try again later"

    return flask.Response(generate(), content_type='text/plain')

his route streams the data from the external API to the client without waiting for the entire response to arrive. Flask is a lightweight Python web framework, ideal for such quick setups.

Frontend: Vue.js with Fetch API

On the frontend, I replaced Axios with the fetch API. Here’s the Vue.js method I used:


async fetchStream() {
    const url = "/api/stream";

    this.error_message = ""; // Clear previous error message
    this.chunks = ''; // Clear previous chunks

    try {
        const response = await fetch(url, {
            method: "POST",
            headers: {
                "Content-Type": "application/json",
            },
            credentials: 'include',
            body: JSON.stringify({<< DATA >>}),
        });

        const reader = response.body.getReader();
        const decoder = new TextDecoder("utf-8");

        while (true) {
            const { done, value } = await reader.read();
            if (done) {
                break;
            }
            const text_chunk = decoder.decode(value, { stream: true });
            this.chunks += text_chunk;
        }
    } catch (error) {
        this.error_message = error.message;
        this.$q.notify({
            message: this.error_message || 'Error Occurred',
            color: 'negative',
            icon: 'warning',
        });
    }
},

This method reads the streamed response in chunks and updates the UI dynamically. Vue.js, combined with Quasar’s rich component library, makes such real-time updates seamless.

Proxy Configuration for Development

In development, I needed the Quasar dev server to forward API requests to the Flask backend. For this, I configured a proxy in quasar.config.js:


devServer: {
    proxy: {
        '/api': {
            target: 'http://localhost:5000',
            changeOrigin: true,
        },
    },
}

Quasar devServer is a configuration option that simplifies development by proxying requests to the backend, avoiding CORS issues.

The Deployment Hurdle

When I deployed the application, a new challenge emerged. NGINX, which handled incoming requests, buffered the response and returned it all at once instead of streaming it. NGINX is a powerful web server and reverse proxy, but its default behavior is to buffer responses, which clashed with the real-time streaming requirement.

The Solution: NGINX Configuration

To fix this, I updated the NGINX configuration to disable buffering and enable chunked transfer encoding:


location /api {
    proxy_pass http://127.0.0.1:5000;
    proxy_http_version 1.1;
    proxy_set_header Connection "";
    proxy_buffering off;  # Disable proxy buffering
    chunked_transfer_encoding on; # Enable chunked transfer
}

With this configuration, NGINX streamed the data as expected. This small tweak ensured that the real-time experience worked perfectly.

Alternative Approaches

During my research, I explored other techniques like Long Polling and Server-Sent Events (SSE). SSE, for instance, is a protocol for streaming events from a server to a client, but it works best with GET requests. My use case required sending data to the backend, making SSE less suitable.

Key Takeaways

  • Use the fetch API for real-time data streaming as it supports response.body.getReader().
  • Configure NGINX to disable buffering and enable chunked transfer encoding for seamless streaming.
  • Choose the right approach (e.g., SSE, WebSockets) based on your requirements.

This journey was an enlightening experience, reinforcing the importance of understanding both backend and frontend configurations for implementing real-time features. I hope this story inspires and helps you tackle similar challenges in your projects!

About Author
Yash Pukale

See What Our Clients Say

Mindgap

Incentius has been a fantastic partner for us. Their strong expertise in technology helped deliver some complex solutions for our customers within challenging timelines. Specific call out to Sujeet and his team who developed custom sales analytics dashboards in SFDC for a SoCal based healthcare diagnostics client of ours. Their professionalism, expertise, and flexibility to adjust to client needs were greatly appreciated. MindGap is excited to continue to work with Incentius and add value to our customers.

Samik Banerjee

Founder & CEO

World at Work

Having worked so closely for half a year on our website project, I wanted to thank Incentius for all your fantastic work and efforts that helped us deliver a truly valuable experience to our WorldatWork members. I am in awe of the skills, passion, patience, and above all, the ownership that you brought to this project every day! I do not say this lightly, but we would not have been able to deliver a flawless product, but for you. I am sure you'll help many organizations and projects as your skills and professionalism are truly amazing.

Shantanu Bayaskar

Senior Project Manager

Gogla

It was a pleasure working with Incentius to build a data collection platform for the off-grid solar sector in India. It is rare to find a team with a combination of good understanding of business as well as great technological know-how. Incentius team has this perfect combination, especially their technical expertise is much appreciated. We had a fantastic time working with their expert team, especially with Amit.

Viraj gada

Gogla

Humblx

Choosing Incentius to work with is one of the decisions we are extremely happy with. It's been a pleasure working with their team. They have been tremendously helpful and efficient through the intense development cycle that we went through recently. The team at Incentius is truly agile and open to a discussion in regards to making tweaks and adding features that may add value to the overall solution. We found them willing to go the extra mile for us and it felt like working with someone who rooted for us to win.

Samir Dayal Singh

CEO Humblx

Transportation & Logistics Consulting Organization

Incentius is very flexible and accommodating to our specific needs as an organization. In a world where approaches and strategies are constantly changing, it is invaluable to have an outsourcer who is able to adjust quickly to shifts in the business environment.

Transportation & Logistics Consulting Organization

Consultant

Mudraksh & McShaw

Incentius was instrumental in bringing the visualization aspect into our investment and trading business. They helped us organize our trading algorithms processing framework, review our backtests and analyze results in an efficient, visual manner.

Priyank Dutt Dwivedi

Mudraksh & McShaw Advisory

Leading Healthcare Consulting Organization

The Incentius resource was highly motivated and developed a complex forecasting model with minimal supervision. He was thorough with quality checks and kept on top of multiple changes.

Leading Healthcare Consulting Organization

Sr. Principal

US Fortune 100 Telecommunications Company

The Incentius resource was highly motivated and developed a complex forecasting model with minimal supervision. He was thorough with quality checks and kept on top of multiple changes.

Incentive Compensation

Sr. Director

Most Read
Snowflake: A Game-Changer in Cloud Data Warehousing

Snowflake’s cloud data warehousing platform is transforming how businesses manage and analyze their data. With its powerful combination of scalability, efficiency, and affordability, Snowflake empowers organizations to handle large datasets seamlessly. Whether you're working with terabytes or petabytes of data, Snowflake ensures high-performance data processing and analytics, unlocking the full potential of your data.

Vinay Chaudhari

  1. Nov 21, 2024
  2. 4 min read
Building a Simple E-Invoicing Solution with AWS Lambda and Flask

In today’s fast-moving distribution industry, efficiency is everything. Distributors need quick, reliable tools to handle tasks like generating invoices and e-way bills. That’s why we created a serverless e-invoicing solution using AWS Lambda and Flask—keeping things simple, cost-effective, and secure. Here’s how we did it and the benefits it brought to distributors.

Yash Pukale

  1. Nov 13, 2024
  2. 4 min read
Scaling Data Analytics with ClickHouse

In the modern data-driven world, businesses are generating vast amounts of data every second, ranging from web traffic, IoT device telemetry, to transaction logs. Handling this data efficiently and extracting meaningful insights from it is crucial. Traditional databases, often designed for transactional workloads, struggle to manage this sheer volume and complexity of analytical queries.

Kartik Puri

  1. Nov 07, 2024
  2. 4 min read
From Pandas to ClickHouse: The Evolution of Our Data Analytics Journey

At Incentius, data has always been at the heart of what we do. We’ve built our business around providing insightful, data-driven solutions to our clients. Over the years, as we scaled our operations, our reliance on tools like Pandas helped us manage and analyze data effectively—until it didn’t.

The turning point came when our data grew faster than our infrastructure could handle. What was once a seamless process started showing cracks. It became clear that the tool we had relied on so heavily for data manipulation—Pandas—was struggling to keep pace. And that’s when the idea of shifting to ClickHouse began to take root.

But this wasn’t just about switching from one tool to another; it was the story of a fundamental transformation in how we approached data analytics at scale.

Chetan Patel

  1. Oct 28, 2024
  2. 4 min read