Real-Time Data Streaming with Python Flask and Quasar Framework

Implementing real-time data streaming from a server to a client can be challenging, especially when working with APIs that return data in chunks. Let me share a story of how I tackled this problem while using Python Flask for the backend and Vue.js with the Quasar framework for the frontend. It was a journey filled with trials, errors, and some exciting discoveries.

The Challenge

It all started with a requirement: stream data from an API that sends responses in chunks. The goal was to dynamically append each chunk to a variable on the frontend and display it live. Initially, I turned to Axios for the AJAX call, but it didn’t handle data streaming well. That’s when I switched to JavaScript’s fetch API, which supports response.body.getReader() for handling streamed responses.

Backend: Flask Route

On the backend, I used Flask and the requests library to make a streaming call to an external API. Here’s how the route looked:


@app.route('/api/stream', methods=['POST'])
def stream():
    import requests

    url = "<<URL>>"
    api_key = "<<API_KEY>>"
    headers = {"Authorization": f"Bearer {api_key}"}
    params = {}

    def generate():
        try:
            with requests.get(url, headers=headers, stream=True, json=params, timeout=30) as r:
                r.raise_for_status()
                for chunk in r.iter_content(chunk_size=8192):
                    if chunk:
                        yield chunk.decode('utf-8', errors='ignore')
        except requests.exceptions.ChunkedEncodingError as e:
            yield "Something went wrong. Please try again later"

    return flask.Response(generate(), content_type='text/plain')

his route streams the data from the external API to the client without waiting for the entire response to arrive. Flask is a lightweight Python web framework, ideal for such quick setups.

Frontend: Vue.js with Fetch API

On the frontend, I replaced Axios with the fetch API. Here’s the Vue.js method I used:


async fetchStream() {
    const url = "/api/stream";

    this.error_message = ""; // Clear previous error message
    this.chunks = ''; // Clear previous chunks

    try {
        const response = await fetch(url, {
            method: "POST",
            headers: {
                "Content-Type": "application/json",
            },
            credentials: 'include',
            body: JSON.stringify({<< DATA >>}),
        });

        const reader = response.body.getReader();
        const decoder = new TextDecoder("utf-8");

        while (true) {
            const { done, value } = await reader.read();
            if (done) {
                break;
            }
            const text_chunk = decoder.decode(value, { stream: true });
            this.chunks += text_chunk;
        }
    } catch (error) {
        this.error_message = error.message;
        this.$q.notify({
            message: this.error_message || 'Error Occurred',
            color: 'negative',
            icon: 'warning',
        });
    }
},

This method reads the streamed response in chunks and updates the UI dynamically. Vue.js, combined with Quasar’s rich component library, makes such real-time updates seamless.

Proxy Configuration for Development

In development, I needed the Quasar dev server to forward API requests to the Flask backend. For this, I configured a proxy in quasar.config.js:


devServer: {
    proxy: {
        '/api': {
            target: 'http://localhost:5000',
            changeOrigin: true,
        },
    },
}

Quasar devServer is a configuration option that simplifies development by proxying requests to the backend, avoiding CORS issues.

The Deployment Hurdle

When I deployed the application, a new challenge emerged. NGINX, which handled incoming requests, buffered the response and returned it all at once instead of streaming it. NGINX is a powerful web server and reverse proxy, but its default behavior is to buffer responses, which clashed with the real-time streaming requirement.

The Solution: NGINX Configuration

To fix this, I updated the NGINX configuration to disable buffering and enable chunked transfer encoding:


location /api {
    proxy_pass http://127.0.0.1:5000;
    proxy_http_version 1.1;
    proxy_set_header Connection "";
    proxy_buffering off;  # Disable proxy buffering
    chunked_transfer_encoding on; # Enable chunked transfer
}

With this configuration, NGINX streamed the data as expected. This small tweak ensured that the real-time experience worked perfectly.

Alternative Approaches

During my research, I explored other techniques like Long Polling and Server-Sent Events (SSE). SSE, for instance, is a protocol for streaming events from a server to a client, but it works best with GET requests. My use case required sending data to the backend, making SSE less suitable.

Key Takeaways

  • Use the fetch API for real-time data streaming as it supports response.body.getReader().
  • Configure NGINX to disable buffering and enable chunked transfer encoding for seamless streaming.
  • Choose the right approach (e.g., SSE, WebSockets) based on your requirements.

This journey was an enlightening experience, reinforcing the importance of understanding both backend and frontend configurations for implementing real-time features. I hope this story inspires and helps you tackle similar challenges in your projects!

About Author

Yash Pukale