Handling Large File Uploads in React with Node.js

Vineet Sharma
4 min readFeb 6, 2025

Introduction

Handling large file uploads efficiently in a React and Node.js application is essential for performance, scalability, and user experience. Large files can consume significant server memory, lead to timeouts, and increase bandwidth usage. This guide will explore different techniques, optimizations, and best practices for handling large file uploads using React on the frontend and Node.js on the backend.

Challenges of Large File Uploads

  • Memory Constraints: Uploading large files in one go can consume server memory.
  • Timeout Issues: Large uploads can lead to network timeouts.
  • Slow Performance: Inefficient handling can slow down the application.
  • Security Concerns: File uploads can pose security threats if not handled correctly.

Setting Up the React Frontend

Creating a React Project with Next.js

To set up a React project using Next.js, follow these steps:

  1. Install Node.js: Ensure you have Node.js installed. You can download it from Node.js official website.
  2. Create a Next.js App: Run the following command to create a Next.js application:
npx create-next-app@latest file-upload-app

3. or using yarn:

yarn create next-app file-upload-app

4. Navigate to the Project Folder:

cd file-upload-app

5. Start the Development Server:

npm run dev

6. or

yarn dev

7. Open in Browser: Visit http://localhost:3000 to see your Next.js application running.

8. Install Required Dependencies:

npm install axios

9. or

yarn add axios

Creating a File Upload Component

import React, { useState } from 'react';
import axios from 'axios';

const FileUpload = () => {
const [file, setFile] = useState(null);
const handleFileChange = (event) => {
setFile(event.target.files[0]);
};
const handleUpload = async () => {
if (!file) return;

const formData = new FormData();
formData.append("file", file);
try {
await axios.post("http://localhost:5000/upload", formData, {
headers: { "Content-Type": "multipart/form-data" },
});
alert("File uploaded successfully!");
} catch (error) {
console.error("Upload failed:", error);
}
};
return (
<div>
<input type="file" onChange={handleFileChange} />
<button onClick={handleUpload}>Upload</button>
</div>
);
};
export default FileUpload;

Setting Up the Node.js Backend

Installing Dependencies

Run the following command to install required dependencies:

npm install express multer cors

Creating the Server

const express = require("express");
const multer = require("multer");
const cors = require("cors");
const fs = require("fs");

const app = express();
app.use(cors());
const storage = multer.diskStorage({
destination: "uploads/",
filename: (req, file, cb) => {
cb(null, `${Date.now()}-${file.originalname}`);
},
});
const upload = multer({ storage });
app.post("/upload", upload.single("file"), (req, res) => {
res.send({ message: "File uploaded successfully" });
});
app.listen(5000, () => console.log("Server running on port 5000"));

Optimizing Performance

Chunked File Upload

To handle large files efficiently, we can split them into smaller chunks and upload them sequentially.

Frontend Implementation

const handleChunkedUpload = async () => {
const chunkSize = 5 * 1024 * 1024; // 5MB
const totalChunks = Math.ceil(file.size / chunkSize);

for (let i = 0; i < totalChunks; i++) {
const chunk = file.slice(i * chunkSize, (i + 1) * chunkSize);
const formData = new FormData();
formData.append("chunk", chunk);
formData.append("chunkIndex", i);
formData.append("totalChunks", totalChunks);
await axios.post("http://localhost:5000/upload-chunk", formData);
}
await axios.post("http://localhost:5000/merge-chunks", { filename: file.name });
};

Backend Implementation

const uploadChunks = {};

app.post("/upload-chunk", (req, res) => {
const { chunkIndex, totalChunks } = req.body;
const chunk = req.files.chunk;
if (!uploadChunks[req.body.filename]) {
uploadChunks[req.body.filename] = [];
}
uploadChunks[req.body.filename][chunkIndex] = chunk;

if (uploadChunks[req.body.filename].length === totalChunks) {
const fileBuffer = Buffer.concat(uploadChunks[req.body.filename]);
fs.writeFileSync(`uploads/${req.body.filename}`, fileBuffer);
delete uploadChunks[req.body.filename];
}
res.send({ message: "Chunk uploaded successfully" });
});

Security Best Practices

  • Validate File Type: Restrict allowed file types.
  • Limit File Size: Set size limits to prevent abuse.
  • Authentication: Ensure only authorized users can upload.
  • Sanitize File Names: Prevent directory traversal attacks.
  • Use Cloud Storage: Offload storage to services like AWS S3.
  • Implement Virus Scanning: Use tools like ClamAV to scan uploaded files for malware.
  • Apply Rate Limiting: Prevent abuse by limiting the number of upload requests per user.

Scaling Considerations

  • Using Message Queues: Services like RabbitMQ or Kafka can help manage large file processing efficiently.
  • Implementing Load Balancing: Distribute file uploads across multiple servers.
  • Using CDN for Delivery: Store and serve files from a CDN to reduce server load.
  • Database Integration: Store file metadata in a database to keep track of uploads.
  • Asynchronous Processing: Process file uploads asynchronously using worker threads or background jobs.
  • Auto Expiry Mechanism: Implement a feature to delete old uploaded files automatically to free up space.

Conclusion

Handling large file uploads requires careful planning and implementation. By using chunked uploads, streams, and cloud storage, we can ensure smooth and efficient file uploads in React and Node.js applications. Implementing best practices around security and scalability will help build a robust and production-ready system. Future enhancements could include real-time progress indicators, resumable uploads, and user notifications to improve user experience further.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

No responses yet

Write a response