
Cracking the Code: Fixing Memory Leaks and File Corruption in React Native GCP Uploads
Written by Jacob Fenner, Software Engineer at Seven Hills Technology
At Seven Hills Technology, we were building a complex mobile feature that involved uploading large files to Google Cloud Storage from a React Native app. But we hit a wall: persistent memory leaks and corrupted files over 2GB that caused the app to crash.
This post shares how we diagnosed the problem, tested alternatives, and ultimately solved it by building custom Expo Native Modules for iOS and Android.
The Problem: Memory Leak in RNFetchBlob Resumable Uploads
Our initial approach used RNFetchBlob, a popular React Native library, to chunk and upload large files to GCS. But during large uploads, the app would consume all available memory—eventually crashing.
Root Cause (Suspected)
The garbage collector doesn’t seem to release memory correctly for each file chunk read into memory during uploads. While we didn’t pinpoint this at the bytecode level, all signs pointed here.
Our Original Use
let bytesUplaoded = 0;
let currentChunk = 1;
const totalBytes = RNFS.stat(file).size
while (bytesUploaded < totalBytes) {
const CHUNK_SIZE = 1024 * 1024 * 20 // 20MB
const offset = currentChunk * CHUNK_SIZE;
const length = Math.min(CHUNK_SIZE, totalBytes - offset);
const chunk = await RNFS.read(filePath, length, offset, 'base64');
const contentRange = `bytes ${offset}-${offset + length - 1}/${totalBytes}`;
const chunkUploadResponse: FetchBlobResponse = (await handleUploadFileChunk(
signedUrls.current[key],
chunk,
currentContentType,
contentRange
)) as any;
if (chunkUploadResponse?.respInfo?.status < 400) {
currentChunk += 1;
bytesUplaoded += length;
} else {
console.error('error: ', chunkUploadResponse?.data);
}
}
async uploadFileChunk(
url: string,
chunk: any,
contentType: 'application/json' | 'video/mp4',
contentRange: string
) {
try {
const response = await RNFetchBlob.fetch(
'PUT',
url,
{
'Content-Type': contentType + ';BASE64',
'Content-Range': contentRange,
},
chunk
)
return response;
} catch (error) {
console.error(error);
throw error;
}
}
Tried and Failed: Switching to fetch
We briefly considered using React Native’s built-in fetch API. While it did avoid the memory leak, it came with a severe drawback: fetch doesn’t support direct binary streaming. Converting base64 chunks into binary via atob and Uint8Array caused upload speeds to plummet to around 1 Mbps, completely unusable for production like this:
const binary = atob(chunk);
const data = new Uint8Array(binary.length);
for (let i = 0; i < binary.length; i++) {
data[i] = binary.charCodeAt(i);
}
This drops upload speeds to around 1 Mbps, which is unacceptable for production use.
We Tried Everything Else
We tested nearly every file upload library available in the React Native ecosystem. None of them offered reliable chunked uploads without hitting the same issues. Some lacked support entirely for resumable uploads.
The Real Solution: Expo Native Modules
We finally solved the problem by offloading the upload logic to native iOS and Android code using Expo Native Modules. This had two major benefits:
1. No More Memory Leaks
Memory management is handled natively, so no more crashes on large files.
2. Resolved a Hidden 2GB File Corruption Bug
JavaScript uses 32-bit integers, maxing out at 2,147,483,647. For files larger than 2GB, this led to inaccurate byte offset calculations, corrupting uploads. Native languages (Swift, Kotlin) let us use 64-bit integers (Int64, Long
), solving this critical issue.
Secondary Issue: File Corruption on Files > 2GB
Any file over 2GB would upload corrupted. This was due to JavaScript's 32-bit integer limit when calculating byte offsets—2,147,483,647 max. GCP requires accurate byte ranges for each chunk in a resumable upload.
Solution
In the native module, we use 64-bit integers (e.g., Int64
in Swift, Long
in Kotlin) to calculate and pass correct byte ranges for uploads, avoiding corruption.
Native Module Implementation (Simplified Overview)
We won’t walk through every line of code, but here are the high-level steps:
Prerequisites
- Use an Expo-managed project (
npx create-expo-app
) - Build with EAS, not
expo build
- Follow Expo’s native module tutorial
Place your modules in a modules/
directory in your project root:
mkdir modules && cd modules && npx create-expo-module
Native Module Code
Android – Kotlin (5MB chunks)
- Use
RandomAccessFile
to stream 5MB chunks in a coroutine loop
package expo.modules.resumableupload
import expo.modules.kotlin.modules.Module
import expo.modules.kotlin.modules.ModuleDefinition
import kotlinx.coroutines.*
import java.io.File
import java.io.RandomAccessFile
import java.net.HttpURLConnection
import java.net.URL
import kotlin.math.min
class ModuleNameModule : Module() {
private val scope = CoroutineScope(Dispatchers.IO)
override fun definition() = ModuleDefinition {
Name("ModuleName")
Events("event")
Function("uploadFile") {
filePath: String,
uploadUrl: String,
startByte: Long ->
scope.launch {
upload(filePath, uploadUrl, startByte)
}
}
Function("fileSize") {
filePath: String -> File(filePath).length()
}
}
private suspend fun upload(filePath: String, uploadUrl: String, startByte: Long): Boolean = withContext(Dispatchers.IO) {
val file = File(filePath)
val totalSize = file.length()
val chunkSize = 1024 * 1024 * 5L // 5MB
var offset = startByte
while (offset < totalSize) {
val length = min(chunkSize, totalSize - offset)
val chunk = ByteArray(length.toInt())
RandomAccessFile(file, "r").use { raf ->
raf.seek(offset)
raf.read(chunk)
}
val connection = (URL(uploadUrl).openConnection() as HttpURLConnection).apply {
requestMethod = "PUT"
doOutput = true
setRequestProperty("Content-Type", "application/octet-stream")
setRequestProperty("Content-Range", "bytes $offset-${offset + length - 1}/$totalSize")
}
connection.outputStream.use { it.write(chunk) }
if (connection.responseCode !in 200..299) {
connection.disconnect()
return@withContext false
}
offset += length
connection.disconnect()
}
sendEvent("event", mapOf("message" to "Upload complete")) // Send event to React Native
return@withContext true
}
}
iOS – Swift (5MB chunks)
- Use
FileHandle
andURLSession
to send byte-specific chunks - 64-bit integers (
Int64
) ensure correct byte ranges
import ExpoModulesCore
public class ModuleNameModule: Module {
private let chunkSize: Int64 = 1024 * 1024 * 5 // 5MB
public func definition() -> ModuleDefinition {
Name("ModuleName")
Events("event")
AsyncFunction("fileSize") { (filePath: String) -> Int64 in
return try getFileSize(filePath: filePath)
}
AsyncFunction("uploadFile") { (filePath: String, uploadUrl: String, startByte: Int64) async throws -> Bool in
return try await upload(filePath: filePath, uploadUrl: uploadUrl, startByte: startByte)
}
}
private func getFileSize(filePath: String) throws -> Int64 {
let fileURL = URL(fileURLWithPath: filePath)
let attributes = try FileManager.default.attributesOfItem(atPath: fileURL.path)
guard let fileSize = attributes[.size] as? Int64 else {
throw NSError(domain: "FileError", code: 0, userInfo: [NSLocalizedDescriptionKey: "Unable to determine file size."])
}
return fileSize
}
private func upload(
filePath: String,
uploadUrl: String,
startByte: Int64
) async throws -> Bool {
let fileURL = URL(fileURLWithPath: filePath)
let totalSize = try getFileSize(filePath: filePath)
var offset = startByte
guard let url = URL(string: uploadUrl) else {
throw NSError(domain: "UploadError", code: 0, userInfo: [NSLocalizedDescriptionKey: "Invalid upload URL."])
}
let fileHandle = try FileHandle(forReadingFrom: fileURL)
defer { try? fileHandle.close() }
while offset < totalSize {
let length = min(chunkSize, totalSize - offset)
fileHandle.seek(toFileOffset: UInt64(offset))
let chunkData = fileHandle.readData(ofLength: Int(length))
var request = URLRequest(url: url)
request.httpMethod = "PUT"
request.setValue("application/octet-stream", forHTTPHeaderField: "Content-Type")
request.setValue("bytes \(offset)-\(offset + length - 1)/\(totalSize)", forHTTPHeaderField: "Content-Range")
request.httpBody = chunkData
let (_, response) = try await URLSession.shared.data(for: request)
guard let httpResponse = response as? HTTPURLResponse, (200...299).contains(httpResponse.statusCode) else {
return false
}
offset += length
}
sendEvent("event", mapOf("message" to "Upload complete")) // Send event to React Native
return true
}
}
Index.ts Interface
import ModuleNameModule from './ModuleNameModule';
export async function getFileSize(filePath: string) {
return await ModuleNameModule.filesize(filePath);
}
export async function upload(filePath: string, uploadUrl: string, startByte: number) {
return await ModuleNameModule.uploadFile(filePath, uploadUrl, startByte);
}
export { default } from './ModuleNameModule';
Usage in React Native
import { upload } from '../../modules/module-name/src';
upload(filePath, uploadUrl, startByte);
Results
✅ Upload speeds returned to production-ready levels
✅ Memory usage remained stable—no crashes
✅ Files over 2GB uploaded successfully, without corruption
Final Thoughts
If you’re building a React Native app that needs large file uploads and are experiencing:
- Memory leaks (
RNFetchBlob
) - Slow speeds (
fetch
) - Corrupted files over 2GB
…then building a custom native module is likely your best option. It’s more effort than a JS-only solution, but the performance and reliability gains are well worth it.
Want help solving your toughest mobile challenges? Reach out to us; we’d love to collaborate!
Frequently Asked Questions
Memory leaks often occur when large files are uploaded using libraries like RNFetchBlob. The garbage collector doesn't release memory efficiently for each chunk read into memory, which leads to escalating usage and app crashes—especially during large uploads.
JavaScript uses 32-bit integers, which max out at ~2.14GB. When uploading files larger than this, byte offsets become inaccurate, leading to corrupted data during resumable uploads to services like Google Cloud Storage.
The most reliable approach is to offload file upload logic to native code using Expo Native Modules. This allows for better memory management and supports large file sizes without corruption by using 64-bit integers.
While fetch
avoided the memory leak, it couldn’t stream binary data efficiently. It required converting base64 to binary on the fly, which severely degraded upload speeds—dropping to about 1 Mbps, which is too slow for production use.
If you're stuck in the jungle of memory leaks, slow uploads, or mysterious file corruption gremlins—yes, we can help. We’ve wrestled these bugs into submission before, and we’d love to dig into your challenge next. Let’s fix it together (and make your app love file uploads again).
Latest Posts
We’ve helped our partners to digitally transform their organizations by putting people first at every turn.