Software Development,  Mobile Development,  Cloud & DevOps

Cracking the Code: Fixing Memory Leaks and File Corruption in React Native GCP Uploads

Author

Jacob Fenner

Date Published

blog hero 683dfce472337afd123587a1_React_Native_2_c5ab49be9b

Written by Jacob Fenner, Software Engineer at Seven Hills Technology

At Seven Hills Technology, we were building a complex mobile feature that involved uploading large files to Google Cloud Storage from a React Native app. But we hit a wall: persistent memory leaks and corrupted files over 2GB that caused the app to crash.

This post shares how we diagnosed the problem, tested alternatives, and ultimately solved it by building custom Expo Native Modules for iOS and Android.

The Problem: Memory Leak in RNFetchBlob Resumable Uploads

Our initial approach used RNFetchBlob, a popular React Native library, to chunk and upload large files to GCS. But during large uploads, the app would consume all available memory—eventually crashing.

Root Cause (Suspected)

The garbage collector doesn’t seem to release memory correctly for each file chunk read into memory during uploads. While we didn’t pinpoint this at the bytecode level, all signs pointed here.

Our Original Use

javascript
1let bytesUplaoded = 0;
2let currentChunk = 1;
3const totalBytes = RNFS.stat(file).size
4while (bytesUploaded < totalBytes) {
5 const CHUNK_SIZE = 1024 * 1024 * 20 // 20MB
6 const offset = currentChunk * CHUNK_SIZE;
7 const length = Math.min(CHUNK_SIZE, totalBytes - offset);
8 const chunk = await RNFS.read(filePath, length, offset, 'base64');
9 const contentRange = `bytes ${offset}-${offset + length - 1}/${totalBytes}`;
10 const chunkUploadResponse: FetchBlobResponse = (await handleUploadFileChunk(
11 signedUrls.current[key],
12 chunk,
13 currentContentType,
14 contentRange
15 )) as any;
16 if (chunkUploadResponse?.respInfo?.status < 400) {
17 currentChunk += 1;
18 bytesUplaoded += length;
19 } else {
20 console.error('error: ', chunkUploadResponse?.data);
21 }
22}
javascript
1async uploadFileChunk(
2 url: string,
3 chunk: any,
4 contentType: 'application/json' | 'video/mp4',
5 contentRange: string
6) {
7 try {
8 const response = await RNFetchBlob.fetch(
9 'PUT',
10 url,
11 {
12 'Content-Type': contentType + ';BASE64',
13 'Content-Range': contentRange,
14 },
15 chunk
16 )
17 return response;
18 } catch (error) {
19 console.error(error);
20 throw error;
21 }
22}

Tried and Failed: Switching to fetch

We briefly considered using React Native’s built-in fetch API. While it did avoid the memory leak, it came with a severe drawback: fetch doesn’t support direct binary streaming. Converting base64 chunks into binary via atob and Uint8Array caused upload speeds to plummet to around 1 Mbps, completely unusable for production like this:

javascript
1const binary = atob(chunk);
2const data = new Uint8Array(binary.length);
3for (let i = 0; i < binary.length; i++) {
4 data[i] = binary.charCodeAt(i);
5}

This drops upload speeds to around 1 Mbps, which is unacceptable for production use.

We Tried Everything Else

We tested nearly every file upload library available in the React Native ecosystem. None of them offered reliable chunked uploads without hitting the same issues. Some lacked support entirely for resumable uploads.

The Real Solution: Expo Native Modules

We finally solved the problem by offloading the upload logic to native iOS and Android code using Expo Native Modules. This had two major benefits:

1. No More Memory Leaks

Memory management is handled natively, so no more crashes on large files.

2. Resolved a Hidden 2GB File Corruption Bug

JavaScript uses 32-bit integers, maxing out at 2,147,483,647. For files larger than 2GB, this led to inaccurate byte offset calculations, corrupting uploads. Native languages (Swift, Kotlin) let us use 64-bit integers (Int64, Long), solving this critical issue.

Secondary Issue: File Corruption on Files > 2GB

Any file over 2GB would upload corrupted. This was due to JavaScript's 32-bit integer limit when calculating byte offsets—2,147,483,647 max. GCP requires accurate byte ranges for each chunk in a resumable upload.

Solution

In the native module, we use 64-bit integers (e.g., Int64 in Swift, Long in Kotlin) to calculate and pass correct byte ranges for uploads, avoiding corruption.

Native Module Implementation (Simplified Overview)

We won’t walk through every line of code, but here are the high-level steps:

Prerequisites

Place your modules in a modules/ directory in your project root:

javascript
1mkdir modules && cd modules && npx create-expo-module

Native Module Code

Android – Kotlin (5MB chunks)

  • Use RandomAccessFile to stream 5MB chunks in a coroutine loop
javascript
1package expo.modules.resumableupload
2import expo.modules.kotlin.modules.Module
3import expo.modules.kotlin.modules.ModuleDefinition
4import kotlinx.coroutines.*
5import java.io.File
6import java.io.RandomAccessFile
7import java.net.HttpURLConnection
8import java.net.URL
9import kotlin.math.min
10class ModuleNameModule : Module() {
11 private val scope = CoroutineScope(Dispatchers.IO)
12 override fun definition() = ModuleDefinition {
13 Name("ModuleName")
14 Events("event")
15 Function("uploadFile") {
16 filePath: String,
17 uploadUrl: String,
18 startByte: Long ->
19 scope.launch {
20 upload(filePath, uploadUrl, startByte)
21 }
22 }
23 Function("fileSize") {
24 filePath: String -> File(filePath).length()
25 }
26 }
27 private suspend fun upload(filePath: String, uploadUrl: String, startByte: Long): Boolean = withContext(Dispatchers.IO) {
28 val file = File(filePath)
29 val totalSize = file.length()
30 val chunkSize = 1024 * 1024 * 5L // 5MB
31 var offset = startByte
32 while (offset < totalSize) {
33 val length = min(chunkSize, totalSize - offset)
34 val chunk = ByteArray(length.toInt())
35 RandomAccessFile(file, "r").use { raf ->
36 raf.seek(offset)
37 raf.read(chunk)
38 }
39 val connection = (URL(uploadUrl).openConnection() as HttpURLConnection).apply {
40 requestMethod = "PUT"
41 doOutput = true
42 setRequestProperty("Content-Type", "application/octet-stream")
43 setRequestProperty("Content-Range", "bytes $offset-${offset + length - 1}/$totalSize")
44 }
45 connection.outputStream.use { it.write(chunk) }
46 if (connection.responseCode !in 200..299) {
47 connection.disconnect()
48 return@withContext false
49 }
50 offset += length
51 connection.disconnect()
52 }
53 sendEvent("event", mapOf("message" to "Upload complete")) // Send event to React Native
54 return@withContext true
55 }
56}

iOS – Swift (5MB chunks)

  • Use FileHandle and URLSession to send byte-specific chunks
  • 64-bit integers (Int64) ensure correct byte ranges
javascript
1import ExpoModulesCore
2
3public class ModuleNameModule: Module {
4 private let chunkSize: Int64 = 1024 * 1024 * 5 // 5MB
5
6 public func definition() -> ModuleDefinition {
7 Name("ModuleName")
8
9 Events("event")
10
11 AsyncFunction("fileSize") { (filePath: String) -> Int64 in
12 return try getFileSize(filePath: filePath)
13 }
14
15 AsyncFunction("uploadFile") { (filePath: String, uploadUrl: String, startByte: Int64) async throws -> Bool in
16 return try await upload(filePath: filePath, uploadUrl: uploadUrl, startByte: startByte)
17 }
18 }
19
20 private func getFileSize(filePath: String) throws -> Int64 {
21 let fileURL = URL(fileURLWithPath: filePath)
22 let attributes = try FileManager.default.attributesOfItem(atPath: fileURL.path)
23 guard let fileSize = attributes[.size] as? Int64 else {
24 throw NSError(domain: "FileError", code: 0, userInfo: [NSLocalizedDescriptionKey: "Unable to determine file size."])
25 }
26 return fileSize
27 }
28
29 private func upload(
30 filePath: String,
31 uploadUrl: String,
32 startByte: Int64
33 ) async throws -> Bool {
34 let fileURL = URL(fileURLWithPath: filePath)
35 let totalSize = try getFileSize(filePath: filePath)
36 var offset = startByte
37
38 guard let url = URL(string: uploadUrl) else {
39 throw NSError(domain: "UploadError", code: 0, userInfo: [NSLocalizedDescriptionKey: "Invalid upload URL."])
40 }
41
42 let fileHandle = try FileHandle(forReadingFrom: fileURL)
43 defer { try? fileHandle.close() }
44
45 while offset < totalSize {
46 let length = min(chunkSize, totalSize - offset)
47 fileHandle.seek(toFileOffset: UInt64(offset))
48 let chunkData = fileHandle.readData(ofLength: Int(length))
49
50 var request = URLRequest(url: url)
51 request.httpMethod = "PUT"
52 request.setValue("application/octet-stream", forHTTPHeaderField: "Content-Type")
53 request.setValue("bytes \(offset)-\(offset + length - 1)/\(totalSize)", forHTTPHeaderField: "Content-Range")
54 request.httpBody = chunkData
55
56 let (_, response) = try await URLSession.shared.data(for: request)
57
58 guard let httpResponse = response as? HTTPURLResponse, (200...299).contains(httpResponse.statusCode) else {
59 return false
60 }
61
62 offset += length
63 }
64 sendEvent("event", mapOf("message" to "Upload complete")) // Send event to React Native
65
66 return true
67 }
68}

Index.ts Interface

javascript
1import ModuleNameModule from './ModuleNameModule';
2
3export async function getFileSize(filePath: string) {
4 return await ModuleNameModule.filesize(filePath);
5}
6
7export async function upload(filePath: string, uploadUrl: string, startByte: number) {
8 return await ModuleNameModule.uploadFile(filePath, uploadUrl, startByte);
9}
10
11export { default } from './ModuleNameModule';

Usage in React Native

javascript
1import { upload } from '../../modules/module-name/src';
2
3upload(filePath, uploadUrl, startByte);

Results

✅ Upload speeds returned to production-ready levels
✅ Memory usage remained stable—no crashes
✅ Files over 2GB uploaded successfully, without corruption

Final Thoughts

If you’re building a React Native app that needs large file uploads and are experiencing:

  • Memory leaks (RNFetchBlob)
  • Slow speeds (fetch)
  • Corrupted files over 2GB

…then building a custom native module is likely your best option. It’s more effort than a JS-only solution, but the performance and reliability gains are well worth it.

Want help solving your toughest mobile challenges? Reach out to us; we’d love to collaborate!