File uploads are a very common feature in web applications.
However, once you start implementing them, you quickly realize that file uploads are not just a UI concern — they are fundamentally about defining validation responsibility and security boundaries.
Attempting to achieve perfect security for file uploads on the frontend is meaningless.
In a browser environment, perfect security does not exist, and the ultimate responsibility for validation always belongs to the server.
The role of the frontend is not to complete security, but to design a validation gateway that applies minimal, well-defined policies before input is sent to the server.
In this post, I’ll explain how I designed that gateway, and where I believe the frontend’s responsibility for file upload validation should reasonably end.
There are many aspects to consider when validating file uploads on the frontend:
Among these, I determined that the reasonable scope for frontend responsibility is limiting the number of uploaded files (as behavior control), file size, and file signature validation.
Relying solely on file extensions or MIME types is insufficient.
File extensions can be easily spoofed, and MIME types are often unreliable or inconsistently reported.
For that reason, the frontend should perform only minimal, content-based validation — specifically, checking file header signatures — and leave more comprehensive validation to the server.
This separation provides the most practical and maintainable boundary between frontend and backend responsibilities.
Once validation rules start living inside UI logic, they quickly become scattered and difficult to maintain.
To avoid this, I modeled file upload validation rules using Zod, expressing them as a schema rather than UI conditions.
Zod was not chosen simply as a “form validation library,” but because it allows validation rules to be expressed as domain-level models, independent of UI concerns.
(* More precisely, Zod provides runtime, schema-based type validation, which makes it well-suited for form validation.)
In this project, Zod is integrated with
React Hook Form, butRHFitself is outside the scope of this article.
Below is a function that creates a Zod schema for validating a FileList.
type CreateFileListSchemaArgs = {
maxFileSize: number
validFileSignatures?: FileSignatures
messages?: {
maxFileSize?: (file?: File) => string
validFileSignatures?: (file?: File) => string
}
}
const getFileExtension = (filename: string) => {
return filename.match(/\.([^.]+)$/)?.[1] || ''
}
const createFileListSchema = ({
maxFileSize,
messages,
validFileSignatures
}: CreateFileListSchemaArgs) => {
return z.instanceof(FileList).superRefine(async (fileList, ctx) => {
if (!isNumber(maxFileSize)) return
const files = Array.from(fileList)
files.forEach((file, index) => {
if (file.size > maxFileSize) {
ctx.addIssue({
code: 'custom',
path: [index],
message:
messages?.maxFileSize?.(file) ?? 'The file size is too large.'
})
}
})
await Promise.all(
files.map(async (file, index) => {
const ext = getFileExtension(file.name).toLowerCase()
const fileSignature = validFileSignatures?.[ext] ?? [MAGIC_BYTES]
const isValid = await isValidFile(file, { [ext]: fileSignature })
if (!isValid) {
ctx.addIssue({
code: 'custom',
path: [index],
message:
messages?.validFileSignatures?.(file) ?? 'The file is not valid.'
})
}
})
)
})
}
This validation is not a simple true/false check on a single value.
It represents a structure where each file can fail for different reasons.
One file may exceed the size limit
Another may fail signature validation
Some files may pass while others fail
In such cases, superRefine is more appropriate than just refine.
Using superRefine allows:
Iterating over the file list
Validating each file independently
Using path: [index] to clearly indicate which file failed
In this implementation, the same path is intentionally used so that each file produces only one error message.
If needed, this could be further refined into paths like [index, 'size'] or [index, 'signature'].
File signature validation is handled by the following utility function:
export const MAGIC_BYTES = 999 as const
const isValidFile = async (
file: File,
signatures?: FileSignatures
): Promise<boolean> => {
const buffer = await file.slice(0, 8).arrayBuffer()
const bytes = new Uint8Array(buffer)
if (!signatures) return true
const isAllowedSignature = Object.values(signatures).flat()[0] !== MAGIC_BYTES
if (!isAllowedSignature) return false
return Object.values(signatures).some((signature) =>
signature.every((byte, i) => bytes[i] === byte)
)
}
export default isValidFile
Here, FileSignatures is a table that defines allowed file header signatures (i.e., actual magic bytes defined by file formats) per file extension.
Instead of relying on extensions or MIME types as metadata, validation is performed based on the actual binary header of the file.
This implementation follows an allowlist-based policy:
only file signatures that are explicitly defined are allowed to pass.
Any file extension without a defined signature is rejected by default.
Some file types (for example, txt) do not have a unique file signature.
In these cases, the signature array would be empty, and naive comparison logic would always fail.
To explicitly represent this condition, MAGIC_BYTES is used as a sentinel value indicating that a file type has no defined signature.
When this value is encountered, the file type is treated as policy-wise disallowed, and validation fails immediately.
This avoids ambiguous edge cases and keeps the validation logic explicit and intention-revealing.
The file upload validation described in this article is not an attempt to achieve complete security.
On the frontend, file validation serves as a validation gateway that applies minimal policies before data reaches the server.
To summarize:
Validation responsibility between frontend and backend is clearly separated
Validation rules are expressed as schemas, not UI logic
Decisions are made based on actual file content, not metadata
File uploads are not a component-level problem, but a matter of policy and system design
By designing file upload validation around these principles, it becomes possible to build systems that are not only functional, but also maintainable, extensible, and architecturally sound.