Use webhooks when you want ExpressCSV to deliver validated records directly to your backend instead of (or in addition to) processing them in the browser.
import { useExpressCSV, x } from "@expresscsv/react" ;
const schema = x. row ({
name: x. string (). label ( "Full Name" ),
email: x. string (). email (). label ( "Email Address" ),
});
export function ImportUsersButton () {
const { open } = useExpressCSV ({
schema,
publishableKey: "pk_test_..." , // Your ExpressCSV publishable key
importIdentifier: "user-import" , // Ties this button to a configured import flow
});
return (
< button
type = "button"
onClick = {() =>
open ({
webhook: {
url: "https://api.example.com/webhooks/csv-import" , // Receives validated records on your server
method: "POST" , // Use the HTTP verb your endpoint expects
headers: {
Authorization: "Bearer your-api-token" , // Lets your backend verify the request
},
metadata: {
source: "react-app" , // Helps identify which client triggered the import
userId: "user-123" , // Passes app-specific context through to the payload
},
},
onComplete : () => console. log ( "Webhook delivery initiated" ),
onError : ( error ) => console. error ( "Delivery error" , error),
})
}
>
Import users
</ button >
);
}
Option Type Default Description urlstring— Endpoint URL (required) method'POST', 'PUT', or 'PATCH''POST'HTTP method headersRecord<string, string>— Custom headers sent with each request timeoutnumber— Request timeout in milliseconds retriesnumber5Max retry attempts for failed chunks metadataRecord<string, unknown>— Arbitrary data included in the payload awaitWebhookArrivalbooleanfalseWait for webhook confirmation before showing success
Each chunk is delivered as a JSON request to your endpoint:
interface WebhookPayload {
records : Record < string , unknown >[];
chunkIndex : number ;
totalChunks : number ;
totalRecords : number ;
metadata ?: Record < string , unknown >;
delivery : {
publishableKey : string ;
environmentName : string ;
environmentType : string ;
teamSlug : string ;
importIdentifier : string ;
deliveryId : string ;
timestamp : string ; // ISO 8601
};
}
Example payload:
{
"records" : [
{ "name" : "Alice Johnson" , "email" : "alice@example.com" },
{ "name" : "Bob Smith" , "email" : "bob@example.com" }
],
"chunkIndex" : 0 ,
"totalChunks" : 3 ,
"totalRecords" : 2500 ,
"metadata" : {
"source" : "react-app" ,
"userId" : "user-123"
},
"delivery" : {
"publishableKey" : "pk_live_abc123" ,
"environmentName" : "Production" ,
"environmentType" : "production" ,
"teamSlug" : "my-team" ,
"importIdentifier" : "user-import" ,
"deliveryId" : "del_abc123" ,
"timestamp" : "2026-03-10T14:30:00.000Z"
}
}
The request includes Content-Type: application/json plus any custom headers you specified.
Your endpoint should return a 2xx status to acknowledge each chunk. Here's a minimal Fastify example:
import Fastify from "fastify" ;
const app = Fastify ();
app. post ( "/webhooks/csv-import" , async ( request , reply ) => {
const token = request.headers.authorization;
if (token !== "Bearer your-api-token" ) {
return reply. status ( 401 ). send ({ error: "Unauthorized" });
}
const { records , chunkIndex , totalChunks , delivery } = request.body as {
records : Array < Record < string , unknown >>;
chunkIndex : number ;
totalChunks : number ;
delivery : { deliveryId : string };
};
try {
await db. insertMany ( "users" , records); // Persist the current chunk before acknowledging it
request.log. info (
`Chunk ${ chunkIndex + 1 }/${ totalChunks } processed ` +
`(${ records . length } records, delivery ${ delivery . deliveryId })`
);
return reply. status ( 200 ). send ({ success: true });
} catch (error) {
request.log. error ({ error }, "Failed to process chunk" );
return reply. status ( 500 ). send ({ error: "Internal server error" });
}
});
await app. listen ({ port: 3000 }); // Start the webhook receiver
5xx and 429 responses are retried automatically (up to 5 attempts per chunk by default)
4xx responses (except 429) are treated as permanent failures and are not retried
Chunks are delivered serially — the next chunk is only sent after the previous one succeeds
The same chunk may be delivered more than once on retry.
Use chunkIndex and totalChunks to track progress — chunkIndex === totalChunks - 1 for the last chunk. The delivery ID will be the same for each chunk, so you can use it to deduplicate on your server.