Skip to content

Commit

Permalink
Handle data when > 6000 rows
Browse files Browse the repository at this point in the history
At the time of the commit PostgREST limit is 6000.

But I choose to cut pagination by pages of 5000, allowing for maximum 10000 rows in total. Because 10000 rows is the maximum limit on Airtable sync API.

When we'll have 10000+ users, I hope this edge function will have been replaced by another reliable method.
  • Loading branch information
farnoux committed Jan 8, 2025
1 parent bc6ce98 commit 42476b3
Showing 1 changed file with 25 additions and 2 deletions.
27 changes: 25 additions & 2 deletions supabase/functions/crm_sync/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,23 @@ serve(async (req) => {
const { token, table, sync } = await req.json();

// Récupère les données de la table au format csv.
const csvResponse = await supabase.from(table).select().csv();
const { error, data, count } = await supabase
.from(table)
.select('*', { count: 'exact' })
.range(0, 4999)
.csv();

let allData = data;

if (count > 5000) {
const { data: dataOver5000 } = await supabase
.from(table)
.select('*')
.range(5000, 9999)
.csv();

allData += '\n' + dataOver5000.replace(/.*\n/, '') /* remove header */;
}

// Envoie les données à Airtable vers l'endpoint sync.
const syncResponse = await fetch(sync, {
Expand All @@ -30,10 +46,17 @@ serve(async (req) => {
Authorization: `Bearer ${token}`,
'Content-Type': 'text/csv',
},
body: csvResponse.data,
body: allData,
});

const { success } = await syncResponse.json();
console.log(
JSON.stringify({
success,
table,
nbOfRows: allData.split('\n').length - 1 /* without header */,
})
);

// Renvoie le statut pour persister dans les logs Supabase.
return new Response(
Expand Down

0 comments on commit 42476b3

Please sign in to comment.