You could try using the
LIMIT feature. If you do this:
SELECT * FROM MyTable ORDER BY whatever LIMIT 0,1000
You’ll get the first 1,000 rows. The first
LIMIT value (0) defines the starting row in the result set. It’s zero-indexed, so 0 means “the first row”. The second
LIMIT value is the maximum number of rows to retrieve. To get the next few sets of 1,000, do this:
SELECT * FROM MyTable ORDER BY whatever LIMIT 1000,1000 -- rows 1,001 - 2,000 SELECT * FROM MyTable ORDER BY whatever LIMIT 2000,1000 -- rows 2,001 - 3,000
And so on. When the
SELECT returns no rows, you’re done.
This isn’t enough on its own though, because any changes done to the table while you’re processing your 1K rows at a time will throw off the order. To freeze the results in time, start by querying the results into a temporary table:
CREATE TEMPORARY TABLE MyChunkedResult AS ( SELECT * FROM MyTable ORDER BY whatever );
Side note: it’s a good idea to make sure the temporary table doesn’t exist beforehand:
DROP TEMPORARY TABLE IF EXISTS MyChunkedResult;
At any rate, once the temporary table is in place, pull the row chunks from there:
SELECT * FROM MyChunkedResult LIMIT 0, 1000; SELECT * FROM MyChunkedResult LIMIT 1000,1000; SELECT * FROM MyChunkedResult LIMIT 2000,1000; .. and so on.
I’ll leave it to you to create the logic that will calculate the limit value after each chunk and check for the end of results. I’d also recommend much larger chunks than 1,000 records; it’s just a number I picked out of the air.
Finally, it’s good form to drop the temporary table when you’re done:
DROP TEMPORARY TABLE MyChunkedResult;