🕶️
- slow is smooth, smooth is fast -
sunday-driver works through a large file at a responsible pace - it pauses to let you consider the data, at given points, and waits to resume working once that's all done.
this allows processing a large file, by sizable chunks, without any race-conditions or memory leaking.
(heavily) inspired by line-by-line, by Markus Ostertag🙏
npm i sunday-driver
const sundayDriver = let options= file: './my/large/file.tsv' splitter: '\n' start: '80%' //as percentages, or in bytes end: '100%' //do your thing, for each segment { console//do your thing.. } //log progress-based events atPercent: 50: { console } 75: { console } //log time-based events atInterval: { console } { console }
any events/intervals will provide you with all the details of the current reader's status:
/*{ chunksDone:: 10, // how many times we've called the 'each' function bytesDone:: 20480, // how many bytes we've processed so far filesize:: 61440, // size of the whole file position: 34.42, // where, in percentage, we are in the file. (if we didn't start at the top!) progress: 68.84 // how far, in percentage, we are to being complete}*/
it was built to support unleashing multiple workers on the same file, and letting them run safely and responsibly, without blowing any fuses.
MIT