Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Process TSV files as streams and validate only the first 1000 rows by default #139

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

effigies
Copy link
Contributor

@effigies effigies commented Jan 9, 2025

This PR is an optimization. In #138 we found a case with >300k lines in a TSV file. In order to limit the number of lines being inspected, I needed to switch TSV loading to be stream-based instead of slurping the entire file.

This PR does the following:

  • Refactors the UTF8 enforcement of BIDSFileDeno.text into a stream transformer in files/stream.ts
  • Rewrites loadTSV to process files as a stream.
    • UTF8 validation
    • Rechunk as lines of text, splitting on \r?\n
  • Rewrites column loading as an array of pre-allocated arrays for efficiency. The ColumnsMap is constructed at the end.
  • Adds a --max-rows flag to the CLI and a maxRows variable to validator options.

Note that this adds a new error condition, where we tolerate empty lines only at the end of files (<content><LF><EOF>). In passing, this permits us to report the line number of bad TSV lines.

I also do not attempt to add maxRows to the TSV cache key, so calling loadTSV() successively on the same file and different maxRows values will return the result from the first call. This does not seem like a problem in terms of running the validator, but might be surprising to future developers. I can look into that, if desired.

Closes #138.

const normalizeEOL = (str: string): string => str.replace(/\r\n/g, '\n').replace(/\r/g, '\n')
// Typescript resolved `row && !/^\s*$/.test(row)` as `string | boolean`
const isContentfulRow = (row: string): boolean => !!(row && !/^\s*$/.test(row))
async function _loadTSV(file: BIDSFile, maxRows: number = -1): Promise<ColumnsMap> {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file in particular will be easiest to review just by reading the new file, as the contents are almost entirely new.

Comment on lines 20 to 32
const cache = new Map<string, Map<F, T>>()
const cached = async function (this: any, file: F): Promise<T> {
const cached = async function (this: any, file: F, ...args: any[]): Promise<T> {
let subcache = cache.get(file.parent.path)
if (!subcache) {
subcache = new Map()
cache.set(file.parent.path, subcache)
}
let val = subcache.get(file)
if (!val) {
val = await fn.call(this, file)
val = await fn.call(this, file, ...args)
subcache.set(file, val)
}
return val
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like in JS there's no equivalent to tuples, so we would need to key on a string.

-  const cache = new Map<string, Map<F, T>>()
+  const cache = new Map<string, Map<string, T>>()
   const cached = async function (this: any, file: F, ...args: any[]): Promise<T> {
     let subcache = cache.get(file.parent.path)
     if (!subcache) {
       subcache = new Map()
       cache.set(file.parent.path, subcache)
     }
-    let val = subcache.get(file)
+    const key = `${file.path}:${args.join(',')}`
+    let val = subcache.get(key)
     if (!val) {
       val = await fn.call(this, file, ...args)
-      subcache.set(file, val)
+      subcache.set(key, val)
     }

Not sure it's worth it.

@effigies effigies marked this pull request as draft January 10, 2025 16:41
@effigies effigies marked this pull request as ready for review January 11, 2025 12:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Limit TSV loads to a manageable number of lines
1 participant