-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Process TSV files as streams and validate only the first 1000 rows by default #139
base: main
Are you sure you want to change the base?
Conversation
const normalizeEOL = (str: string): string => str.replace(/\r\n/g, '\n').replace(/\r/g, '\n') | ||
// Typescript resolved `row && !/^\s*$/.test(row)` as `string | boolean` | ||
const isContentfulRow = (row: string): boolean => !!(row && !/^\s*$/.test(row)) | ||
async function _loadTSV(file: BIDSFile, maxRows: number = -1): Promise<ColumnsMap> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file in particular will be easiest to review just by reading the new file, as the contents are almost entirely new.
ef7442a
to
cfe5120
Compare
const cache = new Map<string, Map<F, T>>() | ||
const cached = async function (this: any, file: F): Promise<T> { | ||
const cached = async function (this: any, file: F, ...args: any[]): Promise<T> { | ||
let subcache = cache.get(file.parent.path) | ||
if (!subcache) { | ||
subcache = new Map() | ||
cache.set(file.parent.path, subcache) | ||
} | ||
let val = subcache.get(file) | ||
if (!val) { | ||
val = await fn.call(this, file) | ||
val = await fn.call(this, file, ...args) | ||
subcache.set(file, val) | ||
} | ||
return val |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like in JS there's no equivalent to tuples, so we would need to key on a string.
- const cache = new Map<string, Map<F, T>>()
+ const cache = new Map<string, Map<string, T>>()
const cached = async function (this: any, file: F, ...args: any[]): Promise<T> {
let subcache = cache.get(file.parent.path)
if (!subcache) {
subcache = new Map()
cache.set(file.parent.path, subcache)
}
- let val = subcache.get(file)
+ const key = `${file.path}:${args.join(',')}`
+ let val = subcache.get(key)
if (!val) {
val = await fn.call(this, file, ...args)
- subcache.set(file, val)
+ subcache.set(key, val)
}
Not sure it's worth it.
eb6e002
to
fe9d3e4
Compare
This PR is an optimization. In #138 we found a case with >300k lines in a TSV file. In order to limit the number of lines being inspected, I needed to switch TSV loading to be stream-based instead of slurping the entire file.
This PR does the following:
\r?\n
--max-rows
flag to the CLI and amaxRows
variable to validator options.Note that this adds a new error condition, where we tolerate empty lines only at the end of files (
<content><LF><EOF>
). In passing, this permits us to report the line number of bad TSV lines.I also do not attempt to add
maxRows
to the TSV cache key, so callingloadTSV()
successively on the same file and differentmaxRows
values will return the result from the first call. This does not seem like a problem in terms of running the validator, but might be surprising to future developers. I can look into that, if desired.Closes #138.