-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] : Performance of combine. #534
Comments
Oh wow, that's an interesting use case. So the implementation of There are two issues I see immediately (with respect to large result lists):
The latter issue is almost certainly the problem. That being said, as I've mentioned here, I don't have time to contribute to the codebase. That being said though I would be happy to accept a fix for this - and I'll devote time to reviewing the PR. |
Also, |
On another note; given such a large list, would you not want some sort of function that partitions your results into e.g. const [okList, errList] = Result.partition(veryLargeList) |
Hey, thank for you detailed answer !
Yeah, i saw the message. I'm glad that your are transparent on the time you can dedicate to Also good luck in your new startup adventure, i know from experience that it's time consuming 😉
I will definitively have a look, but
Yeah, could be nice, should be useful in my case. (This is a personal opinion) That being said, in general i'm not in favor of expanding the interface, it creates confusion and, it's harder to learn over time. I need to check but it might be doable to do some kind of Overall nice pinpointing of the issue 👍 |
Another issue with the current implementation, that isn't directly connected to this issue though, is that even when isErr() returns true, the reduce will keep iterating over the entire input list. So not really short circuiting but only returning err(result.error) n times before it finally finishes all elements. I played around with the implementation and this here is magnitudes faster on my machine. this implementation doesn't create new arrays in every iteration but mutates one array a bunch of times. export const combineResultListFast = <T, E>(
resultList: readonly Result<T, E>[],
): Result<readonly T[], E> => {
let acc = ok([]) as Result<T[], E>
for (const result of resultList) {
if (result.isErr()) {
acc = err(result.error)
break // short circuiting here
} else {
acc.map((list) => list.push(result.value))
}
}
return acc
} i benchmarked it with a couple of different input sizes and this is the result of the microbenchmark on my machine:
You can check out the code as well as the benchmark code here: |
Addionally, it might also make sense to refactor the export const combineResultListWithAllErrors = <T, E>(
resultList: readonly Result<T, E>[],
): Result<readonly T[], E[]> => {
let acc = ok([]) as Result<T[], E[]>
for (const result of resultList) {
if (result.isErr() && acc.isErr()) {
acc.error.push(result.error)
} else if (result.isErr() && acc.isOk()) {
acc = err([result.error])
} else if (result.isOk() && acc.isErr()) {
// do nothing
} else if (result.isOk() && acc.isOk()) {
acc.value.push(result.value)
}
}
return acc
} |
Hey, first let me thank your library which is overall wonderful.
Issue :
I run into trouble regarding the
Result.combine
method.I'm basically trying to combine a relatively "huge"
array
around 300_000 elements ofResult<T, E>
.Sadly doing so takes around 1 minutes on my computer.
Question :
Is
combine
implementation ready to handle that size of array ?P.S : For now i will just not use
neverthrow
on array that are too big.The text was updated successfully, but these errors were encountered: