-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parser very slow #51
Comments
In my case replacing math.MaxInt16 to 128 did the magic
|
Was this done with the benchmark in peg_test.go? I didn't see a speed up using 128. I could make the initial size of 'tree' an option. |
That's my own test. Is the 'tree' written in random order or sequencially? Would it better to use golangs Append? |
@pointlander btw the benchmark for parse slightly incorrect. The reset() does not brings the var tree back to state of first run. So its warmed up - that's why it looks like it has no probs with memory. So the benchcmp looks like:
|
Wonder if it be better option to have AST optional? It still would be fine, if #53 is implemented.
|
AST could be useful for Execute: a <- b { fmt.Println(matches[ruleb]) } matches would be of type 'map[pegRule]string' and contain the string last matched by a rule. |
Just FYI. Golang 1.7.3 vs 1.8beta2
|
I have one large C file of about 33,000 lines which saw a big speedup with 128. This is not particularly large but for some reason this change sped up the parse on this file quite a lot. Sorry I don't have better data at the moment. |
The parser allocates an enormous amount of memory
And then uses own vector doubling scheme
Both of these causes the parser to be very slow because it generates an big amount of garbage. Should probably be optimized.
The text was updated successfully, but these errors were encountered: