-
Notifications
You must be signed in to change notification settings - Fork 207
Issues: flashinfer-ai/flashinfer
Deprecation Notice: Python 3.8 Wheel Support to End in future...
#682
opened Dec 18, 2024 by
yzh119
Open
2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature] Can BatchDecodeWithPagedKVCacheWrapper return attention scores to all tokens, not just logsumexp?
#838
opened Feb 14, 2025 by
yawnzh
PrefillPlan tries to allocate more memory than float_workspace_size_in_bytes passed in.
#809
opened Feb 12, 2025 by
rchardx
BatchPrefillWithPagedKVCacheSM90Run failed with error: operation not supported
#807
opened Feb 12, 2025 by
sfc-gh-yewang
[BUG] attention/prefill.cuh(138): error: expression must have a constant value
#806
opened Feb 11, 2025 by
haohaibo
Numerical stability issue in recent commits since 0.2.0
bug
Something isn't working
priority: high
#805
opened Feb 11, 2025 by
rchardx
[Feature] Reuse JIT code path for building AOT wheel
enhancement
New feature or request
#791
opened Feb 6, 2025 by
abcdabcd987
[Feature] Llama3.1 RoPE on the fly
enhancement
New feature or request
#746
opened Jan 21, 2025 by
turboderp
AttributeError: module 'flashinfer._kernels' has no attribute 'apply_rope_pos_ids_cos_sin_cache'
#741
opened Jan 17, 2025 by
fergusfinn
Flashinfer==0.2.0 precision error when tested on vLLM unit tests
#736
opened Jan 13, 2025 by
Dr-Left
C++ benchmarks CMake error caused by enable_fp16 option in generate.py
#734
opened Jan 13, 2025 by
rtxxxpro
[RFC]: Introducing ReproSpec for Strong Reproducibility in LLM Inference
RFC
Request for Comments
#733
opened Jan 11, 2025 by
yzh119
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.