Skip to content

Enable FlexAttention for llama model #2302

Enable FlexAttention for llama model

Enable FlexAttention for llama model #2302

Triggered via pull request March 14, 2025 22:49
Status Success
Total duration 42s
Artifacts

lint.yaml

on: pull_request
Matrix: lint
Fit to window
Zoom out
Zoom in