Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use sparsified neural models for relevance task #222

Open
3 tasks
Tracked by #215
Shreyanand opened this issue Oct 27, 2022 · 0 comments
Open
3 tasks
Tracked by #215

Use sparsified neural models for relevance task #222

Shreyanand opened this issue Oct 27, 2022 · 0 comments
Assignees
Labels
sparsification Indicates that the issue exists to achieve model sparsification.

Comments

@Shreyanand
Copy link
Member

Shreyanand commented Oct 27, 2022

The existing transformer model use distilbert but we want to see if there are other models in the sparse zoo that we can train for a better performance. Use the existing notebooks (transformer_relevance, transformer_inference) as a reference and create new benchmarks for this task.

Acceptance

  • Replicate the training and inference snippets used for the distilbert model to the sparse model
  • Get the benchmarks for the sparse model (note: add model size comparison to benchmarks)
  • Explore other models in the sparse zoo and select a few to add to the benchmarks
@Shreyanand Shreyanand added the sparsification Indicates that the issue exists to achieve model sparsification. label Oct 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sparsification Indicates that the issue exists to achieve model sparsification.
Projects
None yet
Development

No branches or pull requests

2 participants