We provide an implementation of GCD that is compatible with the popular Transformers library!
This new package, Transformers-CFG, extends the capabilities of our Grammar-Constrained Decoding (GCD) approach by integrating seamlessly with the Transformers
library. It offers:
- Easy Integration: Quickly combine the power of GCD with any model listed in the
transformers
library with just few lines of code! - Enhanced Performance: Leverage the GCD technique for more efficient and accurate generation.
- Friendly Interface: Implemented with the EBNF grammar interface, making it accessible for both beginners and experts.
Get started with Transformers-CFG here.
With the repository cloned, we recommend creating a new conda virtual environment:
conda create -n GCD python=3.9
conda activate GCD
Install the required packages:
pip install -r requirements.txt
- Download datasets, grammars and models
- Build task-specific grammars
- Windows-specific setting
- Running the experiments
This repository contains the code for the models and experiments in Grammar-Constrained Decoding for Structured NLP Tasks without Finetuning
@inproceedings{geng-etal-2023-grammar,
title = {Grammar-Constrained Decoding for Structured {NLP} Tasks without Finetuning},
author = {Geng, Saibo and Josifoski, Martin and Peyrard, Maxime and West, Robert},
year = 2023,
month = dec,
booktitle = {Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
publisher = {Association for Computational Linguistics},
address = {Singapore},
url = {https://aclanthology.org/2023.emnlp-main.674},
editor = {Bouamor, Houda and Pino, Juan and Bali, Kalika}
}
Please consider citing our work, if you found the provided resources useful.
This project is licensed under the terms of the MIT license.