Thanks to visit codestin.com
Credit goes to github.com

Skip to content

dykang/coedit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 

Repository files navigation

CoEdIT: Text Editing by Task-Specific Instruction Tuning

This repository provides datasets, models and code for CoEdIT the instruction-tuned text editing models, with the official implementation of the following paper:

CoEdIT: Text Editing by Task-Specific Instruction Tuning
Vipul Raheja, Dhruv Kumar, Ryan Koo, and Dongyeop Kang

Our code is based on Hugging Face transformers.

Installation

Coming soon.

Data

Coming soon.

Code

Coming soon.

Models

Model checkpoints

We have uploaded all our model checkpoints to Hugging Face.

Model Params
CoEdIT-large 770M
CoEdIT-xl 3B
CoEdIT-xxl 11B
CoEdIT-xl-composite 3B

Example Usage:

You can directly load our models using Hugging Face Transformers.

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("grammarly/coedit-xl")
model = T5ForConditionalGeneration.from_pretrained("grammarly/coedit-xl")
input_text = 'Fix grammatical errors in this sentence: New kinds of vehicles will be invented with new technology than today.'
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids, max_length=256)
edited_text = tokenizer.decode(outputs[0], skip_special_tokens=True)[0]

Citation

Coming soon.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published