MaskedLM is a pre-training technique used in Natural Language Processing (NLP) for deep-learning models like Transformers. It is a variant of language modeling where a portion of the input text is masked, and the model is trained to predict the masked tokens based on the context provided by the unmasked tokens.
Load model
from transformers import AutoTokenizer, AutoModelForMaskedLM,
tokenizer = AutoTokenizer.from_pretrained("SRDdev/HingMaskedLM")
model = AutoModelForMaskedLM.from_pretrained("SRDdev/HingMaskedLM")Build pipeline
from transformers import pipeline
fill = pipeline('fill-mask', model=model, tokenizer=tokenizer)
fill(f'please {fill.tokenizer.mask_token} ko cancel kardo')Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate.
@citation{ HingMaskedLM,
author = {Shreyas Dixit},
year = {2023},
url = {https://huggingface.co/SRDdev/HingMaskedLM}
}