Thanks to visit codestin.com
Credit goes to github.com

Skip to content

fipu-lab/ner-federated

Repository files navigation

Federated Learning of BERT models in natural language processing

Federated Learning using pretrained BERT models for a Named Entity Recognizition (NER) task.

Experiments were run on Few-NERD and CoNLL-2003 datasets.

Run code

To run the code, first you need to download BERT config from google-research. Since the original models were trained using Tensorflow v1.x, checkpoints need to be converted to TF v2. Use this script to convert bert checkpoints from TF1 to TF2. Unzip desired BERT model zip into BERT-NER-TF2/models folder and use the model by its name in notebooks:

  • FL.ipynb - training of pretrained or randomly initialized models
  • FL_frozen-bert.ipynb - fine-tuning of pretrained models by freezing the BERT model layers
  • Central - training of one centralised model

Datasets are already included in this repository, and are located in BERT-NER-TF2/dataset directory.

About

  • This source code is a result of a Fipulab project:
    • Juraj Dobrila University of Pula
    • Faculty of informatics in Pula
  • Contributors: Mateo Borina, Robert Šajina, Nikola Tanković
  • Topic / title: Federated Learning of BERT models in natural language processing
  • Project status: working

Features

  • Federated learning
  • BERT models (transformer-based neural networks)
  • Natural language processing
  • Named entity recognition tasks

Code

  • Python
  • Jupyter Notebook
  • Results are being saved to JSON
  • Digrams in PNG format
  • Datasets: CoNLL-2003, Few-NERD

Other

About

Federated learning of BERT models on the NER task

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published