Thanks to visit codestin.com
Credit goes to github.com

Skip to content

dhruvluci/BERT-RACE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT for RACE

By: Chenglei Si (River Valley High School)

Implementation

This work is based on Pytorch implementation of BERT (https://github.com/huggingface/pytorch-pretrained-BERT). I adapted the original BERT model to work on multiple choice machine comprehension.

Environment:

The code is tested with Python3.6 and Pytorch 1.0.0.

Usage

  1. Download the dataset and unzip it. The default dataset directory is ./RACE
  2. Run ./run.sh

Hyperparameters

I did some tuning and find the following hyperparameters to work reasonally well:

BERT_base: batch size: 32, learning rate: 5e-5, training epoch: 3

BERT_large: batch size: 8, learning rate: 1e-5 (DO NOT SET IT TOO LARGE), training epoch: 2

Results

Model RACE RACE-M RACE-H
BERT_base 65.0 71.7 62.3
BERT_large 67.9 75.6 64.7

You can compare them with other results on the leaderboard.

BERT large achieves the current (Jan 2019) best result. Looking forward to new models that can beat BERT!

More Details

I have written a short report in this repo describing the details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.3%
  • Other 0.7%