This repository contains a personal reproduction of the NIPS 2013 paper
Translating Embeddings for Modeling Multi-relational Data.
- If you want to train TransE on your own dataset, see the
TransE-mydataset.rarfile. - The code is relatively old and designed for beginners. It is recommended to first understand the core ideas and basic implementation of TransE here, but not to spend too much time on every implementation detail.
- For further research, you may want to explore more advanced KGE (Knowledge Graph Embedding) methods, such as:
- Training and testing code:
src/ - Training and testing results:
res/- After ~1001 epochs, the loss stabilizes around 14,000 (mostly converged by ~300 epochs).
- Set the paths for your DATA and save folders.
- Run directly:
python transe_simplie.py- Paper: Translating Embeddings for Modeling Multi-relational Data
- Dataset: FB15k
Inputs:
- Training triplets
- Entity set E
- Relation set L
- Margin γ
- Embedding dimension k
Steps:
- Initialize relations and entities.
- Apply L2 norm normalization to relations.
- Entities are initialized without L2 normalization at this step.
- Training loop begins:
- Normalize entity vectors by L2 norm.
- Sample a positive batch (Sbatch) of correct triplets.
- Construct negative samples by corrupting head/tail entities.
- Form training batch (Tbatch) with both positive & negative triplets.
- Update embeddings using gradient descent.
- End training cycle.
- isFit: Choose between
rawandfilterevaluation modes.- Note:
filtermode is significantly slower.
- Note:
Training Loss (sample epochs):
epoch: 900 loss: 14122.8202
epoch: 910 loss: 14373.6803
epoch: 920 loss: 14340.6623
epoch: 930 loss: 14373.6773
epoch: 940 loss: 14328.8339
epoch: 950 loss: 14310.5885
epoch: 960 loss: 14262.7636
epoch: 970 loss: 14311.8275
epoch: 980 loss: 14327.8245
epoch: 990 loss: 14146.5392
Evaluation Metrics:
- Entity hits@10: 0.3077
- Entity mean rank: 254.53
- Relation hits@10: 0.7907
- Relation mean rank: 81.80
Final Results:
- Hits@10: 0.4067
- Mean Rank: 246.32
This repo benefits from the following works:
Thanks to the authors for their contributions.