Other Resources
View
- hanxiao/bert-as-service - Mapping a variable-length sentence to a fixed-length vector using pretrained BERT model.
- brightmart/bert_language_understanding - Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN.
- algteam/bert-examples - BERT examples.
- JayYip/bert-multiple-gpu - A multiple GPU support version of BERT.
- HighCWu/keras-bert-tpu - Implementation of BERT that could load official pre-trained models for feature extraction and prediction on TPU.
- whqwill/seq2seq-keyphrase-bert - Add BERT to encoder part for https://github.com/memray/seq2seq-keyphrase-pytorch
- xu-song/bert_as_language_model - BERT as language model, a fork from Google official BERT implementation.
- Y1ran/NLP-BERT--Chinese version
- yuanxiaosc/Deep_dynamic_word_representation - TensorFlow code and pre-trained models for deep dynamic word representation (DDWR). It combines the BERT model and ELMo's deep context word representation.
- yangbisheng2009/cn-bert
- Willyoung2017/Bert_Attempt
- Pydataman/bert_examples - Some examples of BERT.
run_classifier.py
based on Google BERT for Kaggle Quora Insincere Questions Classification challenge.run_ner.py
is based on the first season of the Ruijin Hospital AI contest and a NER written by BERT. - guotong1988/BERT-chinese - Pre-training of deep bidirectional transformers for Chinese language understanding.
- zhongyunuestc/bert_multitask - Multi-task.
- Microsoft/AzureML-BERT - End-to-end walk through for fine-tuning BERT using Azure Machine Learning.
- bigboNed3/bert_serving - Export BERT model for serving.
- yoheikikuta/bert-japanese - BERT with SentencePiece for Japanese text.
- nickwalton/AIDungeon - AI Dungeon 2 is a completely AI generated text adventure built with OpenAI's largest 1.5B param GPT-2 model. It's a first of it's kind game that allows you to enter and will react to any action you can imagine.
- turtlesoupy/this-word-does-not-exist - "This Word Does Not Exist" is a project that allows people to train a variant of GPT-2 that makes up words, definitions and examples from scratch. We've never seen fake text so real.
Tools
- jessevig/bertviz - Tool for visualizing attention in the Transformer model.
- FastBert - A simple deep learning library that allows developers and data scientists to train and deploy BERT based models for NLP tasks beginning with text classification. The work on FastBert is inspired by fast.ai.
- gpt2tc - A small program using the GPT-2 LM to complete and compress texts. It has no external dependency, requires no GPU and is quite fast. The smallest model (117M parameters) is provided. Larger models can be downloaded as well. (no waitlist, no sign up required).
Tasks
Named-Entity Recognition (NER)
- kyzhouhzau/BERT-NER - Use google BERT to do CoNLL-2003 NER.
- zhpmatrix/bert-sequence-tagging - Chinese sequence labeling.
- JamesGu14/BERT-NER-CLI - Bert NER command line tester with step by step setup guide.
- sberbank-ai/ner-bert
- mhcao916/NER_Based_on_BERT - This project is based on Google BERT model, which is a Chinese NER.
- macanv/BERT-BiLSMT-CRF-NER - TensorFlow solution of NER task using Bi-LSTM-CRF model with Google BERT fine-tuning.
- ProHiryu/bert-chinese-ner - Use the pre-trained language model BERT to do Chinese NER.
- FuYanzhe2/Name-Entity-Recognition - Lstm-CRF, Lattice-CRF, recent NER related papers.
- king-menin/ner-bert - NER task solution (BERT-Bi-LSTM-CRF) with Google BERT https://github.com/google-research.
Classification
- brightmart/sentiment_analysis_fine_grain - Multi-label classification with BERT; Fine Grained Sentiment Analysis from AI challenger.
- zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification - Kaggle baseline—fine-tuning BERT and tensor2tensor based Transformer encoder solution.
- maksna/bert-fine-tuning-for-chinese-multiclass-classification - Use Google pre-training model BERT to fine-tune for the Chinese multiclass classification.
- NLPScott/bert-Chinese-classification-task - BERT Chinese classification practice.
- fooSynaptic/BERT_classifer_trial - BERT trial for Chinese corpus classfication.
- xiaopingzhong/bert-finetune-for-classfier - Fine-tuning the BERT model while building your own dataset for classification.
- Socialbird-AILab/BERT-Classification-Tutorial - Tutorial.
- malteos/pytorch-bert-document-classification - Enriching BERT with Knowledge Graph Embedding for Document Classification (PyTorch)
Text Generation
- asyml/texar - Toolkit for Text Generation and Beyond. Texar is a general-purpose text generation toolkit, has also implemented BERT here for classification, and text generation applications by combining with Texar's other modules.
- Plug and Play Language Models: a Simple Approach to Controlled Text Generation (PPLM) paper by Uber AI.
Question Answering (QA)
- matthew-z/R-net - R-net in PyTorch, with BERT and ELMo.
- vliu15/BERT - TensorFlow implementation of BERT for QA.
- benywon/ChineseBert - This is a Chinese BERT model specific for question answering.
- xzp27/BERT-for-Chinese-Question-Answering
- facebookresearch/SpanBERT - Question Answering on SQuAD; improving pre-training by representing and predicting spans.
Knowledge Graph
- sakuranew/BERT-AttributeExtraction - Using BERT for attribute extraction in knowledge graph. Fine-tuning and feature extraction. The BERT-based fine-tuning and feature extraction methods are used to extract knowledge attributes of Baidu Encyclopedia characters.
- lvjianxin/Knowledge-extraction - Chinese knowledge-based extraction. Baseline: bi-LSTM+CRF upgrade: BERT pre-training.