There are two multilingual models currently available. We do not plan to release more single-language models, but we may release BERT-Large versions of these two in the future: BERT-Base, Multilingual Cased (New, recommended): 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters BERT-Base, Multilingual Uncased (Orig, not recommended): 102 languages, 12-layer, 768-hidden, 12-heads, 110M p