Multilingual E5 Large

Multilingual E5 Large is part of the E5 family of text embedding models. This large variant contains 24 layers. This model is well-suited for tasks such as:

  • Semantic Search: Finding documents or text passages that are semantically relevant to a query.
  • Text Clustering: Grouping similar pieces of text based on their semantic meaning.
  • Informative Response Generation: Following instructions to provide summaries, explanations, or answers to factual queries.

View model card in Model Garden

Model ID multilingual-e5-large-instruct-maas
Launch stage GA
Supported inputs & outputs
  • Inputs:
    Text
  • Outputs:
    Embeddings
Output dimensions Up to 1,024
Number of layers 24
Max sequence length 512 tokens
Supported languages See Supported languages.
Usage types
Versions
  • multilingual-e5-large
    • Launch stage: GA
    • Release date: September 19, 2025
Supported regions

Model availability

  • United States
    • us-central1
  • Europe
    • europe-west4

ML processing

  • United States
    • Multi-region
  • Europe
    • Multi-region
Quota

us-central1:

  • TPM: 3,000

europe-west4:

  • TPM: 3,000

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025年11月07日 UTC.