Skip to content
Mobrief

zembed-1: new open-weight SOTA multilingual embedding model

Hey everyone, I'm one of the co-founders of ZeroEntropy.

Reddit LocalLLaMA · · ~2 min + comments
Community

Community-submitted content. Signal comes from upvotes, not editorial vetting. Always check the linked source.

  • Major industry investment.
  • Hey everyone, I'm one of the co-founders of ZeroEntropy.
  • We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.

Context

Hey everyone, I'm one of the co-founders of ZeroEntropy. We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks. zembed-1 is a general-purpose text embedding model built for retrieval, semantic search, and RAG pipelines. Weights are available on Hugging Face. In our evaluations, zembed-1 outperforms OpenAI text-embedding-3-large, Qwen embedding 4B, Google Gemini embeddings, and

For builders

We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.

We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.

Model
Read Original
Open
O open S save B back M mode