zembed-1: new open-weight SOTA multilingual embedding model
Hey everyone, I'm one of the co-founders of ZeroEntropy.
Community-submitted content. Signal comes from upvotes, not editorial vetting. Always check the linked source.
Key Takeaways
- Major industry investment.
- Hey everyone, I'm one of the co-founders of ZeroEntropy.
- We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.
What It Means
Context
Hey everyone, I'm one of the co-founders of ZeroEntropy. We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks. zembed-1 is a general-purpose text embedding model built for retrieval, semantic search, and RAG pipelines. Weights are available on Hugging Face. In our evaluations, zembed-1 outperforms OpenAI text-embedding-3-large, Qwen embedding 4B, Google Gemini embeddings, and
For builders
We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.
For Builders
We just released zembed-1, a multilingual text embedding model that sets a new state of the art across major benchmarks.