Today, the AI world is surging! From the mysterious Polaris Alpha model launched on OpenRouter, to Google’s powerful file search tool for the Gemini API, to Novita AI’s limited-time free code generation API, and major updates to multiple Google products, developers and creators are ushering in a new wave of efficiency revolution.
1. Mysterious Model Polaris Alpha Launched on OpenRouter: Is it OpenAI’s Next Move?
Developer community platform OpenRouter recently quietly launched a new stealth model called “Polaris Alpha,” immediately sparking heated discussion within the community.
This model is described as a “powerful, general-purpose model that performs exceptionally well in real-world tasks such as code, tool calling, and instruction following.” Due to its community-feedback-driven testing mode, its true identity is shrouded in mystery.
What exactly is this?
Many speculate that Polaris Alpha may be related to OpenAI’s gpt-5.1 series. After all, OpenRouter previously launched the “Quasar Alpha” model in a similar fashion, which was later confirmed to be related to OpenAI’s new technology. More interestingly, its high 256K context window suggests it might be a lightweight “mini” or “nano” version, optimized for specific tasks.
Currently, developers can experience the model’s capabilities firsthand through the OpenRouter platform and collectively uncover its mystery.
Learn more: OpenRouter - Polaris Alpha
2. Google Injects New Power into Gemini API: File Search Tool Officially Released
The Google DeepMind team has brought a major update to the Gemini API – a built-in “File Search Tool.” This is not just a simple upgrade, but a fully managed RAG (Retrieval Augmented Generation) system designed to allow developers to easily combine their own data with Gemini models.
Good news for developers!
In the past, to implement RAG functionality, developers needed to handle document storage, text chunking, vector embedding, and a series of other complex processes themselves. Now, the Gemini API’s file search tool automates all of this.
Its core highlights include:
- Minimalist development experience: The entire RAG process is seamlessly integrated into the existing
generateContentAPI, requiring almost no changes to developers’ original workflows. - Powerful vector search: Using the latest Gemini Embedding model, it can understand semantics and find the most relevant information even if query keywords do not exactly match those in the document.
- Built-in citation sources: Model-generated responses automatically include citation sources, indicating which document snippets the answers are based on, greatly simplifying the fact-checking process.
- Support for diverse formats: Knowledge bases can be easily created from PDF, DOCX, TXT, JSON, and various programming language documents.
Even more appealing, Google offers a very sincere pricing model: file storage and embedding generation during queries are free, with payment only required for initial document indexing. This undoubtedly greatly improves the cost-effectiveness of developing and expanding AI applications.
Learn more technical details: Google Developers Blog
3. Limited-Time Free! Novita AI Opens Powerful KAT-Coder API
For programmers, this is definitely good news not to be missed! Novita AI announced that its powerful code generation large language model KAT-Coder API will be completely free for a limited time, with the end date to be announced later.
How to get started quickly?
KAT-Coder excels at handling complex coding tasks, such as code refactoring and multi-file error fixing. To experience it immediately, just follow a few simple steps:
It also provides integration with TRAE:
- Go to Novita.ai to get your API key.
- Set the Provider to ’novita’ in your development tool (e.g., TRAE).
- Enter ‘kat-coder’ in the model ID field and paste your API key.
In just 30 seconds, you can integrate this top-tier AI code assistant into your workflow and improve development efficiency for free.
Official source: novita_labs on X
4. NotebookLM Transforms into a Portable Learning Tool: New Quiz and Flashcard Features Added
Google’s AI note-taking application NotebookLM has received a major update to its mobile app, making learning easier and more fun.
Learn efficiently anytime, anywhere
The new version adds two killer features:
- Generate Flashcards: Automatically generate flashcards from your note sources to help you memorize key terms, important dates, and core concepts.
- Create Quizzes: Generate quiz questions based on your notes to test your understanding. You can customize topics, difficulty, and the number of questions.
In addition, the new version has significantly optimized the chat experience, with a 50% improvement in response quality, a 4x expansion of the context window, and a 6x enhancement in conversational memory, all thanks to the latest Gemini model. Now, you can focus more on conversing with data from specific sources.
Update and experience now: Google Labs Blog
5. Everyone is an AI Developer: Google Opal Expands to Over 160 Countries Worldwide
Google Labs’ no-code AI application building tool “Opal” is accelerating its global expansion, with its service scope expanding from the initial 15 countries to over 160 countries in one go.
Opal aims to enable people without programming backgrounds to easily create their own AI gadgets. Users have already used it to create various interesting applications:
Automate tedious tasks: For example, automatically scraping data from the web and saving it to Google Sheets.
Create customized content: Generate posts for social media, design dynamic visuals for marketing campaigns.
Rapidly validate ideas: Build MVPs (Minimum Viable Products) for applications such as language learning, travel planning, or quiz generation in minutes.
This expansion means that more entrepreneurs and developers worldwide will be able to use Opal to quickly turn their AI ideas into reality.
Explore the possibilities of Opal: Google Labs Blog
6. The Beast of AI Computing Power: Google Launches Seventh-Generation TPU Chip Ironwood
Google has also made a big move in the hardware field, officially announcing that its seventh-generation TPU (Tensor Processing Unit) chip “Ironwood” is about to be generally available (GA)!)
How powerful is it?
According to official data, Ironwood has achieved a huge leap in performance:
- Peak performance is 10 times that of TPU v5p.
- Per-chip performance in training and inference workloads is more than 4 times that of TPU v6e (Trillium).
Google is already using Ironwood internally to train and run cutting-edge models, including Gemini. Soon, Google Cloud customers will also be able to leverage this performance beast to provide powerful computing support for their AI applications.
Learn more: Google on X


