* . *
  • About
  • Advertise
  • Privacy & Policy
  • Contact
Tuesday, October 21, 2025
Earth-News
  • Home
  • Business
  • Entertainment
    Hetzel Design: blending architecture and entertainment – Blooloop

    Hetzel Design: Where Architecture and Entertainment Unite in Perfect Harmony

    Country music legend rushed to hospital year after heart surgery. Here’s what we know – PennLive.com

    Country Music Legend Rushed to Hospital One Year After Heart Surgery – What’s Happening Now?

    Strictly Come Dancing results: Chris Robshaw is eliminated while drag queen La Voix escapes dance-off – Yahoo

    Strictly Come Dancing results: Chris Robshaw is eliminated while drag queen La Voix escapes dance-off – Yahoo

    Placer County town of Loomis considers entertainment zone for downtown – CBS News

    Loomis Unveils Thrilling New Entertainment Zone to Revitalize Downtown

    CT Culture Corner: Robert Redford films to watch – CT Insider

    CT Culture Corner: Robert Redford films to watch – CT Insider

    Elmira’s New Entertainment Venue ‘Centertown Social’ Adding its Final Touches – WENY News

    Elmira’s New Entertainment Venue ‘Centertown Social’ Adding its Final Touches – WENY News

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology
    3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

    3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

    3 Technology Stocks to Buy Now – Yahoo Finance

    3 Must-Buy Tech Stocks You Can’t Afford to Miss Right Now

    ‘New frontier’: Austin leaders start discussions on air taxi technology – KXAN Austin

    Austin Leaders Ignite Exciting Conversations on the Future of Air Taxi Technology

    How a Gemma model helped discover a new potential cancer therapy pathway – blog.google

    How a Gemma Model Revealed a Breakthrough Pathway for Cancer Treatment

    Italian Technology in Manufacturing: Supporting North American Industries and Keeping Production Local – Thomasnet

    How Italian Technology is Revolutionizing North American Manufacturing and Boosting Local Production

    Guide to Proteomics Project Planning: Sample Preparation Strategies – Technology Networks

    Guide to Proteomics Project Planning: Sample Preparation Strategies – Technology Networks

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
  • Home
  • Business
  • Entertainment
    Hetzel Design: blending architecture and entertainment – Blooloop

    Hetzel Design: Where Architecture and Entertainment Unite in Perfect Harmony

    Country music legend rushed to hospital year after heart surgery. Here’s what we know – PennLive.com

    Country Music Legend Rushed to Hospital One Year After Heart Surgery – What’s Happening Now?

    Strictly Come Dancing results: Chris Robshaw is eliminated while drag queen La Voix escapes dance-off – Yahoo

    Strictly Come Dancing results: Chris Robshaw is eliminated while drag queen La Voix escapes dance-off – Yahoo

    Placer County town of Loomis considers entertainment zone for downtown – CBS News

    Loomis Unveils Thrilling New Entertainment Zone to Revitalize Downtown

    CT Culture Corner: Robert Redford films to watch – CT Insider

    CT Culture Corner: Robert Redford films to watch – CT Insider

    Elmira’s New Entertainment Venue ‘Centertown Social’ Adding its Final Touches – WENY News

    Elmira’s New Entertainment Venue ‘Centertown Social’ Adding its Final Touches – WENY News

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology
    3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

    3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

    3 Technology Stocks to Buy Now – Yahoo Finance

    3 Must-Buy Tech Stocks You Can’t Afford to Miss Right Now

    ‘New frontier’: Austin leaders start discussions on air taxi technology – KXAN Austin

    Austin Leaders Ignite Exciting Conversations on the Future of Air Taxi Technology

    How a Gemma model helped discover a new potential cancer therapy pathway – blog.google

    How a Gemma Model Revealed a Breakthrough Pathway for Cancer Treatment

    Italian Technology in Manufacturing: Supporting North American Industries and Keeping Production Local – Thomasnet

    How Italian Technology is Revolutionizing North American Manufacturing and Boosting Local Production

    Guide to Proteomics Project Planning: Sample Preparation Strategies – Technology Networks

    Guide to Proteomics Project Planning: Sample Preparation Strategies – Technology Networks

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
Earth-News
No Result
View All Result
Home Technology

DeepMind’s PEER scales language models with millions of tiny experts

July 13, 2024
in Technology
DeepMind’s PEER scales language models with millions of tiny experts
Share on FacebookShare on Twitter

July 12, 2024 8:53 PM

mixture of millions of experts

Image credit: VentureBeat with DALL-E 3

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More

Mixture-of-Experts (MoE) has become a popular technique for scaling large language models (LLMs) without exploding computational costs. Instead of using the entire model capacity for every input, MoE architectures route the data to small but specialized “expert” modules. MoE enables LLMs to increase their parameter while keeping inference costs low. MoE is used in several popular LLMs, including Mixtral, DBRX, Grok and reportedly GPT-4. 

However, current MoE techniques have limitations that restrict them to a relatively small number of experts. In a new paper, Google DeepMind introduces Parameter Efficient Expert Retrieval (PEER), a novel architecture that can scale MoE models to millions of experts, further improving the performance-compute tradeoff of large language models.

The challenge of scaling LLMs

The past few years have shown that scaling language models by increasing their parameter count leads to improved performance and new capabilities. However, there is a limit to how much you can scale a model before running into computational and memory bottlenecks.

Every transformer block used in LLMs has attention layers and feedforward (FFW) layers. The attention layer computes the relations between the sequence of tokens fed to the transformer block. The feedforward network is responsible for storing the model’s knowledge. FFW layers account for two-thirds of the model’s parameters and are one of the bottlenecks of scaling transformers. In the classic transformer architecture, all the parameters of the FFW are used in inference, which makes their computational footprint directly proportional to their size.

MoE tries to address this challenge by replacing the FFW with sparsely activated expert modules instead of a single dense FFW layer. Each of the experts contains a fraction of the parameters of the full dense layer and specializes in certain areas. The MoE has a router that assigns each input to several experts who are likely to provide the most accurate answer. 

By increasing the number of experts, MoE can increase the capacity of the LLM without increasing the computational cost of running it. 

Finding the right level of MoE granularity

According to recent studies, the optimal number of experts for an MoE model is related to several factors, including the number of training tokens and the compute budget. When these variables are balanced, MoEs have consistently outperformed dense models for the same amount of compute resources.

Furthermore, researchers have found that increasing the “granularity” of an MoE model, which refers to the number of experts, can lead to performance gains, especially when accompanied by an increase in model size and training data.

High-granularity MoE can also enable models to learn new knowledge more efficiently. Some studies suggest that by adding new experts and regularizing them properly, MoE models can adapt to continuous data streams, which can help language models deal with continuously changing data in their deployment environments.

Current approaches to MoE are limited and unscalable. For example, they usually have fixed routers that are designed for a specific number of experts and need to be readjusted when new experts are added.

Parameter Efficient Expert Retrieval 

DeepMind’s Parameter Efficient Expert Retrieval (PEER) architecture addresses the challenges of scaling MoE to millions of experts. PEER replaces the fixed router with a learned index to efficiently route input data to a vast pool of experts. For each given input, PEER first uses a fast initial computation to create a shortlist of potential candidates before choosing and activating the top experts. This mechanism enables the MoE to handle a very large number of experts without slowing down.

Unlike previous MoE architectures, where experts were often as large as the FFW layers they replaced, PEER uses tiny experts with a single neuron in the hidden layer. This design enables the model to share hidden neurons among experts, improving knowledge transfer and parameter efficiency. To compensate for the small size of the experts, PEER uses a multi-head retrieval approach, similar to the multi-head attention mechanism used in transformer models.

PEER layer architecturePEER layer architecture (source: arxiv)

A PEER layer can be added to an existing transformer model or used to replace an FFW layer. PEER is also related to parameter-efficient fine-tuning (PEFT) techniques. In PEFT techniques, parameter efficiency refers to the number of parameters that are modified to fine-tune a model for a new task. In PEER, parameter efficiency reduces the number of active parameters in the MoE layer, which directly affects computation and activation memory consumption during pre-training and inference. 

According to the paper, PEER could potentially be adapted to select PEFT adapters at runtime, making it possible to dynamically add new knowledge and features to LLMs.

PEER might be used in DeepMind’s Gemini 1.5 models, which according to the Google blog uses “a new Mixture-of-Experts (MoE) architecture.”

PEER in action

The researchers evaluated the performance of PEER on different benchmarks, comparing it against transformer models with dense feedforward layers and other MoE architectures. Their experiments show that PEER models achieve a better performance-compute tradeoff, reaching lower perplexity scores with the same computational budget as their counterparts. 

The researchers also found that increasing the number of experts in a PEER model leads to further perplexity reduction. 

“This design demonstrates a superior compute-performance trade-off in our experiments, positioning it as a competitive alternative to dense FFW layers for scaling foundation models,” the researchers write.

The findings are interesting because they challenge the long-held belief that MoE models reach peak efficiency with a limited number of experts. PEER shows that by applying the right retrieval and routing mechanisms, it is possible to scale MoE to millions of experts. This approach can help further reduce the cost and complexity of training and serving very large language models.

VB Daily

Stay in the know! Get the latest news in your inbox daily

By subscribing, you agree to VentureBeat’s Terms of Service.

Thanks for subscribing. Check out more VB newsletters here.

An error occured.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : VentureBeat – https://venturebeat.com/ai/deepminds-peer-scales-language-models-with-millions-of-tiny-experts/

Tags: DeepMind’sScalestechnology
Previous Post

DocuSign and Elastic supercharge generative contract and search solutions

Next Post

Meta researchers distill System 2 thinking into LLMs, improving performance on complex reasoning

The lover’s guide to the 2025 World Series – The New York Times

Your Ultimate Lover’s Guide to the 2025 World Series

October 21, 2025
Rethinking Growth In A Broadening Economy – Seeking Alpha

Unlocking Success: New Insights for Thriving in a Booming Economy

October 21, 2025
Hetzel Design: blending architecture and entertainment – Blooloop

Hetzel Design: Where Architecture and Entertainment Unite in Perfect Harmony

October 21, 2025
Study suggests earlier start of hormone therapy optimizes long-term health outcomes – News-Medical

Starting Hormone Therapy Sooner May Boost Long-Term Health Benefits

October 21, 2025
Some States With Democratic Governors Are Posting Partisan Shutdown Messages on Official Websites – The New York Times

Some States with Democratic Governors Feature Partisan Shutdown Messages on Official Websites

October 21, 2025
Revisiting Pope Francis’ Call for ‘Ecological Conversion’ – Sojourners

Pope Francis’ Powerful Appeal for a Global Ecological Awakening

October 21, 2025
PJSB donates $50,000 for UACCM Nursing and Science Center – KVOM

PJSB donates $50,000 for UACCM Nursing and Science Center – KVOM

October 21, 2025
An update from Riane Eisler, author of “The Chalice and the Blade,” on achieving peace. – Psychology Today

An update from Riane Eisler, author of “The Chalice and the Blade,” on achieving peace. – Psychology Today

October 21, 2025
We live in a sailboat and travel the world full-time… but our lifestyle isn’t as glamorous as you’d think – Daily Mail

Living Full-Time on a Sailboat and Traveling the World: The Reality Behind the Glamour

October 21, 2025
3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

3 E Network Technology Group Limited Closes $1.5 Million Convertible Promissory Note Offering – Quiver Quantitative

October 21, 2025

Categories

Archives

October 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  
« Sep    
Earth-News.info

The Earth News is an independent English-language daily published Website from all around the World News

Browse by Category

  • Business (20,132)
  • Ecology (878)
  • Economy (900)
  • Entertainment (21,771)
  • General (17,718)
  • Health (9,941)
  • Lifestyle (912)
  • News (22,149)
  • People (900)
  • Politics (910)
  • Science (16,110)
  • Sports (21,399)
  • Technology (15,879)
  • World (883)

Recent News

The lover’s guide to the 2025 World Series – The New York Times

Your Ultimate Lover’s Guide to the 2025 World Series

October 21, 2025
Rethinking Growth In A Broadening Economy – Seeking Alpha

Unlocking Success: New Insights for Thriving in a Booming Economy

October 21, 2025
  • About
  • Advertise
  • Privacy & Policy
  • Contact

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

Go to mobile version