* . *
  • About
  • Advertise
  • Privacy & Policy
  • Contact
Tuesday, December 16, 2025
Earth-News
  • Home
  • Business
  • Entertainment
    Eagles Tribute Band Will Play Two Concerts In Plymouth – CapeNews.net

    Experience the Ultimate Eagles Tribute Band Live in Plymouth with Two Unforgettable Concerts!

    Cox Communications, Inc. v. Sony Music Entertainment – American Civil Liberties Union

    Epic Showdown: Cox Communications Takes on Sony Music Entertainment in Landmark Legal Battle

    Arts and Entertainment Agenda: Dec. 12-18 – AspenTimes.com

    Your Ultimate Arts and Entertainment Guide: December 12-18

    Apex Legends creators announce new PvP FPS game Highguard – Esports Insider

    Apex Legends Creators Unveil Exciting New PvP FPS Game Highguard

    SYSK’s 12 Days of Christmas… Toys: How the Nintendo Entertainment System Changed Gaming Forever – iHeart

    How the Nintendo Entertainment System Changed Gaming Forever

    Mid-Michigan entertainment for the weekend of Dec. 12-14 and beyond – The Morning Sun

    Unmissable Mid-Michigan Entertainment Events Happening December 12-14 and Beyond

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology
    AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

    AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

    West Virginia High Technology Foundation focuses on artificial intelligence growth in 2026, beyond – WV News

    West Virginia High Technology Foundation Fuels Ambitious AI Growth for 2026 and Beyond

    Is Micron Technology Stock a Buy Right Now? – Nasdaq

    Is Micron Technology Stock a Smart Buy Today?

    Why health plans need member trust to fully harness technology – Fierce Healthcare

    Building Member Trust: Unlocking the True Power of Technology in Health Plans

    5 Things To Do Before You Buy Your Next Martech Tool – CX Today

    5 Things To Do Before You Buy Your Next Martech Tool – CX Today

    Latino Entrepreneurs in Technology – Al Día News

    Rising Latino Entrepreneurs Shaping the Future of Technology

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
  • Home
  • Business
  • Entertainment
    Eagles Tribute Band Will Play Two Concerts In Plymouth – CapeNews.net

    Experience the Ultimate Eagles Tribute Band Live in Plymouth with Two Unforgettable Concerts!

    Cox Communications, Inc. v. Sony Music Entertainment – American Civil Liberties Union

    Epic Showdown: Cox Communications Takes on Sony Music Entertainment in Landmark Legal Battle

    Arts and Entertainment Agenda: Dec. 12-18 – AspenTimes.com

    Your Ultimate Arts and Entertainment Guide: December 12-18

    Apex Legends creators announce new PvP FPS game Highguard – Esports Insider

    Apex Legends Creators Unveil Exciting New PvP FPS Game Highguard

    SYSK’s 12 Days of Christmas… Toys: How the Nintendo Entertainment System Changed Gaming Forever – iHeart

    How the Nintendo Entertainment System Changed Gaming Forever

    Mid-Michigan entertainment for the weekend of Dec. 12-14 and beyond – The Morning Sun

    Unmissable Mid-Michigan Entertainment Events Happening December 12-14 and Beyond

  • General
  • Health
  • News

    Cracking the Code: Why China’s Economic Challenges Aren’t Shaking Markets, Unlike America’s” – Bloomberg

    Trump’s Narrow Window to Spread the Truth About Harris

    Trump’s Narrow Window to Spread the Truth About Harris

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    Israel-Gaza war live updates: Hamas leader Ismail Haniyeh assassinated in Iran, group says

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    PAP Boss to Niger Delta Youths, Stay Away from the Protest

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Court Restricts Protests In Lagos To Freedom, Peace Park

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Fans React to Jazz Jennings’ Inspiring Weight Loss Journey

    Trending Tags

    • Trump Inauguration
    • United Stated
    • White House
    • Market Stories
    • Election Results
  • Science
  • Sports
  • Technology
    AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

    AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

    West Virginia High Technology Foundation focuses on artificial intelligence growth in 2026, beyond – WV News

    West Virginia High Technology Foundation Fuels Ambitious AI Growth for 2026 and Beyond

    Is Micron Technology Stock a Buy Right Now? – Nasdaq

    Is Micron Technology Stock a Smart Buy Today?

    Why health plans need member trust to fully harness technology – Fierce Healthcare

    Building Member Trust: Unlocking the True Power of Technology in Health Plans

    5 Things To Do Before You Buy Your Next Martech Tool – CX Today

    5 Things To Do Before You Buy Your Next Martech Tool – CX Today

    Latino Entrepreneurs in Technology – Al Día News

    Rising Latino Entrepreneurs Shaping the Future of Technology

    Trending Tags

    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • Mark Zuckerberg
No Result
View All Result
Earth-News
No Result
View All Result
Home Technology

DeepMind’s PEER scales language models with millions of tiny experts

July 13, 2024
in Technology
DeepMind’s PEER scales language models with millions of tiny experts
Share on FacebookShare on Twitter

July 12, 2024 8:53 PM

mixture of millions of experts

Image credit: VentureBeat with DALL-E 3

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More

Mixture-of-Experts (MoE) has become a popular technique for scaling large language models (LLMs) without exploding computational costs. Instead of using the entire model capacity for every input, MoE architectures route the data to small but specialized “expert” modules. MoE enables LLMs to increase their parameter while keeping inference costs low. MoE is used in several popular LLMs, including Mixtral, DBRX, Grok and reportedly GPT-4. 

However, current MoE techniques have limitations that restrict them to a relatively small number of experts. In a new paper, Google DeepMind introduces Parameter Efficient Expert Retrieval (PEER), a novel architecture that can scale MoE models to millions of experts, further improving the performance-compute tradeoff of large language models.

The challenge of scaling LLMs

The past few years have shown that scaling language models by increasing their parameter count leads to improved performance and new capabilities. However, there is a limit to how much you can scale a model before running into computational and memory bottlenecks.

Every transformer block used in LLMs has attention layers and feedforward (FFW) layers. The attention layer computes the relations between the sequence of tokens fed to the transformer block. The feedforward network is responsible for storing the model’s knowledge. FFW layers account for two-thirds of the model’s parameters and are one of the bottlenecks of scaling transformers. In the classic transformer architecture, all the parameters of the FFW are used in inference, which makes their computational footprint directly proportional to their size.

MoE tries to address this challenge by replacing the FFW with sparsely activated expert modules instead of a single dense FFW layer. Each of the experts contains a fraction of the parameters of the full dense layer and specializes in certain areas. The MoE has a router that assigns each input to several experts who are likely to provide the most accurate answer. 

By increasing the number of experts, MoE can increase the capacity of the LLM without increasing the computational cost of running it. 

Finding the right level of MoE granularity

According to recent studies, the optimal number of experts for an MoE model is related to several factors, including the number of training tokens and the compute budget. When these variables are balanced, MoEs have consistently outperformed dense models for the same amount of compute resources.

Furthermore, researchers have found that increasing the “granularity” of an MoE model, which refers to the number of experts, can lead to performance gains, especially when accompanied by an increase in model size and training data.

High-granularity MoE can also enable models to learn new knowledge more efficiently. Some studies suggest that by adding new experts and regularizing them properly, MoE models can adapt to continuous data streams, which can help language models deal with continuously changing data in their deployment environments.

Current approaches to MoE are limited and unscalable. For example, they usually have fixed routers that are designed for a specific number of experts and need to be readjusted when new experts are added.

Parameter Efficient Expert Retrieval 

DeepMind’s Parameter Efficient Expert Retrieval (PEER) architecture addresses the challenges of scaling MoE to millions of experts. PEER replaces the fixed router with a learned index to efficiently route input data to a vast pool of experts. For each given input, PEER first uses a fast initial computation to create a shortlist of potential candidates before choosing and activating the top experts. This mechanism enables the MoE to handle a very large number of experts without slowing down.

Unlike previous MoE architectures, where experts were often as large as the FFW layers they replaced, PEER uses tiny experts with a single neuron in the hidden layer. This design enables the model to share hidden neurons among experts, improving knowledge transfer and parameter efficiency. To compensate for the small size of the experts, PEER uses a multi-head retrieval approach, similar to the multi-head attention mechanism used in transformer models.

PEER layer architecturePEER layer architecture (source: arxiv)

A PEER layer can be added to an existing transformer model or used to replace an FFW layer. PEER is also related to parameter-efficient fine-tuning (PEFT) techniques. In PEFT techniques, parameter efficiency refers to the number of parameters that are modified to fine-tune a model for a new task. In PEER, parameter efficiency reduces the number of active parameters in the MoE layer, which directly affects computation and activation memory consumption during pre-training and inference. 

According to the paper, PEER could potentially be adapted to select PEFT adapters at runtime, making it possible to dynamically add new knowledge and features to LLMs.

PEER might be used in DeepMind’s Gemini 1.5 models, which according to the Google blog uses “a new Mixture-of-Experts (MoE) architecture.”

PEER in action

The researchers evaluated the performance of PEER on different benchmarks, comparing it against transformer models with dense feedforward layers and other MoE architectures. Their experiments show that PEER models achieve a better performance-compute tradeoff, reaching lower perplexity scores with the same computational budget as their counterparts. 

The researchers also found that increasing the number of experts in a PEER model leads to further perplexity reduction. 

“This design demonstrates a superior compute-performance trade-off in our experiments, positioning it as a competitive alternative to dense FFW layers for scaling foundation models,” the researchers write.

The findings are interesting because they challenge the long-held belief that MoE models reach peak efficiency with a limited number of experts. PEER shows that by applying the right retrieval and routing mechanisms, it is possible to scale MoE to millions of experts. This approach can help further reduce the cost and complexity of training and serving very large language models.

VB Daily

Stay in the know! Get the latest news in your inbox daily

By subscribing, you agree to VentureBeat’s Terms of Service.

Thanks for subscribing. Check out more VB newsletters here.

An error occured.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : VentureBeat – https://venturebeat.com/ai/deepminds-peer-scales-language-models-with-millions-of-tiny-experts/

Tags: DeepMind’sScalestechnology
Previous Post

DocuSign and Elastic supercharge generative contract and search solutions

Next Post

Meta researchers distill System 2 thinking into LLMs, improving performance on complex reasoning

Ecosystem health shapes viral ecology in peatland soils – Nature

Unveiling How Ecosystem Health Shapes Viral Ecology in Peatland Soils

December 16, 2025
Indiana Math and Science Academy West assistant principal Justin Kirby – IndyStar

Meet Justin Kirby: Inspiring Leader at Indiana Math and Science Academy West

December 16, 2025
Electrical Engineering and Computer Science – MIT School of Engineering

Cutting-Edge Breakthroughs and Discoveries in Electrical Engineering and Computer Science

December 16, 2025
Simple lifestyle tweaks make your brain feel 8 years younger, reveals study – Moneycontrol

Simple lifestyle tweaks make your brain feel 8 years younger, reveals study – Moneycontrol

December 16, 2025
AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

AI coding is now everywhere. But not everyone is convinced. – MIT Technology Review

December 16, 2025
Baumhower’s Victory Grille breaks ground at sports complex in Millbrook – The Bama Buzz

Baumhower’s Victory Grille Breaks Ground at Millbrook Sports Complex

December 16, 2025
Ask Dr. Universe: It’s difficult to count the world’s volcanoes – The Spokesman-Review

Ask Dr. Universe: It’s difficult to count the world’s volcanoes – The Spokesman-Review

December 15, 2025
From Belgrade to Miami: Israel to open economic missions in Netanyahu-allied countries – Haaretz

From Belgrade to Miami: Israel Boosts Economic Missions in Strategic Netanyahu-Allied Nations

December 15, 2025
Eagles Tribute Band Will Play Two Concerts In Plymouth – CapeNews.net

Experience the Ultimate Eagles Tribute Band Live in Plymouth with Two Unforgettable Concerts!

December 15, 2025
Newsom taps former CDC leaders critical of Trump-era health policies for new initiative – Los Angeles Times

Newsom taps former CDC leaders critical of Trump-era health policies for new initiative – Los Angeles Times

December 15, 2025

Categories

Archives

December 2025
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
293031  
« Nov    
Earth-News.info

The Earth News is an independent English-language daily published Website from all around the World News

Browse by Category

  • Business (20,132)
  • Ecology (972)
  • Economy (990)
  • Entertainment (21,866)
  • General (18,767)
  • Health (10,030)
  • Lifestyle (1,002)
  • News (22,149)
  • People (996)
  • Politics (1,003)
  • Science (16,205)
  • Sports (21,491)
  • Technology (15,972)
  • World (978)

Recent News

Ecosystem health shapes viral ecology in peatland soils – Nature

Unveiling How Ecosystem Health Shapes Viral Ecology in Peatland Soils

December 16, 2025
Indiana Math and Science Academy West assistant principal Justin Kirby – IndyStar

Meet Justin Kirby: Inspiring Leader at Indiana Math and Science Academy West

December 16, 2025
  • About
  • Advertise
  • Privacy & Policy
  • Contact

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

No Result
View All Result

© 2023 earth-news.info

Go to mobile version