Home » Artificial intelligence » LLM Profitable – Nvidia H200 Are Very Profitable for AI Companies
Nvidia reports that according to their data Nvidia H200s are profitable for AI companies.
$1 in H200 cost can generate $7 in revenue over 4 years serving Meta Llama 3. This means if an AI company buys a $40k H200, the AI company can make $280k in AI over 4 years.
The Nvidia H200 has twice the AI inference capability of the H100.
IF the AI companies are profitable using Nvidia chips the AI companies can continue to buy Nvidia chips.
Nvidia shares have broken above $1000 and they have announced a ten for one split.
Each Nvidia H200 running Meta LLama 3can support about 2400 users by processing 24000 tokens per second.
Nvidia will be moving to liquid cooled AI data centers for the Blackwell data centers.
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Next Big Future – https://www.nextbigfuture.com/2024/05/llm-profitable-nvidia-h200-are-very-profitable-for-ai-companies.html