Close Menu
GT NewsGT News

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    A-Rod says Yankees’ Judge ‘needs’ October success to cement Yankee legacy

    June 7, 2025

    Risk-on rally: Defence and microcaps drive May surge in Indian markets; RBI rate cut extends momentum into June

    June 7, 2025

    Can your daily cup of tea hurt your kidneys? What science says (and a rare fatal case)

    June 7, 2025
    Facebook X (Twitter) Instagram
    GT NewsGT News
    • Home
    • Trends
    • U.S
    • World
    • Business
    • Technology
    • Entertainment
    • Sports
    • Science
    • Health
    GT NewsGT News
    Home » Microsoft’s BitNet shows what AI can do with just 400MB and no GPU
    Technology

    Microsoft’s BitNet shows what AI can do with just 400MB and no GPU

    LuckyBy LuckyApril 20, 2025No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Microsoft’s BitNet shows what AI can do with just 400MB and no GPU
    Share
    Facebook Twitter LinkedIn Pinterest Email

    What happened? Microsoft has introduced Bitnet B1.58 2B4T, which is a new type of big language model for extraordinary efficiency. Unlike traditional AI models, who rely on 16- or 32-bit floating-point numbers to represent each weight, Bitnet uses only three discomfort values: -1, 0, or +1. This approach, known as Turnery Parmination, allows each weight to be stored in only 1.58 bits. The result is a model that dramatically reduces memory use and can run more easily on standard hardware, without the need for a high-end GPU, it is usually required for large-scale AIs.

    The Bitnet B1.58 2B4T model was developed by the general artificial intelligence group of Microsoft and includes two billion parameters – internal values ​​that enable the model to understand and generate the language. To compensate for its low-precious load, the model was trained on a large-scale dataset of four trillion tokens, equal to the content of about 33 million books. It allows comprehensive training bitnet to perform better than other major models of equal size, such as the Meta of Meta, Jemma 3 1B of Google, and Cuven 2.5 1.5B of Alibaba.

    In benchmark tests, Bitnet B1.58 2B4T performed strong performance in various types of functions, including grade-school mathematics problems and questions requiring the argument of general knowledge. In some evaluation, it also improved its rivals.

    The one who actually separates the bitnet is its memory efficiency. The model requires only 400MB memory, which usually requires comparable models. As a result, it can easily run on standard CPUs, including Apple’s M2 chip, without relying on high-end GPU or special AI hardware.

    This level of efficiency is made possible by a custom software framework called bitnet. CPP, which is adapted to take full advantage of the model’s turnry weight. The framework ensures fast and mild performance on everyday computing devices.

    Standard AI Library such as Hugging Face Transformer bitnet B1.58 does not provide the same performance benefits as 2B4T, which makes the use of custom bitate necessary. Available on Github, the framework is currently adapted to the CPU, but the future updates are planned for support for other processor types.

    The idea of ​​reducing model precision to save memory is not new because researchers have long detected model compression. However, most previous attempts involved converting full-perfect models after training, often at the cost of accuracy. Bitnet B1.58 2B4T takes a different approach: it is trained from the ground using only three weight values ​​(-1, 0, and +1). This allows it to avoid many performances seen in earlier methods.

    This change has important implications. Running large AI models usually demands powerful hardware and considerable energy, which increase cost and environmental impact. Because the bitnet depends on extremely simple components – adding rather than most properties – it consumes very low energy.

    Microsoft researchers estimate that it uses 85 to 96 percent less energy than comparable full-colored models. This can open the door to run directly advanced AIs on individual devices, without the need for a cloud-based supercomputer.

    He said, Bitnet B1.58 2B4T has some limitations. It currently supports only specific hardware and requires custom bitnet. CPP framework. Its reference window – the amount of text can process at once – is smaller than the most advanced model.

    Researchers are still investigating why the model performs so well with such a simplified architecture. The purpose of future work is to expand its abilities, including support for more languages ​​and prolonged text inputs.

    400MB BitNet GPU Microsofts shows
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleBarcelona Open: Holger Rune stuns Carlos Alcaraz for first title since April 2023 | Tennis News
    Next Article Desi deeptech funding up 78% in 2024: Report
    Lucky
    • Website

    Related Posts

    Technology

    Will Musk’s explosive row with Trump help or harm his businesses?

    June 7, 2025
    Technology

    Nvidia reaches historic 92% GPU market share, leaves AMD and Intel far behind

    June 7, 2025
    Entertainment

    Rod Stewart forced to cancel U.S. shows due to illness: “I’m devastated and sincerely apologize”

    June 7, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Stability trend for private markets to see in 2025

    February 21, 2025971 Views

    Appeals court allows Trump to enforce ban on DEI programs for now

    March 14, 2025943 Views

    My mom says these Sony headphones (down to $38) are the best gift I’ve given her

    February 21, 2025886 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    • Pinterest
    • Reddit
    • Telegram
    • Tumblr
    • Threads
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Stability trend for private markets to see in 2025

    February 21, 2025971 Views

    Appeals court allows Trump to enforce ban on DEI programs for now

    March 14, 2025943 Views

    My mom says these Sony headphones (down to $38) are the best gift I’ve given her

    February 21, 2025886 Views
    Our Picks

    A-Rod says Yankees’ Judge ‘needs’ October success to cement Yankee legacy

    June 7, 2025

    Risk-on rally: Defence and microcaps drive May surge in Indian markets; RBI rate cut extends momentum into June

    June 7, 2025

    Can your daily cup of tea hurt your kidneys? What science says (and a rare fatal case)

    June 7, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest YouTube Tumblr Reddit Telegram
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © .2025 gtnews.site Designed by Pro

    Type above and press Enter to search. Press Esc to cancel.