AI NewsMultiverse Computing pushes its compressed AI models into the mainstream

Multiverse Computing pushes its compressed AI models into the mainstream

5:13 PM IST · March 19, 2026

Multiverse Computing pushes its compressed AI models into the mainstream

With private company defaults running atupwards of 9.2%— the highest rate in years — VC firm Lux Capital recently advised companies relying on AI to get their compute capacity commitmentsconfirmed in writing. With financial instability rippling through the AI supply chain, Lux warned, a handshake agreement isn’t enough. But there’s another option entirely, which is to stop relying on external compute infrastructure altogether. Smaller AI models that run directly on a user’s own device — no data center, no cloud provider, no counterparty risk — are getting good enough to be worth considering. AndMultiverse Computingis raising its hand. The Spanish startup has so far kept a lower profile than some of its peers, but as demand for AI efficiency grows, this is changing. After compressing models from major AI labs including OpenAI, Meta, DeepSeek and Mistral AI, it has launched both an app that showcases the capabilities of its compressed models and an API portal — a gateway that lets developers access and build with those models — that makes them more widely available. TheCompactifAI app, which shares its name with Multiverse’s quantum-inspired compression technology, is an AI chat tool in the vein of ChatGPT or Mistral’s Le Chat. Ask a question, and the model answers. The difference is that Multiverse embedded Gilda, a model so small that it can run locally and offline, according to the company. For end users, this is a taste of AI on the edge, with data that doesn’t leave their devices and doesn’t require a connection. But there’s a caveat: their mobile devices must have enough RAM and storage. If they don’t — and many older iPhones won’t — the app switches back to cloud-based models via API. The routing between local and cloud processing is handled automatically by a system Multiverse has named Ash Nazg, whose name will ring a bell for Tolkien fans as it references the One Ring inscription in “The Lord of the Rings.” But when the app routes to the cloud, it loses its main privacy edge in the process. These limitations mean that CompactifAI is not quite ready for mass customer adoption yet, although that may never have been the goal. According to data from Sensor Tower, the app hadfewer than 5,000 downloadsin the past month. The real target is businesses. Today, Multiverse is launching aself-serve API portalthat gives developers and enterprises direct access to its compressed models — no AWS Marketplace required. “The CompactifAI API portal [now] gives developers direct access to compressed models with the transparency and control needed to run them in production,” CEO Enrique Lizaso said in a statement. Real-time usage monitoring is one of the key features of the API, and that’s no accident. Alongside the potential advantages of deploying on the edge, lower compute costs are one of the main reasons why enterprises are considering smaller models as an alternative to large language models (LLMs). It also helps that small models are less limited than they used to be. Earlier this week, Mistral updated its small model family with thelaunch of Mistral Small 4, which it says is simultaneously optimized for general chat, coding, agentic tasks and reasoning. The French company alsoreleased Forge, a system that lets enterprises build custom models, including small models for which they can pick the tradeoffs their use cases can best tolerate. Multiverse’s recent results also suggest the gap with LLMs is narrowing. Its latest compressed model,HyperNova 60B 2602, is built on gpt-oss-120b — an OpenAI model whose underlying code is publicly available. The company claims it now deliversfaster responsesat lower cost than the original it was derived from, an advantage that matters particularly for agentic coding workflows, where AI autonomously completes complex, multi-step programming tasks. Making models small enough to operate on mobile devices while still remaining useful is a big challenge.Apple Intelligencesidestepped that issue by combining an on-device model and a cloud model. Multiverse’s CompactifAI app can also route requests to gpt-oss-120b via API, but its main goal is to showcase that local models like Gilda and its future replacements have advantages that go beyond cost savings. For workers in critical fields, a model that can run locally and without connecting to the cloud offers more privacy and resilience. But the bigger value is in the business use cases this can unlock – for instance, embedding AI in drones, satellites, and other settings where connectivity can’t be taken for granted. The company already serves more than 100 global customers including the Bank of Canada, Bosch and Iberdrola, but expanding its customer base could help it unlock more funding. After raising a$215 million Series Blast year, it is nowrumored to be raising a fresh €500 million funding roundat a valuation of more than €1.5 billion.

read more

Latest AI News

View All News →
There aren’t enough rockets for space data centers — Cowboy Space raised $275M to build them

There aren’t enough rockets for space data centers — Cowboy Space raised $275M to build them

The apparently insatiable demand for AI compute has data center entrepreneurs looking to the stars. There’s a key problem: There aren’t enough rockets to put data centers in orbit around Earth, and they’re too expensive. Most of the players are hoping that SpaceX’s Starship — expected to make its twelfth test flight as soon as this weekend— will solve the problem. But once the vehicle is operational it may be years before it is commercially available, given SpaceX’s internal satellite business. Thesame is truefor Blue Origin’s New Glenn rocket, which failed to deliver a satellite during its third launch in April. That leaves space data center schemes either targeting the mid 2030s, like Google Suncatcher, or preparing to start off doing edge processing tasks for space sensors, likeStarcloud. In theory, there’s a third way: “We’re standing up our own rocket program,” Baiju Bhatt, the CEO and founder of Cowboy Space Corporation, told TechCrunch. He expects the first launch before the end of 2028. Today, the company announced the closure of a $275 million Series B round at a post-money valuation of $2 billion, led by earlier backer Index Ventures, as a downpayment on that work. Breakthrough Energy Ventures, Construct Capital, IVP, and SAIC also participated. The company had previously raised $80 million from investors, including Index, Breakthrough Energy Ventures, Andreessen Horowitz, and New Enterprise Associates. Bhatt, a co-founder of online stock platform Robinhood, launched this startup in 2024 as Aetherflux, with plans to collect abundant solar energy in space and beam it down to Earth. The idea of space data centers led the company to pivot towards using its electricity while in orbit. Thepractical realitiesof that effort, in turn, led him to a rocket development program, and the company’s new name. Bhatt said he spoke to multiple launch providers to try and find a path where his company would only build satellites, but he couldn’t find enough launch capacity to truly scale an orbital data center business, or do so in a way where the unit economics could compete with terrestrial alternatives. "There's a lot of new rockets that are coming online, but as we look three, four years out, it's still very, very scarce, and I think that you're going to see a lot of the first party rocket providers actually specialize into their own payloads," Bhatt said. Of course, while bringing the rocket in-house is logical, it's also nuts. Only a handful of private companies in the West, mainly SpaceX, Rocket Lab and Arianespace, are consistently launching commercial rockets. Two others, Blue Origin and United Launch Alliance, have been struggling to drag their vehicles out of development hell for years. A number of startups, including Stoke Space, Firefly Aerospace, and Relativity Space, have worked for years and are still waiting to deliver operational systems. This evolution of the company will also bring Cowboy Space Corporation into direct competition with SpaceX and Blue Origin, the most advanced and well-funded players in the market. "The prize here, and the size of this market, is big enough that there's room for many players to succeed," Bhatt said "I see the demand for AI getting more and more acute, and I see the options on Earth getting more and more limited." One advantage, Bhatt argues, is the company's focus on this single market (data centers), and its unique design. Orbital rockets typically have a booster stage that flies the vehicle to the edge of space, and a second stage that carries the payload and delivers it to orbit. Cowboy Space plans to build its data centers directly into the second stage of its rocket. It's actually a bit of a throw-back: The first US satellite, Explorer 1, was built as the final stage of a rocket, filled with radio equipment and a few scientific instruments. Making the rocket purpose-built only to launch its data-center satellites should simplify the design process. The company expects each satellite to have a mass of 20,000 to 25,000 kilograms and to generate 1 MW of power for just under 800 onboard GPUs. That means its rocket would be slightly more powerful than the SpaceX's workhorse Falcon 9, though still smaller than its under-development Starship. Eventually, Bhatt says, he expects the booster to be reusable. Cowboy Space has hired veterans of the space industry, including former Blue Origin propulsion engineer Warren Lamont and former SpaceX launch director Tyler Grinne. The company also plans to build its own rocket engine, the most complex and expensive part of any launch vehicle. Cowboy Space is still working through key development needs, like facilities to test, manufacture and launch its rockets. The new vision comes with a new name for the startup, to emphasize its mission to "power humanity from the high frontier," although Bhatt admits "it gives me a reason to wear a cowboy hat and also grow this sick mustache."

7 hours ago

View

Digg tries again, this time as an AI news aggregator

Digg tries again, this time as an AI news aggregator

Digg is back from the dead. Again. Just months afterlaunching, therebootof Kevin Rose’s once-popular link-sharing siteshut down in March, as the company shifted course. Originally redesigned as acompetitorto the massive community forum site Reddit, the new Digg found that it wasn’t able to effectively manage the bot traffic invading its platform and hadn’t differentiated itself enough from the competition to make an impact. The startup laid off staff and said it was time to go back to the drawing board. Rose, a partner at True Ventures, returned to work full-time on a new version of Digg in April. On Friday evening, the founderprevieweda link to the newly redesignedDigg, which now looks nothing like a Reddit clone and more like the news aggregator it once was. a little project i've been hacking on:https://t.co/zTuwWy44lybugs expected. more topics soon. This time around, the site is focused on ranking news — specifically, AI news to start. In an email to beta testers, the company said the site’s goal is to “track the most influential voices in a space” and to surface the news that’s actually worth “paying attention to.” AI is the area it’s testing this idea with, but if successful, Digg will expand to include other topics. The email warned that the site was still raw and “buggy,” and was designed more to give users a first look than to serve as its public debut. On the current homepage, Digg showcases four main stories at the top: the most viewed story, a story seeing rising discussion, the fastest-climbing story, and one “In case you missed it” headline. Below that is a ranked list of top stories for the day, complete with engagement metrics like views, comments, likes, and saves. But the twist is that these metrics aren’t the ones generated on Digg itself. Instead, Digg is ingesting content from X in real-time to determine what’s being discussed, while also performing sentiment analysis, clustering, and signal detection to determine what matters most. As Roseremarked on X,when OpenAI CEO Sam Altman engages with a story about AI, it almost always sets off a chain reaction that includes deep discussion and propagation of that topic throughout X. The new Digg will be able to track that increased engagement. This might be something that’s interesting to data nerds, as it exposes the impact of X-based engagement with charts and graphs, and offers a way to track signal among what can, on X, often be a lot of noise. But it’s unclear whether there’s enough underlying value here for an everyday user, beyond seeing that yes, a@samatweet can make something go viral. The site also ranks the top 1,000 people involved in AI, as well as the top companies and the top politicians focused on AI issues. For those who don’t have time to spend on X tracking breaking AI news, Digg could prove a useful resource. But it’s not clear why people would regularly turn to Digg over their preferred news app, RSS reader, or even their X “For You” feed, if they wanted to catch up on what’s trending — especially because there isn’t currently any discussion happening on Digg’s site itself. Digg may also struggle when it moves on to other topics, as AI news is one of the few areas where discussion still heavily takes place on X. Other verticals don’t have the same traction, especially after Musk’s takeover of the site formerly known as Twitter gave rise to an ecosystem of competitors, which now includes Meta’s creator-focused Threads. Many non-tech-related discussions are now happening off X, or off the public internet entirely. However, if Digg does end up gaining steam, it could serve as a useful source of website traffic to publishers whose businesses have been decimated bydeclining clicksthanks to Google’s changing algorithms andthe impact of AI Overviews, the AI-generated summaries Google displays atop search results, which often answer users’ questions before they ever click through to a website.

7 hours ago

View

OpenAI Launches $4 Billion Deployment Company Backed by Top Investors

OpenAI Launches $4 Billion Deployment Company Backed by Top Investors

The Deployment Company will function as a standalone business unit while remaining closely connected to OpenAI’s research and product teams.

7 hours ago

View

Delhi-NCR Raises $1.7 Billion in Q1 2026 Driven by Large Funding Rounds

Delhi-NCR Raises $1.7 Billion in Q1 2026 Driven by Large Funding Rounds

Gurugram dominated the funding landscape, attracting 52% of all capital raised in the region, equivalent to $876 million.

11 hours ago

View