Want to get featured here? Explore premium visibility opportunities.

Contact us

AI NewsCo-founders behind Reface and Prisma join hands to improve on-device model inference with Mirai

Co-founders behind Reface and Prisma join hands to improve on-device model inference with Mirai

8:18 PM IST · February 19, 2026

Co-founders behind Reface and Prisma join hands to improve on-device model inference with Mirai

Much of the conversation around AI today is focused on building cloud capacity and massive data centers to run models. Companies like Apple and Qualcomm are in the early stages of making on-device AI more useful. Amid all that, the 14-person technical team of London-basedMiraiis working to improve how models run on phones and laptops. Mirai, which is backed by a $10 million seed round led by Uncork Capital, was founded by Dima Shvets and Alexey Moiseenkov last year. Both founders have experience in building scalable consumer apps. Shevts co-founded face-swapping app Reface,which was backed by a16z. Shevts later also became a scout for the venture firm. Moiseenkov was CEO and co-founder ofthe last decade’s viral AI filters app, Prisma. As consumer developers, both had been thinking about AI and machine learning on devices even before generative AI became popular, Shvets said. “When we met together in London, we started to chat about technology, and we realized that within the hype of gen AI and more AI adoption, everybody speaks about cloud, about servers, about AGI coming. But the missing piece is on-device [AI] for consumer hardware,” he told TechCrunch. Shevts and Moiseenkov wanted to use AI to create a pipeline that would allow them to enable complex tasks on the phone, which led them to start Mirai. When they asked others who developed consumer apps, they heard that many wanted better cost optimization and margin per token usage, too. Today, Mirai is developing a framework for models so they can perform better on devices. The company has built an inference engine for Apple Silicon that optimizes on-device throughput. With its upcoming SDK, developers can integrate the runtime in their apps with only a few lines, the company says. “One of the visions why we started the company was that we wanted to give developers, like this Stripe-like, eight lines of code [integration] experience…you basically go to our platform, integrate the key, and start working with summarization, classification, or whatever your use case is,” Shevts said. The startup built this engine in Rust, which can bump up a model’s generation speed by up to 37%, they claim. The company said that, while tuning the model for a platform, it doesn’t tinker with model weights to ensure there is no loss in quality of the output. Mirai’s stack currently focuses on improving text and voice modalities on the platform, with plans to support vision in the future. The team has started to work with frontier model providers to tune their models for edge use and is in talks with different chipmakers. Later, it plans to bring its engine to Android, too. In addition, Mirai aims to release on-device benchmarks so model makers can test on-device performance. Shevts recognizes that not all AI work can be done on-device, though. To enable a mixed mode of operation, the team is building an orchestration layer to send requests that can’t be fulfilled on the device up to the cloud. While the startup is not directly working with apps just yet, its engine could power on-device assistants, transcribers, translators, and chat apps, we’re told. Andy McLoughlin, managing partner at Uncork Capital, noted that he invested in an edge machine learning company in the last decade. He said that the company was early and eventually sold its business to Spotify. In today’s world, the situation is different, he thinks. “Given the cost of cloud inference, something has to change… For now, VCs are happy to continue funding the rocketship companies, spending inordinate sums on cloud inference. But that won’t last — at some point, people will focus on the underlying economics of these businesses and realize that something has to change,” he said. “It feels like every model maker will want to run part of their inference workloads at the edge, and Mirai feels very well positioned to capture this demand.” Mirai’s seed round also saw participation from individuals, including Dreamer CEO David Singleton, YC Partner Francois Chaubard, Snowflake co-founder Marcin Żukowski, Mati ElevenLabs co-founder Staniszewski, former Google AdSense product manager and Coinbase board member Gokul Rajaram, Groq investor Scooter Braun, Turing.com CTO Vijay Krishnan, Theory Forge Ventures’ Ben Parr and Matt Schlicht, and ex-Netflix technical leader, Aditya Jami.

read more

Latest AI News

View All News →
WhatsApp Introduces Incognito Chat With Meta AI for Private Conversations

WhatsApp Introduces Incognito Chat With Meta AI for Private Conversations

WhatsApp has announced Incognito Chat with Meta AI, a new feature that lets users have private, temporary conversations with the company's AI assistant. The feature is aimed at people who want to ask sensitive questions involving personal matters, finances, health, or work without keeping a lasting record of the interaction. Incognito Chat is built on Meta's Private Processing technology and will begin rolling out to WhatsApp and the Meta AI app over the next few months.

24 minutes ago

View

Poppy debuts a proactive AI assistant to help organize your digital life

Poppy debuts a proactive AI assistant to help organize your digital life

Smartphones can be distracting with their dizzying array of apps and constant stream of notifications. A new app calledPoppyaims to organize the chaos by combining your calendar, email, messages and other sources into a single dashboard. The idea, per the company’s website, is that “Poppy pays attention so you don’t have to.” Users can connect various services to Poppy’s app, like their email, calendar, and, at a minimum, their location. Poppy then uses that data along with AI to guess what’s important to you right now based on what’s going on in your life. At a high level, this means you can open Poppy’s app or glance at its widgets to see the meetings or tasks you have on your plate. But Poppy’s most powerful feature is likely its proactive suggestions. For instance, if Poppy has access to your calendar and sees that you have a 30-minute gap while you’re near a park, it could suggest you take a break and go for a walk before your next appointment. And if you’re planning a brunch with a friend who mentioned their food preferences in a previous communication, it could factor in that information when suggesting restaurants. You can also message Poppy with questions or requests, almost as if you had a personal assistant working on your behalf. Poppy can track your flights and alert you to changes, or nudge you when it’s time to take your medication. Poppy’s maker,Sai Kambampati, says he’s always been fascinated by human-computer interaction, having earned his Master’s degree in Computer Science with a specialization in this area. Previously a software engineer at the AI hardware startup Humane, he said he has seen first-hand how people are trying to rethink how we engage with technology. "I've always been interested in challenging what computers are able to do, especially the idea of ambient computing and computers that can proactively sense what you need and anticipate your needs," Kambampati told TechCrunch. "That's something that I found very, very exciting. And I felt like with all the AI technology that we're seeing around us, it has never been more possible to embark on something like this." At launch, Poppy works with everyday apps like Apple Calendar, Google Calendar, Gmail, Outlook, iCloud Mail, Apple Health, Reminders, Contacts, iMessage, WhatsApp, and others. (It uses a Mac app to access iMessage, which could later be a problem as Apple generally doesn't allow third-party apps to access its messaging service.) It also works with apps like Uber and Instacart, and Kambampati plans to extend support to others over time. The company says users' data is encrypted when stored in its database, and it has a zero-retention policy enabled when it uses cloud-based LLMs for its suggestions. In time, however, Kambampati would like make the switch to using local, on-device AI models when technology advances. "My hope, my dream is — within two to three years from now, when our devices have much more powerful compute, and the models get much smaller, cheaper and more high quality — eventually we can have all of this running on our own devices, and there won't even be a need to hit the servers," he says. Poppy's San Francisco-based team of four is backed by $1.25 million in pre-seed funding led by Kindred Ventures, with various angels also participating, including DeepMind's Logan Kilpatrick.

24 minutes ago

View

Anthropic now has more business customers than OpenAI, according to Ramp data

Anthropic now has more business customers than OpenAI, according to Ramp data

For the first time, Anthropic has more verified business customers than OpenAI, according tothis month’s AI Indexfrom the fintech firm Ramp. The survey, compiled from Ramp’s clients’ expense data, shows 34.4% of participating businesses are paying for Anthropic services, more than any other AI lab, while only 32.3% pay for OpenAI. It is the first time Anthropic has held the top position. “Anthropic has already been in the lead amongst the high adoption groups like finance, tech, professional services,” Ramp economist Ara Kharazian told TechCrunch. “It’s across the other firms where OpenAI still has a lead, but that has been shrinking over the past couple of months.” Because the index only represents companies that use Ramp, it’s not a perfect proxy for the marketplace at large. Still, the sample includes more than 50,000 companies, making it both broad and diverse enough to carry weight. More importantly, the general trend can be seen across the industry. OnOpenRouter’s leaderboard, which samples a different portion of users, OpenAI last ranked above Anthropic in December 2025. According to Ramp’s figures, the past 12 months have been particularly transformative for Anthropic. In May 2025, a mere 9% of businesses were paying for Anthropic products, a figure that climbed 26% in the following 12 months. Over the same period, OpenAI’s share declined by 1%, and the overall share of businesses using some kind of AI product increased by 9%. Kharazian is skeptical about whether this advantage will last, for reasons he explained in a blog post, but said the success of the past year was proof that Anthropic had chosen a good strategy. “What Anthropic did worked really well,” Kharazian told TechCrunch, “which was — start with a very technical customer base, focus on their needs, really succeed in execution and then start broadening out through tools like Cowork.”

24 minutes ago

View

WhatsApp adds an incognito mode in Meta AI chats

WhatsApp adds an incognito mode in Meta AI chats

Meta on Wednesday said it is adding the ability to start “incognito” conversations with its Meta AI chatbot within WhatsApp. These conversations, the company said, will be processed in a secure environment and can’t be seen by anyone. Users can start an incognito session by tapping on a new icon in one-on-one chats with Meta AI. The company said the feature will also be available on the standalone Meta AI app as well. Incognito chats will roll out to WhatsApp and the Meta AI app over the next few months. Loading the player… Meta said these incognito conversations are not saved, and messages will disappear by default once you close the chat. The session will also end if you close the app or lock your phone, and Meta AI will lose the context of that particular conversation, the company said. “People are starting to use AI for everything, including some of their most private thoughts, whether that’s tackling financial or health questions, or for advice on how to respond to a tricky message from a friend or a colleague. We think it’s really important to give people the ability to ask these questions as privately as possible,” Alice Newton-Rex, VP of Product at WhatsApp, told TechCrunch over a call. The company has been laying down the groundwork for secure AI chats on WhatsApp for a while now. Last year, it detailed its private processing infrastructure that would let it build AI features withoutbreaking end-to-end encryption. Since then, WhatsApp has added features likeAI-powered summaries of messagesthat use this architecture. Newton-Rex said Meta used smaller models to power its previous features, but the new incognito chat uses itslatest Muse Spark model, which was released last month. The company is already working on its next feature that taps its private processing infra. Called Side Chat, it will let users invoke Meta AI within chats to ask questions and get answers privately without notifying or showing it to other people in the chat. Currently, you need to tag a message and ask a question to the AI assistant to get an answer that other participants in the chat can see. If you privately need to ask a question, you have to paste the text in a separate chat window. ChatGPT and Claude offer incognito modes, too, and companies likeDuckDuckGoandProtonhave launched their own privacy-first chatbots. Meta's move towards private AI chats comes at a key time. Last month, Reuters cited lawyers who opined that users'conversations with an AI chatbot could be used against themin litigation.

24 minutes ago

View

Co-founders behind Reface and Prisma join hands to improve on-device model inference with Mirai