Latest AI News

Intel Shares Rise by 4% After Joining Tesla’s Terafab Chip Venture
The partnership with Tesla, SpaceX, and xAI positions Intel at the centre of a $25 billion semiconductor project.
View

AWS Summit Bengaluru Returns with a Focus on AI, Data & Cloud
The two-day event will feature keynote addresses, breakout sessions, hands-on labs, and an expo floor showcasing AWS technologies and partner solutions.
View

The New Teddy Bear: Children Love AI Toy Robots, But Do They Have a Hidden Cost?
AI robots are transforming childhood experiences, but psychologists and policy experts caution against overdependence and data risks.
View

Anthropic Unveils Claude Mythos, Says Model Too Dangerous to Launch
Anthropic said the model has already identified “thousands of high-severity vulnerabilities,” including flaws in major operating systems and web browsers.
View

Uber to Use AWS Chips for Trip Processing and AI Training
Uber said its AI models are trained on data from billions of trips and deliveries.
View

I can’t help rooting for tiny open source AI model maker Arcee
Arcee, a tiny 26-person U.S. startup that built a massive, 400B-parameter open source LLM ona $20 million shoestring budget, has released its new reasoning model. Arcee calls the model Trinity Large Thinking — and it’s the most capable open-weight model “ever released by a non-Chinese company,” claims CEO Mark McQuade to TechCrunch. As that comment implies, Arcee has a goal that I can’t help but root for: It wants to give U.S. and Western companies a model that gives them no reason to use a Chinese-based one. While Chinese models are extremely capable, they areperceived as risky, putting power, and perhaps data, into the hands of a government that doesn’t share all of the Western world’s ideals. With Arcee, companies can download the model, train it to their own needs, and use it on premises. Companies can also use Arcee’s cloud-hosted version, accessible via API. While Arcee’s models are not outperforming the closed source models from the big labs like Anthropic or OpenAI, they’re not being held hostage by the whims of those giants, either. For instance, Claude, with its exceptional abilities to code, has been a popular choice for users of open source AI agent tool OpenClaw. But Anthropic pulled the rug out from them last week when it told users that their Anthropic subscriptionswill no longer cover OpenClaw usage— they will have to pay additionally for that. (In February, OpenClaw creator Peter Steinberger said he wasjoining Anthropic’s biggest rival, OpenAI.) In contrast, McQuade proudly points todata from OpenRouterthat says it has become one of the top models used with OpenClaw. So, how good is Trinity Large Thinking? It is comparable to some of the other top open source models, according to the benchmark results it shared with TechCrunch. As we previously reported, it is not a head-to-head threat to the big cheese among U.S.-built open models: Meta’s Llama 4. But it also doesn’t have the odd, not-reallyopen source license issuesof Meta’s model. All of Arcee’s Trinity models are released under the gold standard for OS licenses, Apache 2.0. Just to be clear, there are also countless other U.S. startups offering open source models and, as a fan of the ingenuity of startups, I’m rooting for them, too.
View

Uber is the latest to be won over by Amazon’s AI chips
On Tuesday, Amazonannouncedthat Uber was expanding its contract for AWS cloud services to run more of its ride-sharing features on Amazon’s chips. Uber will particularly expand its use of AWS’s Graviton (a low-power, ARM-based server CPU) and start a new trial testing Trainium3, AWS’s Nvidia competitor AI chip. This deal is a bit less about a long-term threat to Nvidia than it is a thorough thumbing of the nose by Amazon at AWS’s cloud competitors, Google and Oracle. While Uber historically ran its own data centers, back in 2023, the ride-hailing company famously signed giant, multi-year cloudcomputing deals with Oracle and Google. The idea was to move the majority of its IT infrastructure off its own data centers and onto these two clouds, it said. Even in December, Uber publicly reiterated that goal, writing ina blog post: In February 2023, Uber began transitioning from on-premise data centers to the cloud using OCI and Google Cloud Platform, taking on the dual challenge of shifting massive workloads and introducing Arm-powered compute instances into a previously x86-dominated environment. Uber particularly called out in that post the use of the ARM chips made by Ampere in Oracle’s cloud. This is where things get interesting. If you want a crash course in how inter-tangled Silicon Valley can be,take a look at the history of Ampere. Ampere was founded by former Intel bigwig Renee James after she was not promoted to CEO at the chipmaker. She pulled all her strings, including her power at her then-job as an investor at private equity firm Carlyle and her board seat position at Oracle, to raise the cash to start this company. Oracle owned about one-third of the company, and James had to give up her status as an independent Oracle director because of that investment. (James was, by the way, a key board personwho helped vote in Oracle’s $9.3 billion purchase of NetSuitein 2016, a company where Larry Ellison was a major stockholder. That dealsparked an unsuccessful shareholder lawsuitalleging Oracle overpaid for it.) In December, Ampere’s major competitor SoftBank acquired it, and Oracle sold its stake for ahandsome $2.7 billionpre-tax gain. James left Oracle’s board at the end of 2024 and is no longer working at Ampere. Oracle israising money as fast as it can to build data centersfor OpenAI and Stargate.Ellison said Oracle sold Amperebecause he believed designing chips in-house for its data centers was no longer a competitive advantage. It prefers to buy the chips and has signed massive deals with Nvidia. It’s worth noting that Oracle, SoftBank, and Nvidia are also part of OpenAI’sorbit of circular dealsthat are supposed to fund the model maker’s massive data center build-out. But now AWS is announcing it has nabbed a bigger contract from one of Oracle’s star customers, Uber, because it has in-house-designed chips. Uber joins Anthropic, OpenAI, and Apple as Big Tech companies that have signed on or increased their usage of AWS because of these AI chips. In December, Amazon CEO Andy Jassy saidTrainium was already a multibillion-dollar business. (Fora look at the team and lab that design these chips, check out our exclusive tour of the facility.)
View

Anthropic debuts preview of powerful new AI model Mythos in new cybersecurity initiative
Anthropic on Tuesday released a preview of its new frontier model, Mythos, which it says will be used by a small coterie of partner organizations for cybersecurity work. In apreviously leaked memo, the AI startup called the model one of its “most powerful” yet. The model’s limited debut is part of a new security initiative, dubbed Project Glasswing, in which 12 partner organizations will deploy the model for the purposes of “defensive security work” and to secure critical software, Anthropic said. While it was not specifically trained for cybersecurity work, the model will be used to scan both first-party and open source software systems for code vulnerabilities, the company said. Anthropic claims that, over the past few weeks, Mythos identified “thousands of zero-day vulnerabilities, many of them critical.” Many of the vulnerabilities are one to two decades old, the company added. Mythos is a general-purpose model for Anthropic’s Claude AI systems that the company claims has strong agentic coding and reasoning skills. Anthropic’s frontier models are considered its mostsophisticated and high-performance models, designed for more complex tasks, including agent-building and coding. The partner organizations previewing Mythos as part of Project Glasswing include Amazon, Apple, Broadcom, Cisco, CrowdStrike, the Linux Foundation, Microsoft, and Palo Alto Networks. As part of the initiative, these partners will ultimately share what they’ve learned from using the model so that the rest of the tech industry can benefit from it. The preview is not going to be made generally available, Anthropic said, though 40 organizations will gain access to the Mythos preview aside from the partnership. Anthropic also claims that it has engaged in “ongoing discussions” with federal officials about the use of Mythos, although one would have to imagine that those discussions are complicated by the fact that Anthropic and the Trump administration are currently lockedin a legal battleafter the Pentagon labeled the AI lab a supply-chain risk overAnthropic’s refusalto allow autonomous targeting or surveillance of U.S. citizens. News of Mythos was originally leaked in a data security incidentreported last month by Fortune. A draft blog about the model (then called “Capybara”) was left in an unsecured cache of documents available on a publicly inspectable data lake. The leak, which Anthropic subsequently attributed to “human error,” was originally spotted by security researchers. “‘Capybara’ is a new name for a new tier of model: larger and more intelligent than our Opus models — which were, until now, our most powerful,” the leaked document said, adding later that it was “by far the most powerful AI model we’ve ever developed,” according to the report. In the leak, Anthropic claimed that its new model far exceeded performance areas (like “software coding, academic reasoning, and cybersecurity”) met by its currently public models and that it could potentially pose a cybersecurity threat if weaponized by bad actors to find bugs and exploit them (rather than fix them, which is how Mythos will be deployed). Last month, the companyaccidentally exposednearly 2,000 source code files and over half a million lines of code via a mistake it made in the launch of version 2.1.88 of its Claude Code software package. The company thenaccidentally causedthousands of code repositories on GitHub to be taken down as it attempted to clean up the mess. Correction April 7, 2026: An earlier version of this article erroneously stated how many partners are working with Anthropic on Project Glasswing. There are 12 partner organizations, though 40 organizations total will have access to the Mythos preview.
View

Intel signs on to Elon Musk’s Terafab chips project
Intel will join SpaceX and Tesla in an effort to build a new U.S. semiconductor factory in Texas, although the scope of its contributions are unclear. “Our ability to design, fabricate, and package ultra-high-performance chips at scale will help accelerate Terafab’s aim to produce 1 TW/year of compute to power future advances in AI and robotics,” Intelsaidin a corporate post on X. Intel hasn’t shared any more information. Elon Muskannouncedin March a team-up between the two tech companies he leads to develop chips for AI compute, satellites, and SpaceX’s mooted space data center and to support the possibility of autonomous Tesla vehicles and robots. However, building a chip fab is one of the most difficult and expensive corporate infrastructure projects out there, typically requiring years of time and more than $20 billion to create a facility with a huge clean room for thousands of ultra-precise machines to carve silicon. It wasn’t obvious how SpaceX and Tesla, two companies with no experience in the sector, could team up to execute the project efficiently. Now we have a better idea: Intel will do it. The company has been hunting for large anchor customers to support its foundry business, and now it has two. Still, if investors thought that Terafab would be a greenfield approach based on SpaceX’s and Tesla’s unique approach to engineering, that may not play out. Once the leading U.S. silicon producer, Intel has seen rivals Nvidia and AMD take the lead in developing advanced processors and adopt the “fabless” business model where chip designers outsource the manufacturing of their semiconductors. Intel’s stock rose more than 3% on the news today. It was trading at $52.28, about 2.9% higher than its opening bell price, at 2 p.m. ET. Intel declined to comment on the partnership, while SpaceX didn’t respond to TechCrunch’s query.
View

Firmus, the ‘Southgate’ AI data center builder backed by Nvidia, hits $5.5B valuation
Asia AI data center provider Firmus on Mondayannounceda fresh $505 million raise led by Coatue at a $5.5 billion post-money valuation. With this round, Firmus has raised $1.35 billion in six months, it says. The Singapore-based data center company previously raisedAU$330 million(approximately $215 million) at anAU$1.85 billion($1.2 billion) valuation from investors, including Nvidia. Firmus is developing an energy-efficient “AI factory” network of data centers in Australia and Tasmania, a project it dubs Project Southgate. It is using Nvidia’s reference designs for building these efficient data centers. These new data centers will use Nvidia’s Vera Rubin platform — the chip giant’s next-gen AI computing system succeeding its Blackwell architecture, expected to ship in the second half of 2026. Firmus originally provided cooling technologies for Bitcoin mining and hasbecome yet anothercrypto-roots-turned-AI providercompany that investors love.
View

The AI gold rush is pulling private wealth into riskier, earlier bets
Loading the player… For decades, buying stock in a hot startup meant being allowed to invest in the funds run by the top VCs. But with the AI boom causing an investment frenzy, more family offices and private wealth are skipping the VC middlemen to get directly onto the cap table. “Companies are staying private longer, and there are fewer IPOs now than we’ve seen historically,” Mitch Stein, founder of Arena Private Wealth, an investment advisory firm for high-net-worth individuals, told TechCrunch on arecent episode of Equity.“A lot of money is being made well before companies go public, and right now the private markets are dominated by a lot of these AI names. The family offices who are allocating [directly into AI startups] are right on.” Arena recentlyco-led a $230 million roundinto AI chip startup Positron, an investment that earned the midwestern firm a board seat. Stein says that’s part of a deliberate shift away from being passive allocators and towards becoming “active participants in the capital markets.” The urgency amongst today’s family offices is real. “The world’s AI infrastructure is being built now, so you’re either going to get in early and have an opportunity to do more primary investing…and really build a portfolio, or you’re going to miss it and be taking random bets,” Ari Schottenstein, Arena’s head of alternatives, told TechCrunch. Stein put it more bluntly: “Your biggest risk is not having exposure to AI, not what could happen to your AI investments.” The numbers reflect this sentiment. In February, family offices made41 direct investmentsinto startups, nearly all of them tied to AI. Among those are high profile names like Laurene Powell Jobs’s Emerson Collective into World Labs, Azim Premji’s family office into Runway, and Eric Schmidt’s Hillspire into Goodfire. According to BNY Wealth research, 83% of family offices say AI is a top strategic priority over the next five years, and more than half have AI exposure through investments. Some are going further still. A growing number of family offices are incubating their own AI companies, seeding the first several million, taking on operational roles, and deploying the same entrepreneurial instincts that built their wealth in the first place, according to Schottenstein. Jeff Bezos’ decision to serve asCEO of his own robotics company, which raised an initial $6.2 billion last year at a nearly $30 billion valuation, is a high-profile example of the model. On a smaller scale, Stein pointed to Tyson Tuttle, an Austin-based angel investor and former CEO of Silicon Labs — which agreed to be acquired by Texas Instruments for $7.5 billion. Tuttle co-founded Circuit, a startup using AI to improve manufacturing and distribution, raising a$30 million angel roundthat includes $5 million from his own family office. Not everyone coming to the table has founded a company, though. Arena’s team comes from institutional finance, and they argue that rigorous due diligence is what earns them the right to lead rounds. “We take our time, we’re a very slow ‘yes,’ we say ‘no’ a lot,” Schottenstein said. “We definitely invest in the sources and experts and people necessary to make sure that a company is what it says it is and can do what it says it will do.” For the Positron deal, that meant working with third-party experts to validate the technology, but also reading the cap table itself as a signal: “If Arm is coming into a deal, we’d like to think your technology is real,” Schottenstein said. Arena also knew Oracle was a major customer, making Positron one of the only AI chips deployed into a hyperscaler not named Nvidia or AMD. That selectivity shapes how Arena participates once it’s in. Unlike a typical VC spreading risk across a portfolio, Arena makes a small handful of direct deals per year, which changes the stakes entirely. When they’re in, they’re all in; Positron is their one and only AI inference chip investment. “When we participate in single asset direct deals and only do a small handful every year, our stakes are incredibly high,” Stein said. “We are not managing portfolio-level returns. We don’t model in failure on a single asset transaction. We are taking a tremendous amount of risk with concentrated client capital. We’re taking on reputational risk as a firm. We’re allocating a tremendous amount of time and resources. There’s an alignment there that founders appreciate.”
View

4 days left to save close to $500 on TechCrunch Disrupt 2026 passes
We’re down to 4 days left to save up to $482 on yourTechCrunch Disrupt 2026pass. These low rates will disappear on April 10 at 11:59 p.m. PT. If you’ve been mapping out your 2026 tech event calendar, this isn’t the moment to wait.Register now to lock in your savings before prices increase. Each year,Disruptbrings together 10,000+ founders, tech leaders, and VCs at San Francisco’s Moscone West. From October 13–15, you’ll gain valuable takeaways and curated networking opportunities designed to elevate your startup trajectory, accelerate your career, or strengthen your portfolio. Last year, Disrupt featured 200+ on-stage conversations with 250+ top voices shaping the tech ecosystem. Expect the same level of powerful, candid conversations this year with tracks ranging from AI and scaling to fintech and climate. Keep an eye on theevent pageas we roll out the 2026 agenda. 2025 highlights included: Last year, 20,000+ curated meetings took place over three days. Beyond that, thousands more connections happened organically through impromptu moments, roundtables, and conversations across a 10,000+ attendee Expo Hall. This year, we’re rolling out improved networking technology to make those connections even more targeted and efficient. Meet the one person who can change the trajectory of your startup. It only takes one. Startup Battlefield 200is where 200 TechCrunch-selected, early-stage startups compete on a global stage for $100,000 in equity-free funding, global visibility, and direct access to top investors. Hear directly from tier-one VCs as they share candid feedback on what makes a startup viable. Is your startup ready to join the battle?Apply here. Know a startup that belongs in this competition?Nominate here. This iconic pitch competition has helped launch breakout companies like Discord, Cloudflare, and Trello. 300+ startup exhibitorswill showcase innovations across the venue, especially in the Expo Hall, where foot traffic converges. Discover tomorrow’s breakthroughs and today’s solutions — all in one place. Want to book a table for your startup?Get started here. Throughout Disrupt Week, October 11–17, TechCrunch Disrupt Side Events will take place across the Bay Area beyond the main venue. Attend a post-event cocktail hour, grab breakfast before the day begins, or even host your own off-site panel. The opportunities to make powerful connections around Disrupt are endless. Five days remain to lock in the lowest rate of the year. Prices increase after April 10 at 11:59 p.m. PT.Register nowand secure your savings of up to $482 before they’re gone. Coming with a group? Save up to 30% onbundle passes.
View
