MAROKO133 Breaking ai: US firm to produce high-purity nuclear materials for next-gen fusio

πŸ“Œ MAROKO133 Hot ai: US firm to produce high-purity nuclear materials for next-gen

US-based EnergyX has officially launched NUKE-it, a nuclear materials technology platform designed to tackle the critical supply chain shortages currently stalling the global race for fusion energy. 

This move establishes EnergyX as a domestic supplier of the specialized materials used in next-generation nuclear reactors.

β€œEnergyX has always been focused on developing technologies to produce advanced materials for the energy transition,” said Teague Egan, Founder and CEO of EnergyX.

Fueling the fusion breakthrough

While much of the energy industry remains focused on traditional fission, EnergyX is placing a heavy emphasis on the burgeoning fusion sector. 

The company is specifically developing 15% enriched Lithium-6 (Li-6), a critical element for tokamak fusion reactors where it serves as the primary material for tritium breeding. 

β€œThis move fits neatly within our mission to help power the clean energy economy and secure domestic critical material supply chains,” added Egan.

Without long-term, stable access to enriched Li-6, the promise of “star power” on Earth remains out of reach. 

By providing these initial nuclear-grade lithium salts engineered for reactor-grade performance, EnergyX is addressing a massive supply gap that currently leaves national laboratories and private fusion developers without the volumes of material necessary for full-scale commercialization.

Supporting the AI power demand

This expansion into the nuclear sector is driven by the unprecedented energy requirements of the AI revolution. 

As tech giants and utilities scramble to provide enough electricity for data centers, large language model training, and the rise of physical AI humanoids, nuclear energy has become the only viable option for firm, carbon-free baseload power. 

EnergyX Founder and CEO Teague Egan noted that nuclear energy represents one of the most important opportunities of the century to deliver the “always-on” power that the modern economy demands.

Advanced refining and fission solutions

Beyond fusion, the NUKE-it platform is also producing 99.999% high-purity Lithium-7 (Li-7), which is essential for the coolant systems of fission-based thorium molten salt reactors (MSRs). 

These reactors depend on Li-7-based compounds like FLiBe and FLiNaK to achieve high thermal stability and low neutron absorption, yet traditional lithium suppliers have historically been unable to meet these stringent purity requirements. 

To achieve this, EnergyX is adapting its proprietary GET-Lit suite of lithium extraction and material conversion technologies, which were originally developed for the electric vehicle battery market.

A secure domestic supply chain

The launch of the NUKE-it platform aligns with broader US national security priorities by strengthening domestic industrial resilience. 

In addition to its lithium isotope production, EnergyX is already working on innovations for uranium and thorium extraction, processing, and refining. 

β€œEnergyX is pursuing multiple provisional patents covering advanced purification and isotope separation pathways designed to reduce hazardous chemistry, improve consistency, and lower production costs,” concluded the company in a press release.

This holistic approach ensures that as the world transitions to a nuclear-backed clean energy economy, the underlying materials are sourced and refined through advanced, cost-effective domestic channels.

πŸ”— Sumber: interestingengineering.com


πŸ“Œ MAROKO133 Hot ai: Claude Code costs up to $200 a month. Goose does the same thin

The artificial intelligence coding revolution comes with a catch: it's expensive.

Claude Code, Anthropic's terminal-based AI agent that can write, debug, and deploy code autonomously, has captured the imagination of software developers worldwide. But its pricing β€” ranging from $20 to $200 per month depending on usage β€” has sparked a growing rebellion among the very programmers it aims to serve.

Now, a free alternative is gaining traction. Goose, an open-source AI agent developed by Block (the financial technology company formerly known as Square), offers nearly identical functionality to Claude Code but runs entirely on a user's local machine. No subscription fees. No cloud dependency. No rate limits that reset every five hours.

"Your data stays with you, period," said Parth Sareen, a software engineer who demonstrated the tool during a recent livestream. The comment captures the core appeal: Goose gives developers complete control over their AI-powered workflow, including the ability to work offline β€” even on an airplane.

The project has exploded in popularity. Goose now boasts more than 26,100 stars on GitHub, the code-sharing platform, with 362 contributors and 102 releases since its launch. The latest version, 1.20.1, shipped on January 19, 2026, reflecting a development pace that rivals commercial products.

For developers frustrated by Claude Code's pricing structure and usage caps, Goose represents something increasingly rare in the AI industry: a genuinely free, no-strings-attached option for serious work.

Anthropic's new rate limits spark a developer revolt

To understand why Goose matters, you need to understand the Claude Code pricing controversy.

Anthropic, the San Francisco artificial intelligence company founded by former OpenAI executives, offers Claude Code as part of its subscription tiers. The free plan provides no access whatsoever. The Pro plan, at $17 per month with annual billing (or $20 monthly), limits users to just 10 to 40 prompts every five hours β€” a constraint that serious developers exhaust within minutes of intensive work.

The Max plans, at $100 and $200 per month, offer more headroom: 50 to 200 prompts and 200 to 800 prompts respectively, plus access to Anthropic's most powerful model, Claude 4.5 Opus. But even these premium tiers come with restrictions that have inflamed the developer community.

In late July, Anthropic announced new weekly rate limits. Under the system, Pro users receive 40 to 80 hours of Sonnet 4 usage per week. Max users at the $200 tier get 240 to 480 hours of Sonnet 4, plus 24 to 40 hours of Opus 4. Nearly five months later, the frustration has not subsided.

The problem? Those "hours" are not actual hours. They represent token-based limits that vary wildly depending on codebase size, conversation length, and the complexity of the code being processed. Independent analysis suggests the actual per-session limits translate to roughly 44,000 tokens for Pro users and 220,000 tokens for the $200 Max plan.

"It's confusing and vague," one developer wrote in a widely shared analysis. "When they say '24-40 hours of Opus 4,' that doesn't really tell you anything useful about what you're actually getting."

The backlash on Reddit and developer forums has been fierce. Some users report hitting their daily limits within 30 minutes of intensive coding. Others have canceled their subscriptions entirely, calling the new restrictions "a joke" and "unusable for real work."

Anthropic has defended the changes, stating that the limits affect fewer than five percent of users and target people running Claude Code "continuously in the background, 24/7." But the company has not clarified whether that figure refers to five percent of Max subscribers or five percent of all users β€” a distinction that matters enormously.

How Block built a free AI coding agent that works offline

Goose takes a radically different approach to the same problem.

Built by Block, the payments company led by Jack Dorsey, Goose is what engineers call an "on-machine AI agent." Unlike Claude Code, which sends your queries to Anthropic's servers for processing, Goose can run entirely on your local computer using open-source language models that you download and control yourself.

The project's documentation describes it as going "beyond code suggestions" to "install, execute, edit, and test with any LLM." That last phrase β€” "any LLM" β€” is the key differentiator. Goose is model-agnostic by design.

You can connect Goose to Anthropic's Claude models if you have API access. You can use OpenAI's GPT-5 or Google's Gemini. You can route it through services like Groq or OpenRouter. Or β€” and this is where things get interesting β€” you can run it entirely locally using tools like Ollama, which let you download and execute open-source models on your own hardware.

The practical implications are significant. With a local setup, there are no subscription fees, no usage caps, no rate limits, and no concerns about your code being sent to external servers. Your conversations with the AI never leave your machine.

"I use Ollama all the time on planes β€” it's a lot of fun!" Sareen noted during a demonstration, highlighting how local models free developers from the constraints of internet connectivity.

What Goose can do that traditional code assistants can't

Goose operates as a command-line tool or desktop application that can autonomously perform complex development tasks. It can build entire projects from scratch, write and execute code, debug failures, orchestrate workflows across multiple files, and interact with external APIs β€” all without constant human oversight.

The architecture relies on what the AI industry calls "tool calling" or "<a href="https://platform.openai…

Konten dipersingkat otomatis.

πŸ”— Sumber: venturebeat.com


πŸ€– Catatan MAROKO133

Artikel ini adalah rangkuman otomatis dari beberapa sumber terpercaya. Kami pilih topik yang sedang tren agar kamu selalu update tanpa ketinggalan.

βœ… Update berikutnya dalam 30 menit β€” tema random menanti!

Author: timuna