MAROKO133 Update ai: Railway secures $100 million to challenge AWS with AI-native cloud in

📌 MAROKO133 Hot ai: Railway secures $100 million to challenge AWS with AI-native c

Railway, a San Francisco-based cloud platform that has quietly amassed two million developers without spending a dollar on marketing, announced Thursday that it raised $100 million in a Series B funding round, as surging demand for artificial intelligence applications exposes the limitations of legacy cloud infrastructure.

TQ Ventures led the round, with participation from FPV Ventures, Redpoint, and Unusual Ventures. The investment values Railway as one of the most significant infrastructure startups to emerge during the AI boom, capitalizing on developer frustration with the complexity and cost of traditional platforms like Amazon Web Services and Google Cloud.

"As AI models get better at writing code, more and more people are asking the age-old question: where, and how, do I run my applications?" said Jake Cooper, Railway's 28-year-old founder and chief executive, in an exclusive interview with VentureBeat. "The last generation of cloud primitives were slow and outdated, and now with AI moving everything faster, teams simply can't keep up."

The funding is a dramatic acceleration for a company that has charted an unconventional path through the cloud computing industry. Railway raised just $24 million in total before this round, including a $20 million Series A from Redpoint in 2022. The company now processes more than 10 million deployments monthly and handles over one trillion requests through its edge network — metrics that rival far larger and better-funded competitors.

Why three-minute deploy times have become unacceptable in the age of AI coding assistants

Railway's pitch rests on a simple observation: the tools developers use to deploy and manage software were designed for a slower era. A standard build-and-deploy cycle using Terraform, the industry-standard infrastructure tool, takes two to three minutes. That delay, once tolerable, has become a critical bottleneck as AI coding assistants like Claude, ChatGPT, and Cursor can generate working code in seconds.

"When godly intelligence is on tap and can solve any problem in three seconds, those amalgamations of systems become bottlenecks," Cooper told VentureBeat. "What was really cool for humans to deploy in 10 seconds or less is now table stakes for agents."

The company claims its platform delivers deployments in under one second — fast enough to keep pace with AI-generated code. Customers report a tenfold increase in developer velocity and up to 65 percent cost savings compared to traditional cloud providers.

These numbers come directly from enterprise clients, not internal benchmarks. Daniel Lobaton, chief technology officer at G2X, a platform serving 100,000 federal contractors, measured deployment speed improvements of seven times faster and an 87 percent cost reduction after migrating to Railway. His infrastructure bill dropped from $15,000 per month to approximately $1,000.

"The work that used to take me a week on our previous infrastructure, I can do in Railway in like a day," Lobaton said. "If I want to spin up a new service and test different architectures, it would take so long on our old setup. In Railway I can launch six services in two minutes."

Inside the controversial decision to abandon Google Cloud and build data centers from scratch

What distinguishes Railway from competitors like Render and Fly.io is the depth of its vertical integration. In 2024, the company made the unusual decision to abandon Google Cloud entirely and build its own data centers, a move that echoes the famous Alan Kay maxim: "People who are really serious about software should make their own hardware."

"We wanted to design hardware in a way where we could build a differentiated experience," Cooper said. "Having full control over the network, compute, and storage layers lets us do really fast build and deploy loops, the kind that allows us to move at 'agentic speed' while staying 100 percent the smoothest ride in town."

The approach paid dividends during recent widespread outages that affected major cloud providers — Railway remained online throughout.

This soup-to-nuts control enables pricing that undercuts the hyperscalers by roughly 50 percent and newer cloud startups by three to four times. Railway charges by the second for actual compute usage: $0.00000386 per gigabyte-second of memory, $0.00000772 per vCPU-second, and $0.00000006 per gigabyte-second of storage. There are no charges for idle virtual machines — a stark contrast to the traditional cloud model where customers pay for provisioned capacity whether they use it or not.

"The conventional wisdom is that the big guys have economies of scale to offer better pricing," Cooper noted. "But when they're charging for VMs that usually sit idle in the cloud, and we've purpose-built everything to fit much more density on these machines, you have a big opportunity."

How 30 employees built a platform generating tens of millions in annual revenue

Railway has achieved its scale with a team of just 30 employees generating tens of millions in annual revenue — a ratio of revenue per employee that would be exceptional even for established software companies. The company grew revenue 3.5 times last year and continues to expand at 15 percent month-over-month.

Cooper emphasized that the fundraise was strategic rather than necessary. "We're default alive; there's no reason for us to raise money," he said. "We raised because we see a massive opportunity to accelerate, not because we needed to survive."

The company hired its first salesperson only last year and employs just two solutions engineers. Nearly all of Railway's two million users discovered the platform through word of mouth — developers telling other developers about a tool that actually works.

"We basically did the standard engineering thing: if you build it, they will come," Cooper recalled. "And to some degree, they came."

From side projects to Fortune 500 deployments: Railway's unlikely corporate expansion

Despite its grassroots developer community, Railway has made significant inroads into large organizations. The company claims that 31 percent of Fortune 500 companies now use its platform, though deployments range from company-wide infrastructure to individual team projects.

Notable customers include Bilt, the loyalty program company; Intuit's GoCo subsidiary; TripAdvisor's Cruise Critic; and MGM Resorts. Kernel, a Y Combinator-backed startup providing AI infrastructure to over 1,000 companies, runs its entire customer-facing system on Railway for $444 per month.

"At my previous company Clever, which sold …

Konten dipersingkat otomatis.

🔗 Sumber: venturebeat.com


📌 MAROKO133 Breaking ai: Top Machine Learning Developer Speechless at Simple Quest

The question that’s stumping top AI researchers isn’t about consciousness or doomsday scenarios. After interviewing dozens of developers at companies including OpenAI, Anthropic, and Meta, Amelia Miller found it was this: should AI “simulate emotional intimacy?”

One chatty researcher at one of the top AI labs “suddenly went quiet,” recalled Miller, who studies AI-human relations, in an essay for The New York Times — and then, tellingly, offered up a halting non-answer.

“I mean… I don’t know. It’s tricky. It’s an interesting question,” the researcher said, before pausing. “It’s hard for me to say whether it’s good or bad in terms of how that’s going to affect people. It’s obviously going to create confusion.”

Though many waffled on answering the question directly, some were adamant about not using AI as an intimacy tool themselves, clearly showing they were aware of the tech’s profound risks

“Zero percent of my emotional needs are met by A.I.,” an executive who heads a top AI safety lab told Miller.

“That would be a dark day,” said another researcher who develops “cutting-edge capabilities for artificial emotion,” according to Miller.

The conflicted responses from the developers reflect growing concern over AI’s ability to act as companions or otherwise fulfill human emotional needs. Because the chatbots are designed to be engaging, they can produce sycophantic responses to even the most extreme user responses. They can act as emotional echo chambers and fuel paranoid thinking, leading some down delusional mental health spirals that blow up their relationships with friends, families, and spouses, ruin their professional lives, and even culminate in suicide.

ChatGPT has been blamed for the death of several teens who confided in the AI and discussed their plans for taking their own life. Many youth are engaging in romantic relationships with AI models. Unlike a human companion, an AI one can lend an ear at any time, won’t judge you, and maybe won’t even question you. A founder of an AI chatbot business quipped to the NYT that AI’s role as an emotional companion turns every relationship in a “throuple.” 

“We’re all polyamorous now,” he added. “It’s you, me and the AI”

And safety isn’t the only factor in the calculus of AI developers.

“They’re here to make money,” said an engineer who’s worked at several tech companies. “It’s a business at the end of the day.”

The most sweeping solution would be to design the bots so they abstain from tricky questions and conversations, and act more like the machines they are instead imitating human personalities. But this would undoubtedly affect how engaging the tools are. The developers “support guardrails in theory,” Miller wrote, “but don’t want to compromise the product experience in practice.” Some think how people choose to use their tools isn’t their responsibility at all, thereby shielding AI from any judgment. “It would be very arrogant to say companions are bad,” an executive of a conversational AI startup told Miller.

However they may choose to justify their work, it’s clear that some, if not most, AI researchers are aware of the harm that their products can cause, a fact that “should alarm us,” Miller opined. She argues that this is partly a consequence of the researchers not being challenged enough. One thanked her for her perspective: “You’ve really made me start to think,” a developer of AI companions said. “Sometimes you can just put the blinders on and work. And I’m not really, fully thinking, you know.”

More on AI: Another OpenAI Researcher Just Quit in Disgust

The post Top Machine Learning Developer Speechless at Simple Question: Should AI Simulate Emotional Intimacy? appeared first on Futurism.

🔗 Sumber: futurism.com


🤖 Catatan MAROKO133

Artikel ini adalah rangkuman otomatis dari beberapa sumber terpercaya. Kami pilih topik yang sedang tren agar kamu selalu update tanpa ketinggalan.

✅ Update berikutnya dalam 30 menit — tema random menanti!

Author: timuna