📌 MAROKO133 Update ai: New spinning bioreactor turns lab-grown cells into factorie
Inside our cells are microscopic bubbles called extracellular vesicles (EVs). These particles carry proteins and compounds that help cells heal and communicate with each other.
Scientists see them as natural delivery vehicles that could carry medicines to damaged tissues, even to parts of the body that are hard to reach.
The potential is immense, but producing EVs in large amounts has been a roadblock. Current methods yield too few to make treatments practical, keeping costs high and limiting patient access.
Breakthrough in production
Researchers at the FAMU-FSU College of Engineering have developed a scalable way to produce EVs using vertical-wheel bioreactors. These devices rotate gently to mimic the flow of blood.
By doing so, they stimulate lab-grown blood vessel cells to release more EVs without compromising their quality.
“Imagine if we could harvest microscopic delivery trucks from lab-grown human tissues to carry healing molecules directly to damaged cells in our bodies,” said Professor Yan Li from the Department of Chemical and Biomedical Engineering. “That’s essentially what we have accomplished in our investigation.”
The method boosted EV production two to three times compared to traditional systems, where cells sit in static chambers.
The researchers compared it to running a factory at peak efficiency.
“Think of it like the difference between a factory running at normal capacity versus one operating at peak efficiency under optimized conditions,” Li said. “Essentially, the gentle spinning motion enhances both the quantity of these essential vesicles and the overall health of the artificial blood vessels.”
Promise for future therapies
Tests confirmed that the EVs produced through this process retained their healing potential.
They reduced damage from aging and supported cell growth, signs that they could still aid tissue repair when scaled up.
The findings address a key obstacle to bringing EV therapies into medical use.
By creating a system that is both scalable and reliable, the researchers opened the door for treatments that could one day be affordable enough for widespread adoption.
“I hope that the research on EVs increases because of our study,” said Justice Ene, a graduate student researcher and study co-author. “In the future, we need to explore the composition of therapeutic cargo and learn how well the research translates to safely being produced at a large scale. There are still many questions, but it’s a step in the right direction.”
The advance could make experimental therapies for age-related diseases and tissue damage more realistic options for patients, not just laboratory prototypes.
The study, published in Stem Cell Research & Therapy, involved collaborators from FAMU-FSU College of Engineering, Florida State University, and PBS Biotech.
Funding came from the National Science Foundation and the National Institutes of Health.
đź”— Sumber: interestingengineering.com
📌 MAROKO133 Breaking ai: Self-Driving Teslas Keep Driving Into the Path of Oncomin
With numerous accidents and countless close calls to its name, Tesla’s Full Self-Driving software has a problem with more than occasionally going off the rails.
Or sometimes, going onto them.
As NBC News reports, numerous Tesla drivers have warned that their cars running the self-driving mode are going haywire near railroad crossings, failing to stop even when a train was barreling past right in front of them. Most of them were able to manually intervene and slam the brakes in time, but that hasn’t always been the case.
“If it’s having trouble stopping at rail crossings, it’s an accident waiting to happen,” Phil Koopman, an associate professor emeritus of engineering at Carnegie Mellon, told NBC. “It’s just a matter of which driver gets caught at the wrong time.”
In all, NBC interviewed six drivers who said their Full Self-Driving cars glitched out at railroad crossings. Another seven who posted videos of their incidents online declined to be interviewed. On top of that, the outlet also found 40 examples of owners complaining about similar mishaps on sites like Reddit and X-formerly-Twitter.
That paints an alarming trend too frequent to be written off as freak accidents or unfortunate flukes. It’s happened enough for the National Highway Traffic Safety Administration to have approached Tesla about the issue, the agency told NBC.
“We are aware of the incidents and have been in communication with the manufacturer,” the NHTSA said in a statement, per NBC News.
One Tesla owner Italo Frigoli showed that the software failure was easily replicable. When NBC accompanied Frigoli to the same crossing where his car nearly plowed into a train over a month ago, FSD once again failed to recognize another oncoming train, forcing him to slam the brakes.
In a video shared by another Tesla owner, the car stops before a flashing railroad crossing. As the barrier arms lower, a traffic light at an intersection ahead turns green. The Tesla suddenly accelerates towards the tracks, nearly getting caught under the arm until the driver slams the brakes. Seconds later, a double decker train blasts through.
“FSD tries to kill me,” reads a caption in the video. “Tesla FSD Team: please fix this highly repeatable error.”
In a similar incident, Jared Cleaver, a construction project manager in Oakland, California, said his 2021 Tesla Model 3 initially seemed to recognize a railroad crossing — but then didn’t.
“The car came to a complete stop, and I was like, ‘OK, we’re good,'” Cleaver told NBC. “And then the car just jumped forward like it was going to go,” he said, forcing him to takeover.
“It’s kind of crazy that it hasn’t been addressed,” Cleaver added.
Sometimes the drivers don’t put their foot down in time. In July, a Tesla in Full Self-Driving mode drove onto double track crossing and got stuck. The family inside ditched the vehicle, and minutes later, a freight train clipped the car’s side. (Luckily for the owners, the worst damage was a broken mirror.)
Despite its name, Full Self-Driving isn’t fully autonomous. On an scale defined by the Society of Automotive Engineers of 1 through 5, with 5 meaning fully autonomous in all conditions and locations, Tesla’s software is only considered Level 2, which indicates the cars can’t drive themselves without constant human supervision.
In short, saying it’s “self-driving” — let alone full self-driving — is a stretch. The California DMV sued Tesla for false advertising because of FSD’s name. Last year, the company subtly modified it to “Full Self-Driving (Supervised).”
The spurious branding is supercharged by Musk’s cult of personality. The billionaire has promised every year for over a decade that Tesla is on the verge of achieving fully autonomous driving, while exaggerating the capabilities of his driving software.
“Teslas can drive themselves!” Musk tweeted in August, a claim he has often repeated.
When Tesla expanded its FSD Beta program in 2022, he asserted that there hadn’t been a single accident or injury since the software’s launch — even though NHTSA data at the time showed there had been at least eight crashes. All the while, Musk sticks to the tech’s raison dĂŞtre that FSD is safer than a human driver.
That the cars struggle to recognize a common obstacle should raise serious questions about the tech’s capabilities — not to mention numerous deadly accidents. Tesla was recently ordered to pay $329 million in damages after a jury found it to be partially responsible for the death of a young woman after a car running Autopilot blew through an intersection at 60 miles per hour and struck a vehicle she was standing next to.
But why it struggles with railroads in particular is unclear, not least of all because the tech is a black-box with little public insight into its inner workings.
Koopman, the Carnegie Mellon professor, speculated that Tesla engineers hadn’t used enough railroad crossing footage when training the FSD software.
“The only possible explanation is that it is not sufficiently trained on that scenario,” Koopman told NBC.
It’s a plausible theory, especially since railroad crossings don’t all look the same. Some don’t have gates or flashing lights and only bear signs and road markings. On the other hand, there’s clear video examples of Teslas barreling straight into a fully-adorned crossing, lights and boom barriers and all, even as a train is moving through. Where’s the excuse there?
Piling the pressure on Tesla, it seems that its competitor Waymo doesn’t have the same issue with its robotaxis — or at least, not at the same scale. The NBC investigation didn’t find examples of customers complaining about Waymos running into trouble around railroad crossings. A representative from Waymo told the outlet the company uses audio receivers to detect train sounds, and has model train crossings at its robotaxi training facility.
In any case, the repeated failures are leaving some Tesla owners pretty disillusioned.
“They seem to make a habit out of making these really big claims and then falling short. It bothers me,” Cleaver told NBC. “It seems like borderline false advertising.”
More on Tesla: Tesla Engineer Quits, Roasts Elon Musk in Spectacular Fashion
Th…
Konten dipersingkat otomatis.
đź”— Sumber: futurism.com
🤖 Catatan MAROKO133
Artikel ini adalah rangkuman otomatis dari beberapa sumber terpercaya. Kami pilih topik yang sedang tren agar kamu selalu update tanpa ketinggalan.
✅ Update berikutnya dalam 30 menit — tema random menanti!