

Good point, I’m assuming all monitors are as good as mine.
Good point, I’m assuming all monitors are as good as mine.
Fair point, but a lot of the article talks about how many studies aren’t meeting all four pillars of clinical trial design - that’s where my issue comes in, I think reporting that X% of trials do not meet all pillars is a bad metric.
And, not all medications these days are pills or IV infusions - some medications and treatments, which are governed by the FDA, are more invasive and more complicated.
The consent process for clinical trials has a ton of guidance (ICH GCP), but the onus is on the clinical monitors and hospitals to make sure it’s done correctly. Many trials now generate supporting documentation in which hospital staff are required to describe the circumstances in which consent was acquired. If the documents are generated, then it’s auditable.
Things get a bit hairy when you look at trials in Alzheimer’s and other cognitive disorders, because the patient may not be coherent enough to withdraw from the trial. In those cases, a legal guardian is responsible for the decision.
Unfortunately, this was an issue before Trump and will continue to be one afterwards. Assuming there even is an afterwards…
The article brings up some great points, some of which that I, an industry insider, weren’t even aware of, especially the historical context surrounding the AIDS epidemic. I’ll jump into the thread to critique an issue within the article.
One of the four pillars recommended by the FDA (control groups) are great in theory but can lead to very real problems in practice, specifically within indications that have an unmet treatment need or are exceptionally rare conditions.
If you have a disease that is 99% fatal but has 0 standard of care treatment options, is it ethical to ask a participant to enroll in a clinical trial and potentially not receive the study treatment/be on placebo? Or, what if the trial involves an incredibly invasive procedure like brain surgery - is it ethical for people to do a placebo procedure? Food for thought - and an explanation for why so few trials meet all four criteria proposed by the FDA.
Happy to answer questions about the industry if anyone has them.
Pharmaceuticals in the US. Fairly early in my career, get paid just short of $100k/year. All it took was getting a doctorate and selling a little bit of my soul.
Sometimes I miss academic research. But at the end of the day I’m getting paid about 4x as much while working 1/2 the hours, by my estimate I’m 8x as happy now. Plus, there’s something to be said for working on projects that actually affect people’s lives instead of overstating the impacts of my research to compete for a dwindling pool of federal grants. Seeing the policy changes in the US this year, I’m very glad I left academia but I’m not convinced I’m 100% safe from changes made at the FDA.
My favorite AI fact is from cancer research. The New Yorker has a great article about how an algorithm used to identify and price out pastries at a Japanese bakery found surprising success as a cancer detector. https://www.newyorker.com/tech/annals-of-technology/the-pastry-ai-that-learned-to-fight-cancer
I kept a few recipes from a subscription I was gifted. Honestly, replacing the missing ingredients has been more fun than cooking the boxed meals.
Well shit. That makes a lot of sense.
No no, they listen. How do you think the “Hey Google” feature works? It has to listen for the key phrase. Might as well just listen to everything else.
I spent some time with a friend and his mother and spoke in Spanish for about two hours while YouTube was playing music. I had Spanish ads for 2 weeks after that.
The article describes the review process - you’re right, these words just flag a paper for further review. I wonder if it’s an automatic flagging system like you suggested.
However, it took me almost a decade of rigorous training to understand my research. I sure as hell don’t trust an elected or appointed official with a political vendetta to critically read my grants. Leave politics out of peer review.
This is still an emergency situation, IMHO. Like you said, people’s grants are being canceled. I see this as a direct attack against higher education.
ETA: It’s also a waste of taxpayer money. These grants are already competing for meager funds. Why should we siphon away any resources to “investigate” them?
Sunglasses with polarized lenses? Worrying about eye cancer is too woke.
Here’s a quick off-the-cuff list of neuroscience domains, not part of Diversity, Equity, and Inclusion, that will be impacted by this censorship. This is not an exhaustive list, it’s just what I thought of after thinking critically for 10 minutes.
It goes without saying this practice is evil and reprehensible. No academic domain should be politically targeted. But it reaches more than their targets. It is dangerous. It is unscientific. It is book-burning. Contact your representatives. Take action. Donate to good causes.
Patient advocacy for people who have had a stroke, or have dementia, or have any number of disabilities, hereditary or acquired.
Any research about the blood brain barrier, including development of drugs that can cross it more efficiently.
Any research about the placental barrier, including development of safe medications for birthing people.
Research into cognitive bias.
Development of statistics (including Bayesian, the hot frontier), machine learning (that’s AI for anyone who prefers that term), where the term bias is used to talk about parameters and model performance.
Basic visual and auditory science, where we talk about visual and auditory discrimination.
Sex differences research- this isn’t just a social issue, we don’t understand how differences in metabolism impact drug metabolism. Can’t have female mice anymore, apparently.
Basic research in the function of neurons, which polarize, depolarize, hyperpolarize, etc.
Concussion research and, again, stroke research. The field is broadly known as traumatic brain injury.
Isn’t it magnesium chloride? More ions = better melting.
Please do not put ice salt on your food in a pinch
Shells or coral could serve as early tools, but (just my opinion) I feel it’s a little human-centric to assume fire and metallurgy are required to progress. Just because we did it that way, doesn’t mean another species would have to.
If anyone’s interested in this sort of speculative sci fi, check out A Mountain in the Sea by Ray Nayler. 10/10 world building, 9/10 science backing, 6/10 writing.
See Alk’s comment above, I touched on medical applications.
As for commercial uses, I see very few. These devices are so invasive, I doubt they could be approved for commercial use.
I think the future of Brain Computer Interfacing lies in Functional Near Infrared Spectroscopy (FNIRS). Basically, it uses the same infrared technology as a pulse oximeter to measure changes in blood flow in your brain. Since it uses light (instead of electricity or magnetism) to measure the brain, it’s resistant to basically all the noise endemic to EEG and MRI. It’s also 100% portable. But, the spatial resolution is pretty low.
HOWEVER, the signals have such high temporal resolution. With a strong enough machine learning algorithm, I wonder if someone could interpret the signal well enough for commercial applications. I saw this first-hand in my PhD - one of our lab techs wrote an algorithm that could read as little as 500ms of data and reasonably predict whether the participant was reading a grammatically simple or complex sentence.
It didn’t get published, sadly, due to lab politics. And, honestly, I don’t have 100% faith in the code he used. But I can’t help but wonder.
A traditional electrode array needs to be as close to the neurons as possible to collect data. So, straight through the dura and pia mater, into the parenchyma where the cell axons and bodies are hanging out. Usually, they collect local data without getting any long distance information - which is a limiting factor to this technology.
The brain needs widespread areas to work in tandem to get most complicated tasks done. An electrode is great for measuring motor activity because those are pretty localized. But, something like memory and language? Not really possible.
There are electrocorticographic devices (ECoG) that places electrodes over a wide area and can rest on the pia mater, on the surface of the brain. Less invasive, but you still need a craniotomy to place the device. They also have less resolution.
The most practical medical purpose I’ve seen is as a prosthetic implant for people with brain/spinal cord damage. Battelle in Ohio developed a very successful implant and has since received DARPA funding: https://www.battelle.org/insights/newsroom/press-release-details/battelle-led-team-wins-darpa-award-to-develop-injectable-bi-directional-brain-computer-interface. I think that article over-sells the product a little bit.
The biggest obstacle to invasive brain-computer implants like this one is their longevity. Inevitably, any metal electrode implanted in the brain gets rejected by the immune system of the brain. It’s a well-studied process where a glial scar forms, neurons move away from the implant, and the overall signal of the device decreases. We need advances in biocompatibility before this really becomes revolutionary.
ETA: This device avoids putting metal in the brain and instead the device sends axons into the brain. Certainly a novel approach which runs into different issues. The new neurons need to be accepted by the brain, and they need to be kept alive by the device.
If they move the cell bodies into the brain and then had the device house axons and dendrites (neuron input and output), they could maybe let the brain keep the device alive. But that is a much more difficult installation procedure
Great, now I have to start proof-reading any communications I get from the FDA to make sure it didn’t hallucinate a scientific article in the citations. There’s going to be so many Vegetative Microscopy proposals.