A somewhat relevant question: what is a good way to get into longevity research/research support for a software engineer with a degree in optical engineering, who recently started refreshing college math?
Try to make contact with grad students and postdocs working in research groups that align with your aims. Find out what the most annoying/repetitive parts of their research are and see if you can think of improvements matched to your skill set. (This could be as simple as joining a Facebook group, forum, or mailing list, and reading what people complain about.) It could be something you can improve with software alone. I'm guessing that there are unmet equipment needs too. PIs are not always incentivized to care about improving the labor efficiency of their underlings.
An uncle of mine got his PhD and then after his postdoc spent the rest of his working life helping other researchers at his university build equipment to support experiments. He had skills with machining and design that most of his colleagues lacked. Much of hands-on scientific research can be improved by some not-too-complicated piece of equipment, but that equipment isn't available off the shelf yet. Some researchers are lucky enough to be supported by people like my uncle, or already have skills like my uncle. Most biological researchers aren't also engineers. You might advance the productivity of biological research, maybe even do well financially if you design something that a company like Millipore ends up acquiring.
Tedious pipetting work used to be a major waste of time in some kinds of biological research. There are commercial robots for that now. What's the next most tedious thing that could be improved with good tooling? I don't know, but you might want to see if you can find out.
EDIT: I'm suggesting that you look for opportunities regarding experimental research instead of pure software because I'm not sure current experimental data is good/abundant enough. I was peripherally involved with an academic "proteomics" software effort more than a decade ago (is that still a trendy thing?) and my experiences led me to believe that experimental reproducibility and throughput needed to improve before it was worth focusing on software. I also hear biologists gripe about slow, poorly reproducible cell experiments in places like the comments on Derek Lowe's blog.
My personal hobby is computational chemistry but if I wanted to make a real impact on chemistry I think it would have to relate to instrumentation or tooling for bench chemists. Chemistry and especially biology are too complicated for theoretical/computational approaches to contribute much without collaborating with experimentalists.
More specifically, the difficulty is in developing computationally efficient models (i.e. algorithms that could be used on today's computers) - vs. just using computational methods of quantum mechanics, which in theory should be able to model anything that consists of atoms but in practice turns out to be too computationally intensive.
There are molecular mechanics/molecular dynamics methods that use only classical physics ("ball and spring" models). That's part of computational chemistry.
To elaborate on your comments, most of computational chemistry does use quantum mechanical models, and there are indeed difficult problems with computational intensity. Basic quantum chemical methods start with a big-O time complexity of O(N^4). The "gold standard" of computational chemistry, CCSD(T), is O(N^7). It is the worst-scaling method that still sees routine use.
An "exact" [1] approach to electronic structure calculations, full configuration interaction, scales as O(N!) -- yes, factorial. Not surprisingly, the size of systems tractable via FCI has not grown much in 30 years even as computers have grown much faster.
There is indeed a lot of work applied developing efficient approximations to the "exact" quantum mechanical solution, and to eking out more constant-factor improvements from existing algorithms.
There's also a lot of work on taking electronic structures, available from various methods, and deriving familiar chemical properties from them. Things like NMR spectra, Raman spectra, pKa, melting point, aqueous solubility...
Measuring properties of bulk condensed-phase matter in the lab is easy but it's hard in simulation. Something "basic" like melting point is very hard to derive from ab initio calculations. On the other hand, properties that require expensive equipment to measure, like NMR spectra, are comparatively easy to calculate.
[1] Terms and conditions apply. Consult Helgaker et al. "Molecular Electronic‐Structure Theory" for details.
Our mission at Spring is to accelerate the discovery of therapies for age related diseases. Machine learning is a core part of our approach and we're building a close-knit team of scientists and engineers. If you're interested, check out our job openings [1]. We also just raised an A round with support from some fantastic folks [2], including Laura Deming who put out this great FAQ on the space if you haven't already seen it: https://www.ldeming.com/longevityfaq/
Start emailing people that write papers you like about aging and tell them how you can help their group and ask for a job. You might be surprised. That's pretty much how my boss hires people. I don't work in aging but in cancer research in academia, but I doubt it is that different.
I recently started my lab at UC Berkeley studying aging and longevity using genomics and computational approaches. https://www.sudmantlab.org Feel free to drop me a line if you're interested in getting involved!
Get a very high paying job and then donate your money to research. If you want to get involved with the research then just start giving the money to the researcher you want without strings (no grant or report writing). Then use this as a chance to chat to the researchers about the research.
We can't really do much about ageing until we solve cancer as ageing is the evolutionary original anti-cancer system.
since many people progress to late 80s and 90s without cancer, it's likely we can increase longevity (not immortality) without addressing cancer as a dependency first. After all, the greatest advances in mortality came from reducing child mortality and addressing infectious diseases that lead to long-term health effects.
Most people in their 80s and 90s have cancer, but thanks to ageing the cancer is very slow growing so they die of something else. For example, in men the percentage probability of having prostate cancer is equal to your age yet few men die of prostate cancer.
There are some mutations in people that slow ageing at the expense of increasing the cancer rate. There is a very interesting one from Brazil where a mutation in the p53 gene has this exact mechanism [0].
No they have full non-benign cancers, they are just so slow growing that the person with them normally dies of something else first although plenty of people in their 80s and 90s die of cancer.
Yes the life expectancy increase from curing cancer is only around 2 years, but it is the essential first step to doing something major about ageing.
non-benign cancer = invasive tumor. people with invasive tumors rarely die of other causes which are not correlated.
anyway, I don't think anybody in the serious scientific community believes that a cancer-only research program would have a huge impact on longevity and instead, most people advocate for a portfolio with roughly 70% spent across cardiovascular and cancer, and the rest on other causes.
do you know the cause of aging? Nobody does. Instead, what we do is address the symptoms, because that generally leads to longevity and higher quality of life in older years. That's what the medical system is dedicated to doing. I say this with full knowledge that my friends who work at Buck Institute of Aging are working on the underlying causes of aging and they've all said it's better to focus on treating those symptoms.
BTW what you've said is also fairly philosophical. It would be completely correct to say that heart disease is a cause of aging, under a reasonable definition of aging.
focus on microscopy. if you have an optical engineering degree you could learn how to build a modern microscope using thorlabs components. then find some labs that need a scope person.
you shouldn't expect, with your academic pedigree and work experience, to be able to pick up enough biology to be truly useful for deep discovery. You can help out writing code, but don't expect to be able to design, run, and analyze the results of an experiment. In biology, it takes decades to be able to judge the results (very different from computer science and machine learning).
Well, I can speak to this since I work for a company that does this. The improvement in microscope image analysis by computers in the past 5 years has been amazing. If the trajectory continues, I would say that 25% of image analysis will be automated using DNNs and other machine learning techniques.
Areas where it won't work: any time you have new image data that doesn't resemble what the networks were trained on. In fact, most people in the field recommend training on and running inference on a single microscope and if you change scopes, you have to retrain your model! Obviously data augmentation has a lot to contribute there but there a ton of challenges.
I've actually proposed building a warehouse-scale microscopy facility within a couple miles of amazon or google data center with full realtime reinforcement learning loop. If you have hundreds of near-identical scopes collecting the same data, you can train over the variation.
I'm reading Aubrey de Grey's book Ending Aging right now. I'm finding it to be a good read and a great introduction to his approach, SENS (http://www.sens.org).
I would look into optical imaging and biophotonics there are many exiting emerging technologies that may revolutionize research (into longevity among others) and healthcare in general. Optical coherence tomography for example.