True's beaked whale.jpg

Western spotted skunk

Hooded skunk

Yellow-throated Marten

Wolverine

Archive for the ‘Sci general’ Category

Post-COVID reflections

Tuesday, June 18th, 2024

Keying off Kevin Drum’s The lessons of COVID, which is a good start but too limited in imagination.

What the US did right: the CDC has great world-wide infectious disease monitoring.

What the US did poorly: the US had spotty local infectious disease monitoring. And lack of data–ignorance–was used as an excuse for inaction. With post-COVID wastewater monitoring, the US is doing better, let’s keep the funding high for it.

Biggest mistake: the federal government responded slowly and ineffectively in the first two months. Faster response that seems like an over-reaction at the time is the best response. If the response is effective, and a disease is slowed or stopped, it should always feel like an over-reaction. Hoof-and-mouth disease is the model.

An upside for AI / LLMs

Sunday, February 11th, 2024

The new AI / LLM tools have many potential applications, but many of them will have downsides for some people–replacing copy writers, some tech support, these are benefits for the companies that apply them, but many of today’s jobs in those areas will be eliminated.

One application that seems inevitable and all positive is raising the floor for human performance. An tool that you can ask for advice, or better yet an AI tool that monitors you, by email, or watches internet use across devices, and/or is watching on video, understands what the user is doing, understands the context, and provides advice will help people avoid mistakes. While these mistakes are not obvious to the person making them, they are obvious to a person with experience, or able to research the problem. Imagine a person playing chess alone vs. someone playing with a chess program to flag potential mistakes, with the general knowledge of an AI allowing it to work in many more situations.

This AI tool can develop slowly and in a modular fashion, will be useful even in crude form, but will become revolutionary once it gets good enough. Imagine a person using a crude form of this as an interactive chat tool. The person could say, I’m taking a vacation to Greece and get advice on things to do, what they need to know about currency or visas. Or imagine a more advanced AI would remind a person of an appointments, or tells them they need to change the house air filter. If a person was goiing to make a poor decision–routinely using check cashing places or buying a car with poor service record, the AI could warn them.

This AI tool would be able to slot in special modules as needed. A person starting a business could get localized advice on the steps to take. Someone buying a house could get advice on things to check, and a new homeowner could get advice on what to check and repair and reminders for maintenance.


Green energy and nuclear power

Saturday, January 14th, 2023

In discussions of wind and solar power, sensible centrists always pop up with, “We must build new nuclear power plants too!”. And then mumble on about how nuclear power isn’t really dangerous, especially new designs, and talk about how nuclear power provides steady base load power which is necessary because wind and solar are intermittent.

For example, see this Freakonomics Radio podcast hosted by Stephen Dubner, which is noteworthy for never meationing the cost or relative cost of nuclear power. No economics in Freakonomics! And in a more reasonable discussion between host Ezra Klein and Jesse Jenkins covering a host of energy / decarbonization topics, nuclear power is boosted as a necessary component, again without a discussion of costs.

But this idea that nuclear power is necessary and complementary is mostly nonsense. Yes, nuclear power has killed very few people, and compares favorably in overall safety to coal power plants which cause plenty of deaths due to air pollution. But this argument is almost entirely off target.

The intermittency of wind and solar power is a big issue. Working out solutions for providing steady power in a grid powered mostly by wind and solar is the challenge for the next generation or two.

The thing is, nuclear power doesn’t help with that. Nuclear power plants are run full out except for maintenance (a capacity factor of 92%). What’s needed to complement solar and wind are power sources that are dispatchable and can be ramped up and down quickly. Hydroelectric power provides that in places like the US’s northwest that have lots of dams. And today gas peaker plants and coal plants provide fast and slow power that can be ramped up and down as needed.

And nuclear power is expensive, very, very, expensive. Today nuclear power costs 3-4X as much as solar and wind power. And that is market cost, excluding the subsidies provided by the federal government for nuclear power. Nuclear plants are insured by the US government. The costs of a meltdown are immense, from billions to hundreds of billions, and with a chance of a nuclear plant disaster of at least 1 in 165 over the life of a plant, the risk is substantial. Long term high level nuclear waste disposal has not been paid for or figured into costs–US nuclear plants store high level waste on site, along rivers and coasts, with the US government expected to handle final storage. And nuclear plant decommissioning will likely cost more than the collected funds account for.

So nuclear plants don’t make economic sense on their own and they do not complement wind and solar power generation.

What is needed to make a power grid with wind and solar the primary power sources able to provide reliable power? There needs to be ways of meeting short term (minute to minute), medium term (hourly and daily), and long term (days and weeks) interruptions in wind and solar power production. Short term irregularity can be met today with small grid storage and hydropower.

Dealing with the daily cycle of solar power production requires much larger grid storage, generally not available today, and/or large scale demand shifting not done today. Short periods of low wind are fairly common, and week- or month-long regional low wind is known to occur. Solar power production is lower on cloudy days, and varies seasonally.

It is not clear today what solutions will be used. Grid scale power storage is an active and promising R&D area. Over-capacity–having more solar and wind capacity than is needed will help, and solar and wind are already cheap enough for it to be economic, but this creates a new problem–what to do with the excess power generated during high periods.

Demand-shifting has a lot of promise, and will help with hourly and daily power demand balancing. Residential and industrial power use modulated by utilities is already in widespread use, mainly used to shave off peak demand and do modest demand shifting today, but there is much more potential, especially as electricity gets used more widely for heating water, cars, and homes.

For long periods with low wind and solar power production, other strategies are needed. Today, fossil fuel plants are used. Grid interconnects able to transfer substantial power between regions can be part of the solution–areas with low wind and heavy cloud cover are typically regional. Long-term, there is also potential for storage of energy in other forms–compressed air, hydrogen, or hydrocarbons. A round trip efficiency of ~25% is enough to make this practical.

So there are challenges to powering the grid mainly with wind and solar power, but nuclear power doesn’t help with solve them. If nuclear power with lower, competitive costs can be developed, then it is safe enough to use.

Glass cutting w/ short laser pulses

Sunday, September 12th, 2021

Filamentation cutting
The output of an ultrashort pulse (< 15 picoseconds) laser is focused to a small spot within the substrate. The very high laser intensity achieved produces self-focusing of the beam (due to the Kerr optical effect) within the glass. This self-focusing further increases power density, until, at a certain threshold, a low-density plasma is created in the material. This plasma lowers the material refractive index in the center of the beam path and causes the beam to defocus. If the beam focusing optics are properly configured, this focusing/defocusing effect can be balanced to repeat periodically and self-sustain. This forms a stable filament, that is, a line of tiny voids, which extends over several millimeters in depth into the glass. The typical filament diameter is in the range of 0.5 µm to 1 μm.

Example system: 50 W of average output power at a wavelength of 1064 nm, 100 mm/s – 2 m/s.

Selective laser-induced etching (SLE)
An ultra-short pulsed laser (Satsuma HP2, Amplitude Systèmes, Pessac, France) with a central wavelength of 1030 nm and the maximum output power of 20 W was used. The maximum laser pulse energy was 40 μJ at the pulse repetition rate of 500 kHz and the pulse width variation was from 370 fs to 10 ps. The pulse repetition rate was variable up to 2000 kHz. For 3D fabrication machine, the laser amplifier was integrated with a 2-axis (XY) galvano scanner (DynAXIS, ScanLab, Puchheim, Germany) and an air bearing 3-axis(XYZ) servo motion stage with a controller (A3200, Aerotech, pittsburgh, USA). A focusing objective lens (NA = 0.4, model No. 378-867-5, Mitutoyo, Kawasaki, JAPAN) was assembled with the galvano scanner for high scan speed. The focused laser beam size is estimated about 2 μm. The combined scan speed of the scanner and 3D stage is up to 200 mm/s and the field size is 100 mm × 100 mm. After laser modification process, the exposed glass substrate was etched using 8 mol potassium hydroxide (KOH) at 85 °C in an ultrasonic bath for uniform concentration control. (Kim et al, 2019)

Possible laser: Osram SPL PL90 3 pulsed laser diode. It is constructed from three epitaxially stacked emitters with a laser aperture of 200 μm by 10 μm and has a peak output power of 75 W, a wavelength of 905 nm, and a maximum pulse width of 100 ns. The rated duty cycle is 0.1%, but this has been exceeded without damage to the diode. $20. datasheet, (Parziale_et_al, 2015)

Diodes with ps pulses are low energy mW or less, so coupling and amplification through a fiber laser is needed.


Carbon capture

Sunday, March 3rd, 2019

The basic problem with carbon capture is energy, and energy is cost. When coal or oil is burned, heat and CO2 are produced. CO2 is a pretty low energy form of carbon. Turning it into something solid (calcium carbonate, graphite or coal) requires a lot of energy. Also, when CO2 is made by burning fossil fuels it disperses, and re-concentrating it requires energy. That’s why carbon capture proposals often include using exhaust gas, grabbing the CO2 before it disperses. The other main type of capture I’ve seen proposed takes the CO2, concentrates it to high pressure, and pumps it underground (and hopes it stays there). Compressors take a lot of energy, and so do pumps if the CO2 needs to be piped hundreds of miles to a place where it can be pumped underground.

The key number for carbon capture is, how much energy is required relative to the amount generated by burning the fossil fuel? I’ve never seen articles about it touting this number. A quick look shows one assessment being 30% – 35% of the energy (Zhang et al, 2014), another figures the production cost of electrcity with carbon capture being 62% – 130% higher (White et al, 2012, Table 6) Another article looks at the harder case, CO2 capture from air, and estimates the cost at $1000/ton CO2 (link). Burning the coal to generate a ton of CO2 (1/3 of a ton coal) generates about $80 of electricity.

So the best case cost of carbon capture–from power plant exhaust gas–is dismal, 25%, 75%, maybe over 100% of the value of the electricity. This number will translate directly to increased fossil fuel energy costs (+30%, +100%, etc.) if fossil fuel companies are required to capture the majority of the CO2 pollution they generate.

All the carbon capture projects are basically stalling actions. The fossil fuel companies pay small $$ to put together a pilot plant (or better yet, get the govt to fund it), run tests for years, but never implement CO2 capture on a coal or gas energy plant. This had been a very successful approach for the fossil fuel industry, they’ve managed to stall things for 50 years already!

CRISPR

Tuesday, February 26th, 2019

The CRISPR gene editing system is a major technical advance. It does open up the near term possibility of making a few small changes to a human embryo’s DNA, but I don’t find that particularly interesting or alarming.

What makes CRISPR better than previous tech for gene modification is that it works at high efficiency–1% to 60% with very high specificity. I read a recent paper testing CRISPR on human embryos that reported 50% effectiveness. Given a handful of embryos to work with, there is a very good chance of making a single change in one embryo.

We have very little knowledge or technology for making positive changes to animals which is a huge limitation to genetic ‘engineering’. Mostly what is understood are disease causing (or predisposing) genetic variants. So a single change (maybe in a few years, a handful of changes?) can be made to a human embryo. There are other limits to modifying human embryos apart from lack of knowledge. The more time an embryo or human embryonic stem cell is cultured, the more it is manipulated, the greater the chance of something going wrong, and the child being born with problems. This tech is great for manipulating animals in the lab. If many or most of them have the genetic change, great! If some are born with defects, cull them, or breed another generation and use those in experiments (often the first generation has non-genetic defects that breed away). But these are huge problems if you are working on humans, because things that increase the risk of getting a damaged child are not desirable.

Long term (100-1000 years), when increases in understanding of biology make improvements (or significant changes of any sort) in humans possible, I think what we’ll see is that the people with the least concern for child welfare will be the most willing to experiment on them.

The really exciting possibilities CRISPR opens up is in genetic treatment of human disease in the tissues of kids and adults. There is delivery tech (well tested viral vectors, and a host of other methods) that can get CRISPR into a good percentage of cells (10% to 50+%) in many tissues, and once there, CRISPR will edit a good fraction of those cells. For many diseases, fixing a genetic defect in 1%, 10% or 20% of cells is enough to treat the disease, so genetic treatment of host of diseases is now possible. Things like hemophilia, some muscular dystrophy, maybe Huntington’s Disease, metabolic diseases, Parkinson’s disease, and on and on. There will be a lot of exciting advances turning that ‘possible’ into actual treatments for different diseases over the next decade or two.

The other major effect of CRISPR tech is that it makes animal experimentation faster and cheaper, and will accelerate basic biological research. We still don’t know what the majority of indivdual genes do, let alone how they work in complexes and networks in cells.

Protein in dinosaur fossils, new paper

Friday, June 12th, 2015

A second group has found protein preserved in dinosaur fossils. I wouldn’t call this solid yet, but it is encouraging. This hit the news in September 2009

Fibres and cellular structures preserved in 75-million–year-old dinosaur specimens

Abstract
Exceptionally preserved organic remains are known throughout the vertebrate fossil record, and recently, evidence has emerged that such soft tissue might contain original components. We examined samples from eight Cretaceous dinosaur bones using nano-analytical techniques; the bones are not exceptionally preserved and show no external indication of soft tissue. In one sample, we observe structures consistent with endogenous collagen fibre remains displaying ~67 nm banding, indicating the possible preservation of the original quaternary structure. Using ToF-SIMS, we identify amino-acid fragments typical of collagen fibrils. Furthermore, we observe structures consistent with putative erythrocyte remains that exhibit mass spectra similar to emu whole blood. Using advanced material characterization approaches, we find that these putative biological structures can be well preserved over geological timescales, and their preservation is more common than previously thought. The preservation of protein over geological timescales offers the opportunity to investigate relationships, physiology and behaviour of long extinct animals.
Nature Communications 6, Article number: 7352, 09 June 2015

Future bloging, because the future is in full text

Friday, August 22nd, 2014

OK, this is quite annoying. It was plenty annoying when I was at a univerisity and 90% of the articles were available at publication, but that 10% always included a handful of important articles so it has always been a PITA. So now I’ll start future blogging!

I’ll tag interesting articles when they get published and follow up when I can actually read them. Many journals now are open access, but some release an article six months or a year after publication. Or sometimes the pdf gets posted. So I’ll tag intertesting articles when they hit the news and write a follow up when I can read them. Because titles and abstracts aren’t enough for articles with useful information!

Duration of urination does not change with body size. Patricia J. Yanga, Jonathan Phama, Jerome Chooa, and David L. Hu. PNAS vol. 111 no. 33p11932–11937.

BTW, PNAS used to release articles at publication. When did they go dark?!

Book review: Parasite Rex

Tuesday, March 1st, 2011

Parasite Rex: Inside the Bizarre World of Nature’s Most Dangerous Creatures by Carl Zimmer.

Great book. About parasites. What they are, the recent discovery of how big a role they have in ecosystems, how they live, how they have jumped from animal to animal, and of course, which ones afflict people.

Several chapters describe a range of human parasites in amazing and often frightening detail. From botfly larvae to liver flukes, malaria’s Plasmodium to the nematodes that parasitize humans. There is some discussion of microbial parasites, but most of the book covers metazoan parasites. Zimmer tells the stories of some of these parasites–how they find their way to people, what they do once they arrive in a new host, how they escape detection, and the course of the disease. The story of how several parasites were discovered, how they were identified and followed through their changes of form and host are told. And there are pictures!

Word cloud of Parasite Rex by Carl_Zimmer

Fluoridation

Tuesday, October 26th, 2010

Notes on water fluoridation and the Fluoride Deception video

I’d heard of the great water fluoridation fight but never looked into it. In the 60’s the John Birchers were saying it was a Commie plot to weaken America’s vital fluids or something of the sort. And it was parodied in the movie Dr. Strangelove…

Let’s start by bracketing things. Fluoride in water can’t be highly dangerous or people would have noticed. Not putting fluoride in water is not a risk-free choice–it prevents cavities. Cavities don’t just make your teeth fall out, they also increase risks of bacteria related heart disease, and the occasional person dies of a tooth abscess. So the question is, is there disease caused by fluoridation, and is it worse than the diseases caused by no fluoridation?

OK, let’s look at the video.
5:42 Suggests that the idea of adding fluoride to water supplies was to hide the dangers of for fluoride pollution or avoid responsibility for damage due to fluoride pollution. Doesn’t really make sense so far. Ah, reading in the history, when government regs made industry stop dumping fluoride in air and water, one thing they did with it was process out fluoride for water fluoridation. Doesn’t sound that damning, after all it would have been cheaper to dump it in a landfill.

~7:00-20:00 Fluoride air pollution can be bad. Some of the early fluoride researchers also worked on and perhaps had a part in the worst cover ups regarding industrial pollutants. What I’ve read of the tetraethyl lead story is appalling. The connection with the lead story is tenuous. Fluoridating water wasn’t a gold mine, I don’t see there being much pressure to push fluoridation back when it started.

21:30 The NRC report (below) discusses Waldbott’s results, concludes that some people are sensitive to typical water concentrations of fluoride and that it appears to be fairly rare.

From the NRC report, it doesn’t appear that the safety of water fluoridation was well-established, certainly nowhere near today’s standards, back when it began. It was safe by 1940’s standards, and had a clear benefit. I’ve probably got an extra tooth in my mouth due to it.

25:00 The NRC report discusses the Mullenix study. Calls it inconclusive, calls for more studies.

The video didn’t have much info. Here are the establishment reference sources:

CDC recommendations

Fluoride reduces cavities by 15-40%, depending on the study. The low figure is an estimate of the benefit of water fluoridation in a population that already uses fluoride toothpaste.

2006 National Academy report (the greybeards)

Here’s the meat! Water fluoridation is 1 mg / L, when the level hits 4 mg / L studies start seeing negative health effects. That’s a pretty narrow window between benefit and danger level, the smallest one for an environmental exposure I’ve run into. YMMV, I’m not an environmental toxicologist.

What hasn’t really been studied are neurotoxic effects of low level exposure. A few studies have turned up disturbing results. Check out the summary on page 205.

Interesting take on differences between Europe and US fluoridation, Pizzo et al. 2008

The bit about Europe in the video is misleading. Europe hasn’t avoided fluoride, it’s just mostly not in water, it’s in salt or toothpaste.