Ecologist’s doggerel
December 12th, 2005Old doggerel, but new to me:
“Let’s consider the concept of niche-
If I knew what it meant I’d be rich.
Its dimensions are n
But a knowledge of Zen
Is required to fathom the bitch.â€
Old doggerel, but new to me:
“Let’s consider the concept of niche-
If I knew what it meant I’d be rich.
Its dimensions are n
But a knowledge of Zen
Is required to fathom the bitch.â€
One of the MA database systems went down. It wouldn’t boot, so we put in the install disk, ‘linux rescue’ at the prompt, and it booted into rescue mode. The system is set up with the system files on a pair of RAID1 SATA drives. Drive /dev/sda was gone–fdisk found no partition. /dev/sdb was fine. I looked around for hacking traces but found nothing. /var/log/messages indicated the system had shutdown for reboot two days before. We hadn’t done it, so how/why?
First, I re-partitioned /dev/sda to look like /dev/sdb using the same ‘fd’ RAID partition type.
To bring it back up, I shut down, switched the sda and sdb cables so we could boot off the good drive and then have RAID restore the second drive. The original /dev/sdb didn’t have grub installed on the MBR, so I had to reboot with the rescue disk and reinstall grub.
/mnt/sysimage/sbin/grub
grub>root (hd0,0)
grub>setup (hd0)
I had to use grub because grub-install wasn’t available from the rescue environment and /mnt/sysimage/sbin/grub-install couldn’t find /sbin/grub.
Then reboot, grub comes up, the system boots. The root /dev/md1 RAID1 is degraded as this shows, so add /dev/sdb back:
mdadm --query --detail /dev/md1
...degraded...
mdadm --add /dev/md1 /dev/sdb
And 20 minutes later the array is clean!
Figured out how to VNC from my laptop to my linux server. Forwarded X conections were too slow so I gave VNC a try. It was harder to set up than I expected. I googled around and after many tries found a post online that worked.
Here’s the stiuation. I need to connect to my lab computer from my laptop at home. The lab computer is behind UK’s firewall. Run vncserver on the lab computer–that’s the easy part. It runs on display :1 by default:
elegans.uky.edu> vncserver
Turns out I need to enable port forwarding on the lab computer’s sshd:
As root add this line to /etc/ssh/sshd_config
AllowTcpForwarding yes
then restart sshd:
/etc/rc.d/init.d/sshd restart
Then forward the ssh connection. On my laptop I run this command from a terminal:
ssh -L 5901:localhost:5901 me@elegans.uky.edu
And run the VNC client. My laptop runs OS X, so I downloaded “Chicken of the VNC. By default it uses port 5900, enter display 1 (so it goes to 5900+1 = 5901) and the host is localhost. Starts up and runs fast!
Archaeologists push back the date of the invention of noodles to 4,000 BP:
BBC NEWS
The 50cm-long, yellow strands were found in a pot that had probably been buried during a catastrophic flood.
Radiocarbon dating of the material taken from the Lajia archaeological site on the Yellow River indicates the food was about 4,000 years old.
Scientists tell the journal Nature that the noodles were made using grains from millet grass – unlike modern noodles, which are made with wheat flour.
The discovery goes a long way to settling the old argument over who first created the string-like food.
Professor Houyuan Lu said: “Prior to the discovery of noodles at Lajia, the earliest written record of noodles is traced to a book written during the East Han Dynasty sometime between AD 25 and 220, although it remained a subject of debate whether the Chinese, the Italians, or the Arabs invented it first.
Great science writing too–packed with detailed information yet succinct. I’ve never read a molecular biology news article as good.

A compter security hack has appropriated the word nematode to describe “good†network worms. Dave Aitel says
“We don’t want people to think this is impossible. It’s entirely possible to create and use beneficial worms and it’s something businesses will be deploying in the future.â€
So much for ever being able to Google “nematode†again! Spoiler! Bastard! And it’s so wrong. Aitel has a apparently heard of Caenorhabditis elegans, everyone’s favorite worm and entirely harmless, and jumped to the conclusion that every nematode is beneficial to humans. So wrong, so sad; a little knowledge is a dangerous thing. If this is the depth of his thinking I certainly wouldn’t take his advice on computer security.
I just read a review on cutting-edge new sequencing technology (Shendure et al., 2004). There are several approaches that were new to me. One that caught my imagination incorporates “polony technology,in which PCR is performed in situ in an acrylamide gel†for DNA amplification. A related technique using emulsion has been developed by the Volgelstein lab.
My idea of developing a method using reversibly terminating nucleotides has also occurred to many other people! Apparently finding a way to do the reversibly termination has been a roadblock. I certainly didn’t have a way to do it. They have worked out approaches to detection of incoration, the other half of the method, and also a part I didn’t develop.
Very interesting tech. According to the paper, even nanopore sequencing is close to working!
The paper talks a bit about using ULCS for personal genome sequencing (PGP, everthing gets an acronym), about the whys and what it will mean. It contain the usual throw away consideration of ethics and consequences. The papers says this “will require high levels of informed consent and securityâ€. In practice, your personal seq info and related disease susceptibility info *will* get spread to interested parties. Just look at who calls the shots; after more than a decade of attention to genetic privacy and overwhelming public support, “no US federal laws that ban genetic discrimination for medical insurance or in the workplaceâ€.
How I would love to seqeunce 1Gb a week!
From PZ Myers, unsourced:
“0.1% of all the species that have existed are currently extant, and the average lifetime of a species is roughly 10 million yearsâ€
I’ve messed with the site’s css style sheets. If it looked OK before, there’s no change. If the layout used to suck, it should be fine now. Css is such a tar baby, I won’t tell you how long it took me.
And I went and downloaded 400+ new mammal pics, should be enough to keep the site in mammals all year!
Here’s an idea: wall moisture sensor. Water damage can be hard to spot early on and by the time you notice it the damage and rot may be extensive. So add a cheap moisture sensor.
The sensor is a 5cm spike with a nickel-sized head. To install it, you push it through the drywall. It expands a bit inside the wall and this helps fix it in place. Also, the back side of the sensor head is adhesive covered so it sticks to the plasterboard and seals the hole.
Now on to the working bits. It contains a cheap humidity sensor IC, a control chip, and a battery. On the head there is a tiny solar cell, a pair of contacts, and a red LED. The solar cell gets a bit of energy from room lighting and keeps the unit working for more than a decade. When humidity is detected, the LED flashes. Touch the contacts and the LED lights to show it works. The sensor and a control chip are embedded in the spike.
It should be cheap to make and long lasting. Home owners can buy a couple and pop them into walls they want to monitor–walls of finished basements, walls containing plumbing, exterior walls, etc. Installed, the sensor is unobtrusive.
If you think this could be useful and are interested in manufacturing/selling this send an email.
[ ]-plasterboard
R [ ]
o ||[ ]
o ||========* -sensor spike
m ||[ ]
[ ] Inner wall
[ ]
A high-end CPU these days uses nearly 100W of power. But it doesn’t have nearly the computing power of a human brain. AI is in good part, perhaps mainly, a software problem, but raw computing power seems lacking too. So how many of today’s CPUs would it take to build a computer with human intelligence? Say at least 10,000 Opterons.
This 10K CPU computer system would use 1MW of power. So how does that compare to a human brain? A person runs on 2000 Kcal / day.
At 86400 s / day this is 23 cal / s x 4.187 calories / J = 97 J / s. J / s are equal to Watts so this is 97 W.
Say 40% of the body’s energy is used by the brain. Then a person’s brain uses 40 W, as much a weak light bulb. Which is order-of-magnitude correct–your head is a little cooler than a weak bulb but the bulb is smaller.
So a human intelligent computer would use 1MW of power while a person’s brain uses 40W. The human brain is 25,000X more effcient than a computer. This says a few things about today’s computers. They are terribly inefficient, at least for the types of computations an AI needs to do. And today’s AI software design doesn’t capture the organization of biological computers. We have 1000 CPU systems today, and could build 10K CPU systems. But no system today is as clever as a mouse. Today’s AI may have crested the house fly brain goal post. But the lack is clearest in the hardware architecture.