The New York Times is running a piece about tap water and the regulation thereof called "That Tap Water Is Legal but May Be Unhealthy". One particular contaminant becomes dangerous on exposure to sunlight so, at a lake in Los Angeles, they've tipped 400,000 plastic balls into the lake to block out the sunlight.
Perhaps this shows I've been in stat-mech too long. All I could think about upon seeing this picture was - "cool, a massive 2D elastic disc simulation!".
It's quite interesting where the crystal structure is interrupted - each one of those interfaces costs a lot of free energy. You can also see it's not truly 2D as along certain stress lines the particles have gone up and over to reduce the energy.
I wonder if it's in equilibrium or whether it'll age with time...
This is what science can do to you :-s
Don't know what fair use would be for stealing this photo but hopefully if I link to the NYT enough they won't mind - go and click on one of their ads of something...
Thursday, 17 December 2009
Wednesday, 9 December 2009
Backup news
Anyone that's been here from the start will know I have a slightly unhealthy obsession with backups. A couple of things have changed since I last blogged about this.
Time Machine
Firstly, I now have a mac at home and I've started using Time Machine. I don't want to pat Apple on the back too much because that really gets of my nerves, but Time Machine is absolutely fantastic.
It's exactly how personal backup software should work. You buy an external hard disk, tell Time Machine to backup there, and then you're done. You never need to worry about it again. Most of the time when I need my backup it's because I've accidentally deleted something I shouldn't. Time Machine allows you to, as the name suggests, just go back in time and find it before you made the mistake. Works like a dream.
After a botched attempt to upgrade to Snow Leopard I recently had my first call to do a complete system restore. All I can say is that it seemed to work perfectly for me - it didn't even take that long.
Rsync + windows
At work we backup to an external file server. Until recently that was Linux based and so I had no trouble using Rsync. Now we've been moved to a Windows server which creates all kinds of problems. Rsync just doesn't get on with Windows. Anyway, after a bit of poking around I finally have a script that does the job. This is my basic rsync call now:
I'm pretty sure most of those options could be replaced with the -a but honestly, now it's working I don't want to touch it! The key command is the modify-window. This accounts for the different way that Windows and Unix file systems time stamp modified files.
SVN - Subversion
For programming and writing papers (in LaTeX) I've started using subversion to take care of version control. I'm also using a shared repository to co-write a paper at the moment - it handles simultaneous editing quite well. There is a start up cost in getting your head around how it works, I found this page very helpful, but once you're there it works very nicely.
I mention it here because the version control works a bit like a backup. You can step back through committed versions very easily. If you use OS X then it's installed along with XCode so you probably have it. With Linux it'll be in the standard repositories.
Well that's enough backup geekery for this year. Anyone using anything that they're particularly happy with? I've kind of given up on backing up over the internet for now but would be interested if there's been any developments.
Time Machine
Firstly, I now have a mac at home and I've started using Time Machine. I don't want to pat Apple on the back too much because that really gets of my nerves, but Time Machine is absolutely fantastic.
It's exactly how personal backup software should work. You buy an external hard disk, tell Time Machine to backup there, and then you're done. You never need to worry about it again. Most of the time when I need my backup it's because I've accidentally deleted something I shouldn't. Time Machine allows you to, as the name suggests, just go back in time and find it before you made the mistake. Works like a dream.
After a botched attempt to upgrade to Snow Leopard I recently had my first call to do a complete system restore. All I can say is that it seemed to work perfectly for me - it didn't even take that long.
Rsync + windows
At work we backup to an external file server. Until recently that was Linux based and so I had no trouble using Rsync. Now we've been moved to a Windows server which creates all kinds of problems. Rsync just doesn't get on with Windows. Anyway, after a bit of poking around I finally have a script that does the job. This is my basic rsync call now:
rsync -rptgoDhpP --modify-window=1 --delete --log-file=RSYNCLOG --exclude-from=./exclude /home/username/ username
I'm pretty sure most of those options could be replaced with the -a but honestly, now it's working I don't want to touch it! The key command is the modify-window. This accounts for the different way that Windows and Unix file systems time stamp modified files.
SVN - Subversion
For programming and writing papers (in LaTeX) I've started using subversion to take care of version control. I'm also using a shared repository to co-write a paper at the moment - it handles simultaneous editing quite well. There is a start up cost in getting your head around how it works, I found this page very helpful, but once you're there it works very nicely.
I mention it here because the version control works a bit like a backup. You can step back through committed versions very easily. If you use OS X then it's installed along with XCode so you probably have it. With Linux it'll be in the standard repositories.
Well that's enough backup geekery for this year. Anyone using anything that they're particularly happy with? I've kind of given up on backing up over the internet for now but would be interested if there's been any developments.
Sunday, 29 November 2009
An unintuitive probability problem
Probability can do strange things to your mind. This week I had a probability problem where every time I tried to use intuition to solve it I ended up going completely wrong. I thought I'd share it as I think it's interesting.
Consider a one dimensional random walk. At each time step my walker will go left with probability
, and right with probability
. It stays where it is with probability
. Furthermore these probabilities are dependent on the walker's position in space, so it's really
and
. I'm imagining I'm on a finite line of length, L, although it doesn't matter too much.
Now if
, then we just have a normal random walker. In my problem I have the following setup:
but
. What does this mean? At any given point, x, my walker is more likely to go left than right. If it does go left it will come back with the same rate (although it's more likely to go left again).

So here's the question: If I leave this for a really long time, what is the equilibrium probability distribution for the walkers position,
?
Consider a one dimensional random walk. At each time step my walker will go left with probability





Now if




So here's the question: If I leave this for a really long time, what is the equilibrium probability distribution for the walkers position,

Labels:
probability,
stat-mech
Friday, 20 November 2009
Great LHC animation
The purpose of this blog was to showcase other types of physics other than the LHC. But I can't resist, this is a really nice animated video showing the stages of getting stationary protons up to 7TeV
http://cdsweb.cern.ch/record/1125472
(via @CERN)
http://cdsweb.cern.ch/record/1125472
(via @CERN)
Thursday, 29 October 2009
Speed limit for computer processors - serial vs parallel
This news item from Nature about this PRL talks about how computer processors are going to hit a speed limit due to the speed at which a system can make transitions between quantum states.
I'm not an expert in quantum information so all I can say is that it looks interesting. There are implications for myself because most of my work is pretty intensive computer simulation. Some of what I do simply needs fast processors, there are sections of my Monte Carlo simulations that cannot be parallelised (fancy cluster algorithms being one). So for these, in principle, it limits what could ever be done.
However, mostly my limit is on what statistics I can collect and that can be solved by using more and more processors. The move from single core being standard, to eight these days, has been a revolution in terms of what I can now get done in a reasonable time scale.
In fact one very interesting development is using standard computer graphics cards to perform molecular dynamics (MD) simulations. I've only read the abstract of this paper I'm afraid but they've apparently done this. Graphics cards designed for games have many little processors on them (GPUs) and they can all work on the problem more efficiently than one super powered CPU trying to do it on its own.
So next time you say that computer games are a waste of time think of this...
There are two independent bounds on this minimum time — one based on the average energy of the quantum system, the other based on the uncertainty in the system's energy. In their calculations, Levitin and Toffoli unify the bounds and show there is an absolute limit to the number of operations that can be achieved per second by a computer system of a given energy.
I'm not an expert in quantum information so all I can say is that it looks interesting. There are implications for myself because most of my work is pretty intensive computer simulation. Some of what I do simply needs fast processors, there are sections of my Monte Carlo simulations that cannot be parallelised (fancy cluster algorithms being one). So for these, in principle, it limits what could ever be done.
However, mostly my limit is on what statistics I can collect and that can be solved by using more and more processors. The move from single core being standard, to eight these days, has been a revolution in terms of what I can now get done in a reasonable time scale.
In fact one very interesting development is using standard computer graphics cards to perform molecular dynamics (MD) simulations. I've only read the abstract of this paper I'm afraid but they've apparently done this. Graphics cards designed for games have many little processors on them (GPUs) and they can all work on the problem more efficiently than one super powered CPU trying to do it on its own.
So next time you say that computer games are a waste of time think of this...
Labels:
computation,
links
Monday, 19 October 2009
There's more to the LHC than bloody black holes
The LHC is cold again. This is very exciting, and also it can't come soon enough. In the absence of any actual science going on an endless stream of bollocks seems to have been coming out about the collider. The latest being this drivel about things coming from the future to... oh God I can't be bothered. It rather upsets me that the only things people really know about the LHC are that it might make a black hole and maybe something is coming through time to sabotage it. So I thought I'd talk about why this machine is ridiculously fantastic and complicated (the more likely cause of breakage).
One of the features of synchrotrons that I've always thought is amazing is the way they cool the beams. By cool I'm not talking about temperature around the beam pipe (although that's bloody cold too so that the magnets work). I'll quickly describe what it is and how people solve it, although I'm still not 100% sure how they've solved it at the LHC.
Our general collider accelerates particles around a ring using strong electric fields. The particles are bent into a circle by bending magnets and they are kept in a beam by the focussing quadropole magnets. The effect of these magnets is that if a particle is heading sideways out of the beam then they push it back in in the opposite direction. In this way the particles kind of snake around the course never straying too far out of line. The task of cooling the beam is to reduce this snaking as much as possible so that we have a really dense, straight running beam.
One of the features of synchrotrons that I've always thought is amazing is the way they cool the beams. By cool I'm not talking about temperature around the beam pipe (although that's bloody cold too so that the magnets work). I'll quickly describe what it is and how people solve it, although I'm still not 100% sure how they've solved it at the LHC.
Our general collider accelerates particles around a ring using strong electric fields. The particles are bent into a circle by bending magnets and they are kept in a beam by the focussing quadropole magnets. The effect of these magnets is that if a particle is heading sideways out of the beam then they push it back in in the opposite direction. In this way the particles kind of snake around the course never straying too far out of line. The task of cooling the beam is to reduce this snaking as much as possible so that we have a really dense, straight running beam.
Wednesday, 9 September 2009
Quorum decisions in bacteria
Stumbled across a few nice things related to quorum decision making recently. Remember how sticklebacks make their decisions? Well bacteria do it too, below is a great TED talk by Bonnie Bassler on how they communicate and how they decide to act as an enormous group.
Also came across this article on humans making group decisions in a Kasparov vs The World chess game. It gets the saliva flowing on how you can engineer good decisions.
Addition: Incidentally, I also think this talk is a great example of how to give a science talk. It's a little rushed (probably nerves) but the enthusiasm is fantastic and the use of visual aids is perfect. I'm giving a workshop on presentations so I've been thinking about this stuff a lot recently.
Also came across this article on humans making group decisions in a Kasparov vs The World chess game. It gets the saliva flowing on how you can engineer good decisions.
Addition: Incidentally, I also think this talk is a great example of how to give a science talk. It's a little rushed (probably nerves) but the enthusiasm is fantastic and the use of visual aids is perfect. I'm giving a workshop on presentations so I've been thinking about this stuff a lot recently.
Labels:
behaviour,
biology,
communication,
society,
stat-mech
Subscribe to:
Posts (Atom)