Author Archives: Magnus

Creation of a black hole is left as an exercise for the reader

At long last, LHC, almost ready. Will we be here for a late lunch or will it all be over in a few hours? Just in case this is the Armageddon, I won’t be doing the dishes today, why bother.

This is what it is all about:

As has already been discussed, the LHC will make collisions with a much lower centre-of-mass energy than some of the cosmic rays that have been bombarding the Earth and other astronomical bodies for billions of years. We estimate that, over the history of the Universe, Nature has carried out the equivalent of 1031 LHC projects (defined by the integrated luminosity forcosmic-ray collisions at a centre-of-mass energy of 14 TeV or more), and continues to do so at the rate of over 1013 per second, via the collisions ofenergetic cosmic rays with different astronomical bodies.There is, however, one significant difference between cosmic-ray collisions with a body at rest and collisions at the LHC, namely that any massive new particles produced by the LHC collisions will tend to have low velocities,whereas cosmic-ray collisions would produce them with high velocities. This point has been considered in detail since the 2003 report by the LHC SafetyStudy Group. As we now discuss, the original conclusion that LHCcollisions present no dangers is validated and strengthened by this more recent work.We recall that the black holes observed in the Universe have very large masses, considerably greater than that of our Sun. On the other hand, each collision of a pair of protons in the LHC will release an amount of energycomparable to that of two colliding mosquitos, so any black hole produced would be much smaller than those known to astrophysicists. In fact,according to the conventional gravitational theory of General Relativity proposed by Einstein, many of whose predictions have subsequently been verified, there is no chance that any black holes could be produced at the LHC, since the conventional gravitational forces between fundamental particles are too weak.However, there are some theoretical speculations that, when viewed at very small distances, space may reveal extra dimensions. In some such theories, itis possible that the gravitational force between pairs of particles might become strong at the energy of the LHC.As was pointed out 30 years ago by Stephen Hawking, it is expected that all black holes are ultimately unstable. This is because of very basic features ofquantum theory in curved spaces, such as those surrounding any black hole.The basic reason is very simple: it is a consequence of quantum mechanics that particle-antiparticle pairs must be created near the event horizon surrounding any black hole. Some particles (or antiparticles) disappear into the black hole itself, and the corresponding antiparticles (or particles) must escape as radiation. There is broad consensus among physicists on the reality of Hawking radiation, but so far no experiment has had the sensitivity required to find direct evidence for it. Independently of the reasoning based on Hawking radiation, if microscopic black holes were to be singly produced by colliding the quarks and gluons inside protons, they would also be able to decay into the same types of particles that produced them. The reason being that in this case they could not carry any conserved quantum number that is not already carried by the original quarks and gluons, and their decay back to the initial state partons would be allowed. For this reason, a microscopic black hole cannot be completely black. In standard quantum physics, the decay rate would be directly related to the production rate, and the expected lifetime would be very short. The case of pair production of black holes carrying new and opposite conserved quantum numbers leads to similar conclusions: only their ground state is guaranteed to be stable, and any further accretion of normal matter in the form of quarks, gluons or leptons would immediately be radiated away. Both this and the existence of Hawking radiation are valid in the extra-dimensional scenarios used to suggest the possible production of microscopic black holes. One might nevertheless wonder what would happen if a stable microscopic black hole could be produced at the LHC. However, we reiterate that this would require a violation of some of the basic principles of quantummechanics – which is a cornerstone of the laws of Nature – in order for the black hole decay rate to be suppressed relative to its production rate, and/or of general relativity – in order to suppress Hawking radiation. Most black holes produced at the LHC or in cosmic-ray collisions would have an electric charge, since they would originate from the collisions of charged quarks. A charged object interacts with matter in an experimentally well-understood way. A direct consequence of this is that charged and stable black holes produced by the interactions of cosmic rays with the Earth or the Sun would be slowed down and ultimately stopped by their electromagnetic interactions inside these bodies, in spite of their initial high velocities. The complete lack of any macroscopic effect caused by stable black holes, which would have accumulated in the billions during the lifetime of the Earth and the Sun if the LHC could produce them, means that either they are notproduced, or they are all neutral and hence none are stopped in the Earth or the Sun, or have no large-scale effects even if they are stopped. If a black hole were to be produced by a cosmic ray, as it traveled through the Earth it would absorb preferentially similar numbers of protons and neutrons, because their masses are larger than that of the electron. It would, therefore, develop and maintain a positive charge, even if it were produced with no electric charge. The standard neutralization process due to the quantumcreation of particle-antiparticle pairs near the horizon – the Schwingermechanism – relies on principles very similar to those at the basis of Hawkingradiation, and would likely not operate if the latter was suppressed. Thus,combining the hypotheses that black holes are simultaneously neutral and stable and accrete matter requires some further deviation from basic physical laws. There is no concrete example of a consistent scenario for microphysics that would exhibit such behaviour. Furthermore, it is possible to exclude any macroscopic consequences of black holes even if such unknown mechanisms were realized, as we now discuss.The rate at which any stopped black hole would absorb the surrounding material and grow in mass is model-dependent. This is discussed in full detailin, where several accretion scenarios, based on well-founded macroscpicphysics, have been used to set conservative, worst-case-scenario limits to the black hole growth rates in the Earth and in denser bodies like white dwarfs and neutron stars. In the extra-dimensional scenarios that motivate the existence of microscopic black holes (but not their stability), the rate at which absorption would take place would be so slow if there are seven or more dimensions that Earth would survive for billions of years before any harm befell it. The reason is that in such scenarios the size of the extra dimensions is very small, so small that the evolution driven by the strong extra-dimensional gravity forces terminates while the growing black hole is still of microscopic size. If there are only five or six dimensions of space-time relevant at the LHCscale, on the other hand, the gravitational interactions of black holes are strong enough that their impact, should they exist, would be detectable in the Universe. In fact, ultra-high-energy cosmic rays hitting dense stars such as white dwarfs and neutron stars would have produced black holes copiously during their lifetimes. Such black holes, even if neutral, would have been stopped by the material inside such dense stars. The rapid accretion due to the large density of these bodies, and to the strong gravitational interactions of these black holes, would have led to the destruction of white dwarfs and neutron stars ontime scales that are much shorter than their observed lifetimes. The final stages of their destruction would have released explosively large amounts of energy, that would have been highly visible. The observation of white dwarfs and neutron stars that would have been destroyed in this way tells us that cosmic rays do not produce such black holes, and hence neither will the LHC. To conclude: in addition to the very general reasoning excluding the possibility that stable black holes exist, and in particular that they could only be neutral, we therefore have very robust empirical evidence either disproving their existence, or excluding any consequence of it.

Back Home

I’ve just been a couple of days in Sweden on a hike. It was work related and great fun!

There are some pictures from the trip in the usual place and a small treat here.

Thermodynamics and girlfriends

I came across an article recently, it was about Drakes equation an the number of potential girlfriends in London. That lead me to a similarly  paper that states:

   Since it is much more likely to have a girlfriend, we can suppose that in a random system (one without social standards, fear or psychological complexes) nearly everybody would have a girlfriend. Since this is not the case and we don’t want to violate the second law of thermodynamics we can suppose that people without girlfriends do more work in order to go to a state of lower entropy.
 

This is one of the weirdest conclusions I have ever heard and I love it! It also contradicts the Drake-Backus equation in the most humorous way. Thermodynamics rock!

You can read all of it here.

rsync and backup

The setup is: Clients -> Server -> NAS. ssh from Client to server, cifs from server to NAS.

First bit, make ssh keypairs for the clients, if the backup-process is supposed to be unsupervised, then make keypairs without passwords. Then I needed to fiddle with the backup-share, permissions screwed with rsync, generating a lot of noise in the log. Stuff got backed up though. Heres what my backupscript ended up looking like:

#!/bin/sh
# for copy from CLIENTBACKUPDIR to /mnt/Backup/$BACKUPUSER directory on server
CLIENTBACKUPDIR=XXX
BACKUPUSER=XXX
SERVERHOST=URL
SSHUSER=XXX
cd $HOME || exit 1
rsync –delete –exclude-from=$HOME/.rsync/exclude-backup \
–timeout=999 -azu \
-e “ssh -l $SSHUSER -i .ssh/backup -ax -o ClearAllForwardings=yes” \
$HOME/ $SERVERHOST:/mnt/Backup/$BACKUPUSER
The $HOME/.rsync/exclude-backup file looks something like this:
#Backup-excludefile
– /Media/*
– /Music/*
– /.*
The path is relative to $HOME
The noperm option made a lot of difference in the /etc/fstab on the server. Otherwise rsync threw up a lot of permission-errors. cifs perhaps isn’t ideal, but it gives me more options.
This even works with DeltaCopy on my wifes Win7 netbook. And it was a nobrainer from all of my Debian-machines and my moms Ubuntu netbook.