GonzalesMestres

Download Report

Transcript GonzalesMestres

COSMIC RAYS AND TESTS OF
FUNDAMENTAL PRINCIPLES
Luis Gonzalez-Mestres
LAPP, CNRS-IN2P3 – Université de Savoie
France
The relativity principle
Henri Poincaré, 1895
”A propos de la théorie de M.Larmor”
L’Eclairage électrique, Vol. 5, 5.
”Absolute motion of matter, or, to be more
precise, the relative motion of weighable
matter and ether, cannot be disclosed. All
that can be done is to reveal the motion of
weighable matter with respect to weighable
matter”.
Possible Lorentz symmetry breaking
Albert Einstein, 1921
Geometry and Experiment (English, 1922)
"It is true that this proposed physical interpretation
of geometry breaks down when applied
immediately to spaces of sub-molecular order of
magnitude. But nevertheless, even in questions as
to the constitution of elementary particles, it
retains part of its importance. For even when it is a
question of describing the electrical elementary
particles constituting matter, the attempt may still
be made to ascribe physical importance to those
ideas of fields which have been physically defined
for the purpose of describing the geometrical
behaviour of bodies which are large as compared
with the molecule. Success alone can decide as to
the justification of such an attempt, which
postulates physical reality for the fundamental
principles of Riemann's geometry outside of the
domain of their physical definitions. It might
possibly turn out that this extrapolation has no
better warrant than the extrapolation of the idea of
temperature to parts of a body of molecular order
of magnitude”
Source :
MacTutor History of Mathematics Archive
More than 20 ordres of magnitude…
… between molecular distance scales and
those associated to the wavelengts of the
highest- energy cosmic rays observed.
And no well-established violation of relativity !
What to do ?
Follow Einstein’s reasoning : Try to apply, try to
break, make measurements… and see what
happens => Cosmic-ray experiments
Other fundamental principles to test
• Quantum mechanics – Standard
uncertainty principe, …
• Energy and momentum conservation
as a consequence of space-time
translation invariance
• (At least) four effective space-time
dimensions
• (CPT, Lagrange-Hamilton, vacuum…)
Models of Lorentz symmery violation (LSV)
that can be tested by UHCR. Example, QDRK :
(Quadratically deformed relativistic kinematics)
E = (2π)⁻¹ h c a⁻¹ e (k a)
e² (k a) ≃ (k a)² − α (k a)⁴ + (2 a)² h⁻² m² c²
k = wave vector, a = fundamental length
Expansion for ka << 1, α (ka)⁴ generates LSV
New physics when α (ka)⁴ becomes of the
same order as the mass term (2 a)² h⁻² m² c²
Needs an absolute (vacuum) rest frame (VRF)
Otherwise : no observable effect for UHECR (go to the
center of mass frame, corrections to SR are too small)
Example of new physics : Possible suppression of the
GZK cutoff, but there is more…
=> transition region E ≈ E (trans) where :
α (k a)⁴ ≈ (2 a)² h⁻² m² c²
Above E (trans) kinematical balances are modified =>
the GZK cutoff would disappear because of the new
cost in energy to split p with the term − p c (k a)²/2 in the
dispersion relation. BUT QUESTIONS : What is the right
value of α for each particle ? What is the « fundamental »
value of α for standard matter ? => take protons and/or
nuclei, or quarks and gluons ? Estimate the difference.
Is the fundamental scale the Planck scale ?
See CRIS 2008 Proceedings
Present data do not necessarily exclude « maximal » LSV
with α ≈ 0.1 – 1 for quarks and gluons, a = Planck length
=> UHECR composition and sources ?
Gonzalez-Mestres, 1997 :
« For α a² > 10⁻⁷² cm² , and assuming a universal value of
α, the GZK cutoff is suppressed for the particles under
consideration and ultra-high energy cosmic rays (e.g.
protons) produced anywhere in the presently observable
Universe can reach the earth without losing their energy
in collisions with the cosmic microwave background
radiation » (α > 10⁻⁶ if a = Planck length)
It was actually assumed that the highest-energy
cosmic rays are protons => If so, this is the upper
bound on α (proton) from the possible existence of
the GZK cutoff => For large systems , α
proportional to M⁻² in order to get a consistent
QDRK => Extra M⁻² when comparing with M² /p .
Gonzalez-Mestres, 1997 : If particle 1 has a positive
value of α larger than that of particle 2, particle 2
can decay into particle 1 + (…) at high enough energy
( p -> p + γ ). But : i) often dynamically difficult ; ii)
time dilation => In general, very slow process => f.i.
can the decays p -> p + γ , N -> N + γ replace the GZK
cutoff for protons and nuclei ?
=> Possible suppression of photons by γ -> e+ e⁻ ?
Gonzalez-Mestres, 1997 and 2000 : suppression of
synchrotron radiation in UHE cosmic accelerators.
=> New experimental tests of LSV, bounds… ?
Pierre Auger Collaboration, February 2010 :
“… a suppression of the flux with respect to a power law
extrapolation is found , which is compatible with the
predicted Greisen-Zatsepin-Kuz’min (GZK) effect, but could
also be related to the maximum energy that can be reached
at the sources.”
Suppression of synchrotron radiation by LSV
(Gonzalez-Mestres, 2000, arXiv:astro-ph/0011182 , on LSV
and acceleration in relativistic schocks) => Extra check ?
LSV at energies above E (trans) : the emission of synchrotron
radiation by a UHE particle (e.g. proton) becomes more and
more difficult, as the negative deformation energy increases
Pierre Auger Collaboration, September 2010 :
[arXiv:1009.1855 ] Measurements by the Pierre Auger Observatory of the depth
shower maximum and its fluctuations indicate a trend toward heavy nuclei with
increasing energy. Although the measurements available now are only up to about
55 EeV, the trend suggests that primary CRs are likely to be dominated by heavy
nuclei at higher energies. This interpretation of the shower depths is not certain,
however. It relies on shower simulations that use hadronic interaction models to
extrapolate particle interaction properties two orders of magnitude in center-ofmass energy beyond the regime where they have been tested experimentally. A
knowledge of CR composition is important for deciding which of several source
scenarios is more likely. The trajectories of highly charged nuclei are expected to
undergo large deflections due to the Galaxy’s magnetic fields. While a correlation of
arrival directions with nearby matter on small angular scales is plausible for protons
above 55 EeV, it is puzzling if the CRs are heavy nuclei.
Definitive conclusions must await additional data. The correlation of recent data
with objects in the VCV catalog is not as strong as that observed in 2007. If the
evidence for anisotropy is substantiated by future data, then it should also become
possible to discriminate between different astrophysical scenarios (…)
More involved scenarios
D-Foam models, space-time foam model based on recoiling
D-branes, where Lorentz symmetry is violated for photon
propagation and reactions not involving incident charged
particles, but not for charged particle propagation and
interactions => LDRK (linearly deformed relativistic
kinematics E ≃ pc − α’ p²/MPlanck ) for photons.
J.Ellis, N.E. Mavromatos, D.V. Nanopoulos, arXiv:1004.4167
L. Maccione, S. Liberati, G. Sigl, arXiv:1003.5468v1 and v2
According to J. Ellis et al. “particle interactions conserve
Lorentz-invariantly” energy and momentum “in a leading
approximation”. All formulae are given at a first-order
approximation => What happens at “non-leading” orders,
which contain in particular the quadratic deformations ?
LDRK and phenomenology
My 1997 papers discarded LDRK for phenomenological
reasons : it naturally leads to too strong effects.
Even for a more sophisticated string model considered by
Ellis et al. where linear LSV is strongly restricted, Maccione
et al. find that “this model would predict too many photons
in the ultra-high energy cosmic ray flux to be consistent with
observations” in spite of the fact that “only purely neutral
particles, such as photons or Majorana neutrinos, posses LV
modified dispersion relations”. The last paper by Ellis et al. is
supposed to avoid the recent criticism addressed by
Maccione et al. using AUGER data ( arXiv:0712:1147 ).
BUT WHAT CAN BE
THE FUNDAMENTAL PHYSICS BEHIND STRINGS ?
STRINGS, 40 YEARS AGO
String theory has its origins in the DUAL RESONANCE MODEL
introduced by Gabriele Veneziano in 1968 using the Euler
beta fonction for a four-point amplitude. The model was
generalized to an N-point amplitude by Ziro Koba and Holger
Bech Nielsen. Dual models of strong interactions were
described by Yoichiro Nambu, H.B. Nielsen and Leonard
Susskind in terms of strings (infinite number of simple
harmonic oscillators describing the motion of an extended
one-dimensional string). In 1970, H.B. Nielsen and Poul
Olesen, and Bunji Sakita and Miguel Angel Virasoro obtained
the dual amplitudes as an approximation to the sum of a
large number of planar “fishnet” Feynman diagrams.
=> Natural link strings <-> underlying constituent structure
Why not a constituent structure behind
string theories ? =>SUPERBRADYONS
Vacuum = material medium, « elementary » particles =
excitations of this medium, Lorentz symmetry could be
of dynamical origin. Solid state book, simplest dispersion
relation for phonons in a lattice is of the QDRK type :
e (k a) = 2 sin (ka/2) with c (sound) instead of c (light) =>
Assume the real fundamental matter has a critical speed
in vacuum : c (fundamental matter) >> c (light) , just as c
(light) >> c (sound) , but not tachyonic : E > 0, m > 0,
standard Lorentz symmetry replaced by another symmetry
=> SUPERBRADYON HYPOTHESIS <= Einstein, 1921 ?
Abdus Salam, 1979 Nobel lecture
”Einstein knew that nature was not economical of
structures: only of principles of fundamental applicability.
The question we must ask ourselves is this: have we yet
discovered such principles in our quest for elementarity, to
justify having fields with such large numbers of components
as elementary ?
Recall that quarks carry at least three charges (colour,
flavour and a family number). Should one not, by now,
entertain the notions of quarks (and possibly of leptons) as
being composites of some more basic entities (PRE-QUARKS
or PREONS), which each carry but one basic charge ? “
Source : Nobel Foundation site
QUESTIONS ON POSSIBLE PREON MODELS AND THEORIES
- Should preons have the same critical speed in vacuum as
conventional particles ? -> Not the case for γ and phonons.
Speed of light >> speed of sound.
- Should standard « elementary » particles be made of two
or three preons, just as mesons and baryons are made of
two or three quarks /antiquarks ? -> Not the case for
phonons, solitons… in condensed matter physics.
Some basic ingredients of the present standard model of
particle physics came from condensed matter physics :
spontaneous symmetry breaking, Higgs mechanism (from
superfluidity, superconductivity…) .
MAYBE ONE NEEDS A PREON APPROACH DIFFERENT
FROM THOSE CONSIDERED THREE DECADES AGO
=> SUPERBRADYONS ? (Gonzalez-Mestres, 1995)
cs >> c
Es = cs (ps2 + ms2 cs2) −1/2 (if new Lorentz symmetry)
ps = ms vs (1 − vs2 cs−2 )−1/2
« Cherenkov radiation » in vacuum for vs > c =>
spontaneous emission of « conventional » particles.
Needs compatibility with low-energy bounds on LSV. Must
preserve conventional relativity in the ”low- energy limit “.
=> Ultra-high energy phenomenon. But superbradyonic
remnants may exist in the present universe and play a
cosmological role => Dark matter, dark energy ?
SUPERBRADYONS IN OUR UNIVERSE ?
If superbradyons obey a new Lorentz symmetry with
critical speed cs (fundamental matter) >> c (light)
”relativistic“ superbradyons will have energies >> ps c and
can in principle be able to spontaneously emit ”standard“
particles => Such processes cease to be allowed when the
superbradyon speed becomes close to c (light).
=> A cosmological sea of superbradyons traveling at
speed c (light) ? => Or slower superbradyons ?
What about superbradyons in the physical vacuum ?
How does the vacuum behave at very low distances ?
Nature, 466, 426 (2010)
« Collider gets yet more exotic 'to-do' list »
Zeeya Merali
« [at ICHEP 2010] Landsberg (…) is presenting an ambitious new theory
in which the number of dimensions in the Universe increases as it
grows in size. He and his colleagues propose that the Universe began
with just one spatial dimension and one time dimension. “Think of the
Universe as a one-dimensional thread that gradually wove itself into a
two-dimensional tapestry as it grew, and then wrapped itself up
further to create three dimensions”, he says. (…) Evidence of vanishing
dimensions may already have been spotted in the shower of particles
created by cosmic rays entering our atmosphere (…) [in] cosmic-ray
data collected 15 years ago in the Pamir mountains in central Asia. »
See also the transparencies by Greg Landsberg at the ICHEP 2010 site.
ORIGINAL PAPER :
arXiv:1003.5914
Vanishing Dimensions and Planar Events at the LHC
Luis Anchordoqui, De Chang Dai, Malcolm Fairbairn, Greg Landsberg,
Dejan Stojkovic
We propose that the effective dimensionality of the space we live in
depends on the length scale we are probing. As the length scale
increases, new dimensions open up. At short scales the space is lower
dimensional; at the intermediate scales the space is threedimensional; and at large scales, the space is effectively higher
dimensional. This setup allows for some fundamental problems in
cosmology, gravity, and particle physics to be attacked from a new
perspective. The proposed framework, among the other things, offers
a new approach to the cosmological constant problem and results in
striking collider phenomenology.
The approach by Anchordoqui et al. explicitly
breaks Lorentz symmetry => requires a VRF
The need of new physics for Pamir data is not obvious, and
LSV can in principle yield other ways to solve problems
related to renormalization. Furthermore, does one need, in
LSV models, such a drastic dimensional suppression to get
alignement phenomena ay very high energy ?
A threshold at the 1016 eV energy scale would be equivalent to a
threshold at the 10-20 cm distance scale. => Perhaps vacuum can then
react to the collision between the cosmic ray and the atmospheric
target, by capturing a comparatively small amount of energy
equivalent to a fraction of the target energy, and release this energy in
the form of superbradyonic matter and waves (ΔE >> Δp c ).
By capturing a fraction of the target energy, but not its
conventional equivalent in momentum, the vacuum
generates a suppression of the available energy for the
transverse momenta of the high-energy particles produced in
the collision of the cosmic ray with the atmosphere.
(Gonzalez-Mestres, arXiv:1009.1853 )
A few rough numbers : ΔE and superbradyon rest energy in
the GeV range, cs ~ 106 c and vs ~ c would imply :
superbradyon mass ~ meV c−2
superbradyon momentum ~ meV c−1
superbradyon kinetic energy ~ meV
=> not detectable, as expected to interact very weakly
DARK MATTER SUPERBRADYONS AS A POSITRON SOURCE ?
PAMELA Collaboration, Nature 458, 607 (2 April 2009) :
An anomalous positron abundance in cosmic rays with
energies 1.5–100 GeV
(…) Previous statistically limited measurements of the ratio of positron
and electron fluxes have been interpreted as evidence for a primary
source for the positrons, as has an increase in the total
electron+positron flux at energies between 300 and 600 GeV. Here we
report a measurement of the positron fraction in the energy range
1.5–100 GeV. We find that the positron fraction increases sharply over
much of that range, in a way that appears to be completely
inconsistent with secondary sources. We therefore conclude that a
primary source, be it an astrophysical object or dark matter
annihilation, is necessary.
« Non-relativistic » remnant superbradyons
with v close to c ?
E ≃ m cs2 + m v2/2
p≃mv
Ekin ≃ m v2/2
To explain (or contribute to) the positron flux (GonzalezMestres, arXiv:0905.4146):
- Decays or annihilations into « conventional »
particles => use the whole superbradyon energy, including
rest energy
- « Cherenkov » decays emitting « conventional »
Particles => use only a fraction of the superbradyon kinetic
energy
Equivalent superbradyonic rest energies
Decays and annihilations spending the superbradyon
rest energy : if cs ~ 106 c and m cs2 ~ 1 TeV , v ~ c =>
superbradyon mass ~ eV c−2
superbradyon momentum ~ eV c−1
superbradyon kinetic energy ~ eV
« Cherenkov » decays using only a fraction of the
superbradyon kinetic energy : m c2 ~ 1 TeV =>
m cs2 ~ 1024 eV =>
REAL SUPERHEAVY OBJECT
SUPERBRADYONS AND QDRK ARE JUST A TOOL
The basic question being : is there « something »
beyond Planck scale, and can we experimentally find
some track of such a « beyond » ?
Standard relativity is not the only fundamental
principle concerned => Quantum mechanics can be
« deformed » in a similar way.
CODATA value of h : 6.62606896 x 10−34 J s with a 5.0
x 10−8 standard accuracy and based on low-energy
measurements (P.J. Morr, B.N. Taylor and D.B.
Newell, Rev. Mod. Phys. 80, 633, 2008) => what
happens at ultra-high energies ?
There has already been important work on possible
departures from standard quantum mechanics : f.i. Julius
Wess, q-Deformed Heisenberg Algebras, arXiv:mathph/9910013 (see also the references given in GonzalezMestres, arXiv:0908.4070 ).
Develop the equivalent of QDRK for quantum
mechanics ? => New commutation relations, where
the effect of the modification increases with energy
=> can lead, for instance, to unexpected intrinsic
uncertainties (direction, momenta, energy…)
More basically, for instance : can hamiltonian and
lagrangian formalisms describe the behaviour of
vacuum at ultra-short distance scales ?
A very small failure of energy and momentum
conservation at ultra-high energies can fake the
Greisen – Zatsepin – Kuzmin cutoff. And what is the
vacuum doing at such ultra-short wavelengths ?
Would superbradyons and similar objects obey
quantum mechanics ? Or is quantum mechanics a
« composite » phenomenon ?
And many other similar questions…
TO CONCLUDE :
Cosmic-ray experiments have extraordinary
and unprecedented discovery potentialities