Einstein realized in the last decade of his life that only a world government can overcome war and hatred on the planet. And he believed he had acquired the right to demand this acutely – in view of the nuclear winter being a real threat in the wake of his own contributions to physics.
His main discovery, however, is the “twin clocks paradox,” overlooked by even his greatest competitor. It describes, not just a physical discovery but much more. The travelled twin got transported along the time axis at a different (reduced) rate. So he will be standing younger-in-age beside his twin brother upon return. This is an ontological change which no one else would have dared consider possible: Interfering with the inexorable fist that pushes us all forward along the time axis!
This is Einstein’s deepest discovery. He topped it only once: when he discovered, two years later in 1907, that clocks “downstairs” are rate-reduced, too. The “second twins paradox” in effect.
The word “paradox” is a misnomer: “Miracle” is the correct word. Imagine staying the hands of time! So everybody sees that what you worked is a miracle (a Western Shaman presenting a tangible feat – a Grimms’ brothers’ fairy tale brought to life – a Jewish miracle revived: “the Lord can be seen”).
Why do I point you to Einstein, the sorcerer? It is because we’d better listen to him. Presently, the whole planet denies his legacy as once before. Deliberately to overlook his second twins paradox amounts to consciously risking the planet for the second time in a row.
The ontologically slowed clocks (downstairs) are not just slower-ticking: they also are proportionally mass-reduced, size increased and charge-reduced. This corollary to Einstein’s 1907 result, called Telemach (since T, L, M, Ch are involved), stays uncontested.
Unfortunately – or rather fortunately –, a famous nuclear experiment turns out to be planet-threatening in time I hope. Technically speaking the second twins paradox implies that CERN’s presently attempted to be produced artificial black holes, # 1) cannot be detected at CERN, # 2) are more likely to arise, #3) will, owing to quantum mechanics, electromagnetism and chaos theory, eat the planet inside-out in a few years’ time so that only a 1.8 cm black residue remains.
So dangerous is Einstein still, 57 years after his passing away? This time around, he is imploring us again while taking off his glasses and smiling into the camera: “please, dear children, do not continue a nuclear experiment that you cannot monitor while ontological implications stand on the list.”
The safety conference,rejected by the Cologne Administrative Court on January 27, 2011, is number 1 on Einstein’s agenda:
The nuclear experiment must be stopped immediately!
The nascent world government is openly asking for this today: This is “Einstein’s miracle.”
Okay?
Time dilation is only a “miracle” if we take the provably incorrect view of Newtonian absolute time as true; but it’s not, hence the repeatable, predictable “miracle”. It also doesn’t really hold back the hands of time — the younger twin doesn’t actually get to live any longer, not from his / her / its / their subjective point of view or clock(s). Although it appears to stationary (relative to traveller) observers that they have somehow eluded time, the twin will have had little actual subjective experience of time (or life) if they come back from a “long trip” at relativistic velocities. It’s actually a kind of forwards time travel, not slowing time, from the perspective of the travelling twin (plus they get the perk of being able to travel to places that their lifespans may otherwise make impossible within a subjectively short amount of time). But ask them how chronologically old they are, and they won’t answer the same amount of time they’ve been away, but much shorter… and they’ll be RIGHT. Relativity, with its observation that the laws of physics are the same regardless of the reference frame / state of motion of the observer, makes this not a miracle but rather a well-understood and necessary consequence, and makes the idea of absolute time incompatible with actual reality.
I fail to see how a world government would overcome war and especially hatred. That may have been a bigger blunder for Einstein than his flat and provably incorrect rejection of quantum mechanics (but not, ironically, the “cosmological constant” which MAY turn out not to be a blunder after all). Certainly if we define “war” as conflicts between nations, then one world government ends war. Sure. Resolution by practical tautology. But what of dissent from within? Civil wars happen, as do justified riots. The civil rights movement in the US involved frequent, useful, and fully justified physical violence. Governments have contained dissent from within for as long as governments exist by sowing the seeds of hatred between groups to keep them fighting each other instead of noticing the vast faults of the state, and governments have always been more than happy to use their vast military might to crush rebellions, whether just or otherwise, peaceful or violent.
So maybe, MAYBE, a world government might end the danger of the use of weapons of mass destruction in conflicts, but on the other hand, maybe not. Nations might be happy to send wave after wave of “expendable” individuals out to war, because the value of individual life is effectively zero to a state unless it serves a political purpose; they’ll rarely consider using WMDs other than via threats or propaganda rather than actual use. Desperate disenfranchsed groups, facing a leviathan state, especially one that’d cover the world, might actually use them though… what other choice do they have against such unfair odds? We can prevent that via the building of a surveillance state and suppression of scientific and technological knowledge and resources, sure. But what kind of world is that? What’s even the point of survival, at that point, if we have only notional “freedoms”, the government can and does know everything we do, and we live in a world with “forbidden” or “secret” knowledge? I like being alive and existing and all, but if the tradeoff for security and a guarantee for my own life and that of Earth’s or humanity’s is becoming an Orwellian 1984-style world… death is better. At least you die yourself rather than giving up everything meaningful about you to cling desperately out of fear of the dark to some semblance of “life”.
As far as this CERN concern… physicists have chimed in time and time again that their is no cause for concern. Hawking radiation is a known phenomenon and it means small black holes wink out of existence rapidly. We also, most importantly, know these experiments aren’t an issue because Mother Nature, the ultimate mad scientist (paraphrasing Seinfeld’s Kramer), performs far higher energy experiments in Earth’s atmosphere all the time, as cosmic rays bombard it daily at energies we can’t yet hope to replicate on Earth (we do them on Earth for the predictability and observability). Whenever you hear doomsday talk about tiny black holes and strangelets, it invariably comes from non-physicists, and sometimes they even quote physicists that, in their naivette, were intellectually honest enough to note that it’s all (highly tested to ridiculous precision and accuracy) theory (moreso than evolution). There simply is no danger.
And then again, what if there is? What’s the point of life without growth, and the gaining of knowledge? Really, death is the permanent end of newness. An eternal return or a total stasis is functionally equivalent to death. It’s worthwhile to do these experiments because there are things we don’t know. Some of these things might revolutionize our understanding of the universe, and may even lead to new technologies that may assist us in our survival and improving quality of life. What if instead of doomsday scenarios we retrieve data that lead us to discover the principles of some form of advanced propulsion technique, as in some sci-fi “hyperdrive” or “warp drive”? Certainly easy interstellar travel would be a big help in promoting humanity’s future existence, via numbers and spreading. Do I think that’ll happen? No; I think we’ll find that (perhaps except for wormholes, which may never be usable or stable except for communication, if that) physics protects the speed of light from true loopholes. Do I hope that’ll happen? Also no; with ways around the cosmic speed limit no doubt some future equally blunderous Einstein will be talking about a one-government interstellar empire or federation, and that would be a tragic loss for the individual spirit and hope to escape the ties that bind of a society that’ll always seek to impose itself and see them as expendable — ironically to ensure their survival and security.
I hope I didn’t come across as intentionally adversarial or attempting to make an attack. But I wanted to share my understanding of the CERN experiments and time dilation, as well as my other alternative opinions on less scientific questions, so as to open up dialogue.
Modern Physics based on misunderstandings explained at my lecture at
http://www.worldnpa.org/site/event/?eventid=524
Relationship between Newtonian and Einsteinian Physics
Description
The mathematical connection between Newtonian and Einsteinian physics will be explained. Essentially it can be viewed as the same bit of maths but subjected to a different language. Special relativity being an interpretation of the equation c’2t’2 = (c2 v2)t2 by setting c’ = c with t not equal to t’. While Newtonian physics is interpretation of the same equation as instead: t’ = t with c not equal to c’. Newtonian gravitational theory has primary and secondary gravitational effects. When both these effects are considered then Newtonian physics gives same maths as General relativity. It is only that the maths is interpreted by different languages. In the case of Newtonian physics it is interpreted in terms of forces while Einsteinian physics talks of it in terms of space-time curvature. On the experimental side it will be pointed out from a paper by a NASA scientist that Einstein’s relativity has never been subjected to a direct experimental test; the tests have only ever been indirect. (Of course certain Einsteinians have deceived themselves to the nature of their experimentation and not realized they have only ever done indirect tests.) Thus it has always been a subjective issue as to whether the maths should be interpreted by Newtonian or Einsteinian language. As to the paradoxes of Einstein’s relativity this has been in part caused due to the complicated language used by the Einsteinians obscuring the understanding; while in Newtonian language it is much clearer as to what is happening. Special relativity considers a symmetrical scenario of two observers at relative constant velocity motion, while general relativity breaks that symmetry. Newtonian physics has none of those conceptual problems from its outset. Thus the problems of modern physics can be placed down to the difficulty people have experienced upon learning a new language to describe physical reality.
Thank you, dear colleague van Lohuizen. Let me — before reading the important contribution further — make a quick first comment.
(Quote: “it’s not, hence the repeatable, predictable ‘miracle’):
Time dilation was not discovered by Einstein as you say, and it is not a miracle in itself for its being a pure observational effect at first sight. I agree.
But Einstein was able to extract from it an ontological implication. This fact was strangely overlooked by more than a century of extremely clever physicists.
A progress like this was apparently never achieved before. (But I tend to repeat my own statements.)
You can either answer or wait until I have read more. Take care.
Dear Mr. van Lohuizen, I come to your second point regarding Einstein. You repeat what most physicists believe: that he had misunderstood quantum mechanics. This is again a false collective opinion amongst us epigones. His EPR argument won the battle. He opened the door to the discovery of the “spooky action at a distance” which he had reluctantly predicted, from the hands of John S. Bell as is well known. But he thereby also enabled the discovery – abhorred if glimpsed by himself alone – of the observer-centered individual reality of the quantum world. His victory over Bohr – whom he loved as a person – was much more triumphant than he had expected. Copenhagen is dead, the observer-specific (and moment-in-time specific) quantum world of Hugh Everett’s – whom he had written a letter when Hugh was 12 – won out. Susan Feingold and later authors (including Penrose and Zeilinger) found the decisive experiment; which – of course – is being denied by the scientific community (specifically ESA) for more than a decade.
I wrote about this in a recent article – in case you find this account interesting enough to doubt it, in Siegfried Zielinski’s Variantology 5 collection ( http://www.wissensnavigator.com/documents/variantology.pdf ).
.
Thank you, Mr. Anderton, for trying to construct a parallel between Newton and Einstein — clearly the two greatest physicists ever. Thibault Damour uncovered that Newton had already seen the equivalence principle. Only the invariance of c still eluded him — so he could not discover gravitational time dilation and its corollaries as of yet.
However, the “ontological” discovery made by Einstein is, perhaps, the single most powerful insight in the history of physics.
I hope that the refusal by CERN to say that it does not jeopardize the planet is recognized as the unprecedented crime that it represents.
Clarification 1: I actually didn’t state anywhere that Einstein “discovered” time dilation; I reread what I wrote to be sure because it didn’t sound like something I’d say.
Clarification 2: I said Einstein rejected quantum mechanics, not that he misunderstood it, which is another thing I’d never say because it wouldn’t be true. That said “rejection” is unclear language and I apologize for any miscommunication resulting from that. By “rejection”, I meant his belief that quantum mechanics was incomplete. Einstein was philosphically committed to local realism, and the idea that things like momentum and location have actual Real (with a capital “R”) meaning in an external (but perhaps not directly or perfectly observable) reality. That’s a philosophical belief / assumption, not a scientific idea or theory, and one that is neither necessary nor one that I see any strong empirical reason to tentatively accept.
Thanks for the paper recommendation; I read it and found it quite interesting. Although by and large, I’ve never liked the Copenhagen or “consciousness” interpretations of quantum mechanics. Then again, I’m not entirely sure why we *need* interpretations of quantum mechanics. It works mathematically well, tested via experiments and observations to ridiculous precision and accuracy, has made predictions that by and large have never disappointed. What’s all the fuss about imposing an ontological “and this is how it really works” interpretation on it? Is it fun? Absolutely. But I’m not entirely sure it’s useful and I’m certain it’s not actually *science*. That said, I do prefer the Everett interpretation if I have to think about quantum mechanics linguistically / non-mathematically… and think popular sci-fi has done it much disservice associating it with the (to my mind, rather abhorrent) idea of an indefinite number of parallel universes (but that’s a whole other thing…). But really, aside from different ways of thinking about it, all the interpretations produce functionally and mathematically equivalent results, and this should be kept in mind.
What is the ontological implication to which you are referring that Einstein discovered? Is it that time “flows” at different rates depending on one’s state of motion (and also, in general relativity, one’s presence in a gravitational / accelerative field)? I’m certainly aware that various particles with predicted decay rates / half-lives can *appear* to us to decay more slowly due to relativistic effects (from the particle’s “point of view”, it “dies” just as quickly though). And I understand your concern that some products of the CERN experiments may, via relativistic effects, survive long enough to become serious threats. But as I said, Mother Nature inadvertently “experiments” every day with higher (than CERN) energy collisions of cosmic rays with our atmosphere, and these things have been going on long before the first bipedal ape walked the planet (or even the first microbe inhabited the primordial seas). Where is the qualitative difference that makes our lower energy controlled experiments any more dangerous? And can we really justify terminating scientific research out of fear of adverse consequences? Scientific research gave us the nuclear bomb, which we were made to fear via the spectre of M.A.D. But, well, it also gave us the fully developed concept of M.A.D., a deeper understanding and practical real world example of game theory, proof of how far nations will go to scare their citizens until they submit to “us vs them” thinking (which they refined during the Cold War and now use to get us to give up all privacy and liberty in our ironically named “war on terror”), a deeper understanding and verification of atomic physics, and… nuclear energy. Nuclear energy may one day save us from a dependence on fossil fuels whilst we develop better alternatives, which, even if we didn’t worry about greenhouse emissions, we still need because fossil fuels are not a bottomless pit and we’re probably now at least close to the oil peak if not a few years past it. If early humans had been too afraid of fire to attempt to ever contain, study, and control it, where would our species be today? I contend fire is more dangerous to humanity than what is being done at particle accelerators such as CERN and LHC.
On the issue of ontological implications:
Ontology is a major branch of metaphysics / philosophy, not physics or even science in general. Although sometimes philosophical thought can helpfully guide theoreticians (it also has unhelpfully stunted them more often that it has helped), as a classical skeptic I find it both unhelpful and unscientific to assume anything about scientific theories other than that they seem to provide useful models (or contexts) for modeling observed phenomenon and making predictions (including novel predictions and retrodictions). But the models and laws of science are fictions. Certainly they’re highly useful fictions, and the fictions may turn out to approximate or even equate to some actual aspects of an external reality (although even the existence of *that* must still be considered a very highly useful pragmatic fiction / assumption). Making philosophical leaps from scientific theories and data (especially in physics) is fun, and even sometimes deeply useful (when it doesn’t derail us!) and nudging us toward further refined theories, but in the end we should remember not to confuse the two. Every scientific theory, principle, and law may be just a few odd repeatable discoveries away from turning out to be at best viewed as an approximation.
Even the constancy of the velocity of light in a vaccuum might be an approximation of a vector field, given the observed data of the seeming variation of the fine structure constant over time. Perhaps all the “magic numbers” / constants, or most of them, will turn out to be malleable vector fields that can be different, have been different, and perhaps do change ever so minutely and below our current measurement precision all the time.
Experiments studying gravitomagnetism (a prediction of general relativity) using rotating superconductors have seemed to show vast and significant (perhaps game-changing) divergence from GR predictions based on mass alone; these have not been researched much further and are relegated to fringe (much as, in the paper you recommended, the refined EPR-paradox experiment) despite the ramifications and possibilities.
And we stick desperately to “the universe is smooth and homogenous at large scales” that we require if conventional Big Bang cosmology is to hold up, inventing never-observed inflationary fields and the like despite the obvious but inconvenient observation that we’ve yet to see a scale at which the structure of the universe is NOT fractal and does not *utterly fail* to be smooth and homogenous. Mainstream physics / cosmology doesn’t pursue this seriously either and it’s directly because of a deep philosophical attachment to a theory and its ontological implications… one of many cases where when scientists let themselves believe they’re “discovering the truth of reality” rather than formulating useful models to help us frame our observations and future expectations, things go badly and it’s no longer science. Again, I hope you see the parallels with ignoring the proposed Einstein-Feingold experiment from the variantology paper… obviously any reasonably proposed experiments should be performed, barring cost and risk factors.
I absolutely agree. We could go on from here.
Dear Otto, dear all
A quite important side issue regarding the neutron star argument:
Given the comments in the abstract and conclusion of the GM paper about cosmic rays producing black holes on neutron stars [GM abstract, p. 52 ], and the claim in the LSAG report that cosmic rays hitting neutron stars would have produced black holes copiously during their lifetimes” [LSAG p. 9], one may be left with the suspicion that the authors and CERN were deliberately trying to obscure the finding that an insuficient number of black holes would be produced by cosmic rays striking neutron stars [GM p. 85]. While obscuring inconvenient truths may be part and parcel of a spin doctor’s trade, it would mark a new low for a prestigious scientifc institution like CERN to knowingly publish a table in a peer-reviewed journal with data that has been inflated by at least three orders of magnitude [GMPhysD p. 23, table III 4].53
Best regards to all.
Page 60 in the following link:
http://www.lhcsafetyreview.org/docs/BlackHoleReview.pdf
It seems that in such a case slow (LHC-made) stable MBHs would get important ‘again‘…
Dear all
An other quite important side issue:
No scientific treatise exists which examines whether all these past and present subnuclear experiments, collisions and collision products have given any sign or information on any dangerous phenomenon or threshold of a dangerous phenomenon or evidence to a chain reaction if energies, luminosities, or changes in other parameters such as the construction of the detectors would be undertaken.
Would such a treatise not be desirable?
Instead, there is chaos, the data is distributed somewhere else and nobody knows whether such a sign exists. From time to time we can read about “higher energies that expected” (RHIC fireball), “more radiation than expected” (in the tunnel of the LHC) or “neutrinos are faster than light” — oh no, I have mixed up the issues (a little joke of mine in between, because all is so sad) and various theories contradict each other and even CERN’s own 5-man risk assessment group “LSAG” has not handled all known risks.
Good luck to all.
Clarification: 5th line, 2nd word “…if energies and luminosities would be increased or changes in other parameters such as…”
A kind request — let me use the occasion to pose a question on Lifeboat at this place.
I today posted a new text on Lifeboat (just google: “Dear Planetary Media” to see its shadow).
I believe here – in the middle of a chat – the text will be more acceptable to the censor. So I repeat it here along with the censor’s comments which he put-in in RED within square brackets [ ] in an email informing of the censorship. The red color is here omitted.
—————————————————————————-
[spamming lifeboat with disinformation]
Dear Planetary Media
===================
(posted by Otto E. Rössler in category: Uncategorized)
You are writing history with your deliberate silence. The LHC experiment at CERN in Switzerland was designed to produce miniature black holes. It is attempting to do so for 15 months by now. Eight more months of enhanced performance are scheduled.
Problem is: CERN’s detectors are blind [based on your largely unaccepted theories only]to their own most anticipated success – as they do not deny [they dismissed the idea].
Second problem is: CERN is more likely to be successful than anticipated – as they do not deny[they dismiss the idea].
Third problem is: The anticipated success will exponentially shrink the planet to 2 cm in perhaps 5 years’ time [based on your non-accepted fringe theories] – as they do not deny [they and the greater scientific community dismiss this notion completely].
Not a single scientist on the planet steps forward to contradict these 3 results published [the entire scientific community contradicts these results] in renowned science journals [are they really renowned journals?]. A court suggested a safety conference 14 months ago [the court appeal was rejected]. The fact that the world’s media behave as if bribed by CERN is a topic for media scientists [the media report what is of interest to the public – the media consider it non-topical (it was back in 2008) – this is a mistake by the world media in my opinion – but not ‘as if bribed’], law professors and historians of the future, provided the latter arises. Watergate pales by comparison.
I predict that Lifeboat’s censor will delete this posting immediately [what part of your content did YOU think would not be acceptable? In answering, please note your other recent contributions on the very same topic were accepted – and if you anticipated this post would be trashed why didn’t you improve the quality?].Therefore this is a maximally short cry for help to the whole planet [it is hardly a short cry – you have over 100 posts on the same subject which have been accepted].
[Please note: Lifeboat has to take a certain responsibility for hosting inflammatory and misleading posts which may incite violence if read by the wrong individuals, and damage the credibility of Lifeboat regardless. Please return to being more articulate and careful with your contributions to Lifeboat.]
—————————————————————————–
15 minutes later, the text was still online (Google still thinks it is). Now my question to the dear readers here: Is this okay?
I am not angry. All I wish is the right to answer the censor’s remarks in public. For censorship is a public act. I shall wait a little while to see whether I am allowed to do so here or not; in in the meantime other contributors can already speak their mind.
I thank everyone involved — including the censor who is acting in subjective righteousness.
Otto E. Rossler, chaos specialist
“are they really renowned journals?”
They are not. For one you can publish anything if you are willing to pay the fee (the african journal) the other has Ottos old friend El-Naschie (and Otto himself) on the editorial board. El Naschie is a well-known fraudulent crackpot who was publishing his own crank-science (and that of Otto) without proper peer review and so on. Anyone can check this in the web on his own.
So Ottos journals are far away from being renowned.
.…who was publishing his own crank-science (and that of Otto) without proper peer review in the journal where he was editor.
El Naschie is one of the best current examples of scientific misbehavior.
Sorry, the above text posted under my name at 8:57 am was edited. I therefore first re-enter here my own version published at that time (which would look nicer with the two fonts kindly introduced web admin – were there not a number of added changes that it would be too cumbersome to stumble upon in responding.
• Otto E. Rossler on May 2, 2012 8:57 am
A kind request — let me use the occasion to pose a question on Lifeboat at this place.
I today posted a new text on Lifeboat (just google: “Dear Planetary Media” to see its shadow).
I believe here – in the middle of a chat – the text will be more acceptable to the censor. So I repeat it here along with the censor’s comments which he put-in in RED within square brackets [ ] in an email informing of the censorship. The red color is here omitted.
— — — — — — — — — — — — — — — — — — — — — — — — —
Dear Planetary Media
===================
(posted by Otto E. Rössler in category: Uncategorized)
You are writing history with your deliberate silence. The LHC experiment at CERN in Switzerland was designed to produce miniature black holes. It is attempting to do so for 15 months by now. Eight more months of enhanced performance are scheduled.
Problem is: CERN’s detectors are blind
[based on your largely unaccepted theories only]
to their own most anticipated success – as they do not deny
[they dismissed the idea].
Second problem is: CERN is more likely to be successful than anticipated – as they do not deny
[they dismiss the idea].
Third problem is: The anticipated success will exponentially shrink the planet to 2 cm in perhaps 5 years’ time
[based on your non-accepted fringe theories]
– as they do not deny
[they and the greater scientific community dismiss this notion completely].
Not a single scientist on the planet steps forward to contradict these 3 results published
[the entire scientific community contracts these results]
in renowned science journals
[are they really renowned journals?].
A court suggested a safety conference 14 months ago
[the court appeal was rejected].
The fact that the world’s media behave as if bribed by CERN is a topic for media scientists
[the media report what is of interest to the public – the media consider it non-topical (it was back in 2008) – this is a mistake by the world media in my opinion – but not ‘as if bribed’]
, law professors and historians of the future, provided the latter arises. Watergate pales by comparison.
I predict that Lifeboat’s censor will delete this posting immediately
[what part of your content did YOU think would not be acceptable? In answering, please note your other recent contributions on the very same topic were accepted – and if you anticipated this post would be trashed why didn’t you improve the quality?].
Therefore this is a maximally short cry for help to the whole planet
[it is hardly a short cry – you have over 100 posts on the same subject which have been accepted].
P.S. Time is now 4:44 am Pacific Standard Time, May 2, 2012
[Please note: Lifeboat has to take a certain responsibility for hosting inflammatory and misleading posts which may incite violence if read by the wrong individuals, and damage the credibility of Lifeboat regardless. Please return to being more articulate and careful with your contributions to Lifeboat.]
— — — — — — — — — — — — — — — — — — — — — — — — —
15 minutes later, the text was still online (Google still thinks it is). Now my question to the dear readers here: Is this okay?
I am not angry. All I wish is the right to answer the censor’s remarks in public. For censorship is a public act. I shall wait a little while to see whether I am allowed to do so here or not; in in the meantime other contributors can already speak their mind.
I thank everyone involved — including the censor who is acting in subjective righteousness.
Otto E. Rossler, chaos specialist
(End of recovered version.)
Rössler is spamming for years the same unfounded rubbish.. it is amazing how long the adminstration was accepting this.
Otto:
Whereas I do believe censorship *should* be a public thing and you should be able to have an open, public rebuttal of the censor’s comments, censorship is often *not* public, and why I abhor all censorship and authority (and why I’m confused by your sharing Einstein’s desire for a world government).
That said, this a privately owned site and the TOS are listed, and in the end they have the right to pull content, as its presence here will be associated with Lifeboat even if, by and large, Lifeboat doesn’t share your beliefs on this matter.
No offense meant, but the truth is that the censored bits are reasonably censored. Each statement you claim “they do not deny” the vast majority of the scientific community and the physicists at CERN have long dismissed outright. Saying “they don’t deny” your claims makes it seem as though they recognize their likelihood, and that seems misleading and dishonest to me because they absolutely don’t recognize their likelihood, and have provided many eloquent scientific arguments to demonstrate, quite convincingly, that there is no threat.
I would sincerely like to know if you believe that the CERN scientists are just irresposibly ignoring risks, or if you actually believe that they *want* to destroy the world. Which is it? It seems like the latter to me.
The only thing massive particle accelerator experiments have consistently done is threaten the plausibility of continuing to believe in the existence of the Higgs boson, not the planet or the human species.
Dear Tom, dear eq, dear all
Question: Do you all agree that neutron stars are not a sound safety argument to rely on anymore?
Every response or criticism to my question is appreciated and I will take a look here next day.
Thank you very much.
To resume the above thread:
My 12 Answers to the Censor’s Remarks
=============================================
1) “detectors are blind [based on your largely unaccepted theories only]”
(Note: The statement in square brackets [] is always the criticism inserted, in your function of appointed censor, into my own text as specified by the preceding non-bracketed words.)
Answer: My Telemach theorem which proves this statement was never contradicted or disproved by a colleague up until now. But there are high-ranking independent discoverers.
2) “are blind … as they do not deny [they dismissed the idea]”
Answer: Who is “they”? There is no one who ever said so in public to the best of my knowledge.
3) “CERN is more likely to be successful … as they do not deny [they dismiss the idea]”
Answer: Who is “they”? Can you give a single name so the person can step forward?
4) “in perhaps 5 years’ time [based on your non-accepted fringe theories]”
Answer: This clearly is a biased, private, opinion. Or can you name a single specialist who dares say so under his or her own name?
5) “as they do not deny [they and the greater scientific community dismiss this notion completely]”
Answer: I am very grateful for this information. Is there anyone at CERN or elsewhere who says so and allows you to quote him for being ready to defend this stance?
6) “contradict these 3 results published [the entire scientific community contracts these results]”
Answer: Why then is there not a single scientist on the planet to say so publicly?
7) “in renowned science journals [are they really renowned journals?]”
Answer: Which scientist says they are not?
8) “A court suggested a safety conference 14 months ago [the court appeal was rejected]”
Answer: No, this court’s suggestion (“expressed opinion”) was never even allowed to become public (except in a very low-distribution professional journal for medical laboratory assistants).
9) “The fact that the world’s media behave as if bribed by CERN is a topic for media scientists ((of the future)) [the media report what is of interest to the public – ((if)) the media consider it non-topical (it was back in 2008) – this is a mistake by the world media in my opinion – but not ‘as if bribed’]”
Answer: Thank you for siding with me for once. If you insist I emphasize that the “if” is a casus irrealis here.
10) “will delete this posting immediately [what part of your content did YOU think would not be acceptable? In answering, please note your other recent contributions on the very same topic were accepted – and if you anticipated this post would be trashed why didn’t you improve the quality?]”
Answer: You did as I had feared – right? If I am not misleading myself I really could not say what element of my very restrained description of facts open to everyone’s scrutiny you could possibly find objectionable, dear official censor of Lifeboat. The “quality” of this text as a literary document may be limited – but its objective importance is potentially infinite. My only aim was to be shown that the danger I described can be disproved – which no one but a competent scientist can possibly do but which he won’t do if the media do not ask him. Please, help me find him or her – which was the only reason for my turning to the world’s media, as you know.
11) “cry for help to the whole planet [it is hardly a short cry – you have over 100 posts on the same subject which have been accepted]”
Answer: This is cynical — unless you have a counterproof to offer that makes you absolutely sure that the LHC is safe. Then, YOU are the scientist whom I tried to find with my public call. Please, do give the proof. For that is all I ever requested.
12) “[Please note: Lifeboat has to take a certain responsibility for hosting inflammatory and misleading posts which may incite violence if read by the wrong individuals, and damage the credibility of Lifeboat regardless. Please return to being more articulate and careful with your contributions to Lifeboat.]”
Answer: “inflammatory and misleading” is obviously a non-fitting characterization of my restrained text.
Nevertheless I acknowledge that the above call upon the media is so simple and easy to share as a desire with other citizens across the planet that I fully understand your concern. Clarity is counterproductive if one believes that the enlightened few must not be pressed to reveal the rationale for their never specified verdict. In many other cases such respect is a beneficial human trait. Only if you have children do you feel otherwise.
So please, forgive me that I insisted on evidence in a case in which everything is at stake — unless this fact can be contradicted. This appears to be the moment that CERN and its associates become nervous because they are “shooting sharp” every day and every minute without detectors (if I am right). So it is the nakedness of the truth that arouses your fear.
I repeat that my only request is to kindly check a theorem that if as stupid as you claim cannot possibly pose a big problem to dismantle.
But, my dear friend – let me close by thanking you from the bottom of my heart: For your having given the planet the privilege of witnessing a first maximally small “safety conference” today. Nothing else but an open discussion was ever requested. This conference is now “on” thanks to your courage and sense of duty.
Are there any other contributors? Every scientist with a name to defend is asked to participate – to spare CERN the duty to explain why the experiment is safe.
Otto Rossler, I like your change of emphasis, hope you keep it up.
Niccolò:
I completely agree that neutron stars are not analogous as they’d end up deflecting cosmic rays, and hence they don’t constitute a sound argument for safety.
But what of Earth’s upper atmosphere, or the surfaces of the Moon and other planets, all of which are bombarded with cosmic rays every day and have been since their formation? These collisions will involve energies greater than those occurring in the LHC, as far as I am aware.
If I’m wrong, please tell me; I’d like to see a comparison of the energies involved in natural cosmic ray collisions in Earth’s atmosphere and throughout the solar system.
Apparently no one ever read the appendix in GM about exactly this problem with single neutron stars.
There is a reason why the scientific community does not agree with this propaganda-accusations of GM being liars etc as mentioned again in this commentary area.
Dear Mr. van Lohuizen, dear hdc, dear all
People who know my comments about the comparison of artificial collisions with cosmic rays on lifeboat can jump to “the neutron star safety argument” issue more below.
It is right that secondary cosmic ray showers have been detected and it has been calculated how much energy the primary particles of these “events” should have. A few of these cosmic ray events have been more energetic that LHC-collisions with 14TeV.
On astronomical bodies fast cosmic ray particles collide with nearly stationary particles of the body or of the atmosphere of the body.
So higher energetic cosmic rays produce faster secondary particles. The faster the incoming ray the faster the products.
Collisions with two equally fast protons, each with 7TeV are extremely seldom (and pairs of lead ions with LHC design energies perhaps even more rare).
A hypothetical cosmic ray proton needs 100‘000TeV (= 1017eV) to match a collision energy of 14TeV if it strikes a stationary proton high up in the atmosphere (the calculation is on page 28 in the G.M. paper).
There is some evidence that slow particles could pose some risk. Two examples are MBHs and strangelets. It is assumed that only fast strangelets would perhaps be destroyed by the impact and pose no danger.
The high luminosity of the LHC is an other difference to high energetic cosmic rays, which are widely or “homeopathic” distributed over the Earth.
Please take a look at my comment here also, in which I refer to the cosmic ray collisions in the solar system:
Niccolò Tottoli on February 14, 2012 7:43 pm
http://lifeboat.com/blog/2012/02/cerns-annual-chamonix-meeti…t-mega-lhc
Just some words from there:
“Fast particles do not equally interact with matter like slow particles. Examples are the decreased decay rate of very fast particles or the increased reactivity of slowed neutrons in uranium.“
(Carbon moderators are used in nuclear power plants, which decelerate the neutrons, to have more reactivity.)
Therefore I say if we do not know all possible reactions and islands of stability of all exotic particles and fields (even with respect to their velocity and energy concentrations if produced or if traversing matter) then we can not know whether the LHC is safe or not.
The neutron star safety argument:
We should not only rely on the paper of Prof. Giddings and Prof. Mangano but they have considered the point that neutron stars would deflect most of the cosmic rays and therefore reduces the calculated number of high energetic cosmic rays impinging on neutron stars by a factor of −1000.
The problem is that in the papers of LSAG and G.M. it is suggested (perhaps to the not very careful readers) that the decay of neutron stars would be catalyzed by micro black holes and that cosmic rays hitting neutron stars “would have produced black holes copiously during their lifetimes”.
I have seen that the statement of LSAG is on page 7 in their report, not page 9, (an error of “LHC safety review”, see link in my comments above, sorry).
A more important problem of the G.M.-report is that the numbers concerning MBHs on neutron stars in various tables are “uncorrected values” and it seems to me quite strange that there is even no note at these tables to mention it — one can only find it in the text.
For example the numbers of table 3 on page 46 “Summary of black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star” have to be reduced by a factor of −1000. (See G.M. at the end of page 45 or see abstract G, end of 3rd paragraph, page 85).
By correcting the values of table 3 we get very low numbers of 0.054 to 0.633 MBHs per million years, depending on the minimum mass (TeV) and on the dimensions (D).
I think that less than 1 MBH per year does not suffice as a sound safety argument to rely on, because some unknown factors or theoretical problems could decrease it further.
If one simply compares table 3 with table 9 on page 77 in G.M. then it will be clear that table 9 are uncorrected values too.
The uncorrected numbers for table 3 and 9 are for the hypothetical case of 100% proton cosmic rays.
G.M. handles the hypothetical case of 100% iron cosmic rays too and in this case the numbers have to be some hundred times to some thousand times smaller again (naturally after already decreased them by a factor of −1000) depending on the assumed dimensions and mass of MBHs.
So a careful investigation of G.M leads us to much less than one single MBH per million or ever per billion years.
Therefore neutron stars are not a sound safety argument for stable MBHs to rely on.
But it‘s not over:
The real kind of cosmic rays of 2TeV or more is not known, there are only assumptions and only the secondary cosmic ray events have been measured yet.
Thus we have to consider all possibilities.
What kind of other cosmic rays would give the smallest numbers of MBHs trapped on white dwarfs or neutron stars?
All possible rays, particles and exotic particles and the worst imaginable case, for example the hypothetical extreme of 100% for each such sort of particle (similar as for iron or protons in GM) have to be considered.
The much less dense white dwarfs are a similar safety argument for stable MBHs.
But I am not yet sure about it.
I have read that MBHs could be smaller than expected and if it is true then the issue have to be handled again.
An other idea could be that MBHs would decay immediately into much smaller but stable remnants which could traverse more dense matter again (but this is only an idea).
Thank you.
Clarification: In the middle of my 2nd-last paragraph I mean:
“I think that less than 1 MBH per million years does not suffice as a sound safety argument to rely on…“
Not: “…1 MBH per year…”!
In starburst galaxies, it is estimated that cosmic ray fluxes are 1000 times higher than in our neighborhood. Also — what are cosmic ray fluxes near the center of the galaxy? And what were they in the first few billion years of the galaxy’s history?
And keep in mind, that if black holes can retain a charge or be charged, they will be stopped by pretty much everything, and all the safety arguments are totally valid. So mini black holes are only a danger if they are uncharged…
As I see the general importance to immediately show corrections, here are two more (Niccolò Tottoli on May 3, 2012 8:40 pm):
Section “The neutron star safety argument” I write several times that the numbers in table 3 and 9 of G.M. have to be “reduced by a factor of −1000″.
Correct it must be: “reduced by a factor of 1000″, since the “factor” is not a negative number. An other possibility would be to say “the numbers have to be thousand times smaller”.
Section “The neutron star safety argument”, 3rd paragraph “I have seen that the statement of LSAG is on page 7 in their report, not page 9…” was my own error, so the link in the paper to page 7 is correct.
Dear Mr. Conant
Two intersting points.
Anyway G. and M. have uncorrected values in these tables, because starburst galaxies are not mentioned. But you are right that the issue have to be handled different for each case, to make “real” estimations for the risk of uncharged MBHs.
But as I told there could be some other problems which would increase the risk of LHC-produced neutral, slow MBHs, so G.M. and LSAG should make a new assessment. It would be important to handle various other risks too. As I told some of them are even not in LSAG.
Thank you for your attention.
Here is a link with the paper which criticizes G.M. and with a list of pro- and contra publications for an independent safety review. Perhaps not all risks are in the list, but at least the risks which are handled in LSAG.
http://www.lhcsafetyreview.org/
And naturally it should be clear that the most would be achieved if an assessment with all the internaltional scientists referring to risks would be made — not only CERN‘s LSAG alone should do it. But as Prof. Ellis told once in his video: The ‘few‘ people with an other opinion can be “decoupled”…!
Reading GM again I have not the impression that totolli has read it carefully enough. So do GM mention the reducing factor of 10^−3 in the discussion of Neutron stars on page 45, referring the appendix. In the follwing they discuss binary systems because of this suppression.
However, Tottoli should present his “Findings” in a more precise and structred way. The permanent corrections in follwing comments do not indicate a thorough reading and understanding of the paper. It is furthermore a little bit surprising that one can imply fraudulent behavior etc on ths paper when there is a open discussion of suppresive effects, cross-references to detailed attached discussions in the appendices and so on. It is even more suprising that a person like Tottoli who implies this fraudulent behavior concerning GM does not even think about something similar when it comes to the ridiculous, diffuse and unfounded “papers” of Otto Rössler.
This at least indicates a certain degree of pre-occupation which is not in agreement with open-minded and objective work on the topic. One can assume that he would come always to the same conclusions about CERN-studies regardless of the exact appearance of these studies.
I have overseen that Tottoli indeed mentioned the page 45 appearance of the suppression factor.
However, table 3 summarizes at least the binary cases. Therefore the values in the tables are not what Tottoli wanted to imply in his long and non-structured texts above.
To clarify on request from a friend and critic, the rejected post from Otto reproduced in comment space earlier on this thread with extensive censor comments was posted entirely by Otto himself — including the censor comments — reproduced from an email I sent to Otto which was not intended for public domain. The rejection of his post was not anonymous ‘subjective righteousness’ as Otto suggests, but an action I took as web admin with the endorsement of the president of Lifeboat and fully explained to Otto in private email — which Otto reproduced above in public complaint. Otto’s original words were not modified above with the exception of the introduction of a bold font for attempted clarity to the reader. Additional text was added for clarity to the reader, such as the title of the original email that was sent to Otto. There is no mystery.
If one wants a general rule, quality of posts are monitored but comment space is mostly free for all. Lifeboat does not willfully censor opinion, but has to consider spamming and disinformation on occasion, particularly if published texts are considered libelous.
I just learned from Tom’s text above that he may without notification have deleted the following text from Lifeboat today. Since he just claimed above that he would not censor in this thread, I share this information with you, to see whether anyone can explain to me or the world why this post needed to be censored by him:
http://lifeboat.com/blog/2012/05/cerns-shunned-safety-debate…n-lifeboat (May 4, 3:20 am PST)
—————————————————————————————————————–
CERN’s Shunned Safety Debate: Gotten Started on Lifeboat
=====================================================
Posted by Otto E. Rössler in category: Uncategorized
Two days ago, an innocuous blog of mine titled “Dear Planetary Media” got censored-away without forewarning. Unexpectedly, 12 scientific counterarguments were subsequently offered in excuse.
The 12 points I was able to rebut – in the first public scientific debate about the planet-risking CERN experiment. This is in fulfillment of a wish expressed by the Cologne Administrative Court on January 27, 2011, shortly after the beginning of the experiment.
I plan to re-publish this exchange-in-progress in a more visible blog if my scientific counterpart agrees. The present version can be seen on http://lifeboat.com/blog/2012/04/einsteins-miracle#comments under the date May 2 3:50 pm.
The documentation gives the planetary media an opportunity to re-consider their decision made to unilaterally rely on CERN’s claim that my danger-proving results were “absolute nonsense.”
The planet is greatly indebted to Lifeboat.
————————————————————————————————————-
You have never rebutted anything, liar.
Dear EQ
It is true that my comments are too long and chaotic sometimes.
My intention was (and is), to highlight the fact that the numbers of trapped MBHs on neutron stars are too small for a strong safety argument.
I honestly appreciate that you have read in G.M. and tell that the numbers in table 3 are not for single neutron stars.
It is still difficult for me to understand why there is not more precise information about what the numbers refer to, directly on these tables.
It seems that table 3 concerns ONLY binary systems of neutron stars and I admit that I thought it would be for the ordinary case.
Table 9 seems to refer to the same, because it fits with table 3.
It should be known that the reducing factor of 10^−3 for the uncorrected numbers in table 3 is only for neutron stars with the weakest magnetic field ever observed.
Do you agree that the numbers of trapped MBHs on single neutron stars would be even smaller again?
Thank you.
Read it again. It seems you still have not grasp the line of argumentation there.
It is useless to pick words here and tables there without seeing the whole picture.
Dear EQ, dear all
Well — the neutron star issue has been discussed again and again for a long time. I have not taken a further look yet but I hope we will get the ‘real‘ numbers for MBHs on single neutron stars (naturally just according to Prof. Giddings and Prof. Mangano) soon, because it is important regarding the risk of slow LHC-produced neutral MHBs…
Thank you.
MBHs, not MHBs, sorry.
——-
Dear EQ
Ok, now I have read all in GM and I think I have grasp the line of argumentation there. I have made my own calculations and I will show important points of table 3, 4 and 9 and figure 8 today or tonight.
Thank you.
Ok, I try the same one more time a bit more structured:
With the help of “EQ” and some other friendly persons I have investigated some extremely misleading points in the report of Prof. Giddings and Prof. Mangano “Astrophysical implications of hypothetical stable TeV-scale black holes”, which should be corrected or clarified in a new draft.
It will be shown here…
1.) that table 3 and table 9 have wrong and extremely misleading titles or otherwise that the numbers of table 3 and 9 have to be at least 1000 times smaller.
2.) two examples of ‘real‘ numbers of micro black holes (MBH) production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin/3 and y = Mmin/14 TeV.
3.) that the neutron star section does contain only numbers for MBH produced by a hypothetical 100% proton cosmic ray flux and that all extreme cases should be considered and shown in such a safety report for all other hypothetical 100% high energetic cosmic ray (or even exotic particles) fluxes.
4.) that if the points above are true, LSAG should correct the sentence of page 9, which states “In fact, ultra-high-energy cosmic rays hitting dense stars such as white dwarfs and neutron stars would have produced black holes copiously during their lifetimes” and in the contrary should admit that the numbers of MBH on neutron stars “are too small to allow sufficient rate for all cases, and specially those at the highest black hole masses.” (GM, page 85).
To get the ‘real‘ numbers of black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star, the numbers of table 3 and 9 have to be multiplied by the factor of at least 10^−3, resp. should be 1000 times smaller, as told at the end of page 45 in GM. And they admit “…considerably weakening the resulting bounds.”
.….….….….….….….
Misleading title of table 3 on page 46, in the section “8.1 Production on neutron stars”:
Table 3: Summary of black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin/3 and y = Mmin/14TeV.
D = 8 9 10 11
Mmin = 7 TeV 323 422 526 633
Mmin = 10 TeV 129 172 218 265
Mmin = 12 TeV 80 109 139 171
Mmin = 14 TeV 54 74 95 118
.….….….….….….….
Misleading title of table 9 on page 77, in the section “E.2 Black hole production by cosmic rays”
Table 9: Black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin/3 and y = max(0:5;Mmin/14 TeV).
Mmin D = 5 D = 6 D = 7 D = 8 D = 9 D = 10 D = 11
3 TeV 1.3×104 2.5×104 4.0×104 5.6×104 7.4×104 9.2×104 1.1×105
4 TeV 2.2×103 4.5×103 7.0×103 9.9×103 1.3×104 1.6×104 1.9×104
5 TeV 570 1100 1800 2500 3300 4100 5000
6 TeV 190 380 590 830 1100 140 1600
7 TeV 72 146 231 323 422 526 633
8 TeV 47 99 161 229 301 378 457
10 TeV 23 52 88 129 172 218 265
12 TeV 13 31 54 80 109 139 171
14 TeV 8 20 36 54 74 95 118
(I have found a little error in the table here: The number of MBHs for 6TeV and D10 should be probably more than 140, perhaps 1400.)
.….….….….….….….…
1.) Section 8.1.1 “Production in binary systems”, page 46 in the publication of Prof. Giddings and Prof. Mangano (GM) states:
“A survey of known classes of binary systems (see Appendix H.1) reliably yields FCE’s in the 2 Myr range, resulting from systems with a 1 Gyr lifespan. The neutron star production rates are exhibited in table 9 and in fig. 8 of Appendix E.2. A summary of that table, focusing on the most interesting cases of D equal or greater than 8, is shown here in table 3.”
But the second phrase: “The neutron star production rates are exhibited in table 9 and in fig. 8 of Appendix E.2. A summary of that table, focusing on the most interesting cases of D equal or greater than 8 , is shown here in table 3.” does not mention binaries at all, so it seems that these tables are not meant for binary neutron star systems.
It is true that some binary neutron star systems would give quite similar numbers as in table 9 or 3 but the values would differ very much for each specific case.
One could say that if the title of table 3 was:
“Table 3: Summary of black hole production rates, per million years of
FCE, induced by proton cosmic rays impinging on a binary companion of a R = 10 km neutron star. MD = Mmin=3 and y = Mmin=14 TeV.”
and table 9 was:
“Table 9: Black hole production rates, per million years of FCE, induced by proton cosmic rays impinging on a binary companion of a R = 10 km neutron star. MD = Mmin=3 and y = max(0:5;Mmin=14 TeV).”
then both tables would have to do with neutron star binaries, and the
data in both tables would be more or less correct (aside from the
various other issues related to different cosmic rays, Mmin, y, etc.).
Table 3 itself is not in the binary section of GM.
The titles of table 3 and table 9 have nothing to do with binaries.
It is important that the titles of these two tables lead to a wrong opinion, because the numbers have to be 1000 times smaller, so the title of the tables are wrong or otherwise it must be told that the numbers are “uncorrected” and have to be multiplied by a factor of at least 10^−3.
Many readers can not search and investigate all misleading points by carefully digging in all sections, appendixes and discussions.
The second-last paragraph on page 76 states “The production rates on a neutron star (neglecting the magnetic screening) can be obtained from the white dwarf’s ones by rescaling by the surface area. Assuming a 10 km radius, the proton rates in Table 4 are reduced by a factor of 3.4 × 10^−6, leading to the numbers in Table 9.”
Table 3 has numbers of table 9.
So I have done what is told and calculated ‘my own‘ numbers, starting with table 4 (see below table 4).
.….….….….….……
Page 73
Table 4: Black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 5400 km white dwarf. MD = Mmin/3 and y = Mmin/14TeV.
D = 5 6 7
Mmin = 7 TeV 2.1 × 107 4.3 × 107 6.7 × 107
Mmin = 8 TeV 1.4 × 107 2.9 × 107 4.7 × 107
Mmin = 10 TeV 6.7 × 106 1.5 × 107 2.6 × 107
Mmin = 12 TeV 3.7 × 106 9.1 × 106 1.6 × 107
Mmin = 14 TeV 2.3 × 106 5.9 × 106 1.0 × 107
As I told I take two examples from table 4. (The same numbers are in table 2 on page 40, in which are some numbers for the hypothetical case of 100% high energetic iron particle flux too, which are smaller that the numbers for proton induced MBH).
Example 1, calculated with a value from table 4.
I take the number for 7TeV at 5 dimensions, which is 2.1 × 107.
2.1 × 107 = 21‘000‘000
Now I reduce the number of 21‘000‘000 by the factor of 3.4 ×10^-6 = 0.0000034
21‘000‘000 × 0.0000034 = 71.4
71.4 does quite match the number of table 9 for MBH production rates per million years, which is 72 MBH for 7TeV at 5 dimensions.
Example 2, calculated with a value from table 4.
I take the number for 14TeV at 7 dimensions, which is 1.0 × 107
1.0 × 107 = 10‘000‘000
Now I reduce the number of 10‘000‘000 by the factor of 3.4 × 10^−6 = 0.0000034
10‘000‘000 × 0.0000034 = 34
34 does again quite well match the number in table 9, which is 36.
So it becomes clear that the calculation with the numbers of table 4, which is for MBH on single white dwarfs, leads to the uncorrected numbers of table 3 and 9 and that these numbers have to be multiplied by a factor of at least 10^−3, because the powerful magnetic screening of the neutron star has not been considered yet. It should be told that the reducing factor of 10^−3 is the smallest factor for the weakest possible magnetic field of a neutron star.
(I should admit that I do not know why my own numbers are slightly different but they are nearly equal to the ones in table 9 and I do not see any reason why my numbers should be wrong, because I have done what is told in the text of GM.)
To calculate two ‘real‘ numbers for the title “Black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin/3 and y = max(0:5;Mmin/14 TeV).” I multiply the resulting numbers of example 1 and 2 with 10^−3.
For example 1:
71.4 × 10^−3 = 0.0714
For 7TeV at 5 dimensions we get 0.0714 black holes per million years.
For example 2:
36 × 10^−3. = 0.036
For 14 TeV at 7 dimensions we get 0.036 black holes per million years.
It seems that such numbers are too small for a safety guarantee to rely on regarding stable, uncharged MBH.
It should be known that high energetic cosmic rays (I think above 2TeV) have not directly been measured, but only particle showers from the secondary events. It is not known what kind of rays or particles lead to these events.
The neutron star section does only contain numbers for the hypothetical flux of 100% high energetic cosmic ‘ray‘ protons.
In the hypothetical case of a 100% iron cosmic ‘ray‘ flux, the uncorrected numbers of table 3 and 9 would have to be smaller by a factor of (roughly) some hundreds or thousands and in the case of 100% lead even more small again. (As told naturally after the multiplication with 10^−3).
The purpose of a risk analysis concerning the risk of the destruction of Earth should consider all possible worst case scenarios. Therefore all other extreme cases for all possible kinds of cosmic rays, heavy nuclei, particles and exotic particles have to be considered, even for the extreme hypothetical case of a 100% flux as already done for protons in the neutron star section or for proton and iron in the white dwarf section in the report of Prof. Giddings and Prof. Mangano.
.….….….….….…
For further reading please see page 59 to page 61 in the very interesting paper of Alam Rahman, which handles and criticizes many other important points in the report of Giddings and Mangano:
http://www.lhcsafetyreview.org/docs/BlackHoleReview.pdf
Special thanks to EQ, Alam Rahman, Eric Penrose, Tom Kerwick, Otto Rössler, Rudolf Uebbing and LHC-Kritik for discussion.
Sincerely yours, Niccolò
——-
It seems that the tables have been contracted, sorry.
And naturally I have made a mistake again, by not mentioning figure 8 in the appendix on page 77 in GM, which is a curvature for MBH production rates by proton and iron cosmic rays. It would be preferable to have tables with the numbers for iron and all other thinkable particles (or most important for the ones which would produce the smallest numbers of MBH) in the main neutron star section too. I hope the purpose of my comments will be clear anyway.
Thank you.
Amazing, you have “corrected” already corrected values in order to create a strawman- I suggest you to contact the authors and present them their epic failure.
Dear hdc
Strawman, already corrected values?
It is quite easy to tell such generalized phrases.
Is it too difficult for you to criticize me detailed and directly?
Perhaps you have something useful and I will honestly admit my possible errors, right? If you are going to criticize me, perhaps you can tell which of my points are true too? I look forward to further discussion.
Many thanks for your participation, Niccolò
——-
Please read page 60 in the link above and all important pages in GM. Page 60 leads you to GM. Otherwise tell my hypothetical error.
Good analysis Niccolò let’s get it in a document for LSAG review/comment. Also in passing a casual remark on this section of G&M — “known neutron stars have strong magnetic fields, which are observed to range upward from 108 G. In the case of a field of 108 G and a radius of R = 10km, yields a maximum energy…” I question whether this assumption of 108 G is reliable basis for the derivation in the first place.
-Tom.
Dear Tom
Great that you have read it and that you agree.
As I told in my detailed comment, the reduction factor of 10^−3 for the uncorrected numbers in table 3 and 9 is for neutron stars with the weakest possible magnetic field. A stronger field gives increased reduction factors and therefore the ‘real‘ numbers per million year would be even smaller again.
I am glad that you say my arguments would be good enough for a real document. But the very most (if not all) is from the great paper of Alam Rahman, which contains many other important points. I have only handled the point for neutron stars.
I have sent the paper to some important people already before some months but not got an answer yet.
Thank you very much.
Best regards,
Niccolò
——-
Hi Niccolo — It seems looking over this the values in Table 3 are certainly uncorrected, as stated on page 76 where the numbers are taken ‘neglecting the magnetic screening’ which in appendix G of course they state should be a multiplier of 10^(−3) for the weakest of such known fields. In the binary case, I’d also think the scattering on the companion of MBH which “since we only need bounds if stable black holes are neutral, the magnetic field of the NS is irrelevant for these”. but they seem to overlook that the screening is still relevant in the initial production — the only point they should be making regarding neutral black holes for scattering on the binary pair is that they don’t have to apply a 10^(−3) multiplier twice — and also they assume that the binary cases have this same relatively low 10^(8) G field which I suspect may not be the case (need to confirm this).
Also just incase you didn’t notice, they cite [109] D. R. Lorimer, \Binary and Millisecond Pulsars,” Living Rev. Rel. 8, 7 (2005] [arXiv:astro-ph/0511258] for their lower bound of 10^(8) G fields on neutron stars. I will have a review of this paper and read around to see how reliable this figure is, and how reliable the method of measuring such fields are.
-Tom.
Tottoli, I don’t see what you get so excited about. The assumptions under which the numbers in table 3 and 9 of the GM paper were derived are stated in the text, there is no ambiguity there. The authors did not list all of the assumptions in the captions of the tables: so what? This is a technical paper aimed at experts, who normally should be able to read (and understand) the whole text.
Kerwick, are you sure that you get GM’s reasoning? The sentence “they seem to overlook that the screening is still relevant in the initial production” makes me doubt that. The way I understand it — after an admittedly QUICK look at section 8.1.1 and appendix H — is that the magnetic field of the NS would deflect the the cosmic rays (= the protons), therefore GM consider a binary system in which the companion is *not* a NS, has no strong magnetic field and thus does not deflect the protons. Some of the protons scatter on the companion and produce black holes, which — if they are neutral — subsequently reach the NS and get stopped. Therefore, there is no “screening in the initial production”, since the initial production occurs in the companion. There is on the other hand a reduction factor accounting for the fact that only a fraction of the protons that would reach the NS if the latter had no magnetic field are intercepted by the companion, but this is amply discussed in the text.
PassingByAgain — yes I considered an NS companion in that previous comment — so stand corrected there would be no 10^(−3) screening on initial production in such binary cases. If I may I would also like to state that I find it refreshing that critical debate has rationalized somewhat here — discussing the nuts and bolts of the G&M analysis.
For the passing reader the relevant comments started around here: http://lifeboat.com/blog/2012/04/einsteins-miracle/comment-page-1#comment-107513
Kerwick, I haven’t seen a “critical debate” here. What I have seen is you and Tottoli going on for pages and pages about so-called errors in the GM tables without even having grasped the basic idea (let alone the maths) behind the “binary” argument…
Well if got as far as the mis-interpretation of the scattering of MBH onto the neutron star of distant binary pairs we’ve gone a long way from two mosquitoes.
@passingby: as I mentioned before, I think Tottoli has created a private strawman that has nothing to do with the numbers in the table at all…
I therefore suggest again that he should contact the authors of the paper.
Another point is that there are no similar responses/objections from other experts in the field. As science is normally highly competitive it would look rather strange if no one would have written an objection against “misleadings” that “obvious”. Ok, I think Tottoli has no problem to explain even this as he is implicitly assuming that scientists all over the world were bribed by CERN.…
I therefore suggest again that he should contact the authors of the paper.
He should then also present their answer here in public, of course.
The whole planet knows that Giddings and Mangano never answer to criticism or update their paper from the first draft since they never were serious from the beginning.
Proof: They do not themselves believe in the formal assumption made by themselves that “Hawking radiation does not exist.”
You have never criticized them in a scientific manner so there is in fact no critique from you, scientifically speaking.
You have apparently not even read the paper as you nonsense about string theoretical calculations and so on has revealed.
Your “proof” in the above comment is ridiculous — for a scientific discussion of worst case scenarios it does not matter whether the authors believe something else…It is again the only kind of argument you are able to formulate: infamous attacks on personal level. Scientific arguments of the GM-quality (falsifiable precise mathematical models etc) cannot be found in any of your pseudoscientific text clouds.
Thank you for replying even though anonymous.
Quote: “for a scientific discussion of worst case scenarios it does not matter whether the authors believe something else.”
This is correct. But if they never upodate for 4 years despite important counter-results offered to them, the lack of seriousness present from the beginning starts showing — or does it not?
My revealing to have asked G&M for 4 years in vain to offer a scientific counterargument you call “infamous attacks on personal level.“
This is something only your anonymity allows you to do. Hiding and shooting is not honorable, especially if the author obviously is a high-ranking scientist. Can we talk?
You have disproved or revealed nothing about the GM paper. You have apparently never read it, as it is obvious from your few statements about it.
if it is indeed correct, that the personal “belief” of a scientist does not matter, why did you mention it then? did you really think this would be an “argument”?
No, we can not talk. No one needs to talk to an old enemy of reason who can not manage to formulate at least one scientific sentence about a paper like GM while attacking the authors purely on personal grounds.
You pretend not to have understood what I said.
Your only argument is that you are clairvoyant. This is very reassuring to yourself, but not necessarily to others.
By the way, no one knows better how stupid am than me. That is why I am asking to be shown my mistake in Telemach. Maybe you know someone who can do that?
Or could you put in the work to formulate a counter-theorem yourself?
I understood clearly, it was your next try to avoid any scientific argument. But speculations about intentions and conclusions about the “seriousness” does not replace a scientific counteragrument which you never delivered. Even if GM are not convinced of the non-existence of hawking radiation, even if they personally think that the whole paper is in principle useless and irrelevant it does not disprove the arguments presented in the precise formulated worst case scenarios.
So again this nonsense about the seriousness is no argument against the GM paper but a proof of your strange avoidance to formulate something like that for 4 years now.
Dear PassingByAgain
Thank you that you do not tell I would have made an error (like hdc told).
But please be honest and precise and do not try to trash my detailed comment (it was quite hard to investigate it for a layman like me). To say it like this: I do not understand why you do not understand me.
You say “…so what?“
I am sure that you know the purpose of such a treatise like GM‘s.
So please read my given points one more time (I am not going to repeat them).
LSAG uses similar phrases like “Nature has already completed about 1031 LHC experimental programmes since the beginning of the Universe.“
And: “Thus, the continued existence of the Earth and other astronomical bodies can be used to constrain or exclude speculations about possible new particles that might be produced by the LHC.” about 15 times in one report (I think only on page 4 there are 4 or 5 of such phrases).
Such sentences just lead to a wrong opinion and are untrue in this context. The differences between nature and the LHC will be told somewhere else but only a few people have time to carefully read on every page. And if one does know the whole report, then it becomes obvious that some important points have even not been considered.
You say: “This is a technical paper aimed at experts, who normally should be able to read (and understand) the whole text.“
But as you have seen in some comments above, even experts (or even LSAG) seem to have difficulties to understand it…
And there are real errors.
Don‘t you think that such an important paper which handles the worst man made risk ever discussed should be held as easy understandable and as straight as possible?
I think my long comment above has enough important points to be written here.
Neutron stars have been discussed since a long time but now it is hopefully clear for ever that the numbers of captured MBH (mostly higher masses) are insufficient (according to GM).
But only to give the link to Alam Rahman´s great paper would not be enough, because only very few people would read it or even klick on it.
Anyway I am quite happy that the issue is discussed more deeply now, which was one of my aims.
Thank you (and thanks to Tom too).
Sincerely yours, Niccolò
——-
Sorry it was the answer to “PassingByAgain on May 10, 2012 3:37 pm”.
I should have checked first, whether the discussion was on page two already…
Perhaps later more. Thanks to all, Niccolò
And dear hdc
Where is the strawman, where is it exactly? Show me the page and the line and tell it to me directly and specifically, as I told you once already, ok?
Thank you.
Anybody?
http://lifeboat.com/blog/2012/04/einsteins-miracle/comment-page-1#comment-107513
Paper of Alam Rahman:
http://www.lhcsafetyreview.org/docs/BlackHoleReview.pdf
Paper of Giddings and Mangano:
http://arxiv.org/pdf/0806.3381v1.pdf
Tottoli, if you can’t read the GM paper at least read my comment in full. In the part addressed to Kerwick, I explained why the neutron star magnetic screening that gets you so excited is completely irrelevant in the scenario considered in the paper. In short, the companion is NOT a neutron star, and hence has no magnetic screening. The protons in the cosmic rays hit the companion and produce black holes, some of which are then captured by the neutron star (indeed, if the black holes are neutral, they don’t feel the magnetic field). If you don’t even understand this much, it’s useless for you to pick on the details of the tables.
Quote from anonymous 3-letter author above: “But speculations about intentions and conclusions about the ‘seriousness’ does not replace a scientific counteragrument which you never delivered.”
This is what I am saying every day after having published a theorem in a peer-reviewed math journal.
Please, be so kind and try to give the counter-theorem the planet is waiting for. It will then be called “the hdc theorem” and make you famous despite your tactful anonymity.
Tottoli, haven’t you read passingby? He has shown you the strawman already that hdc has mentioned.
Rössler, the “peer review” of your journal must be the poorest review in history of science if you have not even been asked the most obvious questions about the derivations of your wrong equations, the missing definitions of variables and so on. All the stuff you were asked before here and at other places which was never answered by you.
Therefore your publication is worthless. The same applies for your “paper” published in the fraudulent journal of your friend El Naschie (everyone can inform himself about the real fraudulent character of this guy, its not that difficult )
Tottoli, how do you explain the fact that no one of the experts who are for sure deeper in the field than you never objected to the GM paper if the errors are that obvious that even a layman like you can find them?
a big conspirancy? All scientist bribed? or what?
(Rössler has not objected against the paper, he tries desparately to hide even the fact that he has read less of the paper than you, Tottoli)
In section 8.1 if one assumes the uncorrected rates of Appendix E in calculating the scattering on a NS in a binary pair are not the uncorrected rates summarised in Table 3 on that same page — as this would infer an NS companion — this would contradict why they refer back to the neutron star production rates in table 9 in the following sentance. Also regardless they should not be using uncorrected rates with the the FCE for the binary case, though I accept the suppression rate of a non-NS partner would be a lot less than 10^(−3). Regardless, it does looks like Nicollo is correct in that Table 3 should state the values are uncorrected, or else be reduced by a multiplier of 10^(−3), as are quite misleading as is. To answer EQ — I can only suspect this section of the G&M paper was not reviewed in as great detail as earlier sections, or else I have missed something here…
I ask Lifeboat to make an official remark before every appearance of eq because he anonymously slanders a leading senior scientist on the planet because he is my friend.
Kerwick (and Tottoli), stop clutching at straws. The table is not misleading at all if one bothers reading the paper. On the other hand, the table would have been useless if the authors had introduced the correction factor due to the magnetic screening. Indeed, it is the uncorrected rates — not the corrected ones — that must be multiplied by the “FCE” in order to determine the number of black holes captured by the NS. A sentence such as “regardless they should not be using uncorrected rates with the the FCE for the binary case” makes me suspect that Kerwick still doesn’t get the basic picture…
El Nashie is a numerology –crackpot, he is not a leading scientist.
And I would say the very same as evry ludic person even if he would not be you friend. Nice try again, Rössler.
PassingByAgain — no straw clutching, I am just passing my opinion on the figures being discussed here. I merely agreed that the table should state ‘uncorrected’ at a minimum, as otherwise it misrepresents the real figures for production rates.
As for the binary case the initial production which results in MBH scattering should be corrected values, as you need to use real figures for actual produced MBH. It is the secondary scattering onto the NS does not require such correcting as neutral. As stated in an earlier comment, one would have to apply a correcting twice if the MBH scattering was not neutral. It looks to me at this stage that it is you who do not get the basic picture here. Maybe you read over it all too quickly again… Anyway, these are more questions for LSAG than messageboards. Best of luck in formulating your concerns Niccolo.
This high-level discussion on neutron stars gains pepper if I briefly interject a fact all of the discussants know: That neutron star cores are “most likely” protected by quantum mechanics. The Giddings and Mangano paper is so sadly outdated because this element of the literature was refused to be quoted (or contradicted) 4 years ago up to this day.
Again this unfounded pseudoargument.
Rössler, that was already shown to be insufficient. Shut up ;)
And there is also no reference from you presenting this fantastic mechanism that would prevent neutron star matter against accretion.
Where is your fantastic model of gravitational/ strong force-superfluidity? :D
I accept any bad word by someone who dares not show his face — if directed against myself. For he says at the same time that he is not to be taken seriously.
But I insist on an excuse on this blog for the breach of other person’s rights occurring under the label “eq”.
Kerwick: “As for the binary case the initial production which results in MBH scattering should be corrected values”.
OMG, you still don’t get it: the initial production occurs on the companion, which is NOT a neutron star and therefore has NO magnetic screening. It would then be WRONG to apply the 10^−3 correction factor, which accounts for the fact that — in a neutron star — only cosmic rays coming near the magnetic poles can pass through. Come back only when you have understood this basic point.
PassingByAgain — I was not referring to an initial 10(−3) correction factor. I was referring to the (lesser) correction factor applicable to the companion. Read back over my earlier comments. It seems you missed my point.
Yes, I don’t know which earlier comments of yours you are referring to. What “(lesser) correction factor applicable to the companion” do you have in mind?
Dear all
Sorry, it is so that I cannot be 24 hours at my pc, so I read and reply later in some cases but I‘ll try tonight or not more late than tomorrow.
Thanks to all for discussion on a high level.
Best regards to all, Niccolò
PassingByAgain — well that would depend on the companion, if you take example of a WD companion it would be a correction for WD though I don’t see a value calculated for this in the appendix (they only calculate for NS case), and G&M do state they use ‘uncorrected’ values here. Should be a lot less significant than the NS correction values of course…
Kerwick: why don’t you just take the formulae in Appendix G and plug in the numbers for the white dwarf? I guess you will find that in that case the suppression is negligible, justifying GM’s use of “uncorrected” values.
PassingBy — one would think I was questioning your Koran. EM deflecting effects in white dwarfs are far from negligible. You’re showing yourself up here a bit…
Dear all
At the moment it seems that Tom has understood the most. No error in my long comment on page 1 here.
Again: the second-last paragraph on page 76 in GM states “The production rates on a neutron star (neglecting the magnetic screening) can be obtained from the white dwarf’s ones by rescaling by the surface area. Assuming a 10 km radius, the proton rates in Table 4 are reduced by a factor of 3.4 × 10^−6, leading to the numbers in Table 9.”
Table 3 has numbers of table 9.
So I have done what is told and made two examples in my long comment on page 1 and calculated ‘my own‘ numbers, starting with table 4.
My results do match the numbers of table 9.
And later I have applied the reduction factor, to show ‘real‘ numbers for single ns.
Please compare the numbers and titles of table 4 on page 73 and table 2 on page 40. The titles of table 2 and 4 are correct.
Binary systems are all different and would lead to different numbers for each case.
Also the FCE (full coverage equivalent) is different for each binary case, because it does NOT just mean that the neutron star is exposed to MBH all its lifespan equally like a hypothetical neutron star without the magnetic screening (-reducing factor).
But it is true anyway that only SOME kind of binaries would have similar ‘real‘ (corrected) numbers of MBH as in table 3 and 9.
The numbers in table 3 and 9 are absolutely sound for single neutron stars, except that the reduction factor is not applied yet. But the titles of 3 and 9 are misleading. Please tell me if something special is unclear again.
I will take a look sometimes.
Thanks to all, Niccolò
——-
Clarification:
“But it is true anyway that only SOME kind of binaries would have similar ‘real‘ (corrected) numbers of MBH as in table 3 and 9.” means that the numbers would be ‘real‘ for some binaries but I did not mean the numbers of table 3 and 9 are ‘real‘ (corrected) numbers.
Dear all
To be clear about FCE and so on please read on page 61 in the paper of Rahman. Or to be absolutely clear, please start on page 59. GM is perhaps too tricky to read without the help of Rahman‘s paper.
Have a good day, Niccolò
——-
Edit:
I have told: “But it is true anyway that only SOME kind of binaries would have similar ‘real‘ (corrected) numbers of MBH as in table 3 and 9.“
So I have to add: …if table 3 and 9 would refer to “per million years of FCE”, naturally!
(Sorry people, I am tired)…
Tottoli: so your whole point seems to be that the captions of the tables 3 and 9 in the GM paper should contain the word “uncorrected”. Who cares? It is absolutely clear from the text of the paper that the numbers in the tables do not include the correction for magnetic screening. And — as I wrote above — it is indeed the uncorrected numbers (not the corrected ones) that must be multiplied by the FCE when considering the binary system. Showing the corrected numbers in the tables would have been useless and confusing in this context.
Kerwick: quantify your “far from negligible”. A correction of order one is indeed negligible in a reasoning about orders of magnitudes. And I suspect that in binary systems in which the companion is not a white dwarf the effect would be even smaller.
Dear PassingByAgain
Oh yes: “Who cares” — a good question!
It is true that “…it is absolutely clear from the text..” that the numbers of table 3 have to be reduced at least 100 times.
But anyway it seems (even you) have not got the whole picture yet.
It is not only one point in my detailed comment on page 1 here and I still have the same points.
I told that the paper of GM is tricky and misleading and you have not got it yet. But please read now my entire comment here, the solution for you comes at the end.
GM, page 45, 2nd paragraph states:
“In order to compute the actual production rate on the neutron star, we use the uncorrected rates of Appendix E, times the number of years of FCE.“
So one goes to appendix E.
In the appendix are many tables but table 9 refers to neutron stars.
I have shown in my two examples that these numbers are uncorrected and have to be 1000 times smaller, to be sound with the title of the tables which states “Table 9: Black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin/3 and y = max(0:5;Mmin/14 TeV).“
Now we know that “per million years” is simply wrong, because these numbers are NOT the MBH production rates PER MILLION YEARS on A neutron star!
Now we all know and agree too that the title of table 3 (which is not in the section of binaries) is simply wrong, because it states: “Table 3: Summary of black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star. MD = Mmin=3 and y = Mmin=14 TeV.”, because the table is NOT a summary of MBH production rates PER MILLION YEARS on A neutron star!
Now it should be clear too for all that the claim in the LSAG report (1st paragraph, page 9) that cosmic rays hitting neutron stars “would have produced black holes copiously during their lifetimes” is wrong too, because the ‘real‘ numbers for single neutron stars are too small (as GM states): “These numbers are too small to allow sufficient rate for all cases, and specially those at the highest black hole masses.”
But I have given some other important points, for example that all other hypothetical cases for other particles and most important those which would lead to the smallest numbers of MBH produced on a neutron star should be considered, which is not the case in GM.
xxxxx Here is the answer to your new comment: xxxxx
It seems that you have not read the paper of Alam Rahman, which (as I told) helps much to understand the quite tricky paper of GM.
Therefore you have shown trhat you have made perfectly the same error in your new comment which most of the ‘normal‘ readers (and probably even most experts — and I would call you an expert) would make:
It is wrong to just multiply the uncorrected numbers of table 9 or 3 by the number of years of FCE!
The paper of Rahman (page 61) gives you the solution:
To arrive at the rates per real million years one would have to multiply by the fraction of the sky covered by the neutron star’s companion. This fraction would vary
depending on the size of the companion and its distance from the neutron star. Appendix H.1 gives a range from 0.002 up to 0.06 [GM p. 86], which implies that the rates in table 3 are actually the number of black holes expected during a time period ranging from 500 million years down to about 17 million years.” (see page 61 in the link of “LHC safety review” (Alam Rahman), which I have given several times.
I hope now everything will be clear and I hope you will see that I was absolutely right with all.
It is better to first read all carefully. Otherwise much time is lost with a discussion and rumours which will never lead to an end.
Thank you.
——-
Correction for my first paragraph:
Not “at least 100 times” but “1000 times”, sorry.
Tottoli, please, read again what you just wrote:
“To arrive at the rates per real million years one would have to multiply by the fraction of the sky covered by the neutron star’s companion. This fraction would vary depending on the size of the companion and its distance from the neutron star.”
This is PRECISELY the reduction encoded in the “FCE”, see eq.(8.1) of the GM paper and the text immediately below it. Due to this reduction, over a typical lifetime of a billion years a neutron star in a binary system receives the number of black holes that a solitary neutron star without magnetic screening would receive in the FCE, i.e. in just two million years. In other words, the relevant number of black holes is given by the FCE times the numbers in table 3. Do you get it now?
Note BTW that Kerwick’s objection is different: he argues that one should also take into account the effect of the magnetic screening of the companion (which is not a neutron star). However, he still has to show that this effect can qualitatively change the result.
P.S. just curious: why do you keep saying that table 3 is not in the section on the binaries?
Dear PassingByAgain
Yes, at the moment I think that we both got it! I thought you would think something else, sorry.
To say it in other words one more time:
If one does multiply the numbers in table 3 or 9 by the numbers of years of FCE then one does not get the real numbers of MBH per million years, but you get only the number of the total integrated MBH exposure over the entire lifetime of the neutron star.
If you like to calculate the number of MBH per ‘real‘ million years (naturally not million years FCE), produced on the binary neutron star then you must divide the result of the FCE-calculation 8.1 (on page 46) by the estimated age of the neutron star.
Right?
About your P.S.: Table 3 is ABOVE the binary section, on page 46 in GM and it is mentioned in the binary section (same page) but it isn‘t IN the binary section on page 46.
Would you agree so far?
I wish you a good day, Niccolò
——-
BTW: I would generally agree with Tom, that the magnetic screening of the binary companion depends on the density and type of the companion. But I have not investigated it yet.
About the magnetic screening of the companion I do not know how much it would make but as we have seen the magnetic screening of a neutron star does vary quite much too.
Good day to all!
Tottoli, you wrote: “you get ONLY the number of the total integrated MBH exposure over the entire lifetime of the neutron star”.
But this is the only number that matters for the safety argument!!! It is the number of black holes that the systems we observe may have accumulated so far without being destroyed. How comes you still don’t understand???
As to the table: scientific papers are usually written with LaTeX, which treats pictures and tables as floating objects and places them wherever it’s more convenient to optimize the page layout. All that matters is where the object is mentioned first in the text (in this case, section 8.1.1: “is shown HERE in table 3″). Nobody who is even vaguely used to reading such papers would doubt that the table is in section 8.1.1.
(if you don’t believe me, look at the other tables and pictures in the GM paper: all of them are either on top of a page, above the text, or in a page without text. That is the default choice of LaTeX, and most authors don’t bother forcing the program to put the floating objects in some specific place)
earlier comments here http://lifeboat.com/blog/2012/04/einsteins-miracle/comment-page-2
Other than Otto Rossler, has anyone come up with a plausible physical mechanism for black holes generated by cosmic ray collisions to be uncharged? If these black holes are charged, they would be stopped (by the Earth, Sun, white dwarfs, neutron stars, etc.) quickly after formation.
So really this whole thing hinges on whether black holes can be charged.
Also — would magnetic screening of cosmic rays on neutron stars really stor them from hitting, or just deflect them to polar collisions?
Conant: of course there is no special reason why the black holes should be neutral (and stable), but it makes sense to investigate the worst-case scenario even if it is theoretically unfounded. As to the magnetic screening, the problem is not the deflection of the protons, but the fact that the magnetic field causes the protons to lose part of their energy, up to the point where they might not be energetic enough to produce black holes (see appendix G of the GM paper).
Tottoli: have you now understood that the relevant quantity is the number of black holes absorbed by the neutron star over its lifespan (and not the number per million years)?
Dear PassingByAgain
Thanks for your first comment on page 3, it leads me to the idea that you could help me to find some additional misleading points in GM.
To you comment on page 2:
http://lifeboat.com/blog/2012/04/einsteins-miracle/comment-page-2#comment-107736
You can tell me what you want but I do not agree with your statement: “Nobody who is even vaguely used to reading such papers would doubt that the table is in section 8.1.1.” Even because of the misleading title of table 3. (And please do not tell I would not know how to read publications, because you do not know me.)
However, your point with LaTeX is true but anyway a publication — especially such an important one — should be made carefully enough, so far that most of the readers can understand it quickly and may not get a wrong opinion. This is important because such a risk assessment should be suitable for an open discussion between various people and one does not know, who anymore could have an important idea regarding it.
I would say, that it would generally not matter whether we would have a table with the numbers of MBH on neutron stars per million years or per entire lifespan of the NS. I agree that the most important point would be “per lifespan”, but the lifetime varies. (You know that I know, so please stop trying to convince naive readers that I would not understand anything.)
If we talk about NS, naturally the only important parameters in order to find out, whether there is a proof of safety regarding neutral, stable, slow MBH, possibly produced at the LHC are as follows: the lifespan of the NS, the exact type of ‘neutron‘ star and its core, the number of trapped MBH per time scale, the size of the MBH (or else its smaller and stable remnant), the space between the particles of the NS and the time of accretion.
If one does not know the accretion rate then it does also matter whether the MBH is trapped at the beginning or at the end of lifespan of the NS.
But again: just one (not the most) important point in my long comment on page 1 here is that the title of table 3 is “Summary of black hole production rates, per million years, induced by proton cosmic rays impinging on a R = 10 km neutron star” and does not mention “million years of FCE” nor “binaries” in any way.
LSAG handles single neutron stars resp. generally “neutron stars” and GM (8.1 and elsewhere) too. Therefore where is a table with a correct title or an argument with some real numbers on single neutron stars (which is the main issue) and (if you still think that table 3 is mostly for the binary section) then why does the title not mention “per million years of FCE”?
The ‘trick‘ would be, just to make the publication (and all tables) as easy understandable and as less misleading as possible, which does not seem to be the case here.
Also I would say that naive readers do not think about LaTeX.
It would not be so difficult to put the tables in the right section (instead of above the title of the section resp. instead of at the end of the solitary section), with the real final (corrected) numbers, for example for MBH per real million years for single NS, MBH per lifespan of NS, MBH per million years FCE (for binaries) or whatever. But I already know you will very probably see it different again.
Thanks to Mr. Conant for the question concerning the neutrality of MBH. Dear PassingByAgain Naturally I agree that it makes sense to investigate the worst-case scenario but I do not agree with your argument that “of course there is no special reason why the black holes should be neutral (and stable)”, because there is not only the paper of Prof. Rössler regarding it. Charge can be seen as “information” too and we do not know whether any information can come out from the black hole. Hawking radiation is not anymore on the public safety page of CERN and there is a reason for it. Not only Prof. Rössler says something against Hawking Radiation.
You mention the worst case scenario but you did not yet answer my argument, that GM should have considered all possible particles, rays and all thinkable sorts of exotic particles, mostly cosmic rays which would give the smallest numbers of MBH trapped on a (single or else binary) neutron star.
Thanks for discussion. Best regards, Niccolò
Tottoli, this discussion is going nowhere. To start with, table 3 IS
in section 8.1.1, and if you still think otherwise, perhaps you have a
problem with English language (“shown here in table 3″), not just with
scientific papers. Anyway, this obsession of yours with the captions
of the tables is just petty. True, the authors could have written
“uncorrected” in the captions, but so what? Once again, the meaning of
the numbers in the tables is clear to anybody who can read the
text. BTW, why did you have to repeat the calculation to realize that
the numbers did not include the correction for magnetic screening?
Couldn’t you just read it in page 46?
Besides, what would a “naive reader” do with the information that a
neutron star in a binary system absorbs a given number of black holes
per million year? The relevant message of section 8 is that, if micro
black holes were: 1) existing; 2) lighter than 14 TeV; 3) neutral; 4)
stable; and 5) able to accrete at a rate dangerous to Earth, a typical
neutron star in a binary system would be destroyed by them, even if
the cosmic rays contained only 10% of protons. Again, this information
is clearly written and fully explained in the paper, and it is
accessible to anybody who bothers to read the text instead of just
looking at the pictures. And remember, this is a paper published in a
specialist journal, aimed at actual physicists. If you and your “naive
readers” get confused by the “tricky” tables — or are unable to
understand eq.(8.1), to make another example — well, that is just your
problem.
As to the LSAG report, it’s a different matter. That report is much
shorter, much less technical and addressed to a wider public. The
issue of micro black holes is just one among several that are
considered, and the bounds from white dwarfs and neutron stars are
discussed (without numbers) in just two paragraphs in page 8. I don’t
know why the LSAG authors did not specify that the bounds refer to
neutron stars in binary systems. Probably they thought that the “naive
reader” would not care about the subtlety, and the expert one would be
able to read (and understand) the details in the GM paper.
Dear Mr. Conant:
Thank you for your question regarding the chargedness of black holes which is as maximally important as you point out.
To the best of my knowledge, the most high-ranking colleague of mine who shares in this result is Professor Richard J. Cook of the Airforce Academy in Colorado Springs. He wrote to me that he fully supports the Ch of my Telemach result (while he himself had independently arrived at T,L,M).
My Telemach paper in the African Journal of Mathematics and Computer Science gives the relvant references. I hope Professor Cook forgives me for pulling him into this discussion here which I consider to be sufficiently scientific in content.
This is a highlight in the public discussion of the LHC experiment. I hope your question will be followed by further questions by others in the same, purely objective spirit.
Sincerely yours,
Otto E. Rössler
Dear PassingByAgain
Agreed with your first sentence. I do not like to talk in circles nor to make you angry. The contrary: it was great to have such an intelligent and strong discussion partner like you over three pages, starting from here (thanks to all, to the moderation and to all readers):
http://lifeboat.com/blog/2012/04/einsteins-miracle/comment-page-1#comment-107513
But to answer your question: I have made the two examples of calculations (see link above), to show some real (corrected) numbers per real million years on single neutron stars, as to prove that the title of table 3 is incorrect. It was also to highlight the statement of GM, that the numbers of MBH on neutron stars “are too small to allow sufficient rate for all cases, and specially those at the highest black hole masses.” (GM, page 85).
Can I ask you three very final questions, in the hope to get a very honest answer?
1.) Do you agree that there is no proof of the type of these high energetic cosmic ‘ray‘ particles, possibly leading to MBH?
2.) Would you find it a good idea, to make calculations specially for the hypothetic particles which would produce the smallest number of MBH on single neutron stars or binaries?
3.) What is your motivation to be on the side of CERN?
Thank you very much. Sincerely yours, Niccolò
——-
But then to the comment of Prof. Rössler again…
Tottoli: I am not an astrophysicist and what I know about cosmic rays I read it on this (and a few other) paper(s). To obtain the bounds from neutron stars, GM assume that the high-energy cosmic rays contain at least 10% of protons, and they quote several sources (refs.[67]-[73]) to argue that this is a conservative assumption. Do you have a problem with any of those sources? Anyway, note that the authors are ready to admit — both in section 8.1.1 and in the conclusions — that “a greater dominance of heavy elements reduces the range of such bonds”. I don’t see what would be the point of repeating the calculation for other “particles which would produce the smallest number of MBH”. You already know that this would result in weaker bounds. But if there are at least 10% protons in the cosmic rays — as seems likely — the bounds from neutron stars in binary systems are safe, otherwise they are not.
As to your third (silly) question, isn’t the answer obvious? I hate humankind, and I want it to be swallowed by a black hole. Or, in alternative, CERN has kidnapped my family and threatens to annihilate them with antimatter if I don’t comply.
Dear PassingByAgain
Hahaha — oh no, your last paragraph was hopefully not straight but I should not laugh, because the discussed theme could have a sad outcome… but it does not answer my silly question (please forgive me).
However the last sentence in your first paragraph is exactly the point — and very honest!
I do not have a “problem” that GM gives these references.
The problem is, that not ALL of these references (even the ones which handle high energetic cosmic rays in LSAG) are completely honest. I explain:
Completely honest regarding the kind of these high energetic cosmic rays is only to make assumptions and (looking at the texts) this is not the case in all these references, but in some it is (believe me, I have seen it or please take a look yourself).
I know that there are assumptions (and some “signs”) for the composition of high and even ultra-high energy cosmic rays or particles but it is so that the real composition is not KNOWN, therefore they have tables with hypothetical 100% proton or/and 100% hypothetical iron fluxes.
(At the LHC we know we have protons or lead ions.)
As it is important to consider the worst case, one could say 100% iron is somehow thought for the worst case — but the worst case would be different, therefore my 2nd question.
Because the type of these rays or particles (possibly producing MBH) is not known for sure and only secondary “events” have been measured, one (or GM) would have to make calculations specially for the (at least) specific particles or most preferable even for exotic particles — if possible — which would produce the smallest numbers of MBH on neutron stars. The very best would be calculations for a hypothetical 100% flux, as they have made for protons or/and iron just for example in table 2.
Thank you very much. Sincerely yours, Niccolò
——-
Tottoli: true, I didn’t really answer your third question. That’s because it IS silly and it does not deserve an answer. In science we do not “take sides” for this or that group. In science we assess the validity of an argument, which can be correct or incorrect, sufficiently or insufficiently motivated, and so on. I don’t need to “take sides” to see that Rossler’s “Telemach” is gibberish, as I don’t need to “take sides” to see that the GM paper contains — to the best of my understanding — sound physics.
For the rest, I haven’t read all of the references [67]-[73], but if your judgment about their “honesty” is based on the position of the tables and the wording of the captions, I am not particularly worried about it. And I still fail to understand (but please, don’t bother explaining it) your insistence on the fact that GM should repeat their calculation of the neutron-star bounds with 100% iron (or whatever else) flux. Indeed, “100% iron flux” falls into the category “less-than-10% proton flux”, which is already discussed in the paper. In that case, which the authors consider “unlikely”, the bounds from neutron stars are no longer valid. Note however that — as you can read in page 40 — the bounds from white dwarfs remain valid even in the case of a 100% iron flux.
Dear PassingByAgain
Hypothetical 100% iron would not give the smallest number of MBH.
But no problem if you do not understand my last paragraph. It‘s ok, because I have told all what I had on my mind.
I think we should stop now in order to let others tell their arguments (as Mr. Conant or Prof. Rössler above).
Let‘s just hope that there is no risk, otherwise we will see us somewhere over the rainbow…
Many thanks. Keep to be critical and selfcritical. Peace! Niccolò
——-
BTW: If you really not understand: The composition of CR is not known. For example hypothetical 100% lead or gold would give less MBH than iron. Ok?
(I mean the composition of the high energy cosmic rays or particles!)
Thank you, Niccolò
——-