Menu

Blog

Page 11884

Oct 7, 2012

Debunking Pulse Detonation Engines — Yes, No, Maybe

Posted by in categories: business, defense, engineering, military, nuclear weapons, physics, space, treaties

Previous posting in this Debunking Series.

In this post we will look at the last three types of engines. Can these engine technologies be debunked?

Start with the boring stuff. Nuclear/plasma engines. For more information look up Franklin Chang-Diaz’s Variable Specific Impulse Magnetoplasma Rocket (VASIMR). Real. Cannot be debunked.

Now for the more interesting stuff. The second is Pulse Detonation Engines (PDE). This type of engine uses detonation waves to combust fuel and oxidizer mixture. “The engine is pulsed because the mixture must be renewed in the combustion chamber between each detonation wave initiated by an ignition source.” Theoretically this type of engine is capable of speeds from subsonic to Mach 5.

Continue reading “Debunking Pulse Detonation Engines — Yes, No, Maybe” »

Oct 6, 2012

The decaying web and our disappearing history

Posted by in categories: information science, media & arts, philosophy

On January 28 2011, three days into the fierce protests that would eventually oust the Egyptian president Hosni Mubarak, a Twitter user called Farrah posted a link to a picture that supposedly showed an armed man as he ran on a “rooftop during clashes between police and protesters in Suez”. I say supposedly, because both the tweet and the picture it linked to no longer exist. Instead they have been replaced with error messages that claim the message – and its contents – “doesn’t exist”.

Few things are more explicitly ephemeral than a Tweet. Yet it’s precisely this kind of ephemeral communication – a comment, a status update, sharing or disseminating a piece of media – that lies at the heart of much of modern history as it unfolds. It’s also a vital contemporary historical record that, unless we’re careful, we risk losing almost before we’ve been able to gauge its importance.

Consider a study published this September by Hany SalahEldeen and Michael L Nelson, two computer scientists at Old Dominion University. Snappily titled “Losing My Revolution: How Many Resources Shared on Social Media Have Been Lost?”, the paper took six seminal news events from the last few years – the H1N1 virus outbreak, Michael Jackson’s death, the Iranian elections and protests, Barack Obama’s Nobel Peace Prize, the Egyptian revolution, and the Syrian uprising – and established a representative sample of tweets from Twitter’s entire corpus discussing each event specifically.

It then analysed the resources being linked to by these tweets, and whether these resources were still accessible, had been preserved in a digital archive, or had ceased to exist. The findings were striking: one year after an event, on average, about 11% of the online content referenced by social media had been lost and just 20% archived. What’s equally striking, moreover, is the steady continuation of this trend over time. After two and a half years, 27% had been lost and 41% archived.

Continue reading “The decaying web and our disappearing history”

Oct 5, 2012

Want to Get 70 Billion Copies of Your Book In Print? Print It In DNA

Posted by in categories: biological, biotech/medical, chemistry, futurism, information science, media & arts

I have been meaning to read a book coming out soon called Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves. It’s written by Harvard biologist George Church and science writer Ed Regis. Church is doing stunning work on a number of fronts, from creating synthetic microbes to sequencing human genomes, so I definitely am interested in what he has to say. I don’t know how many other people will be, so I have no idea how well the book will do. But in a tour de force of biochemical publishing, he has created 70 billion copies. Instead of paper and ink, or pdf’s and pixels, he’s used DNA.

Much as pdf’s are built on a digital system of 1s and 0s, DNA is a string of nucleotides, which can be one of four different types. Church and his colleagues turned his whole book–including illustrations–into a 5.27 MB file–which they then translated into a sequence of DNA. They stored the DNA on a chip and then sequenced it to read the text. The book is broken up into little chunks of DNA, each of which has a portion of the book itself as well as an address to indicate where it should go. They recovered the book with only 10 wrong bits out of 5.27 million. Using standard DNA-copying methods, they duplicated the DNA into 70 billion copies.

Scientists have stored little pieces of information in DNA before, but Church’s book is about 1,000 times bigger. I doubt anyone would buy a DNA edition of Regenesis on Amazon, since they’d need some expensive equipment and a lot of time to translate it into a format our brains can comprehend. But the costs are crashing, and DNA is a far more stable medium than that hard drive on your desk that you’re waiting to die. In fact, Regenesis could endure for centuries in its genetic form. Perhaps librarians of the future will need to get a degree in biology…

Link to Church’s paper

Source

Oct 4, 2012

How do you debunk this?

Posted by in categories: defense, engineering, finance, particle physics, physics, scientific freedom, space

Previous post in this Debunking Series.

——-

This video was broadcast on G4TV, September 19th 2012.

http://www.g4tv.com/videos/60838/dr-eric-w-davis-on-new-ligh…g-science/

Continue reading “How do you debunk this?” »

Oct 2, 2012

In Conversation with Albert-lászló Barabási on Thinking in Network Terms

Posted by in categories: complex systems, information science

One question that fascinated me in the last two years is, can we ever use data to control systems? Could we go as far as, not only describe and quantify and mathematically formulate and perhaps predict the behavior of a system, but could you use this knowledge to be able to control a complex system, to control a social system, to control an economic system?

We always lived in a connected world, except we were not so much aware of it. We were aware of it down the line, that we’re not independent from our environment, that we’re not independent of the people around us. We are not independent of the many economic and other forces. But for decades we never perceived connectedness as being quantifiable, as being something that we can describe, that we can measure, that we have ways of quantifying the process. That has changed drastically in the last decade, at many, many different levels.

Continue reading “Thinking in Network Terms” and watch the hour long video interview

Oct 2, 2012

Evolution in a Toxic World

Posted by in categories: biological, evolution

Earth is a hostile place — and that’s even before one starts attending school. Even when life first sparked into being, it had to evolve defenses to deal with a number of toxins, such as damaging ultraviolet light, then there were toxic elements ranging from iron to oxygen to overcome, later, there was DDT and other toxic chemicals and of course, there are all those dreaded cancers.

In Evolution In A Toxic World: How Life Responds To Chemical Threats [Island Press; 2012: Guardian Bookshop; Amazon UK;Amazon US], environmental toxicologist Emily Monosson outlines three billion years of evolution designed to withstand the hardships of living on this deadly planet, giving rise to processes ranging from excretion, transformation or stowing harmful substances. The subtitle erroneously suggests these toxins are only chemical in nature, but the author actually discusses more than this one subclass of toxins.

The method that arose to deal with these toxins is a plethora of specialised, targeted proteins — enzymes that capture toxins and repair their damages. By following the origin and progression of these shared enzymes that evolved to deal with specific toxins, the author traces their history from the first bacteria-like organisms to modern humans. Comparing the new field evolutionary toxicology to biomedical research, Dr Monosson notes: “In light of evolution, biomedical researchers are now asking questions that might seem antithetical to medicine”.

Continue reading “Evolution in a Toxic World”

Oct 2, 2012

The Ontological Einstein – Minipaper

Posted by in categories: existential risks, particle physics

The Ontological Einstein – One to Four

Otto E. Rossler and Dieter Fröhlich, Faculty of Science, University of Tübingen, Auf der Morgenstelle 8, 72076 Tubingen, Germany

One: Ontological clock slow-down downstairs in gravity

Two: Ontological rest-mass decrease downstairs in gravity

Continue reading “The Ontological Einstein – Minipaper” »

Oct 1, 2012

Debunking Antimatter Rockets for Interstellar Travel

Posted by in categories: education, engineering, physics, policy, space

Previous Post in this Debunking Series.

Why is it necessary to debunk bad or unrealistic technologies? If don’t we live in a dream world idealized by theoretical engineering that has no hope of ever becoming financially feasible. What a waste of money, human resources and talent. I’d rather we know now upfront and channel our energies to finding feasible engineering and financial solutions. Wouldn’t you?

We did the math required to figure out the cost of antimatter fuel one would require just to reach 0.1c and then cost at that velocity, never mind about reaching Alpha Centauri.

Table 2: Antimatter Rocket Fuel Costs to Alpha Centuariat 0.1c (in metric tons)
Source of Estimates Amount of Antimatter Required Maximum Velocity

Spacecraft Mass

Continue reading “Debunking Antimatter Rockets for Interstellar Travel” »

Oct 1, 2012

Debunking Conventional Rocket Interstellar Travel Once And For All

Posted by in categories: education, engineering, physics, policy, space

Previous Post in this Debunking Series.

Why is it necessary to debunk bad or unrealistic technologies? If don’t we live in a dream world idealized by theoretical engineering that has no hope of ever becoming financially feasible. What a waste of money, human resources and talent. I’d rather we know now upfront and channel our energies to finding feasible engineering and financial solutions. Wouldn’t you?

We did the math required to figure out how much fuel one would require just to reach 0.1c and then cost at that velocity until you reach Alpha Centauri and reverse thrust to orbit the star.

Table 1: Conventional Rocket Fuel Costs to travel to Alpha Centauri at 0.1c
Maximum Velocity (km/s)

1980’s cost ($/lb)

Continue reading “Debunking Conventional Rocket Interstellar Travel Once And For All” »

Oct 1, 2012

Liquor & Glass — Sellafield/BNFL Keeping a Lid on It

Posted by in categories: engineering, ethics, nuclear energy, policy, sustainability, transparency

Fukushima reawakened the world to the dangers of nuclear power, and reading back over Fearing Sellafield (2003) by Colum Kenny recently, I reflect back on how deflective and dishonest industry can be to steer clear of critical opinion. Seeing parallels suggested in other industries today, I wonder if much has really changed.

Highly Active Liquor (HAL) produced by the reprocessing of irradiated nuclear fuel at Sellafield, reached a level of 1,500 cubic meters in storage at its peak circa 2001, the capacity of a 50 meter Olympic swimming pool. Particularly unstable, a disruption to electricity & water coolant could result in such liquor boiling, overloading the ventilation filtration systems and leading to a nuclear accident. Containing about 80 times the amount released during the 1986 Chernobyl accident according to a report for the European Parliament at that time, we are rather fortunate such a serious accident never occurred. This analysis was provided by what became known as The WISE Report — so called due to associated with the World Information Service on Energy (WISE) in Paris. In response BNFL set out to reduce this liquor to a solid form known as ‘glass’ — borosilicate glass — much safer than when kept in liquid form, and put in storage — though much of it still remains to be vitrified.

In 2000/2001, the Nuclear Installations Inspectorate (NII) of the HSE published a number of reports on aspects of Sellafield that led to causes of concern. One report in particular entitled ‘an investigation into the falsification of pellet diameter data in the MOX demonstration facility at the BNFL Sellafield site and the effect of this on the safety of MOX fuel in use’ suggested deliberate dishonesty in keeping records. BNFL subsequently complied with most of these recommendations.

Authors of the WISE report however still had concerns regarding increases in levels in certain sea discharges and aerial releases, and inconsistent with the UK’s obligations under the OSPAR Convention. It stated that the deposition of plutonium within 20km of Sellafield attributable to aerial emissions has been estimated at 160–280 billion becquerels — several times the plutonium fallout from all atmospheric nuclear weapons testing, and that 250kg-500kg of plutonium from Sellafield has been absorbed as sediments on the bed of the Irish sea ‘representing a long-term regional hazard of largely unknown proportions’. The report had been treated with caution by the European Commission and conveniently dismissed by the National Radiological Protection Board in the UK by claiming that some of the conclusions drawn in the report were based on ‘lacking objectivity’. It seems that governments are always bent towards safeguarding industry first, leaving environmental concerns and the health of our Mother Ship as a secondary issue.