The Future of Skunkworks Management, Now! By Mr. Andres Agostini This is an excerpt from the conclusion section of, “…The Future of Skunkworks Management, Now!…” that discusses some management theories and practices and strategies. To view the entire piece, just click the link at the end of this post: Peter Drucker asserted, “…In a few hundred years, when the story of our [current] time is written from a long-term perspective, it is likely that the most important event those historians will see is not technology, not the Internet, not e-commerce [not so-called ‘social media’]. IT is an unprecedented change in the human condition. For the first time ─ literally ─ substantial and growing numbers of people have choices. for the first time, they will have to manage themselves. And society is totally unprepared for it…”
Please see the full presentation at http://goo.gl/FnJOlg
This is an excerpt from the conclusion section of, “…NASA’s Managerial and Leadership Methodology, Now Unveiled!..!” by Mr. Andres Agostini, that discusses some management theories and practices. To read the entire piece, just click the link at the end of this illustrated article and presentation:
In addition to being aware and adaptable and resilient before the driving forces reshaping the current present and the as-of-now future, there are some extra management suggestions that I concurrently practice:
1. Given the vast amount of insidious risks, futures, challenges, principles, processes, contents, practices, tools, techniques, benefits and opportunities, there needs to be a full-bodied practical and applicable methodology (methodologies are utilized and implemented to solve complex problems and to facilitate the decision-making and anticipatory process).
The manager must always address issues with a Panoramic View and must also exercise the envisioning of both the Whole and the Granularity of Details, along with the embedded (corresponding) interrelationships and dynamics (that is, [i] interrelationships and dynamics of the subtle, [ii] interrelationships and dynamics of the overt and [iii] interrelationships and dynamics of the covert).
In this essay I argue that technologies and techniques used and developed in the fields of Synthetic Ion Channels and Ion Channel Reconstitution, which have emerged from the fields of supramolecular chemistry and bio-organic chemistry throughout the past 4 decades, can be applied towards the purpose of gradual cellular (and particularly neuronal) replacement to create a new interdisciplinary field that applies such techniques and technologies towards the goal of the indefinite functional restoration of cellular mechanisms and systems, as opposed to their current proposed use of aiding in the elucidation of cellular mechanisms and their underlying principles, and as biosensors.
In earlier essays (see here and here) I identified approaches to the synthesis of non-biological functional equivalents of neuronal components (i.e. ion-channels ion-pumps and membrane sections) and their sectional integration with the existing biological neuron — a sort of “physical” emulation if you will. It has only recently come to my attention that there is an existing field emerging from supramolecular and bio-organic chemistry centered around the design, synthesis, and incorporation/integration of both synthetic/artificial ion channels and artificial bilipid membranes (i.e. lipid bilayer). The potential uses for such channels commonly listed in the literature have nothing to do with life-extension however, and the field is to my knowledge yet to envision the use of replacing our existing neuronal components as they degrade (or before they are able to), rather seeing such uses as aiding in the elucidation of cellular operations and mechanisms and as biosensors. I argue here that the very technologies and techniques that constitute the field (Synthetic Ion-Channels & Ion-Channel/Membrane Reconstitution) can be used towards the purpose of the indefinite-longevity and life-extension through the iterative replacement of cellular constituents (particularly the components comprising our neurons – ion-channels, ion-pumps, sections of bi-lipid membrane, etc.) so as to negate the molecular degradation they would have otherwise eventually undergone.
While I envisioned an electro-mechanical-systems approach in my earlier essays, the field of Synthetic Ion-Channels from the start in the early 70’s applied a molecular approach to the problem of designing molecular systems that produce certain functions according to their chemical composition or structure. Note that this approach corresponds to (or can be categorized under) the passive-physicalist sub-approach of the physicalist-functionalist approach (the broad approach overlying all varieties of physically-embodied, “prosthetic” neuronal functional replication) identified in an earlier essay.
The field of synthetic ion channels is also referred to as ion-channel reconstitution, which designates “the solubilization of the membrane, the isolation of the channel protein from the other membrane constituents and the reintroduction of that protein into some form of artificial membrane system that facilitates the measurement of channel function,” and more broadly denotes “the [general] study of ion channel function and can be used to describe the incorporation of intact membrane vesicles, including the protein of interest, into artificial membrane systems that allow the properties of the channel to be investigated” [1]. The field has been active since the 1970s, with experimental successes in the incorporation of functioning synthetic ion channels into biological bilipid membranes and artificial membranes dissimilar in molecular composition and structure to biological analogues underlying supramolecular interactions, ion selectivity and permeability throughout the 1980’s, 1990’s and 2000’s. The relevant literature suggests that their proposed use has thus far been limited to the elucidation of ion-channel function and operation, the investigation of their functional and biophysical properties, and in lesser degree for the purpose of “in-vitro sensing devices to detect the presence of physiologically-active substances including antiseptics, antibiotics, neurotransmitters, and others” through the “… transduction of bioelectrical and biochemical events into measurable electrical signals” [2].
I continue to survey the available technology applicable to spaceflight and there is little change.
The remarkable near impact and NEO on the same day seems to fly in the face of the experts quoting a probability of such coincidence being low on the scale of millenium. A recent exchange on a blog has given me the idea that perhaps crude is better. A much faster approach to a nuclear propelled spaceship might be more appropriate.
Unknown to the public there is such a thing as unobtanium. It carries the country name of my birth; Americium.
A certain form of Americium is ideal for a type of nuclear solid fuel rocket. Called a Fission Fragment Rocket, it is straight out of a 1950’s movie with massive thrust at the limit of human G-tolerance. Such a rocket produces large amounts of irradiated material and cannot be fired inside, near, or at the Earth’s magnetic field. The Moon is the place to assemble, test, and launch any nuclear mission.
Recently, I met Josh Hopkins of Lockheed’s Advanced Programs, AIAA Rocky Mountain Region’s First Annual Technical Symposium (RMATS), October 26, 2012. Josh was the keynote speaker at this RMATS. Here is his presentation. After his presentation we talked outside the conference hall. I told him about my book, and was surprised when he said that two groups had failed to reproduce Podkletnov’s work. I knew one group had but a second? As we parted we said we’d keep in touch. But you know how life is, it has the habit of getting in the way of exciting research, and we lost touch.
About two weeks ago, I remembered, that Josh had said that he would provide some information on the second group that had failed to reproduce Podkletnov’s work. I sent him an email, and was very pleased to hear back from him and that the group’s finding had been published under the title “Gravity Modification by High-Temperature Semiconductors”. The authors were C. Woods, S. Cooke, J. Helme & C. Caldwell. Their paper was published in the 37th AIAA/ASME/SAE/ASEE Joint Propulsion Conference and Exhibit, 8–11 July 2001, Salt Lake City, Utah. I bought a copy for the AIAA archives, and read it, reread it, and reread it.
Then I found a third team they published their lack of findings “Gravity Modification Experiments Using a Rotating Superconducting Disk and Radio Frequency Fields”. The authors were G. Hathaway, B. Cleveland and Y. Bao. Published in Physica C, 2003.
Both papers focused on attempting to build a correct superconducting disc. At least Wood et al said “the tests have not fulfilled the specified conditions for a gravity effect”. The single most difficult thing to do was to build a bilayered superconducting disc. Woods et al tried very hard to do so. Reading through Hathaway et all paper suggest that they too had similar difficulties. Photo shows a sample disc from Woods’ team. Observe the crack in the middle.
Using an innocuous bacterial virus, bioengineers have created a biological mechanism to send genetic messages from cell to cell. The system greatly increases the complexity and amount of data that can be communicated between cells and could lead to greater control of biological functions within cell communities…
In harnessing DNA for cell-cell messaging the researchers have also greatly increased the amount of data they can transmit at any one time. In digital terms, they have increased the bit rate of their system. The largest DNA strand M13 is known to have packaged includes more than 40,000 base pairs. Base pairs, like 1s and 0s in digital encoding, are the basic building blocks of genetic data. Most genetic messages of interest in bioengineering range from several hundred to many thousand base pairs.
Ortiz was even able to broadcast her genetic messages between cells separated by a gelatinous medium at a distance of greater than 7 centimeters.
“That’s very long-range communication, cellularly speaking,” she said.
Down the road, the biological Internet could lead to biosynthetic factories in which huge masses of microbes collaborate to make more complicated fuels, pharmaceuticals and other useful chemicals. With improvements, the engineers say, their cell-cell communication platform might someday allow more complex three-dimensional programming of cellular systems, including the regeneration of tissue or organs. Continue reading “Stanford Bioengineers Introduce ‘Bi-Fi’ — The Biological Internet”
I have been meaning to read a book coming out soon called Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves. It’s written by Harvard biologist George Church and science writer Ed Regis. Church is doing stunning work on a number of fronts, from creating synthetic microbes to sequencing human genomes, so I definitely am interested in what he has to say. I don’t know how many other people will be, so I have no idea how well the book will do. But in a tour de force of biochemical publishing, he has created 70 billion copies. Instead of paper and ink, or pdf’s and pixels, he’s used DNA.
Much as pdf’s are built on a digital system of 1s and 0s, DNA is a string of nucleotides, which can be one of four different types. Church and his colleagues turned his whole book–including illustrations–into a 5.27 MB file–which they then translated into a sequence of DNA. They stored the DNA on a chip and then sequenced it to read the text. The book is broken up into little chunks of DNA, each of which has a portion of the book itself as well as an address to indicate where it should go. They recovered the book with only 10 wrong bits out of 5.27 million. Using standard DNA-copying methods, they duplicated the DNA into 70 billion copies.
Scientists have stored little pieces of information in DNA before, but Church’s book is about 1,000 times bigger. I doubt anyone would buy a DNA edition of Regenesis on Amazon, since they’d need some expensive equipment and a lot of time to translate it into a format our brains can comprehend. But the costs are crashing, and DNA is a far more stable medium than that hard drive on your desk that you’re waiting to die. In fact, Regenesis could endure for centuries in its genetic form. Perhaps librarians of the future will need to get a degree in biology…
It is a race against time- will this knowledge save us or destroy us? Genetic modification may eventually reverse aging and bring about a new age but it is more likely the end of the world is coming.
The Fermi Paradox informs us that intelligent life may not be intelligent enough to keep from destroying itself. Nothing will destroy us faster or more certainly than an engineered pathogen (except possibly an asteroid or comet impact). The only answer to this threat is an off world survival colony. Ceres would be perfect.
“The more anxiety one produces, the more the discussion there would be about how real and how possible actual existential threats are.”
John Hunt recently queried me on what steps I might take to form an organization to advocate for survival colonies and planetary defense. His comment on anxiety is quite succinct. In truth the landing on the moon was the product of fear- of the former Soviet Union’s lead in rocket technology. As we as a nation quelled that anxiety the budget for human space flight dwindled. But the fear of a nuclear winter continued to grow along with the size of our arsenals.
Interestingly, at the height of the cold war, evidence of yet another threat to human existence was uncovered in the Yucatan peninsula of Mexico in 1981; Chicxulub. But even before the dinosaur killer was discovered, perhaps the greatest threat of all to humanity was born in 1973 when Herb Boyer and Stanley Cohen created the first genetically modified organism. The money to answer both of these threats by going into space continues to be expended by the military industrial complex.
Mile wide rocks in space and microscopic organisms on earth are both threats to our existence, but the third and undoubtedly greatest threat is our own apathy. Why do we expend the tremendous resources of our race on everything BUT keeping it from going extinct?
I have been corresponding with John Hunt and have decided that perhaps it is time to start moving toward forming a group that can accomplish something.
The recent death of Neil Armstrong has people thinking about space. The explosion of a meteor over Britain and the curiosity rover on Mars are also in the news. But there is really nothing new under the sun. There is nothing that will hold people’s attention for very long outside of their own immediate comfort and basic needs. Money is the central idea of our civilization and everything else is soon forgotten. But this idea of money as the center of all activity is a death sentence. Human beings die and species eventually become extinct just as worlds and suns also are destroyed or burn out. Each of us is in the position of a circus freak on death row. Bizarre, self centered, doomed; a cosmic joke. Of all the creatures on this planet, we are the freaks the other creatures would come to mock- if they were like us. If they were supposedly intelligent like us. But are we actually the intelligent ones? The argument can be made that we lack a necessary characteristic to be considered truly intelligent life forms.
Truly intelligent creatures would be struggling with three problems if they found themselves in our situation as human beings on Earth in the first decades of this 21st century;
1. Mortality. With technology possible to delay death and eventually reverse the aging process, intelligent beings would be directing the balance of planetary resources towards conquering “natural” death.