• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

Recent Warfare Technologies

Making things out of cellulose could also have a secondary Inspector Gadget this message will self destruct effect if desired.

 
But you will have to watch out for the enemy trained termites and beavers. >:D
 
A humerous look at how using DNA as high capacity "memory" would work:
 
US Navy laser program has been sped up, and the director apparently sees no need to wait for specially designed ships with integrated electrical power generation systems to use lasers or railguns effectively. Obviously some technical breakthroughs have happened, and the speedup of fielding would indicate there may be some sense of urgency (laser weapons have been "on again, off again" in terms of funding and development since the 80's). As a bit of a twofer there is also some discusion on UUV technology as well:

http://www.wired.com/dangerroom/2012/10/lasers/

Navy’s Top Geek Says Laser Arsenal Is Just Two Years Away
By Spencer AckermanEmail AuthorOctober 22, 2012 | 2:11 pm | Categories: Navy

Rear Adm. Matthew Klunder, the chief of Naval research, salutes Virginia Tech’s humanoid robot CHARLI-2 at the Office of Naval Research’s science and technology expo, Oct. 22, 2012. Photo: Wired.com/Jared Soares

Never mind looming defense cuts or residual technical challenges. The Navy’s chief futurist is pushing up the anticipated date for when sailors can expect to use laser weapons on the decks of their ships, and raising expectations for robotic submarines.

“On directed energy” — the term for the Navy’s laser cannons, “I’d say two years,” Rear Adm. Matthew Klunder, the chief of the Office of Naval Research, told Danger Room in a Monday interview. The previous estimate, which came from Klunder’s laser technicians earlier this year, was that it will take four years at the earliest for a laser gun to come aboard.

“We’re well past physics,” Klunder said, echoing a mantra for the Office of Naval Research’s laser specialists. Now, the questions surrounding a weapon once thought to be purely science fiction sound almost pedestrian. “We’re just going through the integration efforts,” Klunder continued. “Hopefully, that tells you we’re well mature, and we’re ready to put these on naval ships.”

Klunder isn’t worried about the ships generating sufficient energy to fill the laser gun’s magazine, which has been an engineering concern of the Navy’s for years. “I’ve got the power,” said Klunder, who spoke during the Office of Naval Research’s biennial science and technology conference. “I just need to know on this ship, this particular naval vessel, what are the power requirements, and how do I integrate that directed energy system or railgun system.”

That’s a relief for the Navy. It means that the Navy’s future ships probably won’t have to make captains choose between maneuvering their ships and firing their laser weapons out of fear they’d overload their power supplies.

But shipboard testing is underway. Klunder wouldn’t elaborate, but he said that there have been “very successful” tests placing laser weapons on board a ship. That’s not to say the first order of business for naval laser weaponry will be all that taxing: In their early stages, Pentagon officials talk about using lasers to shoot down drones or enable better sensing. Klunder alluded to recent tests in which the Navy’s lasers brought drones down, although he declined to elaborate.

Then come the unmanned submarines. Current, commercially available drone subs typically swim for several days at a time, according to Frank Herr, an Office of Naval Research department head who works on so-called unmanned underwater vehicles, or UUVs. That’s way behind the capabilities that successive Navy leaders want: crossing entire oceans without needing to refuel. So Klunder wants to raise the bar.

“The propulsion systems that I think you’re going to see within a year are going to [give] a UUV with over 30 days of endurance,” Klunder said. By 2016, a prototype drone sub for the office’s Long Duration Unmanned Underwater Vehicle program should be able to spend 60 days underwater at a time: “That’s ahead of schedule of what we told the secretary of the Navy a year ago.”

That’s a challenge for the subs’ propulsion and fuel systems. Typically, Herr explains, the commercially available batteries built into prototype drone subs take up a lot of the ship; but building bigger subs just increases the need for power. The nut that the Office of Naval Research has to crack is using more efficient fuel cells while designing subs that don’t need as much energy to run. “We’re thinking about power requirements for these systems as well as the power [sources] available for them,” Herr says.

“The breakthrough,” Klunder explains, “was really on getting past your more traditional lead-acid battery pieces to more technically robust but also mature lithium ion fuel cell technology and the hybrids of that.”

None of that is to say the lasers will be actually on board by 2014 or the drone subs will disappear beneath the waves for 60 days by 2016. That depends in part on the Navy’s ability to afford it — and at the conference this morning, Adm. Mark Ferguson, the Navy’s vice chief, warned that “research and development is part of that reduction” in defense budgets currently scheduled to take effect in January. But it might not be long before Klunder is finally able to hand over a battle-ready laser cannon to Big Navy.
 
Solving bandwidth issues is a big deal if we are going to "digitize" the battlefield successfully. Even applying this sort of technology to digital radios will have positive impacts on clarity and reception in difficult situations:

http://www.technologyreview.com/news/429722/a-bandwidth-breakthrough/?ref=rss

A Bandwidth Breakthrough

A dash of algebra on wireless networks promises to boost bandwidth tenfold, without new infrastructure.

David Talbot

Tuesday, October 23, 2012

Scott Balmer

Academic researchers have improved wireless bandwidth by an order of magnitude—not by adding base stations, tapping more spectrum, or cranking up transmitter wattage, but by using algebra to banish the network-clogging task of resending dropped packets.

By providing new ways for mobile devices to solve for missing data, the technology not only eliminates this wasteful process but also can seamlessly weave data streams from Wi-Fi and LTE—a leap forward from other approaches that toggle back and forth. "Any IP network will benefit from this technology," says Sheau Ng, vice president for research and development at NBC Universal.

Several companies have licensed the underlying technology in recent months, but the details are subject to nondisclosure agreements, says Muriel Medard, a professor at MIT's Research Laboratory of Electronics and a leader in the effort. Elements of the technology were developed by researchers at MIT, the University of Porto in Portugal, Harvard University, Caltech, and Technical University of Munich. The licensing is being done through an MIT/Caltech startup called Code-On Technologies.

The underlying problem is huge and growing: on a typical day in Boston, for example, 3 percent of packets are dropped due to interference or congestion. Dropped packets cause delays in themselves, and then generate new back-and-forth network traffic to replace those packets, compounding the original problem.

The practical benefits of the technology, known as coded TCP, were seen on a recent test run on a New York-to-Boston Acela train, notorious for poor connectivity. Medard and students were able to watch blip-free YouTube videos while some other passengers struggled to get online. "They were asking us 'How did you do that?' and we said 'We're engineers!' " she jokes.

More rigorous lab studies have shown large benefits. Testing the system on Wi-Fi networks at MIT, where 2 percent of packets are typically lost, Medard's group found that a normal bandwidth of one megabit per second was boosted to 16 megabits per second. In a circumstance where losses were 5 percent—common on a fast-moving train—the method boosted bandwidth from 0.5 megabits per second to 13.5 megabits per second. In a situation with zero losses, there was little if any benefit, but loss-free wireless scenarios are rare.

Medard's work "is an important breakthrough that promises to significantly improve bandwidth and quality-of-experience for cellular data users experiencing poor signal coverage," says Dipankar "Ray" Raychaudhuri, director or the Winlab at Rutgers University (see "Pervasive Wireless"). He expects the technology to be widely deployed within two to three years.

To test the technology in the meantime, Medard's group set up proxy servers in the Amazon cloud. IP traffic was sent to Amazon, encoded, and then decoded as an application on phones. The benefit might be even better if the technology were built directly into transmitters and routers, she says. It also could be used to merge traffic coming over Wi-Fi and cell phone networks rather than forcing devices to switch between the two frequencies.

The technology transforms the way packets of data are sent. Instead of sending packets, it sends algebraic equations that describe series of packets. So if a packet goes missing, instead of asking the network to resend it, the receiving device can solve for the missing one itself. Since the equations involved are simple and linear, the processing load on a phone, router, or base station is negligible, Medard says.

Whether gains seen in the lab can be achieved in a full-scale deployment remains to be seen, but the fact that the improvements were so large suggests a breakthrough, says Ng, the NBC executive, who was not involved in the research. "In the lab, if you only find a small margin of improvement, the engineers will be skeptical. Looking at what they have done in the lab, it certainly is order-of-magnitude improvement—and that certainly is very encouraging," Ng says.

If the technology works in large-scale deployments as expected, it could help forestall a spectrum crunch. Cisco Systems says that by 2016, mobile data traffic will grow 18-fold—and Bell Labs goes farther, predicting growth by a factor of 25. The U.S. Federal Communications Commission has said spectrum could run out within a couple of years.

Medard stops short of saying the technology will prevent a spectrum crunch, but she notes that the current system is grossly inefficient. "Certainly there are very severe inefficiencies that should be remedied before you consider acquiring more resources," she says.

She says that when her group got online on the Acela, the YouTube video they watched was of college students playing a real-world version of the Angry Birds video game. "The quality of the video was good. The quality of the content—we haven't solved," Medard says.
 
http://www.youtube.com/watch?v=O-BukbpkOd8&feature=player_embedded#!

Boeing Non-Kinetic Missile Records First Operational Test Flight

An air-launched directed EMP missile?  Crazy/awesome.

 
Looks like they may have found a solution to the failure of Yucca Mountain repository to get up and running.

And it may provide a solution to the problem of securing tons of waste from those bent on making a dirty bomb.

World’s Most Powerful Laser Beams to Zap Nuclear Waste

http://www.businessweek.com/news/2012-10-25/world-s-most-powerful-laser-beams-to-zap-nuclear-waste

The European Union will spend about 700 million euros ($900 million) to build the world’s most powerful lasers, technology that could destroy nuclear waste and provide new cancer treatments.

The Extreme Light Infrastructure project has obtained funding for two lasers to be built in the Czech Republic and Romania, Shirin Wheeler, spokeswoman for the European Commission on regional policy, said in a phone interview. A third research center will be in Hungary.

The lasers are 10 times more powerful than any yet built and will be strong enough to create subatomic particles in a vacuum, similar to conditions that may have followed the start of the universe. Eventually, the power of the light beams could be used to deteriorate the radioactivity of nuclear waste in just a few seconds and target cancerous tumors, the projects’s Romanian coordinator Nicolae-Victor Zamfir said in an interview.

“We can’t find in nature any phenomenon with such an intense power like the one that will be generated with this laser,” Zamfir said in a phone interview from Romania. “We expect to see the first results of our research in one or two years after the centre becomes operational.”

The Magurele research center, where the Romanian laser will be located, will consume about 10 megawatts of energy, enough to supply about 2,500 average U.S. households. Most of it will come from geothermal pumps installed at the site, where the laser is expected to become operational in 2017.
Largest Site

“It is probably one of the largest such sites in Europe using unconventional energy,” Zamfir said.

Zamfir said companies from the computer industry have shown interest in the project, but none from the nuclear sector. “We haven’t advertised the project yet properly, possibly also because we didn’t have the EU’s approval.”

The research may replicate the same principles used in a new type of cancer radiotherapy called hadrontherapy, Zamfir said. It directly targets deep-rooted tumors, reducing the risk of recurrence or new tumors. The first results of the experiments are expected for 2018-2019.

“This treatment already exists, but requires expensive and big accelerators,” Zamfir said. “If it becomes possible by using this type of laser, it can be implemented at lower costs as technology advances and the lasers get cheaper.”

The laser technology might also be used to reduce the time it take for atomic waste to lose its radioactivity from thousands of years to a few seconds. That could remove the need to build underground stores to keep waste secure for centuries.
No Solution

“It’s going to take almost 20 years until we would be able to do it, but right now many countries don’t see any solution in the near future,” Zamfir said.

The EU is basing the broject in eastern European countries to support science in former communist countries, where a tradition of research hasn’t prevented academics seeking better- paid posts outside the region.

“The hope is to create a virtuous circle that by having the infrastructure there you also attract more funds and research ,” the European Commission’s Wheeler said.

The city of Magurele is home to Romania’s National Institute of Physics and Nuclear Engineering, established in 1949 and one of the biggest nuclear physics research centers in eastern Europe during the communist era.

Although research is still being carried at the institute, Romania, it’s losing scientsits because it invests only 0.5 percent of its gross domestic product in research, compared with a European average of 2 percent.
Old Road

The research center is less than 10 kilometers away from Bucharest, but the journey can take around 20 minutes on an old road that is now being enlarged.

“There’s no direct public transportation from the center of Bucharest -- you need to change the bus and then hitchhike for those private minibuses,” Zamfir said. “We now hope it will change.”

In Romania, 200 researchers will work at the project full time, with around 1,000 more expected to visit the center for experiments each year once it starts working, according to Zamfir.

The project will be followed by the construction of an even more powerful laser and any of the three countries already involved in the project, plus the U.K., might host the laser. The ELI-Ultra High Field Facility will reach 200 petawatts of power, or 100,000 times the power of the world electric grid.

“The proposal for the fourth site should have been made in 2012, but we haven’t reached maturity with the ongoing three projects to draw enough conclusions,” Zamfir said.

The EU expects to spend 550 million euros in the first phase of the project ending December 2013, Wheeler said. Further applications from Romania and Hungary for the second part of the project should raise the total funding from the organization to 700 million euro, more than 80 percent of the entire cost of the project. About 180 million will come from other sources.
 
Potential for huge improvements in gas turbine efficiency using detonation technology. I have been hearing about "Pulse Wave Detonation" engine experiments on and off since the 1990's; think of it as a highly evolved form of the Pulse Jet engine that pwered the V-1 Buzz bomb in WWII. (There are actuallt a lot of technical differences, but the essentiall idea of the engine being a simple tube where the detonation takes place is pretty close to a pulse jet to understand). This takes the idea and changes the otcome from pure jet thrust to rotary motion:

http://www.nrl.navy.mil/media/news-releases/2012/Navy-Researchers-Look-to-Rotating-Detonation-Engines-to-Power-the-Future

Navy Researchers Look to Rotating Detonation Engines to Power the Future

11/2/2012 07:00 EDT - 140-12r
Contact: Donna McKinney, (202) 767-2541 0 Comments    23  258


NRL researchers have constructed a model of a Rotating Detonation Engine.(Photo: U.S. Naval Research Laboratory)
With its strong dependence on gas-turbine engines for propulsion, the U.S. Navy is always looking for ways to improve the fuel consumption of these engines. At the Naval Research Laboratory (NRL), scientists are studying the complex physics of Rotating Detonation Engines (RDEs) which offer the potential for high dollar savings by way of reduced fuel consumption in gas-turbine engines, explains Dr. Kazhikathra Kailasanath, who heads NRL's Laboratories for Computational Physics and Fluid Dynamics.

Many Navy aircraft use gas-turbine engines for propulsion, with the Navy's gas-turbine engines being fundamentally similar to engines used in commercial airplanes. The Navy also depends on gas-turbine engines to provide propulsion and electricity for many of its ships. Even as future ships move toward the model of an "all electric" propulsion system, they will still need gas-turbine engines to produce electricity for the propulsion system and other critical systems. So building a gas-turbine engine that can handle the Navy's requirements for its warfighting ships and provide a fuel-efficient engine is a high priority for researchers.

The U.S. Navy finds gas-turbine engines attractive because they scale nicely to large powers, are relatively small and self-contained, and are relatively easy to maintain. The gas-turbine engines the Navy uses today are based on the Brayton thermodynamic cycle, where air is compressed and mixed with fuel, combusted at a constant pressure, and expanded to do work for either generating electricity or for propulsion. To significantly improve the performance of gas-turbine engines, researchers need to look beyond the Brayton cycle to explore alternative and possibly more innovative cycles.

NRL researchers believe that one attractive possibility is to use the detonation cycle instead of the Brayton cycle for powering a gas-turbine. NRL has been on the forefront of this research for the last decade and has been a major player in developing Pulse Detonation Engines (PDEs). NRL researchers estimate that retrofitting engines on existing Navy ships, like the USS Arleigh Burke pictured here, with rotating detonation technology could result in millions of dollars in savings a year.(Photo: U.S. Navy/Mass Communication Specialist 1st Class Tommy Lamkin) The Rotating Detonation Engine (RDE) is an even more attractive and different strategy for using the detonation cycle to obtain better fuel efficiency. NRL researchers have constructed a model for simulating RDEs using earlier work done on general detonations, as a foundation.

NRL researchers believe that RDEs have the potential to meet 10% increased power requirements as well as 25% reduction in fuel use for future Navy applications. Currently there are about 430 gas turbine engines on 129 U.S. Navy ships. These engines burn approximately 2 billion dollars worth of fuel each year. By retrofitting these engines with the rotating detonation technology, researchers estimate that the Navy could save approximately 300 to 400 million dollars a year.

Like PDEs, RDEs have the potential to be a disruptive technology that can significantly alter the fuel efficiency of ships and planes; however, there are several challenges that must be overcome before the benefits are realized, explains Dr. Kailasanath. NRL scientists are now focusing their current research efforts on getting a better understanding of how the RDE works and the type of performance that can be actually realized in practice.

More details here: http://www.nrl.navy.mil/content_images/11_FA2.pdf
 
And closer to home, DARPA is working on getting sensor fusion technology down to the individual soldier level. Being able to overlay Day, Night and Thermal on the same view addresses issues  like obscurants or various types of camouflage. Getting images beamed to you from multiple viewpoints and getting some sort of coherent image out of that would require quite a bit more computing power and bandwidth, but might be quite useful for vehicles and aircraft:

http://www.darpa.mil/NewsEvents/Releases/2012/11/02.aspx

NIGHT OR DAY, RAIN OR SHINE: DARPA SEEKS MULTI-BAND, PORTABLE SENSOR TO PROVIDE CLEAR IMAGERY TO WARFIGHTERS

November 02, 2012

Clip-on or helmet-mounted camera system would fuse useful aspects of visible, near infrared, and infrared images into a single shot under all weather and visibility conditions

It is often the case with new military technologies that warfighters need to adjust to their equipment to access needed capabilities. As missions shift, however, and warfighters are required to work in smaller teams and access more remote locations, it is technology that must adapt if it is to remain useful. Desirable features for many new man-portable systems include small size, light weight, minimal power consumption, low cost, ease of use, multi-functionality and, to the extent possible, network friendliness.

DARPA created the Pixel Network for Dynamic Visualization program, or PIXNET, to apply these features to the cameras and sensors used by dismounted warfighters and small combat units for battlefield awareness and threat detection and identification. PIXNET aims to develop helmet-mounted and clip-on camera systems that combine visible, near infrared, and infrared sensors into one system and aggregate the outputs. PIXNET technology would ingest the most useful data points from each component sensor and fuse them into a common, information-rich image that can be viewed on the warfighter’s heads-up display, and potentially be shared across units.

The base technologies DARPA proposes to use already exist and are currently used by warfighters. However, these devices typically have dedicated functionality, operate independently of one another and provide value only to the immediate operator. Through PIXNET, DARPA seeks to fuse the capabilities of these devices into a single multi-band system, thus alleviating physical overburdening of warfighters, and develop a tool that is network-ready, capable of sharing imagery with other warfighters.

“Existing sensor technologies are a good jumping-off point, but PIXNET will require innovations to combine reflective and thermal bands for maximum visibility during the day or night, and then package this technology for maximum portability. What we really need are breakthroughs in aperture design, focal plane arrays, electronics, packaging and materials science,” said Nibir Dhar, DARPA program manager for PIXNET.  “Success will be measured as the minimization of size, weight, power and cost of the system and the maximization of functionality.”

To help boost processing power while minimizing size and energy use, PIXNET sensors will interface wirelessly with an Android-based smart phone for fusing images and for networking among units. Although the primary focus of PIXNET is on sensor development, proposers are instructed to develop whatever apps are necessary to achieve the desired functionality for phone and camera.

In addition to technological innovation, proposers are encouraged to develop plans for transitioning the low-cost camera system into manufacturing. In the case of the helmet-mounted system, DARPA’s preferred cost goal in a manufacturing environment producing 10,000 units per month is $3,300 per unit. 

For more information on PIXNET, please see the Broad Agency Announcement.

# # #

Associated images posted on www.darpa.mil may be reused according to the terms of the DARPA User Agreement, available here: http://go.usa.gov/nYr.
 
Probably a very long term solution, but you heard it here first! Any technique that can extract more energy from batteries (or as hinted in the article, reduce transmission losses, such as the 100% transmission efficiency among molecules during photosynthesis) will have a huge impact on logistics as the number of batteries, generators and fuel trucks to provide fuel for the generators shrinks:

http://www.technologyreview.com/view/507176/entanglement-makes-quantum-batteries-almost-perfect-say-physicists/

Entanglement Makes Quantum Batteries Almost Perfect, Say Theorists

In theory, quantum batteries such as atoms and molecules can store and release energy on demand almost perfectly--provided they are entangled, says physicists

In recent years, physicists have amused themselves by calculating the properties of quantum machines, such as engines and refrigerators.

The essential question is how well these devices work when they exploit the rules of quantum mechanics rather than classical mechanics. The answers have given physicists important new insights into the link between quantum mechanics and thermodynamics.

The dream is that they may one day build such devices or exploit those already used by nature.

Today, Robert Alicki, at the University of Gdansk in Poland, and Mark Fannes, at the University of Leuven in Belgium, turn their attention to quantum batteries.  They ask how much work can be extracted from a quantum system where energy is stored temporarily.

Such a system might be an atom or a molecule, for example. And the answer has an interesting twist.

Physicists have long known that it is possible to extract work from some quantum states but not others. These others are known as passive states.

So the quantity physicists are interested in is the difference between the energy of the quantum system and its passive states. All that energy is potentially extractable to do work elsewhere.

Alicki and Fannes show that the extractable work is generally less than the thermodynamic limit. In other words, they show that this kind of system isn’t perfect.

However, the twist is that Alicki and Fannes say things change if you have several identical quantum batteries that are entangled.

Entanglement is a strange quantum link that occurs when separate particles have the same wavefunction. In essence, these particles share the same existence.

Entanglement leads to all kinds of bizarre phenomena such as the “spooky action at a distance” that so puzzled Einstein.

Alicki and Fannes show that when quantum batteries are entangled they become much better. That’s essentially because all the energy from all the batteries can be extracted at once.  “Using entanglement one can in general extract more work per battery,” they say.

In fact, as the number of entangled batteries increases, the performance becomes arbitrarily close to the thermodynamic limit. In other words, a battery consisting of large numbers of entangled quantum batteries could be almost perfect.

That’s a fascinating result. Quantum batteries in the form of atoms or molecules may be ubiquitous in nature, in processes such as photosynthesis. Biologists know for example that during photosynthesis, energy is transferred with 100 per cent efficiency from one molecular machine to another.

How this happens, nobody knows. Perhaps Alicki and Fannes’ work can throw some light on this process.

However, it’s worth pointing out some of the limitations of this work. It is highly theoretical and does not take into account various practical limitations that are likely to crop up.

Indeed they acknowledge this and say an interesting goal for the future will be to work out how practical limitations might change their result.

In the meantime, nanotechnologists can dream about the possibility of exploiting near perfect batteries in micromachines of the future and learning more about the way nature may have already perfected this trick.

Ref: http://arxiv.org/abs/1211.1209: Extractable Work From Ensembles of Quantum Batteries. Entanglement Helps.
 
A look at how Google deals with massive data bases and information handeling on that scale. If we want to be a really "digital" army or Armed Forces, then our network systems need this level of scalability and searchability (along with secure, high bandwidth pipes to move the data to the users), along with the high levels of reliability and uptime. Computer science geeks can probably understand the mechanics of this outtake better than I can, but users of the DWAN are all familier with the negative effects of not having these sorts of systems on line:

http://nextbigfuture.com/2012/11/google-operating-at-its-own-level-of.html

Google operating at its own level of multi-cloud reliability and scalability

Google’s spanner handles trillions of rows of data and Google is shifting away from NoSQL and to NewSQL. Google believes it is better to have application programmers deal with performance problems due to overuse of transactions as bottlenecks arise, rather than always coding around the lack of transactions.

A complicating factor for an Open Source effort is that Spanner includes the use of GPS and Atomic clock hardware.

Spanner is Google's scalable, multi-version, globally-distributed, and synchronously-replicated database. It is the first system to distribute data at global scale and support externally-consistent distributed transactions. This paper describes how Spanner is structured, its feature set, the rationale underlying various design decisions, and a novel time API that exposes clock uncertainty. This API and its implementation are critical to supporting external consistency and a variety of powerful features: non-blocking reads in the past, lock-free read-only transactions, and atomic schema changes, across all of Spanner.

The servers in a Spanner universe.

A zone has one zonemaster and between one hundred and several thousand spanservers. The former assigns data to spanservers; the latter serve data to clients. The
per-zone location proxies are used by clients to locate the spanservers assigned to serve their data. The universe master and the placement driver are currently singletons. The universe master is primarily a console that displays status information about all the zones for interactive debugging.

Amazon is somewhat competitive with datacenter reliability but they charge to replicate across clouds. Amazon does failover automatically across clouds and data centers.

A few decades ago Toyota and Japanese car makers had several times more reliability and quality than competing car companies. This required having a different company culture. Orders of magnitude greater reliability and quality can be competitive weapons that enable things to be done that are impossible for those without the quality and reliability. Google also operates at levels of scale that competitors cannot match.

Spanner: Google’s Globally-Distributed Database (14 pages)





To summarize, Spanner combines and extends on ideas from two research communities: from the database community, a familiar, easy-to-use, semi-relational interface, transactions, and an SQL-based query language; from the systems community, scalability, automatic sharding, fault tolerance, consistent replication, external consistency, and wide-area distribution. Since Spanner’s inception, we have taken more than 5 years to iterate to the current design and implementation. Part of this long iteration phase was due to a slow realization that Spanner should do more than tackle the problem of a globally replicated namespace, and should also focus on database features that Bigtable was missing.

One aspect of our design stands out: the linchpin of Spanner’s feature set is TrueTime. We have shown that rectifying clock uncertainty in the time API makes it possible to build distributed systems with much stronger time semantics. In addition, as the underlying system enforces tighter bounds on clock uncertainty, the overhead of the stronger semantics decreases. As a community, we should no longer depend on loosely synchronized clocks and weak time APIs in designing distributed algorithms.

Future Work

We have spent most of the last year working with the F1 team to transition Google’s advertising backend from MySQL to Spanner. We are actively improving its monitoring and support tools, as well as tuning its performance. In addition, we have been working on improving the functionality and performance of our backup/restore system. We are currently implementing the Spanner schema language, automatic maintenance of secondary indices, and automatic load-based resharding. Longer term, there are a couple of features that we plan to investigate. Optimistically doing reads in parallel may be a valuable strategy to pursue, but initial experiments have indicated that the right implementation is non-trivial. In addition, we plan to eventually support direct changes of Paxos configurations.

Given that we expect many applications to replicate their data across datacenters that are relatively close to each other, TrueTime may noticeably affect performance. We see no insurmountable obstacle to reducing below 1ms. Time-master-query intervals can be reduced, and better clock crystals are relatively cheap. Time-master query latency could be reduced with improved networking technology, or possibly even avoided through alternate time-distribution technology.

Finally, there are obvious areas for improvement. Although Spanner is scalable in the number of nodes, the node-local data structures have relatively poor performance on complex SQL queries, because they were designed for simple key-value accesses. Algorithms and data structures from DB literature could improve singlenode performance a great deal. Second, moving data automatically between datacenters in response to changes in client load has long been a goal of ours, but to make that goal effective, we would also need the ability to move client-application processes between datacenters in an automated, coordinated fashion. Moving processes raises the even more difficult problem of managing resource acquisition and allocation between datacenters
 
A simple and rugged nuclear generating system that could be adapted for use on the ground and at sea as well. I am not sure where the scaling would stop working, but certainly a modular system could produce much more that 550W if the system was scaled and enough units were ganged together.

http://www.youtube.com/watch?v=KobRfGqlpGc&feature=player_embedded

http://www.lanl.gov/newsroom/news-releases/2012/November/11.26-space-travel.php

Joint DOE and NASA team demonstrates simple, robust fission reactor prototype

LOS ALAMOS, N.M., Nov. 26, 2012—A team of researchers, including engineers from Los Alamos National Laboratory, has demonstrated a new concept for a reliable nuclear reactor that could be used on space flights.

The research team recently demonstrated the first use of a heat pipe to cool a small nuclear reactor and power a Stirling engine at the Nevada National Security Site’s Device Assembly Facility near Las Vegas. The Demonstration Using Flattop Fissions (DUFF) experiment produced 24 watts of electricity. A team of engineers from Los Alamos, the NASA Glenn Research Center and National Security Technologies LLC (NSTec) conducted the experiment.

Heat pipe technology was invented at Los Alamos in 1963. A heat pipe is a sealed tube with an internal fluid that can efficiently transfer heat produced by a reactor with no moving parts. A Stirling engine is a relatively simple closed-loop engine that converts heat energy into electrical power using a pressurized gas to move a piston. Using the two devices in tandem allowed for creation of a simple, reliable electric power supply that can be adapted for space applications.

Researchers configured DUFF on an existing experiment, known as Flattop, to allow for a water-based heat pipe to extract heat from uranium. Heat from the fission reaction was transferred to a pair of free-piston Stirling engines manufactured by Sunpower Inc., based in Athens Ohio. Engineers from NASA Glenn designed and built the heat pipe and Stirling assembly and operated the engines during the experiment. Los Alamos nuclear engineers operated the Flattop assembly under authorization from the National Nuclear Security Administration (NNSA).

DUFF is the first demonstration of a space nuclear reactor system to produce electricity in the United States since 1965, and the experiment confirms basic nuclear reactor physics and heat transfer for a simple, reliable space power system.

“The nuclear characteristics and thermal power level of the experiment are remarkably similar to our space reactor flight concept,” said Los Alamos engineer David Poston. “The biggest difference between DUFF and a possible flight system is that the Stirling input temperature would need to be hotter to attain the required efficiency and power output needed for space missions.”

“The heat pipe and Stirling engine used in this test are meant to represent one module that could be used in a space system,” said Marc Gibson of NASA Glenn. “A flight system might use several modules to produce approximately one kilowatt of electricity.”

Current space missions typically use power supplies that generate about the same amount of electricity as one or two household light bulbs. The availability of more power could potentially boost the speed with which mission data is transmitted back to Earth, or increase the number of instruments that could be operated at the same time aboard a spacecraft.

“A small, simple, lightweight fission power system could lead to a new and enhanced capability for space science and exploration”, said Los Alamos project lead Patrick McClure.  “We hope that this proof of concept will soon move us from the old-frontier of Nevada to the new-frontier of outer space”.
LANL animation of the new reactor concept
0:50

LANL animation of the new reactor concept

Los Alamos research on the project was made possible through Los Alamos’s Laboratory-Directed Research and Development Program (LDRD), which is funded by a small percentage of the Laboratory’s overall budget to invest in new or cutting-edge research. NASA Glenn and NSTec also used internal support to fund their contributions to the experiment.

“Perhaps one of the more important aspects of this experiment is that it was taken from concept to completion in 6 months for less than a million dollars,” said Los Alamos engineer David Dixon. “We wanted to show that with a tightly-knit and focused team, it is possible to successfully perform practical reactor testing.”
 
The current radio isotope generators used on the Mars Missions are about 20 kilos in weight, so they appear to be already small enough. They generate 125 Watts each, and are determined to reduce down to 100 Watts after about 14 years of life.

By comparison, the batteries alone in a Toyota Prius weigh over 50 kilos.
 
http://hamptonroads.com/2012/11/photo-video-unmanned-drone-launched-catapult

"An X-47B Unmanned Combat Air System – a drone – completes a land-based catapult launch Thursday at Patuxent River Naval Air Station in Maryland. This week, contractors will direct an X-47B around the Norfolk-based aircraft carrier Harry S. Truman in the first trials to determine whether an unmanned aircraft can function on a carrier flight deck."

The first thought I had was that the X-47B is a lot bigger and taller than in other pictures.  The wing is higher than most of the crews' heads when they were standing up.
 
Rugged and portable laser weapons have now been demonstrated on land. Think of this as a sort of futuristic "Iron Dome" system (and with upgrades it will be able to take on larger targets and at greater ranges)

http://www.lockheedmartin.com/us/news/press-releases/2012/november/1127-ss-adam.html

Lockheed Martin Demonstrates New Ground-Based Laser System in Tests Against Rockets and Unmanned Aerial System

SUNNYVALE, Calif., Nov. 27, 2012 – Lockheed Martin [NYSE: LMT] today announced that it has successfully demonstrated a portable, ground-based military laser system in a series of tests against representative airborne targets. Lockheed Martin developed the Area Defense Anti-Munitions (ADAM) system to provide a defense against short-range threats, such as rockets and unmanned aerial systems.

Since August, the ADAM system has successfully engaged an unmanned aerial system target in flight at a range of approximately 1.5 kilometers (0.9 miles) and has destroyed four small-caliber rocket targets in simulated flight at a range of approximately 2 kilometers (1.2 miles).

“Lockheed Martin has invested in the development of the ADAM system because of the enormous potential effectiveness of high-energy lasers,” said Doug Graham, Lockheed Martin’s vice president of advanced programs for Strategic and Missile Defense Systems. “We are committed to supporting the transition of directed energy’s revolutionary capability to the war fighter.”

Designed for short-range defense of high-value areas including forward operating bases, the ADAM system’s 10-kilowatt fiber laser is engineered to destroy targets up to 2 kilometers (1.2 miles) away. The system precisely tracks targets in cluttered optical environments and has a tracking range of more than 5 kilometers (3.1 miles). The system has been designed to be flexible enough to operate against rockets as a standalone system and to engage unmanned aerial systems with an external radar cue. The ADAM system’s modular architecture combines commercial hardware components with the company’s proprietary software in an integrated and easy-to-operate system.

“Lockheed Martin has applied its expertise as a laser weapon system integrator to provide a practical and affordable defense against serious threats to military forces and installations,” said Paul Shattuck, Lockheed Martin’s director of directed energy systems for Strategic and Missile Defense Systems. “In developing the ADAM system, we combined our proven laser beam control architecture with commercial hardware to create a capable, integrated laser weapon system.”

Lockheed Martin has been a pioneer in the development and demonstration of high-energy laser capabilities for more than 30 years and has made key advances in areas such as precision pointing and control, line-of-sight stabilization and adaptive optics.

Headquartered in Bethesda, Md., Lockheed Martin is a global security and aerospace company that employs about 120,000 people worldwide and is principally engaged in the research, design, development, manufacture, integration and sustainment of advanced technology systems, products and services. The corporation’s net sales for 2011 were $46.5 billion.

A 10 to 100 Kw system can potentially be powered by tapping into an aircraft's engine power as well (an AH 64 generates 2000 SHP from its engines (1471 Kw), and a system in the air has potentially wider sensor range and  fields of fire, so an overlapping set of lasers on the air and ground could be very difficult to defeat.
 
Going to the very small, a new way of bonding silicon to copper allows for greater heat transfer. This could lead to computational devices that don't need cooling fans (or much smaller ones) and are otherwise much more energy efficient. For the end user this means more rugged equipment and fewer batteries or other electrical power sources:

http://news.rpi.edu/update.do?artcenterkey=3110

Nature Materials Study: Boosting Heat Transfer With Nanoglue

Interdisciplinary Study From Rensselaer Polytechnic Institute Demonstrates New Method for Significantly Increasing Heat Transfer Rate Across Two Different Materials

A team of interdisciplinary researchers at Rensselaer Polytechnic Institute has developed a new method for significantly increasing the heat transfer rate across two different materials. Results of the team’s study, published in the journal Nature Materials, could enable new advances in cooling computer chips and lighting-emitting diode (LED) devices, collecting solar power, harvesting waste heat, and other applications.

By sandwiching a layer of ultrathin “nanoglue” between copper and silica, the research team demonstrated a four-fold increase in thermal conductance at the interface between the two materials. Less than a nanometer—or one billionth of a meter—thick, the nanoglue is a layer of molecules that form strong links with the copper (a metal) and the silica (a ceramic), which otherwise would not stick together well. This kind of nanomolecular locking improves adhesion, and also helps to sync up the vibrations of atoms that make up the two materials which, in turn, facilitates more efficient transport of heat particles called phonons. Beyond copper and silica, the research team has demonstrated their approach works with other metal-ceramic interfaces.

Heat transfer is a critical aspect of many different technologies. As computer chips grow smaller and more complex, manufacturers are constantly in search of new and better means for removing excess heat from semiconductor devices to boost reliability and performance. With photovoltaic devices, for example, better heat transfer leads to more efficient conversion of sunlight to electrical power. LED makers are also looking for ways to increase efficiency by reducing the percentage of input power lost as heat. Ganapati Ramanath, professor in the Department of Materials Science and Engineering at Rensselaer, who led the new study, said the ability to enhance and optimize interfacial thermal conductance should lead to new innovations in these and other applications.

“Interfaces between different materials are often heat-flow bottlenecks due to stifled phonon transport. Inserting a third material usually only makes things worse because of an additional interface created,” Ramanath said. “However, our method of introducing an ultrathin nanolayer of organic molecules that strongly bond with both the materials at the interface gives rise to multi-fold increases in interfacial thermal conductance, contrary to poor heat conduction seen at inorganic-organic interfaces. This method to tune thermal conductance by controlling adhesion using an organic nanolayer works for multiple materials systems, and offers a new means for atomic- and molecular-level manipulation of multiple properties at different types of materials interfaces. Also, it’s cool to be able to do this rather unobtrusively by the simple method of self-assembly of a single layer of molecules.”

Results of the new study, titled “Bonding-induced thermal conductance enhancement at inorganic heterointerfaces using nanomolecular monolayers,” were published online last week by Nature Materials, and will appear in an upcoming print edition of the journal. The study may be viewed online at: http://go.nature.com/3LLeYP

The research team used a combination of experiments and theory to validate their findings.

“Our study establishes the correlation between interfacial bond strength and thermal conductance, which serves to underpin new theoretical descriptions and open up new ways to control interfacial heat transfer,” said co-author Pawel Keblinski, professor in the Department of Materials Science and Engineering at Rensselaer.

“It is truly remarkable that a single molecular layer can bring about such a large improvement in the thermal properties of interfaces by forming strong interfacial bonds. This would be useful for controlling heat transport for many applications in electronics, lighting, and energy generation,” said co-author Masashi Yamaguchi, associate professor in the Department of Physics, Applied Physics, and Astronomy at Rensselaer.

This study was funded with support from the National Science Foundation (NSF).

“The overarching goal of Professor Ramanath’s NSF-sponsored research is to elucidate, using first-principles-based models, the effects of molecular chemistry, chemical environment, interface topography, and thermo-mechanical cycling on the thermal conductance of metal-ceramic interfaces modified with molecular nanolayers,” said Clark V. Cooper, senior advisor for science at the NSF Directorate for Mathematical and Physical Sciences, who formerly held the post of program director for Materials and Surface Engineering. “Consistent with NSF’s mission, the focus of his research is to advance fundamental science, but the potential societal benefits of the research are enormous.”

“This is a fascinating example of the interplay between the physical, chemical, and mechanical properties working in unison at the nanoscale to determine the heat transport characteristics at dissimilar metal-ceramic interfaces,” said Anupama B. Kaul, a program director for the Division of Electrical, Communications, and Cyber Systems at the NSF Directorate for Engineering. “The fact that the organic nanomolecular layer is just a monolayer in thickness and yet has such an important influence on the thermal characteristics is truly remarkable. Dr. Ramanath’s results should be particularly valuable in nanoelectronics where heat management due to shrinking device dimensions continues to be an area of active research.”

Along with Ramanath, Keblinski, and Yamaguchi, co-authors of the paper are Rensselaer materials science graduate students Peter O’Brien, Sergei Shenogin, and Philippe K. Chow; Rensselaer physics graduate student Jianxiun Liu; and Danielle Laurencin and P. Hubert Mutin of the Institut Charles Gerhardt Montpellier and  Université Montpellier in France.

For more information on Ramanath and his nanomaterials research at Rensselaer, visit:

    Nature Materials Study: Quick-Cooking Nanomaterials in a $40 Microwave Oven To Make Tomorrow’s Solid-State Air Conditioners and Refrigerators
    http://news.rpi.edu/update.do?artcenterkey=2971
    Inexpensive “Nanoglue” Can Bond Nearly Anything Together
    http://news.rpi.edu/update.do?artcenterkey=2154
    “Nanosculpture” Could Enable New Types of Heat Pumps and Energy Converters
    http://news.rpi.edu/update.do?artcenterkey=2471
    Strengthening Fluids With Nanoparticles
    http://news.rpi.edu/update.do?artcenterkey=2402
    Faculty Home Page
    http://homepages.rpi.edu/~ganapr/

Published December 4, 2012 Contact: Michael Mullaney
Phone: (518) 276-6161
E-mail: mullam@rpi.edu
 
Here is something to gladden the hearts of the IA community; a practical application of link and social network analysis. The interesting thing here is how it is applied; zapping middle level "management" seems to be more effective than going after the leadership using this model. Obviously it needs more refining; it is noted that there is little differentiation between jobs and positions in the model, while it seems pretty clear that nailing a logistics or financial link would probably have more effect than hitting a recruiter or propagandist:

http://www.wired.com/dangerroom/2012/12/paulos-alogrithm/

Death by Algorithm: West Point Code Shows Which Terrorists Should Disappear First
BY NOAH SHACHTMAN12.06.126:30 AM

Paulo Shakarian has an algorithm that might one day help dismantle al-Qaida — or at least one of its lesser affiliates. It’s an algorithm that identifies which people in a terror network really matter, like the mid-level players, who connect smaller cells with the larger militant group. Remove those people, either by drone or by capture, and it concentrates power and authority in the hands of one man. Remove that man, and you’ve broken the organization.

The U.S. military and intelligence communities like to congratulate themselves whenever they’ve taken out a terrorist leader, whether it’s Osama bin Laden or Abu Mussab al-Zarqawi, the bloodthirsty chief of al-Qaida in Iraq. Shakarian, a professor at West Point’s Network Science Center who served two tours as an intelligence officer in Iraq, saw first-hand just how quickly those militant networks regrew new heads when the old ones were chopped off. It became one of the inspirations for him and his colleagues at West Point to craft an algorithm that could truly target a terror group’s weak points.

“I remember these special forces guys used to brag about how great they were at targeting leaders. And I thought, ‘Oh yeah, targeting leaders of a decentralized organization. Real helpful,’” Shakarian tells Danger Room. Zarqawi’s group, for instance, only grew more lethal after his death. “So I thought: Maybe we shouldn’t be so interested in individual leaders, but in how whole organizations regenerate their leadership.”

These days, American counterterror policy is even more reliant on taking out individual militants. How exactly those individuals are picked for drone elimination is the matter of intense debate and speculation. The White House reportedly maintains a “matrix” of the most dangerous militants. Social-network analysis — the science of determining the connections between people — almost certainly plays a role where those militants appear on that matrix.

It’s clearly an imperfect process. Hundreds of civilians have been killed in the drone strikes, along with thousands of militants. And while the core of al-Qaida is clearly weakened, Obama administration officials will only talk in the vaguest terms about when the war against the terror group might some day come to an end.

In a paper to be presented later this month before the Academy of Science and Engineering’s International Conference on Social Informatics, Shakarian and his West Point colleagues argue for a new way of using social-network analysis to target militants. Forget going after the leader of an extremist group, they say. At least right away.

“If you arrest that guy, the number of connections everyone else has becomes more similar. They all become leaders. You force that terror group to become more decentralized. You might be making it harder to defeat these organizations,” Shakarian says.

This chart shows how West Point’s “GREEDY_FRAGILE” algorithm renders a network brittle by removing relatively few nodes.

The second illustration depicts a terror network, as the algorithm centralizes it — and makes it easier to break. Photos: West Point
Instead, counterterrorists should work to remove militant lieutenants in such a way that terror leaders actually become morecentral to their organizations. That’s because a more centralized network is a more fragile one. And a fragile network can ultimately be smashed, once and for all.

The West Point team — which includes professors Devon Callahan, Jeff Nielsen, and Tony Johnson – wrote up a simple (less than 30-line) algorithm in Python they named GREEDY_FRAGILE. It looks for nodes that can be taken out to “maximize network-wide centrality” — concentrate connectivity in the terror leader, in other words. The professors tested GREEDY_FRAGILE against five data sets. the first is the social network of the al-Qaida members involved in the 1998 bombing of the U.S. embassy in Dar es Salaam; the other four are derived from real-world terror groups, but anonymized for academic use.

“In each of the five real-world terrorist networks that we examined, removal of only 12% of nodes can increase the network-wide centrality between 17% and 45%,” the West Point authors note in their paper. In other words, taking out just a few mid-level players make the whole organization much, much more fragile.

Interestingly, GREEDY_FRAGILE works even when the exact shape of the network is unknown — or when certain nodes can’t be targeted, for political or intelligence reasons. In other words, it takes into account some real-world complications that counterterrorists might face.

Now, this is just a lab experiment. No actual terrorists were harmed in the writing of this paper. The algorithm only looks at “degree” centrality — the number of ties a node has. It doesn’t examine metrics like network “closeness,” which finds the shortest possible path between two nodes. Nor does it take into account the different roles played by different nodes — financier, propagandist, bomb-maker.
That’s why the work is funded by the Army Research Office, which handles the service’s most basic R&D efforts.

What’s more, the authors stress that their network-breaking techniques might not be a good fit for every counterterror plan. “It may be desirable to keep certain terrorist or insurgent leaders in place to restrain certain, more radical elements of their organization,” they write.

In fact, the authors strongly hint that they’re not necessarily on board with the Obama administration’s kill-don’t-capture approach to handling terror networks.

“We would like to note that the targeting of individuals in a terrorist or insurgent network does not necessarily mean to that they should be killed,” Shakarian and his colleagues write. “In fact, for ‘shaping operations’ as the ones described in this paper, the killing of certain individuals in the network may be counter-productive. This is due to the fact that the capture of individuals who are likely emergent leaders may provide further intelligence on the organization in question.”

That sort of intelligence may suddenly be at a premium again. From the Pentagon chief on down, the U.S. is increasingly worried about al-Qaida’s spread into unfamiliar regions like Mali and its association with new, shadowy militant groups in Libya. GREEDY_FRAGILE, if it works like Shakarian hopes, might show the counterterrorists which militants to target — and which so-called leaders to leave alone. For now.
 
Well Duh! Everyone knows it's middle management that does all of the work!  ;D

Upper Management is too busy acting as job creators, and the workers are too busy complaining to the union reps and protesting the 1%.  >:D
 
Makes sense to me.  Killing the top guys only opens up for much wanted promotions, and an overall increase in morale.  If their bosses were idiots than it's even more glorious.
 
DARPA announces something new on the medical front, a means of stabilizing injured soldiers to give them a better chance of surviving the "golden hour" before being delivered to advanced treatment facilities. Eerily, the foam seems to resemble the stuff you buy at Home Depot to inject into small cavities in your house (only not so sticky):

http://www.darpa.mil/NewsEvents/Releases/2012/12/10.aspx

DARPA Foam Could Increase Survival Rate for Victims of Internal Hemorrhaging
    December 10, 2012

    Technology developed under DARPA’s Wound Stasis System program resulted in 72 percent survival rate at three hours post-injury in testing 

    The Department of Defense’s medical system aspires to a standard known as the “Golden Hour” that dictates that troops wounded on the battlefield are moved to advanced-level treatment facilities within the first 60 minutes of being wounded. In advance of transport, initial battlefield medical care administered by first responders is often critical to injured servicemembers’ survival. In the case of internal abdominal injuries and resulting internal hemorrhaging, however, there is currently little that can be done to stanch bleeding before the patients reach necessary treatment facilities; internal wounds cannot be compressed the same way external wounds can, and tourniquets or hemostatic dressings are unsuitable because of the need to visualize the injury. The resulting blood loss often leads to death from what would otherwise be potentially survivable wounds.

    DARPA launched its Wound Stasis System program in 2010 in the hopes of finding a technological solution that could mitigate damage from internal hemorrhaging. The program sought to identify a biological mechanism that could discriminate between wounded and healthy tissue, and bind to the wounded tissue. As the program evolved, an even better solution emerged: Wound Stasis performer Arsenal Medical, Inc. developed a foam-based product that can control hemorrhaging in a patient’s intact abdominal cavity for at least one hour, based on swine injury model data. The foam is designed to be administered on the battlefield by a combat medic, and is easily removable by doctors during surgical intervention at an appropriate facility, as demonstrated in testing. 

      Wound Stasis performers presented pre-clinical data on the foam treatment at the 2012 Annual Meeting of the American Association for the Surgery of Trauma in Kauai, Hawaii. These data demonstrated the ability of the foam to treat severe hemorrhage for up to three hours in a model of lethal liver injury. During testing, minimally invasive application of the product reduced blood loss six-fold and increased the rate of survival at three hours post-injury to 72 percent from the eight percent observed in controls.

    “Potentially, Wound Stasis provides an important addition to our ability to save life and limb. Getting after these heretofore difficult-to-stabilize, if not untreatable wounds, expands our options and effectively extends the ‘Golden Hour,’” said Maj. Gen. Bill Hix, Director of Concept Development for the Army Capability Integration Center at Training and Doctrine Command. “A capability like this is important in any operation, but would prove vital during operations in austere areas where military resources and infrastructure are at a premium,” he said.

    “Wound Stasis has been an exciting program because we were able to move unexpectedly from fundamental research to a pre-clinical proof-of-concept based on the strength of our findings,” said Brian Holloway, DARPA program manager. “According to the U.S. Army Institute of Surgical Research, internal hemorrhage is the leading cause of potentially survivable deaths on the battlefield, so the Wound Stasis effort should ultimately translate into an increased rate of survival among warfighters. If testing bears out, the foam technology could affect up to 50 percent of potentially survivable battlefield wounds. We look forward to working with the U.S. Food and Drug Administration on future regulatory submission of this device, and with our partners, the Army Institute of Surgical Research and Special Operations Command, on getting this technology to where it’s desperately needed on the front lines.”

    The foam is actually a polyurethane polymer that forms inside a patient’s body upon injection of two liquid phases, a polyol phase and an isocyanate phase, into the abdominal cavity. As the liquids mix, two reactions are triggered. First, the mixed liquid expands to approximately 30 times its original volume while conforming to the surfaces of injured tissue. Second, the liquid transforms into solid foam capable of providing resistance to intra-abdominal blood loss. The foam can expand through pooled and clotted blood and despite the significant hydrostatic force of an active hemorrhage.

    In tests, removal of the foam took less than one minute following incision by a surgeon. The foam was removed by hand in a single block, with only minimal amounts remaining in the abdominal cavity, and with no significant adherence of tissue to the foam. Features appearing in relief on the extracted foam showed conformal contact with abdominal tissues and partial encapsulation of the small and large bowels, spleen, and liver. Blood absorption was limited to near the surface of the foam; the inside of the foam block remained almost uniformly free of blood.

    DARPA recently awarded a $15.5 million Phase II contract to Arsenal Medical to continue development of the treatment system and support regulatory submission. DARPA anticipates continuing the Wound Stasis program through at least FDA approval of a prototype device.

    # # #

    Associated images posted on www.darpa.mil and video posted at www.youtube.com/darpatv  may be reused according to the terms of the DARPA User Agreement, available here: http://go.usa.gov/nYr.
 
Back
Top