- Reaction score
Don't know of any places where bottled water is cheaper than gas.
Would you pay $55 for bottled water?
by John Fuller
If you got rid of the fancy Bling H2O bottle and lowered the price, would it still be worth it? What about the "regular" plastic bottles of water you find in the store? Are they even worth $2?
Bottled water has become so popular that 41 billion gallons are consumed every year around the world. Many people consider it safe and convenient. Over the past few years, however, many bottled water companies labeling their product as "purified" or "natural spring water" have confessed to filling their products with simple tap water. In July 2007, for instance, Pepsi admitted to filling bottles of Aquafina with public water, even though the packaging suggests the water comes from natural springs [source: Environmental Working Group]. Recent studies have concluded that bottled water is no safer than tap water, and the costs of producing the drink and its effect on the environment have caused some alarm [source: National Geographic News].
To understand how expensive regular bottled water is, let's compare it with gasoline. With the price of oil rising, we typically think of gasoline as very expensive. On the other hand, some of us will barely blink an eye at picking up a few bottles of water from the same gas station. Here are the numbers:
A gallon of gas costs around $3. If we assume a one-liter bottle of water from the store costs about $2.50, a gallon of the same bottled water should cost about $10. Water, life's most necessary substance, costs about three times more than gasoline when it comes in a plastic bottle. If you wanted to fill up a car's 15-gallon tank with gasoline, it would cost you about $45. If you wanted to fill up that same 15-gallon tank with bottled water, it would cost you $150 [source: National Geographic News].
Tap water, on the other hand, costs a fraction of the price of bottled water. The same $2 you spend on a liter of bottled water will get you about 1,000 gallons of tap water [source: EPA].
So, even though it's cheaper than Bling H20, bottled water is still expensive. Next, we'll take a look at some of the other products on the market that seem to cost more than they're worth.
Joe Eck has made his seventh room temperature superconductor and has found a theory for why his method and materials work
Superconductors.ORG (Joe Eck) reports the 38 C superconductor discovered in July 2013 has been reformulated to produce a Meissner transition near 42 Celsius (107F, 315K). This was accomplished by substituting tin (Sn) into the lead (Pb) atomic sites of the D212 structure (shown below left), changing the formula to Tl5Sn2Ba2SiCu8O16+. Multiple magnetization plots clearly show diamagnetic transitions consistently appearing about 4 degrees higher than with Pb in the same atomic site(s). This is the seventh material found to superconduct above room temperature.
A theory put forth nearly 20 years ago seems to explain why planar weight disparity correlates so strongly with high temperature superconductors.
In the mid 1990's Howard Blackstead of Notre Dame and John Dow of A.S.U., postulated that oxygen located in the "chain layer" of a crystal lattice was being compressed into a metallic superconducting state.
"Experimental evidence indicates that the holes of the hypocharged oxygen in the charge-reservoir regions contribute primarily to the superconductivity, contrary to most current models of high- temperature superconductivity, which are based on superconductivity originating in the cuprate-planes. The data suggest that a successful theory of high-temperature superconductivity will be BCS-like and will pair holes through the polarization field, perhaps electronic as well as vibrational polarization."
Hypercharged copper, hypocharged oxygen, and high-temperature superconductivity
Hypocharged oxygen, and not hypercharged Cu+3 is shown to be the generator of high-temperature superconductivity. Models based on Cu+2$ARLRCu+3 charge-fluctuations (such as t-J models), are ruled out experimentally. Experimental evidence indicates that the holes of the hypocharged oxygen in the charge-reservoir regions contribute primarily to the superconductivity,contrary to most current models of high- temperature superconductivity, which are based on superconductivity originating in the cuprate-planes. The data suggest that a successful theory of high-temperature superconductivity will be BCS-like and will pair holes through the polarization field, perhaps electronic as well as vibrational polarization.
New Treatment for Gonorrhea Prevents Reinfection
A nanoparticle-based cancer therapy has been found to thwart an antibiotic-resistant, sexually transmitted infection in mice
By Rachel Feltman
A first step has been taken toward a treatment for gonorrhea, a sexually transmitted disease (STD) notorious for its high reinfection rates. This news comes within days of a troubling update from the U.S. Centers for Disease Control that placed the STD on a list of “urgent threats” in the fight against drug-resistant bacteria. According to the CDC, Neisseria gonorrhoeae, the bacteria that causes the malady in humans—which can initially result in painful inflammation and discharge, and can cause infertility and even death if not treated—requires urgent and aggressive action from the medical research community. Researchers from the University at Buffalo, S.U.N.Y., think the answer may lie in marshaling the immune system against gonorrhea.
The study, published in The Journal of Infections Diseases, found gonococcal infections in mice could be cured by introducing into the genital tract a cytokine, or immunoregulatory protein, known as interleukin-12 (IL-12), which is also being investigated as a cancer-fighting agent. Michael Russell, a microbiologist and immunologist at S.U.N.Y. Buffalo and one of the study’s authors, says that his 20-year investigation into gonorrhea and its resilience led him to suspect that it was actively altering immune systems, preventing human hosts from developing long-term resistance to it.
The exact mechanism of the alteration remains unclear, but Russell thinks it has to do with the two distinct “arms” of vertebrate immune systems: innate and adaptive. Russell observed high levels of a cytokine called interleukin-10 (IL-10) in gonococcal infections, and observed that it induces an innate immune response. IL-10 seems to suppress adaptive responses—like the formation of antibodies that can be used again to fight later infections—in favor of more general, short-term innate responses. Meanwhile, the innate responses, such as inflammation, are easy for N. gonorrhoeae to beat. If IL-12 could counteract the effects of IL-10, Russell hypothesized, it could help the body fight gonorrhea more effectively, and could be used in a treatment for the STD. When his colleague Nejat Egilmez developed a new delivery mechanism for the otherwise toxic IL-12, in which microspheres of slow-releasing nanoparticles of the cytokine could be targeted directly onto immunosuppressant tumors, Russell’s team decided to try injecting them into the vaginal tracts of infected mice.
“And it worked,” he says, “very nicely.”
Not only did mice treated with IL-12 respond more quickly to antibiotics, they were also significantly less likely to be reinfected than controls when exposed to the same strain a month later. “We found that the IL-12 treatment allows the development of an adaptive immune response not usually seen,” Russell says. It seems that by counteracting the IL-10 present at gonococcal infections, the treatment prevents immune systems from being tricked out of developing adaptive responses to the disease. The effect, he says, lasts for several months.
The results come just in time, as the CDC now reports that of the 800,000 estimated cases of gonorrhea that occur each year in the U.S., at least 30 percent are resistant to current antibiotic treatments. With 23 percent of cases now resistant to tetracycline, the CDC recommends that gonococcal infections be treated with a combination of two antibiotics—ceftriaxone in combination with either azithromycin or doxycycline—although a slow but steady increase in strains resistant to ceftriaxone indicate this combination may soon be useless as well.
The new treatment would increase the antibiotics’ effectiveness, but researchers hope that one day it might wean us off the drugs for good. “Since the second world war,” Russell says, “we’ve been treating infections by throwing antibiotics at them. Now that bacteria are emerging with antibiotic resistance, we have nothing else in the pipeline to deal with gonorrhea.” But the IL-12 treatment, he says, can turn the infection into a “live vaccine,” allowing the body to develop immunity. He hopes that further research will show a resistance that carries over from strain to strain, indicating his team has paved the way for a gonorrhea vaccine.
Sanjay Ram, an infectious disease specialist at the University of Massachusetts Memorial Medical Center who was not affiliated with the study, called the work “extremely important,” adding that it “provides nice clues for vaccine development. It would be useful in high-risk women who get repeat infections, with the first occurrence acting as a vaccine for subsequent infections.” Caution is necessary, however, he notes—there are differences between the immune systems of mice and humans, and we’re not sure what high levels of IL-12 might do to a human genital tract. One worry in particular is that higher levels of T cells, which IL-12 stimulate, would mean higher levels of so-called T helper, or CD4, cells. Although vital to the immune system, these cells are also a vulnerable point at which HIV can enter the body. Having high levels of them in the genital tracts of at-risk women would be troubling. But, Ram says, with the state of crisis created by the disease’s advance, any lead is a promising one. "One could go on and on about possible downsides,” he says, “but the fact is that we have a multidrug-resistant disease on our hands.”
The STD crisis notwithstanding, Russell says that human application is a long way off. First, he will see just how long and how effectively IL-12 therapy can protect mice from the rapidly mutating bacteria. He also hopes to learn more about the exact mechanism by which the IL-12 cytokines stimulate the immune responses seen in the study. The answer remains largely a mystery and would have to be documented before human trials could commence. But eventually he hopes to see this novel approach to the treatment of infectious diseases—the stimulation of an immune response at the site of infection—deployed to fight gonorrhea, along with other diseases.
Silicon Valley may soon require a name change to avoid the risk of sounding like a relic from a generation of bygone tech, thanks to new computer system created -- where else -- in Silicon Valley.
A cover story for the journal Nature, out Wednesday, details the efforts of a team based at Stanford to create the first basic computer built around carbon nanotubes rather than silicon chips.
"People have been talking about a new era of carbon nanotube electronics moving beyond silicon," Stanford professor Subhasish Mitra said in a release from the university. "But there have been few demonstrations of complete digital systems using this exciting technology. Here is the proof."
If you're the type of user who's much more concerned with what your computer or device can do rather than how it does it or what type of semiconductor material inside is making the magic happen, here's the skinny on why you should pay attention to the nerdy details at least this once.
For decades now, the exponential acceleration of technology -- which has taken us from room-size computers run by punched paper cards to insanely more powerful devices in our pockets -- has depended on shrinking silicon transistors to jam ever more onto a chip.
The result of this miniaturizing march has been devices that are becoming ever smaller, more powerful, and cheaper. In fact, transistor density has doubled pretty reliably about every 18 months or so since the dawn of the information age -- you might know this as "Moore's Law."
But many think silicon's long run as the king of computing could be nearing an end. That's because continually jamming more tiny transistors on a chip has become more difficult, expensive, and inefficient, not to mention the inevitable physical limitations -- you can't keep shrinking transistors forever.
Carbon nanotubes -- long chains of carbon atoms thousands of times thinner than a human hair -- have the potential to be more energy-efficient and "take us at least an order of magnitude in performance beyond where you can project silicon could take us," according to H.S. Philip Wong, another member of the Stanford team.
Problem is, carbon nanotubes aren't perfect either. They don't always grow in perfectly straight lines, and a fraction of the tubes grown aren't able to "switch off" like a regular transistor.
The Stanford team used a technique of "burning" off some of the imperfect carbon nanotubes while also working their way around other imperfections by using a complex algorithm. The final design consists of a very basic computer with 178 transistors that can do tasks like counting and number sorting and switch between functions.
The computer's limited power is due in part to the facilities available to the team, which did not have access to industrial fabrication tools.
So what we have now is basically a proof of concept for the first carbon nanotube computer, which is about as powerful as Intel's 4004, the first single-chip silicon microprocessor released in 1971. But if this technology turns out to be a worthy successor, we'll likely see devices that can not only compete with, but greatly exceed, the potential of silicon systems.
More importantly, it could mean that Moore's Law will continue for at least a little while longer.
Today, a carbon nanotube computer that can count its own transistors; tomorrow, perhaps the power of a human brain captured in strands thinner than a human hair.
Mitochondria rejuvenation in mice gave memory and exercise performance like a young adult for elderly mice
Researchers took a naturally occurring mitochondrial transcription factor called TFAM, which initiates protein synthesis, and engineered it to cross into cells from the bloodstream and target the mitochondria.
Aged mice given modified TFAM showed improvements in memory and exercise performance compared with untreated mice. "It was like an 80-year-old recovering the function of a 30-year-old," says Rafal Smigrodzki, also at Gencia, who presented the results at the Strategies for Engineered Negligible Senescence conference in Cambridge this month.
Targeted mitochondrial therapeutics in aging (SENS 6)
Mitochondrial dysfunction in aging consists of relative suppression of oxidative phosphorylation and frequently an increase in glycolysis. This metabolic imbalance is triggered by progressive biochemical processes, including accumulation of mitochondrial mutations, and changes in the expression and function of nuclear-encoded mitochondrial proteins. Our group developed methods for mitigation of mitochondrial suppression through mitochondria-targeted therapeutics. We observed that stimulation of mitochondrial activity both in vitro and in vivo significantly improves cellular function, suppresses neoplastic growth and inflammation, improves aged animal cognition and resolves in vivo metabolic derangements. One of our therapeutics, an engineered mitochondrial transcription factor, prolonged survival in wild-type aged animals. We expect that mitochondrial stimulation will be an important part of future aging therapies.
By Matthew Humphries Sep. 30, 2013 12:29 pm
The French Gendarmerie, a branch of the French Armed Forces in charge of public safety, has been a leader in moving away from proprietary software in recent years.
Back in 2004 it decided to stop using Microsoft Office and embraced OpenOffice and the Open Document Format instead. That meant 90,000 PCs moved to OpenOffice, and 20,000 Office licenses were no longer needed. Then they moved to using Firefox for web browsing and Thunderbird for email by 2006. And 2007 saw Gimp and VLC installed across the network.
That move to open source software certainly saves money, but it only goes so far when it all runs on Windows. So the French Gendarmerie decided to go a step further and in 2008 began moving from Windows to Ubuntu. Initially 5,000 PCs were switched to Ubuntu in 2008, that went up to 20,000 by 2011, and currently sits at 37,000 Ubuntu PCs.
The Gendarmerie says it will have 72,000 PCs moved over to Ubuntu by next summer, and they will continue to migrate because it saves so much money. And here’s the important bit: in their experience using open source software so far, the total cost of ownership falls 40 percent, which is massive when you are talking about tens of thousands of machines.
The savings were revealed at the Evento Linux conference held in Lisbon last week. And unlike predicted costs, which Microsoft can claim are incorrect or not proven, the French Gendarmerie is talking from a point where they currently have 30,000 PCs transitioned to an open source solution, and have been running them for years.
With such huge cost savings, it seems likely other companies and organizations will also at least consider making the move away from Windows. Windows 8 hasn’t exactly been well received, so when it comes time to re-license or upgrade, Microsoft may have more of a fight on its hands to keep key business customers.
It’s also interesting to note how the French Gendarmerie handled this transition. They first moved over to open source applications before switching out the OS. That way their employees were used to the tools long before losing Windows, making for a much easier transition where Ubuntu fades into the background and “just works.”
Cheap, spray-on solar cells developed by Canadian researchers
Nanoparticle-based cells can be made with far less energy than conventional silicon solar cells
CBC News Posted: Oct 04, 2013 4:05 PM ET Last Updated: Oct 04, 2013 4:49 PM ET
Spray-on solar panels developed at U of A
Summary of the paper
University Alberta news article
(Note: CBC does not endorse and is not responsible for the content of external links.)
A conventional solar panel must spend three to six years of harvesting energy from the sun before it has generated as much power as it took to make the solar panel in the first place, University of Alberta researcher Jillian Buriak says. (Lykaestria /Wikimedia Commons)
Silicon-free solar cells, light and flexible enough to roll up or use as window blinds, are under development at a University of Alberta lab.
The solar cells are made using nanoparticles — microscopic particles just 30 to 40 atoms across — that are very cheap to produce from zinc and phosphorus, said Jillian Buriak, a University of Alberta chemistry professor and senior research officer of the National Institute of Nanotechnology.
“We turn these things into inks or paints that you can spray coat onto plastics,” Buriak told Quirks & Quarks host Bob McDonald in an interview that airs Saturday.
Here the full interview on Quirks & Quarks, Saturday, Oct. 5, at noon on CBC Radio One
The resulting solar cells can be made extremely light and flexible compared to conventional silicon solar cells.
The zinc phosphide nanoparticle solar cells are also cheaper than conventional solar cells because the process used to make them is very low-energy, Buriak said.
Silicon solar cells are made from sand in a process that involves heating the materials repeatedly to very high temperatures – around 1000 C. As a result, Buriak estimated, it takes three to six years for the resulting solar cell to generate the amount of power used to manufacture it in the first place.
On the other hand, the solar nanoparticles are “actually made in a standard, bubbling pot glassware set up in the lab — the traditional image of chemistry — ” from elements that are very abundant, Buriak said.
Buriak and her colleagues published a description of their solar cell-making process in a recent issue of the scientific journal ACS Nano.
So far, her team has only made very small solar cells from their zinc phosphide nanoparticles, but they recently received funding from the Alberta government to apply the coating to larger sheets of plastic.
“We actually use spray coaters that you can buy from an automobile touch-up shop for paint,” Buriak said.
The efficiency of the solar cells is “not great,” she acknowledged, but that’s something her team is working on.
The fact that they’re “so cheap to make,” she added, means they will only have to reach 7.5 per cent efficiency before they will be commercially competitive with conventional energy sources such as coal-electric generation.
Buriak's research group has previously worked on other kinds of cheap, spray-on solar cell materials, such as flexible polymers.
Wildcat running robot has reached 16 mph without any tethered cable
WildCat is a four-legged robot being developed to run fast on all types of terrain. So far WildCat has run at about 16 mph on flat terrain using bounding and galloping gaits. The video shows WildCat's best performance so far. WildCat is being developed by Boston Dynamics with funding from DARPA's M3 program.
Boston Dynamics is funded by the DARPA's Maximum Mobility and Manipulation (M3) program. They first developed a prototype called Cheetah that broke all speed records for legged robots last year. Cheetah was reached 29 mph (46 km/h), but it was tethered to an external power source and had the benefit of running on a smooth treadmill while being partially balanced by a boom arm.
The eventual goal is to produce a four-legged robot that can run at speeds of up to 50 mph, on “all types of terrain.” While it’s fun to think of WildCat robots chasing down enemy combatants on the battlefield, the main purpose of the M3 (Maximum Mobility and Manipulation) program is simply to investigate how we can create robots that are much more fluid and flexible than they currently are. It would be naive to think that some version of Cheetah/WildCat won’t eventually be used in battle, though. Perhaps to run supplies to the frontline, or perhaps for more aggressive acts, such as a suicidal robotic bomb that runs into the enemy line and explodes.
Boston Dynamics and DARPA now have a complete family of robots:
A human (Atlas)
Mule - Ox (LS3).
The US military might one day field a completely robotic army, with Atlas firing the weapons, BigDog acting as the pack mule, and WildCat providing rapid, highly maneuverable support and flanking. Plus they will a variety of air drones. They will have drone carriers able to lift cars and trucks.
Thucydides said:Interesting idea with long term potential.
Nuclear fusion milestone passed at US lab
By Paul Rincon
Science Editor, BBC News website
"The BBC understands that during an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel - the first time this had been achieved at any fusion facility in the world."
IBM Research to Accelerate Big Data Discovery
New lab unifies data, expertise and novel analytics to speed discovery in industries including retail, medicine and finance
San Jose, Calif. - 10 Oct 2013: Scientists from IBM (NYSE: IBM) today announced the Accelerated Discovery Lab, a new collaborative environment specifically targeted at helping clients find unknown relationships from disparate data sets.
The workspace includes access to diverse data sources, unique research capabilities for analytics such as domain models, text analytics and natural language processing capabilities derived from Watson, a powerful hardware and software infrastructure, and broad domain expertise including biology, medicine, finance, weather modeling, mathematics, computer science and information technology. This combination reduces time to insight resulting in business impact – cost savings, revenue generation and scientific impact – ahead of the traditional pace of discovery.
The notion of Moore’s Law for Big Data has less to do with how fast data is growing, and more with how many connections one can make with that data, and how fast those connections are growing. While companies could utilize data scientists to analyze their own information, they may miss insights that can only be found by bringing their understanding together with other experts, data sources, and tools to create different context and discover new value in their Big Data.
“If we think about Big Data today, we mostly use it to find answers and correlations to ideas that are already known. Increasingly what we need to do is figure out ways to find things that aren’t known within that data,” said Jeff Welser, Director, Strategy and Program Development, IBM Research Accelerated Discovery Lab. ”Whether it’s through exploring thousands of public government databases, searching every patent filing in the world, including text and chemical symbols, to develop new drugs or mixing social media and psychology data to determine intrinsic traits, there's a big innovation opportunity if companies are able to accelerate discovery by merging their own assets with contextual data.”
With much of today’s discovery relying on rooting through massive amounts of data, gathered from a broad variety of channels, it is painful for many businesses and scientists to manage the diversity and the sheer physical volumes of data for multiple projects and to locate and share necessary resources and skills outside their organizations.
Leveraging the best research and product technologies for analytics on a scalable platform, the Accelerated Discovery Lab empowers subject matter experts to quickly identify and work with assets such as datasets, analytics, and other tools of interest relevant to their project.
At the same time, it encourages collaboration across projects and domains to spark serendipitous discovery by applying non-proprietary assets to subsequent projects. This collaboration can occur whether the experts are co-located in the same physical location or are geographically distributed but working within the same system infrastructure.
“The history of computing shows that systems commoditize over time,” said Laura Haas, IBM Fellow and Director, Technology and Operations, IBM Research Accelerated Discovery Lab. “Moving forward, people and systems together will do more than either could do on their own. Our environment will provide critical elements of discovery that allow domain experts to focus on what they do best, and will couple them with an intelligent software partner that learns continuously, increasing in value over time.”
Drug Development: The process of drug discovery today spans an average of 12 to 15 years, with billions of dollars invested per drug, and a 90+% fallout rate. Working primarily with pharmaceutical companies, IBM Research is using machine-based discovery technology to mine millions of published papers, patents and material properties databases. Then using advanced analytics, modeling and simulation to aid human discovery, IBM is able to uncover unexpected whitespace and innovation opportunities, and predict where to make the most profitable research bets. The inability to discover the next “new thing” quickly is a huge shortcoming faced by companies today across multiple industries including retail, medicine and consumer goods. A diverse set of skills and tools were needed to integrate and analyze these many sources of data, from deep domain knowledge of chemistry, biology and medicine, to data modeling and knowledge representation, to systems optimization. The data sets, skills and infrastructure provided by the Accelerated Discovery Lab not only enabled this work, but also are allowing the re-use of the tools in domains from materials discovery to cancer research.
Social Analytics: Marketers gather terabytes of data on potential customers, spend billions of dollars on software to analyze spending habits and segment the data to calibrate their campaigns to appeal to specific groups. Yet they still often get it wrong because they study “demographics” (age, sex, marital status, dwelling place, income) and existing buying habits instead of personality, fundamental values and needs. Recognizing this, scientists at IBM Research are helping businesses understand their customers in entirely new ways using terabytes of public social media data. They are able to understand and segment personalities and buying patterns from vast amounts of noisy social media data and do so automatically, reliably and after as few as 50 tweets. This is data that marketers never had before, permitting much more refined marketing than traditional approaches based on demographics and purchase history alone. The Accelerated Discovery Lab brought together the expertise in text analytics, human-computer interaction, psychology and large-scale data processing to enable these new insights. Because clients from multiple industries including retail, government, media and banking are exploring different applications of social analytics in this common environment, the opportunities for unexpected discoveries abound as new analytics are applied to diverse challenges.
Predictive Maintenance: Natural resources industries, such as oil and gas, mining and agriculture, depend on the effectiveness and productivity of expensive equipment. Most maintenance processes result in costly in-field failures, which can cost a company $1.5M for one day of downtime on a single piece of equipment. In order to have a real bottom-line impact, analytics and modeling need to be integrated with current processes. IBM developed an intelligent condition monitoring technology using the most comprehensive data set ever assembled in this domain. This system proactively presents decision support information to drive actions that reduce downtime, increase fleet productivity, and minimize maintenance costs – in fact, one estimate suggests that a $30B company can save $3B a year by implementing predictive maintenance technology. The Accelerated Discovery Lab brought together experts in the domain, the systems and mathematical modeling and provided systems infrastructure and expertise that freed the domain researchers and mathematicians to focus on the client problem and sped up the execution of the resulting models by a factor of 8.
About IBM Big Data and Analytics
Each day we create 2.5 quintillion bytes of data generated by a variety of sources -- from climate information, to posts on social media sites, and from purchase transaction records to healthcare medical images. At IBM we believe that Big Data and analytics are a catalyst to help clients become more competitive and drive growth. IBM is helping clients harness this Big Data to uncover valuable insights, and transform their business. IBM has established the world's deepest and broadest portfolio of Big Data technologies and solutions, spanning services, software, research and hardware. For more information about IBM and Big Data and analytics, visit http:www.ibmbigdatahub.com Follow IBM and Big Data on Twitter @IBMbigdata and #ibmbigdata.
IBM Media Relations
1 (408) 927-2272
How IBM is making computers more like your brain. For real
Big Blue is using the human brain as a template for breakthrough designs. Brace yourself for a supercomputer that's cooled and powered by electronic blood and small enough to fit in a backpack.
by Stephen Shankland October 17, 2013 6:11 AM PDT
ZURICH, Switzerland -- Despite a strong philosophical connection, computers and brains inhabit separate realms in research. IBM, though, believes the time is ripe to bring them together.
Through research projects expected to take a decade, Big Blue is using biological and manufactured forms of computing to learn about the other.
On the computing side, IBM is using the brain as a template for breakthrough designs such as the idea of using fluids both to cool the machine and to distribute electrical power. That could enable processing power that's densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links.
And on the brain side, IBM is supplying computing equipment to a $1.3 billion European effort called the Human Brain Project. It uses computers to simulate the actual workings of an entire brain -- a mouse's first, then a human's -- all the way down to the biochemical level of the neuron. Researchers will be able to tweak parameters as the simulation is running to try to figure out core mechanisms for conditions like Alzheimer's disease, schizophrenia, and autism.
It's all part of what IBM calls the cognitive systems era, in which computers aren't just programmed, but also perceive what's going on, make judgments, communicate with natural language, and learn from experience. It's a close cousin to that decades-old dream of artificial intelligence.
"If we want to make an impact in the cognitive systems era, we need to understand how the brain works," said Matthias Kaiserswerth, a computer scientist who's director of IBM Research in Zurich, speaking during a media tour of the labs on Wednesday.
One key challenge driving IBM's work is matching the brain's power consumption. Over millions of years, nature has evolved a remarkably efficient information-processing design, saidAlessandro Curioni, manager of IBM Research's computational sciences department. The ability to process the subtleties of human language helped IBM's Watson supercomputer win at "Jeopardy." That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go.
"Watson used 85 kilowatts," Kaiserwerth said. "That's a lot of power. The human brain uses 20 watts."
Dense 3D computing
The shift in IBM's computing research shows in the units the company uses to measure progress. For decades, the yardstick of choice for gauging computer performance has been operations per second -- the rate at which the machine can perform mathematical calculations, for example.
When energy constraints became a problem, meaning that computers required prohibitive amounts of electrical power and threw off problematic amounts of waste heat, a new measurement arrived: operations per joule of energy. That gauges a computer's energy efficiency.
Now IBM has a new yardstick: operations per liter. The company is judging success by how much data-processing ability it can squeeze into a given volume. Today's computers must be laid out on flat circuit boards that ensure plenty of contact with air that cools the chips.
"In a computer, processors occupy one-millionth of the volume. In a brain, it's 40 percent. Our brain is a volumetric, dense, object," said Bruno Michel, a researcher in advanced thermal packaging for IBM Research, who got his Ph.D in biophysics.
What's the problem with sprawl? In short, communication links between processing elements can't keep up with data-transfer demands, and they consume too much power as well, Michel said.
The fix is to stack chips into dense 3D configurations, with chips linked using a technology called through-silicon vias (TSVs). That's impossible today because stacking even two chips means crippling overheating problems. But IBM believes it's got an answer to the cooling problem: a branching network of liquid cooling channels that funnel fluid into ever-smaller tubes.
The liquid passes not next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip, Michel said. The company has demonstrated the approach in an efficient prototype system called Aquasar. (Get ready for another new yardstick: greenhouse gas emissions. Aquasar can perform 7.9 trillion operations per second per gram of carbon dioxide released into the atmosphere.)
Liquid-based flow battery
But that's not all the liquid will do. IBM also is developing a system called a redox flow battery that also uses it to distribute power instead of using wires. Two liquids called electrolytes, each with oppositely charged electrical ions, circulate through the system to distribute power. Think of it as a liquid battery interlaced through the interstices of the machine.
"We are going to provide cooling and power with a fluid," Michel said. "That's how our brain does it."
The electrolytes, vanadium-based at present, travel through ever-smaller tubes, said Patrick Ruch, another IBM researcher working on the effort. At the smallest, they're about 100 microns wide, about the width of a human hair, at which point they hand off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer's circuit board.
Liquid cooling has been around for decades in the computing industry, but most data centers avoid it given its expense and complexity. It's possible the redox battery could provide a new incentive to embrace it, though.
Michel estimates the liquid power technology will take 10 to 15 years to develop, but when it works, it'll mean supercomputers that fit into something the size of a backpack, not a basketball court.
"A 1-petaflop computer in 10 liters -- that's our goal," Michel said.
Performing at 1 petaflop means a computer can complete a quadrillion floating-point mathematical operations per second. Today's top supercomputer clocked in at 33.86 petaflops, but it uses 32,000 Xeon processors and 48,000 Xeon Phi accelerator processors.
How to build a brain
More conventional supercomputers have been used so far for IBM's collaborations in brain research. The highlight of that work so far has been the Blue Brain project, which is on its thirdIBM Blue Gene supercomputer at the Ecole Polytechnique Federale de Lausanne, or EPFL, in Lausanne, Switzerland. The Blue Brain and Human Brain Project will take a new step with aBlue Gene/Q augmented by 128 terabytes of flash memory at the Swiss National Supercomputing Center in Lugano, Switzerland. It'll be used to simulate the formation and inner workings of an entire mouse brain, which has about
70 million neurons.
The eventual human brain simulation will take place at the Juelich Supercomputing Center in northern Germany, Curioni said. It's planned to be an "exascale" machine -- one that performs 1 exaflops, or quintillion floating-point operations per second.
The project doesn't lack for ambition. One of its driving forces is co-director Henry Markram of EPFL, who has worked on the Blue Brain project for years and sees computing as the way to understand the true workings of the human brain.
"It's impossible to experimentally map the brain," simply because it's too complicated, Markram said. There are too many neurons overall, 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans.
"If you can't experimentally map the brain, you have to predict it -- the numbers of neurons, the types, where the proteins are located, how they'll
interact," Markram said. "We have to develop an entirely new science where we predict most of the stuff that cannot be measured."
With the Human Brain Project, researchers will use supercomputers to reproduce how brains form -- basically, growing them in an virtual vat -- then seeing how they respond to input signals from simulated senses and nervous system.
The idea isn't to reproduce every last thing about the brain, but rather a model based on the understanding so far. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn't work, scientists will know where their knowledge falls short.
"We take these rules and algorithmically reconstruct a model of the brain," Markram said. "We'll say this is biological prediction, then we can go back to the experiments and we can verify if the model is right. We celebrate when the model is wrong, because that's when it points to where we need more data or we don't understand the rules."
The result, if the work is successful, will be not just a better understanding of the brain, but better cooperation among brain researchers and medical experts. That could reverse recent declines in the development of new drugs to treat neural problems, he said.
And understanding the brain could usher in the era of "neuromorphic computing."
"Any new rules, circuits, or understanding of how the brain works will allow us to design neuromorphic machines that are much more powerful in terms of cognitive power, energy efficiency, and packaging," Curioni said.
And that, in turn, could lead to profoundly more capable computers. For starters, IBM has four markets in mind: machines that could find the best places to invest money, bring new depth and accuracy to medical diagnoses, research the appropriate legal precedents in court cases, or give people help when they dial a call center.
But it's not hard to imagine that's only the beginning. When computers can learn for themselves and program themselves, it's clear the divide separating biological and artificial computing will be a lot narrower.
(Phys.org) —By collecting heat energy from the environment and transforming it into electrical power, thermoelectric energy harvesters have the potential to provide energy for a variety of small electronic devices. Currently, the biggest challenge in developing thermoelectric energy harvesters is to make systems that are both powerful and efficient at the same time.
One material that scientists have experimented with for making thermoelectric energy harvesters is quantum dots, nano-sized crystals with semiconducting properties. Due to their sharp, discrete energy levels, quantum dots are good energy filters, which is an important property for thermoelectric devices.
In a new study published in the New Journal of Physics, a team of researchers from Switzerland, Spain, and the US has investigated a thermoelectric energy harvester design based on quantum wells. Although quantum wells are also made of semiconducting materials, they have different structures and energy-filtering properties than quantum dots.
"We have shown that quantum wells can be used as powerful and efficient energy harvesters," said coauthor Björn Sothmann, a physicist at the University of Geneva in Switzerland. "Compared to our previous proposal based on quantum dots, quantum wells are easier to fabricate and offer the potential to be operated at room temperature."
The energy harvester design that the researchers investigated here consists of a central cavity connected via quantum wells to two electronic reservoirs. The central cavity is kept at a hotter temperature than the two electronic reservoirs, and the quantum wells act as filters that allow electrons of certain energies to pass through. In general, the greater the temperature difference between the central cavity and the reservoirs, the greater the electron flow and output power.
In their analysis, the researchers found that the quantum well energy harvester delivers an output power of about 0.18 W/cm2 for a temperature difference of 1 K, which is nearly double the power of a quantum dot energy harvester. This increased power is due to the ability of quantum wells to deliver larger currents compared to quantum dots as a result of their extra degrees of freedom.
Although the quantum well energy harvester has a good efficiency, the efficiency is slightly lower than that of energy harvesters based on quantum dots. The researchers explain that this difference occurs because of the difference in energy filtering: quantum wells transmit electrons of any energy above a certain level, while quantum dots are more selective and let only electrons of a specific energy pass through. As a result, quantum wells are less efficient energy filters.
Quantum well energy harvesters appear promising for applications. For one thing, they may be easier to fabricate than energy harvesters that use quantum dots, since quantum dots are required to have similar properties in order to achieve good performance, and there is no such requirement for quantum wells. In addition, the fact that they can operate at room temperature may make quantum well energy harvesters suitable for a variety of applications, such as electric circuits.
"The energy harvester can be used to convert waste heat from electric circuits, e.g. in computer chips, back into electricity," Sothmann said. "This way, one can reduce both the consumed power as well as the need for cooling the chip."
The researchers hope that their work encourages experimental groups to build and test the device.
Read more at: http://phys.org/news/2013-10-scientists-quantum-wells-high-power-easy-to-make.html#jCp
A MEGA TO GIGA YEAR STORAGE MEDIUM CAN OUTLIVE THE HUMAN RACE
Mankind has been storing information for thousands of years. From carvings on marble to today's magnetic data storage. Although the amount of data that can be stored has increased immensely during the past few decades, it is still difficult to actually store data for a long period. The key to successful information storage is to ensure that the information does not get lost. If we want to store information that will exist longer than mankind itself, then different requirements apply than those for a medium for daily information storage. Researcher Jeroen de Vries from the University of Twente MESA+ Institute for Nanotechnology demonstrates that it is possible to store data for extremely long periods. He will be awarded his doctorate on 17 October.
Current hard disk drives have the ability to store vast amounts of data but last roughly ten years at room temperature, because their magnetic energy barrier is low so that the information is lost after a period of time. CDs, DVDs, paper, tape, clay and tablets and stone also have a limited life. Alternatives will have to be sought if information is to be retained longer.
Archival storage for up to one billion years
It is possible to conceive of a number of scenarios why we wish to store information for a long time. "One scenario is that a disaster has devastated the earth and society must rebuild the world. Another scenario could be that we create a kind of legacy for future intelligent life that evolves on Earth or comes from other worlds. You must then think about archival storage of between one million and one billion years," according to researcher De Vries.
Optical information carrier
De Vries has developed an optical information carrier that can store information for extremely long periods of time, with each bit being written using etching techniques. The chosen information carrier is a wafer consisting of tungsten encapsulated by silicon nitride. Tungsten was chosen because it can withstand extreme temperatures. A QR code is etched into the tungsten (see picture) and is protected by the nitride. Each pixel of the large QR code contains a smaller QR code that in turn stores different information. "In principle, we can store everything on the disc that we believe is worthwhile saving: for example, a digital image of the Mona Lisa. In this study we tested a digital copy of the chapter about this medium from my thesis", says De Vries.
Ageing test at high temperatures
In order to ensure the stability of the data, an energy barrier that separates the information from the non-information is required. In order to prove that the data is still legible after millions of years, an ageing test is required to see if the energy barriers are high enough to prevent data loss. De Vries: "According to the Arrhenius model, the medium should keep working for at least 1 million years if it is heated to a temperature of 473 Kelvin (200 degrees Celsius) and kept in the oven for an hour." After the test there was no visible degradation of the tungsten, and it was still easy to read the information. Things become complicated at higher temperatures. When heated to 713 Kelvin (440 degrees Celsius) it becomes a lot more difficult to decypher the QR codes even if the tungsten is not affected. De Vries: "A follow-up study would be to investigate whether the data carrier also can withstand higher temperatures, for example during a house fire. But if we can find a place that is very stable, such as a nuclear storage facility, then the disc itself and the data that is on it should be able to endure millions of years. ”
About Jeroen de Vries
Jeroen de Vries was born on 5 January 1982 in Stede Broec. In 2000, he moved to Enschede to study Electrical Engineering. From December 2007 to April 2008 he stayed in Akita, Japan with the group of Professor Hitoshi Saito at Akita University in order to study the theoretical sensitivity of cantilever tip shapes. He followed this with a study of the optical readout of a cantilever array at the Systems and Materials for Information storage (SMI) group. He graduated in 2009 and then started as a PhD student with the Transducers Science and Technology (TST) group. During his doctoral studies, he participated in the IEEE summer school on magnetism in Dresden, Germany and the ESONN summer school on nanotechnology in Grenoble, France.
The PhD defence of Jeroen de Vries from the Transducers Science and Technology department of the MESA+ Institute will take place in the Waaier building on the University of Twente campus on 17 October 2013 at 2.45 p.m. His thesis supervisor is Professor Miko C. Elwenspoek (Electromagnetism) from the Electrical Engineering, Mathematics and Computer Science (EEMCS) faculty. For more information, or an electronic version of the thesis ‘Energy barriers in patterned media’ you can contact Dennis Moekotte, (+31 6 18642685), the University of Twente Press department.
from the Washington Free Beacon
China Tests High-Speed Precision-Guided Torpedo
Torpedo poses threat to U.S. ships, submarines
BY: Bill Gertz
November 14, 2013 5:00 am
China’s navy recently conducted a test of a new high-speed maneuvering torpedo that poses a threat to U.S. ships and submarines.
Defense officials said the new torpedo is the latest example of what the Pentagon calls Beijing’s anti-access, area-denial, or AA/AD, high-tech weaponry.
Other new weapons include China’s recently deployed anti-ship ballistic missile, the DF-21D, which is designed to sink U.S. aircraft carriers far from China’s shores.
China’s military showcased last month another high-tech weapon designed to target Navy ships and submarines. U.S. submarines are considered one of the U.S. military’s most important counter weapons to the AA/AD threat.
Amazon and Walmart are competing to get same day delivery now but Amazon is looking to leap to 30 minute delivery drones with Prime Air
In a Sunday evening "60 Minutes" program aired on CBS Amazon chief executive Jeff Bezos unveiled the new service, dubbed Prime Air, to CBS anchor Charlie Rose.
The company has been working on the "octocopter" project in a secret research and development lab at its Seattle, Wash.-based headquarters for months in efforts to ramp up its competition against its rivals. According to the program, the octocopter drones will pick up packages in small buckets at Amazon's fulfillment centers and fly directly to customers' nearby in as little as 30 minutes after they hit the "buy" button.
But the service won't launch overnight. In fact, it may take as long as four to five years for Prime Air drones to take to the skies. There needs to be FAA (Federal Aviation Administration) approval.
Prime Air is something the Amazon team has been working on in their next generation R and D lab. The goal of this new delivery system is to get packages into customers' hands in 30 minutes or less using unmanned aerial vehicles. Putting Prime Air into commercial use will take some number of years as we advance technology and wait for the necessary FAA rules and regulations. This is footage from a recent test flight. See page at http://amzn.to/PrimeAir
Amazon hopes FAA's rules will be in place as early as sometime in 2015.