22 October 2014

Why Solar Is Much More Costly Than Wind or Hydro

I noticed a recent article in Technology Review that reminded me that many people can look at the same information, but when said information is presented in a different way, the understanding of it can change as well.

A new report from the E.U. estimates the true economic cost of different forms of energy production.

By Mike Orcutt on October 22, 2014

WHY IT MATTERS

The cost of electricity generation is crucial to the debate over climate change and energy policies.

It’s no surprise that if environmental costs are considered, renewables—particularly wind power—are a far better bargain than coal power. But it might surprise many that according to a new such analysis, solar power lags far behind wind and even hydroelectric power in its economic impact, at least in the European Union.

Energy costs are rarely viewed through this lens, though it is more common in Europe than in the U.S. and other parts of the world. The study, commissioned by the E.U. and conducted by Ecofys, a renewable energy consultancy, considered the economic costs of climate change, pollution, and resource depletion as well as the current capital and operating costs of the power plants.

The authors assessed the cost of generating electricity and any resulting environmental damage. They used a measure known as the “levelized cost,” the estimated cost per megawatt-hour, without subsidies, of building and operating a given plant in a given region over an assumed lifetime. The authors referred to established models and academic literature to find monetary values for pollution, land use, and resource depletion. And to account for climate change, they assumed a metric ton of emitted carbon dioxide costs around €43 ($55).

Previous studies have looked at the economic impacts of pollution and other environmental consequences of energy production, but the Ecofy analysis is unique in that it includes the depletion of energy resources as an additional cost, says Ann Gardiner, a consultant at Ecofys who co-authored it.

Surprisingly, solar power fared poorly in the analysis, costing far more than wind power and nearly the same as nuclear power. The reason, says Gardiner, is that many of world’s solar panels are manufactured in China, where electricity is very carbon-intensive. The depletion of metal resources also represents a larger cost for solar than wind, she says. Gardiner notes, however, that solar technology is still improving and may be more cost-effective today than it was in 2012, the year used for the study.

According to the Ecofys analysis, new coal and natural gas plants in the E.U. have levelized costs of just over €50 ($64) (in 2012 euros) per megawatt-hour (assuming they are running at maximum capacity); onshore wind is around €80 ($102) per megawatt-hour; utility scale solar PV is about €100 ($127); nuclear power is around €90 ($115); and hydropower is as cheap as €10.

The chart below shows the estimated cost for several technologies, per megawatt-hour.
The environmental costs associated with different forms of energy production are approximations, but they the show the scale of damages associated with each technology.

These costs would look different for other parts of the world. Two of the most influential variables are the cost of fuel (natural gas is much cheaper in the U.S., for example) and the capital cost of building power plants, which varies globally by as much as four times, says David Victor, professor of international relations at the University of California, San Diego.

15 October 2014

Telling Stories with Web Archives

Michele Weigle
Associate Professor of Computer Science, Old Dominion University
Monday, October 27, 2014
Room 141, Brooks Building, 1:00 p.m.

Telling Stories with Web Archives

Abstract:
The web has become an integral part of our lives, shaping how we get news, shop, and communicate. When critical events occur, social media and news websites cover the stories as they break and continually revise them as the story evolves. Unfortunately, much of the content around these stories are vulnerable to being lost. Thus, web archives have become a significant repository of our recent history and cultural heritage. Content from web archives can be used to fill in the gaps in the live web about the evolution of the story of an important event.  This talk will explore the problem and describe our initial steps towards a solution.

Speaker Bio:
Michele C. Weigle is an Associate Professor of Computer Science at Old Dominion University.  Her research interests include digital preservation, web science, information visualization, and mobile networking.  Dr. Weigle’s current research projects include an NEH-funded digital humanities project to allow users to archive dynamic or personalized web pages as they appear in the browser and an exploration of the use of web archives to enrich the live web experience through storytelling. She has published over 50 articles in peer-reviewed conferences and journals and co-edited one of the first books on vehicular networks, Vehicular Networks: From Theory to Practice, published in 2009 by CRC Press. She has served as PI or Co-PI on external research grants totaling over $2 million from NSF and NEH.  Dr. Weigle received her Ph.D. in computer science from the University of North Carolina in 2003. From 2004-2006, she was an Assistant Professor in the Department of Computer Science at Clemson University. She joined ODU in 2006.
Host: Jay Aikat (aikat@cs.unc.edu)
Brett R. Leslie
Faculty Support
UNC Department of Computer Science
234 Sitterson Hall
919-590-6035

14 October 2014

Network Theory Reveals The Hidden Link Between Trade And Military Alliances That Leads to Conflict-Free Stability

An article in Technology Review that applies network theory to a different context.

The first game-theoretical study of military alliances shows that they cannot alone lead to global stability.
The study of modern history is currently undergoing a revolution. That is largely because historians are beginning to apply the ideas in network theory to the complex interactions that have forged our past.

There was a time when historians focused largely on events as the be all and end all of history. But in recent years, there has been a growing understanding that a complex network of links, alliances, trade agreements and so on play a hugely important role in creating an environment in which conflict (or peace) can spread.

An interesting open question in this regard is whether certain kinds of networks exist that are stable against the outbreak of war. Today, we get an answer thanks to the work of Matthew Jackson and Stephen Nei at Stanford University in California. These guys combine network theory and game theory to study the stability of different kinds of networks based on real-world data.

In particular, Jackson and Nai study the theoretical properties of networks consisting of countries that have military links and compared them to the properties of networks in which countries have both military and trade links.

Finally, they apply real data to their model. They combine international trade data with the well-known “Correlates of War” database to see how closely their predictions match those of real networks.

Jackson and Nei begin by considering a simple network of a handful of countries that can form military coalitions of various strengths. At the same time, each alliance must serve a purpose by helping to protect the countries involved so that the deletion of any alliance would make a country vulnerable. In this network, a country is vulnerable to attack if its coalition is weaker than its opponents’ coalition and the cost of a war is less than the benefit.

An interesting question is whether such a network can ever be stable against war. In other words, can the network exist in a way that no country is vulnerable to a successful attack by others and that no country can change alliances in a way that makes such an attack viable.

Jackson and Nei use game theory to calculate mathematically that this kind of stability is impossible. “It turns out that there are no war-stable networks,” they say (except for the trivial case of an empty network with no links).

The reason is that the pressure to ensure that all alliances are purposeful makes the network unstable almost by design. “The pressure to economize on alliances conflicts with stability against the formation of new alliances, which leads to instability and would suggest chaotic dynamics,” they say. “This instability provides insights into the constantly shifting structures and recurring wars that occurred throughout the nineteenth and first half of the twentieth centuries.”

Between 1820 and 1959, there were 10 times as many wars per year on average between each possible pair of countries than between 1960 and 2000 (see diagram above). So what has changed since the 1950s?

Jackson and Nei argue that it is the formation of trade links between countries that has created the stability that has prevented wars. Between 1816 and 1950, a country had on average 2.525 alliances but this has grown by a factor of four since then.

They go on to add this to their model. They point out that there has been a rapid increase in global trade since World War II, not least because of the advent of container shipping in the 1960s.

Next, they consider a network in which a link can exist because of a military alliance or because of economic considerations. This change dramatically alters the stability of the resulting network for two reasons.

First, trade provides a reason to maintain an alliance. Second, these economic considerations reduce the incentive to attack another country since trade will be disrupted. “This reduces the potential set of conflicts and, together with the denser networks, allows for a rich family of stable networks that can exhibit structures similar to networks we see currently,” say Jackson and Nei, a result they are able to show mathematically and which matches the real-world data after the 1960s.

Some historians might point out that there has been another factor at work since World War II: the development of nuclear weapons. Jackson and Nei say that nuclear weapons certainly change the landscape by increasing military strength and reducing the incentive to attack given the likely ensuing damage.

Indeed, their model suggests that the worldwide adoption of nuclear weapons could stabilise the global network in the absence of trade. However, their analysis also shows that the only solution for a stable network is the empty network in which there are no alliances. In this model, nuclear weapons do not play the role that many historians have claimed and cannot by themselves produce the network of links that exist today.

“To explain the much denser and more stable networks in the modern age along with the paucity of war in a world where nuclear weapons are limited to a small percentage of countries, our model points to the enormous growth in trade as a big part of the answer,” say Jackson and co.

That is a fascinating insight into the way conflicts can be prevented. The complex link between trade and conflict is increasingly the focus of study. But little has been done to protect the game theoretical stability of these networks. “To our knowledge, there are no previous models of conflict that game-theoretically model networks of alliances between multiple agents/countries based on costs and benefits of wars,” they say.

That makes this an interesting and important step forward and the basis on which a number of new ideas can be tested. For example, Jackson and Nei could include much more detailed information about the nature of trade links. And they could also include the role that geography plays in conflict, which is hard to model. “Geography constrains both the opportunities and benefits from trade and war, and so it has ambiguous effects on stability,” they say.

Another important consideration is the relative rates of economic and military growth. Do countries that outgrow their competitors economically protect themselves against future conflict?

The best models also have predictive power. There is an interesting analogy with forest fires, which can occur on a wide range of scales, some many orders of magnitude larger than others. The size of these fires is almost entirely dependent on the network of links between trees in the forest. If these links are sparse, the fire dies out. But if they are dense, the fire can spread easily.

A crucially important point is that the eventual size of the fire has little, if anything, to do with the spark that started it. So an analysis that concentrates on this spark will inevitably ignore a great deal about the nature of the fire.

The same kind of thinking applies to a wide range of social phenomena, such as epidemics, fashions, wars and so on. And it allows interesting predictions, for example, that the size distribution of epidemics, fashions and wars should all follow the same pattern, which turns out to be observed. It also suggests that the initiating event is a poor predictor of the eventual size of an epidemic, fashion or war.

So an interesting task for future modellers will be to use these kinds of simulations to predict where future areas of conflict might occur and how they can be protected against.

That is a big ask, not least because the changing nature of international alliances, whether economic or military, is hard both to measure in real-time and predict of any decent timescale. Nevertheless, these are all worthy goals.

Ref: arxiv.org/abs/1405.6400 Networks of Military Alliances, Wars, and International Trade

13 October 2014

Module 03 grades

I have finished grading your module 03 reports and everyone did well. I have put individualized gradesheets in protected directories, from where you may retrieve them at you leisure. You can find a link to your individual gradesheet at the grades page which you may access by using our INLS201 readings credentials.

When you click on your name, you will be prompted for new credentials. For all but three of you all (and I have sent individual notes to the three who are affected), you are prompted for your personal ONYEN credentials.

I made some comments on each gradesheet. If you have any questions about what I had to say, we can discuss it during my office hours.

If your gradesheet does not show your Module 03 grade, let me know. It is possible that I might have missed a required step and can fix the problem as soon as it's pointed out to me.

Note that there has been a bit of a modification for Module 04. We will discuss it in class tomorrow, but the deliverable will be of a different type.

12 October 2014

Why do people love Ordnance Survey maps?

from a BBC article on 07 October 2014. Note how it ties into several of the themes we have mentioned in our discussion of information and how it may be understood, used, and managed.

Man on a giant Ordnance mapA man looks at a giant Ordnance Survey map of Wales
US-born neuroscientist John O'Keefe has jointly won the 2014 Nobel Prize for medicine for discovering the brain's navigation system. Is it any surprise then that he loves Ordnance Survey maps, writes Luke Jones.
O'Keefe came to the UK from the US in the late 1960s. He was supposed to stay for only two years as part of post-doctoral study. He decided to relocate for good.
The 74-year-old told BBC Radio 4's Today programme that he was "very attracted to many aspects of British culture".
Two aspects that he named were the NHS and the Ordnance Survey map. "I like walking on the weekends and finding my way around," said the professor who found that the brain has an "inner GPS system" in 1971 by discovering nerve cells that help create maps.
Key research that he co-authored in 1971, The Hippocampus As A Cognitive Map, references an OS map as a way of explaining spatial behaviour and the brain's internal positioning system.
Most countries have a national mapping agency. But Rob Andrews, from Ordnance Survey, says it is the level of detail which makes them unique. Some 250 surveyors and two planes contribute to the "10,000 changes" made to the database every day. More detail is added and changes in the landscape are accounted for. We constantly have "roads changing, houses popping up and petrol stations being demolished", he says.
Detailed mapDetail of the London Bridge area
MinecraftMinecraft map
Woman looking at map 1937Ordnance Survey map from 1937
Simon Garfield, author of On the Map, agrees with O'Keefe that OS maps are an integral part of British culture.
"Ordnance Survey maps were originally inspired by 18th Century cartography in France," he says. "But they've been associated with sodden walks in the Cairngorms and the Lakeland Fells for so long that they'll always be thought of as British as roast beef and Big Daddy. What else makes them so? Their indefatigable finicky detail and their historic quirkiness. The maps show bracken and drinking fountains, not something you see much of on satnav."
The maps were originally military surveys. Official Superintendent William Mudge expected them to be the "honour of the nation". The first Ordnance Survey map produced was of Kent. They started mapping the South East to help with the defence plans against Napoleonic France. It was quickly picked up as a tool for tourists. A director of the Ordnance Survey complained of the "swarms of idle holiday visitors" who were pestering surveyors for the locations of the most picturesque parts to visit.
Print sales now account for just 7% of revenue. Last year, sales dropped below two million for the first time.
OS maps became available online in 2009, and in September 2013, OS terrain data was made available for users of the online game Minecraft.
Evolution of mapEvolution of the Ordnance Survey map for the Derwent Valley, Lake District

07 October 2014

SILS CRADLE Seminar

A note from Brad Hemminger

Friday Oct 10th, 12pm-1pm
Manning Hall room 208

Title: Modelling User Data Access Patterns as a Component of Archive Engineering
Presenter: Bruce Barkstrom, NASA

Abstract: As a bit of history on this, about 2001, I had put together a model of user data access that used a Markov model of user activities and that I'd begun to couple to an innovation-diffusion model of how user communities adopt a new approach to doing things. I'd even given a one-day presentation to various NASA EOS DAACs on how this would work. I've resurrected the model and am in the process of extending it to do dynamic Monte Carlo modelling, not just the steady-state Markov model solution. Part of the new material also includes the notion that user communities have their own dialects that add "noise" from semantic heterogeneity.

Bio: Bruce Barkstrom is retired from NASA. He has extensive work in system design as well as data stewardship

There Are Now 3 Million Data Centers in the U.S., and Climbing

An article on Mashable by Andrew Freeman, 30 Sep 2014
In this Aug. 29, 2014 photo, rows of servers are lined up at BlueBridge Networks in Cleveland.
Data centers, the key backbone of the cloud, are growing larger and more energy intensive, prompting the Department of Energy (DOE) to launch a new initiative on Tuesday aimed at improving their energy efficiency.

According to DOE statistics, data center electricity use doubled between 2001 to 2006, from 30 to 60 billion kilowatt-hours of electricity, and stood at about 100 billion kilowatt-hours of electricity as of 2013. This amounts to about 2% of all U.S. electricity use and climbing.

Already, there are about 3 million data centers in the U.S., amounting to about one center per 100 people in the U.S., and this is expected to continue to grow as more computing applications for large and small companies are moved to these facilities.

The new initiative aims to improve the energy efficiency of data centers across the U.S. by bringing them into the preexisting "Better Buildings Challenge", which the Obama administration established in 2011. The challenge is part of the Obama administration's efforts to improve the energy efficiency of the U.S. economy, thereby cutting emissions of greenhouse gases that warm the climate.

The launch on Tuesday includes 19 new partners, from ecommerce giant eBay and retailers like Home Depot and Staples to eight DOE labs, including the Los Alamos National Laboratory in New Mexico, where nuclear weapons research is conducted. The inclusion of large federal facilities is noteworthy, since 10% of the federal government's energy use is from data centers, according to DOE information.

The initiative also includes the EPA, part of the Justice Department, Michigan State University, National Renewable Energy Laboratory and the Social Security Administration. Two large realty companies with large data centers are also partnering with the DOE on this initiative, including CoreSite Realty Corporation and Digital Realty.

The data centers that are part of the new program together consume more than 90 megawatts of electricity per year, according to a press release. This would be enough to power 90,000 homes for a year.

Check the Infographic at Mashable

According to the DOE, operators of "the vast majority" of data centers across the U.S. do not currently practice "energy management" to improve their efficiency and reduce energy costs.

Kathleen Hogan, DOE's deputy assistant secretary for energy efficiency, told Mashable in an interview that half of the energy used to power a data center is used for the cooling and powering of equipment, with the other half going to actually running the servers and other IT equipment.

Because most companies are turning over their data center IT technologies every three to five years, bringing in equipment that tends to be more efficient, Hogan says the focus of the initiative is on making gains in the efficiency of heating and cooling operations at these facilities.

“We’re all in a world where things are changing quickly and we see there is always new IT equipment out there all the time,” she said. “There’s a great opportunity to reduce the air conditioning loads in a data center," Hogan said.

The new initiative seeks to make all data centers operated by the partner companies and agencies at least 20% more efficient by 2020, Hogan added.

If all U.S. data centers met this efficiency goal, she said, it could yield about $2 billion in cost savings and save more than 20 billion kilowatt-hours of electricity by 2020.

According to Hogan, the new focus on data centers aims to “challenge as many data center managers, operators as possible.” She said a main component of the initiative calls for companies to share best practices with others, thereby creating a "solution network" that can accelerate the spread of more efficient technologies.

Companies with small data center footprints can also join the challenge by making efficiency improvements at just one data center, on the order of a 25% increase in energy efficiency within five years, Hogan said.

She said the sheer amount of data centers, from small ones located at office buildings to large ones that are large structures of their own, are now "pervasive" around the U.S. "even if we don’t see them.”