Friday, 8 June 2012

LIKKUS

IRMANUHI image

FOTO FOTO IRMANUHI END..

Wednesday, 30 May 2012

Asal kata dan Makna Bulan2 Islam (Hijriah)

Jika Agan/sista2 sekalian tak hapal bulan-bulan dalam Islam, apalagi maknanya, mungkin kalian tak sendirian. Sebab, bulan-bulan resmi yang digunakan di Indonesia mengacu pada Masehi. Inilah makna di balik bulan-bulan Hijriyah itu. PENDAHULUAN Kalender Islam menggunakan peredaran bulan sebagai acuannya, berbeda dengan kalender biasa (kalender Masehi) yang menggunakan peredaran matahari. Penetapan kalender Hijriyah dilakukan pada jaman Khalifah Umar bin Khatab, yang menetapkan peristiwa hijrahnya Rasulullah saw dari Mekah ke Madinah. Kalender Hijriyah juga terdiri dari 12 bulan, dengan jumlah hari berkisar 29 - 30 hari. Penetapan 12 bulan ini sesuai dengan firman Allah Subhana Wata'ala, ”Sesungguhnya bilangan bulan pada sisi Allah ialah dua belas bulan, dalam ketetapan Allah di waktu Dia menciptakan langit dan bumi, di antaranya empat bulan haram. Itulah (ketetapan) agama yang lurus, maka janganlah kamu menganiaya diri kamu dalam bulan yang empat itu, dan perangilah kaum musyrikin itu semuanya sebagaimana mereka pun memerangi kamu semuanya; dan ketahuilah bahwasanya Allah beserta orang-orang yang bertakwa.” (QS :At Taubah(9): 36). Sebelumnya, orang arab pra-kerasulan Rasulullah Muhammad saw telah menggunakan bulan-bulan dalam kalender Hijriyah ini. Hanya saja mereka tidak menetapkan ini tahun berapa, tetapi tahun apa. Misalnya saja kita mengetahui bahwa kelahiran Rasulullah saw adalah di tahun gajah. Abu Musa Al-Asyri sebagai salah satu gubernur di zaman Khalifah Umar r.a. menulis surat kepada Amirul Mukminin yang isinya menanyakan surat-surat dari khalifah yang tidak ada tahunnya, hanya tanggal dan bulan saja, sehingga membingungkan. Khalifah Umar lalu mengumpulkan beberapa sahabat senior waktu itu. Mereka adalah Utsman bin Affan r.a., Ali bin Abi Thalib r.a., Abdurrahman bin Auf r.a., Sa’ad bin Abi Waqqas r.a., Zubair bin Awwam r.a., dan Thalhan bin Ubaidillah r.a. Mereka bermusyawarah mengenai kalender Islam. Ada yang mengusulkan berdasarkan milad Rasulullah saw. Ada juga yang mengusulkan berdasarkan pengangkatan Muhammad saw menjadi Rasul. Dan yang diterima adalah usul dari Ali bin Abi Thalib r.a. yaitu berdasarkan momentum hijrah Rasulullah SAW dari Makkah ke Yatstrib (Madinah). Alhasil, semuanya setuju dengan usulan Ali r.a. dan ditetapkan bahwa tahun pertama dalam kalender Islam adalah pada masa hijrahnya Rasulullah saw. Sedangkan nama-nama bulan dalam kalender hijriyah ini diambil dari nama-nama bulan yang telah ada dan berlaku di masa itu di bangsa Arab. Orang Arab memberi nama bulan-bulan mereka dengan melihat keadaan alam dan masyarakat pada masa-masa tertentu sepanjang tahun. Misalnya bulan Ramadhan, dinamai demikian karena pada bulan Ramadhan waktu itu udara sangat panas seperti membakar kulit rasanya. Berikut adalah arti nama-nama bulan dalam Islam: Nih Asal dan Maknanya gan... MUHARRAM, artinya: yang diharamkan atau yang menjadi pantangan. Penamaan Muharram, sebab pada bulan itu dilarang menumpahkan darah atau berperang. Larangan tesebut berlaku sampai masa awal Islam. SHAFAR, artinya: kosong. Penamaan Shafar, karena pada bulan itu semua orang laki-laki Arab dahulu pergi meninggalkan rumah untuk merantau, berniaga dan berperang, sehingga pemukiman mereka kosong dari orang laki-laki. RABI'ULAWAL, artinya: berasal dari kata rabi’ (menetap) dan awal (pertama). Maksudnya masa kembalinya kaum laki-laki yang telah meninqgalkan rumah atau merantau. Jadi awal menetapnya kaum laki-laki di rumah. Pada bulan ini banyak peristiwa bersejarah bagi umat Islam, antara lain: Nabi Muhammad saw lahir, diangkat menjadi Rasul, melakukan hijrah, dan wafat pada bulan ini juga. RABIU'ULAKHIR, artinya: masa menetapnya kaum laki-laki untuk terakhir atau penghabisan. JUMADILAWAL nama bulan kelima. Berasal dari kata jumadi (kering) dan awal (pertama). Penamaan Jumadil Awal, karena bulan ini merupakan awal musim kemarau, di mana mulai terjadi kekeringan. JUMADIL AKHIR, artinya: musim kemarau yang penghabisan. RAJAB, artinya: mulia. Penamaan Rajab, karena bangsa Arab tempo dulu sangat memuliakan bulan ini, antara lain dengan melarang berperang. SYA'BAN, artinya: berkelompok. Penamaan Sya’ban karena orang-orang Arab pada bulan ini lazimnya berkelompok mencari nafkah. Peristiwa penting bagi umat Islam yang terjadi pada bulan ini adalah perpindahan kiblat dari Baitul Muqaddas ke Ka’bah (Baitullah). RAMADHAN, artinya:sangat panas. Bulan Ramadhan merupakan satu-satunya bulan yang tersebut dalam Al-Quran, satu bulan yang memiliki keutamaan, kesucian, dan aneka keistimewaan. Hal itu dikarenakan peristiwa-peristiwa penting seperti: Allah menurunkan ayat-ayat Al-Quran pertama kali, ada malam Lailatul Qadar, yakni malam yang sangat tinggi nilainya. Karena para malaikat turun untuk memberkati orang-orang beriman yang sedang beribadah, bulan ini ditetapkan sebagai waktu ibadah puasa wajib. Pada bulan ini kaurn muslimin dapat menaklukan kaum musyrik dalam perang Badar Kubra dan pada bulan ini juga Nabi Muhammad saw berhasil mengambil alih kota Mekah dan mengakhiri penyembahan berhala yang dilakukan oleh kaum musyrik. SYAWWAL, artinya:kebahagiaan. Maksudnya kembalinya manusia ke dalam fitrah (kesucian) karena usai menunaikan ibadah puasa dan membayar zakat serta saling bermaaf-maafan. Itulah yang membahagiakan. DZULQAIDAH, berasal dari kata dzul (pemilik) dan qa’dah (duduk). Penamaan Dzulqaidah, karena bulan itu merupakan waktu istirahat bagi kaum laki-laki Arab dahulu. Mereka menikmatinya dengan duduk-duduk di rumah. DZULHIJJAH artinya:yang menunaikan haji. Penamaan Dzulhijjah, sebab pada bulan ini umat Islam sejak Nabi Adam a.s. menunaikan ibadah haji.

Monday, 12 March 2012

GLOBAL WARMING


Global warming
From Wikipedia, the free encyclopedia
Jump to: navigation, search
This article is about the change in climate that Earth is currently experiencing. For general discussion of how Earth's climate can change, see Climate change.
Page semi-protected
This is a featured article. Click here for more information.
]
Global mean land-ocean temperature change from 1880–2011, relative to the 1951–1980 mean. The black line is the annual mean and the red line is the 5-year running mean. The green bars show uncertainty estimates. Source: NASA GISS
The map shows the 10-year average (2000–2009) global mean temperature anomaly relative to the 1951–1980 mean. The largest temperature increases are in the Arctic and the Antarctic Peninsula. Source: NASA Earth Observatory[1]
Fossil fuel related CO2 emissions compared to five of IPCC's emissions scenarios. The dips are related to global recessions. Data from IPCC SRES scenarios; Data spreadsheet included with International Energy Agency's "CO2 Emissions from Fuel Combustion 2010 – Highlights"; and Supplemental IEA data. Image source: Skeptical Science

Global warming refers to the rising average temperature of Earth's atmosphere and oceans, which started to increase in the late 19th century and is projected to keep going up. Since the early 20th century, Earth's average surface temperature has increased by about 0.8 °C (1.4 °F), with about two thirds of the increase occurring since 1980.[2] Warming of the climate system is unequivocal, and scientists are more than 90% certain that most of it is caused by increasing concentrations of greenhouse gases produced by human activities such as deforestation and burning fossil fuels.[3][4][5][6] These findings are recognized by the national science academies of all the major industrialized nations.[7][A]

Climate model projections are summarized in the 2007 Fourth Assessment Report (AR4) by the Intergovernmental Panel on Climate Change (IPCC). They indicate that during the 21st century the global surface temperature is likely to rise a further 1.1 to 2.9 °C (2 to 5.2 °F) for their lowest emissions scenario and 2.4 to 6.4 °C (4.3 to 11.5 °F) for their highest.[8] The ranges of these estimates arise from the use of models with differing sensitivity to greenhouse gas concentrations.[9][10]

An increase in global temperature will cause sea levels to rise and will change the amount and pattern of precipitation, and a probable expansion of subtropical deserts.[11] Warming is expected to be strongest in the Arctic and would be associated with continuing retreat of glaciers, permafrost and sea ice. Other likely effects of the warming include more frequent occurrence of extreme-weather events including heat waves, droughts and heavy rainfall, species extinctions due to shifting temperature regimes, and changes in crop yields. Warming and related changes will vary from region to region around the globe, with projections being more robust in some areas than others.[12] If global mean temperature increases to 4 °C (7.2 °F) above preindustrial levels, the limits for human adaptation are likely to be exceeded in many parts of the world, while the limits for adaptation for natural systems would largely be exceeded throughout the world. Hence, the ecosystem services upon which human livelihoods depend would not be preserved.[13]

Most countries are parties to the United Nations Framework Convention on Climate Change (UNFCCC),[14] whose ultimate objective is to prevent "dangerous" anthropogenic (i.e., human-induced) climate change.[15] Parties to the UNFCCC have adopted a range of policies designed to reduce greenhouse gas emissions[16]:10[17][18][19]:9 and to assist in adaptation to global warming.[16]:13[19]:10[20][21] Parties to the UNFCCC have agreed that deep cuts in emissions are required,[22] and that future global warming should be limited to below 2.0 °C (3.6 °F) relative to the pre-industrial level.[22][B] A 2011 report of analyses by the United Nations Environment Programme[23] and International Energy Agency[24] suggest that efforts as of the early 21st century to reduce emissions may be inadequately stringent to meet the UNFCCC's 2 °C target.
Observed temperature changes
Main article: Instrumental temperature record
Two millennia of mean surface temperatures according to different reconstructions from climate proxies, each smoothed on a decadal scale, with the instrumental temperature record overlaid in black.

Evidence for warming of the climate system includes observed increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level.[25][26][27] The Earth's average surface temperature, expressed as a linear trend, rose by 0.74±0.18 °C over the period 1906–2005. The rate of warming over the last half of that period was almost double that for the period as a whole (0.13±0.03 °C per decade, versus 0.07±0.02 °C per decade). The urban heat island effect is very small, estimated to account for less than 0.002 °C of warming per decade since 1900.[28] Temperatures in the lower troposphere have increased between 0.13 and 0.22 °C (0.22 and 0.4 °F) per decade since 1979, according to satellite temperature measurements. Climate proxies show the temperature to have been relatively stable over the one or two thousand years before 1850, with regionally varying fluctuations such as the Medieval Warm Period and the Little Ice Age.[29]

Recent estimates by NASA's Goddard Institute for Space Studies (GISS) and the National Climatic Data Center show that 2005 and 2010 tied for the planet's warmest year since reliable, widespread instrumental measurements became available in the late 19th century, exceeding 1998 by a few hundredths of a degree.[30][31][32] Estimates by the Climatic Research Unit (CRU) show 2005 as the second warmest year, behind 1998 with 2003 and 2010 tied for third warmest year, however, “the error estimate for individual years ... is at least ten times larger than the differences between these three years.”[33] The World Meteorological Organization (WMO) statement on the status of the global climate in 2010 explains that, “The 2010 nominal value of +0.53 °C ranks just ahead of those of 2005 (+0.52 °C) and 1998 (+0.51 °C), although the differences between the three years are not statistically significant...”[34]
NOAA graph of Global Annual Temperature Anomalies 1950–2011, showing the El Niño-Southern Oscillation

Temperatures in 1998 were unusually warm because global temperatures are affected by the El Niño-Southern Oscillation (ENSO), and the strongest El Niño in the past century occurred during that year.[35] Global temperature is subject to short-term fluctuations that overlay long term trends and can temporarily mask them. The relative stability in temperature from 2002 to 2009 is consistent with such an episode.[36][37] 2010 was also an El Niño year. On the low swing of the oscillation, 2011 as an La Niña year was cooler but it was still the 11th warmest year since records began in 1880. Of the 13 warmest years since 1880, 11 were the years from 2001 to 2011. Over the more recent record, 2011 was the warmest "La Niña year" in the period from 1950 to 2011, and was close to 1997 which was not at the lowest point of the cycle.[38]

Temperature changes vary over the globe. Since 1979, land temperatures have increased about twice as fast as ocean temperatures (0.25 °C per decade against 0.13 °C per decade).[39] Ocean temperatures increase more slowly than land temperatures because of the larger effective heat capacity of the oceans and because the ocean loses more heat by evaporation.[40] The Northern Hemisphere warms faster than the Southern Hemisphere because it has more land and because it has extensive areas of seasonal snow and sea-ice cover subject to ice-albedo feedback. Although more greenhouse gases are emitted in the Northern than Southern Hemisphere this does not contribute to the difference in warming because the major greenhouse gases persist long enough to mix between hemispheres.[41]

The thermal inertia of the oceans and slow responses of other indirect effects mean that climate can take centuries or longer to adjust to changes in forcing. Climate commitment studies indicate that even if greenhouse gases were stabilized at 2000 levels, a further warming of about 0.5 °C (0.9 °F) would still occur.[42]
Initial causes of temperature changes (external forcings)
Greenhouse effect schematic showing energy flows between space, the atmosphere, and earth's surface. Energy exchanges are expressed in watts per square meter (W/m2).
This graph, known as the "Keeling Curve", shows the long-term increase of atmospheric carbon dioxide (CO2) concentrations from 1958–2008. Monthly CO2 measurements display seasonal oscillations in an upward trend; each year's maximum occurs during the Northern Hemisphere's late spring, and declines during its growing season as plants remove some atmospheric CO2.

External forcing refers to processes external to the climate system (though not necessarily external to Earth) that influence climate. Climate responds to several types of external forcing, such as radiative forcing due to changes in atmospheric composition (mainly greenhouse gas concentrations), changes in solar luminosity, volcanic eruptions, and variations in Earth's orbit around the Sun.[43]:0 Attribution of recent climate change focuses on the first three types of forcing. Orbital cycles vary slowly over tens of thousands of years and at present are in an overall cooling trend which would be expected to lead towards an ice age, but the 20th century instrumental temperature record shows a sudden rise in global temperatures.[44]
Greenhouse gases
Main articles: Greenhouse gas, Greenhouse effect, Radiative forcing, and Carbon dioxide in Earth's atmosphere

The greenhouse effect is the process by which absorption and emission of infrared radiation by gases in the atmosphere warm a planet's lower atmosphere and surface. It was proposed by Joseph Fourier in 1824 and was first investigated quantitatively by Svante Arrhenius in 1896.[45]

Naturally occurring amounts of greenhouse gases have a mean warming effect of about 33 °C (59 °F).[46][C] The major greenhouse gases are water vapor, which causes about 36–70% of the greenhouse effect; carbon dioxide (CO2), which causes 9–26%; methane (CH4), which causes 4–9%; and ozone (O3), which causes 3–7%.[47][48][49] Clouds also affect the radiation balance through cloud forcings similar to greenhouse gases.

Human activity since the Industrial Revolution has increased the amount of greenhouse gases in the atmosphere, leading to increased radiative forcing from CO2, methane, tropospheric ozone, CFCs and nitrous oxide. The concentrations of CO2 and methane have increased by 36% and 148% respectively since 1750.[50] These levels are much higher than at any time during the last 800,000 years, the period for which reliable data has been extracted from ice cores.[51][52][53][54] Less direct geological evidence indicates that CO2 values higher than this were last seen about 20 million years ago.[55] Fossil fuel burning has produced about three-quarters of the increase in CO2 from human activity over the past 20 years. The rest of this increase is caused mostly by changes in land-use, particularly deforestation.[56]
Per capita greenhouse gas emissions in 2005, including land-use change.
Total greenhouse gas emissions in 2005, including land-use change.

Over the last three decades of the 20th century, gross domestic product per capita and population growth were the main drivers of increases in greenhouse gas emissions.[57] CO2 emissions are continuing to rise due to the burning of fossil fuels and land-use change.[58][59]:71 Emissions can be attributed to different regions. The two figures opposite show annual greenhouse gas emissions for the year 2005, including land-use change. Attribution of emissions due to land-use change is a controversial issue.[60][61]:289

Emissions scenarios, estimates of changes in future emission levels of greenhouse gases, have been projected that depend upon uncertain economic, sociological, technological, and natural developments.[62] In most scenarios, emissions continue to rise over the century, while in a few, emissions are reduced.[63][64] Fossil fuel reserves are abundant, and will not limit carbon emissions in the 21st century.[65] Emission scenarios, combined with modelling of the carbon cycle, have been used to produce estimates of how atmospheric concentrations of greenhouse gases might change in the future. Using the six IPCC SRES "marker" scenarios, models suggest that by the year 2100, the atmospheric concentration of CO2 could range between 541 and 970 ppm.[66] This is an increase of 90–250% above the concentration in the year 1750.

The popular media and the public often confuse global warming with ozone depletion, i.e., the destruction of stratospheric ozone by chlorofluorocarbons.[67][68] Although there are a few areas of linkage, the relationship between the two is not strong. Reduced stratospheric ozone has had a slight cooling influence on surface temperatures, while increased tropospheric ozone has had a somewhat larger warming effect.[69]
Particulates and soot
Ship tracks over the Atlantic Ocean on the east coast of the United States. The climatic impacts from particulate forcing could have a large effect on climate through the indirect effect.

Global dimming, a gradual reduction in the amount of global direct irradiance at the Earth's surface, was observed from 1961 until at least 1990.[70] The main cause of this dimming is particulates produced by volcanoes and human made pollutants, which exerts a cooling effect by increasing the reflection of incoming sunlight. The effects of the products of fossil fuel combustion – CO2 and aerosols – have largely offset one another in recent decades, so that net warming has been due to the increase in non-CO2 greenhouse gases such as methane.[71] Radiative forcing due to particulates is temporally limited due to wet deposition which causes them to have an atmospheric lifetime of one week. Carbon dioxide has a lifetime of a century or more, and as such, changes in particulate concentrations will only delay climate changes due to carbon dioxide.[72]

In addition to their direct effect by scattering and absorbing solar radiation, particulates have indirect effects on the radiation budget.[73] Sulfates act as cloud condensation nuclei and thus lead to clouds that have more and smaller cloud droplets. These clouds reflect solar radiation more efficiently than clouds with fewer and larger droplets, known as the Twomey effect.[74] This effect also causes droplets to be of more uniform size, which reduces growth of raindrops and makes the cloud more reflective to incoming sunlight, known as the Albrecht effect.[75] Indirect effects are most noticeable in marine stratiform clouds, and have very little radiative effect on convective clouds. Indirect effects of particulates represent the largest uncertainty in radiative forcing.[76]

Soot may cool or warm the surface, depending on whether it is airborne or deposited. Atmospheric soot directly absorb solar radiation, which heats the atmosphere and cools the surface. In isolated areas with high soot production, such as rural India, as much as 50% of surface warming due to greenhouse gases may be masked by atmospheric brown clouds.[77] When deposited, especially on glaciers or on ice in arctic regions, the lower surface albedo can also directly heat the surface.[78] The influences of particulates, including black carbon, are most pronounced in the tropics and sub-tropics, particularly in Asia, while the effects of greenhouse gases are dominant in the extratropics and southern hemisphere.[79]
Satellite observations of Total Solar Irradiance from 1979–2006.
Solar activity
Main articles: Solar variation and Solar wind

Solar variations causing changes in solar radiation energy reaching the Earth have been the cause of past climate changes.[80] The effect of changes in solar forcing in recent decades is uncertain, but small, with some studies showing a slight cooling effect,[81] while others studies suggest a slight warming effect.[43][82][83][84]

Greenhouse gases and solar forcing affect temperatures in different ways. While both increased solar activity and increased greenhouse gases are expected to warm the troposphere, an increase in solar activity should warm the stratosphere while an increase in greenhouse gases should cool the stratosphere.[43] Radiosonde (weather balloon) data show the stratosphere has cooled over the period since observations began (1958), though there is greater uncertainty in the early radiosonde record. Satellite observations, which have been available since 1979, also show cooling.[85]

A related hypothesis, proposed by Henrik Svensmark, is that magnetic activity of the sun deflects cosmic rays that may influence the generation of cloud condensation nuclei and thereby affect the climate.[86] Other research has found no relation between warming in recent decades and cosmic rays.[87][88] The influence of cosmic rays on cloud cover is about a factor of 100 lower than needed to explain the observed changes in clouds or to be a significant contributor to present-day climate change.[89]

Studies in 2011 have indicated that solar activity may be slowing, and that the next solar cycle could be delayed. To what extent is not yet clear; Solar Cycle 25 is due to start in 2020, but may be delayed to 2022 or even longer. It is even possible that Sol could be heading towards another Maunder Minimum. While there is not yet a definitive link between solar sunspot activity and global temperatures, the scientists conducting the solar activity study believe that global greenhouse gas emissions would prevent any possible cold snap.[90]

The fact we still see a positive imbalance despite the prolonged solar minimum isn't a surprise given what we've learned about the climate system...But it's worth noting, because this provides unequivocal evidence that the sun is not the dominant driver of global warming.[91]



In line with other details mentioned above, director of NASA's Goddard Institute for Space Studies James Hansen says that the sun is not nearly the biggest factor in global warming. Discussing the fact that low amounts of solar activity between 2005 and 2010 had hardly any effect on global warming, Hansen says it is more evidence that geen house gases are the largest culprit; that is, he supports the theory advanced by "nearly all climate scientists" including the IPCC.[91]
Feedback
Main article: Climate change feedback

Feedback is a process in which changing one quantity changes a second quantity, and the change in the second quantity in turn changes the first. Positive feedback increases the change in the first quantity while negative feedback reduces it. Feedback is important in the study of global warming because it may amplify or diminish the effect of a particular process.

The main positive feedback in the climate system is the water vapor feedback. The main negative feedback is radiative cooling through the Stefan–Boltzmann law, which increases as the fourth power of temperature. Positive and negative feedbacks are not imposed as assumptions in the models, but are instead emergent properties that result from the interactions of basic dynamical and thermodynamic processes.

A wide range of potential feedback processes exist, such as Arctic methane release and ice-albedo feedback. Consequentially, potential tipping points may exist, which may have the potential to cause abrupt climate change.[92]

For example, the "emission scenarios" used by IPCC in its 2007 report primarily examined greenhouse gas emissions from human sources. In 2011, a joint study by the US National Snow and Ice Data Center and National Oceanic and Atmospheric Administration calculated the additional greenhouse gas emissions that would emanate from melted and decomposing permafrost, even if policymakers attempt to reduce human emissions from the A1FI scenario to the A1B scenario.[93] The team found that even at the much lower level of human emissions, permafrost thawing and decomposition would still result in 190 Gt C of permafrost carbon being added to the atmosphere on top of the human sources. Importantly, the team made three extremely conservative assumptions: (1) that policymakers will embrace the A1B scenario instead of the A1FI scenario, (2) that all of the carbon would be released as carbon dioxide instead of methane, which is more likely and over a 20 year lifetime has 72x the greenhouse warming power of CO2, and (3) their model did not project additional temperature rise caused by the release of these additional gases.[93][94] These very conservative permafrost carbon dioxide emissions are equivalent to about 1/2 of all carbon released from fossil fuel burning since the dawn of the Industrial Age,[95] and is enough to raise atmospheric concentrations by an additional 87±29 ppm, beyond human emissions. Once initiated, permafrost carbon forcing (PCF) is irreversible, is strong compared to other global sources and sinks of atmospheric CO2, and due to thermal inertia will continue for many years even if atmospheric warming stops.[93] A great deal of this permafrost carbon is actually being released as highly flammable methane instead of carbon dioxide.[96] IPCC 2007's temperature projections did not take any of the permafrost carbon emissions into account and therefore underestimate the degree of expected climate change.[93][94]

Other research published in 2011 found that increased emissions of methane could instigate significant feedbacks that amplify the warming attributable to the methane alone. The researchers found that a 2.5-fold increase in methane emissions would cause indirect effects that increase the warming 250% above that of the methane alone. For a 5.2-fold increase, the indirect effects would be 400% of the warming from the methane alone.[97]
Climate models
Main article: Global climate model
Calculations of global warming prepared in or before 2001 from a range of climate models under the SRES A2 emissions scenario, which assumes no action is taken to reduce emissions and regionally divided economic development.
The geographic distribution of surface warming during the 21st century calculated by the HadCM3 climate model if a business as usual scenario is assumed for economic growth and greenhouse gas emissions. In this figure, the globally averaged warming corresponds to 3.0 °C (5.4 °F).

A climate model is a computerized representation of the five components of the climate system: Atmosphere, hydrosphere, cryosphere, land surface, and biosphere.[98] Such models are based on physical principles including fluid dynamics, thermodynamics and radiative transfer. There can be components which represent air movement, temperature, clouds, and other atmospheric properties; ocean temperature, salt content, and circulation; ice cover on land and sea; the transfer of heat and moisture from soil and vegetation to the atmosphere; chemical and biological processes; and others.

Although researchers attempt to include as many processes as possible, simplifications of the actual climate system are inevitable because of the constraints of available computer power and limitations in knowledge of the climate system. Results from models can also vary due to different greenhouse gas inputs and the model's climate sensitivity. For example, the uncertainty in IPCC's 2007 projections is caused by (1) the use of multiple models with differing sensitivity to greenhouse gas concentrations, (2) the use of differing estimates of humanities' future greenhouse gas emissions, (3) any additional emissions from climate feedbacks that were not included in the models IPCC used to prepare its report, i.e., greenhouse gas releases from permafrost.[93]

The models do not assume the climate will warm due to increasing levels of greenhouse gases. Instead the models predict how greenhouse gases will interact with radiative transfer and other physical processes. One of the mathematical results of these complex equations is a prediction whether warming or cooling will occur.[99]

Recent research has called special attention to the need to refine models with respect to the effect of clouds[100] and the carbon cycle.[101][102][103]

Models are also used to help investigate the causes of recent climate change by comparing the observed changes to those that the models project from various natural and human-derived causes. Although these models do not unambiguously attribute the warming that occurred from approximately 1910 to 1945 to either natural variation or human effects, they do indicate that the warming since 1970 is dominated by man-made greenhouse gas emissions.[43]

The physical realism of models is tested by examining their ability to simulate contemporary or past climates.[104]

Climate models produce a good match to observations of global temperature changes over the last century, but do not simulate all aspects of climate.[105] Not all effects of global warming are accurately predicted by the climate models used by the IPCC. Observed Arctic shrinkage has been faster than that predicted.[106] Precipitation increased proportional to atmospheric humidity, and hence significantly faster than global climate models predict.[107][108]
Expected effects
Main articles: Effects of global warming and Regional effects of global warming

"Detection" is the process of demonstrating that climate has changed in some defined statistical sense, without providing a reason for that change. Detection does not imply attribution of the detected change to a particular cause. "Attribution" of causes of climate change is the process of establishing the most likely causes for the detected change with some defined level of confidence.[109] Detection and attribution may also be applied to observed changes in physical, ecological and social systems.[110]
Sparse records indicate that glaciers have been retreating since the early 1800s. In the 1950s measurements began that allow the monitoring of glacial mass balance, reported to the World Glacier Monitoring Service (WGMS) and the National Snow and Ice Data Center (NSIDC)
Natural systems

Global warming has been detected in a number of systems. Some of these changes, e.g., based on the instrumental temperature record, have been described in the section on temperature changes. Rising sea levels and observed decreases in snow and ice extent are consistent with warming.[111] Most of the increase in global average temperature since the mid-20th century is, with high probability,[D] attributable to human-induced changes in greenhouse gas concentrations.[112]

Even with policies to reduce emissions, global emissions are still expected to continue to grow over time.[113]

In the IPCC Fourth Assessment Report, across a range of future emission scenarios, model-based estimates of sea level rise for the end of the 21st century (the year 2090–2099, relative to 1980–1999) range from 0.18 to 0.59 m. These estimates, however, were not given a likelihood due to a lack of scientific understanding, nor was an upper bound given for sea level rise. On the timescale of centuries to millennia, the melting of ice sheets could result in even higher sea level rise. Partial deglaciation of the Greenland ice sheet, and possibly the West Antarctic Ice Sheet, could contribute 4–6 metres (13 to 20 ft) or more to sea level rise.[114]

Changes in regional climate are expected to include greater warming over land, with most warming at high northern latitudes, and least warming over the Southern Ocean and parts of the North Atlantic Ocean.[113] Snow cover area and sea ice extent are expected to decrease, with the Arctic expected to be largely ice-free in September by 2037.[115] The frequency of hot extremes, heat waves, and heavy precipitation will very likely increase.
Ecological systems

In terrestrial ecosystems, the earlier timing of spring events, and poleward and upward shifts in plant and animal ranges, have been linked with high confidence to recent warming.[111] Future climate change is expected to particularly affect certain ecosystems, including tundra, mangroves, and coral reefs.[113] It is expected that most ecosystems will be affected by higher atmospheric CO2 levels, combined with higher global temperatures.[116] Overall, it is expected that climate change will result in the extinction of many species and reduced diversity of ecosystems.[117]
Social systems

Vulnerability of human societies to climate change mainly lies in the effects of extreme-weather events rather than gradual climate change.[118] Impacts of climate change so far include adverse effects on small islands,[119] adverse effects on indigenous populations in high-latitude areas,[120] and small but discernable effects on human health.[121] Over the 21st century, climate change is likely to adversely affect hundreds of millions of people through increased coastal flooding, reductions in water supplies, increased malnutrition and increased health impacts.[122]

Future warming of around 3 °C (by 2100, relative to 1990–2000) could result in increased crop yields in mid- and high-latitude areas, but in low-latitude areas, yields could decline, increasing the risk of malnutrition.[119] A similar regional pattern of net benefits and costs could occur for economic (market-sector) effects.[121] Warming above 3 °C could result in crop yields falling in temperate regions, leading to a reduction in global food production.[123] Most economic studies suggest losses of world gross domestic product (GDP) for this magnitude of warming.[124][125]
Responses to global warming
Mitigation
Main article: Climate change mitigation
See also: Fee and dividend

Reducing the amount of future climate change is called mitigation of climate change. The IPCC defines mitigation as activities that reduce greenhouse gas (GHG) emissions, or enhance the capacity of carbon sinks to absorb GHGs from the atmosphere.[126] Many countries, both developing and developed, are aiming to use cleaner, less polluting, technologies.[59]:192 Use of these technologies aids mitigation and could result in substantial reductions in CO2 emissions. Policies include targets for emissions reductions, increased use of renewable energy, and increased energy efficiency. Studies indicate substantial potential for future reductions in emissions.[127]

In order to limit warming to within the lower range described in the IPCC's "Summary Report for Policymakers"[128] it will be necessary to adopt policies that will limit greenhouse gas emissions to one of several significantly different scenarios described in the full report.[129] This will become more and more difficult with each year of increasing volumes of emissions and even more drastic measures will be required in later years to stabilize a desired atmospheric concentration of greenhouse gases. Energy-related carbon-dioxide (CO2) emissions in 2010 were the highest in history, breaking the prior record set in 2008.[130]

Since even in the most optimistic scenario, fossil fuels are going to be used for years to come, mitigation may also involve carbon capture and storage, a process that traps CO2 produced by factories and gas or coal power stations and then stores it, usually underground.[131]
Adaptation
Main article: Adaptation to global warming

Other policy responses include adaptation to climate change. Adaptation to climate change may be planned, either in reaction to or anticipation of climate change, or spontaneous, i.e., without government intervention.[132] The ability to adapt is closely linked to social and economic development.[127] Even societies with high capacities to adapt are still vulnerable to climate change. Planned adaptation is already occurring on a limited basis. The barriers, limits, and costs of future adaptation are not fully understood.
Geoengineering

A body of the scientific literature has developed which considers alternative geoengineering techniques for climate change mitigation.[133] In the IPCC's Fourth Assessment Report (published in 2007) Working Group III (WG3) assessed some "apparently promising" geoengineering techniques, including ocean fertilization, capturing and sequestering CO2, and techniques for reducing the amount of sunlight absorbed by the Earth's atmospheric system.[133] The IPCC's overall conclusion was that geoengineering options remained "largely speculative and unproven, (...) with the risk of unknown side-effects."[134] In the IPCC's[134] judgement, reliable cost estimates for geoengineering options had not yet been published.

As most geoengineering techniques would affect the entire globe, deployment would likely require global public acceptance and an adequate global legal and regulatory framework, as well as significant further scientific research.[135]
Views on global warming
Main articles: Global warming controversy and Politics of global warming
See also: Scientific opinion on climate change and Public opinion on climate change

There are different views over what the appropriate policy response to climate change should be.[136] These competing views weigh the benefits of limiting emissions of greenhouse gases against the costs. In general, it seems likely that climate change will impose greater damages and risks in poorer regions.[137]
Global warming controversy

The global warming controversy refers to a variety of disputes, significantly more pronounced in the popular media than in the scientific literature,[138][139] regarding the nature, causes, and consequences of global warming. The disputed issues include the causes of increased global average air temperature, especially since the mid-20th century, whether this warming trend is unprecedented or within normal climatic variations, whether humankind has contributed significantly to it, and whether the increase is wholly or partially an artifact of poor measurements. Additional disputes concern estimates of climate sensitivity, predictions of additional warming, and what the consequences of global warming will be.

In the scientific literature, there is a strong consensus that global surface temperatures have increased in recent decades and that the trend is caused mainly by human-induced emissions of greenhouse gases. No scientific body of national or international standing disagrees with this view,[140][141] though a few organisations hold non-committal positions.

From 1990–1997 in the United States, conservative think tanks mobilized to undermine the legitimacy of global warming as a social problem. They challenged the scientific evidence; argued that global warming will have benefits; and asserted that proposed solutions would do more harm than good.[142]
Politics
Article 2 of the UN Framework Convention refers explicitly to "stabilization of greenhouse gas concentrations."[143] In order to stabilize the atmospheric concentration of CO2, emissions worldwide would need to be dramatically reduced from their present level.[144]

Most countries are Parties to the United Nations Framework Convention on Climate Change (UNFCCC).[145] The ultimate objective of the Convention is to prevent "dangerous" human interference of the climate system.[146] As is stated in the Convention, this requires that GHG concentrations are stabilized in the atmosphere at a level where ecosystems can adapt naturally to climate change, food production is not threatened, and economic development can proceed in a sustainable fashion.[147] The Framework Convention was agreed in 1992, but since then, global emissions have risen.[148] During negotiations, the G77 (a lobbying group in the United Nations representing 133 developing nations)[149]:4 pushed for a mandate requiring developed countries to "[take] the lead" in reducing their emissions.[150] This was justified on the basis that: the developed world's emissions had contributed most to the stock of GHGs in the atmosphere; per-capita emissions (i.e., emissions per head of population) were still relatively low in developing countries; and the emissions of developing countries would grow to meet their development needs.[61]:290 This mandate was sustained in the Kyoto Protocol to the Framework Convention,[61]:290 which entered into legal effect in 2005.[151]

In ratifying the Kyoto Protocol, most developed countries accepted legally binding commitments to limit their emissions. These first-round commitments expire in 2012.[151] US President George W. Bush rejected the treaty on the basis that "it exempts 80% of the world, including major population centers such as China and India, from compliance, and would cause serious harm to the US economy."[149]:5

At the 15th UNFCCC Conference of the Parties, held in 2009 at Copenhagen, several UNFCCC Parties produced the Copenhagen Accord.[152] Parties associated with the Accord (140 countries, as of November 2010)[153]:9 aim to limit the future increase in global mean temperature to below 2 °C.[154] A preliminary assessment published in November 2010 by the United Nations Environment Programme (UNEP) suggests a possible "emissions gap" between the voluntary pledges made in the Accord and the emissions cuts necessary to have a "likely" (greater than 66% probability) chance of meeting the 2 °C objective.[153]:10–14 The UNEP assessment takes the 2 °C objective as being measured against the pre-industrial global mean temperature level. To having a likely chance of meeting the 2 °C objective, assessed studies generally indicated the need for global emissions to peak before 2020, with substantial declines in emissions thereafter.

The 16th Conference of the Parties (COP16) was held at Cancún in 2010. It produced an agreement, not a binding treaty, that the Parties should take urgent action to reduce greenhouse gas emissions to meet a goal of limiting global warming to 2 °C above pre-industrial temperatures. It also recognized the need to consider strengthening the goal to a global average rise of 1.5 °C.[155]
Public opinion
Globe icon.
The examples and perspective in this section deal primarily with English-speaking territories and do not represent a worldwide view of the subject. Please improve this article and discuss the issue on the talk page. (October 2011)
Based on Rasmussen polling of 1,000 adults in the USA conducted 29–30 July 2011.[156]

In 2007–2008 Gallup Polls surveyed 127 countries. Over a third of the world's population was unaware of global warming, with people in developing countries less aware than those in developed, and those in Africa the least aware. Of those aware, Latin America leads in belief that temperature changes are a result of human activities while Africa, parts of Asia and the Middle East, and a few countries from the Former Soviet Union lead in the opposite belief.[157] In the Western world, opinions over the concept and the appropriate responses are divided. Nick Pidgeon of Cardiff University said that "results show the different stages of engagement about global warming on each side of the Atlantic", adding, "The debate in Europe is about what action needs to be taken, while many in the US still debate whether climate change is happening."[158][159] A 2010 poll by the Office of National Statistics found that 75% of UK respondents were at least "fairly convinced" that the world's climate is changing, compared to 87% in a similar survey in 2006.[160] A January 2011 ICM poll in the UK found 83% of respondents viewed climate change as a current or imminent threat, while 14% said it was no threat. Opinion was unchanged from an August 2009 poll asking the same question, though there had been a slight polarisation of opposing views.[161]

A survey in October, 2009 by the Pew Research Center for the People & the Press showed decreasing public perception in the US that global warming was a serious problem. All political persuasions showed reduced concern with lowest concern among Republicans, only 35% of whom considered there to be solid evidence of global warming.[162] The cause of this marked difference in public opinion between the US and the global public is uncertain but the hypothesis has been advanced that clearer communication by scientists both directly and through the media would be helpful in adequately informing the American public of the scientific consensus and the basis for it.[163] The US public appears to be unaware of the extent of scientific consensus regarding the issue, with 59% believing that scientists disagree "significantly" on global warming.[164]

By 2010, with 111 countries surveyed, Gallup determined that there was a substantial decrease in the number of Americans and Europeans who viewed Global Warming as a serious threat. In the US, a little over half the population (53%) now viewed it as a serious concern for either themselves or their families; this was 10% below the 2008 poll (63%). Latin America had the biggest rise in concern, with 73% saying global warming was a serious threat to their families.[165] That global poll also found that people are more likely to attribute global warming to human activities than to natural causes, except in the USA where nearly half (47%) of the population attributed global warming to natural causes.[166]

On the other hand, in May 2011 a joint poll by Yale and George Mason Universities found that nearly half the people in the USA (47%) attribute global warming to human activities, compared to 36% blaming it on natural causes. Only 5% of the 35% who were "disengaged", "doubtful", or "dismissive" of global warming were aware that 97% of publishing US climate scientists agree global warming is happening and is primarily caused by humans.[167]

Researchers at the University of Michigan have found that the public's belief as to the causes of global warming depends on the wording choice used in the polls.[168]

In the United States, according to the Public Policy Institute of California's (PPIC) eleventh annual survey on environmental policy issues, 75% said they believe global warming is a very serious or somewhat serious threat to the economy and quality of life in California.[169]

A July 2011 Rasmussen Reports poll found that 69% of adults in the USA believe it is at least somewhat likely that some scientists have falsified global warming research.[156]

A September 2011 Angus Reid Public Opinion poll found that Britons (43%) are less likely than Americans (49%) or Canadians (52%) to say that "global warming is a fact and is mostly caused by emissions from vehicles and industrial facilities." The same poll found that 20% of Americans, 20% of Britons and 14% of Canadians think "global warming is a theory that has not yet been proven."[170]
Other views

Most scientists agree that humans are contributing to observed climate change.[58][171] National science academies have called on world leaders for policies to cut global emissions.[172] However, some scientists and non-scientists question aspects of climate-change science.[171][173][174]

Organizations such as the libertarian Competitive Enterprise Institute, conservative commentators, and some companies such as ExxonMobil have challenged IPCC climate change scenarios, funded scientists who disagree with the scientific consensus, and provided their own projections of the economic cost of stricter controls.[175][176][177][178] In the finance industry, Deutsche Bank has set up an institutional climate change investment division (DBCCA),[179] which has commissioned and published research[180] on the issues and debate surrounding global warming.[181] Environmental organizations and public figures have emphasized changes in the climate and the risks they entail, while promoting adaptation to changes in infrastructural needs and emissions reductions.[182] Some fossil fuel companies have scaled back their efforts in recent years,[183] or called for policies to reduce global warming.[184]
Etymology

The term global warming was probably first used in its modern sense on 8 August 1975 in a science paper by Wally Broecker in the journal Science called "Are we on the brink of a pronounced global warming?".[185][186][187] Broecker's choice of words was new and represented a significant recognition that the climate was warming; previously the phrasing used by scientists was "inadvertent climate modification," because while it was recognized humans could change the climate, no one was sure which direction it was going.[188] The National Academy of Sciences first used global warming in a 1979 paper called the Charney Report, it said: "if carbon dioxide continues to increase, [we find] no reason to doubt that climate changes will result and no reason to believe that these changes will be negligible."[189] The report made a distinction between referring to surface temperature changes as global warming, while referring to other changes caused by increased CO2 as climate change.[188]

Global warming became more widely popular after 1988 when NASA climate scientist James Hansen used the term in a testimony to Congress.[188] He said: "global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and the observed warming."[190] His testimony was widely reported and afterward global warming was commonly used by the press and in public discourse.[188]
See also
Portal icon Global warming portal
Book icon Book: Global warming
Wikipedia books are collections of articles that can be downloaded or ordered in print.

Environmental impact of the coal industry
Glossary of climate change
History of climate change science
Index of climate change articles


Notes

^ The 2001 joint statement was signed by the national academies of science of Australia, Belgium, Brazil, Canada, the Caribbean, the People's Republic of China, France, Germany, India, Indonesia, Ireland, Italy, Malaysia, New Zealand, Sweden, and the UK.[191] The 2005 statement added Japan, Russia, and the U.S. The 2007 statement added Mexico and South Africa. The Network of African Science Academies, and the Polish Academy of Sciences have issued separate statements. Professional scientific societies include American Astronomical Society, American Chemical Society, American Geophysical Union, American Institute of Physics, American Meteorological Society, American Physical Society, American Quaternary Association, Australian Meteorological and Oceanographic Society, Canadian Foundation for Climate and Atmospheric Sciences, Canadian Meteorological and Oceanographic Society, European Academy of Sciences and Arts, European Geosciences Union, European Science Foundation, Geological Society of America, Geological Society of Australia, Geological Society of London-Stratigraphy Commission, InterAcademy Council, International Union of Geodesy and Geophysics, International Union for Quaternary Research, National Association of Geoscience Teachers, National Research Council (US), Royal Meteorological Society, and World Meteorological Organization.
^ Earth has already experienced almost 1/2 of the 2.0 °C (3.6 °F) described in the Cancun Agreement. In the last 100 years, Earth's average surface temperature increased by about 0.8 °C (1.4 °F) with about two thirds of the increase occurring over just the last three decades.[2]
^ Note that the greenhouse effect produces an average worldwide temperature increase of about 33 °C (59 °F) compared to black body predictions without the greenhouse effect, not an average surface temperature of 33 °C (91 °F). The average worldwide surface temperature is about 14 °C (57 °F).[46]
^ In the IPCC Fourth Assessment Report, published in 2007, this attribution is given a probability of greater than 90%, based on expert judgement.[192] According to the US National Research Council Report – Understanding and Responding to Climate Change – published in 2008, "[most] scientists agree that the warming in recent decades has been caused primarily by human activities that have increased the amount of greenhouse gases in the atmosphere."[58]

Monday, 30 January 2012

LAMP


LAMP, (Linux, Apache, MySQL and PHP), is an acronym for a solution stack of free, open source software, referring to the first letters of: Linux (operating system), Apache HTTP Server, MySQL (database software) and PHP (or sometimes Perl or Python), principal components to build a viable general purpose web server.[1]

The GNU project is advocating people to use the term "GLAMP" since what is known as "Linux" is known to GNU as the GNU/Linux system.[2]

The exact combination of software included in a LAMP package may vary, especially with respect to the web scripting software, as PHP may be replaced or supplemented by Perl and/or Python.[3] Similar terms exist for essentially the same software suite (AMP) running on other operating systems, such as Microsoft Windows (WAMP), Mac OS (MAMP), Solaris (SAMP), iSeries (iAMP), or OpenBSD (OAMP).

Though the original authors of these programs did not design them all to work specifically with each other, the development philosophy and tool sets are shared and were developed in close conjunction. The software combination has become popular because it is free of cost, open-source, and therefore easily adaptable, and because of the ubiquity of its components which are bundled with most current Linux distributions.

When used together, they form a solution stack of technologies that support application servers.

Wednesday, 18 January 2012

Crisis nuclear in japan


The earthquake and tsunami that hit northern Japan on March 11, 2011 created the worst nuclear crisis since the Chernobyl disaster. The three active reactors at the Fukushima Daiichi Nuclear Power Station 170 miles north of Tokyo suffered meltdowns after the quake knocked out the plant’s power and the tsunami disabled the backup generators meant to keep cooling systems working. A series of explosions and fires led to the release of radioactive gases.

At least 80,000 people were evacuated from around the plant, and radioactive materials were detected in tap water as far away as Tokyo, as well as in agricultural produce like vegetables, tea and beef.

The blasts in the days after the earthquake cracked the containment vessel at one reactor and may have cracked two others, although the Tokyo Electric Power Company, which owns the plant, said most of the fuel stayed inside, avoiding the more catastrophic “China syndrome.”

In April, Japan raised its assessment of the accident from 5 to 7, the worst rating on an international scale, putting the disaster on par with the 1986 Chernobyl explosion, in an acknowledgement that the human and environmental consequences of the nuclear crisis could be dire and long-lasting. While the amount of radioactive materials released from Fukushima Daiichi so far has equaled about 10 percent of that released at Chernobyl, officials said that the radiation release from Fukushima could, in time, surpass levels seen in 1986.

In December, Prime Minister Yoshihiko Noda announced that technicians had regained control of reactors at the Fukushima Daiichi Nuclear Power Plant, declaring an end to the world’s worst nuclear disaster since Chernoybl. The government will now focus on removing the fuel stored at the site, opening up the ravaged reactors themselves and eventually dismantling the plant, a process that is expected to take at least four decades, Mr. Noda said.

But for many of the people of Fukushima, the crisis is far from over. More than 160,000 people remain displaced, and even as the government lifts evacuation orders for some communities, many are refusing to return home. And many experts still doubt the government’s assertion that the plant is now in a stable state — the equivalent of a “cold shutdown’' — and worry that officials are declaring victory only to quell public anger over the accident.

The crisis at Fukushima had effects on Japan’s overall energy policy: In May, Prime Minister Naoto Kan, who had been criticized for showing a lack of leadership, said Japan would abandon plans to build new nuclear reactors, saying his country needed to “start from scratch” in creating a new energy policy that should include greater reliance on renewable energy and conservation.

Word in early June that the amount of radiation released in the first days of the crisis might have been more than twice the original estimate chipped away further at the credibility of the nuclear industry and the government. In July, Mr. Kan went further, saying Japan should reduce and eventually eliminate its dependence on nuclear energy, saying that the Fukushima accident had demonstrated the dangers of the technology.

Mr. Kan was forced out in August, replaced by Mr. Noda, who has taken a more pro-nuclear stance.

In interviews and public statements, some current and former government officials have admitted that Japanese authorities engaged in a pattern of withholding damaging information and denying facts of the nuclear disaster — in order, some of them said, to limit the size of costly and disruptive evacuations in land-scarce Japan and to avoid public questioning of the politically powerful nuclear industry. As the nuclear plant continues to release radiation, some of which has slipped into the nation’s food supply, public anger is growing at what many here see as an official campaign to play down the scope of the accident and the potential health risks.

Doubts About a ‘Cold Shutdown’

The Japanese government declared in December 2011 that it had finally regained control of the overheating reactors at the Fukushima Daiichi plant. But even before it was made, the announcement faced serious doubts from experts.

A disaster-response task force headed by Prime Minister Yoshihiko Noda announced that the plant’s three damaged reactors had been put into the equivalent of a “cold shutdown,” a technical term normally used to describe intact reactors with fuel cores that are in a safe and stable condition. Some experts said that the announcement reflected the government’s effort to fulfill a pledge to restore the plant’s cooling system by year’s end, not the true situation.

Other experts expressed concern that the government would declare victory only to appease growing public anger over the accident, and that it could deflect attention from remaining threats to the reactors’ safety. One of those — a large aftershock to the magnitude 9 earthquake on March 11, which could knock out the jury-rigged new cooling system that the plant’s operator hastily built after the accident — is considered a strong possibility by many seismologists.

Plans to Decommission the Reactors

Soon after declaring that the reactors at the Fukushima Daiichi plant had been put into the equivalent of a “cold shutdown,” the Japanese government announced plans for fully shutting them down. Doing so will take 40 years and require the use of robots to remove melted fuel that appears to be stuck to the bottom of the reactors’ containment vessels, according to a detailed government plan.

Japan’s nuclear crisis minister, Goshi Hosono, acknowledged that no country has ever had to clean up three destroyed reactors at the same time. Mr. Hosono told reporters the decommissioning faced challenges that were not totally predictable, but “we must do it even though we may face difficulties along the way.”

According to the plan, the plant’s operator, Tokyo Electric Power, will spend two years removing spent fuel rods from storage pools located in the same buildings as the damaged reactors. At least one of those pools, which are highly radioactive, was exposed by hydrogen explosions that destroyed the reactor buildings in the first days of the accident.

The most technically challenging step will be removing the melted fuel, a process that the government said will take 25 years and require new types of robots and other new technologies that have not even been developed yet. After the removal, fully decommissioning the reactors will take another 5 to 10 years, according to the plan.

Paying for It All

In December 2011, the Japanese government told Tokyo Electric Power to consider accepting temporary state control in return for a much-needed injection of public funds, in effect proposing an interim nationalization of the struggling utility.

The order came after Tokyo Electric Power requested ¥689.4 billion, or $8.8 billion, in government aid to help pay for its response to the nuclear accident at the Fukushima site. The utility may have to pay ¥4.5 trillion in compensation payments by 2013, a government panel said in October, a sum that threatens to render the company insolvent.

The company will also most likely be forced to decommission all six nuclear reactors at Fukushima Daiichi at a huge cost, while the future of four other reactors at a second site is also on the line after a national outcry over the disaster.

Still, it remains unclear whether the government will force changes that experts have long called for at Tokyo Electric, also known as Tepco, like sweeping changes to management or a breakup of the monopoly the utility enjoys over electricity generation and distribution in the Tokyo area.

A Colossal Cleanup

As 2011 drew to a close, Japan was drawing up plans for a cleanup of Fukushima Prefecture that was both monumental and unprecedented, in the hopes that those displaced could go home.

The Soviet Union did not attempt such a cleanup after the Chernobyl accident, instead choosing to relocate about 300,000 people, abandoning vast tracts of farmland. Many Japanese officials believe that they do not have that luxury; the evacuation zone covers more than 3 percent of the landmass of Japan.

But quiet resistance has begun to grow. Soothing pronouncements by local governments and academics about the eventual ability to live safely near the ruined plant can seem to be based on little more than hope.

No one knows how much exposure to low doses of radiation causes a significant risk of premature death. That means Japanese living in contaminated areas are likely to become the subjects of future studies — the second time in seven decades that Japanese have become a test case for the effects of radiation exposure, after the bombings of Hiroshima and Nagasaki.

Nuclear Power: Overview

Nuclear power plants use the forces within the nucleus of an atom to generate electricity.

The first nuclear reactor was built by Enrico Fermi below the stands of Stagg Field in Chicago in 1942. The first commercial reactor went into operation in Shippingport, Pa., in December 1957.

In its early years, nuclear power seemed the wave of the future, a clean source of potentially limitless cheap electricity. But progress was slowed by the high, unpredictable cost of building plants, uneven growth in electric demand, the fluctuating cost of competing fuels like oil and safety concerns.

Accidents at the Three Mile Island plant in Pennsylvania in 1979 and at the Chernobyl reactor in the Soviet Union in 1986 cast a pall over the industry that was deepened by technical and economic problems. In the 1980s, utilities wasted tens of billions of dollars on reactors they couldn’t finish. In the ‘90s, companies scrapped several reactors because their operating costs were so high that it was cheaper to buy power elsewhere.

But recently, in a historic shift, more than a dozen companies around the United States have suddenly become eager to build new nuclear reactors. Growing electric demand, higher prices for coal and gas, a generous Congress and a public support for radical cuts in carbon dioxide emissions have all combined to change the prospects for reactors, and many companies were ready to try again.

The old problems remain, however, like public fear of catastrophe, lack of a permanent waste solution and high construction costs. And some new problems have emerged: the credit crisis and the decline worldwide of factories that can make components. The competition in the electric market has also changed.

Nonetheless, industry executives and taxpayers are spending hundreds of millions of dollars to plan a new chapter for nuclear power in the United States and set the stage for worldwide revival.

Nuclear Energy: How It Works

Nuclear power is essentially a very complicated way to boil water.

Nuclear fuel consists of an element – generally uranium – in which an atom has an unusually large nucleus. The nucleus is made up of particles called protons and neutrons. The power produced by a nuclear plant is unleashed when the nucleus of one of these atoms is hit by a neutron traveling at the right speed.

The most common reaction is that the nucleus splits — an event known as nuclear fission — and sets loose more neutrons. Those neutrons hit other nuclei and split them, too. At equilibrium — each nuclear fission producing one additional nuclear fission — the reactor undergoes a chain reaction that can last for months or even years.

When the split atom flings off neutrons, it also sends out fragments. Their energy is transferred to water that surrounds the nuclear core as heat. The fragments also give off sub-atomic particles or gamma rays that generate heat.

Depending on the plant’s design, the water is either boiled in the reactor vessel, or transfers its heat to a separate circuit of water that boils. The steam spins a turbine that turns a generator and makes electricity.

Sometimes instead of splitting, the nucleus absorbs the neutron fired at it, a reaction that turns the uranium into a different element, plutonium 239 (Pu-239). This reaction happens some of the time in all reactors. But in what are known as breeder reactors, neutrons fired at a higher force are absorbed far more often. In this process, spent uranium fuel can be recycled into Pu-239, which can be used as new fuel. But problems with safety and waste disposal have limited their use – a fuel recycling plant that operated near Buffalo for six years created waste that cost taxpayers $1 billion to clean up.

Discovery and the Birth of an Industry

The possibility of nuclear fission – splitting atoms — was recognized in the late 1930s. The first controlled chain reaction came in 1942 as part of the Manhattan Project, America’s wartime effort to build an atom bomb. That project entailed construction of several reactors, but for them, the energy was a waste product; the object was plutonium bomb fuel. On July 16, 1945, at the Trinity Site in New Mexico, the project’s scientists set off a chain reaction that was designed to multiply exponentially – the first blast of an atomic bomb.

Even before the war ended, the military was looking at reactors for another use, submarine propulsion. Work on those reactors began in the early 1950s, and on some other uses of nuclear power that never came to fruition, like nuclear-powered airplanes.

By general consensus, the first commercial reactor was a heavily subsidized plant at Shippingport, Pa. That was essentially a scaled-up version of a submarine reactor. In the United States and abroad, as the cold war and a vast nuclear arms race took shape, the race was on to find a peaceful use for the atom.

In December 1953, President Dwight D. Eisenhower delivered a speech at the United Nations called “Atoms for Peace,” calling for a “worldwide investigation into the most effective peace time uses of fissionable material.’’

Messianic language followed. Rear Admiral Lewis L. Strauss, chairman of the Atomic Energy Commission, told science writers in New York that “our children will enjoy in their homes electrical power too cheap to meter.’’

The “too cheap to meter” line has dogged the industry ever since. But after a slow start in the 1950s and early ’60s, larger and larger plants were built and formed the basis for a great wave of optimism among the electric utilities, which eventually ordered 250 reactors.

As it turned out, many of those companies were poor at managing massive, multiyear construction projects. They poured concrete before designs were complete, and later had to rip and replace some work. New federal requirements slowed progress, and delays added to staggering interest charges.

Costs got way out of hand. Half the plants were abandoned before completion. Some utilities faced bankruptcy. In all, 100 reactors ordered after 1973 were abandoned. By the time of the Three Mile Island accident, ordering a new plant was unthinkable and the question was how many would be abandoned before completion.

Safety – Three Mile Island and Chernobyl

The core meltdown at Three Mile Island 2, near Harrisburg, Pa., in March 1979, and the explosion and fire at Chernobyl 3 in April 1986, near Kiev, in the Ukraine, are events the industry cannot afford to repeat.

Three Mile Island unit 2 was the youngest reactor in the United States. The plant, like all others on line in the United States, had been built with impressive back-up systems to guard against a big pipe break that could leave the nuclear core without its blanket of water. But here a relatively slow leak combined with misunderstandings by the plant operators about their complex controls, factors that had not been anticipated.

The operators knew that they had a routine malfunction and had taken action to deal with it. But as problems mounted, in their windowless control room, filled with dials, warning lights and audible alarms that all clamored for attention faster than they could absorb it, they did not realze for hours that a valve they believed they had closed was actually stuck open. Rather than resolving the problem, they had allowed most of the cooling water to leak out.

Tens of thousands of worried residents evacuated the surrounding area. The reactor core was destroyed, but with little damage beyond it.

The reactor had shut itself down in the first few moments of the malfunction, when an automatic system triggered control rods to drop into the core, shutting off the flow of neutrons that sustained the chain reaction. And even if that had not happened, the reaction would have stopped as the cooling water boiled away, because the water acted as a moderator, slowing the neutrons down.

The plant leaked radioactive materials; post-accident estimates said the amount was very small. No one died, but in a matter of hours, a billion-dollar asset had become a billion-dollar liability.

In contrast, the Chernobyl reactor in the Ukraine was moderated by graphite, a material that does not boil away. And as graphite gets hotter, its performance as a moderator improves, meaning that the reaction speeds up. When a malfunction made the plan run hot, instead of shutting down, the reaction ran out of control and the reactor blew up.

Graphite has another unfavorable characteristic: it burns on contact with air. At Chernobyl, once the reactor exploded, hundreds of tons of graphite became the fuel for a fire that lasted at least three and a half hours, providing the energy to disburse the tons of radioactive material inside.

The government said 31 people died of radiation sickness in the following weeks. Estimates of the eventual number of dead are colored by politics, but a United National panel said in 2005 that the release of Iodine-131, a highly radioactive material that gets concentrated in the thyroid gland, would eventually cause 4,000 deaths. An “exclusion zone” 36 miles in diameter remains in place, and hundreds of thousands of people have been resettled.

Safety – Nuclear Waste

When the nucleus of a uranium atom is struck by a neutron, the atom breaks into fragments. Nearly all these fission products, few of which exist in nature, are unstable. They seek to return to stability by giving off an energy wave, called a gamma ray, or a particle, called alpha or beta radiation. Some transmute into a new, stable state in a matter of seconds; others remain radioactive for millennia.

Most fission products with very short half-lives – the length of time needed for half their atoms to be transmuted into something else — are intensely radioactive, which makes them a concern in the event of a leak. Other fission products, most of which are contained in spent reactor fuel, will remain radioactive for millions of years.

The Federal government always promised it would accept the high-level nuclear wastes, and beginning in the early 1980s, it signed contracts with the utilities, saying storage would begin in 1998. It hasn’t happened yet, and won’t before 2020, if then.

In the 1980s, the idea was to have the Energy Department study the geology of several sites and pick the best, but that job went very slowly, and Congress decided to make the choice itself. It chose Yucca Mountain, about 100 miles from Las Vegas, in large part because the site is extremely dry. But intensive study showed that what water does fall on the mountain runs through it far faster than scientists initially estimated.

In 2004, a federal appeals court threw out a set of federal rules for the site because they would only offer protection for 10,000 years, while scientists say the fuel would be hazardous for close to a million years.

President Obama declared that Yucca would not be used, but in June a federal judge ordered the Energy Department not to withdraw its application for an operating license, an application opposed by the state of Nevada and a range of private groups, some of whom hope the lack of a storage site will force the entire industry to shut down. The judge said Congress had required the department to file an application when it settled on the Yucca site.

California, Connecticut and other states have moved to block construction of new reactors until a repository is opened, but other states seem likely to go ahead.

In the meantime, at many plants the spent fuel is stored in casks that look like small silos, with a steel liner and a concrete shell. The fuel is put inside and dried, and the cask is filled with an inert gas to prevent rust. Then it is parked on a high-quality concrete pad, surrounded by floodlights and concertina wire, resembling a basketball court at a maximum-security prison.

Safety — Military Waste

The nation’s biggest plutonium problem is not from nuclear power but from nuclear weapons. The most troubling is Hanford, a 560-square-mile tract in south-central Washington that was taken over by the federal government as part of the Manhattan Project. (The bomb that destroyed Nagasaki in 1945 originated with plutonium made at Hanford.) By the time production stopped in the 1980s, Hanford had made most of the nation’s plutonium. Cleanup to protect future generations will be far more challenging than planners had assumed, according to an analysis by a former Energy Department official.

The plutonium does not pose a major radiation hazard now, largely because it is under “institutional controls” like guards, weapons and gates. But government scientists say that even in minute particles, plutonium can cause cancer, and because it takes 24,000 years to lose half its radioactivity, it is certain to last longer than the controls

The fear is that in a few hundred years, the plutonium could reach an underground area called the saturated zone, where water flows, and from there enter the Columbia River. Because the area is now arid, contaminants move extremely slowly, but over the millennia the climate is expected to change, experts say.

The finding on the extent of plutonium waste signals that the cleanup, still in its early stages, will be more complex, perhaps requiring technologies that do not yet exist. But more than 20 years after the Energy Department vowed to embark on a cleanup, it still has not “characterized,” or determined the exact nature of, the contaminated soil.

So far, the cleanup, which began in the 1990s, has involved moving some contaminated material near the banks of the Columbia to drier locations. (In fact, the Energy Department’s cleanup office is called the Office of River Protection.) The office has begun building a factory that would take the most highly radioactive liquids and sludges from decaying storage tanks and solidify them in glass.

That would not make them any less radioactive, but it would increase the likelihood that they stay put for the next few thousand years.

The problem of plutonium waste is not confined to Hanford. Plutonium waste is much more prevalent around nuclear weapons sites nationwide than the Energy Department’s official accounting indicates, said Robert Alvarez, who reanalyzed studies in 2010 conducted by the department in the last 15 years for Hanford; the Idaho National Engineering Laboratory; the Savannah River Site, near Aiken, S.C.; and elsewhere.

Recent Developments: Safety and Output

In 2009, reactors are producing more electricity than ever before, about 20 percent of the kilowatt-hours used in the United States, by getting more power out of old plants.

Many reactors were designed to produce more power than had been applied for. In the 1990s, a number of companies asked the Nuclear Regulatory Commission for “uprates,’’ which allowed them to make changes, often small, that increased their output.

Nuclear plants are also running longer, in part because deregulation of the industry has given companies an incentive to get as much as they can out of each plant. Plants used to run at a capacity factor – the percentage of power a plant could produce if it ran continuously — of 60 or 65 percent; now the norm is 90 percent. Such increases have been essential to the survival of plants like Indian Point 3 in New York, which has gone from 40 percent in the 1980s to around 90 percent now.

Safety issues have persisted, and one incident in an Ohio plant in 2002 in particular shook confidence in the safety of reactors and the quality of nuclear regulation. Regulators ordered plant operators around the country to inspect a spot in the lid of reactor vessels that was known to be prone to leaks. In the Ohio plant, the operators were shocked to find that the boric acid that is mixed into reactor water to stabilize the reaction had eaten away a chunk of carbon steel the size of a football, leaving the vessel vulnerable to a failure.

New Designs, New Issues

On the drawing boards at government labs are all kinds of exotic designs, using graphite and helium, or plutonium and molten sodium, and making hydrogen rather than electricity. But the experts generally agree that if a reactor is ordered soon, its design changes will be evolutionary, not revolutionary.

Many of the new designs have focused on the emergency core cooling systems, where the new goal is to minimize dependency on active systems, like pumps and valves, in favor of natural forces, like gravity and natural heat circulation and dissipation.

Westinghouse is one of the companies trying to market a reactor, the AP1000, with what is called a passive approach to safety. Compared to Westinghouse designs now in service, it requires only half as many safety-related valves, 83 percent less safety-related pipe and one-third fewer pumps.

A French company called Areva is building the EPR, for European Pressurized Water Reactor, which has four emergency core cooling systems, instead of the usual two. That not only makes it less likely that all systems would fail, but would allow the plant to keep running while one of the systems is down for maintenance.

The third entry is General Electric’s Economic Simplified Boiling Water Reactor, derived from its boiling water reactor design. It is tweaked for better natural circulation in case of an accident, so there will be less reliance on pumps. But three of its four potential customers have backed away.

The Nuclear Regulatory Commission is also considering a proposal that it give approval to a handful of standardized, completed designs, rather than approving each plant’s design individually after construction had begun. The hope is to cut a 10-year construction process in half.

Nuclear Power and Climate Change

Nuclear power has gained new adherents in recent years, including some environmentalists who had previously opposed it. The reason is growing concern over climate change, and the role of energy production in the build-up of carbon dioxide in the atmosphere. Nuclear plants do not burn fuel and so produce no carbon dioxide. Proponents of nuclear power say it is the only available method of producing large amounts of energy quickly enough to make a difference in the fate of the atmosphere.

In the 2008 presidential campaign, Senator John McCain, the Republican candidate, made expansion of nuclear power a central point of his agenda both for energy and global warming.

But expanding nuclear power to replace coal and oil would be a massive job, on a scale that some consider unrealistic. A study by the Princeton Carbon Management Initiative estimated that for nuclear to play a significant role in cutting emissions, the industry’s capacity would have to triple worldwide over the next 50 years — a rate of 20 new large reactors a year.

At the moment, though, industry leaders in the United States wonder whether the worldwide supplier base could support construction of more than four or five reactors simultaneously. Some reactors under construction, like a prototype EPR in Finland, are over budget and years behind schedule. All new projects have to depend on a single supplier for the biggest metal parts, Japan Steel Works.

And at the moment, the price of nuclear power seems too high. In Florida, Progress Energy wants to build two reactors with a total cost, including transmission and interest during construction, that translates into about $8,000 per kilowatt of capacity — the amount of power needed to run a single window air conditioner. On a large scale, it may be cheaper to build better air conditioners, some energy experts suspect.

Recent Developments

The Obama administration favors another $37 billion in new loan guarantees, beyond the $18.5 billion provided in a 2005 energy law. It opposes opening a waste repository at Yucca Mountain, although that goal has long been sought by the industry. It has favored new reactors as part of the energy picture.

In his 2011 State of the Union address, President Obama proposed giving the nuclear construction business a type of help it has never had, a role in a quota for clean energy. But recent setbacks in a hoped-for “nuclear renaissance” raise questions about how much of a role nuclear power can play.

Of four reactor projects identified by the Energy Department in 2009 as the most likely candidates for federal loan guarantees, only two are moving forward. At a third, in Calvert Cliffs, Md., there has been no public sign of progress since the lead partner withdrew in October 2010 and the other partner said it would seek a replacement.

And at the fourth, in Texas, a would-be builder has been driven to try something never done before in nuclear construction: finding a buyer for the electricity before the concrete is even poured. Customers are not rushing forward, given that the market is awash in generating capacity and an alternative fuel, natural gas, is currently cheap.

Many Democrats and most Republicans in Congress back nuclear construction, as do local officials in most places where reactors have been proposed.

Some challenges are not peculiar to the nuclear sector. All forms of clean energy, including solar and wind power, are undercut to some extent by the cheap price of natural gas and the surplus in generating capacity, which is linked partly to the recession. And federal caps on carbon dioxide emissions from coal- and gas-burning plants, which would benefit clean energy sources, are not expected until 2012.

But some obstacles are specific to the nuclear industry, like the ballooning cost estimates for construction of reactors, which are massive in scale. Even when projects are identified as prime candidates for federal loan guarantees, some investment partners turn wary.

The Nuclear Regulatory Commission has been working for more than 15 years to streamline reactor licensing to cut construction time and to reduce risk.

Nuclear energy has also begun to be looked on more favorably in Europe, too. The Finnish Parliament in July 2010 approved the construction of two nuclear power plants; just two weeks before, the Swedish Parliament narrowly voted to allow the reactors at 10 nuclear power plants to be replaced when the old ones are shut down — a reversal from a 1980 referendum that called for them to be phased out entirely.

The New York Times coverage of nuclear energy: Click here for a searchable archive of New York Times coverage of nuclear energy at nytexplorer.com, including articles and commentary.