Google+ Followers

Sunday, January 25, 2015

Obama's India visit: Hopes for clean energy and climate deals

Barack Obama was advised, only half-jokingly, to wear a gas mask when he appears as guest of honour at India’s Republic Day parade on Monday. The air pollution in Delhi and other Indian cities has become that bad.

Providing a fix for that very unhealthy air – 13 of the world’s most polluted cites are in India – is a growing priority for the Indian prime minister, Narendra Modi, and could take key position in a suite of clean energy initiatives the two leaders are expected to roll out on Monday.
“The co-operation on clean energy and climate change is critically important,” Ben Rhodes, the deputy national security adviser, told a conference call with reporters.
America is hoping to persuade India, one of the world’s biggest emitters, to commit to an ambitious post-2020 plan for reining in its greenhouse gas emissions ahead of the international climate change meeting in Paris this December.

Coral reefs face heightened risk of fatal disease from dredging, says research

Coral faces a heightened risk of fatal disease if the seabed near it is dug up, according to a landmark study that will reignite the debate over whether dredging projects near the Great Barrier Reef should take place.
The research, conducted by the ARC Centre of Excellence for Coral Reef Studies and the Australian Institute of Marine Science, found that dredging near coral reefs doubled the frequency of diseases afflicting corals.
The study, the first of its kind to directly address the issue of dredging and coral health, is likely to raise further questions over whether a plan to dredge 3m cubic metres of seabed and dump it within the Great Barrier Reef marine park is a wise one.
White syndrome coral disease
Pinterest
 White syndrome coral disease. Photograph: F Joseph Pollack
Analysis of corals subjected to plumes of seabed sediment found that they were two times as likely to suffer a myriad of different diseases, compared with those untouched by dredging. Worryingly, these affected corals were most likely to suffer from “white syndrome”, a condition that effectively destroys them.
“White syndrome is like if your flesh fell off at the fingertips leaving just bone, and then kept going on up your arms and then the rest of your body,” Joe Pollock, lead author of the study, told Guardian Australia.
“When you have bleaching events, which cause the corals to go white, you can still get some recovery there because the coral is still alive. But with white syndrome the tissue is completely gone and it won’t recover at all.”
Researchers studied the progress of an 18-month dredging project near Barrow Island, a previously pristine area off the Western Australian coast. The dredging involved the removal of 7m cubic metres of seabed to create a channel to accommodate ships for the Gorgon natural gas project.
Using satellite imagery, researchers could map the areas of coral covered by plumes of sediment released by the dredging process. They found there was a “direct link” between coral disease and sediment.
Dredging study map
Pinterest
 Dredging study map showing sediment plumes. Photograph: F Joseph Pollack
Corals require both food and light to survive. Dredging creates turbidity in the water that reduces the amount of light reaching the coral, affecting photosynthesis, while silt that settles on the coral interferes with its ability to feed itself.
“The sediment clogs their mouths up,” Pollock said. “The coral can shed sediment, but it’s very energy intensive. When the coral is expending a lot of energy, at a time when it is getting less energy from the sun, it becomes very stressed.
“A doubling in disease is substantial. The issue is that these diseases are chronic – we only saw a snapshot but I’ve rarely seen these white syndrome diseases stop progressing. They can kill 2 to 3m-wide corals, like the ones we see on the Great Barrier Reef, within a few months.”
On top of an increase in these devastating diseases, the study found there was a ninefold rise in other signs of compromised coral health, such as bleaching and sediment necrosis, when the corals become smothered in seabed spoil.
The federal government has already approved a plan to expand theQueensland port of Abbot Point by dredging the seabed and dumping the sediment within the Great Barrier Reef marine park.
Advertisement
Significant questions remain over the project, such as its cost and also how far plumes of sediment will travel once the spoil is deposited back into the water. North Queensland Bulk Ports, the project’s proponents, have still to identify an exact location for the dumping site, after the first choice was deemed unsuitable.
NQBP and the government both insist the dredging will be done safely, with minimal impact to the environment. A slew of conditions are attached to it, such as the stipulation that dredging is only undertaken between March and June each year, to protect spawning coral.
But environmental groups say the dredging will add significant pressure to a reef already suffering from pollution and a plague of coral-eating starfish. The North Queensland Conservation Council has launched a legal challenge against the project.
“It’s important we get our heads around these impacts, so we can make the right decisions,” Pollock said. “Dredging is part and parcel of these development projects. We don’t argue that these should be stopped, but you need to understand the impact so it’s done in a sustainable manner.”

Scientists reveal which coral reefs can survive global warming Study shows for the first time which parts of the Great Barrier Reef and other reefs can be expected to bounce back from mass bleaching events




Scientists have identified which parts of the Great Barrier Reef and other reefs are most capable of recovering from mass bleaching events which will become more frequent due to global warming.
The information should help conservationists to target their efforts to protect the portions of reefs that are most capable of survival, they say.
Previous studies have shown coral reefs as they exist today will be largely wiped out by climate change in the long term, but the new work by an Australian team shows for the first time which reefs in the short term can be expected to bounce back from bleaching events.
A major bleaching event is currently under way in large parts of the North Pacific, including the Marshall Islands and Hawaii, which experts have warned could be on a ‘historic’ scale akin to the record bleaching of 1998 that saw mass coral die-off around the world.
Nicholas Graham, lead author of the study published in Nature on Wednesday, looked at the 1998 bleaching’s impacts on reefs in the Seychelles, and found 12 of 21 sites had recovered afterwards.
Looking at just two of 11 factors – water depth and the physical complexity of the coral – the team were able to use modelling to 98% of the time correctly predict whether a reef would recover or not. Deeper water and a more complex structure made a recovery more likely.
Graham, who works on coral at James Cook University in Australia, told the Guardian that the results bought time for authorities to better manage climate-resilient reefs while bigger picture problems such as greenhouse gas cuts were addressed.
“If emissions continue as they are, the longer term future is likely to still be bleak, even for those recovering at the moment [from bleaching], because the projections are coral bleaching will become more and more frequent. In a way it’s [the study’s findings] buying us time to keep as many reefs in good shape as we can, while we tackle some of these global, bigger issues.”
Advertisement
The study’s findings suggest the parts of the Great Barrier Reef that are still relatively pristine, in the north and further offshore, are also those best placed to recover from bleaching events brought about by global warming.
Graham said the findings also raised concerns about the logic of dumping sediment from planned major port projects to expand coal exports along the Queensland coast, a local impact which could harm coral otherwise capable of surviving the global impacts of climate change.
“If you have these big dredging projects, such as at Abbot Point, if we’re dumping a lot of spoil and sediment into the Great Barrier Reef lagoon, a lot of that will settle in deep water, a lot of which might be coral and a lot of which will do better under climate change. If we’re not actually doing enough to reduce local impacts, we’re doing ourselves a disservice under climate change.”
The research could help organisations such as the reef’s marine park authority to better pinpoint which areas should avoid anchor damage from boats, which reduces physical complexity and thus the ability to recover from bleaching, Graham added. It could also help other coral nations, such as Kenya, to pinpoint where to limit damage by fishing gear.
The research looked at a 17-year data set covering surveys undertaken in 1994, 2005, 2008 and 2011, and looked at eleven factors that influenced their recovery or die-off. The coral’s physical complexity and water depth were the two most important factors – sites deeper than 6.3 metres were found to be highly likely to recover – while whether a reef was in a marine protected area made no difference.
Dr Aaron MacNeil, a co-author on the study from the Australian Institute of Marine Science, said: “This gives reef management a major boost in the face of the threats posed by climate change and, encouragingly, suggests people can take tangible steps to improve the outlook for reefs.
“By carefully managing reefs with conditions that are more likely to recover from climate-induced bleaching, we give them the best possible chance of surviving over the long term, while reduction of local pressures that damage corals and diminish water quality will help to increase the proportion of reefs that can bounce back.”

Climategate, the sequel: How we are STILL being tricked with flawed data on global warming Something very odd has been going on with the temperature data relied on by the world's scientists,

Although it has been emerging for seven years or more, one of the most extraordinary scandals of our time has never hit the headlines. Yet another little example of it lately caught my eye when, in the wake of those excited claims that 2014 was “the hottest year on record”, I saw the headline on a climate blog: “Massive tampering with temperatures in South America”. The evidence on Notalotofpeopleknowthat, uncovered by Paul Homewood, was indeed striking.
Puzzled by those “2014 hottest ever” claims, which were led by the most quoted of all the five official global temperature records – Nasa’s Goddard Institute for Space Studies (Giss) – Homewood examined a place in the world where Giss was showing temperatures to have risen faster than almost anywhere else: a large chunk of South America stretching from Brazil to Paraguay.
Noting that weather stations there were thin on the ground, he decided to focus on three rural stations covering a huge area of Paraguay. Giss showed it as having recorded, between 1950 and 2014, a particularly steep temperature rise of more than 1.5C: twice the accepted global increase for the whole of the 20th century.
But when Homewood was then able to check Giss’s figures against the original data from which they were derived, he found that they had been altered. Far from the new graph showing any rise, it showed temperatures in fact having declined over those 65 years by a full degree. When he did the same for the other two stations, he found the same. In each case, the original data showed not a rise but a decline.
Homewood had in fact uncovered yet another example of the thousands of pieces of evidence coming to light in recent years that show that something very odd has been going on with the temperature data relied on by the world's scientists. And in particular by the UN’s Intergovernmental Panel on Climate Change (IPCC), which has driven the greatest and most costly scare in history: the belief that the world is in the grip of an unprecedented warming.
How have we come to be told that global temperatures have suddenly taken a great leap upwards to their highest level in 1,000 years? In fact, it has been no greater than their upward leaps between 1860 and 1880, and 1910 and 1940, as part of that gradual natural warming since the world emerged from its centuries-long “Little Ice Age” around 200 years ago.
This belief has rested entirely on five official data records. Three of these are based on measurements taken on the Earth’s surface, versions of which are then compiled by Giss, by the US National Oceanic and Atmospheric Administration (NOAA) and by the University of East Anglia’s Climatic Research Unit working with the Hadley Centre for Climate Prediction, part of the UK Met Office. The other two records are derived from measurements made by satellites, and then compiled by Remote Sensing Systems (RSS) in California and the University of Alabama, Huntsville (UAH).
The adjusted graph from the Goddard Institute for Space Studies
In recent years, these two very different ways of measuring global temperature have increasingly been showing quite different results. The surface-based record has shown a temperature trend rising up to 2014 as “the hottest years since records began”. RSS and UAH have, meanwhile, for 18 years been recording no rise in the trend, with 2014 ranking as low as only the sixth warmest since 1997.
One surprise is that the three surface records, all run by passionate believers in man-made warming, in fact derive most of their land surface data from a single source. This is the Global Historical Climate Network (GHCN), managed by the US National Climate Data Center under NOAA, which in turn comes under the US Department of Commerce.
But two aspects of this system for measuring surface temperatures have long been worrying a growing array of statisticians, meteorologists and expert science bloggers. One is that the supposedly worldwide network of stations from which GHCN draws its data is flawed. Up to 80 per cent or more of the Earth’s surface is not reliably covered at all. Furthermore, around 1990, the number of stations more than halved, from 12,000 to less than 6,000 – and most of those remaining are concentrated in urban areas or places where studies have shown that, thanks to the “urban heat island effect”, readings can be up to 2 degrees higher than in those rural areas where thousands of stations were lost.
Below, the raw data in graph form
To fill in the huge gaps, those compiling the records have resorted to computerised “infilling” or “homogenising”, whereby the higher temperatures recorded by the remaining stations are projected out to vast surrounding areas (Giss allows single stations to give a reading covering 1.6 million square miles). This alone contributed to the sharp temperature rise shown in the years after 1990.
But still more worrying has been the evidence that even this data has then been subjected to continual “adjustments”, invariably in only one direction. Earlier temperatures are adjusted downwards, more recent temperatures upwards, thus giving the impression that they have risen much more sharply than was shown by the original data.
An early glaring instance of this was spotted by Steve McIntyre, the statistician who exposed the computer trickery behind that famous “hockey stick” graph, beloved by the IPCC, which purported to show that, contrary to previous evidence, 1998 had been the hottest year for 1,000 years. It was McIntyre who, in 2007, uncovered the wholesale retrospective adjustments made to US surface records between 1920 and 1999 compiled by Giss (then run by the outspoken climate activist James Hansen). These reversed an overall cooling trend into an 80-year upward trend. Even Hansen had previously accepted that the “dust bowl” 1930s was the hottest US decade of the entire 20th century.
Assiduous researchers have since unearthed countless similar examples across the world, from the US and Russia to Australia and New Zealand. In Australia, an 80-year cooling of 1 degree per century was turned into a warming trend of 2.3 degrees. In New Zealand, there was a major academic row when “unadjusted” data showing no trend between 1850 and 1998 was shown to have been “adjusted” to give a warming trend of 0.9 degrees per century. This falsified new version was naturally cited in an IPCC report (see “New Zealand NIWA temperature train wreck” on the Watts Up With That science blog, WUWT, which has played a leading role in exposing such fiddling of the figures).
By far the most comprehensive account of this wholesale corruption of proper science is a paper written for the Science and Public Policy Institute, “Surface Temperature Records: Policy-Driven Deception?”, by two veteran US meteorologists, Joseph D’Aleo and WUWT’s Anthony Watts (and if warmists are tempted to comment below this article online, it would be welcome if they could address their criticisms to the evidence, rather than just resorting to personal attacks on the scientists who, after actually examining the evidence, have come to a view different from their own).
One of the more provocative points arising from the debate over those claims that 2014 was “the hottest year evah” came from the Canadian academic Dr Timothy Ball when, in a recent post on WUWT, he used the evidence of ice-core data to argue that the Earth’s recent temperatures rank in the lowest 3 per cent of all those recorded since the end of the last ice age, 10,000 years ago.
In reality, the implications of such distortions of the data go much further than just representing one of the most bizarre aberrations in the history of science. The fact that our politicians have fallen for all this scary chicanery has given Britain the most suicidally crazy energy policy (useless windmills and all) of any country in the world.
But at least, if they’re hoping to see that “universal climate treaty” signed in Paris next December, we can be pretty sure that it is no more going to happen than that 2014 was the hottest year in history.

Friday, January 16, 2015

Does the Uptick in Global Surface Temperatures in 2014 Help the Growing Difference between Climate Models and Reality?



This post includes calendar year 2014 global surface temperature data from GISS and NCDC.
I thought it would be interesting to begin the introduction as if GISS and NCDC were announcing year-end business profits at their press conference today. [sarc on.]


INTRODUCTION
Today, two of the world’s climate-industry giants—the NASA Goddard Institute for Space Studies (GISS) and the NOAA National Climatic Data Center (NCDC)—posted their much-anticipated annual results for 2014.  According to GISS, global surface temperature anomalies were an astounding +0.02 deg C higher in 2014 than they were in 2010, making the 2014 results the highest in the history of GISS. These record-breaking results from GISS are under the guidance of their new Deputy Director, Gavin Schmidt.  If you’re not familiar with numbers that remarkable, they’re read two one-hundredth of a deg C, which is equal to less than four one-hundredths of a deg F.  According to the NCDC, their global surface temperature results were +0.04 deg C higher in 2014 than they were in 2005 and 2010, their two previous best years.  The warmest years are within the margin of uncertainty for the data*, making it impossible to determine which year was actually warmest.  Even so, these results bring new hope to global warming investors, who have had to endure disappointing results in recent decades.  GISS and NCDC are once again showing why the CO2 obsessed turn to them for global warming data.  GISS and NCDC are global-warming industry leaders…known for eking out record years from poor source data, even during these hard times of global warming slowdown. In related news, based on similar source data, Berkeley Earth too announced record highs in 2014, but only by 0.01 deg C. [sarc off.]
Figure 1
Figure 1
*The uncertainties are assumed to be the same as those shown in the Berkeley link (in the range of +/- 0.04 to 0.05 deg C).
Those results, especially the NCDC results, appear somewhat curious.  We showed in the post here that the Meteorological Annual Mean (December to November) were 0.01 deg or less between 2014 and 2010.   Then again, the differences between the Meteorological Annual Mean and Calandar Mean are being measured in hundredths of a deg C.
There will be all sorts of bizarre proclamations now that the 2014 global surface temperature data from GISS and NCDC (and Berkeley Earth) were found to be a tick warmer than the prior warmest year(s).


What eludes those making the claims—or what they are purposely directing attention away from—is the growing disagreement between the real world and the global surface warming simulated by climate models.


MODEL-DATA DIFFERENCES
We’ll use the GISS data for this discussion.  Similar graphs, but with the NCDC data, follow later in the post.
The teeny-tiny uptick in global surface temperature anomalies does not really help the growing difference between observations and the projections by climate models…because the modeled surface temperatures continue to rise, too, and modeled surface temperatures are rising faster than observations.


Figure 2 presents the annual GISS global surface temperature data for their full term of 1880 to 2014.  Also shown on the graph is the average of all of the outputs of the simulations of global surface temperatures by the climate models stored in the CMIP5 archive, models with historical forcing through about 2005 and with RCP8.5 (worst case) forcings thereafter. The predictions of gloom and doom are based on the worst-case scenarios so we might as well use them for the comparison.  The models stored in the CMIP5 archive were used by the IPCC for their 5th Assessment Report.  Anomalies were calculated against the averages for the period of 1880 to 2014 so that the base years did not bias the presentation.
Figure 2
Figure 2
We use the average of the model simulations (the multi-model ensemble member mean) because it best describes how surface temperatures would vary if (big if) they varied in response to the numerical values of the forcings (anthropogenic greenhouse gases, aerosols, etc.) used to drive the climate models. For a further discussion, see the post here.
It’s very plain to see that the observed global surface temperatures have not risen as fast as predicted by climate model simulations in recent years.


Let’s put the growing difference between models and observations into perspective.  We’ll subtract the annual values of the data from the modeled values, and we’ll smooth the difference with a 5-year running-average filter (centered on the 3rd year) to reduce the volatility from El Niños, La Niñas and volcanic aerosols. See Figure 3.  The horizontal red line is the value of the most recent model-data difference—for the 5-year period of 2010 to 2014. Over that period, the model projections are running on average about 0.17 deg C too warm.   Keep in mind, these climate model projections are only a few years old and already their performance is terrible.
Figure 3
Figure 3
We can also see that the models have not simulated surface temperatures this poorly (have not deviated 0.17 deg C from reality) since the 5-year period centered on about 1910. That earlier deviation was caused by the model failure to properly simulate the cooling of global surfaces that took place from the 1880s to about 1910.  The present deviation is caused by the model failure to simulate the recent slowdown in global warming.


30-YEAR MODEL-DATA TRENDS
The carbon-dioxide obsessed often say we need to look at 30-year trends, so let’s do exactly that.  See Figure 4.
Figure 4
Figure 4
An explanation of what’s shown in that graph: Each data point presents the 30-year linear trend (warming and cooling rate) as calculated by MS EXCEL in deg C/decade.  The last data points at 2014 are the linear trends (warming rates) for the 30-year period of 1985-2014.  Working back in time, the data points at 2013 are the warming rates for the period of 1984-2013…and so on, until the first data points at 1909, which show the model and observed trends for the period of 1880 to 1909.  The term “trailing” in the title block indicates the data points are keyed to the last year of the 30-year terms.


The 30-year period when global surfaces cooled fastest ended about 1909. At that time, the models showed surface temperatures should have been warming if Earth’s surfaces responded to the forcings in the same way as the climate models.  Obviously they didn’t. From the 30-year periods ending in 1909 to just before 1925 (when the data trends were still negative but the negative trends were growing smaller) global surfaces were cooling, but the cooling rate was decelerating. (To simplify this discussion, keep in mind that the years discussed are the last years in 30-year periods.)  Starting just after 1925 and running through about 1945, Earth’s surfaces had warmed and the observed 30-year warming rate grew faster (accelerated), while the models did not show the same multidecadal variability in warming over that time.
IMPORTANT NOTE:  In fact, for the period ending in 1945, the climate models show that global surfaces should only have warmed at a rate that was about 1/3 the observed rate—or, in other words, from 1916 to 1945, global warming occurred at a rate that was about 3 times faster than simulated by climate models—or, to phrase it yet another way, natural variability was responsible for about 2/3rds of the warming from 1916 to 1945. If Earth’s surfaces warmed much faster than simulated by the models, then the warming was caused by something other than the forcings used to drive the climate models…thus it must have been natural variability.  Of course, that undermines the claims that all of the global surface warming in the latter part of the 20th Century was caused by man’s emissions of carbon dioxide. If natural factors were capable of causing about 66% of the global warming from 1916 to 1945, there is every reason to conclude that a major portion of the global surface warming during the latter warming period was caused by natural factors. The fact that the models better align during the latter part of the 20th Century is not proof that the warming in that period was caused by manmade greenhouse gases…the climate models have already shown that they have no skill at being able to simulate global surface temperatures over multidecadal periods. [End note.]


From 1945 to about 1964, observed global warming over 30-year time spans decelerated at rates that were much faster than simulated by models.  But the modeled trends aligned with the data from the mid-1950s to the late 1960s, then diverged slightly during the 1970s and realigned until about 2003.


Over the last 11 years, the observed 30-year global warming rates decelerated slightly while the climate models show that global warming should have continued to accelerate…if carbon dioxide was the primary driver of global surface temperatures.  While the 30-year trends do not show global cooling at this time, they also do not show global warming accelerating as predicted by climate models…and that is the problem that climate scientists are still trying to explain and coming up with dozens of excuses.  If history repeats itself, global warming will continue to decelerate, maybe for as long as another 20 years.


30-YEAR MODEL-DATA TREND DIFFERENCES
Figure 5 shows the differences between the modeled and observed 30-year trends (trailing) in global surface temperatures.  Referring back to Figure 4, the data trends were subtracted from the modeled trends.  For the 30-year period of 1985 to 2014, the models show that global surfaces should have been warming at a rate that’s about 0.085 deg C per decade faster than has been observed.  The last time the models showed 30-year global warming rates that were that much faster than observed was around 1920. Now consider again that these climate model projections are only a few years old.
Figure 5
Figure 5
NCDC GLOBAL SURFACE TEMPERATURE DATA
Figures 6 through 9 are the same as Figures 2 to 5, but with the NCDC global land+ocean temperature anomaly data. The curves are so similar to those with the GISS data that there’s no reason to repeat the dialogue.
Figure 6
Figure 6
# # #
Figure 7
Figure 7
# # #
Figure 8
Figure 8
# # #
Figure 9
Figure 9
THE REASON FOR THE 2014 UPTICK IN GLOBAL COMBINED SURFACE TEMPERATURES
Figure 10 presents the global sea surface temperatures for the period of 1997 to 2014 based on NOAA’s ERSST.v3b data, which is used by GISS and NCDC for their combined global land plus sea surface temperature datasets.  The 2014 value was 0.044 deg C warmer than the previous warmest year 1998.  Obviously, because the oceans cover 70% of the surface of the planet, the uptick in global combined surface temperatures was the result of the larger uptick in global sea surface temperatures.
Figure 10
Figure 10
We have been discussing for more than 6 months the reasons for the record high sea surface temperatures in 2014. Recently, we confirmed that the uptick in global sea surface temperatures was caused by the unusual weather event in the eastern extratropical North Pacific. See the post Alarmists Bizarrely Claim “Just what AGW predicts” about the Record High Global Sea Surface Temperatures in 2014.  No other ocean basin had record-high sea surface temperatures in 2014.
The following is a reprint of a discussion from that post under the heading of On the Record High Sea Surface Temperatures in 2014:


Again, of the individual ocean basins, only the North Pacific had record high sea surface temperatures this year, and the weather event there was strong enough to cause record warm sea surfaces globally, in the Pacific as a whole and in the Northern Hemisphere.


We’ve been discussing the record high sea surface temperatures since the June Sea Surface Temperature (SST) Update.   We identified the location of the unusual weather event, the likely reasons for the record high sea surface temperatures and the fact that climate models could not explain that warming in the post On The Recent Record-High Global Sea Surface Temperatures – The Wheres and Whys.  We discussed the topic further in other posts, including Axel Timmermann and Kevin Trenberth Highlight the Importance of Natural Variability in Global Warming…   Our discussions of the unusual warming event in the eastern extratropical North Pacific were confirmed by the 2014 paper by Johnstone and Mantua (here) which was presented in the post Researchers Find Northeast Pacific Surface Warming (1900-2012) Caused By Changes in Atmospheric Circulation, NOT Manmade Forcings.  Jim Johnstone, one of the authors of the paper, joined us on the thread of the cross post at WUWT and provided a link to his webpage.  There you can find a link to the paper.  Also see his comment here for an update on the recent unusual warming event in the extratropical North Pacific.   Under the heading of NE Pacific coastal warming due to changes in atmospheric circulation at his webpage, Jim Johnstone updated one of the graphs from their paper and wrote:
Jan 1980 – Nov 2014.   NE Pacific monthly coastal SST anomalies (red) and SST modeled from regional SLP.  Recent warming from Jan 2013 to Nov 2014 occurred in response to low SLP over the NE Pacific, consistent with long-term forcing. Gray bars mark data beginning in January 2013 that were not included in the study.  Negative SLP anomalies generate anomalous cyclonic winds, reducing the mean anticyclonic flow and winds speeds throughout the Arc.  The drop in wind speeds reduces evaporation rates, producing positive surface  latent heat fluxes and SST increases.

Also refer to the NOAA summary and FAQ webpage about Johnstone and Mantua (2014) for discussions about the paper in less-technical terms.
As we’ve been saying for years, coupled ocean-atmosphere processes can and do cause regional warming, which, in turn, lead to the warming of ocean surfaces globally.
ONE LAST NOTE
The NOAA press release from Wednesday includes the following statements (my boldface):
NOAA and NASA independently produce a record of Earth’s surface temperatures and trends based on historical observations over oceans and land. Consistency between the two independent analyses, as well as analyses produced by other countries, increases confidence in the accuracy of such data, the assessment of the data, and resulting conclusions.

NOAA and GISS may produce the surface temperature data independently, using different methods to infill missing data, but they rely on the same sea surface temperature data (NOAA’s ERSST.v3b) and, for the most part, on the same land surface air temperature source data (NOAA’s GHCN).  Though GISS does include a few other surface temperature datasets in areas where the GHCN data are sparse, they rely primarily on the same data for both land and oceans.  They cannot be independent if the suppliers rely on the same source data.
CLOSING


As illustrated and discussed, while global surface temperatures rose slightly in 2014, the minor uptick did little to overcome the growing difference between observed global surface temperature and the projections of global surface warming by the climate models used by the IPCC.


This post will serve as the annual surface temperature update for GISS and NCDC.  The full monthly update will follow later in the day or tomorrow.


SOURCES
See the GISS global Land-Ocean Temperature Index (LOTI) data page and the NCDC data are accessible here (can be very slow).