Google+ Followers

Thursday, August 29, 2013

VULCAN MIND MELD’: FIRST HUMAN BRAIN-TO-BRAIN COMMUNICATION LETS SCIENTIST CONTROL ANOTHER PERSON’S MOVEMENT

                                                 Remember  Star Trek II: The Wrath of Khan 
                                                                 



Researchers have made the first step in human telepathy, creating a brain-to-brain interface that allows one person to control the motions of another.
Mind control technology has been making strikes in the medical field, helping paralyzed or disabled patients feed themselves and fly drones as researchers hope to give them more independence. But these instances have only been using a person’s brain activity to power a device, like a robot.
In contrast, researchers at the University of Washington used the brain signals of one person to control the hand motions of another person.
brain to brain interface
The EEG signals from Rajesh Rao’s brain were captured and then transferred over the internet to Andrea Stocco, who was wearing a cap over the part of his brain that would control his hand movement. (Photo: University of Washington/Bryan Djunaedi)
The brain signals were sent over the Internet, allowing Rajesh Rao to move Andrea Stocco’s finger on a keyboard, according to the university’s website.
“The Internet was a way to connect computers, and now it can be a way to connect brains,” Stocco said. “We want to take the knowledge of a brain and transmit it directly from brain to brain.”
brain to brain interface
The researchers caution that the technology can only read simple commands from the brain at this point. They also noted that it cannot control a person’s actions against their will. (Photo: University of Washington)
The brain-to-brain interface involves one participant wearing a cap with electrodes that are attached to an electroencephalography machine. This cap picks up the brain’s electrical signals and transfers the information to the other person, who in this experiment was sitting on the other side of the university’s campus, wearing a cap with a transcranial magnetic stimulation coil above his left motor cortex, an area of the brain that controls hand movement.
Here’s more about how the experiment was conducted:
Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the “fire” button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn’t looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.
brain-to-brain interface
This diagram details how the information is transferred from one person’s brain to another and translated into movement. (Image: University of Washington)


Wednesday, August 28, 2013

Warmist Off Deep End: ‘Will climate change kill-off humanity? Unlikely, but we may wish it had.

“If humans were to go extinct, the earth would at least have a chance to repair itself…”
Chris Clarke writes at KCET.org:
Will we go extinct? There’s some thought that humans have already passed through a near-extinction event, related to the Toba eruption about 70,000 years ago, that may have reduced our total global population down to 15,000 people or fewer in southern Africa. That theory is questioned by some who cite the possibility of other survivng bands of humans. Either interpretation offers us hope for our species: humans can survive horrible catastrophes and rebuild.
Which means the question may not be so much “will we die out” as “will we wish we had.” If humans were to go extinct, the earth would at least have a chance to repair itself, evolve new biodiversity, and move on. But having the globe sprinkled with scattered bands of a few hundred survivors, each desperately scraping whatever sustenance might come from the planet their ancestors ruined? That’s a much more frightening prospect for both the planet and our great great grandchildren.

Monday, August 26, 2013

Svensmark Effect Attacked: Study claims cosmic rays don’t effect clouds

Svensmark hypothesized that cosmic rays flux affects cloud formation which, in turn, affects climate change.

Henrik Svensmark

CREDENTIALS 

  • Ph.D., Physics Laboratory I, Technical University of Denmark, (September, 1987).
  • Master of Science in Engineering (Cand. Polyt), Physics Laboratory I, The Technical University of Denmark, (February, 1985).
Source: [1]

BACKGROUND

Henrik Svensmark is a physicist at the Danish National Space Center in Copenhagen. Svensmark has studied the effects of cosmic rays on cloud formation and presented a hypotheses that global warming is caused by solar radiation.
Svensmark appeared in a documentary titled "The Cloud Mystery" to illustrate his position, and has also shared his research at the Heartland Institute's International Conference on Climate Change.
Henrik Svensmark is director of the Centre for Sun-Climate Research at the Danish Space Research Institute (DSRI). Previously, Dr. Svensmark was head of the Sunclimate group at DSRI. He has held post doctoral positions in physics at University California Berkeley, Nordic Institute of Theoretical Physics, and the Niels Bohr Institute. [2]

STANCE ON CLIMATE CHANGE

"In fact global warming has stopped and a cooling is beginning. No climate model has predicted a cooling of the Earth – quite the contrary. And this means that the projections of future climate are unreliable." [3]
According to Lawrence Solomon, "Dr. Svensmark has never disputed the existence of greenhouse gases and the greenhouse effect. To the contrary, he believes that an understanding of the sun's role is needed to learn the full story, and thus determine man's role. Not only does no climate model today consider the effect of cosmic particles, but even clouds are too poorly understood to be incorporated into any serious climate model. [2]

KEY QUOTES

"During the last 100 years cosmic rays became scarcer because unusually vigorous action by the Sun batted away many of them. Fewer cosmic rays meant fewer clouds—and a warmer world. [4]


Astrobiology Web reports:
The problem of the contribution of cosmic rays to climate change is a continuing one and one of importance. In principle, at least, the recent results from the CLOUD project at CERN provide information about the role of ionizing particles in ‘sensitizing’ atmospheric aerosols which might, later, give rise to cloud droplets.
Our analysis shows that, although important in cloud physics the results do not lead to the conclusion that cosmic rays affect atmospheric clouds significantly, at least if H2SO4 is the dominant source of aerosols in the atmosphere. An analysis of the very recent studies of stratospheric aerosol changes following a giant solar energetic particles event shows a similar negligible effect. Recent measurements of the cosmic ray intensity show that a former decrease with time has been reversed. Thus, even if cosmic rays enhanced cloud production, there will be a small global cooling, not warming.

High fructose corn syrup causes diabetes-myth vs science

Over the past few months, there has been a lot of baseless claimstrying to link high fructose corn syrup (HFCS) and a variety of diseases, especially Type 2 diabetes. Like many of these medical myths, there is, at its core, some tiny bit of evidence that is generally misinterpreted or misused. But let’s take a close look at Type 2 diabetes, HFCS and the evidence that either supports or refutes the hypothesis that drinking HFCS is any more responsible for the disease than other sugars.
Just for background, the claimed link is between HFCS and Diabetes mellitus Type 2 (or Type 2 diabetes, T2DM), a metabolic disorder that is characterized by high blood glucose in the context of insulin resistance and relative insulin deficiency. In general, someone with T2DM produces low (or maybe even adequate) levels of insulin, but  various cells and organs become resistant to insulin, so cells don’t remove or store blood glucose. Although the cause of Type 2 diabetes is not completely understood, it results from a complex interaction between diet, obesity, genetics, age and gender. Some are under personal control, like diet and obesity, but most factors aren’t. 
Because they are often confused, it’s important to note that T2DM has a completely different cause and pathophysiology than Diabetes mellitus Type 1 (T1DM, and once called juvenile diabetes). Type 1 diabetes results from the inability of the beta cells of the pancreas to produce insulin, mostly as a result of an autoimmune disease. Typically, T1DM begins in children, though there are forms of the disease that begin in 30-40′s that had been confused with the type 2 version in the past, but blood tests can determine if it is Type 1 or Type 2. As far as we currently know, T1DM is neither preventable nor curable, and there is only some conflicting evidence about what causes T1DM. Diet, including consumption of sugars, won’t cause T1DM. Furthermore, although there are numerous treatments and lifestyle changes that can change the course of T2DM, and there are several treatment regimens, Type 1 is a death sentence without regular daily insulin injections.Over 90-95% of diabetes is the Type 2 form.
The consequences of both types of diabetes are almost the same. Complications of poorly managed diabetes mellitus may include cardiovascular diseasediabetic neuropathy, and diabetic retinopathy, among many other chronic conditions. I was intending to make this a quick explanation of diabetes, but I thought it would be beneficial to understanding the hype behind high fructose corn syrup.
After giving you a background discussion of diabetes, it’s time for a similar focus on the basics of sugar biochemistry.
There are two broad types of sugars, aldose and ketose, along with over twenty individual, naturally-found sugars, called monosaccharides. Of all of those sugars, only four play any significant role in human nutrition: glucosefructosegalactose, and ribose (which has a very minor nutritional role, though a major one as the backbone of DNA and RNA). Got that? Four sugars. Whatever you eat, however you consume it, you can only absorb 4 sugars. And these sugars are all precisely the same chemical. Fructose, glucose, or galactose whether from a chemical factory, or sugar cane, or from organic honey lovingly taken from the wild honeybees of Switzerland who were gently allowed to fly amongst the organic clover fields of the high alps. They are all the same.
Of course, it does get a bit more complicated. Many sugars form disaccharides that are compounds made of two monosaccharide sugars covalently bound together. Table sugar, the white stuff in our sugar bowls, is called sucrose which is one molecule of glucose bound to one molecule of fructose. Sucrose is also the main sugar in most commercially purchased sugars that you find including brown sugar, molasses, beet sugar, and maple sugar (and syrup). Milk sugar is lactose, which is a glucose and galactose disaccharide, maltose is two glucose molecules, and there are a few dozen less common ones. Each disaccharide has a slightly different taste and sweetness, and different combinations of disaccharides provide unique tastes to certain fruits and vegetables.
There are more complex sugars, called polysaccharides, that are long chains of sugars which are are the constituent biochemical in a lot of plant and animal cellular-level and gross-anatomy structures. For example, chitin, the exoskeleton of insects and crustaceans, is essentially a long chain of glucose molecules that’s extremely hard (and completely indigestible by humans).
Before we can continue onto our discussion about HFCS you need to know one important thing. Humans can only absorb monosaccharides, specifically glucose, fructose, galactose and ribose. In other words, all of those disaccharides and polysaccharides must be broken down into the constituent monosaccharides before it has any usefulness for a human. The gut has a variety of different enzymes that break down these starches and disaccharides–so sucrose is not absorbed by the small intestine, but it is instead broken down by the sucrase enzyme into glucose and fructose, and only at that point, it is absorbed into the bloodstream to be used for energy. By the way, any disaccharide or polysaccharide that are not broken down remain in the gut, providing food for our gut bacteria, thereby maintaining a healthy digestive system. Moreover, some individuals lack certain enzymes to digest some disaccharides; lactose intolerance results from the lack of of lactase, which breaks down lactose into glucose and galactose. Thus, in these individuals, lactose cannot be absorbed, and it passes to the bacterial flora, the result of which causes significant intestinal distress to the lactose intolerant individual.
It is important to note that fructose is 1.73 times more sweet than sucrose despite having the same exact caloric content. So technically, you could use about 58% less fructose than sucrose to get the same sweetness. And any food that has more fructose (like honey, pears and grapes) will taste sweeter, even though it has no additional calories. 
HFCS consists of 24% water, and the rest fructose and glucose. There are two main types of HFCS, HFCS 55 (used mostly in soft drinks) which is approximately 55% fructose and 42% glucose; and HFCS 42 (used in other types of beverages and processed foods), which is approximately 42% fructose and 53% glucose. There is another type, HFCS-90, approximately 90% fructose and 10% glucose, which is used in small quantities for specialty applications (interestingly, low calorie drinks, because, for the same sweetness as sucrose based drinks, about 33% less calories are added), but it is primarily blended with HFCS 42 to make HFCS 55.
And despite the fact that HFCS is produced from corn, it is contains the same fructose and glucose that is found in table sugar, or any other plant in the world. Fructose and glucose do not chemically differ in any way from fructose and glucose found anywhere else, it is a fixed chemical structure. Admittedly, HFCS is used in some foods because of cost. For the equivalent sweetness as sucrose (cane sugar for example), the manufacturer can use less with fewer calories. Given that HFCS contains both natural fructose and natural glucose, and it is just as sweet with fewer calories, you’d think that people would be all over that.
I hope that’s clear. HFCS is nothing more than a syrup that contains a higher ratio of “natural” fructose to “natural” glucose than table sugar. 
The story now becomes complicated, and it is the basis of some of the speculation about HFCS and diabetes, because galactose, fructose and glucose are treated differently by human metabolism. Glucose passes through the liver unchanged, and can be used by all cells for energy. The level of glucose is controlled by insulin, which causes it to be stored if the blood levels get high, and glucagon, another hormone which causes the release of glucose from storage. This control system is highly complicated, and in non-diabetics, is a finely tuned system. Fructose and galactose don’t signal insulin, but are captured by the liver, eventually processed into a couple of different biochemicals, one of which is glucose. So, because fructose is treated in a different manner by the body, speculation has been that fructose might be implicated in T2DM. How the body controls blood sugar levels, and how fructose and galactose are involved in that control, is incredibly complex and would take at least a year of graduate level classwork to even begin to understand the physiology (I know this, because I took those classes, and even today I scratch my head about how complex it is).
Except, there are some problems with this speculation about fructose and T2DM. For example, fructose has a very low glycemic index of 19 ± 2, compared with 100 for glucose and 68 ± 5 for sucrose. Because fructose is 1.73X sweeter than sucrose, diabetics can consume significantly less fructose (than other forms of sugars) for an equivalent level of sweetness. Studies show that fructose consumed before a meal may even lessen the glycemic response of the meal. In other words, specifically because of the sweetness and lower insulin reactivity, fructose may actually be preferred for those who are attempting a low glycemic index diet.
A lot of the current “mania” about HFCS and T2DM results from a recent article in an open source journal, Global Public Health which has an impact factor of 0.92, about as low as you can get on the impact factor scale. The authors tried toestablish a correlation between availability of HFCS foods and the incidence of Type 2 diabetes. This type of study is at the population level, which may seem like it would give you great numbers, but the problem is that it allows in so many confounding factors, and ignores all sorts of other information that might provide us with better information. In other words, it is simply not a way to establish correlation, let alone causation.
The problems with this study are numerous:
  1. They assume that availability of HFCS foods is the major cause of obesity. It simply isn’t. It is possible that high amounts of HFCS means accessibility to various others foods that might cause obesity.
  2. Obesity itself is not a causal factor in T2DM, it is one of number of causes of the disease.
  3. Even though the researchers attempted to control for other factors, there are just too many factors that may skew the results at a country level. The best type of epidemiological study would a prospective study, which would allow for controlling of different factors along with getting more detailed data about each patient. A prospective study takes time, is expensive, but gives some of the best results upon which to confirm or refute a hypothesis about HFCS being causal to T2DM. The authors of this study took the easiest and simplest route to write a paper: they obtained country-level data for T2DM, Gross domestic product, HFCS production in foodstuffs, and calories consumed. This type of work takes a few days, and would require you just to leave the computer for sustenance. I could look up the sales of Xbox and Playstations in each country and compare it to T2DM, get it published in a bad journal, and make a name for myself that video gaming is correlated to Type 2 diabetes. And even if I could show some correlation between video gaming and diabetes, it wouldn’t be worth anything, though I’d make a big name for myself. 
In a review article examining the health implications of HFCS, the author, James Rippe, MD, states that “most of the studies being cited to support the proposed linkages between fructose consumption and obesity and other metabolic conditions employ epidemiologic data that establishes associations rather than cause and effect.” The study I critiqued above is a perfect example of the type of study dismissed by Dr. Rippe. He goes on to conclude that:
While the fructose hypothesis is an interesting one, it poses the danger of distracting us from further exploration and amelioration of the known causes of obesity and related metabolic conditions. It is important to remember that many of the metabolic abnormalities currently being postulated as attributable to fructose consumption may also be ascribed to obesity itself.
The epidemiologic evidence being cited to support metabolic abnormalities related to fructose consumption leaves many questions unanswered. There are compelling data to support excessive caloric consumption as the major dietary driver of obesity. The fructose hypothesis is based largely on epidemiologic data that do not establish cause and effect. All too often, we have been led astray by confusing associations with cause and effect. With the fructose argument, we are in danger of repeating mistakes frequently made in the past by basing judgments on insufficient evidence.
It’s clear that there are individuals want to “prove” that HFCS is unsafe and causes all sorts of problems to humans. But HFCS is a sugar syrup, close to honey in ratio of fructose to glucose. Just because it has this scary chemical name, high fructose corn syrup, people must think that it’s made up of some evil fructose chemical. But all fructose molecules are exactly the same, whether it’s in honey, a fruit, maple syrup, cane sugar, or HFCS. 
High quality reviews of the research around HFCS and T2DM have consistently stated that the data does not show any causality between the sugar and the disease. We could cherry pick a few poorly designed epidemiological studies or force-feeding rats to induce diabetes studies, but neither of those types of studies provide us with solid or even intriguing evidence that HFCS has some responsibility for T2DM. Until we have two pieces of information, one, a high powered prospective epidemiological study, and two, a definitive explanation of how fructose could disrupt the metabolism leading to T2DM, we completely lack any reliable evidence to think that HFCS itself causes T2DM rather than simply any sugar. Because the hypothesis that is well understood, and well supported by evidence, is the one that says any sugar can lead to obesity, thus leading to a higher risk of T2DM.
If I were raising children, and frankly, I have, I would keep them from HFCS. And sucrose. And honey. And fruit juices. And cotton candy. And chocolate bars. And fatty foods. And potato chips. Picking on HFCS just seems crazy.
Key citations:

WaPo slams ethanol ‘central planning’ — but it would somehow be different with CO2?

Sadly though, carbon central planning would be just peachy.

The biggest problem with the carbon tax is that it would be hard to get one through Congress. Fine. Then lawmakers should choose another policy that encourages conservation and innovation without absurd central planning

Ditch ethanol mandates. Try a carbon tax.

The WaPo editorializes:

WASHINGTON IS seeing a great fight between two extremely powerful lobbies, Big Ethanol and Big Oil. Neither should win.
At issue is the Renewable Fuel Standard, a huge subsidy meant for companies making all kinds of ethanol but that mostly benefits the least-attractive type, derived from corn. The policy demands that increasing amounts of various sorts of ethanol be blended into the nation’s gasoline supply. Yet, oil companies point out, when Congress last looked at the standard in 2007, estimates of how much fuel Americans would be using by now were much too high. The result today is a legal requirement to blend the same, mandated amount of ethanol into a smaller-than-expected pool of fuel sold, a task for which the country doesn’t have the infrastructure. And that assumes the ethanol industry manages to produce enough of each type of ethanol in the first place, which it hasn’t. The result, oil companies argue, is a “blend wall” that inevitably translates into higher gasoline prices for consumers, since oil firms have to buy special credits to make up for missing the law’s blending targets.

This month the Environmental Protection Agency, which oversees the standard,softened the government’s ethanol mandate somewhat. But even if the blend wall weren’t an issue, the ethanol industry simply doesn’t deserve federal pampering. The industry claims to be environmentally friendly, but credible environmentalists disagree. The industry claims to be helping the United States wean itself off foreign oil, but increasing domestic oil supplies make that less important, and there are better ways to do it anyway. The industry says it has marshaled billions in private investment, but all of that would surely have gone to more economically productive use if the government weren’t tipping the scales so emphatically in ethanol’s direction.
Ethanol backers are right that it’s firmly in the nation’s interest to reduce the amount of oil Americans use, not least because it’s dirty. The answer, though, is not for lawmakers to guarantee a market for an alternative product of their choosing. Instead, Congress should make polluters pay something for the pollution they cause by establishing a reasonable tax on the carbon dioxide contained in oil and other fossil fuels. Without all the congressional micromanaging, the policy would nudge consumers to use less fuel, it would give investors incentive to divert their money to the clean technologies that will do the most good, and it would direct the tax revenue raised to the Treasury, instead of enriching politically favored groups.
If perfecting and using ethanol is an economically efficient way to reduce the transportation sector’s emissions, the industry should prosper under those conditions. Oil companies, meanwhile, would still sell a lot of gasoline for a long time, but they would face consumers less eager to take unnecessary drives or buy gas-guzzlers.


Fracking defended at coal’s expense with EPA air quality junk science

To fend off claims that fracking poses a risk to health from air emissions, an MIT economist/EPA advisor claims that the risks are offset by lives saved from using less coal.

However, there is no evidence that coal plant emissions kill anyone.
http://junkscience.com/2011/09/15/epa-shows-us-a-body/



PITTSBURGH — A project examining the local health effects from natural gas drilling is providing some of the first preliminary numbers about people who may be affected, and the results challenge the industry position that no one suffers but also suggest that the problems may not be as widespread as some critics claim.
The Southwest Pennsylvania Environmental Health Project has been trying to help people who feel they’ve been sickened by natural gas drilling or processing for about 18 months in one county south of Pittsburgh.


The work is potentially important because it’s one of the first long-term attempts to monitor drilling-related health effects, and it could help other groups identify possible symptoms.
The project found 27 cases where people in Washington County believe they were hurt by nearby drilling — seven cases of skin rashes, four of eye irritation, 13 of breathing problems and three of headaches and dizziness. The skin exposures were from water and the other cases were from air. The numbers don’t represent a full survey of the area, just cases with plausible exposures.
The EHP group is trying to help people who have been exposed to drilling-related air or water pollution, toxicologist David Brown told the Associated Press, adding that they’re finding “an array of symptoms” in some people who live close to wells or processing stations.
There were some surprises: Air pollution seems to be more of a threat than tainted water, and the huge processing stations that push gas into national pipelines may be more of a problem than the drilling sites themselves. The processing stations can handle large volumes of gas from hundreds of wells.
Washington County has a population of about 200,000, and about 700 natural gas wells have been drilled there in the past six years. It’s also home to large gas processing operations.
Some experts not involved with the findings praised the general program, but said the debate over fracking and health often neglects a crucial point. Fracking, or hydraulic fracturing, is a technique in which water is mixed with sand and chemicals and injected into rock formations to break them up and free trapped gases.
“There’s a strong case that people in the U.S. are already leading longer lives as a consequence of the fracking revolution,” said Michael Greenstone, a professor of environmental economics at the Massachusetts Institute of Technology. That’s because many power plants have stopped burning coal and switched to natural gas, which emits far less fine soot, nitrous oxide and sulfur dioxide.
“Obviously, that has to be counterbalanced against the local effects of the drilling,” and that makes for a complicated decision, said Greenstone, formerly one of President Obama’s chief economic advisers. Obama has expressed strong support for the natural gas drilling boom, and has said it can be done safely.
Greenstone said more work needs to be done to confirm that Washington County residents were affected by natural gas activity and not by other factors, but he called the project an “important start.”
The EHP group only counted cases where symptoms began after natural gas activity started, where there was a plausible source of exposure and where the individual didn’t have an underlying medical condition that was likely to have caused the symptoms.
Brown said the project team is aware that more work needs to be done on links between natural gas activities and their effects on health. He said the work has been “a lot harder than I thought it was going to be,” but substantial progress has been made.
Brown said one of the most worrying findings was the extremely high levels of air pollution found inside two homes that are about 1,000 feet from a gas processing station. Western Pennsylvania tends to have high levels of air pollution, but the levels found in the two homes were up to four times higher than the local average.
Brown said the group is collecting more data and pushing ahead to refine ways to advise people who are worried about nearby natural gas activity.

Michael Mann has another Twitter fit after JunkScience posts Monckton debunking of Mann

The Mann-child responds as may now be expected.
JunkScience posted Monckton’s response to Mann a couple hours ago… and Mann goes into yet another Twitter spasm attacking Milloy.
Screen Shot 2013-08-25 at 9.17.35 PMPrevious Mann twitter fits:

Monckton responds to Mann: Global warming has stopped. Get over it.

By Christopher Monckton of Brenchley
August 25, 2013
The collapsed global warming scare certainly has some odd characters coming to its defense in this paper. Michael Mann (Aug. 25), whom the Attorney General of Virginia investigated under the Fraud Against Taxpayers Act 2000 after some statistical peculiarities in Mann’s failed attempt to abolish the medieval warm period, now bloops another blooper.
He tries to deny the embarrassing near-17-year pause in global warming because “NASA found the warming continues unabated, with the past decade the warmest on record”. As an expert reviewer for the Fifth Assessment Report of the UN’s climate panel, let me correct his latest gaffe.
Moncton RSS

The monthly near-surface temperature record from the RSS satellites (above) shows no warming trend for 16 years 8 months. But go back 20 years and some warming shows up. The temperature climbed from 1993-1996, then stopped.
So the latest decade is a bit warmer than those that went before, but there has still been no warming for almost 17 years. Even the climate-science chairman of the UN’s climate panel, the IPCC, admits that. Elementary, my dear Michael. Tut, tut! Statistics 101.
Mann says there is “evidence that humans are warming the planet”. There can’t be. For 200 months there has been no warming at all. Get over it. Get a life.
Mann says his discredited attempt to rewrite medieval temperatures “has not been disproved”. Well, here is what Professor Ross McKitrick, who exposed Mann’s statistical peculiarities in the learned journals, had to say about it:
“… The conclusions are unsupported by the data. At the political level the emerging debate is about whether the enormous international trust that has been placed in the IPCC was betrayed. The hockey stick story reveals that the IPCC allowed a deeply flawed study to dominate the Third Assessment Report, which suggests the possibility of bias in the Report-writing process. In view of the massive global influence of IPCC Reports, there is an urgent need to bias-proof future assessments …”.
And here is the report of three Congressional statisticians in 2006:
“… we judge that the sharing of research materials, data and results was haphazardly and grudgingly done. In this case we judge that there was too much reliance on peer review, which was not necessarily independent.
“Moreover, the work has been sufficiently politicized that this community can hardly reassess their public positions without losing credibility.
“Overall, our committee believes that Mann’s assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis.”
Mann goes on to say, “Dozens of independent groups of scientists have independently reproduced and confirmed our findings …”. His double use of “independent” was scarcely the mot juste. Here is what the three statisticians told Congress:
“In our further exploration of the social network of authorships in temperature reconstruction, we found that at least 43 authors have direct ties to Dr. Mann by virtue of co-authored papers with him.
“Our findings from this analysis suggest that authors in the area of paleoclimate studies are closely connected and thus ‘independent studies’ may not be as independent as they might appear on the surface.”
Mann then complains at my pointing out that his earlier offensive references to climate “ ‘deniers’ and ‘denialists’ would be illegal in Europe as being anti-Jewish, racialist hate-speech.” He says he is Jewish. Then he should know better than to use such unscientific and (in Europe) illegal terms, calculated to imply Holocaust denial on the part of his opponents.
Mann says the House of Lords says I am not a member when I say I am. Sigh! Mann knows no more of British constitutional practice than he does of elementary statistics. Hansard records that the House has recognized my title to succeed my late beloved father, but does not record the House as saying I am not a member. Facts wrong again, Mike, baby. Try doing science, not invective.
Finally, Mann says I “impersonated a delegate from Myanmar” at a UN conference. Do I look Burmese? Do I sound Burmese? Did the chairman of the conference say he thought I was Burmese? No. He said he knew I was not from Burma. Facts wrong yet again, Mickey.
Meanwhile, the world continues to fail to warm as predicted. Not only Attorneys General but also taxpayers will soon, and rightly, be demanding their money back from the grasping profiteers of doom who so monstrously over-egged this particular pudding.
Lord Monckton is an expert reviewer for the IPCC’s forthcoming Fifth Assessment Report. He has lectured worldwide in climate science and economics and has published several papers in the learned literature. Oh, and his passport says he is The Right Honourable Christopher Walter, Viscount Monckton of Brenchley.

Satellite temps flat for 200 months now

If the global warming era started in June 1988 with Jim Hansen’s drama-queen congressional testimony, then atmospheric temps have been flat 67% of the time since.

WoodForTrees.org – Paul Clark – Click the pic to view at source
[NOTE: RSS is a satellite temperature data set much like the UAH dataset from Dr. Roy Spencer and John Christy - Anthony]
Image Credit: WoodForTrees.org
Guest Post By Werner Brozek, Edited By Just The Facts
The graphic above shows 3 lines. The long line shows that RSS has been flat from December 1996 to July 2013, which is a period of 16 years and 8 months or 200 months. The other slightly higher flat line in the middle is the latest complete decade of 120 months from January 2001 to December 2010. The other slightly downward sloping line is the latest 120 months prior from present. It very clearly shows it has been cooling lately, however this cooling is not statistically significant.
In my opinion, if you want to find out what the temperatures are doing over the last 10 or 16 years on any data set, you should find the slope of the line for the years in question. However some people insist on saying global warming is accelerating by comparing the decade from 2001 to 2010 to the previous decade. They conveniently ignore what has happened since January 2011. However, when one compares the average anomaly from January 2011 to the present with the average anomaly from January 2001 to December 2010, the latest quarter decade has the lower number on all six data sets that I have been discussing. Global warming is not even decelerating. In fact, on all six data sets, cooling is actually taking place.
The numbers for RSS for example are as follows: From January 2001 to December 2010, the average anomaly was 0.265. For the last 31 months from January 2011 to July 2013, the average anomaly is 0.184. The difference between these is -0.081. I realize that it is only for a short time, but it is long enough that there is no way that RSS, for example, will show a positive difference before the end of the year. In order for that to happen, we can use the numbers indicated to calculate what is required. Our equation would be (0.184)(31) + 5x = (0.265)(36). Solving for x gives 0.767. This is close to the highest anomaly ever recorded on RSS, which is 0.857 from April 1998. With the present ENSO conditions, there is no way that will happen.
A word to the wise: do not even mention accelerated global warming until the difference is positive on all data sets.
I have added rows 23 to 25 to the table in Section 3 with the intention of updating it with every post. This table shows the numbers that I have given for RSS above as well as the corresponding numbers on the other five data sets I have been discussing. Do you feel this would be a valuable addition to my posts?
(Note: If you read my last article and just wish to know what is new with the July data, you will find the most important new things from lines 7 to the end of the table.)
Below we will present you with the latest fact, the information will be presented in three sections and an appendix. The first section will show for how long there has been no warming on several data sets. The second section will show for how long there has been no statistically significant warming on several data sets. The third section will show how 2013 to date compares with 2012 and the warmest years and months on record so far. The appendix will illustrate sections 1 and 2 in a different way. Graphs and a table will be used to illustrate the data.
Section 1
This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.
On all data sets below, the different times for a slope that is at least very slightly negative ranges from 8 years and 7 months to 16 years and 8 months.
1. For GISS, the slope is flat since February 2001 or 12 years, 6 months. (goes to July)
2. For Hadcrut3, the slope is flat since April 1997 or 16 years, 4 months. (goes to July)
3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 12 years, 8 months. (goes to July)
4. For Hadcrut4, the slope is flat since December 2000 or 12 years, 8 months. (goes to July)
5. For Hadsst2, the slope is flat since March 1997 or 16 years, 4 months. (goes to June) (The July anomaly is out, but it is not on WFT yet.)
6. For UAH, the slope is flat since January 2005 or 8 years, 7 months. (goes to July using version 5.5)
7. For RSS, the slope is flat since December 1996 or 16 years and 8 months. (goes to July)RSS is 200/204 or 98% of the way to Ben Santer’s 17 years.
The next link shows just the lines to illustrate the above for what can be shown. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the sloped wiggly line shows how CO2 has increased over this period.
WoodForTrees.org – Paul Clark – Click the pic to view at source
When two things are plotted as I have done, the left only shows a temperature anomaly. It goes from 0.1 C to 0.6 C. A change of 0.5 C over 16 years is about 3.0 C over 100 years. And 3.0 C is about the average of what the IPCC says may be the temperature increase by 2100.
So for this to be the case, the slope for all of the data sets would have to be as steep as the CO2 slope. Hopefully the graphs show that this is totally untenable.
The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted.
Trend1B
Source: WoodForTrees – Paul Clark – click to view at source
Section 2
For this analysis, data was retrieved from SkepticalScience.com. This analysis indicates for how long there has not been statistically significant warming according to their criteria. The numbers below start from January of the year indicated. Data go to their latest update for each set. In every case, note that the magnitude of the second number is larger than the first number so a slope of 0 cannot be ruled out. (To the best of my knowledge, SkS uses the same criteria that Phil Jones uses to determine statistical significance.)
The situation with GISS, which used to have no statistically significant warming for 17 years, has now been changed with new data. GISS now has over 18 years of no statistically significant warming. As a result, we can now say the following: On six different data sets, there has been no statistically significant warming for between 18 and 23 years.
The details are below and are based on the SkS Temperature Trend Calculator:
For RSS the warming is not statistically significant for over 23 years.
For RSS: +0.120 +/-0.129 C/decade at the two sigma level from 1990
For UAH the warming is not statistically significant for over 19 years.
For UAH: 0.141 +/- 0.163 C/decade at the two sigma level from 1994
For Hadcrut3 the warming is not statistically significant for over 19 years.
For Hadcrut3: 0.091 +/- 0.110 C/decade at the two sigma level from 1994
For Hadcrut4 the warming is not statistically significant for over 18 years.
For Hadcrut4: 0.092 +/- 0.106 C/decade at the two sigma level from 1995
For GISS the warming is not statistically significant for over 18 years.
For GISS: 0.104 +/- 0.106 C/decade at the two sigma level from 1995
For NOAA the warming is not statistically significant for over 18 years.
For NOAA: 0.085 +/- 0.102 C/decade at the two sigma level from 1995
If you want to know the times to the nearest month that the warming is not statistically significant for each set to their latest update, they are as follows:
RSS since August 1989;
UAH since June 1993;
Hadcrut3 since August 1993;
Hadcrut4 since July 1994;
GISS since January 1995 and
NOAA since June 1994.
Section 3
This section shows data about 2013 and other information in the form of a table. The table shows the six data sources along the top and bottom, namely UAH, RSS, Hadcrut4, Hadcrut3, Hadsst2, and GISS. Down the column, are the following:
1. 12ra: This is the final ranking for 2012 on each data set.
2. 12a: Here I give the average anomaly for 2012.
3. year: This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.
4. ano: This is the average of the monthly anomalies of the warmest year just above.
5. mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first two letters of the month and the last two numbers of the year.
6. ano: This is the anomaly of the month just above.
7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0.
8. sig: This is the whole number of years for which warming is not statistically significant according to the SkS criteria. The additional months are not added here, however for more details, see Section 2.
9. Jan: This is the January, 2013, anomaly for that particular data set.
10. Feb: This is the February, 2013, anomaly for that particular data set, etc.
21. ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months. However if the data set itself gives that average, I may use their number. Sometimes the number in the third decimal place differs by one, presumably due to all months not having the same number of days.
22. rnk: This is the rank that each particular data set would have if the anomaly above were to remain that way for the rest of the year. Of course it won’t, but think of it as an update 30 or 35 minutes into a game. Due to different base periods, the rank may be more meaningful than the average anomaly.
23.new: This gives the average anomaly of the last 31 months on the six data sets I have been discussing, namely from January 2011 to the latest number available.
24.old: This gives the average anomaly of the 120 months before that on the six data sets I have been discussing. The time goes from January 2001 to December 2010.
25.dif: This gives the difference between these two numbers.
Note that in every single case, the difference is negative. In other words, from the previous decade to this present one, global warming is NOT accelerating. As a matter of fact, cooling is taking place.
SourceUAHRSSHad4Had3Sst2GISS
1. 12ra9th11th9th10th8th9th
2. 12a0.1610.1920.4480.4060.3420.57
3. year199819982010199819982010
4. ano0.4190.550.5470.5480.4510.66
5. monAp98Ap98Ja07Fe98Au98Ja07
6. ano0.660.8570.8290.7560.5550.93
7. y/m8/716/812/816/416/412/6
8. sig1923181918
SourceUAHRSSHad4Had3Sst2GISS
9. Jan0.5040.4410.4500.3900.2830.63
10.Feb0.1750.1940.4790.4240.3080.50
11.Mar0.1830.2050.4050.3840.2780.58
12.Apr0.1030.2190.4270.4000.3540.48
13.May0.0770.1390.4980.4720.3770.56
14.Jun0.2690.2910.4510.4260.3040.66
15.Jul0.1180.2220.5140.4900.4680.54
SourceUAHRSSHad4Had3Sst2GISS
21.ave0.2040.2440.4590.4270.3390.564
22.rnk6th8th9th8th10th10th
23.new0.1580.1840.4360.3850.3140.562
24.old0.1870.2650.4830.4350.3520.591
25.dif-.029-.081-.047-.050-.038-.029
If you wish to verify all of the latest anomalies, go to the following links, For UAH, version 5.5 was used since that is what WFT used,RSSHadcrut4Hadcrut3Hadsst2,and GISS.
To see all points since January 2012 in the form of a graph, see the WFT graph below.
WoodForTrees.org – Paul Clark – Click the pic to view at source
Appendix
In this section, we summarize the data for each set separately.
RSS
The slope is flat since December 1996 or 16 years and 7 months. (goes to June) RSS is 199/204 or 97.5% of the way to Ben Santer’s 17 years.
For RSS the warming is not statistically significant for over 23 years.
For RSS: +0.122 +/-0.131 C/decade at the two sigma level from 1990.
The RSS average anomaly so far for 2013 is 0.248. This would rank 7th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2012 was 0.192 and it came in 11th.
Following are two graphs via WFT. Both show all plotted points for RSS since 1990. Then two lines are shown on the first graph. The first upward sloping line is the line from where warming is not statistically significant according to the SkS site criteria. The second straight line shows the point from where the slope is flat.
The second graph shows the above, but in addition, there are two extra lines. These show the upper and lower lines using the SkS site criteria. Note that the lower line is almost horizontal but slopes slightly downward. This indicates that there is a slight chance that cooling has occurred since 1990 according to RSS.
Graph 1 and graph 2.
UAH
The slope is flat since July 2008 or 5 years, 0 months. (goes to June)
For UAH, the warming is not statistically significant for over 19 years.
For UAH: 0.139 +/- 0.165 C/decade at the two sigma level from 1994
The UAH average anomaly so far for 2013 is 0.219. This would rank 4th if it stayed this way. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2012 was 0.161 and it came in 9th.
Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to UAH.
Graph 1 and Graph 2.
Hadcrut4
The slope is flat since November 2000 or 12 years, 7 months. (goes to May.)
For Hadcrut4, the warming is not statistically significant for over 18 years.
For Hadcrut4: 0.093 +/- 0.107 C/decade at the two sigma level from 1995
The Hadcrut4 average anomaly so far for 2013 is 0.450. This would rank 9th if it stayed this way. 2010 was the warmest at 0.547. The highest ever monthly anomaly was in January of 2007 when it reached 0.829. The anomaly in 2012 was 0.448 and it came in 9th.
Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut4.
Graph 1 and Graph 2.
Hadcrut3
The slope is flat since April 1997 or 16 years, 2 months (goes to May, 2013)
For Hadcrut3, the warming is not statistically significant for over 19 years.
For Hadcrut3: 0.091 +/- 0.110 C/decade at the two sigma level from 1994
The Hadcrut3 average anomaly so far for 2013 is 0.414. This would rank 9th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to go back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2012 was 0.405 and it came in 10th.
Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut3.
Graph 1 and Graph 2
Hadsst2
For Hadsst2, the slope is flat since March 1, 1997 or 16 years, 2 months. (goes to April 30, 2013).
The Hadsst2 average anomaly for the first four months for 2013 is 0.306. This would rank 11th if it stayed this way. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2012 was 0.342 and it came in 8th.
Sorry! The only graph available for Hadsst2 is this.
GISS
The slope is flat since February 2001 or 12 years, 5 months. (goes to June)
For GISS, the warming is not statistically significant for over 18 years.
For GISS: 0.105 +/- 0.110 C/decade at the two sigma level from 1995
The GISS average anomaly so far for 2013 is 0.57. This would rank 9th if it stayed this way. 2010 was the warmest at 0.66. The highest ever monthly anomaly was in January of 2007 when it reached 0.93. The anomaly in 2012 was 0.56 and it came in 9th.
Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to GISS. Graph 1 and Graph 2
Conclusion
So far in 2013, there is no evidence that the pause in global warming has ended. As well, all indications are that RSS will reach Santer’s 17 years in three or four months. The average rank so far is 8.5 on the six data sets discussed here. ENSO has been neutral all year so far and shows no signs of changing. The sun has been in a slump all year and also shows no sign of changing. As far as polar ice is concerned, the area that the north is losing is close to what the south is gaining. So the net effect is that there is little overall change and this also shows no sign of changing.