Tuesday, July 30, 2019

Western Wildfire Acreage Doubled by Climate Change

Here's a rather stunning chart from the US National Climate Assessment 2018, Chapter 25.

It shows that since 1984, climate change has doubled the total wildfire acreage in the US west.

Note: the acreage is zero in 1984 because that's when they started counting. Results come from an ensemble of climate models.

This additional acreage comes to about 12 million acres, or 19,000 square miles. That's a little more than the area of Vermont and New Hampshire combined.

Friday, July 26, 2019

Frequent Flyers the #6 CO2-Emitting Country

I don't know what "Bitcoin territory" means here -- perhaps the same level of CO2 being produced by Bitcoin mining -- but the stats on flying are telling nonetheless.

125 Year Heat Waves

Thursday, July 25, 2019

Yet Another Hockey Stick

There's another hockey stick in the scientific literature, this one from the PAGES 2k Consortium in Nature Geoscience. It used over 700 proxies from around the world and shows that climate is warming faster now than at any time over the last 2000 years. As Michael Mann wrote on Twitter, it re-re-re-re-re-...-re affirms the hockey stick, and we can add it to the list. Let's go to the figure:


The reconstruction only goes to the year 2000 -- we've now at 1.0 C of warming, and pushing higher.

Here's a different form of the results, presented in a phys.org news release:


More from the press release:
The results suggest that volcanic activity was responsible for variations before about 1850. After that, greenhouse gases became the dominant influence on global climate. By removing these influences in their analysis, the researchers also identified the magnitude of the random changes that cannot be traced to a specific cause. The team's data-based reconstructions also agreed with model simulations when evaluating these random changes.
As I've written before, it's easy to show that hockey stick is the expected result in the absence of significant natural forcings:
  1. temperature change = (climate_sensitivty)*(change in forcing)
  2. CO2 forcing = constant*log(CO2/initial_CO2)
  3. Atmospheric CO2 has been increasing exponentially since the beginning of the industrial era.
    • So if CO2 isn't changing, there is no temperature change -- the flat handle of the hockey stick.
    • If CO2 is increasing exponentially, its forcing is changing linearly and hence so is the temperature – which is the blade of the hockey stick.
  4. The initial curve upward from the shaft was when CO2 was increasing superexponentially.
It'd be far more surprising is there wasn't a hockey stick in the last 2,000 years.

**

Added 1:40 pm - PAGES 2k's calculations of 51-year trends, from the same paper:


**

Added 5:10 pm - Here's a map and proxy count of PAGES 2k's network, from their Supplementary Information. In all they used "...nearly 700 separate publicly available records from sources that contain indicators of past temperatures, such as long-lived trees, reef-building corals, ice cores, and marine and lake sediments. The data are sourced from all of Earth's continental regions and major ocean basins."


Monday, July 22, 2019

Memorializing a Lost Glacier

I wonder if this will become a thing. Researchers are going to put a plaque in Iceland memorializing the first lost glacier in that country.

It will be at the site of the now-extinct Okjökull glacier — nicknamed the “OK” glacier — in Borgarfjörður, Iceland. The hillside will be known as "Mt OK."

Deglaciation in Iceland occurs at a rate of about 40 square kilometers per year. Glaciers cover about 1/10th of the Earth’s dry land. Wikipedia says "The 13 largest glaciers [in Iceland] have an aggregate area of 11,181 km² (out of about 11,400 km² for all glaciers of Iceland).

Hey, that's 285 years worth of glaciers in Iceland. Nothing to worry about.

People in the distant future will wonder what a glacier was. They'll wonder what it was like to have a stable coastline. They'll wonder how we could have been so stupid. Maybe the Baby Boomer generation will become known as the Baby Doomers.

Tuesday, July 16, 2019

Warmest June

You've probably heard by now that June 2019's global mean surface temperature (GMST) was the warmest June in the records, according to both Japan's JMA and NASA GISS.

GISS was especially notable, breaking the old record (2015) by a whopping 0.09°C.

According to GISS, the northern hemisphere had its warmest June in the records, at 1.17°C above the 1951-1980 baseline. That was a record by 0.08°C. That's 1.47°C above a baseline of 1880-1909. 2.64°F. Starting to seem warm....

The southern hemisphere saw only the 6th highest June.

Land-only was also a record high, at 1.07°C above the baseline.

These are notable temperatures, given how small the recent El Nino was compared to 2015-16. (The recent El Nino is now over.)

NOAA announces their GMST tomorrow Thursday.

Saturday, July 13, 2019

That Kauppinen and Malmi Paper is Junk

I am seeing lots of citations to the Kauppinen and Malmi preprint that came out two weeks ago:
"NO EXPERIMENTAL EVIDENCE FOR THE SIGNIFICANT ANTHROPOGENIC CLIMATE CHANGE," Kauppinen and Malmi, June 29, 2019, https://arxiv.org/pdf/1907.00165.pdf.
(Yes, their title is in all caps.) Anthony Watts posted it but couldn't be bothered to read it ("I didn't vet this"). Infowars has an article with no skepticism whatsoever. Someone just sent me an email saying "this journal article by some Finnish scientists would change our entire understanding of global warming."

If you even glance through the article, you see that they assumed a CO2 climate sensitivity value of just 0.24°C (top of page 4). That's an absurdly low value, given that we've already had 1°C of warming and atmospheric CO2 hasn't even increased by 50% yet. Climate models put CO2's climate sensitivity at 2-4°C.

The authors themselves justify this claim by citing three articles of their own work(!) -- one which appeared in Energy and Environment (enough said), The International Review of Physics (clearly amateurish), and another unpublished preprint. They also made the Ed Berry Bullshit Error:
If we pay attention to the fact that only a small part of the increased CO2 concentration is anthropogenic....
In fact, all the CO2 increase in the atmosphere is anthropogenic (and the part that's due to the 1°C temperature rise is also anthropogenic because that warming is anthropogenic.)

They also assume that almost all temperature change is the result of low cloud cover changes:
In Figure 2 we see the observed global temperature anomaly (red) and global low cloud cover changes (blue). These experimental observations indicate that 1 % increase of the low cloud cover fraction decreases the temperature by 0.11°C.
and assume all that cloud cover change is "natural." And so on and so on.

Deniers: Don't believe everything you read. Especially when it supports your preconceived notions. Especially when it supports denialism. Especially when you haven't even read the paper.

(Triple this when it comes from WUWT or Infowars.)

Added 7/14: The scientists at Climate Feedback came to the same conclusion, with more detail. It's worth reading.

Friday, July 12, 2019

River Gauges in New Orleans

It's hard to believe that New Orleans could be inundated again just 14 years after Hurricane Katrina and subsequent fixes. If it's as bad it's going to raise questions of "when do you give up on a city," which might be the first city to face this of what will be many more this century. Surely New Orleans as a city won't be abandoned after this flood, but you have to wonder at what point another exodus occurs and at what point that feedbacks and causes more still people to leave. It's population doesn't seem to have yet fully recovered from Katrina, so clearly there was some feedback already:


This says 2014 population was down 7.7% post-Katrina.

I found two sites which are recording the level of the Mississippi River in New Orleans, from weather.gov and the US Army Corps of Engineers. These screenshots are of the most recent results:



And here's a nice storm track.

CNN says
Much of the area around New Orleans is now 1½ to 3 meters (4.92 to 9.84 feet) below mean sea level, according to a 2003 study by the US Geological Survey. Scientists found that the ground in the area was sinking at a rate of 1 centimeter a year.

That continual sinkage, combined with rising global sea levels due to the climate crisis, meant New Orleans would probably be between 2½ and 4 meters (8.2 to 13.12 feet) below sea level by 2100.

Tuesday, July 09, 2019

9 ft of SLR by 2100??

Rosanna Xia wrote in the Los Angeles Times:
In the last 100 years, the sea rose less than 9 inches in California. By the end of this century, the surge could be greater than 9 feet.
Oh come on. The 21st century is almost 20% gone. There is no evidence that this scale of sea level rise is in the future.

Yes, sea level rise is accelerating. Yes, this acceleration can increase and probably is increasing. But enough to get 9 ft (2740 mm) of sea level rise in 81 years? I'm very skeptical.

Every year that doesn't see a big jump in SLR takes a bite from this century's remaining SLR budget and makes this more improbable.

Even the scientists who are studying Antarctic sea-flowing glaciers -- which definitely do seem to be a problem -- are barely sure of the order of magnitude of the SLR they'll cause.

I would like to see journalists like Rosanna Xia have some skepticism -- or any at all -- instead of writing down the most extreme upper limit that anyone mentions to them.

Pollution Controls and Economic Growth

Of course, we all know this, but it's good to see the Trump administration admit that cutting pollution is not incompatible with economic growth: 
"From 1970 to 2018, the combined emissions of the most common air pollutants fell 74 percent while the economy grew over 275 percent."

-- White House press release, 7/8/19.

https://www.whitehouse.gov/briefings-statements/president-donald-j-trump-promoting-clean-healthy-environment-americans/

Sunday, July 07, 2019

Eva Kor

This is a remarkable video of someone I found completely captivating. It speaks for itself.

Eva Kor just died at the age of 85.

Saturday, July 06, 2019

I Am Now Too Old to Use the Internet

I got my first programmable calculator, a TI-58c, when I was a sophomore in high school. We had to drive all the way to a Pittsburgh suburb to buy it, which in those days was like driving to Manhattan or Paris or the Moon. I think I paid about $120 then, which is like $500 now.

In the year before my mother was given a very basic calculator that she got for hosting a Tupperware party -- ten numbers, and x + - divide, and maybe a square root. I remember that I figured out that if you took a number like WXYZ and squared it (or something), and then subtracted WXYZ and divided by YZ and inverted that and then took the square root, or something, you'd get back the original number, or something like it. I figured that out numerically, by trial and error, but a couple of years later I was proud of myself for being able to prove it with algebra. But I spent a lot of time on my back looking up at the very simple calculator pressing buttons.

I programmed the hell out of my TI-58c. It only had something like 59 programming steps available, so I got very very good at streamlining code and making the most use of subroutines.

And damn, it was actually a very good way to learn programming. I remember I programmed up some kind of golf game where we used a small wad of paper on a sheet with a golf hole drawn on it, and the calculator would tell us, after we entered the "club," how far the "ball" went and at what angle, both chosen randomly within some bounds.

I forget the exact specifications of the program, but it doesn't matter -- I learned more about programming from that little calculator than I ever did in any subsequent programming classes. That still amazes me. That's what's amazing about being 16.

In high school I took, in 11th grade, a class in statistics, like all the other smart kids, where we had access to a class-wide computer, a desktop PC, I guess, to do some homework problems on. I remember the teacher kept it as his special pet, with his special programmer a kid named John who lived up the road from us a bit and got on at the same bus stop. But I never got to know him -- he seemed a little too insular and quiet, a little too weird, and back then as a dumb teenager I didn't respect the value of being different or nonconformist or great at math. (There was a girl in my home room class when we were in grades 10-12 who would not stand for the pledge of allegiance but kept sitting in her chair. Her hair wasn't neat and she wore plain print dresses that no other girls did and they did her no favors, but while I never bothered her about not standing -- none of us did -- I never tried to understand her, either, which now I regret.) John, this math-smart kid, about two classes younger than me, was a master on this high school computer, and a favorite of the teacher teaching us statistics, meaning he could program a given algorithm in fewer steps than anyone else. Just bringing a number up from memory required code like ↑() from all of about eight registers, one at a time, so brevity and cleverness mattered. We computed the future dates of Easter, I remember. Now John is probably a math professor at MIT with a Fields Medal, or driving for Uber.

Anyway, I want to point out that you can learn a lot when you're 15 and have just a simple tool and are ready to suck up knowledge from the forest floor.

Later in college I took a class in Pascal, which was mostly a class in understanding how to type out Hollerith cards then learning exactly how to submit your program to the university's big computer to have your results printed out on big green-and-white stripped paper, whether that meant your program produced something useful or it meant, usually, that your program failed and crashed and produced nothing. The guys back around the computer would gather up your printout 20 minutes later and place it in a coded wooden bin that was yours for the time being. It wasn't exactly quick turn-around computing, and I didn't learn anything in that class that I didn't learn back at home when I was 16.

And then blah blah. In grad school we could log on to a VAX and send emails to one another, but that was about it. Within a couple of years the ITP at Stony Brook got a pretty nice UNIX machine of its own where we theoretical students could really play around, and I got my degree by writing a couple of 2-3 thousand line programs on some QCD phenomenology about questions my advisor thought up.

Then I went to Bell Labs, used UNIX, etc etc, then when I joined a startup in Boulder, CO I got my very first PC in late 1991.

PCs did less in those days, but they seemed easier to use, IIRC. I mostly used mine for email, emailing several people (women mostly) I'd "met" on a fiction writing listserv. I even got involved with one of them, and we did a lot of backpacking together, for three years, living together in AZ and then VT. Then she ended it, but it was for the best.

But that was eons ago, in the nascent days of the Web, which I first encountered in 1994. Then it was called the "WWW." I discovered Yahoo then, but there wasn't a lot else -- though there was an entire treasure chest laid out then, but I didn't know enough to take advantage of any of it. Nor did anyone I know, not that I knew anyone then. I didn't even know how to gobble up domain names for the corporations that would soon pay nicely for them, like sex.com.

Anyway now it's 20+ years later, and often I feel like I can barely use a computer at all. Nothing seems to work seamlessly. My iPhone, mostly, but my laptop? No. There just seem to be endless problems. Right now I can't get my printer hooked up so that it prints, though it worked fine in my last place and I thought all I'd need to do what plug it in here. And Chrome does not allow me to ALT-TAB away from it, to another program -- I guess this is some bug in Chrome. I read this can be solved by using a Bluetooth mouse instead of a USB mouse, but that doesn't solve the problem for me. I have BiPaP machine that used to read out my nightly statistics to an app on my phone, but that stopped working 4 days ago and I have no idea why. There are a few more big problems that I've forgotten.

And when I try to solve these, by looking at manuals/help on the Web, nothing works. It's all out of date, or wrong to begin with, with no indications of anything current. Help sites say something like go to Settings | Options | My XXX | whatever | and whatever else, but my version of Windows -- just the latest version of Windows 10, I think, on my PC -- doesn't have those options, it has some other path. I think. Maybe some driver isn't updated. It says to click on "Content settings" but there is no Contents setting on the Settings page where they said it would be. The instructions, even those written in, say, late 2018, never work, and the people who wrote those instructions, probably freelancers getting paid $0.015 per word, poor saps with Brooklyn rents, clearly don't care about providing good instructions or updating them when they require changes.

So no instructions work. Endless Googling and trial and error. The software companies don't give a fuck once they get your money. Solving any of the string of problems take 2 or more hours each, and I just wonder why I have to do this anymore and why I am doing it. Is this really all PCs can do in 2019? Why do they need to much babysitting? Why are they getting worse? Why can't we just plug them in, connect wirelessly to a printer, and have it all work?

Seriously. WHY CAN'T THIS ALL JUST WORK 40 YEARS AFTER PCs BECAME UBIQUITOUS?

It happened with cars. In the beginning -- 1910s and 1920s -- you had to take delicate care of them, be sure they had water and the tyres had air and the spark plugs were adjusted to whatever length or firing capacity or explosion potential and you didn't know what else. Enrico Fermi decided that a good way to become an experimentalist was to drive a car across America and fix it as it broke down. (Really.)

Why do I have to be a software engineer to get my printer to work?? Why can't I just connect it via USB or wireless or whatever and have it print? Why do I have to be a computer engineer to understand why this isn't possible? Why, when I connect my phone to my PC, doesn't it delete the iPhone pictures after downloading them, as my settings tell it to? Another fucking bug, that's why. Why has it failed at this for 3+ years? Why can't Microsoft fix this? Why don't they care? Why can't Google fix its bug that prevents me from using ALT-TAB to go to other windows, which is actually a serious usability problem for me. Why don't THEY care??

A month ago I click on an email link and it opened in Gmail. Not it doesn't. What the fuck changed? It wasn't me? Do I have to rejigger this every month? Do you see why I am frustrated? And then in the end someone random happens to describe your exact problem -- but no one else's -- and you just change a "NO" button to "YES" and it all works. For you.

I really wish I could give up on all this crappy technology and just farm organic peas for a living. I would never log on to the Internet or even listen to the radio again.

Trump is Behind Obama in Job Gains

Job gains during Trump's first 28 months: 5.61 M.

During Obama's last 28 months: 6.42 M.

Sources:
FRED USPRIV, FRED USGOVT

Medium: “‘Climategate’ Email Hacking was Carried out from Russia, in Effort to Undermine Action…”

This is interesting, but it's not really a surprise that shadowy Russians/Eastern Europeans would be the hackers. I hope the journalist, Iggy Ostanin, can eventually publish the name of the climate scientist he says in involved.

Exclusive: "Climategate" Email Hacking was Carried out from Russia, in Effort to Undermine Action… by Iggy Ostanin

Russian hacking, quote-unquote, seems to be more and more of a problem anymore. Is there really nothing that can be done?