Sarah Perkins-Kirkpatrick - climate scientist with a strange fascination for extreme events
sarahinscience
  • Summary
  • Blog
  • Publications
  • CV
  • Conferences & Workshops
  • Media

Intense, frequent and long heatwaves caused by human activity

6/30/2015

4 Comments

 

This piece is based on my own work and is apart of a blog series profiling climate scientists, economists, social scientists and civil society members who are presenting and discussing innovative climate science at Our Common Future Under Climate Change Conference in Paris, July 2015. For more follow @ClimatParis2015 and #CFCC15 on Twitter.

It’s boiling outside. You can’t remember the last time it was this hot. It feels like the sun is sucking every morsel of water out of your body. Hopes of a short respite in a delicious ice cream fade fast when you realize how quickly it will melt. 

You’re in the middle of a heatwave. 

Heatwaves, measured as prolonged periods of excessive heat, are a complex type of extreme temperature event. These events occur naturally (albeit rarely) as part of our climate, and are driven by a delicate balance of the right weather patterns, local soil moisture conditions, and larger-scale climate variability patterns.

Unfortunately, they’ve increased in their intensity, frequency and duration over many regions of the globe since at least the middle of the 20th Century.

But is that because of human influence on the global climate?

In order to answer this question, I’ve used a special set of simulations from a global climate model. One of these experiments simulates what the global climate would have been like without human activity. It represents an alternate climate, had the industrial revolution never occurred. The other experiment includes observed anthropogenic emissions of greenhouse gases, thus simulating the historical climate. 

From these simulations I calculated trends in heatwaves across the globe and compared them to observed trends. I did this for trends since the 1950s and also since 1998 (when the “hiatus” begun, apparently). And I found some very interesting and sobering results. Though for many, they are probably not surprising.

If we compare long-term trends of heatwaves in a world where human influence is included to a world without humans, it is obvious that we are largely responsible for the rate at which they are increasing. This is not quite the same thing as overall or absolute changes in heatwaves – sure, we are now seeing more of them than we used to. And projections from climate models indicate that even more will occur in the future as our influence of the climate increases. But what is interesting from my findings is that the speed at which they are increasing could not have occurred naturally – heatwaves fluctuated from year to year, but not on timescales in the order of decades.

Had the industrial revolution never occurred, almost everywhere in the world would have seen no, or at most, a very little (and insignificant) changes in the frequency of heatwaves since 1950.  Yet the observations tell us that what actually happened were significant increases across almost every region where there is sufficient data. This pattern is consistent only under a climate that is altered by us, as indicated by the climate model simulations.

As for the “hiatus” period, increasing or decreasing trends in heatwaves differ from region to region. Yet this is expected over shorter timescales, since climate variability processes (think El Niño/Southern Oscillation) dominate. Besides, regional and global trends of heatwaves are not robust on these timescales. We simply don’t have enough information over 10-15 years to have a clear picture of what is really happening in the climate system.  This also stands for other climate variables, such as global surface temperature – changes over short time periods (particularly those that start from extremely warm years) simply cannot tell us the whole story.

What my results for the “hiatus” period also tell us is that while regional cooling trends in heatwaves can still occur under (the current amount of) human influence, warming trends are still far more likely. This means that shorter time periods in the near future are likely to have sharp increases in heatwaves. 

Not only is this bad news for you and your ice cream (more of them will melt more quickly in the future), but it’s also terrible news for the many other people and systems adversely affected by heatwaves. In Australia for example, fruit bats literally fall out of trees when the extreme heat is on. An estimated 70,000 people were killed in Europe during the 2003 heatwave, and although it’s too early to know the full impact, over 2,300 people have already died in the recent Indian heatwave.

With regional trends in heatwaves increasing more quickly than ever before, there is little doubt that the adverse impacts will sharply rise too.

4 Comments

Scientist pinpoints how quickly climate is changing

6/30/2015

3 Comments

 
Although written by me, this piece is a part of a blog series profiling climate scientists, economists, social scientists and civil society members who are presenting and discussing innovative climate science at Our Common Future Under Climate Change conference in Paris, July, 2015. For more follow @ClimatParis2015 and #CFCC15 on Twitter.


By 2100, the latest state-of-the-art global climate models project a global average temperature rise of 2.6-4.8 degrees C under a high emissions scenario. While initially this may not sound like much, the impacts of such an increase will be disastrous to many systems, both human and natural, if they cannot adapt. However, what is likely more important for the adaptation of these systems is the rate of change towards the projected increases, over that of the absolute increases. 

Understanding this rate of change has been central to Yann Chavaillaz’s work, a Ph.D student studying at the Laboratoire des Sciences du Climat et de l'Environnement in France. Looking at an ensemble of climate models that project the above absolute warming, he’s developed indicators directly linked with this rate. 

These indicators regionally compare how much change has occurred over a 20-year period, compared to the prior 20 years. This method has the benefit of determining how quickly the climate is changing, rather than the overall change from an arbitrary baseline state.

What he’s found is that the rate of change is not uniform throughout the 21st century.  In fact, the influence of this rate on temperature distributions peaks for many regions around or just after the middle of the century.

Take for example West Africa. Currently, about 15% of the region has a one in two chance to experience an extremely warm year, compared to the last 20 years. By about 2060 - and compared to 20 years immediately before – the indicator peaks at 75%. That’s 5 times the current figure, and, frightfully, accounts for future warming that hasn’t even occurred yet. 

This means the rate of warming will become increasingly rapid over the next few decades, despite accounting for any sort of adaption to recent changes.

“The definition of an extremely warm year will be outdated within an increasingly short timescale,” says Yann. 

“The acceleration in the temperature rise will induce drying and moistening trends that will persist in some critical regions”. 

And this result is pretty robust over the models he’s analysed. Taking into account how the models represent climate variability and other differences in their representations of the earth system, qualitative conclusions remain near identical from one model to another. 

The above results, along with those Yann will present at the Our Common Future Under Climate Change conference, mainly focus on the RCP8.5 scenario. This is the scenario with the greatest change and most potential damage over the remainder of the century, and is unfortunately the scenario we are currently tracking.

However, should substantial mitigation practices be put in place and lower emission scenarios followed, indicators linked to the rate of change are not so disturbing. 

Under RCP2.6, which involves immediate and extensive mitigation, all regional indicators return to historical values by 2050. Under RCP4.5, a more middle-of-the-road scenario, indicators remain fairly consistent until 2040, and afterwards return to near-historical values. 

So while some absolute changes to the climate do occur under these emission scenarios, adaptation to changes over the prior 20 years might be much more possible, since the rate of change in the climate is slower than under RCP8.5.

Yann’s current research has applied this approach of seasonal or yearly changes in average temperature and rainfall. However its versatility means that it can be applied to basically any climate variable to understand when its rate of change may peak. He’s even working on extending the method to see if start of spring and the duration of summer are influenced by such rates of change.  

“I really hope that my results might me helpful for adaptation planning,” Yann says,  “as climate continues to change, natural and human systems will need to continuously adapt to a moving target”.

By quantifying this moving target, Yann has given insight on how much of a challenge this may be.


3 Comments

What's the right way to measure heatwaves?

6/29/2015

2 Comments

 
Picture
Right now I’m on a working holiday (though far more on the work side!) in Europe. Also right now there is a rather large heatwave that has hit Spain, is currently over France, the southern U.K., Switzerland and Italy and is starting to move towards Germany, Austria and the Czech Republic (currently where I am). The other day, the hottest ever temperature in London during July was recorded, and many locations in Spain and France have tipped over 40 (somewhat ironically the next leg of my journey is a climate conference in Paris).

I’ve also seen a bit of crap floating around the internet saying that Europe isn’t really in a heatwave and, frankly, that everyone should suck it up and enjoy the warm rays of sunshine.

So I thought this would be as good a time as ever to talk about how heatwaves are measured.

The U.K. Met Office uses this definition: at least 5 consecutive days where the temperature is at least 5oC above the daily maximum (get that?). {Edit - I originally reported this is what the World Meteorological Organisation uses, but I have been corrected. It is, however the definition in Wikipedia}. The threshold used is specific to each day of the year, so thresholds in spring are cooler than summer thresholds, and you can feasibly have a heatwave in winter too. It’s also relative to the location, so thresholds in London, say, will be cooler than Melbourne.

There are pros and cons to this definition. Let’s go with the pros first.

The definition is relative to location and time of year.  It’s worth keeping in mind that we’re all adapted to the climates we live in. The temperature in Český Krumlov today was around 28-30, to me this is lovely summer weather that I really enjoyed exploring this beautiful town in, but then I’m Australian (o.k., so today was more holiday than work…..). Many residents the Czech Republic did not seem to be coping so well. Smack on a few consecutive days under these conditions and it will wear them down. The same goes for any other living things in this climate – trees, pets, wild animals, crops.

The relativeness to the time of year is also a nice feature. We generally associate heatwaves as summer events since this is when the most disastrous impacts occur. Indeed, some abnormally warm weather in winter can be pretty enjoyable. But we’ve got to remember that such conditions are not normal to the time of year, and can disrupt the reproductive cycle of staple crops, and, as what’s currently happening back home, preventing some much anticipated snowfalls.

Now let’s go with the cons.

Firstly, the definition stated above is antiquated. This group, who came up with the index about 15 years ago don’t actually use it anymore. They realized (as did I when I wrote this paper) that it doesn’t work everywhere (particularly the tropics), which is a bit redundant for an index that’s meant to be relative.

Secondly, 5 consecutive days is too long to be the minimum length of time a heatwave should last for. Over many locations in Australia, we rarely get heatwaves of over 5 days - due to the synoptic patterns that govern our weather -  yet we still definitely get heatwaves. The Australian Open heatwave in 2014, which received worldwide media coverage, is a perfect example of this.  Also, the impacts of heatwaves on human health, infrastructure, and our native ecosystems can kick in from 2 consecutive days. Particularly in the case for human health, this is a major issue if nighttime temperatures aren’t cool enough.

So, what’s the right way to measure heatwaves then?

Truth be told, there is no ONE way. In fact, there are literally hundreds of heatwave definitions. Sure, a lot of them are closely related, but how they are calculated is different. Some incorporate relative humidity. Some are a mixture of daily maximum and minimum temperatures. Some only measure summer events. Some used fixed thresholds (e.g. days above 300C). Believe me, the list goes on and on and on…..

Jee, I don’t even stick to a single definition! In a lot of my research activities, I use the Excess Heat Factor, the official definition of the Australian Bureau of Meteorology. This index is kinda neat, since it includes a measure of acclimatisation, as well as identifying how hot it is against the background climate. However on my website scorcher, heatwaves are measured based on at least 3 consecutive days exceeding the daily 90th percentile (each day is in the hottest 10% for that day of the year), since this was easier to calculate and show on graphs. All in all, the events measured by these definitions are pretty much the same, though they aren’t 100% identical.

Given the vast range of impacts heatwaves have, I don’t think we’ll ever fully agree on one definition. Sure we might (and should!) be able to narrow it down, but I think a complete one-size-fits-all approach is a little out of reach.

But I certainly do think it’s time that the guidance on a formal definition is changed. 5 days is certainly too long for a minimum length. And a threshold that does not work everywhere, particularly in regions where impacts can potentially be very high, needs quick attention. Based on the current state of research, it really is time for more formal guidance, perhaps by something like the World Meteorological Organisation, ton an overarching definition. That way, weather and climate services the world over can adopt their own heatwave definitions that are more in line with current scientific research.

Putting all of the above aside, I do hope everyone who is in the thick of the hot European weather is taking care. If it is 1, 3, 5, or even more hot days in a row, if the temperature is 30C, 35C or 40C, I hope you’re taking the appropriate measures to keep cool, regardless if a heatwave has been officially declared.  Remember to keep yourselves hydrated, avoid any strenuous exercise or activities, and stay out of the sun. Don't forget to look after those around you too.


2 Comments

The changing nature of heatwaves

6/29/2015

2 Comments

 
I originally wrote this piece as a guest blogger for International Innovation. You can check it out here, published on the 24th April 2015

The world’s hottest year on record was 2014. On the whole, global average temperature has increased by at least 0.8oC since 1880. And human influence on average temperature increases since 1950 is unequivocal.

Hang on. Only 0.8oC?! That doesn’t seem like much at all. Surely that can’t actually mean anything!

This is the reaction I consistently get when attempting to explain how climate change impacts heatwaves. How can a small change on the global­ scale impact events over my country/state/region?

The answer: well, quite a lot actually.

You see, the relationship between averages and extreme temperatures is disproportionate. By definition, the average is what’s expected, whereas extremes are rare. This figure, part of the Intergovernmental Panel on Climate Change Special Report on extremes graphs this relationship pretty nicely. By shifting the average just a little to the right (ie. it becomes warmer), there’s a disproportionate increase in the number of extreme, or ‘rare’ events. Also, more ‘records’ are seen – these are extreme temperatures that would not have occurred had the average temperature not increased.

THE COMPLEXITY OF HEATWAVES

Heatwaves are just one type of temperature extreme. In fact, they’re quite complex, since their occurrence depends also on weather systems, the time of year (depending how you define them), and even how much rain has recently fallen.  Yet increases in their frequency, intensity and duration have been measured globally since at least 1950. This is where most of the observed global warming has occurred.

Exactly what aspects of heatwaves have changed the most and where these changes have occurred does have some variation. In Australia, the city of Canberra has seen incidence double since 1950, however Melbourne has seen the hottest heatwave day increase by over 4oC (see this report for more detail). Globally, increases in heatwave frequency over Europe and parts of Asia are larger than most other regions.

HUMAN ACTIVITY AND HEATWAVES

As I mentioned above, heatwaves are complex, and are caused by multiple other mechanisms other than changes in average temperature. Depending on the region, some mechanisms are more dominant than others. For example, a lack of winter rainfall over Europe greatly influences the intensity and duration of the most extreme heatwaves for this region. Such conditions were key ingredients for severe European heatwaves in 1976, 1994, 2003 and 2005.

How can we be sure that observed changes in heatwaves are due to humans?

In a similar way that doctors can work out how lifestyle factors (eg. smoking) increases the risk of certain diseases (eg. cancer), climate scientists can work out how greenhouse gas emissions from human activity alter the risk of certain extreme events, such as heatwaves. We do this with climate models, as we can switch things on and off in them that we can’t do with the real climate. We can simulate a ‘natural world’, where greenhouse gases are kept at pre-industrial levels, and we can also simulate the ‘current world’ over and over again, with greenhouse gases are prescribed by real-world observations. Then we compare how often the heatwave occurs in the natural and current worlds.

The result? Not great.

WARMING WORRIES

While the 2003 European heatwave was, at least in part, due to low rainfall during the previous winter, a 2004 study found that the frequency of such an event had at least doubled due to human influence on the global climate. However, a more recent study in 2014 found that the likelihood of a similar event occurring has increased even further, just over the last decade.

Over Australia, the occurrence of severe heatwave seasons such as 2013 has at least doubled due to human activity. That same year, a severe heatwave season occurred over South Korea, which now occurs 10 time more often than before the industrial revolution (both studies are in this report).

These are just a few examples. There is a whole swathe of literature concluding that human activity has influenced observed changes in heatwaves. While there are other processes that trigger these events, and there is regional variation on how much humans have altered the occurrence of these high-impact events, the common conclusion is crystal clear – the impact of global climate change on these high-impact events is already detectable.

And this is all from a global average temperature change of 0.8oC.

What about the future? Well, if we keep emitting greenhouse gases at the current rate, by 2100, global average temperature could be anywhere between 1.5-5oC.  I don’t think I need to spell out the disastrous impact this will have.

2 Comments

the landscape of early career research in Australia

1/31/2015

7 Comments

 
this morning I read an article about an early career researcher who TURNED DOWN a DECRA.

yep, you read that right. She turned it down.

For those of you who don't know A DECRA (Discovery Early Career Researcher Award) is a hugely competitive research grant given by the australian research council (ARC). You have to be within 5 years of your PhD to be eligible, and here's the cracker, only 14% get funded.

14% - that's really crap odds. If you don't get one the first time you try, you can only try again one more time. 2 shots at 14% - bloody hell.

But, you gotta be in it to win it - a lot of blood sweat and tears go into these applications, and for many early career researchers, it's the first grant application they've ever done, which makes it all the more harder.

These things are the holy grail in Australia if you're within the 5yr PhD mark. Get one of these and you quite possibly have your foot in the door for an academic career.

So err why did shy turn it down? At first I was kind of shocked when I read this article, and a bit offended actually. I put a phenomenal amount of effort into writing my DECRA application, there was absolutely no way in hell I was going to turn the offer down. And it's not like her allotted funding will be passed down to the next person on the list, the money simply goes back to the ARC.

So out of the 200 DECRAS funded each year, there's already at least 1 that won't be completed. I'm sure there's a lot of applicants who only just missed out that would have loved to receive Dr. Edwards’ funding instead, and would have clutched on to it for dear life.

Though, as I said before, it only POSSIBLY gives you a foot in the door for an academic career Australia. In the eyes of Dr. Edwards’, Possibly was not good enough. Even though she no doubt put as much effort into her application as anyone else, she does not see a future in academic science in Australia.

It is appalling that this is what it has come to. The best and brightest are leaving us.

But as much as I hate to admit it, I completely agree with Dr. Edwards.

The science and research landscape in Australia has changed.

CSIRO's funding has been slashed. The future of university fees are uncertain. The future of some ARC fellowship programs are uncertain. We did not have a science minister for the first part of the Abbott government (and I'm not sure how useful the current one will be). Permanent academic positions are as rare as hen's teeth (that's a direct quote from a senior colleague of mine). And, in the case of my field, climate science, there is even less interest at the federal government scale. We already word grants and other formal documents such that "climate change" and "global warming" do not appear, so we don't rock the boat.

Yes DECRAS are highly prestigious and extremely impressive, but if there are no jobs here in the long term, then what good are they? This is likely what underpinned Dr. Edwards’ decision. And I can see her rationale.

In fact, I know three colleagues from my department alone who moved overseas to either permanent roles or a completely new career before their DECRAS had finished. Because there are not options here.

This is something that's been playing a lot on my mind lately. Ive got 2, maybe 3 years at best in my current role as a DECRA fellow. I'll only get the extra year if I show my department proof that I've gone for other grants, which I should probably start thinking about now. If I'm lucky enough to get another grant then that's just another 3 year cycle. I've got buckley's chance of getting a permanent position in Australia, let alone at my home institution, not because I'm not competitive, but simply because there are NO JOBS.

In the very unlikely yet fortunate case of a permanent position becoming available, it wouldn't shield me from the perpetuous funding cycle either - all permanent positions are dependant on your ability to bring in funds. But at least I could get rid of that ever-present back-of-the-mind thought that I have no idea where I'll be in 4 years (which actually really scares me).  

I can see how even this option is completely unappealing to Dr. Edwards, and other people in our position. If I am totally honest, it's unappealing to me too. Although I'm still considered early career, I'm not exactly 20 anymore. Job security, a stable income and a healthy work-life balance are becoming more important the older I get.

A couple of short term contracts (post-docs) are completely acceptable and necessary to gain experience in the very early years of an academic career. But in order to good research, proper research in the long term that is world class they are not sustainable. Simple as that.

What are the other options for Australia's early career researchers?

1) move overseas or 2) find another career.

For me, if at the end of my DECRA things are  looking pretty bleak, my choice will be another career (note if my students are reading this - this is not going to happen any time soon). I love my job, but at some point enough will have to be enough. Dr. Edwards chose to stay overseas. We all have our limits.

What are the options for Australia to stop this happening?

Gosh, I can't help but be brutally honest. Australia, pull your finger out. Science and research underpins our society. Fact.

from polymer banknotes, aeroguard and wifi (which are all products of CSIRO) to solar panels, black box flight recorders, ultrasounds and the HPV vaccine - these are all things that Australian, (yes that's right Australian) science has pioneered. No money in science, and these developments stop coming. Worse still, the brightest move overseas to make their discoveries there.

So we're being shot in the foot, twice.

We need to keep our youngest and brightest here. We need to change the science and research landscape again, because the grass is definitely greener on the other side. We need to make it more attractive to stay, well beyond a 3-year fellowship which is almost impossible to get anyway. This is not just to keep the scientists happy - the Australian of the future will undoubtedly benefit from this too.
7 Comments

2014, year of the impostor

12/28/2014

2 Comments

 
goodness me, what a year!

2014 was quite possibly my most hectic year yet, which is a little ironic, given that I promised myself that I would not repeat the chaos of 2013. whoops.

As I look back on 2014, what comes to my mind is the recurring feeling of impostor syndrome.

I first learned about impostor syndrome at the annual Australian Meteorological and Oceanographic Society conference in Hobart earlier this year. For the first time, a women's lunch was held, and one of my more senior colleagues at UNSW gave a talk about her experiences as a woman in academia. I was absolutely shocked and amazed when she started describing how she felt that someone was about to tap her on the shoulder, discovering she was a fraud, and tell her to get the hell out of academia. Of course she is anything but the sort, but this feeling, this syndrome, is quite prevalent in academics, particularly female ones.

Before she gave her talk, I thought it was just my own insecurity, and to some extent my own competitiveness that made me feel quite similar. I've had these reoccurring thoughts since starting postgraduate study. I just routinely told myself to harden up and get on with it. I did not realise that I was not by myself here, let alone how common it actually is.

While the realisation that a lot more (female) academics feel this way brought me some solace, impostor syndrome did indeed rear its ugly head time and time again during 2014.

While I will always consider myself very fortunate to receive a prestigious DECRA grant, I do constantly wonder if i really did deserve it. The ARC can only give out 200 each calendar year for a 3 year research fellowship, and so competition is stiff. How can I know for sure that mine was based on merit, and not for some other reason, e.g. political, or the personal interests of the reviewer/s? Moreover, I was the only current employee of my centre to obtain one for 2014 - there is no way my proposal was the best out of the numerous others submitted by my colleagues. Not a chance. I sometimes think that I will receive an email from the ARC saying they made a mistake and that I actually have no job or grant, after all. 

Halfway through 2014, my partner and I took a 2 month holiday to travel Europe. It's something I always wanted to do since my early 20's, he'd never been, and lately I'd just been tacking short holidays on to work trips. It was the trip of our lifetimes. We had so much fun, and we saw and learned so much. I'd promised him that I'd not work or even check my emails the whole trip, but within 2 weeks, I'd already given in, but answering emails and working on paper revisions with a deadline before we returned home. Id also started to panic that I'd miss out on opportunities back home, and was damaging my career in the long term, all from taking 2 months off for some much needed R&R and to, well, see the world (this PhD comic sums up perfectly how I felt throughout my entire holiday). I also felt like I was over indulging myself. As soon as I got home, I'd be told that academics don't have the luxury of long holidays, and because I was not working my fingers to the bone during those 2 months, I do not belong in academia and should seek another career. Not long after we got back, I also had to take some personal leave and the same thoughts entered my head once again. (I must stress that my work environment is VERY supportive and UNSW has excellent leave provisions, I use this example as it's a textbook case of  impostor syndrome).

This year, my publication record has suffered a fair bit. I've had 2 manuscripts rejected, and for various reasons (perhaps including my holiday) I've simply not had the time to finish the multiple other projects I currently have going. The old cry of the academic constantly rings clear in my head - "publish or perish" - if I do not publish quality manuscripts often, then I simply do not deserve to still be in the game. 


And then there is the extreme competitiveness of the academic game - permanent positions are as rare as hen's teeth. In a couple of years I'll be fighting once again for another grant, in the hope for a few more years of research and employment. I'd absolutely love to get a permanent role, but I'd be competing with 100s of others in a similar position to me, with marginally better chances if we moved overseas. I'm surprised those "powers" higher than me has not told me to bow out of the game already, and let someone who really deserves it be propelled forward. And good lord, what if I did get a permanent position? I'd constantly be waiting for that tap-on-the-shoulder-telling-me-to-move-on.


I am writing this blog on new year's eve, in the hope that 2015 will not be filled with so many impostor related feelings, and will be prosperous for my career and communication activities. Personally, I do wonder if there's more that we can do to help eliminate these thoughts, as it's clear so many academics, and even people in other industries have them. Perhaps there are strategies we can employ to deal with these thoughts? Educate younger academics/students on impostor syndrome, and that it isn't "just them"? I think this is really important, as at times, it can be particularly dominating, and likely impact the work of our youngest and brightest. We may even lose some up-and-comming stars who give in to impostor syndrome.


I also write this blog in the hope that it may put other academics and suffers of impostor syndrome at ease - you're not alone. Do not let the feelings of being an impostor take over your career - you deserve to be where you are, you worked hard to get there, and many, many people who surround you see this. You belong right were you are.





Man, I should really listen to my own advice sometimes!



below - some photos of our holiday :) can you guess which places these are?
Picture
Picture
Picture
Picture
2 Comments

heatwaves, and finding the human signal in them

4/18/2014

4 Comments

 
Picture
Lately my research has taken me to investigating the role of human activity on changes in heatwaves. I've been asked countless times, generally during or directly after a heatwave (though sometimes before), whether humans are the cause behind that particular event. So I decided to look into this. But first, a bit about heatwaves.

Heatwaves have always punctuated the Australian (and global) landscape, and occur due to the manifestation of numerous conditions, such as low rainfall, dry soil, higher than average background temperatures, and the positioning of what's called a "persistent high" - a near-stationary high pressure system that advects warm air to the area affected. They are measured relative to the local climate base period - so heatwaves can (and do) occur in Hobart, potentially resulting in catastrophic local impacts, even though the measured temperature would be cooler than a heatwave in Alice Springs. 

By the description I gave above, it is completely plausible for heatwaves to occur in a stationary climate - one that is not showing any overall change whatever reason, other than year-to-year fluctuations due to natural cycles (e.g. El Nino/Southern Oscillation). So heatwaves did actually occur before human activity had any measurable impact on the climate system. If we assumed that the climate was stable around natural climate variability, we would not expect to see any increases or decreases in the number of heatwaves, their intensity, duration, or how early the first one occurs each season. Heatwaves themselves would still occur, but we wouldn't see any real change in their behaviour over timescales much longer than a few years.

However, measurable changes in heatwaves have occurred, which includes research I've undertaken myself here, here, and here, and that reported by the Intergovernmental Panel on Climate change.

But are we sure these changes are actually due to us? How are we sure they aren't due to natural causes? Such questions are indeed on the mind of many when we do actually experience a heatwave. 

We need to be careful in what question we're actually asking. As I mentioned above, I constantly get asked "are humans to blame for this heatwave?". We cannot categorically answer this question, with a yes or no, or even with something like "humans were 75.3% responsible for the massive Australian heatwave in 2013". As I also said above, a range of factors need to be present for a heatwave to occur, including human-induced climate change.

But what we can determine instead is the changes in likelihood, or risk, that climate change has imposed on recent heatwave events. Using the example in the last paragraph, we can say something like "climate change has increased the likelihood of the massive Australian heatwave occurring in 2013 by 75.3%". This is saying that the event can occur without climate change, but occurs more often than it otherwise would have, due to climate change (note that 75.3% is a random number for illustrative purposes).

The difference between the two answers are subtle, yes, but very important.

Think if it like betting on a racehorse. There are numerous factors that influence whether or not a horse will win, likely including breed, training, age, track conditions, jockey, fitness, etc. The odds of the horse winning are controlled by these factors,  that rise and fall based on their balance. Better training = better odds and poor track  = worse odds, for example (although one might overshadow the other if they both occur at the same time). If the horse did win, we can't put it down to any one factor, and can only say, at most, that any one factor increased/decreased the odds, or likelihood of that win.

This logic applies when investigating the human influence behind specific events, such as heatwaves. So how do we do it? 

This is where climate models come in. We only have one set of observations, and so climate models provide us with lots of data that include the affects of human activity (such as the release of greenhouse gases), and lots of data where only natural climate variability prevails. We can then compare how often heatwaves occur in this natural-only world, to how often they occur when our current level of human activity is present. This gives us our change in likelihood, or change in odds, due to human induced climate change.


To throw in another analogy, the same method is applied to understanding the likelihood of cancer in smokers. In fact, climate scientists stole this method from epidemiologists. The rate, type and severity of cancer is determined in large groups of smokers vs non-smokers. This allows for the estimation of by how much smoking increases your cancer risk, compared to if you didn't smoke at all. 


And, unfortunately, though likely unsurprisingly, human activity has increased the likelihood of more intense and more frequent heatwaves. If we consider actual observed events, such as the intensity and frequency of heatwaves during the 2012/2013 Australian summer, human influence has more than doubled the likelihood (or doubled the odds) of events like these occurring. This means that heatwaves like those during the 2012/2013 season occur twice as often as they used to. It's important to remember that this is very different to stating that humans are (insert your favourite number here)% responsible for the heatwaves that occurred during this season.


So, in summary, we can certainly quantify the human signal in heatwaves. But in order to do this properly we need to be asking the right question. 


If in doubt, think of the racehorses!


4 Comments

importance of a work-life balance - lessons learned!

3/10/2014

5 Comments

 
Picture
At the most recent Australian Meteorological and Oceanographic Society conference, I was part of a “life after post doc” forum, aimed at postgraduate students and junior researchers. There was a panel of three - myself, who is barely out of a post doc, a researcher from the Bureau of Meteorology, and a professor from the Australian National University. The idea was to share our experiences and journeys to our current position, and allow students to ask any particulars.

One student asked how do we manage good work-life balance.  Straight away I delved into how imperative it is, how we all need to keep up a social life, a home life, take regular exercise and/or a hobby, and anything else to have a life outside our careers. I professed to not working weekends, and never succumbing to the pressure to do so in order to be successful.

Not long after the forum finished, I realized that I was, indeed, a hypocrite.

I have always been an advocate of a good work-life balance. I led a (very) active social life during my undergraduate studies, which continued well into my PhD. I also worked part time to help pay the rent, and took up various active hobbies.

Sure, when my studies required more of my time I battened down the hatches and did whatever needed to be done, whether it was studying or thesis-writing, but it was only ever temporary. And even then, I still made sure I had some sort of balance, such as dinner with a friend or a gym session to burn some energy and frustration.

But over the last couple of years, and especially the last 12 months, the balance has been overrun.

There seems to be this stigma, particularly in younger researchers in academia that in order to succeed you must work all day and all night. We have studied for a long time to get to where we are, jobs are extremely hard to find, particularly permanent ones, and often involve moving overseas. There also seems to be a move towards more contract positions and less permanent positions in academia globally, which doesn’t raise high hopes for up-and-coming researchers.

Therefore, in order to earn our positions we need to work our fingers to the bone (because having a PhD is not enough).  
At times this actually sounds quite reasonable to an academic. We love what we do! We studied for so long to do what we do! Why wouldn’t we want to do it ALL THE TIME?

Thinking right along these lines, at the beginning of 2013 I set myself the new year’s resolution of excelling my career. I had been in a great post-doc position that, quite rarely, came with a lot of research freedom. However I was well aware that post doc positions are short-lived, and there was no time for complacency. So I decided to do ALL THE THINGS to propel my career forward.

This list was oh so long, including applying for a research grant, joining a young researchers international committee and a local conference committee, spending three months working abroad in Europe and Ecuador, setting up my own blog space and twitter account, going for various awards, setting up an interactive heatwave website, mentoring a summer student, undertaking various domestic collaborative trips and workshops, involvement with the Australian Climate Council, starting up a partnership in the Scientists in Schools program, polishing my communication skills and executing them constantly, as well as trying to keep on top of my research interests, and, publish, publish, publish!

Oh god, I’m exhausted just typing all that.

Now a lot of that is stuff that I wanted to do. But all this in one year? Crazy.

I started 2014 off in utter exhaustion from burning myself out last year. I was much more anxious than enthusiastic about what this year would bring, and how I would cope with a mammoth workload, as, truth be told, it was my publications that suffered the most in 2013 (a cardinal sin in academia, I know!). I got into my job because it excites and fascinates me, and certainly not because it makes me feel anxious. But it wasn’t until the forum that it became clear my work-life balance was completely out of whack.

Even the most enjoyable tasks can get tiring.  Besides my job, I also happen to really love running, but just like long, repetitive and strenuous exercise can take its toll on the body, so can working too hard negatively affect the mind .

As it turns out I’ve been extremely fortunate in receiving the research grant, and was also a Tall Poppy award, both of which I am extremely grateful for and excited about. But do you think I’ve been able to breathe a sigh of relief and relax, given that I achieved what I set out to achieve?

No.

It’s only made me want to work even harder, because I’ve been so lucky, and in order to really deserve it, I have to work even harder still.

Yep, definitely crazy.

But what I’m most fearful of is that I’m not alone. Other young (and perhaps even not-so-young) academics feeling the exact same way. Sure, you gotta put in the hard yards to achieve your goals, and you have to be at least a little bit competitive in the academic industry. But in order to be the most successful you can possibly be, you need to give yourself a break.

Although I feel less of a hypocrite for realizing my own poor balance after the forum, I’m disappointed in myself that it took something like that to actually wake me up. I don’t want that to be the same for other young academics. It is just not worth it. Balance is imperative. It is worth so much more than pushing yourself to your limit (can you really put a price on your sanity?).

Take it from me,  tired, grumpy academics do not make good research. Nor do they make good friends, partners, sons or daughters. NOTHING, least of all your sanity and personality, is worth pushing yourself to the limit.

Perhaps some people may think it’s easy for me to say this as I have gained a grant, but my job is only guaranteed for another 3 or so years. And in order to stay where I am permanently, someone else will need to move on. I’d then be in the competition with all researchers at my level, both within my workplace and any outsiders too. So in another few years, I’ll be facing the exact same prospects as I did last year, and as so many young researchers do after their PhD and between contracts.

But consistently pushing too far only makes things worse. If I kept pushing, I wouldn’t want to be around in 3 years time.

The key to a work-life balance is different to everyone – I find running relaxing but I know that’s not the same for everyone. You need to find what’s right for you. I’m not going to dictate about how one should balance their life, rather the notion of balance itself is essential. Generally, you do need to do enough work to satisfy your natural academic curiosity, but if you find yourself hating what you do, then you’ve pushed yourself too far. We’re in this job for the love remember?! That limit could be 6 solid hours for one person, maybe 12 for someone else, but everybody does have a limit.

And this should not be pushed.

5 Comments

Science and communication - like oil or water? or cake and icing?

10/17/2013

0 Comments

 
  Last week I participated in and helped organize the Greenhouse2013 conference in Adelaide, Australia. This conference is quite unique, as it bridges the science, as well as the impacts, of human-induced climate change. Participants included scientists like myself from universities, CSIRO, and the Bureau of Meteorology, as well as members of local and state government, human health experts, agricultural experts, and representatives from renewable energy companies. I even spoke to someone doing her PhD in coal mining. Though there was one particular group of people that really caught my attention – communicators.

Perhaps it is my recent interest in science communication that made this group stand out to me. I’ve only been to one previous Greenhouse conference before and my memory fails me as to whether there was a similar communication presence. This latest meeting in the conference series had its own science communication stream, where experts in this niche presented about new and innovative ways to reach the general public on various climate and weather-related topics, including the issue of human-induced climate change. I even participated in a casual forum of five early career researchers, where one of my fellow panel members was a trained, successful, and engaging science communicator.

Now, I’m aware that perhaps not everyone in this industry has the same enthusiasm I do about communication. This could be for various reasons, whether they be personal or professional. But no matter how we slice and dice it, climate change, and therefore climate science, is a topical and contemporary subject. So does this mean that science and communication can mix?

If this was a black and white issue, then there would simply be two camps: yes, like cake and icing, or no, like oil and water.

Let’s get the negatives out of the way first and start off with some of the reasons why some climate scientists say NO. Firstly, the media as a terrible reputation of misrepresenting stories, and at times, has been thought to have a hidden agenda.  It is quite understandable that someone does not want to place their hard work, and potentially, their reputation on the line, all for a 5-minute (or less) interview and their misquoted findings forever etched somewhere in stone. Some may say that since the life cycle of the media is so short, a misquoted piece will go away as soon as it emerges. But due to the nature of the scientific method, scientists spend years gathering their results and have trouble believing that something disappears as quickly as it appeared. Moreover, like the bad press that continually follows anyone in the public eye - politician or celebrity, some may fear that these reports could resurface at any time only to bring us back down.

Secondly, talking to the media takes up precious research time. Generally before an interview, one needs to plan exactly what to say – you only have a few minutes if you’re really lucky, and trying to condense a few years or so worth of research down to a few sound bytes is challenging to say the least. And if you’re fearful of being misrepresented, you’ve got all the more pressure on you. Sometimes you might be asked to provide comment on a particular research article or study, that due to embargoes you haven't seen yet. So there goes a couple of hours just reading and critically evaluating the study, before you’ve even worked out what your main points are. And if you’ve been asked to write an opinion piece or article – well sheesh, there goes full day at best!

Thirdly, it is not actually part of our job description, formal training, or qualifications. Scientists are trained to critically analyse physical processes, and present their findings to like-minded peers at conferences and within scientific journals. We are not trained, nor generally possess the natural ability to present our findings in a way that everyone can understand, let alone appreciate or find them as exciting as we do. This may seem a bit of a cop-out to some, but it’s more than enough motivation to others to just continue along their research trajectory without attracting too much attention to themselves.

Ok, now for the “cake and icing” camp.

For some climate scientists that enjoy communication activities, it’s all about the challenge. I mentioned above that we are not trained in this area, nor do we have a natural ability, so acquiring communication skills and using them effectively is definitely a challenge. Moreover, it’s a very rewarding one, particularly when people’s feedback is positive, and, better yet, they now make sense of something they didn’t before And a lot of scientists see a challenge as a good and gratifying exercise. After all, our job is full of researching things we don’t know, and the excitement of acquiring new skills and findings is what gets us out of bed every morning.

The communication of climate science can also bring fulfillment, meaning and purpose to your research.  I have lately hard by various sources that those who communicate, particularly over the internet, are narcissists and just want the attention. I strongly disagree with this broad generalization. The communication of our work brings purpose to our RESEARCH, not to US.  We do our job because we find it interesting, it means something and it’s our passion, not because it puts us in the public eye (I think you’d find celebrities fit more into this category). However, since our research means something to the general public and policy and not just one small fraction of a particular research, communicating it affectively to the masses is like the final cheery on a sundae (or icing on the cake, so to say).

And on the note of climate science being so topical, communication provides the opportunity to put the right information and the right science out there.  You’re the expert, and should be the first port of call. It gives you and your peers the perfect change to counteract the “hearsay” and misinformation that may be circulating, whatever the reason. Although I discussed above that this may be a turn-off for some, it is definitely a motivation for others, as it gives the public the opportunity to weigh up both sides of the argument themselves, instead of just being berated by the same, Il-formed statements all the time. After all, who would YOU trust, when it comes to your health for example? A car mechanic? A lawyer? An accountant? Or a doctor?

The same theory applies to climate science.

Of course science and communication, particularly in the field of climate science is no black and white issue. There are many shades of grey where various scientists decide their own level of communication based on their personal beliefs and decisions. But what is definite, however, is the growing interest, resources, and forms of communication delivery that now exist. To me, this indicates that communication and climate science can indeed mix, and not only that, but it is the scientists can control to the extent at which this mix occurs (and how mic.

0 Comments

Sydney heat - is it hot enough for you?

10/17/2013

2 Comments

 
One of the great oddities of recent times in Australia is that during our increasingly frequent and intense fire seasons – when we're losing houses and, unfortunately, lives – it is seen by many as rude or in poor taste to talk about climate change.

It is quite a bizarre response considering an ever-growing body of research highlights that increases in heat waves, fire danger and extreme temperatures are intimately linked to global warming.

More importantly, these three areas are considered to be the earliest, most responsive and well-defined impacts of climate change.

In Australia, we have seen the Bureau of Meteorology add a new temperature colour to its maps, the creation of a catastrophic fire danger category, the hottest 12 months on record and heat records falling at increasing rates over the past 50 years. Worldwide research has shown that the number of new heat records being set has increased by 40 per cent while the number of extreme cold records being set has declined by 40 per cent.

  We are seeing a shift in the climate towards warmer conditions that will unequivocally have an impact on the timing and intensity of fires.

In Sydney, our fire season started this year in September. On Thursday we have had forecasts for a 39-degree day in Sydney and the declaration of catastrophic fire conditions in other parts of NSW.

Unfortunately, for me, this fire season shift comes as no surprise – it is exactly what is expected under climate change. We are no longer talking about projections, but observations made over the past 50 years and longer that reveal the change.

Our own research at the ARC Centre of Excellence for Climate System Science has shown that in Australia heatwaves are getting longer, hotter and more frequent. Beyond the increased fire danger, the health impacts of high overnight temperatures are enormous particularly on the more vulnerable and elderly sections of our population. Death rates go up during persistent heatwaves.

This week in the journal Nature, new research claimed to pinpoint the exact year that many of the world's major cities will see the climates completely alter. While this kind of precision is perhaps inappropriate when talking about complete transformations in regional climates, it does not alter the fact that this is where we are heading.

When we look at future climates, the preponderance of evidence suggests an increasing number of extreme heat events, no matter how they are measured. There is also increasing confidence around the idea that wet areas will get wetter and dry areas will get drier. The changes in observed salinity in our oceans supports this proposition.

Future projections of severe rainfall events under enhanced atmospheric greenhouse gas concentrations show they are also likely to be more intense.

This summer follows two intense La Nina periods that brought extensive rainfall to the east coast of Australia. While there is no clear indicator that these record rainfall events were due to climate change, because precipitation is hard to model, the extreme flooding and rainfall from these two years has created a vegetation load that that has many in our fire services deeply concerned.

On a personal note, my partner is a volunteer in the Rural Fire Service. The amount of fuel load that has built up over the last few years coupled with an early fire season is something that he and his colleagues have rarely, if at all, seen. They are expecting the worst while praying for a reprieve in the summer ahead.

Hazard reduction burning has been particularly fraught this year and both my partner and his colleagues have explicitly said they are concerned of the conditions and what the summer could bring.

At the same time, their concerns are magnified by the fact that so few people have prepared for this fire season even though it has already started. Gutters are still filled with leaves and flammable objects litter properties that are close to the verge between bush and urban areas.

It is my hope that we don't have the devastation inflicted by infernos such as those in Victoria and Tasmania visited upon NSW. With a partner who is likely to be in the frontline fighting these fires, I have real skin in the game.

But if Sydney and NSW does experience such devastation, I don't want us to ignore the role climate change played. There is no doubt in my mind that global warming is significantly contributing to making Australia's fire danger worse than it has ever been.

If towns are burning and Australian lives are being put more at risk because of this, we have a responsibility to face the role of climate change, to talk about it and to consider our response to a challenge that will only grow if we do nothing.

When lives and communities are being devastated it's more than rude not to talk about climate change and fires, it's life threatening.

This opinion piece was written by myself, and published in the Sydney Morning Herald, Thursday the 10th October, 2013
2 Comments
<<Previous

    Author

    climate scientist, fascinated by extreme events, but kinda tired by being made out to be a "bad guy". Tend to moonlight as, well, your average human being.

    Archives

    June 2015
    January 2015
    December 2014
    April 2014
    March 2014
    October 2013
    September 2013
    August 2013

    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.