Friday, April 3, 2015

The Evolution of Models - 3

Complex Systems
Architects of prospective sea level rise (SLR) software have to face 'mystery' or better said, complex system behavior, from time to time.

Complex becomes 'mysterious' when the complexity is "on steroids" to the point it moves out of cognitive range.

Simulating the complexity exactly requires an understanding of the unknown, which is a dichotomy, so the solution chosen sometimes is to guess or to ignore.

For example, earlier IPCC reports from time to time in the past did not see polar ice cap melt as calculable, so that realm was excluded from the picture.

Thus, projections of models were consistently underestimating the real scenario, because the polar ice caps were destined to eventually become the main player in SLR.

The polar caps are now given a place at the table, because they are the prime source of future SLR.

Knowing that does not, in and of itself, remove all of the mystery, because SLR is a phenomenon of a complex system.

There are linear, non-linear, exponential and non-exponential factors, even in well known dynamics, such as an 'acceleration' (The Question Is: How Much Acceleration Is Involved In Sea Level Rise?) or a 'surge' (Will This Float Your Boat - 5, What Do You Mean - World Civilization? - 2).

There are various types and causes of both acceleration and surges.

Add to that the occasional misidentification, whereby either one or both acceleration and surge, can sometimes be confused with "feedback loops."

That said, it is also true that sometimes surges and/or accelerations are in fact due to feedback loops, but at other times they are mistaken for separate natural surges, and/or natural accelerations in a complex non-feedback dynamic:
"We may be witnessing the start of the long-awaited jump in global temperatures. There is “a vast and growing body of research,” as Climate Central explained in February. “Humanity is about to experience a historically unprecedented spike in temperatures.” - Think Progress

"Humanity is about to experience a historically unprecedented spike in temperatures.

That’s the ominous conclusion of a vast and growing body of research that links sweeping Pacific Ocean cycles with rates of warming at the planet’s surface — warming rates that could affect how communities and nations respond to threats posed by climate change." - Climate Central
And of course as I said above, on top of acceleration and surge, there are separate individual feedback loops:
"Now, scientists believe they’ve untangled the relationship. In a paper published Monday in Nature Climate Change, researchers from the University of Exeter claim to have found direct evidence that as global temperatures rise, so does the atmospheric concentration of greenhouse gases, creating a positive feedback that in turn warms the Earth even more — basically, global warming creates more global warming.

“We discovered that not only does thickening the blanket of heat-trapping gases around our planet cause it to get warmer, but also, crucially, when it gets warmer this increases thickens the blanket of heat-trapping gases,” Tim Lenton, the paper’s author, told ThinkProgress, “so we have a process called a ‘positive feedback’ that amplifies changes in the Earth’s temperature.”
(Study: Direct Evidence That Global Warming Causes More Global Warming). The designer or architect of software that has these issues within its context can try to build that logic into the code itself, or can choose instead to have these scenarios and dynamics described by data.

Data which the software analyzes, derives values from, then injects those derived results into the other ongoing streams of computations and calculations.

IMO, the highly data driven model is the preferred method.

Because, as the situation becomes more clear and the data becomes better, the simulations will become more accurate without any change to the code.

The program logic (e.g. x + y - z) can be used in thousands of scenarios without changing the code, but instead merely updating the data values for x,y, and z.

So, I have built surge, delay, and acceleration logic into both the data entry exercises and in the calculation logic.

Processing that data, then using it in concert with the less-mysterious and more certain computations, becomes routine and more error free in the long run.

One can put expected surge, delay, and acceleration values into the database and it will be processed accordingly.

Or it can be left out, if a user chooses.

My practice is to develop multiple data scenarios, save them, modify them if needed when research clarifies this or that issue, but always have them available for use.

To the contrary, constantly rewriting the code is not preferred --when compared to merely adjusting the data.

So, the software I wrote to project future SLR has classes with data members for standard melting, acceleration of melting, temporary surges of melting, delays in melting,  and various other data realms.

Data is used to describe the zones of melt (size, year melt begins, maximum melt, etc.).

Those are data values which the user can enter, modify, and control at will.

Thus, the user is in the driver's seat to the extent that the data determines the results.

The purpose which the architect or designer fulfils, then, is to first grasp the principles involved in the solution ("abstractions" in object oriented language jargon), then create "classes" (concepts of operations) which become "objects" (converting abstractions into "concrete" operations) when "instantiated" at "runtime."

This article from BBC (Greenland ice sheet losses double) gives some idea of what a developer of such software faces.

This article from NOAA (Arctic Report Card) gives some idea about how breaking ice sheets into zones still requires more than that alone.

That is, more than just a "melt beginning year," acceleration of melt, delay of melt, temporary surges in melt rate, and the like.

Comprehensive consideration of the phenomenon requires a concept of a dynamic rate-of-melt, which changes in that zone.

And from time to time changes in the other zones too.

For example, did you notice that in the Arctic Report Card piece mentioned above, that the ice loss as measured by the GRACE satellite (measures mass not volume) dropped to "no mass loss' (Fig. 3.3 in that post)?

Using the same formula that derived 14.87% ( L = [ (f / s)(1 / y) ] - 1 ), where y = 5 years (see Will This Float Your Boat - 5), that 14.87% drops to 12.25% when y = 6 years, instead of y = 5 years.

That is a drop in acceleration of 17.6% in one year (assuming Cryosat-2 comes up with the same result in its upcoming report).

Meanwhile, Antarctica volume loss @ Totten Glacier accelerated (which shows the importance of trend, mean, and averages in long range projections).

The next post in this series is here, the previous post in this series is here.

Thursday, April 2, 2015

The Evolution of Models - 2

Fig. 1 (click to enlarge)
Today, I am going to be "talking out loud" about the sea level rise (SLR) calculation program, version 1.0 (earlier versions I posted about were betas).

I added a masking class that normalizes the candidate data being used for a particular year's calculation utilizing the track urged by the Potsdam Institute authors in a paper they published in PNAS (Potsdam Institute, cf. PNAS).

By "normalizing" I mean the software logic influences the calculation by utilizing the principle of 2.3 meters of SLR per 1 degree centigrade of increased global temperature as shown in Fig. 1 above, and as discussed in the paper.

The purpose of Fig. 1 is to show the SLR generated by those global temperatures and the time the potential was generated.

The purpose is not to show when that SLR will be actualized in reality.

To the degree that the data is outside that track, it modifies it closer to a track in line with that formula, but only to a degree, and only if directed to do so.

That nudge bends any calculation to the generally accepted history of average global temperatures used by many climate scientists, bloggers, educators, etc. (NOAA).

In other words, if the data presented to the model is suggesting X, but X is either higher or lower than "2.3 meters of SLR per 1 degree centigrade of increased global temperature" it is modified closer to that norm.

The rub in using the Potsdam formula comes when models attempt to implement the delay between the time of temperature rise, and the resulting impact on ice melt caused SLR.

Traditionally, that delay has been about 40 years for some scenarios, but can vary (Time Keeps on Slippin' Slippin' Slippin' In From The Past).

The Potsdam Institute writes about long delay time frames, so by default I staggered it by 100 years (1880-2014 becomes 1980-2114) in the masking, but that can be modified downward (decreased) by a data setting (as are the other relevant parameters that control the modelling program).

Those parameters can be in either an SQL or a flat-file database.

Anyway, I eventually want to have several masks based on formulas using observed, measured, and known values for those instances where the model is being used in scenarios requiring as little speculation as possible.

That is because the model, being data driven, will generate any scenario, real or fanciful, that the user wishes to generate data and graphs for.

For example, Fig. 2 is generated by a data set to simulate the IPCC 3 ft. to 6 ft. SLR by about 2100.

Fig. 2 (click to enlarge)
Any scenario can be modelled.

The way I do it is to write a "mySQL" script file that creates an SQL table in a mySQL database, then populates the table with data for the desired scenario.

For instance, Fig. 3 shows about a 21 ft. SLR by 2100 using a different dataset.

That is, it was generated using a different SQL script file in order to generate a different database table.
Fig. 3 (click to enlarge)

One can create as many of those files as desired, and use them any time, which beats continual data entry any time a different scenario is desired.

As our knowledge improves those files can improve and therefore the model will generate a better picture of prospective SLR in future years.

One can only wonder as to why this has not been basic protocol and practice, but one thing, fear of reality comes to mind as a reoccurring phenomenon.

The model, as I have written before, is based on four zones (coastal, inland 1, inland 2, and no-melt) in each area where relevant ice sheets exist (non-polar glaciers, ice caps; Greenland; and Antarctica).

The accuracy of the model's work depends on the accuracy of the values associated and attributed to each zone.

Thus, to the degree that the size of the zone (stored in the data that the model uses) is inaccurate, is the degree to which the projections will be inaccurate.

So, keeping a close eye on scientific research and findings on the ice volume in the zones in Antarctica and Greenland, then updating the database accordingly, will improve the results.

Likewise, it is essential to assign accurate acceleration rates to each zone in order to enhance accuracy.

It is all about accurate data.

The next post in this series is here, the previous post in this series is here.




Wednesday, April 1, 2015

The Evolution of Models

Model evolution
The Darwinian thinking, in its embryonic state, was not a concept that abiotic entities evolve, no, it was a line of thinking based on the evolution of biotic entities.

For example, those of that ilk in that time long ago never contemplated how genes evolve, no, because they did not know that genes even existed, much less that they evolved.

Then, upon the discovery of genes and DNA, the focus changed to genetics and why genes evolve.

Of, course the "obvious" answer to that was "because they are selfish."

Our evolutionary thought processes have evolved too (The Uncertain Gene, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11).

Now, the real question is not what evolves, rather, the real question is why they evolve.

Even the hypotheses and theories about why "things and non-things" evolve also evolve.

Take natural selection, which was originally a concept far less evolved than it is now.

Now, it is trending towards randomness, now having evolved far away from "survival of the fittest" in the biological science realms (but see The Fittest Stars, Planets, & Species).

Where there is randomness in thinking there is variation in thinking:
"The hypothesis put forth to explain the origins of the Universe, our solar system, and our planet is called the Big Bang Theory. The Big Bang Theory IS NOT EVOLUTION! (The theory of evolution deals with living organisms, once they have come into existance" [sic] - Origins of Life).

"What exactly happened after the Universe was born? Why did stars, planets and huge galaxies form? These are the questions that concern Viatcheslav Mukhanov, and he tries to find the answers with the help of mathematical physics. Mukhanov, Professor of Physics at LMU, is an acknowledged expert in the field of Theoretical Cosmology - and he has used the notion of so-called quantum fluctuations to construct a theory that provides a precise picture of the crucial initial phase of the evolution of our Universe: For without the minimal variations in energy density that result from the tiny but unavoidable quantum fluctuations, one cannot account for the formation of stars, planets and galaxies that characterize the Universe we observe today." - Cosmology Is Evolution
(emphasis added, cf. On the Origin of the Genes of Viruses - 12). Sometimes scientists get a bit possessive about "their science and their universe."

Anyway, models of sea level rise also evolve.

With the same problems.

There are as many models as there are modellers and model problems, so, let's consider some:
"EdGCM provides a research-grade Global Climate Model (GCM) with a user-friendly interface that can be run on a desktop computer. For the first time, students can explore the subject of climate change in the same way that actual research scientists do. In the process of using EdGCM, students will become knowledgeable about a topic that will surely affect their lives, and we will better prepare the next generation of scientists who will grapple with a myriad of complex climate issues.

Our goal is to improve the quality of teaching and learning of climate-change science through broader access to GCMs, and to provide appropriate technology and materials to help educators use these models effectively. With research-quality resources in place, linking classrooms to actual research projects is not only possible, but can also be beneficial to the education and research communities alike." - (Columbia University, American).

"Climate scientists build large, complex simulations with little or no software engineering training, and do not readily adopt the latest software engineering tools and techniques. In this paper, we describe an ethnographic study of the culture and practices of climate scientists at the Met Office Hadley Centre. The study examined how the scientists think about software correctness, how they prioritize requirements, and how they develop a shared understanding of their models. The findings show that climate scientists have developed customized techniques for verification and validation that are tightly integrated into their approach to scientific research. Their software practices share many features of both agile and open source projects, in that they rely on self organisation of the teams, extensive use of informal communication channels, and developers who are also users and domain experts. These comparisons offer insights into why such practices work." - (Toronto University, Canadian)

"This educational software is originally deviced for use in high schools as part of the french curriculum in Life and Earth Sciences. It realises climate simulations given user-chosen parameters. Through a friendly interface, the user chooses the length of the simulation (from 100 years to a few billions years) and the initial conditions, and tests the influence of various parameters involved in climate: astronomic forcing, atmospheric composition, carbone cycle, climatic feedbacks (ice albedo, vegetation, ocean, water vapor). Simulation results (such as temperature, sea level and ice cover), calculated on the flys by a physical climate model, appear on the interface, through curves and images." - (SimClimat, French)
The development of a software solution that is prospective, i.e., it is going to be used to anticipate events going forward, brings up problems that remind me of sayings on the Quotes Page:
"Scientists have discovered that 'the present' has always existed, but they are not sure about 'the past' and 'the future'." -Dredd

"One thing is for sure on the subject of global warming induced climate change: if there was ever a time to err on the safe side, it was long ago." - Dredd
Now, back to my codeine coding.

The next post in this series is here.

TEDx Talk excerpt from transcript of video below:
"[Unfortunately, talking about climate forecasts is often a great way to end a friendly conversation!] Climate models tell us that by the end of this century, if we carry on burning fossil fuels at the rate we have been doing, and we carry on cutting down forests at the rate we have been doing, the planet will warm by somewhere between 5 to 6 degrees centigrade. That might not seem much, but, to put it into context, in the entire history of human civilization, the average temperature of the planet has not varied by more than 1 degree. So that forecast tells us something major is coming, and we probably ought to pay attention to it.

But on the other hand, we know that weather forecasts don’t work so well the longer into the future we peer. Tomorrow’s forecast is usually pretty accurate. Three day and five day forecasts are reasonably good. But next week? They always change their minds before next week comes. So how can we peer 100 years into the future and look at what is coming with respect to the climate? Should we trust those forecasts? Should we trust the climate models that provide them to us?"
(Should we trust climate models?).

The next post in this series is here.



Tuesday, March 31, 2015

In The Year 2010

I have been doing upgrades to the sea level rise modelling program.



So, I did not get any posting done.



Time for another look at the past to see same old same old is real.


Yep, some review to emphasize that stale BAU is still at work:


Today Barak Obama convinced me completely!

He made a pronouncement that drilling for oil in the waters near the coast of the Atlantic and other places is the right thing to do.

In effect, Sarah Palin and the oil barons are right he said, reversing yet another campaign talking point made not long ago.

Offshore drilling has been banned in those areas for decades, but is coming to an end, even though he said he would not do that when he asked for our votes.

Obama made this announcement in front of a jet fighter at a military base that "is going green".

That is, the military will begin to run on biofuel, yes, half of MOMCOM's super weapons will have gone green within ten years Obama said.

Then he praised the military for its leadership in environmental foresight.

That absolutely convinced me that he has absolutely no clue, yet he is one of the smarter people in government, in terms of academic achievement.

What can one say about the institutionalization of abject dementia?

All I can say at the moment is that I feel very sad for him, and for all of us, if that is the best we can do as a society.

One movie calls it The Age of Stupid, another calls it The Criminally Insane Epoch, while other scientists call it inevitable:
"I see it with everybody. People just want to go on doing what they're doing. They want business as usual. They say, 'Oh yes, there's going to be a problem up ahead,' but they don't want to change anything."

Lovelock believes global warming is now irreversible, and that nothing can prevent large parts of the planet becoming too hot to inhabit, or sinking underwater, resulting in mass migration, famine and epidemics.
(Enjoy Life While You Can). What government official can "enjoy life" when it is death, not life, that official's policy is bringing; when there is little to no doubt that we have entered the last age or epoch of the governments of the human species?

While they exclaim that their number one mission is the security of the populace, they bring us ever closer to the brink, ever closer to the midnight of human existence.

And they do it thinking they are wise, yes, they fancy themselves as being wise for seeing human extinction approaching, but doing nothing about it.


Once upon a time the republican running for president came up with the term "voodoo economics".

It was a vague term that was not fully understood until his son became president and the economy collapsed.

The Bush II belief was that holy wars started by the High Priest In Chief were cost free.

They were not made part of budget numbers so that the Bush II regime could play deceit games with the public, telling them all is well with the economy.

If fact, their presidential candidate running to replace Bush II, McCain, infamously said "the economy is fundamentally sound" up until the time it crashed around him, along with his presidential aspirations.

Where is the U.S. economy that was once here when the Bush II regime took over, yes, where has it gone, where did the Bush II regime ship it off to?

The answer is that it has been shipped to Afghanistan, Iraq, and a couple hundred other countries where we dump trillions on military bases designed during a past epoch that has faded into historical insignificance.

The wars are not free, in fact they are the most costly imaginable voodoo economical and political disaster:
The article added about Karzai: “ ‘He has developed a complete theory of American power,’ said an Afghan who attended the lunch and who spoke on the condition of anonymity for fear of retribution. ‘He believes that America is trying to dominate the region, and that he is the only one who can stand up to them.’ ”

That is what we’re getting for risking thousands of U.S. soldiers and having spent $200 billion already. This news is a flashing red light, warning that the Obama team is violating at least three cardinal rules of Middle East diplomacy.
(NY Times). The wars have taken out jobs, homes, health care coverage, and sent many Americans to the bankruptcy courts.

These wars are not done with us yet, because now about half of our commercial real estate mortgages are going down this year:
By the end of 2010, about half of all commercial real estate mortgages will be underwater, said Elizabeth Warren, chairperson of the TARP Congressional Oversight Panel, in a wide-ranging interview on Monday.
(CNBC). The wars won't be done with us until we are done with them.

Bring the economy back home, then watch the soldiers, the plunder barons, and the contractors hurriedly follow.



Monday, March 30, 2015

The IPCC Record on Global Warming Temperature Projections

Fig. 1 (click to enlarge)
There is an interesting disconnect among various papers published in the Proceedings of the National Academy of Sciences (PNAS).

One paper found that for each 1°C of global average temperature rise there would, as a direct result, be a 2.3 meter sea level rise (SLR), according to Potsdam Institute.

I read another interesting piece at Skeptical Science, which pointed out, asserted, and argued that the IPCC had historically been quite accurate on its future global temperature rise projections:
As shown above, the IPCC has thus far done remarkably well at predicting future global surface warming. The same cannot be said for the climate contrarians who criticize the IPCC and mainstream climate science predictions.
(Skeptical Science, IPCC Estimates Ok). This raised a flag in my mind because many observers, including me, believe that their recent 3 ft. SLR projections in their 5th Assessment, covering from about now to 2100, are low.

The 1990 IPCC projections of temperature increases are shown in Fig. 1, indicating a temperature rise of about 3°C to about 6°C.

Fig. 2 (click to enlarge)
According to the paper in PNAS, that would equate to a SLR of 6.9 to 13.8 meters (about 22.6 to 45.3 feet) which is in accord with the projection of the software I wrote (see Fig. 2).

The new features in the code are: better control of zone switching (when inland melt zones kick in) and the software now acquiring its data from a mySQL database.

(The zone management algorithms will need some more tweaking, but all in all the C++ program is improving with time.)

The paper in PNAS, linking SLR to temperature rise, indicates as follows:
Greenhouse gases emitted today will cause sea level to rise for centuries to come. Each degree of global warming is likely to raise sea level by more than 2 meters in the future, a study now published in the Proceedings of the National Academy of Sciences shows. While thermal expansion of the ocean and melting mountain glaciers are the most important factors causing sea-level change today, the Greenland and Antarctic ice sheets will be the dominant contributors within the next two millennia, according to the findings. Half of that rise might come from ice-loss in Antarctica which is currently contributing less than 10 percent to global sea-level rise.
(Potsdam Institute, emphasis added; cf. PNAS). To be fair, in the PNAS paper they do not specify a clear time frame concerning specifically when the SLR kicks in following the increase in global temperature (it is still a good fit with my model that generated the graph, because their high end is 45.3 ft. and my model shows ~21 ft. circa 2100, leaving the other half for the next century).

This disconnect, which I mentioned in the first sentence of this post, blurs the reason for scientific research, in the sense of scientific research being something that should be for the public good (The Common Good, 2, 3, 4, 5, 6, 7, 8, 9, 10).

I say that, because we need to know when our lack of knowledge, or our denial, puts the public in danger.

Are you ready for "The Harbor & Port Czar" (The Agnotology of Sea Level Rise Via Ice Melt)?

The next post in this series is here.