Complex Systems |
Complex becomes 'mysterious' when the complexity is "on steroids" to the point it moves out of cognitive range.
Simulating the complexity exactly requires an understanding of the unknown, which is a dichotomy, so the solution chosen sometimes is to guess or to ignore.
For example, earlier IPCC reports from time to time in the past did not see polar ice cap melt as calculable, so that realm was excluded from the picture.
Thus, projections of models were consistently underestimating the real scenario, because the polar ice caps were destined to eventually become the main player in SLR.
The polar caps are now given a place at the table, because they are the prime source of future SLR.
Knowing that does not, in and of itself, remove all of the mystery, because SLR is a phenomenon of a complex system.
There are linear, non-linear, exponential and non-exponential factors, even in well known dynamics, such as an 'acceleration' (The Question Is: How Much Acceleration Is Involved In Sea Level Rise?) or a 'surge' (Will This Float Your Boat - 5, What Do You Mean - World Civilization? - 2).
There are various types and causes of both acceleration and surges.
Add to that the occasional misidentification, whereby either one or both acceleration and surge, can sometimes be confused with "feedback loops."
That said, it is also true that sometimes surges and/or accelerations are in fact due to feedback loops, but at other times they are mistaken for separate natural surges, and/or natural accelerations in a complex non-feedback dynamic:
"We may be witnessing the start of the long-awaited jump in global temperatures. There is “a vast and growing body of research,” as Climate Central explained in February. “Humanity is about to experience a historically unprecedented spike in temperatures.” - Think ProgressAnd of course as I said above, on top of acceleration and surge, there are separate individual feedback loops:
"Humanity is about to experience a historically unprecedented spike in temperatures.
That’s the ominous conclusion of a vast and growing body of research that links sweeping Pacific Ocean cycles with rates of warming at the planet’s surface — warming rates that could affect how communities and nations respond to threats posed by climate change." - Climate Central
"Now, scientists believe they’ve untangled the relationship. In a paper published Monday in Nature Climate Change, researchers from the University of Exeter claim to have found direct evidence that as global temperatures rise, so does the atmospheric concentration of greenhouse gases, creating a positive feedback that in turn warms the Earth even more — basically, global warming creates more global warming.(Study: Direct Evidence That Global Warming Causes More Global Warming). The designer or architect of software that has these issues within its context can try to build that logic into the code itself, or can choose instead to have these scenarios and dynamics described by data.
“We discovered that not only does thickening the blanket of heat-trapping gases around our planet cause it to get warmer, but also, crucially, when it gets warmer this increases thickens the blanket of heat-trapping gases,” Tim Lenton, the paper’s author, told ThinkProgress, “so we have a process called a ‘positive feedback’ that amplifies changes in the Earth’s temperature.”
Data which the software analyzes, derives values from, then injects those derived results into the other ongoing streams of computations and calculations.
IMO, the highly data driven model is the preferred method.
Because, as the situation becomes more clear and the data becomes better, the simulations will become more accurate without any change to the code.
The program logic (e.g. x + y - z) can be used in thousands of scenarios without changing the code, but instead merely updating the data values for x,y, and z.
So, I have built surge, delay, and acceleration logic into both the data entry exercises and in the calculation logic.
Processing that data, then using it in concert with the less-mysterious and more certain computations, becomes routine and more error free in the long run.
One can put expected surge, delay, and acceleration values into the database and it will be processed accordingly.
Or it can be left out, if a user chooses.
My practice is to develop multiple data scenarios, save them, modify them if needed when research clarifies this or that issue, but always have them available for use.
To the contrary, constantly rewriting the code is not preferred --when compared to merely adjusting the data.
So, the software I wrote to project future SLR has classes with data members for standard melting, acceleration of melting, temporary surges of melting, delays in melting, and various other data realms.
Data is used to describe the zones of melt (size, year melt begins, maximum melt, etc.).
Those are data values which the user can enter, modify, and control at will.
Thus, the user is in the driver's seat to the extent that the data determines the results.
The purpose which the architect or designer fulfils, then, is to first grasp the principles involved in the solution ("abstractions" in object oriented language jargon), then create "classes" (concepts of operations) which become "objects" (converting abstractions into "concrete" operations) when "instantiated" at "runtime."
This article from BBC (Greenland ice sheet losses double) gives some idea of what a developer of such software faces.
This article from NOAA (Arctic Report Card) gives some idea about how breaking ice sheets into zones still requires more than that alone.
That is, more than just a "melt beginning year," acceleration of melt, delay of melt, temporary surges in melt rate, and the like.
Comprehensive consideration of the phenomenon requires a concept of a dynamic rate-of-melt, which changes in that zone.
And from time to time changes in the other zones too.
For example, did you notice that in the Arctic Report Card piece mentioned above, that the ice loss as measured by the GRACE satellite (measures mass not volume) dropped to "no mass loss' (Fig. 3.3 in that post)?
Using the same formula that derived 14.87% ( L = [ (f / s)(1 / y) ] - 1 ), where y = 5 years (see Will This Float Your Boat - 5), that 14.87% drops to 12.25% when y = 6 years, instead of y = 5 years.
That is a drop in acceleration of 17.6% in one year (assuming Cryosat-2 comes up with the same result in its upcoming report).
Meanwhile, Antarctica volume loss @ Totten Glacier accelerated (which shows the importance of trend, mean, and averages in long range projections).
The next post in this series is here, the previous post in this series is here.