Monday, October 16, 2017

On The More Robust Sea Level Computation Techniques - 6

Fig. 1 SA 0-200 m
In this series I have provided some views on how best to calculate and compute thermal expansion and contraction with the in situ measurements available in the World Ocean Database (WOD).

One important event in that process is the use of the TEOS-10 toolkit provided by the scientific community (On The More Robust Sea Level Computation Techniques, 2, 3, 4, 5).

The importance of TEOS-10 should not be underestimated:
Fig. 2 SA 201-400 m
"On climatic time scales, melting ice caps and regional deviations of the hydrological cycle result in changes of seawater salinity, which in turn may modify the global circulation of the oceans and their ability to store heat and to buffer anthropogenically produced carbon dioxide." (Abstract, Metrologia 53 (2016), R1)

Fig. 3 SA 401-600 m
"Melting polar glaciers raise the sea level and influence the surface salinity distribution, and in turn may affect the large scale vertical and horizontal circulations in the oceans which continuously store, release or displace huge amounts of heat and dissolved gases." (ibid, R2)

"It is evident from climatology and geosciences that atmospheric
Fig. 4 SA 601-800 m
relative humidity, ocean salinity and seawater pH are key parameters for observing, modelling and analysing the increasing effects of global warming on ecosystems and society. However, despite their widespread use and relevance, the metrological underpinning of these parameters is inadequate, relies on century old provisional concepts, lacks
Fig. 5 SA 801-1000 m
traceability to the SI, or suffers from ambiguities and deficiencies of definitions, conventions and measurement techniques. The recent introduction of the international standard TEOS-10, the Thermodynamic Equation of Seawater 2010 (IOC et al 2010), has raised new awareness of these long standing and increasingly urgent problems, and has at the same time offered new perspectives for overcoming them.
" (ibid)
(IOP Science, PDF, emphasis added). The PDF is well worth downloading (no cost) and is filled with helpful reasoning as to why coherence is in the cards since the introduction of TEOS-10.

Fig. 6 SA 1001-3000 m
I have provided some source code as an example for using TEOS-10 (The Art of Making Thermal Expansion Graphs).

Thermal expansion and contraction is claimed to have been the major factor in sea level change for a century or so, and is claimed to have been more of a factor way back when than it is now (On Thermal Expansion & Thermal Contraction, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25).

Fig. 7 SA >3000 m
That assertion does not pass the smell test when, at the same time, it is also asserted that global warming has been increasing, with most of the increasing heat (~93%) ending up in the oceans.

The TEOS-10 and other formulas show that thermal expansion has decreased over the past ~50 years, even as more heat has been going into the ocean (On Thermal Expansion & Thermal Contraction - 25).

Fig. 8 SA All depths
Based on the in situ ocean temperature and salinity measurements processed by TEOS, ice sheet melt in Greenland and Antarctica has been going on in greater amounts and for a longer time than previously thought (Antarctica 2.0, 2).

Today's graphs focus on Absolute Salinity, an advanced concept for not only studying ocean water thermodynamics, but also for clarity and consistency (see  IOP Science link above).

Like temperature, Absolute Salinity (SA) varies with depth, and does so in a non-intuitive manner from time to time.

In these graphs (Fig. 1 thru Fig. 7) I used the usual Dredd Blog depth levels to show each level compared to the mean average of all the depths.

In the graph at Fig. 8 I placed all depths, along with the mean average again, on one graph.

It shows a relatively stable Absolute Salinity, the largest departure from the pack being the shallowest level (Fig. 1).

That probably reflects ongoing surface water freshening due to ice sheet and glacial melt acceleration.

Remember what the experts wrote in the quotes above: "Melting polar glaciers raise the sea level and influence the surface salinity distribution, and in turn may affect the large scale vertical and horizontal circulations in the oceans" (see IOP Science link above).

The previous post in this series is here.

Saturday, October 14, 2017

On Thermal Expansion & Thermal Contraction - 25

Fig. 1a Conservative Temperature
Fig. 1b CT Average
In the Thermodynamic Equation of SeaWater 2010 (TEOS-10) scheme of things, Conservative Temperature (CT), Absolute Salinity (SA), and Pressure (P) are key players.

CT is derived from in situ (measured) temperature taken at a given location using the function gsw_ct_from_t.

SA is derived from in situ (measured) salinity taken at a given location using the function gsw_sa_from_sp.

P is derived from measured depth at a given latitude using the function gsw_p_from_z.

The graph at Fig. 1a shows CT at the typical Dredd Blog "seven depth levels" (depth is 'height' in TEOS parlance), with one of the graph lines being the median (mean average).
Fig. 2a Absolute Salinity
Fig. 2b SA Average

The graph at Fig. 1b shows the average CT of those seven depth levels.

In like fashion, the graphs at Fig. 2a and Fig. 2b show the same for SA.
Fig. 3a Thermal Expansion / Contraction
Fig. 3b Thermal Expansion / Contraction Average

Since P is a function of depth, the pressure is derived from the depth of each CT and SA  in situ measurement.

The big story in today's post is the calculation of changes in thermal expansion and contraction at each of those depth levels.

The graph at Fig. 3a shows the changes in thermosteric volume at the seven depth levels, including a mean average of all seven combined.

The graph at Fig. 3b shows the average of those thermosteric volume changes.

The span of time involved begins in 1968 and ends in 2016, using all WOD measurements from all zones.

That span of time is due to my use of CTD and PFL datasets of the World Ocean Database (WOD).

I don't use the data acquired by older methods of gathering in situ measurements, preferring the more advanced gathering technology.

About half a century of data collection is recorded in the CTD and PFL datasets (about a billion temperature, salinity, and depth measurements).

As you can see in Fig. 3a and Fig. 3b, the results are not intuitive.

The results depart from the current thinking that thermal expansion has been the main source of sea level rise for a century.

The net result for the 1968-2016 span of time is a net -2171.6305236 km3 thermal contraction (not thermal expansion).

That value equates to about a 6 mm [~1/4 inch] thermosteric volume decrease (-2171.6305236 ÷ 361.841 = −6.001615416 mm [-0.236284071 inch]) ...  (the 361.841 value is the number of cubic kilometers necessary to raise the global mean average sea level by one millimeter).

Regular readers know that I have been experimenting to find a better way to calculate these values, and at this point I think using the seven ocean depth levels method is working well.

I have used the seven depth levels method with in situ temperature and salinity for quite a while.

The formulas for deriving the calculations were shown previously, along with the actual source code that does it (The Art of Making Thermal Expansion Graphs).

One difference in today's work is that the ten temperatures used in that linked-to example ("5.5,6.5,7.5,8.0,8.5,8.0,7.5,7.0,6.5,6.0, 5.5") are replaced with about a billion measured values that are used to produce the graphs shown in today's post.

What is most important in this calculation sequence (Fig. 3a) is that the individual depth levels' mass-volume must first be determined before calculating the thermal expansion coefficient.

That is because the seven depth levels don't all have the same mass-volume. (e.g. the 0-200 m level is different from the 1001-3000 m level in terms of mass-volume).

Notice in Fig. 3a that the depth level with the most thermal expansion is the >3000 m level (non-intuituve).

All the heat going into the oceans tends to move from warmer water to colder water over time.

As the heat works its way through the ocean basins, it causes expansion and contraction individual to that basin and the particular depth level.

BTW, the eustatic (non-thermal) sea level change graphs, measured by tide gauge stations featured in today's graphs, are posted here.

The previous post in this series is here.

The ice is melting ... (Dr. Eric Rignot) ...



Tuesday, October 10, 2017

The Art of Making Thermal Expansion Graphs

Gardens By The Bay
It has been a while since I published any C++ source code (Weekend Rebel Science Excursion - 49).

The last one, linked to above, had to do with some of the simple basics of sea level rise calculations.

As with ghost-water (NASA Busts The Ghost), calculating thermal expansion is another one of those "mysto" areas being focused on by scientists.

Today, I present source code written in the C++ programming language.

It is real code, but the data is limited to a ten episode (e.g. year) range., nevertheless, it shows how the thermosteric volume of the ocean can change even if the mass volume does not.

Those thermosteric volume changes are caused by temperature change in the body of water.

Before we look at the code and what it produces, here are some comments about the subject:
"To be a bit more scientific about the matter before we end, let’s quote an interesting study by the Potsdam Institute for Climate Impact Research, published in 2013 in PNAS (called ‘The Multi-millenial sea level commitment of global warming’). This research group thinks of the final future sea level rise Greenland would contribute about 25 percent, Antarctica (combined) about 50 percent, smaller glaciers about 5 percent – and thermal expansion about 20 percent."
(Bits of Science). There is another paper which I have quoted that points out a wide spread problem of improper calculations routinely done in this matter (On The More Robust Sea Level Computation Techniques - 5).

Some of the models are even older than some of the software advances in recent years (e.g. TEOS-10 toolkit), as regular readers know.

But there are other issues too:
"The thermal expansion of the ocean has been investigated by a spectrum of climate models of different complexity, ranging from zero-dimensional diffusion models ... via Earth System Models of Intermediate Complexity (EMIC) ... to comprehensive general circulation models ... Although uncertainty remains, especially owing to uncertainty in the ocean circulation and thereby the distribution of heat within the ocean, the physical processes are relatively well understood even if not fully represented in all models."
(The multimillennial sea-level commitment of global warming). Scientists are not unified as to whether or not thermal expansion is a or the major cause of sea level rise.

A recent paper points this out:
"On the basis of the GRACE data, we conclude that most of the change in ocean mass is caused by the melting of polar ice sheets and mountain glaciers. This contribution of ice melt is larger than previous estimates, but agrees with reports of accelerated ice melt in recent years."
(Nature). One limiting factor is calculating the thermal coefficient of sea water (Thermal expansion co-efficient of sea water), which the TEOS-10 toolkit solves nicely with the gsw_alpha function.


Anyway, here is the software code:

/** std C++ header files */
#include <iostream>
#include <fstream>
#include <iomanip>

/** TEOS header file */
#include <gswteos-10 .h>

using namespace std;

/******************************************
V1 = V0(1 + β ΔT)
V1 means new volume
V0 means original volume
β means thermal expansion coefficient
ΔT means change in temperature (t1 - t0)
********************************************/
double thermalExpansion(double currentOceanVolume, /** V0 */
                                          double tec, /** β */
                                          double t1, /** ΔT half */
                                          double t0) /** ΔT half */
{
        double V0 = currentOceanVolume;
        double B = tec;
        double DT = t1 - t0;
        double V1 = V0*(1 + B * DT);

        return V1;
}

/*******************************
This sample program
calculates thermal
expansion & contraction
from a list of ocean water
temperatures which
represent in situ
measurements.

Those in situ measurements
are converted to TEOS
values via TEOS
(gsw_....) functions.
********************************/
int main()
{
        /****************************************
         volume acquired at: Live Science
        *****************************************/
        const double oceanVol2010 = 1332370930.2; /** cu km */

        /****************************************************************
        number of cubic kilometers per millimeter of sea level change
        *****************************************************************/
        const double cuKmPerMm = 361.841;

        /** maximum number of temperatures */
        const unsigned maxTemperatures = 11;

        /** in situ temperatures, in degrees C */
        const double temperatures[maxTemperatures] =
        {5.5,6.5,7.5,8.0,8.5,8.0,7.5,7.0,6.5,6.0, 5.5};

        /** practical salinity */
        const double SP = 34.15;

        /** ocean depth of measurements, in meters */
        const double depth = 60.25;

        /***************************************
        latitude, longitude of measurements
        (off W. Coast of U.S.)
        ****************************************/
        const double lat = 35.33;
        const double lon = -150.21;

        /** variables for storing net steric balances */
        double netTETC = 0; /** net thermal expansion / contraction */
        double netSLC = 0; /** net sea level change, in millimeters */

        /** report text file */
        ofstream textFile("output-files/thermal_expansion.txt");

        textFile << setprecision(12)
        << "initial volume of ocean: "
        << oceanVol2010 << " (cu. km.)"
        << endl << endl;

        /********************************************************
         for loop: calculates thermal expansion/contraction
         from temperatures specified @ the temperature array         **********************************************************/
        for (unsigned tPos = 1; tPos < maxTemperatures; tPos++)
        {
                /** select temperatures */
                double Tnow = temperatures[tPos];
                double Tbefore = temperatures[tPos-1];

                /** save temperature changes to text file */
                textFile << setprecision(2)
                << tPos << ")\t"
                << "Temp. before: " << Tbefore << endl
                << "\tTemp. now: " << Tnow << endl
                << "\tTemp. change: "
                << Tnow - Tbefore << " (deg C)"
                << endl;

                /*********************************
                 convert depth @ lat to TEOS Z
                **********************************/
                double Z = gsw_z_from_p(depth, lat);

                /** calculate TEOS pressure (dbars) */
                double P = gsw_p_from_z(Z, lat);

                /** calculate TEOS absolute salinity (g/kg) */
                double SA = gsw_sa_from_sp(SP,P,lon,lat);

                /** calculate TEOS conservative temperature */
                double CT = gsw_ct_from_t(SA,Tnow,P);

                /** calculate TEOS thermal expansion coefficient */
                double tec = gsw_alpha(SA, CT, P);

                /****************************
                 calculate thermosteric
                 volume using a constant
                 mass-volume value
                 (oceanVol2010)
                *****************************/
                double TEvolume = thermalExpansion(oceanVol2010, tec, Tbefore, Tnow);

                /** record the TE volume changes */
                netTETC += oceanVol2010 - TEvolume;

                /** record sea level changes (mm) */
                netSLC = netTETC / cuKmPerMm;

                /** save changes to report file */
                textFile << setprecision(12)
                << "\tthis volume change: " << oceanVol2010 - TEvolume
                << " (cu. km)"
                << endl
                << "\tnet volume change: " << netTETC
                << " (cu. km)"
                << endl
                << "\tnet sea level change: " << netSLC
                << " (mm)"
                << endl << "\t---------------" << endl;
        } /** for loop */

        /** clean up */
        textFile.close();

        return 0;
}/** end of source code */


Here is what the above program prints out:

initial volume of ocean: 1332370930.2 (cu. km.)

1)  Temp. before: 5.5
     Temp. now: 6.5
     Temp. change: 1 (deg C)
     this volume change: 172385.525115 (cu. km)
     net volume change: 172385.525115 (cu. km)
     net sea level change: 476.412360995 (mm)
     ---------------
2)  Temp. before: 6.5
     Temp. now: 7.5
     Temp. change: 1 (deg C)
     this volume change: 186669.887116 (cu. km)
     net volume change: 359055.412231 (cu. km)
     net sea level change: 992.301624832 (mm)
     ---------------
3)  Temp. before: 7.5
     Temp. now: 8
     Temp. change: 0.5 (deg C)
     this volume change: 96841.4510376 (cu. km)
     net volume change: 455896.863269 (cu. km)
     net sea level change: 1259.93699793 (mm)
    ---------------
4)  Temp. before: 8
     Temp. now: 8.5
     Temp. change: 0.5 (deg C)
     this volume change: 100306.653495 (cu. km)
     net volume change: 556203.516764 (cu. km)
     net sea level change: 1537.1489598 (mm)
     ---------------
5)  Temp. before: 8.5
     Temp. now: 8
     Temp. change: -0.5 (deg C)
     this volume change: -96841.4510376 (cu. km)
     net volume change: 459362.065726 (cu. km)
     net sea level change: 1269.5135867 (mm)
     ---------------
6)  Temp. before: 8
     Temp. now: 7.5
     Temp. change: -0.5 (deg C)
     this volume change: -93334.9435582 (cu. km)
     net volume change: 366027.122168 (cu. km)
     net sea level change: 1011.56895478 (mm)
     ---------------
7)    Temp. before: 7.5
    Temp. now: 7
    Temp. change: -0.5 (deg C)
    this volume change: -89785.8304353 (cu. km)
    net volume change: 276241.291733 (cu. km)
    net sea level change: 763.432810911 (mm)
    ---------------
8)  Temp. before: 7
     Temp. now: 6.5
     Temp. change: -0.5 (deg C)
     this volume change: -86192.7625573 (cu. km)
     net volume change: 190048.529176 (cu. km)
     net sea level change: 525.226630414 (mm)
     --------------
9)  Temp. before: 6.5
     Temp. now: 6
     Temp. change: -0.5 (deg C)
     this volume change: -82554.3415222 (cu. km)
     net volume change: 107494.187653 (cu. km)
     net sea level change: 297.07575331 (mm)
     ---------------
10) Temp. before: 6
      Temp. now: 5.5
      Temp. change: -0.5 (deg C)
      this volume change: -78869.118834 (cu. km)
      net volume change: 28625.0688193 (cu. km)
      net sea level change: 79.1095227442 (mm)
      ---------------

NOTE: Use a g++ compliant compiler, and download the TEOS-10 toolkit.

More on the more difficult part of the process (handling the real WOD data) in a future post.

Saturday, October 7, 2017

On The More Robust Sea Level Computation Techniques - 5

Fig. 1 Determining Ocean Mass Change
I. The Critical Numbers

The most critical value to determine in thermosteric sea level rise (thermal expansion) calculations is the mass at the time the calculations commence.

That is not as difficult to determine as one might think.

In Fig. 1 the technique I use is illustrated in a manner that further illustrates what I wrote in a previous post in this series:
Fig. 2
The Golden 23 numbers indicate about 197 mm (about 7.8 inches) of sea level rise during the graphed time frame.

That comports well with the NASA estimation that "sea level has risen about eight inches since the beginning of the 20th century" (NASA, emphasis added).
(On The More Robust Sea Level Computation Techniques - 4). The official global mean sea level rise during the graphed span of time is about 8 inches, which is about 200 millimeters (Fig. 1 shows 197.136 mm).

II. First Numbers First

Fig. 3
The first thing we need to know is the volume of the ocean at a given year, which I glean from “The Volume of Earth's Ocean” (Oceanography, vol. 23, no. 2, 2010, pp. 112–114; PDF version).

In 2010 that paper was published with the new calculated amount of 1,332,370,930.2 km3.

From that value, by calculation we can derive:
1,332,370,930.2 ÷ 3.6822 = 361,841,000 (cu km per km of depth)
361,841,000 ÷ 1000 = 361,841 (cu km per m of depth)
361,841 ÷ 1000 = 361.841 (cu km per mm of depth)
The sea level rise, from 1880 - 2016, shown by the Golden 23 PSMSL tide gauge stations is 197.136 millimeters.

Fig. 4
Looking at my CSV file, the ocean mass in 2010 is 1,332,370,930.2 km3, the ocean mass in 2016 is 1,332,385,221.75 km3, and the ocean mass in 1880 is 1,332,313,889.78 km3.

So, 1,332,385,221.75 - 1,332,313,889.78 is 71,331.97 km3.

That is the mass increase in cubic kilometers (km3).

To double check the calculations, we multiply the sea level rise in millimeters by the cubic kilometers per millimeter of ocean: 197.136 × 361.841 = 71,331.887376 km3.

How close they are is determined by: 71,331.97 − 71,331.887376 = 0.082624 mm) close enough?

If that is close enough, then the ocean mass in 1880 was 1,332,313,889.78 km3.

That is the mass value we must use to properly calculate thermal expansion:
"A common practice in sea level research is to analyze separately the
Fig. 5
variability of the steric and mass components of sea level. However, there are conceptual and practical issues that have sometimes been misinterpreted, leading to erroneous and contradictory conclusions on regional sea level variability. The crucial point to be noted is that the steric component does not account for volume changes but does for volume changes per mass unit (i.e., density changes). This indicates that the steric component only represents actual volume changes when the mass of the considered water body remains constant.
"
(On The More Robust Sea Level Computation Techniques). If we do not use that value, instead using the increasing mass / volume quantity, then the thermal expansion mass value becomes about thirteen times more than all of the ocean volume increase (the variable use totals to a 910, 589 km3 thermal expansion increase, when the maximum mass / volume increase is 71,331.97 km3).

The graph at Fig. 3 shows how really way off the improper use of the formula will make the thermal expansion calculation (see Section V. here for how the TEOS toolkit is used to calculate that issue).

III. The Other Graphs

The graph at Fig. 4 shows a lot of activity in both thermal expansion and thermal contraction, as I discovered about a year and a half ago (On Thermal Expansion & Thermal Contraction).
Fig. 6

The net result, though, is thermal contraction.

The graph at Fig. 5 contains both the wrong use of the formulas and the right use of the formulas.

You can clearly see that the use of the increasing ocean mass/volume vs. the use of the initial ocean mass/volume radically distorts the result.

And the graph at Fig. 6 shows the proper ocean mass / volume change increase of 71,331.97 km3 associated with the 197.136 mm sea level rise.

BTW, I do not use the abstract mass / volume values in the calculations any longer.

For the water temperature and salinity history, I use calculations that more accurately compute those values, so the 1968 beginning date for WOD water temperature and salinity measurements, and the satellite beginning date of 1993, are no longer a hindrance (due to lack of data).

IV. Conclusion

Solving the mass / volume quantities of the ocean has made the entire exercise stable now.

So, I want to get back to some of the models that I have not used in a long time, and plug the new TEOS toolkit, ocean mass / volume values, and other factors into projecting some sea level changes in various parts of the world.

The next post in this series is here, the previous post in this series is here.

Thursday, October 5, 2017

Gerrymandering - Geological Deceit? - 6

Oral Argument in Gill v Whitford
I began this series almost nine years ago (Gerrymandering - Geological Deceit?).

The U.S. Supreme Court heard oral argument on the issue this past Tuesday in the case Gill v. Whitford.

This case will have the impact of Citizens United v. FEC or worse.

If state legislatures are allowed to draw gerrymandered voting districts to favor their political party and ideals, democracy in the U.S. at both the state and federal level will suffer a catastrophe (Gerrymandering - Geological Deceit?, 2, 3, 4, 5).

Why not simply make one or more counties the voting districts, and allow she or he who wins the most counties to become elected?

I say that because counties already hold elections and have the apparatus to do so, in both federal and state elections.

It will save the expense we now have in managing votes in far flung and outlandishly shaped gerrymandered voting districts.

Meanwhile the counties already exist and have the money to conduct elections.

Anyway, the Supreme Court's decision will likely be made before the end of June 2018.

The previous post in this series is here.

Wednesday, October 4, 2017

Hold On To Something Alright

Fig. 1 History of Ocean Depth Assertions
I. Ignorance is Bliss?

The graph at Fig. 1 is a NOAA graph showing the mean (average) ocean depth in meters, as estimated or calculated for over a century.

The latest estimate, measurement, or calculation (3,682.2 m) that I use was done in 2010 (Science Daily).

Why have scientists been unsure of the depth?

According to a NASA scientist:
"... the oceans contain 99 percent of the living space on the planet ... We know more about the oceans than we used to. Yet we still know 'very little about the vast majority of the ocean', he says ... It's really a hard place to work. In many ways, it's easier to put a person into space than it is to send a person down to the bottom of the ocean. For one thing, the pressure exerted by the water above is enormous. It's the equivalent of one person trying to support 50 jumbo jets. It's also dark and cold. Unlike space where you can see forever, once you're down in the ocean you can't see anything because your light can't shine very far. It's a challenging place to study ... even with all the technology that we have today -- satellites, buoys, underwater vehicles and ship tracks -- we have better maps of the surface of Mars and the moon than we do the bottom of the ocean. We know very, very little about most of the ocean. This is especially true for the middle and deeper parts far away from the coasts"
(NASA). NOAA adds:
"The ocean is the lifeblood of Earth, covering more than 70 percent of the planet's surface, driving weather, regulating temperature, and ultimately supporting all living organisms. Throughout history, the ocean has been a vital source of sustenance, transport, commerce, growth, and inspiration.

Yet for all of our reliance on the ocean, 95 percent of this realm remains unexplored, unseen by human eyes."
(NOAA, emphasis added). Once upon a time, about nine years ago, when I pointed out that we treat the ocean like a black hole where we throw our nuclear, sewage, and other garbage and waste, the internut went a bit wacko on me (New Continent Found - Garbage Gyre II).

I have also attempted to point out that we don't tend to want to understand even the part of the ocean that we can see, or should see (NASA Busts The Ghost).

II. So How Do We Know So Much?

We don't have to see the bottom to see the surface, and that surface can tell us a lot about ourselves and what our civilization is doing to a planet we have yet to discover, it would seem (You Are Here).

There is a vast amount of information available (e.g. World Ocean Database, Permanent Service for Mean Sea Level).

That information is what I mull over, study, and use for the benefit of Dredd Blog readers.

We can know enough to save our civilization at some point, only if enough of our fellow sojourners get it and tell it like it is.

Frankly, that is not the case (Normalization Of The Abnormal, Civilization Is Now On Suicide Watch, 2, 3, 4, 5, 6, 7, 8).

III. Conclusion

There is no glory in all of this arduous work done under a 'pen name' (6 Reasons for Authors to Use a Pen Name).

But there is the satisfaction of discovering and then sharing some discoveries with others (like the scientists who work at it day in and day out).

I am not tired of it yet, because it is something I believe.

Something I can hold on to.

And it makes me feel alright.

Lyrics here, (full concert here).



Tuesday, October 3, 2017

Normalization Of The Abnormal

McTell it like it is, Snooper Man
The news media habit of normalizing the abnormal went awry when candidate Trump bloviated on and on about what was considered to be abnormal politics.

Like soon-to-be, it is said, "Senator Judge Roy L. Bean" (out of Alabama).

But I digress.

In their media minds, Candidate Trump could never be elected to the most powerful office that an individual could hold in the United States, so why bother normalizing his abnormality?

It was a waste of time, besides, the media world already had lots of work to do with all the paid expert-bloviators it took to normalize everlasting war without even mentioning it (Secret Afghanistan Underground - 3).

Or, I guess I should say mentioning it about as much as it mentions climate change:
"In 2016, evening newscasts and Sunday shows on ABC, CBS, and NBC, as well as Fox Broadcast Co.'s Fox News Sunday, collectively decreased their total coverage of climate change by 66 percent compared to 2015, even though there were a host of important climate-related stories, including the announcement of 2015 as the hottest year on record, the signing of the Paris climate agreement, and numerous climate-related extreme weather events. There were also two presidential candidates to cover, and they held diametrically opposed positions on the Clean Power Plan, the Paris climate agreement, and even on whether climate change is a real, human-caused phenomenon. Apart from PBS, the networks also failed to devote significant coverage to climate-related policies, but they still found the time to uncritically air climate denial -- the majority of which came from now-President Donald Trump and his team."
(How Broadcast Networks Covered Climate Change In 2016, emphasis added). They reduced it because, you know, both are life and death matters ("One death is a tragedy; one million is a statistic.").

If it bleeds it leads, unless it can't be explained by those expert-bloviators.

Like mass-murders.

Let's review one of many exceptionally famous cases ("we're number one" at mass murderers) for some key points:
"... [Whitman] killed a receptionist with the butt of his rifle. Two families of tourists came up the stairwell; he shot at them at point-blank range. Then he began to fire indiscriminately from the deck at people below. The first woman he shot was pregnant. As her boyfriend knelt to help her, Whitman shot him as well. He shot pedestrians in the street and an ambulance driver who came to rescue them.

The evening before, Whitman had sat at his typewriter and composed a suicide note:
I don’t really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I can’t recall when it started) I have been a victim of many unusual and irrational thoughts.
By the time the police shot him dead, Whitman had killed 13 people and wounded 32 more. The story of his rampage dominated national headlines the next day. And when police went to investigate his home for clues, the story became even stranger: in the early hours of the morning on the day of the shooting, he had murdered his mother and stabbed his wife to death in her sleep.
It was after much thought that I decided to kill my wife, Kathy, tonight … I love her dearly, and she has been as fine a wife to me as any man could ever hope to have. I cannot rationa[l]ly pinpoint any specific reason for doing this …
Along with the shock of the murders lay another, more hidden, surprise: the juxtaposition of his aberrant actions with his unremarkable personal life. Whitman was an Eagle Scout and a former marine, studied architectural engineering at the University of Texas, and briefly worked as a bank teller and volunteered as a scoutmaster for Austin’s Boy Scout Troop 5. As a child, he’d scored 138 on the Stanford-Binet IQ test, placing in the 99th percentile. So after his shooting spree from the University of Texas Tower, everyone wanted answers.

For that matter, so did Whitman. He requested in his suicide note that an autopsy be performed to determine if something had changed in his brain — because he suspected it had.
I talked with a Doctor once for about two hours and tried to convey to him my fears that I felt [overcome by] overwhelming violent impulses. After one session I never saw the Doctor again, and since then I have been fighting my mental turmoil alone, and seemingly to no avail.
Whitman’s body was taken to the morgue, his skull was put under the bone saw, and the medical examiner lifted the brain from its vault. He discovered that Whitman’s brain harbored a tumor the diameter of a nickel. This tumor, called a glioblastoma, had blossomed from beneath a structure called the thalamus, impinged on the hypothalamus, and compressed a third region called the amygdala."
(Atlantic Monthly, emphasis added). The "well educated Eagle Scout Marine" dood became a domestic enemy, and turned on his loved ones.

The Blind Willie McTell News goes the most bananas when they have no paid expert-bloviators to explain what happened in "Las Vegas" (Choose Your Trances Carefully - 6).

Whatever happens in "the Las Vegas trance" stays in "Las Vegas trance" it would seem.

They do the 24/7 trance and begin showing the same videos over and over 24/7 ... until ... well ... they don't anymore.

Then it never happened it would seem.

They may soon devolve into the Normalize The Abnormal trance to try to convince you, and themselves, that this is the new normal.

They may also resort to pabulum indicating that it is the fire sale price we pay for MOMCOM freedom (MOMCOM - A Mean Welfare Queen).

Historically, they have a record of accomplishing this (Blind Willie McTell News, 2, 3, 4, 5, 6).

But in the end, that accomplishment can only be brought about by their incessant appeal to the base (the exceptional despotic minority):
"In the Study Toynbee examined the rise and fall of 26 civilizations in the course of human history, and he concluded that they rose by responding successfully to challenges under the leadership of creative minorities composed of elite leaders. Civilizations declined when their leaders stopped responding creatively, and the civilizations then sank owing to the sins of nationalism, militarism, and the tyranny of a despotic minority. Unlike Spengler in his The Decline of the West, Toynbee did not regard the death of a civilization as inevitable, for it may or may not continue to respond to successive challenges. Unlike Karl Marx, he saw history as shaped by spiritual, not economic forces."
(Stockholm Syndrome: The Declaration of Intellectual Dependence, quoting Encyclopedia Britannica). "Go with the flow baby" is the new old Highway 61 (War is the Highway 61 of the 1%).

After all, the Oscar ... oops ... I mean the Pulitzer is in play.





Sunday, October 1, 2017

NASA Busts The Ghost

Fig. 1 NASA detective work sees ghost water
It is no secret to regular readers that ghost-water has been around for a couple of centuries (The Ghost-Water Constant, 2, 3, 4, 5, 6, 7, 8, 9; Weekend Rebel Science Excursion - 54).

But, the very abundant ghost-water can only be seen by those without a pre-Newton view of science (The Gravity of Sea Level Change, 2, 3, 4); and since NASA is one of the branches of government that understands the reality of the existence of Newton, it was inevitable that they were going to get into sleuth mode sooner or later and do some ghost-busting.

Once they put on their Sherlock Holmes hat, it was also inevitable that they would detect the fingerprints of the once "very well hidden-in-plain-sight" ghost-water:
"Researchers from NASA’s Jet Propulsion Laboratory in Pasadena, California, and the University of California, Irvine, have reported the first detection of sea level “fingerprints” in ocean observations: detectable patterns of sea level variability around the world resulting from changes in water storage on Earth’s continents and in the mass of ice sheets. The results will give scientists confidence they can use these data to determine how much the sea level will rise at any point on the global ocean as a result of glacier ice melt.

As ice sheets and glaciers undergo climate-related melting, they alter Earth’s gravity field, resulting in sea level changes that aren’t uniform around the globe. For example, when a glacier loses ice mass, its gravitational attraction is reduced. Ocean waters nearby move away, causing sea level to rise faster far away from the glacier. The resulting pattern of sea level change is known as a sea level fingerprint. Certain regions, particularly in Earth’s middle and low latitudes, are hit harder, and Greenland and Antarctica contribute differently to the process. For instance, sea level rise in California and Florida generated by the melting of the Antarctic ice sheet is up to 52 percent greater than its average effect on the rest of the world.

To calculate sea level fingerprints associated with the loss of ice from glaciers and ice sheets and from changes in land water storage, the team used gravity data collected by the twin satellites of the U.S./German Gravity Recovery and Climate Experiment (GRACE) between April 2002 and October 2014. During that time, the loss of mass from land ice and from changes in land water storage increased global average sea level by about 0.07 inch (1.8 millimeters) per year, with 43 percent of the increased water mass coming from Greenland, 16 percent from Antarctica and 30 percent from mountain glaciers. The scientists then verified their calculations of sea level fingerprints using readings of ocean-bottom pressure from stations in the tropics.

“Scientists have a solid understanding of the physics of sea level fingerprints, but we’ve never had a direct detection of the phenomenon until now,” said co-author Isabella Velicogna, UCI professor of Earth system science and JPL research scientist.

“It was very exciting to observe the sea level fingerprints in the tropics, far from the glaciers and ice sheets,” said lead author Chia-Wei Hsu, a graduate student researcher at UCI.

The findings are published
[PDF] today in the journal Geophysical Research Letters. The research project was supported by UCI and NASA’s Earth Science Division."
(NASA/UCI Find Evidence of Sea Level 'Fingerprints', emphasis added). The link discusses a paper in Geophysical Research Letters.

The published paper is unambiguous and straight forward:
"As ice sheets and glaciers and ice caps melt into the ocean, the pattern of regional sea level rise is nonuniform and tracked via the sea level fingerprints. Here we provide the first observational evidence that the [fingerprints] calculated from satellite observations match the record from ocean stations that measure mass changes over time, i.e., the Gravity Recovery and Climate Experiment-based satellite technique correctly captures the distribution of freshwater fluxes to the ocean and the signal is large enough to be detected by ocean in situ observations in the tropics. The results are critical to improve regional projections of sea level rise and its impact on coastlines and human systems."
(Geophysical Research Letters). They cite two papers by Professor Mitrovica (2001 and 2009), but still have not found Woodward (1888).

The "ocean-bottom pressure" detection technique reminded me of another Dredd Blog series (Is A New Age Of Pressure Upon Us?, 2, 3, 4, 5, 6, 7, 8, 9, 10,11, 12, 13).

That series discusses sea level change as a source of contrasting torque and pressure upon the Earth's crust, which can and does lead to earthquakes and volcanic activity.

We can now hope that they will un-detect thermal expansion as "the major cause" of sea level change, because they haven't shaken that error yet:
"We calculate changes in steric [thermal expansion] height at each grid cell by vertically integrating the density profile. The vertical profiles provide data up to 2000 m depth with a vertical resolution of 10 dbar when close to the sea surface and 100 dbar close to 2000 m depth."
(ibid, Geophysical Research Letters). They can start with ceasing and desisting from scratching the surface to instead "go deep" using robust detection techniques (On The More Robust Sea Level Computation Techniques, 2, 3, 4).

The thermodynamics are seemingly still hidden in plain sight too (On Thermal Expansion & Thermal Contraction, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24).

I suggest they investigate the TEOS-10 toolkit (TEOS-10 Flyer, PDF).

Professor Jerry X. Mitrovica on the gravity / axis bulge SLR / SLF issues we don't hear about often enough:



Saturday, September 30, 2017

When Will The Arctic Sea-ice Be Gone? - 3

Fig. 1 September 28, 2017
Previous posts in this series were presented at a time when it looked as if this year, 2017, was going to be the year of no summer sea ice in the Arctic.

The software module I developed to project the year when the numbers and logic calculated the year that event would take place, indicated that it would take place circa 2024-2025 (When Will The Arctic Sea-ice Be Gone?, 2).

Fig. 2 No Summer Sea-ice (red line)

That is the year of "no summer sea ice" but "no winter or summer sea ice" was of course projected to take place about a decade later circa 2038 (Fig. 3).

The graph at Fig. 1 is the current NSIDC graph for summer sea ice extent.

The graph at Fig. 2 has a red line added to the Fig. 1 graph to show the distance yet to go before no summer sea ice takes place in the Arctic.
Fig. 3 Current Dredd Blog Projection
The year 2012 continues to hold the record low for the month of September, however, the years 2016 and 2017 hold the record in other months of the year, having taken those monthly record lows away from the year 2012.

It is important to remember that these trajectories for sea ice extent, sea ice volume, as well as for  other global warming phenomena, are non linear (that much is obvious).

It works both ways in the sense that we have years with acceleration and years of deceleration.

However, the trend has not changed and will continue its inevitable course.

That much is also obvious.

The previous post in this series is here.

Thursday, September 28, 2017

On The More Robust Sea Level Computation Techniques - 4

Fig. 1a
I. Satellite Data Added
Fig. 1b

I have injected the NASA Satellite Records into the Test Case modules described earlier in this series (On The More Robust Sea Level Computation Techniques, 2, 3).

How this satellite usage works in general has explained previously (Syncronizing Satellite Data With Tide Gauge Data).
Fig. 1c

The graphs from the Test Case module (Fig. 1a thru Fig. 1c) are like the graphs in other modules.

II. Test Case Scenario

The three Test Case scenarios ('Golden 23', 'Church 491', and 'All Stations' were also discussed in previous posts of this series.

Basically, the three test cases are used to emphasize how important it is to select balanced world ocean areas when they are to be used as being representative of the whole of the oceans.

The satellite record (conformed to RLR concepts so as to match the PSMSL values) matches best with the Golden 23.

The graphs show that the Golden 23 evinces about twice the amount of sea level rise that the other two cases do.

How does "twice as much" set with the official record?

III. By The Numbers

Fig. 2a
Fig. 2b
Fig. 2c
Some numbers from the CSV files, from which the graphs @ Fig. 2a thru Fig. 2c were constructed, will also help us to discern that reality:
all stations:
103.339 mm * 361.841 = 37,392 (37392.287099) cu km
103.339 mm ÷ 304.8 mm = .34 (0.339038714) ft.
total: 4.07 (4.068464567) in.

Church 491:
93.7412 mm * 361.841 = 33,919 (33919.4095492) cu km
93.7412 mm ÷ 304.8 mm = 0.31 (0.307549869) ft.
total: 3.7 (3.690598425) in.

Golden 23:
197.136 mm * 361.841 = 71,332 (71331.887376) cu km
197.136 mm ÷ 304.8 mm = 0.65 (0.646771654) ft.
total: 7.8 (7.761259843) in.

The Golden 23 numbers indicate about 197 mm (about 7.8 inches) of sea level rise during the graphed time frame.

That comports well with the NASA estimation that "sea level has risen about eight inches since the beginning of the 20th century" (NASA, emphasis added).

The other two test case scenarios indicate about half of that Golden 23 amount, so we can discount them because the test case indicates that they are not produced from a balanced group of tide gauge, weather, or WOD stations and zones.

To produce those numbers, I use "361.841" as the cubic kilometers of ocean water required to raise the global mean average of sea level one millimeter; I use "304.8" as the number of millimeters per foot.

The Golden 23 sea level, then, indicates that an increase of 71,332 cu. km. of ocean water was added by Cryosphere melting during the graph time frame.

Again, the other two indicate about half of that amount.

IV. Back To Ocean Water Temperatures

Fig. 3a
Fig. 3b
Fig. 3c

This is the toughest case to solve.

For starters, it is not axiomatic that the zones where tide gauge stations have been installed and working for centuries in some cases, decades in other cases, will have the same warming of waters as sea level change.

There is no direct link at the latitude / longitude location between warming waters and sea level rise or fall.

That is because a lot of the water comes from far away when released by ice sheets as they lose their gravity (The Gravity of Sea Level Change, 2, 3, 4).

Likewise, the melt water also flows toward the equator (ibid).

Furthermore, the World Ocean Database is composed of uneven depth measurements as well as uneven latitude / longitude measurements.

The graphs at Fig. 3a thru Fig. 3c show that the in situ measurements (actual measurements at actual locations) vary based on measurement location, and are at odds with the expected Test Case temperatures.

V. UPDATED GRAPHS

After doing some more thinking about this, I am providing graphs Fig. 4a thru Fig. 4c as
Fig. 4a
Fig. 4b
Fig. 4c
well as Fig. 5.

The problem was essentially comparing apples to oranges.

The long span-of-time "expected temperature" as well as the long span-of-time sea level graph lines should not be combined with shorter time frames because it squishes both into a less revealing contortion.

The graphs at Fig. 4a thru Fig. 4c show expected sea level change and in situ (measured) sea level change in the modern satellite measurements era.

The graph at Fig. 5 shows expected ocean temperatures compared with in situ (measured) ocean temperatures in the modern satellite measurements era.

The graph at Fig. 5 is also constructed using ALL temperature measurements in the WOD database in all WOD zones. 

The previous graphs (Fig. 3a thru Fig. 3c) only included WOD zones that had relevant tide gauge stations in them (Golden 23, all tide gauge stations, and the Church 491 tide gauge stations).

They were also compared to ocean temperatures from only those limited numbers of WOD zones.

Fig. 5
The solution, then, was to compare the expected ocean temperatures to all WOD zones whether they had tide gauge stations in them or not.

The solution also included using all WOD measurements in those zones.

The bottom line is that all of the examples (Golden 23 stations, All stations, and Church's 491 stations) are reasonably close and accurate in terms of the modern satellite measurements which only span the years 1993-2016.

That is fine, because as we go back in time the measurements of ocean temperatures are much more likely to be non-existent, unlike PSMSL tide gauge station records.

VI. Conclusion

[I originally said] "I am going to do some examination of my portion of the WOD database (~1 billion measurements) to see if there is a collection of WOD zones that are closer to the Test Case scenario (even if they are outside of the Golden 23)."

[As the updated graphs and discussion show, I did what I said I would, and we now have a better, working understanding of how to do this.]

[I also originally said] "Additionally, I intend to explore whether I should add some other datasets to the mix (I currently use only the CTD and PFL datasets)."

[Again, that will not be necessary.]

[Finally, I originally said] "I will inform readers of that progress in future posts."

[That will not be necessary now, since I did it in this post in the form of an update.]

The next post in this series is here, the previous post in this series is here.