Monday, August 19, 2024

The Cuckoo's Egg Hatched Again - 4

Software Bug?

This series is about a bird that is a parasite but is little known; and is a metaphor for intruders of the hacker type:

"The cuckoo lays her eggs in other birds' nests. She is a nesting parasite: some other bird will raise her young cuckoos. The survival of cuckoo chicks depends on the ignorance of other species."

(The Cuckoo's Egg, p. 21). The following also caught my 'metaphorical eye' since in our western culture history and herstory keep repeating themselves:

"White House urges developers to dump C and C++ ... it named Rust as an example of a programming language it considers safe." 

(Info World) ... which brought up these Dredd Blog 'memories':

Ye Olde Drone Hackers Ride Again

The Cuckoo's Egg Hatched Again, 2, 3

(Series Posts (N-Z)  @"SPYING ON AMERICA"). The particular history/herstory being repeated is the typical blaming of the software program instead of blaming the programmer who wrote it (cf. "Choose Your Trances Carefully" and "It Can't Happen Here Trance"). 

The quote just above in this post ("White House urges developers to dump C and C++ ... and it named Rust as an example of a programming language it considers safe") and the following quote, read together, are exceptionally instructive:

"How to work with LLVM

The typical way to work with LLVM is via code in a language that supports LLVM’s libraries.

Two common language choices are C and C++. Many LLVM developers default to one of those two for good reasons:

LLVM itself is written in C++." 

(Link; The LLVM Compiler InfrastructureLLVM Wikipedia). In other words the White House assertion that programmers should not use the "C" or "C++" programming languages, instead they should use the "Rust" programming language is less convincing when one considers that LLVM and Rust are essentially written in the C++ programming language.

But I digress.

This series began with some little known historical hacking events:

"For forty years the military has been hackable. If you want a book that documented the current reality 40 years ago, read The Cuckoo's Egg, by Cliff Stoll. He had a difficult time convincing the military that it was being hacked as if it was an open book:

The meeting was top secret, so I couldn't listen—someone fetched me when my turn came. In a small room, lit only by the viewgraph machine, there were around thirty people, most of them in uniforms. Generals and admirals, like you see in the movies.

Well, I talked for half an hour, describing how the hacker was breaking into military computers and skipping through our networks.
...
I know as little about the military world as the next person. "I guess I'm impressed, though I'm not sure why," I said. "You ought to be," Bob said. "These are all flag officers. General John Paul Hyde works at the Joint Chiefs of Staff. And that guy in the front row -- he's a big shot from the FBI. It's a good thing he heard you."

(ibid, The Cuckoo's Egg, p. 200). This took place 40 years ago when a long hair from Berkeley informed the military they were being had, so they gave the long haired hippy astronomer the National Medal of Honor.

But they did not fix the system, because when someone hacks the system they can go to congress yelling "security security" to scare congress into giving them more money not to fix it."

(Ye Olde Drone Hackers Ride Again, 2009). If we add fifteen more years to the "For forty years the military has been hackable" line in the quote and bring in all entities that use software the statement becomes for 55 years critical software has been hackable (cf. The Military NSA Can't Hack My Car Nor Can AGW Make Us Extinct, ACLU vs. Clapper, Alexander, Hagel, Holder, and Mueller - 9).

At least read page 21 in "The Cuckoo's Egg" where the author writes:

"Just one problem: there's a [']bug['] in that software." 

(The Cuckoo's Egg, p.21). I put 'bug' in quotes to illustrate the sense that the intruder from afar considered it to be a feature not a bug. 

The bug had nothing to do with the programming language the software was written in.

If you ask someone about the better/best programming languages, the results will vary:

"What is a programming language used in spacecraft?

"The programming languages used in spacecraft vary depending on the systems and tasks performed by those vehicles. Common languages used in spacecraft software development include:

"C/C++: The C/C++ programming language is one of the most popular languages in spacecraft software development. It is used to build control, analysis and simulation systems and to interact with devices and other systems.

Python: Python is increasingly used in the aerospace industry due to its ability to handle a wide range of tasks, such as data analysis, machine learning, and engineering applications.

Java: Java is used in some space applications, especially when it comes to programming applications based on graphical interfaces or cross-platform operational applications.

Assembly Language: In some cases, Assembly Language is used to program specific parts of space systems that require high performance and precise control of the equipment.

LabVIEW: LabVIEW is used in some space applications as a software tool for designing control, monitoring, and data mining systems."

(Hive). Getting more specific, what about the programming language used to write computer operating systems:

8 Programming languages an operating system developer should know

The process of building an operating system is often challenging and complex. Developers use low-level system programming to develop operating systems as it offers them more control over how computer memory is managed and also reduces resource consumption and minimizes latency. Low level enables developers to interface with hardware and ensure that the OS can smoothly communicate with the underlying computer architecture. Though there are many languages that can be used to develop an operating system, there are some languages that the developers prefer more than others for the benefits the languages offer. Here is the list of 8 programming languages that are more commonly used:


Assembly Language:

Kernel is a most critical part of an operating system and assembly language is often used by developers to write it. It is the lowest level programming language that is often used there, where the system requires low-level access to the computers' hardware. Different architectures have their own assembly languages.

C:

C is one of the most widely used when it comes to developing an Operating System. The language was created primarily to develop UNIX. C is capable of working with memory addresses and performing pointer arithmetic. It is a fundamental feature that makes it well-suited for system programming. It allows developers to directly manage and manipulate memory, which is crucial when building operating systems and other system-level software.
 

C++:

C++ is an extension of C. This programming language caters to a few needs while developing an operating system. It helps in creating more object-oriented modular kernels. The development of C++-based operating systems like Haiku and the kernel parts of the Microsoft Windows operating system demonstrates this.
Rust:

Rust has gained some popularity in recent years for offering memory safety, control over low level systems, abstractions, and concurrency support. The language is designed to reduce typical memory-related programming mistakes like null pointer differences, buffer overflows, use-after-free errors, etc. It provides the developers control over memory, interrupts, and CPU registers.

Nim:

Nim is a system programming language. It aims to combine high-level abstractions with low-level efficiency. It's not a traditional choice when it comes to building operating systems but it can be used in niche areas. Nim focuses on simplicity, safety, and performance and it is more suitable for certain user-level components.

ADA:

Ada is a high-integrity language that is typed statically. It is used in safety-critical systems, including certain real-time and embedded operating systems.

Golang:

Golang is also used for specific components of building an operating system. For instance, the Go programming language has been employed to develop various user-level applications, services, and utilities that run on top of the Plan 9 kernel.

Zig:

Zig is designed for low-level development. Besides safety and performance, it focuses on readability. It has gained some attention in the systems programming community. Likewise, Nim, programming language is also not commonly used for complete operating systems building, but it could be employed for writing low-level components, device drivers, or other system-level software.

(Analytics Insight). Some operating system considerations:

"Unix distinguishes itself from its predecessors as the first portable operating system: almost the entire operating system is written in the C programming language, which allows Unix to operate on numerous platforms" (Wikipedia). 

Linux is also primarily written in C (Wikipedia).

And those operating systems can have mistakes in them if programmers make mistakes.

Closing Comments

The programming language is not the problem nor the solution, as shown by these infamous programmer mistakes:

1. The Mariner 1 Spacecraft, 1962

The first entry in our rundown goes right back to the sixties.

Before the summer of love or the invention of the lava lamp, NASA launched a data-gathering unmanned space mission to fly past Venus. It did not go to plan.

The Mariner 1 space probe barely made it out of Cape Canaveral before the rocket veered dangerously off course. Worried that the rocket was heading towards a crash-landing on earth, NASA engineers issued a self-destruct command and the craft was obliterated about 290 seconds after launch.

An investigation revealed the cause to be a very simple software error. A hyphen was omitted in a line of code, which meant that incorrect guidance signals were sent to the spacecraft. The overall cost of the omission was reported to be more than $18 million at the time (about $169 million in today’s world).

2. The Morris Worm, 1988

Not all costly software errors are worn by big companies or government organizations. In fact, one of the most costly software bugs ever was caused by a single student. A Cornell University student created a worm as part of an experiment, which ended up spreading like wildfire and crashing tens of thousands of computers due to a coding error.

The computers were all connected through a very early version of the internet, making the Morris worm essentially the first infectious computer virus. Graduate student Robert Tappan Morris was eventually charged and convicted of criminal hacking and fined $10,000, although the cost of the mess he created was estimated to be as high as $10 million.

History has forgiven Morris though, with the incident now widely credited for exposing a vulnerability and improving digital security. These days, Morris is a professor at MIT and the worm’s source code has been kept as a museum piece on a floppy disc at the University of Boston.

3. Pentium FDIV Bug, 1994

The Pentium FDIV bug is a curious case of a minor problem that snowballed due to mass hysteria.

Thomas Nicely, a math professor, discovered a flaw in the Pentium processor and reported it to Intel. Their response was to offer a replacement chip to anyone who could prove they were affected by it.

The original error was relatively simple, with a problem in the lookup table of the chip’s algorithm. This could cause tiny inaccuracies in calculations, but only very rarely. In fact, the chance of an miscalculation occurring was calculated to be just 1 in 360 billion.

Although the actual effects of the software error were negligible, when details of the bug hit the international press, millions of people requested a new chip, costing Intel upwards of $475 million.

4. Bitcoin Hack, Mt. Gox, 2011

Mt. Gox was the biggest bitcoin exchange in the world in the 2010s, until they were hit by a software error that ultimately proved fatal.

The glitch led to the exchange creating transactions that could never be fully redeemed, costing up to $1.5 million in lost bitcoins.

But Mt. Gox’s woes didn’t end there. In 2014, they lost more than 850,000 bitcoins (valued at roughly half a billion USD at the time) in a hacking incident. Around 200,000 bitcoins were recovered, but the financial loss was still overwhelming and the exchange ended up declaring bankruptcy.

5. EDS Child Support System, 2004

Back in 2004, the UK government introduced a new and complex system to manage the operations of the Child Support Agency (CSA). The contract was awarded to IT services company Electronic Data Systems (EDS). The system was called CS2, and there were problems as soon as it went live.

A leaked internal memo at the time revealed that the system was “badly designed, badly tested and badly implemented”. The agency reported that CS2 “had over 1,000 reported problems, of which 400 had no known workaround”, resulting in “around 3,000 IT incidents a week”. The system was budgeted to cost around £450 million, but ended up costing an estimated £768 million altogether. EDS, a Texas-based contractor, also announced a $153 million loss in their subsequent financial results.

6. Heathrow Terminal 5 Opening, 2008

Imagine prepping to jet off on your eagerly-awaited vacation or important business trip, only to find that your flight is grounded or and your luggage is nowhere to be seen.

This was exactly what happened to thousands of travelers when Heathrow’s Terminal 5 opened back in March 2008, and it was all caused by buggy software. The problem lay with a new baggage handling system that performed well on test runs, but failed miserably in real-life. This caused massive disruptions like malfunctioning luggage belts and thousands of items being lost or sent to the wrong destinations.

British Airways also revealed that problems with the wireless network caused additional problems at the airport. Over the next 10 days, some 42,000 bags were lost and more than 500 flights canceled, costing more than £16 million.

7. NASA’s Mars Climate Orbiter, 1998

Losing $20 from your wallet is probably enough to ruin your day — how would it feel to lose a $125 million spacecraft? NASA engineers found out back in 1998 when the Mars Climate Orbiter burned up after getting too close to the surface of Mars.

It took engineers several months to work out what went wrong. It turned out to be an embarrassingly simple mistake in converting imperial units to metric. According to the investigation report, the ground control software produced by Lockheed Martin used imperial measurements, while the software onboard, produced by NASA, was programmed with SI metric units. The overall cost of the failed mission was more than $320 million.

8. Soviet Gas Pipeline Explosion, 1982

This error is a little bit different to the others, as it was deliberate (or so rumor has it). In fact, the Soviet gas pipeline explosion is alleged to be a cunning example of cyber-espionage, carried out by the CIA.

Back in 1982, at the height of the cold war tensions between the USA and USSR, the Soviet government built a gas pipeline that ran on advanced automated control software. The Soviets planned to steal from a Canadian company that specialized in this kind of programming.

According to accounts, the CIA was tipped off and began plotting some counter-espionage. They worked with the Canadians to place deliberate bugs in the software (also known as a Trojan Horse) to compromise the Soviet pipeline.

The unknowing Soviets went ahead and stole the compromised software and applied it to the pipeline. In June 1982, the explosion occurred with a force which was visible from space. This severely damaged the pipeline, which had cost tens of millions to construct and was intended to produce $8 billion in natural gas revenue.

9. Knight’s $440M in bad trades, 2012

Losing $440 million is a bad day at the office by anyone’s standards. Even more so when it happens in just 30 minutes due to a software error that wipes 75% off the value of one the biggest capital groups in the world.

Knight Capital Group had invested in new trading software that was supposed to help them make a killing on the stock markets. Instead, it ended up killing their firm. Several software errors combined to send Knight on a crazy buying spree, spending more than $7 billion on 150 different stocks.

The unintended trades ended up costing the company $440 million, and Goldman Sachs had to step in to rescue them. Knight never really recovered, and was ultimately acquired by a competitor less than a year later.

10. ESA Ariane 5 Flight V88, 1996

Given the complexity and expense of space exploration, it’s no wonder there are several failed space missions on our list of all-time software errors. However, the European Space Agency’s Ariane 5 failure is an even harsher cautionary tale than the rest, as it was caused by more than one error.

Just 36 seconds after its maiden launch, the rocket engines failed due to the engineers reusing incompatible code from Ariane 4 and a conversion error from 64-bit to 16-bit data.

The failure resulted in a $370 million loss for the ESA, and a whole host of recommendations came out of the subsequent investigation, including calls for improved software analysis and evaluation.

11. The Millennium Bug, 2000

The Millennium Bug, AKA the notorious Y2K, was a massive concern in the lead-up to the year 2000. The concern was that computer systems around the world would not be able to cope with dates after December 31, 1999, due to the fact that most computers and operating systems only used two digits to represent the year, disregarding the 19 prefix for the twentieth century. Dire predictions were made about the implosion of banks, airlines, power suppliers and critical data storage. How would systems deal with the 00 digits?

The anticlimatic answer was “pretty well, actually”. The millennium bug was a bit of a non-starter and didn’t cause too many real-life problems, as most systems made adjustments in advance. However, the fear caused by the potential fallout throughout late 1999 cost thousands of considerable amounts of money in contingency planning and preparations, with institutions, businesses and even families expecting the worst. The USA spent vast quantities to address the issue, with some estimates putting the cost at $100 billion.

(Raygun). Remember, the programmer makes the mistakes, the programming language only does what it is 'told' (programmed by the programmer) to do.

So, the White House would have done better to say: "White House urges developers to dump programming mistakes".

Finally, forget about teaching an X president Spanish to stop him from lying, eh?

In the sense that "music" and lyrics are a "language" (way of communicating), the video below is also a parasitic anthem.

It is cuckoo because 'music' is not sentient, can't write songs, can't write anything,  and the real writer of this song, like the intruder in the hacking by the cuckoo hacker that Cliff Stoll busted, is also a parasite and the words are parasitic.

Like the Cuckoo bird who initiates a targeted mother bird of another species, and deceives her, causing her to, in effect, assassinate her own chicks, this cuckoo song's words are parasitic (however, the second song is truly instructive).

The next post in this series is here, the previous post in this series is here.




"I met a strange lady, she made me nervous
She took me in and gave me breakfast
And she said
"Do you come from a land down under
Where women glow and men plunder?
Can't you hear, can't you hear the thunder?
You better run, you better take cover"


No comments:

Post a Comment