About the Initiative

The Friendly Technology Initiative is a mission to promote friendly computer technology.

Computing technology — personal computers, phones, tablets and business hardware — should be friendly.

Friendly technology is unobtrusive, functional, pragmatic, aesthetic, respectful, ethical and robust.
(see the Friendly Principles.)

Unfortunately, much current technology is unfriendly by these standards.
(see the Unfriendly Epidemic.)

We believe these standards are reasonable, desirable and achievable. The principle impediments to friendly technology are not technological, but social and economic.
(see the Friendly Rationale.)

Our Mission

The Friendly Technology Initiative is a mission to promote friendly computer technology — both through the development of friendly technology, and through the promotion of existing more friendly alternatives.

The Friendly Principles

The key principles of friendly technology are:

  • Unobtrusive.
    • inconspicuous
    • serves the users
    • not obnoxious
  • Functional.
    • fit for purpose
    • free from significant or notable defects
  • Pragmatic.
    • sufficiently documented
    • as simple as practical
    • feature stable, complete
  • Aesthetic.
    • pleasing to the senses
  • Respectful.
    • conducive to creativity
    • affirms the productive will of the users
    • protective of privacy
    • no harrassment
    • free of rules, impediments or legal encumbrances
  • Ethical.
    • supports community
    • accessible
  • Robust.
    • reliable
    • trustworthy
    • secure

In practice, these principles have several other important implications:

Rationale

Nobody wants to spend time around someone who is unfriendly. In the same way, nobody wants to work with uncooperative or obnoxious tools.

The primary motivation of engineers and creatives ought to be the problem they are actually trying to solve, or the idea or feeling they are trying to explore. Whilst obnoxious and unfriendly technology may serve as a motivator to develop better tools, it also offers significant distraction from the task at hand.

Friendly technology fosters the ideas, freedoms and activities which form the foundation of human society. Combative and aggressively competitive technology which relies upon the violence of law above wisdom, is fundamentally an impediment to our collective ideals. Too often we preech freedom whilst acting with disregard for the freedom of others — to do so is implicitly an own-goal.

The initiative is as much a pragmatic idea as it is a philosophical idea.

Part of the activity of the initiative is the presentation of Friendly Technology — both existing and newly developed.

Our Strategy

The initiative is promoting friendly technology in a number of ways:

The Unfriendly Epidemic

We are in the midst of an obnoxious technology epidemic. To see that this is so, we need only take a look at the history of digital computing and plot a few key indicators which relate directly to the Friendly Principles.

Let us examine a couple of very simple metrics.

Computational Performance

First, we will view a chart of microprocessor clockspeed (see Figure 1 below.) Clockspeed was historically a casual approximation of year-on-year increase in computational power.

Figure 1: Microprocessor Clockspeed (1971-2017)
Microprocessor Year Clockspeed MHz x10 Improvement *
Intel 4004 1971 0.74 0
Intel 8080 1974 2 0.4318
Intel 8086 1976 5 0.8297
Motorola 68000 1979 8 1.0339
Intel 80386 1985 40 1.7328
Intel Pentium 1993 66 1.9503
IBM/Motorola PowerPC 601 1993 80 2.0339
Intel Pentium II 1997 300 2.6079
IBM/Motorola PowerPC 750 1997 366 2.6942
Intel Pentium III 1999 600 2.9089
AMD Athlon 1999 1000 3.1308
Intel Pentium 4 2000 2000 3.4318
Intel Pentium D 2005 3200 3.6359
Intel Core i7 2008 3200 3.6359
AMD Ryzen 2017 4100 3.7436

Clockspeeds represent maximum advertised performance. Sample data selected based on common microprocessors generally used in personal computers. Obviously market factors may have significantly impacted the data.

* x10 improvement is calculated as the base-10 logarithm of the sample speed divided by the original 1971 baseline speed where sampled performance measurement began. A polynomial trendline has been used to smooth the graphic.

Source: https://en.wikipedia.org/wiki/Microprocessor_chronology
accessed November 15, 2020

Clockspeed represents the rate at which a microprocessor clock 'ticks' (today measured in Gigahertz — billions of cycles per second.) The microprocessor is the small silicon chip at the heart of a computer — the 'brain' if you like. Though there are a lot of other factors involved, the lay person can think of clockspeed as somewhat analogous to how many calculations/primitive-operations are performed each second. So a 400 MHz clock is very roughly analogous to 400 million operations every second.

We seem to have reached something of a physical impasse in the pursuit of higher clockspeeds. In recent history, microprocessor manufacturers have had to seek increasingly innovative means to continue to increase the performance of general purpose computing systems. As a result, clockspeed is no longer a good on-going measurement of performance improvement.

Nevertheless, throughout the early decades of personal computing clockspeed improvements are somewhat roughly indicative of the year-on-year improvement in the performance of the central processor. In that regard, we can see that processor performance has been significant since the birth of the microprocessor in 1971.

Today, high-end systems are easily 10,000 times faster than the earliest microprocessor-based systems.

Personal Computing Storage

A similar trend is repeated throughout the same period with respect to personal computing storage capacity (see Figure 2.)

Figure 2: Apple Personal Computer Storage (1976-2013)
System Year Storage MB x10 Improvement *
Apple ][ 1976 0.114 0
Macintosh Plus 1985 0.800 0.85
Macintosh II 1986 80 2.85
Macintosh Quadra 1992 1,000 3.94
Power Macintosh G3 1997 4,000 4.55
iMac Flat Panel 2002 60,000 5.72
MacBook Pro 2008 312,500 6.45
MacBook Pro 2013 1,000,000 6.94

Quoted storage figures are usually representative of the high-end of stock offerings, with few exceptions. Sample data selected to provide an indicative chronological spread. Obviously market factors may have significantly impacted the data.

* x10 improvement is calculated as the base-10 logarithm of the sample storage capacity divided by the original 1976 baseline capacity. A polynomial trendline has been used to smooth the graphic.

Source: https://www.apple-history.com
accessed November 15, 2020

The Apple-line of personal computers, produced from 1976 through to present-day, is an ideal casual reference since Apple have usually been the exclusive maker of their hardware (meaning the information is gathered in one place) and their history is well documented.

Operating System Size

The relevance of the aforementioned charts to our discussion only becomes apparent, however, in the juxtaposition between these performance and capacity metrics, and the size of the software of the same time period.

Consider the installed size of various consumer operating systems over a similar period (see Figure 3.)

Figure 3: Consumer Operating System Size (1983-2015)
Operating System Year Installed Size (MB) x10 Increase1
Apple ProDos 1983 2 0.4 0.00
Macintosh System 6 1988 2 0.8 0.30
Microsoft Windows 3.1 1992 15 1.57
Macintosh System 7.5.3 1995 20 1.70
Microsoft Windows 95 1995 55 2.14
Microsoft Windows 98 1998 195 2.69
MacOS X 10.0.4 2001 800 3.30
Windows XP 2001 1,500 3.57
MacOS X 10.3.9 'Panther' 2003 1,500 3.57
Windows Vista 2007 15,000 4.57
Windows 8 2012 20,000 4.70
Windows 10 2015 32,000 4.90

Sample data selected to provide an indicative chronological spread.

1 x10 increase is calculated as the base-10 logarithm of the sample install size divided by the original 1983 baseline install size. A polynomial trendline has been used to smooth the graphic.

2 Early operating systems were not technically 'installed', but loaded from floppy disk.

Source: https://en.wikipedia.org/
accessed November 15, 2020

The World's most popular free software operating system, Linux, is not included in these statistics. There is no standard Linux distribution. One of the principle advantages of Linux and other free operating systems, in fact, is that given sufficient knowledge a user may exercise significant control over precisely what is included and installed.

Nevertheless, the free software community is not immune from bloat, as we shall see in a moment.

What these figures show is that as performance and system capacity has increased, so too has the size of the most popular consumer operating system software. Early operating systems were so small by comparison with more recent versions, that they do not even appear upon the chart. Moreover, the magnitude of the size increase is often comparable with performance and storage at the time.

All of this begs the question — over this time period, has the feature set, performance, stability, usability and overall friendliness of these operating systems increased by a corresponding magnitude? Has Windows or macOS become ten-thousand times easier to use, or ten-thousand times faster, or more stable? Have there been even one-hundred significant advancements in user interface, or usable features that could warrant such a monumental increase in size?

Word Processor Size

Let us make one final albeit trivial comparison — one which may be applied even to free software (see Figure 4.)

Figure 4: Word Processor Comparison (1994-2020)
Office Product Year Installed Size (MB) x10 Increase Memory Usage (MB) x10 Increase
ClarisWorks 3 1994 1.3 0.00 0.975 0.00
LibreOffice 6.4.0.3 2020 625 2.68 60 1.79

Sample data selected to provide an indicative chronological spread. Application programs observed to have similar requirements in a given chronological period, regardless of operating system or platform.

ClarisWorks 3 was installed on the MinivMac 68k Macintosh emulator running System 7.5.3. LibreOffice 6.4.0.3 was installed on an AMD x64 PC running Windows 10.

The comparison of two popular word processing applications is particularly interesting.

It is easier to obtain data for application software because unlike an operating system, there is only one major component and the installed selection of packages is not a significant consideration like it would be if we were to compare Linux distributions.

Word processors themselves have not changed significantly over the chronological period we are discussing. Even 'office packages' — which technically both of our examples are — are still essentially the same today as they were in 1994. ClarisWorks included word processing, spreadsheets, drawing and painting, simple database and a communications terminal emulator. LibreOffice includes word processing, spreadsheets, drawing, a significantly evolved database, mathematical formula creation and presentation. The most significant difference here in software complexity is the database. Though there are similarly-sized database packages available for the ClarisWorks-era machine, so it is beyond dubious that this alone explains the massive difference in size between these two programs.

Yet again, we see orders of magnitude difference in resource requirements and utilisation between the software of yesteryear and the software of today.

Tinkering Around the Margins

Both the aesthetic and functional changes between the oldest and most recent software packages surveyed in this article are arguably minor compared to the relative differences in resource utilisation.

Figure 5: Apple's System 7.5.3 running on an emulated Macintosh II with 8 MB of memory
Apple Macintosh System 7.5.3
Figure 6: Microsoft's Windows 10 running on a real AMD x64 PC with 8 GB of memory (that's 8,192 MB)
Microsoft Windows 10

Aside from some fancy translucency effects, I cannot really tell the difference. Can you?

If anything, to me, the modern desktop (see Figure 6) represents a significant step backward in usability; the modern user interface is cluttered by comparison to the old (see Figure 5.) Even given the over twenty year gap between these two operating systems, the 'Character Map' of the ubiquitious Windows is ugly by comparison to vintage Mac 'Key Caps'. The Windows 'Notepad' doesn't support styled text, but the Mac 'SimpleText' does. The Windows 'Calculator' is huge, which is obviously just what you need from something ordinarily designed to sit open on your desktop. And the Save Changes? dialog box on Windows looks positively camouflaged by comparison with the crisp ancient Mac version on Windows; it is as if the system were whispering quietly, "You're about to loose all your work. Do you really want to do that?" — fantastic!

Moreover, the core user-facing elements of modern operating systems remain virtually unchanged from their predecessors. We still have pointers, windows, icons, menus, buttons, desktops, and file browsers. A word processor is still fundamentally about processing, well... words. Professional applications for graphics and desktop publishing still existed twenty-five years ago. Sure, there have been some useful incremental improvements; though it is difficult to point to anything revolutionary. Meanwhile, our user experience has become more cluttered and prone to failure than twenty-five years ago.

For instance, a recent update to Apple's latest operating system has reportedly rendered inoperative some user's machines. Speaking from experience, this is not an isolated incident. Nor is it limited to Apple — it is an industry-wide phenomenon.

If we attempt to enumerate the significant changes to software over the last two-three decades of consumer computing history, and exclude all those which are not actually that recent, there are very few:

Some would argue that I am being extraordinarily unfair. They might say that much of the progress of the past twenty-five years has obviously occurred 'under-the-hood' in the parts of the system that to which the ordinary user is not privy. To that, I say "bollocks"! Again, I challenge anyone to produce more than a handful of fairly minor 'improvements'.

Software development methodologies have changed superficially, but basically we are using the same tools today as we were twenty-five years ago. Today instead of servers we have 'the cloud'. Instead of data structures we have 'objects'. Instead of Java we have Python and Ruby and a plethora of other languages which basically amount to dialectical differences on the same language paradigms from the dawn of the modern computing age in the 60s and 70s. If you want an example of a genuine paradigm shift in programming, you need to look at something like Forth.

It becomes increasingly difficult to comprehend — particularly to a skilled systems engineer — how such a huge disparity in resource utilisation can be considered necessary. Indeed it is recognisably not so.

What is occurring is that decidedly average code writers are educated into the existing paradigm and constrained by the artifical imperatives of the economic machine into consistently writing and rewriting barbaric and inefficient code. This inefficiency has even been welcomed and heralded as good practice. The famous programming maxim, "Premature optimisation is the root of all evil" has become a doctrine derived from a quotation long-since removed from the original context in which it appeared and simplistically misinterpreted by seemingly nearly all who encounter it (see The Fallacy of Premature Optimisation by Randall Hyde.)

Perhaps if there was a corresponding perceptable or statistical increase in reliability or performance whilst using contemporary systems, that might be considered some justification for this waste. Though I challenge anyone to find it — because this author certainly cannot.

More probably market forces explain the disparity. The massive increases in resource availability have made it possible to write bloated software. Without the resource constraints of earlier times, it seems there is no significant opposing force to the tendency toward ever more bloated and complicated systems.

Even the prospect of future advances cannot explain what has already occurred, nor what might be yet to develop.

There is an intuitive limit to how innovative you can make a word processor. Likewise, there is a similar intuitive limit to how much you can evolve what most of us recognise as a computer operating system — within the limitations intrinsic to the human nervous system. Even if we build capable speech-operated systems, or neural interfaces, the actual functions of these systems are significantly circumscribed by the tasks they perform. Word processing is computer-aided writing and writing itself has existed, as a pradigm, largely unchanged for millennia.

'Progress' in the functionality of software is superficial at best — like the sleek drop-shadow enhanced, graphically accelerated user interface aesthetics that accompany the myriad mostly small incremental changes — little more than 'tinkering around the margins' of possibility.

Expert Opinions

Zawinski's Law, or "the Law of Software Envelopment", states that,

"Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can."

Jamie Zawinski

There are a few notable personas throughout computing history who appear to have commented directly or indirectly upon the phenomenon of software bloat. They include:

Moore has created a thing which is so unlike almost everything else, it could conceivably take an experienced programmer months to comprehend the degree to which Forth is a paradigm unlike any other. It is also an example of a system that is miniscule by the standards of today, yet arguably more functional.

Moreover, Moore has made abundantly clear what he thinks of the state of technology and software in particular, stating:

“I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth.”

With regards to the issue of feature creep and the gradual tendency of software to expand, Knuth seems to have similarly strong opinions.

Knuth has stated that the development of the TeX typesetting system is complete. It is a finished piece of software. For all intents and purposes, there is nothing to be gained by developing it further — aside from making the odd and exceedingly rare bug fix when necessary. In this vein, TeX uses the mathematical constant Pi as a version number. Whenever there is a fix, an additional digit of pi is appended. At time of writing, the current version of TeX is 3.14159265. This position is anathema to contemporary market-driven software development, which is forever tinkering to make an excuse to sell the next version or maintain subscribers.

Unfriendly Epidemic

Contemporary systems are significantly unfriendly for a number of reasons:

These characteristics are commonplace, unnecessary and disruptive.

Contemporary systems fail the freedom test and they fail the simplicity test. Without freedom and an eye toward simplicity, a system cannot be friendly.