About the Initiative
The Friendly Technology Initiative is a mission to promote friendly computer technology.
Computing technology — personal computers, phones, tablets and business hardware — should be friendly.
Friendly technology is unobtrusive, functional, pragmatic, aesthetic, respectful, ethical and robust.
(see the Friendly Principles.)
Unfortunately, much current technology is unfriendly by these standards.
(see the Unfriendly Epidemic.)
We believe these standards are reasonable, desirable and achievable. The principle impediments to friendly technology are not technological, but social and economic.
(see the Friendly Rationale.)
The Friendly Technology Initiative is a mission to promote friendly computer technology — both through the development of friendly technology, and through the promotion of existing more friendly alternatives.
The Friendly Principles
The key principles of friendly technology are:
- serves the users
- not obnoxious
- fit for purpose
- free from significant or notable defects
- sufficiently documented
- as simple as practical
- feature stable, complete
- pleasing to the senses
- conducive to creativity
- affirms the productive will of the users
- protective of privacy
- no harrassment
- free of rules, impediments or legal encumbrances
- supports community
In practice, these principles have several other important implications:
- Free Software.
- As promoted by the Free Software Foundation, as unencumbered by artificial legal, copyright, patent or intellectual property restrictions — is far preferable and far more friendly than proprietary software.
- Even moderately complex solutions are difficult and often impossible to verify, and therefore un-trustworthy.
- Complex solutions take longer to learn for the user, which can compromise or derail creative freedom.
- Complex solutions are more difficult to understand as a developer, which can act as an additional encumberance to making desirable improvements or modifications.
In fact, complexity is perhaps the primary failing of otherwise 'free software'; complexity is the encumberance hidden in plain sight.
Nobody wants to spend time around someone who is unfriendly. In the same way, nobody wants to work with uncooperative or obnoxious tools.
The primary motivation of engineers and creatives ought to be the problem they are actually trying to solve, or the idea or feeling they are trying to explore. Whilst obnoxious and unfriendly technology may serve as a motivator to develop better tools, it also offers significant distraction from the task at hand.
Friendly technology fosters the ideas, freedoms and activities which form the foundation of human society. Combative and aggressively competitive technology which relies upon the violence of law above wisdom, is fundamentally an impediment to our collective ideals. Too often we preech freedom whilst acting with disregard for the freedom of others — to do so is implicitly an own-goal.
The initiative is as much a pragmatic idea as it is a philosophical idea.
Part of the activity of the initiative is the presentation of Friendly Technology — both existing and newly developed.
The initiative is promoting friendly technology in a number of ways:
- Marketing. Promoting and explaining the benefits of friendly technology through online channels. Featuring particularly good examples of friendly technology.
- Nexus. Organising information as it relates to friendly technology and providing a central point-of-departure for anyone seeking more friendly technology for their home, office, educational institution, small business, non-profit or enterprise.
- Software Databases. Building a database of particularly friendly existing free and open-source software products — including both application programs and operating systems.
- Hardware Databases. Building a database of particularly friendly hardware products — in particular, personal computing devices (laptops, tablets and mobile phones), and business computing technology (servers, communications and storage.)
- Quality Documentation. Building high-quality documentation for Usage and Troubleshooting of computing systems.
- Development. Developing promising technology and our own ideas in accordance with the friendly principles.
- Education. Providing learning resources to empower individuals with respect to computing technology.
The Unfriendly Epidemic
We are in the midst of an obnoxious technology epidemic. To see that this is so, we need only take a look at the history of digital computing and plot a few key indicators which relate directly to the Friendly Principles.
Let us examine a couple of very simple metrics.
First, we will view a chart of microprocessor clockspeed (see Figure 1 below.) Clockspeed was historically a casual approximation of year-on-year increase in computational power.
|Microprocessor||Year||Clockspeed MHz||x10 Improvement *|
|IBM/Motorola PowerPC 601||1993||80||2.0339|
|Intel Pentium II||1997||300||2.6079|
|IBM/Motorola PowerPC 750||1997||366||2.6942|
|Intel Pentium III||1999||600||2.9089|
|Intel Pentium 4||2000||2000||3.4318|
|Intel Pentium D||2005||3200||3.6359|
|Intel Core i7||2008||3200||3.6359|
Clockspeeds represent maximum advertised performance. Sample data selected based on common microprocessors generally used in personal computers. Obviously market factors may have significantly impacted the data.
* x10 improvement is calculated as the base-10 logarithm of the sample speed divided by the original 1971 baseline speed where sampled performance measurement began. A polynomial trendline has been used to smooth the graphic.
accessed November 15, 2020
Clockspeed represents the rate at which a microprocessor clock 'ticks' (today measured in Gigahertz — billions of cycles per second.) The microprocessor is the small silicon chip at the heart of a computer — the 'brain' if you like. Though there are a lot of other factors involved, the lay person can think of clockspeed as somewhat analogous to how many calculations/primitive-operations are performed each second. So a 400 MHz clock is very roughly analogous to 400 million operations every second.
We seem to have reached something of a physical impasse in the pursuit of higher clockspeeds. In recent history, microprocessor manufacturers have had to seek increasingly innovative means to continue to increase the performance of general purpose computing systems. As a result, clockspeed is no longer a good on-going measurement of performance improvement.
Nevertheless, throughout the early decades of personal computing clockspeed improvements are somewhat roughly indicative of the year-on-year improvement in the performance of the central processor. In that regard, we can see that processor performance has been significant since the birth of the microprocessor in 1971.
Today, high-end systems are easily 10,000 times faster than the earliest microprocessor-based systems.
Personal Computing Storage
A similar trend is repeated throughout the same period with respect to personal computing storage capacity (see Figure 2.)
|System||Year||Storage MB||x10 Improvement *|
|Power Macintosh G3||1997||4,000||4.55|
|iMac Flat Panel||2002||60,000||5.72|
Quoted storage figures are usually representative of the high-end of stock offerings, with few exceptions. Sample data selected to provide an indicative chronological spread. Obviously market factors may have significantly impacted the data.
* x10 improvement is calculated as the base-10 logarithm of the sample storage capacity divided by the original 1976 baseline capacity. A polynomial trendline has been used to smooth the graphic.
accessed November 15, 2020
The Apple-line of personal computers, produced from 1976 through to present-day, is an ideal casual reference since Apple have usually been the exclusive maker of their hardware (meaning the information is gathered in one place) and their history is well documented.
Operating System Size
The relevance of the aforementioned charts to our discussion only becomes apparent, however, in the juxtaposition between these performance and capacity metrics, and the size of the software of the same time period.
Consider the installed size of various consumer operating systems over a similar period (see Figure 3.)
|Operating System||Year||Installed Size (MB)||x10 Increase1|
|Apple ProDos||1983||2 0.4||0.00|
|Macintosh System 6||1988||2 0.8||0.30|
|Microsoft Windows 3.1||1992||15||1.57|
|Macintosh System 7.5.3||1995||20||1.70|
|Microsoft Windows 95||1995||55||2.14|
|Microsoft Windows 98||1998||195||2.69|
|MacOS X 10.0.4||2001||800||3.30|
|MacOS X 10.3.9 'Panther'||2003||1,500||3.57|
Sample data selected to provide an indicative chronological spread.
1 x10 increase is calculated as the base-10 logarithm of the sample install size divided by the original 1983 baseline install size. A polynomial trendline has been used to smooth the graphic.
2 Early operating systems were not technically 'installed', but loaded from floppy disk.
accessed November 15, 2020
The World's most popular free software operating system, Linux, is not included in these statistics. There is no standard Linux distribution. One of the principle advantages of Linux and other free operating systems, in fact, is that given sufficient knowledge a user may exercise significant control over precisely what is included and installed.
Nevertheless, the free software community is not immune from bloat, as we shall see in a moment.
What these figures show is that as performance and system capacity has increased, so too has the size of the most popular consumer operating system software. Early operating systems were so small by comparison with more recent versions, that they do not even appear upon the chart. Moreover, the magnitude of the size increase is often comparable with performance and storage at the time.
All of this begs the question — over this time period, has the feature set, performance, stability, usability and overall friendliness of these operating systems increased by a corresponding magnitude? Has Windows or macOS become ten-thousand times easier to use, or ten-thousand times faster, or more stable? Have there been even one-hundred significant advancements in user interface, or usable features that could warrant such a monumental increase in size?
Word Processor Size
Let us make one final albeit trivial comparison — one which may be applied even to free software (see Figure 4.)
|Office Product||Year||Installed Size (MB)||x10 Increase||Memory Usage (MB)||x10 Increase|
Sample data selected to provide an indicative chronological spread. Application programs observed to have similar requirements in a given chronological period, regardless of operating system or platform.
ClarisWorks 3 was installed on the MinivMac 68k Macintosh emulator running System 7.5.3. LibreOffice 126.96.36.199 was installed on an AMD x64 PC running Windows 10.
The comparison of two popular word processing applications is particularly interesting.
It is easier to obtain data for application software because unlike an operating system, there is only one major component and the installed selection of packages is not a significant consideration like it would be if we were to compare Linux distributions.
Word processors themselves have not changed significantly over the chronological period we are discussing. Even 'office packages' — which technically both of our examples are — are still essentially the same today as they were in 1994. ClarisWorks included word processing, spreadsheets, drawing and painting, simple database and a communications terminal emulator. LibreOffice includes word processing, spreadsheets, drawing, a significantly evolved database, mathematical formula creation and presentation. The most significant difference here in software complexity is the database. Though there are similarly-sized database packages available for the ClarisWorks-era machine, so it is beyond dubious that this alone explains the massive difference in size between these two programs.
Yet again, we see orders of magnitude difference in resource requirements and utilisation between the software of yesteryear and the software of today.
Tinkering Around the Margins
Both the aesthetic and functional changes between the oldest and most recent software packages surveyed in this article are arguably minor compared to the relative differences in resource utilisation.
If anything, to me, the modern desktop (see Figure 6) represents a significant step backward in usability; the modern user interface is cluttered by comparison to the old (see Figure 5.) Even given the over twenty year gap between these two operating systems, the 'Character Map' of the ubiquitious Windows is ugly by comparison to vintage Mac 'Key Caps'. The Windows 'Notepad' doesn't support styled text, but the Mac 'SimpleText' does. The Windows 'Calculator' is huge, which is obviously just what you need from something ordinarily designed to sit open on your desktop. And the Save Changes? dialog box on Windows looks positively camouflaged by comparison with the crisp ancient Mac version on Windows; it is as if the system were whispering quietly, "You're about to loose all your work. Do you really want to do that?" — fantastic!
Moreover, the core user-facing elements of modern operating systems remain virtually unchanged from their predecessors. We still have pointers, windows, icons, menus, buttons, desktops, and file browsers. A word processor is still fundamentally about processing, well... words. Professional applications for graphics and desktop publishing still existed twenty-five years ago. Sure, there have been some useful incremental improvements; though it is difficult to point to anything revolutionary. Meanwhile, our user experience has become more cluttered and prone to failure than twenty-five years ago.
For instance, a recent update to Apple's latest operating system has reportedly rendered inoperative some user's machines. Speaking from experience, this is not an isolated incident. Nor is it limited to Apple — it is an industry-wide phenomenon.
If we attempt to enumerate the significant changes to software over the last two-three decades of consumer computing history, and exclude all those which are not actually that recent, there are very few:
- Smooth graphics and effects on high-resolution displays
- Much improved typography and internationalisation support, perhaps
Some would argue that I am being extraordinarily unfair. They might say that much of the progress of the past twenty-five years has obviously occurred 'under-the-hood' in the parts of the system that to which the ordinary user is not privy. To that, I say "bollocks"! Again, I challenge anyone to produce more than a handful of fairly minor 'improvements'.
Software development methodologies have changed superficially, but basically we are using the same tools today as we were twenty-five years ago. Today instead of servers we have 'the cloud'. Instead of data structures we have 'objects'. Instead of Java we have Python and Ruby and a plethora of other languages which basically amount to dialectical differences on the same language paradigms from the dawn of the modern computing age in the 60s and 70s. If you want an example of a genuine paradigm shift in programming, you need to look at something like Forth.
It becomes increasingly difficult to comprehend — particularly to a skilled systems engineer — how such a huge disparity in resource utilisation can be considered necessary. Indeed it is recognisably not so.
What is occurring is that decidedly average code writers are educated into the existing paradigm and constrained by the artifical imperatives of the economic machine into consistently writing and rewriting barbaric and inefficient code. This inefficiency has even been welcomed and heralded as good practice. The famous programming maxim, "Premature optimisation is the root of all evil" has become a doctrine derived from a quotation long-since removed from the original context in which it appeared and simplistically misinterpreted by seemingly nearly all who encounter it (see The Fallacy of Premature Optimisation by Randall Hyde.)
Perhaps if there was a corresponding perceptable or statistical increase in reliability or performance whilst using contemporary systems, that might be considered some justification for this waste. Though I challenge anyone to find it — because this author certainly cannot.
More probably market forces explain the disparity. The massive increases in resource availability have made it possible to write bloated software. Without the resource constraints of earlier times, it seems there is no significant opposing force to the tendency toward ever more bloated and complicated systems.
Even the prospect of future advances cannot explain what has already occurred, nor what might be yet to develop.
There is an intuitive limit to how innovative you can make a word processor. Likewise, there is a similar intuitive limit to how much you can evolve what most of us recognise as a computer operating system — within the limitations intrinsic to the human nervous system. Even if we build capable speech-operated systems, or neural interfaces, the actual functions of these systems are significantly circumscribed by the tasks they perform. Word processing is computer-aided writing and writing itself has existed, as a pradigm, largely unchanged for millennia.
'Progress' in the functionality of software is superficial at best — like the sleek drop-shadow enhanced, graphically accelerated user interface aesthetics that accompany the myriad mostly small incremental changes — little more than 'tinkering around the margins' of possibility.
Zawinski's Law, or "the Law of Software Envelopment", states that,
"Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can."
— Jamie Zawinski
There are a few notable personas throughout computing history who appear to have commented directly or indirectly upon the phenomenon of software bloat. They include:
- Chuck Moore — creator of the Forth programming language/operating system/paradigm, which is a thing of great beauty and simplicity.
- Donald Knuth — author of The Art of Computer Programming, creator of the TeX typesetting system and proponent of literate programming.
Moore has created a thing which is so unlike almost everything else, it could conceivably take an experienced programmer months to comprehend the degree to which Forth is a paradigm unlike any other. It is also an example of a system that is miniscule by the standards of today, yet arguably more functional.
Moreover, Moore has made abundantly clear what he thinks of the state of technology and software in particular, stating:
“I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth.”
With regards to the issue of feature creep and the gradual tendency of software to expand, Knuth seems to have similarly strong opinions.
Knuth has stated that the development of the TeX typesetting system is complete. It is a finished piece of software. For all intents and purposes, there is nothing to be gained by developing it further — aside from making the odd and exceedingly rare bug fix when necessary. In this vein, TeX uses the mathematical constant Pi as a version number. Whenever there is a fix, an additional digit of pi is appended. At time of writing, the current version of TeX is 3.14159265. This position is anathema to contemporary market-driven software development, which is forever tinkering to make an excuse to sell the next version or maintain subscribers.
Contemporary systems are significantly unfriendly for a number of reasons:
- Bloat. Software is orders of magnitude larger than it needs to be; the results include:
- longer downloads, and longer installs and upgrades
- more expensive hardware is a necessity — which incurs an ethical cost because people who cannot afford the latest hardware are unnecessarily excluded
- longer learning time
- wasted energy — which in turn has economic and environmental impacts — computers are responsible for upwards of ten-percent of the World's electricity usage according to one report (The Register, accessed 15 November 2020.)
- insecure — they are too complex to provide reasonable guarantees
- Restriction. Software restricts and curtails the freedom of users:
- proprietary software does so under the guise of a generally accepted though seldom examined economic paradigm, which is enforced by the State; users are not permitted to share software (which is fundamentally anti-community) and skilled-users are often unable to implement necessary fixes or changes (the source is not available)
- free software does so despite good intentions by way of inexperience, innocence, naivete and unnecessary complexity — leading to bugs, poor documentation, poor accessibility to users and developers, configuration issues and an unthinking tendency to follow proprietary offerings rather than innovate
- Distraction. Various distractions range from annoying to catastrophic:
- software tends to be visually cluttered, which is an unnecessary distraction to the creative process
- many commercial programs make intrusive demands upon the user for commercial purposes
- many regular interruptions are necessary to serve upgrades upon unreliable, insecure and needlessly evolving programs — sometimes the forced upgrades will render a device unusable
- Dysfunctional. Programs are often:
- changed so significantly that they become incompatible with prior versions and may break the workflow of the user — this is a particular problem in software development
- Complexity. Sometimes a system is so unnecessarily complex that it mandates only the most highly skilled individuals to develop, maintain or make desired changes — for example, Universal Serial Bus (USB.)
These characteristics are commonplace, unnecessary and disruptive.
Contemporary systems fail the freedom test and they fail the simplicity test. Without freedom and an eye toward simplicity, a system cannot be friendly.