The Slow Gulls

Ideas, texts and images between sky, earth and sea

lang: [ en | fr ]

Bloated: evolution of computer science ressource consumption

2021-12-15

I have recently started a new paid position. The hiring process went by quickly and i had to work with a personal computer, the professional one being lost to the delivery gods. And so, I took out of storage my old study computer, lĂ©thĂ©e that I bought 8 years prior (2014). She was top of the line for the time (and rather expensive), but I didn’t want, as was customary at the time among my peers, to change it in 3 years. A successfull gamble, since she still is my only laptop to date.

For all the power she displayed at the time (Intel i5-4200U - 2 hyperthreaded cores at 2.6GHz, 8GB of DDR3 and Intel HD 4400 graphics), that I still consider pretty decent for today’s standards, i was rather disappointed in a professional usage.

Resource usage in a modern professional context🔗

I’m {dev,sys,sec}ops these days. These roles imply mostly the following tasks:

  1. Collaborate via tools like mail, instant messaging, VCS (and its interfaces such as GitLab);
  2. Edit code, or text in a rather large langage panel (php, hcl, yml, toml, (ba)sh, markdown, python, go, rust
);
  3. Write technical & user documentation;
  4. launch commands & interpret their results;
  5. Keep an eye on my infrastructures via monitoring tools & dashboards

With only what runs on léthée to display my desktop environment & provide what is expected of a standard computer, she already uses 1GB of RAM.

$ free -h
		total	used	free
Mem:	8Gi		1Gi 	7Gi

Collaborate🔗

For mails, my company uses GSuite, so GMail. The web interface is rather well done and the automatic sorting feature (only available on this frontend) is quite useful to increase the signal/noise ratio. This tab, opened with Firefox, eats up a bit more than 1GB of RAM.

$ free -h
		total	used	free
Mem:	8Gi		2.1Gi 	5.9Gi

For instant messaging, the choice has been made to use Slack. A web interface is available, as well as a desktop application. The app, based on Electron, who packages a complete Chrome engine, also eats up quite some RAM.

$ free -h
		total	used	free
Mem:	8Gi		3.3Gi 	4.7Gi

For the VCS tools, we use a mix between GitLab & CodeCommit, a tool from Amazon Web Services (AWS). GitLab is light, compared to CodeCommit, who uses most of the JS of the whole AWS console. GitLab takes about 700MB while CodeCommit uses around 900MB.

$ free -h
		total	used	free
Mem:	8Gi		4.9Gi 	3.1Gi

All of these tools are quite standard and require to be permantly active to fill their intended purposes. Collectively, they use about 3.9GB, that is half my total capacity dedicated to collaboration tools.

Editing text files🔗

The main issue here is the diversity the editor must be able to handle.

Vim1 is the champion here. Using few resources, being perfectly integrated with my usual work environment (a terminal emulator :D), its plugin system enables us to turn Vim into whatever we see fit, including turning it into an IDE. However, it’s not really efficient as an IDE as soon as we need to interface with the outside world, and I must constantly switch between a terminal, vim and an opened shell. Still, I constantly have at least one opened, but its impact si almost negligible.

The same can’t be said when not using Vim. I sometimes use VSCode, for instance. I have found it to be a good mid-point between vim & a full-on IDE, feature-wise. Very usefull to edit a few project files with a minimum of context-pertinent auto-complete & documentation. However, it asks for around 1GB.

$ free -h
		total	used	free
Mem:	8Gi		5.9Gi 	2.1Gi

A full-on IDE, however, rapidly uses about 1.5GB, but is usefull for important modifications to medium to large-sized projects.

$ free -h
		total	used	free
Mem:	8Gi		7.4Gi	600Mi

BUT! It’s also pretty frequently that I need to lookup some documentation that’s not integrated in the IDE, articles. I do these on the web, as printing these changing pieces of information on dead trees seems a waste of forest. So I frequently find myself opening 3-4 Firefox tabs.

$ free -h
		total	used	free
Mem:	8Gi		8.4Gi	-0.4Gi

Keeping an 👁 on infrastructures🔗

This is done via a web dashboard offering graphs, metrics & other goodies. Typically rather on the heavy side regarding JavaScript & frequently leaking memory.

$ free -h
		total	used	free
Mem:	8Gi		11.4Gi	-3.4Gi

Collaborate
 Again.🔗

Somebody calls me via Slack or Google Meet! I then need to activate microphone and/or camera, maybe share my screen or look at one located hundreds of km away.

$ free -h
		total	used	free
Mem:	8Gi		12Gi	-4Gi

I’ve been asked to open a .docx. I must then launch LibreOffice only to look at the file’s contents.

$ free -h
		total	used	free
Mem:	8Gi		13.5Gi	-5.5Gi

LĂ©thĂ©e doesn’t have any available RAM for quite a while now, and even with all the swap space in the world, she freezes 30 seconds at each window switch.

A computer

My computer isn’t the issue🔗

No. Léthée, even being not so young anymore has more than enough RAM & CPU to run every feature described above, especially the collaboration features.

A mail client needs very few resources. It only needs to display a list with some meta-info, permit search and display mail contents, one at a time. It would be even lighter if we didn’t have HTML emails.

All these software don’t need as much RAM. Most of the issue is that half of them are built on a web stack, and so, need a minima a web rendering engine to run. This is quite RAM-hungry, but also inefficient, CPU-wise. Even if I didn’t ask them to do anything, not even display them on screen, they still each idle around 1% CPU.

I remember that once upon a time, what started the characteristique noise of a fan under stress was video games, back when the only reason you would want a powerfull rig would be that you wanted to work with 3D workloads or video editing softwares, not because you wanted to look at documents made of a variant of plain text.

A company development computer can’t do much without the web or browsers in today’s tooling. Google felt it (and maybe influenced it?) some years back with its ChromeOS.

We have become incredibly dependant on a heavy set of specification, requiring “war machines” (as I’ve seen gaming computer beeing called) to run them. The trend doesn’t seem to go towards alleviating these requirements2 and it is now complicated to comfortably use them on an expensive computer from only 8 years ago, and 10 years old computer struggle with modern basic usage, even with light linux distributions.

Why are we here?🔗

I think many factors to take into account, one of them being, in my humble opinion, the quick gain of popularity of JavaScript. Suddenly, we found ourselves with lots of JavaScript developers, each contributed to a fragmented ecosystem, later unified by framework of increasing complexity. This is quite well described by Kev Quirk in its The Web Is Fucked. To quickly summarize: it’s the Capital’s fault. The Capital, wanting to sell ads and so needs increasingly fancy websites, oftentimes to the detriment of content.

This system has been nourrished by the needs of companies to port their webapps in a desktop environment, without wanting to recruit system developers. It so happens that they already had plenty of JavaScript devs, how convenient.

But JavaScript and it’s developers are not to blame, this is just the latest result of decades of trading off performance for easyness/quickness of development. It’s all a matter of where do we place the acceptance threshold between these to concepts, said to be opposed.

New features lead to new uses, themselves leading to new features, evolving standards & protocols as to support them, complexifying all systems and increasing their use of resources, which does have consequences in the physical world. We just tend to see computing the same way capitalism sees growth: eternal & infinite.

The complexification of systems is source of the creation of an “expert caste”, and by this, the exclusion of all other individuals in the capacity of appropriating and mastering them. I think these situation should be avoided, especially in communication technologies, as they reinforce the power of some on others.

Is an other way possible?🔗

Conceptually: of course. We can see these last couple years many new projects going in other disections, exploring what is possible.

Go and Rust, for instance, altough fundamentaly opposed in their approach of memory management, attempt to modernize (understand: make more accessible & easy) system development. Rust has impressive performance, with the promise of memory safety.

More extreme, uxn is an artistical project by Hundred Rabbits under the form of a constraining virtual machine forcing the development of with resource usage in mind. It already has gather a surprising quantity of adepts and software.

In the development tooling category, sourcehut also seduces quite the number with its set of collaboration tools.

Gemini is something between gopher and HTTP, is light and puts content back at the center. I think it enables knowledge or information transmission without the interference of ad agencies (for now).

These projects are exemples of a more general (but still rather small) will of an other computing model than what is currently in place. We may even boldly say: the yearning for an other form of society, less consuming, more authentic (as in: out of control of megacorporations) in what it proposes and its essence.

But is this will really shared by a sufficient part of the population? There’s already a divide between Facebook users and those who refuse or don’t understand its usage. Facebook’s users, for a large part know that Facebook is ethicaly problematic. The same behaviour is found in spending habits.

I am intimately convinced that the models upon which we produce and consume will determine the future of our kind. I don’t think today’s trends go in the right direction, but we will soon see.

And I’ve already been wrong quite often.


  1. In reality, i use NeoVim. ↩

  2. «This was once revealed to me in a dream.» (The Divine and the Human, Nikolai Berdyaev, 1949) ↩