Computing #1 - How much computing are we actually doing?
Are we genuinely more productively with each subsequent upgrade?
How much of the overhead is actually utilised meaningfully?
How much of it is simply bad practice?
Microsoft's office suite is especially verbose.
Word uses about 2.1 GB on macOS.
Excel uses about 1.8 GB.
PowerPoint uses about 1.6 GB.
Are advanced users benefiting from all of that?
In comparison, LibreOffice, a free and open-source office suite, uses a total of only 800 MB for its three equivalent apps to Word, Excel and PowerPoint.
How does a messaging app like WeChat use 600 MB at launch from a fresh installation?
In comparison, WhatsApp uses only 123 MB of storage.
What explains the gap?
Software is a collection of plain text files.
A Graphical User Interface does not have to be expensive.
It can be constructed efficiently from instructions in plain text files.
A Graphical User Interface would use more memory, and not explain the excessive storage.
At trade shows and industry events, a popular chart in keynotes is one on the calculus of user-generated data.
Presenters would point out the rate of change and the slope of the curve.
They like to highlight the amount of data over time is growing exponentially.
Is that growth as representative as presenters portray?
People are certainly spending more time online and doing more over the Web.
Are they doing that much more?
Or, is it because of the consumer migration toward videos with higher resolution?
An image at "4K" resolution has four times as many pixels as an image at "1080p" resolution.
A two-hour video at "4K" resolution is about 28 GB.
A similar video at "1080p" resolution may be 3 GB.
On top of that, there is the multiplier effect, enabled by aggregators like YouTube, and social networks like TikTok.
So, how much of the growth in data is masked by the same high-resolution videos being streamed over and over?
How much computing are we actually doing, and how much of that is indeed helping us work and live happier?