office notes

on ventures, computing and emerging technologies


Computing #1 - How much computing are we actually doing?
Are we genuinely more productively with each subsequent upgrade?
How much of the overhead is actually utilised meaningfully?
How much of it is simply bad practice?

Microsoft's office suite is especially verbose.
Word uses about 2.1 GB on macOS.
Excel uses about 1.8 GB.
PowerPoint uses about 1.6 GB.
Are advanced users benefiting from all of that?
In comparison, LibreOffice, a free and open-source office suite, uses a total of only 800 MB for its three equivalent apps to Word, Excel and PowerPoint.

How does a messaging app like WeChat use 600 MB at launch from a fresh installation?
In comparison, WhatsApp uses only 123 MB of storage.
What explains the gap?

Software is a collection of plain text files.
A Graphical User Interface does not have to be expensive.
It can be constructed efficiently from instructions in plain text files.
A Graphical User Interface would use more memory, and not explain the excessive storage.

At trade shows and industry events, a popular chart in keynotes is one on the calculus of user-generated data.
Presenters would point out the rate of change and the slope of the curve.
They like to highlight the amount of data over time is growing exponentially.

Is that growth as representative as presenters portray?
People are certainly spending more time online and doing more over the Web.
Are they doing that much more?

Or, is it because of the consumer migration toward videos with higher resolution?
An image at "4K" resolution has four times as many pixels as an image at "1080p" resolution.
A two-hour video at "4K" resolution is about 28 GB.
A similar video at "1080p" resolution may be 3 GB.

On top of that, there is the multiplier effect, enabled by aggregators like YouTube, and social networks like TikTok.

So, how much of the growth in data is masked by the same high-resolution videos being streamed over and over?
How much computing are we actually doing, and how much of that is indeed helping us work and live happier?
Computing #2 - Have we reached a plateau in computing?
The Graphical User Interface with a mouse was a breakthrough over the previous Command Line Interface with a keyboard.
It made computing accessible to office workers and middle-class families.
The world changed how it worked on a daily basis.

Breakthroughs in storage and integrated circuit removed design constraints.
Apple's expression of mobile computing became the dominant design.
It made computing convenient and more natural to the broader population.
The world changed how it lived on an hourly basis.

Yet, hasn't the role of computing remained the same?
Operating systems still wait for user commands.
If there is no command in the job queue, then operating systems simply idle.

Apps have features like "auto-complete", "auto-fill", and "auto-pilot", but non of which is fully automated.

Advanced users of Excel may record a Macro and script in Visual Basic for Application.
System Administrators may script in Bash or Python, then execute it on a schedule by editing the Cron Table.
Enterprise clients use Robotic Process Automation to automate repetitive tasks.

Aren't these interactions eerily short, relatively mindless, and categorically transactional?
Are today's apps little more than glorified "If-Else" statements?

What is beyond "If-Else" statements?
Is it a "State Machine"?

If a "State Machine" knew about the user's context, would that allow tasks to be fully delegated with less user attention?
If so, wouldn't rule-based business apps like accounting benefit from a "State Machine"?
If so, why isn't the "State Machine" design pattern more prevalent beyond academia?