The Worst Programming Language of All Time

You can argue that C++ shares this honor with the likes of JavaScript and TeX. Among them only JavaScript managed to design itself out of the mess it was in the early-to-mid 2000s. There’re still the ugly parts, but each new iteration actually improved the language as a whole. All while keeping backward compatibility. Well, TeX is odd and idiosyncratic, but it’s a “niche” language. And then there’s C++ … which managed to become more and more of a mess the more they tried to “improve” it. Making big blunders in designing features, failing to rectify them in a timely manner and then cowardly leaving “broken” features in the language to preserve backward compatibility. *sigh*

Here’s a great collection of grievances:

While many of the features are useful and necessary for a modern language, all the pieces are so shoddily Frankensteined together it is hilarious.

Just the amount of “separate” Turing-complete languages it contains is out of this world: C++, its C subset, Macros, Templates, Exceptions, constexpr/consteval, co-routines. All with separate syntax, semantics, inconsistencies and foot guns and no coherent design.

And even after all that it’s still missing essential pieces for software development like dependency and build management which the specification doesn’t even acknowledge as relevant. 🤯 This leading to weird edge cases like ODR violations or “ill-formed, NDR”-like atrocities, which was summarized best in a CppCon talk:

This is a language which has false positives for the question “was this a program?”

What is C++ – Chandler Carruth, Titus Winters – CppCon 2019 at 13:23

Simulating Statically Compiled Binaries in Glorified Tarballs

Containers won for one reason: they simulate a statically compiled binary that’s ergonomic for engineers and transparent to the application. A Docker image is a glorified tarball with metadata in a JSON document.

From Joseph’s comment on “Containers and giving up on expecting good software installation practices”

I hadn’t thought of it that way, but from a developer’s perspective it makes sense. It may not be incidental that the new programming languages of the 2010s (e.g. Go, Rust, Zig) produce statically linked binaries by default.

I always thought of containers as a way to add standardized interfaces to an application/binary that can be configured in a common way (e.g. ports, data directories, configurationenv vars, grouping and isolation). The only other ecosystem that does this and maybe even goes a little further is Nix.

Because the binary format itself is ossified and the ecosystem fragmented enough we missed the train for advanced lifecycle hooks for applications (think multiple entry points for starting, pausing, resuming, stopping, reacting to events, etc. like on Android, iOS, MacOS) … in Linux this is something that’s again bolted on from the outside: with e.g. D-Bus, Systemd, CRIU).

Gebrauchsanmaßung ohne Zueignungsabsicht

Juristen haben echt grandiose Wörter 😂:

Zueignungsabsicht, f.

Die Absicht einer Person beschreibt, sich eine Sache wenigstens vorübergehend anzueignen bei gleichzeitigem Vorsatz, den Berechtigten um die Sache dauerhaft zu enteignen (siehe Diebstahl, Unterschlagung).
Wikipedia

Gebrauchsanmaßung, f.

Eine vorübergehende eigenmächtige (und damit unberechtigte) Nutzung von beweglichen Sachen unter zeitweiliger Brechung fremden Gewahrsams. Gemeint ist, dass die Sache zwar unberechtigt benutzt, dem Berechtigten später aber zurückgebracht wird.
Wikipedia

Ich bin durch einen News-Artikel über diese Wörter gestolpert.

Those ones were the expensive headcount anyway

Arstechnica reports on a study where they measured the productivity of software developers of different open source projects doing different (also non-coding) tasks.

In the comments there’s a snarky summary of the articles main point:

“These factors lead the researchers to conclude that current AI coding tools may be particularly ill-suited to “settings with very high quality standards, or with many implicit requirements (e.g., relating to documentation, testing coverage, or linting/formatting) that take humans substantial time to learn.” While those factors may not apply in “many realistic, economically relevant settings” involving simpler code bases, they could limit the impact of AI tools in this study and similar real-world situations.”

So as long as I cull the experienced people and commit to lousy software the glorious Age of AI will deliver productivity gains? Awesome, those ones were the expensive headcount!

Jon Steward on Trevor Noah’s What Now Podcast

No one has discernment for what they aren’t. […] You can’t. It’s the hardest thing in the world. It’s hard enough to have empathy to what they aren’t let alone discernment. […]
Jon Steward at 50:30

If we were more understanding of prejudice and stereotype and less tolerant of racism we’d understand that prejudice and stereotype are functions mostly of ignorance and of experience. Racism is malevolent, right? But the other is way more natural, but we react as though it would metastasize immediately. And so I think we throw out barriers to each other […] before we have to.
Jon Steward at 56:00

Century-Scale Storage

What would you use to keep (digital) data safe for at least a hundred years? Maxwell Neely-Cohen looks at all the factors, possible technologies, social and economic challenges that you have to contend with if you intentionally want to store data for a century. He explicitly chose that time scale, because it is at the edge of what a human can experience, but it is outside of a single human’s work life as well as beyond the lifetime of most companies or institutions. So the premise sets you up for a host of problems to be solved. He also analyses strategies for recording and keeping data past and present and evaluates their potential for keeping data safe at century-scale.
It’s long, but worth it.