When started using , we fixed our lowest supported version. A year and a half later, here's how that worked out for us (from my perspective, at least):

· · Web · 3 · 2 · 4

@minoru Lack of support for old Rust compilers is not due to lack of maturity. It’s a conscious decision by the Rust project. There are no plans to change it. Breaking away from C/C++ slow update cycles is seen as a success, AKA “Stability Without Stagnation”.

It’s the “evergreen” approach Chrome took. Chrome is a mature product, even though 6-week-old Chrome versions are unsupported.

@kornel The compiler and standard library aren't the problem. They do deliver on their motto, and they're totally fine IMHO.

The problem is with the crates ecosystem. It aggressively drops support for older compilers and stdlib, making it impossible to build stuff with even a few-months-old toolchain.

I'm not calling for deliberate stagnation; I just wish it was possible to *knowingly stagnate*. Right now, I'm literally forced not to.

@minoru But that’s the point. Rust compiler release schedule drives the ecosystem compat here. Because the Rust project doesn’t support any non-latest compilers at all, most crates don’t support them either, and through network effects, everyone is forced to stop using old compilers.

@kornel That's optimizing for the developer at the expense of the users who build from source. I don't have a technical argument against this, but it feels plain wrong.

@minoru It is incompatible with distros that want to freeze compilers for years. I wish Rust provided official packages, because curl | sh has bad reputation, but technically nothing stops users from getting the latest compiler via rustup. It's easy and works well. Only problem is it goes against distros' policies.

@kornel Not just policies, it also goes against usability. With Rustup, ordinary users *triple* the number of package managers in their system. The "install these tools and libraries, then run ./configure && make && make install" was "stateless" in a sense that the user didn't have to maintain the setup -- their distro's package manager did that for them. Now all of a sudden users have to at least learn about `rustup update`, and maybe more if they're on one of the less-supported platforms.

@minoru True, that is messy. But if Rust provided deb and rpm, why not always use latest?

@kornel Newer deb/rpm fill fail to install on an older system because of incompatible libc or something. Thus, users will try an older package. Thus, they'll get an older compiler. Thus, some of my app's deps might fail to build.

So no, I don't see how that solves the problem at all.

@minoru Rustup somehow manages to avoid this (with an old-enough glibc?). And you could always package a MUSL build in a deb/rpm.

@kornel > Rustup somehow manages to avoid this (with an old-enough glibc?)

You mean they—*theatrical gasp*—deliberately stagnate in order to provide compatibility? :)

> And you could always package a MUSL build in a deb/rpm.

You mean static linking? But then the packager has to update the package promptly when musl fixes something, and the users have to trust the packager to do this. Users take time to develop trust in their distros; 3rd party need to, but can they warrant the user's effort?

@minoru The line has to be drawn somewhere. Instead of individual libraries bending backwards, the rustc does. This way you can have an old CentOS and still latest Rust libs.

I mean static linking. You're supposed to update Rust every 6 weeks anyway. You need to make it an apt source and pull these updates with the rest of security fixes.

@kornel Six weeks is quite a lot to re-build your software with an already patched library. Again, users will have to develop trust that this will happen promptly. Asking Newsboat users to build trust in another 3rd party might be a stretch. Distros exist precisely so that users can put their trust into a single entity (even if it's really a collective), not spread it over myriad sources.

@minoru I wish Distros pulled rustc every 6 weeks.
In terms of trust, Rust from last month is no different than Rust from this month. I don't think distros review 10 million lines of code every time they package it.

@kornel I meant trust in integration and packaging. E.g. installing manpages, binaries etc. where they usually are on that particular distro, creating/updating any symlinks (e.g. update-alternatives on Debian), stuff like that. Third-party packages are always a bet, because outside maintainers don't necessarily have the training (like NMU process in Debian) to produce a high-quality package.

@kornel On a side note: that "deliberate stagnation" is a slippery slope, of course. The trick would be to decide how much stagnation is too much.

On a meta-level, I feel some irony re-reading my previous toot: at my previous job, I was advocating for *less* stagnation, and here I'm advocating for *more*. The difference is, at that job we were about 20 years behind current C++ practices, while here, I'm forced to stay within a few months of the latest and greatest.

@minoru Yes, Rust seems wicked reckless compared to C and C++. But OTOH it can run circles around it, e.g. async/await is perfectly usable already (we use it in production at Cloudflare), and it was released in November! If Rust had Debian-like release "stability", people would wait until 2022 before even trying it. If it was C, we'd be looking at 2040 :)

I've experimented with old compiler support:
With a cargo index filtered down to only compatible crates old compilers are usable, but everything is mostly frozen in time, and in untested configurations.

@kornel True.

But does a crate like `xdg` need async, or any of the stuff that was added in the last couple years.

(Don't mean to pick on `xdg`, they aren't doing anything outstandingly wrong -- they're just an example of common behaviour.)

@minoru I'd say nobody should be using Rust older than 1.31 at all. The older borrow checker was less accurate and much more annoying.

@kornel Well, yeah, but I believe a C or C++ programmer would expend some (small) effort on re-writing code in such a way that it's accepted by both versions of the checker. Simply because as a library author, they'll try to get themselves out of the way.

@minoru Yeah, but that's an extra work with an explicit goal of helping others stagnate. It creates a vicious cycle: users are taught their compiler is for life, and library authors can't upgrade code.

The Web has suffered from this. Updates were slow and manual. Devs had to support old IE, and IE users never felt a need to update.

Evergreen approach has shaken this up.

@kornel > Yeah, but that's an extra work with an explicit goal of helping others stagnate.

Exactly, and I don't think it's a necessarily bad thing. In fact, Firefox seem to embrace both: its train release model exemplifies the evergreen approach, while ESR provides some (deliberate, controlled) stagnation to those who need it. Ditto with LTS in other projects.

(Though of course FF is not liable if some site no longer works in ESR, so that might not be a good example.)

@kornel > The Web has suffered from this.

The Web, or at least the JS developers, also suffered from the "JS fatigue", where people got tired of everything moving all the time.

Kinda like in Carroll: It takes all the running you can do, to stay in place. To get somewhere, you need to run twice as fast.

Rustup, crater, and hours upon hours of core team work make this less of an issue with Rust, but the problem can't really be fixed as long as the pace is high.

@minoru I don't think JS fatigue is an analogy here. In JS the fashionable frameworks change, so your projects become obsolete before you ship them, and you need to learn a new one.

OTOH Rust releases are fully backwards-compatible, small incremental improvements. async/await was the only exception since 1.0. The 2018 edition was big, but it actually _reduced_ amount of stuff devs need to know.

@kornel A user who updates Newsboat from source every three months has to update their Rust toolchain every time. To them, it looks like every time they look away, something changes. That's what reminded me of JS fatigue.

Still, that's not about developers. Those are well-off and taken care of in the Rust world, no complaints there :)

@minoru Yes, that is true. And I think it's just a perceptions/expectations problem. If you depended on newer libraries, few people would care. But a new compiler is suddenly exceptional. And it's not even a runtime dependency, or something incompatible like Python 3.

@kornel > If you depended on newer libraries, few people would care.

It depends on what you're developing, and how new is "newer". That goes back to my earlier toot: 20 years is too old, 6 weeks is too new.

> But a new compiler is suddenly exceptional.

At least in the context of Newsboat, it's not. It's right there among all the other tools we require for a build.

> And it's not even a runtime dependency

The problem is specifically about end users who build from source.

The article is based on the false premise that a user will build latest newsboat on some ancient Debian and that should be supported. This is totally wrong. Either stable distro would ship older version of newsboat and provide support for it, or a user needs more recent distro for latest software. Other cases defy the purpose of stable distro and make no sense, and no developer resources should be wasted to support these.

@AMDmi3 It isn't an assumption - it's a fact, backed by evidence (i.e questions about GCC 4.8 errors).

But yeah, discussion on Lobsters made me doubt if I should support that use-case.

Sign in to participate in the conversation
Functional Café

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!