US Government is requesting information on adoption of memory safe programming languages

Like the title says, the US Government is requesting information on adoption of memory safe programming languages and open-source software security.

They’re currently taking comments until October 9th. I think this is a good opportunity to help bring Ada back into the spotlight.

https://www.federalregister.gov/documents/2023/08/10/2023-17239/request-for-information-on-open-source-software-security-areas-of-long-term-focus-and-prioritization

5 Likes

They have released the full report now.

Maya Posch also blogged about it here

3 Likes

Thanks for the pointer to Maya’s blog.

2 Likes

Maya’s post also on Hacker News: The White House Memory Safety Appeal Is a Red Herring | Hacker News

There are other Ada posts of hers and others in Hackaday: Ada | Hackaday

Yeah, as usual, the HN news one is literally the same as the comments in the original blog post, just complaints about Ada or just saying the post has no merit.

The creator of C++ disagrees with the White House’s report, but claims C++ today’s different from old C++. I find it hilarious since modern C++ is just the old one with heavy meta-programming added to make appear safer. It’s bad enough the language has a crisis where most of its programmers refuse to use it for more than a C compiler, but getting them to embrace modern C++ is a real shot in the dark.

C++ creator rebuts White House warning | InfoWorld

1 Like

May also has a great talk on YouTube about moving from C++ to Ada.

I think it is more of a perspective difference. If you were writing C++ in the 90s, it is very very different from the C++ of today, even before the metaprogramming. C++ has been doing its best to move away from the use of pointers in favor of RAII and a lot of updates to the standard library (std::array verses c style pointers for example. It upgraded enumerations and namespaces. It upgraded looping and type inference. It added lambda functions, language defined thread support, reworked casting to be more constrained, etc.

If you compare today to C++ 11 though, it isn’t as big of a delta, so the article may not hit as far for people without the old days experience of C++.

All of that said, Ada is still way ahead in my opinion, but I just wanted to provide some perspective. C++ still has the biggest issue that Ada has, and that is it really really wants to maintain backwards compatibility so it leaves all the old stuff in. This hurts C++ more than Ada because Ada is safer by default, but I do feel this mentality does hurt both languages. I like the approach Rust takes with different editions of the language, though I understand how that can be harder to maintain.

4 Likes

What would you take out?

I actually think Rusts approach could be detrimental. One great thing about Ada is that you can pick up any book spanning 40 years and the majority of it is still of value.

There’s a lot of stuff in the containers that was there before modern iteration was added (mostly cursor focused stuff). A lot of that could be pruned. Interfaces.C.* has nearly 3 different ways to do pointers to c strings. Some of that could be pruned or simplified. In general like 30% of the ARG discussions I’ve read on new features gets comments about how they can’t add something or change something for the better because it will break some code written in Ada83 or Ada95. There’s things like not being able to add by reference indexing to Unbounded and Bounded string types (My_Unbounded_Str(2) := ‘c’ for example). They can’t update the access variable rules easily because of all the old rules. The standard string types are fairly disjointed and a bit of a mess to go between if you need to worry about inputs of different formats.

I’m not saying Ada is bad btw…far from it. Just that a lot of old code and rules that don’t make as much sense in todays context of the language definitely hinder the language improvements today. It doesn’t outright stop them, but it makes language development much more constrained and tougher.

As far as the impact on Rust, so far their edition approach has been met with overwhelmingly positive results. They have been doing it for 6 years now and most folks have found it to be better. Sure it has negatives (and I am sure you can find some folks who don’t like it), but when weighing stuff like this you have to weigh both sides, and in their case the community found the approach to be large net positive for the language. Before they did the different editions, they really struggled updating the language while maintaining backwards compatibility. Now they can add new features while allowing some people to stay with the edition they have all while getting security updates.

1 Like

I have only used the std containers once in a desktop tool as they require significant runtime support. Yet the 83 booch components didn’t.

Strings get a bad wrap and I can understand the desire for a simple convert to utf8 and back e.g. Go, Rust? or utf-16 and back in the case of java/js/dart/windows. However, Adas String support, whilst a learning curve is more powerful than I have seen before. Handling embedded ascii, utf-16 and utf-8 without conversion.

I’m not quite sure how Rust editions work. However I do know Rust is conpletely different to it’s first design and many backwards incompatible changes would seem problamatic for a learner in the future. e.g. I have 5 books and I didn’t realise they’re all on different editions .

I would prefer specifics as it is also possible that changes would be detrimental. I’m not absolutely sure that all changes since Ada 83 have been positive. Certainly more effort went into getting Ada 83 right than any github discussion is likely to achieve.

Rust changed a lot until 2015. The very first edition of Rust (the original original edition) was completely changed from at that point. After 2015 they started trying looking into managing updates to the language as editions. In particular they wanted to really overhaul the borrow checker but doing so would require a lot of breaking of backwards compatibility. There was a lot of rust code at that point, so they basically have compilers provide a switch (or via Cargo toml files) that allowed the users to select if they were using the 2015 edition (old borrow checker) or the new 2018 edition (upgraded borrow checker that was much easier to use). This allowed folks who couldn’t afford to make changes to upgrade to 2018 to still run rust just as they always had with some new features and up to date security fixes. If they wanted the new features that weren’t compatible, they had to work towards porting to the new edition, but they didn’t have to if they didn’t want to.

And they ensure that the edition decision is on a per crate basis and that crates of different editions can interoperate with each other. It’s really pretty slick.

It puts a lot more effort on the language designers and compiler writers, but it worked out really well for rust. And the compiler writers were surprisingly open to this, so there wasn’t much pushback.

It lets them do thing like introduce new keywords without breaking old code (like “async” and “await”).

This brings me back to what I mentioned about a modern runtime.

Also, there’s issues with initialising data at compile time, e.g. I recently mentioned having strings be both C and Ada, so it has a dope vector at the start and a null at the end, that way copying strings about between C and Ada is unnecessary, To_C for example cannot be made static.

Yes, but I have read that one reason Ada lost a lot of support in the mid- to late-90s was that Ada 95 broke backwards compatibility in some ways that were too much for some people. I don’t know this, and I don’t remember where I read it; I’ve also read that Ada took a huge hit with the rise of Java.

But it does amaze me that, yes, Rust has been pretty bold about breaking backwards compatibility from time to time (e.g., lib-green) and yet it still grows strong. What on earth did Ada do wrong. The software industry can’t really be so simple-minded that it’s a matter of preferring { and } to begin and end.

(Oh, FWIW I had to explain to colleagues the differences in what C++ means by & and && the other day. They were surprised to learn that C++ can have move semantics.)

1 Like

I don’t recall anything ground breaking back then. The hardest part was new keywords, but it was pretty trivial to work around those using find and replace in our editors. Here’s a list of the incompatibilities:

The biggest reason Ada fell out of favor for us was compiler pricing. It just wasn’t sustainable long term. Compilers were extremely expensive back then and there weren’t a lot of options for platforms we targeted. I always liked the language (and still my favorite to date), but a lot of companies could not afford to pay it.

1 Like

I know I’ll sound somewhat crazy, but I really think reworking controlled types into interfaces would be nice. The compiler can detect if a type inherits off of a Controlled interface and build the scaffolding needed internally to the type. They could probably base the scaffolding off of the existing controlled abstract classes

Thanks; I’ve long wondered if that had something to do with it. Ada compilers would have needed certification, right? which required passing the ACATS, which required money, if only because of time. Whereas Microsoft could basically attempt to steal Java from Sun and after losing the subsequent lawsuit they still managed to get the .NET runtime and C# out of it.

I don’t remember when I first tried gnat; I think it was on the Amiga, sometime in late '93 or '94. Found it on Aminet and downloaded it, but I didn’t have an Ada book handy at the time and couldn’t figure out how to make it work. I was very much into Modula-2 so I stuck with that – which illustrates a problem Ada’s had at least since the DoD mandate ended.

I imagine another issue was if some want to use C and some want Ada but with a cost especially at a time when security wasn’t even a consideration for the vast majority then Ada would often lose. I guess security considerations only started generally with Windows 2000 being built on Windows NT.

Apparently Ichbiah left because he preferred a class keyword for oop instead of the tagged types.