This is wonderful. I like it so much I would nominate the author of this paper to be Ada’s BDFL. Of all the ways to program, I miss the functional style and higher-order functions the most in Ada.
I am slightly disappointed they didn’t mention the Proposal for solving the issue (Abstracting TYPE and INTERFACE) that I put together — note that there is no syntax proposed because I want the notion ironed-out before bikeshedding; but the idea is that there are two things we need for Ada:
A method to “turn the ability (ala generics) to statically construct types by parts and properties inside-out”, thus allowing us to “hang” proofs on an abstraction, then marry/integrate that into the type (completely absent tagged-typing/OOP).
A method to describe the “how” of interfacing/interacting a type, complete with attributes and properties, thus allowing a generalization of things like indexing. (e.g. an access’s auto-dereferencing is exactly indexing by a zero-length list of indices; completely consistent with how Ada would reduce “function K() return data” to function k return data.)
These two features would allow for the materialization of the notional type-hierarchy (see here), as well as clearing most of the clutter of SPARK-proof and/or user-defined-indexing… in fact, given implementing the abstraction-on-interface (of which indexing is a subcomponent), it would allow us to drop the currently-needed auxiliary-types used in user-indexing.
Judging by Lucretia’s input on the proposal page, this reduces to having a way to construct (or even compose) types by certain traits, correct?
You’d want to define a set of type traits (constraints, really) and then compose your type hierarchy from those traits and of course use them as formal parameters to functions, for generics, and proofs. This is somewhat achievable with contracts but contracts are not objects - they cannot be packed behind an identifier and reused, can they? They’re not interfaces, just annotations.
This idea has been implemented rigorously in Haskell’s type classes and Rust’s type traits and I have exclusively heard praise for those. I think they have particular appeal for those with mathematical disposition.
To pair that with Ada’s already phenomenal type system would be wonderful, but this would be an advanced feature, wouldn’t it?
I am biased but I think importing some of the functional idioms (a general tendency in language design nowadays that’s actually good and sensible because it can be used to reduce complexity) would be more desirable from the perspective of an average engineer.
Why on earth contracts need to be objects, unless within the compiler?
Of course, contracts can be reused without making them run-time objects. Interface inheritance is the vehicle for that.
Interface is a contract an implementation of must fulfill.
Contract is not the program and shall have no run-time effect, whatsoever. Conflating the meta language of contracts with the object language of the program is a road to nowhere.
I am not thinking of type traits occupying space in memory or about constructing them and passing them around to functions as arguments.
Perhaps I shouldn’t have mentioned “objects” in my message above. Sorry for the confusion.
On the contrary, I’m thinking solely about composing types by including their capabilities in the program text and that being completely static with no runtime effect whatsoever.
If such type composability can be done with Ada interfaces, I admit that’s new to me.
This seems to be the consensus judging by the general lack of functional influences in Ada.
So, it is about types algebra. Then I do not see how this is related to functional programming which basically strives to remain un/ad-hoc-typed.
If you decompose the problem into functions, you naturally throw types out. Ada programming decomposes the problem into abstract user-defined types. Functions are secondary as they apply to types. Functional and relational hold types secondary. You cannot have both. These are different paradigms for a reason. Poisoning Ada with functional mess would produce nothing but more mess.
Inheritance/derivation is the basic type algebraic operation. The way must be in my view to replace as much of built-in types magical operations like:
type T is array (…) range …;
with universal inheritance, e.g. in this case inheritance from some array interface.
It is not bound to FP - it was the Mosteo & Lorente (2021) paper that was about the functional-style operations on containers.
The topic of types came from the message by @OneWingedShark, or rather from my cursory reading of it and the related proposal on Github.
Well, ML and Haskell are statically typed and allow type annotations. Haskell has type traits. I’m not sure if this can be seen as throwing the types out.
I think you’re right - Ada is an imperative language and it was not designed for functional programming as exhibited in the paper (those right-hand side examples of filter/take/map/fold/etc.).
This may be a lot to ask, but do you have any code that does this thing that a relative newcomer like me could read and learn from? I want to see how it’s done in idiomatic Ada.
Ada cannot do this. You cannot have a class of array types. As I said operations producing arrays, records, access, scalar task types were hard-wired in Ada 83. Ada 95 added tagged types orthogonal to anything else. They use a very limited record extension model unsuitable for scalar types.
After that various hacks were applied to avoid solving pressing type system problems. The container library, unbounded strings, protected objects all fall out of the type system. E.g. neither unbounded string nor vectors are arrays.
Parallel to that generics were intensively misused to create some resemblance of classes, but since generics are static, they lack run-time instances, which produces massive overdesign and incredible kludges.
Then it were helper types in the worst C++ fashion.
I’ve not really seen a good example of a “traits system” in my programming-language research; given the intuitive leap in how it’s described, it could be exactly what’s being proposed, but I hesitate because, as I said, I don’t have any real experience on a “traits system”. (Things like e.g. C++ concepts are claimed to be equivalent to Ada’s contracts+generics, it seemed like a pile of confusing and incoherent mess to me when I read up on the proposed feature.)
There’s two features in view here: one is essentially the computer-science Abstract Data Type, where you can lay out the properties, perhaps some [sub]structure, and from this construct hang the proofs/annotations for SPARK — this is to (a) clean up SPARK code and improve its readability, and (b) provide a form of “parameterization” and “early+late-binding” of proofs on the code [because you can separate into two the proof onto the abstraction, and the implementation of the abstraction therefore fulfils the proof]. — But they don’t have to be “objects” at all, in the OOP-sense; they would have to be an “object” in the compiler/language sense (like a variable is).
The other feature being the abstraction of the interface, the “how this is handled/interacted-with in the language”, of the type, things like indexing, dereferencing, attributes, properties. Indexing is the most obvious example, but also things like dereferencing, and (things like) limited/non-limited and constrained/unconstrained. (Honestly, being able to reify the conceptual-type hexarchy in that picture would go a LONG way to making Ada more friendly for newcomers: “Just look in Ada.Meta.Type_System, you’ll see all the various classes of types, and their attributes.”)
Indeed-so.
It’s why I was particularly disgusted with the proposed/GNAT-experimental class-syntax — IMO, it’s a stupid, “let’s be more like python and C++!” feature… when we could literally be solving the entire problem, unifying, and simplifying the language. (One of the benefits of the abstraction-of-interfaces/type-classes/whatever is that you can then use that to define the types, and model-check/prove them, hoisting the LRM definitions into the LRM’s own constructs. [i.e. allowing SPARK-style proof on metasystem, and therefore on the type-system].)
Exactly the reason I want to do so, “on a meta-level”… we can then use the language to define the language; which (done correctly) will make things easier both for compiler-writers and language-users/-learners.
I have mixed feelings there: I would be in agreement if we were to import enough “functional” to allow subprograms-as-parameters, without having [typically anonymous] access types — in fact, there would be a LOT of clean-up if anonymous access were removed altogether.
I think that with a LOT of what you would want functional-programming for, you could get with the abstract-interface I proposed and/or generics. The first link in my first reply shows some of the difficulties, though it also seems unaware of generics.
Not necessarily; check out How to Think about Parallel Programming: Not! — pay particular attention to the properties mentioned at 57:44, and also the example split-string example at about 40:00.
Even in 1995 when I was at uni, the staff there were pushing functional languages, which they were shit at teaching btw, and with more and more universities ONLY teaching more functional languages, more and more people expect those facilities.
And Ada IS a general purpose programming language, no matter what AdaCore want to push.
If there’s one thing I learned from this thread so far is that there are differing opinions about what could be added or changed in Ada, but all of you want the language to improve.
Alright, I see your point. I cannot judge it because I don’t know Ada as well as you do, but it sounds plausible to me.
Basically to my “it would be nice to have some convenience of functional programming”, you’re saying “no, you don’t know the kind of mess that’s underneath”.
Likely couldn’t have been avoided, given how from Ada83 perspective even OOP is a foreign influence.
I’m going to try to wrap my head around your wider message to the extent my still-too-limited Ada knowldege allows, but the allure of this part is hard to dismiss. I imagine looking at such code would make prospective users realise the amazing things you can do, and once you know the type system, you can get so much done.
Ada already gives a good first impression - the type system is so good - but then you get to the “exposition dump” of OOP, packages and generics. I think it’s telling that new languages seem to stay away from supporting OOP and generics at the same time.
Yet, going back to earth, I can’t say if these are doable or feasible because the language is what it is. There’s no reason for me to believe that the custodians of Ada don’t have its best interest in mind, but the “fix past mistake, remove bloat, add cool features” button has not yet been invented as far as I know.
The evolution of Java, C#, C++, and even Python (!), seems to confirm the presence of those expectations. I know what it looks like in C++ and Python, and I’m unconvinced about the added value. I like a lot about it, but in the languages which weren’t designed with FP inventions in mind it’s a mere shadow of the real thing. I expect it could feel similar if Ada, but then again, it also wasn’t designed with OOP in mind and it didn’t prevent it from following the market in 1995. How well it served Ada is up for discussion.
Ada did generics the right way: they are not textual-substitution templates, but actual constructs and the formal parameters are the exposing of the type-system-properties to the body of the generic. (Some of the particular syntax in generic formal-parameters is a bit messy, but it’s relatively minor amounts.) — as an example consider Type X(<>) is limited private; and Type Y is (<>); and type Z(<>) is new Parent;: these are, respectively, “any-type”, a discrete/enumeration type, and a type derived from parent.
As you can see, this presents to the implementation the information that it can rely on, while at the same time presenting to the instantiation the requirements for the parameters, too. — And Z (OOP) is a relative non-issue for generics.
Right, the type system is excellent. I believe the generic system is excellent as well, though there are a few things that would make it “more usable”, and @dmitry-kazakov would argue that generics are not good due to some previous generic-heavy code and GNAT’s method of handling them.
There’s the “Three Papers” that I wrote Explaining Ada’s Features, which may help you:
I am on the ARG, and IMO the user-defined indexing is a mistake as evidenced by how awkward it is to implement, requiring helper-types and otherwise extraneous code cluttering everything up — that’s why I really want to do the abstract-the-interface and Annex J (Obsolescent features) the current aspect-based user-indexing system.
SPARK proof is an amazing tool, and I’m quite impressed with it, but a lot of SPARK proven code has a LOT of clutter for the various aspects needed for the prover; and a LOT of these are duplicated-in-large-parts for something like a CONTAINERS library or, indeed any library which has a collection of substantially-similar abstract-data-types. Hence why I want the second feature of abstracting the type-properties out (in a way that is not necessarily/connected-to OOP).
I highly recommend you take a look at the OOP paper, definitely reread the Packages paper first, and then I’m sure you’ll be a bit more reserved in saying Ada wasn’t designed with OOP in mind: the four pillars of OOP were already present in Ada83, but as orthogonal features.
Thank you. I had no idea these existed, that’s very helpful. I prefer this form of presentation to what’s offered on AdaCore’s Learn platform which had too low information density and annoyed me with its fragmentation when I last tried it.
Yes, like many things in Ada, their design was literally ahead of its time. For instance, after Alex Stepanov and Dave Musser failed to build their generic library in Ada, Stepanov moved on to C++ and tried to convince Bjarne Stroustrup to base C++ templates on Ada generics. [1]
Alex then worked at HP Labs but he had earlier worked for a couple of years at Bell Labs, where he had been close to Andrew Koenig and where I had discussed library design and template mechanisms with him. He had inspired me to work harder on generality and efficiency of some of the template mechanisms, but fortunately he failed to convince me to make templates more like Ada generics. Had he succeeded, he wouldn’t have been able to design and implement the STL!
– Bjarne Stroustrup [1] 4.1.2 The STL emerges
The Scheme work led to a grant to produce a generic library in Ada. Dave Musser and I produced a generic library that dealt with linked structures. My attempts to implement algorithms that work on any sequential structure (both lists and arrays) failed because of the state of Ada compilers at the time. I had equivalences to many STL algorithms, but could not compile them. Based on this work, Dave Musser and I published a paper where we introduced the notion of generic programming insisting on deriving abstraction from useful efficient algorithms. The most important thing I learned from Ada was the value of static typing as a design tool. Bjarne Stroustrup had learned the same lesson from Simula.
– Alex Stepanov [1] 4.1.8 Stepanov’s view
Please not also the mention of Simula. I remember that Ada 83 Rationale mentions it too and that works in favour of your claim that OOP was very much on the horizon.
Is the argument here that some of these advanced features have been implemented using “hacks” on top of what was available (such as being tied to OOP facilities) and not as general features of the language? The effect is they’re complicated, impure and inelegant, right? I’m lenient on that because that’s 40 years of nature at work, but I agree that such things are confusing. For example, I was scrathcing my head when I read about the Ada 2022 reduction expressions – A'Reduce ("+", 0) (an FP influence btw) – because this I would expect to simply be a higher-order function in the standard library. From the outside, it looks like some kind of function call syntax, and you wonder what else – like Map/Transform/Filter/My_Arbitrary_Func – can be put there instead. It turns out it’s just a singular thing - not very cool! I’m sure there are reasons for that and it’s a useful thing to have, but it’s confusing.
Well, my primary consideration would always be the reduction of complexity and ease-of-expression which improves readability and comprehension. These aims can only be achieved without introducing foreign syntax or complicating the existing one (like multiplying ways to do things!). Secondary perhaps is compiler and implementation hygiene.
So my stance from the beginning is that many functional idioms are a strong drug, but if they were to pollute a language, they’re not worth it.
It was. Ada was object-based. The type system supported derivation and inheritance but lacked run-time class-wide objects. So generic programming was not possible. GP means “programming in terms of sets of types.”
In Ada 83 GP was achieved per generics, which then seemed a good compromise because marcos were fashionable and considered a good idea. C had preporocessor, PL/1 had macros, so Ada had got its own macro language of generics. It was even weakly typed a nice contrast to common untyped macros…
No, I would argue that static parametric polymorphism is fundamentally flawed:
It constitutes a meta-language, generic type is not a type, generic package is not a package etc. It is not Ada by definition.
Its classes have no-run-time instances. You cannot have a class-wide object holding values from a set of generic instances.
Create a generic widget library and see how it works. It does not. Parametric polymorphism inherently cannot that.
True. The real thing is a lie. The sole purpose of any program is the side effects of its execution. But that goes quickly over the board and what remains is harmful programming practice which can be seen in the languages striving appear functional (no pun intended … or maybe).
The problem is wide vs narrow approach. Wide operations apply to large objects as a whole. That is how FP decomposition works. You do something to all container, to whole string etc. This is how tokenizing and other mess comes into play. Take container, get a new one. To extract an element of a matrix multiply it by two vectors. Exciting!
This is far worse than infamous OOP orgies of creating countless meaningless classes, From the software design and engineering decomposition into wide operations is an extremely bad idea for obvious reasons. And this the very first thing every newbie or lazy programmer grasps and never let go.