Hi,
I’m studying with a book on data structures, and suddenly it’s gotten hella more difficult, at least for my perspective. I mean questions of exercices are way less clear, and ask a lot more “seeing”. As if I had missed a good chapter, but I didn’t. There’s so much code now that I hit a wall: I improved in that regard but the progress is slow, I always have issue “seeing” things mentally, spatially or otherwise.
So I thought maybe I should study Spark before going back to that one ? The problem is so complex, so I could certify each routine, avoiding haphazard testing. I have Building High Integrity Applications with Spark, it seems to start slow enough, but it dates a bit… the language changed quite a bit these last years. No access types yet for instance. Is there another book, beside adacore’s tuto ?
I started by adding aspect Spark_mode over a package in my exercices, and filling it with contracts. I was awestruck at the number of complaints.
- equality on access types is not allowed in SPARK
- prefix of “Old” introducing aliasing is not allowed in SPARK (SPARK RM 3.10(13))
- call a deep copy function for type “Position” as prefix of “Old” to avoid aliasing
- call to allocating function not stored in object as part of assignment, declaration or return is not allowed in SPARK
- no INOX allowed (whatever that means)
These, about 5 times each ? I looked in the (spark) rm for all, found a mention of the interdiction for half of them, with no explanation, and no mention at all for the rest. The source is here. And the allocating stuff concerned a function with no call to new or unchecked_deallocation.
I wonder if the formal containers will provide any insight though I do not know how readable they are.
https://docs.adacore.com/spark2014-docs/html/ug/en/source/spark_libraries.html
I assume you have turned spark warnings off at first to concentrate on the major problems initially. It isn’t an easy task that you are attempting.
This video shows using Sparks borrow checker feature for a simple stack.
2 Likes
While I can see how the elementary logic that constitutes the contracts that govern spark is perhaps more accessible than many of the details of implementing data structures, IMHO the details of implementing data structures are much, much simpler than implementing working SPARK.
So I personally think it best to continue wading through data structures, and post questions. But it probably depends on the problem you’re trying to solve. I’ve noticed that SPARK seems much easier for the problems some people here work on than for others’ problems.
Huh, I wouldn’t have guessed how difficult proving even containers was, so that even spark libraries don’t have all their bodies in spark. I didn’t imagine the use of models was so frequent for anything with a little bit of complexity. No, I didn’t consider deactivating checks, because I don’t know the associated workflow. I wanted to learn that too.
I see now that introducing this early in teaching would be quite the challenge. Clearly a more involved process than just switching Spark for Ada or Pascal in high school.
Interesting fact. Rod Chapman stated that a guaranteed safe performant doubly linked list isn’t possible in either Spark or Rust.
I’m taking the intermediate route. Instead of running gnatprove, I execute all the contracts. That way I’ll have extra insurance assuming the routines compile.
It only took an inhuman level of concentration ! It also taught me to formulate correct pre/postconditions, which is more difficult than I estimated. But a few times it did point at or confirm the issues’ location.
Can you explain this constraint ?
main.adb:17:09: error: type invariant on private_type_declaration or private_type_extension is not allowed in SPARK (SPARK RM 7.3.2(2))
package Essay with Spark_Mode is
TYPE TableType IS LIMITED PRIVATE with Type_Invariant => Isinitialized (Tabletype);
function Isinitialized (T: Tabletype) return Boolean is (True);
private
type Tabletype is new Integer;
end Essay;
The only mention of invariants in the aforementioned RM paragraph is
In addition, the default initialization of values of the type and the evaluation of its potential type invariant or subtype predicate shall not access any mutable state.
But I don’t see any mutable state here.
Try:
I don’t have my compiler set up correctly on this computer, so that’s a best-guess from the error which says it’s not allowed on the declaration portion.
That said, the ref cited is only relevant to initialization… which isn’t in sight within this example — so this is a bug: either a bad flagging of an error, a bad error-message, or a mistake in the RM.
Mmm, so invariants are only accepted in the full type declaration. Unlike Stable_Properties. I would suppose clients could benefit from that knowledge too, so it makes little sense.
One more GCC bug to file !