I wanted to try out some of the formal containers for SPARK, but I am having trouble finding the Sparklib mentioned in the Spark docs. My GNAT Studio doesn’t have a SPARK category under Help, as the SPARK manual mentions, and I usually do everything with ALR. I do see the SPARK Lemma libraries in my ALR cache of Gnatprove, but not the other libraries.
It’s actually packaged with the gnatprove release in Alire, but it’s not in the GPR_PROJECT_PATH so you cannot with it directly.
In the future this library will be packaged separately so it will be easier to depend on it.
You can grab its sources here: spark2014/include at master · AdaCore/spark2014 · GitHub
Note that, as Fabien said, this is more recent than the version of GNATprove in Alire, so some special handling (say, for SPARK.Big_Integers) won’t be available yet.
Ok, I have an Alire project with gnatprove
(13.2.1) as dependency, but I’m still stuck using SPARKlib.
The docs say I need a sparklib.gpr
extending sparklib_internal
:
project SPARKlib extends "sparklib_internal" is
for Object_Dir use "sparklib_obj";
for Source_Dirs use SPARKlib_Internal'Source_Dirs;
for Excluded_Source_Files use SPARKlib_Internal'Excluded_Source_Files;
end SPARKlib;
@daniel-larraz in his AoC project uses a sparklib.gpr
extending sparklib_external
:
project SPARKlib extends "sparklib_external" is
for Object_Dir use "sparklib_obj";
for Source_Dirs use SPARKlib_External'Source_Dirs;
for Excluded_Source_Files use SPARKlib_External'Excluded_Source_Files;
end SPARKlib;
But in both cases, I get a project not found error. Is there something obvious that am I missing?
I’m using gnatprove
14.1.1 and gnat_native
14.2.1, in case it helps.
What’s the content of your GPR_PROJECT_PATH? It should contain the path to the “sparklib_external” project.
Note that this project was renamed recently to “internal”, but it’s not the case for any released gnatprove version. The online documentation is for the development version of SPARK.
Hm. So my question gets more stupid: Where should the “sparklib_external” project be, given that it’s bundled with gnatprove? Or do I need to provide/install it explicitly?
Sorry for this, I have a very blurry mental model of how this is supposed to work.
(I tried pointing GPR_PROJECT_PATH to the root of my project, but that doesn’t change the error. The Source_Dirs points to “src/**”.)
Assuming you have latest gnatprove 14.1.1 then you should only need to set GPR_PROJECT_PATH to the folder that contains sparklib_external.gpr
. I don’t know which folder that would be with alire to be honest. Then it’s recommended to create your own sparklib.gpr
as is documented here in the thread.
If I try this, I get an error during binding:
Bind
[gprbind] aoc2024.bexch
[Ada] aoc2024.ali
fatal error, run-time library not installed correctly
cannot locate file system.ads
raised TYPES.UNRECOVERABLE_ERROR : targparm.adb:168
gprbind: invocation of gnatbind failed
gprbuild: unable to bind aoc2024.adb
But I managed to upgrade both gnatprove
and gnat_native
to the latest version, and now it does work the with the sparklib.gpr
file that @daniel-larraz uses.
Thanks a lot to both of you for helping me figure it out!
By the way I was confused. The method I provided was OK but unnecessarily complicated. I updated my post with the “best” current method. As Fabien said earlier, in the future we want to make sparklib
be standalone and not depend on a gnatprove install.
Hi, I just wanted to say that this sound great! It would make the release of the library more flexible and also its use in other programs!
Best regards,
Fer
How is one supposed to use the Alire validation profile when including sparklib
? I’m getting errors of undefined reference to check_or_fail
I tried following the instructions on spark2014/docs/ug/en/source/spark_libraries.rst at cb9c8f48a4d7912dd9580d1216b607163ad81416 · AdaCore/spark2014 · GitHub
To get rid of this error, you should compile your code with the switch
-ffunction-sections
and link it with the switch-Wl,--gc-sections
, so that unused functions are not included in the executable.
I did it in this way:
project SPARKlib extends "sparklib_external" is
for Object_Dir use "obj/sparklib";
for Source_Dirs use SPARKlib_External'Source_Dirs;
Ada_Compiler_Switches :=
Sparklib_External.Compiler'Default_Switches ("Ada") & (
"-ffunction-sections" -- Separate ELF section for each function
,"-fdata-sections" -- Separate ELF section for each variable
);
package Compiler is
for Default_Switches ("Ada") use Ada_Compiler_Switches;
end Compiler;
for Excluded_Source_Files use SPARKlib_External'Excluded_Source_Files;
end SPARKlib;
And then on my main project, I’ve added:
package Linker is
for Switches("Ada") use ("-Wl,--gc-sections"); -- Remove unused sections
end Linker;
But I still get this error:
/home/mgr/.local/share/alire/releases/gnatprove_14.1.1_f6ca6f8c/include/spark/spark-containers-formal-vectors.adb:428:(.text.___ghost_coap_spark__options__lists__formal_model__m_elements_included+0x1ba): undefined reference to `check_or_fail'
– this advice is not applicable if you’re running on Darwin: -Wl,--gc-sections
is a GNU ld switch, not Apple ld. I don’t know what advice (if any) would be applicable.
It’s not my case, though. I’m running on Ubuntu Linux x86-64. I can see the necessary switches passed to the compilation and linking phases, but check_or_fail
is still referenced.
I solved my linking problem by commenting out the line 428 in include/spark/spark-containers-formal-vectors.adb
:
function Element_Logic_Equal (Left, Right : Element_Type) return Boolean
is
begin
-- Check_Or_Fail;
return Left = Right;
end Element_Logic_Equal;
I wonder if that’s a bug, because, according to the documentation, it shouldn’t be called by actual code and used only for verification. But at the same time, the validation profile executes the contracts, so how are those claims made compatible?
This is using gnat_native=14.2.1, gprbuild=22.0.1 and sparklib from gnatprove=14.1.1.