2022 Day 7: No Space Left On Device

Doesn’t look like I’ll get this one finished tonight, but I’ll create a topic here anyway. Looking forward to seeing everyone’s solutions.

1 Like

I finally got Part 1 after an embarrassingly long time. If it helps:

  • make sure you’re answering the right question (sum of the directory sizes, not the number of directories); I wasted wayyyy too long answering the wrong question;
  • make sure you count the last folders; in my case at least, I was updating folder sizes whenever I read a cd .., and there’s no cd .. once you read the last file, so I was missing the last two directories.

I haven’t done Part 2 yet, so I won’t post my solution until then.

This was nice, again using Vectors. I feel a bit dirty for using access types, though. But since it doesn’t change anything of the data structure after initially building it, it should be fine.

I already updated the parent folder sizes when inserting a file - that was handy for the solution.

Also, I added a printing function to output the directory tree to look like the example.

This time, solving part 2 was as simple as adding one (recursive) function.

OK, I posted it in the main thread, but here it is again. Most solutions I’ve seen bake their own container to handle the file hierarchy, but I used Ada.Containers.Multiway_Trees and it worked great.

Part 2 seems pretty straightforward after Part 1, and Part 1 itself is straightforward, too. I made Part 1 harder than it should have been by solving it while reading the command history and building the tree, introducing a subtle bug.

Though I lost a lot more time by misreading the question.

Couldn’t sleep without finishing the puzzle :slight_smile:

I had a couple of false starts… First I thought I’d just implement a VFS. No problem, right? Yeah, that wasn’t right. Then I decided to use nested Ada.Containers… This led to madness.

My working solution was to build a big Ordered_Map of (Path → (Size, Is_Dir)), where Path is a Vector and the elements are a record.

Querying this data structure is O(N), so it’s not terribly efficient, but gets the job done.

Just one Ordered_Set and a recursive input parser works fine for me :stuck_out_tongue:


I love the simplicity you get with the recursive parser, great idea! I originally tried to keep track of the directories, but after seeing your idea, I went back and ripped all that out and was able to just keep an array of sizes, that are really only needed for the second part.

Again a solution with HAC and its minimal type system (and library)…
The parser is sequential but each directory knows its parent’s index, so the file sizes are added all the way down to the root. Not as efficient algorithmically as the recursive parser, but the run time is okay (5 milliseconds when run from HAC). And it could swallow an input with a “$ cd /” occurring while in some subdirectory (there was not such a banana skin in my input file, but who knows…)

I had to learn a lot about cursors and iterating in containers. And then I had to cheat and added $ cd .. as needed to the input.

1 Like

And there goes 3 hours! I got to learn how multiway trees and their iterators worked too, today.

EDIT: Also, comparing @RREE 's code, holy smokes the solution is so similar! I swear I did this blind, I guess great minds think alike.


This one was fun! I took a bit of time to figure out how to do this as simply as I could, but I eventually realized that the file system is explored in a kind of DFS, so I only need to keep track of which directories are the parents of the current one.

It also taught me the value of separating things into their own functions when proving things in SPARK :slight_smile:

Querying this data structure is O(N), so it’s not terribly efficient, but gets the job done.

Aren’t the ordered red/black containers O(log N)?

Yes, but I’m just iterating through all the elements, not actually doing a query. Honestly, I could’ve used a Vector here. It’s really poorly built, but I don’t really like this problem and don’t want to think about it anymore :slight_smile: