Having worked on compression algos, any NN is just way to slow for (de-)compression. A potential usage of them is for coarse prior estimation in something like rANS, but even then the overhead cost would need to carefully weighted against something like Markov chains since the relative cost is just so large.
No mention of decompression speed and validation, or did I miss something?
It's in the post: Benchmarks -> Speed
tl;dr: SMLL is approximately 10,000x slower than Gzip