• voxleone an hour ago

    Neural cellular automata are interesting because they shift learning from “predict tokens” to “model state evolution.” That feels much closer to a transition-based view of systems, where structure emerges from repeated local updates (transitions) rather than being encoded explicitly.

    I'm working on a theoretical/computational framework, the Functional Universe, intended for modeling physical reality as functional state evolution. i would say it could be used to replicate your CA process. Won't link it here to signal my good faith discussing this issue - it's on my GH.

    • dzink an hour ago

      “The long-term vision is: foundation models that acquire reasoning from fully synthetic data, then learn semantics from a small, curated corpus of natural language. This would help us build models that reason without inheriting human biases from inception.”

      • qsera 23 minutes ago

        I think this is a bit risky, because it assumes that all knowledge that a human posses about nature is acquired after birth.

        But is that correct? I think organisms also come with a partial built in understanding of nature at birth.