Neurocat

Neurocat is an experimental toy library studying 2 things:

Very superficially, the idea of the paper is quite simple (minus a few details):

ParaFn -> Learn

I: NNet -> ParaFn

NNet -> Learn : (ParaFn -> Learn) ∘ (NNet -> ParaFn)

I'll stop there for now but my work has just started and there are more concepts about the bimonoidal aspects of neural networks under euclidean space constraints and pending studies about recurrent networks and more.

Discovering that formulation, I just said: "Whoaaa that's cool, exactly what I had in mind without being able to put words on it".

Why? Because everything I've seen about neural networks looks like programming from the 70s, not like I program nowadays with Functional Programming, types & categories.

This starts unifying concepts and is exactly the reason of being of category theory in maths. I think programming learning algorithms will change a lot in the future exactly as programming backends changed a lot those last 10 years.

I'm just scratching the surface of all of those concepts. I'm not a NeuralNetwork expert at all neither a good mathematician so I just want to open this field of study in a language which now has singleton-types allowing really cool new ways of manipulating data structures

So first, have a look at this sample:

For info, to manipulate matrices, I used ND4J to have an array abstraction to test both in CPU or GPU mode but any library doing this could be used naturally.