In this concept, you start with a small model of Dadacorn unicorn dad and baby fathers day shirt, with actual neural characteristics, like a spiking neural network, and you try to shape it into what you want it to be by growing it and iteratively refining it. Since there is no way to apply back-propagation and linear regression to such neural networks to fit them to data, about the only way to shape them is to use genetic algorithms to make many, slightly different, mutated copies of the network, and send them off to do their assigned task, then measure how well they did so. Then keep the best 10%, cross-breed them by randomly mixing and mutating their models, and create another batch of progeny to go for as many rounds as you need to converge the network architecture design.
Dadacorn unicorn dad and baby fathers day shirt, hoodie, sweater and v-neck t-shirt
Offical Dadacorn unicorn dad and baby fathers day shirt
Some immediate problems with this model are that you need a Dadacorn unicorn dad and baby fathers day shirt, which is hard for complex models, and the other problem is genetic algorithms have only been demonstrated to work on small networks. Evolution in general only makes small tweaks each generation, a small change to a letter or two in a DNA sequence in a gene, then spends a lifetime seeing if that made the critter better or not. It seems utterly and completely impossible to even consider using genetic algorithms on models that start with 100 billion random neurons and 100 trillion random connections like a human brain. Conventional wisdom says it would never converge, yet here we sit.