A common creationist argument against biological evolution is ''incredulity'' that a random process could ever produce something complex and functional and "engineered" as life. And it is counter-intuitive, even to some very smart engineers. Biological evolution is relatively slow, and computer simulations of evolution-analogies have been too limited or too niche to convince anyone.
That is, until recently. Enter ChatGPT (and many others).
Each of these systems has some level of tangible life-like complexity. Normies see a sort of intelligence there. While there is indeed plenty of engineering going on, the gist of the system however relies on ''randomness''. These systems start with arrays of billions of random numbers. The neural network algorithms adjust those numbers, according to training data which is randomly selected and randomly permuted. Eventual use of a trained neural net involves random "seeds" to reinitialize new state between sessions. Randomness is essential throughout the process.
Yes, there is clearly an engineering force at work, specifying the algorithms. But it's not any more specific about details than the laws of physics tell a dog how to chase a squirrel. Algorithms ~ laws-of-physics.
The datasets being fed to neural networks are not engineered much. They are most akin to observations a living system makes about the world. Datasets ~ observations.
Like in fitness-selection in evolution, the algorithms make it more likely for data-fitting combinations of random numbers to have stronger influence. Reproduction ~ learning.
Enough about how randomness is not enough.