I personally haven’t done any digging in to this subject at this time, but a shower thought I had today was…
Transformers have become incredibly good at image generation (DALL-E 2), NLP (GPT-3), boiler plate code generation (TabNine, Github Copilot, et al), not to mention the massive quantity of style transfer techniques.
NVidia has been tooting their horn about chip design done by AI.
How long until we are using transformers, or similar DL (deep learning) algorithms to generate basic component level electronic circuits? I want to wire up these two ICs (integrated circuits), add in some USB ports, add in power, and so forth. And a few seconds later out pops a reasonably refined prototype of the circuit that I can then tweak and tune to my needs.
Not long at all, I suspect.