dmbche 19 hours ago

" chip design cycle is also probably one of the most complicated engineering processes that exists in the world,” said Institute Professor Siddharth Garg (ECE). “There’s a saying that rocket science is hard, but chip design is harder.” "

Why not Rockets for the rest of us first, if that's easier?

  • gsf_emergency_6 15 hours ago

    Rocket science is hard because you have to burn your own cash if you are not connected. Chip design is less risky for the individual, but it's been harder (so far) to signal your mastery to the funders.

    The difficulty is not (entirely) technical

  • wlesieutre 18 hours ago

    There’s already lots of rockets for the rest of us, they’re just not as big

fleshmonad 20 hours ago

We have textual slop, visual slop, audio slop, so we asked: "What else do we want to sloppify?". And then it dawned on me. ICs. ICs haven't been slopped yet — sure, we could ask the machine to generate some vhdl, but that isn't the same. So we present: Silicon Slop.

I am actually astonished. Is this what happens when the NYU board of directors tells every department they have to use and create AI, or they will stop funding? What is going on?

  • carlCarlCarlCar 19 hours ago

    Ah, thanks; we definitely needed more artisanal, real human social media slop like this.

    Improving the lived experience keeping it real! Feels so much more authentic.

    More people would love AI if it communicated like an emo *Nix elitist. Train it on Daria, Eeyore, and grunge lyrics! People will love it!

stogot 16 hours ago

> To address this challenge, Garg and colleagues scoured Verilog code on GitHub and excerpted content from 70 Verilog textbooks to amass the largest AI training dataset of Verilog ever assembled. The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

I expect this will become the norm in a number of fields. Perhaps COBOL is next?

  • fancy_pantser 11 hours ago

    > The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

    Perhaps it's the first open one. I was an eng manager at a hyperscaler helping one of our clients, a large semiconductor design company, build models to use internally. It was trained on their extensive Verilog repos, tooling, and strict style guides. I see this being repeated across industries, at least since 2023 there are quite a few deep-pocketed S&P 500 orgs creating models from scratch or extensively finetuning to give unique advantages they require. They're rarely announced specifically, but you can often infer from the initial investment or partnership announcements that they're working on it.

alecco 20 hours ago

> Consequently, the NYU researchers’ goal is to make chip design more accessible, so nonengineers, whatever their background can create their own custom-made chips.

What?

  • Aloisius 17 hours ago

    "ChatGPT: Please design a chip for me."

    Basically.

    • kingstnap 8 hours ago

      Unironically what industry is trying to do. Saw a slide with basically exactly that written on it sometime ago at a conference (MLCAD).

  • bigyabai 20 hours ago

    I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

    A lot of my questions went away when I got to this line though:

    > He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.

    This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.

    • charlie-83 18 hours ago

      Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable

      • tormeh 18 hours ago

        All I remember from my experience with VHDL/Verilog is that they really truly suck.

    • bsder 18 hours ago

      > I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

      That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).

      The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.

      Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.

      • tormeh 18 hours ago

        Would you really earn more money doing this than monopolizing online search advertising? Because I find that hard to believe. Hardware seems like a miserable business.

        • pesfandiar 17 hours ago

          That might change if geopolitical tensions fragment the global supply chains.

        • bsder 12 hours ago

          Being a fab is a garbage business.

          Being a software supplier to fabless semiconductor companies is a very profitable business.

          In the Gold Rush, the people who came out rich were selling the shovels and denim.

bgwalter 19 hours ago

Bootstrap framework for chips, Verilog stolen from books and from GitHub.