RE: [ng-spice-devel] save all
> This is not good enough. As circuits get bigger, it becomes
> necessary to exploit latency, duplication, and other factors. With
> old technology you get linear growth. With appropriate algorithms it
> can be very much sublinear.
Cool !! I'd like to see that.
> > ..... Also, it would be
> > reasonable to assume that the bigger the circuit, the more times
> > spice will have to go round the evaluate-solve loop. So for bigger
> > circuits, the matrix solution becomes more significant, and the
> > overall effort goes up more than linearly.
>
> Maybe. But, the bigger the circuit, the more tricks become
> available. I have a benchmark of cascaded NMOS buffers, that takes
> the same number of iterations regardless of the number of stages
> (above about 5 stages), with linear growth in time, and slightly
> sublinear growth in memory requirements. Spice fails to converge on
> it. Numeric overflow.
That also sounds great. Have you tried that circuit on the
latest ng-spice ? I thought I'd caught a lot of numeric
overflows. I don't mean that we've done anything to compete
with your stuff, but at least it might get there in the end.
> But what I usually want is lots of detail in a small section. Why
> save all the nodes when I don't need them? Then, what about the
> detail? Probes other than node voltages? What if I want to see how
> junction capacitance varies with signal? Spice only gives you this
> in steady state, but ACS has the extended probes in transient, too.
> ACS has 55 parameters on a MOSFET that you can probe in transient
> analysis, and this is not counting what is available by treating it
> as a subcircuit and probing its components (17 parameters on each of
> the 2 diodes, 26 on each of the 5 capacitors, 18 on each of 4
> resistors, 18 on the transfer element) That's 309 total per MOSFET,
> assuming I counted correctly. There is some duplication, so the
> useful number of somewhat less.
Maybe work a different way from you. Usually when I run a big
simulation, it's towards the end of a project, and I'm puting lots
of things together to make sure they work as intended. When (er, I
mean if ;-) something goes wrong, it's impossible to predict where
I will need to look to track it down. It's incredibly frustrating
to have to do a first run (of, say, a switched mode power supply),
saving only the top level nodes, and then run it again saving a
different set of nodes to find out what the problem is. This is
the way you have to work with some simulators, and when each run
takes tens of hours, progress is slow.
I agree that all those extra parameters don't need to be saved,
just all voltages and currents. I need the voltages in order to
see which nodes are getting pulled in the wrong direction, and
I need the currents to find out what is pulling them ('cos in
an analogue circuit it's not always obvious). Once I've tracked
down the problem, then I make up a smaller simulation to take a
closer look, if necessary.
Your ideas about speeding up large simulations are intriguing,
and are obviously worth investigating for their own reasons,
but they are not a solution to this problem. Even if you bring
the simulation time down dramatically, it still means I have to
wait to see the rest of the signals that I need.
> Now, taking this to the future, with different models, it is
> impossible today to predict what info you will want to see. I will
> offer a few as food for thought:
>
> Grid probes in a finite element transistor model.
>
> Internal probes, at some point inside a transmission line.
> .
> .
> .
Yes, you're right, there's no reason you shouldn't be able to
ask for millions of different measurements and parameters, and
"save all" wouldn't want to save all of them. We would probably
want other ways of saving collections of parameters. For example-
.save ic(q*) vdsat(m*) i(d1-d11)
and things like that.
Cheers,
Alan
Partial thread listing: