I am currently building a simulator (time-domain model) that includes a System-on-Chip, a constant voltage source and a series resistor.
Current solution: My initial intuition was to model the SoC's consumption as a constant-power-sink: I attribute each task I expect the SoC to execute a power value and an execution time.
Problem: As I calculate the voltage seen by the SoC I have to take into account the voltage drop across the series resistor. At the next time step, the current is already higher to compensate for the lower input voltage. This continues and especially with high series resistances quickly ends up at voltages close to 0 V and extremely high currents.
Is there a more reasonably way to model an SoC's power consumption?