Hello,
This is a very basic question so pardon my naivete. It has to do with real-life adjustable DC power supplies. In most texts, students are introduced to ideal voltage and current sources and if either one is connected to a resistive load, Ohm's law will determine the V-I relationship. Now most of the adjustable DC power supplies I've seen (but never used) seem to double as voltage and current sources as in the photo attached (that's my assumption.) My question is, what happens if a resistor is connected across the output terminal of this power supply and the voltage and current values are set in such a way that they do not conform with Ohm's law? eg if the resistor at the output terminal is 100kΩ and the voltage is set to 10V, by Ohm's law, the current flowing through the resistor should be 0.1mA, but what happens if the power supply current is set to output 1mA instead?

This is a very basic question so pardon my naivete. It has to do with real-life adjustable DC power supplies. In most texts, students are introduced to ideal voltage and current sources and if either one is connected to a resistive load, Ohm's law will determine the V-I relationship. Now most of the adjustable DC power supplies I've seen (but never used) seem to double as voltage and current sources as in the photo attached (that's my assumption.) My question is, what happens if a resistor is connected across the output terminal of this power supply and the voltage and current values are set in such a way that they do not conform with Ohm's law? eg if the resistor at the output terminal is 100kΩ and the voltage is set to 10V, by Ohm's law, the current flowing through the resistor should be 0.1mA, but what happens if the power supply current is set to output 1mA instead?
