Max input current is 4A. (if 5V and 4A in, then the output could be 10V @ 2A or a little less than 2A)
The output must be more than the input.
The "IN-" and the "out-" are connected together.
You can not add the outputs together to get more voltage.
The immutable rule of DC-DC conversion is that the output power will always be less than the input power. Sometimes it will be much less.
For a "back of the envelope" calculation, I would use an efficiency of 85%. How does that work you ask? Excellent question, here's an example:
5V @ 4A = 20 watts of input power. At 85% efficiency you will have an output power of 20 * 0.85 = 17 Watts. Let's say you want the output voltage at 25.6 Volts (a number from thin air). So 17 Watts at 25.6 Volts will allow 664 mA of output current. The higher the voltage the lower the current.
The actual efficiency might be a little better or a little worse. If you know your requirements the you can easily decide if this will work for you, or you can keep looking for a better alternative. Keep the immutable rule in mind, because you can use it over, and over, and over........
As a rule of thumb I would always plan on having twice as much current as you think you will need. Seeing the voltage collapse because a load draws too much current is the kind of sinking feeling you will seldom want to experience.