Airguy,
Dropping voltage is simple, but not at all rare to do with DC-to-DC for high power LEDs. The decision of resistive, verses linear, verses switching regulation has to do with several factors. If the LED is relatively high power (these are) and must operate at 12V & 24V (these do), then a switching regulator is the easy winner.
You missed the key clue I listed in my post: For the product under discussion, the input current decreases as a function of increasing voltage.
Input Current .......1.25 Amps @ 12.8V.......0.6 Amps @ 25.6V
In fact, apparently, the power supply is a little more efficient at higher voltage since the power drops. I'm changing my assertion from likely a switcher to definitely a switcher.
If it were done with a resistor, the current would roughly double from 12V-to-24V. Power consumed would be roughly 4x. Its actually a little worse than this, but either way its not very attractive.
If it were done with a linear regulator (voltage or current), the input current would stay the same over voltage. Therefore, the power consumption would double.
Resisters and linear regulators are virtually two sides of the same coin. They drop voltage or limit current by turning electrical energy into thermal energy.
DC-to-DC switch mode power supplies are POWER CONVERTERS. They use reactive elements (inductors and capacitors) to store electrical energy supplied at one voltage, then release it to the output at another voltage level (either higher or lower). Except for imperfections in switches, inductors, capacitors, etc, this is not a power dissipating process. Power remains constant (minus losses of course), so increasing the input voltage WILL result in a decrease in input current.