I have built drivers based on three different linear circuits and they can range from semi-efficient to totally inadequate. If you can drive a LED (or series of LEDs) without relying on your integrated circuit to drop a bunch of voltage/amperage, a cheap linear circuit will work just fine. In other words, if you run three 3.3V LEDs in series (9.9V total) from 12 volts, you are only dropping 2 volts at the regulator. However, try running one 3.6V LED from a 14.5V car battery running at 2.2 amps and you are now dropping 10.9 volts x 2.2 amps or 24 watts. That is a totally unreasonable amount of heat to ask a small LED circuit to dissipate. So, my experience tells me to match total LED voltage to your source voltage and run them at lower amperages if you want to use a home-brew circuit. Otherwise, you're stuck with buying a more expensive DC-DC switching regulator.