They have almost zero efficiency loss until you start driving them past max current.
From the datasheets I've looked at, LEDs are always more efficient at lower current. I've never seen an exception, but if you know of one, I'm ready to learn.
For example, if you look at the graph of relative flux vs current on page 9 of the Cree XBD datasheet (PDF warning):
Reading approximate values from the graph of the White LED:
At ~165 mA you get 50% relative luminous flux.
At 350 mA : 100%
At 700 mA : maybe as high as 175%.
Note that if you want to double the relative flux, you have to
more than double the current.
This is well below the 1 Amp rating for these chips, and you see the efficiency reducing as the current goes higher, even with a consistent junction temperature.
In practice the savings should be greater than the graph implies, for two reasons:
1. Lower currents are produced by lower voltages, so two chips at 150 mA consume less power than one chip at 300 mA (Watts are equal to Volts times Amps).
2. Lower current produces a lower junction temperature for the same heatsink arrangement. Lower temperature produces higher radiant flux, as shown by the graphs on page 7. For blue and white diodes this effect isn't so extreme, but look at what happens to the efficiency of red diodes when the temperature rises! Below 70% at 50 degrees celcius, well below 50% at 75 degrees C.
I'm not saying you should run your diodes at a lower current (though I do, and if I want more light I add more diodes). I am saying that LEDs do lose efficiency at higher currents, as shown by tests run by the manufacturer.