SSGrower
Well-Known Member
Assuming we want light that is PAR. If you have to add a heater the power it consunes must be less than the difference in PAR watts for the HID vs LED. In order for it to make fiscal sence this advantage must be well in favor of the LED due to higher initial cost in order for LED to make sence.Since we are in this discusion id like to gauge what people think about this:
As for enviroment, chip temprature be damned, but how does the temp of the heatsink affect the enviroment?
You could for example have a fixture which dissipates say 500w of heat thru a heatsink which rise in temprature say 45 degrees over ambient.
And in an identical room and fixture with only difference being that the heat sinks are larger and the 500w only raise the temp of the heatsink by 5 degrees.
All temps at steady state operation.
Same watt in but how would enviroment react in each case? I dont think the same but would like to know what people think.
And since keeping your room hot enough is important when using leds, are there cases where you actually like your leds to run hot so you dont have to put in a 2000w radiator?
Since most current operations are not optimized for led use because they are currently optimized for hid its kind of like pounding a square peg in a round hole to not build a new environment for the led.