For sure, it's definitely a theoretical discussion if we're talking about 100% accuracy. However the amount of light that's being lost in a decently sealed grow tent is going to make such a small difference I can't imagine it would make a noticeable difference in real world costs. If we look at the wattage of Bandit's light at 905 watts x 3.4121416 btu/watt = 3087.99 btu. It's rated at 3065 btu, (I used the wrong number earlier 3412.1416 is for a kilowatt) I would bet the remaining 22 btu are from light. Higher efficacy lights would have a higher ratio of heat produced by the device itself vs light that would convert back to heat later. And again as growers yes producing more photons vs direct heat is definitely beneficial, but theoretically from a purely heat load standpoint there is supposedly little to no difference. Scientifically 1 watt=3.4121416 btu my understanding is that any electrical device follows this, LED's are not immune.
But this is just something I've always found interesting & something that we debated going back to trade school, "does turning off lights in the winter save money vs. an electric furnace?"