With a magnetic ballast there are up to 20% greater efficiencies when they run @ 240 volts but the digital ballasts should not be that far off. Digital ballasts have power factors of .95 and greater whereas magnetics can drop to a .8. For single phase supply circuits Amps x Volts = Watts. With that formula your 240v x 4.2 amps = 1008 watts is off and your 120 x 8.9 amps = 1068 watts would be right. Were you taking the amperage readings with a clamp on meter?
The IG universal digital power supplies have a .99 power factor and I can tell you from personal experience there is a negligible difference in consumed watts whether the lights are ran on 120 or 240 volts supply circuits. The only difference is that @ 120 volts a 420 will draw ~3.9 amps each and @ 240 volts they draw ~1.9 amps each. For circuit layouts I'll run 4 ea., 420's on one 20 amp 120 volt circuit or 8 ea., 420's on a 20 amp 240 volt circuit. If the draw was 10% higher at 120 volts I'd only be able to run 3 ea., 420's on a 20 amp 120 volt circuit and that has never been the case.
GG should check in with AT as they should have a simple explanation for the different wattage readings he's getting. It's not making sense to me.