I'd need someone clever like JBroll to pop in and give his thoughts, but the more I think about it... the more I think a 64-bit floating point mix engine makes this all null and void. I'm using Reaper for this test. But... Put a sine-tone on a track. Say it equates to 0db, give it +24db increase on the tracks gain control. Now on the master, give it -24db decrease on the gain control. Render this out as a mono wav. Add it to a new track. Reset the master volume and reset the first tracks volume. Invert the phase of the 'rendered' sine tone. Now what I'm getting here is a peak when the file starts, then complete cancellation. Which seems to indicate that even though the track was at +24db and was clipping awfully... the master track being reduced prevented any clipping in the final output file. NOTE: You don't hear any clipping whilst it plays back either. What does this mean in regards to the issue at hand? I'm not so sure. But it's interesting and might be relevant at some point.