I think I am confusing things when it comes to signal strength, especially when it comes to DI signals, say, of a guitar recorded directly in the interface. First off, though I'm neither an audio engineer, nor an electrical engineer (I'm a mechanical engineer ... ), I still think I remember that dBs are actually a "Reference" unit system. It's the ratio of something over a reference, then converted to a logarithmic scale (very badly explained I think, but it should be close enough). Now, when talking about audio, it seems people are either talking about dbV or dBu. The reference for a dBV is 1 Vrms, while it is 0.7746Vrms for a dBu. So far so good? First question: in a DAW, when looking at the faders, are those dBu or dBV or something else? Second question: If I take my DI signal for example. I send my guitar signal directly into my interface... So I have a signal coming out of my guitar at "X Vrms", or if you prefer, X dBV or X dBu, whichever... I turn my input gain on my interface so that I get a signal in Reaper that peaks at maximum, say, -6dB... Now, I do I know if I "amplified" the guitar signal or if I "reduced" it? Where is "unity" gain on an interface? Is it at noon? To better illustrate my point, if I used said recorded signal to Reamp, I will get different results depending on how strong the recorded DI is. Suppose I recorded my signal at -30dB (as seen in Reaper fadder)... the Reamped sound will be very soft (even with Reamper at max gain), like if I would have lower the volume knob on my guitar. But if it is strong enough (suppose -1 dB in Reaper), then I won't have to max the gain of my Reamper and still reproduce quite correctly my true guitar sound, as if I'd be playing directly in the amp. Maybe this is a complex topic, but it really interests me to understand this.