|
Post by supernaut on Feb 24, 2013 17:32:40 GMT -7
The Monza's master volume is called a "post phase inverter" master volume. How exactly is this circuitry different from a "regular" master volume (if there is such a thing) OR IS IT? I think I know that with a traditional master volume, the distortion is created in the preamp for lower level playing. Kicking up the volume starts unleashing the beast of power amp distortion. Is this also the case with the Monza's master? All I know is that when I have the Monza on whisper quiet, it still sounds great. Better than my other master volume amps. Any knowledge would be greatly appreciated.
|
|
|
Post by digiTED aka 'Ted' on Feb 24, 2013 17:59:03 GMT -7
I'm no amp tech, but my rudimentary understanding is the following:
A 'normal' MV lives before the Phase Inverter (PI) and only lets the pre-amp section get cranked up. The PI and power section then only responds to the lower signal and resulting lack of sag (hence the 'sterility' some amps have with the MV turned down).
A PPIMV also lets the PI get hit with the full brunt of the preamp signal and is allowed to distort as well. A distorted PI can sag the voltage (I think) going to the power section, which allows the feel of cooking power tubes without the full wattage and resulting volume.
|
|
|
Post by supernaut on Feb 24, 2013 18:01:44 GMT -7
Is this a different approach than (London) power scaling
|
|
|
Post by digiTED aka 'Ted' on Feb 24, 2013 18:15:01 GMT -7
Is this a different approach than (London) power scaling Yeah. London Power Scaling is kind of like having an attenuator built in, but implemented in a non-tone sucking manner (in some folks' opinions). Here's a blurb I nicked off of the web that jives with other stuff I've read: " It uses a MOSFET as a heat sink to get rid of the wattage as waste heat and thereby reducing the output of the tubes and the volume of the amp without greatly altering the tone. There is a 1 meg pot attached to a small circuit board that is used to vary the amount of B+ voltage reduction sent to the MOSFET.
Basically, it is a knob you turn to reduce the amount of B+ voltage being sent to the tubes by redirecting it to a heat sink instead. This in turn reduces the volume of the amp but keeps the tone."
Here's a link to the inventor's FAQ: www.londonpower.com/pscaling.htm
|
|
|
Post by supernaut on Feb 24, 2013 18:51:56 GMT -7
AH, OK. Thanks, digited. You RULE!
|
|
|
Post by club327 (aka Ben) on Feb 24, 2013 19:50:59 GMT -7
I should mention that my chihuahua amp has london power scaling in It. All of the above is true. However my amp has two knobs added in back for the london power scaling. Best i can tell, One does get rid of the wats but also slightly alters the break up point on the amp. So a second knob was added that makes it a bit quieter but will not completely turn if off so to speak. Together they work pretty good but i dont know anthing more about them.
|
|
|
Post by muZician on Feb 26, 2013 4:23:57 GMT -7
yeah digited's explanation is correct. The regular or pre-phase-inverter MV reduces the signal delivered to the phase inverter and the power stage. These two stages work with no saturation and the only distortion you have is created in the preamp section (or using external pedals). This MV has the charme that it keeps more of the amp's dynamic sensitivity. If you place the MV after the phase inverter stage (before of the power tubes) the phase inverter tube is allowed to saturate. Quite often before of hearing distortion you have a strong signal compression which is quite good for clean of almost clean solos because it adds sustain. It's less good (to my ears) for rock rhythm since you loose dynamic range (also your guitar volume knob will be less capable of cleaning up the amp while turning down). Two different MV concepts, both with pros and cons. Both have in common that the output tubes work with a much lower gate voltage and so are far away from their saturation. Very different than the london power scaling. This circuit is a power control feedback circuit inserted directly into the power stage. Here the tubes are "pushed" into saturation but without producing full power, just switching to another response curve by changing the load characteristic of the power tube. I guess they use a variable resistance (power MOSFET) in series with the load current. So you have distortion/saturation but at much lower output power. But here I am not sure, it's just a guess...but it would work ;-)
|
|
|
Post by supernaut on Feb 26, 2013 21:36:10 GMT -7
yeah digited's explanation is correct. The regular or pre-phase-inverter MV reduces the signal delivered to the phase inverter and the power stage. These two stages work with no saturation and the only distortion you have is created in the preamp section (or using external pedals). This MV has the charme that it keeps more of the amp's dynamic sensitivity. If you place the MV after the phase inverter stage (before of the power tubes) the phase inverter tube is allowed to saturate. Quite often before of hearing distortion you have a strong signal compression which is quite good for clean of almost clean solos because it adds sustain. It's less good (to my ears) for rock rhythm since you loose dynamic range (also your guitar volume knob will be less capable of cleaning up the amp while turning down). Two different MV concepts, both with pros and cons. Both have in common that the output tubes work with a much lower gate voltage and so are far away from their saturation. Very different than the london power scaling. This circuit is a power control feedback circuit inserted directly into the power stage. Here the tubes are "pushed" into saturation but without producing full power, just switching to another response curve by changing the load characteristic of the power tube. I guess they use a variable resistance (power MOSFET) in series with the load current. So you have distortion/saturation but at much lower output power. But here I am not sure, it's just a guess...but it would work ;-) AHHH, Thank you both for your explanations. That helps lots.
|
|