Post by bradm on Mar 16, 2018 7:11:54 GMT -7
I've posted this in the Marshall forum as I frequent there a lot, but also curious as to what some folks options are on it here, too.
This morning I happened upon Aiken's article regarding notch biasing and he makes a valid point. If you haven't read it, you'll find it below. Make sure you have a read first then let the discussion begin.
Aiken & Notch biasing
I totally agree with his premise of voltage/current measurements for bias. Here's my added take on it from a global perspective.
I was raised in the discrete component era (late 70's - early 80's) of electronic engineering classes. When you look at pure hifi or any consumer audio amp (i.e. NOT guitar), the scope notch method was valid for a couple reasons.
1 - You set the bias, whether tube or transistor, to eliminate crossover distortion as much as possible
2 - However, given hifi enthusiasts where "no distortion" is ideal, you will never purposefully push any given amp to the point of clipping. For music reproduction in a pure listening environment, you're focused on clean throughput from entry point to speakers. And, the designer of any given amp is, too. (Or should be!)
Which brings us to,
3 - With guitar amps, we typically thrive on clipping in a variety of stages and, thus, we will push the design limits of any given tube in regard to play or performance. Given that very premise, Aiken's case of the likelihood of exceeding manufacturer's specs present a very real hazard to tube life and amp stability if biased toward the hot side.
And, I believe that's true. One thing that was drilled into me in class was "don't exceed manufacturer's specs" and you're safe.
I have biased both ways and I've seen the error scope-only measurements can make. I've never had a client with an issue when simply relying on voltage/current measurements and parking dissipation at around 62-65% of the manufacturers specified max tube power dissipation in regard to fixed biased amps. (Cathode bias is another story as it tends to auto-regulate and often tubes are biased near or at max. Let's stick to typical fixed bias in AB class to keep some uniformity.)
Consider those reasons and tell me what you've experienced. Here's another reason I'll also continue to use voltage/current to set bias. I'm not biasing just my own amp. I have a string of clients where most bring me an amp that's usually in the $1200+ range of value. Even safety aside, I can't afford to setup an amp to push the edge because "it's where it sounds the absolute best". That edge is also where shortened tube life exists. And, in a world of PPIMV and overdriven preamp stages, I don't find it's necessary.
So, for longevity and customer satisfaction...I'll bias for some lasting and safe enjoyment. To date, I've never had a customer return with failing tubes or a complaint that the amp sounds awful. (I did have one get irate with multiple problems continually appearing, but he unfortunately bought an old lemon of an amp...I won't go there.)
Thoughts?
This morning I happened upon Aiken's article regarding notch biasing and he makes a valid point. If you haven't read it, you'll find it below. Make sure you have a read first then let the discussion begin.
Aiken & Notch biasing
I totally agree with his premise of voltage/current measurements for bias. Here's my added take on it from a global perspective.
I was raised in the discrete component era (late 70's - early 80's) of electronic engineering classes. When you look at pure hifi or any consumer audio amp (i.e. NOT guitar), the scope notch method was valid for a couple reasons.
1 - You set the bias, whether tube or transistor, to eliminate crossover distortion as much as possible
2 - However, given hifi enthusiasts where "no distortion" is ideal, you will never purposefully push any given amp to the point of clipping. For music reproduction in a pure listening environment, you're focused on clean throughput from entry point to speakers. And, the designer of any given amp is, too. (Or should be!)
Which brings us to,
3 - With guitar amps, we typically thrive on clipping in a variety of stages and, thus, we will push the design limits of any given tube in regard to play or performance. Given that very premise, Aiken's case of the likelihood of exceeding manufacturer's specs present a very real hazard to tube life and amp stability if biased toward the hot side.
And, I believe that's true. One thing that was drilled into me in class was "don't exceed manufacturer's specs" and you're safe.
I have biased both ways and I've seen the error scope-only measurements can make. I've never had a client with an issue when simply relying on voltage/current measurements and parking dissipation at around 62-65% of the manufacturers specified max tube power dissipation in regard to fixed biased amps. (Cathode bias is another story as it tends to auto-regulate and often tubes are biased near or at max. Let's stick to typical fixed bias in AB class to keep some uniformity.)
Consider those reasons and tell me what you've experienced. Here's another reason I'll also continue to use voltage/current to set bias. I'm not biasing just my own amp. I have a string of clients where most bring me an amp that's usually in the $1200+ range of value. Even safety aside, I can't afford to setup an amp to push the edge because "it's where it sounds the absolute best". That edge is also where shortened tube life exists. And, in a world of PPIMV and overdriven preamp stages, I don't find it's necessary.
So, for longevity and customer satisfaction...I'll bias for some lasting and safe enjoyment. To date, I've never had a customer return with failing tubes or a complaint that the amp sounds awful. (I did have one get irate with multiple problems continually appearing, but he unfortunately bought an old lemon of an amp...I won't go there.)
Thoughts?