DanH

Legacy Member
Mentor
A friend recently asked an interesting question, for which I had no answer. It was "Have you ever done a mag check at full throttle in climb?" He had been doing it for many years, with many airplanes, and a recent one had been showing roughness.

At first I went "Huh?", but a little thought says the test makes a lot of sense. It is good to know if the engine operates in a satisfactory manner on one ignition at WOT, and I cannot think of any mechanical reason to recommend against it.

The PIREP? My IO-390 is currently firing Tempest massive electrode plugs with a 500+ hour non-impulse Slick, and Denso iridium plugs with a EDIS-based electronic ignition. Passing through 500 MSL at WOT, switching off either ignition offered no clue except immediately rising EGT. No roughness, and if there is a power loss, the prop governor covers it up. Without the EGT instrumentation, I could not tell it was running on just one.

Maybe it's just me, but I'd not heard of mag checks at WOT. Was I asleep when they passed out the instructions? What are ya'll doing out there?
 
My first in flight Lightspeed Ignition failure was at high speed cruise. The only indication I lost the ignition was my electronic RPM indication went away, which clued me to look at the Lightspeed breaker which had popped open (and would not reset).

Carl
 
Where in the original post was there mention of doing this check at 500 ft AGL?

Actually I did state "passing through 500 MSL", but concern about cycling a switch depends on how much runway is out in front at the time.

Let's assume folks have a considered flight plan, and talk about ignitions instead.

BTW, density will make a difference, so I currently intend to repeat at 8000 or above.

So, where in the Lycoming manual does it say that this is a good idea? Where in any flight handbook is this advised?

It doesn't, thus the invitation for discussion.

Might we note that such things as leaned stress tests, GAMI spreads, and electronic ignitions are also not mentioned in Lycoming manuals.
 
Last edited:
Since it has been a while since I stayed at a Holiday Inn Express, this is only opinion. :)

WFO TO power includes extra gas for cooling so the flame speed should be slower (Peak ICP timing past the sweet spot) but still inside the A/F range that is pretty easy to light. Because of this, it is not surprising to me that a test of a good ignition system does not show any significant performance change that can be felt by the seat of the pants.

My gut feel is that the LOP ignition test may be a better indicator of ignition performance since the mixtures are theoretically harder to light. I suspect an LOP test of an ignition showing roughness at TO power would be worse. It would be handy to get feedback from Dan's acquaintance on a LOP test to see if the engine shakes off the mounts. :eek:

I look forward to comments from others on this thread with direct experience with engine or ignition development.
 
Last edited:
Low level, no - but I do cruise-altitude WOTLOP mag checks frequently, in both aircraft. It's an excellent ignition health indicator.

And no, you probably won't find that information in your Lycoming book - but it's not in their interest to cover that.
 
OK, I pruned out some posts that were way off the topic of testing ignition at WOT.

If someone wants to discuss the wisdom of turning off the ignition during climb---please start a new thread about that subject.

Now, as Dan has asked, back to discussing ignition please.
 
A technical point of interest:
The Mazda Rotary flight group has a term SAG.
Sparkplug Attention Getter.
These engines are hard on spark plugs, eventually they begin to fail at full throttle, so you have to close it a bit to keep the plug firing. Then you land and change the plugs.
The higher the true compression, the greater the dielectric resistance to firing the plug. So the test you suggest is valid. It should be done at the lowest possible altitude at full throttle. To me that means on the ground over a clean spot. I wouldn't worry about a fixed pitch prop only attaining 2200 RPM instead of full rated 2700 RPM.
Heat/Time could be a factor also, so maybe such a test could be done after a flight, just before shut down, although there are reasons to not do that as well.
 
OK, I pruned out some posts that were way off the topic of testing ignition at WOT.

If someone wants to discuss the wisdom of turning off the ignition during climb---please start a new thread about that subject.

Now, as Dan has asked, back to discussing ignition please.
Seems to me you through the baby out with the bath water but, well. . . oh never mind. I guess I will remain a spectator on this one.
 
ignition test WOT

I seem to recall that the Australian guy that offers engine management classes all over the world almost always does his ignition checks before descending.
His data indicated that it is a much more realistic test of the system than the standard run- up.
Maybe some who are interested will search the postings. I have run my Lycoming WOT 98% of the time since it was new. I take off from 5000 ft. at home and climb to 8500-13500 every day. When I check the P mag and the Lightspeed back and forth... there is hardly a change at all. The prop is fixed pitch Catto.
On rare really cold high altitude days... I will back the throttle off a bit to swing the throttle plate in the flow. I get a bit better mixture distribution that way.
This carb. model will not run LOP smoothly most of the time. But gas is still cheaper than engine work... at least at my local shop.
 
A technical point of interest:
The Mazda Rotary flight group has a term SAG.
Sparkplug Attention Getter.
These engines are hard on spark plugs, eventually they begin to fail at full throttle, so you have to close it a bit to keep the plug firing. Then you land and change the plugs.
The higher the true compression, the greater the dielectric resistance to firing the plug. So the test you suggest is valid. It should be done at the lowest possible altitude at full throttle. To me that means on the ground over a clean spot. I wouldn't worry about a fixed pitch prop only attaining 2200 RPM instead of full rated 2700 RPM.
Heat/Time could be a factor also, so maybe such a test could be done after a flight, just before shut down, although there are reasons to not do that as well.
,
I cant get the silly image out of my mind of an RV screaming through death valley on the deck with the pilot head down fiddling with switches.

Scott, you bring up a good point about high compression spark. Hard to know which situation on our Lyc's is the worst case for ignition. WFO TO or LOP. Need data. Yum.
 
Last edited:
I cant get the silly image out of my mind of an RV screaming through death valley on the deck with the pilot head down fiddling with switches.

The reality being 100 knots, over a runway, with the EFIS in recording mode.

Scott, you bring up a good point about high compression spark

Yep. It's what I assumed was being checked. Mixture aside, the "100% power" and "LOP at altitude" cases are different in that regard. Reduction of in-cylinder pressure would be proportional to density?

BTW, I should not have titled the thread "...at WOT", but rather, "at 100%"
 
A technical point of interest:
The Mazda Rotary flight group has a term SAG.
Sparkplug Attention Getter.
These engines are hard on spark plugs, eventually they begin to fail at full throttle, so you have to close it a bit to keep the plug firing. Then you land and change the plugs.
The higher the true compression, the greater the dielectric resistance to firing the plug. So the test you suggest is valid. It should be done at the lowest possible altitude at full throttle. To me that means on the ground over a clean spot. I wouldn't worry about a fixed pitch prop only attaining 2200 RPM instead of full rated 2700 RPM.
Heat/Time could be a factor also, so maybe such a test could be done after a flight, just before shut down, although there are reasons to not do that as well.

Agreed. I run one mag, one elec ignition (Electroair) on my spam can. A couple months agoi, ground checks at runup were fine at about 1800-2000 rpm, but in cruise at full cruise power (2500) I felt some roughness at about 3000 MSL. Switched it to the mag, ran fine. Switched it to the EI, ran like **** (that's probably an understatement - I had to retard throttle to about 2200 to get it to run without bucking). After diagnosis, determined the Champion massive electrode plugs were shot, replaced with a set of Tempest fat plugs, nothing else changed, now running fine.

Asked Electroair when I was troubleshooting, he said yep, replace the plugs. Sure enough.

Plugs do go bad.
 
Ignition Test

A ignition test done under these conditions will determine that the engine can make climb power if either ignition system fails shortly after takeoff. Seems valid to me. Usual safety precautions apply.

Don Broussard

RV 9 Rebuild in Progress
 
Scott, you bring up a good point about high compression spark. Hard to know which situation on our Lyc's is the worst case for ignition. WFO TO or LOP. Need data. Yum.

It is true that high compression charges are harder to light, however, engines with static valve timing fundamentally produce the same pressure (i.e. compression) regardless of RPM or throttle position. There is a small reduction at low RPM's, as there is more time for pressure to escape via the ring gap during the compression stroke.

I believe that turbulence and swirl increase with RPM and this can make it more difficult to light the charge and therefore would be less tolerant of a weak spark.

It is also true that lower ambient air pressure increases the difficulty of lighting a charge and therefore it is wise to test ignition systems at altitude, as they will show there weakness there before ground runs.

Larry
 
Last edited:
Low level, no - but I do cruise-altitude WOTLOP mag checks frequently, in both aircraft. It's an excellent ignition health indicator.

+1

Have done this for years during low workload phases of cruise, helps to find problems early.

I also tend to check just before top of descent on cruise while traveling. That way I can identify problems earlier than the preflight/runup when I'm ready to come home.
 
It is true that high compression charges are harder to light, however, engines with static valve timing fundamentally produce the same pressure (i.e. compression) regardless of RPM or throttle position. There is a small reduction at low RPM's, as there is more time for pressure to escape via the ring gap during the compression stroke.

I believe that turbulence and swirl increase with RPM and this can make it more difficult to light the charge and therefore would be less tolerant of a weak spark.

It is also true that lower ambient air pressure increases the difficulty of lighting a charge and therefore it is wise to test ignition systems at altitude, as they will show there weakness there before ground runs.

Larry

I'd have to disagree with these points here. Throttle angle makes a massive difference in charge density, thus compression pressure. That's why throttles, throttle. Ring gap leakage at anything above 2000 rpm would make no measurable difference in compression pressure.

Increased turbulence with increased rpm likely makes the charge more homogeneous and INCREASES ignition probability.

Lower MAP at altitude results in lower compression pressure, making it EASIER to ignite the mixture.

Atmo Lycoming engines hardly present a difficult case to light off the mixture compared forced induction automotive engines which may have 3 to 10 times the cylinder pressure and operate at 3 to 6 times the rpm where the time available to saturate the coils is very short.

In the case of low revving engines like Lycomings, worst case spark scenario is torque peak rpm at sea level where there is maximum cylinder pressure.

If you have a problem firing your average Lycoming plugs, I'd look hard at a marginal ignition system.
 
Last edited:
I keep a good bookshelf, but it's surprisingly shallow regarding detailed ignition data. Can anyone recommend a really comprehensive text, preferably one that doesn't cost a small fortune?

Lower MAP at altitude results in lower compression pressure, making it EASIER to ignite the mixture.

That may be one of those subtle details. Available data says low pressure lessens the voltage requirement to ionize the gap and start the arc. Conversely, high in-cylinder pressure (like WOT at low altitude) should require more voltage to ionize. However, striking an arc and igniting a mixture are not necessarily the same thing. For example, I observe that I can smoothly operate much further LOP at a lower altitude than at a high altitude...so the mere fact of lower voltage requirement doesn't seem to directly equate with consistent ignition.

From John Schwaner's "The Magneto Ignition System":

 
Can anyone recommend a really comprehensive text, preferably one that doesn't cost a small fortune?]

No, don't think so, I have searched for years. The Obert, and a couple of other similar, newer, books are very good overall. The remainder get historical or narrow to specific categories. Topics that would make good chapters for previous texts.

To the OP, though, Peak Cylinder Pressures, PCP, and compression pressures most definitely get lower with altitude, manifold pressures. Not precisely proportional to inlet pressure, but close. Other factors at work if part throttle is used.

Why not go to a single ignition at WOT down low? It depends. On a hot day with an engine that is closer to detonation, it might move into detonation during the check. Higher, cooler, lower pressures etc, detonation margin would certainly be increasing (vs AF curve, so lets not get picky) rendering any concerns about that aspect. Would it hurt, probably not, as long as the time was short (handful of seconds) and recorded data was to be used for comparison.

The need for higher voltage to jump the gap at higher pressures is long known to be true. A high school experiment with voltage in a tube and evacuating it to get a longer discharge comes to mind. And yes, separation of spark and ignition of the mixture are different things. We do tend to mush things together.
 
Last edited:
I keep a good bookshelf, but it's surprisingly shallow regarding detailed ignition data. Can anyone recommend a really comprehensive text, preferably one that doesn't cost a small fortune?



That may be one of those subtle details. Available data says low pressure lessens the voltage requirement to ionize the gap and start the arc. Conversely, high in-cylinder pressure (like WOT at low altitude) should require more voltage to ionize. However, striking an arc and igniting a mixture are not necessarily the same thing. For example, I observe that I can smoothly operate much further LOP at a lower altitude than at a high altitude...so the mere fact of lower voltage requirement doesn't seem to directly equate with consistent ignition.

From John Schwaner's "The Magneto Ignition System":


We've seen this stuff for decades in turbocharged race engines where MAP may exceed 150 inches and outputs may reach 600hp/L (that's 3600hp in Lycoming 360 terms). Invariably at some point, the cylinder pressure is so high, the spark won't jump any more and this happens at torque peak. On either side of torque peak rpm, the engine may run without misfire.

We did our initial development years ago using a pressure chamber with a Lexan window to observe the spark at varying coil charge times. It was interesting to watch the spark weaken and die out with increasing pressure applied.

It's important to note that absolute EGT values may not be a consistent/ reliable method to judge AFR across a range of rpms and MAPs since EGTs fall as these parameters decrease (as output falls). The amount LOP vs. AFR is best looked at as a percentage of peak EGT. In other words, 200 LOP at 20 inches MAP is probably not the same AFR as 200 LOP at 30 inches MAP, even at the same rpm. Throw variable ignition timing into the mix which can substantially affect EGTs and things change even more.

With high spark energy systems, we're able to run over 250 LOP without misfire or roughness. Magnetos may not be able to do this and may have the additional problem of internal arcing at high altitudes which is why many turbocharged engines have pressurized magnetos. Distributorless systems have no additional gaps to jump like rotor and cap systems.

I'd be interesting to varying the coil charge time in flight to observe the effects on misfire at different MAP/ rpm/ AFR combinations.
 
Last edited:
I'd be interesting to varying the coil charge time in flight to observe the effects on misfire at different MAP/ rpm/ AFR combinations.

That would be interesting. Correct me if I'm wrong, but I think one of the key differences between SDS-engineered systems and some other players (notably EMagair) is the degree of coil current limiting? Brad argues that more power has diminishing returns, and is generally unnecessary, while Robert (EFii) has made a more power a big deal, defined as spark volts x amps x duration. It would be great to determine if it actually makes any difference in the context of low RPM engines with wildly variable mixtures.

http://www.emagair.com/home/design-principals/

http://www.flyefii.com/products/efii-systems/ignition-comparison/

Give the same coils (they're not, obviously, but humor me), increasing the charge time results in longer spark duration?
 
Last edited:
That would be interesting. Correct me if I'm wrong, but I think one of the key differences between SDS-engineered systems and some other players (notably EMagair) is the degree of coil current limiting? Brad argues that more power has diminishing returns, and is generally unnecessary, while Robert (EFii) has made a more power a big deal, defined as spark volts x amps x duration. It would be great to determine if it actually make any difference in the context of our low RPM engines with wildly variable mixtures.

http://www.emagair.com/home/design-principals/

http://www.flyefii.com/products/efii-systems/ignition-comparison/

Give the same coils (they're not, obviously, but humor me), increasing the charge time results in longer spark duration?

I'm not sure I buy the theory that longer spark duration has any value though. It seems to me that once the initial spark is struck, and starts the combustion process, continued arcing across the gap serves no further purpose. Am I missing something there?
 
I'm not sure I buy the theory that longer spark duration has any value though. It seems to me that once the initial spark is struck, and starts the combustion process, continued arcing across the gap serves no further purpose. Am I missing something there?

That is the argument. The counter is that increased duration improves cycle-to-cycle variation by increasing the chance of igniting less than optimum mixtures.

Ross proposes to end the argument with some in-flight comparison.
 
Last edited:
This deep high tech discussion on ignition is interesting...

...but is it of any value beyond discussion's sake?

The original premise of two spark plugs in a Lycoming engine was two fold.

(1) Ignition reliability with two systems.
(2) Burn pattern in a large combustion chamber where the spark is not centered.

The reliability factor is obvious and needs no comment.

The inadequate burn pattern appears when checking the system for single operation at relatively low power, i.e. 1800 rpm. We see a drop of no more than 150 rpm on one system. This is because magneto voltage is dependent on rpm. Perform the same check at 2500 rpm and there is less rpm drop.

Along comes electronic ignition retrofitted in this early technology engine. Now the voltage is higher and earlier. Supposedly so high, it does not matter if there is one spark plug or two. The duration is different (usually longer) and so is the timing of the beginning it. Is this a good thing in an early technology engine?

Yes it starts easier, runs smoother, and is more efficient, at least on the short term.

However there are maintenance reports that indicate electronic ignition causes early valve failure due to excess heat. In the beginning of this early technology, unburned fuel moderated exhaust temperatures, moderated CHT as well. Some EI systems use a wasted spark to burn fuel before it leaves the combustion chamber which must effect the temperature of the gas passing through the exhaust valve.

Is this good or not so good for valve life?

It is interesting that Lycoming continues to build engines with magnetos. After all these years and after much advancement in ignition technology, they still rely on magnetos to light the fire in their engines.

I wonder how come?
 
Last edited:
That would be interesting. Correct me if I'm wrong, but I think one of the key differences between SDS-engineered systems and some other players (notably EMagair) is the degree of coil current limiting? Brad argues that more power has diminishing returns, and is generally unnecessary, while Robert (EFii) has made a more power a big deal, defined as spark volts x amps x duration. It would be great to determine if it actually makes any difference in the context of low RPM engines with wildly variable mixtures.

http://www.emagair.com/home/design-principals/

http://www.flyefii.com/products/efii-systems/ignition-comparison/

Give the same coils (they're not, obviously, but humor me), increasing the charge time results in longer spark duration?

No, generally once the coil is fully saturated, there is no point in charging it longer, you just make more heat and waste power. Coil charge times vary widely with inductance as does spark duration. Some early Subaru COP units only require 1.25ms to fully saturate, some Ford coils up to 4 ms. Some coils with low inductance and low turns ratio have relatively weak spark (low voltage capability and short duration).

The point being that drivers must match coil characteristics.

Lycomings are at the low end of the spark energy requirement food chain. You don't need 55,000 volts and 5 ms spark duration to fire one. More voltage and spark duration is useful at very lean or very rich mixtures which Lycomings can run at however the usefulness of running at 10 or 20 AFR is debatable.
 
I'm not sure I buy the theory that longer spark duration has any value though. It seems to me that once the initial spark is struck, and starts the combustion process, continued arcing across the gap serves no further purpose. Am I missing something there?

The theory is that increased spark duration increases ignition probability. The exact alignment/ distribution of air and fuel molecules changes with piston motion and swirl from one millisecond to the next between the plug gaps. In one instant the mixture may not light off, while 1 ms later, it might.

I've seen opposing views on this but scope traces with chamber pressure transducers seem to show that the typical probability is over 99% in twin plug engines with AFRs around best power mixture. I wish I could find a old printed article I clung on to for years to written by a GM combustion engineer. It was excellent.

In the end we have CD, magneto and inductive discharge ignitions all running some very high specific output engines. They all work and have their advantages and disadvantages.
 
No, don't think so, I have searched for years. The Obert, and a couple of other similar, newer, books are very good overall.

Thanks Bill. Would that be Edward F. Obert, Internal Combustion Engines and Air Pollution?

Why not go to a single ignition at WOT down low? It depends. On a hot day with an engine that is closer to detonation, it might move into detonation during the check.

Many moons ago, a respected engine builder friend and I had the same thought, but George Braly said no. He had tested for it, and found (IIRC) that single ignition operation reduced peak pressure enough to widen the detonation margin.
 
Thanks Bill. Would that be Edward F. Obert, Internal Combustion Engines and Air Pollution?

Many moons ago, a respected engine builder friend and I had the same thought, but George Braly said no. He had tested for it, and found (IIRC) that single ignition operation reduced peak pressure enough to widen the detonation margin.

Yes, that Obert. Well, if the detonation margin is not reduce ( and George would certainly know), it sounds like the only reason not to test it is ground rush. :eek:
 
No, generally once the coil is fully saturated, there is no point in charging it longer, you just make more heat and waste power.

That makes sense.

So why do you want to vary the coil charge time in flight to observe the effects on misfire at different MAP/ rpm/ AFR combinations? What will varying the coil charge time do for delivered spark? Are you typically not running the coil at full saturation?
 
...but is it of any value beyond discussion's sake?

The original premise of two spark plugs in a Lycoming engine was two fold.

(1) Ignition reliability with two systems.
(2) Burn pattern in a large combustion chamber where the spark is not centered.

The reliability factor is obvious and needs no comment.

The inadequate burn pattern appears when checking the system for single operation at relatively low power, i.e. 1800 rpm. We see a drop of no more than 150 rpm on one system. This is because magneto voltage is dependent on rpm. Perform the same check at 2500 rpm and there is less rpm drop.

Along comes electronic ignition retrofitted in this early technology engine. Now the voltage is higher and earlier. Supposedly so high, it does not matter if there is one spark plug or two. The duration is different (usually longer) and so is the timing of the beginning it. Is this a good thing in an early technology engine?

Yes it starts easier, runs smoother, and is more efficient, at least on the short term.

However there are maintenance reports that indicate electronic ignition causes early valve failure due to excess heat. In the beginning of this early technology, unburned fuel moderated exhaust temperatures, moderated CHT as well. Some EI systems use a wasted spark to burn fuel before it leaves the combustion chamber which must effect the temperature of the gas passing through the exhaust valve.

Is this good or not so good for valve life?

It is interesting that Lycoming continues to build engines with magnetos. After all these years and after much advancement in ignition technology, they still rely on magnetos to light the fire in their engines.

I wonder how come?

I think Toobuilder's test with our CPI showed what we'd generally expect: minimal gains at SL WOT, best power AFRs, but significant gains at lower MAP and especially running LOP.

Flame speed may be reduced as much as 40% running way LOP so it only serves to reason that we must have more advance to achieve peak cylinder pressure at the same crank angle for best torque.

Clearly too much timing is not a good thing at high power settings with high IATs and high CHTs from a detonation margin standpoint.

Wasted spark has no effect firing on the exhaust stroke since no mixture has reached the plug yet due to piston position BTDC. The residual gases are essentially inert at this point as well. I say no difference in valve temps or EGTs with waste spark. I've never seen any evidence of this on the dyno, switching from distributor type to waste spark ignitions.

The type certificate specifies magnetos for the hundreds of common Lycomings. It would cost more money to certify an EI. IE2 no longer uses magnetos so they have the technology available if they wanted to apply it to other engines in the Lycoming family.
 
That makes sense.

So why do you want to vary the coil charge time in flight to observe the effects on misfire at different MAP/ rpm/ AFR combinations? What will varying the coil charge time do for delivered spark? Are you typically not running the coil at full saturation?

Simply to learn how far we could go. On low rpm engines like Lycomings, the coils are always fully saturated. On some high rpm engines above say 9000 rpm, we must reduce charge times so there is some off time for the transistors. The CPI allows the user
to play with the charge times although it wasn't designed with the thought to do it in flight.

Less saturation would drop max voltage capability as well as duration.

If it only takes 12,000 volts to ionize the gap and start the spark, nothing higher is required.
 
Simply to learn how far we could go. On low rpm engines like Lycomings, the coils are always fully saturated.

Less saturation would drop max voltage capability as well as duration.

Ok, to learn how far you can reduce coil output, as you're currently running them at full saturation, i.e. the maximum potential voltage and duration available from that particular coil.

Just to keep concepts straight, here's an illustration borrowed from the net. Total on-time is about 4ms, with about 1.25ms of that time being current limited to 7.8 amps.

If I understand correctly, a CPI user can vary total on-time. Does the system also incorporate coil current limiting?



If it only takes 12,000 volts to ionize the gap and start the spark, nothing higher is required.

Agree, but that's not really what we're talking about here. The secondary voltage of a monster coil, fully saturated, will only rise to 12,000 if that's all it takes to ionize the gap. We might reduce its coil-on time quite a lot before we dropped below the ability to ionize. However, the duration would get shorter and shorter, and duration is the supposed "more power" advantage...volts x amps x time.

How short will still produce satisfactory cycle to cycle variation? I'm as curious as you are.
 
Last edited: