Transformer Magnetic flux linkage over long distance

Thread Starter

wes

Joined Aug 24, 2007
242
I have recently been thinking about transformers and the magnetic fields they produce and I had a question about them. It's pretty short and simple, just two.

What is the minimum flux strength produced by the primary needed to cause a voltage induced in the secondary?

How big can the distance between the Primary and secondary be?

Basically I was thinking and while I don't know why you would do this, the thought occurred to me about how big can the space between them be. After thinking about it, I believe as long as you can keep the field's from straying from the core, you could potentially make the distance quite big, maybe on the order of tens of feet or maybe bigger.

For the first one, I think as long as the magnetic field that is produced links to the secondary, then it doesn't really matter how strong it is. For example, could a primary with just 10 mA of current induce a voltage in the secondary that is 10 feet away as long as the Magnetic field was contained within the core or atleast 90ish percent was contained?

I know the voltage induced will decrease as the field strays from the core but as long it stays contained then the voltage should be close to 100%


note to admins:
I wasn't to sure if this should be in the physics or electronics section, so feel free to move if needed.
 
Last edited:

davebee

Joined Oct 22, 2008
540
Theoretically, any primary current would produce a voltage in a secondary at any distance, but in practice, once either the primary current drops enough or the distance between the primary and secondary increases enough, there will be a point where the resulting secondary voltage will fall low enough to be undetecteable, either because the measuring instrument isn't sensitive enough or because the signal will fall below the noise level (and at really low signal levels, there is a lot of electrical noise in practically any circuit.)
 

bountyhunter

Joined Sep 7, 2009
2,512
I have recently been thinking about transformers and the magnetic fields they produce
The strength of the field at a given distance depends on the flux pattern. Different shape cores of inductors produce different flux patterns. However, they all drop off in intensity very rapidly with distance. Common snse tells why: for a specific flux density in a given volume of space, as you radiate outward the total volume occupied gets much larger. Since the flux can not magically increase itself, there is a proportional loss in flux density moving away from the source.
 

Thread Starter

wes

Joined Aug 24, 2007
242
Yeah that is pretty much what i figured as well that theoretically, the primary could induce at any distance at any current. In the real world, the field would die out or leak out even with a high permeable material and maybe some sort of leakage reducer , lol, .

Leakage Reducer idea, lol

Would a diamagnetic inside core work, to obstruct the fields path back to the other pole so that it is easier from the fields perspective to just continue through the core? This would also be insanely expensive as the diamagnetic material is a superconductor in the program. According to the program it would help and the air gaps were put in intentionally. I attached some pictures to help.
 

Attachments

JDT

Joined Feb 12, 2009
657
What is the minimum flux strength produced by the primary needed to cause a voltage induced in the secondary?
Anything greater than zero (it's rate of change also has to be greater than zero).

How big can the distance between the Primary and secondary be?
Infinite.

Your primary magnetic field has to change in order to induce a voltage in the secondary. The rate at which it changes gives it a frequency spectrum. The magnetic field travels at the speed of light. Because you have a frequency travelling at the speed of light, you have a wavelength. For good long-distance operation, the primary and secondary devices need to be at least 1/4 to 1/2 the wavelength in physical size.

What I am saying here is that there is no difference between a magnetic field from, say a 50Hz transformer and an interstellar radio signal.
 

Adjuster

Joined Dec 26, 2010
2,148
It occurs to me to ask what would be the point of making such an extended transformer. Since you propose to have a magnetic core connection, you have what is in effect a cable joining the source and the load, so why not have an electrical cable in the first place?
 

Thread Starter

wes

Joined Aug 24, 2007
242
Lol, I have no idea what the use is for a extended transformer. I just had a thought and wanted to know if it was even possible.
 

t_n_k

Joined Mar 6, 2009
5,455
What I am saying here is that there is no difference between a magnetic field from, say a 50Hz transformer and an interstellar radio signal.
I think there is at least one subtle difference.

In the case of a transformer in which the magnetic circuit couples both the primary and secondary windings via the magnetic field, then an observer can note a change in the primary power draw for a change in transformer secondary loading. In the case of a wave that propagates from an antenna into free space, then an observer will be unable to discern any difference in the transmitter output power when a sufficiently remote load (such as a distant receiver antenna) is 'excited' by the propagated signal. This probably boils down to a consideration of near field and far field phenomena. If I bring a loaded receiver antenna sufficiently close to a transmitting antenna eventually I will note a change in the transmitter output power as the effective transmitting antenna load changes. This presumably 'happens' as I cross the far-field/ near-field transition region.
 

Thread Starter

wes

Joined Aug 24, 2007
242
I know this is Really late but I was going over threads and saw this one and I just had to ask this.

Originally Posted by JDT View Post
What I am saying here is that there is no difference between a magnetic field from, say a 50Hz transformer and an interstellar radio signal.

The thing I have always had a problem with is the fact that if they are essentially the same, which they are, then why is it that a magnetic field from an Electromagnet doesn't bend or effect a light beam?

Just had a thought, lol, is it because since the magnetic field of the light is oscillating back and forth so fast, let's say at 400 THZ (reddish) and the magnetic field from the Electromagnet is basically DC, the Light Magnetic field just attracts and then repels constantly causing no change and obviously no bend.

So if you properly pulsed the Electromagnet and tuned it to be in phase with the light, then could you bend the light by being on at the point of attract let's say and then off in the repel phase so the Electromagnet always attracts but never repels and hence bends the light.

So is that correct?
 
Top