Hi,
This is my first post!
As a learning exercise, I am trying out a few small solar panels I have kicking around and, for convenience, I want to test them using an ordinary 60W light bulb and then extrapolate for sunlight conditions. I have tried to analyze the problem as follows but I am not sure I am doing this right. Can anyone check my work/assumptions? I am not all that good at math so bear with me.
Problem description: A 0.1m x 0.1m solar panel of unknown properties is placed 0.1m away from the filament of a 60W incandescent bulb and its output is measured. How can this output be used to determine its outputs under full sun?
Assumptions:
Assumption 1: 60W bulb throws away 90% of its energy as heat. Therefore, only 10% of the emitted energy is in the form of useable light so the actual usable light power output is roughly 6W.
Assumption 2: The remaining non-thermal energy given off by the bulb is roughly white light and, for the purposes of this exercise, is assumed to represent an approximation of a proportion of full sunlight. (I know this is a terrible assumption - just bear with this for the remains of the exercise).
Assumption 3. For the purposes of simplifying this exercise we assume the bulb is a point source isotropic radiator (not perfect but close enough for my purposes).
Calculations:
1. The bulb's intensity is: 6W÷(4*pi) = 0.477 W/sr
2. The panel's active area is 0.01 m².
3. The angle of illumination for the panel, in radians, is artcan(dist/panellength) = artcan(0.1m/0.1m) = 0.785 rad.
4. This works out to 0.785÷( 4*pi) = 0.0625sr.
5. The amount of light power incident on the panel is 0.0625 * 0.477 = 0.0298W
6. The panel's irradiance works out to 0.0298÷0.01m² = 2.98W/m²
7. One standard sun is 1000W/m². So this light source is roughly equivalent to 0.3% of one standard sun. (Seems awfully small!)
So, am I out to lunch? Comments please (be gentle!)
Cheers,
Allan
This is my first post!
As a learning exercise, I am trying out a few small solar panels I have kicking around and, for convenience, I want to test them using an ordinary 60W light bulb and then extrapolate for sunlight conditions. I have tried to analyze the problem as follows but I am not sure I am doing this right. Can anyone check my work/assumptions? I am not all that good at math so bear with me.
Problem description: A 0.1m x 0.1m solar panel of unknown properties is placed 0.1m away from the filament of a 60W incandescent bulb and its output is measured. How can this output be used to determine its outputs under full sun?
Assumptions:
Assumption 1: 60W bulb throws away 90% of its energy as heat. Therefore, only 10% of the emitted energy is in the form of useable light so the actual usable light power output is roughly 6W.
Assumption 2: The remaining non-thermal energy given off by the bulb is roughly white light and, for the purposes of this exercise, is assumed to represent an approximation of a proportion of full sunlight. (I know this is a terrible assumption - just bear with this for the remains of the exercise).
Assumption 3. For the purposes of simplifying this exercise we assume the bulb is a point source isotropic radiator (not perfect but close enough for my purposes).
Calculations:
1. The bulb's intensity is: 6W÷(4*pi) = 0.477 W/sr
2. The panel's active area is 0.01 m².
3. The angle of illumination for the panel, in radians, is artcan(dist/panellength) = artcan(0.1m/0.1m) = 0.785 rad.
4. This works out to 0.785÷( 4*pi) = 0.0625sr.
5. The amount of light power incident on the panel is 0.0625 * 0.477 = 0.0298W
6. The panel's irradiance works out to 0.0298÷0.01m² = 2.98W/m²
7. One standard sun is 1000W/m². So this light source is roughly equivalent to 0.3% of one standard sun. (Seems awfully small!)
So, am I out to lunch? Comments please (be gentle!)
Cheers,
Allan