My LEDs have a forward voltage range of 3.0-3.8 V, as designated by the packaging they arrived in. When wiring them up, what voltage should I assume they are for calculating voltage drops? I'm wiring 10 parallel rows of 9 series LEDs, and I'm presently assuming that the typical Vf would be the lower value, 3.0V. Is this a reasonable assumption? I could test 10 and take an average, but is that necessary for series? What if I set it up as 3.1V or 3.2V instead?