Back to Main Index.
As divers, used to the sometimes-dark environment of U.K. waters, we are all aware of the benefits of decent, high-power underwater lanterns.
Whilst High Intensity Discharge units are gaining popularity, they still remain beyond the reach of many - (myself included), leaving the majority of us with the choice of either the cheaper, bog-standard non-rechargeable lanterns or the somewhat more expensive rechargeable varieties - both using incandescent bulbs.
Some, on a tight budget go for the cheapest option, then, realising the long-term cost benefits of rechargeables, decide to try retrofitting rechargeable cells - often with some disappointment.
Others, whose rechargeable cells "die" through neglect, abuse or inferior chargers (yes, there are a few about - even from reputable lantern manufacturers), may want to replace the originals with cheaper, locally sourced alternatives.
In this short article I'll try to explain some of the points to observe when choosing cells.
Firstly, let's get the terminology right.
There are several different types of "Cells"
"Primary cells" are non-rechargeable (see note below); the two most common types are zinc-carbon (low energy - unsuitable for our purposes) and alkaline (high energy). Both types have a nominal cell voltage of 1.5V.
Note. In actual fact, some primary cells can be recharged but this isn't advertised by the manufacturers: to do so requires specialised chargers and the results can be variable; this subject is beyond the scope of this article.
"Secondary cells" are rechargeable; these come in a great variety of materials, characteristics and voltages. The two most common, easily obtained types are: nickel-cadmium (NiCd) and nickel-metal-hydride (NiMh). Cell voltage for both types is 1.2V.
"Batteries" are groups of identical "cells" (logical isn't it?) wired together to produce a higher voltage. Battery voltage is always a multiple of the cell voltage; a 9V radio (PP3) battery has six 1.5V cells.
"Capacity" is, in effect, an indication of the energy that the cell will provide, usually given in Ampere Hours (A.hr). This is the time for which the cell will supply a fixed, known current at its specified voltage before the cell voltage begins to fall. It is not a linear function: the higher the current drawn, the lower will be the A.hr available.
Size for size, non-rechargeable alkaline cells, will almost always have more energy capacity than the two mentioned types of re-chargeable cells.
Whilst the capacity is rarely shown on alkaline cells, it is almost always shown on rechargeables, if it isn't, be very suspicious (just what are the manufactures trying to hide?).
Comparison of supposed "High-Power" rechargeable cells, typically on sale in the likes of Boots The Chemists and those available from electronics suppliers, should lead you to appreciate the following statement: "Not all cells are created equal".
The following table lists the capacities of the most common types.
|Non-rechargeable 1.5V/cell||Re-chargeable 1.2V/cell|
It doesn't need saying that you should avoid the "Worst" types like the plague, IMO they aren't worth the cost.
If capacity isn't given, compare weights with alkaline types, a very general guideline is: heavier batteries mean better capacity! Decent rechargeables will be at least as heavy as similar sized alkalines.
Theoretically, "Burn Time" = (A.Hr)*(No of cells)*(Cell voltage)/(Bulb wattage)
i.e. for four, 1.2A.Hr, 1.2V "D" cells driving a 10 watt bulb: Time = 4*1.2*1.2/10 = 0.57 hours = 34 mins.
But, using 4.0A.Hr "D" cells, Time = 4*4*1.2/10 = 1.9 hours = 1 hr 55 mins.
So you can see that the extra capacity of "best' cells really is a bonus!
I believe that the slight cost difference between NiMh and NiCd is far outweighed by the extra capacity of NiMh and the fact that they're much more tolerant to partial-discharge/charge cycles is an added benefit.
Of the two types, MiMh have much higher self-discharge rates, this means that if you charge the cells at the end of the dive season (which you should anyway to help prolong their life), you shouldn't expect them to remain fully charged until the following season! And you should never leave them in a discharged state for any extended period.
Chargers come in many flavours.
The most basic types require something like 14-16 hours to fully charge. If you pay a little more, you can have them charge in as little as 7 or even 4 hours - but be warned, unless it's a high quality charger, they may not really be "fully" charged - only 90% or so - and you may have to ensure you're around at the end of the charge period to switch them off otherwise they'll overheat. Like many things, with chargers, you invariably get what you pay for.
The most basic types are simply a constant current source. Re-chargeable cells are usually capable of being left on a continuous charge of about one tenth of their A.Hr rating, i.e. a 4A.Hr cell can usually be left to charge indefinitely with a charge current of 400mA, however, they will get quite warm and ventilation is important to prevent overheating and possible damage. This is the technique the most basic chargers use.
The better types charge at this rate for a predetermined time then reduce the charge current to a "trickle charge" until they're switched off.
The really good types are semi-intelligent (not unlike some divers I know) and continuously monitor the terminal voltage and temperature of the cell(s), this ensures the maximum charge in the minimum time - but obviously they cost quite a bit more.
Cells should always be charged "in series", this ensures they all have the same current passing through them.
Using higher capacity cells with lower capacity chargers.
Suppose you buy one of those economical combined "charger with included cells" packs from your local electrical store. They typically come with 1.3 A.Hr "AA" size cells and suggest something like 14 hours to "fully" charge the cells.
Now suppose you also buy a set of 1.7 A.Hr cells, will they also charge in 14 hours? The answer is, unless it's an intelligent charger, No.
So how much longer will they need to charge? This obviously depends on the charger, but if it's a basic charger that pumps out the maximum safe continuous charge current, you can safely increase the charge time by the ratio of cell capacities. i.e. 1.7 A.Hr/1.3 A.Hr = 1.3, so, 1.3 times 14 hours = 18 hours.
All-singing-and-dancing intelligent chargers should automatically compensate for the higher capacity; they might take a little longer to charge but it probably won't be anything like 30% longer.
Problems could arise with mid-price "fast" chargers. Unless we know exactly how they work it's difficult to say for certain what extra time should be allowed. In this case it's advisable to always retain any instructions that come with the charger. As a VERY rough guide, for constant current charging, Charge time = (Cell capacity+25%)/Charge current.
Rechargeables have a lower voltage than non-rechargeables - so initially, bulbs will not burn as intensely but, because they have a flatter discharge curve, they provide a more constant intensity throughout their discharge cycle.
As can be seen from the following graphs, alkaline cell voltage drops fairly linearly from about 1.4V to 1V (1V being the point at which the cell is considered "flat"), whilst NiCd and NiMh have much "flatter" curves.
Electrical power is proportional to the square of the voltage (or current), so replacing 1.5V alkalines with 1.2V NiCads will produce only (1.2*1.2)/100% / (1.5*1.5) = 64% of the power! (into a constant/linear load).
However, bulb filaments are made from resistance wire with a very high, positive "temperature coefficient of resistance", this means they're not linear, so when cold i.e. first switched on, they draw more current and as they heat up the current diminishes (you can see the effect with Christmas-tree lights; at switch-on they surge brightly for a moment then stabilise).
So, because our new battery voltage is lower, the bulb filament might never reach its normal operating temperature and the light intensity will be reduced.
I've made a few measurements to see what effect varying the voltages about the nominal rating has on the filament current (unfortunately I've no means of measuring light intensity); the results are presented graphically below.
As you can see, the relationship isn't directly proportional; changing the voltage by 10% causes the current to change by about 6%, whilst a 25% change in voltage produces about a 15% change in current. So, whilst the bulb will be less bright because of the reduced voltage, it won't be quite as dim as you'd expect.
In addition, the bulbs supplied with dive lanterns are often of a lower voltage rating than the battery (they're usually over-driven to burn brighter!) and so the degradation in intensity may be even less noticeable.
However, if you want to do a proper "upgrade", you really should get a suitable bulb.
A question I was recently asked (as well as many years ago):
On a long-diving-weekend what can we do about recharging torches with no access to mains electricity (i.e. when camping)?
One option is to use photovoltaic panels as a source of charge but because they'd invariably need to be charged overnight, it's likely this'd be an expensive option because they'd need to be powerful enough to deliver considerably more than a trickle charge for the short time that daylight was available.
If you can find out the type, quantity, voltage and amp.hour rating of the cells in the battery pack, it's a simple matter to rig up a crude Heath-Robinson charger to work off your car's battery. All you need is a resistor to limit the current, wire, a fuse (and in-line holder) and crocodile clips.
This'll allow some energy to be pumped into the cells although it won't give them a full charge unless you could charge them for at least 14 hours. I've successfully used this method on dive trips to Scotland in the distant past. See below
Lead-acid car battery = 13.2V, lamp battery =4 NiCads (@1.2V each) = 4.8V.
If the cells are "industrial grade" 4 Amp.hour types, they can usually be left on continuous charge at 350mA (some types can tolerate up to 400mA whilst other cheaper and/or smaller types only about 100mA - that's why it's important to establish the type of cells).
A resistor to limit the current when fully charged will be Vr/I = (13.2V-4.8V)/350mA = 23.4 ohms (22 ohms is a commonly available value, 24 ohms should also be available).
Initially, if the cells were discharged to 1V each (you should never let them get lower than this) the battery pack would total 4V, Vr will be 13.2V-4V = 9.2V and the maximum charging current would be 9.2V/22R = 420mA.
If the torch battery (or output terminals were short-circuited, the worst-case current would be 13.2V/22R = 600mA
The fuse therefore should be a "quick-blow" type rated between these two currents (500mA is a standard value), fitted at the car battery end of the leads and properly insulated (don't forget to take some spares!)
Unfortunately, resistors dissipate electrical power in the form of heat and this heat needs to be got rid of safely.
Fault condition (short-circuit) power dissipation in the resistor will be 13.2V*600mA = 7.9 Watts but this will only be for as long as it takes the fuse to blow (resistors can tolerate several times their power rating for several seconds before suffering permanent damage).
Worst-case charging power will be Vr*I, i.e. 9.2V*420mA = 3.9 Watts.
When fully charged (to 4.8V) the current will fall to the 350mA limit determined by the series resistor, Vr falls to 8.4V and the power will drop to 8.4V*350mA = 2.9 Watts.
So in this example you'd need a 22R resistor capable of continuously dissipating at least 4 Watts.
Typically, a 6 watt high quality wirewound resistor dissipating this much will heat up to 200 degrees Celsius or more above ambient whilst a 12 Watt resistor will probably reach 120 degrees so whatever you use will need to be supported on something non-flammable and non-meltable.
However there's no reason why this power couldn't be shared among several smaller (and much cheaper) resistors wired either in series (add resistance values) or in parallel (divide one resistor's value by the number of resistors). i.e. ten parallel 220R resistors will result in 22R and each would only have to handle one tenth of the total power (0.39W) and if they were rated at 0.5 Watts they'd only rise to about 80 degrees (small, film-type resistors run "cooler" than power resistors).
Maplins or Tandy should be able to supply something suitable.
You MUST ensure that both the battery and resistor(s) are located in free-air. All electrical specifications assume that generated heat is not allowed to build up in confined spaces.
You should also be aware that if the engine is run with the torch battery and "charger" attached, the car's battery voltage could rise significantly (due to the alternator charging it), as a result the current into the torch battery will also rise and the resistor could get quite a lot hotter!
As for car battery life; its battery will probably be >50 Amp.hour whilst something like a UK400R battery will be <4 Amp.hour, that's at least 12 cycles. Running the engine for a few minutes (to the pub & back) with modern alternators should easily replenish what you take out.
This covers most of the questions I've been asked in the past and whilst not comprehensive, I hope it helps in some way to answer the most common queries.
For even more answers about rechargeable cells, check out this extremely useful site...Green Batteries"
Top of page.
Back to Index.