I know there are probably lots of groups and boards out there where a question like this would belong, but I’m not a member of any others…and most Wooters are pretty high-tech in general.
So, before I go digging through lots of electrical engineering webboards, maybe someone here can give me a clue.
On a scale of 1-10, 10 being a PhD in eletrical engineering and hands-on practical circuitry building, and 1 being Cletus the SJY who doesn’t know what a battery is, figure I’m about a 3.
I can solder, I can follow a basic circuit diagram with switches, resistors, capacitors and diodes, and I understand the CONCEPTS of Ohm’s law, the differences between volts and amps (current) and such…
But, in practical stuff, I’m pretty clueless. I have what I assume is a very simple question, but I’ve no idea how to do this.
Simply put – I want to do some geeky projects with laser pointers and LED pens that usually run on 3 1.5v “watch batteries.”
I figured the minor difference between 4.5v they need and the 5v coming from my computer power supply and/or USB ports wasn’t a big deal, and wired up some stuff to run off that supply.
After a few hours, the LED pens and laser pointers were dead, leading me to beleive that in fact that .5v is pretty important. (this was junk from a dollar store, no big deal, but obviously I want it to last more than a few hours…)
What’s the simple way to drop a 5vdc power source to 4.5vdc? I don’t know how much current (amps) I’m using but figure it’s far, far under the 500ma total of a USB port…
Do I just wire in a resistor? If so, how does one compute that? Using Ohm’s law I’d need to know the “current” and I’m not sure what that means - current the load (laser pointer in this case) is going to draw, or max current over the wire, or what?
Googling to this point has lead me to using a simple “voltage regulator” to do this, but they seem to come in standard outputs of 5, 12, and 24 volts…which isn’t going to work, as I need to go from 5vdc to 4.5vdc.
I could probably get away with lower, such as tapping a 3.3v wire off the power supply, but seeing as that voltage is going to much more sensitive items on the motherboard than that which is coming out of the USB ports or the drive MOLEX connectors, I’m leery of touching those wires.
Sorry for the huge post. I’m hoping it makes sense, and someone can kindly and without TOO much pedantry and/or scorn for my ignorance help me out here.
(Side question, that I would have thought I could figure out in 30 seconds on Google but can’t as all answers are either over my head or presume I’d already know this: How the heck does one use a multimeter to figure out how much a device will draw in amps? I’ve NEVER been able to figure that one out…I’ve got a darn fancy multimeter here but when it comes to such a simple thing I can’t figure it out, does it go inline with the power source (wired in series) or across the +/- (parallel) while the device is running, or is it hooked up in lieu of a power source, or what?!)
Thanks in advance!!
JD