72 hours

Thread Starter

monkeykg

Joined Apr 11, 2014
2
I was wondering how I would go about making a circuit that sends an output after 72 hours. It doesn't have to be accurate and could go for anywhere +/- 5 hours of that. It needs to be a circuit board though cant use any microcontrollers. It only has to send the signal once
 

crutschow

Joined Mar 14, 2008
34,464
Use two CD4060 14-bit counters with the oscillator of the first configured as an RC type with a 1Hz frequency (use a pot for Rx to tweak to the desired frequency). Since there are ≈259k seconds in 72 hours you need to divide the 1Hz signal by 259k. 18 bits is 262k so that's within about 1% of the desired delay. You can adjust the oscillator to 0.988.8Hz (1.0114s period) for a more exact 72 hours). Feed the Q14 output of the first counter to the clock input of the second. The Q4 output of the second counter should give the desired delay.

You can connect the Q4 output to the Reset input of the first counter so no further outputs are generated, if desired. Reset the second counter to restart the count.
 

ErnieM

Joined Apr 24, 2011
8,377
72 ± 5 hours is ±7%

Standard resistors come 1%. Standard caps are 10%.

Worst case that's 11% out of the box. So you may need to tweak this.

A huge advantage of a digital divider is not having to wait a full 72 hours to test it.

Once.
 

AnalogKid

Joined Aug 1, 2013
11,055
Two 4060's in series gives you 28 bits, or 27 bits since you're using the leading edge of the last bit to inhibit further counting. That's over 500 times as many counts as the required 259K. Set the oscillator for 518 Hz (0.035% error), a much easier number to tune to and easier to get accurate components.

Or something like that.

ak
 

THE_RB

Joined Feb 11, 2008
5,438
Trivial with any microcontroller. And will give exact timing.

Why do people rule out the very best solution right at the start?

"I need to fly from London to New York, but I can't use a plane. I have to make something out of feathers and string..."

At least say WHY you can't use a micro.
 

ErnieM

Joined Apr 24, 2011
8,377
I would assume anyone who could use a micro would use a micro.

Anyone who has never used a micro would natually and justifiably be apprehensive of the learning curve just to get a blinky LED "Hello World" project to work.
 

AnalogKid

Joined Aug 1, 2013
11,055
The question was about "making a circuit", not writing a program. Granted that a lot of the incoming questions are not fully baked, but I try to go with the intent of thw question. And pretty much anyone on a electronics forum has to be aware of a uC's ability to count.

My favorite logic critter for something like this thread is a CPLD. No code to write, registers to set, or variables to define. Draw a schematic, select a part, click compile, done. Of course you have to wire it up, but that's a constant.

ak
 

Thread Starter

monkeykg

Joined Apr 11, 2014
2
My reason for avoiding microcontrollers is I have never used one before. Where could I learn about them and how to use them? Would it be a long learning curve?
 

WBahn

Joined Mar 31, 2012
30,075
My reason for avoiding microcontrollers is I have never used one before. Where could I learn about them and how to use them? Would it be a long learning curve?
Depends on how well prepared your background has made you for working with microcontrollers. The better your programming skills (in any language) and the better your understanding of beginning and intermediate digital logic concepts, the easier it will be to pick up microcontrollers. There are lots of options out there today that didn't use to be there, from the BasicStamp to the Arduino to the RasberryPi (I don't think I'd recommend the latter for what I am guessing is your background) and lots of others. Most microcontrollers have reasonably inexpensive development kits available. If this is truly a one-time project and you will never do a project again that would lend itself to a microcontroller, then it might not be worth the upfront time, effort, and money to get into the game.
 

sirch2

Joined Jan 21, 2013
1,037
... Would it be a long learning curve?
As WBahn says, it depends on skills and experience but in most cases, yes it will be a significant learning curve, not least because you need to read and understand the datasheet (more like a data-book) for your processor of choice and there are a lot of pit falls. For example there are a lot of configuration options and multi-use pins on most modern MCUs which need to be configured and without ploughing through the datasheet it is easy to miss something that means your program will not work as expected, if at all.
 

djsfantasi

Joined Apr 11, 2010
9,163
For example there are a lot of configuration options and multi-use pins on most modern MCUs which need to be configured and without ploughing through the datasheet it is easy to miss something that means your program will not work as expected, if at all.
However, something like an Arduino Uno takes care about most of those concerns. It is 10x expensive. But if you have any programming skills, you can be up and running in a few hours. The Arduino page has a good programming reference page.

I recently used this approach for the first time, and was completely successful. If this scenario is acceptable, don't let sirch2 completely discourage you. Learning micro controllers is a good thiing!
 

KMoffett

Joined Dec 19, 2007
2,918
You might also check out PICAXE microcontrollers. http://www.picaxe.com/ Originally designed for junior high and high school courses, very cheap, programs in BASIC, free programming software and manuals, only needs a serial cable to program, and they have a very helpful and civil forum.

Ken
 

sirch2

Joined Jan 21, 2013
1,037
... don't let sirch2 completely discourage you. Learning micro controllers is a good thiing!
I wasn't trying to discourage anyone, just being realistic. I have used Arduinos a lot and agree that it is a good way to get started but sooner or later you just want to embed the chip in your own project.
 
Top