Most applications need to use timers to do things every few seconds. They are needed for maintenance work, network keep alive or other reasons. In most cases, they don’t need to be very exact.
I’m wondering why they are not synchronized on the OS-level: All applications do their maintenance work directly after each other and then the CPU is allowed to go to a state with lower power consumption. The current case is probably that it can’t even go low-power because all applications have their timers fired at arbitrary times.
Does anyone know about such possibilities in the major operation systems?
blog comments powered by Disqus