I have two scripts that do basically the same thing: The first one ('onDemand'), uses a small amount of CPU whenever it's informed a certain event occured, and is completely idle the rest of the time. The second one ('poller') uses a very small amount of cpu, but poll's for its event so it's active constantly (you can see it's using background CPU in the screenshot).
I started them at more or less the same time and left them running for a day to test which one had less system impact, and I'm confused by the results (see screenshot, onDemand is first, poller is second: http://imgur.com/a/7tF34 )
OnDemend used vastly more "Total CPU Time" but vastly less "Cycles"; How can this be? 'Cycles' seems pretty solidly defined, but what exactly is CPU Time measuring here? Based on these numbers how can I tell which process has used fewer CPU resources; in terms of slowing the system down and total energy used by the CPU?
Thanks!