[PATCH] Eliminate per-test timing calibration.
Peter Harris
pharris at opentext.com
Thu Jan 7 12:24:38 PST 2010
On 2010-01-07 14:54, Adam Jackson wrote:
> Instead, use setitimer() to receive a SIGALRM after the requested number
> of seconds elapses, and change the tests to report back the true number
> of iterations.
Nice.
> void
> diff --git a/x11perf.c b/x11perf.c
> index 15eb04d..410967c 100644
> --- a/x11perf.c
> +++ b/x11perf.c
> @@ -38,12 +38,18 @@ SOFTWARE.
> #define Time_t time_t
> #include <stdlib.h>
>
> +#include <limits.h>
> +#include <sys/time.h>
> +#ifndef ITIMER_REAL
> +#define NEED_CALIBRATE
> +#endif
My platform doesn't have a <sys/time.h>. I suspect it should be
protected by an #ifdef ITIMER_REAL.
> @@ -138,6 +139,19 @@ extern int abortTest;
> extern void AbortTest (void);
>
> #define CheckAbort() if (abortTest) AbortTest ()
> +
> +#include <stdio.h>
> +static inline Bool CheckCancel(void)
> +{
> + if (abortTest == SIGALRM) {
> + abortTest = 0;
> + return 1;
> + } else
My platform (Win32) doesn't have SIGALRM either. This hunk should
probably be protected by #ifndef NEED_CALIBRATE or similar.
Also, "inline" is a C99ism. Should that read _X_INLINE to support C89
compilers?
Peter Harris
--
Open Text Connectivity Solutions Group
Peter Harris http://connectivity.opentext.com/
Research and Development Phone: +1 905 762 6001
pharris at opentext.com Toll Free: 1 877 359 4866
More information about the xorg-devel
mailing list