A high-precision timing and profiling library for Python with support for context managers, decorators, lap timing, and benchmarking.
- Multiple Timer Interfaces: Choose between
StopWatch,TimerContext, or the registry-basedTimerclass - Context Manager Support: Time code blocks with clean
withstatements - Decorator Support: Automatically time function executions with
@timer.timedand@timer.timed_async - Lap Timing: Record intermediate times within a timing context
- Historical Tracking: All timing invocations are stored in lists for statistical analysis
- Benchmarking: Run functions multiple times with warmup support
- Async Support: Full support for async functions with
@timer.timed_async - Type Safe: Comprehensive type hints using modern Python typing features
- Zero Dependencies: Uses only Python standard library
- High Precision: Uses
time.perf_counter()for accurate timing
Requirements: Python 3.12+
Install with uv (recommended):
uv pip install -e .Or with pip:
pip install -e .from timekid.timer import Timer
timer = Timer(precision=3)
with timer['database_query']:
# Your code here
result = execute_query()
# Access timing (returns list of floats)
print(f"Query took {timer.times['database_query'][0]}s")from timekid.timer import Timer
timer = Timer(precision=2)
@timer.timed
def process_data(data):
# Your processing logic
return processed_data
# Call multiple times
for item in items:
process_data(item)
# Analyze all invocations
times = timer.times['process_data']
print(f"Average: {sum(times) / len(times):.3f}s")
print(f"Min: {min(times):.3f}s, Max: {max(times):.3f}s")with timer['data_pipeline'] as t:
load_data()
t.lap() # Record lap 1
transform_data()
t.lap() # Record lap 2
save_data()
# Final lap recorded automatically on exit
# Access lap times
contexts = timer.get('data_pipeline')
print(f"Load: {contexts[0].laps[0]}s")
print(f"Transform: {contexts[0].laps[1]}s")
print(f"Save: {contexts[0].laps[2]}s")import asyncio
from timekid.timer import Timer
timer = Timer()
@timer.timed_async
async def fetch_data(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.json()
# Use normally
await fetch_data('https://api.example.com/data')from timekid.timer import Timer
timer = Timer()
# Benchmark a function with 1000 iterations
results = timer.benchmark(my_function, num_iter=1000, arg1, arg2)
# Analyze results
times = [r.elapsed_time for r in results]
avg_time = sum(times) / len(times)
print(f"Average: {avg_time:.6f}s")from timekid.timer import StopWatch
sw = StopWatch(precision=2)
sw.start()
# Your code here
do_something()
elapsed = sw.stop()
print(f"Elapsed: {elapsed}s")All timer registry values are stored as lists of TimerContext objects. This enables:
- Historical tracking: Every invocation is preserved
- Statistical analysis: Calculate min, max, average, standard deviation
- Performance trends: Track performance over time
- Consistent API: No mix of single values and lists
# Multiple invocations create list entries
with timer['task']:
do_work()
with timer['task']:
do_work()
# Access all timings
all_times = timer.times['task'] # Returns list[float]
first_time = timer.times['task'][0]
latest_time = timer.times['task'][-1]Decorated functions store all invocations under the function name (no numbered keys):
@timer.timed
def process(item):
return item * 2
# Call 3 times
process(1)
process(2)
process(3)
# All stored under 'process' key
print(len(timer.times['process'])) # Output: 3All timer types accept an optional precision parameter for rounding:
timer = Timer(precision=3) # Round to 3 decimal places
with timer['task']:
time.sleep(0.123456)
print(timer.times['task'][0]) # Output: 0.123Enable verbose logging to see timing events in real-time:
import logging
logger = logging.getLogger(__name__)
timer = Timer(verbose=True, log_func=logger.info)
with timer['task']:
# Logs start and stop events
do_work()Main registry-based interface for timing operations.
Timer(precision: Optional[int] = None, verbose: bool = False, log_func: Callable[[str], None] = print)Properties:
times: Dict[str, list[float]]- All elapsed times for succeeded timerscontexts: Dict[str, list[TimerContext]]- All timer contextsprecision: Optional[int]- Configured precision for rounding
Methods:
timer['key']- Create/access timer context (creates new context each time)timed(func)- Decorator for synchronous functionstimed_async(func)- Decorator for async functionsget(key: str)- Get all contexts matching a keystatus(key: str)- Get list of statuses for a keysorted(reverse: bool = False)- Get timers sorted by elapsed timetimeit(func, *args, **kwargs)- Time a single function callbenchmark(func, num_iter: int, *args, **kwargs)- Benchmark function with multiple iterationsanonymous(name, verbose, log_func)- Create anonymous timer context (not stored in registry)
Context manager for timing code blocks.
TimerContext(precision: Optional[int], name: Optional[str] = None, verbose: bool = False, log_func: Callable[[str], None] = print)Properties:
elapsed_time: float- Total elapsed timelaps: list[float]- List of lap timesstatus: Status- Current status (PENDING/RUNNING/SUCCEEDED/FAILED)name: str- Timer name
Methods:
lap()- Record intermediate timereset()- Reset timer (clears laps, starts from now)rename(name: str)- Change timer name
Simple imperative timer with manual control.
StopWatch(precision: Optional[int] = None)Properties:
elapsed_time: float- Elapsed time (raises error if not started)status: Status- Current status
Methods:
start()- Start timingstop()- Stop timing and return elapsed timereset()- Reset to initial state
Timer lifecycle states:
Status.PENDING- Created but not startedStatus.RUNNING- Currently timingStatus.STOPPED- Manually stopped (StopWatch only)Status.SUCCEEDED- Context exited normallyStatus.FAILED- Context exited with exception
Run tests with unittest:
python -m unittest tests._basic_test -vThis project uses uv for package management:
# Install in editable mode
uv pip install -e .
# Run tests
.venv/bin/python -m unittest tests._basic_test -v
# Run examples
python -m timekid.timerThe project includes GitHub Actions workflow for automated testing:
- Runs on Python 3.12 and 3.13
- Tests on push/PR to main, master, or develop branches
- Uses
uvfor dependency management
If upgrading from a version with mixed single/list registry values:
# Old way (single values)
elapsed = timer.times['my_task'] # Was a float
# New way (list values)
elapsed = timer.times['my_task'][0] # First timing
elapsed = timer.times['my_task'][-1] # Latest timingContributions are welcome! Please ensure:
- Python 3.12+ compatibility
- All tests pass
- Type hints for all functions
- Update documentation as needed
MIT License
Peter Vestereng Larsen ([email protected])