feat: auto-detect backend from environment variables#88
feat: auto-detect backend from environment variables#88
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (4)
✅ Files skipped from review due to trivial changes (1)
🚧 Files skipped from review as they are similar to previous changes (2)
📝 WalkthroughWalkthroughDefaultBackendProvider now auto-detects the L2 backend from environment variables (CACHEKIT_* or REDIS_URL), can return None for L1-only mode, raises ConfigurationError on ambiguous configs, and changes wrapper behavior to fail-fast for sync but degrade gracefully for async when no backend is available. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Client Code
participant Provider as DefaultBackendProvider
participant Env as Env Vars
participant Resolver as _resolve_provider()
participant Factory as Backend Factories
Client->>Provider: get_backend()
alt first call (_resolved = False)
Provider->>Resolver: _resolve_provider()
Resolver->>Env: check CACHEKIT_API_KEY
alt CACHEKIT_API_KEY set
Resolver->>Factory: create CachekitIOBackend
Factory-->>Resolver: CachekitIO instance
else check CACHEKIT_REDIS_URL / REDIS_URL
Resolver->>Env: check CACHEKIT_REDIS_URL / REDIS_URL
alt redis var set
Resolver->>Factory: create RedisBackend
Factory-->>Resolver: Redis instance
else check CACHEKIT_MEMCACHED_SERVERS
Resolver->>Env: check CACHEKIT_MEMCACHED_SERVERS
alt memcached set
Resolver->>Factory: create MemcachedBackend
Factory-->>Resolver: Memcached instance
else check CACHEKIT_FILE_CACHE_DIR
Resolver->>Env: check CACHEKIT_FILE_CACHE_DIR
alt file dir set
Resolver->>Factory: create FileBackend
Factory-->>Resolver: File instance
else no backend vars
Resolver-->>Provider: None
end
end
end
end
alt multiple conflicting vars
Resolver-->>Provider: raise ConfigurationError
end
Provider->>Provider: cache result, set _resolved = True
else cached (_resolved = True)
Provider->>Provider: return cached backend or None
end
Provider-->>Client: backend instance or None
sequenceDiagram
participant User as User Code
participant Sync as sync_wrapper
participant Async as async_wrapper
participant L1 as L1 Cache
participant L2 as L2 Backend / Provider
User->>Sync: call decorated function
Sync->>L2: resolve backend (get_backend)
alt backend is None (L1-only)
Sync->>Sync: raise BackendError(PERMANENT)
Sync-->>User: error
else backend resolved
Sync->>L1: check L1
alt L1 hit
L1-->>User: return value
else L1 miss
Sync->>L2: query L2
L2-->>Sync: value
Sync->>L1: store
Sync-->>User: return value
end
end
User->>Async: call decorated coroutine
Async->>L2: resolve backend (get_backend)
alt backend is None (L1-only)
Async->>L1: check L1
alt L1 hit
L1-->>User: return value
else L1 miss
Async->>User: execute original coroutine (skip L2)
User-->>Async: result
Async->>L1: store
Async-->>User: return value
end
else backend resolved
Async->>L1: check L1
alt L1 hit
L1-->>User: return value
else L1 miss
Async->>L2: query L2
L2-->>Async: value
Async->>L1: store
Async-->>User: return value
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
❌ 1 Tests Failed:
View the full list of 1 ❄️ flaky test(s)
To view more test analytics, go to the Test Analytics Dashboard |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
src/cachekit/backends/provider.py (1)
149-164: Minor thread-safety consideration for concurrent initialization.The lazy initialization pattern (
if not self._resolved) is not thread-safe. Concurrent threads could race and both enter_resolve_provider(). Since_resolve_provider()is idempotent (reads env vars, creates provider), this is benign—at worst, duplicate providers are created momentarily.If strict single-initialization is required in the future, consider adding a lock:
♻️ Optional: thread-safe initialization
+import threading + class DefaultBackendProvider(BackendProviderInterface): + _init_lock = threading.Lock() + def get_backend(self): - if not self._resolved: - self._provider = self._resolve_provider() - self._resolved = True + if not self._resolved: + with self._init_lock: + if not self._resolved: # double-check + self._provider = self._resolve_provider() + self._resolved = True🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/cachekit/backends/provider.py` around lines 149 - 164, The lazy init in get_backend() can race since it checks self._resolved and calls _resolve_provider() without synchronization; to make initialization thread-safe, add a lock (e.g., self._init_lock = threading.Lock()) and wrap the resolve sequence in get_backend() with the lock so only one thread runs self._resolve_provider() and sets self._provider/self._resolved, leaving other threads to read the already-set self._provider; update any constructor (or class init) to create the lock and ensure get_backend() uses it around the if not self._resolved / self._resolve_provider() block.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/cachekit/backends/provider.py`:
- Around line 149-164: The lazy init in get_backend() can race since it checks
self._resolved and calls _resolve_provider() without synchronization; to make
initialization thread-safe, add a lock (e.g., self._init_lock =
threading.Lock()) and wrap the resolve sequence in get_backend() with the lock
so only one thread runs self._resolve_provider() and sets
self._provider/self._resolved, leaving other threads to read the already-set
self._provider; update any constructor (or class init) to create the lock and
ensure get_backend() uses it around the if not self._resolved /
self._resolve_provider() block.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 7641c536-c1e2-402b-a7a5-6a7313b97b12
📒 Files selected for processing (4)
.secrets.baselinesrc/cachekit/backends/provider.pysrc/cachekit/decorators/wrapper.pytests/unit/backends/test_provider.py
b603be2 to
5dd48df
Compare
DefaultBackendProvider now resolves the cache backend from CACHEKIT_-prefixed environment variables instead of always defaulting to Redis. Raises ConfigurationError when multiple CACHEKIT_ backend vars are set concurrently. Priority: CACHEKIT_API_KEY > CACHEKIT_REDIS_URL > CACHEKIT_MEMCACHED_SERVERS > CACHEKIT_FILE_CACHE_DIR > REDIS_URL (12-factor fallback) > None (L1-only). Non-prefixed REDIS_URL never conflicts with CACHEKIT_ vars.
5dd48df to
31fc1c1
Compare
Summary
Closes #87
DefaultBackendProvidernow auto-detects the cache backend fromCACHEKIT_-prefixed environment variables instead of always defaulting to RedisConfigurationErrorwhen multipleCACHEKIT_-prefixed backend vars are set concurrently (ambiguous config)REDIS_URLnever conflicts — it's a 12-factor fallback that yields to any explicitCACHEKIT_varPriority chain:
CACHEKIT_API_KEY→CACHEKIT_REDIS_URL→CACHEKIT_MEMCACHED_SERVERS→CACHEKIT_FILE_CACHE_DIR→REDIS_URL→ None (L1-only)Unchanged: explicit
backend=parameter andset_default_backend()always take precedence over env-var detection.Test plan
Summary by CodeRabbit
New Features
Bug Fixes
Tests