"Settings example" in docs with huge performance hit #12299
-
First Check
Commit to Help
Example Code@lru_cache
def get_settings():
return Settings()
# This endpoint has a response time of 56k req/s because of the sync function get_settings used as dependancy.
@app.get("/info")
async def info(settings: Annotated[Settings, Depends(get_settings)]):
return {
"app_name": settings.app_name,
"admin_email": settings.admin_email,
"items_per_user": settings.items_per_user,
}
===
from core.utils import get_settings
settings = get_settings()
# This endpoints has a response time of 126k req/s settings imported directly not used as a dependancy
@get("/test")
async def info():
return {
"app_name": settings.app_name,
"admin_email": settings.admin_email,
"items_per_user": settings.items_per_user,
}Descriptionhttps://fastapi.tiangolo.com/advanced/settings/#__tabbed_4_1 Here as you can see the settings example, we have a sync dep: I followed this example in my app, and surely the routes using this Annotated[Settings, Depends(get_settings)] are all affected. 56k req/s vs of 126k req/s (the number I expect) As per this discussion: #11654 I understand this is because the get_settings function is sync and it runs in a different threadpool. I tried making it async but I got errors relating to "coroutines", it probably doesn't work with the @lru_cache decorator. In the end I just used imported the and called the get_settings at the top of all files I need settings for: And now all the endpoints have response times of 126k req/s again. So this pattern example from the documentation is limiting users performance by more than half, maybe without them knowing. It surely isn't a good reason to limit app performance for everyone reading variables from env. Thanks! Operating SystemmacOS Operating System DetailsNo response FastAPI Version0.111.0 Pydantic VersionPython Version3.12 Additional ContextNo response |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 3 replies
-
|
Without doing anything resembling an investigation of the following claim I'm still going to make it because it intuitively makes some sense. The worst possible case is being tested in the original post that started this discussion thread (and is similar to the linked comment). The moment a route and the dependencies it calls start doing something, anything performance is going to drop a ton and those two measurements will get much closer together. Think of fetching some data from a database or reading a file from disk. Those efforts are going to swamp the performance hit from running a sync dependency, which will bring the two measurements much closer together. |
Beta Was this translation helpful? Give feedback.
-
|
Does someone have any input? |
Beta Was this translation helpful? Give feedback.
-
|
Bump ? |
Beta Was this translation helpful? Give feedback.
-
|
To avoid overheads caused by using @lru_cache
def get_settings_sync():
return Settings()
async def get_settings():
return get_settings_sync()
@app.get("/info")
async def info(settings: Annotated[Settings, Depends(get_settings)]):
return {
"app_name": settings.app_name,
"admin_email": settings.admin_email,
"items_per_user": settings.items_per_user,
} |
Beta Was this translation helpful? Give feedback.
To avoid overheads caused by using
syncdependency, you can either useasyncalternative oflru_cache(alru_cache(https://github.com/aio-libs/async-lru)) or wrap your cached function withasyncfunction: