Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@lenguyenthanh
Copy link
Owner

@lenguyenthanh lenguyenthanh commented Feb 22, 2025

Fix #280

postgres.use(_.execute(q)(f.argument))

def countFederationsSummary: IO[Long] =
import skunk.implicits.*
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it okay to introduce dependency on skunk here? Seems sql macros belongs to Sql object, but otherwise I just would import it on top of the Db object declaration

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah you're right, I was just lazy πŸ˜‚. Moved sql stuff to Sql object.

Copy link
Collaborator

@Masynchin Masynchin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks cheesy for me πŸ‘πŸ»

@lenguyenthanh lenguyenthanh merged commit cd3cca1 into main Feb 22, 2025
3 checks passed
@lenguyenthanh lenguyenthanh deleted the better-paging branch February 22, 2025 20:54
@Masynchin
Copy link
Collaborator

@lenguyenthanh this two queries instantiates two connections, rather than using one, is it acceptable? In case of no, it can be added new layer, which combines query functions of [A] => Conn => IO[A] to Conn => IO[(A, B)], for this two queries, which can be then used on single connection

@lenguyenthanh
Copy link
Owner Author

yeah, It would be ideal if it uses only one connection. Another think we could do to optimize this is cache the count query instead of executing it for every page.

@Masynchin
Copy link
Collaborator

Another thing we could do to optimize this is cache the count query instead of executing it for every page.

I don't see how this cache can be properly invalidated πŸ€”

@lenguyenthanh
Copy link
Owner Author

I don't see how this cache can be properly invalidated πŸ€”

We can use crawler to invalidate the cache πŸ€” or just use some arbitrary ttl.

@Masynchin
Copy link
Collaborator

We can use crawler to invalidate the cache

Seems workable. My concern is about consistency, so TTL seems disastrous 😁

@lenguyenthanh
Copy link
Owner Author

My concern is about consistency, so TTL seems disastrous 😁

yeah, if we're worry about consistency then ttl is tragic, we probably not too worry about it tho πŸ˜›. But using crawler to invalidate the cache is much better idea anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Show how many players were found in the search and how many pages there are in total to go through.

3 participants