Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

limit rows returned #44

Open
ablomeke opened this issue Feb 7, 2024 · 3 comments
Open

limit rows returned #44

ablomeke opened this issue Feb 7, 2024 · 3 comments
Assignees

Comments

@ablomeke
Copy link

ablomeke commented Feb 7, 2024

We use firebird_fdw to connect our postgres instance to a fairly large-sized firebird instance(s). When one of our users makes a query that calls a significantly-large dataset, the query will spin and eventually cause a memory leak that will cause oomkiller to terminate the process. Doing that results in the database crashing and going into recovery mode. We are investigating ways to mitigate this, including communicating with users to appropriately limit their queries to something that will return, but try as we might, people write queries they shouldn't. Doing so shouldn't result in a crash of the database.

We have similar queries every now and again in our postgres fdws, but they seem to handle themselves much better. I am curious if part of the reason why is that postgres_fdw has the fetch_size parameter that allows you to limit the number of rows that are pulled in a single query. Doing that seems to slow things down enough for us to be able to catch the issue and deal with it properly before things spiral out of control affecting other users, or allows postgres to manage it more appropriately in memory. Either way, I'm wondering if that feature can be implemented in firebird_fdw, or if some other means of managing memory usage can be created.

Let me know what, if any, log files or error messages I can provide.

@ibarwick
Copy link
Owner

ibarwick commented Feb 8, 2024

Thanks for the report. With this kind of issue, the first step to finding the right fix is being able to reproduce it, so what would be very helpful is some indication of the size of "a fairly large-sized firebird instance", or more specifically the approximate size and definition of the tables being queried (field names etc. anonymized if appropriate), and a sample query which triggers the issue.

Other information which would be helpful:

  • firebird_fdw version(s) being used (and how installed, e.g. from package or source)
  • PostgreSQL version(s) firebird_fdw is running ion
  • Firebird version(s) being queried
  • sample PostgreSQL log file(s) output from around the time the issue occurs (redacted/anonymized if appropriate).

Feel free to send me anything you don't want to attach here to: barwick [a] gmail.com.

@ibarwick ibarwick self-assigned this Feb 8, 2024
@ablomeke
Copy link
Author

ablomeke commented Feb 8, 2024 via email

@surfcode
Copy link

surfcode commented Mar 30, 2024

the important thing is that the results returned be more than the total memory capacity of the server(plus swap space. Once it hits that limit oomkiller terminates the thread and brings the whole DB down.

I experienced the exact same problem. Today I had the need to query for an additional year of data which naturally results in more rows getting returned from firebird to postgres. This resulted in a crash of postgres and indeed, the database entered recovery mode.

Increasing the server memory (from 1GB to 2GB on a DO Droplet in this case) solved the problem so I'm not bothered too much but if I can help in making this awesome wrapper more resilient, this was the info I gathered:

dmesg | grep -A2 Kill
Out of memory: Killed process 370357 (postgres) total-vm:2821968kB, anon-rss:690616kB, file-rss:324kB, shmem-rss:104kB, UID:113 pgtables:5008kB oom_score_adj:0

  • firebird_fdw
    ibfq-0.5.0 and firebird_fdw-1.3.0 installed from source
  • postgres version firebird_fdw is running on
    15.4
  • firebird version being queried
    2.5.9
  • sample postgres log file
    emailed to you :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants