Why do I get 'GC overhead limit exceeded' errors with Redshift?

David Registro shared this question 7 years ago
Answered

I'm using Redshift as my data source, and built some views off fairly large tables. But I constantly get 'GC overhead limit exceeded' when running my reports.

Yes I am returning a lot of data, but I'm using a fairly beefy server with a lot of memory allocated to Java.

Replies (1)

photo
1

This is something that may have impacted quite a few users in the past, but should no longer be much of a problem.

In the past there was no dedicated Redshift driver, so you would use the Postgres driver to connect, which worked.

However, in these scenarios, the JDBC driver would fetch the entire result set from your database, before passing to Yellowfin, so if reporting off a large data set, there is a good chance you would run out of memory and thrash your CPU. Something you would have also noticed if using another reporting tool that utilised that same JDBC driver.

However since then, Amazon did create a dedicated Refshift JDBC driver, which will only return a row at a time, and save your previous Java memory and CPU.

So to resolve this, please download the latest Redshift 4.1 driver here and ensure that you select this from the data source connection page.

872d687fd4724979405d4415b607c572

If you're unsure how to upload a new driver into Yellowfin, please see the following article: How to add new JDBC drivers to Yellowfin

Leave a Comment
 
Attach a file