Advanced Append Subquery Joins + Row Limits
Idea Logged
Hi,
My understanding is that basic subqueries do not respect row limits on reports in order to make sure there is a match for the join i.e. the results of the subquery are not limited
However we have seen that in advanced subqueries with a view based on a different datasource, this is not the case. I can imagine that since the application fires off a separate query that it might treat it as another report/master level query and hence the limit the result before joining the 2 result sets together in memory.
Just wondering if this is intended behaviour or perhaps a bug?
Thanks
Dean
Hi Dean,
I hope you're well.
This is known behaviour for subqueries from different data sources and we have an enhancement open for it. If you like, I can add your support to it to help raise visibility with the developer team.
Let me know.
Kind regards,
Chris
Hi Dean,
I hope you're well.
This is known behaviour for subqueries from different data sources and we have an enhancement open for it. If you like, I can add your support to it to help raise visibility with the developer team.
Let me know.
Kind regards,
Chris
Hi Chris,
Yes please do
In this case we managed to get around it by adding the tables from the other datasource to the same database as the original source. This was only possible as they were both Snowflake databases. However we likely would be looking to join up disparate databases in the future so definitely something we would like looked at
It might also be more important now with the CData connector integrations
Thanks
Dean
Hi Chris,
Yes please do
In this case we managed to get around it by adding the tables from the other datasource to the same database as the original source. This was only possible as they were both Snowflake databases. However we likely would be looking to join up disparate databases in the future so definitely something we would like looked at
It might also be more important now with the CData connector integrations
Thanks
Dean
Hi Dean,
Thanks for letting me know your reasoning. If that's everything I'll mark this as enhancement logged.
Kind regards,
Chris
Hi Dean,
Thanks for letting me know your reasoning. If that's everything I'll mark this as enhancement logged.
Kind regards,
Chris
Sounds good.
Thanks
Sounds good.
Thanks
Hi Dean,
Just a small update on this one from the developer team. At the moment this one is a low priority and they recommend continuing with your approach of adding the tables from the other data source to the same database as the original source.
With how Yellowfin works and being unable to run queries that join data across two different databases, we have to run the individual queries, bring all the data back onto the server, and then complete the join in memory. Using this method would put quite a lot of processing load on the server in a large volume environment.
Our current recommendation is to try to solve this in the data pipeline rather than the BI layer. There are federated query / data virtualisation middleware solutions that can be used to run unified queries across data sources. You could then plug that in to Yellowfin to use as a singular data source.
Kind regards,
Chris
Hi Dean,
Just a small update on this one from the developer team. At the moment this one is a low priority and they recommend continuing with your approach of adding the tables from the other data source to the same database as the original source.
With how Yellowfin works and being unable to run queries that join data across two different databases, we have to run the individual queries, bring all the data back onto the server, and then complete the join in memory. Using this method would put quite a lot of processing load on the server in a large volume environment.
Our current recommendation is to try to solve this in the data pipeline rather than the BI layer. There are federated query / data virtualisation middleware solutions that can be used to run unified queries across data sources. You could then plug that in to Yellowfin to use as a singular data source.
Kind regards,
Chris
Replies have been locked on this page!