Remove the 'N' from WHERE clause when not NVARCHAR

Tal Mickel shared this idea 1 year ago
Awaiting Reply

YF treats string values as NVARCHAR when using in where clause. This can cause performance slow down.


Can we please only have Yellowfin include the 'N' when querying NVARCHAR columns?


Here is example of query being run against VARCHAR column.


Select DISTINCT 
"First Name", "Last Name"
FROM "Users"
WHERE "Region" IN (N'APAC', N'NA')

Comments (10)

photo
1

Thanks Tal/Alex, I've spoken to a few devs over here and we will go ahead and raise an enhancement request for this.

If there are any updates on this I'll let you know, though please let us know if there was anything else in the meantime.


Regards,

David

photo
1

Hey David !

Is there any update for this idea ?

photo
1

Hi Dor,


Not as yet sorry, it is still in the list so it hasn't been forgotten about. Just not something that has been planned as yet.

Sorry for the vague info.


How many reports do you think are impacted by this, and in general how many slow reports do you have?


Thanks,

David

photo
1

Hey Dave, sorry for my delay


It's in all of our reports, about 2500 reports...

It affects the performance of all the reports, so there's no specific slow reports..

photo
1

Thanks Dor for confirming. I do know we have had a recent focus on performance improvements so I would expect some news on this soonish.


Regards,

David

photo
1

awesome David. Thank you!

photo
photo
1

Hi Dor,

I know it's been a while since this was last brought up, but I believe it was mentioned this makes reports take a few more seconds to load when this is the scenario. Do you have an example of this occurring that you can show us? Do you perhaps have an example to provide us, via screenshots I suppose, where a query with NVARCHAR that's not an actual NVARCHAR column takes just a few seconds to execute in SQL Server, but takes significantly longer in Yellowfin? Are we talking 5 seconds vs. 45 seconds? Or like 5 seconds vs. 10 seconds?

Thanks for your feedback.

Regards,

Mike

photo
1

Hey Mike,

I've contact our department's DEV team to give us an example for that. I'll update once we have an example.

photo
photo
1

Hey Mike,

I see that our DEV team once contacted you and presented the problem.

You can see it in ticket number 6013 (not shared due to public topic)


Let me know if it's alright

photo
1

Hi Dor,


Yup I can see that info, though it doesn't seem to outline any big problems.

In the test case you're using 200,000,000 rows of data (Which I'm sure will not be a normal amount of data).

The differences seem to be minimal.

With N'Varchar conversion: 4738 ms.

Without the conversion: 783 ms.


So yes it's x4 slower, but we're talking 4secs for 20 million rows of data.

I don't think this would have any real impact for your users unless they're constantly running reports that return more than 20 million rows, even then we're talking a matter of seconds difference.

Am I missing something here?

Thanks,

David

photo