SciChart® the market leader in Fast WPF Charts, WPF 3D Charts, and iOS Chart & Android Chart Components
I’m having issues when I want to apply a custom PaletteProvider to a XyScatterRenderableSeries.
It takes forever to show just 10.000 points! I am really surprised given that without custom coloring I can render more than 50M points.
As far as I know, I’m not doing anything fancy. Have a simple transfer function inside the PaletteProvider that colors the PointMarker based on a Z value that is set as an indexed collection which I can access directly on the Palette Provider.
I’m using a SquarePointMarker as the marker, and I’ve also tried to set DirectXHelper.TryApplyDirectXRenderer to true.
Is this a known performance issue? Is there some way to by-pass it?
We’re expecting to be able to draw about 10M points colored.
Update: Feb 2018 / SciChart v5.1
In SciChart v5.1 we have now implemented a new scatter series type which is able to handle big-data sets with scatter-charts and PaletteProvider.
Find out more at the page The ExtremeScatterRenderableSeries.
It really depends on how many colour changes you have in your dataset. Every time the colour is changed we flush the render pipeline and create a new cached marker for drawing.
The worst case scenario is if you change colour every point to a new random colour. This will yield very poor performance.
If you are changing colour every point but say to the same value e.g. Red/Green/Red/Green then there may be something we can do for you. E.g. Advise on a custom series
Finally What we’ve built is for the general case and there are always edge cases where performance is poor. In most cases however it is usually possible to create a workaround for a specific case. Let me know more about your requirements and I’ll give you some ideas.
Unfortunately the data sets i’m receiving as examples don’t have a pattern to follow.
If I set a DataSeries with sorted X values but unsorted Z values, the application even crashes.
If I set a DataSeries with unsorted X values but sorted Z values, everything works ok and pretty fast, but this is only 150.000 points so I don’t think it will scale for more data.
Can you think of anything I could try using the first scenario?
Please login first to submit.