I would like to stop the user from trying to load datas that are so large that when zoomed out it becomes confusing. I thought that we could use the none resampling mode but that means that the chart would load all data points and I think that this could affect performance. No matter the resampling mode, eventually the dataset will be so large that if the user zooms out, the chart will display a confusing image.
What is the best practice for this type of situation? I was thinking that perhaps we should limit the zoom based on data size. Perhaps there is a better idea out there?
SciChart ResamplingModes are intended to be lossless, this is why you get a cluttered image when zooming out on a really large dataset, it’s drawing as if all the data is there!
For more information, please see How does SciCharts handle the rendering of a high density of data points?
If you want to smooth the data you will need to filter it yourself to remove data-points. This isn’t something we do within SciChart itself I’m afraid.
I can recommend a technique such as listening to XAxis.VisibleRangeChanged events and loading new data as you zoom in/out. Simple filtering algorithms such as Douglas–Peucker line simplification can give you a nice smooth line for use when zoomed out.
Hope this helps,
Please login first to submit.