We are building an application that receives data at a rate of 264 samples per second. The values we are trying to plot are too large to use doubles so we are using floats instead. As soon as a second or two of data is appended to the chart’s dataseries the application freezes and I get the following message in the debug output window of Visual Studio.
SciChartSurface didn’t render, because an exception was thrown:
Message: Array dimensions exceeded supported range.
Stack Trace: at System.Collections.Generic.List
1.set_Capacity(Int32 value)1.EnsureCapacity(Int32 min)
1.Add(T item)1 tickRange, Double delta, Double majorDelta, IList
1 results)1 tickRange, IAxisDelta
1 tickDelta)1.SciChart.Charting.Numerics.TickProviders.ITickProvider.GetMajorTicks(IAxisParams axis)
at SciChart.Charting.Visuals.Axes.AxisBase.OnDraw(IRenderContext2D renderContext, IRenderPassData renderPassData)
at vme.scn(ISciChartSurface cbn, IRenderContext2D cbo)
at vme.RenderLoop(IRenderContext2D renderContext)
How can we get this data to render on the chart without crashing? Ideally we would like to use the FIFO feature of SciChart as well but at this point we would just settle for getting the chart to render.
I am using UniformHeatMapDataSeries to plot a heat map in WPF/C#. However, the data is very large in size and as a result it is throwing OutOfMemoryException while populating the first parameter value of this data series. which is a two dimensional array TZ (generic).
As per definition:
public UniformHeatmapDataSeries(TZ[,] zValues, TX xStart, TX xStep, TY yStart, TY yStep, IPointMetadata[,] metadata = null);
Here I am filling this array with double[,] data.
This error is happening due to TZ double[,] is filled with very big size of data going out of max defined range of double 2D array size. Please suggest if I can replace the values of double[,] with any other data type which allow larger data.
The issue I’m having is that I seem to be running out of memory for 3D charts after a certain data size. What I will see is either my chart never renders at all or on occasion the entire application will crash (due to accessing out of bounds memory in the native dlls.)
I am using a SurfaceMeshRenderableSeries3D with a UniformGridDataSeries3D(int, float, int). With smaller sizes such as 320×240 (X,Z) I get results, but larger sizes such as 640×480 (X,Z).
This issue only seems to happen when compiling to 32-bit. I have seen the other questions regarding how AnyCPU is compiling to 32-bit. Some of our customers have to use 32-bit so I would like to know what the expected footprint is to hopefully try and work around it.
We create a surface in code and use ExportToFile() to embed it in a printable document.
All well and fast.
But, after changing region settings from en-us to de-de this Method takes ALL memory an the Machine and causes out of memory issues and virtually halts the machine.
We tried our code on different machines and the behavior was always the same.
Has anybody similar experiences? Or even a solution 🙂
I constantly have OutOfMemoryException when I set IChartSeriesViewModel.DataSeries.FifoCapacity and there are 4 or more chart on a single SciChartSurface.
Exception’s code block is
viewModel.DataSeries.FifoCapacity = 1000000;
where viewModel is newly created instance of ChartSeriesViewModel class.
What it could be?
Exception stack trace is
A. `1..ctor(Int32 ) Abt.Controls.SciChart.Model.DataSeries.XyDataSeries`2.ClearColumns() Abt.Controls.SciChart.Model.DataSeries.DataSeries`2.Clear() Abt.Controls.SciChart.Model.DataSeries.DataSeries`2.set_FifoCapacity(Nullable`1 value)
p.s. I have no opportunity to set x64 as target platform. My SciChart version is 3.1 but there is the same problem with 3.22.
I am adding charts with 10 million points.
With 40 million points the chart throws an OutOfMemoryException
In SciChart is there a limit on the number of points ? Or should I configure something?
Why such a difference if X data type is DateTime?