FlexPro Forum – Discuss Your Topic!

Low performance when computing large files

Home > Community > General > Low performance when computing large files

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #12849
    Anonymous
    Inactive

    Hello,

    We need to treat large data files (Dewesoft *. .d7d, Nicolet Recording Files *.nrf).

    Those files with a size of 2 Gb contain :
    – 6 to 8 channels
    – 2 hours of recording at 5000 values per seconds
    from where :
    – 36 million values per channel
    – 288 million values per file

    Currently, we use FlexPro to treat successfully smaller files (less than 500 Mb).

    With those files we experience many performances problems : opending files, computing formulas (even simple ones, formulas applying to all the recording mixing 3 or 6 channels), drawing charts are very slow.

    We found many optimization possibilities in the documentation, and tried them : it’s not sufficient.

    The tests were carried out with:
    – FlexPro 8 Professional
    – computer : Windows XP, 3 Go RAM, Intel Centino 2 Core @ 2.4GHz

    The most complex calculations (in particular FFT) were not tested.

    Some other rival software have very better performances than FlexPro on this same computer.

    We plan to test FlexPro 9 Professionnal on a more powerful machine under Windows 7 – 64 bits, to find out if FlexPro is suitable for this need.

    Could you recommend a computer configuration (memory, number of processors/core, …) ?

    Are there other solutions to explore with FlexPro ?

    Is FlexPro suitable ?

    Thanks.

    #8509
    Anonymous
    Inactive

    Hello,

    We need to treat large data files (Dewesoft *. .d7d, Nicolet Recording Files *.nrf).

    Those files with a size of 2 Gb contain :
    – 6 to 8 channels
    – 2 hours of recording at 5000 values per seconds
    from where :
    – 36 million values per channel
    – 288 million values per file

    Currently, we use FlexPro to treat successfully smaller files (less than 500 Mb).

    With those files we experience many performances problems : opending files, computing formulas (even simple ones, formulas applying to all the recording mixing 3 or 6 channels), drawing charts are very slow.

    We found many optimization possibilities in the documentation, and tried them : it’s not sufficient.

    The tests were carried out with:
    – FlexPro 8 Professional
    – computer : Windows XP, 3 Go RAM, Intel Centino 2 Core @ 2.4GHz

    The most complex calculations (in particular FFT) were not tested.

    Some other rival software have very better performances than FlexPro on this same computer.

    We plan to test FlexPro 9 Professionnal on a more powerful machine under Windows 7 – 64 bits, to find out if FlexPro is suitable for this need.

    Could you recommend a computer configuration (memory, number of processors/core, …) ?

    Are there other solutions to explore with FlexPro ?

    Is FlexPro suitable ?

    Thanks.

    #9345
    Bernhard KantzBernhard Kantz
    Participant

    Reading large files takes some time especially when using slow devices (network, hard drives, etc.). When datasets can’t be held in main memory, they will be swapped out to the temporary folder, what lowers the performance while processing. E.g. if you work with signals with 36 million of values and use the default system setting for the maximum size of data sets in memory of 10 Megabytes, the data will not reside in RAM but has to be loaded from disk. In your case this size should be at least 600 MB to circumvent swapping. If you want to hold eight signals in memory you should adapt the maximum memory allocation for data sets settings to at least 5 GB. That implies a computer with a 64 bit Windows and not less than 8 GB of main memory. Starting with FlexPro 9.1 we offer a 64 bit release capable of using more than 2 GB of RAM. The overall performance of course also benefits from fast hard drives. This should boost the performance of computations when the data is read. But the actual loading of the files may still takes some time.

    #9347
    HerveM1234HerveM1234
    Participant

    Hi,

    With 64bits FlexPro 9 running on seven64, I didn’t notice any difference with 32bits versions.
    On 8 cores system (I7 3770), Flexpro use only 4 cores on 8 and 12% of CPU !!!
    The memory used is allways less than 4GB and it takes a while to calculate formulas.

    Thanks !

    #9350
    Bernhard KantzBernhard Kantz
    Participant

    The main improvement of the 64 bit-version is the capability to use more than 2 gigabytes of main memory for datasets. Customizing the settings Maximum memory allocation for data sets and Maximum size of data sets in memory to the available amount of main memory allows FlexPro to hold large datasets in memory instead of swapping them out to disk. These settings can be modified in the System Settings tab of the Options dialog in the Tools menu.
    On the same page, the parallel update feature can be activated in the Professional edition. Currently the parallel update is used mainly in the construction of presentation objects like diagrams, tables and documents. In FlexPro 10 it is planned to enhance the FPScript language like a parallel for-loop to improve the performance of the evaluation of formulas.

    #9352
    HerveM1234HerveM1234
    Participant

    OK for that.
    I still not understand why CPU is not fully used!
    Thanks

    Sorry : wrong picture above !

    #9353
    Bernhard KantzBernhard Kantz
    Participant

    Since the parallelization is currently used only for the creation of presentation objects, the peak performance will be noticed at the time when documents containing multiple diagrams with curves based on large datasets are created.

Viewing 7 posts - 1 through 7 (of 7 total)
  • You must be logged in to reply to this topic.