I tried a .txt file and got around 460k rows imported. And what’s about maximum columns?
For importing data, there is no limit on the number of rows or columns.
The practical limit on the amount of data has more to do with the limits of what you can have in memory from your computer, which will result in a limit of about 2.7 billion. To have a column with that many numbers would be quite a few gigabytes, so that will slow most computers down!
For the Expression column, there is a limit of 200 million rows. This is something we could raise, but no one has asked yet 🙂
To get a sense of the practical limit for calculations, I tested a MacBook Air (13 inch), 1.6 GHz, and 8GB of memory. This computer easily performed calculations for 10 million rows of data, and plotted the results.
Going to 100 million was possible, but it took a few moments to compute and the program was a bit sluggish.
A lot of columns can also slow the program down, but again there is no limit.
If you have over 100 columns, try to place them in groups and close the group when you are not using that data. That will help the program to run faster. This is also one of the reasons we encourage a flattened format for data, as the program will tend to run faster and be more efficient with more rows, and less columns.
Thanks. I have tried a dataset with 1.3 million rows and 60 columns, which is highly time-resolution and only got around 460 k rows imported. Do you have an idea of this incomplete reading?
What is the file format? We have an outstanding issue with netCDF files not importing completely.
Weeks ago, I raised a netCDF flag to the forum, too. Sorry for the inconvenience that I did again the DG limit surveying and conveying work. The format of the current file is .txt. I cannot import it into Excel, either, for Excel can only handle 1.04 million rows and its data connection function is not agile to use. The file size is too big to share with you. Here is the screenshot of the imported file. I can transfer it to some cloud drive and share the link with you if you like. Can you spare some time to check with it?
That would be great if you could share a link with the file – here or on email.
If this is txt we should be able to handle this.
I have included you in the recipient. Did you receive the invitation?
Thanks. I shall send the text file later with a google drive link in a separate email. Its size is over 900 MB not suitable for pasting here.
- This reply was modified 2 years, 7 months ago by petercreate.
I tried your large file using the normal import method and got the same results that only partial of the file is imported.
The good news is that I had no problem using the Import Special method.
Here is a screen shot of an import of your data and I even made a couple of graphs to make sure the program was being responsive.
Import Special has a more advanced parser then the default method. It is also a lot faster for big data sets. You can specify exactly what columns you want to import and even add conditions on importing.
FYI – Here is a demo of the Import special:
You must be logged in to reply to this topic.