DataGraph Forums › Technical Support › Support Desk › File size restrictions?
Hi there,
should DataGraph be able to handle 1660 columns of 5300 rows each (MacBook Pro, 10.14, 32 GB RAM)? The file (.csv) is 93 MB.
Anyway, DataGraph is hanging itself up (wheel of death), “not responding”, and after a force quit even clicking “do not reopen window” does not have the desired effect – DataGraph starts loading the file again (even after a re-install and reboot).
Thanks for any ideas.
Best
Achim
Because of the UI for each column, the 1660 column count could be the issue. 5300 rows is not a problem at all. It shouldn’t crash, but it certainly won’t be fast. The recommended approach is to flatten the data (R calls it melt – https://www.r-bloggers.com/melt/) and DataGraph can handle 9 million rows.
To deal with the auto-save issue you need to dig into the library.
In ~/Library/Autosave Information – you will see any unsaved DataGraph files.
In ~/Library/Saved Application State/com.visualdatatools.datagraph.savedSate
the system stores the application state that is tries to restore upon launch. The Mac App store version doesn’t have access to those locations, so what the system does is to create a Container for the application and stores that in
~/Library/Containers/com.visualdatatools.datagraph/Data
There you will see the part that is visible to the application from your file system and in that has a library folder.
If you can share the file I would like to use it to stress test DataGraph. I can certainly create a file with that many rows and columns but maybe there is something else that is causing problems. Version 4.5 is able to display that size of a table, without any difficulties. Earlier versions will have problems with that many columns.
One more thing. DataGraph can flatten data once it has been imported – How to ‘flatten’ Data
Hi David,
it’s been a while! I can send you the file, no problem … via which channel?
I don’t seem to have the “Autosave” file, I do have the Saved Application State/com.*, this contains four files. Removing these didn’t solve the problem, though. I don’t have the Mac App version and I am using 4.5.
Probably it is best that I extract the columns of interest, for now.
Thanks.
Best
Achim
… I did
> library(readr)
> mov_wGSHP_lo_ofd_onn_onof_vo2_20_rt0_STD_as_is_210_test_win <- read_csv(“mov_wGSHP_lo_ofd-onn_onof_vo2_20_rt0_STD_as-is_210_test_win.csv”)
> melt_mov<-melt(mov_wGSHP_lo_ofd_onn_onof_vo2_20_rt0_STD_as_is_210_test_win)
> write.csv(melt_mov,file=”mov_wGSHP_lo_ofd_onn_onof_vo2_20_rt0_STD_as_is_210_test_win_melt”)
which gives me a file with 555 MB instead of 90. Haven’t tried importing this yet. The file has 8.36 Million rows, it seems …
You should import it using the Import Special method. That is a lot faster than the standard import method. This size is not a problem.
It is larger because data is replicated. The import special has no problem previewing the file.
For the file, the easiest method is to use drop box or google drive.
The data file: https://drive.switch.ch/index.php/s/FNijXgOADNfXHpZ/download.
DataGraph Forums › Technical Support › Support Desk › File size restrictions?