That's too much data for most desktop software to handle, not just ESB software. The computer resources used just to load all of that data each time you start the software will result in wonky behavior and be a nightmare to deal with generally.
I put ESB 3 through some serious CSV import stress tests in the past and "millions" is not happening (even if each row has little data).
But it does a pretty good job of handling above 1 million rows in a single software build (even with several columns of data), but that's with different grid objects created (for 1 million rows of imported data, for example, you would need to create 10 or more grid objects with maybe 100,000 rows maximum of data imported per grid object).
To import faster you would need to split the "millions" CSV file into multiple CSV files of 5000 rows each that you import one at a time - I've actually done this and have tried importing many different size CSVs, and importing 3000 - 5000 rows at a time is the fastest, unless you have a crazy amount of data in the rows which will still make it a slow import if that's the case.
To create this correctly so that it is useful, you would need to have the CSV files split and categorized in some meaningful way and then create your grid objects around those categorizations. Then do all of your importing into each grid object and create the build.
Bottom line: what you are wanting to do won't be easy, and it will take a lot of effort to import the data and you'll need to create the software build a certain way for it to work acceptably, and that's just for roughly 1 million rows of data maximum imported per software build (and no excessive data overall in the CSV files you're importing; otherwise it will be even tougher to accomplish and it will still be a strain on resources).
Mel