First of all, I am fully aware that Stack Overflow itself is probably not the ideal sub forum to ask this on, but I asked my professor and he told me to ask Bing AI and Bing AI told me to ask this question here, so if there is some other specific sub forum that would be better, please let me know in a comment or an answer.
I am on a team of grad students working on the capstone project in our MS in Data Analytics Engineering program at George Mason University and we are being required to use a team Amazon Web Services account provided for us and when setting up the database instance using AWS's RDS service, I have experienced extremely slow execution speeds. When I say extremely, I mean hours for the Table Data Import Wizard to import/load a 639 KB csv file!
I have already written the two SQL scripts we need using the local version of MySQL Server and Workbench on my laptop to test that everything works, now all I need is to establish a connection between the MySQL Workbench application on my laptop with an RDS database instance that executes at reasonable speeds. The first one I created was a Publicly Accessible single RDS db instance running on the MySQL Community engine of Size/Class db.t3.micro (initially). After connecting it to Workbench, I found the latency involved in using the Table Data Import Wizard to import/load our 3 source datasets: Airports.csv, Airport_Schedules.csv, and Runway.csv (all three along with the two MySQL script files can be found in our team's GitHub Repository for this project) was CARTOONISHLY high. So, I increased its processing capabilities by upgrading its class from a db.t3.micro (which only has 1 GB of RAM) to a db.t3.medium which has 4 GB of RAM, but this did not improve runtime at all. I also uploaded all of the logs when trying to use this connection on our GitHub in a Zip file called 'aws-mysql-rds-db-error-running. Logs'.
From there, I proceeded similarly as I had in that first attempt at a solution by increasing the RAM by incrementally trying different possible solutions, but only trying each one at a time, this is very important, so I made absolutely sure to only test each new change by itself before trying another. My next change was to create a new RDS db instance, but this time, instead of it only being a single DB instance, I created it as a Multi-AZ DB cluster instead (but still only a size of db.t3.medium, i.e. 4 GB of RAM for each of the three instances created). Unfortunately, once I connected it to MySQL Workbench, it was just as slow as the other db instance, so from there, I increased the size of all 3 instances in this cluster to db.m5d.xlarge (16 GB of RAM), yet the massive latency remained.
At this point, I booted up my backup laptop, installed MySQL Server, Workbench, and the rest of the things that come with the Developer installation option, then established connections to both of these RDS MySQL db instances with my new local MySQL Workbench application on that laptop and the executions were all still just as slow on that laptop!! I am out of ideas here, please provide any suggestions as for who to ask if you don't know the answer yourself, I have already asked Chat GPT and Bings Chat AI upgrade of Chat GPT as well, but none of its suggestions have worked either.
Should I try creating an Aurora MySQL database instance on RDS instead and try establishing a connection between that and my local MySQL Workbench application?
Yes, MySQL Workbench is cartoonishly bad at importing CSV data. This has been a sore point for years. Bugs have been logged with MySQL (Oracle Corp), but nothing changes.
Forget about using MySQL Workbench for bulk data import. It doesn't work. I don't use MySQL Workbench, and I don't know any professional MySQL developers or DBAs who use it.
Not only is it orders of magnitude slower than it should be, but users have reported that it fails to import all the rows in the input CSV file. Effectively, MySQL Workbench has no viable import feature.
Instead, you can use the command-line tool mysqlimport or the equivalent SQL statement LOAD DATA LOCAL INFILE. These are mature, efficient, and much faster.
You might also like my presentation Load Data Fast! for MySQL.