Can you please tell me if there is a way to export data from CSV file to postgress database according to conditions?
By condition I mean a situation where there are two tables in a postgres database. I have to do a validation before adding a row to the second table. I need to check that a parameter in my file matches the value in the first table and only if that parameter matches I need to put that parameter in the second table.
For example, I have two tables "clothing brand" and "clothing size". The CSV file that needs to be exported to postgres table contains both the brand of clothing and its sizes. I need to make a validation in the tables already created in postgres, that if the brand of clothing that is in the csv file exists already in the table, then I need to add or modify the table "clothing size".
I tried searching the internet, but found examples of how to simply import a complete table from CSV file to postgres database. any way to import a csv file to postgresql without creating new tables from scratch? How to import CSV file data into a PostgreSQL table https://www.youtube.com/watch?v=KKjcwBUxZXY But I couldn't find an example where I could first check the existing table data and only after successful results of this verification I could insert a specific row.
The easiest solution is to import all data into a temporary table using COPY
command, and then use standard SQL queries to filter out and insert only specific rows. It would look something like this:
create table temp_sizes (...);
copy temp_sizes from 'sizes.csv' format CSV;
insert into sizes from temp_sizes where brand_name in (select name from brands);
drop table temp_sizes;
Alternatively, you could use any general-purpose programming language to parse the CSV file and issue INSERT
commands according to your conditions.