khajlk
I have a table `my_tbl` in my `PostgreSQL 11` database with the following definition:

```
CREATE TABLE my_tbl (
   id serial primary key,
   date date not null,
   hour integer,
   val numeric
);
```

The data from a CSV file `data.csv` under directory `C\:temp` (Windows 10, x64 machine) was imported into `my_tbl` using `COPY` command (Windows CLI). The sample data looks like this:

```
Select * from my_tbl Order by id limit 5;
```

```
id		date			hour		val
1		2012-01-01		1			12.5
2		2012-01-01		2			11.5
3		2012-01-01		3			9.5
4		2012-01-01		4			8.6
5		2012-01-01		5			7.7
```


The CSV file `data.csv` gets updated daily (let say, at 20h) and is copied to the directory `C\:temp`. Instead of manually importing updated CSV file, I would like to automate this process. How can I automate this process? For example, a procedure/trigger (or pgagent scheduled job) that keeps an "eye" at the directory and automatically imports the data daily in Postgres db, let say at 20.15h. Can someone help me on this?
Top Answer
Truilus
Put your COPY command into a SQL script, then run psql from within a batch file. To avoid the password prompt, use [environment variables](https://www.postgresql.org/docs/current/libpq-envars.html) inside the batch file.

```batch
set PG_PASSWORD=the_password
"c:\Program Files\PostgreSQL\12\bin\psql" -h localhost -U pg_user -d target_database -f c:\temp\import_data.sql
``` 

Then schedule that batch file through the Windows scheduler.

Enter question or answer id or url (and optionally further answer ids/urls from the same question) from

Separate each id/url with a space. No need to list your own answers; they will be imported automatically.