I am using Golang and Postgres to filter some financial data. I have a Postgres database which has a single table containing a single Stock Market (if that's the correct term). This table has columns for id, symbol, date, open, high, low, close and volume. The total number of rows is 6,610,598 and the number of distinct stocks (symbols) is 2174.
Now what I want to do is to filter the data from that table, and save to another table. So the first one contains raw data and second one contains cleaned data.
We have three parameters, a date (EVALDATE) and 2 integers (MINCTD & MINDP). First, we have to select only those stocks that will pass our minimum calendar trading days parameter. So that will be selected by (NOTE: we use golang for this)
symbols []string got its value from ( Select distinct symbol from table_name; )
[]filteredSymbols
var symbol, date string
var open, high, low, close float64
var volume int
for _, symbol := range symbols {
var count int
query := fmt.Sprintf("Select count(*) from table_name where symbol = '%s' and date >= '%s';", symbol, EVALDATE)
row := db.QueryRow(query)
if err := row.Scan(&count); err != nil ........
if count >= MINCTD
filteredSymbols = append(filteredSymbols, symbol)
}
Basically, the operation above only asks for those symbols which has enough number of rows from the EVALDATE up to current date (latest date in data) that will satisfy MINCTD. The operation above took 30 minutes
If a symbol satisfies the first filter above, it will undergo a second filter which will test if within that period (EVALDATE to LATEST_DATE) it has enough rows that contain complete data (no OHLC without values). So the query below is used to filter the symbols which passed the filter above:
Select count(*) from table_name where symbol='symbol' and date>= 'EVALDATE' and open != 0 and high != 0 and low != 0 and close != 0;
This query took 36 minutes.
After getting the slice of symbols which passed both filter, I will then grab their data again using postgres query then begin a bulk insert to another table.
So 1 hour and 6 minutes is not very acceptable. What should I do then? Grab all data then filter using Golang in memory?