标签云

微信群

扫码加入我们

WeChat QR Code

I have to use BASH to connect to our PostgreSQL 9.1 database server to execute various SQL statements.We have a performance issue caused by repeatedly opening/closing too many database connections (right now, we send each statement to a psql command).I am looking at the possibility of maintaining an open database connection for a block of SQL statements using named pipes.The problem I have is that once I open a connection and execute a SQL statement, I don't know when to stop reading from the psql. I've thought about parsing the output to look for a prompt, although I don't know if that is safe considering the possibility that the character may be embedded in a SELECT output.Does anyone have a suggestion?Here's a simplified example of what I have thus far...#!/bin/bashPIPE_IN=/tmp/pipe.inPIPE_OUT=/tmp/pipe.outmkfifo $PIPE_IN $PIPE_OUTpsql -A -t jkim_edr_md_xxx_db < $PIPE_IN > $PIPE_OUT &exec 5> $PIPE_IN; rm -f $PIPE_INexec 4< $PIPE_OUT; rm -f $PIPE_OUTecho 'SELECT * FROM some_table' >&5# unfortunately, this loop blockswhile read -u 4 LINEdoecho LINE=$LINEdone


PostgreSQL supports persistent connections. You don't need to open and close connections all the time.

2019年04月19日31分42秒

I need to process the results from the SQL statements individually so I don't think batch execution will work for me.

2019年04月18日31分42秒

Thanks I have updated my post with some of your recommendations.

2019年04月19日31分42秒