I have a need to update thousands of database records at a time. I am using a loop to generate the necessary PreparedStatement
updates, but am hoping for a more efficient method.
Normally, I would try to use a batch operation, but since the number of fields I need to update for each record differs, I need a new PreparedStatement
for each update.
As it is, updates around 2,000 records takes 3-4 minutes.
for (AgentUpdate update : updateList) {
Agent thisAgent = update.getAgent();
// Prefix for updating each agent
StringBuilder sql = new StringBuilder(
"UPDATE REP_ASSIST.AGENTS SET\n"
);
// Create a map of all changes
Map<String, String> updateMap = new HashMap<>();
for (AgentDiffs diff : update.getChanges()) {
updateMap.put(diff.getAgentField().toString(), diff.getNewValue());
}
// Iterate through the map and build the SQL statement,
// including all fields to be updated
Iterator iterator = updateMap.entrySet().iterator();
while (iterator.hasNext()) {
Map.Entry pair = (Map.Entry) iterator.next();
sql.append(pair.getKey()).append("=?");
if (iterator.hasNext()) {
sql.append(",\n");
}
}
sql.append("\nWHERE AGENT_CODE=? AND UPN=?;");
Utility.printTempMsg(sql.toString());
// Create the PreparedStatement and fill each ? with the
// appropriate new value.
ps = connection.prepareStatement(sql.toString());
int paramNum = 1;
for (String fieldName : updateMap.keySet()) {
String newValue = updateMap.get(fieldName);
ps.setString(paramNum, newValue);
// Increase the parameter count
paramNum++;
}
// We are now to our last two parameters, fill them
ps.setString(paramNum++, thisAgent.getAgentCode());
ps.setInt(paramNum, thisAgent.getAgentUpn());
ps.executeUpdate();
count++;
Utility.printTempMsg(sql.toString());
// Update the progress based on total count so far
updateProgress(count, totalOperations);
}
So how does one handle large dynamic updates of this sort?
2 Answers 2
Here are some ideas to try:
You are not closing your
PreparedStatement
s. There might possibly be some slow down due to resource consumption or deadlock. Use the try-with-resources feature so it will automatically close.for (AgentUpdate update : updateList) { try (PreparedStatement ps = connection.prepareStatement(sql.toString()) { // set parameters ps.executeUpdate(); } }
Alternatively, you are creating a custom query and creating a new
PreparedStatement
for each record. Although you are optimizing individual queries by minimizing the number of fields you are updating, you are creating additional overhead by preparing a new statements for each record. If you update every field for every row, you could re-use the samePreparedStatement
, which may end up being more efficient (depending on the number of fields and length of data you are actually updating). Just overwrite the values in the prepared statement and redo theexecuteUpdate()
on it.try (PreparedStatement ps = connection.prepareStatement(sql.toString())) { for (AgentUpdate update : updateList) { // set parameters ps.executeUpdate(); } }
You don't even need to create a HashMap
because you already have an object AgentDiffs
to hold the values for you. Creating a HashMap
is just an unnecessary step here.
To create the UPDATE
statement, you may directly use Stream
instead of iterating over a HashMap
String fields = update.getChanges().stream()
.map(diff -> diff.getAgentField().toString() + "=?")
.collect(Collectors.joining(",\n"));
And after that, you may iterate the AgentDiffs
again to fill the values of the parameters in PreparedStatement
.
Try to break down your block of code into separate functions.
I hope someone provides a better way to improve the SQL performance you are looking for.
-
\$\begingroup\$ Does using the
stream
ensure the values fields are added in list order? If not, I have no way of knowing which parameter would need to hold each value... \$\endgroup\$Zephyr– Zephyr2018年07月31日 11:38:05 +00:00Commented Jul 31, 2018 at 11:38 -
\$\begingroup\$ It will be sequential maintaining the original order unless you use
parallel()
orparallelStream()
. \$\endgroup\$Ankit Soni– Ankit Soni2018年07月31日 21:31:10 +00:00Commented Jul 31, 2018 at 21:31