scala - How to catch exceptions when using DataFrameWriter to write dataframe with JDBC -


i writing dataframe mysql database. database throwing java.sql.batchupdateexception: duplicate entry 'qwer1234' key 'transaction_id_unique'.

the code using write database is

txnsdf.write.mode("append").jdbc(connprop.getproperty("url"), query, connprop) 

now can't write try catch around because not on row level , looking @ org.apache.spark.sql.execution.datasources.jdbc.jdbcutils class inserts batches of 1000 transactions @ time.

i remove constraint on database doesn't solve problem of handling exceptions coming class.

my last option, partially rewrite savepartition method in jdbcutils handle exception , don't want that. suggestions?


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -