sparksql写入mysql出现死锁

来源:9-17 -需求三统计结果写入到MySQL

慕前端5264115

2019-06-03

rdd.foreachPartition(rowIterator => {
val list = new ListBufferTrackingBasic
var dbName = ““
rowIterator.foreach(row => {
if (dbName.isEmpty) {
dbName = “serving_tracking_” + row._1
}
val dimensions = ArrayBufferInt
for (dim <- value.split(”,”)) {
dimensions.append(row._2.getAsInt)
}
val platform = row._2.getAsInt
val dmp_recognition = row._2.getAsLong
val imp_pv = row._2.getAsLong
val imp_uv = row._2.getAsLong
val clk_pv = row._2.getAsLong
val clk_uv = row._2.getAsLong
list.append(TrackingBasic(platform, dmp_recognition, imp_pv, imp_uv, clk_pv, clk_uv, dimensions: _*))
})

            //  将结果数据写入MYsql
            TrackingDao.insertTrackingBasic(dbName, tableName, tableColumns, list)

})
如上在foreach中组装list,然后调用dao的insert
try {
conn = DbUtil.getConnection()
conn.setAutoCommit(false) // 设置手动提交

        // 构建insert sql
        val sql = generateInsertSql(dbName, tableName, tableColumns)
        pstmt = conn.prepareStatement(sql)

        for (ele <- list) {
            val dimesions = ele.dimensions
            val dimLength = dimesions.length
            for (i <- Range(0, dimLength)) {
                pstmt.setInt(i + 1, dimesions(i))
            }
            pstmt.setInt(dimLength + 1, ele.platform)
            pstmt.setLong(dimLength + 2, ele.dmp_recognition)
            pstmt.setLong(dimLength + 3, ele.imp_pv)
            pstmt.setLong(dimLength + 4, ele.imp_uv)
            pstmt.setLong(dimLength + 5, ele.clk_pv)
            pstmt.setLong(dimLength + 6, ele.clk_uv)
            pstmt.addBatch()
        }
        pstmt.executeBatch()
        conn.commit()

    } catch {
        case e2: BatchUpdateException => {
            println("BatchUpdateException-->:    " + dbName + "." + tableName + "===>time:" + System.currentTimeMillis() + " Length:" +  e2.getUpdateCounts().length)
            println("BatchUpdateException-->Message:   " + e2.getMessage)
        }
        case e: Exception => {
            e.printStackTrace()
            println("jdbc-Exception->    ")
        }
    } finally {
        DbUtil.release(conn, pstmt)
    }

如下是捕捉到的异常,发生死锁
BatchUpdateException–>: serving_tracking_917.tracking_basics_date_media_placement_ad_creative_province_city===>time:1559270145103 L
ength:665
BatchUpdateException–>Message: Deadlock found when trying to get lock; try restarting transaction

写回答

1回答

Michael_PK

2019-06-03

死锁这个没遇到过,不太清楚你的机器的情况。两个思路:1)代码问题 2)mysql环境问题

0
2
Michael_PK
回复
慕前端5264115
会有两个task同时执行的可能,但是默认是不开启推测执行,除非你开启了,不然应该不会有同时执行的可能
2019-06-03
共2条回复

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程