主要研究領(lǐng)域?yàn)楦叻诌b感公安應(yīng)用和視頻圖像處理與分析。先后主持完成國(guó)家級(jí)項(xiàng)目2項(xiàng),省部級(jí)項(xiàng)目6項(xiàng),從實(shí)戰(zhàn)需求出發(fā),對(duì)敏感目標(biāo)識(shí)別、安保態(tài)勢(shì)分析等關(guān)鍵技術(shù)進(jìn)行攻關(guān),全面提升對(duì)各類(lèi)風(fēng)險(xiǎn)隱患的發(fā)現(xiàn)、識(shí)別、預(yù)警和處置能力;主持完成的局部遮擋人臉識(shí)別與偵測(cè)關(guān)鍵技術(shù)獲2018年度公安部科學(xué)技術(shù)三等獎(jiǎng),獲公安部?jī)?yōu)秀教學(xué)成果三等獎(jiǎng)1項(xiàng)。近5年,發(fā)表論文16篇,其中SCI收錄5篇,EI收錄8篇。目前主持在研國(guó)家級(jí)項(xiàng)目2項(xiàng),經(jīng)費(fèi)總計(jì)300余萬(wàn)元。 代表性成果 1.J. Li, H. Huo, C. Li, R. Wang, C. Sui and Z. Liu, "Multigrained Attention Network for Infrared and Visible Image Fusion," in IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1-12, 2021, Art no. 5002412, doi: 10.1109/TIM.2020.3029360. 2.J. Li, H. Huo, C. Li, R. Wang and Q. Feng, "AttentionFGAN: Infrared and Visible Image Fusion Using Attention-Based Generative Adversarial Networks," in IEEE Transactions on Multimedia, vol. 23, pp. 1383-1396, 2021, doi: 10.1109/TMM.2020.2997127. 3. Li J , Huo H T , Liu K , et al. Infrared and Visible Image Fusion Using Dual Discriminators Generative Adversarial Networks with Wasserstein Distance[J]. Information Sciences, 2020. Doi:10.1016/j.ins.2020.04.035 4. Jing Li, Hongtao Huo, Chenhong Sui, Chenchen Jiang, and Chang Li.Poisson Reconstruction-Based Fusion of Infrared and Visible Images via Saliency Detection, IEEE Access: 2019 ,7 ,20676-20688 5. Feiyan Li, Wei Li, Hongtao Huo and Qiong Ran, Decision Fusion Based on Joint Low Rank and Sparse Component for Hyperspectral Image Classification, 2019 IEEE International Geoscience and Remote Sensing Symposium(IGARSS 2019), 401-404. 6.蔣晨琛,霍宏濤,馮琦.一種基于PCA的面向?qū)ο蠖喑叨确指顑?yōu)化算法[J/OL].北京航空航天大學(xué)學(xué)報(bào):1-17[2019-12-24].https://doi.org/10.13700/j.bh.1001-5965.2019.0398. 7.李非燕,霍宏濤,李靜,白杰.基于多特征和改進(jìn)稀疏表示的高光譜圖像分類(lèi)[J].光學(xué)學(xué)報(bào),2019,39(05):351-359. 8. 李非燕,霍宏濤,白杰,王巍.基于稀疏表示和自適應(yīng)模型的高光譜目標(biāo)檢測(cè)[J].光學(xué)學(xué)報(bào),2018,38(12):379-385. 9. 羅霄陽(yáng),霍宏濤,王夢(mèng)思,陳亞飛.基于多殘差馬爾科夫模型的圖像拼接檢測(cè)[J].計(jì)算機(jī)科學(xué),2018,45(04):173-177.
|