辽宁石油化工大学学报

辽宁石油化工大学学报 ›› 2016, Vol. 36 ›› Issue (3): 67-70.DOI: 10.3969/j.issn.1672-6952.2016.03.016

• 计算机与控制 • 上一篇    下一篇

基于混沌粒子群和模糊聚类的图像分割算法研究

王 杨   

  1. (辽宁石油化工大学计算机与通信工程学院,辽宁抚顺113001)
  • 收稿日期:2015-03-02 修回日期:2015-05-13 出版日期:2016-06-25 发布日期:2016-07-01
  • 作者简介:王杨(1978-),女,硕士,讲师,从事模式识别和机器学习方面研究;E-mail:wangyang0531@163.com。

Research of Image Segmentation Algorithm Based on Chaos Particle Swarm and Fuzzy Clustering

Wang Yang   

  1. School of Computer and Communication Engineering, Liaoning Shihua University, Fushun Liaoning 113001, China
  • Received:2015-03-02 Revised:2015-05-13 Published:2016-06-25 Online:2016-07-01

摘要: 针对模糊C-均值聚类算法受初始聚类中心和隶属度矩阵的影响,易陷入局部最优解,以至于得不到最佳聚类结果等问题,提出了一种新的基于混沌粒子群的模糊C-均值聚类的图像分割算法。该算法采用逻辑自映射函数初始化均匀分布的粒子群,当算法陷入早熟收敛时进行混沌优化,以改善因粒子停滞而收敛到局部最优解的能力。实验结果表明,该算法具有更快的分割速度和更高的分割精度。

关键词: 模糊C-均值聚类, 混沌粒子群, 图像分割, 逻辑自映射, 早熟收敛

Abstract: Fuzzy C-means clustering algorithm is sensitive to initial clustering center and membership matrix and likely converges into the local minimum, so it can not get the best clustering results. A new image segmentation algorithm based on chaos particle swarm and FCM clustering is proposed. The uniform particles are produced by logical self-map function.When it gets into the premature convergence,the algorithm can start the chaos optimization to improve the performance of convergence into the local minimum because of standstill. The experimental results show that the new algorithm has faster convergence and higher accuracy of segmentation.

Key words: Fuzzy C-means clustering, Chaos particle swarm,  Image segmentation,  Logical self-map, Premature convergence

引用本文

王 杨. 基于混沌粒子群和模糊聚类的图像分割算法研究[J]. 辽宁石油化工大学学报, 2016, 36(3): 67-70.

Wang Yang.

Research of Image Segmentation Algorithm Based on Chaos Particle Swarm and Fuzzy Clustering[J]. Journal of Liaoning Petrochemical University, 2016, 36(3): 67-70.

使用本文

0
    /   /   推荐

导出引用管理器 EndNote|Ris|BibTeX

链接本文: http://journal.lnpu.edu.cn/CN/10.3969/j.issn.1672-6952.2016.03.016

               http://journal.lnpu.edu.cn/CN/Y2016/V36/I3/67