The search functionality is under construction.

Author Search Result

[Author] Pengfei SHI(3hit)

1-3hit
  • 3D Face Landmarking Method under Pose and Expression Variations

    Yuan HU  Jingqi YAN  Wei LI  Pengfei SHI  

     
    LETTER-Image Recognition, Computer Vision

      Vol:
    E94-D No:3
      Page(s):
    729-733

    A robust method is presented for 3D face landmarking with facial pose and expression variations. This method is based on Multi-level Partition of Unity (MPU) Implicits without relying on texture, pose, orientation and expression information. The MPU Implicits reconstruct 3D face surface in a hierarchical way. From lower to higher reconstruction levels, the local shapes can be reconstructed gradually according to their significance. For 3D faces, three landmarks, nose, left eyehole and right eyehole, can be detected uniquely with the analysis of curvature features at lower levels. Experimental results on GavabDB database show that this method is invariant to pose, holes, noise and expression. The overall performance of 98.59% is achieved under pose and expression variations.

  • Fusion of Multiple Facial Features for Age Estimation

    Li LU  Pengfei SHI  

     
    LETTER-Image Recognition, Computer Vision

      Vol:
    E92-D No:9
      Page(s):
    1815-1818

    A novel age estimation method is presented which improves performance by fusing complementary information acquired from global and local features of the face. Two-directional two-dimensional principal component analysis ((2D)2PCA) is used for dimensionality reduction and construction of individual feature spaces. Each feature space contributes a confidence value which is calculated by Support vector machines (SVMs). The confidence values of all the facial features are then fused for final age estimation. Experimental results demonstrate that fusing multiple facial features can achieve significant accuracy gains over any single feature. Finally, we propose a fusion method that further improves accuracy.

  • Sexual Dimorphism Analysis and Gender Classification in 3D Human Face

    Yuan HU  Li LU  Jingqi YAN  Zhi LIU  Pengfei SHI  

     
    LETTER-Pattern Recognition

      Vol:
    E93-D No:9
      Page(s):
    2643-2646

    In this paper, we present the sexual dimorphism analysis in 3D human face and perform gender classification based on the result of sexual dimorphism analysis. Four types of features are extracted from a 3D human-face image. By using statistical methods, the existence of sexual dimorphism is demonstrated in 3D human face based on these features. The contributions of each feature to sexual dimorphism are quantified according to a novel criterion. The best gender classification rate is 94% by using SVMs and Matcher Weighting fusion method. This research adds to the knowledge of 3D faces in sexual dimorphism and affords a foundation that could be used to distinguish between male and female in 3D faces.