Hierarchical classification is critical to knowledge management and exploration, as in gene function prediction and document categorization. In hierarchical classification, an input is classified according to a structured hierarchy. In a situation as such, the central issue is how to effectively utilize the inter-class relationship to improve the generalization performance of flat classification ignoring such dependency. In this article, we propose a novel large margin method through constraints characterizing a multi-path hierarchy, where class membership can be non-exclusive. The proposed method permits a treatment of various losses for hierarchical classification. For implementation, we focus on the symmetric difference loss and two large margin classifiers: support vector machines and psi-learning. Finally, theoretical and numerical analyses are conducted, in addition to an application to gene function prediction. They suggest that the proposed method achieves the desired objective and outperforms strong competitors in the literature.