Manifold learning for the hyperspectral data structure of intra-class variation provides useful information for investigating the intrinsic coordinates corresponding to the quantitative proper- ties inherent in the class. However, in the high-dimensional feature space, it is unfeasible to acquire a statistically sufficient number of labeled data to estimate the coordinates. In this paper, we propose semi-supervised regression and dimensionality reduction methods for hyperspectral subspace learning that utilize abundant unlabeled data and a small number of labeled data. The quantitative target variables for regression and the order constraints for dimensionality reduction are embedded in matrices representing data relations, i.e., a set of between-class scatter matrices, within-class scatter matrices, and supervised local attraction matrices. The optimal projection matrices are estimated by generalized eigenvalue problems based on the matrices. The proposed methods are applied to synthetic linear regression problems and dimensionality reduction problems based on a time-series of hyperspectral data for a deciduous broad- leaved forest to extract local coordinates related to phenological changes. The order consistency of the projections is assessed by evaluating an index based on the Mann-Kendall test statistics. The proposed methods demonstrate much better performances in terms of both regression and dimensionality reduction than the alternative supervised and unsupervised methods.