Spectral clustering is a leading and popular technique inunsupervised data analysis. Two of its major limitations are scalabilityand generalization of the spectral embedding (i.e., out-of-sample-extension).In this paper, we introduce a deep learning approach to spectral clustering that overcomes the above shortcomings.Our network, which we call SpectralNet, learns a map that embeds input data points into the eigenspace of their associated graph Laplacian matrix and subsequently clusters them. We train SpectralNet using a procedure that involves constrainedstochastic optimization. Stochastic optimization allows it to scale to large datasets, while the constraints, which are implemented using a special-purpose output layer, allow us to keep the network output orthogonal. Moreover, the map learned by SpectralNet naturally generalizes the spectral embedding to unseen data points.To further improve the quality of the clustering, we replace the standard pairwise Gaussian affinities with affinities leaned from unlabeled data using a Siamese network. Additional improvement can be achieved by applying the network to code representations produced, e.g., by standard autoencoders. Our end-to-end learningprocedure is fully unsupervised.In addition, we apply the VC dimension theory to derive a lower bound on the size of SpectralNet.