{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Alternating least squares for Tucker model\n", "\n", "```\n", "Copyright 2022 National Technology & Engineering Solutions of Sandia,\n", "LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the\n", "U.S. Government retains certain rights in this software.\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The function `pyttb.tucker_als()` computes the best rank-$(R_1,R_2,\\ldots,R_n)$ approximation of tensor $\\mathcal{X}$, according to the specified dimensions in the vector $(R_1,R_2,\\ldots,R_n)$. The form of $\\mathcal{X}$ can be a `tensor`, `sptensor`, `ktensor`, or `ttensor`. The result is a `ttensor`.\n", "\n", "The method is originally from Tucker (1966) and later revisited in De Lathauwer et al. (2000).\n", "\n", "* L. R. Tucker, Some mathematical notes on three-mode factor analysis, Psychometrika, 31:279-311, 1966, http://dx.doi.org/10.1007/BF02289464\n", "* L. De Lathauwer, B. De Moor, J. Vandewalle, On the best rank-1 and rank-(R_1, R_2, R_N) approximation of higher-order tensors, SIAM J. Matrix Analysis and Applications, 21:1324-1342, 2000, http://doi.org/10.1137/S0895479898346995\n", "\n", "Note: Oftentimes it's better to use `pyttb.hosvd()` instead." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import os\n", "import sys\n", "import pyttb as ttb\n", "import numpy as np" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create a data tensor of shape (5, 4, 3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "np.random.seed(0) # Set seed for reproducibility\n", "X = ttb.sptenrand(\n", " [5, 4, 3], nonzeros=10\n", ") # Create a tensor with 10 nonzeros using the 'nonzeros' param.\n", "X" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create an approximation with all ranks equal to 2" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "T = ttb.tucker_als(X, 2) # best rank(2,2,2) approximation\n", "T" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create an approximation with specific ranks of [2, 2, 1]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "T = ttb.tucker_als(X, [2, 2, 1]) # best rank(2,2,1) approximation\n", "T" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Use a different ordering of the dimensions" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "T = ttb.tucker_als(X, 2, dimorder=[2, 1, 0])\n", "T" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Use the `\"nvecs\"` initialization method\n", "This initialization is more expensive but generally works very well." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "T = ttb.tucker_als(X, 2, dimorder=[0, 1, 2], init=\"nvecs\")\n", "T" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Specify the initial guess manually" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "U0 = [np.random.rand(5, 2), np.random.rand(4, 2), np.random.rand(3, 2)]\n", "T = ttb.tucker_als(X, 2, dimorder=[0, 1, 2], init=U0)\n", "T" ] } ], "metadata": {}, "nbformat": 4, "nbformat_minor": 2 }