To quickly demonstrate how the GraphLab framework can be used to solve a real-world Machine Learning problem we provide a short tutorial of how to implement PageRank in GraphLab.
To provide a detailed overview of how all of the pieces of GraphLab can be put together to build an interesting algorithm we provide a lengthy tutorial of a toy "Coin Flipping" problem.
To provide an example on how to implement linear algebra iterative algorithms for solving systems of linear equations, the following examples shows how to implement the Jacobi method.
Below are current application libraries on top of GraphLab
This application solves a linear system of equations using iterative solvers: Jacobi, Gaussian Belief Propagation (GaBP), Conjugate gradient, inverse of sparse symmetric matrix via GaBP, Shotgun LASSO solver and Shotgun sparse logistic regression solver.
This library implements multiple algorithms for factorizing a 3D tensor or a 2D matrix into lower rank matrices. Implemented algorithms are: PMF (probabalistic matrix factorization), BPTF (Bayesian probablistic tensor factorization), ALS (alternating least squares), WALS (weighted alternating least squares), SGD (stochastic gradient descent), SVD (Lanczos algorithm), NMF (non-negative matrix factorization) and Koren's SVD++ algorithm.
This library implements multiple clustering methods like K-Means, Fuzzy-Kmeans, Kmeans++, LDA (Latent Dirichlet Allocation), K-Core decomposition.
Code for running Non-parametric belief propagation algorithm, for computing inference using Gaussian mixture model.