Thursday, May 9, 2013

Learning linear regression with gradient descend

Last week I restarted an old and good behavior (see A collection of algos and data structures published here). Every day, I take an well known algorithm and code it in boost and C++. Nothing else, just pure training and geeky fun. The only constrain is the time limit of 45mins, after a running session in St. James park.

Here you have the code for linear regression with gradient descent in C++, boost, and ublas.  Linear regression is an approach to modeling the relationship between a scalar dependent variable y and one or more explanatory variables denoted X. Ublas is a powerful set of c routines for efficient matrix and vector computations, Boost:: numeric provides an elegant C++ way of using ublas based on templates.

Here you have the code

No comments:

Post a Comment