See my old posting K-means in C++ .
Here you have a lightweight code
Random commentary about Machine Learning, BigData, Spark, Deep Learning, C++, STL, Boost, Perl, Python, Algorithms, Problem Solving and Web Search
Sunday, May 12, 2013
Friday, May 10, 2013
Stocastic Gradient Descent
Added a stocastic gradient descent to the linear regression code
Thursday, May 9, 2013
Learning linear regression with gradient descend
Last week I restarted an old and good behavior (see A collection of algos and data structures published here). Every day, I take an well known algorithm and code it in boost and C++. Nothing else, just pure training and geeky fun. The only constrain is the time limit of 45mins, after a running session in St. James park.
Here you have the code for linear regression with gradient descent in C++, boost, and ublas. Linear regression is an approach to modeling the relationship between a scalar dependent variable y and one or more explanatory variables denoted X. Ublas is a powerful set of c routines for efficient matrix and vector computations, Boost:: numeric provides an elegant C++ way of using ublas based on templates.
Here you have the code
Here you have the code for linear regression with gradient descent in C++, boost, and ublas. Linear regression is an approach to modeling the relationship between a scalar dependent variable y and one or more explanatory variables denoted X. Ublas is a powerful set of c routines for efficient matrix and vector computations, Boost:: numeric provides an elegant C++ way of using ublas based on templates.
Here you have the code
Subscribe to:
Posts (Atom)