Wednesday, March 18, 2009

Adaboost : improve your weak performance

Adaboost is one of my favorite Machine Learning algorithm. The idea is quite intriguing: You start from a set of weak classifiers and learn how to linearly combine them so that the error is reduced. The result is a strong classifier built by boosting the weak classifiers.

Mathematically, given a set of labels y_i = { +1, -1 } and a training dataset x_i, Adaboost minimizes the exponential loss function sum_i (exp - (y_i * f(x_i)) }. The function f = sum_t (alpha_t h_t) is a linear combination of the h_t classifier with weight alpha_t. For those loving Optimization Theory , Adaboost is a classical application of Gradient Descend.
The algorithm is quite simple and has been included in the top 10 data mining algorithms in 2007 and the Gödel prize in 2003. After Adaboost, Boosting become quite popular in the data mining community with application in Ranking and Clustering.

Here you have the code for AdaBoosting in C++ and Boost.


  1. M. Ahda NasrullahMarch 29, 2009 at 8:17 AM

    interesting topic. I also work on it with java but still not work :D

    Thank you for sharing your knowledge, I appriciate that.

  2. Is there any other approach which solves a similar problem?

  3. thank you for your code
    I work in this field but in matlab
    i want to have a solution in multiclass

  4. what about extending mine and share the code?

  5. Hi, I would rather say that it is an application of the coordinate descent optimization algorithm rather than the gradient descent.

  6. Great effort. But my compiler is missing two files, ptr_vector.hpp
    and matrix.hpp.

    How many files do I suppose to have when I decompress your file.

    I got only 5 files which are :

    adaboost.hpp, test.cpp,


  7. make sure you have boost (1.37) and you update your -I in Makefile.

  8. Thanks for sharing this code. I am reusing this code in my personal project. Is there any way to load the training dataset from file in your code ? How is it?

  9. Hello, does your this code support n dimensional feature vectors??
    In your test code, there you have used data.push_back(i);

    If I want to push one array instead of one integer, shall I need to change anything your corresponding class Adaboost code?

    Please help me.

  10. I can't get it to compile.
    I am using Mac. I changed the -I to /opt/local/include and -L to /opt/local/lib but i get this error:

    g++ -L /opt/local/lib adaboost.o test.o -o ada_boosting
    ld: in /opt/local/lib, can't map file, errno=22
    collect2: ld returned 1 exit status

  11. Hi,
    Nice effort!
    However, I am not able to compile it.
    I use a Mac I changed the -I to /opt/local/include and the -L /opt/local/lib
    However, I get the following error:
    g++ -L /opt/local/lib adaboost.o test.o -o ada_boosting
    ld: in /opt/local/lib, can't map file, errno=22
    collect2: ld returned 1 exit status

    Can you help with me that?

  12. Hey Nice post. I am looking for a document dealing with the practical aspects of Adaboost, like scaling the data, how to tweak the parameters like the number of weak classifiers, amount of training data and so on. Let me know if you are aware of such a document.
    Jai Pillai

  13. Hello. How many number of random samples must be selected at each iteration in AdaBoost?

  14. Hi
    I have some problem with detail of adaboost.would you please help me?

  15. hello sir,
    Im a student from India, currently working on a project "Automatic localization of Backward collision of vehicles using a single camera." I need your help in some of the terms, i have also read your paper on "Vehicle detection combining gradient analysis and
    AdaBoost classification". Its a wonderful work sir, i need your help in detecting objects(such as vehicles) at the rear end of a car. Im using haar tranforms and adaboost algorithm for the object detection. is this approach correct or do i need to make some changes?? kindly reply.

  16. Hi,
    Thanks very much!
    Is there a Readme file to help us to use and learn your program?

  17. Hi
    unlikely to do that at ada_boost() function in Class ADA in your code.
    "alpha.resize(label_size);" instead of a "ahpha.resize(classifiers_size);", right?

    thank you for sharing your source code.

  18. unlikely to do that at ada_boost() function in Class ADA in your code.

    A : alpha.resize(label_size);
    B : ahpha.resize(classifiers_size);

    A instead of a B, right?

    Thanks for sharing your code.

  19. Hi, nice post