Curse of Dimensionality in the context of Machine Learning and Pattern Recogition

Thread Starter


Joined Jun 7, 2012
I have written followin the context of "Curse of Dimensionlity":

Initially the feature space is sparse but as we increase the number of variables, feature space becomes dense. Now we need more computational power for testing those features. Also with more var we have more noise added. This phenomena is called curse of dimensionality. So we have to go for reducing the dimensions which may cause loss of some features.

Is the above correct? What else can i add to it in simple words?