I have recently been encountering explanations of classification and regression which start with discrete label values as defining the former and continuous label values as defining the latter. I have always conceived of the difference as being defined by fundamentally different geometries (in [imath]\mathbb{R}^2[/imath] for example, separating the datapoints with curves is classification and best fitting the datapoints with a curve is regression)?
Do these two characterizations of the difference between classification and regression follow from each other? The geometry of multiclass classification in [imath]\mathbb{R}^2[/imath] consists in adding additional curves which further separate the data. Since a continuous interval can be thought of as a limiting case of recursively subdividing discrete buckets within the interval, is it possible to think of curve fitting as a limiting case of adding more and more separating curves to generalize the two concepts?
Do these two characterizations of the difference between classification and regression follow from each other? The geometry of multiclass classification in [imath]\mathbb{R}^2[/imath] consists in adding additional curves which further separate the data. Since a continuous interval can be thought of as a limiting case of recursively subdividing discrete buckets within the interval, is it possible to think of curve fitting as a limiting case of adding more and more separating curves to generalize the two concepts?