A version of the K Nearest Neighbors algorithm that uses the average (mean) outcome of the k nearest data points to an unknown sample to make continuous-valued predictions suitable for regression problems.
Note: KNN is considered a lazy learner because it does the majority of its computation during inference. For a faster spatial tree-accelerated version, see KD Neighbors Regressor.
Data Type Compatibility: Depends on distance kernel
|1||k||5||int||The number of nearest neighbors to consider when making a prediction.|
|2||weighted||true||bool||Should we consider the distances of our nearest neighbors when making predictions?|
|3||kernel||Euclidean||Distance||The distance kernel used to compute the distance between sample points.|
use Rubix\ML\Regressors\KNNRegressor; use Rubix\ML\Kernels\Distance\SafeEuclidean; $estimator = new KNNRegressor(2, false, new SafeEuclidean());
This estimator does not have any additional methods.