Note
Click here to download the full example code
Classification krigingΒΆ
An example of classification kriging
Out:
========================================
classification model: SVC
Finished learning classification model
Finished kriging residuals
Classification Score: 0.212
CK score: 0.6566666666666666
========================================
classification model: RandomForestClassifier
Finished learning classification model
Finished kriging residuals
Classification Score: 0.5766666666666667
CK score: 0.5946666666666667
========================================
classification model: LogisticRegression
/home/docs/checkouts/readthedocs.org/user_builds/pykrige/envs/v1.6.1/lib/python3.7/site-packages/sklearn/linear_model/_logistic.py:765: ConvergenceWarning: lbfgs failed to converge (status=1):
STOP: TOTAL NO. of ITERATIONS REACHED LIMIT.
Increase the number of iterations (max_iter) or scale the data as shown in:
https://scikit-learn.org/stable/modules/preprocessing.html
Please also refer to the documentation for alternative solver options:
https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression
extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)
Finished learning classification model
Finished kriging residuals
Classification Score: 0.5193333333333333
CK score: 0.6553333333333333
import sys
from sklearn.svm import SVC
from sklearn.ensemble import RandomForestClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import fetch_california_housing
from sklearn.preprocessing import KBinsDiscretizer
from sklearn.model_selection import train_test_split
from pykrige.ck import ClassificationKriging
svc_model = SVC(C=0.1, gamma="auto", probability=True)
rf_model = RandomForestClassifier(n_estimators=100)
lr_model = LogisticRegression(max_iter=10000)
models = [svc_model, rf_model, lr_model]
try:
housing = fetch_california_housing()
except PermissionError:
# this dataset can occasionally fail to download on Windows
sys.exit(0)
# take the first 5000 as Kriging is memory intensive
p = housing["data"][:5000, :-2]
x = housing["data"][:5000, -2:]
target = housing["target"][:5000]
discretizer = KBinsDiscretizer(encode="ordinal")
target = discretizer.fit_transform(target.reshape(-1, 1))
p_train, p_test, x_train, x_test, target_train, target_test = train_test_split(
p, x, target, test_size=0.3, random_state=42
)
for m in models:
print("=" * 40)
print("classification model:", m.__class__.__name__)
m_ck = ClassificationKriging(classification_model=m, n_closest_points=10)
m_ck.fit(p_train, x_train, target_train)
print(
"Classification Score: ", m_ck.classification_model.score(p_test, target_test)
)
print("CK score: ", m_ck.score(p_test, x_test, target_test))
Total running time of the script: ( 0 minutes 28.788 seconds)