Skip to content
Snippets Groups Projects
Commit 7d3bb6f1 authored by Dominique Benielli's avatar Dominique Benielli
Browse files

compute alpha mumbo

parent cc9a556e
No related branches found
No related tags found
No related merge requests found
Pipeline #4204 passed
Showing
with 184 additions and 51 deletions
No preview for this file type
No preview for this file type
...@@ -172,8 +172,8 @@ The following toy examples illustrate how the MVML algorithm ...@@ -172,8 +172,8 @@ The following toy examples illustrate how the MVML algorithm
.. _sphx_glr_tutorial_auto_examples_usecase: .. _sphx_glr_tutorial_auto_examples_usecase:
Use Case Examples Use Case Examples on Digit
----------------- --------------------------
The following toy examples illustrate how the multimodal as usecase on digit dataset of sklearn The following toy examples illustrate how the multimodal as usecase on digit dataset of sklearn
...@@ -262,7 +262,7 @@ The following toy examples illustrate how the multimodal as usecase on digit da ...@@ -262,7 +262,7 @@ The following toy examples illustrate how the multimodal as usecase on digit da
.. raw:: html .. raw:: html
<div class="sphx-glr-thumbcontainer" tooltip="Use Case MKL"> <div class="sphx-glr-thumbcontainer" tooltip="Use Case MKL on digit">
.. only:: html .. only:: html
......
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png

73.4 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png

80.3 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMKL_001.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png

75.1 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png

80.3 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMVML_001.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png

76 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png

79.3 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMuCuBo_001.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png

71.6 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png

75.9 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png
doc/tutorial/auto_examples/usecase/images/sphx_glr_plot_usecase_exampleMumBo_001.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png

23.4 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png

24.5 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMKL_thumb.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png

23.6 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png

24.5 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMVML_thumb.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png

23.7 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png

24.6 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMuCuBo_thumb.png
  • 2-up
  • Swipe
  • Onion skin
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png

22.8 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png

23.6 KiB | W: | H:

doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png
doc/tutorial/auto_examples/usecase/images/thumb/sphx_glr_plot_usecase_exampleMumBo_thumb.png
  • 2-up
  • Swipe
  • Onion skin
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
%matplotlib inline %matplotlib inline
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
\n# Use Case MKL\n\nUse case for all classifier of multimodallearn MKL\nmulti class digit from sklearn, multivue\n - vue 0 digit data (color of sklearn)\n - vue 1 gradiant of image in first direction\n - vue 2 gradiant of image in second direction \n# Use Case MKL on digit\n\nUse case for all classifier of multimodallearn MKL\nmulti class digit from sklearn, multivue\n - vue 0 digit data (color of sklearn)\n - vue 1 gradiant of image in first direction\n - vue 2 gradiant of image in second direction
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
from __future__ import absolute_import\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.multiclass import OneVsOneClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.tree import DecisionTreeClassifier\nfrom multimodal.datasets.base import load_dict, save_dict\nfrom multimodal.tests.data.get_dataset_path import get_dataset_path\nfrom multimodal.datasets.data_sample import MultiModalArray\nfrom multimodal.kernels.mvml import MVML\nfrom multimodal.kernels.lpMKL import MKL\n\nfrom usecase_function import plot_subplot\n\nif __name__ == '__main__':\n # file = get_dataset_path("digit_histogram.npy")\n file = get_dataset_path("digit_col_grad.npy")\n y = np.load(get_dataset_path("digit_y.npy"))\n base_estimator = DecisionTreeClassifier(max_depth=4)\n dic_digit = load_dict(file)\n XX =MultiModalArray(dic_digit)\n X_train, X_test, y_train, y_test = train_test_split(XX, y)\n\n est4 = OneVsOneClassifier(MKL(lmbda=0.1, nystrom_param=0.2)).fit(X_train, y_train)\n y_pred4 = est4.predict(X_test)\n y_pred44 = est4.predict(X_train)\n print("result of MKL on digit with oneversone")\n result4 = np.mean(y_pred4.ravel() == y_test.ravel()) * 100\n print(result4)\n\n fig = plt.figure(figsize=(12., 11.))\n fig.suptitle("MKL : result" + str(result4), fontsize=16)\n plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0" )\n plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0" )\n plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1" )\n plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2" )\n # plt.legend()\n plt.show() import numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.multiclass import OneVsOneClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.tree import DecisionTreeClassifier\nfrom multimodal.datasets.base import load_dict, save_dict\nfrom multimodal.tests.data.get_dataset_path import get_dataset_path\nfrom multimodal.datasets.data_sample import MultiModalArray\nfrom multimodal.kernels.mvml import MVML\nfrom multimodal.kernels.lpMKL import MKL\n\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport matplotlib._color_data as mcd\n\n\ndef plot_subplot(X, Y, Y_pred, vue, subplot, title):\n cn = mcd.CSS4_COLORS\n classes = np.unique(Y)\n n_classes = len(np.unique(Y))\n axs = plt.subplot(subplot[0],subplot[1],subplot[2])\n axs.set_title(title)\n #plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',\n # edgecolors=(0, 0, 0))\n for index, k in zip(range(n_classes), cn.keys()):\n Y_class, = np.where(Y==classes[index])\n Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])\n plt.scatter(X._extract_view(vue)[Y_class],\n X._extract_view(vue)[Y_class],\n s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #\n plt.scatter(X._extract_view(vue)[Y_class_pred],\n X._extract_view(vue)[Y_class_pred],\n s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))\n\n\nif __name__ == '__main__':\n # file = get_dataset_path("digit_histogram.npy")\n file = get_dataset_path("digit_col_grad.npy")\n y = np.load(get_dataset_path("digit_y.npy"))\n base_estimator = DecisionTreeClassifier(max_depth=4)\n dic_digit = load_dict(file)\n XX =MultiModalArray(dic_digit)\n X_train, X_test, y_train, y_test = train_test_split(XX, y)\n\n est4 = OneVsOneClassifier(MKL(lmbda=0.1, nystrom_param=0.2)).fit(X_train, y_train)\n y_pred4 = est4.predict(X_test)\n y_pred44 = est4.predict(X_train)\n print("result of MKL on digit with oneversone")\n result4 = np.mean(y_pred4.ravel() == y_test.ravel()) * 100\n print(result4)\n\n fig = plt.figure(figsize=(12., 11.))\n fig.suptitle("MKL : result" + str(result4), fontsize=16)\n plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0 color" )\n plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0 color" )\n plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1 gradiant 0" )\n plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2 gradiant 1" )\n # plt.legend()\n plt.show()
``` ```
......
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
""" """
============ =====================
Use Case MKL Use Case MKL on digit
============ =====================
Use case for all classifier of multimodallearn MKL Use case for all classifier of multimodallearn MKL
multi class digit from sklearn, multivue multi class digit from sklearn, multivue
- vue 0 digit data (color of sklearn) - vue 0 digit data (color of sklearn)
...@@ -10,7 +10,7 @@ multi class digit from sklearn, multivue ...@@ -10,7 +10,7 @@ multi class digit from sklearn, multivue
- vue 2 gradiant of image in second direction - vue 2 gradiant of image in second direction
""" """
from __future__ import absolute_import
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
from sklearn.multiclass import OneVsOneClassifier from sklearn.multiclass import OneVsOneClassifier
...@@ -22,7 +22,29 @@ from multimodal.datasets.data_sample import MultiModalArray ...@@ -22,7 +22,29 @@ from multimodal.datasets.data_sample import MultiModalArray
from multimodal.kernels.mvml import MVML from multimodal.kernels.mvml import MVML
from multimodal.kernels.lpMKL import MKL from multimodal.kernels.lpMKL import MKL
from usecase_function import plot_subplot import numpy as np
import matplotlib.pyplot as plt
import matplotlib._color_data as mcd
def plot_subplot(X, Y, Y_pred, vue, subplot, title):
cn = mcd.CSS4_COLORS
classes = np.unique(Y)
n_classes = len(np.unique(Y))
axs = plt.subplot(subplot[0],subplot[1],subplot[2])
axs.set_title(title)
#plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',
# edgecolors=(0, 0, 0))
for index, k in zip(range(n_classes), cn.keys()):
Y_class, = np.where(Y==classes[index])
Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])
plt.scatter(X._extract_view(vue)[Y_class],
X._extract_view(vue)[Y_class],
s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #
plt.scatter(X._extract_view(vue)[Y_class_pred],
X._extract_view(vue)[Y_class_pred],
s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))
if __name__ == '__main__': if __name__ == '__main__':
# file = get_dataset_path("digit_histogram.npy") # file = get_dataset_path("digit_histogram.npy")
...@@ -42,10 +64,10 @@ if __name__ == '__main__': ...@@ -42,10 +64,10 @@ if __name__ == '__main__':
fig = plt.figure(figsize=(12., 11.)) fig = plt.figure(figsize=(12., 11.))
fig.suptitle("MKL : result" + str(result4), fontsize=16) fig.suptitle("MKL : result" + str(result4), fontsize=16)
plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0" ) plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0 color" )
plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0" ) plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0 color" )
plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1" ) plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1 gradiant 0" )
plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2" ) plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2 gradiant 1" )
# plt.legend() # plt.legend()
plt.show() plt.show()
4f807359096f5f5b3a7ee6b3ea540b91 f7b5c3f0fd24e4628f03aa7019eea376
\ No newline at end of file \ No newline at end of file
...@@ -7,9 +7,9 @@ ...@@ -7,9 +7,9 @@
.. _sphx_glr_tutorial_auto_examples_usecase_plot_usecase_exampleMKL.py: .. _sphx_glr_tutorial_auto_examples_usecase_plot_usecase_exampleMKL.py:
============ =====================
Use Case MKL Use Case MKL on digit
============ =====================
Use case for all classifier of multimodallearn MKL Use case for all classifier of multimodallearn MKL
multi class digit from sklearn, multivue multi class digit from sklearn, multivue
- vue 0 digit data (color of sklearn) - vue 0 digit data (color of sklearn)
...@@ -30,8 +30,8 @@ multi class digit from sklearn, multivue ...@@ -30,8 +30,8 @@ multi class digit from sklearn, multivue
.. code-block:: none .. code-block:: none
result of MKL on digit with oneversone result of MKL on digit with oneversone
98.44444444444444 96.88888888888889
/home/dominique/projets/ANR-Lives/scikit-multimodallearn/examples/usecase/plot_usecase_exampleMKL.py:50: UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure. /home/dominique/projets/ANR-Lives/scikit-multimodallearn/examples/usecase/plot_usecase_exampleMKL.py:72: UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.
plt.show() plt.show()
...@@ -44,7 +44,7 @@ multi class digit from sklearn, multivue ...@@ -44,7 +44,7 @@ multi class digit from sklearn, multivue
.. code-block:: default .. code-block:: default
from __future__ import absolute_import
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
from sklearn.multiclass import OneVsOneClassifier from sklearn.multiclass import OneVsOneClassifier
...@@ -56,7 +56,29 @@ multi class digit from sklearn, multivue ...@@ -56,7 +56,29 @@ multi class digit from sklearn, multivue
from multimodal.kernels.mvml import MVML from multimodal.kernels.mvml import MVML
from multimodal.kernels.lpMKL import MKL from multimodal.kernels.lpMKL import MKL
from usecase_function import plot_subplot import numpy as np
import matplotlib.pyplot as plt
import matplotlib._color_data as mcd
def plot_subplot(X, Y, Y_pred, vue, subplot, title):
cn = mcd.CSS4_COLORS
classes = np.unique(Y)
n_classes = len(np.unique(Y))
axs = plt.subplot(subplot[0],subplot[1],subplot[2])
axs.set_title(title)
#plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',
# edgecolors=(0, 0, 0))
for index, k in zip(range(n_classes), cn.keys()):
Y_class, = np.where(Y==classes[index])
Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])
plt.scatter(X._extract_view(vue)[Y_class],
X._extract_view(vue)[Y_class],
s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #
plt.scatter(X._extract_view(vue)[Y_class_pred],
X._extract_view(vue)[Y_class_pred],
s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))
if __name__ == '__main__': if __name__ == '__main__':
# file = get_dataset_path("digit_histogram.npy") # file = get_dataset_path("digit_histogram.npy")
...@@ -76,10 +98,10 @@ multi class digit from sklearn, multivue ...@@ -76,10 +98,10 @@ multi class digit from sklearn, multivue
fig = plt.figure(figsize=(12., 11.)) fig = plt.figure(figsize=(12., 11.))
fig.suptitle("MKL : result" + str(result4), fontsize=16) fig.suptitle("MKL : result" + str(result4), fontsize=16)
plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0" ) plot_subplot(X_train, y_train, y_pred44 ,0, (4, 1, 1), "train vue 0 color" )
plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0" ) plot_subplot(X_test, y_test,y_pred4 , 0, (4, 1, 2), "test vue 0 color" )
plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1" ) plot_subplot(X_test, y_test, y_pred4,1, (4, 1, 3), "test vue 1 gradiant 0" )
plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2" ) plot_subplot(X_test, y_test,y_pred4, 2, (4, 1, 4), "test vue 2 gradiant 1" )
# plt.legend() # plt.legend()
plt.show() plt.show()
...@@ -87,7 +109,7 @@ multi class digit from sklearn, multivue ...@@ -87,7 +109,7 @@ multi class digit from sklearn, multivue
.. rst-class:: sphx-glr-timing .. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 0 minutes 12.697 seconds) **Total running time of the script:** ( 0 minutes 20.457 seconds)
.. _sphx_glr_download_tutorial_auto_examples_usecase_plot_usecase_exampleMKL.py: .. _sphx_glr_download_tutorial_auto_examples_usecase_plot_usecase_exampleMKL.py:
......
No preview for this file type
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
%matplotlib inline %matplotlib inline
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
\n# Use Case of MVML\n\nUse case for all classifier of multimodallearn MVML\n\nmulti class digit from sklearn, multivue\n - vue 0 digit data (color of sklearn)\n - vue 1 gradiant of image in first direction\n - vue 2 gradiant of image in second direction \n# Use Case of MVML on digit\n\nUse case for all classifier of multimodallearn MVML\n\nmulti class digit from sklearn, multivue\n - vue 0 digit data (color of sklearn)\n - vue 1 gradiant of image in first direction\n - vue 2 gradiant of image in second direction
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
from __future__ import absolute_import\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.multiclass import OneVsOneClassifier\nfrom sklearn.model_selection import train_test_split\nfrom multimodal.datasets.base import load_dict, save_dict\nfrom multimodal.tests.data.get_dataset_path import get_dataset_path\nfrom multimodal.datasets.data_sample import MultiModalArray\nfrom multimodal.kernels.mvml import MVML\nfrom usecase_function import plot_subplot\n\n\nif __name__ == '__main__':\n # file = get_dataset_path("digit_histogram.npy")\n file = get_dataset_path("digit_col_grad.npy")\n y = np.load(get_dataset_path("digit_y.npy"))\n dic_digit = load_dict(file)\n XX =MultiModalArray(dic_digit)\n X_train, X_test, y_train, y_test = train_test_split(XX, y)\n est1 = OneVsOneClassifier(MVML(lmbda=0.1, eta=1, nystrom_param=0.2)).fit(X_train, y_train)\n y_pred1 = est1.predict(X_test)\n y_pred11 = est1.predict(X_train)\n print("result of MVML on digit with oneversone")\n result1 = np.mean(y_pred1.ravel() == y_test.ravel()) * 100\n print(result1)\n\n fig = plt.figure(figsize=(12., 11.))\n fig.suptitle("MVML: result" + str(result1), fontsize=16)\n plot_subplot(X_train, y_train, y_pred11\n , 0, (4, 1, 1), "train vue 0" )\n plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0" )\n plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1" )\n plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2" )\n #plt.legend()\n plt.show() import numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.multiclass import OneVsOneClassifier\nfrom sklearn.model_selection import train_test_split\nfrom multimodal.datasets.base import load_dict, save_dict\nfrom multimodal.tests.data.get_dataset_path import get_dataset_path\nfrom multimodal.datasets.data_sample import MultiModalArray\nfrom multimodal.kernels.mvml import MVML\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport matplotlib._color_data as mcd\n\n\ndef plot_subplot(X, Y, Y_pred, vue, subplot, title):\n cn = mcd.CSS4_COLORS\n classes = np.unique(Y)\n n_classes = len(np.unique(Y))\n axs = plt.subplot(subplot[0],subplot[1],subplot[2])\n axs.set_title(title)\n #plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',\n # edgecolors=(0, 0, 0))\n for index, k in zip(range(n_classes), cn.keys()):\n Y_class, = np.where(Y==classes[index])\n Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])\n plt.scatter(X._extract_view(vue)[Y_class],\n X._extract_view(vue)[Y_class],\n s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #\n plt.scatter(X._extract_view(vue)[Y_class_pred],\n X._extract_view(vue)[Y_class_pred],\n s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))\n\n\n\nif __name__ == '__main__':\n # file = get_dataset_path("digit_histogram.npy")\n file = get_dataset_path("digit_col_grad.npy")\n y = np.load(get_dataset_path("digit_y.npy"))\n dic_digit = load_dict(file)\n XX =MultiModalArray(dic_digit)\n X_train, X_test, y_train, y_test = train_test_split(XX, y)\n est1 = OneVsOneClassifier(MVML(lmbda=0.1, eta=1, nystrom_param=0.2)).fit(X_train, y_train)\n y_pred1 = est1.predict(X_test)\n y_pred11 = est1.predict(X_train)\n print("result of MVML on digit with oneversone")\n result1 = np.mean(y_pred1.ravel() == y_test.ravel()) * 100\n print(result1)\n\n fig = plt.figure(figsize=(12., 11.))\n fig.suptitle("MVML: result" + str(result1), fontsize=16)\n plot_subplot(X_train, y_train, y_pred11\n , 0, (4, 1, 1), "train vue 0 color" )\n plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0 color" )\n plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1 gradiant 0" )\n plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2 gradiant 1" )\n #plt.legend()\n plt.show()
``` ```
......
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
""" """
================ =========================
Use Case of MVML Use Case of MVML on digit
================ ========================
Use case for all classifier of multimodallearn MVML Use case for all classifier of multimodallearn MVML
multi class digit from sklearn, multivue multi class digit from sklearn, multivue
...@@ -11,7 +11,7 @@ multi class digit from sklearn, multivue ...@@ -11,7 +11,7 @@ multi class digit from sklearn, multivue
- vue 2 gradiant of image in second direction - vue 2 gradiant of image in second direction
""" """
from __future__ import absolute_import
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
from sklearn.multiclass import OneVsOneClassifier from sklearn.multiclass import OneVsOneClassifier
...@@ -20,7 +20,29 @@ from multimodal.datasets.base import load_dict, save_dict ...@@ -20,7 +20,29 @@ from multimodal.datasets.base import load_dict, save_dict
from multimodal.tests.data.get_dataset_path import get_dataset_path from multimodal.tests.data.get_dataset_path import get_dataset_path
from multimodal.datasets.data_sample import MultiModalArray from multimodal.datasets.data_sample import MultiModalArray
from multimodal.kernels.mvml import MVML from multimodal.kernels.mvml import MVML
from usecase_function import plot_subplot import numpy as np
import matplotlib.pyplot as plt
import matplotlib._color_data as mcd
def plot_subplot(X, Y, Y_pred, vue, subplot, title):
cn = mcd.CSS4_COLORS
classes = np.unique(Y)
n_classes = len(np.unique(Y))
axs = plt.subplot(subplot[0],subplot[1],subplot[2])
axs.set_title(title)
#plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',
# edgecolors=(0, 0, 0))
for index, k in zip(range(n_classes), cn.keys()):
Y_class, = np.where(Y==classes[index])
Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])
plt.scatter(X._extract_view(vue)[Y_class],
X._extract_view(vue)[Y_class],
s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #
plt.scatter(X._extract_view(vue)[Y_class_pred],
X._extract_view(vue)[Y_class_pred],
s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))
if __name__ == '__main__': if __name__ == '__main__':
...@@ -40,10 +62,10 @@ if __name__ == '__main__': ...@@ -40,10 +62,10 @@ if __name__ == '__main__':
fig = plt.figure(figsize=(12., 11.)) fig = plt.figure(figsize=(12., 11.))
fig.suptitle("MVML: result" + str(result1), fontsize=16) fig.suptitle("MVML: result" + str(result1), fontsize=16)
plot_subplot(X_train, y_train, y_pred11 plot_subplot(X_train, y_train, y_pred11
, 0, (4, 1, 1), "train vue 0" ) , 0, (4, 1, 1), "train vue 0 color" )
plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0" ) plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0 color" )
plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1" ) plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1 gradiant 0" )
plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2" ) plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2 gradiant 1" )
#plt.legend() #plt.legend()
plt.show() plt.show()
b4b4bb03418027ba62ce77c251085cf5 c401fe6af938dc5fef9c977303a2fdcf
\ No newline at end of file \ No newline at end of file
...@@ -7,9 +7,9 @@ ...@@ -7,9 +7,9 @@
.. _sphx_glr_tutorial_auto_examples_usecase_plot_usecase_exampleMVML.py: .. _sphx_glr_tutorial_auto_examples_usecase_plot_usecase_exampleMVML.py:
================ =========================
Use Case of MVML Use Case of MVML on digit
================ ========================
Use case for all classifier of multimodallearn MVML Use case for all classifier of multimodallearn MVML
multi class digit from sklearn, multivue multi class digit from sklearn, multivue
...@@ -30,9 +30,54 @@ multi class digit from sklearn, multivue ...@@ -30,9 +30,54 @@ multi class digit from sklearn, multivue
.. code-block:: none .. code-block:: none
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 2}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 5, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 1}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 2}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 2}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 1}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 2}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 3, 'precond_A_1': 2}
WARNING:root:warning appears during fit process{'precond_A': 5, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 5, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 4}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 6}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 1}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
WARNING:root:warning appears during fit process{'precond_A': 4, 'precond_A_1': 5}
result of MVML on digit with oneversone result of MVML on digit with oneversone
98.88888888888889 96.88888888888889
/home/dominique/projets/ANR-Lives/scikit-multimodallearn/examples/usecase/plot_usecase_exampleMVML.py:48: UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure. /home/dominique/projets/ANR-Lives/scikit-multimodallearn/examples/usecase/plot_usecase_exampleMVML.py:70: UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.
plt.show() plt.show()
...@@ -45,7 +90,7 @@ multi class digit from sklearn, multivue ...@@ -45,7 +90,7 @@ multi class digit from sklearn, multivue
.. code-block:: default .. code-block:: default
from __future__ import absolute_import
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
from sklearn.multiclass import OneVsOneClassifier from sklearn.multiclass import OneVsOneClassifier
...@@ -54,7 +99,29 @@ multi class digit from sklearn, multivue ...@@ -54,7 +99,29 @@ multi class digit from sklearn, multivue
from multimodal.tests.data.get_dataset_path import get_dataset_path from multimodal.tests.data.get_dataset_path import get_dataset_path
from multimodal.datasets.data_sample import MultiModalArray from multimodal.datasets.data_sample import MultiModalArray
from multimodal.kernels.mvml import MVML from multimodal.kernels.mvml import MVML
from usecase_function import plot_subplot import numpy as np
import matplotlib.pyplot as plt
import matplotlib._color_data as mcd
def plot_subplot(X, Y, Y_pred, vue, subplot, title):
cn = mcd.CSS4_COLORS
classes = np.unique(Y)
n_classes = len(np.unique(Y))
axs = plt.subplot(subplot[0],subplot[1],subplot[2])
axs.set_title(title)
#plt.scatter(X._extract_view(vue), X._extract_view(vue), s=40, c='gray',
# edgecolors=(0, 0, 0))
for index, k in zip(range(n_classes), cn.keys()):
Y_class, = np.where(Y==classes[index])
Y_class_pred = np.intersect1d(np.where(Y_pred==classes[index])[0], np.where(Y_pred==Y)[0])
plt.scatter(X._extract_view(vue)[Y_class],
X._extract_view(vue)[Y_class],
s=40, c=cn[k], edgecolors='blue', linewidths=2, label="class real class: "+str(index)) #
plt.scatter(X._extract_view(vue)[Y_class_pred],
X._extract_view(vue)[Y_class_pred],
s=160, edgecolors='orange', linewidths=2, label="class prediction: "+str(index))
if __name__ == '__main__': if __name__ == '__main__':
...@@ -74,10 +141,10 @@ multi class digit from sklearn, multivue ...@@ -74,10 +141,10 @@ multi class digit from sklearn, multivue
fig = plt.figure(figsize=(12., 11.)) fig = plt.figure(figsize=(12., 11.))
fig.suptitle("MVML: result" + str(result1), fontsize=16) fig.suptitle("MVML: result" + str(result1), fontsize=16)
plot_subplot(X_train, y_train, y_pred11 plot_subplot(X_train, y_train, y_pred11
, 0, (4, 1, 1), "train vue 0" ) , 0, (4, 1, 1), "train vue 0 color" )
plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0" ) plot_subplot(X_test, y_test,y_pred1, 0, (4, 1, 2), "test vue 0 color" )
plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1" ) plot_subplot(X_test, y_test, y_pred1, 1, (4, 1, 3), "test vue 1 gradiant 0" )
plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2" ) plot_subplot(X_test, y_test,y_pred1, 2, (4, 1, 4), "test vue 2 gradiant 1" )
#plt.legend() #plt.legend()
plt.show() plt.show()
...@@ -85,7 +152,7 @@ multi class digit from sklearn, multivue ...@@ -85,7 +152,7 @@ multi class digit from sklearn, multivue
.. rst-class:: sphx-glr-timing .. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 0 minutes 39.921 seconds) **Total running time of the script:** ( 1 minutes 14.485 seconds)
.. _sphx_glr_download_tutorial_auto_examples_usecase_plot_usecase_exampleMVML.py: .. _sphx_glr_download_tutorial_auto_examples_usecase_plot_usecase_exampleMVML.py:
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment