Skip to content
Snippets Groups Projects
Commit e236fcd2 authored by Balthazar Casale's avatar Balthazar Casale
Browse files

Update 2 files

- /src/models/approx_based.py
- /README.md
parent 07bb4e21
Branches
Tags
No related merge requests found
......@@ -24,12 +24,13 @@ We give a typical use case in the following snipped of code :
```python
from types import save_dmstack, load_dmstack
from pipeline import *
from samplers.mixed import RandomInduced
from models.criteria import PPT
from models.approx_based import DistToSep
from transformers.sep_approximation import FrankWolfe
states, infos = Pipeline([
('sample', InducedMeasure(k_params=[25]).states), # induced measure of parameter 25
('sample', RandomInduced(k_params=[25]).states), # induced measure of parameter 25
('ppt only', select(PPT.is_respected, True)), # respecting the PPT criterion
('fw', add(FrankWolfe(1000).approximation, key = 'approx'), # compute the sep approx.
('sel ent', select(DistToSep(0.01, sep_key = 'fw__approx').predict, Label.ENT))
......@@ -55,6 +56,7 @@ def sampler(n_states : int, dims : list[int]) -> DMStack, dict
```
the following samplers can be found in the library :
- samplers.utils.FromSet
- samplers.pure.RandomHaar
- samplers.mixed.RandomInduced
- samplers.mixed.RandomBures
......@@ -71,8 +73,8 @@ def transformer(states : DMStack, infos : dict) -> DMStack, dict
the following transformers can be found in the library :
- transformers.sep_approximations.FrankWolfe
- transformers.real_representation.GellMann
- transformer.real_representation.Measures
- transformers.representations.GellMann
- transformer.representations.Measures
### model
......
......@@ -7,6 +7,16 @@ from ..types import Label
import numpy as np
class MlModel :
"""
Use a machine learning model (sklearn) as model
"""
def __init__(self, model) :
self.model = model
def predict(self, states, infos={}):
return self.model.predict(state), {}
class DistToSep:
"""
Distance from a separable approximation
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment