Advanced Classifiers

Module that contains classifiers that uses two step genetic optimization and rule mining based on the support of the candidate rules.

class ex_fuzzy.classifiers.FuzzyRulesClassifier(nRules: int = 30, nAnts: int = 4, fuzzy_type: FUZZY_SETS = None, tolerance: float = 0.0, verbose=False, n_class: int = None, runner: int = 1, expansion_factor: int = 1, linguistic_variables: list[fuzzyVariable] = None)[source]

Bases: ClassifierMixin

A classifier that works by performing a double optimization process. First, it creates a candidate rule base using genetic optimization and then uses it as a basis to obtain a better one that satisfies the constrain of antecedents and number of rules.

fit(X: array, y: array, n_gen: int = 30, pop_size: int = 50, checkpoints: int = 0, **kwargs) None[source]

Trains the model with the given data.

Parameters:
  • X – samples to train.

  • y – labels for each sample.

  • n_gen – number of generations to compute in the genetic optimization.

  • pop_size – number of subjects per generation.

  • checkpoints – if bigger than 0, will save the best subject per x generations in a text file.

  • kwargs – additional parameters for the genetic optimization. See fit method in BaseRuleBaseClassifier.

predict(X: array) array[source]

Predcit for each sample the correspondent class.

Parameters:

X – samples to predict.

Returns:

a class for each sample.

class ex_fuzzy.classifiers.RuleFineTuneClassifier(nRules: int = 30, nAnts: int = 4, fuzzy_type: FUZZY_SETS = None, tolerance: float = 0.0, verbose=False, n_class: int = None, runner: int = 1, expansion_factor: int = 1, linguistic_variables: list[fuzzyVariable] = None)[source]

Bases: ClassifierMixin

A classifier that works by mining a set of candidate rules with a minimum support and then uses a two step genetic optimization that chooses the optimal combination of those rules and fine tunes them.

fit(X: array, y: array, n_gen: int = 30, pop_size: int = 50, checkpoints: int = 0, **kwargs) None[source]

Trains the model with the given data.

Parameters:
  • X – samples to train.

  • y – labels for each sample.

  • n_gen – number of generations to compute in the genetic optimization.

  • pop_size – number of subjects per generation.

  • checkpoints – if bigger than 0, will save the best subject per x generations in a text file.

  • kwargs – additional parameters for the genetic optimization. See fit method in BaseRuleBaseClassifier.

predict(X: array) array[source]

Predcit for each sample the correspondent class.

Parameters:

X – samples to predict.

Returns:

a class for each sample.

class ex_fuzzy.classifiers.RuleMineClassifier(nRules: int = 30, nAnts: int = 4, fuzzy_type: FUZZY_SETS = None, tolerance: float = 0.0, verbose=False, n_class: int = None, runner: int = 1, linguistic_variables: list[fuzzyVariable] = None)[source]

Bases: ClassifierMixin

A classifier that works by mining a set of candidate rules with a minimum support, confidence and lift, and then using a genetic algorithm that chooses the optimal combination of those rules.

fit(X: array, y: array, n_gen: int = 30, pop_size: int = 50, **kwargs) None[source]

Trains the model with the given data.

Parameters:
  • X – samples to train.

  • y – labels for each sample.

  • n_gen – number of generations to compute in the genetic optimization.

  • pop_size – number of subjects per generation.

  • kwargs – additional parameters for the genetic optimization. See fit method in BaseRuleBaseClassifier.

predict(X: array) array[source]

Predict for each sample the corresponding class.

Parameters:

X – samples to predict.

Returns:

a class for each sample.