Java Code Examples for weka.classifiers.lazy.IBk#setCrossValidate()

The following examples show how to use weka.classifiers.lazy.IBk#setCrossValidate() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: CollectiveIBk.java    From collective-classification-weka-package with GNU General Public License v3.0 6 votes vote down vote up
/**
 * performs initialization of members
 */
@Override
protected void initializeMembers() {
  super.initializeMembers();

  m_KNNdetermined    = -1;
  m_NeighborsTestset = null;
  m_TrainsetNew      = null;
  m_TestsetNew       = null;
  m_UseNaiveSearch   = false;
  m_LabeledTestset   = null;
  m_Missing          = new ReplaceMissingValues();
  
  m_Classifier = new IBk();
  m_Classifier.setKNN(10);
  m_Classifier.setCrossValidate(true);
  m_Classifier.setWindowSize(0);
  m_Classifier.setMeanSquared(false);
  
  m_KNN = m_Classifier.getKNN();
  
  m_AdditionalMeasures.add("measureDeterminedKNN");
}
 
Example 2
Source File: kNN.java    From tsml with GNU General Public License v3.0 5 votes vote down vote up
public static void testkNNvsIBk(boolean norm, boolean crossValidate){
        System.out.println("FIRST BASIC SANITY TEST FOR THIS WRAPPER");
        System.out.print("Compare 1-NN with IB1, normalisation turned");
        String str=norm?" on":" off";
        System.out.println(str);
        System.out.print("Cross validation turned");
        str=crossValidate?" on":" off";
        System.out.println(str);
        System.out.println("Compare on the UCI data sets");
        System.out.print("If normalisation is off, then there may be differences");
        kNN knn = new kNN(100);
        IBk ibk=new IBk(100);
        knn.normalise(norm);
        knn.setCrossValidate(crossValidate);
        ibk.setCrossValidate(crossValidate);
        int diff=0;
        DecimalFormat df = new DecimalFormat("####.###");
        for(String s:DatasetLists.uciFileNames){
            Instances train=DatasetLoading.loadDataNullable("Z:/ArchiveData/Uci_arff/"+s+"\\"+s+"-train");
            Instances test=DatasetLoading.loadDataNullable("Z:/ArchiveData/Uci_arff/"+s+"\\"+s+"-test");
            try{
                knn.buildClassifier(train);
//                ib1.buildClassifier(train);
                ibk.buildClassifier(train);
                double a1=ClassifierTools.accuracy(test, knn);
                double a2=ClassifierTools.accuracy(test, ibk);
                if(a1!=a2){
                    diff++;
                    System.out.println(s+": 1-NN ="+df.format(a1)+" ibk="+df.format(a2));
                }
            }catch(Exception e){
                System.out.println(" Exception builing a classifier");
                System.exit(0);
            }
        }
         System.out.println("Total problems ="+DatasetLists.uciFileNames.length+" different on "+diff);
    }
 
Example 3
Source File: EnsembleProvider.java    From AILibs with GNU Affero General Public License v3.0 5 votes vote down vote up
/**
 * Initializes the CAWPE ensemble model consisting of five classifiers (SMO,
 * KNN, J48, Logistic and MLP) using a majority voting strategy. The ensemble
 * uses Weka classifiers. It refers to "Heterogeneous ensemble of standard
 * classification algorithms" (HESCA) as described in Lines, Jason & Taylor,
 * Sarah & Bagnall, Anthony. (2018). Time Series Classification with HIVE-COTE:
 * The Hierarchical Vote Collective of Transformation-Based Ensembles. ACM
 * Transactions on Knowledge Discovery from Data. 12. 1-35. 10.1145/3182382.
 *
 * @param seed
 *            Seed used within the classifiers and the majority confidence
 *            voting scheme
 * @param numFolds
 *            Number of folds used within the determination of the classifier
 *            weights for the {@link MajorityConfidenceVote}
 * @return Returns an initialized (but untrained) ensemble model.
 * @throws Exception
 *             Thrown when the initialization has failed
 */
public static Classifier provideCAWPEEnsembleModel(final int seed, final int numFolds) throws Exception {
	Classifier[] classifiers = new Classifier[5];

	Vote voter = new MajorityConfidenceVote(numFolds, seed);

	SMO smo = new SMO();
	smo.turnChecksOff();
	smo.setBuildCalibrationModels(true);
	PolyKernel kl = new PolyKernel();
	kl.setExponent(1);
	smo.setKernel(kl);
	smo.setRandomSeed(seed);
	classifiers[0] = smo;

	IBk k = new IBk(100);
	k.setCrossValidate(true);
	EuclideanDistance ed = new EuclideanDistance();
	ed.setDontNormalize(true);
	k.getNearestNeighbourSearchAlgorithm().setDistanceFunction(ed);
	classifiers[1] = k;

	J48 c45 = new J48();
	c45.setSeed(seed);
	classifiers[2] = c45;

	classifiers[3] = new Logistic();

	classifiers[4] = new MultilayerPerceptron();

	voter.setClassifiers(classifiers);
	return voter;
}