I happen to be using libsvm, so I am attempting to use option 2.  From what I understand SplitClassifier is a meta-classifier, and so I can simply feed my previous classifier to SplitClassifier and feed that to CrossValidation. SplitClassifier than just provides a layer that can save stuff out over the folds... I have a tenuous grasp but hopefully this is basically correct.  Can you glance at the couple of lines below to verify that I am using SplitClassifier correctly?  Thanks for the help!<div>
<br></div><div>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">baseclf = LinearCSVMC()</p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">svdmapper=SVDMapper()        </p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">get_SVD_sliced = lambda x: ChainMapper([svdmapper, StaticFeatureSelection(x)])</p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">
metaclf = MappedClassifier(baseclf, get_SVD_sliced(slice(0, 15)))</p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"><br></p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"></p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"></p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"></p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">sc = SplitClassifier(metaclf, enable_ca=['stats'])</p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">cv = CrossValidation(sc, NFoldPartitioner(), errorfx=mean_mismatch_error, enable_ca=['stats','datasets'])</p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">                 </p>
<p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">err = cv(ds)</p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"><br></p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">
# now to test the novel dataset on an example classifier</p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">mean(sc.clfs[1].predict(ds2.samples) == ds2.targets)</p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px">
<br></p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"><br></p><p style="margin-top:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"><br></p><div><br><div class="gmail_quote">On Sun, Jan 8, 2012 at 4:14 PM, Yaroslav Halchenko <span dir="ltr"><<a href="mailto:debian@onerussian.com">debian@onerussian.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">there are 2 ways:<br>
<br>
1. [available only in mvpa2]<br>
   any RepeatedMeasure (including CrossValidation) takes argument<br>
   'callback':<br>
<br>
       callback : functor<br>
          Optional callback to extract information from inside the main loop of<br>
          the measure. The callback is called with the input 'data', the 'node'<br>
          instance that is evaluated repeatedly and the 'result' of a single<br>
          evaluation -- passed as named arguments (see labels in quotes) for<br>
          every iteration, directly after evaluating the node.<br>
<br>
   so there you could access anything you care about in the 'node', which is<br>
   classifier in this case<br>
<br>
   BUT because the same classifier instance gets reused through the iterations,<br>
   you can't just "store" the classifier.  you can deepcopy some of them (e.g.<br>
   the ones relying on swig-ed APIs, like libsvm, would not be deepcopy-able)<br>
<br>
2. SplitClassifier<br>
<br>
   That one behaves similarly to cross-validation (just access its .ca.stats to<br>
   get results of cross-validation), but also operates on copies of the originally<br>
   provided classifier, so you could access all of them via .clfs attribute.<br>
<br>
<br>
Helps?<br>
<div><div class="h5"><br>
On Sun, 08 Jan 2012, Tyson Aflalo wrote:<br>
<br>
>    Is there a means of accessing each trained classifier that is generated as<br>
>    part of a cross-validation analysis?�<br>
<br>
>    Thanks,<br>
<br>
>    tyson<br>
<br>
</div></div>> _______________________________________________<br>
> Pkg-ExpPsy-PyMVPA mailing list<br>
> <a href="mailto:Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org">Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org</a><br>
> <a href="http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa" target="_blank">http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa</a><br>
<span class="HOEnZb"><font color="#888888"><br>
<br>
--<br>
=------------------------------------------------------------------=<br>
Keep in touch                                     <a href="http://www.onerussian.com" target="_blank">www.onerussian.com</a><br>
Yaroslav Halchenko                 <a href="http://www.ohloh.net/accounts/yarikoptic" target="_blank">www.ohloh.net/accounts/yarikoptic</a><br>
<br>
_______________________________________________<br>
Pkg-ExpPsy-PyMVPA mailing list<br>
<a href="mailto:Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org">Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org</a><br>
<a href="http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa" target="_blank">http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa</a></font></span></blockquote></div><br></div>
</div>