From mbannert at tuebingen.mpg.de Tue Jan 23 10:51:34 2018 From: mbannert at tuebingen.mpg.de (Michael Bannert) Date: Tue, 23 Jan 2018 11:51:34 +0100 Subject: [pymvpa] Shrinkage LDA for PyMVPA Message-ID: <3b0b1e74-21b6-919e-7cba-3fc229e792df@tuebingen.mpg.de> Dear PyMVPA users, Since I couldn't find an implementation of "shrinkage LDA" (e.g., Pereira & Botvinick, NeuroImage, 2011) in PyMVPA, I implemented my own based on Gaussian Discriminant Analysis (gda.py). I figured it could be helpful for other users as well? It is considerably faster than using the scikit-learn implementation through the SKLLearnerAdapter, which I have been using until now: SKLLearnerAdapter(LinearDiscriminantAnalysis(solver='lsqr', shrinkage='auto')) It is also a little more flexible because you can choose between the Ledoit-Wolf estimator and the Oracle Approximating Shrinkage. May it be useful. Cheers, Michael -------------- next part -------------- A non-text attachment was scrubbed... Name: gda.py Type: text/x-python-script Size: 12145 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: warehouse.py Type: text/x-python-script Size: 28845 bytes Desc: not available URL: From debian at onerussian.com Tue Jan 23 14:44:18 2018 From: debian at onerussian.com (Yaroslav Halchenko) Date: Tue, 23 Jan 2018 09:44:18 -0500 Subject: [pymvpa] Shrinkage LDA for PyMVPA In-Reply-To: <3b0b1e74-21b6-919e-7cba-3fc229e792df@tuebingen.mpg.de> References: <3b0b1e74-21b6-919e-7cba-3fc229e792df@tuebingen.mpg.de> Message-ID: <20180123144418.GL3296@hopa.kiewit.dartmouth.edu> On Tue, 23 Jan 2018, Michael Bannert wrote: > Dear PyMVPA users, > Since I couldn't find an implementation of "shrinkage LDA" (e.g., Pereira & > Botvinick, NeuroImage, 2011) in PyMVPA, I implemented my own based on > Gaussian Discriminant Analysis (gda.py). I figured it could be helpful for > other users as well? > It is considerably faster than using the scikit-learn implementation through > the SKLLearnerAdapter, which I have been using until now: > SKLLearnerAdapter(LinearDiscriminantAnalysis(solver='lsqr', > shrinkage='auto')) > It is also a little more flexible because you can choose between the > Ledoit-Wolf estimator and the Oracle Approximating Shrinkage. > May it be useful. Sounds great. Would you mind submitting a PR so your contribution authorship is properly recorded, and so we could make sure it works correctly (by running at least the standard batch of tests after it gets added to the warehouse which I think you have already done)? -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From jj.jilek at gmail.com Tue Jan 30 14:21:32 2018 From: jj.jilek at gmail.com (=?UTF-8?B?SmFrdWIgSsOtbGVr?=) Date: Tue, 30 Jan 2018 14:21:32 +0000 Subject: [pymvpa] PyMVPA workshops Message-ID: Hello, do you know of any physically-attended workshops / courses which teach how to use PyMVPA? Any location is fine. Thank you. All the best, Jakub Jilek UCL -------------- next part -------------- An HTML attachment was scrubbed... URL: From debian at onerussian.com Tue Jan 30 14:28:25 2018 From: debian at onerussian.com (Yaroslav Halchenko) Date: Tue, 30 Jan 2018 09:28:25 -0500 Subject: [pymvpa] PyMVPA workshops In-Reply-To: References: Message-ID: <20180130142825.GT3296@hopa.kiewit.dartmouth.edu> On Tue, 30 Jan 2018, Jakub J?lek wrote: > Hello, > do you know of any physically-attended workshops / courses which teach how > to use PyMVPA? Any location is fine. Thank you. AFAIK none is planned ATM. -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From jj.jilek at gmail.com Tue Jan 30 14:43:10 2018 From: jj.jilek at gmail.com (=?UTF-8?B?SmFrdWIgSsOtbGVr?=) Date: Tue, 30 Jan 2018 14:43:10 +0000 Subject: [pymvpa] PyMVPA workshops In-Reply-To: <20180130142825.GT3296@hopa.kiewit.dartmouth.edu> References: <20180130142825.GT3296@hopa.kiewit.dartmouth.edu> Message-ID: Thank you! On Tue, 30 Jan 2018 at 15:28 Yaroslav Halchenko wrote: > > On Tue, 30 Jan 2018, Jakub J?lek wrote: > > Hello, > > do you know of any physically-attended workshops / courses which > teach how > > to use PyMVPA? Any location is fine. Thank you. > > AFAIK none is planned ATM. > > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 <+1%20603-646-9834> Fax: +1 > (603) 646-1419 <+1%20603-646-1419> > WWW: http://www.linkedin.com/in/yarik > > _______________________________________________ > Pkg-ExpPsy-PyMVPA mailing list > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org > http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa -------------- next part -------------- An HTML attachment was scrubbed... URL: From jhevia at hotmail.com Tue Jan 30 21:42:22 2018 From: jhevia at hotmail.com (Jorge H) Date: Tue, 30 Jan 2018 21:42:22 +0000 Subject: [pymvpa] PyMVPA temporal comprobation In-Reply-To: References: , <20180130142825.GT3296@hopa.kiewit.dartmouth.edu>, Message-ID: Sorry I did not mentioned in the last email that I?m doing a Searchlight Hyperalignment. I`m attaching my script. Thanks again! Jorge Dr. Jorge Carlos Hevia Orozco Unidad de Analisis de Imagenes, Instituto de Neurobiologia, UNAM campus Juriquilla ________________________________ De: Jorge H Enviado: martes, 30 de enero de 2018 03:23 p. m. Para: Development and support of PyMVPA Asunto: Re: [pymvpa] PyMVPA temporal comprobation Hi I have a question and I hope you can help me I have obtained the accuracy for several regiones of interest. Some of them apparently are good clasifiers but some other dont. However what if I want to make the temporal comprobation? If I disorder the attributes (vectors) text file, should I expect a random (0.50) classification on each regi?n of interest? Thanks! Kind regards! Jorge Dr. Jorge Carlos Hevia Orozco Unidad de Analisis de Imagenes, Instituto de Neurobiologia, UNAM campus Juriquilla ________________________________ De: Pkg-ExpPsy-PyMVPA en nombre de Yaroslav Halchenko Enviado: martes, 30 de enero de 2018 08:28 a. m. Para: pkg-exppsy-pymvpa at lists.alioth.debian.org Asunto: Re: [pymvpa] PyMVPA workshops On Tue, 30 Jan 2018, Jakub J?lek wrote: > Hello, > do you know of any physically-attended workshops / courses which teach how > to use PyMVPA? Any location is fine. Thank you. AFAIK none is planned ATM. -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik _______________________________________________ Pkg-ExpPsy-PyMVPA mailing list Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Searchlight_Hyperalignment_Hevia_ene292017B.py Type: text/x-python-script Size: 5115 bytes Desc: Searchlight_Hyperalignment_Hevia_ene292017B.py URL: From jhevia at hotmail.com Tue Jan 30 21:23:32 2018 From: jhevia at hotmail.com (Jorge H) Date: Tue, 30 Jan 2018 21:23:32 +0000 Subject: [pymvpa] PyMVPA temporal comprobation In-Reply-To: <20180130142825.GT3296@hopa.kiewit.dartmouth.edu> References: , <20180130142825.GT3296@hopa.kiewit.dartmouth.edu> Message-ID: Hi I have a question and I hope you can help me I have obtained the accuracy for several regiones of interest. Some of them apparently are good clasifiers but some other dont. However what if I want to make the temporal comprobation? If I disorder the attributes (vectors) text file, should I expect a random (0.50) classification on each regi?n of interest? Thanks! Kind regards! Jorge Dr. Jorge Carlos Hevia Orozco Unidad de Analisis de Imagenes, Instituto de Neurobiologia, UNAM campus Juriquilla ________________________________ De: Pkg-ExpPsy-PyMVPA en nombre de Yaroslav Halchenko Enviado: martes, 30 de enero de 2018 08:28 a. m. Para: pkg-exppsy-pymvpa at lists.alioth.debian.org Asunto: Re: [pymvpa] PyMVPA workshops On Tue, 30 Jan 2018, Jakub J?lek wrote: > Hello, > do you know of any physically-attended workshops / courses which teach how > to use PyMVPA? Any location is fine. Thank you. AFAIK none is planned ATM. -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik _______________________________________________ Pkg-ExpPsy-PyMVPA mailing list Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa -------------- next part -------------- An HTML attachment was scrubbed... URL: From jessi.goold at gmail.com Wed Jan 31 20:44:28 2018 From: jessi.goold at gmail.com (Jessica Goold) Date: Wed, 31 Jan 2018 12:44:28 -0800 Subject: [pymvpa] Searchlight Permutation Testing with Monte Carlo Selection Message-ID: Hi All, Thanks in advance for the help. I am trying to create a monte carlo permutation distribution to test my group searchlight against, but am worried that I set it up wrong and am also not sure where to go from here. Currently, I have 40 subjects with 15 permuted searchlights per subject as well as the actual searchlights which I want to test against the permuted distribution all saved as nifti files. I just created different permuted attribute files and did each permuted searchlight one by one across 3 computers in the lab to try to speed up the process, but now I'm not sure how to recombine everything and do the actual monte carlo distribution. What I've found online in the PyMVPA manual has the permutator and estimator within the cross validation script so I'm worried I may need to redo everything which will take another month or longer to do on one computer in my lab. Long story short, is there a way to create a monte carlo distribution with searchlights that have already been created and what's the best way to do this? Here is the code I used for one permuted searchlight, I'm also worried I didn't save all the information I needed, can someone let me know if that is the case? Thank you, Jessica #source /opt/rh/python27/enable from mvpa2.suite import * import numpy as np import scipy.stats as sp import scipy.io import nibabel import random subjects = [.....'20','21','22','23','24','25','26','27','28','29','30','31','32','33','34','36','37','38','39','40','41','42','43'] path = '/Users/Desktop/SearchlightPermutation/' for sub in subjects: verbose(1,'processing subject: %s'%sub) #load dataset attr = SampleAttributes(os.path.join(path, 'Attributes11thIt.txt')) fds = fmri_dataset(samples=os.path.join(path, 'Beta_allStim_sub%s.nii'%sub), targets = attr.targets, chunks=attr.chunks, mask = os.path.join(path, 'mask_anat.sub%s.nii'%sub)) #zscore detrended data zscore(fds, chunks_attr=None) #set up classifier clf = LinearCSVMC() cvte = CrossValidation(clf, NFoldPartitioner(), errorfx=lambda p, t: np.mean(p == t), enable_ca=['stats']) sl = sphere_searchlight(cvte, radius=3, space='voxel_indices', postproc=mean_sample(), nproc=8) res = sl(fds) #save searchlight niftiresults = map2nifti(fds, res.samples).to_filename('searchlight_allcond_sub%s_it11.nii'%sub) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mbannert at tuebingen.mpg.de Tue Mar 6 10:12:16 2018 From: mbannert at tuebingen.mpg.de (Michael Bannert) Date: Tue, 6 Mar 2018 11:12:16 +0100 Subject: [pymvpa] "SVD did not converge" - how can I make PyMVPA use lapack version? Message-ID: <9accdd50-fad4-6d2f-3584-8bea0aa61bba@tuebingen.mpg.de> Dear all, I would like to use the lapack version of SVD since it is supposed to be more robust than the numpy/scipy implementations ( http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/2014q2/002754.html ). I have installed on my a macOS using Homebrew, which seemed to work but it gave me this message: """ This formula is keg-only, which means it was not symlinked into /usr/local, because macOS already provides this software and installing another version in parallel can cause all kinds of trouble. For compilers to find this software you may need to set: LDFLAGS: -L/usr/local/opt/lapack/lib CPPFLAGS: -I/usr/local/opt/lapack/include For pkg-config to find this software you may need to set: PKG_CONFIG_PATH: /usr/local/opt/lapack/lib/pkgconfig """ I defined the environment variables as suggested but PyMVPA's externals.exists('liblapack.so') still returns false. What can I do to make it work? Best & thanks, Michael PS: Eventually, I will need to use it on a remote Linux system where lapack is installed as /usr/lib64/liblapack.so.3 but PyMVPA fails to find it via "externals.exists('liblapack.so')" too. From debian at onerussian.com Tue Mar 6 18:49:02 2018 From: debian at onerussian.com (Yaroslav Halchenko) Date: Tue, 6 Mar 2018 13:49:02 -0500 Subject: [pymvpa] "SVD did not converge" - how can I make PyMVPA use lapack version? In-Reply-To: <9accdd50-fad4-6d2f-3584-8bea0aa61bba@tuebingen.mpg.de> References: <9accdd50-fad4-6d2f-3584-8bea0aa61bba@tuebingen.mpg.de> Message-ID: <20180306184902.GS4352@hopa.kiewit.dartmouth.edu> On Tue, 06 Mar 2018, Michael Bannert wrote: > Dear all, > I would like to use the lapack version of SVD since it is supposed to be > more robust than the numpy/scipy implementations ( > http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/2014q2/002754.html > ). I guess you are talking about the message coming from the searchlight hyperalignment, which has an argument for hyperalignment, which has an argument for which transformation to use (ProcrusteanMapper by default), and ProcrusteanMapper has a parameter 'svd' which defaults to 'numpy' but could be set to "dgesvd" which would take that one from the lapack... see e.g. http://github.com/PyMVPA/PyMVPA/blob/HEAD/doc/examples/hyperalignment.py > I have installed on my a macOS using Homebrew, which seemed to work but it > gave me this message: > """ > This formula is keg-only, which means it was not symlinked into /usr/local, > because macOS already provides this software and installing another version > in parallel can cause all kinds of trouble. > For compilers to find this software you may need to set: > LDFLAGS: -L/usr/local/opt/lapack/lib > CPPFLAGS: -I/usr/local/opt/lapack/include > For pkg-config to find this software you may need to set: > PKG_CONFIG_PATH: /usr/local/opt/lapack/lib/pkgconfig > """ > I defined the environment variables as suggested but PyMVPA's > externals.exists('liblapack.so') still returns false. > What can I do to make it work? > Best & thanks, > Michael > PS: Eventually, I will need to use it on a remote Linux system where lapack > is installed as /usr/lib64/liblapack.so.3 but PyMVPA fails to find it via > "externals.exists('liblapack.so')" too. note that you need to have ctypes available for that functionality print externals.exists('ctypes') this is what is happening underneath: python -c "from ctypes import cdll; print(cdll.LoadLibrary('liblapack.so'))" so you could try smth like LD_PRELOAD=/usr/lib64/liblapack.so.3 python -c "from ctypes import cdll; print(cdll.LoadLibrary('liblapack.so'))" may be... and/or check if you have /usr/lib64/liblapack.so symlink to /usr/lib64/liblapack.so.3 -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From mbannert at tuebingen.mpg.de Wed Mar 7 10:29:14 2018 From: mbannert at tuebingen.mpg.de (Michael Bannert) Date: Wed, 7 Mar 2018 11:29:14 +0100 Subject: [pymvpa] "SVD did not converge" - how can I make PyMVPA use lapack version? In-Reply-To: <20180306184902.GS4352@hopa.kiewit.dartmouth.edu> References: <9accdd50-fad4-6d2f-3584-8bea0aa61bba@tuebingen.mpg.de> <20180306184902.GS4352@hopa.kiewit.dartmouth.edu> Message-ID: <6a1ee47b-e265-baeb-9dc9-44484c52989b@tuebingen.mpg.de> Dear Yaroslav, Thanks for your reply! I'm starting to wonder if it would really make a difference to use 'dgesvd' or 'numpy'/'scipy' implementations because numpy also uses lapack apparently at least since 1.11 http://devdocs.io/numpy~1.11/generated/numpy.linalg.svd Do you still think dgesvd could be more robust as claimed in previous PyMVPA threads? On 06.03.18 19:49, Yaroslav Halchenko wrote: > > On Tue, 06 Mar 2018, Michael Bannert wrote: > >> Dear all, > >> I would like to use the lapack version of SVD since it is supposed to be >> more robust than the numpy/scipy implementations ( >> http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/2014q2/002754.html >> ). > > I guess you are talking about the message coming from the searchlight > hyperalignment, which has an argument for hyperalignment, which has an > argument for which transformation to use (ProcrusteanMapper by default), > and ProcrusteanMapper has a parameter 'svd' which defaults to 'numpy' > but could be set to "dgesvd" which would take that one from the > lapack... see e.g. > http://github.com/PyMVPA/PyMVPA/blob/HEAD/doc/examples/hyperalignment.py Yes, it shows up in the context of hyperalignment (not the searchlight version, though - haven't checked it yet). > > >> I have installed on my a macOS using Homebrew, which seemed to work but it >> gave me this message: > >> """ >> This formula is keg-only, which means it was not symlinked into /usr/local, >> because macOS already provides this software and installing another version >> in parallel can cause all kinds of trouble. > >> For compilers to find this software you may need to set: >> LDFLAGS: -L/usr/local/opt/lapack/lib >> CPPFLAGS: -I/usr/local/opt/lapack/include >> For pkg-config to find this software you may need to set: >> PKG_CONFIG_PATH: /usr/local/opt/lapack/lib/pkgconfig >> """ > >> I defined the environment variables as suggested but PyMVPA's >> externals.exists('liblapack.so') still returns false. > >> What can I do to make it work? > >> Best & thanks, >> Michael > >> PS: Eventually, I will need to use it on a remote Linux system where lapack >> is installed as /usr/lib64/liblapack.so.3 but PyMVPA fails to find it via >> "externals.exists('liblapack.so')" too. > > note that you need to have ctypes available for that functionality > > print externals.exists('ctypes') I have ctypes on both Mac and Linux systems. > > this is what is happening underneath: > > python -c "from ctypes import cdll; print(cdll.LoadLibrary('liblapack.so'))" > > so you could try smth like > > LD_PRELOAD=/usr/lib64/liblapack.so.3 python -c "from ctypes import cdll; print(cdll.LoadLibrary('liblapack.so')) > > may be... and/or check if you have /usr/lib64/liblapack.so symlink to > /usr/lib64/liblapack.so.3 > Yes, the symbolic link did the trick. Thank you - I learned something today. Best, Michael From torabiansajjad at gmail.com Fri Mar 30 21:15:55 2018 From: torabiansajjad at gmail.com (Sajjad Torabian) Date: Fri, 30 Mar 2018 14:15:55 -0700 Subject: [pymvpa] Question on Event-related Data Analysis Message-ID: Hi Everyone, In the section "Event-related Data Analysis" the approach is mainly to load masked data (only a few ROIs are loaded in the tutorial - I myself use the fmri_dataset function with masking) and then do HRF modeling using fit_event_hrf_model. There are other parts of analysis as well, but my question is basically whether doing HRF modeling on only a part of the whole brain is reasonable. Shouldn't we load the whole data, do modeling, and then do masking? (or maybe the functions handle this?) Also, in general, modeling takes a lot of time using other tools. But when I use fit_event_hrf_model (even without masking first) it takes very short time like a couple of minutes. Thank you for your help in advance! Best, Sajjad -------------- next part -------------- An HTML attachment was scrubbed... URL: