Differences between revisions 39 and 46 (spanning 7 versions)
Revision 39 as of 2011-03-31 12:38:50
Size: 4780
Comment:
Revision 46 as of 2011-06-17 12:13:40
Size: 4767
Comment:
Deletions are marked like this. Additions are marked like this.
Line 6: Line 6:
 * buckner_data-tutorial_subjs.tar.gz ~16GB uncompressed - the main 'recon-all' stream subject data processing  * buckner_data-tutorial_subjs.tar.gz ~16GB uncompressed - the main 'recon-all' stream subject data processing ([[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|here is the md5sum]])
Line 10: Line 10:
 * Tracula_tutorial_data.tar.gz - Tracula tutorial dataset
Line 15: Line 16:
Open a terminal, change to a directory where you know you have at least 100GB of space. The troubleshooting and group study analysis tutorials require the '''buckner_data-tutorial_subjs.tar.gz''' download, which is 12GB compressed (18GB decompressed). There is also FSL-FEAT tutorial data. To get these, type: Open a terminal, change to a directory where you know you have at least 100GB of space. To download, type:
Line 23: Line 24:
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/Tracula_tutorial_data.tar.gz
Line 25: Line 27:
}}}torial_subjs-group_analysis_tutorial_p2.tgz & }}}
Line 27: Line 29:
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue were a partial download stopped. Notice the commands have the ampersand, so you can run all three commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt in this directory
{{{
http://surfer.nmr.mgh.harvard.edu/pub/data/
}}}
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]].
Line 45: Line 44:
source $FREESURFER_HOME/subjects/buckner_data/tutorial_subjs/scripts/subjects.csh setenv SUBJECTS_DIR $TUTORIAL_DATA/buckner_data/tutorial_subjs/

top

FreeSurfer Tutorial: Sample Data

The data for the tutorials consists of several data sets:

  • buckner_data-tutorial_subjs.tar.gz ~16GB uncompressed - the main 'recon-all' stream subject data processing (here is the md5sum)

  • long-tutorial.tar.gz ~16GB uncompressed - the longitudinal tutorial
  • fsfast-tutorial.subjects.tar.gz ~5.6GB uncompressed and fsfast-functional.tar.gz ~9.1GB uncompressed - the FS-FAST tutorial data set
  • diffusion_recons.tar.gz and diffusion_tutorial.tar.gz - the diffusion and CVS tutorial data sets
  • Tracula_tutorial_data.tar.gz - Tracula tutorial dataset
  • fbert-feat.tgz and bert.recon.tgz - tutorial on the integration of FreeSurfer and FSL/FEAT

The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: use curl -O in place of wget.

Download using wget

Open a terminal, change to a directory where you know you have at least 100GB of space. To download, type:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/Tracula_tutorial_data.tar.gz
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt in this directory.

Installation

Once the files are downloaded, move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands:

tar xzvf <filename>.tar.gz

Replacing of course <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted.

To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following command or include in your .cshrc or .tcshrc file:

setenv TUTORIAL_DATA $FREESURFER_HOME/subjects
setenv SUBJECTS_DIR $TUTORIAL_DATA/buckner_data/tutorial_subjs/

Note: If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR).

The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate. The tutorial references the TUTORIAL_DATA var, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

FSL-FEAT Tutorial Data

cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz

FsTutorial/Data (last edited 2018-09-30 09:35:54 by AndrewHoopes)