Differences between revisions 57 and 79 (spanning 22 versions)
Revision 57 as of 2015-03-03 17:01:44
Size: 5426
Editor: ZekeKaufman
Comment:
Revision 79 as of 2016-08-04 12:04:00
Size: 7548
Comment: updated tutorial links to point to the Tutorials page that lists all tutorials instead of the FSTutorial page which only lists the tutorials used at the latest course
Deletions are marked like this. Additions are marked like this.
Line 4: Line 4:
The data set for the Freesurfer tutorials comes in two forms, a "Lightweight" version and a "Full" version.
Line 5: Line 6:
The data for the [[FsTutorial| tutorials]] consists of several data sets:
 * buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)[[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|Here is the md5sum]].
 * long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
 * fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)
 * diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
 * fbert-feat.tgz & bert.recon.tgz - tutorial on the integration of !FreeSurfer and FSL/FEAT
 * The "Lightweight" version contains only the files required to run the commands of the !FreeSurfer tutorial. It has all the required input and output data, but lacks the other files that would normally be present when performing a recon. People who just want a quick easy way to run the commands in the tutorial should download this data set.
Line 12: Line 8:
If you only want to get started with the basics of !FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:
 *[[FsTutorial/OutputData_freeview|Intro to FreeSurfer Output]]
 *[[FsTutorial/TroubleshootingData|Troubleshooting FreeSurfer Output]]
 *[[FsTutorial/GroupAnalysis|Group Analysis via command-line]]
 *[[FsTutorial/QdecGroupAnalysis_freeview|Group Analysis via GUI]]
 *[[FsTutorial/AnatomicalROI|ROI Analysis]]
 *[[FsTutorial/MultiModal_freeview|Multimodal Analysis]]
 * The "Full" version contains all the files that would normally be present when performing recons. Because of the numerous subjects involved with the tutorials, the full data set is quite large (~60Gigs). People who want to go into more depth than that which is covered in the tutorials should download this data set.

== Lightweight version ==
Use the following link to download the lightweight version of the tutorial data:

 . [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/tutorial_data.tar.gz|tutorial_data.tar.gz]]

Once the download is complete, uncompress the data file - this usually this can be done by simply double-clicking. To run the tutorials, simply source Freesurfer, then define an environment variable called '''TUTORIAL_DATA''' which point to the uncompressed directory. For example:

{{{
$> <source freesurfer>
$> export TUTORIAL_DATA=/home/username/Downloads/tutorial_data
$> ls $TUTORIAL_DATA
buckner_data fsfast-functional
diffusion_recons fsfast-tutorial.subjects
diffusion_tutorial long-tutorial
}}}
You are now ready to start the [[Tutorials|Freesurfer tutorials]].

== Full version ==
The full data for the [[Tutorials|tutorials]] consists of several data sets:

 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz|buckner_data-tutorial_subjs.tar.gz]]: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed).
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz|long-tutorial.tar.gz]]: the longitudinal tutorial (size: ~16GB uncompressed)
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz|fsfast-tutorial.subjects.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz|fsfast-functional.tar.gz]]: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz|diffusion_recons.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz|diffusion_tutorial.tar.gz]] - the diffusion and Tracula tutorial data sets
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz|fbert-feat.tgz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz|bert.recon.tgz]] - tutorial on the integration of !FreeSurfer and FSL/FEAT

Also, if you only want to get started with the basics of !FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:

 * [[FsTutorial/OutputData_freeview|Intro to FreeSurfer Output]]
 * [[FsTutorial/TroubleshootingData|Troubleshooting FreeSurfer Output]]
 * [[FsTutorial/GroupAnalysis|Group Analysis via command-line]]
 * [[FsTutorial/QdecGroupAnalysis_freeview|Group Analysis via GUI]]
 * [[FsTutorial/AnatomicalROI|ROI Analysis]]
 * [[FsTutorial/MultiModal_freeview|Multimodal Analysis]]
Line 21: Line 46:
The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''.   The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''.
Line 24: Line 49:
Line 34: Line 60:
Line 37: Line 62:
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.  The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.
Line 52: Line 77:
Line 57: Line 81:
Once the datasets have been downloaded, uncompress and install with the following command run from a terminal window. Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.
Line 62: Line 86:

Replacing <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted. 
Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.
Line 67: Line 90:
Line 68: Line 92:
(bash)
export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory>

(csh)
Line 69: Line 97:
setenv SUBJECTS_DIR $TUTORIAL_DATA/buckner_data/tutorial_subjs/
Line 71: Line 98:

top

FreeSurfer Tutorial Datasets

The data set for the Freesurfer tutorials comes in two forms, a "Lightweight" version and a "Full" version.

  • The "Lightweight" version contains only the files required to run the commands of the FreeSurfer tutorial. It has all the required input and output data, but lacks the other files that would normally be present when performing a recon. People who just want a quick easy way to run the commands in the tutorial should download this data set.

  • The "Full" version contains all the files that would normally be present when performing recons. Because of the numerous subjects involved with the tutorials, the full data set is quite large (~60Gigs). People who want to go into more depth than that which is covered in the tutorials should download this data set.

Lightweight version

Use the following link to download the lightweight version of the tutorial data:

Once the download is complete, uncompress the data file - this usually this can be done by simply double-clicking. To run the tutorials, simply source Freesurfer, then define an environment variable called TUTORIAL_DATA which point to the uncompressed directory. For example:

$> <source freesurfer>
$> export TUTORIAL_DATA=/home/username/Downloads/tutorial_data
$> ls $TUTORIAL_DATA
buckner_data                    fsfast-functional
diffusion_recons                fsfast-tutorial.subjects
diffusion_tutorial              long-tutorial

You are now ready to start the Freesurfer tutorials.

Full version

The full data for the tutorials consists of several data sets:

Also, if you only want to get started with the basics of FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:

Download using wget

The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: Use curl -O in place of wget.

Open a terminal, create a directory called tutorial_data where you know you have at least 100GB of space, and cd into that directory. To download, type:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &

This download will likely take several hours.

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.

Optional Verification Step

If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt in this directory or get them this way:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &

You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may have been a disk error. Mac OS NOTE: Use md5 -r to get the same results as md5sum. More on md5sum can be found here.

Installation

Uncompress the files

Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.

tar xzvf <filename>.tar.gz

Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.

Set variables to point FreeSurfer to data

To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following commands every time you open a new terminal window. Alternatively, you can include the below commands in your .cshrc or .tcshrc file so these variables are automatically set every time you open a new terminal window:

(bash)
export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory>

(csh)
setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory>

Note: The tutorial references the TUTORIAL_DATA variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

FSL-FEAT Tutorial Data

cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz

FsTutorial/Data (last edited 2018-09-30 09:35:54 by AndrewHoopes)