|
Size: 5221
Comment:
|
Size: 8237
Comment:
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| == FreeSurfer Tutorial: Sample Data == | '''Index''' |
| Line 5: | Line 5: |
| A [[http://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_public_distribution.README.txt|public distribution data set]] is available to run the tutorials. It contains 40+ subjects, and occupies about 18G of disk-space after download and decompression. There are two download methods: using a web browser, or using the wget application. The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. There are also two tar files associated with the tutorial on the integration of FreeSurfer and FSL/FEAT. | <<TableOfContents>> == FreeSurfer Tutorial Datasets == ## The data set for the Freesurfer tutorials comes in two forms, a "Lightweight" version and a "Full" version. ## * The "Lightweight" version contains only the files required to run the commands of the !FreeSurfer tutorial. It has all the required input and output data, but lacks the other files that would normally be present when performing a recon. People who just want a quick easy way to run the commands in the tutorial should download this data set. ## * The "Full" version contains all the files that would normally be present when performing recons. Because of the numerous subjects involved with the tutorials, the full data set is quite large (~60Gigs). People who want to go into more depth than that which is covered in the tutorials should download this data set. ## == Lightweight version == ## Use the following link to download the lightweight version of the tutorial data: ## . [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/tutorial_data.tar.gz|tutorial_data.tar.gz]] ## Once the download is complete, uncompress the data file - this usually this can be done by simply double-clicking. In order to do the tutorials, users must define an environment variable called '''TUTORIAL_DATA''' which is set to the location of the extracted data. For example: ## {{{ ## $> export TUTORIAL_DATA=/home/username/Downloads/tutorial_data ## $> ls $TUTORIAL_DATA ## buckner_data fsfast-functional ## diffusion_recons fsfast-tutorial.subjects ## diffusion_tutorial long-tutorial ## }}} ## You are now ready to start the [[Tutorials|Freesurfer tutorials]]. ## == Full version == The full data for the [[Tutorials|tutorials]] consists of several data sets: * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz|buckner_data-tutorial_subjs.tar.gz]]: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed). * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz|long-tutorial.tar.gz]]: the longitudinal tutorial (size: ~16GB uncompressed) * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz|fsfast-tutorial.subjects.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz|fsfast-functional.tar.gz]]: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively) * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz|diffusion_recons.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz|diffusion_tutorial.tar.gz]] - the diffusion and Tracula tutorial data sets Also, if you only want to get started with the basics of !FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials: * [[FsTutorial/OutputData_freeview|Intro to FreeSurfer Output]] * [[FsTutorial/TroubleshootingData|Troubleshooting FreeSurfer Output]] * [[FsTutorial/GroupAnalysis|Group Analysis via command-line]] * [[FsTutorial/QdecGroupAnalysis_freeview|Group Analysis via GUI]] * [[FsTutorial/AnatomicalROI|ROI Analysis]] * [[FsTutorial/MultiModal_freeview|Multimodal Analysis]] |
| Line 8: | Line 49: |
| The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''. | |
| Line 9: | Line 51: |
| Open a terminal, change to a directory where you know you have at least 18GB of space. The troubleshooting and group study analysis tutorials require the '''buckner_data-tutorial_subjs.tar.gz''' download, which is 12GB compressed (18GB decompressed). There is also FSL-FEAT tutorial data. To get these, type: | Open a terminal, create a directory called {{{tutorial_data}}} where you know you have at least 100GB of space, and {{{cd}}} into that directory. To download, type: |
| Line 11: | Line 54: |
| wget http://magick.mit.edu/buckner_data-tutorial_subjs.tar.gz & | wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz & |
| Line 14: | Line 60: |
| wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz & |
|
| Line 15: | Line 63: |
| ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz & |
This download will likely take several hours. |
| Line 18: | Line 65: |
| The wget application will handle poor connections, and retry if it is having problems. Notice the commands have the ampersand, so you can run all three commands at once (although the wget output will be hard to decipher). Goto the Installation section below once the files are downloaded (this will likely take several hours). The md5sum for buckner_data-tutorial_subjs.tar.gz is: {{{ 691078b7353d989826d65cdea477b3d4 }}} |
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded. |
| Line 23: | Line 67: |
| === Download using the web browser === Some web browsers seem to have difficulty downloading files greater than 4GB in size, but if you would like to attempt the download using your browser, then the one necessary file is listed next (click on the .tar.gz filename to download). If a popup window appears prompting for a username and password, use 'anonymous' as the username, and your email address as the password. |
## ==== Optional Verification Step ==== ## If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way: |
| Line 26: | Line 70: |
| The troubleshooting and group study analysis tutorials require these downloads: * [[http://magick.mit.edu/buckner_data-tutorial_subjs.tar.gz|buckner_data-tutorial_subjs.tar.gz]] ## * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz|buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz]] ## * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz|buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz]] The FSL-FEAT tutorial require these: * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz|fbert-feat.tgz]] * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz|bert.recon.tgz]] ## === Mirror Site === ## M. Pearrow has kindly provided a [[http://magick.mit.edu/buckner_data-tutorial_subjs.tar.gz|mirror site of the data]] at MIT. |
## {{{ ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt & ## }}} ## You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may ## have been a disk error. '''Mac OS NOTE:''' Use '''md5 -r''' to get the same results as '''md5sum'''. More on md5sum can be found [[http://www.techradar.com/us/news/computing/pc/how-## to-verify-your-files-in-linux-with-md5-641436|here]]. |
| Line 39: | Line 83: |
Once the file is downloaded (via either ftp or web browser), move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands: |
==== Uncompress the files ==== Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window. |
| Line 43: | Line 87: |
| tar xzvf buckner_data-tutorial_subjs.tar.gz | tar -xzvf <filename>.tar.gz |
| Line 45: | Line 89: |
| ## tar xzvf buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz ## tar xzvf buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz |
Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted. |
| Line 48: | Line 91: |
| The downloaded .tar.gz files can then be deleted. | ==== Set variables to point FreeSurfer to data ==== |
| Line 50: | Line 93: |
| To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following command or include in your .cshrc or .tcshrc file: | Once the download is complete you may uncompress the tar files. In order to do the tutorials, users must define an environment variable called '''TUTORIAL_DATA''' which is set to the location of the extracted data. For example: |
| Line 52: | Line 96: |
| source $FREESURFER_HOME/subjects/buckner_data/tutorial_subjs/scripts/subjects.csh | export TUTORIAL_DATA=/home/username/Downloads/tutorial_data ls $TUTORIAL_DATA buckner_data fsfast-functional diffusion_recons fsfast-tutorial.subjects diffusion_tutorial long-tutorial |
| Line 54: | Line 102: |
| You are now ready to start the [[Tutorials|Freesurfer tutorials]]. | |
| Line 55: | Line 104: |
| '''Note:''' If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the SUBJECTS_DIR environment variable to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR). | {{{ (bash) export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory> |
| Line 57: | Line 108: |
| The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate. | (csh) setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory> }}} '''Note:''' The tutorial references the {{{TUTORIAL_DATA}}} variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.). |
| Line 59: | Line 113: |
| ## === Additional group study data === ## There is an additional data set available upon which further group study analysis could be conducted. It is composed of the 40 subjects from which the tutorial's group study component was created. The data set is the full Freesurfer reconstruction of the 40 subjects (conducted using stable v3.0.2). It is 10.5GB compressed, and 18GB decompressed. It is named '''buckner_data-group_study.tar.gz'''. Also, there is the file '''buckner_data-raw_dicoms.tar.gz''', which, as indicated by the name, contains the original DICOM files for the ## subjects. Be sure to #[[http://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_public_distribution.README.txt|read these notes.]] |
== FreeSurfer Tutorial Datasets 5.1/5.3 == |
| Line 62: | Line 115: |
| === FSL-FEAT Tutorial Data === {{{ cd $SUBJECTS_DIR tar xvfz bert.recon.tgz cd /place/for/functional/data tar xvfz fbert-feat.tgz }}} |
* [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-5.3.tar.gz|buckner_data-tutorial_subjs-5.3.tar.gz]] * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutoriaL-5.3.tar.gz|long-tutorial-5.3.tar.gz]] * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects-5.3.tar.gz|fsfast-tutorial.subjects-5.3.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz|fsfast-functional-5.3.tar.gz]] * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons-5.3.tar.gz|diffusion_recons-5.3.tar.gz]] & [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial-5.3.tar.gz|diffusion_tutorial-5.3.tar.gz]] |
Index
Contents
FreeSurfer Tutorial Datasets
The full data for the tutorials consists of several data sets:
buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed).
long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)
diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
Also, if you only want to get started with the basics of FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:
Download using wget
The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: Use curl -O in place of wget.
Open a terminal, create a directory called tutorial_data where you know you have at least 100GB of space, and cd into that directory. To download, type:
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &
This download will likely take several hours.
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.
Installation
Uncompress the files
Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.
tar -xzvf <filename>.tar.gz
Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.
Set variables to point FreeSurfer to data
Once the download is complete you may uncompress the tar files. In order to do the tutorials, users must define an environment variable called TUTORIAL_DATA which is set to the location of the extracted data. For example:
export TUTORIAL_DATA=/home/username/Downloads/tutorial_data ls $TUTORIAL_DATA buckner_data fsfast-functional diffusion_recons fsfast-tutorial.subjects diffusion_tutorial long-tutorial
You are now ready to start the Freesurfer tutorials.
(bash) export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory> (csh) setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory>
Note: The tutorial references the TUTORIAL_DATA variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).
