Differences between revisions 6 and 62 (spanning 56 versions)
Revision 6 as of 2005-10-25 18:44:43
Size: 4109
Comment:
Revision 62 as of 2015-10-08 14:53:44
Size: 7145
Editor: ZekeKaufman
Comment:
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
[wiki:Self:FsTutorial top] | [wiki:Self:FsTutorial previous] | [wiki:Self:FsTutorial/Overview next] [[FsTutorial|top]]
Line 3: Line 3:
== FreeSurfer Tutorial: Sample Data == == FreeSurfer Tutorial Datasets ==
Line 5: Line 5:
A [http://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_public_distribution.README.txt public distribution data set] is available to run the tutorials. It contains 40 subjects, and occupies about 20G of disk-space after download and uncompression. The data set is divided into two files: the data for the troubleshooting examples (6GB), and the data for the group study analysis (14GB). There are two download methods: using a web browser, or using the ftp application. The data set for the Freesurfer tutorials comes in two forms, a "light weight" version and a "full" version.
Line 7: Line 7:
=== Download using the web browser ===
Some web browsers seem to have difficulty downloading files greater than 4GB in size, but if you would like to attempt the download using your browser, then the two files are listed next (click on the .tar.gz filename to download).
  * The "light weight" version contains only the files required to run the commands of the free surfer tutorial. It has all the required input and output data, but lacks the other subject data that would normally be present when performing a recon. People who just want a quick easy way to run the commands in the tutorial should download this data set.
Line 10: Line 9:
The troubleshooting tutorials, in the section [wiki:Self:FsTutorial/MorphAndRecon Morphometry and Reconstruction], require this download:
 * [ftp://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_data-tutorial_subjs.tar.gz buckner_data-tutorial_subjs.tar.gz]
  * The "full" version contains all the files that would normally be present when performing recons. Because of the numerous subjects involved with the tutorials, the full data set is quite large (~60Gigs). People who want to go into more depth than that which is covered in the course and the tutorials should download this data set.
Line 13: Line 11:
The group study [wiki:Self:FsTutorial/GroupAnalysis analysis] and [wiki:Self:FsTutorial/Visualization visualization] tutorials require this download:
 * [ftp://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_data-group_study.tar.gz buckner_data-group_study.tar.gz]
== FreeSurfer Tutorial Datasets - light weight version ==
Line 16: Line 13:
=== Download using the FTP application === Use the following link to download the light weight version of the tutorial data:
Line 18: Line 15:
If you have trouble downloading the files in your web browser, then use an ftp application directly. Type '''ftp surfer.nmr.mgh.harvard.edu'''. Login as '''anonymous''' and use your email address as the password.   [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/tutorial_data.tar.gz|tutorial_data.tar.gz]]
Line 20: Line 17:
Example login: Or download using the following command:
Line 23: Line 20:
$ ftp surfer.nmr.mgh.harvard.edu
Connected to surfer.nmr.mgh.harvard.edu (132.183.202.158).
220 surfer.nmr.mgh.harvard.edu FTP server (Version wu-2.6.2-15.7x.legacy) ready.
Name (surfer.nmr.mgh.harvard.edu:fsurfer): anonymous
331 Guest login ok, send your complete e-mail address as password.
Password:
230 Guest login ok, access restrictions apply.
Remote system type is UNIX.
Using binary mode to transfer files.
ftp>
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/tutorial_data.tar.gz
Line 35: Line 23:
The tutorial datasets are found in the pub/dist directory: Once the download is complete, uncompress the data file - this usually this can be done by simply double-clicking. To run the tutorials, simply source Freesurfer, then set an environment variable called '''TUTORIAL_DATA''':
Line 37: Line 26:
ftp> cd pub/dist (bash)
<source_freesurfer>
export TUTORIAL_DATA=<path_to_tutorial_data>

(csh)
<source_freesurfer>
setenv TUTORIAL_DATA <path_to_tutorial_data>
Line 40: Line 35:
The troubleshooting tutorials, in the section [wiki:Self:FsTutorial/MorphAndRecon Morphometry and Reconstruction], require the '''buckner_data-tutorial_subjs.tar.gz''' download, which is 4.5GB compressed:
You are now ready to start the [[FsTutorial| Freesurfer tutorials]].

== FreeSurfer Tutorial Datasets - full version ==

The full data for the [[FsTutorial| tutorials]] consists of several data sets:
 * buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)[[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|Here is the md5sum]].
 * long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
 * fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)
 * diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
 * fbert-feat.tgz & bert.recon.tgz - tutorial on the integration of !FreeSurfer and FSL/FEAT

Also, if you only want to get started with the basics of !FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:
 *[[FsTutorial/OutputData_freeview|Intro to FreeSurfer Output]]
 *[[FsTutorial/TroubleshootingData|Troubleshooting FreeSurfer Output]]
 *[[FsTutorial/GroupAnalysis|Group Analysis via command-line]]
 *[[FsTutorial/QdecGroupAnalysis_freeview|Group Analysis via GUI]]
 *[[FsTutorial/AnatomicalROI|ROI Analysis]]
 *[[FsTutorial/MultiModal_freeview|Multimodal Analysis]]

=== Download using wget ===
The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''.

Open a terminal, create a directory called {{{tutorial_data}}} where you know you have at least 100GB of space, and {{{cd}}} into that directory. To download, type:
Line 42: Line 60:
ftp> get buckner_data-tutorial_subjs.tar.gz wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &
Line 45: Line 70:
The group study [wiki:Self:FsTutorial/GroupAnalysis analysis] and [wiki:Self:FsTutorial/Visualization visualization] tutorials require the '''buckner_data-group_study.tar.gz''' download, which is about 6.5GB compressed: This download will likely take several hours.

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.

==== Optional Verification Step ====
If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way:
Line 47: Line 78:
ftp> get buckner_data-group_study.tar.gz wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &
Line 50: Line 88:
Now exit the ftp application: You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may have been a disk error. '''Mac OS NOTE:''' Use '''md5 -r''' to get the same results as '''md5sum'''. More on md5sum can be found [[http://www.techradar.com/us/news/computing/pc/how-to-verify-your-files-in-linux-with-md5-641436|here]].

=== Installation ===
==== Uncompress the files ====
Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.
Line 52: Line 95:
ftp> quit tar xzvf <filename>.tar.gz
Line 55: Line 98:
=== Installation === Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.
Line 57: Line 100:
Once a file is downloaded, move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following command: ==== Set variables to point FreeSurfer to data ====
To setup the environment variable {{{SUBJECTS_DIR}}} to point to the tutorial data, type the following commands every time you open a new terminal window. Alternatively, you can include the below commands in your .cshrc or .tcshrc file so these variables are automatically set every time you open a new terminal window:
{{{
(bash)
export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory>
Line 59: Line 106:
{{{
tar xzvf buckner_data-<dataset>.tar.gz
(csh)
setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory>
Line 63: Line 110:
where <dataset> is either '''tutorial_subjs''' or '''group_study'''. '''Note:''' The tutorial references the {{{TUTORIAL_DATA}}} variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).
Line 65: Line 112:
The downloaded .tar.gz file can then be deleted.

To setup the environment variable SUBJECTS_DIR to point to the troubleshooting tutorial data (for use in the tutorial section [wiki:Self:FsTutorial/MorphAndRecon Morphometry and Reconstruction]), type the following command or include in your .cshrc or .tcshrc file:
=== FSL-FEAT Tutorial Data ===
Line 69: Line 114:
source $FREESURFER_HOME/subjects/buckner_data/tutorial_subjs/scripts/setup_subjects.csh cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz
Line 71: Line 119:

To setup the environment variables SUBJECTS_DIR and SUBJECTS to point to the group study tutorial data (for use in the tutorial sections group study [wiki:Self:FsTutorial/GroupAnalysis analysis] and [wiki:Self:FsTutorial/Visualization visualization]), type the following command or include in your .cshrc or .tcshrc file:
{{{
source $FREESURFER_HOME/subjects/buckner_data/group_study/scripts/setup_subjects.csh
}}}

You can only source one '''setup_subjects.csh''' file at a time. The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate.

top

FreeSurfer Tutorial Datasets

The data set for the Freesurfer tutorials comes in two forms, a "light weight" version and a "full" version.

  • The "light weight" version contains only the files required to run the commands of the free surfer tutorial. It has all the required input and output data, but lacks the other subject data that would normally be present when performing a recon. People who just want a quick easy way to run the commands in the tutorial should download this data set.
  • The "full" version contains all the files that would normally be present when performing recons. Because of the numerous subjects involved with the tutorials, the full data set is quite large (~60Gigs). People who want to go into more depth than that which is covered in the course and the tutorials should download this data set.

FreeSurfer Tutorial Datasets - light weight version

Use the following link to download the light weight version of the tutorial data:

Or download using the following command:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/tutorial_data.tar.gz

Once the download is complete, uncompress the data file - this usually this can be done by simply double-clicking. To run the tutorials, simply source Freesurfer, then set an environment variable called TUTORIAL_DATA:

(bash)
<source_freesurfer>
export TUTORIAL_DATA=<path_to_tutorial_data>

(csh)
<source_freesurfer>
setenv TUTORIAL_DATA <path_to_tutorial_data>

You are now ready to start the Freesurfer tutorials.

FreeSurfer Tutorial Datasets - full version

The full data for the tutorials consists of several data sets:

  • buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)Here is the md5sum.

  • long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
  • fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)

  • diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets

  • fbert-feat.tgz & bert.recon.tgz - tutorial on the integration of FreeSurfer and FSL/FEAT

Also, if you only want to get started with the basics of FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:

Download using wget

The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: Use curl -O in place of wget.

Open a terminal, create a directory called tutorial_data where you know you have at least 100GB of space, and cd into that directory. To download, type:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &

This download will likely take several hours.

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.

Optional Verification Step

If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt in this directory or get them this way:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &

You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may have been a disk error. Mac OS NOTE: Use md5 -r to get the same results as md5sum. More on md5sum can be found here.

Installation

Uncompress the files

Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.

tar xzvf <filename>.tar.gz

Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.

Set variables to point FreeSurfer to data

To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following commands every time you open a new terminal window. Alternatively, you can include the below commands in your .cshrc or .tcshrc file so these variables are automatically set every time you open a new terminal window:

(bash)
export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory>

(csh)
setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory>

Note: The tutorial references the TUTORIAL_DATA variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

FSL-FEAT Tutorial Data

cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz

FsTutorial/Data (last edited 2018-09-30 09:35:54 by AndrewHoopes)