Differences between revisions 52 and 53
Deletions are marked like this. Additions are marked like this.
Line 3: Line 3:
== FreeSurfer Tutorial: Sample Data == == FreeSurfer Tutorial Datasets ==
Line 6: Line 6:
 * buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)([[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|here is the md5sum]])  * buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)[[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|Here is the md5sum]].
Line 21: Line 21:
The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' use '''curl -O''' in place of '''wget'''. The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''.
Line 35: Line 35:
The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way: This download will likely take several hours.

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.

If
you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way:
Line 48: Line 52:
You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. '''Mac OS NOTE:''' use '''md5 -r''' to get the same results as '''md5sum'''. You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may have been a disk error. '''Mac OS NOTE:''' Use '''md5 -r''' to get the same results as '''md5sum'''. More on md5sum can be found [[http://www.techradar.com/us/news/computing/pc/how-to-verify-your-files-in-linux-with-md5-641436|here].
Line 52: Line 56:
Once the files are downloaded, move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands: Once the datasets have been downloaded, move them to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following command run from a terminal window (you'll have to {{{cd}}} to $FREESURFER_HOME/subjects first if you haven't already):
Line 58: Line 62:
Replacing of course <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted. Replacing <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted.
Line 60: Line 64:
To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following command or include in your .cshrc or .tcshrc file: To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following commands every time you open a new terminal window. Alternatively, you can include the below commands in your .cshrc or .tcshrc file so these variables are automatically set every time you open a new terminal window:
Line 66: Line 70:
'''Note:''' If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR). '''Martinos Center employees:''' Because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects when doing so on a Martinos Center workstation. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that location. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR).

top

FreeSurfer Tutorial Datasets

The data for the tutorials consists of several data sets:

  • buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed)Here is the md5sum.

  • long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
  • fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)

  • diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets

  • fbert-feat.tgz & bert.recon.tgz - tutorial on the integration of FreeSurfer and FSL/FEAT

If you only want to get started with the basics of FreeSurfer, you need only download the bucker_data set. This will allow you to do the following tutorials:

Download using wget

The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: Use curl -O in place of wget.

Open a terminal, change to a directory where you know you have at least 100GB of space. To download, type:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &

This download will likely take several hours.

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded.

If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt in this directory or get them this way:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &

You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may have been a disk error. Mac OS NOTE: Use md5 -r to get the same results as md5sum. More on md5sum can be found [[http://www.techradar.com/us/news/computing/pc/how-to-verify-your-files-in-linux-with-md5-641436|here].

Installation

Once the datasets have been downloaded, move them to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following command run from a terminal window (you'll have to cd to $FREESURFER_HOME/subjects first if you haven't already):

tar xzvf <filename>.tar.gz

Replacing <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted.

To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following commands every time you open a new terminal window. Alternatively, you can include the below commands in your .cshrc or .tcshrc file so these variables are automatically set every time you open a new terminal window:

setenv TUTORIAL_DATA $FREESURFER_HOME/subjects
setenv SUBJECTS_DIR $TUTORIAL_DATA/buckner_data/tutorial_subjs/

Martinos Center employees: Because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects when doing so on a Martinos Center workstation. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that location. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR).

The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate. The tutorial references the TUTORIAL_DATA var, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

FSL-FEAT Tutorial Data

cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz

FsTutorial/Data (last edited 2018-09-30 09:35:54 by AndrewHoopes)