Deletions are marked like this. | Additions are marked like this. |
Line 48: | Line 48: |
=== Download using wget === The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' Use '''curl -O''' in place of '''wget'''. Open a terminal, create a directory called {{{tutorial_data}}} where you know you have at least 100GB of space, and {{{cd}}} into that directory. To download, type: {{{ wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz & wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz & }}} This download will likely take several hours. The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Go to the Installation section below once the files are downloaded. ## ==== Optional Verification Step ==== ## If you want to verify the files transferred correctly using md5sum, the md5sum for each of these downloads can found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way: ## {{{ ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt & ## wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt & ## }}} ## You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads. If they do not match, the file transfer may have been faulty or there may ## have been a disk error. '''Mac OS NOTE:''' Use '''md5 -r''' to get the same results as '''md5sum'''. More on md5sum can be found [[http://www.techradar.com/us/news/computing/pc/how-## to-verify-your-files-in-linux-with-md5-641436|here]]. |
Index
Contents
FreeSurfer Tutorial Datasets
The full data for the tutorials consists of several data sets:
buckner_data-tutorial_subjs.tar.gz: the main 'recon-all' stream subject data processing (size: ~16GB uncompressed).
long-tutorial.tar.gz: the longitudinal tutorial (size: ~16GB uncompressed)
fsfast-tutorial.subjects.tar.gz & fsfast-functional.tar.gz: the FS-FAST tutorial data set (size: ~5.6GB uncompressed & ~9.1GB uncompressed, respectively)
diffusion_recons.tar.gz & diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
Also, if you only want to get started with the basics of FreeSurfer, you need only download the buckner_data set. This will allow you to do the following tutorials:
Installation
Uncompress the files
Once the dataset(s) have been downloaded, uncompress and install with the following command run from a terminal window.
tar -xzvf <filename>.tar.gz
Replacing <filename> with the name of the file downloaded. The downloaded .tar.gz files can then be deleted.
Set variables to point FreeSurfer to data
Once the download is complete you may uncompress the tar files. In order to do the tutorials, users must define an environment variable called TUTORIAL_DATA which is set to the location of the extracted data. For example:
export TUTORIAL_DATA=/home/username/Downloads/tutorial_data ls $TUTORIAL_DATA buckner_data fsfast-functional diffusion_recons fsfast-tutorial.subjects diffusion_tutorial long-tutorial
You are now ready to start the Freesurfer tutorials.
(bash) export TUTORIAL_DATA=<absolute_path_to_tutorial_data_directory> (csh) setenv TUTORIAL_DATA <absolute_path_to_tutorial_data_directory>
Note: The tutorial references the TUTORIAL_DATA variable, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).