DACQ acquisition and data transfer
Data acquisition workflow
- Log into the DACQ workstation using common username and password.
- Check the helium budget with Helium Recycler Scheduler.
- Start Elekta Neuromag -> Acquisition.
- Load an existing project or insert new project details.
- Load existing subject information or insert a new subject. Do not insert real personal information (name or birthday) as subject details, but a unique code. Use subject group “Volunteers”.
- Load or set up data acquisition information (channels, sampling rate, filters, ...).
- Define trigger setup (internal for DACQ-generated sequences, external for Stim-PC controlled stimuli).
- Set up stimulators, response devices etc. and test that all works as intended.
- Check that gantry is in the intended position and correctly recognized by the DACQ software.
- Check empty-room signal quality. A few bad channels is ok, otherwise fix (Tuning).
- Check signal quality with stimulation and subject - if artefacts, check subject for jewelry, clothing, dental work, make-up, dyes, tattoos, surgical leftovers, implants.
- Prepare the subject. Bring the subject into the MEG, perform HPI, start stimulation if applicable, record raw & average data.
- When the experiment is complete, stop stimuli and recording, save the data, and bring out the subject.
- Thank the subject, help cleaning him/her up if needed, answer any questions, give reward, make sure he/she leaves happy.
- Move the data to the CIBR server (see separate instructions).
- Log out of the DACQ workstation, but leave the workstation on.
- If you are the last user of the day, bring the gantry to the helium liquefaction position.
- Clean up your mess, make sure to leave everything as it was, fill the log book.
Data transfer to the CIBR servers
When MEG data is being collected, it is saved to a temporary file on the DACQ computer. If the measurement crashes for some reason, you can try recovering the data with the Neuromag utility “Rescue data”. When data acquisition finishes, the data (usually both evoked and raw) are saved on the external network drive station attached to DACQ. This external drive sometimes takes some time to initialize or gives some user name / permission related errors, but these are not fatal. The network drive is mirrored, thus inform staff of possible drive warnings to maintain data integrity.
For further analysis, the data is moved to CIBR servers, where they can be accessed remotely. Moving your data from DACQ to CIBR server takes three steps: checksum computation, data transfer, and checksum checking. We compute checksums to make sure the data is transferred uncorrupted (no lost bits). Using the command terminal is preferred, because it makes things fast and repeatable. In JYU Linux environments, the command terminal is called Konsole.
First you need to know your data paths. For project called “test”, the data path is /projects/test/ both in the MEG lab and on the server. For a given measurement session, the directory is appended by <subject>/<date>/ by default.
Step 1. Checksum computation
Go to the data directory that you want to transfer (with sub-directories).
cd <data_path>
Compute the checksum for each fif-file under the directory using a custom script:
sha256_recur.sh
There is no user feedback at this stage, except appearance of the checksum files in the subdirectories. Do not try transferring files that have already been transferred.
Step 2. Data transfer
Use the command scp (for “secure copy”) with the following syntax:
scp -pr <data_path> nobelist@cibr1:/projects/test/
Here, replace the username “nobelist” with your own username,
and the target directory /projects/test/ with your own target directory in the project folder.
The computer displays how the transfer is proceeding.
Step 3. Checksum check
For this final step, you need to log into the CIBR servers. You can do this on the DACQ workstation or anywhere else. With ssh, command looks like:
ssh nobelist@cibr1.psy.jyu.fi
Again, replace the username.
Go to the directory of the transferred data, like:
cd /projects/test/
To check integrity of the transferred data, use a custom script on the server. This has the same name as the checksum script on DACQ computer:
sha256_recur.sh
The script outputs the check result for each file, whose checksum is recorded in the checksum files under current directory. Once each transferred file is reported as “OK”, you are finished. You can wait 24h before removing the transferred data from the DACQ computer to make sure all server back-ups are up-to-date.
If the sha256 check command complains "Command not found.", please first try to command "bash" and retype your command again.