Scaffold Mapping Tools: Annotation
How to annotate SPARC datasets in the mapping tool
Navigate to the File menu (top left) and select the Open option. From the MAPClient-Workflows folder, open the sparc-dataset-curation-helper folder and select the map-client-workflow.proj file.
Note: The sparc-dataset-curation-helper directory is a sibling directory to the sparc-data-mapping directory, which should be the current directory when the file chooser dialog presents itself. If this is not the case, the sparc-dataset-curation-helper directory can be found here: C:\Users\your username\AppData\Roaming\MusculoSkeletal\MAPClient-Workflows, where your username is your username.
To configure this workflow,
-
Right-click on the DatasetDirectory icon, click on the Configure option.
-
Select the Output directory (For the demonstration dataset this would be DATASET_ROOT)
-
Click on the OK button.
-
Save (Ctrl+S) the workflow.
-
Click on the Execute button to run this workflow.
Figure 1: MAP Client showing the configure dialog of the sparc-dataset-curation-helper workflow Directory Chooser step, highlighting the: Directory output, OK button, and Execute button.
The SPARC dataset curation helper step consists of the Scaffold Annotation, Plot Annotation and the Context Annotation tabs. For preparing the demonstration dataset that we have been working on we will only make use of the Scaffold Annotation and Context Annotation tabs.
With regard to the Scaffold Annotation tab, there are three main sections, first is the Scaffold annotations section that reports on the currently available scaffold annotations. The second section is the Manual annotation section, which helps to manually annotate the metadata files, view files, and the thumbnail files (note that it’s only for annotating with the isDerivedFrom and isSourceOf predicates). The third section named Errors is where error reporting is shown and where errors can be fixed.
Manual annotation should be used with extreme caution. It is very easy to make nonsensical statements that will harm computer understanding of the dataset. The automatic annotation tool in the Errors section will work in 99% of cases and should alleviate the need for using the manual annotation functionality.
Figure 2: MAP Client highlighting the three sections of the scaffold annotation step, namely, Scaffold annotations, Manual annotation, and Errors.
In the errors section we can see a list of current scaffold annotation errors found in our demonstration dataset. We can fix these errors using the Fix All Errors button. Click on the Fix All Errors button and then click the Yes button on the confirmation pop-up window to allow the magic tool to fix these errors.
Note: The curation helper tool works on one layer of errors at a time. So, you will have to use the Fix All Errors button multiple times, until the tool stops reporting new errors. If the reported errors are not changing then we need to use the manual annotation tool to make the correct annotations.
Figure 3: Scaffold annotation tool showing the magic tool confirmation dialog, highlighting: Confirmation dialog, Yes button, Errors section, and Fix All Errors button.
After fixing all the errors we should see a tree structure in the scaffold annotations section. For this demonstration, the mouse_colon_metadata.json is the root of the tree. The root has two children mouse_colon_default_view.json and mouse_colon_proximal_view.json file. Each of these view files is associated with a single thumbnail mouse_colon_default_thumbnail.jpeg and mouse_colon_proximal_thumbnail.jpeg respectively. This tree representation of the scaffold annotations shows us that the annotations have been made correctly.
Figure 4: Scaffold annotation tool showing the annotated files, namely, the mouse_colon_metadata.json, mouse_colon_default_view.json and the mouse_colon_proximal_view.json along with their respective thumbnail files: mouse_colon_default_thumbnail.jpeg and mouse_colon_proximal_thumbnail.jpeg.
Next, we move to the Context Annotation tab to add additional contextual information to the visualization.
At the top of the Context Annotation tab, there is a provision to manually select a scaffold annotation map file. However, when the dataset already has a scaffold annotation map file saved within it, the scaffold annotation tool will automatically load this file for us. This is the case for our demonstration dataset. If you did want to add additional annotations you can load them using the select scaffold annotation map file functionality.
The context annotation tab is subdivided into three parts; Summary, Samples, and Views. Depending on the dataset, we should enter details in the Summary section and make some addition in at least one of the Samples or Views parts.
For our demonstration, we will start with providing a summary. We need to provide a Heading for this dataset e.g., Colon with neural tracings, and an appropriate description in the Description field e.g., Visualization of 3D digital tracing of enteric plexus in mouse proximal colon scaffold.
Figure 5: Context Annotation tab highlighting the Summary section.
Next, go to the samples tab. To add a sample,
-
Click on the Add button.
-
Enter the Heading, e.g., sample 1
-
Provide a DOI. For this demo, the DOI field is empty as the sample is included in the dataset (you need to enter the DOI only if the sample is from a different dataset).
-
Enter the Path, i.e., DATASET_ROOT\primary where the segmentation file is located.
-
Add annotation in the Annotation field.
-
Associate the Annotation to the view through the View option. (Note: The View chooser will populate the available view options once the Views are added as indicated in the Views section below).
-
Provide a Description.
Figure 6: Context Annotation tab highlighting the Samples section.
Following the Summary and Samples sections, go to the Views section. We can add information about a view as exemplified below,
-
Click on the Add button.
-
Select the Path value, i.e., DATASET_ROOT\derivative\scaffold\mouse_colon_default_view.json
-
Provide a Thumbnail value, e.g., DATASET_ROOT\derivative\scaffold\mouse_colon_default_thumbnail.jpeg
-
The annotation chooser allows us to annotate the view. The annotation terms listed in the annotation chooser are loaded from the scaffold annotation map file. To read a more human friendly form of an annotation term use the computer mouse to hover over a term shown in the chooser. Set the annotation to UBERON:0001155 (Uberon identifier for colon).
-
Enter a Description for the view, e.g., Digital tracings of enteric plexus mapped onto the proximal section of the mouse colon scaffold.
Figure 7: Context Annotation tab highlighting the Views section; showing the View 1 details.
Similarly, we will add another view and associate it with the mouse colon proximal view as explained below,
-
Click on the Add button.
-
Select the Path value, i.e., DATASET_ROOT\derivative\scaffold\mouse_colon_proximal_view.json
-
Provide a Thumbnail value, e.g., DATASET_ROOT\derivative\scaffold\mouse_colon_proximal_thumbnail.jpeg
-
Select UBERON:0008972 (Uberon identifier for the proximal colon) for the annotation
-
We can associate this view with sample 1. To do this use the sample chooser and select Sample 1 (as soon as you associate View 2 to Sample 1, under the Samples tab, then Sample 1 gets associated with View 2).
-
Enter a Description.
Figure 8: Context Annotation tab highlighting the Views section; showing the View 2 details.
Figure 9: Context Annotation tab showing the association of Sample 1 with View 2.
After entering all the required information to the context annotation, like Summary, Samples, and Views details, click on the Write Annotation button followed by the Done button to write the contextual annotation to the dataset and ready for dataset curation.
Figure 10: Context Annotation tab highlighting the Write Annotation button and Done button.
Once the workflow has finished executing (returned to the initial state), navigate to the scaffold folder (DATASET_ROOT\derivative\scaffold) to view the manifest.xlsx Excel file. This file contains all the annotation information for the scaffold and contextual information and is ready for the SPARC portal.
Figure 11: manifest.xlsx file shown in Excel displaying the results of the scaffold annotation tool.
Move on to add provenance or return to the main Scaffold Mapping Tools page.
Updated over 1 year ago