TR_NESCent_Visit_09FEB10_UI_Mockup
This is a summary of the user interface (UI) discussion that surrounded the activity of mocking the prototype.
(Click to view larger versions of the images)
Current entry points for the user:
Search by Single BLAST
Discussion of results from a Single BLAST were tabled due to the fact the topic has dominated previous working group meetings and Andrew should be able to come up w/ UI mockups on his own.
Search by gene term
The search box looks like Google, so we need to expect users to utilize it in that fashion. The drop-down for specifying the type of term (GenBank identifier, genome sequencing project identifier, GO term, EMBL identifier, etc) is not likely to be thought of as that helpful. A later development (in discussion) led to creating a default value of "ALL" or something indicating 'anything', and that a user could choose to narrow the focus by specifying via this drop-down.
A "help/info" link should be present below the box that indicates what terms are available to search on. But not just that, if the user select one of the terms, say "genome sequencing project ID", it would give AT##### or OS##### as a sample for the format. This would help users unfamiliar w/ the identifiers (like Andrew) and allow them to find what they're looking for.
The intent of this search is to funnel the user toward a gene family.
Results within the "Search by gene term" tab
A limited number of result could be included in the region of the tab for the "search by gene term" without causing the user to leave this page. It's likely if the search is performing its' purpose that one of the first 5 or 10 hits will be what the user is looking for:
Detail view of the search results within the tab
Discussion of 'matching' strategy was also covered. When looking at identifier searches, a "starts with" approach may be sufficient (in SQL, something LIKE = 'AT156%') but we also need to support the more expensive 'within' matching for getting hits in values like GenBank Definition Line (in SQL, something LIKE - '%AT156%'.
Results from a "Search by gene term"
Continuing on from the notion of a search results page, where a user might go was discussed:
The search result page will fork users (re #1.) in two directions re the selected gene family: toward the reconciled tree or toward a page showing ortholog sets. We have tabled the discussion of the ortholog sets page at Todd's request (he needs to consult w/ some more knowledgeable on the topic). The visualization will funnel the use to the 3-panel view. We discussed the fact that the 3 different panels are likely not going to work out as screen real estate is at a premium any way - so delineating it into smaller portions will not help. Perhaps a tabbed view, or using some panning/zooming mechanism would help with the great amount of visual-details presented.
The search results (#2.) will show gene families and their members (in green). The individual genes should be hyperlinked out to external sources regarding the gene (if they linked back into iPlant Discovery Environment pages this would be less optimal or even "bad"). The checkboxes here could be used to build up a "gene context" that could be used like "points of interest" on a map where the map is the 'fat tree' representation of the reconciled tree (so an image from PRIMETV). This could be shown in panning/zooming view and allow the user to home-in on various areas of interest. This "gene context" could be added to and further narrow the view shown in the 3-panel view area. The gene tree could also be used to refine the "gene context" further.
The search results would be different in the user was coming in via the "gene term search" and had been searching on a GO term, pathway, or domain. This needs to be discussed further in working group meetings.
Visualization Approach discussion re Reconciled Tree
If PRIMETV was used to produce reconcile tree images, might it be possible to create those images and also have something like the markup-file for google maps that would allow points on the image to plotted? Would a Google Maps approach be too much? Perhaps the zooming and panning used by MorphBank might be sufficient? MorphBank currently uses a commercial tool called FSI Viewer to provide panning & zooming on images. They are looking at also using a tool called Zoomify that has free & commercial versions. There is a sourceforge project for Zoomifying a given image for use with the tool.
reference: whiteboard