7afb80e202aa3c2119b6882755bb80f0.ppt
- Количество слайдов: 17
ISMB 2003 presentation Extracting Synonymous Gene and Protein Terms from Biological Literature Hong Yu and Eugene Agichtein Dept. Computer Science, Columbia University, New York, USA {hongyu, eugene}@cs. columbia. edu 212 -939 -7028
Significance and Introduction n Genes and proteins are often associated with multiple names n n n Apo 3, DR 3, TRAMP, LARD, and lymphocyte associated receptor of death Authors often use different synonyms Information extraction benefits from identifying those synonyms Synonym knowledge sources are not complete Developing automate approaches for identifying gene/protein synonyms from literature
Background-synonym identification n Semantically related words n Distributional similarity [Lin 98][Li and Abe 98][Dagan et al 95] n “beer” and “wine” n n Mapping abbreviations to full forms n Map LARD to lymphocyte associated receptor of death n n “drink”, “people”, “bottle” and “make” [Bowden et al. 98] [Hisamitsu and Niwa 98] [Liu and Friedman 03] [Pakhomov 02] [Park and Byrd 01] [Schwartz and Hearst 03] [Yoshida et al. 00] [Yu et al. 02] Methods for detecting biomedical multiword synonyms n Sharing a word(s) [Hole 00] n n cerebrospinal fluid protein assay Information retrieval approach n Trigram matching algorithm [Wilbur and Kim 01] n Vector space model n n cerebrospinal fluid cer, ere, …, uid cerebrospinal fluid protein assay cer, ere, …, say
Background-synonym identification n GPE n [Yu et al 02] A rule-based approach for detecting synonymous gene/protein terms n Manually recognize patterns authors use to list synonyms n n n Apo 3/TRAMP/WSL/DR 3/LARD Extract synonym candidates and heuristics to filter out those unrelated terms n ng/kg/min Advantages and disadvantages n n High precision (90%) Recall might be low, expensive to build up
Background—Machine-learning n n n Machine-learning reduces manual effort by automatically acquiring rules from data Unsupervised and supervised Semi-supervised n Bootstrapping [Hearst 92, Yarowsky 95] [Agichtein and Gravano 00] n Hyponym detection [Hearst 92] n The bow lute, such as the Bambara ndang, is plucked and has an individual curved neck for each string. n A Bambara ndang is a kind of bow lute n Co-training [Blum and Mitchell 98]
Method-Outline n Machine-learning n Unsupervised n n Similarity [Dagan et al 95] Semi-supervised n Bootstrapping n n Supervised n n n SNOWBALL [Agichtein and Gravano 02] Support Vector Machine Comparison between machine-learning and GPE Combined approach
Method--Unsupervised n Contextual similarity [Dagan et al 95] n Hypothesis: synonyms have similar surrounding words Mutual information n Similarity n
Methods—semi-supervised n SNOWBALL [Agichtein and Gravano 02] n Bootrapping n Starts with a small set of user-provided seed tuples for the relation, automatically generates and evaluates patterns for extracting new tuples. {Apo 3, DR 3} {LARD, Apo 3} {DR 3, LARD} “Apo 3, also known as DR 3…” “DR 3, also called LARD…” “<GENE>, also called <GENE>” “<GENE>, also known as <GENE>”
Method--Supervised n Support Vector Machine n State-of-the-art text classification method n n Training sets: n n n SVMlight The same sets of positive and negative tuples as the SNOWBALL Features: the same terms and term weights used by SNOWBALL Kernel function n Radial basis kernel (rbf) kernel function
Methods—Combined n Rational n n Machine-learning approaches increase recall The manual rule-based approach GPE has a high precision with lower recall Combined will boost both recall and precision Method n n Assume each system is an independent predictor Prob=1 -Prob that all systems extracted incorrectly
Evaluation-data n Data n Gene. Ways corpora [Friedman et al 01] n 52, 000 full-text journal articles n n Preprocessing n Gene/Protein name entity tagging n n Abgene [Tanabe and Wilbur 02] Segmentation n n Science, Nature, Cell, EMBO, Cell Biology, PNAS, Journal of Biochemistry Sentence. Splitter Training and testing n 20, 000 articles for training n n Tuning SNOWBALL parameters such as context window, etc. 32, 000 articles for testing
Evaluation-matrices n Estimating precision n Randomly select 20 synonyms with confident scores (0. 00. 1, 0. 1 -0. 2, …, 0. 9 -1. 0) Biological experts judged the correctness of synonym pairs Estimating recall n SWISSPROT—Gold Standard n n 989 pairs of SWISSPROT synonyms co-appear in at least one sentence in the test set Biological experts judged 588 pairs were indeed synonyms n “…and cdc 47, cdc 21, and mis 5 form another complex, which relatively weakly associates with mcm 2…”
Results n Patterns SNOWBALL found Conf Left 0. 75 0. 54 0. 47 n Middle <(0. 55><ALSO 0. 53><CALLED 0. 53> <ALSO 0. 47><KNOWN 0. 47><AS 0. 47> <( 0. 54> <ALSO 0. 54> <TERMED 0. 54> Right - Of 148 evaluated synonym pairs, 62(42%) were not listed as synonyms in SWISSPROT
Results
Results
Results n System performance System Tagging Similarity Snowball SVM Time 40 mins 7 hs 2 hs 1. 5 h GPE 35 mins
Conclusions n n n Extraction techniques can be used as a valuable supplement to resources such as SWISSPROT Synonym relations can be automated through machine-learning approaches SNOWBALL can be applied successfully for recognizing the patterns
7afb80e202aa3c2119b6882755bb80f0.ppt