Скачать презентацию CS 460 626 Natural Language Processing Language Technology for Скачать презентацию CS 460 626 Natural Language Processing Language Technology for

c9aafaec039743c9d72c6ee69eba1d4d.ppt

  • Количество слайдов: 95

CS 460/626 : Natural Language Processing/Language Technology for the Web (Lecture 1 – Introduction) CS 460/626 : Natural Language Processing/Language Technology for the Web (Lecture 1 – Introduction) Pushpak Bhattacharyya CSE Dept. , IIT Bombay

Persons involved n n n Faculty instructors: Dr. Pushpak Bhattacharyya (www. cse. iitb. ac. Persons involved n n n Faculty instructors: Dr. Pushpak Bhattacharyya (www. cse. iitb. ac. in/~pb) and Dr. Om Damani (www. cse. iitb. ac. in/~damani) TAs: to be decided Course home page (to be created) n www. cse. iitb. ac. in/~cs 626 -460 -2009

Perpectivising NLP: Areas of AI and their inter-dependencies Search Logic Machine Learning NLP Vision Perpectivising NLP: Areas of AI and their inter-dependencies Search Logic Machine Learning NLP Vision Knowledge Representation Planning Robotics Expert Systems

Web brings new perspectives: QSA Triangle Query Search Analystics Web brings new perspectives: QSA Triangle Query Search Analystics

Web 2. 0 tasks n n Business Intelligence on the Internet Platform Opinion Mining Web 2. 0 tasks n n Business Intelligence on the Internet Platform Opinion Mining Reputation Management Sentiment Analysis (some observations at the end) NLP is thought to play a key role

Books etc. n Main Text(s): n n Other References: n n n NLP a Books etc. n Main Text(s): n n Other References: n n n NLP a Paninian Perspective: Bharati, Cahitanya and Sangal Statistical NLP: Charniak Journals n n Natural Language Understanding: James Allan Speech and NLP: Jurafsky and Martin Foundations of Statistical NLP: Manning and Schutze Computational Linguistics, Natural Language Engineering, AI Magazine, IEEE SMC Conferences n ACL, EACL, COLING, MT Summit, EMNLP, IJCNLP, HLT, ICON, SIGIR, WWW, ICML, ECML

Allied Disciplines Philosophy Semantics, Meaning of “meaning”, Logic (syllogism) Linguistics Study of Syntax, Lexicon, Allied Disciplines Philosophy Semantics, Meaning of “meaning”, Logic (syllogism) Linguistics Study of Syntax, Lexicon, Lexical Semantics etc. Probability and Statistics Corpus Linguistics, Testing of Hypotheses, System Evaluation Cognitive Science Computational Models of Language Processing, Language Acquisition Psychology Behavioristic insights into Language Processing, Psychological Models Brain Science Language Processing Areas in Brain Physics Information Theory, Entropy, Random Fields Computer Sc. & Engg. Systems for NLP

Topics proposed to be covered n Shallow Processing n n n Language Modeling n Topics proposed to be covered n Shallow Processing n n n Language Modeling n n n N-grams Probabilistic CFGs Basic Linguistics n n n Part of Speech Tagging and Chunking using HMM, MEMM, CRF, and Rule Based Systems EM Algorithm Morphemes and Morphological Processing Parse Trees and Syntactic Processing: Constituent Parsing and Dependency Parsing Deep Parsing n n n Classical Approaches: Top-Down, Bottom-UP and Hybrid Methods Chart Parsing, Earley Parsing Statistical Approach: Probabilistic Parsing, Tree Bank Corpora

Topics proposed to be covered (contd. ) n n n Knowledge Representation and NLP Topics proposed to be covered (contd. ) n n n Knowledge Representation and NLP n Predicate Calculus, Semantic Net, Frames, Conceptual Dependency, Universal Networking Language (UNL) Lexical Semantics n Lexicons, Lexical Networks and Ontology n Word Sense Disambiguation Applications n Machine Translation n IR n Summarization n Question Answering

Grading n Based on n n Midsem Endsem Assignments Seminar Except the first two Grading n Based on n n Midsem Endsem Assignments Seminar Except the first two everything else in groups of 4. Weightages will be revealed soon.

Definitions etc. Definitions etc.

What is NLP n n Branch of AI 2 Goals n n Science Goal: What is NLP n n Branch of AI 2 Goals n n Science Goal: Understand the way language operates Engineering Goal: Build systems that analyse and generate language; reduce the man machine gap

The famous Turing Test: Language Based Interaction Test conductor Machine Human Can the test The famous Turing Test: Language Based Interaction Test conductor Machine Human Can the test conductor find out which is the machine and which the human

Inspired Eliza n http: //www. manifestation. com/neuroto ys/eliza. php 3 Inspired Eliza n http: //www. manifestation. com/neuroto ys/eliza. php 3

Inspired Eliza n (another sample interaction) A Sample of Interaction: Inspired Eliza n (another sample interaction) A Sample of Interaction:

“What is it” question: NLP is concerned with Grounding Ground the language into perceptual, “What is it” question: NLP is concerned with Grounding Ground the language into perceptual, motor and cognitive capacities.

Grounding Chair Computer Grounding Chair Computer

Two Views of NLP and the Associated Challenges 1. 2. Classical View Statistical/Machine Learning Two Views of NLP and the Associated Challenges 1. 2. Classical View Statistical/Machine Learning View

Stages of processing n n n n Phonetics and phonology Morphology Lexical Analysis Syntactic Stages of processing n n n n Phonetics and phonology Morphology Lexical Analysis Syntactic Analysis Semantic Analysis Pragmatics Discourse

Phonetics n n Processing of speech Challenges n Homophones: bank (finance) vs. bank (river Phonetics n n Processing of speech Challenges n Homophones: bank (finance) vs. bank (river n bank) Near Homophones: maatraa vs. maatra (hin) n Word Boundary n n aajaayenge (aa jaayenge (will come) or aaj aayenge (will come today) I got [ua]plate n Phrase boundary n mtech 1 students are especially exhorted to attend as such seminars are integral to one's post-graduate education Disfluency: ah, um, ahem etc. n

Morphology n n n n Word formation rules from root words Nouns: Plural (boy-boys); Morphology n n n n Word formation rules from root words Nouns: Plural (boy-boys); Gender marking (czar-czarina) Verbs: Tense (stretch-stretched); Aspect (e. g. perfective sit-had sat); Modality (e. g. request khaanaa khaaiie) First crucial first step in NLP Languages rich in morphology: e. g. , Dravidian, Hungarian, Turkish Languages poor in morphology: Chinese, English Languages with rich morphology have the advantage of easier processing at higher stages of processing A task of interest to computer science: Finite State Machines for Word Morphology

Lexical Analysis n Essentially refers to dictionary access and obtaining the properties of the Lexical Analysis n Essentially refers to dictionary access and obtaining the properties of the word e. g. dog noun (lexical property) take-’s’-in-plural (morph property) animate (semantic property) 4 -legged (-do-) carnivore (-do) Challenge: Lexical or word sense disambiguation

Lexical Disambiguation First step: part of Speech Disambiguation n n Dog as a noun Lexical Disambiguation First step: part of Speech Disambiguation n n Dog as a noun (animal) Dog as a verb (to pursue) Sense Disambiguation n n Dog (as animal) Dog (as a very detestable person) Needs word relationships in a context n The chair emphasised the need for adult education Very common in day to day communications Satellite Channel Ad: Watch what you want, when you want (two senses of watch) e. g. , Ground breaking ceremony/research

Technological developments bring in new terms, additional meanings/nuances for existing terms n n n Technological developments bring in new terms, additional meanings/nuances for existing terms n n n Justify as in justify the right margin (word processing context) Xeroxed: a new verb Digital Trace: a new expression Communifaking: pretending to talk on mobile when you are actually not Discomgooglation: anxiety/discomfort at not being able to access internet Helicopter Parenting: over parenting

Syntax Processing Stage Structure Detection S VP NP V I like NP mangoes Syntax Processing Stage Structure Detection S VP NP V I like NP mangoes

Parsing Strategy n Driven by grammar n n n S-> NP VP NP-> N Parsing Strategy n Driven by grammar n n n S-> NP VP NP-> N | PRON VP-> V NP | V PP N-> Mangoes PRON-> I V-> like

Challenges in Syntactic Processing: Structural Ambiguity n Scope 1. The old men and women Challenges in Syntactic Processing: Structural Ambiguity n Scope 1. The old men and women were taken to safe locations (old men and women) vs. ((old men) and women) 2. No smoking areas will allow Hookas inside n Preposition Phrase Attachment n I saw the boy with a telescope (who has the telescope? ) I saw the mountain with a telescope (world knowledge: mountain cannot be an instrument of seeing) n I saw the boy with the pony-tail (world knowledge: pony-tail cannot be an instrument of seeing) Very ubiquitous: newspaper headline “ 20 years later, BMC n pays father 20 lakhs for causing son’s death”

Structural Ambiguity… n Overheard n n An actual sentence in the newspaper n n Structural Ambiguity… n Overheard n n An actual sentence in the newspaper n n n I did not know my PDA had a phone for 3 months The camera man shot the man with the gun when he was near Tendulkar (P. G. Wodehouse, Ring in Jeeves) Jill had rubbed ointment on Mike the Irish Terrier, taken a look at the goldfish belonging to the cook, which had caused anxiety in the kitchen by refusing its ant’s eggs… (Times of India, 26/2/08) Aid for kins of cops killed in terrorist attacks

Headache for Parsing: Garden Path sentences n Garden Pathing n n n The horse Headache for Parsing: Garden Path sentences n Garden Pathing n n n The horse raced past the garden fell. The old man the boat. Twin Bomb Strike in Baghdad kill 25 (Times of India 05/09/07)

Semantic Analysis n Representation in terms of n Predicate calculus/Semantic Nets/Frames/Conceptual Dependencies and Scripts Semantic Analysis n Representation in terms of n Predicate calculus/Semantic Nets/Frames/Conceptual Dependencies and Scripts n John gave a book to Mary n Give action: Agent: John, Object: Book, Recipient: Mary Challenge: ambiguity in semantic role labeling n n n (Eng) Visiting aunts can be a nuisance (Hin) aapko mujhe mithaai khilaanii padegii (ambiguous in Marathi and Bengali too; not in Dravidian languages)

Pragmatics n n Very hard problem Model user intention n Tourist (in a hurry, Pragmatics n n Very hard problem Model user intention n Tourist (in a hurry, checking out of the hotel, motioning to the service boy): Boy, go upstairs and see if my sandals are under the divan. Do not be late. I just have 15 minutes to catch the train. Boy (running upstairs and coming back panting): yes sir, they are there. World knowledge n WHY INDIA NEEDS A SECOND OCTOBER (To. I, 2/10/07)

Discourse Processing of sequence of sentences Mother to John: John go to school. It Discourse Processing of sequence of sentences Mother to John: John go to school. It is open today. Should you bunk? Father will be very angry. Ambiguity of open bunk what? Why will the father be angry? Complex chain of reasoning and application of world knowledge Ambiguity of father as parent or father as headmaster

Complexity of Connected Text John was returning from school dejected – today was the Complexity of Connected Text John was returning from school dejected – today was the math test He couldn’t control the class Teacher shouldn’t have made him responsible After all he is just a janitor

Giving a flavour of what is done: Structure Disambiguation Scope, Clause and Preposition/Postpositon Giving a flavour of what is done: Structure Disambiguation Scope, Clause and Preposition/Postpositon

Structure Disambiguation is as critical as Sense Disambiguation n Scope (portion of text in Structure Disambiguation is as critical as Sense Disambiguation n Scope (portion of text in the scope of a modifier) n n n Clause n n Old men and women will be taken to safe locations No smoking areas allow hookas inside I told the child that I liked that he came to the game on time Preposition n I saw the boy with a telescope

Structure Disambiguation is as critical as Sense Disambiguation (contd. ) n Semantic role n Structure Disambiguation is as critical as Sense Disambiguation (contd. ) n Semantic role n n n Visiting aunts can be a nuisance Mujhe aapko mithaai khilaani padegii (“I have to give you sweets” or “You have to give me sweets”) Postposition n unhone teji se bhaaagte hue chor ko pakad liyaa (“he caught the thief that was running fast” or “he ran fast and caught the thief”) All these ambiguities lead to the construction multiple parse trees for each sentence and need semantic, pragmatic and discourse cues for disambiguation

Higher level knowledge needed for disambiguation n Semantics n n an instrument of seeing) Higher level knowledge needed for disambiguation n Semantics n n an instrument of seeing) Pragmatics n n I saw the boy with a pony tail (pony tail cannot be ((old men) and women) as opposed to (old men and women) in “Old men and women were taken to safe location”, since women- both and young and old- were very likely taken to safe locations Discourse: n n No smoking areas allow hookas inside, except the one in Hotel Grand. No smoking areas allow hookas inside, but not cigars.

Preposition Preposition

Problem definition n 4 -tuples of the form V N 1 P N 2 Problem definition n 4 -tuples of the form V N 1 P N 2 n n saw (V) boys (N 1) with (P) telescopes (N 2) Attachment choice is between the matrix verb V and the object noun N

Lexical Association Table (Hindle and Rooth, 1991 and 1993) n From a large corpus Lexical Association Table (Hindle and Rooth, 1991 and 1993) n From a large corpus of parsed text n n n first find all noun phrase heads then record the verb (if any) that precedes the head and the preposition (if any) that follows it as well as some other syntactic information about the sentence. Extract attachment information from this table of co-occurrences

Example: lexical association n n A table entry is considered a definite instance of Example: lexical association n n A table entry is considered a definite instance of the prepositional phrase attaching to the verb if: n the verb definitely licenses the prepositional phrase E. g. from Propbank, n absolve n frames n absolve. XX: NP-ARG 0 NP-ARG 2 -of obj-ARG 1 1 n absolve. XX NP-ARG 0 NP-ARG 2 -of obj-ARG 1 n On Friday , the firms filed a suit *ICH*-1 against West Virginia in New York state court asking for [ARG 0 a declaratory judgment] [rel absolving] [ARG 1 them] of [ARG 2 -of liability].

Core steps Seven different procedures for deciding whether a table entry is an instance Core steps Seven different procedures for deciding whether a table entry is an instance of no attachment, sure noun attach, sure verb attach, or ambiguous attach n able to extract frequency information, counting the number of times a particular verb or noun attaches with a particular preposition n

Core steps (contd. ) These frequencies serve as the training data for the statistical Core steps (contd. ) These frequencies serve as the training data for the statistical model used to predict correct attachment n To disambiguate a sentence, compute the likelihood of the particular preposition given the particular verb and contrast with the likelihood of the preposition given the particular noun n n i. e. , compare P(with|saw) with P(with|telescope) as in I saw the boy with a telescope

Critique Limited by the number of relationships in the training corpora n Too large Critique Limited by the number of relationships in the training corpora n Too large a parameter space n Model acquired during training is represented in a huge table of probabilities, precluding any straightforward analysis of its workings n

Approach based on Transformation Based Error Driven Learning, Brill and Resnick, COLING 1994 Approach based on Transformation Based Error Driven Learning, Brill and Resnick, COLING 1994

Example Transformations Initial attachments by default are to N 1 predominantly. Example Transformations Initial attachments by default are to N 1 predominantly.

Transformation rules with word classes Wordnet synsets and Semantic classes used Transformation rules with word classes Wordnet synsets and Semantic classes used

Accuracy values of the transformation based approach: 12000 training and 500 test examples Method Accuracy values of the transformation based approach: 12000 training and 500 test examples Method Accuracy #of transformation rules Hindle and Rooth 70. 4 to 75. 8% NA (baseline) Transformations 79. 2 418 Transformations (word classes) 81. 8 266

Maximum Entropy Based Approach: (Ratnaparki, Reyner, Roukos, 1994) Use more features than (V N Maximum Entropy Based Approach: (Ratnaparki, Reyner, Roukos, 1994) Use more features than (V N 1) bigram and (N 1 P) bigram n Apply Maximum Entropy Principle n

Core formulation n We denote n n n the partially parsed verb phrase, i. Core formulation n We denote n n n the partially parsed verb phrase, i. e. , the verb phrase without the attachment decision, as a history h, and the conditional probability of an attachment as P(d|h), where d and corresponds to a noun or verb attachment- 0 or 1 - respectively.

Maximize the training data log likelihood --(1) --(2) Maximize the training data log likelihood --(1) --(2)

Equating the model expected parameters and training data parameters --(3) --(4) Equating the model expected parameters and training data parameters --(3) --(4)

Features Two types of binary-valued questions: n Questions about the presence of any ngram Features Two types of binary-valued questions: n Questions about the presence of any ngram of the four head words, e. g. , a bigram maybe V == ‘‘is’’, P == ‘‘of’’ n Features comprised solely of questions on words are denoted as “word” features n

Features (contd. ) Questions that involve the class membership of a head word n Features (contd. ) Questions that involve the class membership of a head word n Binary hierarchy of classes derived by n mutual information

Features (contd. ) n Given a binary class hierarchy, n n we can associate Features (contd. ) n Given a binary class hierarchy, n n we can associate a bit string with every word in the vocabulary Then, by querying the value of certain bit positions we can construct binary questions. Features comprised solely of questions about class bits are denoted as “class” features, and features containing questions about both class bits and words are denoted as “mixed” features.

Word classes (Brown et. al. 1992) Word classes (Brown et. al. 1992)

Experimental data size Experimental data size

Performance of ME Model on Test Events Performance of ME Model on Test Events

Examples of Features Chosen for Wall St. Journal Data Examples of Features Chosen for Wall St. Journal Data

Average Performance of Human & ME Model on 300 Events of WSJ Data Average Performance of Human & ME Model on 300 Events of WSJ Data

Human and ME model performance on consensus set for WSJ Human and ME model performance on consensus set for WSJ

Average Performance of Human & ME Model on 200 Events of Computer Manuals Data Average Performance of Human & ME Model on 200 Events of Computer Manuals Data

Back-off model based approach (Collins and Brooks, 1995) n n n n NP-attach: n Back-off model based approach (Collins and Brooks, 1995) n n n n NP-attach: n (joined ((the board) (as a non executive director))) VP-attach: n ((joined (the board)) (as a non executive director)) Correspondingly, NP-attach: n 1 joined board as director VP-attach: n 0 joined board as director Quintuple of (attachment: A: 0/1, V, N 1, P, N 2) 5 random variables

Probabilistic formulation Or briefly, If Then the attachment is to the noun, else to Probabilistic formulation Or briefly, If Then the attachment is to the noun, else to the verb

Maximum Likelihood estimate Maximum Likelihood estimate

The Back-off estimate o Inspired by speech recognition o Prediction of the Nth word The Back-off estimate o Inspired by speech recognition o Prediction of the Nth word from previous (N-1) words Data sparsity problem f(w 1, w 2, w 3, …wn) will frequently be 0 for large values on n

Back-off estimate contd. The cut off frequencies (c 1, c 2. . ) are Back-off estimate contd. The cut off frequencies (c 1, c 2. . ) are thresholds determining whether to back-off or not at each levelcounts lower than ci at stage i are deemed to be too low to give an accurate estimate, so in this case backing-off continues.

Back off for PPT attachment Note: the back off tuples always retain the preposition Back off for PPT attachment Note: the back off tuples always retain the preposition

The backoff algorithm The backoff algorithm

Lower and upper bounds on performance Lower bound (most frequent) Upper bound (human experts Lower and upper bounds on performance Lower bound (most frequent) Upper bound (human experts Looking at 4 word only)

Results Results

Comparison with other systems Maxent, Ratnaparkhi et. al. Transformation Learning, Brill et. al. Comparison with other systems Maxent, Ratnaparkhi et. al. Transformation Learning, Brill et. al.

Flexible Unsupervised PP Attachment using WSD and Data Sparsity Reduction: (Medimi Srinivas and Pushpak Flexible Unsupervised PP Attachment using WSD and Data Sparsity Reduction: (Medimi Srinivas and Pushpak Bhattacharyya, IJCAI 2007) n n n Unsupervised approach (some way similar to Ratnaparkhi 1998): The training data is extracted from raw text The unambiguous training data of the form V-P-N and N 1 -P-N 2 TEACH the system how to resolve PPattachment in ambiguous test data V-N 1 -P-N 2 Refinement of extracted training data. And use of N 2 in PP-attachment resolution process.

Flexible Unsupervised PP Attachment using WSD and Data Sparsity Reduction: (Medimi Srinivas and Pushpak Flexible Unsupervised PP Attachment using WSD and Data Sparsity Reduction: (Medimi Srinivas and Pushpak Bhattacharyya, IJCAI 2007) n n PP-attachment is determined by the semantic property of lexical items in the context of preposition using Word. Net An Iterative Graph based unsupervised approach is used for Word Sense disambiguation (Similar to Mihalcea 2005) Use of a Data sparseness Reduction (DSR) Process which uses lemmatization, Synset replacement and a form of inferencing. DSRP uses Word. Net. Flexible use of WSD and DSR processes for PPAttachment

Graph based disambiguation: page rank based algorithm, Mihalcea 2005 Graph based disambiguation: page rank based algorithm, Mihalcea 2005

Experimental setup n n Training Data: n Brown corpus (raw text). Corpus size is Experimental setup n n Training Data: n Brown corpus (raw text). Corpus size is 6 MB, consists of 51763 sentences, nearly 1 million 27 thousand words. n Most frequent Prepositions in the syntactic context N 1 -P-N 2: of, in, for, to, with, on, at, from, by n Most frequent Prepositions in the syntactic context V-P-N: in, to, by, with, on, for, from, at, of n The Extracted unambiguous N 1 -P-N 2: 54030 and V -P-N: 22362 Test Data: n Penn Treebank Wall Street Journal (WSJ) data extracted by Ratnaparkhi n It consists of V-N 1 -P-N 2 tuples: 20801(training), 4039(development) and 3097(Test)

Experimental setup contd. n n Base. Line: n The unsupervised approach by Ratnaparkhi, 1998 Experimental setup contd. n n Base. Line: n The unsupervised approach by Ratnaparkhi, 1998 (Base-RP). Preprocessing: n Upper case to lower case n Any four digit number less than 2100 as a year n Any other number or % signs are converted to num n Experiments are performed using DSRP: with different stages of DSRP n Experiments are performed using Gu. WSD and DSRP: with different senses

The process of extracting training data: Data Sparsity Reduction Tools/process Raw Text POS Tagger The process of extracting training data: Data Sparsity Reduction Tools/process Raw Text POS Tagger Chunker Extraction Heuristics Morphing DSRP (Synset Replacement) Output The professional conduct of the doctors is guided by Indian Medical Association. The_DT professional_JJ conduct_NN of_IN the_DT doctors_NNS is_VBZ guided_VBN by_ IN Indian_NNP Medical_NNP Association_NNP. _. [The_DT professional_JJ conduct_NN ] of_IN [the_DT doctors_NNS ] (is_VBZ guided_VBN) by_IN [Indian_NNP Medical_NNP Association_NNP]. After replacing each chunk by its head word it results in: conduct_NN of_IN doctors_NNS guided_VBN by_IN Association_NNP N 1 PN 2: conduct of doctors and VPN: guided by Association N 1 PN 2: conduct of doctor and VPN: guide by association N 1 PN 2: {conduct, behavior} of {doctor, physician} can result in 4 combination with the same sense and similarly for VPN: {guide, direct} by {association} can result in 2 combinations with the same sense.

Data Sparsity Reduction: Inferencing If V 1 -P-N 1 and V 2 -P-N 1 Data Sparsity Reduction: Inferencing If V 1 -P-N 1 and V 2 -P-N 1 exist as also do V 1 -P- N 2 and V 2 -P-N 2, then if V 3 -P-Ni exist (i=1, 2), then we can infer the existence of V 3 -P-NJ (i ≠ j) with a frequency count of V 3 -P-Ni that can be added to the corpus.

Example of DSR by inferencing n V 1 -P-N 1: play in garden and Example of DSR by inferencing n V 1 -P-N 1: play in garden and V 2 -P-N 1: sit in garden n V 1 -P-N 2: play in house and V 2 -P-N 2: sit in house n V 3 -P-N 2: jump in house exists n Infer the existence of V 3 -P-N 1: jump in garden

Results Results

Effect of various processes on Flex. PPAttach algorithm Effect of various processes on Flex. PPAttach algorithm

Precision vs. various processes Precision vs. various processes

Is NLP Really Needed Is NLP Really Needed

Hereâ" src="https://present5.com/presentation/c9aafaec039743c9d72c6ee69eba1d4d/image-85.jpg" alt="Post-1 n POST----5 TITLE: "Wants to invest in IPO? Think again" |
Hereâ" /> Post-1 n POST----5 TITLE: "Wants to invest in IPO? Think again" |
Hereâ € ™ s a sobering thought for those who believe in investing in IPOs. Listing gains â € ” the return on the IPO scrip at the close of listing day over the allotment price â € ” have been falling substantially in the past two years. Average listing gains have fallen from 38% in 2005 to as low as 2% in the first half of 2007. Of the 159 book-built initial public offerings (IPOs) in India between 2000 and 2007, two-thirds saw listing gains. However, these gains have eroded sharply in recent years. Experts say this trend can be attributed to the aggressive pricing strategy that investment bankers adopt before an IPO. â € œ While the drop in average listing gains is not a good sign, it could be due to the fact that IPO issue managers are getting aggressive with pricing of the issues, â € says Anand Rathi, chief economist, Sujan Hajra. While the listing gain was 38% in 2005 over 34 issues, it fell to 30% in 2006 over 61 issues and to 2% in 2007 till mid-April over 34 issues. The overall listing gain for 159 issues listed since 2000 has been 23%, according to an analysis by Anand Rathi Securities. Aggressive pricing means the scrip has often been priced at the high end of the pricing range, which would restrict the upward movement of the stock, leading to reduced listing gains for the investor. It also tends to suggest investors should not indiscriminately pump in money into IPOs. But some market experts point out that India fares better than other countries. â € œ Internationally, there have been periods of negative returns and low positive returns in India should not be considered a bad thing.

Post-2 n POST----7 TITLE: Post-2 n POST----7 TITLE: "[IIM-Jobs] ***** Bank: International Projects Group Manager"| Please send your CV & cover letter to anup. abraham@*****bank. com ***** Bank, through its International Banking Group (IBG), is expanding beyond the Indian market with an intent to become a significant player in the global marketplace. The exciting growth in the overseas markets is driven not only by India linked opportunities, but also by opportunities of impact that we see as a local player in these overseas markets and / or as a bank with global footprint. IBG comprises of Retail banking, Corporate banking & Treasury in 17 overseas markets we are present in. Technology is seen as key part of the business strategy, and critical to business innovation & capability scale up. The International Projects Group in IBG takes ownership of defining & delivering business critical IT projects, and directly impact business growth. Role: Manager  – International Projects Group Purpose of the role: Define IT initiatives and manage IT projects to achieve business goals. The project domain will be retail, corporate & treasury. The incumbent will work with teams across functions (including internal technology teams & IT vendors for development/implementation) and locations to deliver significant & measurable impact to the business. Location: Mumbai (Short travel to overseas locations may be needed) Key Deliverables: Conceptualize IT initiatives, define business requirements

Sentiment Classification Positive, negative, neutral – 3 class n Sports, economics, literature - multi Sentiment Classification Positive, negative, neutral – 3 class n Sports, economics, literature - multi class n Create a representation for the document n Classify the representation The most popular way of representing a document is feature vector (indicator sequence). n

Established Techniques n n n n Naïve Bayes Classifier (NBC) Support Vector Machines (SVM) Established Techniques n n n n Naïve Bayes Classifier (NBC) Support Vector Machines (SVM) Neural Networks K nearest neighbor classifier Latent Semantic Indexing Decision Tree ID 3 Concept based indexing

Successful Approaches The following are successful approaches as reported in literature. n n NBC Successful Approaches The following are successful approaches as reported in literature. n n NBC – simple to understand implement SVM – complex, requires foundations of perceptions

Mathematical Setting We have training set A: Positive Sentiment Docs B: Negative Sentiment Docs Mathematical Setting We have training set A: Positive Sentiment Docs B: Negative Sentiment Docs Indicator/feature vectors to be formed Let the class of positive and negative documents be C+ and C- , respectively. Given a new document D label it positive if P(C+|D) > P(C-|D)

Priori Probability Docu ment Vector Classif ication D 1 V 1 + D 2 Priori Probability Docu ment Vector Classif ication D 1 V 1 + D 2 V 2 - D 3 V 3 + . . . D 4000 V 4000 - Let T = Total no of documents And let |+| = M So, |-| = T-M P(D being positive)=M/T Priori probability is calculated without considering any features of the new document.

Apply Bayes Theorem Steps followed for the NBC algorithm: n Calculate Prior Probability of Apply Bayes Theorem Steps followed for the NBC algorithm: n Calculate Prior Probability of the classes. P(C + ) and P(C-) n Calculate feature probabilities of new document. P(D| C + ) and P(D| C-) n Probability of a document D belonging to a class C can be calculated by Baye’s Theorem as follows: • P(C|D) = P(C) * P(D|C) Document belongs to C , if P(D) + P(C+ ) * P(D|C+) > P(C- ) * P(D|C-)

Calculating P(D|C+) is the probability of class C+ given D. This is calculated as Calculating P(D|C+) is the probability of class C+ given D. This is calculated as follows: n Identify a set of features/indicators to evaluate a document and generate a feature vector (VD). VD = n Hence, P(D|C+) = P(VD|C+) = P( | C+) = |, C+ | | C+ | n Based on the assumption that all features are Independently Identically Distributed (IID) = P( | C+ ) = P(x 1 |C+) * P(x 2 |C+) * P(x 3 |C+) *…. P(xn |C+) =∏ i=1 n P(xi |C+) can now be calculated as |xi |

Baseline Accuracy n n n Just on Tokens as features, 80% accuracy 20% probability Baseline Accuracy n n n Just on Tokens as features, 80% accuracy 20% probability of a document being misclassified On large sets this is significant

To improve accuracy… Clean corpora n POS tag n Concentrate on critical POS tags To improve accuracy… Clean corpora n POS tag n Concentrate on critical POS tags (e. g. adjective) n Remove ‘objective’ sentences ('of' ones) n Do aggregation Use minimal to sophisticated NLP n