
0ac1767825bf1c22ac3e7edcbffa3c97.ppt
- Количество слайдов: 51
Online Learning by Projecting: From Theory to Large Scale Web-spam filtering Yoram Singer Based on joint work with: Koby Crammer (Upenn), Ofer Dekel (Google/HUJI), Vineet Gupta (Google), Joseph Keshet (HUJI), Andrew Ng (Stanford), Shai Shalev-Shwartz (HUJI) UT Austin AIML Seminar, Jan. 27, 2005
Online Binary Classification No animal eats bees True Pearls melt in vinegar False Dr. Seuss finished Dartmouth True There are weapons of mass destruction in Iraq
Binary Classification • Instances (documents, signals): • Labels (true/false, good/bad): • Classification and Prediction: • Mistakes and losses:
Online Binary Classification • Initialize your classifier ( ) • For t = 1, 2, 3, …, T, … • Receive an instance: • Predict label: • Receive true label: • Update classifier ( [suffer “loss”/error] ) üGoal: suffer small losses while learning
Why Online? • • • Adaptive Simple to implement Fast, small memory footprint Can be converted to batch learning (O 2 B) Formal guarantees • But: might not be as effective as a well designed batch learning algorithms
Linear Classifiers & Margins • The prediction is formed as follows: • The margin of an example w. r. t Positive Margin Negative Margin
Separability Assumption
Classifier Update - Passive Mode
Prediction & Margin Errors
Hinge Loss
Version Space In case of a prediction mistake then must reside
Mistake Aggressive Mode is projected onto the feasible (dual) space
Passive-Aggressive Update
Three Decision Problems: A Unified View Classification Regression Uniclass
The Generalized PA Algorithm • Each example induces a set of consistent hypotheses (half-space, hyper-slub, ball) • The new vector is set to be the projection of onto set of consistent hyp. Classification Regression Uniclass
Loss Bound (Classification) • If there exists such that • Then where üPA makes a bounded number of mistakes
Proof Sketch • Define: • Upper bound: • Lower bound: Lipschitz Condition
Proof Sketch (Cont. ) • Combining upper and lower bounds • L=B for classification and regression • L=1 for uniclass
Unrealizable Case ? ? ?
Unrealizable Case (Classification) PA-II
(Not-really) Aggressive Updates
Mistake Bound for PA-I • Loss suffered by PA-I on round t: • Loss suffered by any fixed vector: • #Mistakes made by PA-I is at most:
Loss Bound for PA-II • Loss suffered by PA-II on round t: • Loss suffered by any fixed vector: • Cumulative loss ( ) of PA-II is at most:
Beyond Binary Decision Problems • Applications and generalizations of PA: • Multiclass categorization • Topic ranking and filtering • Hierarchical classification • Sequence learning (Markov Networks) • Segmentation of sequences • Learning of pseudo-metrics
Movie Recommendation System Recommender System
Recommending by Projecting • Project • Apply Thresholds 1 2 3 4
Prank Update w 1 2 3 4 Thresholds 5 Rank Levels
Prank Update w 1 2 3 4 5
PRank Correct Rank Interval w 1 2 3 4 5
Prank Update {2, 3} w 1 2 3 4 5
PRank Update w
PRank Update w x w
Each. Movie Database • • 74424 registered Viewers 1648 listed Movies Viewers rated subsets of movies Demo: online movie recommendation
PA@Google: Web Spam Filtering [With Vineet Gupta] • Query: “hotels palo alto” • Spammers: • Cardinal Hotel - Palo Alto - Reviews of Cardinal Hotel. . . Palo Alto, California 94301 United States. Deals on Palo Alto hotels. . More Palo Altohotels. . Research other Palo Alto hotels. Is this hotel not right for you? . . . www. tripadvisor. com/Hotel_Review-g 32849 -d 79154 -… • Palo Alto Hotels - Cheap Hotels - Palo Alto Hotels. . . Book Palo Alto Hotels Online or Call Toll Free 1 -800 -359 -7234. . Keywords: Palo Alto. Hotel Discounts - Cheap Hotels in Palo Alto. Hotels In Palo Alto. . www. hotelsbycity. com/california/hotels-palo-alto-…
Enhancements for Web Spam • Various “signals” features • Design of special kernels • Multi-tier feedback (label): • • +2 navigational site (e. g. www. stanford. edu) +1 on topic -1 off topic -2 nuke the spammer • Loss is sensitive to site label • Algorithmic modifications due to scale: • Online-to-batch conversions • Re-projections of old examples • Part of a recent revision to search (Google 3)
Web Spam Filtering - Results • Specific queries and domains are heavily spammed: • Over 50% of the returned URL for travel search • Certain countries are more spam prone • Training set size: over half a million domains • Training time: 2 hours to 5 days • Test set size: the entire web crawled by Google (over 100 million domains) • A few hours to filter all domains on 100’s of cpus • Current reduction achieved (estimate): 50% of spammers
Summary • • Unified online framework for decision problems Simple and efficient algorithms (“kernelizable”) Analyses for realizable and unrealizable cases Numerous applications Batch learning conversions & generalization Generalizations using general Bregman projections Approximate projections for large scale problems Applications of PA to other decision problems
Related Work • Projections Onto Convex Sets (POCS): • Y. Censor & S. A. Zenios, “Parallel Optimization” (Hildreth’s projection algorithm), Oxford UP, 1997 • H. H. Bauschke & J. M. Borwein, “On Projection Algorithms for Solving Convex Feasibility Problems”, SIAM Review, 1996 • Online Learning: • M. Herbster, “Learning additive models online with fast evaluating kernels”, COLT 2001 • J. Kivinen, A. Smola, and R. C. Williamson, “Online learning with kernels”, IEEE Trans. on SP, 2004
Relevant Publications • Online Passive Aggressive Algorithms, CDSS’ 03 CSKSS’ 05 • Family of Additive Online Algorithms for Category Ranking, CS’ 03 • Ultraconservative Online Algorithms for Multiclass Problems, CS’ 02 CS’ 03 • On the algorithmic implementation of Multiclass SVM, CS’ 03 • PRanking with Ranking, CS’ 01 CS’ 04 • Large Margin Hierarchical Classification, DKS’ 04 • Learning to Align Polyphonic Music, SKS’ 04 • Online and Batch Learning of Pseudo-metrics, SSN’ 04 • The Power of Selective Memory: • Self-Bounded Learning of Prediction Suffix Trees, DSS’ 04 • A Temporal Kernel-Based Model for Tracking Hand. Movements from Neural Activities, SCPVS’ 04
Hierarchical Classification: Motivation Phonetic transcription of DECEMBER Gross erorr T ix s eh m bcl b er Small errors d AE s eh m bcl b er d ix s eh NASAL bcl b er
Phonetic Hierarchy PHONEMES Sononorants Silences Nasals Liquids l y w r Vowels Front oyow uhuw Obstruents n m ng Affricates Plosives Center aa ao er aw ay Back iy ih ey eh ae b g d k p t jh ch Fricatives f v sh s th dh zh z
Common Constructions • Ignore the hierarchy - solve as multiclass C • A greedy approach: solve a multiclass problem at each node C C C
Hierarchical Classifier • Assume and • Associate a prototype with each label W 0 • Classification rule: W 1 W 3 W 4 W 2 W 5 W 6 W 7 W 9 W 8 W 10
Hierarchical Classifier (cont. ) • Define • W 0 W 1 W 3 W 4 W 2 W 5 W 6 W 7 W 9 W 8 W 10
A Metric Over Labels • A given hierarchy defines a metric over the set of labels via graph distance b a
From PA to Hieron • Replace a simple margin constraint with a tree-based margin constraint: - correct label - predicted label
Hieron - Update w 1 w 3 w 4 w 2 w 5 w 6 w 7 w 9 w 8 w 10
Hieron - Update w 6 w 7 w 10
Sample Run on Synthetic Data The hierarchy given to the algorithm An edge indicates that prototypes are “close”
Experiments with Hieron Datasets used # train # test # labels depth DMOZ (web pages) 8576 4 -FCV 316 8 Speech (phonemes) 80000 20000 40 4 Synthetic data 12100 6050 121 4 • Compared two models: Hieron with knowledge of the correct hierarchy Hieron without knowledge of the correct hierarchy (flat)
Experimental Results • Each graph shows the difference between the error histograms of the two models DMOZ Phoneme (TIMIT) Synthetic • Hieron makes fewer “gross” mistakes • State-of-the-art results for frame-based phoneme classification
0ac1767825bf1c22ac3e7edcbffa3c97.ppt