60187ef34f39107efd808161bf0f9829.ppt
- Количество слайдов: 40
Modeling Condition And Performance Of Mining Equipment Tad S. Golosinski and Hui Hu Mining Engineering University of Missouri-Rolla 1
Condition and Performance Monitoring Systems l Machine health monitoring l Payload and productivity • Allows for quick diagnostics of problems • Provides management with machine and fleet performance data l Warning system • Alerts operator of problems, reducing the risk of catastrophic failure 2
CAT’s VIMS (Vital Information Management System) l Collects / processes information on major machine components • • Engine control Transmission/chassis control Braking control Payload measurement system l Installed on… • Off-highway trucks • Hydraulic shovels • Wheel loaders • 785, 789, 793, 797 • 5130, 5230 • 994, 992 G (optional) 3
Other, Similar Systems l l Cummins • CENSE (Engine Module) Euclid-Hitachi • Contronics & Haultronics Komatsu • VHMS (Vehicle Health Monitoring System) Le. Tourneau • LINCS (Le. Tourneau Integrated Network Control System) 4
Round Mountain Gold Mine Truck Fleet 17 CAT 785 (150 t) 11 CAT 789 B (190 t) PSA (Product Support Agreement) CAT dealer guarantees 88% availability 5
VIMS in RMG Mine l l Average availability is 93% over 70, 000 operating hours VIMS used to help with preventive maintenance • Diagnostics after engine failure • Haul road condition assessment • Other Holmes Safety Association Bulletin 1998 6
CAT Mine. Star l CAT Mine. Star - Integrates … • Machine Tracking System • • • (GPS) Computer Aided Earthmoving System (CAES) Fleet scheduling System (Fleet. Commander) VIMS 7
Cummins Mining Gateway Modem Mining. Gateway. com Cummins Engine Database CENSE Base Station RF Receiver Modem 8
VIMS Data & Information Flow VIMS Legacy Database Mine Site 1 Data Extract Data Cleanup Data Load Mine Site 2 VIMS Data Warehouse Mine Site 3 Information Extraction Information Apply Data Mining Tools 9
Earlier Research: Data Mining of VIMS l Kaan Ataman tried modeling using: l Edwin Madiba tried modeling using: • Major Factor Analysis • Linear Regression Analysis • All this on datalogger data • Data formatting and transferring • VIMS events association • All this on datalogger and event data 10
Research Objectives l l l Build the VIMS data warehouse to facilitate the data mining Develop the data mining application for knowledge discovery Build the predictive models for prediction of equipment condition and performance 11
Interactions Data Acquisition Result Interpretation Data Preparation Data Mining 12
VIMS Features Operator Download Sensors & Controls Monitor & Store • • Event list Event recorder Data logger Trends Cumulative data Histograms Payloads Maintenance Wireless Link VIMS wireless Management 13
Data Source 14
VIMS Statistical Data Warehouse 1 -3 minute interval statistical data • • • Minimum Maximum Average Data Range Variance • • Regression Intercept Regression Slope Regression SYY Standard Deviation 15
VIMS Data Description l l l Six CAT 789 B trucks 300 MB of VIMS data 79 “High Engine Speed” events One-minute data statistics 16
SPRINT -A Decision Tree Algorithm IBM Almaden Research Center l GINI index for the split point l Strictly binary tree Built-in v-fold cross validation l 17
18
VIMS EVENT PREDICTION Normal Engine Speed High Engine Speed Normal Engine Speed Snapshot VIMS Data 0 0 0 1 2 3 4 5 6 0 0 0 High Eng 767_1 Event_ID Predicted Label Other 767_2 Eng_1 Eng_2 Other 19
“One-Minute” decision tree 20
Decision Tree: Training on One-Minute Data Total Errors = 120 (6. 734%) Predicted Class --> | Other | Eng 1 | Eng 3 | Eng 2 | Eng 4 | Eng 6 | Eng 5 | -------------------------------------------------------Other | 1331 | 18 | 9| 5| 16 | 6| 1 | total = 1386 Eng 1 | 0| 62 | 1| 3| 0| 0| 0 | total = 66 Eng 3 | 0| 11 | 51 | 2| 2| 1| 0 | total = 67 Eng 2 | 0| 12 | 8| 38 | 7| 0| 0 | total = 65 Eng 4 | 0| 3| 7| 2| 55 | 0| 1 | total = 68 Eng 6 | 0| 0| 0| 1| 0| 61 | 4 | total = 66 Eng 5 | 0| 0| 0| 64 | total = 64 -------------------------------------------------------1331 | 106 | 76 | 51 | 80 | 68 | 70 | total = 1782 21
Decision Tree: Test#1 on One-Minute Data Total Errors = 24 (24%) Predicted Class --> | Other | Eng 1 | Eng 3 | Eng 2 | Eng 4 | Eng 6 | Eng 5 | -----------------------------------------------------Other | 59 | 3| 0| 2| 3| 0| 1 | total = 68 Eng 1 | 4| 1| 0| 0| 0 | total = 6 Eng 3 | 0| 3| 1| 0| 0 | total = 5 Eng 2 | 1| 1| 0| 0| 0 | total = 4 Eng 4 | 1| 1| 0| 0 | total = 4 Eng 6 | 0| 0| 0| 7| 0 | total = 7 Eng 5 | 0| 0| 0| 6 | total = 6 -----------------------------------------------------65 | 9| 2| 5| 5| 7| 7 | total = 100 22
Decision Tree: Test#2 on One-Minute Data Total Errors = 35 (17. 86%) Predicted Class --> | Other | Eng 1 | Eng 3 | Eng 2 | Eng 4 | Eng 6 | Eng 5 | ----------------------------------------------------Other | 141 | 9| 2| 4| 4| 0| 0 | total = 160 Eng 1 | 2| 2| 1| 1| 0| 0| 0 | total = 6 Eng 3 | 2| 1| 2| 0| 1| 0| 0 | total = 6 Eng 2 | 2| 1| 0| 0| 0| total = 6 Eng 4 | 1| 0| 1| 1| 3| 0| 0| total = 6 Eng 6 | 0| 0| 0| 6| 0| total = 6 Eng 5 | 0| 0| 0| 6| total = 6 ----------------------------------------------------148 | 13 | 8| 7| 8| 6| 6 | total = 196 23
“Two-Minute” decision tree 24
Decision Tree Training on Two-Minute Data Sets Total Errors = 51 (5. 743%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | ENG 3 | ----------------------------------OTHER | 657 | 6| 19 | 3| total = 685 ENG 1 | 0| 62 | 10 | 0| total = 72 ENG 2 | 0| 13 | 54 | 0| total = 67 ENG 3 | 0| 0| 0| 64 | total = 64 ----------------------------------657 | 81 | 83 | 67 | total = 888 25
Decision Tree Test #1 on Two-Minute Data Total Errors = 14 (29. 79%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | ENG 3 | ----------------------------------OTHER | 28 | 5| 4| 1| total = 38 ENG 1 | 1| 0| 0| 0| total = 1 ENG 2 | 2| 1| 1| 0| total = 4 ENG 3 | 0| 0| 0| 4| total = 4 ----------------------------------31 | 6| 5| 5| total = 47 26
Decision Tree Test #2 on Two-Minute Data Total Errors = 15 (15. 31%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | ENG 3 | ----------------------------------OTHER | 71 | 8| 1| 0| total = 80 ENG 1 | 3| 3| 0| 0| total = 6 ENG 2 | 0| 3| 3| 0| total = 6 ENG 3 | 0| 0| 0| 6| total = 6 ----------------------------------74 | 14 | 4| 6| total = 98 27
“Three-Minute” decision tree 28
Decision Tree Training on Three-Minute Data Total Errors = 28 (4. 878%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | --------------------------OTHER | 411 | 23 | 4| total = 438 ENG 1 | 1| 65 | 0| total = 66 ENG 2 | 0| 0| 70 | total = 70 --------------------------412 | 88 | 74 | total = 574 29
Decision Tree Test #1 on Three-Minute Data Total Errors = 12 (19. 05%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | --------------------------OTHER | 42 | 9| 0| total = 51 ENG 1 | 3| 5| 0| total = 8 ENG 2 | 0| 0| 4| total = 4 --------------------------45 | 14 | 4| total = 63 30
Decision Tree Test #2 on Three-Minute Data Total Errors = 9 (14. 06%) Predicted Class --> | OTHER | ENG 1 | ENG 2 | --------------------------OTHER | 47 | 5| 0| total = 52 ENG 1 | 4| 2| 0| total = 6 ENG 2 | 0| 0| 6| total = 6 --------------------------51 | 7| 6| total = 64 31
Decision Tree Summary l l l “One-Minute model” needs more complex tree structure “One-Minute model” gives low accuracy of predictions “Three-Minute” decision tree model gives reasonable accuracy of predictions • l Based on test #1 • Other - 13% error rate • Eng 1 - 50% error rate • Eng 2 – 0 error rate Other approach? 32
Backpropagation A Neural Network Classification Algorithm Node Detail x 1 x 2 x 3 Input Hidden Layer Out Characteristic: Each output corresponds to a possible classification. w 1 w 2 f(z) w 3 z = Siwixi Some choices for F(z): f(z) = 1 / [1+e-z] (sigmoid) f(z) = (1 -e-2 z) / (1+e-2 z) (tanh) 33
Minimize the Sum of Squares SSQ Error Function yk (output) is a function of the weights wj, k. tk is the true value. Freeman & Skapura, Neural Networks, Addison Wesley, 1992 In the graph: • Ep is the sum of squares error • Ep is the gradient, (direction of maximum function increase) 34 More
Neural Network Modeling Results “Three-Minute training set” 35
Neural Network Modeling Result “Three-Minute set”: test #1 and #2 Test #1 Test #2 36
NN Summary l l Insufficient data for one-minute and twominute prediction models Three-minute network shows better performance than the decision tree model: • Other - 17% error rate • Eng 1 - 28% error rate • Eng 2 - 20% error rate 37
Conclusions l l Predictive model can be built Neural Network model is more accurate than the Decision Tree one • Based on all data l l Overall accuracy is not sufficient for practical applications More data is needed to train and test the models 38
References l Failure Pattern Recognition of a Mining Truck with a Decision Tree Algorithm • l Intelligent Miner-Data Mining Application for Modeling VIMS Condition Monitoring Data • l Tad Golosinski & Hui Hu, Mineral Resources Engineering, 2002 (? ) Tad Golosinski and Hui Hu, ANNIE, 2001, St. Louis Data Mining VIMS Data for Information on Truck Condition • Tad Golosinski and Hui Hu, APCOM 2001, Beijing, P. R. China 39
40


