Data mining offers methodologies and tools for data analysis, discovery of new knowledge, and autonomous process control. This paper introduces basic data mining algorithms. An approach based on rough set theory is used to derive associations among control parameters and the product quality in the form of decision rules. The model presented in the paper produces control signatures leading to good quality products of a metal forming process. The computational results reported in the paper indicate that data mining opens a new avenue for decision-making in material forming industry.

1.
Michie, D., Spiegelhalter, D. J., and Taylor, C. C., 1994, Machine Learning, Neural, and Statistical Classification, Ellis Horwood, New York.
2.
Domingos, P., and Pazzani, M., 1996, “Beyond Independence: Conditions for the Optimality of the Simple Bayesian Classifier,” Machine Learning: Proceedings of the Thirteenth International Conference, Morgan Kaufmann, Los Altos, CA, pp. 105–112.
3.
Vapnik, V. N., 2000, The Nature of Statistical Learning Theory, Springer, New York.
4.
Quinlan
,
J. R.
,
1986
, “
Induction of Decision Trees
,”
Mach. Learn.
,
1
(
1
), pp.
81
106
.
5.
Clark
,
P.
, and
Boswell
,
R.
,
1989
, “
The CN2 induction algorithm
,”
Mach. Learn.
,
3
(
4
), pp.
261
283
.
6.
Quinlan, J. R., 1993, C4.5: Programs for Machine Learning, Morgan Kaufmann, Los Altos, CA.
7.
Auer, P., Holte, R., and Maass, W., 1995, “Theory and Application of Agnostic PAC-Learning with Small Decision Trees,” A. Prieditis and S. Russell, eds, ECML-95: Proceedings of 8th European Conference on Machine Learning, Springer Verlag, New York.
8.
Friedman, J., Yun, Y., and Kohavi, R., 1996, “Lazy Decision Trees,” Proceedings of the Thirteenth National Conference on Artificial Intelligence, AAAI Press and MIT Press.
9.
Kohavi, R., 1995, “Wrappers for Performance Enhancement and Oblivious Decision Graphs,” Ph.D. Thesis, Computer Science Department, Stanford University, Stanford, CA.
10.
Aha
,
D. W.
,
1992
, “
Tolerating Noisy, Irrelevant and Novel Attributes in Instance-Based Learning Algorithms
,”
Int. J. Man-Mach. Stud.
,
36
(
2
), pp.
267
287
.
11.
Michalski, R. S., Bratko, I., and Kubat, M., eds., 1998, Machine Learning and Data Mining, John Wiley, New York.
12.
Michalski, R. S., Mozetic, I., Hong, J., and Lavrac, N., 1986, “The Multi-Purpose Incremental Learning System AQ15 and Its Testing Application to Three Medical Domains,” Proceedings of the 5th National Conference on Artificial Intelligence, AAAI Press, Palo Alto, CA, pp. 1041–1045.
13.
Grzymala-Busse
,
J. W.
,
1997
, “
A New Version of the Rule Induction System LERS
,”
Fundamenta Informaticae
,
31
, pp.
27
39
.
14.
Pawlak, Z., 1991, Rough Sets: Theoretical Aspects of Reasoning About Data, Kluwer, Boston, MA.
15.
Brooker, L. B., 1989, “Triggered Rule Discovery in Classifier Systems,” Schaffer, J. D., ed., Proceedings of the 3rd International Conference on Genetic Algorithms, ICGA89), Morgan Kaufmann, San Mateo, CA, pp. 265–274.
16.
Donnart, J. Y., and Meyer, J. A., 1994, “A Hierarchical Classification System Implementing a Motivationally Autonomous Aninmat,” Cliff, D., Husbands, P., Meyer, J. A., and S. W. Wilson, eds, Proceedings of the Third International Conference on Simulation of Adaptive Behavior, SAB94, MIT Press, Cambridge, MA, pp. 144–153.
17.
Wilson
,
S. W.
,
1995
, “
Classifier Fitness Based on Accuracy
,”
Evol. Comput.
,
3
(
2
), pp.
149
175
.
18.
Agraval, R., and Srikant, R., 1994, “Fast Algorithms for Mining Association Rules in Large Data Bases,” IBM Research Report No. RJ 9839, Almaden Research Center, San Jose, CA.
19.
Lim
,
T.-S.
,
Loh
,
W.-Y.
, and
Shih
,
Y.-S.
,
2000
, “
A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms
,”
Mach. Learn.
,
40
, pp.
203
228
.
20.
Cherkassky, V., and F. Mulier, 1998, Learning from Data—Concepts, Theory, and Methods, John Wiley, New York.
21.
Lanzi, P. L., Stoltzmann, W., and Wilson, S. W., eds., 2000, Learning Classifier Systems: From Foundations to Applications, Springer, New York.
22.
Holland, J. H., 1975, Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, MI.
23.
Goldberg, D. E., 1989, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, Reading, MA.
24.
Han, J., and Kamber, M., 2001, Data Mining: Concepts and Techniques, Academic Press, San Diego, CA.
25.
Kusiak
,
A.
,
2001
, “
Rough Set Theory: A Data Mining Tool for Semiconductor Manufacturing
,”
IEEE Transactions on Electronics Packaging Manufacturing
,
24
(
1
), pp.
44
50
.
26.
Kusiak, A., 2000, Computational Intelligence in Design and Manufacturing, John Wiley, New York.
27.
Kusiak
,
A.
, and
Kurasek
,
C.
,
2001
, “
Data Mining Analysis of Printed-Circuit Board Defects
,”
IEEE Trans. Rob. Autom.
,
17
(
2
), pp.
191
196
.
28.
Kusiak
,
A.
,
2001
, “
Feature Transformation Methods in Data Mining
,”
IEEE Transactions on Electronics Packaging Manufacturing
,
24
(
3
), pp.
214
221
.
You do not currently have access to this content.