A tighter error bound for decision tree learning using PAC learnability

dc.contributor.author Pichuka, Chaithanya
dc.contributor.author Bapi, Raju S.
dc.contributor.author Bhagvati, Chakravarthy
dc.contributor.author Pujari, Arun K.
dc.contributor.author Deekshatulu, B. L.
dc.date.accessioned 2022-03-27T05:55:05Z
dc.date.available 2022-03-27T05:55:05Z
dc.date.issued 2007-12-01
dc.description.abstract Error bounds for decision trees are generally based on depth or breadth of the tree. In this paper, we propose a bound for error rate that depends both on the depth and the breadth of a specific decision tree constructed from the training samples. This bound is derived from sample complexity estimate based on PAC learnability. The proposed bound is compared with other traditional error bounds on several machine learning benchmark data sets as well as on an image data set used in Content Based Image Retrieval (CBIR). Experimental results demonstrate that the proposed bound gives tighter estimation of the empirical error.
dc.identifier.citation IJCAI International Joint Conference on Artificial Intelligence
dc.identifier.issn 10450823
dc.identifier.uri https://dspace.uohyd.ac.in/handle/1/8764
dc.title A tighter error bound for decision tree learning using PAC learnability
dc.type Conference Proceeding. Conference Paper
dspace.entity.type
Files
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Plain Text
Description: