Figure - uploaded by Su Jia
Content may be subject to copyright.
Cost of Different Algorithms for α = 1.

Cost of Different Algorithms for α = 1.

Source publication
Preprint
Full-text available
A fundamental task in active learning involves performing a sequence of tests to identify an unknown hypothesis that is drawn from a known distribution. This problem, known as optimal decision tree induction, has been widely studied for decades and the asymptotically best-possible approximation algorithm has been devised for it. We study a generali...

Context in source publication

Context 1
... Tables 1, Tables 2 and Tables 3 show the unknowns per test, we also have included these parameters as well as their average values in Table 4. Table 5 summarizes the results on WISER-ORG with clique and neighborhood stopping criteria. ...

Similar publications

Preprint
Full-text available
Huffman coding is well known to be useful in certain decision problems involving minimizing the average number of (freely chosen) queries to determine an unknown random variable. However, in problems where the queries are more constrained, the original Huffman coding no longer works. In this paper, we proposed a general model to describe such probl...
Preprint
Full-text available
In the submodular ranking (SR) problem, the input consists of a set of submodular functions defined on a ground set of elements. The goal is to order elements for all the functions to have value above a certain threshold as soon on average as possible, assuming we choose one element per time. The problem is flexible enough to capture various applic...
Article
Full-text available
Decision trees are popular classification models, providing high accuracy and intuitive explanations. However, as the tree size grows the model interpretability deteriorates. Traditional tree-induction algorithms, such as C4.5 and CART, rely on impurity-reduction functions that promote the discriminative power of each split. Thus, although these tr...