### Abstract

Generalized Binary Search (GBS) is a well known greedy algorithm for identifying an unknown object while minimizing the number of "yes" or "no" questions posed about that object, and arises in problems such as active learning and active diagnosis. Here, we provide a coding-theoretic interpretation for GBS and show that GBS can be viewed as a top-down algorithm that greedily minimizes the expected number of queries required to identify an object. This interpretation is then used to extend GBS in two ways. First, we consider the case where the objects are partitioned into groups, and the objective is to identify only the group to which the object belongs. Then, we consider the case where the cost of identifying an object grows exponentially in the number of queries. In each case, we present an exact formula for the objective function involving Shannon or Rényi entropy, and develop a greedy algorithm for minimizing it.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |

State | Published - 2010 |

Event | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada Duration: Dec 6 2010 → Dec 9 2010 |

### Other

Other | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
---|---|

Country | Canada |

City | Vancouver, BC |

Period | 12/6/10 → 12/9/10 |

### Fingerprint

### ASJC Scopus subject areas

- Information Systems

### Cite this

*Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010*

**Extensions of Generalized Binary Search to group identification and exponential costs.** / Bellala, Gowtham; Bhavnani, Suresh; Scott, Clayton.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010.*24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada, 12/6/10.

}

TY - GEN

T1 - Extensions of Generalized Binary Search to group identification and exponential costs

AU - Bellala, Gowtham

AU - Bhavnani, Suresh

AU - Scott, Clayton

PY - 2010

Y1 - 2010

N2 - Generalized Binary Search (GBS) is a well known greedy algorithm for identifying an unknown object while minimizing the number of "yes" or "no" questions posed about that object, and arises in problems such as active learning and active diagnosis. Here, we provide a coding-theoretic interpretation for GBS and show that GBS can be viewed as a top-down algorithm that greedily minimizes the expected number of queries required to identify an object. This interpretation is then used to extend GBS in two ways. First, we consider the case where the objects are partitioned into groups, and the objective is to identify only the group to which the object belongs. Then, we consider the case where the cost of identifying an object grows exponentially in the number of queries. In each case, we present an exact formula for the objective function involving Shannon or Rényi entropy, and develop a greedy algorithm for minimizing it.

AB - Generalized Binary Search (GBS) is a well known greedy algorithm for identifying an unknown object while minimizing the number of "yes" or "no" questions posed about that object, and arises in problems such as active learning and active diagnosis. Here, we provide a coding-theoretic interpretation for GBS and show that GBS can be viewed as a top-down algorithm that greedily minimizes the expected number of queries required to identify an object. This interpretation is then used to extend GBS in two ways. First, we consider the case where the objects are partitioned into groups, and the objective is to identify only the group to which the object belongs. Then, we consider the case where the cost of identifying an object grows exponentially in the number of queries. In each case, we present an exact formula for the objective function involving Shannon or Rényi entropy, and develop a greedy algorithm for minimizing it.

UR - http://www.scopus.com/inward/record.url?scp=84860617818&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84860617818&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781617823800

BT - Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

ER -