Exploring the Boundary Region of Tolerance Rough Sets for Feature Selection

Research output: Contribution to journalArticlepeer-review

113 Citations (SciVal)
204 Downloads (Pure)


Of all of the challenges which face the effective application of computational intelligence technologies for pattern recognition, dataset dimensionality is undoubtedly one of the primary impediments. In order for pattern classifiers to be efficient, a dimensionality reduction stage is usually performed prior to classification. Much use has been made of rough set theory for this purpose as it is completely data-driven and no other information is required; most other methods require some additional knowledge. However, traditional rough set-based methods in the literature are restricted to the requirement that all data must be discrete. It is therefore not possible to consider real-valued or noisy data. This is usually addressed by employing a discretisation method, which can result in information loss. This paper proposes a new approach based on the tolerance rough set model, which has the ability to deal with real-valued data whilst simultaneously retaining dataset semantics. More significantly, this paper describes the underlying mechanism for this new approach to utilise the information contained within the boundary region or region of uncertainty. The use of this information can result in the discovery of more compact feature subsets and improved classification accuracy. These results are supported by an experimental evaluation which compares the proposed approach with a number of existing feature selection techniques.
Original languageEnglish
Pages (from-to)655-667
Number of pages13
JournalPattern Recognition
Issue number5
Early online date09 Sept 2008
Publication statusPublished - 01 May 2009


  • Attribute reduction
  • Feature selection
  • Classification
  • Rough sets


Dive into the research topics of 'Exploring the Boundary Region of Tolerance Rough Sets for Feature Selection'. Together they form a unique fingerprint.

Cite this