A growing literature reveals that brain mechanisms contribute to symptoms of hearing loss. To study how hearing loss affects brain function, several groups including our own have developed light-based methods via functional near-infrared spectroscopy (fNIRS). Using fNIRS, we recently confirmed that lateral frontal cortex (LFC) engages when normal-hearing listeners direct attention to words in background sound versus listen passively. We now test how sensory resolution affects LFC recruitment. As a control, we also record over auditory cortex (AC). Using simulated cochlear implant speech, we ask listeners to perform a word detection task in a situation with competing background speech, as a function of the amount of sensory detail in the cochlear implant simulation. One possibility is that impoverished sensory cues reduce AC or LFC engagement, as compared to high-fidelity cues, limiting the potential usefulness auditory attention. Alternatively, poorly resolved sensory detail may increase AC or LFC engagement, causing it to saturate and limiting overall speech intelligibility. Recruitment of AC and LFC will be discussed in context of the saturation hypothesis.