Peer review is under pressure. Demand for reviews is outstripping supply where reviewers tend to be busy people who contribute voluntarily. Authors highly value reviews, yet complain about the time it takes to get feedback to the point of putting research timeliness at stake. Though part of the review process has been moved to the Web, the review itself is still often conducted with the only help of a yellow highlighter, physical or digital. This work looks for more performant highlighters that account for the review specifics. Peer review does not stop at spotting the manuscript (de)merits, it also strives for manuscript improvement and gatekeeping. These functions are conducted within an often tacit research-quality framework, and frequently in a discontinuous way. Unfortunately, when it comes to support review practices, current facilities fall short. This work introduces a set of requirements for review-dedicated highlighters. These requirements are instantiated and evaluated through Review&Go, a color-coding highlighter that generates a review draft out of the reviewer’s highlighting activities. The aim is to offer representational guidance to enhance context/cognitive awareness so that reviewers can exert less effort while offering valuable and timely reviews.