Quantum simulation, the study of strongly correlated quantum matter using synthetic quantum systems, has been the most successful application of quantum computers to date. It often requires determining observables with high precision, for example when studying critical phenomena near quantum phase transitions. Thus, readout errors must be carefully characterized and mitigated in data post-processing, using scalable and noise-model agnostic protocols. We present a readout error mitigation protocol that uses only single-qubit Pauli measurements and avoids experimentally challenging randomized measurements. The proposed approach captures a very broad class of correlated noise models and is scalable to large qubit systems. It is based on a complete and efficient characterization of few-qubit correlated positive operator-valued measures (POVMs), using overlapping detector tomography. To assess the effectiveness of the protocol, observables are extracted from simulations involving up to 100 qubits employing readout errors obtained from experiments with superconducting qubits.