Background Cardiac perfusion MRI enables quantification of myocardial blood flow in ml/min/g. However, signal intensity (SI) saturation of the arterial input function (AIF) leads to underestimation of the AIF and errors in perfusion quantification. Dual bolus experiments avoid this by using an unsaturated dilute bolus to measure the AIF and have been shown to be accurate against gold standard microsphere measurements. Despite the advantages associated with absolute quantification, dual bolus has seen limited adoption outside of a few large academic centers in part because it is difficult to implement. Set up requires either two injectors or a complex preloading scheme, both of which add time and complexity to a procedure that can be sensitive to experimental error. To calculate perfusion, the AIF from the dilute bolus is scaled to match the myocardial SI data from the full bolus using the ratio of contrast concentrations between the full and dilute boluses at the left ventricle. This is typically assumed to be the dilution ratio used to mix the dilute bolus, and any imprecision during mixing can translate into large errors in AIF height and thus perfusion calculations. Modifying the dual bolus technique so that it is more tolerant of experimental error might facilitate more widespread adoption outside of large research centers. Here we present a method for empirically determing contrast ratios that can be applied retropspectivley to dual bolus perfusion data to reduce experimental error.