I have a question about the qubit assay kit for measuring concentrations of dsDNA. The kit comes with 2 standards as tubes. The standards are labeled as 0ng/ul and 10ng/ul.
I calibrate the qubit with these standards just fine. First I make up the Picogreen mix, then take 10ul of each standard and dilute into 190ul of Picogreen mix, so the standards are diluted 1/20. After qubit is calibrated, I measure the standards in the kit, they read as <0.50ng/ml (for the 0ng/ul tube) and >600ng/ml (for 10ng/ul tube).
My question is how does the qubit calculate the 10ng/ul labeled standard tube as being greater than 600ng/ml when it is diluted 20-fold in the first place? Can someone show me the calculation of how it does this? I only want to know how it reads the standards as it does.
I calibrate the qubit with these standards just fine. First I make up the Picogreen mix, then take 10ul of each standard and dilute into 190ul of Picogreen mix, so the standards are diluted 1/20. After qubit is calibrated, I measure the standards in the kit, they read as <0.50ng/ml (for the 0ng/ul tube) and >600ng/ml (for 10ng/ul tube).
My question is how does the qubit calculate the 10ng/ul labeled standard tube as being greater than 600ng/ml when it is diluted 20-fold in the first place? Can someone show me the calculation of how it does this? I only want to know how it reads the standards as it does.
Comment