So, I am new to the US credit card system, and I have read about the credit utilization ratio around. Everyone talks about it as being the percentage of your current balance over your current credit limits, across all your credit lines, but the definition of "current" is often a bit shady, I have found. A few resources mention that what matters is the balance you have at the end of the billing cycle, as that's when credit card companies typically report your credit data to the competent bureaus, but I have found mixed information in this regard.
Now, say I have one credit card with a $200 credit line. Whenever I make a payment with my credit card, I pay off the corresponding balance appearing on my account, as soon as it appears (so, 1-2 business days after the purchase). Say I am 100% consistent with this. Before I pay it off right as it appears, my balance never exceeds $65, which is 32.5% of my credit line. Say that, in a typical month, I borrow about $120 in total from my credit card (thus, $120 is what I get by summing up all the purchases made with the credit card). At the end of each billing cycle, my bill always shows a balance of $0.00.
Supposing that this scenario repeats without any change every month, which of the following is a better estimate of my credit card utilization ratio: 0%; the maximum balance/credit ratio that I kept at any given moment on my single credit card (so, in the above situation, 32.5%); or 60%, that is the percentage corresponding to 120/200 (total amount borrowed/credit line max)?
Thank you in advance for your insights.