the definition of a second wouldn't have an uncertainty when related to the transition of the Cs atom,
The definition of the SI unit "second" does not refer to just any given sample of Cs atoms, and specificly, not to transitions between the two hyperfine levels of the ground state of just any given sample of caesium 133 atoms;
but it refers to an idealization: a caesium atom at rest at a temperature of 0 K and free of any perturbation.
As far as this idealization is unambiguously defined, such that for given samples of caesium 133 atoms it may be unambiguously measured by how much they differ from the idealization, the SI unit "second" has no uncertainty.
But why doesn't the speed of light have uncertainty?
That's due to our definition of (how to measure) "speed";
and, in the first place, due to our definition of (how to measure) "distance" between participants ("ends") who were and remained at rest to each other, and (therefore also) due to our definition of (how to measure) whether a given pair of participants is "at rest" to each other, or not.
Specificly, within the framework of (special) relativity and thus of contemporary physics in general, we define the distance between two suitable participants (i.e. who were and remained at rest with respect to each other), say $A$ and $B$, through the ping duration between them i.e. the duration of either participant from indicating a signal until indicating the reception of the corresponding reflection off the other participant. (By the definition of how to measure mutual rest, these mutual ping durations, trial by trial, are equal, and constant.)
The distance of $A$ and $B$ with respect to each other is then expressed as $$\ell[~A, B~] = \ell[~B, A~] := \frac{c}{2} ~ \tau A[~\text{signal}, \circledR B \circledR \text{signal}~] = \frac{c}{2} ~ \tau B[~\text{signal}, \circledR A \circledR \text{signal}~],$$
where "$c$" is (just) a distinctive symbol (for distinguishing ping durations between a suitable pair of participants from other durations) which is (evidently) not zero; and the factor $\frac{1}{2}$ is included by convention.
Further, using the definition of "average speed of a trip from $A$ to $B$" as the ratio between "distance between start and finish" and "duration of the course having been occupied",
the (average) signal front speed of a signal being exchanged between $A$ and $B$ is evaluated as ratio between $\ell[~A, B~]$ and half of the ping duration between $A$ to $B$; explicitlty therefore:
$$\ell[~A, B~] ~/~ \frac{\tau A[~\text{signal}, \circledR B \circledR \text{signal}~]}{2} = $$ $$\frac{c}{2} ~ \tau A[~\text{signal}, \circledR B \circledR \text{signal}~] ~/~ \frac{\tau A[~\text{signal}, \circledR B \circledR \text{signal}~]}{2} = c.$$
So the symbol "$c$" which had been formally introduced in the distance definition is (subsequently) identified as the value of (average) signal front speed (or colloqially: the "speed of light in vacuum").
Isn't the speed of light something that's measured physically?
No: there is nothing genuinely to be measured; the result is necessarily "$c$", as sketched above; plainly, and without any uncertainty. (Therefore,"$c$" also lends itself as an "obvious, natural unit of speed". But, of course, speed values are independent of any particular choice of units in which they are expressed.)
What can and should be measured, trial by trial, is instead foremost: whether two particular "ends" under consideration were and remained indeed at rest with respect each other (or to quantify as far as they were not).