As others have commented, simply saying the probability the next (so initially, first) ball is white is $p$ and $1−p$ for black appears insufficient: you really need a prior distribution for the number of white balls.
Laplace's rule of succession does not seem to work here. His argument was essentially that the probability of the world ending tomorrow might be taken as having a Beta distribution. He started it as a $\text{Beta}(1,1)$ distribution so with mean $\frac12$, and after $k$ examples of the world not ending updated this to $\text{Beta}(1,k+1)$ with mean $\frac{1}{k+2}$ for it ending the next day and so $\frac{k+1}{k+2}$ not ending the next day. His estimate of the conditional probability of it not ending therefore tended to increase over time, rather like (2). (He could have started with a different prior and got a similar result.) But his argument would not fit the balls-in-urn model and in particular that the black ball might appear before some of the white balls; once the world has ended it is unlikely to continue existing. So we need to consider something else.
As an example of (1), you may actually know the number of white balls is $n$ (perhaps you put the balls in yourself), and so say $p=\frac{n}{n+1}$ initially. Conditional on having seen $k$ white balls, you might then say the probability of the next ball being white is $\frac{n-k}{n+1-k}$. This probability is decreasing as $k$ increases. Many other priors would produce a similar result, such as a Poisson distributed number of white balls: if it had parameter $\lambda$ then your initial estimate for $p$ would be $1- \frac{1-e^{-\lambda}}{\lambda}$ but as you saw more white balls the conditional probability of the next ball being white would fall towards $0$.
But other priors are possible and you can find an example which works for (2). A natural example might be when suppose you believed that in addition to the $1$ black ball, the number of white balls $N\ge 0$ followed a geometric distribution with $P(N=n)=(1-q) q^n$ for some value $q$. Your initial figure for $p$ would be $1-\frac{1-q}{q}\log_e\left(\frac1{1-q}\right)$ but, as you saw more white balls, the conditional probability for the next ball being white would increase towards $q$.
Having found examples which work for (1) and (2), it should be and is possible to find an intermediate prior which works for (3). Suppose you believed that in addition to the $1$ black ball, the number of white balls $N\ge 0$ followed a distribution with $P(N=n)=(n+1)(1-p)^2 p^n$ for some value $p$. Initially you would say the probability of the next ball being white would be $\sum\limits_{n=0}^{\infty} \frac{n}{n+1} (n+1)(1-p)^2 p^n= p$. After seeing $k$ white balls, your conditional distribution for the number of remaining white balls being $n$ would be unchanged - the trick in the construction is that $\frac{n}{n+1} (n+1)=n$ - and so the conditional probability of the next ball being white would remain $p$, as in (3). Lewis Carroll did something similar in one of his Pillow Problems drawing without replacement from an urn filled with two colours selected from a binomial distribution.
This shows that the prior distribution matters and that simply stating a single number related to the prior distribution without stating its shape is not enough for the question to be answered. All three of your arguments can be valid in some cases, but the others dominate in other cases.