Following on @mlk's suggestion to do Olbers' paradox in reverse, I'll try to estimate how much of the sky ends at a visible star (i.e., emitting something in the visible range, so that it could literally be "seen" with sufficient light-gathering powera sufficiently large telescope.)
- This question & answer says that the total amount of light from the stars is estimated at a total of -6.5 magnitudes integrated over the sky.
- Assuming that the above figure is in the visual band, and using the standard reference fluxes on Wikipedia, this corresponds to about $1.5 \times 10^{6} \text{ Jy}$, or (dividing by 4π steradians) an average "surface brightness" of the sky of about $1.2 \times 10^{5} \text{ Jy/sr}$.
- A blackbody at 3000 K (typical of a red dwarf, which is the most common type of star) has a surface brightness at 550 nm (545000 GHz) of $1.3 \times 10^{18} \text{ Jy/sr}$.
So we can conclude that the "fraction of the sky" in which there is a star is the ratio of these two numbers, or about $10^{-13}$.
This is a very crude estimate, and one could quibble about my choices & simplifications above. It also doesn't take into account cosmological effects, such as redshift or the focusing/defocusing of light rays by cosmological expansion. I wouldn't be entirely surprised if it's off by a factor of 100 in either direction.
But even so, the fraction of the sky that "has a star in it" is utterly negligible. The fraction of the sky that contains something visible will be dominated by the Sun and the Moon (which together take up about $10^{-4}$ of the sky), with maybe a small correction from the planets in the fourth or fifth decimal place.