If you have a voltage source and you want to charge a capacitor, you inevitably utilize circuit resistance (either an intentional resistor or unintentional cable resistance) to ensure that the peak current during charging is not stupidly high. And, if you do the math, the total energy acquired by the capacitor is: -
$$W = \dfrac{1}{2}\cdot CV^2$$
Whereas, the total energy delivered by the voltage source is this: -
$$W = CV^2$$
So, the lost energy (50%) is due to the cable resistance or intentional resistance.
As a comparison, if your voltage source was programmable and could be set to ramp up in voltage, then the energy delivered to the capacitor would be the energy liberated from the voltage source. In other words, when trying to charge capacitors from a hard voltage source, you get a collision situation. This doesn't happen with inductors; apply a voltage across an inductor and the current rises linearly and energy is stored very efficiently. This is why we use them in switch mode power supplies.