There are several things going on, it isn't just a matter of applying Ohm's law.
First, that's not a battery, it's a power supply. You're not dealing with the battery, rather the output of various circuitry that uses the battery. The circuitry manipulates the voltage and current and has its own limits. So in terms of how much current you can get out, you need to look at the specifications for the device.
One of the product pictures shows the back, where it lists the specs on the USB ports (5V, 2.4A for that model). It also has an AC output (which would be for something very low-powered), and you would need to look at the specs for its limits. It would not exceed the power rating for the product. Using what you listed in the question, the maximum power would be what it can handle momentarily, like the surge when you first turn on whatever is plugged into it. The continuous rating is the 65W and as you calculated, that would be just under 0.3A at 230V. However, the power rating is typically for the whole device, so you probably couldn't demand that much AC power if you were also siphoning power on the USB ports.
Which brings us to whether it can power your laptop. The laptop power adapters are "universal" in the sense that they don't manufacture it specifically for your laptop. It is probably used with a whole range of laptops, so they make it to support the heaviest demands it will be associated with. It is also designed to work on a wide range of voltage. The only starting point for calculations is the output because it produces that from anything in the input range.
It will put out 19.5V with as much current as the laptop needs as long as it is less than 6.15A. That doesn't tell you much about what your laptop actually needs. You can get better information from the laptop's battery.
The battery voltage is likely to be a little lower (to charge the battery, you need a higher voltage than the battery puts out). The battery probably lists a mAh rating (1/1000ths of an amp times how long it will provide output). The values are typically over 1,000; divide the number by 1,000 to get the amp-hours. The user manual may tell you how much run time you should get on a full charge, or just use the run time you achieved when the battery was new. Divide the amp-hours by the hours of run time and that gives you the amps. Multiply the amps by the battery's voltage and that gives you a ballpark approximation of the watts used by the laptop. There is a big margin of error and you don't want to operate right at the limits of that external power supply, so leave some cushion between your laptop watts calculation and the power supply's rating.
Another consideration. Just as the laptop battery has a mAh rating, so does the battery inside the power supply. The power supply's wattage rating just tells you how fast you can pull power from it without burning it out or popping a circuit breaker. It doesn't tell you how long you can operate. Inside its cabinet, there is a lot of circuitry in addition to the battery, so the battery takes up only a portion of the space. Compare some arbitrary fraction of the unit's size to the size of your laptop's battery. It could be using a battery with higher energy density than your laptop (like lithium vs NiCad), but if it is a lot smaller, you can probably expect less run time. Also, run time isn't linear with load. You will get less amp-hours out of the battery if you drain it at full load than if you nurse power out of it. One other consideration, if you plug the laptop AC adapter into this power supply, the AC adapter will waste some power so the full rated power won't be available for the laptop.