If any baud rate is potentially acceptable (assuming some means of programming it in, or autodetection), then simply putting enough filtering on both ends of the line, will do. Cable type is irrelevant: multi, twisted or shielded.
More particularly, for a 3m coupling length, for noise frequencies up to c0 / 10l, or 7MHz or so (note the speed of light c0 is slower in the dielectric insulating the cables), coupling predominantly manifests as lumped equivalent capacitive and inductive coupling, between the power and data lines, with the coupling factor increasing ~proportionally with frequency, up to this point (above, see the following paragraph). For a 3 or 10V signal level (typical outputs from these interfaces using modern ICs), even if the load is very badly misbehaved (10s of V of ripple/noise), if that noise is dominant at frequencies a fraction of this (say, below a MHz), the interface will likely still read correctly, even if the baud rate is comparable to the noise frequency. If the noise is dominant at higher frequencies (10s of V at >1MHz), filtering will be required, and maximum bitrate will be limited some as a result.
For noise frequencies higher than 7MHz, transmission line effects become increasingly dominant, until eventually the coupling factors turn over and the energy from a whole line can couple into another (directional coupling effects), plus resonant effects, CM/DM mode conversion, etc.. Shielding would be mandatory up here.
Old serial protocols aren't generally practical at such rates anyway (RS-232 itself is lucky to go into the 100s kbaud), so the transmission line regime likely isn't important. Frequencies that high should simply be filtered out, to avoid potential analog interface effects (such as input clamp/ESD diode or input stage rectification).
For noise amplitudes less than a fraction of the signal level (maybe 1V or so, max), both interfaces are fine, against noise of any frequency.
These are with respect to voltages measured at the cable and transceivers, in whatever suitable method could do so without disturbing the values. (Not the most trivial thing, as surrounding water will indeed disturb the measurement too, so ideally this would be measured while immersed.) There could always be resonances or transformation effects in a pathologically designed system, but assuming the source end is well-behaved (dampened supply impedance, filtered comms), measurements should not differ very much with respect to position (at transceiver(s), or anywhere along the cable/connectors).
Overall, some filtering is recommended, and a maximum baud rate of perhaps 50-100kbps is likely feasible.
Noteworthy that RS-232 is more vulnerable to capacitive coupling, due to its high output impedance (single to fractional kΩ); more or less for this reason, it compensates by using a higher signal level (nominally 15V, but most transmitters these days only manage 6-10V). RS-485, with matched-impedance terminations, is about even between inductive and capacitive coupling strengths; its differential connection, with +12/-8V minimum common mode range, affords significant immunity.
If twisted pairs are used (one for data, one for power), the inductive loop area between power and signal is significantly reduced for RS-485. The between-signals twist would worsen crosstalk for RS-232, but again, it's highly unlikely this would be a problem. There will also be a modest reduction in capacitance between power and data pairs, which RS-232 may benefit from.
Both (capacitive and inductive) coupling factors are more-or-less worst-case for plain multiconductor cable; it would seem it should be avoided if possible. The effect of plain multiconductor cable would be to degrade baud rate by requiring additional filtering to ensure signal quality. The effect would only be modest, probably by a factor of less than 2.
If shielding is an option, then individually shielded pairs could be used (at least the data pair), with the data pair's ground being tied to circuit ground, at both ends, as close to the connectors as possible. (Ideally this would be made on a dedicated pin, so that the signals can be received differentially with respect to that ground, at both ends. But I gather this isn't a full-system design scope, so the next best will have to do.) This would allow higher data rates and smaller signal levels; USB might be feasible, for example -- probably not High Speed or above, without the dedicated ground pin, but Full Speed or below would be fine.
Presumably, a badly-behaved device would contain sufficient filtering internally to deal with itself? Maybe it's possible that something could be so bad, not only can it not function by itself, but it knocks out other onboard functions as well; I have no idea.
This also assumes a badly-behaved device can be discarded otherwise-harmlessly, if found to cause malfunction when use is attempted -- malfunction of itself, or of other devices, or the whole system entirely. Presumably including during a mission. I certainly wouldn't want to get stranded at depth; I guess this is an ROV, not piloted, but if it's maneuvered into some inconvenient place, tugging on the umbilical might not be enough to extract it, and I'm guessing that would be a significant capital loss plus scrubbed mission. As an engineer, I would want to see far more consideration of equipment involved here; it should start with a high-level scope, such as risk and cost of failure versus effort required to identify further risk factors (such as this).