Try it and see if YOU notice the difference?
The opinion here is irrelevant.
@hornetstinger is correct. In absolute terms there's no arguing that standard bluetooth compression mangles the audio signal.
What you're talking about is whether the listener notices that difference. In a sense then, you're both right; it's just that one view is objective and the other subjective.
There's something more to this question, and so far (by post #7, as I write this) no-one has mentioned it. TTBOMK, Bluetooth doesn't support Dolby Digital 5.1 but optical does. That means whether or not the user notices the additional compression, the audio going in to the sound bar will always be a maximum of stereo only, and so any surround effect that the sound bar can muster up will only ever be ProLogic/PLII (at best) or some pseudo-surround fudge-up if Vizio choose not to licence the DPL decoders.
In subjective terms, one could rightly argue that the user can't hear the difference because the sound bar is a stereo only device or doesn't really do surround, or the audio it produces isn't high-enough resolution to show the difference. In objective terms though, there's no avoiding the fact that optical supports DD and DTS multichannel bitstream audio whereas BT does not, or at least as far as I'm aware.