
Thanks Daniel, Tobias and Maren for your replies.
I see there is reason behind, but I'm still not completely convinced. Data integrity should be guaranteed between end points through a verification mechanism, not relying on the transmission channel robustness (or absence of interference): encrypting/decrypting can be a way but a separate checksum could be just as good and it should always be the way for distributed binaries, because corruption could happen before encryption or while saving the file on the receiving computer's hard disk, after the browser has decrypted data. The point about drowning sensitive data into a sea of noise has the weakness of providing more material for decrypters so the chance to break in may even become higher.
Surely, if today's resources make HTTPS' overhead negligible, one could say: why not? Even if it proved useless, the wasted effort would be minimal. While legitimate, that's not exactly the way I think.
In any case, I see that this is quite a recent topic that's being discussed a lot all around. I'm not an expert so I think I'll sit down and see as it develops.
Luca
-- View this message in context: http://inkscape.13.x6.nabble.com/Let-s-encrypt-is-this-possible-tp4979718p49... Sent from the Inkscape - Dev mailing list archive at Nabble.com.