A useless way to (in)elegantly (not) hide text

by EEVa

I assume that when programming languages are an other world for you, it may seem impressive to see code, for instance I could write \x57\x6f\x77\x2c\x20\x31\x33\x33\x37\x20\x75\x73\x65\x64\x20\x74\x6f\x20\x72\x6f\x63\x6b\x20\xc2\xb7\x2f\x3f\x23 and that wouldn’t mean anything to you.

But, it is merely an hexadecimal reference to a unicode character! (see the article on wikipedia)

A simple way to transform letters from a string back and forth to their unicode reference follows (in Python):
>>> for letter in "Wow, 1337 used to rock ·/?#":
... t += hex(ord(letter)).replace('0x', '\\x')
>>> print(t)
>>> print("\x57\x6f\x77\x2c\x20\x31\x33\x33\x37\x20\x75\x73\x65\x64\x20\x74\x6f\x20\x72\x6f\x63\x6b\x20\xc2\xb7\x2f\x3f\x23")
Wow, 1337 used to rock ·/?#

And yes, it could be more elegant.