-1

I would like to know how to convert from decimal values into Unicode and from Unicode back into decimal as I am interested in Cryptography.

Here's an example: unicode($) -> decimal(36)

Thank you

  • The functionality is implemented in System.Text.Encoding class and its descendants: https://docs.microsoft.com/en-us/dotnet/api/system.text.encoding?view=netframework-4.8 – rs232 Jun 04 '19 at 08:43
  • It sounds like either I haven't understood your question or you have a fundamental misunderstanding - especially given that this question is a paraphrased version of your previous one, which was already answered. Basically, there is no difference between a Unicode character and a decimal value in the internal bit representation of the value. It's nothing more than how you display it. If you treat it as an int, you display it as an int. If you treat it as a char, you display it as a char. You will need to increase your knowledge of this considerably before you can even begin doing cryptography. – DodgyCodeException Jun 04 '19 at 08:47
  • i am not a genius at coding, and yes i do understand that, but i am trying to ask for code that changes the way it is displayed, sorry about the confusion. – RanOutOfQuestions Jun 04 '19 at 08:52
  • OK, so what is wrong with the answers to [your previous question](https://stackoverflow.com/q/55820939/1016716) that asked the same thing? – Mr Lister Jun 04 '19 at 11:58

1 Answers1

0

A decimal can be cast to a char and back:

decimal d = 0x2c6f;
char c = (char)d; //it is an upside down A, viz: Ɐ


char cc = '$'; 
decimal dd = (decimal)cc; //dd is now: 36
Caius Jard
  • 69,583
  • 5
  • 45
  • 72
  • 1
    I suspect that by using the term "decimal", the OP means a decimal representation of a char (i.e. treat as an integral value and display it in base 10) rather than the specific C# data type called "decimal". The type casting you're doing here, by converting char to decimal, involves a non-trivial change in bit pattern. – DodgyCodeException Jun 04 '19 at 14:08