ASCII and UNICODE in Java | Core Java Tutorials

This Core Java Tutorial  “ASCII and UNICODE in Java” describes clearly the difference between ASCII character system and UNICODE character system in Java.


 

ASCII v/s UNICODE:

  • C & C++ languages need to represent only one language character system at a time.
  • These languages used to develop only standalone applications.
  • ASCII can represent all the symbols of one language using 1 byte
  • Hence C & C++ character data type occupies 1 byte memory.

 

  • In the following diagram standalone application (OS) supports “n” number of languages but at a time it has to represent only one language.
  • Suppose in our mobile, at a time we can select only language (radio button) hence it is also standalone.
  • To represent one language character set completely, I byte (256) number enough.

ASCII represents 1 language

 

  • Java & .net applications need to represent more than one language character system at a time.
  • It can be accessed from number of systems at a time, hence they may use different languages from different machines.
  • Hence to represent more than one language character set, we need to use UNICODE character system

 

In this diagram, a web-application need to represent more than one language character set as we can run the web application from different machines at a time in different languages.

UNICODE in Java

Share this