Data Type C# Tutorial

C# uses a 16-bit character type called Unicode.
Unicode defines a character set that is large enough to represent all of the characters found in all human languages.
There are no automatic type conversions from integer to char.
Char supplies methods that allow you to process and categorize characters.
Char defines the following fields:
public const char MaxValue
public const char MinValue
These represent the largest and smallest values that a char variable can hold.
Char implements the following interfaces: IComparable and IConvertible.

using System;
class MainClass
{
  static void Main(string[] args)
  {
    char MyChar = 'A';
        Console.WriteLine(MyChar);
  }
}
A