1

I've searched this all over and all I get is webfonts. I don't want a webfont, I mean a font that I can install to my pc and see emojis instead of having to got to a site like iemoji.com to translate

I'm running Windows Vista so maybe that's why?

9 Guy
  • 105

1 Answers1

2

I'm going to keep this vague so that I don't say something that's wrong.

At the lowest level, your computer doesn't know what letters are. It only knows numbers. Because of this, it pretends that certain numbers are certain letters, simple enough. However, there's a lot of different ways to do this, and lots of people have tried making their own mappings. ASCII and Unicode are two very popular ones, and you can look up exactly what number means what letter or character for those standards. ASCII is an 8 bit system, meaning it can recognize up to 256 unique characters. I believe Unicode can recognize up to 2147483647.

Essentially, your computer looks at characters using a numbering system that doesn't include emoji. While it might seem easy to get one that does, the particular system that your computer uses doesn't even have enough slots to fit all the emoji you would want to use!

So, there's no way you can add emoji to windows at a system level. You would have to use applications that would only recognize them within the scope of that application (which is what I presume webfont is?)

Nealon
  • 146
  • Actually, ASCII is a 7-bit code, allowing up to 128 characters, some of which are non-printable. – Ron Maupin Apr 19 '16 at 01:51
  • @RonMaupin ASCII is 8-bit. Only the first 128 characters are standardized. – Nealon Apr 19 '16 at 01:56
  • No, ASCII is 7-bit, and it always has been. Extensions, such as those by the ISO/IEC provide an 8-bit, character-compatible alternative to ASCII, but ASCII itself is, and always has been, a 7-bit code. – Ron Maupin Apr 19 '16 at 01:59
  • @RonMaupin Looks like I was misinformed. ASCII is technically 7-bit. But every notable use of it uses it as a 8-bit system, and an ASCII character is stored in 8 bits. – Nealon Apr 19 '16 at 02:02
  • Old people, like me, remember these things. I remember telecommunications with ASCII. You could transfer faster if you used ASCII with only seven bits. Adding start and stop bits added a significant percentage to the slow transfers. – Ron Maupin Apr 19 '16 at 02:06
  • @RonMaupin That helps explain why the last 128 bits are so non-standard. Very interesting. – Nealon Apr 19 '16 at 02:15
  • so, the only solution is a new pc? – 9 Guy Apr 19 '16 at 10:45
  • @9Guy No! The solution is a new operating system. However, I'm not aware of any desktop operating systems that support emoji. – Nealon Apr 19 '16 at 18:11
  • @Nealon I know windows 7+ support emoji, and Windows 10 allows you to enter them in yourself. – 9 Guy Apr 21 '16 at 13:05
  • And although one encoding (UCS-4) could go to 4294967295, Unicode is defined to never exceed the maximum set by UTF-16 which is 1114111 -- and currently includes 1624 emoji codepoints, see http://unicode.org/emoji/charts/full-emoji-list.html (WARNING: BIG!) But the issue is having a font (or fonts) with a glyph at the (well-defined) codepoint. Webfont is a font downloaded from the web so you don't need it already installed on your system, and could among other things be a font that includes emoji glyphs. – dave_thompson_085 Apr 21 '16 at 15:24
  • @dave_thompson_085 how does one install a webfont? – 9 Guy Apr 26 '16 at 23:02
  • @9Guy On the client/browser you don't. On the server I expect it depends on the server but I haven't looked in any detail. – dave_thompson_085 Apr 28 '16 at 03:48
  • Accepted this answer 4 years later because to be honest at the time of you making this post, I had 0 idea what you and the commenters were talking about. Looking back you were totally correct – 9 Guy Nov 13 '20 at 16:21