UTF-8 was first implemented in 1992 in 'Plan 9 from Bell Labs' -- it was created by Ken Thomspson (the father of Unix) and Rob Pike -- both of whom subsequently worked at Google where they created the 'Go' programming language (Ken is basically retired now).
a formal RFC (request-for-comments) describing UTF-8 for use on Internet was published in 1996 in RFC2044 (as hosted by IETF -- Internet Engineering Task Force):
UTF-8 became the most common used encoding on the internet circa 2008 an now makes up about 80-90% of all web content
Every operating system released in the last 20 years have supported some form of Unicode (be it UTF-16 or/and UTF-8). All of the fada's are part of the basic character plane in Unicode, it's not like we are dealing with the old ponc's (ḃ ċ ḋ ḟ ġ ṁ ṗ ṡ ṫ) or the ⁊ (Trionian et -- agus).
The fact is the Leap card should have been designed with full Unicode support from the very beginning. Needless to say depending on what platform they are using, it's not that hard to migrated from legacy US-ASCII to UTF-8. After all US-ASCII (7-bit) forms the first 128 characters of UTF-8 codespace.
Needless to say they could use 'punycode' as a work around, this is the system used by the gloabl DNS (Domain Name System) to allow for IDN's (Internationalised Domain names). Basically it's a system that converts in either direction a specifically formated string from US-ASCII <-> UTF8
eg. punycode: xn--sen-fla == seán (UTF-8)
As you can see the punycode has specific formating (starting with xn--)
so for example the domain: éire.ie -- is recorded as xn--ire-9la.ie within the underlying zonefile for ie. -- software that does DNS resoultion (that isn't ancient) knows to autoconvert éire.ie to xn--ire-9la.ie or from xn--ire-9la.ie to éire.ie
The reason for such a system is that the core standards of DNS do not support non US-ASCII characters.