New book – order details

Western European Languages – A Reference Guide is now available online in all markets.

“It occurs to me that there should be a copy of this book in every household.”

“An easy book to tip into as well as read more thoroughly.”

“An exploration of these languages not just as they are spoken in Europe but across the world.”

Average 5* reviews on Amazon, Google and Loot.

“My new book, Western European Languages – A Reference Guide, is now available in all major markets online, including in:

UK & Ireland

United States











South Africa

However, if you are in the UK, do contact me directly and I can save you the postage costs.


Is it “Yoo-“ruguay or “Oo-“ruguay…?

Uruguay has an arguably unparalleled record among men’s national football teams per head of population, and tonight is another decisive moment. Obviously, the issue to be decided is whether the BBC and ITV presenters will call them “Yoo-ruguay” or “Oo-ruguay”… what’s going on here?

Firstly, the name Uruguay comes from the Guarani language, and is in fact the name of a river (the probable origin of -guay). The country is specifically the Republica Oriental (the “Eastern Republic”), at the eastern end of the river. In Uruguay, Spanish has come to predominate (typically the Rioplatense variety similar to that of Buenos Aires and therefore most speakers in neighbouring Argentina); this is not the case is nearby (but not neighbouring!) Paraguay, where Guarani remains in common use.

However, the issue here is the pronunciation of the first syllable in English, and here we go right back to basics – how are each of the vowels actually pronounced. When you read “A, E, I, O and U”, how do you pronounce the last letter?

The letter u in the English language is itself pronounced with what is referred to in phonetics as a “yod”, or perhaps more obviously as a “y-glide”. In other words, the y is inherent to the pronunciation of the u.

We see this in the pronunciation of lots of words, from “cube” to “universe” to “few”. Here, the vowel sound is identical to the pronunciation of the letter u itself.

However, this becomes contested in some cases. The simplified tendency is for “dual” and “news” to retain the y-glide in British English but to lose it in American English, although this also differs between dialects (for example there is no y-glide at all in parts of East Anglia – even in words such as “few”). In other instances, the y-glide has now been lost: even in British English, maintaining it in words such as “suit” or “evolution” would be seen as archaic or characteristic of older speakers. One peculiarity here is the word “dude” which is, specifically, a borrowing into British English from American and thus does not retain the y-glide in either variety (even though typically “dune” does in British).

What on earth has this to do with Uruguay? One of the challenges for any languages is what to do with imported names, and how to “nativise” them. In English, for example, the personal name Rasputin typically is nativised (so the y-glide is added), but Putin varies. With country names, the most obvious example is Cuba – here, there is no y-glide in Spanish (so natives of Cuba pronounce it koo-ba) but there is in all varieties of English (just as there is in “cube”).

It is this which brings us to “Uruguay” – this name is essentially borrowed into English via Spanish from Guarani, and is pronounced by natives of Uruguay itself without a y-glide. However, as with Cuba, Uruguay does naturally take the y-glide in English (just as “universe”).

Here, consistency is key. There are some presenters who think it preferable to go without the y-glide because natives of the country itself do not – but would they say koo-ba for “Cuba”? Additionally, there is another issue with “Uruguay”, namely that Spanish speakers stress the final syllable rather than the first one – are they also doing that? Indeed, do they say Deutschland for “Germany”?

My own sense is that the correct English pronunciation (“correct” meaning both normal and consistent) is Yoo-ruguay. But we will see who wins later on…

Letter to Irish Times

My letter in the Irish Times this morning:

I would have to take issue with Rory Montgomery’s contention (26/11) that reforming and reinforcing the Agreement so that Northern Ireland’s institutions of government actually work constitutes “meddling” with the Agreement. On the contrary, in time of crisis in our economy and our health service and at a time when many of the basic assumptions which applied in 1998 apply no longer, it constitutes an urgent necessity.

Firstly, there is a tendency implicit in Mr Montgomery’s argument that the institutions are some sort of “optional extra”, whose operation may at any time reasonably be stopped provided it is by a party which is “Unionist” or “Nationalist”. On the contrary, the devolved institutions are Northern Ireland’s government – when they are brought down, Northern Ireland ceases to be governed. At best, this vacuum is filled by attempts at administration to cover up the cracks but still leave vital issues such as Health reform and economic reinvigoration flailing; at worst, they are filled with the dark shadows of extremism as “politics” shifts from the debating chamber to the street. Such a vacuum may be theoretically justified, but it is practically unacceptable and it causes real harm to the public.

Secondly, things have changed. Northern Ireland has seen, no doubt partially as a consequence of Brexit moving the UK and Ireland apart but also because it is now inhabited mainly by a post-Agreement generation, a fundamental political re-alignment. It is unjustifiable, even in theory, to suggest that a “Unionist” or “Nationalist” party should be allowed to exercise a veto on the very operation of government in Northern Ireland, but that the greatly increasing number who have disavowed Unionism or Nationalism at the ballot box should be left utterly disenfranchised – subject to the whims of others’ vetoes but unable to wield their own. This failure to represent those who vote beyond the traditional dividing lines was a failing even of the original agreement and was only made worse at St Andrew’s, but to ignore it now when fully 20% of Assembly members are now of the “other” designation is fundamentally anti-democratic.

Thirdly, it should be noted that the Alliance Party, which was as insistent on reform of the institutions when it had six seats as it is now with 17, has put forward a perfectly reasonable reform option which constitutes not “meddling” but actually “reinforcing”. Exceptionally (by the standards of normal democracies), the DUP, purely by virtue of being the largest party within its designation and without being the largest party overall, would still on the basis of the last election be automatically entitled to Ministerial posts (including the joint highest). The difference is that should it choose not to take on the responsibility of government it would simply move into opposition, rather than being allowed to block entirely the operation of the institutions and of democracy itself.

Ultimately, the purpose of the Agreement was to enable Northern Ireland to be governed purely democratically, while reflecting its divisions. Those divisions are no longer what they were, and that should be regarded as a triumph to be celebrated, not a problem to be overcome.

25 years on, is it really too much to ask that the Agreement be reformed so that those who wish to shirk their responsibilities should no longer be rewarded for doing so, and so that the people of Northern Ireland are given the same basic right of democratic representation and scrutiny taken for granted on the rest of the island?

What’s going on with the laws of football?

I have written before on the offside law and on a comparison of the laws of football (which I once refereed, albeit at the lowest possible level) with the rules of hockey. Recent events in the Champions’ League and the (otherwise much tainted) World Cup have brought some of these issues back into focus. Four in particular are worthy of review.


The offside law is, of course, widely misunderstood (and, ahem, just as often by men). In practice, there are three separate aspects to it – the position, the assessment (i.e. the time at which it is assessed), and the sanction (i.e. whether or not an indirect free-kick is awarded).

The offside position, as the average bloke down the pub probably couldn’t actually tell you, is when a player is farther forward than the halfway line, the ball and the second last defender. This latter phrasing is important – it is not the last outfielder, but rather the second last player at all (a point which was relevant to the controversial VAR decision in the opening game of the World Cup between Qatar and Ecuador). Being “farther forward” is assessed by any part of the body which may be used by an outfielder to play the ball (i.e. hands and arms do not count, even for the goalkeeper).

The offside assessment takes place when the ball is played by an attacker. This does not have to be intentional, nor forward. Again, this latter point was relevant to the overturning of a late goal in the Champions’ League group game between Tottenham Hotspur and Sporting Club. Additionally, in interpretation, the whole assessment is abandoned at the time a defender plays the ball intentionally aside from for a “save” (which may actually be a block by a defender) – other than for a “save”, when a defender plays the ball intentionally all opponents are no longer assessed for offside (i.e. whether they are in an offside position or not becomes irrelevant).

The offside sanction applies to a player assessed to be in an offside position and “gaining or seeking to gain an advantage” (in older parlance, “interfering with play”). A whole raft of interpretations are placed on this, including playing the ball, attempting the play the ball, stopping an opponent from playing the ball, obstructing, and so on.

Both the infamous VAR reversals at the Tottenham-Sporting and Qatar-Ecuador games were correct under the current law and interpretation. The problem was that they did not “feel” right in either case – in one instance, the Tottenham player had played the ball backwards but because it was then deflected forward by a defender, this became relevant to the assessment; in the other, the Ecuador player was assessed to be in an offside position but was well onside upon playing the ball. There are questions here about the offside law itself, rather than about VAR which was only doing what was asked of it. There is an argument that position should mean wholly in front of the halfway line, the ball and the second last defender (or perhaps even specifically the last outfield defender, although such a law would be difficult to write) – i.e. that if any part of the attacker’s body is level with any part of the defender’s body, the attacker is deemed to be in an onside position. There is an argument that assessment should only apply when the ball is specifically played forward by an attacker. And there is certainly an argument that the interpretation of “gaining an advantage” requires some further work – when I refereed, it was in fact playing (or attempting to play) the ball, impeding an opponent, or standing in line with the goal (as assessed as a triangle from either goalpost), and it strikes me that there was not much wrong with that.

Hockey got around all of this simply by abandoning offside. It had come to confine it only to the attacking quarter of the field (which has been trialled in various leagues for football also). I think abandoning offside has worked well for hockey but would not work so well for football, where it is easier to play the ball long and in the air.


To put my own cards on the table, handball is where football has gone most wrong laws-wise. Interestingly, a Tottenham Hotspur Champions’ League match is involved again.

In an epic quarter-final in 2019, one of Tottenham’s goals in a 4-3 defeat at Manchester City (enough to win the overall tie on away goals) went in off an arm. This was the final straw for the International Rules Board, which did not want goals scored in such a way.

Even if we accept this was a “problem” (and, well, was it?), the “solution” was surely overkill. Now, handball applies if intentional, or even if unintentional if it leads to a goal, a goal-scoring opportunity or a change of possession. This was seen as unfair on attackers, who could be deemed guilty of the same handball offence that defenders would not be deemed guilty of, and in effect this has been evened up by effectively forcing defenders to run around like convicts with their hands behind their back. An example of a penalty given on this basis was one given for Canada against Belgium when a convict dared to release an arm to turn in the air and the ball struck it… it’s a bit mad, frankly.

Here, hockey offers a much more obvious solution. You can simply deem that a goal can only be scored if the ball goes in off a defender or off a part of an attacker’s body with which they may legitimately intentionally play the ball (i.e. not the arm or hand). If it goes in off an attacker’s hand or arm, simply, a goal-kick is awarded (just as is currently the case if the ball goes straight in from a throw-in, a dropped ball or an indirect free-kick). This is similar to in hockey where a shot from outside the circle (hockey’s penalty area) – or even from inside it at a penalty corner if the ball has not gone outside since it was taken – cannot result in a goal; it is not uncommon even for keepers simply to let the ball pass into the goal enabling the hit-out (equivalent of goal-kick). This would return the handball offence to being for intentional handball only, and stop the sense of having to even it up by treating defenders in their own penalty area like convicts…

Match timing

Arsenal’s epic late title win in 1989 is slightly understood in retrospect because not only was the winning goal scored in injury time, but there really wasn’t very much injury time (on that occasion two minutes were played, as was fairly typical at the time). 23 years later, when Manchester City also pulled out a late turnaround, they scored twice in injury time but they already had longer (five minutes; again, as had become typical at the time). Now, ten minutes or more is becoming, if not typical, far from unusual.

This is tied to a determination from the authorities that the ball should be in play for longer, and indeed in the professional game it was already in play for longer in 2012 than in 1989. The issue, however, is that the game itself is lasting a lot longer – tampering with TV schedules and in some ways detracting from the drama of “injury time” goals (which are now not much rarer than a goal in the last ten minutes was 30 or 40 years ago).

Here, on balance, the authorities probably have it right. Hockey times precisely – time is stopped for specific events (these actually vary depending on the level and the competition but always include a penalty stroke and, almost always now, a card) and when time is up it is up (even if someone has just unleashed a shot or is about to put the ball in an empty net). This has its merits, but arguably football’s tendency to end time when the ball is in midfield or out of play is probably more appropriate (at least to it).

This does bring us to a further point, however…

Rolling substitutions

The decision by the Iranian keeper to remain on the field against England having suffered an evidently serious concussion was worrying. Concussion is not something, in any sport, which can be left to the player, nor even to the player’s team. The football authorities at top level have tried to intervene by seeking independent medical advice based on TV pictures, but this is still not binding.

In sports such as hockey, any blow with even the potential to cause concussion requires a departure from the field. Of course, hockey has the advantage that it has rolling substitutes, so the team is not at the disadvantage of playing with one fewer during any period where a concussion is assessed; this would not currently be the case in football. (Formally, therefore, the game of hockey is now played between two teams of up to 16, or 18 in some circumstances, with 11 permitted on-field at any given time.)

At least, it would not be the case at top level. In fact, the laws of the game already permit rolling substitutes, but these have not been put into force in much of the professional game.

The route to rolling substitutes in hockey was not smooth, however. Initially, they resulted in the development of penalty corner specialists, who entered play only at attacking penalty corners and often whose only role in the game was to attempt to score from them. Subsequently, the rules changed to bar substitutions of outfielders by either team between the award and completion (carefully defined) of a penalty corner. In football, it is hard to know where this law of unintended consequences would fall – I think we can be fairly sure that few would regard “specialists” coming on only for corners or attacking free-kicks as within the spirit of the game.

Nevertheless, rolling substitutes – whether generally permitted or specific to cases of suspected concussion – are surely part of the future of the game. For player health reasons, some means of moving in this direction should be found soon.


I am not convinced huge changes are needed to the laws of football, but changes to interpretation would surely help.

Football is the world game because of its beguiling simplicity – those who make and guide its laws should be informed by that basic premise. Offside should be assessed when the ball is played by an attacker, handball should apply only if the arm has been placed intentionally to play or potentially block the ball, and rolling substitutes should become the norm at least in cases of potential concussion.

Most laws have moved in the direction of simplification in past decades. For example, a goal may now be scored directly from any kicked restart except a dropped ball or indirect free-kick, and any restart entering the player’s own goal directly results in a corner – this used to be far more complex. Kick-offs have been simplified; laws around defensive walls usefully clarified, and so on. The back-pass law was also in fact a simplification, ultimately allowing for easier comprehension of the laws around the goalkeeper’s “possession” of the ball (which used to be subject to all sort of niggly rules, such as the “four step” rule now thankfully long forgotten).

So football will probably get there. It would be good if it were sooner rather than later!

One thing you don’t go into refereeing or umpiring for is money; it is a vocation and, particularly with hockey, it remains an amateur pursuit rarely fully compensated. This is the way it should be – officials should do it for the love of the game. And so it is with pieces such as this one, put out there for information and debate. That is the way it will always be. However, if you particularly enjoy my pieces on officiating, feel free to buy a coffee to keep me more awake… plenty of players and coaches will tell you I need it…

Christmas Stocking Filler

My book, Western European Languages – A Reference Guide (link will provide purchase options in various markets), is the ideal Christmas stocking filler for the linguist or even the social historian in your life.

The aim of the book was to provide a quick reference for anyone visiting a place where a particular Western European language was spoken, and I hope it accomplishes that.

Feedback suggests, however, that most buyers have particularly enjoyed the historical aspects, as the book demonstrates via sections on Latin (both Classical and Late), Gothic and Middle English where our modern languages came from and (in terms, at least, of a basic overview) how they ended up as they are. Therefore, those with merely a passing interest in modern languages have still found they enjoyed some of the historical points – a nice, if unintended, consequence of the project!

(I don’t believe in “Black Friday”, but the paperback price has always been frozen and the hardback just £1 extra.)

Also, if you like what I write on languages on this blog, it will always be free as it is a labour of love. However, if you feel like buying me a coffee (or maybe even a mince pie) for my efforts, you can do so here (which also provides another buying option for the book and for a future edition, likely 2024).

“Did they be…” and Gaelic substrates…

Someone I was speaking to the other day started a sentence with “Did they be…” before stopping to check, albeit of course light-heartedly, whether such a construction was in fact grammatical.

It is an incredibly simple question, yet the answer is surprisingly complex.

Prescription versus description

Firstly, we need to address the concept of “prescriptive” versus “descriptive” grammar; the former lays down “rules” that are to be obeyed in speech and writing if it is to be deemed “educated” or “formal”. The latter is a description of how people actually speak, which may be quite different from the prescriptive “rules” but will still invariably have its own rules.

For example, in colloquial Belfast English you may hear “Me and him is friends”. A prescriptive grammar would argue this “should” be “He and I are friends”, with the subject pronoun forms and plural verb form; a descriptive grammar would note that it is common in Belfast English to use object pronouns even as the subject of the sentence provided they do not appear alone (so you can have “me and him” as a subject but not “me” alone or “him” alone) and to use the third person singular verb form with multiple subjects (so again, you can have “me and him is” but not “I is” or “we is”). An important aspect of this is that the Belfast English phrase is not ungrammatical – it does follow grammatical rules. However, it is non-standard (in that it breaches the prescriptive “rules” laid down for Standard English). There is a place both for prescriptive and descriptive – the key is to get that place right.

Auxiliary verbs

Western European languages, and those of Germanic origin more than any, rely heavily on a concept known as the “auxiliary verb”. These are often common verbs in the language which take on an additional use as a “helping” verb to express a mood (suggestion, volition, obligation etc), an aspect (complete or incomplete action) or a tense (past, present or future). For example, “you and he should be friends” uses the conditional form of “shall” to express a mood (of suggestion, in this instance); “he and I have been friends for twenty years” uses the verb “have” as an auxiliary to express something which is ongoing and thus incomplete; and “he and I are going to be friends” uses the auxiliary “go” in a continuous form (which itself uses the auxiliary “be”, in this instance in its plural form “are”) to express future time.

Middle English

In the time of Chaucer, the verb “do” could only be used as a main verb – it had no auxiliary function.

To ask questions, you simply inverted the subject and the verb: “They follow the rules” simply switched to “Follow they the rules?”

To express negation, you generally simply added the particle “not”: “They follow not the rules”.

With some slight amendments to word order, this remains how questions and negation work in German and Dutch. However, English developed the verb “do” as an auxiliary, and this therefore changed somewhat.

“Do” auxiliary verb

By the time of Shakespeare or the King James Bible translation, “do” was appearing as an auxiliary verb, but its status remained inconsistent. Typically, it was used for emphasis, both in positive and negative sentences: you could say “they follow the rules” or emphasise it by saying “they do follow the rules”; and you could say “they follow not the rules” or emphasise it by saying “they do not follow the rules”; indeed you could ask “Follow they the rules?” or emphasise it by asking “Do they follow the rules?”

Ultimately, the trend settled on preferring the maintenance of the “do” auxiliary verb in questions and negation, but not in positive sentences (except for emphasis: “but if they do follow the rules…”)

Except… well, there are always exceptions…

Auxiliary verbs standing alone

The “do” auxiliary verb was not required with other auxiliary verbs: you cannot say “do(n’t) shall”, “do(n’t) have been” or “do(n’t) be going to” – you say “shall not”, “haven’t been” and “aren’t going to”. Indeed, it was also around the time of Shakespeare that English abandoned “double modals” (essentially disallowing two auxiliary verbs of any kind in the same clause); so you cannot now say “they will can be friends” (this is instead “they will be able to be friends”), whereas in German and Dutch (and actually Scots) this remains quite normal.

Therefore, all auxiliary verbs stand alone in Modern English. The question which then arises is, what happens when an auxiliary verb is used as a main verb?

Haven’t, don’t have, haven’t got

This is surprisingly complicated, and in fact evades any reasonable “rules” of prescriptive grammar.

In Modern Standard English, for example, the verb “to be” is always used (with one inevitable exception noted below) as if it is an auxiliary, even when it isn’t [see what I did there?] – in other words, you say “he is my friend”, “he is not my friend” and “Is he my friend?” without the “do” auxiliary just as you did in the olden days with every verb.

However, with “have” it gets a little more complicated, and rather depends on variety. You say “I have a friend” or perhaps “I have got a friend”; you can then say “I don’t have a friend” (definitely the preferred construction in General American) or “I haven’t got a friend” (more common on this side of the Atlantic) or even, at a push, “I haven’t a friend” (which perhaps sounds fine to an Irish or Scottish audience but probably seems a little odd, if not outright wrong, to most native speakers).

“Don’t be saying that…”

Non-standard English, particularly in areas with a Gaelic substrate (most obviously in Ireland and perhaps parts of Scotland) does allow the use of the auxiliary “do” with the verb “to be”, however.

“Do you be in Derry often?” was something directly asked to me some years ago; this is non-standard but completely grammatical, with the “do be” form conveying to Irish ears an element of continuity or habituality. To other native speakers of English, this may sound literally foreign – an outright mistake, even if an understandable one.

“Don’t be saying that now” is again more common in Ireland, but touches on the inevitable exception: the “do” auxiliary is used with “to be” in the negative imperative form. For example, “Don’t be stupid” is common to native English speakers everywhere, therefore; that said, its extension to the progressive use (“Don’t be saying” rather than “Don’t say”) perhaps again betrays a Gaelic substrate.

“Did they be”… therefore, is not Standard English; but its instinctive use in any part of Ireland maybe provides a hint at a Gaelic substrate. Doesn’t it?

What do we mean by “Latin”?

My book Western European Languages – A Reference Guide is now available in hardback, and while it is on a relatively niche subject it has received considerable feedback. Most of that has focused on the historical linguistics side which, to be quite frank, was not the intention of the book at all! Nevertheless, I am glad people have enjoyed that aspect, and let us return to it briefly here.

Just a quick note first on Ko-Fi.

I write this blog primarily in an attempt to provide background to language learning and linguistic curiosities. I do this primarily as an enthusiast, although I do hold an honours degree in German with Spanish and a masters in Germanic Linguistics (as well as government qualifications of proficiency in Spanish, Italian and Portuguese and an A-Level in French). This is a passion, which readers can take or leave at their pleasure, and I will always do it for free.

However, it does take time, so if you do want to buy me a virtual coffee just to keep me alert, you can do so here. Many thanks!

Major languages such as Portuguese, Spanish, Catalan, French, Italian and Romanian [I exclude the latter from consideration both in the book and here for the simple reason that I have no competence in it beyond a few phrases learned and largely forgotten for a business trip nearly two decades ago] are, as most people know, descended from “Latin”.

I argue in the book, however, that we are inclined to make a comparison directly with “Classical Latin”, i.e. the language of Cicero and Caesar – therefore, the formal version of Latin spoken by upper class Romans around 2100 years ago. We do this not because it was any “better” than any other version (although “Classical” hints at its being regarded subsequently as “first class”), but because we have plenty of evidence of it still available to us, from great poems to graffiti.

We may then be vaguely aware that there is some contention over exactly how “Classical Latin” was pronounced. As we approach Christmas we will once again sing “Adeste Fideles” and such like in a version of Latin essentially based on modern Italian (specifically perhaps Tuscan) pronunciation, known as “Ecclesiastical Latin”. This version has been used for ecclesiastical and academic reasons, particularly in Italy itself, but was never in fact spoken as a daily language.

Let us even just take the first line: Adeste fideles, laeti triumphantes; this is pronounced when sung with a modern /s/ sound; with a low [ae] similar to the ‘a’ in modern English “name” (and the vowels in fideles pronounced similarly); and with [ph] pronounced as /f/. These pronunciations are all developments which had occurred before medieval times but not all at the same time, meaning “Latin” was probably never pronounced as it is sung here. Caesar and Cicero would have pronounced the [s] further up in the mouth (sometimes this sounds like a lisp, although it is not quite that either), they would have pronounced each [e] in fideles as long but still as an [e] (close to English “bet”, not “name”); the [ae] in laeti as a high vowel (more like the main vowel in modern English “nine”); and notably [ph] represented an aspirated /p/ typically in words borrowed from Greek (whose consonants were characteristically more aspirated than in Latin – this is still the case in daughter languages; notably, Germanic languages such as English or German behave much more like Greek in this regard).

This then raises the question: when we say the above languages all derive from “Latin”, what do we mean by “Latin”?

In the book, I emphasise the fairly obvious point that Latin continued to develop after Cicero and Caesar and indeed it was not until the time of another great ‘C’ of history, Charlemagne, that Latin began to break down into local vernaculars – a process which had ended by the end of the first millennium (when no one in Europe would have considered themselves to be speaking “Latin” any more). However, it is important to note that people until Charlemagne’s time did consider themselves to be speaking “Latin”, despite being well aware that the version they spoke had evolved considerably – in much the same way that modern English speakers are aware that the way they write is obviously at odds from the way they speak because the written form is based on the pronunciation of Chaucer’s time, 650 years ago. In other words, we have some consciousness that the written form is conservative (hence “write”, “right” and “rite”) but it does not concern us in daily use.

In fact, the Latin of Charlemagne’s era, almost 900 years on from Caesar and Cicero, would have been even further removed. Changes in pronunciation had also rendered almost useless the case system, so that word order and prepositions had become much more important in speech; they had also caused some verb endings to align to the extent that some had to be recast for other purposes or dropped altogether (the future was completely reanalysed and other tenses and moods merged or disappeared, often variously in different dialects). However, there were significant variations in time and in geography; for example, the process whereby the westernmost four of the six above-mentioned languages adopted plural in -s (completely coincidentally with English, by the way) was incomplete even beyond Charlemagne’s time.

So is it correct, therefore, to say that those languages derive in fact from Vulgar Latin?

By “vulgar” we in fact mean “popular”, i.e. the Latin of the people (the negative connotation of the word “vulgar” came later, but ultimately it is merely cognate with “folk”). This is reinforced by the use of the term “Vulgar Latin” also to refer to later forms of Latin, which are of course naturally closer to modern languages – though I myself prefer to simpler term “Late Latin” to clarify that we are talking about a point in time, not a register of the language. Either way, technically, it is correct to say that those languages derive from Vulgar Latin, but only if we define it properly.

“Vulgar Latin” is often expressed as something separate from “Classical Latin”; in much the same way perhaps that we may distinguish “Estuary English” from “Received Pronunciation”. However, we do need to be careful not to create the idea that the “Vulgar” and “Classical” forms were entirely distinct, regardless of whether we use them to refer to register or to a point in time – “Vulgar” and “Classical” were in fact both “Latin”, and when we say that certain modern languages are defined from “Latin” we do mean the language as a whole, incorporating ultimately both its “Vulgar” and “Classical” forms.

It is certainly true that modern Latin-derived languages have emerged from a form of Latin spoken much later than Cicero or Caesar, and spoken by a broader segment of the population than urban capital city dwellers. Nevertheless, they do derive from “Latin”, as a whole; following the history from one to the other makes little sense unless this point is catered for.

For all that, it is close to pointless to learn any form of Latin specifically in order to learn its daughter languages – unless the learner has a particular motivation to learn the language of antiquity, the quickest route to proficiency in modern languages derived from Latin is to start by learning the one which appears most interesting. Any of them will be closer to each other than any is to Latin. Nevertheless, some knowledge of the route from Classical and Vulgar Latin to the modern languages may come in useful on occasions, depending ultimately on the kind of learner you are and the interests you have.

What the hell is going on?

The latest collapse of devolution in Northern Ireland is actually fairly standard – so much so that I would need to look up all the times it has happened before. However, what is actually going on in the world, politically?

Global democratic chaos

In the United States, next week, yet again the electorate will choose “cohabitation”, making it effectively impossible for any legislation to pass with different parties in charge of each House (the Representatives and the Senate). This is a country in which, only last year, there was an attempted coup. Five people were killed, we tend to forget. The man responsible for that, the then sitting President, is still at large and openly threatening journalists, with sensitive documents at his home and the only consolation that there is no chance he would ever be bothered to read them. Fundamentally, madness and deadlock have become the norm.

In Italy, political collapses are also the norm. The bizarre decision to bring in Mario Draghi as a technocratic Premier (the man whose policies at the European Central Bank caused serious pain to southern European countries including his own) was duly punished by the electorate, who instead installed a new head of government from a party which, in the past, has openly colluded with fascism in a country which has suffered from it within some people’s living memory.

In France, there was relief when the far right populist candidate “only” received 42% of the vote in the second round of the Presidential election – before President Macron was denied a parliamentary majority and left with huge struggles to pass a budget.

In Germany, the Chancellor had to used a specific clause of the Basic Law to force the government to take a decision on the ongoing use of nuclear power stations in order to secure energy supply, contrary to a decision eleven years ago to phase out nuclear. That decision has led to chaos, but itself followed on from a corrupt deal where a previous Chancellor, acting in an interim capacity having lost an election in 2005, signed a deal on a pipeline from Russia. That pipeline, approaching completion, has suddenly begun “leaking” in the North Sea.

In Sweden, an arrangement first seen in Austria in 1999 will see the third largest party in fact provide the State Minister (head of government) after far right populists came second.

Speaking of Austria, in 2019 it had its first ever successful vote of no confidence in its government, forcing a mass resignation meaning the President had to appoint an interim government made up entirely of independents from outside politics, headed by a jurist (who technically became the country’s first ever female head of government). The Chancellor (head of government) losing the vote was, however, re-elected a few months later only to have to resign in a scandal, for his successor then to decide he really was not too keen on the job (although he did last slightly longer than the UK’s last Prime Minister).

In Spain, one region (Catalonia) is still struggling with the aftermath of a “successful” independence vote (which was partially boycotted, so did not really count). In Denmark, an early election was forced by a mink scandal during Covid. Ireland, which looks like a beacon of stability in comparison, has a government reliant on an agreement to change Taoiseach (head of government) half way through, in six weeks’ time in fact. That is before we go into any detail about the neighbours…

What is going on?

Depression of the Liberal

Victor Lapuente puts forward on idea in his book Decálogo del buen ciudadano which, I am sure, is by now available in English. He argues many things but one, in an appropriately titled section Depresión del Liberal, suggests that one reason for this chaos is that the comparative reduction of religion in society has led to a melding of religion and politics where once there was none (except where specifically introduced into politics, such as by the DUP in Northern Ireland). This leads to a “Liberal depression” because politics ceases to be about tax levels, welfare systems, health provision and so on, but rather about concepts and people who are treated, effectively, as religious icons – rendering meaningful rational debate impossible.

This seems counterintuitive, but currently much about the world is.


The late broadcaster Peter Jennings once produced a documentary arguing that the reason there are so many conspiracies around the Kennedy assassination is that it is just too uncomfortable to accept we live in a world chaotic and random enough that the President of the United States can be shot dead in broad daylight by a madman. Yet, in November 1963, that is exactly what happened.

As humans, we need to believe that in fact someone is in control. We cannot accept that a President can just be shot or indeed that a much loved Princess can be killed in a random car crash. We do not like to believe that we live in a world where terrorists can fly planes into skyscrapers or a virus can suddenly be unleashed forcing us all to stay at home while still killing millions. People who believe that someone is behind all of this (even if that person is simply referred to as “they”) in fact tend to be less likely to suffer from depression and anxiety, even if they believe that “they” (whoever “they” may be) is fundamentally evil.

It is a very human thing to seek to make sense of the world, and it is much easier to believe that someone – be they “God”, “Allah” or simply “they” – is actually out there planning it all. Generally, we crave the security that comes with such a belief over the insecurity of a random world of 7.5 million people some of whom are cruelly deprived of life at a young age for no good reason and some of whom become Prime Minister despite having no discernible ability. This randomness drives us mad – almost literally.


To be clear, even though I profoundly disagree with them, I understand why in June 2016 some people voted for the UK to leave the European Union simply because they felt the UK did not really fit into a common political and social zone with Continental Europe. However, some people did not vote ‘Leave’ because they wanted the UK to leave the European Union. Much of it had nothing to do with that. We can see this in the way public “debate” has proceeded since.

“Brexit” has in fact become a word which means something other that simply the withdrawal of the UK from the European Union. You will hear it in phrases such as “This would risk Brexit” or “Brexit would be endangered”. This is odd, because Brexit has occurred – the UK is no longer part of the European Union. So how can it be risked or endangered?

However “Brexit”, so expressed, does not really have anything to do with the European Union, nor indeed with politics at all. Rather, I am sure Lapuente would argue, it has become almost an argument of faith – “Brexit” is in fact, as it were, the way, the truth, the life. It has to do with a sense of detachment felt by many people – often for perfectly legitimate reasons particularly in post-industrial areas which are among the most marginalised in Western Europe – and an idea that “Brexit” is the route to overcoming that detachment. This detachment is, however, predominantly a feeling rather than an obvious economic reality – it can be expressed by arguments about the need for improved opportunities in left-behind areas (an argument typically associated with the “left”) or by arguments about negative experiences of immigration and multiculturalism (and perhaps even a “lost past”; arguments typically associated with the “right”). It has, therefore, taken on a quasi-religious meaning as a concept to be followed; and some of its followers follow it with a religious fervour, meaning that rational political argument is rendered ineffective and trying to determine whether it is of the “left” or the “right” is pointless.


“Boris”, oddly referred to almost always by that name only, was in some ways the son of Brexit.

Was he partying during Covid? Oh come on, who didn’t break the rules?

Did he miss a confidence vote? Oh, I’m sure he was “paired” or something!

Did he go on holiday during parliamentary time? Oh that’s harsh, are you saying he’s not allowed to go on holiday?

Should he be Prime Minister again? Oh yes – after all, he “saved” us from Covid!

Rationally, it is incredible just how many excuses people make for a man who is plainly a narcissist with no discernible ability to be Prime Minister whatsoever – a man who was finalising his divorce from a wife he cheated on while she had cancer rather than preparing for Covid in February 2020; a man who was partying and then proved too lazy to cover it up property during Covid while even the Queen adhered to the rules even at funerals; a man who promoted a sex pest and sent colleagues out to lie about it; a man who cannot even tell us how many children he has but has definitely been sacked more than once for lying; a man who missed a three-line whip confidence vote because he was on holiday during work time and could not even be bothered to get a flight back for two days even to run his own leadership bid; a man who met Russian people as Foreign Secretary he had no business meeting and put other Russian people in the UK Parliament contrary to independent recommendation and then laughably claimed to be a friend of Ukraine; a man who spends no time reading papers but a lot of time getting drunk.

Anyone looking at him even half rationally would see a nasty, selfish bigot. Yet look at him quasi-religiously – as the living embodiment of Brexit – and you see a quirky man who looks like he doesn’t take himself too seriously and may in fact be a bit of fun.

God and Caesar

Christians are in fact commanded not to mix politics and religion in this way. Christ himself said, in the gospel of Matthew, “Give back to Caesar the things which are Caesar’s, and to God those which are God’s“. Essentially, let God be your moral guide but not your political guide – that, at the time, was (Tiberius) Caesar and for us (luckier that we are) our democratically elected government. We should maintain our moral principles, but not at the expense of rational debate around practical issues; and indeed we should remember who is Christ, and who is not.

In a world in which religion were religion and politics were politics, the Trumps and Johnsons of the world would be seen as the cheats, liars and troublemakers that they are – by applying moral principles and rational debate this becomes indisputable. Mix the two, however, and you can end up presenting Trumps and Johnsons almost as if they are infallible religious figures – against whom any challenge is seen as that not of a thinker but of an infidel.

Why do they attain this status?


Unfortunately the “messiahs” of our new politics of religious fervour will almost invariably be narcissists because, psychologically, those are the people who care least about the suffering they inflict on others but also who exhibit a deep insecurity which means they have a constant thirst for adulation.

This is a lethal combination. Those who crave attention are also those likely to abuse it. Those who attain mass followings are also those most likely to seduce them. Those who attain positions of social responsibility are also those most likely to shirk it in the name of selfishness. The whole thing becomes a vicious circle of self-centredness and egotism.

Northern Ireland

Which brings us to what has just happened (again) in Northern Ireland.

I need write no more…


I write this blog primarily in an attempt to provide background, more long-term views of a world which has, unfortunately, been taken over in the rolling news era by a constant desire for short-term soap opera. This is a passion, which readers can take or leave at their pleasure, and I will always do it for free.

However, it does take time, so if you do want to buy me a coffee just to keep me alert, you can do so here. Many thanks!

Which accent should we learn?

One aspect of language learning is which accent we should learn. Languages are, fundamentally, spoken constructs in origin and if we are to learn them by “comprehensible input” (essentially, therefore, by imitation) we will have to learn a version. Surely?

Ultimately, it depends on the language and the circumstances, but there are some tips with regards to Western European languages at least. (I will take this opportunity cheekily to point out that my book on these is now available in hardback.)


French is probably fairly straightforward in this regard: it is almost certain, no matter where in the world you are, that the version you learn will be based on the educated speech of Paris, which is the predominant “standard”.

One exception can be for learners in Canada, who may learn the French of Quebec. This is quite distinct, with variations in vocabulary and, very notably, intonation. Nevertheless, it is very much the minority form. The standard French of Belgium, Luxembourg and Switzerland, for example, is essentially based on that of France with minor vocabulary differences (these are more common in informal usage); likewise, African French ultimately is an imitation of that Paris-based standard.


Spanish is a little trickier. As an official and widespread language of 21 countries and territories (and even those do not include the United States, where it is spoken natively by nearly 50 million people), which version to learn is by no means straightforward – although in fact it matters little.

The Spanish-speaking world is not easily divided between “European” and “Latin American” in the way commonly assumed (not least by English speakers used simply to “British” and “American”). The speech of Buenos Aires is as distinct from that of Havana as that of Madrid is.

All other things being equal, there is a good case for going for the Spanish of Madrid – essentially the “neutral” or even “prestigious” form of the language as spoken in Castile itself (i.e. central Spain). The reason for this is twofold; firstly, it maintains a distinct informal second person plural set of pronouns and verbs (vosotros entendéis ‘you [all] understand’ is distinguished from ustedes entienden as informal and formal registers; across Latin America only the latter is in regular use, even informally); secondly, it retains phonological distinción, thus z and c (before a high vowel e or i) are pronounced distinctly from s. These distinctions may make European Spanish seem more complex than Latin American, but overall it probably makes sense to learn them from the outset.

All other things being not quite equal with perhaps priority consideration being given to Latin America (where, after all, the overwhelming majority of speakers live), and perhaps Mexican is the version to look at. This is in part a pure numbers game – Mexico has close to three times as many Spanish speakers as any other country, and it has a significant influence over the Spanish used for dubbing. There are other reasons – while there are definitely differences in accent within Mexico (to my non-expert ear, particularly to the south the accent becomes markedly more Central American, close to Nicaraguan), the educated speech of Mexico City offers not just numbers but also a degree of phonological clarity perhaps not so evident elsewhere (in particular, the letter s is clearly and consistently pronounced in all positions; and ll and y generally merge as if the latter).

All other things being not at all equal, the trick is to enjoy learning Spanish in all its varieties. It is far easier now than it was even a decade ago to access lots of accents, and even select one depending on which part of the Spanish-speaking world delivers most interest. Ultimately, the main thing is to be consistent.


The first choice with Portuguese is to decide on European or Brazilian; African varieties are closer to the former, but it remains the case that the vast majority of native speakers live in Brazil.

Within Brazil there is also significant variation. Rio de Janeiro and Sao Paulo, despite being not hugely distant from each other, have clearly distinct accents even to a basic level learner. The former is the cultural centre and a version heard in broadcasting perhaps disproportionately to its numbers, but the latter has the clear numerical advantage. In the end, which version you choose may depend on how you want to learn – you may, for example, have a tutor or exchange from a particular place.


Given Italian is spoken almost exclusively in Italy, you would think the choice of version to learn would be more obvious. Unfortunately, this is probably not true.

It is said that the best version is lingua toscana in bocca romana (that works a lot better in Italian – essentially a Tuscan tongue in a Roman mouth). However, it is increasingly evident that the Italian of Milan is becoming more predominant, as the city (or, at least, the metropolitan area) becomes by far the most prosperous part of the country and thus the one increasing numbers of people choose to move to. This may shift the commonly taught version decisively north – meaning perhaps most notably that the s between vowels in words such as inglese or casa becomes voiced in the version learned by foreigners (more like Portuguese than Spanish).

Nevertheless, there is also a “neutral” version of Italian often used by actors or broadcasters, based on the above mix of Tuscan and Roman. That perhaps remains the version to go for currently; it is after all the one which will be heard most often, and thus the one which is easiest to imitate.


Even with Catalan, the choice of which version to learn is not straightforward.

The main dialect divide in the so-called “Catalan Lands” is in fact within Catalonia – essentially, Valencian follows western Catalan and Balearic follows eastern. There are significant differences, even in verb forms (parlo/parle ‘I speak’) and core words (huit/vuit ‘eight’), as well as notably in pronunciation (the e in aprendre ‘to learn’ sounds much more like the Spanish in western, but moved back to close to a in eastern).

On balance, eastern Catalan will be the one to go for, all other things being equal, as that is the version spoken in Barcelona.


Despite the huge historical complications of how “Standard German” came to be “Standard German” and the fact that the spoken standard is based on a different geographical location than the written standard, in fact German is one of the easier languages in which to pick a version – German phonology is taught almost always based on speech around Hanover.

Hanover is fairly central within the part of Germany where High German displaced Low German and thus where there are fewer variations within what we now call “German” (even though there is a separate languages, Low German, which holds out to some extent in more rural areas). A “neutral” spoken form developed initially among actors for literary use, known as Bühnendeutsch ‘Stage German’.

Dialect variation is very marked further south, in Bavaria and Austria, where the accent becomes very different from that of Hanover. Even in those places, however, no one expects learners to learn their version.


Standard Dutch, peculiarly, was once based on the speech of Antwerp – in Belgium! However, the growth of Amsterdam to become the largest agglomeration in the Dutch-speaking area means that the educated speech of Amsterdam is the commonly taught form, and without doubt the version to learn for beginners.


What about English?!

English is generally thought to be split between “British English” (actually the “Received Pronunciation” or “BBC English” based on the Oxford-Cambridge area) and “American English” (or “General American”, broadly based on educated speech in the Midwest). Learners choose based on which version they are most likely to use; at school, most Europeans focus on the former with some reference also made to the latter. However, the predominance of Holywood is also significant in determining which version is chosen by foreigners learning through “comprehensible input” and the American version is unsurprisingly that taught in most of the Americas.

Enjoy the variation!

Far from being a challenge, the variation within languages can be seen as beneficial to learning as it provides an extra point of interest and even added discussion around pronunciation to start with (which can do no harm).

In the end, consistency of use is more important than the version chosen. However, all things being equal, a “neutral” or “standard” form in a populous country, region or city is probably the best to go for, for the simple reason it will typically be the one most often used in the content through which the language is learned.


I write this blog primarily in an attempt to provide background to language learning and linguistic curiosities. I do this primarily as an enthusiast, although I do hold an honours degree in German with Spanish and a masters in Germanic Linguistics (as well as government qualifications of proficiency in Spanish, Italian and Portuguese and an A-Level in French). This is a passion, which readers can take or leave at their pleasure, and I will always do it for free.

However, it does take time, so if you do want to buy me a virtual coffee just to keep me alert, you can do so here. Many thanks!

Northern Ireland needs a government, not an election

Rishi Sunak’s arrival in 10 Downing Street is a remarkable moment, as he becomes the first minority ethnic head of government in any major European country. No one should forget the late John McCain’s extraordinarily gracious speech when the same occurred for the first time in the United States fourteen years ago; and no one should forget either what came later. It is essential that this moment of demonstration that anyone in the UK can rise to its highest political post should be matched by a similar determination to break down other divides, not least those of class.

It is also a moment of opportunity among troubling times here in Northern Ireland. But we should be clear – what Northern Ireland needs is a government, not an election.


We should be very clear about what a comprehensive failure the calling of another election would be.

First, it is a slap in the face for democracy itself and for the people who participate in it; there was a clear result of the Assembly Election last May, and those now seeking to usurp that result are damaging democracy itself. An Assembly must be formed on the basis of May’s election result – or democracy itself will be in dispute.

Second, the process of any election can only cause further division. What Northern Ireland needs currently is the precise opposite. For example, although it was outrageous to block the formation of an Executive based on issues around the Protocol (the stated reason) after the 2022 election just as it was based on issues around the Irish language (the real reason) after the 2017 election, the fact is that the Protocol exactly as is is in nobody’s interests. Indeed, it is now plain that the UK as a whole will move closer to the EU, perhaps agreeing regulatory standards and such like, rendering the Protocol less necessary; there is also a clear landing zone even in the short zone around checks occurring away from the Irish Sea ports. It should be possible to bring parties together for a compromise outcome, but also an improved one. An election does nothing to encourage such an effort, and everything to discourage one.

Third, there is no prospect of any outcome improving matters, and indeed some prospect that the outcome would cause further deterioration. If some United Unionist Front were to emerge in front, it would be a Pyrrhic victory; then Sinn Fein would, with some justification, argue that it has no reason to serve as “deputy” under a ludicrous system when Unionists refused to (the real reason there is no functioning Executive now, let us be honest); conversely, if Sinn Fein were to do well (as is likely in the context just mentioned), it would use that mandate to seek a “Border Poll” not for any compromise on governing Northern Ireland. The likely further reinforcement of the growth of the Alliance Party will also, in turn, just demonstrate how ridiculous an outcome which discriminates against those not designating “Unionist” or “Nationalist” is in modern Northern Ireland.

Fourth, and most importantly of all, we are facing into an immediate economic and energy crisis which will have a significant and detrimental impact on our collective standard of living. Such uncertainty is no time for a divisive election, nor to encourage rational thinking towards a sensible compromise; but, more to the point, it is a time when the people of Northern Ireland deserve and need clear policy direction and financial support, not to have people knocking their doors spoiling the run-up to what would otherwise be the first Covid-free and election-free Christmas since 2018.

Put simply, the purpose of an election is to find out what the people say, and in Northern Ireland we know what they have said. We do not need to waste time and money asking them again. We need to spend time responding to what they said, and money supporting the most vulnerable among them through the choppy waters ahead.


Underlying all of this is a simple further question which I suspect is why the UK Government is pretending it has run out of ideas.

What if Sinn Fein and the DUP do not want to be in government because they have damaged public services and the economy so significantly during the period of refusing to govern that they are now very difficult to put right without taking unpopular decisions? What if they simply do not wish to face up to the task of clearing up the mess that four years of non-government (out of the last six) have inevitably caused? And indeed, why should the Alliance Party and others enter an Executive to take part of the hit for the clean-up operation, when they were not responsible for the mess?

In other words, the fact is Northern Ireland has to be governed – and the question then becomes how?

The answer, whether those who fill the relevant roles in the Northern Ireland Office by the end of the week are quite prepared to admit it or not, will involve reform of the institutions to remove the sectarian veto.

In other words, the only reason you would have an election is because you have run out of ideas and you are prepared to do further damage – or, put another way, because you are not prepared to put in the work necessary to deliver institutions which actually function.

Northern Ireland needs a government, not an election. And therefore it needs institutional reform, not more of the same dysfunction.

Who are the “Americans”, and what should we call them?

I saw a query on Twitter the other day about what we should call people from the United States of America. This, of course, raises all sorts of questions around the word “American”, in English and in different languages.

United States

In English, the “United States” refers to the “United States of America”, and “American” almost always refers to someone from those United States. Few English speakers would think much of this; the debate would likelier come around the use of the word “America” to refer to that individual country.

Generally, the continent is referred to as “the Americas”. This does lead to some quirks: “South America” refers to the continent but the “American South” refers to the country (really to the “southeast” of the country as it is now formed, but let’s not concern ourselves with that element of precision!)


In English, an “Americano” is probably most often a coffee.

However, in Italian and, particularly, Spanish it refers to someone from the “Americas”. Both languages have a specific term for someone from the United States (statunitense and estadounidense respectively), which is used formally.

Nevertheless, Italian in particular rarely frowns too much as “americano” being used to refer specifically to the country, nor indeed even to “America” being used to refer to the “United States of America” (likewise américain and Amérique in French). It is in Spanish (and perhaps Portuguese) that precision can become a significant issue: in practice, “norte-americano” is often used in informal Spanish to specify someone or something from the United States (even though this is plainly still inaccurate), but “americano” would almost always refer to an inhabitant or native of the continent.

Spanish introduces another confusion too, because the full name of Mexico is “Estados Unidos Mexicanos“, literally “United Mexican States”. Are these not united states of America?!


German has another option for specifying someone from the United States, namely US-Amerikaner. This term is used for the avoidance of doubt if it is felt necessary to specify origin in the United States; the colloquial “Ami“, however, also specifically means of the United States and “Amerikaner” is more often than not also understood that way.


Many people from Latin America do express frustration at what they see as the clearly erroneous use of “American” in other languages.

Yet, unfortunately, this is just what happens in the world, whether right or wrong. On my own recent trip to Latin America our group was consistently referred to as “English”, even though in fact two of the eight were Northern Irish, two Welsh (indeed Welsh-speaking), and one even a New Zealander. Ultimately it did not really matter, so no one bothered with the precision. Ultimately, that is what people are doing when they use “Holland” for the “Netherlands” (which is the other way around – “Holland” is part of the Netherlands; the same used to apply to “Russia” and the “Soviet Union”) or “America” for the “United States”.

In linguistics, unfortunately or otherwise, the principle of least effort often applies.