Planned Obsolescence 0 – 3D Printer 4

Yet another win against planned obsolescence, checkmate by the 3D printer. If you haven’t seen the previous parts of the series, here they are:

Planned Obsolescence, 0 – 3D Printer, 1 Broken Stuff 0 – 3D Printing 1 Planned Obsolescence, 0 – 3D Printer, 2 Planned Obsolescence 0 – 3D Printer 3

The failing device was a UV sterilizer. Its cover is held by two little plastic notches, which broke. Plastic.

Analyzing the problem

The problem is that, without that little piece of plastic, the whole thing doesn’t work. That’s because there is a little switch that is activated when the lid is closed, and that switch doesn’t get activated.

The tricky part here is that the missing plastic part is very small. It is basically 2mm thick and a few millimeters large. Besides, the hole is higher than the plastic pieces that are left. There is no way we can just put a screw directly inside the remaining plastic.

Additionally, 3D printed pieces have only a limited level of detail. And the less plastic there is, the less robust it is.

The main issue here is that we have to have something go inside this hole to hold the whole cover. There is no way a piece of plastic will handle that. It has to be a screw or bolt.

I also didn’t want to use any glue, so the holding piece should be held by another screw.

Designing the piece

I came up with a very simple design: a screw to hold the printed piece in place, and another screw that acts as a sort of hinge.

And the printed piece comes to life, notice how we are really reaching the edge of how much detail we can get:

Putting the piece in place

I first needed to drill the existing door’s plastic to fit the holding screw. Low tech drill here:

Time for truth: screwing the piece in place and testing it on the machine. In here, you can see the little switch that needs to be activated in order for the machine to work.

Final result

And the final result: a fully functional door again, and a working sterilizer.

3D printer, 4th object fixed without needing to trash things and replace them with new ones. Yay!

Managing your gpg keys – the right way

When installing software from non-official repositories in Debian-based distributions, you might come across “key problems”, such as:

The following signatures couldn't be verified because the public key is not available: NO_PUBKEY <key>

When it appears, you might scratch your head for quite some time.

There is a simple way of dealing with those. However, as I recently experienced while upgrading a machine, most tutorials are incomplete or even sometimes totally misleading.

Why keys?

First, let’s see what these keys are for.

When installing software from non-official repositories, Linux needs to download packages from those external sources. However, hackers may introduce malware inside the files that are on the servers of those external sources. This type of hack is not an easy one, since web administrators are watching those sites closely. However, when it succeeds, the attackers can automatically distribute their malware to a lot of computers at once. Consequently, everything should be put in place to avoid spreading Trojan horses this way.

This is why any file that can be downloaded is signed digitally by the actual provider of the source. If a hacker alters a file, the digital signature no longer matches the content of the file anymore. This way, your Linux distribution makes sure that anything it downloads is an unaltered original file as originally published by the source.

To verify the signature, it only needs the public key of the source. And that is why your distribution needs to keep a list of public keys of all the non-official sources.

The apt-key way (deprecated in Ubuntu 22)

Previously, one could import keys using a tool called “apt-key”. Such way is still generally mentioned in many tutorials, in the form:

apt-key adv --keyserver hkp:// --recv-keys <KEY>

Transitioning to GPG (the new way)

Ubuntu and other distributions are switching to a different, more secure way of storing keys – although actually still not secure, but it is what it is.

Keys are now stored with GPG. To transition, it is possible to import keys from apt-key to gpg. This is done in two steps:

  • listing the existing keys with “apt-key list”, which gives the following type of result:
pub   rsa4096 2022-01-31 [SC] [expires: 2024-01-31]
  DF44 CF0E 1930 9195 C106  9AFE 6299 3C72 4218 647E
uid       [ unknown] Vivaldi Package Composer KEY08 <>
sub   rsa4096 2022-01-31 [E] [expires: 2024-01-31
  • importing those keys to gpg, using a command of the form:
apt-key export 4218647E | sudo gpg --dearmour -o /etc/apt/trusted.gpg.d/vivaldi.gpg

Importing directly into gpg

So, for new sources, rather than importing through apt-key, you should use gpg directly instead. The commands take the form of:

curl -fsSL | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg

Matching package source and gpg file

Now, here comes the trick. In the previous command above to import a gpg key for docker, the target file for the key was:


What actually happens when running “apt” is that it reads the package information in a file located in the directory:


For instance, you may have followed instructions to add the docker ppa using the following command:

echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

Note the “signed-by” part of the command, which specifically points to the following gpg file:


So that’s exactly where your gpg file should be. The confusing part is that some tutorials / recommendations prefer /etc/apt/keyrings while others use /usr/share/keyrings.

Once your gpg key is in the correct place, your problem is solved.

I hope this will help prevent some of you from scratching your heads over this.

3D Printing – fixing roller blinds

Sometimes, things get in the way in our lives. And sometimes, it’s best to take advantage of otherwise displeasing situations. With creativity and a firm conviction that we can succeed, anything is always possible.

Our apartment building is almost 50 years old. After that much time, some things start falling apart. Like roller blinds.

Assessing the problem

The first time those things failed, almost 10 years ago, it was an easy fix. One central part of the mechanism is a plastic piece with a square hole, that receives the crank handle. That plastic piece got loose over the years until it burst open.

A simple sleeve clamp did the job of making the plastic hold the connector tightly, and had the whole thing running for another 10 years.

However, what came next was at a different level. The central piece of the mechanism, a piece with a ball bearing in it that rotates only when maneuvered by the crank handle, became stuck. For good. Of course, there are currently no replacements for this kind of piece.

So I had a shutter in perfect condition, except for that piece. I knew that calling a repairman for this would have me replace the whole shutter, “because it’s not compatible” (and additionally to maximize profit, spending time doing custom jobs is not efficient when it comes to money, which is sad because that’s exactly what we would need at a civilization level right now), etc. What a waste it would be!

Devising a solution with 3D printing

I figured that I could actually take advantage of the situation, and I could even put an electric engine in my roller. Many people laughed when I said I was planning to go electric on a failing manual mechanism.

I launched OpenScad and designed some pieces. I ordered an engine that I knew would handle the weight (100 kg, 50 Nm, less than 50 euros).

It so happens that today’s standards are very different from that time’s. At the time, they had large cylinders with a diameter of almost 10 cm. Nowadays, we have octagonal rollers that are about 6 cm large. So, today’s engines surely don’t fit into a 10 cm cylinder – well you can insert it in it, but having it secured inside is a challenge. This is especially true as all the force needed to raise the shutter is transferred there. Besides, the roller itself is impossible to disassemble. And it’s quite large and massive, difficult for one person to handle alone.

Physical Constraints

Besides, I had to take into account the weight of the whole shutter: it’s an old thing, and probably weighs around 50 kg, maybe a bit more. Besides, the roller itself probably weighs 20-30 kg on its own. Needless to say, my solution had better be strong. Fortunately, I’ve already had experience with plastic.

Plastic is generally viewed as fragile. After all, when a mechanical part fails, it is a piece of plastic 95% of the time. However, plastic, even PLA which is my preference because of its low environmental footprint, can be quite sturdy. People have even printed propellers with it:

My experience is that combining a plastic structure with metallic backbones is quite indestructible. Very small pieces can hold a lot of weight if designed properly. For instance, these shelf supports can hold a lot of weight. And I mean, a lot:

Note how the weight is actually mostly on the horizontal steel screw, the plastic is just a proxy in between, as well as having some extra support from those two other screws on the shelf – which is also a cubic structure so there is no angular pressure on the support.

First Failure – Design

As I didn’t want to damage the roller, I first thought of having the engine outside the roller and connecting it with gears but there were a couple of problems:

  • there is very very little space for the engine when the shutter is totally rolled up,

  • tightening the engine securely for the kind of forces at play would be challenging,
  • I had doubts about the solidity of 3D printed gears, especially on the long run,
  • the engine has two parts: one for traction, one for counting the number of turns in order to stop when the shutter is fully open or closed. Making that second part turn in sync with the first was not an easy thing to do in a reliable way with what I had in my toolbox.

After trying out several designs, I had to give up that path. Not that it was impossible, it would be feasible to squeeze that engine outside and fix some gears. But it involved many pieces and the risk of something failing sooner or later. I didn’t want to take the chance.

Second solution – and failure

So I had to cut the roller open to insert my engine inside. Reluctantly, I cut the end part of the roller:

I know, it doesn’t look pretty. I probably could have used an angle grinder. I burnt a couple of blades in the process:

Unfortunately, I had bad news: inside was a wall, very likely welded in, that I would certainly not be able to remove. I was stuck.

Getting back up

At that point, I figured I could as well cut the other end and see if I would be luckier… and I was! That side was definitely empty so I could resume the project. The next step was to measure and design the necessary 3D pieces:

  • some holding piece on one end with a ball bearing for the whole thing to turn smoothly, and again, note how the plastic is just a proxy between metallic parts ; you can see the ball bearing encased in the plastic:

  • on the engine’s side, fillers that would match the engine’s octagonal and somewhat weird shapes on the inside, and the cylinder on the outside,

  • drilling holes in the shaft to host the holding screws:

  • Fitting the pieces together:


Putting the external support in place:

This last piece has several functions.

  • first, to hold the whole structure vertically: engine+roller. This is achieved thanks to the 4 horizontal screws, which actually are the ones taking all the weight.
  • second, to enable rotating the whole structure while blocking on the hexagonal corners, this can be useful if the whole thing becomes stuck, for instance, otherwise you would be in trouble with a non-rotating shutter and no way of unscrewing it due to the lack of space,
  • third, to have those 3 screws distribute the load of the rotational forces to the 4 horizontal screws. Again, the plastic is just a proxy here, it doesn’t actually hold anything by itself.

Final setup

Finally, everything looks good:

The last step is the electric setup with a switch (notice the little 3D-printed box under it to hide the electric wiring):

Upon testing, everything works as expected. Hurray!

Just a final note to those of you who do have shutters like this: you *have* to lubricate the sliding parts regularly (at least twice a year) with silicone lubricant. It is the best way of giving a long life to your shutters. Any other solution I tried ended up either in disaster or with very unsatisfactory results.

Les horribles paroles des comptines françaises


L’importance des premiers contacts

En tant que parents, il paraît important de tenter d’inculquer un certain nombre de valeurs à nos enfants, et ce, dès leur plus jeune âge. C’est à ce moment que leurs cerveaux bâtissent les premières connexions, les premiers comportements. Autant qu’ils aillent dans la bonne direction tout de suite, car il est bien plus difficile de perdre une mauvaise habitude que de la prendre. Et les habitudes de pensée en particulier.

Bien sûr, les valeurs dépendent des croyances de chacun. Certains vont mettre en avant que, pour survivre dans le monde, il faut être costaud et prêt à donner tous les coups. Pour ma part, lorsque mon fils est né, je préférais qu’il fasse connaissance de l’empathie, la persévérance, l’espoir, et bien d’autres valeurs qui me sont chères. Mais dans tous les cas, la bienveillance est la plus grande des valeurs, de mon point de vue. Cela ne veut pas dire  pour autant « naïveté », mais ne pas être naïf n’implique pas non plus de systématiquement écraser les autres.

Importance de la musique

Naturellement, j’ai voulu aussi le baigner de musique, puisque je suis musicien. La musique est extrêmement puissante : elle fait appel à l’émotion. Or, on grave nos souvenirs et expériences avec d’autant plus d’intensité qu’ils sont accompagnés de grandes émotions.

Il a eu droit à des petits instruments pour s’amuser, un xylophone, un tambourin, comme la plupart des enfants de son âge. Mais évidemment aussi, je voulais lui chanter des chansons. Non seulement pour la musique, mais pour partager l’héritage qui est le sien, par les comptines typiquement françaises. Dans ce contexte, les paroles étaient particulièrement importantes pour moi.

Curieusement, j’ai eu beau chercher dans ma mémoire, je ne me rappelais d’aucune comptine entièrement. Des débuts de chansons et des mélodies fredonnées, mais ça s’arrêtait très vite.

Premières surprises des origines

En farfouillant un peu sur le web, quelle n’a pas été ma surprise de découvrir tout de suite que beaucoup, pour ne pas dire la plupart, des comptines ont des sens cachés. À croire que nos ancêtres étaient de sacrés lascars…

Au clair de la lune, par exemple, est donc en réalité une comptine… très… adulte. Le pire, c’est que même les adultes qui les chantent ne s’en rendent même plus compte, car les expressions ont changé ou on a oublié le contexte et on ne comprend plus leur sens d’origine.

Mais soit. Si même les adultes ne comprennent pas, les enfants n’y verront que du feu, avec leur petit esprit tout innocent.

Mais non, ce n’est même pas ça. C’est beaucoup plus explicite. Je veux bien qu’on endurcisse nos petites têtes, mais quand même…

Contact avec la réalité

Je me suis donc procuré quelques petits livres de chansons, car je voulais les chanter moi-même. En fait, lorsqu’on passe un enregistrement, le contact avec bébé n’est pas du tout le même. C’est beaucoup plus impersonnel. Alors que, chanté par un être en chair et en os en face de soi, c’est beaucoup plus puissant. Mais du coup, j’ai pu m’intéresser directement aux paroles, ce que je n’aurais sans doute pas fait si j’avais simplement passé des enregistrements.

Je me rappelle l’une des toutes premières comptines : une souris verte. Je me rappelais de la mélodie, chantée très souvent dans mon enfance. Mais après quelques couplets, stupeur : « trempez-la dans l’huile, ça fera un escargot tout chaud ». Hein, quoi ? Pour transformer une souris en escargot… en la trempant dans l’huile bouillante, il faut y aller de bon cœur.

J’ai tourné la page très vite en me disant « bon, je vais éviter celle-là… suivante ». Je ne me rappelle plus exactement lesquelles furent les suivantes. Mais ce dont je me rappelle, c’est une succession de déconvenues. À chaque nouveau début, je me disais « ah oui, celle-là est bien ! ». Et passé quelques instants, stupeur voire horreur. J’ai fait toutes les chansons du premier livre une à une. Je n’en ai pas trouvé une seule avec un message de bienveillance qui aurait mérité d’être sauvée.

Petit florilège…

La Mère Michel… hum, sachant déjà que le chat perdu est en fait sa chatte, mais passons. Même en la prenant au premier degré, on a affaire à une prise d’otage de chat, et un très explicite exemple de chantage. Belle morale !

Alouette, gentille alouette… je te plumerai… sans commentaire.

Il était une bergère, ronron petit patapon… elle tua son p’tit chaton – et pour pénitence elle doit embrasser le curé !… et trouve que c’est bien agréable et recommencera !!! Et dire que j’en avais un bon souvenir, de cette comptine !

Il était un petit navire… à quelle sauce manger un humain – on envisage vraiment toutes les possibilités. Miam !

Maman les p’tits bateaux : harceler son bébé en le prenant pour un âne et en lui racontant des mensonges, rien de tel pour l’éduquer !

Au feu les pompiers… la maison brûlée… et c’est pas moi qui l’ai brûlée… (pyromanie, délation…)

Le petit cordonnier qui bat sa femme en rentrant bourré…

Et tant qu’à faire, une histoire de viol dans « à la pêche aux moules » même pas déguisée.

On veut marier Jeannette avec un prince mais elle veut se marier avec Pierre… qui est en prison (qu’a-t-il fait pour y être ?)… et qui sera pendouillé… et Jeannette veut être pendouillée avec… bah tiens on les pendouille tous les deux. Littéralement.


Toutes les comptines ne sont pas à jeter. J’en ai trouvé dans d’autres livres qui étaient acceptables et mignonnes. Mais toutes relativement peu intéressantes pour le développement d’un enfant. Pirouette, cacahuète. Frère Jacques. Il pleut, bergère. Et certaines à vocation éducative pour apprendre des mots de vocabulaire. Mais moi qui étais si excité à l’idée de pouvoir chanter des chansons, le moins qu’on puisse dire est que j’ai été sacrément déçu – et choqué.

Très vite, nous sommes passés à autre chose : les fables de La Fontaine. Même s’il ne comprenait pas au début, il ne s’en est jamais lassé.

Existing “lingua francas” – some history – an international language – Part 3


In the previous part, we have seen that existing languages can be fairly complex. Besides, we cannot speak all languages spoken by others. Clearly, from the beginning of language, people needed to communicate with “others who don’t speak your own language”.

Many existing languages served as a common tongue between different civilizations. In fact, such languages emerged in history in every region that had enough trade going on, and where communication was necessary.

In the West, during the Middle Ages, traders around the Mediterranean spoke a language called “Lingua Franca”. The term now means “common language”: a language spoken by people who otherwise wouldn’t understand one another. All over the world, groups that had strong interactions used some variations of regional languages as “lingua franca”, such as Chinese, Sanskrit, Arabic, Swahili, Latin, Quechua, and many others.

Vulgar Latin served as the “lingua franca” for the whole European continent for centuries. It is actually still in use in the scientific world, where Latin is the root of botanical and anatomical names.

After the Renaissance in Europe, there were many attempts to devise simplified languages, mostly for scientists from different countries who couldn’t communicate across Europe because they spoke different languages.


In the late 19th century, Schleyer, a German priest, created a language called Volapük, which he hoped would become an international language. His language gained big popularity very quickly, with over a million enthusiasts, but it was short-lived.

Among Schleyer’s followers, some wanted to make changes to the language. However, he wanted to keep it “pure and unmodified” – and also didn’t want the credit for creating the language to slip from his hands. As a result, many “unofficial” variants of the language appeared. Different factions started arguing and claiming that their version was “the best”.

Power is a treacherous thing. Every promoter of their own version fought with others. Many branches of the language appeared, and this is probably one of the main causes of the destruction of this language. I insist on this part of the story because it explains what followed with Esperanto, and is still visible to this day.


10 years after the creation of Volapük, a Polish ophthalmologist (he was actually born in the Russian Empire at the time), Zamenhof, published a book under the pseudonym “Doktoro Esperanto”, describing what would become the language “Esperanto”. He had been working on creating an international language for more than 15 years. Indeed, he believed that humanity needed a common language to be at peace. Many Esperantists (people who speak and promote Esperanto) still have this same goal in mind: a unifying language for the world.

I do admire the goal, and I absolutely share the vision. Not being able to understand one another creates division. And certainly, the “divide and rule” tactic is working partly because we all speak different languages.

Fear and rigidity

However, like many other people, I find that the Esperanto language is lacking. In fact, Zamenhof himself thought that his language could be improved. However, many Esperantists feared that modifying the original language would lead to a “Volapük schism” which would eventually destroy Esperanto. A large majority rejected Zamenhof’s own reform when he proposed improvements to the language. This fear is still felt by many Esperantists today.

Although I do understand the fear, I feel it is based on the wrong grounds. Volapük did not disappear because it was modified by others. It disappeared because Schleyer would not allow to modify it to make a better version – thus provoking the splitting which eventually led to its demise. The thing is: the inevitable obviously cannot be stopped. Trying to stop a natural evolution makes things worse, not better. From a shallow point of view, it looks like a schism destroyed Volapük. However, when looking deeper, the root cause was rigidity. I fear the same might happen with Esperanto.

Granted, some reforms did take place in Esperanto. But they are very minimal and insufficient, at least from my point of view. On the other hand, I do share the view of many Esperantists that an international language shouldn’t go through a reform every month. I also agree that “better” is often the enemy of “good”. However, what is clearly broken and/or creates strong negative emotional reactions cannot be widely adopted and creates resistance. Esperanto is the oldest of the current major surviving conlangs, it also benefited from a number of hypes over time. Yet, it never really took off and we’re very far today from it being a common international language.


Many, very early on in the life of Esperanto, shared the view that it needed some extra touch. Branches of the language appeared despite all the efforts from Esperantists to stop anyone from modifying the language in fear of a schism. Those branches separated forcefully from the main Esperanto speakers since there was no tolerance from their side. One of them is called Ido, a “revised Esperanto” (“ido” in Esperanto means “offspring”).

Frankly, I find that Ido feels like a patch. It does solve some of the problems I see in Esperanto, but it is still lacking. Quite a lot. We will see that in more detail in the next posts. It is an honest attempt at fixing many aspects of Esperanto that many people judge negatively, but I believe it is not sufficient. It is just like fixing a broken house with tape.

And indeed, the result is here: despite being around for quite a long time, Ido is at a standstill. In fact, it is far behind Esperanto in terms of the number of speakers. If it was as good as it claims, I believe it would have overtaken Esperanto by now.


In the middle of the Second World War, a scientist, Lancelot Hogben, devised the bases for an international language during his idle hours. He published a book called “Interglosa”, mostly aimed at language teachers, confident that people would pick up his language immediately. However, people at the time had other things on their minds, with the war going on. His introduction manual for the language never took any attention.

Almost 20 years later, another scientist, Ron Clark, found Hogben’s book in a second-hand bookshop. He read it and immediately found it fascinating. He was soon joined by Wendy Ashby, and they worked with Hogben, who was still alive, to improve the language. However, Hogben died in 1975. Clark and Ashby founded Glosa after some further modifications of the language.

I find Glosa much better than any previous attempts. After all, Hogben benefited from many failures before him, so he definitely had an advantage. Here are some of the core features of Glosa:

  • it is fully phonetic, every single printed character is pronounced in one way, and vice versa,
  • unlike Esperanto, words do not change, they can serve multiple purposes by the simple addition of prepositions. This feature eliminates complex inflections which makes it difficult for many people to speak and understand languages – people in general don’t think about what an adjective or an adverb is when they speak!
  • the vocabulary is limited to 1000 words,
  • the roots of the words are exclusively taken from Greek and Latin.

However, like Esperanto or Ido, Glosa is still not widespread. Granted, it is much younger and didn’t benefit from big hypes as Esperanto did.

Toki Pona

There are many other constructed languages that aim to be international languages. It would be impossible to list them all. I’ll just present a last one, which is quite fun and intriguing.

Toki Pona was created by a Canadian, Sonja Lang, whose aim wasn’t exactly to create an international language, but rather a “minimalist” language. It was a tool to help organize her thoughts. Indeed, the official Toki Pona vocabulary contains no more than 120 words. Yes, you have read that correctly, a language with a total of one hundred and twenty words! In fact, the philosophy is close to Northern minimalism, which along with her pen name led me to believe for quite some time that Sonja Lang was a Swede!

Similar to the founder of Esperanto, the inventor of Toki Pona had something in mind: doing good. While the word “Esperanto” means “the one who hopes”, “toki pona” means “the language of good”.

Besides, I particularly like some of Toki Pona’s features. For instance, the sounds have been selected so that basically all people in the world can read, understand and pronounce the language easily. This is nice! Besides, it can be written in plain ASCII without any diacritics.

Minimalism comes at a price

However, minimalism comes at a price. Although a few root words may be sufficient to express very simple things, it becomes very challenging when you need to express more complex thoughts. You need to become extremely descriptive, like a 2-year-old child who doesn’t know his vocabulary yet.

For instance,

The student learns history from the teacher


The one who studies learns the communicated chronology that passed from the person whose job is to instruct others.

As a primary tool for communication, this can become very tedious. Better than nothing, for sure. Indeed, the original creator’s intention is to develop creativity and imagination.

However, if we do want to be understood by other human beings, we need a consensus. Everyone must use the same metaphors to be understood. This in turn means that you have to learn vocabulary – or in this case idioms -, just like in any other language. If you don’t, you might scratch your head when someone mentions a “confident bird”. Could it possibly be one of the following?

Actually, it is the metaphor people generally use for an “eagle” in Toki Pona. But if you haven’t come across the expression yet, any one of those above could certainly qualify as a “confident bird”.

Besides, others would probably also scratch their heads when you mention a “confident bird from the Andes”, although that one could come very easily as a “condor” for someone who is familiar with the Andean culture.

Conlangs are also biased

Created languages, or “conlangs”, also suffer from the biases of their creator(s). Because someone created a language with “goodwill” doesn’t mean the language itself is good, easy, or even usable for communication. We’ll examine the ease of learning and communicating of existing conlangs in part 5.

Does a “lingua franca” replace local languages?

To conclude this post, I would like to address this very sensitive topic. Language is deeply connected to the culture of the people who speak it.

Languages politically forced on populations destroy dialects

This is a common fear, due to a very big misunderstanding of what a “lingua franca” should be. Since English has become the de facto international language, many believe that an international language always tries to force itself on people and aims to replace all languages.

In many countries where a common language was forcefully introduced as “the common lingo”, it has been the case.

In my native France, the French Republic has spent countless efforts in the last 250 years trying to kill all regional languages, forcing people to speak French and ditch their local language or dialect. All this in the name of “unity”. And this strong will to eradicate dialects is still very much alive within the Parisian administration today – and, sadly, it has succeeded quite well. In June 2021, it voted on a new law to restrict any teaching of regional languages at school. But that is a political agenda, not “goodwill” to “help people communicate”. Instead, it is a tool to control the population from a centralized authoritarian administration.

In the same way, some other languages did replace the local dialects. But in most cases, those were the results of military conquest. Indeed, Vulgar Latin replaced many local languages in Europe during the Middle Ages. But that was the result of the colonization of territory by the Romans. Similarly, the Incas pushed Quechua on the people they conquered. With the Spanish takeover of South America, it grew even more because the Spaniards didn’t want to have translators for every single local language. Again, it was a military invasion. And after all, the language of the conquerors, Spanish, replaced Quechua to some extent.

English also comes with the conquering American mentality. Conquering through music, a huge film industry, fast food, and many other aspects. It is not the language itself that endangers others, but the associated culture.

This is not what an international language is for.

Lingua francas don’t kill other languages

As I have mentioned, many “lingua francas” emerged naturally in history, and rarely ever replaced or killed other languages. The trading language of the Mediterranean called “Lingua Franca” never replaced other local languages or dialects. Chinese outside of China, Malay, and Swahili, while being widely understood well beyond the borders of their native speakers, didn’t replace local dialects.

The goal needs to be precise: a “lingua franca” is an “alternative means of communication”. Not a new “unified world” thing. By the way, another name for such a language is “international auxiliary language”. Auxiliary.

I actually make the point that providing a constructed language to the world as a lingua franca is actually saving languages rather than endangering them. If you are speaking Cherokee today, you could just learn the easy lingua franca along with Cherokee, rather than having to study English. Because of this, some countries like some parts of Switzerland as well as Finland are actually switching fully to English, ditching their own language. Which I personally find catastrophic. To keep the diversity, we need very easy access to the lingua franca.

Enough spoken of existing languages. In the next part, we will focus on constructed languages: conlangs.

Overview of current electronics shortage in 3 minutes


There are more and more reports about shortages in the supply of electronic components during the past year. What are the causes and should we be concerned? Will it lead to increased prices and will it last?

Let’s review the causes and what we can make of it.

Rising demand

The demand in electronics has been constantly rising for the last decades. One example is the demand for batteries, which is quite telling:

Demand for batteries in the last decades

Obviously, this means that the supply chain has to grow accordingly, which is not always a given. And electronics are not the only ones suffering from supply chain problems: plastics are also in a strange condition right now.

The “work from home” drill

One of the first reasons for the shortage is a higher demand from consumers. With the pandemic and everyone switching to remote working last year, people had to buy extra computers (at least one for every member of the family) or upgrade existing ones (think about a better camera, a better processor or graphics card to deal with the video live streams, a larger screen on your desk at home since it has become your semi-permanent office, etc.).

This sudden demand created a spike in an already congested industry, hence a shortage. The problem is that such a spike should be only temporary, but it looks like the situation is not going to be resolved anytime soon. What is going on?

Toilet Paper

Remember the toilet paper shortages? Well, that’s pretty much what is happening with the electronics industry right now. Because people started being aware of the shortage and the potential for it to become long term, they have acted exactly as they did with toilet paper. Buy more. As soon as possible. Before it is too late.

So the initial hit on the demand is also worsened by panic buying. Of course, buying an extra computer is not as easy for many people as buying toilet paper, due to the price difference. So while the effect is felt within days for toilet paper, the time frame is counted in months for electronics.

The car industry

The car industry is one of the most demanding in terms of electronics: our cars are getting stuffed more and more with those chips and gadgets, and it is getting to a point where the car industry is hit very badly by the shortage. It is currently causing very heavy losses in sales in that sector. Back to the first chart of this post above, we can see that batteries for electric-powered vehicles is mainly responsible for the demand to jump almost exponentially. Any shortage in those immediately results in slower production.

Accumulation of incidents

There have also been two major fires in the industry (one in Japan, another one in Taiwan), which have worsened the shortage, especially for memory chips.

On top of that is the winter incident in Texas, which closed chip making factories for weeks.

In an already tense supply chain, any extra incident can bring a system to its knees. And the recovery is difficult since the supply was already not sufficient.

Note that the whole world depends on Taiwan for the supply of chips, which doesn’t make it very resilient.

Source TrendForcetaiwan electronics

Pandemic supply chain disruption

As I warned a year ago on my blog at the beginning of the pandemic, Covid also disrupts supply chains since productivity is impacted – when the industries don’t close altogether. People needing to stay at home at the first sign of illness, whereas before everyone was still going to work with a running nose. And of course, wearing masks, material needing disinfection, etc.

All this obviously slows down existing systems. And again, in a “just-in-time” production mode with rising demand, this can only cause shortages.

Raw material shortage

As we all know, our planet is not infinite. With such a growth in demand, there must mathematically be a point when this never-ending growing trend goes beyond the total resources of the planet.

An abandoned mine

Along with silicon, some rare metals are getting scarce, if not already at the point of exhaustion. Other metals and rare-earth elements will follow, without any doubt. There would be a lot to talk about on this topic, but I’m keeping it short for now. Recycling those rare metals is typically a very big challenge – some of them in electronic components can actually never be recycled since it would need going to the atomic level.

And the shortage for some metals is not so far away. Just look at “other industrial metals” in the following chart, there is a chance you’ll see the shortage of some of them in your lifetime. And what then?

Shortages to come – what next? Source: Visual CapitalistShortages to come


The current shortage has many causes. Some of them may be temporary, but others will undoubtedly be felt on the long term. Hopefully, as the price of the rarest materials increase, alternative technological solutions will enable us to replace rare materials with more common ones. Or maybe we’ll find this missing Germanium or Palladium on the Moon or Mars…

An international language – living languages are BAD – Part 2

In this part, we will see why any living language makes a very poor candidate to be an international language. Yet, we do need a communication tool across the globe, so let’s see how existing languages can help – or actually create more problems.

In part 1, we have seen that English is a very complex language. This complexity makes it very difficult to learn for many people on Earth. Besides, it is highly ambiguous. All these points make it a very bad “international language”. But it is not the only language with these difficulties. In fact, I state that, in general, an existing living language cannot and should not become an international language.

You might wonder why we can’t safely take an existing living language as an international language. After all, there are many alternative choices, some of them without many drawbacks of English, while already widely spoken across different populations: Spanish, Chinese, Hindi, Arabic, French, Swahili…

Obviously, the advantage of picking an existing language is the strong base of speakers who use it without any extra learning effort. Of course, this existing base helps the initial spread of any language, but it comes with some attached strings.


The first point is a cultural and philosophical one. There is a symbolic meaning when one language of a certain culture is forced upon the world population, as it is currently the case with English. Colonizers always imposed their language on their colonies. It clearly means: “You need to make the effort of learning my language, but I certainly won’t make any effort learning yours. And that’s because I’m superior to you.”

Yup. That’s it. Racism at its worst.

So for this reason alone, any existing language as a “lingua franca” (a term I will use a lot in the future – meaning a common language) is simply a no-no. Should we still continue the discussion at this point? Maybe not. But just for the fun of it, there are other reasons why any existing “living” language is not a good candidate.

Languages are… a mess

There are many other practical reasons why living languages are totally inadequate to serve as an international language.

They have grown randomly

Language changes mostly through usage. There are many reasons why languages evolve. But they generally do when something is considered “inefficient” by the social group that uses it. And that fills the language with exceptions over time, which makes it more difficult to learn for people who were not born within that social group.


New words appear when new objects or concepts appear. This can be the case with technological, scientific, or philosophical advances, for instance.

Very often, languages also take “loanwords” from other languages. “Oh, this word doesn’t exist in my language, but it does in that other language. That’s fun! Let me borrow it!” In the meantime, another word from your language might have actually done the job quite well – but you didn’t think about it. Typically, in France, everyone speaks about a « week-end » while in Quebec it is a « fin de semaine ». On the other hand, what is « une job » in Quebec is « un travail » or « un emploi » in France.

English is full of loanwords from Latin and French, from the Middle Ages, especially nouns. For instance, almost all words ending in “ion” are French words (adoption, lion, explosion – etc.), spelled exactly the same, pronounced slightly differently. On the other hand, many verbs come from Germanic and Scandinavian languages: “to run” is “rennen” in German, “to drink” is “trinken” in German, etc. And the irregular verbs come directly from Germanic languages. In German, “to sing” is “singen”, also an irregular verb that becomes “singt – sang – gesungen”. Sounds familiar, doesn’t it?


The way syllables and mores are pronounced plays a huge role in the evolution of languages. Whenever something is lengthy, difficult to pronounce, or judged ambiguous, usage changes to correct this perceived fault.

“It is” becomes “It’s”. “Getted” becomes “Get”. “Pronounciation” becomes “Pronunciation”. “Logique” becomes “Logic”. And “a apple” becomes “an apple” because pronouncing two a’s in a row is “breaking the flow” of speech. Almost all languages go through these changes, which have some logic to them from the point of view of the social group that uses this language, and occur more often on words that are most used in daily life.

New meanings

Old words can also get new meanings. And before you know it, the vocabulary changes quite a lot. And again, it is daily life and daily usage that affects the language most. Common words are often transformed, whereas literate words change more rarely. An English “plate” is a French « assiette » (nowadays, the French word « plat » means “a dish”, but as an adjective, it also means “flat”… which makes a lot of sense, doesn’t it?) and an English “tree” is a French « arbre » (whose root can be found in the English word “arborist”, for instance). But the English “sediment” is also the French « sédiment » and “vernacular” is also « vernaculaire ». Of course, there are many counter-examples of this, but the point is that, as a general rule, what is used more often is a more likely target for changes to “simplify” the language.

This phenomenon also causes exceptions to occur within the vocabulary and expressions that are mostly used daily by everyone.

They adapted locally

Languages also adapt to local circumstances. If you’re a tribe living in a hot climate and near the sea, you have very little use for the word “snow”. There is no snow in your environment, and you probably don’t even know what it is. So you don’t need a word for it. On the other hand, you certainly need to name very precisely every species of fish and sea animals, in order to know whether you’re speaking of a predator or prey, or whether that thing is edible or not. You probably also need specific vocabulary for water currents, tides, waves, wind, and other sea-related concepts, which can actually play a role in your survival.

However, when you live in the mountains up North, far from the sea, you don’t even know what the “horizon” looks like, you’ve got a mountain in front of you! However, you have plenty of snow, and it’s very critical and sometimes a matter of life and death that you describe precisely the type of snow that is on the ground today. You may need to describe accurately if it is icy, sticky, slippery, likely to cause avalanches, whether it covered animal tracks, etc.

If you’re a tribe of hunters, you don’t need the same vocabulary as farmers or breeders. A sedentary vs nomadic lifestyle also brings its own range of useful words. And an industrial world has other communication needs than a rural one.

Granted, this specialization and optimization for a specific environment make the language more practical and more precise for those people who use it. However, it is completely not adapted for others. Besides, it makes it more difficult to learn for no real advantage outside its original environment.

They also adapt culturally

Without going into too much detail, languages also adapt depending on the people’s culture and rituals. More religious or spiritual people will invent lots of words to describe their feelings, mystical events, and so on. A monotheist religion doesn’t bring the same vocabulary as an animist one (in which every being is suddenly brought to life, even a stone).

Whether the culture gives a lot of importance to family, social ranking, and other relationships, also brings a richer or poorer vocabulary. Japanese has notably different forms of expression and vocabulary for women and men.

Fun fact: after WWII, many American soldiers learned Japanese from their girlfriends, and ended up speaking the language of women, which would trigger quite a laugh from Japanese men.

An “elder brother” is called differently in Korean if the sibling is a female (he’s then Oppa) or a male (he’s called Hyeong), while there is no difference in European languages, and little brothers are also called differently in Korean, rather than using “elder” or “younger” as European languages do. And so on.

Ranges of sounds

Any given language has been living within a group of people over time. As a consequence, the sounds it uses have become specialized in such a way that they are very easy to distinguish within this social group. However, we are all different, and every society puts emphasis on different things. Because of that, every social group has come, generation after generation, to select different sounds as “different”. Linguists use various classifications to put every sound into a nice set of categories, from vocative, ablative, labial, dorsal, and many others. Those categories indicate where the sound is produced, with which organ (we don’t only use our vocal cords to produce sounds – the tongue, throat, lips, jaw, and larynx play huge roles as well), etc.

Do you imagine an international language with tongue clicks, like in Xhosa (if you watch the video, notice how he pronounces Xhosa… can you do it?)?

If you’re not an African who speaks one of those languages with clicks (and there are many, especially in the south of Africa – Zulu has some too), probably not. If you’ve ever wondered why Japanese people can pronounce “sa”, “ki”, but not “si” (they say “shi” instead), the video above may have given you a clue. Did you notice how the white guy says: “I can make the click by itself, but I can’t do it with the vowel”.

What about a language that uses tones to change the meaning of words, like Mandarin and other Asian languages? If you are not a speaker of those languages, you simply can’t distinguish the different forms of the word ma: a mother (mā), a horse (mǎ), hemp (má), a grasshopper or “to scold/abuse” (mà), a question indicator or a pause (ma), among others. Because of this, the Chinese language allows for pretty cool tongue twisters, such as this one which tells a full story using only the syllable “shi” but with different tones:

As they have evolved within a closed community, living languages are simply too different for people raised in a different environment. This makes every single living language difficult for anyone who doesn’t speak it as a native tongue. It’s like asking a musician to learn a computer programming language – or a programmer to learn a musical instrument. I’m not saying it’s impossible, but it’s very difficult because there is so much to learn at once with skills that are hard to acquire as an adult.

Languages are rich – too rich?

So, definitely, the environment shapes languages. Does an international language need to go to such deep extremes? Certainly not. You can afford to be more descriptive when the situation requires it. This is not your daily tongue. It is an auxiliary one.

Consider the adjective “many” in English, it has a lot of synonyms: diverse, countless, copious, innumerable, manifold, myriad, numerous, plenty, several, various… What about “interesting”: fascinating, engaging, intriguing, thought-provoking, inspiring, titillating, exciting, absorbing, enthralling, curious, captivating, enchanting, bewitching, appealing… and many others.

Granted, most of those adjectives have a very slightly different meaning than all the others. Sometimes they are completely interchangeable. Of course, it can help us write better novels and better poetry, in order not to repeat the same word twice or to convey the exact concept we have in mind – that is, if the reader/listener knows that word… and associates the exact same nuances to it than the writer/speaker. Besides, this is adding a considerable amount of vocabulary to learn, for very little gain, if you consider “communication” alone.

Languages move around

I was pointing out in the last paragraph “if the receiver understands the word the same way as the sender”. This becomes especially true when the same language moves from one location to the next.

This can end up being extremely confusing. Consider some examples:

Word / expression Britain America
I can easily jump out the window since I live on the first floor.
Let’s use a dummy to calm down the baby.
Oh, you are a chemist?
Can you check the post, please?
‘Going to the bog?

Between the French spoken in Quebec and in France, we have quite a few false friends like these that can actually become extremely awkward. The same goes for Spanish spoken in South America vs Spain (think about “coger”, for instance, which is very normal in Spain but… well, don’t use it elsewhere, prefer “tomar” instead!).

An international language must be simple

If we want people to learn an extra language and be able to communicate, it has to be simple. Its vocabulary has to be limited so that we don’t need to learn tens of thousands of words to start communicating.

It also has to be as unambiguous as possible:

  • vocabulary must have a definite meaning for everyone, and it should avoid synonyms,
  • grammar must be clear and allow as few misunderstandings as possible,
  • sounds must be easy to pronounce and distinct for most people,
  • related to the previous point, it shouldn’t have homonyms: one sound, one meaning, and vice-versa.

Besides, it has to offer all the needed flexibility to get as precise as possible when it is needed. Do I really want to convey “enthralling”? We can use some metaphor for that with simple vocabulary: that is actually what dictionaries do to explain complex words. And that’s exactly what we do naturally when we struggle to find the exact word we mean to use.

However, “simple” is a relative concept. How many words should an international language have? 100? 1 000? 10 000? More? As we will see in the next parts, “too simple” can be very crippling. We need to find the correct balance between “simple but inconvenient” and “overly complex and hard to learn”.

We will see in the next part that many languages have served as “common languages”, and that new ones also appeared, especially in recent history.

An international language – English is BAD – Part 1

For a time, two centuries ago, the French language shone on the Western world, and was spoken by most travelers and high society. During the 20th century, English has gradually become the main international language. Yet, this language is incredibly difficult to learn for many people on this Earth.

Of course, we do need a way of communicating across countries and cultures. Even more so since we can now communicate instantly with other people all over the world thanks to technology.

However, language can be a huge wall between different people in terms of communication. Not being able to communicate, not understanding someone fully, not understanding at all, or worse, a misunderstanding because of the language barrier, is extremely frustrating.

For those of you who have some time to spare while having a good laugh – and in the meantime get my point -, you can watch this hilarious guy. Do we really need to learn all of those just to communicate? In any case, go on reading (as well as the follow-up parts, this article is only part 1 of a series).

Is English really that difficult?

Although I was born in France, I’ve been lucky to have been exposed to English almost since birth, so I don’t mind speaking it. In fact, English serves me well, personally. However, not everyone is as lucky as I am.

Let’s face it: English is incredibly hard to learn, read, write, and speak for a large portion of the world’s human beings. It’s not even easy for natives!

Here is a story of a kid who is bilingual in Japanese and English, but who has been diagnosed with dyslexia, and has a very hard time with English. What if, instead of learning English, he had learned something less challenging? The fact is that, currently, everyone on Earth who wants to communicate with other parts of the world has to learn English. And English is a challenge for many.

English sounds

Pronunciation of English sounds is very challenging for a large portion of non-native English speakers (ever heard a French or a Chinese person struggling to speak English and get the sounds right?). Many speakers of other languages can’t make the difference between some English sounds, especially the vowels, for instance, “sheet” and another word which I will let you guess. 💩 And another one is “peace”. Yes, you got that one right too.


So, how do you expect people to pronounce things correctly when they can’t even hear and make a difference between the different sounds?

Besides, pronunciation rules of written text are incredibly complex, to the point where if you don’t know some words, you wouldn’t know how to pronounce them. Think for instance of “thoroughly” and “through”, or the word “choir”. In this regard, the absence of strict rules about syllable emphasis makes it extremely challenging for non-natives. It’s “alias” but “akin”, it’s “misnomer” but “mischievous”. And even a similarly stressed syllable doesn’t guarantee the same pronunciation: the stressed “a” of “alias” (/ˈeɪ.li.əs/) is different from the one of “alibi” (/ˈæl.ə.baɪ/). For a learner of the language, these types of rules go on and on like this forever. You basically have to learn almost every single word.

Spelling rules

Accordingly, spelling is also a big challenge. If you don’t know a word, you’re often at a loss when it comes to writing something you’re hearing. What about “juggler” vs “jugular”, “able” and “abide”, etc.?

Do I also have to have the offence/offense of mentioning as an annex/annexe that the agonising/agonizing specter/spectre of the “z” (zed/zee) is always an unrivaled/unrivalled endeavor/endeavour when it comes to British vs American English?

And yes, it’s pronunciation, although it is pronounced.

And one little doubled consonant can make a whole difference.

Incidentally, a comma also changes everything.

Let’s eat, kids.

Let’s eat kids.

Grammar in general and exceptions

The English grammar is incredibly complicated, and tenses are a mess. Who hasn’t struggled with has had been, even natives?

There are exceptions everywhere. Verb conjugation of course. But more typically, prepositions are a headache to learn and can change the whole meaning of a sentence:

I took the statue in the garden. => it was in the garden, I’m taking it away

I took the statue into the garden. => I’m putting it in the garden

Exceptions always lurk around the corner:

The adjective for metal is metallic, but not so for iron.

Which is ironic.

Multiple meanings and homophones

A large proportion of words in the English language have multiple meanings. And I do mean multiple! Think about the very common word “date”: a day in the calendar, a romantic encounter, a fruit, or “old fashioned” as in “dated”. As a computer scientist working in AI, I have to note that this has also a terrible effect on the computerized processing of language: it is very hard to automatically translate the billions of text written in English accurately to other languages.

Homophones are also all over the place. An ant is not an aunt, especially at a bizarre bazaar. “Wine and dine” sounds like “Whine and dine” but doesn’t mean precisely the same thing…

Word order

Of course, word order can be challenging for speakers of languages whose grammar orders words differently.

In English: I go to England.

In Japanese: I (the subject) Japan (destination) go.

In Irish: Go (I) to Ireland.

In Turkish: Turkey (to) go (I). Or: I Turkey (destination) go.

But no. I’m not speaking about those. Because no matter how you look at things, there will be differences, that’s the way languages are. And word order does matter, it is quite normal. As a game, you can put the word “only” anywhere in the next sentence, and get very different meanings:

She told him that she loved him.

Here are the results:

  • Only she told him that she liked him: nobody else told him that
  • She only told him that she liked him: she didn’t say anything else
  • She told only him that she liked him: he’s the only one to whom she said that
  • She told him only that she liked him: that’s all she said, and it sounds like she could have said more
  • She told him that only she liked him: she claimed she was the only one who liked him
  • She told him that she only liked him: that may be awkward. He may love her but she’s pointing out that he’s just a friend to her…
  • She told him that she liked only him: she doesn’t like anyone else
  • She told him that she liked him only: same meaning as the previous one

But English has more tricks that make far less sense.

Adjectives, for instance. Think about a woman who is: beautiful, tall, thin, young, black-haired, and Scottish. To speak correct English, you would have to describe her with those adjectives in this exact order, no other! She cannot be a young, tall woman. While this may come naturally from experience since birth to a native English speaker, it is a total headache for a non-native, who might very well think that she is a “Scottish black-haired young thin tall beautiful woman”. It hurts, doesn’t it?

Word stress

Unfortunately, this also happens orally by stressing one word in particular. In that case, there is no real way of showing this when writing, except maybe by using italic or bold fonts. In the following sentence, stressing a particular word radically changes the global meaning and context:

I never said she stole my money.


  • I never said she stole my money: but someone else may have said it
  • I never said she stole my money: I would never do that!
  • I never said she stole my money: I just implied it, but never directly said it
  • I never said she stole my money: I didn’t point fingers at her as the culprit
  • I never said she stole my money: she may have borrowed it… or found my lost wallet
  • I never said she stole my money: but that she did steal someone else’s money
  • I never said she stole my money: she stole something else from me

Again, this is quite normal and exists in most languages, but it makes comprehension difficult. People expect you to pick the difference in the meaning of every single of those sentences. Of course, context helps a lot here.


English can be highly ambiguous, and relies on context and/or “common sense” to interpret what is being said. But in an international context, you absolutely don’t want to rely on “common sense” since this can vary a lot depending on the culture. Consider:

My brother and I are getting married this summer.

What? Well, maybe not “to each other”.

What about:

The lady hit the man with an umbrella.

The only thing we can tell for sure is that it probably did hurt. But who actually had the umbrella in their hands remains unclear. “Their” in the previous sentence is actually singular.

I read the book.

Is this past or present?

English is a local language

English has many native speakers around the world. However, all those native speakers are speaking “their own version of English”. English is actually a local language. And it has its own dialects and cultural versions.

They actually have such different accents and vocabulary that some of them can’t even understand one another. Just picture a Scot and a Texan trying to communicate. That’s the challenge we inevitably face when reusing an existing language to make it an international one artificially.

Let’s not pretend English is mutually intelligible by anyone who learns it anywhere in the world.

To conclude

I think this incomplete list speaks for itself. Although it doesn’t technically speak, as it’s written text. Here is a last funny and well-known example and we’ll leave the subject at that:

Why is it that writers write, but fingers don’t fing, grocers don’t groce, and hammers don’t ham? If the plural of tooth is teeth, why isn’t the plural of booth beeth? One goose, 2 geese. So, one moose, 2 meese? One index, two indices? Is cheese the plural of choose?

So yes, let’s face it, English is a terrible international language. The most telling part is that people have invented “Simple English” or “Basic English” to try and reduce the difficulty. That’s simply acknowledging that it is too difficult in the first place. For no real benefit. Well yes, it creates millions of jobs for English teachers and translators all around the world. But wouldn’t that energy be better spent if it was for other purposes than trying to fit a square into a circle?

Here’s a link for those of you who are not afraid to go down the rabbit hole.

In the next part, we’ll explore why any other living language is a bad candidate for an international language. Then, we’ll see alternative languages that have emerged as “common languages” between groups that spoke different languages but needed to communicate – and why none of them make a good international language. And then I’ll suggest something else.


Why you should never use the type “char” in Java

The post title may be blunt. But I think after reading this article, you will never use the type “char” in Java ever again.

The origin of type “char”

At the beginning, everything was ASCII, and every character on a computer could be encoded with 7 bits. While this is fine for most English texts and can also suit most European languages if you strip the accents, it definitely has its limitations. So the extended character table came, bringing a full new range of characters to ASCII, including the infamous character 255, which looks like a space, but is not a space. And code pages were defining how to show any character between 128 and 255, to allow for different scripts and languages to be printed.

Then, Unicode brought this to a brand new level by encoding characters on… 16 bits. This is about the time when Java came out in the mid-1990s. Thus, Java designers made the decision to encode Strings with characters encoded on 16 bits. All Java char has always been and is still encoded with 16 bits.

However, when integrating large numbers of characters, especially ideograms, the Unicode team understood 16 bits were not enough. So they added more bits and notified everyone: “starting now, we can encode a character with more than 16 bits”.

In order not to break compatibility with older programs, Java chars remained encoded with 16 bits. Instead of seeing a “char” as a single Unicode character, Java designers thought it best to keep the 16 bits encoding. They thus had to introduce the new concepts from Unicode, such as “surrogate” chars to indicate that one specific char is actually not a character, but an “extra thing”, such as an accent, which can be added to a character.

Character variations

In fact, some characters can be thought of in different ways. For instance, the letter “ç” can be considered:

  • either as a full character on its own, this was the initial stance of Unicode,
  • either as the character “c” on which a cedilla “¸” is applied.

Both approaches have advantages and drawbacks. The first one is generally the one used in linguistics. Even double characters are considered “a character” in some languages, such as the double l “ll” in Spanish which is considered as a letter on its own, separate from the single letter “l”.

However, this approach is obviously very greedy with individual character unique numbers: you have to assign a number to every single possible variation of a character. For someone who is only familiar with English, this might seem like a moot point. However, Vietnamese, for instance, uses many variations of those appended “thingies”. The single letter “a”, can follow all those individual variations: aàáâãặẳẵằắăậẩẫầấạả. And this goes for all other vowels as well as some consonants. Of course, the same goes for capital letters. And this is only Vietnamese.

The second approach has good virtues when it comes to transliterating text into ASCII, for instance, since transliterating becomes a simple matter of eliminating diacritics. And of course, when typing on a keyboard, you cannot possibly have one key assigned to every single variation of every character, so the second approach is a must.

Special cases: ideograms

When considering ideograms, there are also a small number of “radicals” (roughly 200 for Chinese). Those get combined together to form the large number of ideograms we know (tens of thousands).

Breakdown of a chinese word into character, radical and stroke
A Chinese Word’s decomposition (credit: Nature:


It would be feasible to represent any Chinese character using a representation using radicals and their position. However, it is more compact to list all possible Chinese characters and assign a number to each of them, which is what was done by Unicode.

Korean Hangul

Another interesting case is Hangul, which is used to write Korean. Every character is actually a combination of letters and represents a syllable:

Hangul characters are syllables that can be broken down into individual phonemes.


So, in some cases, it is easier to assign a number to every individual components and then combine them (which happens when typing in Korean on a keyboard). There are only 24 letters (14 vowels and 10 consonants). However, the number of combinations to form a syllable is very large: it amounts to more than 11 000, although only about 3 000 of them produce correct Korean syllables.

Funny characters

People, especially in social media, use an increasing number of special characters, emojis, and other funny stuff, from 𝄞 to 🐻. Those have made it into Unicode, thus making it possible to write ʇxǝʇ uʍop ǝpısdn, 𝔤𝔬𝔱𝔥𝔦𝔠 𝔱𝔢𝔵𝔱, or even u̳n̳d̳e̳r̳l̳i̳n̳e̳d̳ ̳t̳e̳x̳t̳ without the need for formatting or special fonts (all the above are written without special fonts or images, those are standard Unicode characters). Every flag of the world’s countries have even made it as a single character into the Unicode norm.

This plethora of new characters which made it late into the standard are often using more than 16 bits for their encoding.

Using type “char” in Java

When using the type “char” in Java, you accept that things like diacritics or non existent characters will be thrown at you, because remember, a char is encoded with 16 bits. So, when doing “𝄞”.toCharArray() or iterating through this String’s chars, Java will throw at you two characters that don’t exist on their own:

  • \uD834
  • \uDD1E

Both those characters are illegal, and they only exist as a pair of characters.

Bottom line, when it comes to text, chars shouldn’t be used. Ever. In the end, as a Java developer, you have probably learned that, unless doing bit operations, you should never use String.getBytes(), and use chars instead. Well, with the new Unicode norms and the increasing use of characters above 0xFFFF, when it comes to Strings, using char is as bad as using byte.

Java type “char” will break your data

Consider this one:


What do you think this prints? 1? Nope. It prints 2.

Here is one of the consequences of this. Try out the following code:


This prints the following, which might have surprised you before reading this blog post:


But after reading this post, this makes sense. Sort of.

Because substring() is actually checking chars and not code points, we are actually cutting the String which is encoded this way:

\uD834 \uDD1E \u0031
\___________/ \____/
𝄞 1

It is amazing that a technology such as Java hasn’t addressed the issue in a better way than this.

Unicode “code points”

Actually, it is a direct consequence of what was done at the Unicode level. If you tried to break down the character 𝄞 into 16 bits chunks, you wouldn’t get valid characters. But this character is correctly encoded with U+1D11E. This is called a “code point”, and every entry in the Unicode character set has its own code point.

The down side is that an individual character may have several code points.

Indeed, the character “á” can be either of these:

  • the Unicode letter “á” on its own, encoded with U+00E1,
  • the Unicode combination of the letter “a” and its diacritic “◌́”, which results in the combination of U+0061 and U+0301.

Java code points instead of char

A code point in Java is a simple “int”, which corresponds to the Unicode value assigned to the character.

So when dealing with text, you should never use “char”, but “code points” instead. Rather than

“a String”.toCharArray()


“a String”.codePoints()

Instead of iterating on chars, iterate on code points. Whenever you want to check for upper case characters, digits or anything else, never use the char-based methods of class Character or String. Always use the code point counterparts.

Note that this code will actually fail with some Unicode characters:

for (int i = 0 ; i < string.length() ; i++)
   if (Character.isUpperCase(string.charAt(i)))
        ... do something

This will iterate through characters that are NOT characters, but Unicode “code units” which are possibly… garbage.

Inserting data into a database

Consider a simple relational table to store unique characters:

id 🔑 (primary key) int(11)
c (unique constraint) varchar(4)

Now imagine your java program is inserting unique characters in the column “c” of this table. If based on “char” the Java program will consider two different surrogate chars as different since their code are different, but the database will store strange things at some point since those are not valid Unicode codes. And the unique constraint will kick in, crashing your program, and possibly allowing wrong Unicode codes to be pushed into the table.

Alternative replacements

String.toCharArray() String.codePoints() (to which you can append toArray() to get an int[])
String.charAt(pos) String.codePointAt(pos)
String.indexOf(int/char) String.indexOf(String)
iterate with String.length() convert String into an int[] of code points and iterate on those
String.substring() Make sure you don’t cut between a surrogate pair. Or use int[] of code points altogether.
replace(char, char) replaceAll(String, String) and other replace methods using Strings

new String(char[])
new String(char[], offset, count)

new String(int[] codePoints, int offset, int count) with code points
Character methods using type char Character methods using int code points

Covid en France – le point au 10/12/2020


À l’heure où la désinformation est partout et où les statistiques sont manipulées dans tous les sens, il est important de partir de sources fiables. Pour ceux qui se demandent d’où sortent mes analyses, je prends les fichiers des décès de l’INSEE qui ont ce format et j’en fais des courbes et des cartes :

Nom complet - date et lieu de naissance - date et lieu de décès

Cf. ce billet initial, et également ce billet pour les plus curieux et pour garantir la transparence sur le traitement des données. Refaites ce que je fais chez vous pour vérifier, le logiciel est à disposition ! Cette analyse ne se base que sur ces fichiers, rien d’autre. Ils ont l’avantage d’être difficilement falsifiables .

Toutes les courbes sont lissées sur 10 jours, et chaque année est pondérée par la population totale de la France, afin de ne pas avoir de biais de ce côté-là.

Les cartes présentées ici représentent la différence de mortalité entre 2020 et les années 2015-2019, dans l’intervalle 15 mars-10 mai pour la première vague et 10 octobre-10 novembre pour la deuxième (elle n’est pas terminée, mais les données après le 10 novembre sont trop incomplètes pour pouvoir les prendre en compte).

Où en est-on ?

Pour ceux que les chiffres intéressent, j’ai récupéré hier le fichier des décès (donc, toutes causes confondues) de novembre. Ce fichier couvre en gros mi-octobre à mi-novembre, mais ça commence à ne plus être fiable dès début novembre où des données commencent à manquer.

Les « padedeuxièmevague » peuvent un peu aller se rhabiller. Ce n’est plus aujourd’hui exactement le même virus qu’en mars, mais c’est un parent. Parle-t-on de « grippe de janvier »  puis « grippe de mars » ? Non, ça reste la grippe !

Ici, on a clairement affaire à un virus qui ne réagit pas de la même manière qu’une grippe. Plus soudain. Plus localisé. Car si la grippe frappe dans des zones très larges (couvrant souvent au moins une région administrative) sur des durées relativement longues (plusieurs mois), la Covid semble être beaucoup plus localisée et plus soudaine.

À l’échelle nationale

Les courbes à l’échelle nationale sont plutôt parlantes.

On observe bel et bien une « deuxième vague », qui pour l’instant ne dépasse pas vraiment la grippe de 2016-2017 (l’épidémie la plus meurtrière depuis au moins 20 ans, avant la Covid) à l’échelle nationale.

Petit rappel : les données ne sont pas complètes, en particulier sur la fin de la courbe. On observe aussi une pression non négligeable sur la mortalité par rapport aux années précédentes entre août et octobre. À noter que cette courbe est pondérée par la population totale chaque année. Cette pression n’est donc pas due à une augmentation de la population. Cette courbe va-t-elle redescendre sans monter beaucoup plus haut ? Probable, si les données fournies actuellement par le Gouvernement sont correctes.

Et localement ?

C’est bien beau de regarder la tendance nationale, mais localement, il y a des courbes qui montrent qu’il ne s’agit pas d’une « grippette ». En voici quelques exemples.

On voit bien une recrudescence de la mortalité de manière bien plus brutale qu’une grippe dans de nombreux départements, comparé aux années précédentes (2015-2019).

L’épidémie dans les départements les plus touchés lors de la seconde vague. Cliquer pour zoomer.


On observe que les départements très touchés par la deuxième vague ont globalement été assez épargnés lors de la première vague (à part le Rhône). Le département des Bouches-du-Rhône, siège du Pr. Raoult, n’est pas particulièrement épargné. Évidemment, tous les hôpitaux de ce département ne pratiquent pas le protocole du Professeur (tandis que des hôpitaux dans d’autres départements peuvent très bien le pratiquer), donc on ne peut en tirer aucune conclusion.

Ce qui est également intéressant, c’est que les départements les plus touchés lors de la première vague semblent être épargnés pour l’essentiel par la deuxième :

Départements touchés par la première vague et épargnés par la deuxième.


Là encore, difficile de comprendre pourquoi, les raisons peuvent être multiples. Une certaine immunité collective. Un comportement plus « responsable » des gens qui ont été touchés et ont vécu la première vague. Ou tout simplement le fait que le virus sévisse vraiment localement et donc ne revienne pas vraiment là où il a déjà été… pour l’instant.

Et en carte ?

Lorsqu’on regarde sous forme de carte, la surmortalité a été très localisée lors de la première vague :

Première vague (15 mars – 10 mai), plus c’est vert, plus la mortalité est élevée en 2020 par rapport à  2015-2019.


C’est donc principalement la région parisienne, l’Alsace et la Lorraine qui ont été touchées, mais également le Rhône dans une moindre mesure.

Pour la deuxième vague :

Deuxième vague (10 octobre-10 novembre). Très localisée en Auvergne-Rhône-Alpes, avec quelques foyers dans d’autres régions.


Quand on superpose les deux cartes, on voit bien la différence de localisation :

Deux vagues : la première en vert, la deuxième en rouge. En marron, les régions touchées par les deux vagues.


Globalement, l’ouest et de sud-ouest ont été particulièrement épargnés par les deux vagues. Climat ? Humidité ? Vent ? Températures ? Chance ? Moins d’activité économique et donc moins d’échanges avec de potentiels contaminants ? Difficile à dire.

Et le bilan sur l’année ?

L’année n’est pas terminée, mais sa fin approche, et une tendance se dessine très nettement. Car c’est bien beau d’avoir des vagues, mais est-ce que par hasard une « grippe étalée » aurait le même effet global qu’un « Covid soudain » ? En d’autres termes, la Covid fait-elle plus ou moins de morts que la grippe ? Grippe qui par ailleurs a officiellement totalement disparu cette année…

Le nombre de morts cette année s’élève (jusqu’à mi-novembre 2020) à 595940. C’est donc le nombre de décès sur sur 10,5 mois. Cela devrait donc nous amener au minimum (la tendance en hiver – et donc au mois de novembre-décembre étant quand même à la hausse en général) à : 595940×12÷10,5≃681074.

C’est quand même une hausse minimum de 9,6 % par rapport à l’année 2019 (probablement plus, mais on ne le saura véritablement que l’année prochaine vers mi-février), à comparer avec des hausses de 0,4 %, 0,5 %, et 2,2 % les années précédentes (2,2 %, c’est la grippe de 2017). Comment expliquer ce bond, si ce n’est par un virus plus virulent que les précédents ? Effet « papy boom » ? Peut-être. Et vous, vous en pensez quoi ?

Nombre de décès par an depuis 10 ans. 2020 est une extrapolation très vraisemblable à partir des données 01/01/2020-15/11/2020.

Qui meurt ?

Voici les courbes de mortalité en fonction des classes d’âge.

Très clairement, ce sont les personnes au-dessus de 70 ans qui sont les plus touchées, et dans une moindre mesure les 50-70 aussi. Étaient ils tous grabataires lorsqu’ils sont morts ? À vous de voir. Leur durée de vie était-elle limitée ? Pour certains, sûrement. Tous ? J’en doute.

Pour les plus jeunes, ils sont visiblement largement épargnés (même si ce ne sont que des statistiques, attention, il y a des gagnants au loto…) :

Pour rappel, lorsqu’un « petit vieux », vos parents, vos grands-parents, attrapent la Covid, ça ne se fait pas par génération spontanée. C’est qu’ils ont été en contact avec quelqu’un qui était contaminé.

Une société, c’est fait pour protéger les plus fragiles. Sinon, on en revient aux lois de la jungle.

Une pandémie terrifiante ?

10 % de mortalité supplémentaire, c’est loin d’être négligeable. Pour autant, ce n’est pas non plus la peste ou le choléra.

La comparaison avec la grippe espagnole est quand même un bon exemple. Il y a tout de même une grosse différence : la grippe espagnole a profité du terrain de populations au système immunitaire épuisé par la malnutrition suite à la Première Guerre Mondiale, tandis que la Covid se régale de maladies liées à un autre type de malnutrition et de mauvaises habitudes de vie. Mais le schéma semble être le même.

Par ailleurs, il est intéressant de voir que, dans des pays bien gérés, il n’y a que peu voire pas de surmortalité du tout. Dépistage précoce, traçage, utilisation de médicaments pour éviter les cas graves (on va en parler plus loin). Rien de tout cela n’a été mis en place efficacement chez nous. Alors non, on est loin du « virus dévastateur » qu’on nous a vendu avec les images terrifiantes de Wuhan, rappelant les pires films d’apocalypse.

Pas que la mortalité

Pour autant, n’oublions pas que je n’analyse ici que la mortalité. Mais la Covid, c’est bien d’autres choses.

C’est, en l’absence de traitement, aussi la surcharge des hôpitaux, ce qui implique des soins retardés voire annulés pour beaucoup de patients qui en avaient besoin : greffes, opérations du cœur, cancers, accidents, etc. Et donc plus de morts sur le long terme.

Mais c’est aussi, et on n’en parle pas assez, beaucoup de gens qui « s’en sortent », mais à quel prix ? Lésions irréversibles d’organes, myocardites soudaines, problèmes cognitifs, perte à vie du goût et/ou de l’odorat voire parfois de l’ouïe, sans compter ceux (rares mais ils sont là !) qui « galèrent » à être malades pendant des mois.

Même s’ils survivent pour l’instant, tous ceux-là sont des laissés pour compte de notre société malade. Et ils ne sont comptabilisés nulle part.

Alors oui, il faut faire « quelque chose », et ne pas laisser le virus se répandre comme on le fait d’habitude avec les simples grippes. Avant la Covid, combien de fois êtes-vous allé au boulot avec « une petite crève », combien de fois avez-vous éternué ou toussé sans rien devant la bouche en plein milieu d’autres personnes, combien de fois avez-vous « chopé une cochonnerie » suite à des contacts avec des gens manifestement malades ? Combien de fois par jour vous laviez-vous les mains avant la Covid ?

Confinement, masques, barrières, touça…

On peut se poser la question de l’utilité (ou de l’inutilité) de ces mesures. Je connais très bien une école de plus de 1000 élèves où des règles simples mais strictes ont été appliquées : masques, lavage de mains réguliers, désinfection routinière, empêcher les gens d’être en contact rapproché, limiter les interactions entre les différentes classes, mise en quarantaine immédiate des contacts les plus proches en cas de Covid détectée, ouverture des fenêtres, etc. Tout ça malgré tout dans le respect des individus.

Dans cet établissement, les cas de Covid n’ont pas dégénéré et n’ont pas contaminé une large quantité de gens, comme ça aurait très bien pu arriver dans des lieux où, quoi qu’on fasse, il y a des contacts fréquents entre un grand nombre de personnes. La chance ? Peut-être. J’ai quand même tendance à croire qu’on est tous les ouvriers de notre chance. 🙂

Faut-il pour autant tous nous isoler ? Non. L’école en question est ouverte et fonctionne quasiment à plein régime. Elle est tout simplement très bien gérée et les gens y ont adopté une attitude responsable. Des pays comme la Corée du Sud, la Norvège, l’Allemagne et bien d’autres ont montré que des pays bien gérés peuvent parfaitement contenir le virus sans forcément avoir recours à des confinements forcés de toute leur population sur le long terme.

En revanche, lorsqu’on voit des métros bondés chez nous, tandis que les pistes de ski en plein air sont interdites, il y a de quoi se poser des questions sur les mesures mises en place et sur les véritables intentions des dirigeants…

Réponse médicale

Les intentions des dirigeants… parlons-en, justement.

Pas une grippe…

On est ici en présence d’une maladie qui est clairement plus « méchante » qu’une simple grippe. Comment traite-t-on une grippe ? Au lit, bouillon de poulet, riz, carottes et aspirine ou doliprane suivant les préférences. Et en cas de problèmes respiratoires sérieux, hospitalisation. C’est exactement ce que le Gouvernement a fait avec la Covid.

Grave erreur ! C’est cette politique qui a saturé les hôpitaux avec une surmortalité élevée.

Pourtant, il existe des moyens de prévenir la Covid, ou du moins d’éviter d’en arriver à l’hospitalisation pour une large part des patients. Ceci dit, ce n’est pas la seule maladie pour laquelle des solutions existent, et ne sont pas utilisées pour autant. La faim tue 9 millions de personnes par an dans le monde, il y a pourtant un remède très simple, et il suffirait de délester les plus riches d’un pourcentage infime de leur richesse pour l’offrir… mais revenons à la Covid, qui est encore loin des 9 millions de morts pour l’instant.


Pour commencer, en prévention, beaucoup de médecins préconisent exactement ce que je conseillais dès mars (et je ne suis pas médecin !) : vitamine D (extrêmement efficace pour combattre les infections respiratoires), vitamine C, et zinc, facilement trouvables et aux effets indésirables quasi inexistants. C’est pas cher, pas risqué et simple. Pourquoi ne pas proposer cela à l’ensemble des Français, en particulier ceux les plus à risque ? Avant même qu’on connaisse d’autres traitements, c’était une étape de base simple.

Ensuite sont venus d’autres médicaments qui ont été testés par de nombreux médecins. Par exemple, l’azithromycine, avec des résultats très probants, pour un risque quasi inexistant puisque c’est un médicament couramment utilisé dans le traitement d’autres infections. Pas cher. Efficace. Que demande-t-on de mieux ? Les médecins qui ont essayé (avec succès !) se sont fait rappeler à l’ordre… par leur Ordre, justement. Il valait mieux tester le remdesivir, dépenser des millions pour un médicament qui n’avait pas prouvé son efficacité (on sait maintenant qu’il n’est pas efficace) et qui en plus est toxique (problèmes rénaux).

Il y a aussi la chloroquine, utilisée aussi dans bien d’autres pays apparemment avec succès, même si son effet n’est pas miraculeux. Elle a au moins l’avantage d’être connue depuis longtemps.

D’autres médicaments permettent également de lutter très efficacement contre la Covid, essais cliniques en double aveugle à l’appui. C’est le cas de l’Ivermectine. Pas chère. Utilisée depuis 40 ans sans problèmes. Mais non, personne n’en parle.

Chut. Il vaut mieux tester un vaccin qui va coûter des milliards… dont on ne connaît pas l’efficacité à grande échelle et encore moins à long terme, qui provoque des réactions allergiques, et dont on ne connaît pas les effets secondaires sur le moyen et long terme. On se fout de nous !

Impact économique

Comme je l’avais décrit dès le mois de mars, la Covid provoque une crise économique sans précédent. En France, en parle même de -20 % de croissance sur l’année.

Le Grand Reset

Or, le système financier ne peut survivre sans croissance. Du coup, on nous prépare un « grand reset ». Et ce n’est pas une « théorie complotiste ». C’est annoncé et soutenu publiquement par le Forum Économique Mondial, qui réunit régulièrement les grands de ce monde. Le coupable qui rend ce « reset » nécessaire ? Le virus, bien entendu.

En réalité, ce « reset » n’est que l’aboutissement logique d’une finance basée sur la croissance… qui n’est déjà plus là depuis une bonne décennie. Le système financier s’écroulait déjà malgré les tentatives désespérées de la Banque Centrale Européenne de maintenir le système à flot en injectant massivement de la monnaie dans le système bancaire et qui redouble d’efforts actuellement. Il se serait effondré, même sans Covid. Allons-nous laisser bâtir une société nouvelle par ceux-là même qui ont construit ce qui a déjà amené la destruction de toutes les infrastructures publiques et aux inégalités croissantes que l’on connaît aujourd’hui ?

Changements – à nous de jouer !

Il y a pourtant d’autres solutions : monnaie libre, démocratie participative. C’est le moment pour chacun d’entre-nous de s’intéresser à toutes ces initiatives car ceux qui se prétendent « chefs » ne pourront pas imposer un nouveau modèle sans notre consentement. Consentement qu’ils tenteront d’obtenir à tout prix par du lavage de cerveau via les médias qu’ils possèdent, ainsi que par l’intimidation.

En l’occurrence, l’information et la connaissance sont le pouvoir.

Lorsqu’il faut rebâtir la société suite à une crise, il vaut mieux être informé, pouvoir se « poser » et réfléchir. Or, aujourd’hui l’ambiance est exactement l’exact opposé :

  • la peur, très mauvaise conseillère, sous la forme du virus, mais aussi des masques, de la répression policière, de l’introduction de black blocs violents dans les manifestations pacifiques, le discours anxiogène en boucle des médias, les intimidations de l’Ordre des Médecins, et jusqu’à l’incarcération en asile psychiatrique des dissidents…
  • l’émotion, très mauvaise conseillère aussi, par le confinement, la perte massive d’emplois, le vote incessant de lois toujours plus liberticides et privilégiant toujours les plus privilégiés aux dépens des autres…

Autant que possible, nous devons collectivement ne pas céder à ces émotions… facile à dire, plus difficile à faire.

Pour conclure

La Covid est-elle le pire fléau que l’humanité ait connu ? Évidemment, non. La Covid tue, modérément, plus qu’une grippe, en particulier quand elle n’est pas traitée correctement.

Et pourtant, des traitements simples existent, mais ils sont systématiquement refoulés par les gouvernements. Simple incompétence ? Peut-être. Ça pue quand même la corruption, tout ça. On voudrait utiliser la stratégie de la peur pour museler et contrôler une population qu’on ne ferait pas autrement.

C’est en tout cas une opportunité rêvée pour les gouvernants de calmer les ardeurs de manifestants toujours plus nombreux, partout dans le monde. Avez-vous entendu parler de ces centaines de millions d’Indiens qui se soulèvent actuellement ? Chuuuut.

Souvenons-nous le jour venu qu’il n’y a une crise économique que parce que tout le système financier est basé sur l’enrichissement d’une minorité aux dépens de la majorité – par le biais d’intérêts bancaires pour rembourser de la monnaie créée à partir de rien. Souvenons-nous aussi des slogans de croissance économique brandis depuis des décennies, et la doctrine néo-libérale du « il faut tout privatiser et tout déréguler ». Ces deux facteurs créent des multinationales surpuissantes – plus puissantes que de nombreux états -, tandis que les infrastructures pour le bien commun sont délaissées – sans hôpitaux, la Covid fait des ravages.

À l’inverse, gardons-nous bien de croire qu’un Gouvernement (à la chinoise) va résoudre tous nos problèmes. Tant que nous n’avons aucun contrôle sur les pouvoirs que nous déléguons, à qui que ce soit, nous nous ferons abuser. C’est une évidence qu’il est bon de rappeler de temps en temps.