To the best of my knowledge, there is no such glyph in Unicode. Consequently, the solution is to "simulate" the look of the notation and this is tricky.

Fonts in Math are selected according to the context of use. They are defined in `Format`

>`Fonts`

. The "common" contexts are configured in the **Formula Fonts** of the dialog. Leave them out so that you don't mess Math operation (unless of course you want to customise them). The bottom part **Custom Fonts** offers three user-selectable contexts.

Choose one of them to define the font to be used for your normal distribution symbol, *Serif* (the context by itself does not matter).

I suggest the use of some script font (I kind of remember *Zapf Chancery* might look like the picture in the question).

When back in edit mode, the formula above is `font serif {N} (%mu, %sigma^2)`

.

The keywords for the other user-contexts are `sans`

and `fixed`

.

*To show the community your question has been answered, click the ✓ next to the correct answer, and "upvote" by clicking on the ^ arrow of any helpful answers. These are the mechanisms for communicating the quality of the Q&A on this site. Thanks!*

*In case you need clarification, ***edit** your question (not an answer) or **comment** the relevant answer.

Eh, is the N character absent on your keyboard or what is the problem?

Eh, can't you notice the difference between look of the N in "N character" and the N in the formula or what?

But it is still N, isn't it? If you want some fancy typeface, follow the answer given.

@gabix "But it is still N, isn't it?"

No, it isn't: when you do math, everything you use have meaning, you cannot drop a "normal" N instead of a "calligraphic" N, they mean completely different things.

The Wikipedia article referenced to by the asker says that both a calligraphic N and a plain N are possible, as I can understand:

The normal distribution is often referred to as N ( μ , σ 2 ) {\displaystyle N(\mu ,\sigma ^{2})} N(\mu ,\sigma ^{2}) or N ( μ , σ 2 ) {\displaystyle {\mathcal {N}}(\mu ,\sigma ^{2})} {\mathcal {N}}(\mu ,\sigma ^{2}).[6]

https://en.wikipedia.org/wiki/Normal_...

@gabix Often people use "normal" characters due to convenience. Sometimes, it is widely accepted. In my field length unit μm is often written in um. It disturbs me. μm looks better. Similarly, I want my formulas to look better. As a user I was wondering, if there is any way to do this. Questioning this is meaningless. I don't even need a reason to ask for this. Maybe I will invent a reason. Maybe I am working on a notation system. It doesn't matter.