This is one habit I have made sure my kids didn't pick up. I explained that you introduce germs into your body by putting your fingers in there so you should use a tissue if you need to. I just hate seeing anyone dig in their nose, it makes me feel a little sick when I see it.
Is this one of those things you can ignore? Or are you like me and cannot stand it? Do you fall somewhere in the middle?