Im watching this "day in the life of..." japan series and japan just looks superior. I live in new york city. it is nothing like this. nothing. no comfy udon stores. the service workers dont give a damn about you, they are unfriendly brown people with strange accents. there is no sense of unity or tradition.
give it to me straight. is japan just better? is life just BETTER in a calm, non diverse place? people seem to enjoy themselves more or is this grass is greener thinking?