In humans, we tend to call behaviour emotional when we observe certain facial expressions like smiling or getting scared. We could see those expressions and interpret them correctly. They are part of the no verbal communication that people use for transmitting relevant information to others in a conversation.
From that point of view, the answer is yes. Emotional Robots could show emotions in a similar way as we do. And we could interpret those emotions correctly as we usually do in our social life.
Emotional robots may be able to communicate with us in ways we intuitively understand, for example showing a sluggish walk when their battery needs recharging, instead of a confusing panel of lights and beeps. The ultimate goal is not necessarily to create robots that can fall in love or fulfill all our human emotional needs, but to build machines that can interact with us in a more human way, rather than requiring us to behave more like machines.
Giving robots that capability is very useful for a whole variety of reasons and human situations. E.g. emotional robots could help children with autism improving their quality of life, or being a perfect assistant for teacher at classroom, or a friend of elder people.
In this video, the emotional robot Aisoy1 shows its facial expressions for the following emotions: sad, happy, angry, surprise, disgust, relief, reproach, pride, admiration and scared.