Not all of these features are required for intelligent life.
Part of the difficulty for this question is defining intelligent life. I think Wikipedia has a good start on the definition of intelligence:
Intelligence has been defined in many different ways such as in terms of one's capacity for logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity and problem solving. It can also be more generally described as the ability to perceive and/or retain knowledge or information and apply it to itself or other instances of knowledge or information creating referable understanding models of any size, density, or complexity, due to any conscious or subconscious imposed will or instruction to do so.
It appears from this definition that at least emotion and self-awareness are required. I don't think that's true; it's a very human-centric view of intelligence.
Self-Awareness
I would argue that a hive mind, a collection of cooperative individuals without self-awareness, can also be called intelligent. Even if we don't have a current example of that, there is no reason it can not exist. Intelligence itself seems to be an emergent quality, there is no set of neurons in a human brain that are intelligent, it's the interaction of it all working together. Such emergence is seen from groups of individuals already, like an ant colony. Though it may not be considered intelligent, it's not difficult to imagine the effect being scaled up to something we would recognize as intelligence (then again, it may just be a Chinese Room, but that's a whole different discussion about intelligence). So a hive mind intelligence could have emotion, be non-religious, lack self-awareness, but be intelligent. It could also be lacking in emotion, but that might not be as likely for a biological entity.
Emotion
I'm under the belief that in order to have emotion, a mind must have a body. This arises from the James-Lange theory. It's why I don't believe artificial intelligences will ever be the angry homicidal beings that fiction makes them out to be. An artificial intelligence then, will be self-aware, non-religious, won't have emotion, but can be intelligent.
Religion
I don't see any cases where this is required at all (I think it also requires emotion, so I don't think it even belongs in a fundamental list). Not even for the evolution of human intelligence. It's prevalence in human civilization seems more likely a remnant of the suspicion and superstition that helped us survive early in our evolution. Suspicion gave us the wariness to avoid predators and the overactive pattern recognition that has almost no evolutionary counter-pressure. Superstition let us follow the wisdom of not eating the red berries without understanding why we can't eat the red berries. But belief in a higher power doesn't specifically provide evolutionary advantage, all the benefits can be had without that side effect. Humans are capable of morality, love, awe, wonder, and altruism without religion. Secular humanism is a good example. I suppose that could be called a type of spirituality, but it should not be confused with belief in a higher power, which is implied by the term religion.