The human body has the amazing ability to heal even the most serious of injuries. Most minor injuries heal without leaving a scar. However, some injuries are so severe that while they do heal, there is some scarring left behind. Scars are unattractive. Depending on where they are located, they can have an impact on …