West Miami, Florida

West Miami is a city of Florida in the United States.