Where have all the good Doctors gone?

Is it just me or are most of the Doctors left in this country generally uncaring of there patients? Is there one left that cares about their patients and doesn't just put a face on when they are there? The doctors I have gone to tell me they care when I am there and explaining you problem, but once they send me out for blood work no one ever calls back. I know that common saying is "no news is good news" but just because the blood work came back fine, does not mean I sharp stabbing i have had for over a year in my lower abdomen is gone. I feel like crying.