That's exactly what the median means. If you lined up all households in order of income, then found the household in the exact middle of the line, that would have the median income. Therefore you are making more than over half the American population do.
What you have is a LOT of people on a low income. Let's look at what happens if you grouped this line into bands of 0-5k, 5-10k, 10-15k and so on. Which of these bands has the most households in it? That is the mode, and I bet it's significantly lower than the median. The final kind of average is the one you probably know - the arithmetic mean, what you get if you add up all the incomes and divide by the number of households. Most likely that will be lower than the median but higher than the mode.
What we have here is a skewed distribution. If we have a measurement that clusters around the average number and the further away you get from it in EITHER direction the equally less likely it is, then if you plotted a graph of that, it would be bell-shaped, highest in the middle and tailing away equally on either side. This is called a normal distribution. The mode, median and mean are all the same.
But with household incomes, you get a bell that looks pushed over towards the low end, with a long tail of higher incomes. The mode, median and mean are not the same. So which one is most meaningful? The median is usually picked because of what it tells you - half of American households make more than this and half make less. And it's as low as it is because SO many households make less.
Bear in mind there are single-person households, couples where only one works or neither works because unemployment is high in that area... all of those will pull the median down. In statistics, doing the calculation is the easy part, interpreting correctly what it means is the tricky bit!