I think what the TVs are doing is upconversion and not upscaling. These are two different technologies. You can read a detailed thread here:
http://www.hifivision.com/av-enhancers-room-acoustics/3017-upconversion-vs-upscaling.html
All the digital TV's/displays do upscaling or decimation to their native resolution.
For example, if the TV's native resolution is 1080p, then it will upscale/upconvert any signal that is not 1080p+digital. If you feed it 480p/720p/1080i through HDMI, then it will be upscaled to 1080p. If you feed it 480i/480p/720p/1080i through component, then it will be converted to digital and then upscaled to 1080p. All the digital displays have the scaling, deinterlacing and analog to digital conversion chips. Sometimes, a single chip does all the three. The main reason for this is because a digital display can only show the picture in its native resolution to use the full panel.
Now, that brings us to the next question. Is the scaling chip in the Tv better than the in the players? It depends on the chip they used. Many times, TV manufacturers cut corners and they use lesser quality chips. If they used a high quality like Reon, ABT, Qdeo, then they usually specify in brochures.
Another thing is - whether we can notice the quality of the scaling. That depends upon the image size and seating distance. At 40" screen size, watching from 12 ft, its very hard to say. At that distance, very few people will even notice difference between 1080p vs 720p. Some people will notice the difference and some won't.
Also, the scaling chips sometimes have the problems of their own too. Like ABT chip is known add ringing during scaling. So, see the TV in the showroom with 480i/p connection, from a distance same as seating position. If you don't notice any artifacts, de-interlacing issues etc, most likely you won't feel the need for another scaling chip.