It doesn't matter, everything is rendered internally at 720p.
Reportedly the load times are
halved with the HD on the 360, and I'm wondering if they had to make other concessions in terms of what has to be stored in RAM. Like, if you have xx amount of RAM, and have to account for 2 types of load latency, then it means either you always work at the slowest (so you have to have smaller data set, such as generally lower detailed and hence sized textures, because it's loading less often as it loads slow), or you have 2 versions (i.e. where the HD version could stream stuff like higher-res textures and more aggressively swap stuff in and out because of the lower latency). Because it's not like that memory bandwidth is as much use if you're forced rely on the slow optical media loading rather than faster HD loading, is it?
Care to actually answer the question, anyways? Core or #(whatever it was called) spec machine?
Although I note all that pullaver we had pre-console release about 1080i graphics seems to have gone now. Same for Sony too, as well; they've both gone to 720p as a base level, despite all those claims about super-high res, etc (granted, 1280x720 isn't bad, but we were promised 1920x1080 or so).