Sorry, I'm probably a bit gun-shy at this point! Let me back up and just try to answer your questions.
"Well then why did they have to change the numbers for that particular question when they delivered the final exit poll."
They don't pick and choose which questions they are going to change the numbers for; they reweight them all. They calculate a multiplier for every respondent that ends up matching the proportions of Bush and Kerry voters in the official count (as well as their estimates of the overall distribution of age, race, and sex among voters -- interviewers are supposed to track those variables for the people they aren't able to interview). Then they rerun every table using those weights.
Why do they do this? Because,
assuming that the exit poll is off, using the weights should yield more accurate tables. (I sort of spelled this out, with a numerical example, in a post here:
http://www.democraticunderground.com/discuss/duboard.php?az=show_mesg&forum=132&topic_id=2490768&mesg_id=2499695 .)
"Maybe that wasn't the point but when they re-weighed the poll to make it match the results, by increasing the number of people questioned who voted for *, they forgot about those numbers which look ridiculous."
I don't think they "forgot." I suppose they could have looked at the past-vote table and said, "Wow, some folks are going to be arguing from here to eternity that that table is evidence of fraud. Maybe we could, umm, jigger those numbers." But they didn't.
Now, here's the interesting thing I found out when I looked at a lot of old exit polls (and some other polls, too) -- generally more people say they voted for the previous winner than actually voted for him. In fact, I wasn't able to find an exit poll where the percentage
wasn't higher than it "should" be (assuming that the people who voted for the previous winner, and loser, turn out at similar rates). Also, I looked at a panel study where people were asked in 2000 whom they had just voted for, and then asked again in 2004 whom they voted for. A bunch of people who said in 2000 that they had voted for Gore, said by 2004 that actually they had voted for Bush. I think it is mostly because a certain proportion of people, who don't follow politics very closely, pretty much just forget the loser. I wrote this all up here:
http://inside.bard.edu/~lindeman/too-many.pdf"Why did they even bother to make the final exit poll match the results the day after, to make it look like they had messed up less?"
Again, that's what that other post is about.
Assuming that the exit poll is off, the reweighting ought to yield more accurate results for all the tables (although even then, it's possible for the weighting to backfire).
"If one believes that the election was fraudulent one must understand that the exit poll unwittingly would reflect that."
Yes, if there was massive vote miscount, the exit polls should reflect it. The exit polls may also reflect other error sources. Vote suppression (such as registration purges) probably wouldn't show up in the exit polls except in the form of uncounted provisional ballots (and by affecting the turnout estimates -- that's an arcane topic).
"The fraud is sometimes illustrated by precincts with turnouts in the high 90s, third party candidates getting more votes than they should, candidates from the presidential candidate's party obtaining more votes than them and possibly exit polls."
Huge topic. In Ohio, those first three things all happened, but not very often. Smart, decent people are going to go to their graves disagreeing about what happened in Ohio.
"I mean even if the methodology was slightly off it seems that exit polling is more accurate than actually voting."
They could both be way off. I think the exit poll was way off in Ohio because it put Kerry up 6.5 points, and that's just too many points IMHO, especially considering all the people who never got to vote. (We didn't realize that on election night because the initial tabulations -- the CNN.com screen shots -- are based on an estimate that uses both the exit poll interviews and pre-election expectations. Now we know that while that estimate put Kerry up 3.4 points in Ohio, the interview-only estimate had him up almost twice as much.) I also think the exit poll was way off in New York because (as in Ohio) the official result was a lot closer to pre-election polls than the exit poll result was. Not that I think pre-election polls are perfect by any means, but I sure don't think Kerry won New York state by 30 percentage points, either.
But the kicker is: even if the exit polls were wrong, it obviously doesn't prove that the count was right. (And even if the count was basically right, it doesn't prove that the election was fair, or that the machines can be trusted.)
I do think that the count was basically right (at least in most places). I get in a lot of trouble for saying it. But please reread the end of the previous paragraph.
Bear with me. It was a big election.