As they say, a NIST is as good as a Nile to a blind horse that can't swim, nudge nudge wink wink...
But in reality NIST is the closest thing to an in-depth study that we have at this point, and two years after the event it is still at the data-gathering stage. At about the half-way point of their proposed 24 month study, their May 2003 Progress Report (
http://wtc.nist.gov/pubs/MediaUpdate%20_FINAL_ProgressReport051303.pdf) offers:
"{the} outline of an approach to assess the most probable structural collapse sequence that integrates impact damage, fire dynamics, thermal-structural response, and collapse initiation analyses
using a combination of physics-based mathematical modeling, statistical, and probabilistic methods..."
They are basically starting from scratch by looking at everything (except of course for explosives) that might have contributed to the collapse and seeing what kind of model they can construct that will (roughly) fit the data. As they make clear in their mission statement they do not take any of the collapse mechanisms proposed in the FEMA/ASCE report or elsewhere as a starting point.
As for the fire modelling, the only publication to date is the Initial Model for Fires in the World Trade Center Towers, published May 2002:
http://wtc.nist.gov/pubs/WTC_total__rept.pdfIt is little more than an initial model that tries to estimate total heat output from the north tower based on the behavior of the smoke plume, and doesn't make any attempt to estimate temperatures reached at various points in the building. And that after all is what we must know before we can say that metal softening from the fires caused the collapses, a theory that remains unproven at this point.