So, someone riddle me this. If SOI is providing significant benefit, how come intel isn't using it?
It's a simple matter of accounting.
Node development is a fixed cost, it doesn't matter whether you are a $50B/yr company that is going to use the node to process 500,000wfrs/month or if you are a $1B/yr company that is going to use the node to process 1,000 wfrs/month.
Either way the node is going to cost you around $4B to develop.
SOI lowers the development cost, makes it "easier" to hit the leakage spec, for the node in exchange for making it more costly to produce on a per-wafer basis.
If you are that $50B/yr company you are not interested in making your 500,000 wfrs/month cost an extra $100 per wafer.
You'd much rather invest an extra $50m-$100m up-front during node development to develop a comparable node that is not dependent on SOI. Costs more to develop, but per-wafer production costs will be lower and after you make a couple million wafers with it you will have recovered your extra R&D costs.
If you are that $5B/yr company you are not looking at making 500,000 wfrs/month. You are going to be more interested in saving that $50m-$100m upfront R&D money and just take the $100 hit on your paltry low-volume 5,000 wfrs/month.
Even though your wafers cost more to produce, the added cost will never amount to $100m over the life of the node.
(PS - its the same accounting principles that come into play when IDM's decide to go immersion litho vs. dual-pattern litho...its a trade-off that makes economic sense depending on your production volumes versus R&D budget for developing the node)