Understand the Target First Byte

On a website (http without SSL), I’ve the following result:

In the details page, for the first request:
First Byte : 0,966 s

  • DNS Lookup: 242 ms
  • Initial Connection: 89 ms
  • Time to First Byte: 634 ms

In the details page, for the second request:
First Byte : 0,611 s

  • DNS Lookup: 92 ms
  • Initial Connection: 83 ms
  • Time to First Byte: 435 ms

In the Performance Review page:
First Byte : 966 ms
Target First Byte : 267 ms

I definitely understand that my server time is bad and the DNS lookup is bad, but I don’t understand where the Target First Byte time comes from.
From the definition on the Performance Review page, I was expecting it to be DNS time + connection time + 100 = 242 + 89 + 100 = 431 ms
From the thread on http://www.webpagetest.org/forums/showthread.php?tid=11441&highlight=target+time, I am expecting it to be DNS time + connection time * 2 = 242 + 89 * 2 = 420 ms
Why is it 267 ms?

Many thanks!
Regards,

The allowed time for DNS as part of the budget for calculating the target is 1 RTT (the same as the connect time).

The logic is SSL Time + connection time *3

89 * 3 = 267 and then it gives you a buffer of 100ms before dropping the grade.

Thanks a lot for your explanation!