This is the question:
The temperature t of the stratosphere increases with height z above the ground. At 35 km we have the temperature –60C. Suppose we have a temperature equation t = t[sub:206f6h49]0[/sub:206f6h49]*(1 ? k*Dz), where Dz (delta z) stands for the difference in height. Determine the temperature at 50 km. The constant k = 0.033 km[sup:206f6h49]?1[/sup:206f6h49].
Okey. So my thought is to first calculate the t[sub:206f6h49]0[/sub:206f6h49] (im guessing this must be the temperature at ground level?) with t = -60, k = 0.033 and Dz = 35, getting t[sub:206f6h49]0[/sub:206f6h49] = 5.7 C. And then using this t[sub:206f6h49]0[/sub:206f6h49] with Dz = 50 to get the temperature at 50 km?
Is this a correct solution?
The temperature t of the stratosphere increases with height z above the ground. At 35 km we have the temperature –60C. Suppose we have a temperature equation t = t[sub:206f6h49]0[/sub:206f6h49]*(1 ? k*Dz), where Dz (delta z) stands for the difference in height. Determine the temperature at 50 km. The constant k = 0.033 km[sup:206f6h49]?1[/sup:206f6h49].
Okey. So my thought is to first calculate the t[sub:206f6h49]0[/sub:206f6h49] (im guessing this must be the temperature at ground level?) with t = -60, k = 0.033 and Dz = 35, getting t[sub:206f6h49]0[/sub:206f6h49] = 5.7 C. And then using this t[sub:206f6h49]0[/sub:206f6h49] with Dz = 50 to get the temperature at 50 km?
Is this a correct solution?