devarshi-dt-logo

Question:

Let f be a real-valued function defined on the interval (0, ∞) by f(x) = ln x + ∫₀ˣ √(1 + sin t) dt, then which of the following statement(s) is(are) true?

f''(x) exists for all x ∈ (0, ∞)

There exists α > 1 such that |f'(x)| < |f(x)| for all x ∈ (α, ∞)

f(x) exists for all x ∈ (0, ∞) and f' is continuous on (0, ∞), but not differentiable on (0, ∞)

There exists β > 0 such that |f(x)| + |f'(x)| ≤ β for all x ∈ (0, ∞)

Solution:

Option A: f(x) = ln x + ∫₀ˣ √(1 + sin t) dt
f'(x) = 1/x + √(1 + sin x)
f''(x) = -1/x² - cos x / (2(1 + sin x)^1.5)
f''(x) doesn't exist for values in which sin x = -1

Option B: It can be seen from the given definition that f(x) exists for the given interval. Also f'(x) is continuous in the given interval, it can be observed by the above f'(x) calculations. f'(x) is not differentiable because f''(x) doesn't exist at points where sin x = -1 in the given interval.

Option C: Now consider f(x) - f'(x) = [∫₀ˣ √(1 + sin t) dt - √(1 + sin x)] + ln x - 1/x
f'(x) = 1/x + √(1 + sin x) ≤ √2
The maximum value of f'(x) is √2. So, there exists some α for which f'(x) < f(x)
f'(x) is bounded whereas f(x) is an increasing function (unbounded), after some value of x, f(x) will be dominating f'(x)

Hence, option B and C.