wbrandi118
wbrandi118
14.04.2020 • 
Mathematics

Given f : R → R defined by f(x) = x · e x and g : R → R defined by g(x) = √x² + 1.
(a) Find f ο g
(b) Find g ο f
(c) Find the range of g and state the range of g in correct set notation
(d) Let S = (−1, 1) = {x ∈ R | − 1 < x < 1}. Find g(S) and express g(S) in correct set notation.
(e) Let T = {√2}. Find g⁻¹ (T) and express g⁻¹ (T) in correct set notation.

Solved
Show answers

Ask an AI advisor a question