I think because a full proof covering existence and uniqueness will either be really long or require tools from outside the scope of the text. E.g. there's a somewhat concise proof using linear algebra which I'll partially reproduce below. (I like this proof because the equation is derived from first principles rather than starting with an ansatz.)
---
Let x_n be a sequence defined by the recurrence relation:
x_{n+1} = a * x_{n-1} + b * x_n
Observe that if we define a sequence of two-element vectors of successive elements:
[x_0] [x_1] [x_2]
[x_1], [x_2], [x_3], ...
then we can form the relation in terms of matrix/vector multiplication:
[x_1] = [[0 1]] [x_0]
[x_2] [[a b]] [x_1]
Let's name the sequence of vectors as y_n and call the matrix M:
y_1 = M * y_0
We can get the next term in the sequence with another multiplication:
y_2 = M * y_1
= M * (M * y_0)
= M^2 * y_0
By induction we have:
y_n = M^n * y_0
M has characteristic polynomial:
r^2 - br - a = 0
with roots:
r_1 = (b - c)/2
r_2 = (b + c)/2
c = √(b^2 + 4a)
Therefore we have by diagonalization:
y_n = S * [[r_1^n 0 ]] * S^(-1) * y_0
[[0 r_2^n]]
where S is the matrix of eigenvectors. From here, we can finish our existence and uniqueness proofs from the existence and uniqueness of the eigenvalues of M.
---
Let x_n be a sequence defined by the recurrence relation:
Observe that if we define a sequence of two-element vectors of successive elements: then we can form the relation in terms of matrix/vector multiplication: Let's name the sequence of vectors as y_n and call the matrix M: We can get the next term in the sequence with another multiplication: By induction we have: M has characteristic polynomial: with roots: Therefore we have by diagonalization: where S is the matrix of eigenvectors. From here, we can finish our existence and uniqueness proofs from the existence and uniqueness of the eigenvalues of M.