Skeptics always like to toss in 'if ever' as some form of enlightenment they they are aware of some fundamental limitation of the universe only they are privy to.
Let’s say there are three options: {soon, later, not at all}. Ruling out only one to arrive at {later, not at all} implies less knowledge than ruling out two and asserting {later}.
Awareness of a fundamental limitation would eliminate possibilities to just {not at all}, and the phrasing would be “never”, rather than “not soon, if ever”.
But we know that the fundamental limitation of intelligence does not exist, nature has already created that with animal and eventually human intelligence via random walk. So 'AI will never exist' is lazy magical thinking. That intelligence can be self reinforcing is a good reason why AI will exist much sooner than later.
Of the universe, perhaps, but humans certainly are a limiting factor here. Assuming we get this technology someday, why would one buy your software when the mere description of its functionality allows one to recreate it effortlessly?