Yes. Induction is a solid way of getting to know things. Never fear.
First of all, it seems that the laws of physics themselves are in such a way, that they do not change over time. This just plain seems to be a feature built into our particular universe (like for instance, electrons, 2 plus 2 being four, and the speed of light) Do note that assuming the laws of physics are the same tomorrow as they are today have made a lot of people very, very wealthy.
Second, there is an extensive field of applied mathematics, Machine Learning, which works with a certain concept of statistics: The Bayesian Update. In short, every time a learning system gets new information from it's sensory channels, it updates it's internal probability distribution over possible sensory inputs to fit. It is a mathematically well-founded way of building an internal model of the surrounding environment, just by looking. Google made literally all their money by using machine learning.
Lastly, there is a brilliant formalization of Occam's Razor (the simplest explanation, all else being equal, should be favoured) called Solomonoff induction; where a hypothetical prediction machine runs all possible simulations of universes which would explain it's current sensory data, and every time it gains new information it discards the ones that don't fit. Given that it is not computable, it has yet to make people rich, but it has spawned some serious research in AI.
This is just a few name drops of what mathematicians work with when they talk about "inductive reasoning." And, again, there is money to be made with inductive reasoning, so ponder long and well before you argue against induction.
ETA: "Proof by induction" is something very different from inductive reasoning. Such a proof is only relevant when you are proving a property of numbers and works like this: If P is true for 0, and P being true for n implies that P is true for n+1, then P is true for all numbers. Prove P for 0; assumbe P for n and prove for n+1 and you're done.