I'm not sure to fully understand your questions but I will try an answer biased by recent formal results in Proof Theory and my viewpoint of a computer scientist. So it won't be a very "philosophical" answer in the sense that I don't fully rely on philosophical references.
Note that the nature of proof is not clear at all and it's still something we seek. We actually have several answers for that.
Do you think we as humans in general use certain prove strategies to prove things ?
Toward a formalization of proof. Mathematician were already making proofs a long time before formal logic appeared. Around the 19th and 20th century a lot of system of proofs were established according our conception of proof.
Frege with his Begriffsschrift started from the conception of Logic provided by Aristotle and the Stoics.
Hilbert made a system were the modus ponens was the only rule and we try to derive proofs from already-established axioms
More interesting : Gentzen first made a system based on the use of logic in mathematical methods called the Natural Deduction (because it reproduces the empirical use of logic by humans). When proving things we introduce connectives and eliminate them later :
- from A and B we can reduce A /\ B (definition of "and")
- from A /\ B we can either deduce A or B (use of "and")
With such a system we can restrict the space of proofs within the introduction and elimination of a finite set of connectives.
So yes, we use certain strategies to prove things. Another questions :
- Where do these rules come from ?
- Is logic more than that ?
- Why do we use these rules and not other ones ?
The first two questions can be answered, the last one may be a question of psychology or biology (there are actually some litterature about it).
If the above is true this means they are certain strategies that we need to follow, where do you think the creativity comes in ?
A fine analysis of logic. Coming back to Gentzen, he observed that deductions from the Natural Deduction system could be reduced/simplified : when we introduce a connective and eliminate it right after, we're making a "deviation" which is not necessary.
He was unable to prove that so he made another system called the Sequent Calculus. That system wasn't based on the study of the use of proof but on an analyze of natural deduction and logic. The Sequent Calculus analyze the deep structure of logic and its symmetries according to a duality hypothesis/conclusion rather than introduction/elimination or definition/use.
The last system was adequate enough to make the concept of "deviation" explicit : it's called the cut rule and that's were creativity comes in (but he didn't know that).
Creativity and the use of lemmas. The cut rule is expressed as follow : from Γ ⊢ Δ,A (left = conjunction of hypothesis, right = disjunction of conclusions, the Greek letters are multiset of formulas) and Σ,A ⊢ Π we can deduce Γ,Σ ⊢ Π,Δ. Reading the rule in the reverse direction, it means that if we want to prove Γ,Σ ⊢ Π,Δ (which is anything you want) you can introduce an arbitrary A providing that A can be proved with some hypothesis Γ you have and that giving A and some eventually some other hypothesis Σ you have you can prove what you wanted.
The cut rule is actually the use of lemma in mathematics. When you want to prove something, prove a lemma A and use it.
Following his goal, Gentzen proved the use of the cut rule (representing deviations now called "cuts") could be removed. It means that we can prove things without lemmas and only with pure application of logical rules. Only one problem : reducing a proof with cuts can lead to an explosion of complexity with proof of huge size (sometimes of an absurd size).
A concrete example of cuts. Say you want to prove that 5+3 is the same as 3+5. You can either use "pure application of rules" leading to the result of 8 in both sides OR use creativity/a lemma/the cut rule : we prove that addition is commutative (for all n and m, n+m = m+n) and that the commutativity of addition prove what we want.
We make a deviation that is not necessary but more creative, more useful, more interesting...
How does proving things in mathematics differ from real world?
[...] In other words, to prove q in real life, we can easily find a p which results in q and it is not very challenging. What makes this challenging in mathematics though
Hum... I'm not sure to understand that one. In your example the reason may be that the space of search is finite when in mathematics it can be infinite. But it's not a general thing if in "real world" you include results about physics for instance.
Epilogue : Toward the deep structure of logic
The origin of rules. In the 20th century, logicians and computer scientists discovered something amazing called the Curry-Howard isomorphism. It states that :
The proof of some logical systems actually correspond exactly to computer programs of some system (The Intuitionistic Natural Deduction corresponds to the Lambda Calculus).
For the corresponding systems, the formula proved by a proof can be seen as the type of the corresponding program.
The elimination of cuts can be seen as the execution of programs
There're not a lot of litterature about that but I think that correspondence gives a "natural" status the logical rules as coming from the natural concept of "computation".
This paradigm interpret logical rules as programs constructions : logical implication is seen as a function in a programming language, conjunction as a pair, disjunction as a sum type/algebraic type...
An even finer analysis of logic. Recently logicians like Jean-Yves Girard tried to look even deeper in the analysis of logic : in most proof systems we have a rule to "duplicate" and "erase formula". For instance, giving A ⊢ A, the following still holds : A,A ⊢ A. And giving A,A ⊢ A we have A ⊢ A which still holds. What if we remove these rules and see proofs like chemical reactions where we consume formulas ? It led to Linear Logic which eventually reintroduce infinity/staticity of truth with new operators !A and ?A.
Linear Logic finally led to a new dynamical and interactive status for proofs which is still being investigated nowadays. That new approach fully rely on the concept of cut elimination which is now seen as interaction, a central idea of current research in Proof Theory.
Toward a fully internal logic. As you said, the problem of axioms and theorem can be problematic. In logic we rely too much on an external world. Logicians tried to make an internal logic called Ludics which is also currently under investigations. Ludics have a concept of "partial" proofs where two partial proofs interact together with one trying to prove a formula A and the other the dual/negation and only one win.
The denial. Recent researches in Proof Theory interacts a lot with computer science and tends to give a preferential treatment to logic as explained by computation.
A quite controversial question is : what about the logical systems where the theorem of cut elimination (cuts can be eliminated) doesn't hold and which doesn't have any corresponding system in Computer Science ? Are they really "valid" ?
Some systems in "philosophical logics" doesn't enjoy the theorem of cut elimination but it's another debate.
Summary
A proof can be seen as a program of some programming system in Computer Science
Computer Science and the cut rule give really interesting directions
The cut rule and the cut elimination procedure may the core of Logic
Without considering the use of logic by human we can study the formal structure of logic even deeper and finer
Formal results in Computer Science, Proof Theory and Mathematics should interact more with philosophers : philosophical ideas are unknown to most of the former and computation and technical details to the latter.