The constraints of some languages makes it easier to implement faster code (e.g. Fortran vs C and the pointers aliasing) which is a tradeoff between out-of-the-box performance and possibilities).
The language is not "optimized" for specific tasks, but the implementation, compilers and constraints that make it easier to understand the code by compiler makes so. The real deal is about specific libraries, algorithms implemented to speed up the process with switches depending on the problem length makes it optimal.
For example the multiplication uses various cases (see GMP multiplication.
When the language specifies the higher level mathematic operations it's implementation is optimal (efficient in this case), but that is not the part of language specification.
Please take a look at matrix rank computation in Matlab, Mathematica and Maple (I cannot perform all the tests myself right now, but these are consistent with my tests). All these languages (environments) implement the same higher level operation but the implementation details differ which gives various times.
When some domain specific task (here also domain specific language) is oriented at particular calculations they get improved and optimized (over the years) for the target audience. But being optimal is not always the case. For example Perl has a long history of handling strings, but the PCRE (here simply Perl's regular expressions) are not the fastest existing ones (and use a lot of memory), but are extremely expressive and powerful.
The constraints of the language makes a difference in the compilation process, the mentioned pointer aliasing prevents the possibility of the code reordering and enforces reloading of variables.