40
$\begingroup$

Mathematica 10 provides beautiful high-level machine learning functionality. Sadly, the learned functions once created are rather opaque objects. I need to use them in other projects, so here is my question:

Can I export the neural networks, random forest decision trees, or any other internals of the learned functions for externally use, e.g. can I export a neural net into a standard FANN File format.

PS. My real hope is to use Mathematica V10 to generate machine-vision convolutional neural networks and then embed them within my clojure, pythons and iOS projects.

$\endgroup$
6
  • 1
    $\begingroup$ I've been wondering the same thing. Also, I don't believe v10 machine learning supports convolution nets, and exporting image processing related code is not easy. $\endgroup$
    – M.R.
    Commented Jul 31, 2014 at 15:47
  • 1
    $\begingroup$ This question has been asked last year. Are there any worthy updates? $\endgroup$
    – Levi
    Commented Sep 2, 2015 at 19:33
  • $\begingroup$ Any further development on this? I saw in recent videos that they discussed exporting models. $\endgroup$ Commented May 4, 2016 at 8:53
  • 2
    $\begingroup$ Any more updates? $\endgroup$
    – M.R.
    Commented Mar 9, 2020 at 4:21
  • $\begingroup$ If OP's interest is for neural networks (convolutional or not) specifically: You can export trained networks in ONNX format reference.wolfram.com/language/ref/format/ONNX.html $\endgroup$ Commented Feb 20, 2021 at 0:14

2 Answers 2

25
$\begingroup$

It's on our list of things to do, but there are many other areas we want to cover, such as custom feature functions, customizable feature selection, boosting, NLP, deep learning of neural networks, convolutional nets, GPU acceleration, and so on.

Until then, your only real solution is to deploy your trained classifier as an API function. Some simpler classifiers and predictors can output a pure function that is fairly easy to translate 'by hand':

p = Predict[{{1.3, "P"} -> 1, {1.8, "Q"} -> 2.5, {1.9, "Q"} -> 3, {0.2, "P"} -> 1,        
               {-3.2, "P"} -> -4.2, {0.3, "Q"} -> 2}];

PredictorInformation[p, "Function"]
0.452129 - 0.530351 Boole[#2 === "P"] + 0.530351 Boole[#2 === "Q"] + 1.12488 #1 &
$\endgroup$
5
  • 4
    $\begingroup$ And Bayesian networks with the ability to edit/provide your own nets please! This is a showstopper for me. $\endgroup$
    – user4860
    Commented Jul 31, 2014 at 22:13
  • 1
    $\begingroup$ Hi from 2017 ! Any news ? $\endgroup$
    – sereizam
    Commented Mar 9, 2017 at 23:10
  • 2
    $\begingroup$ @mazieres I encourage you to file this as a suggestion with technical support. I don't work on this feature. $\endgroup$ Commented Mar 19, 2017 at 11:59
  • $\begingroup$ @TaliesinBeynon any update on this? $\endgroup$
    – ngc1300
    Commented Nov 13, 2020 at 1:36
  • $\begingroup$ Are there any updates this? Is there a way now to export or compile ClassifierFunction in general? $\endgroup$
    – sepehr78
    Commented Mar 31, 2021 at 17:25
7
$\begingroup$

TLDR : I had successfully implemented ("exported" + tested + verified) mathematica's GradientBoostedTree PredictorFunction[] model in python.

By "exported", I mean thoroughly matched the each predicted values (mathematica vs python).

Here is the a simplified workflow :

  1. On the trained GradientBoostedTree (I name it 'p7mmv') PredictorFunction, do Information[p7mmv,"MethodOption"]

  2. referring to wolfram docs, found that it was an implmentation of https://lightgbm.readthedocs.io/en/latest/ .

    a) with exactly the same training & validation set, train the new model.

    b) As for the setting use all info from [1], n apply it here https://lightgbm.readthedocs.io/en/latest/Parameters.html

  3. Run the prediction output and compare (all).

Just wish to share on a (small) successful mathematica trained model re-make story.. Any clarifications/comment/improvements is welcomed.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.