这一部分将介绍如何在客户端程序中调用远程计算服务并在 Web 页面中查看结果。

6.4. Client-Server 接口调用方式

当我们使用 MindOpt 的远程计算服务时,优化求解的执行过程被分作客户端建模和计算服务器求解两个部分,它们之间的通信则依靠 API 来执行。 用户可以使用标准的 MindOpt API(C/C++/Python,后续会支持更多语言接口)来构建优化模型和设置相关的算法参数,然后提交给 Server 来进行求解。 Server 成功接收作业请求后,将返回一个唯一的作业 ID job_id。 用户可以再通过这个 job_id 去查询和监控优化模型求解的状态(SubmittedSolvingFailedFinished)。一旦该作业被处理完(状态为 FinishedFailed),客户端就可以向 Server 获取结果。

通过客户端提交任务和获取结果主要用到以下两类接口:

  • 上传模型: 用于向服务器提交作业并获取返回的作业 ID。

  • 获取结果: 用于通过作业 ID 向服务器检索状态并下载计算结果。

客户端接口:

接口

C 语言

C++ 语言

Python 语言

上传模型

Mdo_submitTask()

mindopt::MdoModel::submitTask()

mindoptpy.MdoModel.submit_task()

获取结果

Mdo_retrieveTask()

mindopt::MdoModel::retrieveTask()

mindoptpy.MdoModel.retrieve_task()

其中,在上传模型时,需要将模型和参数的序列化文件上传至远程服务器进行求解;在获取结果时,也需要对序列化的模型和参数进行加载。为了适应用户对不同使用场景的需求,在设置序列化文件的存取时主要有两种实现方式。

第一种方式,用户可以自定义保存的文件和参数名,以 C 语言为例,使用 model_fileparam_file 来分别指定序列化模型和参数文件的存储文件名(若文件名中包含路径,则需要保证路径存在且具有写入权限,否则会报错)。 然后在优化程序中指定求解器将这两个文件写入指定位置。

Mdo_writeTask(model, model_file, MDO_YES, MDO_NO, MDO_NO);
Mdo_writeTask(model, param_file, MDO_NO, MDO_YES, MDO_NO);

并进行参数设置,让求解器将对应的文件上传至远程服务器进行求解。

Mdo_setStrParam(model, "Remote/File/Model", model_file);
Mdo_setStrParam(model, "Remote/File/Param", param_file);

在获取结果阶段,若模型为空,则需要通过以下方式加载序列化模型和参数。

Mdo_readTask(model, model_file, MDO_YES, MDO_NO, MDO_NO);
Mdo_readTask(model, param_file, MDO_NO, MDO_YES, MDO_NO);

此外,由于在获取结果时,需要将远程服务器回传的计算结进行保存,因此还需要指定序列化解文件的存储路径(若文件名中包含路径,则需要保证路径存在且有写入权限,否则会报错)。并指定参数 "Remote/File/Model""Remote/File/Param",进行序列化文件的读取。

const char * soln_file = "./my_soln.mdo";

Mdo_setStrParam(model, "Remote/File/Soln", soln_file);
Mdo_setStrParam(model, "Remote/File/Model", model_file);
Mdo_setStrParam(model, "Remote/File/Param", param_file);

第二种方式,用户只需指定将要保存的文件的文件夹路径,例如 filepath (需要保证路径存在且有写入权限)。然后直接在参数设置时指定 "Remote/File/Path",求解器会自动执行写入序列化文件的过程。

const char * filepath = "./tmp";

Mdo_setStrParam(model, "Remote/File/Path", filepath);

类似的,在获取结果阶段,进行该参数设置后求解器会利用 filepathjob_id 到相应路径寻找序列化文件并加载到模型和参数中。例如,若 job_id10,则序列化模型、参数、解文件的文件名分别为 m10.mdop10.mdos10.mdo

我们所有的 API 都建立在 C API 的基础上。C API 会负责构建内部数据结构、调用 MindOpt 中的优化算法以及生成解的相关信息等。 在本地机器上运行 MindOpt 时,C API 将在本地内存中构建相关的数据结构。 在计算服务器中,MindOpt C API 会在接收到模型和参数数据后,将其作为优化算法的输入,并生成解的数据作为输出。

下面我们将通过具体的示例代码展示如何在不同的客户端 SDK (C/C++/Python) 中通过 API 来使用远程计算服务。

6.5. 客户端 C 程序示例

我们通过如下的 C 程序展示如何调用 Mdo_submitTaskMdo_retrieveTask API 进行作业上传和获取结果。

6.5.1. 上传模型

6.5.1.1. 从 MPS 文件读取模型上传

这个部分的程序展示了读取 MPS 文件、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们用它来查询作业状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
#include <stdio.h>
#include "Mindopt.h"

/* Macro to check the return code */
#define MDO_CHECK_CALL(MDO_CALL)                                    \
    code = MDO_CALL;                                                \
    if (code != MDO_OKAY)                                           \
    {                                                               \
        Mdo_explainResult(model, code, str);                        \
        Mdo_freeMdl(&model);                                        \
        fprintf(stderr, "===================================\n");   \
        fprintf(stderr, "Error   : code <%d>\n", code);             \
        fprintf(stderr, "Reason  : %s\n", str);                     \
        fprintf(stderr, "===================================\n");   \
        return (int)code;                                           \
    }

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    char         str[1024] = { "\0" };    
    /* Input mps file (optimization problem) */
    const char * input_file = "./afiro.mps";
    /* Output serialized files */
    const char * model_file = "./my_model.mdo";
    const char * param_file = "./my_param.mdo";
    MdoResult    code = MDO_OKAY;
    MdoMdl *     model = NULL;
    char         job_id[1024]= { "\0" };
     
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MDO_CHECK_CALL(Mdo_createMdl(&model));
    
    /*------------------------------------------------------------------*/
    /* Step 2. Input model and parameters.                              */
    /*------------------------------------------------------------------*/
    /* Read model from file. */
    MDO_CHECK_CALL(Mdo_readProb(model, input_file));
    
    /* Input parameters. */
    Mdo_setIntParam(model, "NumThreads", 4);
    Mdo_setRealParam(model, "MaxTime", 3600);

    /*------------------------------------------------------------------*/
    /* Step 3. Serialize the model and the parameters.                  */
    /*------------------------------------------------------------------*/
    MDO_CHECK_CALL(Mdo_writeTask(model, model_file, MDO_YES, MDO_NO, MDO_NO));
    MDO_CHECK_CALL(Mdo_writeTask(model, param_file, MDO_NO, MDO_YES, MDO_NO));
        
    /*------------------------------------------------------------------*/
    /* Step 4. Input parameters related to remote computing.            */
    /*------------------------------------------------------------------*/
    /* Input parameters related to remote computing. */
    Mdo_setStrParam(model, "Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
    Mdo_setStrParam(model, "Remote/Desc", "my model");
    Mdo_setStrParam(model, "Remote/Server", "127.0.0.1"); // Change to your server IP 
    Mdo_setStrParam(model, "Remote/File/Model", model_file);
    Mdo_setStrParam(model, "Remote/File/Param", param_file);

    /*------------------------------------------------------------------*/
    /* Step 5. Upload serialize model and parameter to server, and      */
    /*         then optimize the model.                                 */
    /*------------------------------------------------------------------*/
    MDO_CHECK_CALL(Mdo_submitTask(model, job_id));
    if (job_id[0] == '\0')
    {
        printf("ERROR: Invalid job ID: %s\n", job_id);
    }
    else
    {
        printf("Job was submitted to server successfully.\n");
        printf("User may query the optimization result with the following job ID: %s\n", job_id);
    }
     
    /*------------------------------------------------------------------*/
    /* Step 6. Free the model.                                          */
    /*------------------------------------------------------------------*/
    /* Free the model. */
    Mdo_freeMdl(&model);
    
    return (int)code;
}

6.5.1.2. 自定义优化模型上传

这个部分的程序展示了自定义优化模型、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们将用它来查询作业的状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
#include <stdio.h>
#include "Mindopt.h"

/* Macro to check the return code */
#define MDO_CHECK_CALL(MDO_CALL)                                    \
    code = MDO_CALL;                                                \
    if (code != MDO_OKAY)                                           \
    {                                                               \
        Mdo_explainResult(model, code, str);                        \
        Mdo_freeMdl(&model);                                        \
        fprintf(stderr, "===================================\n");   \
        fprintf(stderr, "Error   : code <%d>\n", code);             \
        fprintf(stderr, "Reason  : %s\n", str);                     \
        fprintf(stderr, "===================================\n");   \
        return (int)code;                                           \
    }

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    char         str[1024] = { "\0" };    
    const char * model_file = "./my_model.mdo";
    const char * param_file = "./my_param.mdo";
    MdoResult    code = MDO_OKAY;
    MdoMdl *     model = NULL;
    char         job_id[1024]= { "\0" };
    
    const int    row1_idx[] = { 0,   1,   2,   3   };
    const double row1_val[] = { 1.0, 1.0, 2.0, 3.0 };
    const int    row2_idx[] = { 0,    2,   3   };
    const double row2_val[] = { 1.0, -1.0, 6.0 };
    
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MDO_CHECK_CALL(Mdo_createMdl(&model));
    
    /*------------------------------------------------------------------*/
    /* Step 2. Input model and parameters.                              */
    /*------------------------------------------------------------------*/
    /* Change to minimization problem. */
    Mdo_setIntAttr(model, "MinSense", MDO_YES);

    /* Add variables. */
    MDO_CHECK_CALL(Mdo_addCol(model, 0.0, 10.0,         1.0, 0, NULL, NULL, "x0", MDO_NO));
    MDO_CHECK_CALL(Mdo_addCol(model, 0.0, MDO_INFINITY, 1.0, 0, NULL, NULL, "x1", MDO_NO));
    MDO_CHECK_CALL(Mdo_addCol(model, 0.0, MDO_INFINITY, 1.0, 0, NULL, NULL, "x2", MDO_NO));
    MDO_CHECK_CALL(Mdo_addCol(model, 0.0, MDO_INFINITY, 1.0, 0, NULL, NULL, "x3", MDO_NO));

    /* Add constraints.
     * Note that the nonzero elements are inputted in a row-wise order here.
     */
    MDO_CHECK_CALL(Mdo_addRow(model, 1.0, MDO_INFINITY, 4, row1_idx, row1_val, "c0"));
    MDO_CHECK_CALL(Mdo_addRow(model, 1.0, 1.0,          3, row2_idx, row2_val, "c1"));
    
    /* Input parameters. */
    Mdo_setIntParam(model, "NumThreads", 4);
    Mdo_setRealParam(model, "MaxTime", 3600);

    /*------------------------------------------------------------------*/
    /* Step 3. Serialize the model and the parameters for later use.    */
    /*------------------------------------------------------------------*/
    MDO_CHECK_CALL(Mdo_writeTask(model, model_file, MDO_YES, MDO_NO, MDO_NO));
    MDO_CHECK_CALL(Mdo_writeTask(model, param_file, MDO_NO, MDO_YES, MDO_NO));
        
    /*------------------------------------------------------------------*/
    /* Step 4. Input parameters related to the remote computing server. */
    /*------------------------------------------------------------------*/
    /* Input parameters related to remote computing. */
    Mdo_setStrParam(model, "Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
    Mdo_setStrParam(model, "Remote/Desc", "my model");
    Mdo_setStrParam(model, "Remote/Server", "127.0.0.1"); // Change to your server IP
    Mdo_setStrParam(model, "Remote/File/Model", model_file);
    Mdo_setStrParam(model, "Remote/File/Param", param_file);

    /*------------------------------------------------------------------*/
    /* Step 5. Upload the serialized model and parameters to server, and*/
    /*         then optimize the model.                                 */
    /*------------------------------------------------------------------*/
    MDO_CHECK_CALL(Mdo_submitTask(model, job_id));
    if (job_id[0] == '\0')
    {
        printf("ERROR: Invalid job ID: %s\n", job_id);
    }
    else
    {
        printf("Job was submitted to server successfully.\n");
        printf("User may query the optimization result with the following job ID: %s\n", job_id);
    }
     
    /*------------------------------------------------------------------*/
    /* Step 6. Free the model.                                          */
    /*------------------------------------------------------------------*/
    /* Free the model. */
    Mdo_freeMdl(&model);
    
    return (int)code;
}

6.5.2. 获取结果

在计算任务提交之后,可以通过如下示例程序对运行结果进行检索以及下载问题的解,该段程序将展示反序列化模型和参数、查询解状态、获取解、打印结果的过程。

Note

需提供准备好Server 端 tokenjob_id 等必要信息。

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
/**
 *  Description
 *  -----------
 *
 *  Check the solution status, download the solution, and populate the result.
 */
#include <stdio.h>
#if defined(__APPLE__) || defined(__linux__)
#   include <unistd.h>
#   define SLEEP_10_SEC sleep(10)
#else
#   include <windows.h>
#   define SLEEP_10_SEC Sleep(10000)
#endif
#include "Mindopt.h"

/* Macro to check the return code */
#define MDO_CHECK_CALL(MDO_CALL)                                    \
    code = MDO_CALL;                                                \
    if (code != MDO_OKAY)                                           \
    {                                                               \
        Mdo_explainResult(model, code, str);                        \
        Mdo_freeMdl(&model);                                        \
        fprintf(stderr, "===================================\n");   \
        fprintf(stderr, "Error   : code <%d>\n", code);             \
        fprintf(stderr, "Reason  : %s\n", str);                     \
        fprintf(stderr, "===================================\n");   \
        return (int)code;                                           \
    }

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    char         str[1024] = { "\0" };    
    char         status[1024] = { "\0" };    
    /* Serialized files, must be the same in the submission code */
    const char * model_file = "./my_model.mdo";
    const char * param_file = "./my_param.mdo";
    /* Output solution file */
    const char * soln_file = "./my_soln.mdo";
    const char * sol_file = "./my_soln.sol";
    MdoResult    code = MDO_OKAY;
    MdoStatus    model_status = MDO_UNKNOWN;    
    char         model_status_details[1024] = { "\0" };    
    MdoResult    result = MDO_OKAY;    
    char         result_details[1024] = { "\0" };    
    MdoBool      has_soln = MDO_NO;        
    MdoMdl *     model = NULL;
    char         job_id[1024] = { "\0" };
    double       val = 0.0;
         
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MDO_CHECK_CALL(Mdo_createMdl(&model));
    
    /*------------------------------------------------------------------*/
    /* Step 2. Read the serialized model and parameters -- this is      */
    /*         required while populating the optimization result.       */
    /*------------------------------------------------------------------*/
    MDO_CHECK_CALL(Mdo_readTask(model, model_file, MDO_YES, MDO_NO, MDO_NO));
    MDO_CHECK_CALL(Mdo_readTask(model, param_file, MDO_NO, MDO_YES, MDO_NO));
    
    /*------------------------------------------------------------------*/
    /* Step 3. Input parameters related to the remote computing server. */
    /*------------------------------------------------------------------*/
    Mdo_setStrParam(model, "Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
    Mdo_setStrParam(model, "Remote/Server", "127.0.0.1"); // Change to your server IP
    Mdo_setStrParam(model, "Remote/File/Model", model_file);
    Mdo_setStrParam(model, "Remote/File/Param", param_file);
    Mdo_setStrParam(model, "Remote/File/Soln", soln_file);
    strcpy(job_id, "1"); // Change to the jobID you received after submit the task
        
    /*------------------------------------------------------------------*/
    /* Step 4. Check the solution status periodically, and              */
    /*         download the its upon availability.                      */
    /*------------------------------------------------------------------*/
    do 
    {        
        MDO_CHECK_CALL(Mdo_retrieveTask(model, job_id, status, &model_status, &result, &has_soln));
        
        /* Sleep for 10 seconds. */
        SLEEP_10_SEC;
    }
    while (status[0] == 'S'); /* Continue the loop if the status is in either Submitted status or Solving status. */
    
    /*------------------------------------------------------------------*/
    /* Step 5. Un-serialize the solution and then populate the result.  */
    /*------------------------------------------------------------------*/  
    printf("\nPopulating optimization results.\n");    
    Mdo_explainStatus(model, model_status, model_status_details);
    Mdo_explainResult(model, result, result_details);
    
    printf(" - Job status             : %s\n", status);    
    printf(" - Model status           : %s (%d)\n", model_status_details, model_status);
    printf(" - Optimization status    : %s (%d)\n", result_details, result);
    printf(" - Solution availability  : %s\n", has_soln ? "available" : "not available");
    if (has_soln)
    {
        printf("\nPopulating solution.\n");    
        MDO_CHECK_CALL(Mdo_readTask(model, soln_file, MDO_NO, MDO_NO, MDO_YES));
        Mdo_displayResults(model);
        Mdo_writeSoln(model, sol_file);

        Mdo_getRealAttr(model, "PrimalObjVal", &val);
        printf(" - Primal objective value : %e\n", val);
        Mdo_getRealAttr(model, "DualObjVal", &val);
        printf(" - Dual objective value   : %e\n", val);
        Mdo_getRealAttr(model, "SolutionTime", &val);
        printf(" - Solution time          : %e sec.\n", val);
    }
    
    /*------------------------------------------------------------------*/
    /* Step 6. Free the model.                                          */
    /*------------------------------------------------------------------*/
    /* Free the model. */
    Mdo_freeMdl(&model);
       
    return (int)code;
}

6.6. 客户端 C++ 程序示例

我们通过如下 C++ 程序展示如何调用 submitTaskretrieveTask API 进行作业上传和获取结果。

6.6.1. 上传模型

6.6.1.1. 从 MPS 文件读取模型上传

这个部分的程序展示了读取 MPS 文件、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们用它来查询作业的状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
#include <iostream>
#include <vector>
#include "MindoptCpp.h"

using namespace mindopt;

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    /* Input mps file (optimization problem) */
    std::string input_file("./afiro.mps");
    /* Output serialized files */
    std::string model_file("./my_model.mdo");
    std::string param_file("./my_param.mdo");
    std::string job_id;
     
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MdoModel model;
    /* Read model from mps file */
    model.readProb(input_file);
    
    
    try 
    {
        /*------------------------------------------------------------------*/
        /* Step 2. Input model and parameters.                              */
        /*------------------------------------------------------------------*/
        /* Input parameters. */
        model.setIntParam("NumThreads", 4);
        model.setRealParam("MaxTime", 3600);                
        
        /*------------------------------------------------------------------*/
        /* Step 3. Serialize the model and the parameters.                  */
        /*------------------------------------------------------------------*/
        model.writeTask(model_file, MDO_YES, MDO_NO, MDO_NO);
        model.writeTask(param_file, MDO_NO, MDO_YES, MDO_NO);
               
        /*------------------------------------------------------------------*/
        /* Step 4. Input parameters related to remote computing.            */
        /*------------------------------------------------------------------*/
        /* Input parameters related to remote computing. */
        model.setStrParam("Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
        model.setStrParam("Remote/Desc", "my model");
        model.setStrParam("Remote/Server", "127.0.0.1"); // Change to your server IP 
        model.setStrParam("Remote/File/Model", model_file);
        model.setStrParam("Remote/File/Param", param_file);
        
        /*------------------------------------------------------------------*/
        /* Step 5. Upload serialize model and parameter to server, and      */
        /*         then optimize the model.                                 */
        /*------------------------------------------------------------------*/
        job_id = model.submitTask();
        if (job_id == "")
        {
            std::cout << "ERROR: Empty job ID." << std::endl;
        }
        else
        {
            std::cout << "Job was submitted to server successfully." << std::endl;
            std::cout << "User may query the optimization result with the following job ID: " << job_id << std::endl;
        }
    }
    catch (MdoException & e)
    {
        std::cerr << "===================================" << std::endl;
        std::cerr << "Error   : code <" << e.getResult() << ">" << std::endl;
        std::cerr << "Reason  : " << model.explainResult(e.getResult()) << std::endl;
        std::cerr << "===================================" << std::endl;

        return static_cast<int>(e.getResult());
    }

    return static_cast<int>(MDO_OKAY);
}

6.6.1.2. 自定义优化模型上传

这个部分的程序展示了自定义优化模型、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们用它来查询作业的状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
#include <iostream>
#include <vector>
#include "MindoptCpp.h"

using namespace mindopt;

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    std::string model_file("./my_model.mdo");
    std::string param_file("./my_param.mdo");
    std::string job_id;
     
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MdoModel model;
    
    try 
    {
        /*------------------------------------------------------------------*/
        /* Step 2. Input model and parameters.                              */
        /*------------------------------------------------------------------*/
        /* Change to minimization problem. */
        model.setIntAttr("MinSense", MDO_YES);

        /* Add variables. */
        std::vector<MdoVar> x;
        x.push_back(model.addVar(0.0, 10.0,         1.0, "x0", MDO_NO));
        x.push_back(model.addVar(0.0, MDO_INFINITY, 1.0, "x1", MDO_NO));
        x.push_back(model.addVar(0.0, MDO_INFINITY, 1.0, "x2", MDO_NO));
        x.push_back(model.addVar(0.0, MDO_INFINITY, 1.0, "x3", MDO_NO));

        /* Add constraints. */
        model.addCons(1.0, MDO_INFINITY, 1.0 * x[0] + 1.0 * x[1] + 2.0 * x[2] + 3.0 * x[3], "c0");
        model.addCons(1.0, 1.0,          1.0 * x[0]              - 1.0 * x[2] + 6.0 * x[3], "c1");
        
        /* Input parameters. */
        model.setIntParam("NumThreads", 4);
        model.setRealParam("MaxTime", 3600);                
        
        /*------------------------------------------------------------------*/
        /* Step 3. Serialize the model and the parameters for later use.    */
        /*------------------------------------------------------------------*/
        model.writeTask(model_file, MDO_YES, MDO_NO, MDO_NO);
        model.writeTask(param_file, MDO_NO, MDO_YES, MDO_NO);
               
        /*------------------------------------------------------------------*/
        /* Step 4. Input parameters related to the remote computing server. */
        /*------------------------------------------------------------------*/
        /* Input parameters related to remote computing. */
        model.setStrParam("Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
        model.setStrParam("Remote/Desc", "my model");
        model.setStrParam("Remote/Server", "127.0.0.1"); // Change to your server IP 
        model.setStrParam("Remote/File/Model", model_file);
        model.setStrParam("Remote/File/Param", param_file);
        
        /*------------------------------------------------------------------*/
        /* Step 5. Upload the serialized model and parameters to server, and*/
        /*         then optimize the model.                                 */
        /*------------------------------------------------------------------*/
        job_id = model.submitTask();
        if (job_id == "")
        {
            std::cout << "ERROR: Empty job ID." << std::endl;
        }
        else
        {
            std::cout << "Job was submitted to server successfully." << std::endl;
            std::cout << "User may query the optimization result with the following job ID: " << job_id << std::endl;
        }
    }
    catch (MdoException & e)
    {
        std::cerr << "===================================" << std::endl;
        std::cerr << "Error   : code <" << e.getResult() << ">" << std::endl;
        std::cerr << "Reason  : " << model.explainResult(e.getResult()) << std::endl;
        std::cerr << "===================================" << std::endl;

        return static_cast<int>(e.getResult());
    }

    return static_cast<int>(MDO_OKAY);
}

6.6.2. 获取结果

在计算任务提交之后,可以通过如下示例程序对运行结果进行检索以及下载问题的解,该段程序将展示反序列化模型和参数、查询解状态、获取解、打印结果的过程。

Note

需提供准备好Server 端 tokenjob_id 等必要信息。

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
/**
 *  Description
 *  -----------
 *
 *  Check the solution status, download the solution, and populate the result.
 */
#include <iostream>
#include <vector>
#include <stdio.h>
#if defined(__APPLE__) || defined(__linux__)
#   include <unistd.h>
#   define SLEEP_10_SEC sleep(10)
#else
#   include <windows.h>
#   define SLEEP_10_SEC Sleep(10000)
#endif
#include "MindoptCpp.h"

using namespace mindopt;

int main(
    int argc,
    char * argv[])
{
    /* Variables. */
    /* Serialized files, must be the same in the submission code */
    std::string  model_file("./my_model.mdo");
    std::string  param_file("./my_param.mdo");
    /* Output solution file */
    std::string  soln_file("./my_soln.mdo");
    std::string  sol_file("./my_soln.sol");
    MdoStatus    model_status = MDO_UNKNOWN;    
    std::string  model_status_details;    
    std::string  status;
    MdoResult    result = MDO_OKAY;    
    std::string  result_details;    
    MdoBool      has_soln = MDO_NO;        
    std::string  job_id;
    double       val = 0.0;
     
    /*------------------------------------------------------------------*/
    /* Step 1. Create a model and change the parameters.                */
    /*------------------------------------------------------------------*/
    /* Create an empty model. */
    MdoModel model;
    
    try 
    {
        /*------------------------------------------------------------------*/
        /* Step 2. Read the serialized model and parameters -- this is      */
        /*         required while populating the optimization result.       */
        /*------------------------------------------------------------------*/
        model.readTask(model_file, MDO_YES, MDO_NO, MDO_NO);
        model.readTask(param_file, MDO_NO, MDO_YES, MDO_NO);

        /*------------------------------------------------------------------*/
        /* Step 3. Input parameters related to the remote computing server. */
        /*------------------------------------------------------------------*/
        model.setStrParam("Remote/Token", "xxxxxxxxxtokenxxxxxxxx"); // Change to your token
        model.setStrParam("Remote/Server", "127.0.0.1"); // Change to your server IP
        model.setStrParam("Remote/File/Model", model_file);
        model.setStrParam("Remote/File/Param", param_file);
        model.setStrParam("Remote/File/Soln", soln_file);
        job_id = "1"; // Change to the jobID you received after submit the task

        /*------------------------------------------------------------------*/
        /* Step 4. Check the solution status periodically, and              */
        /*         download the its upon availability.                      */
        /*------------------------------------------------------------------*/
        do 
        {        
            status = model.retrieveTask(job_id, model_status, result, has_soln);
            /* Sleep for 10 seconds. */
            SLEEP_10_SEC;
        }
        while (status[0] == 'S'); /* Continue the loop if the status is in either Submitted status or Solving status. */

        /*------------------------------------------------------------------*/
        /* Step 5. Un-serialize the solution and then populate the result.  */
        /*------------------------------------------------------------------*/  
        std::cout << std::endl << "Populating optimization results." << std::endl;    
        model_status_details = model.explainStatus(model_status);
        result_details = model.explainResult(result);

        std::cout << " - Job status             : " << status << std::endl;    
        std::cout << " - Model status           : " << model_status_details << " (" << model_status << ")" << std::endl;
        std::cout << " - Optimization status    : " << result_details << " (" << result << ")" << std::endl;
        std::cout << " - Solution availability  : " << std::string(has_soln ? "available" : "not available") << std::endl;
        if (has_soln)
        {
            std::cout << "Populating solution." << std::endl;    
            model.readTask(soln_file, MDO_NO, MDO_NO, MDO_YES);
            model.displayResults();
            model.writeSoln(sol_file);

            val = model.getRealAttr("PrimalObjVal");
            std::cout << " - Primal objective value : " << val << std::endl;
            val = model.getRealAttr("DualObjVal");
            std::cout << " - Dual objective value   : " << val << std::endl;
            val = model.getRealAttr("SolutionTime");
            std::cout << " - Solution time          : " << val << std::endl;
        }
    }
    catch (MdoException & e)
    {
        std::cerr << "===================================" << std::endl;
        std::cerr << "Error   : code <" << e.getResult() << ">" << std::endl;
        std::cerr << "Reason  : " << model.explainResult(e.getResult()) << std::endl;
        std::cerr << "===================================" << std::endl;

        return static_cast<int>(e.getResult());
    }

    return static_cast<int>(MDO_OKAY);
}

6.7. 客户端 Python 程序示例

我们通过如下 Python 程序展示如何调用 submit_taskretrieve_task API 进行作业上传和获取结果。

6.7.1. 上传模型

6.7.1.1. 从 MPS 文件读取模型上传

这个部分的程序展示了读取 MPS 文件、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们用它来查询作业的状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
"""
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
"""
from mindoptpy import *


if __name__ == "__main__":

    # Input mps file (optimization problem)
    input_file = "./afiro.mps"
    # Output serialized files
    model_file = "./my_model.mdo"
    param_file = "./my_param.mdo"
    job_id = ""

    # Step 1. Create a model and change the parameters.
    model = MdoModel()
    model.read_prob(input_file)

    try:
        # Step 2. Input model.
        model.set_int_param("NumThreads", 4)
        model.set_real_param("MaxTime", 3600.0)
        
        # Step 3. Serialize the model and the parameters. 
        model.write_task(model_file, True, False, False)
        model.write_task(param_file, False, True, False)        

        # Step 4. Input parameters related to remote computing. 
        model.set_str_param("Remote/Token", "xxxxxxxxxtokenxxxxxxxx") # Change to your token
        model.set_str_param("Remote/Desc", "afiro model")
        model.set_str_param("Remote/Server", "127.0.0.1") # Change to your server IP
        model.set_str_param("Remote/File/Model", model_file)
        model.set_str_param("Remote/File/Param", param_file)

        # Step 5. Upload serialize model and parameter to server, and then optimize the model.
        job_id = model.submit_task()
        if job_id == "":
            print("ERROR: Empty job ID.")
        else:
            print("Job was submitted to server successfully.")
            print("User may query the optimization result with the following job ID: {}".format(job_id))

    except MdoError as e:
        print("Received Mindopt exception.")
        print(" - Code          : {}".format(e.code))
        print(" - Reason        : {}".format(e.message))
    except Exception as e:
        print("Received exception.")
        print(" - Reason        : {}".format(e))
    finally:
        # Step 6. Free the model.
        model.free_mdl()

6.7.1.2. 自定义优化模型上传

这个部分的程序展示了自定义优化模型、设置算法配置参数、序列化模型和参数、提交作业任务的过程,如果任务提交成功,最终我们会拿到服务端返回的 job_id,我们用它来查询作业的状态。

Note

需提供准备好的 MPS 文件和算法配置参数以及 Server 端 Token(需要获取自己的 Token)等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
"""
/**
 *  Description
 *  -----------
 *
 *  Input model, specify parameters, serialize all the inputs, and then upload these data onto server.
 */
"""
from mindoptpy import *


if __name__ == "__main__":

    model_file = "./my_model.mdo"
    param_file = "./my_param.mdo"
    job_id = ""
    MDO_INFINITY = MdoModel.get_infinity()

    # Step 1. Create a model and change the parameters.
    model = MdoModel()

    try:
        # Step 2. Input model.
        # Change to minimization problem.
        model.set_int_attr("MinSense", 1)
        
        # Add variables.
        x = []
        x.append(model.add_var(0.0,         10.0, 1.0, None, "x0", False))
        x.append(model.add_var(0.0, MDO_INFINITY, 1.0, None, "x1", False))
        x.append(model.add_var(0.0, MDO_INFINITY, 1.0, None, "x2", False))
        x.append(model.add_var(0.0, MDO_INFINITY, 1.0, None, "x3", False))

        # Add constraints.
        # Note that the nonzero elements are inputted in a row-wise order here.
        model.add_cons(1.0, MDO_INFINITY, 1.0 * x[0] + 1.0 * x[1] + 2.0 * x[2] + 3.0 * x[3], "c0")
        model.add_cons(1.0,          1.0, 1.0 * x[0]              - 1.0 * x[2] + 6.0 * x[3], "c1")

        # Specify parameters.
        model.set_int_param("NumThreads", 4)
        model.set_real_param("MaxTime", 3600.0)
        
        # Step 3. Serialize the model and the parameters for later use.
        model.write_task(model_file, True, False, False)
        model.write_task(param_file, False, True, False)        

        # Step 4. Input parameters related to the remote computing server.
        model.set_str_param("Remote/Token", "xxxxxxxxxtokenxxxxxxxx") # Change to your token
        model.set_str_param("Remote/Desc", "my model")
        model.set_str_param("Remote/Server", "127.0.0.1") # Change to your server IP
        model.set_str_param("Remote/File/Model", model_file)
        model.set_str_param("Remote/File/Param", param_file)

        # Step 5. Upload the serialized model and parameters to server, and then optimize the model.   
        job_id = model.submit_task()
        if job_id == "":
            print("ERROR: Empty job ID.")
        else:
            print("Job was submitted to server successfully.")
            print("User may query the optimization result with the following job ID: {}".format(job_id))

    except MdoError as e:
        print("Received Mindopt exception.")
        print(" - Code          : {}".format(e.code))
        print(" - Reason        : {}".format(e.message))
    except Exception as e:
        print("Received exception.")
        print(" - Reason        : {}".format(e))
    finally:
        # Step 6. Free the model.
        model.free_mdl()

6.7.2. 获取结果

在计算任务提交之后,可以通过如下示例程序对运行结果进行检索以及下载问题的解,该段程序将展示反序列化模型和参数、查询解状态、获取解、打印结果的过程。

Note

需提供准备好Server 端 tokenjob_id 等必要信息。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
"""
/**
 *  Description
 *  -----------
 *
 *  Check the solution status, download the solution, and populate the result.
 */
"""
from mindoptpy import *
import time


if __name__ == "__main__":

    # Serialized files, must be the same in the submission code
    model_file = "./my_model.mdo"
    param_file = "./my_param.mdo"
    # Output solution file
    soln_file = "./my_soln.mdo"
    sol_file = "./my_soln.sol"
    job_id = "" # Change to the jobID you received after submit the task
    val = 0.0
    status = "Submitted"
    MDO_INFINITY = MdoModel.get_infinity()

    # Step 1. Create a model and change the parameters.
    model = MdoModel()

    try:        
        # Step 2. Read the serialized model and parameters -- this is 
        #         required while populating the optimization result. 
        model.read_task(model_file, True, False, False)
        model.read_task(param_file, False, True, False)        

        # Step 3. Input parameters related to the remote computing server.
        model.set_str_param("Remote/Token", "xxxxxxxxxtokenxxxxxxxx") # Change to your token
        model.set_str_param("Remote/Server", "127.0.0.1") # Change to your server IP
        model.set_str_param("Remote/File/Model", model_file)
        model.set_str_param("Remote/File/Param", param_file)
        model.set_str_param("Remote/File/Soln", soln_file)

        # Step 4. Check the solution status periodically, and             
        #         download the its upon availability.     
        while status == 'Submitted' or status == 'Solving': 
            status, model_status, result, has_soln = model.retrieve_task(job_id)
         
            # Sleep for 10 seconds.
            time.sleep(10)

        model_status_details = model.explain_status(model_status)
        result_details = model.explain_result(result)

        print(" - Job status             : {}".format(status))
        print(" - Model status           : {0} ({1})".format(model_status_details, model_status))
        print(" - Optimization status    : {0} ({1})".format(result_details, result))
        print(" - Solution availability  : {0}".format("available" if has_soln else "not available"))

        if has_soln:
            print("\nPopulating solution.")

            model.read_task(soln_file, False, False, True)
            model.display_results()
            model.write_soln(sol_file)
            
            print(" - Primal objective value : {}".format(model.get_real_attr("PrimalObjVal")))
            print(" - Dual objective value   : {}".format(model.get_real_attr("DualObjVal")))
            print(" - Solution time          : {} sec.".format(model.get_real_attr("SolutionTime")))

    except MdoError as e:
        print("Received Mindopt exception.")
        print(" - Code          : {}".format(e.code))
        print(" - Reason        : {}".format(e.message))
    except Exception as e:
        print("Received exception.")
        print(" - Reason        : {}".format(e))
    finally:
        # Step 5. Free the model.
        model.free_mdl()

6.8. 通过客户端程序建模

客户端的建模方式与单机版求解器一致,具体的优化建模方法请参考 建模与优化。 在 阿里云天池 MindOpt 优化求解器 中,我们提供了免费的远程计算服务,用户只需要注册后进行简单的在线安装即可试用,并且附带了更多关于如何使用远程计算服务的学习案例和程序。

6.9. 服务端 Web 页面操作

6.9.1. 访问和登录

在浏览器中输入地址,如:

本机安装地址:127.0.0.1
远端服务器安装地址:xx.xx.xx.xx

若正确安装,将显示以下页面

../_images/web-login.jpg

输入账号密码:

默认账号:admin
默认密码:admin

登录后会显示概览欢迎页面:

../_images/web-portal.jpg

6.9.2. 工作台

文件夹 Token 使用

文件夹的作用是方便用户管理不同来源的问题。在使用 Client SDK 提交问题时,相同的 Token 的作业数据将会被记录在同一个文件夹,方便后续的查看、管理和下载数据。

Token的使用方式,以Python为例:

model.set_str_param("Remote/Token", "xxxxxxxxxtokenxxxxxxxx")

您可以通过新建文件夹的方式管理任务放置结构。选中文件夹目录节点,将在该目录下创建子文件夹。

../_images/web-folderToken.jpg

其他功能

后续我们将在此模块上线其他的提交任务的体验方式,尽请期待。

6.9.3. 任务列表

计算服务器接收到的所有的任务都会在此展示,当数据存储占用磁盘空间时可按需清理。任务列表提供了全列表视图和文件夹视图。对每一个任务,用户都可以单独查看日志log、数据统计、下载模型、结果、日志数据等。在文件夹视图下,用户可以看到该文件夹下面的所有任务的统计,或者对整个文件夹打包下载。

../_images/web-tasklist.jpg

6.9.4. 下载列表

本列表记录的是在任务列表中进行如下的”创建下载任务“操作后的作业列表。

../_images/web-create-download.jpg

对于数据量大的下载,我们建议您采用”创建下载任务“的方式来下载数据,这样可以在后台打包数据,打包结束之后您可以在下载列表页进行快速下载。

../_images/web-download.jpg