diff --git a/official/nlp/transformer/infer/README.md b/official/nlp/transformer/infer/README.md
deleted file mode 100644
index 71c1f1078a4de5ae10b56d6750e9038574932212..0000000000000000000000000000000000000000
--- a/official/nlp/transformer/infer/README.md
+++ /dev/null
@@ -1,434 +0,0 @@
-# Transformer MindX鎺ㄧ悊鍙妋xBase鎺ㄧ悊
-
-- [鑴氭湰璇存槑](#鑴氭湰璇存槑)
-    - [鑴氭湰鍙婃牱渚嬩唬鐮乚(#鑴氭湰鍙婃牱渚嬩唬鐮�)
-    - [鍑嗗鎺ㄧ悊鏁版嵁](#鍑嗗鎺ㄧ悊鏁版嵁)
-    - [妯″瀷杞崲](#妯″瀷杞崲)
-    - [mxBase鎺ㄧ悊](#mxBase鎺ㄧ悊)
-    - [MindX SDK鎺ㄧ悊](#MindX-SDK鎺ㄧ悊)
-
-## 鑴氭湰璇存槑
-
-### 鑴氭湰鍙婃牱渚嬩唬鐮�
-
-```text
-鈹溾攢鈹€ infer                                  // 鎺ㄧ悊 MindX楂樻€ц兘棰勮缁冩ā鍨嬫柊澧�
-    鈹溾攢鈹€ convert                            // 杞崲om妯″瀷鍛戒护锛孉IPP
-       鈹溾攢鈹€ air_to_om.sh  
-    鈹溾攢鈹€ data                               // 鍖呮嫭妯″瀷鏂囦欢銆佹ā鍨嬭緭鍏ユ暟鎹泦銆佹ā鍨嬬浉鍏抽厤缃枃浠�
-       鈹溾攢鈹€ config                          // 閰嶇疆鏂囦欢
-           鈹溾攢鈹€ transformer.pipeline
-       鈹溾攢鈹€ data                            // 鎺ㄧ悊鎵€闇€鐨勬暟鎹泦
-           鈹溾攢鈹€ 00_source_eos_ids
-           鈹溾攢鈹€ 01_source_eos_mask          // 缁忚繃澶勭悊鍚庣殑鏁版嵁闆�
-           鈹溾攢鈹€ vocab.bpe.32000             // 璁$畻绮惧害鎵€鐢ㄦ暟鎹泦
-           鈹溾攢鈹€ newstest2014.tok.de         // 璁$畻绮惧害鎵€鐢ㄦ暟鎹泦
-           鈹溾攢鈹€ test.all                    // 鍘熷鏁版嵁闆�
-           鈹溾攢鈹€newstest2014-l128-mindrecord
-           鈹溾攢鈹€newstest2014-l128-mindrecord.db
-      鈹溾攢鈹€ model                            // air銆乷m妯″瀷鏂囦欢
-           鈹溾攢鈹€ transformer.air
-           鈹溾攢鈹€ transformer.om
-   鈹溾攢鈹€ mxbase                              // mxbase鎺ㄧ悊
-      鈹溾攢鈹€ src
-           鈹溾攢鈹€transformer.cpp
-           鈹溾攢鈹€Transformer.h
-           鈹溾攢鈹€main.cpp
-      鈹溾攢鈹€ build.sh
-      鈹溾攢鈹€ CMakeLists.txt
-      鈹溾攢鈹€ post_process.py
-   鈹溾攢鈹€ sdk                                // sdk鎺ㄧ悊
-      鈹溾攢鈹€main.py
-   鈹溾攢鈹€docker_start_infer.sh               // 鍚姩瀹瑰櫒鑴氭湰
-   鈹溾攢鈹€multi-bleu.perl                     // 璁$畻绮惧害鑴氭湰
-```
-
-### 鍑嗗鎺ㄧ悊鏁版嵁
-
-鍑嗗妯″瀷杞崲鍜屾ā鍨嬫帹鐞嗘墍闇€鐩綍鍙婃暟鎹€�
-
-1. 澶勭悊鏁版嵁闆�
-
-- 杩涘叆瀹瑰櫒鎵ц浠ヤ笅鍛戒护:
-
-鍚姩瀹瑰櫒,杩涘叆Transformer/infer鐩綍,鎵ц浠ヤ笅鍛戒护,鍚姩瀹瑰櫒銆�
-
-```Shell
-bash docker_start_infer.sh  docker_image  tag  model_dir
-```
-
-**琛� 2**  鍙傛暟璇存槑
-
-  | 鍙傛暟      | 璇存槑 |
-  | ----------- | ----------- |
-  | docker_image      | 鎺ㄧ悊闀滃儚鍚嶇О锛屾牴鎹疄闄呭啓鍏ャ€�      |
-  | tag   | 闀滃儚tag锛岃鏍规嵁瀹為檯閰嶇疆锛屽锛�21.0.2銆�       |
-  | model_dir  | 鎺ㄧ悊浠g爜璺緞銆�      |
-
-- 鍚姩瀹瑰櫒鏃朵細灏嗘帹鐞嗚姱鐗囧拰鏁版嵁璺緞鎸傝浇鍒板鍣ㄤ腑銆傚彲鏍规嵁闇€瑕侀€氳繃淇敼**docker\_start\_infer.sh**鐨刣evice鏉ユ寚瀹氭寕杞界殑鎺ㄧ悊鑺墖銆�
-
-```Shell
-docker run -it
---device=/dev/davinci0        # 鍙牴鎹渶瑕佷慨鏀规寕杞界殑npu璁惧
---device=/dev/davinci_manager
-```
-
->**璇存槑锛�**  
->MindX SDK寮€鍙戝浠讹紙mxManufacture锛夊凡瀹夎鍦ㄥ熀纭€闀滃儚涓紝瀹夎璺緞锛氣€�/usr/local/sdk_home鈥溿€�
-
-2. 涓嬭浇杞欢鍖呫€�
-
-   鍗曞嚮鈥滀笅杞芥ā鍨嬭剼鏈€濆拰鈥滀笅杞芥ā鍨嬧€濓紝涓嬭浇鎵€闇€杞欢鍖呫€�
-
-3. 灏嗘ā鍨嬭剼鏈拰妯″瀷涓婁紶鑷虫帹鐞嗘湇鍔″櫒浠绘剰鐩綍骞惰В鍘嬶紙濡傗€�/home/Transformer鈥濓級
-
-```shell
-# 鍦ㄧ幆澧冧笂鎵ц
-unzip Transformer_for_MindSpore_{version}_code.zip
-cd Transformer_for_MindSpore_{version}_code/infer && dos2unix `find .`
-unzip ../../Transformer_for_MindSpore_{version}_model.zip
-```
-
-4. 鍚姩瀹瑰櫒鍚庯紝杩涘叆鈥淭ransformer鈥滀唬鐮佺洰褰�
-
-鎵ц鍛戒护濡備笅:
-
-```Shell
-bash wmt16_en_de.sh
-```
-
-鍋囪鎮ㄥ凡鑾峰緱涓嬪垪鏂囦欢,灏嗕互涓嬫枃浠剁Щ鍏ュ埌浠g爜鐩綍鈥淭ransformer/infer/data/data/鈥滅洰褰曚笅
-
-```text
-鈹溾攢鈹€ wmt16_en_de
-    vocab.bpe.32000
-    newstest2014.tok.bpe.32000.en
-    newstest2014.tok.bpe.32000.de
-    newstest2014.tok.de
-```
-
-杩涘叆鈥淭ransformer/infer/data/data/鈥滅洰褰�
-
-鎵ц鍛戒护濡備笅:
-
-```Shell
-paste newstest2014.tok.bpe.32000.en newstest2014.tok.bpe.32000.de > test.all
-```
-
-灏哾efault_config_large.yaml涓璪ucket鏀逛负bucket: [128]
-
-```text
-# create_data.py
-input_file: ''
-num_splits: 16
-clip_to_max_len: False
-max_seq_length: 128
-bucket: [128]
-```
-
-杩涘叆鈥淭ransformer/鈥滅洰褰�
-
-鎵ц鍛戒护濡備笅:
-
-```Shell
-python3 create_data.py --input_file ./infer/data/data/test.all --vocab_file ./infer/data/data/vocab.bpe.32000 --output_file ./infer/data/data/newstest2014-l128-mindrecord --num_splits 1 --max_seq_length 128 --clip_to_max_len True
-```
-
-鏇存敼default_config_large.yaml涓弬鏁帮細
-
-```text
-#eval_config/cfg edict
-data_file: './infer/data/data/newstest2014-l128-mindrecord'
-...
-#'preprocess / from eval_config'
-result_path: "./infer/data/data"
-```
-
-鎺ョ潃鎵ц鍛戒护锛�
-
-```Shell
-python3 preprocess.py
-```
-
-鎵ц鍚庡湪鈥淭ransformer/infer/data/data鈥滅洰褰曚腑寰楀埌鏂囦欢澶瑰涓�:
-
-```txt
-鈹溾攢鈹€data
-   00_source_eos_ids
-   01_source_eos_mask
-```
-
-### 妯″瀷杞崲
-
-浠ヤ笅鎿嶄綔鍧囬渶杩涘叆瀹瑰櫒涓墽琛屻€�
-
-1. 鍑嗗妯″瀷鏂囦欢銆�  
-
-- transformer.air
-
-- 灏嗘枃浠舵斁鍏ransformer/infer/data/model涓�
-
-2. 妯″瀷杞崲銆�
-
-    杩涘叆鈥淭ransformer/infer/convert鈥滅洰褰曡繘琛屾ā鍨嬭浆鎹紝杞崲璇︾粏淇℃伅鍙煡鐪嬭浆鎹㈣剼鏈拰瀵瑰簲鐨刟ipp閰嶇疆鏂囦欢锛�**鍦╝ir_to_om.sh**鑴氭湰鏂囦欢涓紝閰嶇疆鐩稿叧鍙傛暟銆�
-
-```Shell
-model_path=$1
-output_model_name=$2
-
-atc --model=$model_path \                             # 甯﹁浆鎹㈡ā鍨嬬殑璺緞
-    --framework=1 \                                   # 1琛ㄧずMindSpore
-    --output=$output_model_name \                     # 杈撳嚭om妯″瀷鐨勮矾寰�
-    --input_format=NCHW \                             # 杈撳叆鏍煎紡
-    --soc_version=Ascend310 \                         # 妯″瀷杞崲鏃舵寚瀹氳姱鐗囩増鏈�
-    --op_select_implmode=high_precision \             # 妯″瀷杞崲绮惧害
-    --precision_mode=allow_fp32_to_fp16               # 妯″瀷杞崲绮惧害
-```
-
-杞崲鍛戒护濡備笅:
-
-```Shell
-bash air_to_om.sh  [input_path] [output_path]
-e.g.
-bash air_to_om.sh ../data/model/transformer.air ../data/model/transformer
-```
-
-**琛� 3**  鍙傛暟璇存槑
-
-| 鍙傛暟      | 璇存槑 |
-| ----------- | ----------- |
-| input_path     | AIR鏂囦欢璺緞銆�      |
-| output_path   |鐢熸垚鐨凮M鏂囦欢鍚嶏紝杞崲鑴氭湰浼氬湪姝ゅ熀纭€涓婃坊鍔�.om鍚庣紑銆�       |
-
-### mxBase鎺ㄧ悊
-
-宸茶繘鍏ユ帹鐞嗗鍣ㄧ幆澧�,鍏蜂綋鎿嶄綔璇峰弬瑙佲€滃噯澶囧鍣ㄧ幆澧冣€濄€�
-
-1. 鏍规嵁瀹為檯鎯呭喌淇敼Transformer.h鏂囦欢涓殑鎺ㄧ悊缁撴灉淇濆瓨璺緞銆�
-
-```c
- private:
-    std::shared_ptr<MxBase::ModelInferenceProcessor> model_;
-    MxBase::ModelDesc modelDesc_ = {};
-    uint32_t deviceId_ = 0;
-    std::string outputDataPath_ = "./result/result.txt";
-};
-
-#endif
-```
-
-2. 鍦ㄢ€渋nfer/mxbase鈥濈洰褰曚笅锛岀紪璇戝伐绋�
-
-```shell
-bash build.sh
-```
-
-缂栬瘧瀹屾垚鍚庡湪mxbase鐩綍涓嬪緱鍒颁互涓嬫柊鏂囦欢:
-
-```text
-鈹溾攢鈹€ mxbase
-    鈹溾攢鈹€ build                               // 缂栬瘧鍚庣殑鏂囦欢
-    鈹溾攢鈹€ result                              //鐢ㄤ簬瀛樻斁鎺ㄧ悊缁撴灉鐨勭┖鏂囦欢澶�
-    鈹溾攢鈹€ Transformer                         // 鐢ㄤ簬鎵ц鐨勬ā鍨嬫枃浠�
-```
-
- 杩愯鎺ㄧ悊鏈嶅姟銆�
-
-   鍦ㄢ€渋nfer/mxbase鈥濈洰褰曚笅锛岃繍琛屾帹鐞嗙▼搴忚剼鏈�
-
-```shell
-./Transformer [model_path] [input_data_path/] [output_data_path]
-e.g.
-./Transformer ../data/model/transformer.om ../data/data ./result
-```
-
-**琛� 4** 鍙傛暟璇存槑锛�
-
-| 鍙傛暟             | 璇存槑         |
-| ---------------- | ------------ |
-| model_path | 妯″瀷璺緞 |
-| input_data_path | 澶勭悊鍚庣殑鏁版嵁璺緞 |
-| output_data_path | 杈撳嚭鎺ㄧ悊缁撴灉璺緞 |
-
-3. 澶勭悊缁撴灉銆�
-
-淇敼鍙傛暟
-
-```python
-path = "./result"                  #鎺ㄧ悊缁撴灉鎵€鍦ㄦ枃浠跺す
-
-filenames = os.listdir(path)
-result = "./results.txt"           #澶勭悊鎺ㄧ悊缁撴灉鍚庢枃浠舵墍鍦ㄨ矾寰�
-```
-
-鍦ㄢ€渋nfer/mxbase鈥濈洰褰曚笅鎵ц:
-
-```shell
-python3 post_process.py
-```
-
-鍦ㄢ€渋nfer/mxbase鈥濈洰褰曚笅寰楀埌results.txt
-
-4. 鎺ㄧ悊绮惧害
-
-杩涘叆"Transformer/"鐩綍涓嬫墽琛岋細
-
-```shell
-bash scripts/process_output.sh REF_DATA EVAL_OUTPUT VOCAB_FILE
-e.g.
-bash scripts/process_output.sh ./infer/data/data/newstest2014.tok.de ./infer/mxbase/results.txt ./infer/data/data/vocab.bpe.32000
-```
-
-杩涘叆"Transformer/infer/"鐩綍涓嬫墽琛岋細
-
-```shell
-perl multi-bleu.perl REF_DATA.forbleu < EVAL_OUTPUT.forbleu
-e.g.
-perl multi-bleu.perl ./data/data/newstest2014.tok.de.forbleu < ./mxbase/results.txt.forbleu
-```
-
-寰楀埌绮惧害BLEU涓�27.24
-
-### MindX SDK鎺ㄧ悊
-
-宸茶繘鍏ユ帹鐞嗗鍣ㄧ幆澧冦€傚叿浣撴搷浣滆鍙傝鈥滃噯澶囧鍣ㄧ幆澧冣€濄€�
-
-1. 鍑嗗妯″瀷鎺ㄧ悊鏂囦欢
-
-    (1)杩涘叆Transformer/infer/data/config鐩綍锛宼ransformer.pipeline鏂囦欢涓殑"modelPath": "../model/transformer.om"涓簅m妯″瀷鎵€鍦ㄨ矾寰勩€�
-
-```txt
-    {
-    "transformer": {
-        "stream_config": {
-            "deviceId": "0"
-        },
-        "appsrc0": {
-            "props": {
-                "blocksize": "409600"
-            },
-            "factory": "appsrc",
-            "next": "mxpi_tensorinfer0:0"
-        },
-        "appsrc1": {
-            "props": {
-                "blocksize": "409600"
-            },
-            "factory": "appsrc",
-            "next": "mxpi_tensorinfer0:1"
-        },
-        "mxpi_tensorinfer0": {
-            "props": {
-                "dataSource":"appsrc0,appsrc1",
-                "modelPath": "../data/model/transformer.om"
-            },
-            "factory": "mxpi_tensorinfer",
-            "next": "mxpi_dataserialize0"
-        },
-        "mxpi_dataserialize0": {
-            "props": {
-                "outputDataKeys": "mxpi_tensorinfer0"
-            },
-            "factory": "mxpi_dataserialize",
-            "next": "appsink0"
-        },
-        "appsink0": {
-            "props": {
-                "blocksize": "4096000"
-            },
-            "factory": "appsink"
-        }
-    }
-}
-```
-
-(2) 鏍规嵁瀹為檯鎯呭喌淇敼main.py鏂囦欢涓殑 **pipeline璺緞** 銆�**鏁版嵁闆嗚矾寰�**銆�**鎺ㄧ悊缁撴灉璺緞**鏂囦欢璺緞銆�
-
-```python
-def run():
-    """
-    read pipeline and do infer
-    """
-    # init stream manager
-    stream_manager_api = StreamManagerApi()
-    ret = stream_manager_api.InitManager()
-    if ret != 0:
-        print("Failed to init Stream manager, ret=%s" % str(ret))
-        return
-
-    # create streams by pipeline config file
-    with open("../data/config/transformer.pipeline", 'rb') as f:                       #pipeline璺緞
-        pipelineStr = f.read()
-    ret = stream_manager_api.CreateMultipleStreams(pipelineStr)
-
-    if ret != 0:
-        print("Failed to create Stream, ret=%s" % str(ret))
-        return
-    stream_name = b'transformer'
-    predictions = []
-    path = '../data/data/00_source_eos_ids'                                           #鏁版嵁闆嗚矾寰�
-    path1 = '../data/data/01_source_eos_mask'                                         #鏁版嵁闆嗚矾寰�
-    files = os.listdir(path)
-    for i in range(len(files)):
-        full_file_path = os.path.join(path, "transformer_bs_1_" + str(i) + ".bin")
-        full_file_path1 = os.path.join(path1, "transformer_bs_1_" + str(i) + ".bin")
-        source_ids = np.fromfile(full_file_path, dtype=np.int32)
-        source_mask = np.fromfile(full_file_path1, dtype=np.int32)
-        source_ids = np.expand_dims(source_ids, 0)
-        source_mask = np.expand_dims(source_mask, 0)
-        print(source_ids)
-        print(source_mask)
-        if not send_source_data(0, source_ids, stream_name, stream_manager_api):
-            return
-        if not send_source_data(1, source_mask, stream_name, stream_manager_api):
-            return
-        # Obtain the inference result by specifying streamName and uniqueId.
-        key_vec = StringVector()
-        key_vec.push_back(b'mxpi_tensorinfer0')
-        infer_result = stream_manager_api.GetProtobuf(stream_name, 0, key_vec)
-        if infer_result.size() == 0:
-            print("inferResult is null")
-            return
-        if infer_result[0].errorCode != 0:
-            print("GetProtobuf error. errorCode=%d" % (infer_result[0].errorCode))
-            return
-        result = MxpiDataType.MxpiTensorPackageList()
-        result.ParseFromString(infer_result[0].messageBuf)
-        res = np.frombuffer(result.tensorPackageVec[0].tensorVec[0].dataStr, dtype=np.int32)
-        print(res)
-        predictions.append(res.reshape(1, 1, 81))
-    # decode and write to file
-    f = open('./results', 'w')                                                        #鎺ㄧ悊缁撴灉璺緞
-    for batch_out in predictions:
-        token_ids = [str(x) for x in batch_out[0][0].tolist()]
-        f.write(" ".join(token_ids) + "\n")
-    f.close()
-    # destroy streams
-    stream_manager_api.DestroyAllStreams()
-```
-
-2. 杩愯鎺ㄧ悊鏈嶅姟,杩涘叆鈥淭ransformer/infer/sdk鈥� 鐩綍涓嬫墽琛屻€�
-
-```Shell
-python3 main.py
-```
-
-3. 璁$畻鎺ㄧ悊绮惧害銆�
-
-杩涘叆"Transformer/"鐩綍涓嬫墽琛岋細
-
-```shell
-bash scripts/process_output.sh REF_DATA EVAL_OUTPUT VOCAB_FILE
-e.g.
-bash scripts/process_output.sh ./infer/data/data/newstest2014.tok.de ./infer/sdk/results ./infer/data/data/vocab.bpe.32000
-```
-
-杩涘叆"Transformer/infer/"鐩綍涓嬫墽琛岋細
-
-```shell
-perl multi-bleu.perl REF_DATA.forbleu < EVAL_OUTPUT.forbleu
-e.g.
-perl multi-bleu.perl ./data/data/newstest2014.tok.de.forbleu < ./sdk/results.forbleu
-```
-
-寰楀埌绮惧害BLEU涓�27.24
\ No newline at end of file
diff --git a/official/nlp/transformer/scripts/docker_start.sh b/official/nlp/transformer/scripts/docker_start.sh
new file mode 100644
index 0000000000000000000000000000000000000000..77ed32a46ddf603e17ae577f82eff0ccf0b0ad0c
--- /dev/null
+++ b/official/nlp/transformer/scripts/docker_start.sh
@@ -0,0 +1,40 @@
+#!/bin/bash
+# Copyright 2022 Huawei Technologies Co., Ltd
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ============================================================================
+
+docker_image=$1
+data_dir=$2
+model_dir=$3
+
+docker run -it --ipc=host \
+               --device=/dev/davinci0 \
+               --device=/dev/davinci1 \
+               --device=/dev/davinci2 \
+               --device=/dev/davinci3 \
+               --device=/dev/davinci4 \
+               --device=/dev/davinci5 \
+               --device=/dev/davinci6 \
+               --device=/dev/davinci7 \
+               --device=/dev/davinci_manager \
+               --device=/dev/devmm_svm \
+               --device=/dev/hisi_hdc \
+               --privileged \
+               -v /usr/local/Ascend/driver:/usr/local/Ascend/driver \
+               -v /usr/local/Ascend/add-ons/:/usr/local/Ascend/add-ons \
+               -v ${data_dir}:${data_dir} \
+               -v ${model_dir}:${model_dir} \
+               -v /var/log/npu/conf/slog/slog.conf:/var/log/npu/conf/slog/slog.conf \
+               -v /root/ascend/log:/root/ascend/log ${docker_image} \
+               /bin/bash