作为一种可能的Web3.0底层技术,区块链以去中心化、不可篡改、可溯源等特点,构建起数字经济时代的全新信任体系。
As a possible bottom-of-the-web 3.0 technology, the block chain builds a completely new system of trust in the digital economy by decentralizing, non-alterable, traceability and so on.
从技术角度分析,区块链让数字资产价值流转的每一个节点都公开透明、有迹可循且不可篡改,这将会让Web3.0时代的一切交易变得更加真实可信。
From a technical point of view, a block chain that allows every node in which the value of digital assets flowes to be open and transparent, traceable and non-frozen would make all transactions in the Web3.0 era more credible.
同时,数据通过区块链技术可以确定权属,实现数据的资产化,这也将使得区块链成为Web3.0时代的基础设施。
At the same time, data can identify tenure and assetize data through block chain technology, which will also make the block chain the infrastructure of the Web3.0 era.
一、pytorch模型保存/加载
i. Pytorch model preservation/loading
有两种方式可用于保存/加载pytorch模型1)文件中保存模型结构和权重参数2)文件只保留模型权重.
There are two ways in which you can save/load the pytorch model 1) file and save the model structure and weight parameter 2)) file only retain the model weight.
1、文件中保存模型结构和权重参数
1. Parameters for preservation of model structure and weights in documents
1)pytorch模型保存
1) Pytorch model preservation
import torch
torch.save(selfmodel,"save.pt")
2)pytorch模型加载
2) Pytorch Model Load
import torch
torch.load("save.pt")
2、文件只保留模型权重
2. Only model weights will be retained in the document
1)pytorch模型保存
1) Pytorch model preservation
import torch
torch.save(selfmodel.state_dict(),"save.pt")
2)pytorch模型加载
2) Pytorch Model Load
关于区块链技术项目开发威:yy625019
Technical project on block chain development: yy625019
selfmodel.load_state_dict(torch.load("save.pt"))
二、pytorch模型转ONNX模型
II. Pytorch model to ONNX model
1、文件中保存模型结构和权重参数
1. Parameters for preservation of model structure and weights in documents
import torch
torch_model=torch.load("save.pt")#pytorch模型加载
Torch_model=torch.load("save.pt") #pytorch model load
batch_size=1#批处理大小
Match_size=1# batch size
input_shape=(3,244,244)#输入数据
Input_shape=(3,244,244)#Input data
#set the model to inference mode
torch_model.eval()
x=torch.randn(batch_size,*input_shape)#生成张量
x=torch.randn(batch_size,*input_shape)# yields
export_onnx_file="test.onnx"#目的ONNX文件名
#Export_onnx_file= "test.onnx" for ONNX
torch.onnx.export(torch_model,
x,
export_onnx_file,
opset_version=10,
do_constant_folding=True,#是否执行常量折叠优化
Do_constant_olding=True, # Whether to implement constant folding optimisation
input_names=["input"],#输入名
Input_names=["input",#Inputname
output_names=["output"],#输出名
Output_names=["output", #outputname
dynamic_axes=,#批处理变量
dynamic_axes=, #Batch process variable
"output":{0:"batch_size"}})
注:dynamic_axes字段用于批处理.若不想支持批处理或固定批处理大小,移除dynamic_axes字段即可.
Note: dynamic_axes field is used for batch processing. If you do not want to support batch processing or fixed batch size, remove dynamic_axes field.
2、文件中只保留模型权重
2. Only model weights should be retained in the document
import torch
torch_model=selfmodel()#由研究员提供python.py文件
Torch_model=selfmodel()# python.py files by researchers
batch_size=1#批处理大小
Match_size=1# batch size
input_shape=(3,244,244)#输入数据
Input_shape=(3,244,244)#Input data
#set the model to inference mode
torch_model.eval()
x=torch.randn(batch_size,*input_shape)#生成张量
x=torch.randn(batch_size,*input_shape)# yields
export_onnx_file="test.onnx"#目的ONNX文件名
#Export_onnx_file= "test.onnx" for ONNX
torch.onnx.export(torch_model,
x,
export_onnx_file,
opset_version=10,
do_constant_folding=True,#是否执行常量折叠优化
Do_constant_olding=True, # Whether to implement constant folding optimisation
input_names=["input"],#输入名
Input_names=["input",#Inputname
output_names=["output"],#输出名
Output_names=["output", #outputname
dynamic_axes=,#批处理变量
dynamic_axes=, #Batch process variable
"output":{0:"batch_size"}})
注册有任何问题请添加 微信:MVIP619 拉你进入群
打开微信扫一扫
添加客服
进入交流群
发表评论