Skip to main content

Use bayes Manage containers

bayes Under the command line, there are "working directory" The concept. It corresponds to HyperAI Inside the container "output" catalogue. When creating a container using command-line tools, it is necessary to first treat a local directory as "working directory" And with HyperAI of "container" Establish mapping relationship. The specific operation is as follows:

  1. Switch to the directory where the code to be executed is located cd ~/hyperai-mnist-example
  2. Initialize a new container bayes gear init mnist-example At this point, the current directory is already connected to mnist-example This container has created a mapping relationship. Created by "implement" They will all appear under the container
note

use bayes gear ls command. You can view all your containers

bayes gear init Commands can use existing container names or containers ID Initialize the container for the current directory. If initializing with a non-existent container name. Will create a new container.

After the preparation work is completed. We will introduce the use of several access methods in the future

Create through command-line parameters "Python Script execution"

By command bayes gear run task -h You can see a large number of sample tips on how to create one "Python Script execution" .

Let's first create a relatively simple version here:

$ bayes gear run task --env=tensorflow-1.15 -- python main.py

Sending upload request to server...
The server has responded
Reading file list, please wait a moment...
Shared Files 95 individual
Checking in progress .HyperAIignore file...
Excluded .HyperAIignore Ignore files and folders in the list
Compressing code...
Compressed code completed
Initializing upload in progress...
Uploading compressed file. Total upload size: 49.31 MiB
49.31 MiB / 49.31 MiB [=================================] 100 % 4.21 MiB/s
Removing compressed files that have already been uploaded
Code uploaded successfully
Requesting the server to create a container...
Container created successfully
The container is starting up...
Open webpage https://hyperai.com/console/username/containers/d1JKTyFQq1W/isqgy7idosrk Can view containers mnist-example Detailed Information

among -- The specific commands to be executed afterwards. If there is && This symbol. Need to use quotation marks for protection bayes gear run task -- 'echo 123 && python main.py'.

You can see bayes Upload files from the current directory and create one "python script" task. Compared to through web Interface upload zip The experience of the bag has improved a lot.

then. We are trying to use command-line arguments. Create a more complex version:

  bayes gear run task \
--resource cpu \
--env tensorflow-1.15 \
--data hyperai/eBIQp4yPMtU/1:/input0 \
--data hyperai/sTggKplxyT6/1:/input1 \
--data hyperai/bbNaMvDNqO9/1:/input2 \
--data username/jobs/3s55ypc33ptl/output:/output \
--message "task message" \
--open \
--follow \
-- sleep 60

Can you introduce a few available parameters:

  • -e or --env Select Image. The available images can be selected through commands bayes gear env query
  • -r or --resource Choose computing power. The available computing power can be selected through commands bayes gear resource query
  • -d or --data Bind data. The available datasets for binding can be obtained through commands bayes gear bindings query
  • -m or --message Execution description. Can be left blank
  • -o or --open After the container starts running, it will. Open the corresponding in the browser web Interface
  • -f or --follow Track the status of running containers
info

It should be noted that --data hyperai/eBIQp4yPMtU/1:/input0 in, HyperAI It is a proprietary name for public datasets. If you want to use your own dataset. We need to HyperAI Replace with your username, adopt bayes gear bindings The command can view the most recent bindable data, mnist It is the name of the dataset, 1 It is the version number of the dataset, /input0 Representing the binding of datasets to input0

Adopt HyperAI.yaml establish "Python Script execution"

in addition. Through bayes gear init After binding the current directory to the container, files will appear in the directory HyperAI.yaml The internal initialization content is as follows:

HyperAI.yaml
## Of " HyperAI configuration file" The latest explanation. Please refer to https://hyperai.com/docs/cli/config-file/

## Data_bindings
# Refers to the bound data, support "Container output" as well as "data set" , Bind up to three at the same time
#
# A complete data_bindings The example is as follows:
#
# data_bindings:
# - data: hyperai/mnist/1
# path: /input0
# - data: hyperai/jobs/jfaqJeLMcPM/output
# path: output
#
# You can also data_bindings replace with bindings, Abbreviated as the following example:
#
# bindings:
# - hyperai/mnist/1:/input0
# - hyperai/jobs/jfaqJeLMcPM/output:/output
#
data_bindings: []

## Resource
# What computing power container is used. By command bayes gear resource You can see the supported types of computing power
#
resource: tiny-cpu

## Env
# What runtime environment is used. By command bayes gear env Can view supported runtime environments
#
env: tensorflow-1.15

## Command
# Only when creating "Script execution" When needed. The entry command during task execution
#
command: ""


## Parameters
# support key / value Formal parameters. This parameter will be generated during container execution HyperAI_params.json And added in command After the parameters
# The example is as follows:
#
# parameters:
# input: /input0
# epochs: 5
#
# During execution, a content will be generated that is {"input": "/input0", "epochs": 5} of HyperAI_params.json,
# And it will be appended after executing the command `--input=/input0 --epochs=5`
#
parameters: {}


## Of " HyperAI Automatic parameter tuning" The latest explanation. Please refer to https://hyperai.com/docs/hypertuning/
#
# A complete hyper_tuning The example is as follows:
# hyper_tuning:
# max_job_count: 3
# hyperparameter_metric: precision
# goal: MINIMIZE
# algorithm: Bayesian
# parameter_specs:
# - name: regularization
# type: DOUBLE
# min_value: 0.001
# max_value: 10.0
# scale_type: UNIT_LOG_SCALE
# - name: latent_factors
# type: INTEGER
# min_value: 5
# max_value: 50
# scale_type: UNIT_LINEAR_SCALE
# - name: unobs_weight
# type: DOUBLE
# min_value: 0.001
# max_value: 5.0
# scale_type: UNIT_LOG_SCALE
# - name: feature_wt_factor
# type: DOUBLE
# min_value: 1
# max_value: 200
# scale_type: UNIT_LOG_SCALE
# - name: level
# type: DISCRETE
# discrete_values: [1, 2, 3, 4]
# - name: category
# type: CATEGORICAL
# categorical_values: ["A", "B", "C"]
#
hyper_tuning:

## max_job_count
# The number of attempts at automatic parameter tuning once. Maximum support 100 second
#
max_job_count: 0

## parallel_count
# The number of parallel attempts is limited to the maximum number of parallel attempts for a single resource type of the user. Usually 1 perhaps 2
#
parallel_count: 1

## hyperparameter_metric
# Target variable
# Report on target variables. Please refer to https://hyperai.com/docs/hypertuning/#Report key indicators
hyperparameter_metric: ""

## goal
# The direction of the optimal solution ( MAXIMIZE or MINIMIZE )
#
goal: ""

## algorithm
# The algorithm used. The supported algorithms are as follows:
# Grid For only DISCRETE as well as CATEGORICAL The scenario of type parameters can be achieved through GridSearch Traverse all combinations of parameters
# Random in the light of INTEGER as well as DOUBLE type. Based on the supported distribution types, stay min_value and max_value Randomly select numerical values between them, about DISCRETE and CATEGORICAL type. Their behavior and Grid Similar approach
# Bayesian Consider the previous when generating parameters each time "parameter" - "Target variable" The result. Provide parameters through the updated distribution function to expect better results. The algorithm can refer to this article
#
algorithm: ""

## parameter_specs
# Protocol for input parameters
# Please refer to the definition of parameter specifications: https://hyperai.com/docs/hypertuning/#Automatic Parameter Adjustment Configuration Instructions
#
parameter_specs: []

## side_metrics
# Other reference indicators
#
side_metrics: []

among hyper_tuning Part will not be introduced temporarily. But you can see that other parameters are bayes gear run task The parameters used in are consistent. By HyperAI.yaml Configuring parameters can avoid duplication in use bayes gear run task Repeatedly inputting parameters. For example, provide the following parameters:

data_bindings:
- data: hyperai/mnist/1
path: /input0
resource: t4
env: tensorflow-1.15
command: "python train.py -i /input0 -o ./model -e 2 -m model.h5 -l ./tf_dir"
info

It should be noted that data: hyperai/mnist/1 in, HyperAI It is a proprietary name for public datasets. If you want to use your own dataset. We need to HyperAI Replace with your username, mnist It is the name of the dataset, 1 It is the version number of the dataset; path: /input0, It is to bind the dataset to input0 in.

Direct input bayes gear run task The command can be executed in tensorflow-1.15 Under the environment. The computing power resources are t4, Bind dataset hyperai/mnist/1 reach /input0, The entry command is python train.py -i /input0 -o ./model -e 2 -m model.h5 -l ./tf_dir The task has been completed.

note

More information on how to write configuration files, see HyperAI configuration file

Establish "Jupyter working space"

With "Python script" Similar to the creation. Create through the command line Jupyter By default, files in the current directory will be uploaded to the container "output" in.

  1. git clone https://github.com/practicalAI/practicalAI download practicalAI project
  2. cd practicalAI && bayes gear init practicalAI Initialize container
  3. bayes gear run notebook establish Jupyter
$ bayes gear run notebook-o -f

Sending upload request to server...
The server has responded
Reading file list, please wait a moment...
Excluded from .HyperAIignore Neglected files and folders...
Shared Files 99 individual
Compressing code...
Compressed code completed
Initializing upload in progress...
Uploading compressed file. Total upload size: 9.28 MiB
9.28 MiB / 9.28 MiB [=================================] 100 % 5.14 MiB/s
Cleaning work in progress
Code uploaded successfully
Requesting the server to create a container...
Container created successfully
Open webpage https://hyperai.com/console/username/containers/CUEtgUJHidb/ri4nft12g18c Can view containers practicalAI Detailed Information
⠴ In preparation
Container in operation
Jumping to browser...
Successfully opened browser
note

"Jupyter working space" The creation and connection with "Python script" Similar to the creation. Can be created through command-line parameters. It can also be done through HyperAI.yaml File creation.

Container continues to execute

  • use bayes gear status command. View all executions under the current container
  • use bayes gear restart command. Transfer completed execution ID. You can run the execution again with the same parameters.
$ bayes gear restart 52yaekv8nf91 -o -f

Container continues to execute...
Open webpage https://hyperai.com/console/username/jobs/6q848lathbdp Can view containers practicalAI Detailed Information
⠹ In preparation
Container in operation
Jumping to browser...
Successfully opened browser

It can also be covered by parameters. Modify some parameters. Run the execution again.

note

restart Command options and run The options of the command are consistent

$ bayes gear restart 52yaekv8nf91 \
--resource cpu \
--env tensorflow-1.15 \
--data hyperai/eBIQp4yPMtU/1:/input0 \
--data hyperai/sTggKplxyT6/1:/input1 \
--data hyperai/bbNaMvDNqO9/1:/input2 \
--data username/jobs/3s55ypc33ptl/output:/output \
--message "task message" \
--open \
--follow

Container continues to execute...
Open webpage https://hyperai.com/console/username/jobs/6q848lathbdp Can view containers practicalAI Detailed Information
⠹ In preparation
Container in operation
Jumping to browser...
Successfully opened browser
info

It should be noted that --data hyperai/eBIQp4yPMtU/1:/input0 in, HyperAI It is a proprietary name for public datasets. If you want to use your own dataset. We need to HyperAI Replace with your username, eBIQp4yPMtU It's a dataset ID, 1 It is the version number of the dataset; :/input0, It is to bind the dataset to input0 in.

Close container execution

use bayes gear stop command. Pass in the running container for execution ID. The execution of the container can be closed.

$ bayes gear stop 52yaekv8nf91 -o -f

Synchronize data and close the container
Open webpage https://hyperai.com/console/username/jobs/52yaekv8nf91 Can view containers practicalAI Detailed Information
Jumping to browser...
Successfully opened browser
⠦ Closing
Container closed

Can you introduce a few available parameters:

  • -o or --open It will be closed after the container begins to close. Open the corresponding in the browser web Interface
  • -f or --follow Will continuously track the status of the container. Until the container is completely closed

Download container output content using command-line tools

1. Directly through execution ID Download container output content

use bayes gear download command. Transfer to container for execution ID. You can download the current output content of the container.

$ bayes gear download 5mx0ki1s5ej8 --target ~/Downloads/data-download-location --unarchive

Downloading in progress, please wait a moment
Download completed. The file is saved in ~/Downloads/data-download-location/cli-29.output.zip
Under decompression, please wait a moment
Decompression completed. The file is saved in ~/Downloads/data-download-location
Compressed package ~/Downloads/data-download-location/cli-29.output.zip Deleted

Can you introduce a few available parameters:

  • -f or --from Specify the download sub path. If not filled in, download the entire output
  • -t or --target Local existence location. If not filled in, use the current path
  • -u or --unarchive Automatically decompress compressed files and delete source files. If not filled in, the compressed file will be retained by default. Do not perform automatic decompression
note

use --unarchive parameter, requirement --target The selected folder is an empty folder

2. establish "Python Script execution" , And wait for the container to complete execution before downloading the output content

Combination use bayes gear run task and bayes gear download command. Will wait "Python Script execution" Download the output content after completion.

$ bayes gear run task -f && bayes gear download -t ~/Downloads/data-download-location -u
Sending upload request to server...
The server has responded
Reading file list, please wait a moment...
Excluded from .HyperAIignore Neglected files and folders...
Shared Files 18 individual
Compressing code...
Compressed code completed
Initializing upload in progress...
Uploading compressed file. Total upload size: 8.88 MiB
8.88 MiB / 8.88 MiB [==========================================================] 100 % 1.90 MiB/s
Cleaning work in progress
Code uploaded successfully
Requesting the server to create a container...
Container created successfully
Open webpage https://hyperai.com/console/username/jobs/mktv6fo5drjy Can view containers cli Detailed Information
Container in operation
Container log:
No logs available at the moment
⠦ Running
Container execution completed
Downloading in progress, please wait a moment
Download completed. The file is saved in ~/Downloads/data-download-location/cli-49.output.zip
Under decompression, please wait a moment
Decompression completed. The file is saved in ~/Downloads/data-download-location
Compressed package ~/Downloads/data-download-location/cli-49.output.zip Deleted

Open the container with command-line tools web Interface

We can open it directly from the command line using the following command web Interface:

$ bayes gear open 6q848lathbdp
Opening container https://hyperai.com/console/username/jobs/6q848lathbdp
Jumping to browser...

It can also be opened by the name of the container

$ bayes gear open practicalAI
Opening container https://hyperai.com/console/username/jobs/6q848lathbdp
Jumping to browser...

perhaps. Add at the end of executing commands in the container -o parameter. The command-line tool will immediately open the corresponding file after the upload or merge is completed web Interface:

$ bayes gear run notebook -o -f

Sending upload request to server...
The server has responded
Reading file list, please wait a moment...
Excluded from .HyperAIignore Neglected files and folders...
Shared Files 102 individual
Compressing code...
Compressed code completed
Initializing upload in progress...
Uploading compressed file. Total upload size: 877.42 KiB
877.42 KiB / 877.42 KiB [==========================================================] 100 % 969.85 KiB/s
Cleaning work in progress
Code uploaded successfully
Requesting the server to create a container...
Container created successfully
Open webpage https://hyperai.com/console/username/jobs/1ekrvwi6uyac Can view containers test111 Detailed Information
⠴ In preparation
Container in operation
Jumping to browser...
Successfully opened browser
note

bayes gear of run restart stop All commands can be added at the end of the command -o option. The command line will run after the container reaches the target state. Open the corresponding in the browser web Interface

Use command-line tools to track container logs and container status

1. Log Tracking

By command bayes gear logs Can view container logs in operation, join -f or --follow Parameters will continuously track the container's log printing

$ bayes gear logs 1ekrvwi6uyac -f

[I 14:41:01.149 LabApp] Writing notebook server cookie secret to /root/.local/share/jupyter/runtime/notebook_cookie_secret
[W 14:41:01.433 LabApp] All authentication is disabled. Anyone who can connect to this server will be able to run code.
[I 14:41:01.749 LabApp] JupyterLab extension loaded from /usr/local/lib/python3.6/site-packages/jupyterlab
[I 14:41:01.750 LabApp] JupyterLab application directory is /usr/local/share/jupyter/lab
[I 14:41:01.758 LabApp] Serving notebooks from local directory: /hyperai
[I 14:41:01.758 LabApp] Jupyter Notebook 6.1.4 is running at:
[I 14:41:01.758 LabApp] http://username-1ekrvwi6uyac-main:8888/jobs/username/jobs/1ekrvwi6uyac/
[I 14:41:01.758 LabApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
...
...
...

2. Status tracking

bayes gear The sub command run restart stop All support joining -f or --follow Parameter tracking container status.

caution

Status tracking only applies to "Python script" and " Jupyterworking space" Task takes effect, yes "Automatic parameter tuning" Task is invalid.

among, run restart command:

  • about "Python script" task. It will track the completion of the entire task execution
  • about "Jupyter working space" task. It will be tracked Jupyter Workplace startup completed
  • about " Automatic parameter tuning" task, --follow Parameter not effective

stop command:

  • about "Python script" task. Track tasks until the container is completely closed
  • about "Jupyter working space" task. Track tasks until the container is completely closed
  • about " Automatic parameter tuning" task, --follow Parameter not effective