Data warehouse FAQ (frequently asked questions)
How to modify the content in a dataset?
The current dataset does not support deleting files in specific versions (Delete a file), But support partial modification behavior. The specific support situation is as follows::
- In the current version, adopt "Upload data to the current directory" function. Add a new file to the current version
- In the current version, adopt "Upload data to the current directory" function. Upload a new file with the same name to overwrite the existing file content
If you want to modify a dataset in your workspace. Please save the modified content to the working directory of the container. Then, after the container is closed and synchronized. You can copy the specified directory to the dataset.
The upload was successful immediately, but no new files were seen on the page
In some cases HyperAI It will be assumed that certain files already exist (Even if the file was not successfully uploaded), This will result in clicking upload. The upload dialog box immediately closes and prompts that the file has been successfully uploaded.
In this situation, you can click on the one you just used again "Upload a new version" perhaps "Upload to current directory" , Click on the top right corner "Clear upload cache" Try uploading the process again later.
The top-level directory for automatic decompression is missing
There are two ways to compress a folder::
Directly compress the file:
Select all files in the folder and package them together:
For the first method of compressing the package, there will actually be an additional layer of directory. After decompression, a top-level directory will also appear. For the second approach. The compressed file contains a series of flattened files. After decompression, there will be no additional directory included.
Most developers will use the first method to compress folders. But I didn't realize that such decompressed files would have an additional layer of directory. For the convenience of most scenarios, When the compressed file name matches its top-level directory name. We will automatically assist in deleting this directory layer.for instance:
I will create a directory train
I compressed it. Generate a default one named train.zip
The file. After uploading. HyperAI It will automatically transfer the top-level directory train
Delete and retain the files under it.
If you wish to retain train
This directory. You can upload it train.zip
Rename to train_.zip
Such a name. After uploading HyperAI Found file name (train_
)And top-level directory train
inequality. We will keep this directory layer.
Failed to display after automatic decompression
If it displays after uploading the compressed file "Upload failed. Please confirm the format of the data packet" Then explain HyperAI Completely unable to decompress the compressed file. Suggest packaging in other formats before uploading.
After automatic decompression, the actual number of files decreased
There are various reasons that can cause the complete or partial failure of decompression when packaging compressed files:
- Incompatible character encoding format, such as Windows Default Chinese file character encoding issue
- Partial file damage caused by repeated transmission of compressed files
- The compressed file uses an incompatible format, see [macOS Large size zip Package upload](#macos-Large size below-zip-Package upload)
HyperAI I will try various decompression tools to decompress the compression. Try to ensure the integrity of the compressed file as much as possible. But after multiple attempts, it is still possible to lose some of the damaged data. Therefore, if it is found that the decompressed result does not match the actual local decompressed result. You can try repackaging the decompressed local data or transferring it through partial packaging and multiple uploads.
MacOS Large size below zip Package upload
zip
The initial standard supported a maximum size of 4GB The compressed package. And at most include 65535 A file.
And the new zip64
The standard has been expanded upon. Supports larger compressed files and a greater number of files.however macOS Sierra And the default compression tool in subsequent versions will compress beyond 4GB The content does not support this standard. This will lead to macOS The compression exceeds 4GB The compressed file cannot be accessed HyperAI Successfully decompressed and uploaded.
Therefore, please macOS User usage Keka Or other forms of support zip64
Standard compression tools.