Caper

Latest version: v2.3.2

Safety actively analyzes 630254 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 15

0.8.0

Parameters

Deprecated parameters:
- `--use-netrc`: Autouri defaults to use `~/.netrc`.
- `--http-user` and `--http-password`: Use `~/.netrc` to access private URLs

Change of parameters:
- `--use-gsutil-over-aws-s3` -> `--use-gsutil-for-s3`: Autouri uses `gsutil` CLI only for direct transfer between S3 and GCS buckets. Otherwise, it always use Python libraries like `google-cloud-storage` and `boto3`.

Added parameters:
- `--debug` and `--verbose`: For better logging.

New features

Localization and preventing repetitive file transfer
- When a new localization module makes a copy of source on destination cache directory, it compares md5 hash of source and destination if a file already exists on destination. All bucket URIs (`s3://`, `gs://`) and most of URLs provide md5 hash information in their headers. If md5 hash of those match, Caper skips unnecessary file transfer. For local paths, Caper calculate md5 hash of them and store md5 hash string in `.md5` file since md5 hash calculation is expensive. This happens only when Caper writes on a local storage (i.e. when localizing files on local cache). `.md5` file is not valid if its modification time (mtime) is older than the file itself.
- If md5sum comparison fails, Caper compares file sizes and mtimes instead. If file sizes match and mtime is newer for destination then Caper skips file transfer.

File locking
- Caper uses a stable file locking tested up to multiple threads (50 for local, 10 for cloud URIs) competing to write on the same file.

Automatic subworkflow zipping
- Fixed bugs in old auto-zipping module.
- Caper can automatically zip subworkflow WDLs imported in the main WDL. It can also be manullay defined by users in command line arguments `--imports`. Caper will skip auto-zipping if `--imports` is defined.
- Enabled for `caper submit` only. i.e. `caper run` does not use any automatic subworkflow zipping since it is assumed that all sub-WDLs are already localized for `caper run`.

Womtool validation
- If `--imports` is defined or there is an auto-zipped subworkflow WDLs, then Caper creates a temporary directory and put the main WDL and unpack the zip file there. And then Caper runs Womtool to validate those WDLs.
- You can still skip Womtool validation with `--ignore-womtool`.

0.7.0

New features
- `caper init` downloads and Cromwell/Womtool JARs and adds them to Caper's default conf file `~/.caper/default.conf` (or whatever defined with `caper -c`) so that Caper can work completely offline once those JARs are installed.
- Caper made a copy of outputs on every re-ran workflows (tasks) on GCP. Added `--gcp-call-caching-dup-strat` to control this behavior. It defaults back to `reference` instead of `copy`. Define `--gcp-call-caching-dup-strat copy` to keep making copies on re-ran (call-cached) tasks.
- Caper can soft-link globbed outputs instead of hard-linking them. This is useful on file systems where hard-linking is not allowed (e.g. beeGFS). Added a flag `--soft-glob-output` for local backends (`local`, `slurm`, `sge` and `pbs`). This flag cannot work with docker (with `--docker`) or docker-based backends (`gcp` and `aws`).

Documentation
- Heartbeat file and how to run multiple `caper server` on a single machine.
- How to configure Caper for a custom backend.
- Important notes for storage choices on Sherlock cluster.

Bug fixes
- `metadata.json` in output directory/bucket is updated correctly while running and after being done.
- `caper list` sent too many requests to get label of all workflows. Now it sends a single query to retrieve all information of workflows.

0.6.4

Improved job submission for SLURM backend (Sherlock, SCG, ...)
- Fix for the following submission error when server is busy. Caper can try `sbatch`ing up to 3 times.

sbatch: error: Batch job submission failed: Socket timed out on send/recv operation


Added warning for Stanford Sherlock platform (SLURM backend)
- Do not install Caper, Conda and any executable on `$OAK` or `$SCRATCH`. Install them on `$HOME` or `$PI_HOME`.

Bug fixes
- Fix for `w['submission']` error.

0.6.3

added warning for parameter `tmp-dir`

change in default parameters
- increase default java-heap-run 2G->3G

bug fixes
- check presence of `metadata.json` file for troubleshooting
- `submission = w['submission']` error for `caper list`

0.6.2

Bug fixes
- Remove leading/trailing quotes `"` and `'` from values when reading from the conf file (e.g. ~/.caper/default.conf`). Users can use quoted strings in a conf file.

0.6.1

Minor update for Croo's new feature (task graph)

Bug fixes
- `Permission denied` issue for MySQL shell script for docker.

Updated documentation
- MySQL docker

Page 6 of 15

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.