4 Comments

Why does my job publishing to s3 storage keep failing when I execute it in a private network?

```

# bacalhau job run batch_job.yaml

Job successfully submitted. Job ID: j-fd557eaf-6561-4419-bd4c-e022114bcaa1

Checking job status... (Enter Ctrl+C to exit at any time, your job will continue running):

Communicating with the network ................ err ❌ 0.0s

Error submitting job:

Job Results By Node:

To get more details about the run, execute:

bacalhau job describe j-fd557eaf-6561-4419-bd4c-e022114bcaa1

To get more details about the run executions, execute:

bacalhau job executions j-fd557eaf-6561-4419-bd4c-e022114bcaa1

```

Expand full comment
Dec 18, 2023·edited Dec 18, 2023

Is there a support of custom endpoint for s3?

Expand full comment
author

Yes it is supported. We also support s3-compatible services as well, such as MinIO and GCS. You can check the S3 Publisher spec for more info

https://docs.bacalhau.org/references/other-specifications/publishers/s3

Expand full comment
author

Sure thing! What did you have in mind?

Expand full comment