Skip to content

Commit e3d6934

Browse files
committed
update docs
1 parent 8cbfd3f commit e3d6934

File tree

1 file changed

+16
-5
lines changed
  • docs/en/sql-reference/table-functions

1 file changed

+16
-5
lines changed

docs/en/sql-reference/table-functions/s3.md

+16-5
Original file line numberDiff line numberDiff line change
@@ -276,15 +276,26 @@ For example, if the provided credentials have permission to assume a role but la
276276

277277
For more details on role assumption, read [AWS AssumeRole documentation](https://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRole.html).
278278

279-
To assume a role, pass `roleARN` via the `extra_credentials` parameter to the `s3` function. Example:
279+
To enable role assumption, pass parameters via the extra_credentials argument in the s3 function. The following keys are supported:
280+
281+
* `role_arn` (required) — ARN of the IAM role to assume. **If this key is not provided, ClickHouse will not attempt to assume a role and will use the original credentials as-is.**
282+
* `role_session_name` (optional) — Custom session name to include in the AssumeRole request.
283+
* `sts_endpoint_override` (optional) — Overrides the default AWS STS endpoint (https://sts.amazonaws.com). Useful for testing with a mock or when using another STS-compatible service.
280284

281285
```sql
282-
SELECT count() FROM s3('<s3_bucket_uri>/*.tsv',access_key_id,secret_access_key,'CSVWithNames',extra_credentials(role_arn = 'arn:aws:iam::111111111111:role/BucketAccessRole-001'))
286+
SELECT count() FROM s3(
287+
'<s3_bucket_uri>/*.csv',
288+
access_key_id,
289+
secret_access_key,
290+
'CSVWithNames',
291+
extra_credentials(
292+
role_arn = 'arn:aws:iam::111111111111:role/BucketAccessRole-001',
293+
role_session_name = 'ClickHouseSession',
294+
sts_endpoint_override = 'http://mock-sts:8080'
295+
)
296+
)
283297
```
284298

285-
It is also possible to pass `role_session_name` inside `extra_credentials` if needed.
286-
287-
288299
## Working with archives
289300

290301
Suppose that we have several archive files with following URIs on S3:

0 commit comments

Comments
 (0)