You can connect to your object storage using an S3-compatible API, an SDK, or the AWS CLI. Below are some examples of commands you can use with these methods. For a full list of compatible commands, refer to S3 API compatibility.
If you currently have a bucket with AWS or GCP and would like to transfer your files from this bucket to Sevalla, please contact our Support Team.
AWS CLI
aws configure set aws_access_key_id YOUR_ACCESS_KEY
aws configure set aws_secret_access_key YOUR_SECRET_KEY
aws configure set default.s3.signature_version s3v4
aws configure set default.s3.endpoint_url YOUR_ENDPOINT_URL
Generate a pre-signed URL
aws s3 presign s3://YOUR_BUCKET_NAME/FILENAME --expires-in 3600 --endpoint-url YOUR_ENDPOINT_URL
Upload a file
aws s3 cp LOCAL_FILENAME s3://YOUR_BUCKET_NAME/ --endpoint-url YOUR_ENDPOINT_URL
Download a file
aws s3 cp s3://YOUR_BUCKET_NAME/REMOTE_FILENAME DOWNLOADED_FILENAME --endpoint-url YOUR_ENDPOINT_URL
List your bucket objects
aws s3 ls s3://YOUR_BUCKET_NAME/ --endpoint-url YOUR_ENDPOINT_URL
Boto3 (Python SDK)
Install Boto3 if you havenโt already:
Enter Python interactive mode:
Connect to your object storage
import boto3
from botocore.client import Config
# Set up R2 connection
s3 = boto3.client(
"s3",
endpoint_url="YOUR_ENDPOINT_URL",
aws_access_key_id="YOUR_ACCESS_KEY",
aws_secret_access_key="YOUR_SECRET_KEY",
config=Config(signature_version="s3v4"),
)
Generate a pre-signed URL
presigned_url = s3.generate_presigned_url(
"get_object",
Params={"Bucket": "YOUR_BUCKET_NAME", "Key": "REMOTE_FILE_NAME"}, # Replace values
ExpiresIn=3600, # Expiration time in seconds (e.g., 1 hour)
)
print("Pre-signed URL:", presigned_url) # Ensure this prints
Upload a file
s3.upload_file("LOCAL_FILENAME", "YOUR_BUCKET_NAME", "UPLOADED_FILENAME")
Download a file
s3.download_file("YOUR_BUCKET_NAME", "REMOTE_FILENAME", "DOWNLOADED_FILENAME")
List your bucket objects
response = s3.list_objects_v2(Bucket="your-bucket-name")
print(response) # See what files exist