Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tagging in PostPolicy upload does not enforce policy tags #21648

@datakurre

Description

@datakurre

Expected Behavior

Defining tagging condition in PostPolicy as specified in this implemented feature request should enforce upload to not only allow but require the tags specified in the policy, as described in the issus to enforce LifeCycle policy by tag for a fresh upload. Otherwise the upload should not be authorized.

Current Behavior

Instead, "tagging" in policy conditions could be whatever, even be empty string and the final post could define whatever tags to apply with the upload. The tags in policy are ignored and object is stored with the tags in the post.

Steps to Reproduce (for bugs)

@orientalperil Since you submitted the original feature request for this, would you be able to confirm this issue?

import datetime

import boto3
import requests

RETENTION = 'retention'
EXPIRE_FAST = 'expire_fast'
TEMP_MARKER = 'temp'

service = 's3'
region = 'us-east-1'
t = datetime.datetime.utcnow()
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = '/'.join([t.strftime('%Y%m%d'), region, service, 'aws4_request'])

key = 'temp-22e063cd-4ed3-437c-87ac-2bc6e6f328d0/my_file.txt'


def get_tag_xml(key, value):
    return f"<Tagging><TagSet><Tag><Key>{key}</Key><Value>{value}</Value></Tag></TagSet></Tagging>"

def get_tag_xml_real(key, value):
    return f"<Tagging><TagSet><Tag><Key>{key}</Key><Value>never-expire</Value></Tag></TagSet></Tagging>"


conditions = [
    {"x-amz-algorithm": algorithm},
    {"x-amz-credential": credential_scope},
    {"x-amz-date": t.isoformat()},
    {"tagging": get_tag_xml(RETENTION, EXPIRE_FAST)},
    {"success_action_status": "201"},
    {"bucket": 'demo-bucket'},
    ["starts-with", "$key", TEMP_MARKER],
]

fields = {
    "x-amz-algorithm": algorithm,
    "x-amz-credential": credential_scope,
    "x-amz-date": t.isoformat(),
    "tagging": get_tag_xml_real(RETENTION, EXPIRE_FAST),
    "success_action_status": "201",
}

client = boto3.client(
    service_name='s3',
    aws_access_key_id='minioadmin',
    aws_secret_access_key='minioadmin',
    endpoint_url='http://localhost:9000',
)

presigned = client.generate_presigned_post(
    'demo-bucket',
    key,
    Fields=fields,
    Conditions=conditions,
    ExpiresIn=60*60,
)

# Demonstrate how another Python program can use the presigned URL to upload a file
object_name = 'filename.txt'
with open(object_name, 'rb') as f:
    files = {'file': (object_name, f)}
    http_response = requests.post(presigned['url'], data=presigned['fields'], files=files)
print(f'File upload HTTP status code: {http_response.status_code}')
print(http_response.content)

Context

The same use case as in #19811

Regression

Your Environment

  • Version used (minio --version): RELEASE.2025-09-07T16-13-09Z (go1.24.6 linux/amd64)
  • Server setup and configuration: podman run --rm -ti --name minio -e "MINIO_ROOT_USER=minioadmin" -e "MINIO_ROOT_PASSWORD=minioadmin" --network host quay.io/minio/minio server /data --console-address ":9001"

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions