-
-
Notifications
You must be signed in to change notification settings - Fork 4.2k
S3 asf fixes detected by Terraform test suite #7027
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
localstack/services/s3/utils.py
Outdated
@@ -48,6 +48,8 @@ | |||
ObjectCannedACL.bucket_owner_read, | |||
ObjectCannedACL.public_read, | |||
ObjectCannedACL.public_read_write, | |||
# according to docs 'log-delivery-write' is also valid https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl | |||
"log-delivery-write", # TODO add to parser? seems to be missing accross services though |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could add it with a patch I guess, weird it's not part of it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! i don't have a good suggestion on how to get the s3 hack out of the parser right now, so i'd be OK with merging it for now with the FIXME. i'll let alex make the final call though.
localstack/aws/protocol/parser.py
Outdated
if ( | ||
self.service.service_name == "s3" | ||
and uri_param_name == "Key" | ||
and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
base_url: "Like url but without the query string."
and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") | |
and request.base_url.endswith(f"{uri_params[uri_param_name]}/") |
localstack/aws/protocol/parser.py
Outdated
# FIXME special case for s3 object-names (=key): trailing '/' are valid and need to be preserved | ||
# however, the url-matcher removes it from the key | ||
# we check the request.url to verify the name | ||
if ( | ||
self.service.service_name == "s3" | ||
and uri_param_name == "Key" | ||
and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
predictably i'd prefer encapsulating this in the s3 parser somehow :D i don't have a good suggestion at the moment, will take a look after the review, maybe @alexrashed also has some ideas
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be host, the service_name
is something we desperately tried to avoid in these "generalized" parsers.
Since there is already an S3RequestParser
, we could check this case and modifying the uri_params
and afterwards calling the super method. Something like this:
class S3RequestParser(RestXMLRequestParser):
def _parse_shape(
self, request: HttpRequest, shape: Shape, node: Any, uri_params: Mapping[str, Any] = None
) -> Any:
if shape is not None and uri_params is not None \
and shape.serialization.get("location") == "uri" \
and shape.serialization.get("name") == "Key" \
and request.url.split("?")[0].endswith(f"{uri_params['Key']}/"):
uri_params = dict(uri_params)
uri_params["Key"] = uri_params["Key"] + "/"
return super()._parse_shape(request, shape, node, uri_params)
Another option would be to externalize the location parsing to a separate function, but this doesn't seem to be worth the effort / complexity.
I'm happy to assist if I can be of service!
if request.method == "HEAD": | ||
# for head we have to keep the original content-length, but it will be re-calcualated when creating | ||
# the final_response object | ||
final_response.content_length = response.headers.get("Content-Length", 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow! This is great! The S3 ASF migration really goes to the limits (and beyond) of ASF. 🚀
It would be nice if we could externalize the S3 specific edge case in the parser, but this also shouldn't be a blocker (I could also take care of that another time).
localstack/aws/protocol/parser.py
Outdated
# FIXME special case for s3 object-names (=key): trailing '/' are valid and need to be preserved | ||
# however, the url-matcher removes it from the key | ||
# we check the request.url to verify the name | ||
if ( | ||
self.service.service_name == "s3" | ||
and uri_param_name == "Key" | ||
and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be host, the service_name
is something we desperately tried to avoid in these "generalized" parsers.
Since there is already an S3RequestParser
, we could check this case and modifying the uri_params
and afterwards calling the super method. Something like this:
class S3RequestParser(RestXMLRequestParser):
def _parse_shape(
self, request: HttpRequest, shape: Shape, node: Any, uri_params: Mapping[str, Any] = None
) -> Any:
if shape is not None and uri_params is not None \
and shape.serialization.get("location") == "uri" \
and shape.serialization.get("name") == "Key" \
and request.url.split("?")[0].endswith(f"{uri_params['Key']}/"):
uri_params = dict(uri_params)
uri_params["Key"] = uri_params["Key"] + "/"
return super()._parse_shape(request, shape, node, uri_params)
Another option would be to externalize the location parsing to a separate function, but this doesn't seem to be worth the effort / complexity.
I'm happy to assist if I can be of service!
…sposition' for parsing
Moved the specific s3-parsing (for the special case with trailing "/" in the key name) to the S3RequestParser as suggested by @alexrashed (thank you! 🙂) Removing the commit with the excluded CORS tests and rebasing with master now (to check that the tests still work with the old s3 provider), before merging. |
This PR contains several fixes for S3 ASF that were detected by the terraform test suite:
ReplaceKeyPrefixWith
for redirect-routing rule (in that case it should remove the theKeyPrefixEquals
from the original url and then do the redirect)content-lanugage
,content-disposition
andcache-control
to be picked up by parserx-amz-server-side-encryption-bucket-key-enabled
which is returned as camel-case by moto, but should be lowercase to be picked up by the parsercontent-length
for HEAD requests that are forwarded using theProxy
for vhost-style requests.delete-objects
:=
from the path (/<bucket>/?delete=
but expects/<bucket>/?delete
)delete-objects
quiet
mode fordelete-objects
log-delivery-write
, which was missing from theBucketCannedACL
TODO before merge: