org.apache.hadoop.fs.s3a.S3AUtils Java Examples
The following examples show how to use
org.apache.hadoop.fs.s3a.S3AUtils.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: S3FileSystem.java From dremio-oss with Apache License 2.0 | 6 votes |
@Override public AmazonS3 load(S3ClientKey clientKey) throws Exception { logger.debug("Opening S3 client connection for {}", clientKey); DefaultS3ClientFactory clientFactory = new DefaultS3ClientFactory(); clientFactory.setConf(clientKey.s3Config); final AWSCredentialProviderList credentialsProvider = S3AUtils.createAWSCredentialProviderSet(S3_URI, clientKey.s3Config); final AmazonS3 s3Client = clientFactory.createS3Client(S3_URI, "", credentialsProvider); return registerReference(AmazonS3.class, s3Client, client -> { client.shutdown(); try { // Note that AWS SDKv1 client will NOT close the credentials provider when being closed so it has to be done ourselves // Because client still holds a reference to credentials provider, it won't be garbage collected until the client is garbage collected itself if (credentialsProvider instanceof AutoCloseable) { ((AutoCloseable) credentialsProvider).close(); } } catch (Exception e) { logger.warn("Failed to close AWS credentials provider", e); } }); }
Example #2
Source File: STSCredentialProviderV1.java From dremio-oss with Apache License 2.0 | 6 votes |
public STSCredentialProviderV1(URI uri, Configuration conf) throws IOException { AWSCredentialsProvider awsCredentialsProvider = null; //TODO: Leverage S3AUtils createAwsCredentialProvider if (S3StoragePlugin.ACCESS_KEY_PROVIDER.equals(conf.get(Constants.ASSUMED_ROLE_CREDENTIALS_PROVIDER))) { awsCredentialsProvider = new SimpleAWSCredentialsProvider(uri, conf); } else if (S3StoragePlugin.EC2_METADATA_PROVIDER.equals(conf.get(Constants.ASSUMED_ROLE_CREDENTIALS_PROVIDER))) { awsCredentialsProvider = InstanceProfileCredentialsProvider.getInstance(); } final String region = S3FileSystem.getAWSRegionFromConfigurationOrDefault(conf).toString(); final AWSSecurityTokenServiceClientBuilder builder = AWSSecurityTokenServiceClientBuilder.standard() .withCredentials(awsCredentialsProvider) .withClientConfiguration(S3AUtils.createAwsConf(conf, "")) .withRegion(region); S3FileSystem.getStsEndpoint(conf).ifPresent(e -> { builder.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(e, region)); }); this.stsAssumeRoleSessionCredentialsProvider = new STSAssumeRoleSessionCredentialsProvider.Builder( conf.get(Constants.ASSUMED_ROLE_ARN), UUID.randomUUID().toString()) .withStsClient(builder.build()) .build(); }
Example #3
Source File: HadoopS3AccessHelper.java From Flink-CEPplus with Apache License 2.0 | 5 votes |
@Override public ObjectMetadata getObjectMetadata(String key) throws IOException { try { return s3a.getObjectMetadata(new Path('/' + key)); } catch (SdkBaseException e) { throw S3AUtils.translateException("getObjectMetadata", key, e); } }
Example #4
Source File: HadoopS3AccessHelper.java From flink with Apache License 2.0 | 5 votes |
@Override public ObjectMetadata getObjectMetadata(String key) throws IOException { try { return s3a.getObjectMetadata(new Path('/' + key)); } catch (SdkBaseException e) { throw S3AUtils.translateException("getObjectMetadata", key, e); } }
Example #5
Source File: HadoopS3AccessHelper.java From flink with Apache License 2.0 | 5 votes |
@Override public ObjectMetadata getObjectMetadata(String key) throws IOException { try { return s3a.getObjectMetadata(new Path('/' + key)); } catch (SdkBaseException e) { throw S3AUtils.translateException("getObjectMetadata", key, e); } }