Category : amazon-web-services

I am interested if one can import sagemaker packages on your own local Python environment or whether they are restricted to AWS Sagemaker? from sagemaker_automl import AutoMLInteractiveRunner, AutoMLLocalCandidate For instance can I somehow download the sagemaker_automl? I know the there are no sagemaker packages available in the conda repository. Perhaps there is some other way ..

Read more

I am trying to mock describe_instance_types call on AWS EC2 client, using Python Stubber, but not successful. I have a lambda handler function, in which calling a function describe_instance_types() with instance_type as param to get the hypervisor information for it. describe_instance_types_response = boto3.client(‘ec2’).describe_instance_types(InstanceTypes=[instance_type]) hypervisor = describe_instance_types_response["InstanceTypes"][0]["Hypervisor"] And have a test case with stubber as below ..

Read more

Apologies for the very rudimentary question. I am running a docker container and it’s referenceing a file called site.py which is supposed to display a message on a web page. When I run a curl command, this is the message I get: The view function did not return a valid response. The function either returned ..

Read more

this is my problem: I have to run a Sagemaker processing job using custom code written in PySpark. I’ve used the Sagemaker SDK by running these commands: spark_processor = sagemaker.spark.processing.PySparkProcessor( base_job_name="spark-preprocessor", framework_version="2.4", role=role_arn, instance_count=2, instance_type="ml.m5.xlarge", max_runtime_in_seconds=1800, ) spark_processor.run( submit_app="processing.py", arguments=[‘s3_input_bucket’, bucket_name, ‘s3_input_file_path’, file_path ] ) Now I have to automate the workflow by using Step ..

Read more

I am trying to run a copy command which loads around 100 GB of data from S3 to redshift. I am using the lambda function to initiate this copy command every day. This is my current code from datetime import datetime, timedelta import dateutil.tz import psycopg2 from config import * def lambda_handler(event, context): con = ..

Read more

I got error "errorMessage": "An error occurred (AccessDenied) when calling the DescribeClusters operation: User: arn:aws:sts::XX:assumed-role/xx/axx is not authorized to perform: redshift:DescribeClusters on resource: arn:aws:xx:*", For RDS below is the code client = boto3.client(‘rds’) cluster_list = client.describe_db_cluster_endpoints() print(cluster_list) For redshift below is the code client = boto3.client(‘redshift’, ‘us-east-2’) cluster_list = client.describe_clusters() print(cluster_list) My I am role ..

Read more