Category : google-cloud-platform

I can set Validation in my gg sheet using gspread with this code below gs = gspread.service_account("<file_name>.json") sh = gs.open_by_key(<Sheet-ID>) ws = sh.sheet1 body_validation = { "requests": [ { "setDataValidation": { "range": { "sheetId": 0, "startRowIndex": 11, "endRowIndex": 12, "startColumnIndex": 0, "endColumnIndex": 1 }, "rule": { "condition":{ "type": "ONE_OF_LIST", "values": [ { "userEnteredValue": "Value A"}, ..

Read more

I have a Python FastAPI app I’m trying to deploy to Google App Engine (GAE). I believe the gunicorn command is the preferred command to have GAE run. It seems that the GAE deploy environment can be configured through an app.yaml alone or in combination with a Dockerfile. I need google chrome installed in the ..

Read more

from googleapiclient.http import MediaFileUpload from Google import Create_Service CLIENT_SECRET_FILE = ‘client-secret.json’ API_NAME =’drive’ API_VERSION = ‘v3’ SCOPES = [‘https://www.googleapis.com/auth/drive.file’] service = Create_Service(CLIENT_SECRET_FILE, API_NAME, API_VERSION, SCOPES) #Upload file file_metadata = { ‘name’: ‘image.jpg’, # Name to upload to Google Drive ‘parents’: [‘insert parents here’] } media_content = MediaFileUpload(‘image.jpg’, mimetype=’image/jpg’) # image from folder file = service.files().create( ..

Read more

I’m trying to work with a bunch of files from a bucket in the Google Cloud Platform. These files are mostly in XLSX format (other files are in xls), so this is the code of how I access them: def read_file(bucket_name, file): if file[-4:] == ‘xlsx’: df_tmp = pd.read_excel(f’gs://{bucket_name}/{file}’, engine=’openpyxl’, header=None) else: df_tmp = pd.read_excel(f’gs://{bucket_name}/{file}’, ..

Read more

I am trying to create an external table in Big Query for a Parquet file that is present on the GCS bucket. But I am running the below code in airflow getting an error: ERROR: ERROR – 400 POST https://bigquery.googleapis.com/bigquery/v2/projects/project_dev/datasets/dataset_dev/tables?prettyPrint=false: When defining a table with an ExternalDataConfiguration, a schema must be present on either the ..

Read more

I have crated a postgreSQL cloud SQL instance in GCP and I have created a user and a DB for it. I can connect to it via cloud_sql_proxy tool: $ cloud_sql_proxy -instances=project_name:REGION:instance_name=tcp:5432 -credential_file=/path/to/key.json I then can successfully connect to instance via psql and run queries and insert data etc in command line: $ psql "host=127.0.0.1 ..

Read more

I have the following java code snippet which polls the gcs bucket for the arrival of new files. This is the code that I am using for my streaming pipeline, which would further load the data after applying some transformations into some destination. PCollection<String> pcollection = pipeline.apply("Read From streaming source", TextIO.read().from("gs://abc/xyz") .watchForNewFiles(Duration.standardSeconds(10), Watch.Growth.never())); but for ..

Read more