Given a point (latitude and longitude) I would like to calculate the distance between the given point and the longitudes and latitudes in my file, and filter all the information based on that distance. This is my file. <xarray.Dataset> Dimensions: (height: 1, heightv: 1, ff: 3, time: 49, x: 70, y: 61) Coordinates: Lambert_Conformal |S1 ..
I’m trying to use a bias correction function that is implemented in R on a gridded dataset in python. I found an example online which loops over each gridpoint. import pickle import numpy as np import xarray as xr import pandas as pd import matplotlib.pyplot as plt import sys from rpy2.robjects.packages import importr import rpy2.robjects.numpy2ri ..
So I’m trying to parallelize the process using the dask cluster. Here’s my try. Getting clusters ready: gateway = Gateway( address="http://traefik-pangeo-dask-gateway/services/dask-gateway", public_address="https://pangeo.aer-gitlab.com/services/dask-gateway", auth="jupyterhub", ) options = gateway.cluster_options() options cluster = gateway.new_cluster( cluster_options=options, ) cluster.adapt(minimum=90, maximum=100) client = cluster.get_client() cluster client Then I have a function which will load files from S3 and process it and ..
I convert a 3D xarray DataArray to IRIS cube, add the coordinates and attributes and save to NetCDF file. The data I save are 3 dimensional (time, lat, long) temperature or salinity of the ocean. I would like to add 2D (lat, long) arrays of the mask and the bathymetry to the same NetCDF file ..
I’m trying to download data from a opendap link by loading through xarray, subsetting the data, then saving the files to netcdf format using the code below: import xarray as xr import numpy as np if __name__ == "__main__": url = ‘https://ds.nccs.nasa.gov/thredds/dodsC/bypass/NEX-GDDP/bcsd/rcp85/r1i1p1/tasmin/inmcm4.ncml’ lat = 31.02781324 lon = -112.6473323 + 365 with xr.open_dataset(url) as ds: closest_lat ..
I need help using xarray. I have a list of points (longitudes, latitudes and dates) for which I need to extract weather data. So far, I have weather_by_loc_time = pd.DataFrame() for i,j in zip(latitude,longitude): dsloc = ds.sel(latitude=i,longitude=j, method=’nearest’) dot = dsloc.to_dataframe() weather_by_loc_time = weather_by_loc_time.append(dot) which gives me data for the entire time series. If I ..
I have a set of points defined by longitude, latitude and date (YY.MM.DD.HH). I had the idea to use xarray to extract values from netcdf file at each point. Using following… target_lon = xr.DataArray(longitude, dims="points") target_lat = xr.DataArray(latitude, dims="points") db = ds.sel(longitude=target_lon, latitude=target_lat, method="nearest") …I can collect the entire timeseries for each of the points. ..
I am trying to calculate correlation coefficient between two datasets along time dimension. Suppose, I have a 3D matrix A (time,lat,lon) and another matrix B (time,lat,lon). r = Xarray.corr(A,B,dim=’time’) giving me an output size (0,lon). See below for the error information. Here are the description of the two matrices: and other matrix: My main aim ..
I have monthly data set in dimensions (time, depth, y, and x). and I am hoping to regroup the data into months (Jan – Dec), # of years in that month, depth, y, and x. Apologies in advance for not having reproducible data… just think of this data set as monthly time steps and spatial ..
I have two .nc files, which have similar lat and lon dimensions, CRS and everything else. Dimensions: (lat: 62, lon: 94) Coordinates: * lat (lat) float64 58.46 58.79 59.12 59.44 … 77.45 77.78 78.11 78.44 * lon (lon) float64 -150.0 -149.4 -148.9 -148.3 … -98.56 -98.0 -97.43 When I then add these two files together, ..