IO / Conversion#

Dataset methods#

load_dataset(filename_or_obj, **kwargs)

Open, load into memory, and close a Dataset from a file or file-like object.

open_dataset(filename_or_obj, *[, engine, ...])

Open and decode a dataset from a file or file-like object.

open_mfdataset(paths[, chunks, concat_dim, ...])

Open multiple files as a single dataset.

open_zarr(store[, group, synchronizer, ...])

Load and decode a dataset from a Zarr store.

save_mfdataset(datasets, paths[, mode, ...])

Write multiple datasets to disk as netCDF files simultaneously.

Dataset.as_numpy()

Coerces wrapped data and coordinates into numpy arrays, returning a Dataset.

Dataset.from_dataframe(dataframe[, sparse])

Convert a pandas.DataFrame into an xarray.Dataset

Dataset.from_dict(d)

Convert a dictionary into an xarray.Dataset.

Dataset.to_dataarray([dim, name])

Convert this dataset into an xarray.DataArray

Dataset.to_dataframe([dim_order])

Convert this dataset into a pandas.DataFrame.

Dataset.to_dask_dataframe([dim_order, set_index])

Convert this dataset into a dask.dataframe.DataFrame.

Dataset.to_dict([data, encoding])

Convert this dataset to a dictionary following xarray naming conventions.

Dataset.to_netcdf([path, mode, format, ...])

Write dataset contents to a netCDF file.

Dataset.to_pandas()

Convert this dataset into a pandas object without changing the number of dimensions.

Dataset.to_zarr([store, chunk_store, mode, ...])

Write dataset contents to a zarr group.

Dataset.chunk([chunks, name_prefix, token, ...])

Coerce all arrays in this dataset into dask arrays with the given chunks.

Dataset.close()

Release any resources linked to this object.

Dataset.compute(**kwargs)

Manually trigger loading and/or computation of this dataset's data from disk or a remote source into memory and return a new dataset.

Dataset.filter_by_attrs(**kwargs)

Returns a Dataset with variables that match specific conditions.

Dataset.info([buf])

Concise summary of a Dataset variables and attributes.

Dataset.load(**kwargs)

Manually trigger loading and/or computation of this dataset's data from disk or a remote source into memory and return this dataset.

Dataset.persist(**kwargs)

Trigger computation, keeping data as chunked arrays.

Dataset.unify_chunks()

Unify chunk size along all chunked dimensions of this Dataset.

DataArray methods#

load_dataarray(filename_or_obj, **kwargs)

Open, load into memory, and close a DataArray from a file or file-like object containing a single data variable.

open_dataarray(filename_or_obj, *[, engine, ...])

Open an DataArray from a file or file-like object containing a single data variable.

DataArray.as_numpy()

Coerces wrapped data and coordinates into numpy arrays, returning a DataArray.

DataArray.from_dict(d)

Convert a dictionary into an xarray.DataArray

DataArray.from_iris(cube)

Convert a iris.cube.Cube into an xarray.DataArray

DataArray.from_series(series[, sparse])

Convert a pandas.Series into an xarray.DataArray.

DataArray.to_dask_dataframe([dim_order, ...])

Convert this array into a dask.dataframe.DataFrame.

DataArray.to_dataframe([name, dim_order])

Convert this array and its coordinates into a tidy pandas.DataFrame.

DataArray.to_dataset([dim, name, promote_attrs])

Convert a DataArray to a Dataset.

DataArray.to_dict([data, encoding])

Convert this xarray.DataArray into a dictionary following xarray naming conventions.

DataArray.to_index()

Convert this variable to a pandas.Index.

DataArray.to_iris()

Convert this array into a iris.cube.Cube

DataArray.to_masked_array([copy])

Convert this array into a numpy.ma.MaskedArray

DataArray.to_netcdf([path, mode, format, ...])

Write DataArray contents to a netCDF file.

DataArray.to_numpy()

Coerces wrapped data to numpy and returns a numpy.ndarray.

DataArray.to_pandas()

Convert this array into a pandas object with the same shape.

DataArray.to_series()

Convert this array into a pandas.Series.

DataArray.to_zarr([store, chunk_store, ...])

Write DataArray contents to a Zarr store

DataArray.chunk([chunks, name_prefix, ...])

Coerce this array's data into a dask arrays with the given chunks.

DataArray.close()

Release any resources linked to this object.

DataArray.compute(**kwargs)

Manually trigger loading of this array's data from disk or a remote source into memory and return a new array.

DataArray.persist(**kwargs)

Trigger computation in constituent dask arrays

DataArray.load(**kwargs)

Manually trigger loading of this array's data from disk or a remote source into memory and return this array.

DataArray.unify_chunks()

Unify chunk size along all chunked dimensions of this DataArray.

DataTree methods#

open_datatree(filename_or_obj, *[, engine, ...])

Open and decode a DataTree from a file or file-like object, creating one tree node for each group in the file.

open_groups(filename_or_obj, *[, engine, ...])

Open and decode a file or file-like object, creating a dictionary containing one xarray Dataset for each group in the file.

DataTree.to_dict([relative])

Create a dictionary mapping of paths to the data contained in those nodes.

DataTree.to_netcdf(filepath[, mode, ...])

Write datatree contents to a netCDF file.

DataTree.to_zarr(store[, mode, encoding, ...])

Write datatree contents to a Zarr store.

DataTree.chunk([chunks, name_prefix, token, ...])

Coerce all arrays in all groups in this tree into dask arrays with the given chunks.

DataTree.load(**kwargs)

Manually trigger loading and/or computation of this datatree's data from disk or a remote source into memory and return this datatree.

DataTree.compute(**kwargs)

Manually trigger loading and/or computation of this datatree's data from disk or a remote source into memory and return a new datatree.

DataTree.persist(**kwargs)

Trigger computation, keeping data as chunked arrays.