❭ API reference ❭ data_base ❭ IO ❭
LoaderDumper¶
Read and write data in various formats.
This package provides IO modules that always contain three components:
- A - dump()function to write out the data, and its corresponding- Loaderobject.
- A - Loaderclass that can load the data back into memory.
- a - check()method that checks whether the object can be saved with this dumper.
To save an object, the dump method is called:
>>> import my_dumper
>>> my_dumper.dump(obj, savedir)
This saves the object as specified in the respective dump() method.
In addition, a Loader.json is saved alongside the data.
This file contains the specification of a Loader object,
which can then be initialized and contains all the mechanisms to load the object back into memory.
Functions¶
| 
 | Resolve a loader path to an absolute path. | 
| 
 | Standard interface to load data. | 
| 
 | Convert a dumper submodule to a string. | 
| 
 | Get the dumper string from a filepath. | 
Modules¶
| Save and load  | |
| Save and load dask dataframes to msgpack with categorical columns. | |
| Save and load dask dataframes to msgpack. | |
| Save and load dask dataframes to and from Apache parquet format. | |
| Create a folder and return it as a ManagedFolder object. | |
| Create and load  | |
| Read and write dataframe meta. | |
| Read and write numpy arrays to msgpack files | |
| Read and write a numpy array to  | |
| Read and write a numpy array to the compressed  | |
| Read and write a numpy array to the  | |
| Save and load pandas dataframes to msgpack files. | |
| Read and write a pandas DataFrame to the parquet format. | |
| Read and write a pandas DataFrame to the pickle format. | |
| Base class for child Loader classes | |
| Read and write a  | |
| Read and write numpy arrays to and from shared memory. | |
| Read and write an object to the cloudpickle format. | |
| Read and write objects to the msgpack format. | |
| Read and write objects to the pickle format. | |
| Convenience methods for data IO. | 
Documentation unclear, incomplete, broken or wrong? Let us know