Skip to content

Commit

Permalink
Merge pull request #94 from pastas/dev
Browse files Browse the repository at this point in the history
Release v1.2.0
  • Loading branch information
dbrakenhoff authored May 5, 2023
2 parents 28ea9f2 + 349bd6e commit b6ef41e
Show file tree
Hide file tree
Showing 21 changed files with 512 additions and 205 deletions.
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ jobs:
[
"git+https://github.com/pastas/[email protected]",
"git+https://github.com/pastas/[email protected]",
"git+https://github.com/pastas/[email protected]",
"git+https://github.com/pastas/pastas.git@dev",
]

Expand Down
10 changes: 6 additions & 4 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,12 @@
# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.9"

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
Expand All @@ -14,12 +20,8 @@ formats: all

# Optionally set the version of Python and requirements required to build your docs
python:
version: "3.8"
install:
- method: pip
path: .
extra_requirements:
- docs

build:
image: latest
22 changes: 22 additions & 0 deletions docs/connectors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,28 @@ tim eseries are stored as JSON files. Models are stored as JSON as well but
*do not* contain the time series themselves. These are picked up from
the other directories when the model is loaded from the database.

ArcticDB
--------
Note: this Connector uses ArcticDB the next-generation version of Arctic. Requires arcticdb Python package.

The :ref:`ArcticDBConnector` is an object that creates a
local database. This can be an existing or a new database.
For each of the datasets a collection or library is created. These are named
using the following convention: `<database name>.<library name>`.

The ArcticDB implementation uses the following structure:

.. code-block::
+-- database
| +-- libraries (i.e. oseries, stresses, models)
| | +-- items... (i.e. individual time series or models)
The data is stored within these libraries. Observations and stresses time series
are stored as pandas.DataFrames. Models are stored as pickled dictionaries
and *do not* contain the time series themselves. These are picked up from
the other libraries when the model is loaded from the database.

Arctic
------
Note: this Connector is not actively tested!
Expand Down
26 changes: 21 additions & 5 deletions docs/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ is stored in-memory (in dictionaries)::
conn = pst.DictConnect("my_db")

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)
store = pst.PastaStore(conn)


Using Pastas
Expand All @@ -35,11 +35,27 @@ that writes data to disk as no external dependencies are required::
import pastastore as pst

# define pas connector
path = "./data/pas"
path = "./data/pastas_db"
conn = pst.PasConnector("my_db", path)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)
store = pst.PastaStore(conn)


Using ArcticDB
--------------

The following snippet shows how to create an `ArcticDBConnector` and initialize
a `PastaStore` object::

import pastastore as pst

# define arctic connector
uri = "lmdb://./my_path_here/"
conn = pst.ArcticDBConnector("my_db", uri)

# create project for managing Pastas data and models
store = pst.PastaStore(conn)


Using Arctic
Expand All @@ -56,7 +72,7 @@ this to work::
conn = pst.ArcticConnector("my_db", connstr)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)
store = pst.PastaStore(conn)


Using Pystore
Expand All @@ -71,7 +87,7 @@ connection string to a database::
conn = pst.PystoreConnector("my_db", path)

# create project for managing Pastas data and models
store = pst.PastasProject("my_project", conn)
store = pst.PastasProject(conn)


The PastaStore object
Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Start Python and import the module::

import pastastore as pst
conn = pst.DictConnector("my_connector")
store = pst.PastaStore("my_store", conn)
store = pst.PastaStore(conn)

See the :ref:`examples` section for some quick examples on how to get started.

Expand Down
9 changes: 9 additions & 0 deletions docs/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,15 @@ PasConnector
:private-members:
:show-inheritance:

ArcticDBConnector
^^^^^^^^^^^^^^^^^

.. autoclass:: pastastore.ArcticDBConnector
:members:
:undoc-members:
:private-members:
:show-inheritance:

ArcticConnector
^^^^^^^^^^^^^^^

Expand Down
1 change: 1 addition & 0 deletions docs/utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ all its contents or copying all data to a new database:
* :meth:`pastastore.util.delete_pas_connector`
* :meth:`pastastore.util.delete_pystore_connector`
* :meth:`pastastore.util.delete_arctic_connector`
* :meth:`pastastore.util.delete_arcticdb_connector`
* :meth:`pastastore.util.copy_database`


Expand Down
8 changes: 7 additions & 1 deletion pastastore/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@
from . import connectors, util
from .connectors import ArcticConnector, DictConnector, PasConnector, PystoreConnector
from .connectors import (
ArcticConnector,
ArcticDBConnector,
DictConnector,
PasConnector,
PystoreConnector,
)
from .store import PastaStore
from .version import __version__
181 changes: 181 additions & 0 deletions pastastore/connectors.py
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,187 @@ def oseries_with_models(self):
return self._get_library("oseries_models").list_symbols()


class ArcticDBConnector(BaseConnector, ConnectorUtil):
conn_type = "arcticdb"

def __init__(self, name: str, uri: str):
"""Create an ArcticDBConnector object using ArcticDB to store data.
Parameters
----------
name : str
name of the database
uri : str
URI connection string (e.g. 'lmdb://<your path here>')
"""
try:
import arcticdb
except ModuleNotFoundError as e:
print("Please install arcticdb with `pip install arcticdb`!")
raise e
self.uri = uri
self.name = name

self.libs: dict = {}
self.arc = arcticdb.Arctic(uri)
self._initialize()
self.models = ModelAccessor(self)
# for older versions of PastaStore, if oseries_models library is empty
# populate oseries - models database
self._update_all_oseries_model_links()

def _initialize(self) -> None:
"""Internal method to initalize the libraries."""

for libname in self._default_library_names:
if self._library_name(libname) not in self.arc.list_libraries():
self.arc.create_library(self._library_name(libname))
else:
print(
f"ArcticDBConnector: library "
f"'{self._library_name(libname)}'"
" already exists. Linking to existing library."
)
self.libs[libname] = self._get_library(libname)

def _library_name(self, libname: str) -> str:
"""Internal method to get full library name according to ArcticDB."""
return ".".join([self.name, libname])

def _get_library(self, libname: str):
"""Get ArcticDB library handle.
Parameters
----------
libname : str
name of the library
Returns
-------
lib : arcticdb.Library handle
handle to the library
"""
# get library handle
lib = self.arc.get_library(self._library_name(libname))
return lib

def _add_item(
self,
libname: str,
item: Union[FrameorSeriesUnion, Dict],
name: str,
metadata: Optional[Dict] = None,
**_,
) -> None:
"""Internal method to add item to library (time series or model).
Parameters
----------
libname : str
name of the library
item : Union[FrameorSeriesUnion, Dict]
item to add, either time series or pastas.Model as dictionary
name : str
name of the item
metadata : Optional[Dict], optional
dictionary containing metadata, by default None
"""
lib = self._get_library(libname)
# only normalizable datatypes can be written with write, else use write_pickle
# normalizable: Series, DataFrames, Numpy Arrays
if isinstance(item, (dict, list)):
lib.write_pickle(name, item, metadata=metadata)
else:
lib.write(name, item, metadata=metadata)

def _get_item(self, libname: str, name: str) -> Union[FrameorSeriesUnion, Dict]:
"""Internal method to retrieve item from library.
Parameters
----------
libname : str
name of the library
name : str
name of the item
Returns
-------
item : Union[FrameorSeriesUnion, Dict]
time series or model dictionary
"""
lib = self._get_library(libname)
return lib.read(name).data

def _del_item(self, libname: str, name: str) -> None:
"""Internal method to delete items (series or models).
Parameters
----------
libname : str
name of library to delete item from
name : str
name of item to delete
"""
lib = self._get_library(libname)
lib.delete(name)

def _get_metadata(self, libname: str, name: str) -> dict:
"""Internal method to retrieve metadata for an item.
Parameters
----------
libname : str
name of the library
name : str
name of the item
Returns
-------
dict
dictionary containing metadata
"""
lib = self._get_library(libname)
return lib.read_metadata(name).metadata

@property
def oseries_names(self):
"""List of oseries names.
Returns
-------
list
list of oseries in library
"""
return self._get_library("oseries").list_symbols()

@property
def stresses_names(self):
"""List of stresses names.
Returns
-------
list
list of stresses in library
"""
return self._get_library("stresses").list_symbols()

@property
def model_names(self):
"""List of model names.
Returns
-------
list
list of models in library
"""
return self._get_library("models").list_symbols()

@property
def oseries_with_models(self):
"""List of oseries with models."""
return self._get_library("oseries_models").list_symbols()


class PystoreConnector(BaseConnector, ConnectorUtil): # pragma: no cover
conn_type = "pystore"

Expand Down
5 changes: 4 additions & 1 deletion pastastore/datasets.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ def _default_connector(conntype: str):
----------
conntype : str
name of connector (DictConnector, PasConnector,
ArcticConnector or PystoreConnector)
ArcticConnector, ArcticDBConnector or PystoreConnector)
Returns
-------
Expand All @@ -187,6 +187,9 @@ def _default_connector(conntype: str):
if Conn.conn_type == "arctic":
connstr = "mongodb://localhost:27017/"
conn = Conn("my_db", connstr)
elif Conn.conn_type == "arcticdb":
uri = "lmdb://./arctic_db"
conn = Conn("my_db", uri)
elif Conn.conn_type == "pystore":
conn = Conn("my_db", "./pystore_db")
elif Conn.conn_type == "dict":
Expand Down
9 changes: 6 additions & 3 deletions pastastore/plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -587,9 +587,12 @@ def stresses(

mask0 = (stresses["x"] != 0.0) | (stresses["y"] != 0.0)

c = stresses.loc[mask0, "kind"]
kind_to_color = {k: f"C{i}" for i, k in enumerate(c.unique())}
c = c.apply(lambda k: kind_to_color[k])
if "c" in kwargs:
c = kwargs.pop("c", None)
else:
c = stresses.loc[mask0, "kind"]
kind_to_color = {k: f"C{i}" for i, k in enumerate(c.unique())}
c = c.apply(lambda k: kind_to_color[k])

r = self._plotmap_dataframe(stresses.loc[mask0], c=c, figsize=figsize, **kwargs)
if "ax" in kwargs:
Expand Down
Loading

0 comments on commit b6ef41e

Please sign in to comment.