Skip to content

Commit

Permalink
Merge with main branch
Browse files Browse the repository at this point in the history
  • Loading branch information
rcboufleur committed Jan 27, 2025
2 parents 5d2cfc7 + 323ad56 commit 181364d
Show file tree
Hide file tree
Showing 27 changed files with 1,872 additions and 1,089 deletions.
39 changes: 25 additions & 14 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,22 +15,33 @@ jobs:
contents: read

steps:
- name: Checkout
uses: actions/checkout@v4
- name: Checkout
uses: actions/checkout@v4

- name: Build hinfo
uses: lsst-sqre/build-and-push-to-ghcr@v1
with:
image: ${{ github.repository }}-hinfo
github_token: ${{ secrets.GITHUB_TOKEN }}
dockerfile: Dockerfile.hinfo
- name: Extract Git Tag or Branch
id: git_info
run: |
if [[ "${{ github.ref_type }}" == "tag" ]]; then
echo "GITHUB_TAG=${{ github.ref_name }}" >> $GITHUB_ENV
else
echo "GITHUB_TAG=noversion" >> $GITHUB_ENV
fi
- name: Build pqserver
uses: lsst-sqre/build-and-push-to-ghcr@v1
with:
image: ${{ github.repository }}-pq
github_token: ${{ secrets.GITHUB_TOKEN }}
dockerfile: Dockerfile.pqserver
- name: Build hinfo
uses: lsst-sqre/build-and-push-to-ghcr@v1
with:
image: ${{ github.repository }}-hinfo
github_token: ${{ secrets.GITHUB_TOKEN }}
dockerfile: Dockerfile.hinfo

- name: Build pqserver
uses: lsst-sqre/build-and-push-to-ghcr@v1
with:
image: ${{ github.repository }}-pq
github_token: ${{ secrets.GITHUB_TOKEN }}
dockerfile: Dockerfile.pqserver
build-args: |
GITHUB_TAG=${{ env.GITHUB_TAG }}
- name: Build efdtransform
uses: lsst-sqre/build-and-push-to-ghcr@v1
Expand Down
20 changes: 19 additions & 1 deletion .github/workflows/pytest.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,5 +20,23 @@ jobs:
- name: Build docker image
run: docker build -f Dockerfile.pytest -t pytest_image .

- name: Create directory for coverage report
run: mkdir -p ${{ github.workspace }}/pytest_reports

- name: Set coverage report directory permissions
run: chmod -R 777 ${{ github.workspace }}/pytest_reports

- name: Run docker image
run: docker run --rm -e PYTEST_ADDOPTS="--color=yes" pytest_image
run: docker run --rm -v ${{ github.workspace }}/pytest_reports:/home/lsst/consdb/pytest_reports -e PYTEST_ADDOPTS="--color=yes" pytest_image

- name: Upload test report
uses: actions/upload-artifact@v4
with:
name: pytest_report
path: pytest_reports/pytest_report.html

- name: Upload coverage report
uses: actions/upload-artifact@v4
with:
name: coverage_report
path: pytest_reports/htmlcov
19 changes: 18 additions & 1 deletion Dockerfile.pqserver
Original file line number Diff line number Diff line change
@@ -1,7 +1,24 @@
FROM python:3.11

ARG GITHUB_TAG
ENV VERSION=${GITHUB_TAG}

RUN pip install fastapi safir astropy uvicorn gunicorn sqlalchemy psycopg2
WORKDIR /
COPY python/lsst/consdb/__init__.py python/lsst/consdb/pqserver.py python/lsst/consdb/utils.py /consdb_pq/
COPY \
python/lsst/consdb/__init__.py \
python/lsst/consdb/pqserver.py \
python/lsst/consdb/cdb_schema.py \
python/lsst/consdb/config.py \
python/lsst/consdb/dependencies.py \
python/lsst/consdb/exceptions.py \
python/lsst/consdb/models.py \
/consdb_pq/
COPY \
python/lsst/consdb/handlers/external.py \
python/lsst/consdb/handlers/internal.py \
/consdb_pq/handlers/

# Environment variables that must be set:
# DB_HOST DB_PASS DB_USER DB_NAME or POSTGRES_URL

Expand Down
8 changes: 6 additions & 2 deletions Dockerfile.pytest
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,16 @@ RUN yum install -y postgresql-server postgresql && rmdir /usr/local/bin && ln -s

USER lsst
RUN source loadLSST.bash && mamba install -y aiokafka httpx
RUN source loadLSST.bash && pip install kafkit aiokafka httpx pytest-asyncio testing.postgresql lsst-felis
RUN source loadLSST.bash && pip install kafkit aiokafka httpx pytest-asyncio pytest-cov pytest-html testing.postgresql lsst-felis safir

WORKDIR /home/lsst/

COPY --chown=lsst . ./consdb/
WORKDIR /home/lsst/consdb/
RUN source /opt/lsst/software/stack/loadLSST.bash && pip install -e .

ENTRYPOINT [ "/bin/bash", "-c", "source /opt/lsst/software/stack/loadLSST.bash; setup obs_lsst; setup felis; pytest ." ]
USER root
RUN mkdir -p /home/lsst/consdb/pytest_reports && chown lsst:lsst /home/lsst/consdb/pytest_reports
USER lsst

ENTRYPOINT [ "/bin/bash", "-c", "source /opt/lsst/software/stack/loadLSST.bash; setup obs_lsst; setup felis; pytest --cov=./ --cov-report=html:/home/lsst/consdb/pytest_reports/htmlcov --html=pytest_reports/pytest_report.html --self-contained-html ." ]
15 changes: 10 additions & 5 deletions alembic-autogenerate.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,19 @@

#
# How to use this script:
# 1. Load the LSST environment and setup sdm_schemas and felis.
# source loadLSST.bash
# setup felis
# setup -r /path/to/sdm_schemas
# 1. Install required packages and sdm_schemas, set environment variables:
# pip install lsst-felis testing.postgresql alembic sqlalchemy pyyaml \
# black psycopg2-binary
# git clone https://github.com/lsst/sdm_schemas
# cd sdm_schemas
# export SDM_SCHEMAS_DIR=`pwd```
# 2. From the root of the consdb git repo, invoke the script. Supply a
# revision message as the command line argument:
# python alembic-autogenerate.py DM-12345
# python alembic-autogenerate.py this is my revision message "\n" \
# the message can span multiple lines "\n" \
# if desired
# 3. Revise your auto-generated code as needed.
# 4. Remove the autogenerated creation of sql views (visit1, ccdvisit1).
#

import os
Expand Down
2 changes: 2 additions & 0 deletions doc/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,5 @@ doxygen.conf
# Sphinx products
_build
py-api

*.DS_Store
2 changes: 1 addition & 1 deletion doc/contributor-guide/adding-columns.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Structure

- ConsDB content must relate to exposures or visits or observations structured like exposures. General time series should go in the Engineering and Facilities Database (EFD).
- ConsDB content should generally be scalar values. Large amounts of data, especially arrays or images or cubes, should generally go into the Large File Annex (LFA).
- Avoid arrays expressed as individual columns (e.g. ``something0``, ``something1``, ``something2``) where possible, as this increases the number of columns drastically (and there is `a limit <https://www.postgresql.org/docs/current/limits.html>`_), makes it hard to query (``SELECT`` clauses need to list all of these individually, and ``WHERE`` clauses may need to include large ``OR`` or ``AND`` conditions), and potentially requires a lot of database storage space.
- Avoid arrays expressed as individual columns (e.g. ``something0``, ``something1``, ``something2``) where possible, as this increases the number of columns drastically (and there is `a limit <https://www.postgresql.org/docs/current/limits.html>`__), makes it hard to query (``SELECT`` clauses need to list all of these individually, and ``WHERE`` clauses may need to include large ``OR`` or ``AND`` conditions), and potentially requires a lot of database storage space.
- Columns should be named in all lowercase with underscore (``_``) separators, also known as "snake_case".

Data sources
Expand Down
2 changes: 1 addition & 1 deletion doc/developer-guide/consdbclient-summit-utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
ConsDbClient in summit_utils
############################

How to write and test code in summit_utils for ConsDbClient
How to write and test code in summit_utils for ConsDbClient
4 changes: 2 additions & 2 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
ConsDB
======

``lsst.consdb`` is developed at https://github.com/lsst-dm/consdb.
You can find Jira issues for this module under the `consdb <https://jira.lsstcorp.org/issues/?jql=project%20%3D%20DM%20AND%20component%20%3D%20consdb>`_ component.
``lsst.consdb`` is developed at `https://github.com/lsst-dm/consdb <https://github.com/lsst-dm/consdb>`__.
You can find Jira issues for this module under the `ConsDB <https://jira.lsstcorp.org/issues/?jql=project%20%3D%20DM%20AND%20component%20%3D%20consdb>`__ component.

#############
Documentation
Expand Down
86 changes: 84 additions & 2 deletions doc/operator-guide/deployment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,87 @@
Deployment
###########

* Database
* REST API Server
Database
========

Deployments of the Consolidated Database are currently located at

- Summit
- USDF (+ dev, use the same underlying database, a replication of Summit)
- Base Test Stand (BTS)
- Tucson Test Stand (TTS)

Updates to these deployments may be needed when there are edits to the schema for any of the cdb_* tables defined in <link to> sdm_schemas.

Tools:
------

- Argo-CD
- LOVE
- Felis

Repositories:
-------------

- `phalanx <https://github.com/lsst-sqre/phalanx>`__
- `sdm_schemas <https://github.com/lsst/sdm_schemas>`__
- `consdb <https://github.com/lsst-dm/consdb>`__

Access needed:
--------------

- NOIRLab VPN
- Summit VPN
- USDF

Process:
--------


Deploy code to populate db at Summit and/or USDF
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Follow the testing steps above for testing alembic migration and code at TTS/BTS, before the you consider deploying at the summit.

The steps to deploy at the summit mirror the steps to test on a test stand with coordination and permission from the observers and site teams.
Access to argo-cd deployments is available via the Summit OpenVPN.
To coordinate your deployment update on the summit, you must attend Coordination Activities Planning (CAP) meeting on Tuesday mornings and announce your request.

Add your migration intentions to the CAP SITCOM confluence agenda `here <https://rubinobs.atlassian.net/wiki/spaces/LSSTCOM/pages/53765933/Agenda+Items+for+Future+CAP+Meetings>`__

The CAP members may tell you a time frame that is acceptable for you to perform these changes.

They may also tell you specific people to coordinate with to help you take images to test LATISS and LSSTCOMCAMSIM tables. There will be more tables to test eventually.

Channels to note: #rubinobs-test-planning; #summit-announce; #summit-auxtel, and `channel usage guide <https://obs-ops.lsst.io/Communications/slack-channel-usage.html>`__.

When you get your final approval and designated time to perform the changes to ConsDB, announce on #summit-announce, and follow similar steps as test stand procedure above.

USDF Deployment Steps
^^^^^^^^^^^^^^^^^^^^^

These steps must happen in synchrony with a Summit migration.

1. Disable (pause) SUBSCRIPTION at USDF.
2. Perform the migration at the summit with the steps below.
3. Connect to the USDF database via psql and perform the alembic migration.
4. Check or test as agreed upon with the ConsDB team.
5. Enable and Refresh Subscription at USDF.

If there is no impact or coordination with Summit needed: Run alembic migration at USDF, and test as appropriate.

Summit Deployment Steps
^^^^^^^^^^^^^^^^^^^^^^^

1. Use a branch in ``phalanx`` to point to the ConsDB tag for deployment.
2. Set the Argo-CD application ``consdb's`` target revision to your ``phalanx`` branch.
3. Refresh the ConsDB application and review pod logs.
4. Connect to the summit database via psql and perform the alembic migration.
5. Have an image taken with the observing team, then verify database entries with a SQL query or Jupyter notebook.
6. Check your new entries in the database using a jupyter notebook or SQL query in RSP showing your new image has been inserted to the database as expected.

Once deployment succeeds, set the ``Target Revision`` in Argo-CD back to ``main`` and complete the ``phalanx`` PR for the tested ConsDB tag.


REST API Server
===============
32 changes: 30 additions & 2 deletions doc/operator-guide/monitoring.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,33 @@
Monitoring
###########

* Database
* REST API Server
Reporting channels
==================

Users of ConsDB, ConsDBClient (``pqserver``) should report issues via #consolidated-database in rubin-obs.slack.com.

ConsDB operators monitor this channel and #ops-usdf, #ops-usdf-alerts for issues and outages reported, as well as escalate verified database issues.

Database
========

The ConsDB team is responsible for verifying whether or not the database is up when issues are reported.

They can check the method reported by the users, check using ``psql``/ ``pgcli``, and check in the #ops-usdf slack channel for currently reported issues.

Once the ConsDB team has confirmed there is an issue with the database, they should notify #ops-usdf slack channel and USDF DBAs should be responsible for fixing/restarting.

REST API Server
===============

If we suspect the API server died, the ConsDB team should be responsible for checking and restarting it.

Use the appropriate argo-cd deployment graph to check deployment logs, and potentially restart the service.


Other issues
------------

If the K8s infrastructure died then the ConsDB team can verify the problem, but there are likely to be wider issues seen.

USDF or Summit K8s/IT support should be responsible for fixing.
Loading

0 comments on commit 181364d

Please sign in to comment.