The aim of this repository is to demonstrate how to deploy a static website written in React to a Storage Blob. The hosted SPA allows users to upload CSV files containing demographic and financial data about individuals. The files are uploaded to a storage blob by calling an HTTP-triggered Azure Function with the appropriate output bindings. Once the CSV has been uploaded to the storage blob, another, blob-triggered Azure Function calculates correlations between various variables, such as experience, state, gender, and income. The computed statistics are then stored in a new blob container, which is used to serve the results to the user. These two functions are defined in the python script function_app.py - which is the main entrypoint of our Azure Function App instance.
The SPA is protected with Oauth2.0 authorization code flow with PKCE and OIDC. The user is redirected to the Azure AD login page, where they must authenticate before being redirected back to the SPA.
The associated Azure infrastructure is deployed with a script (more on that below).
A branch-triggered pipeline has been set up to deploy our code to the respective Azure resources using a GitHub Actions Workflows script. The two functions are deployed using the Function App's associated publish profile, whereas the static web app is deployed using a Service Principal configured with a Federated Credential. Note that the static website is hosted directly from a storage blob, as our associated storage container has been configured to serve static websites in our resource provisioning script. Thus, deploying the website is simply a matter of uploading the static files to the designated blob container.
- Platform: x86-64, Linux/WSL
- Programming Languages: React, Python 3
- Cloud Account: Azure
- Resource provisioning: Azure CLI
The shell script allocate_resources creates Azure resources using the Azure CLI in conjunction with a Bicep template file.
It will create the following hierarchy of resources:
graph TD
A[Subscription]
A --> B[Resource Group]
B --> C[Storage Account]
C --> D[Blob Container]
D -->|Static Website Hosting| H[index.html]
B --> E[App Service Plan]
E -->|Hosts| G[Function App]
G -->|Uses| F[Application Insights]
A -->|Contains| B
B -->|Contains| C
C -->|Contains| D
B -->|Contains| E
B -->|Contains| F
In addition to the resources listed above, the script will also create a service principal and two Microsoft Entra ID app registrations.
The service principal has been assigned contributor role to our resource group, which is sufficient in order to deploy the static web app to the storage blob. It has been assigned a federated credential configured to work with this repository as it is utilized in our CI/CD GitHub Actions Workflow script.
Exposes the scopes Csv.Writer and Csv.Reader under URI api://hvalfangst-function-app
Has a redirect URI configured to the static web app's URL and the permissions Csv.Writer and the OIDC ones.
Four secrets are required in order for the GitHub Actions Workflow script to deploy the code to the Azure resources. As may be observed in the script, these are:
- AZURE_CLIENT_ID: Used to authenticate the service principal in order to deploy the static web app
- AZURE_SUBSCRIPTION_ID: Used to authenticate the service principal in order to deploy the static web app
- AZURE_TENANT_ID: Used to authenticate the service principal in order to deploy the static web app
- PUBLISH_PROFILE: Used to deploy our two functions to the Azure Function App
After provisioning resources, setting up secrets, and pushing the code to the repository, one may access the static web app by navigating to the following URL:
https://hvalfangststorageaccount.z6.web.core.windows.net, which results in the following.
Click on Sign In to initiate the OIDC flow - which redirects to the Azure AD permission consent screen.
Clik on Accept and check off the Consent on behalf of your organization box to be redirected back to the SPA, where you will be greeted with the following.
Proceed to click on Upload to choose a file to upload. Pick the CSV file named input which has been provided for this purpose.
The file name will be displayed in the input field. Click on Upload to attempt to upload the file to the storage blob.