- Register to Boiling either online or by using the BDCLI.
- Compile and build Boiling HTTP Gateway container image:
node --version # you need node v18 or later
yarn install
yarn build
# Your BoilingData credentials are passed through the environment variables
[email protected] BD_PASSWORD=myBdSecretPw docker-compose up -d boilingdata_http_gw
# Ready to query Boiling!
curl -s -H 'Content-Type: application/json' localhost:3100 \
-d "{\"statement\":\"SELECT * FROM parquet_scan('s3://boilingdata-demo/test.parquet');\"}"
The curl
command returns 10 entries similar to this.
[
{
"registration_dttm": "2016-02-03 07:55:29+00",
"id": 1,
"first_name": "Amanda",
"last_name": "Jordan",
"email": "[email protected]",
"gender": "Female",
"ip_address": "1.197.201.2",
"cc": "6759521864920116",
"country": "Indonesia",
"birthdate": "3/8/1971",
"salary": 49756.53,
"title": "Internal Auditor",
"comments": "1E+02"
}
]
NOTE: see also standalone with python guide.
If you want to run BI Tool with Presto connector:
- Checkout Buenavista Boiling Proxy and build it (creates
buenavista
docker image) - Start e.g. Metabase, Boiling Buenavista, and Boiling HTTP GW locally and start querying
[email protected] BD_PASSWORD=myBdSecretPw docker-compose up -d
You can run queries both locally and remote on Boiling from the same BI Tool interface as Buenavista Proxy accompanies DuckDB database. Your BI Tool does not need to know the difference, it's all SQL.
The Boiling Buenavista Proxy handles all the SQL queries, has embedded DuckDB as default target. By matching the SQL with keywoards we relay some queries to Boiling, get the results back to local DuckDB and update the query to consume the results now in the local DuckDB.
See the docker-compose.yml file for running some BI Tools.
Example with Metabase:
NOTE! Use do not set Schema or Password, this is a local container. Also the username can be anything.