diff --git a/.gitignore b/.gitignore index 81a312c..d69aac8 100644 --- a/.gitignore +++ b/.gitignore @@ -1,7 +1,3 @@ -# Misc -#Misc -store-example.js - # Logs logs *.log diff --git a/README.md b/README.md index 83399ef..369d39a 100644 --- a/README.md +++ b/README.md @@ -1,29 +1,98 @@ -# Gatsby Groq (WIP) +# gatbsy-plugin-groq -**This is a WIP which for now includes a starter theme with local plugin for development purposes. Once ironed out it will be its own standalone plugin and these docs will be less abysmal** +**Gatsby plugin for using GROQ in place of GraphQL** -**View low quality demo here:** -https://drive.google.com/file/d/1FVch2HbAWk1TEXph1katiNb1uuaBLSHf/view?usp=sharing +The purpose of this plugin is to merge the power of GROQ and Gatsby by allowing developers to run GROQ queries against Gatsby's data layer for their page and static queries. For those of you who are familiar with GROQ, you are probably already in love and need no introduction. For everyone else, I highly suggest reading the below [What is This](#introduction) and [Resources](#resources) sections. -To do: Purpose of plugin (groq > graphql). Shout outs to Sanity and Gatsby teams because they're so awesome. +Included in this repository is a demo Gatsby starter with some data to play around with. You can find it under `packages/gatsby-groq-starter`. Just download the files and add this plugin within your `plugins` directory to start having fun. -Actual plugin can be found in `plugins/gatsby-plugin-groq`. `index.js` contains some functions used during build and runtime, whereas `gatsby-node.js` contains all the wizardry. +**View low quality demo here:** +https://drive.google.com/file/d/1FVch2HbAWk1TEXph1katiNb1uuaBLSHf/view?usp=sharing ## 🎂 Features +- Works with any data pulled into Gatsby's data layer - Replicates Gatsby's beloved patterns -- Works with any data pulled into Gatsby's data layer **(Needs Testing)** - GROQ-based page queries with HMR - GROQ-based static queries with live reloads -- Leverages GROQ's native functionality for advanced querying, node/document projections, joins (limited), etc. -- String interpolation within queries, much more flexible than GraphQL fragments **(TO DO)** +- Leverages GROQ's native functionality for advanced querying, node/document projections, joins (limited), etc, +- String interpolation ("fragments") within queries, much more flexible than GraphQL fragments - GROQ explorer in browser during development at `locahost:8000/__groq` **(TO DO)** - Optimized for incremental builds on Cloud and OSS **(TO DO)** - -## 🧙 How it works +## 🚀 Get Started + +1. For now, download and install in your local `plugins` directory at the root of your Gatsby project. +2. In `gatbsy-config.js`, add the plugin configuration to the `plugins` array: +``` +module.exports = { + //... + plugins: [ + { + resolve: 'gatsby-plugin-groq', + options: { + // Location of your project's fragments index file. + // Only required if you are implementing fragments. + fragmentsDir: './src/fragments' + } + } + ] +} +``` +3. To use a GROQ page query, simply add a named `groqQuery` export to the top of your component file as you would a Gatsby query: +``` +export const groqQuery = ` + ... +` +``` +4. To use a GROQ static query, use the `useGroqQuery` hook: +``` +import { useGroqQuery } from 'src/plugins/gatsby-plugin-groq'; + +export function() { + + const data = useGroqQuery( ` + ... + ` ); + +} +``` +5. For more flexibility and advanced usage checkout [Fragments]() + +## 🤔 What is This? +Gatsby is an amazing tool that has helped advance modern web development in significant ways. While many love it for its magical frontend concotion of static generation an rehydration via React, easy routing, smart prefetching, image rendering, etc., one of the key areas where it stands out from other similar tools is its GraphQL data layer. This feature is in large part the reason why some love Gatsby and why others choose to go in another direction. Being able to source data from multiple APIs, files, etc. and compile them altogether into a queryable GraphQL layer is ***amazing***, but many developers don't enjoy working with GraphQL. This is where GROQ comes in. + +GROQ (**G**raph-**R**elational **O**bject **Q**ueries) is an incredibly robust and clear general query language design by the folks at Sanity Inc. for filtering and projecting JSON data. In many is very similar to GraphQL in that you can run multiple robust queries and specify the data you need all within a single request, however with GROQ you can accomplish much more in a more clear and flexible way. It supports complex parameters and operators, functions, piping, advanced joins, slicing, ordering, projections, conditionals, pagination etc., all with an intuitive syntax 😲. + +For example, take this somewhat simple GraphQL query: + +``` +{ + authors(where: { + debutedBefore_lt: "1900-01-01T00:00:00Z", + name_matches: "Edga*r" + ) { + name, + debutYear, + } +} +``` + +Here is what it would look like using GROQ: + +``` +*[_type == "author" && name match "Edgar" && debutYear < 1900]{ + name, + debutYear +} +``` + +The more complex the queries, the smoother GROQ becomes. This is why some developers already familiar with GROQ bypass Gatsby's data layer so that they could leverage its power. + + +## 🧙 How it Works This plugin mimics Gatsby's own method of extracting queries from components by using a few Babel tools to parse files and traverse code to capture all queries found in files. By leveraging Gatsby's Node APIs and helpers we are able to extract queries from files on demand then run them against all GraphQL nodes found in Gatsby's redux store. After queries are run we can either feed results into a page's context (page queries), or cache for later use within individual components (static queries). Everything was done to leverage available APIs and to avoid interacting with Gatsby's data store directly as much as possible. -For now, all cache related to groq queries can be found in `.cache/groq` during development and `public/static/groq` in production. **Note: I have made some changes since last testing builds so they might be buggy** +For now, all cache related to groq queries can be found in `.cache/groq` during development and `public/static/groq` in production. ### Page Queries All page-level components with `groqQuery` exports will have their queries (template literal) extracted and cached unprocessed as a hashed json file in the groq directory. The hash is based on the component file's full path so that the cached file can always be associated with the component. During bootstrap, whenever a page is created via `createPage` the plugin checks the cache to see if there is a page query related to it page's component. If there is, it then runs the query and replaces any variables with values supplied in the page's context. The result is stored in the `data` property within `pageContext`. @@ -37,28 +106,61 @@ All components using the hook `useGroqQuery` first have these queries extracted, Similar to page queries, all files are watched for changes and whenever there is a change to a file containing a static query the above process runs again, the query results are cached, and the page refreshes with the static query now returning the updated content. +### Fragments +Playing off of GraphQL, "fragments" are strings of reusable portions of GROQ queries that can be interpolated within other queries. For example, say you have a blog where you are showing post snippets throughout multiple page templates and for every post need to retrieve its `id`, `title`, `summary`, and `category`, along with the category's `name` and `slug`. Instead of having to remember which fields you need and write this out every time, you could create a reusable fragment: +``` +exports.postSnippetFields = ` + id, + summary, + title, + "category": *[ type == "category" && id == ^.category ] { + name, + slug + } +` +``` -**I think that covers most of it. Check the comments within code for more details...** +Then simply reuse the fragment wherever you need: +``` +import { postSnippetFields } from 'src/fragments'; + +const groqQuery = ` + *[ type == "post" ] { + ${postSnippetFields} + } +``` +To use GROQ fragments with this plugin, for now all fragments must be exported from a `index.js` using CommonJS syntax. You must also specify the directory where this file is found within the plugin options: `fragmentsDir: // Directory relative to project root`. + +**That should cover most of it. Check the comments within code for more details.** ## ⌛ TO DO (random order) - ~~Get rid of relative directories~~ - ~~Work on issues with joins~~ we might be limited here +- ~~Clean up spotty caching issues after running development~~ +- ~~Experiment with other data sources (WordPress)~~ - GROQ explorer -- Run fragment functions before interpolating into queries -- Experiment with other data sources (WordPress) +- Allow for fragment functions +- Set up page refreshing when fragments are changed +- Look into using esm for ES6 imports/exports - Set up an option to auto-resolve references? -- Clean up spotty caching issues after running development -- Error messaging (especially when there are Babel parsing errors -- Parsing options (i.e. allow ECMAScript proposals) +- Error messaging (especially when there are Babel parsing errors) - Performance optimizations - Improve docs - Provide recipe docs with heavy use of fragments - Incremental builds? - Allow for variables within static queries? -- Use esm? - Helpers for real-time listening in client (Sanity only) - Tests -- Proselytize everyone from GraphQL to Groq. \ No newline at end of file +- Proselytize everyone from GraphQL to GROQ. + +## 📖 GROQ Resources +- [GROQ Intro Video](https://www.youtube.com/watch?v=Jcfubj2zRI0) +- [GROQ Docs](https://www.sanity.io/docs/overview-groq) +- [CSS Ticks - The Best (GraphQL) API is One You Write](https://css-tricks.com/the-best-graphql-api-is-one-you-write/) +- [Review of GROQ, A New JSON Query Language](https://nordicapis.com/review-of-groq-a-new-json-query-language/) + +## 🙇 Huge Thanks +Thanks to the awesome teams at [Gatsby](https://www.gatsbyjs.org/) and [Sanity](https://www.sanity.io/) for their absolutely amazing tools and developer support. If you haven't checked it out yet, I would **HIGHLY** recommend looking into Sanity's incredible CMS. It's hard to imagine how a headless CMS experience could be any better. diff --git a/gatsby-browser.js b/gatsby-browser.js deleted file mode 100644 index e69de29..0000000 diff --git a/gatsby-config.js b/gatsby-config.js deleted file mode 100644 index 077426f..0000000 --- a/gatsby-config.js +++ /dev/null @@ -1,26 +0,0 @@ -const path = require( 'path' ); - -require( 'dotenv' ).config( { - path: `.env`, -} ); - -module.exports = { - plugins: [ - { - resolve: 'gatsby-plugin-groq', - options: { - fragmentsDir: './src/fragments', - } - }, - { - resolve: 'gatsby-source-sanity', - options: { - projectId: process.env.SANITY_PROJECT, - dataset: process.env.SANITY_DATASET, - token: process.env.SANITY_TOKEN, - overlayDrafts: process.env.NODE_ENV === 'development' ? true : false, - watchMode: process.env.NODE_ENV === 'development' ? true : false, - } - }, - ], -} diff --git a/gatsby-node.js b/gatsby-node.js index 506a85c..a064900 100644 --- a/gatsby-node.js +++ b/gatsby-node.js @@ -1,37 +1,518 @@ +const axios = require( 'axios' ); +const chalk = require( 'chalk' ); +const fs = require( 'fs' ); +const glob = require( 'glob' ); +const murmurhash = require( './murmur' ); +const normalizePath = require( 'normalize-path' ); +const parser = require( '@babel/parser' ); const path = require( 'path' ); -const slash = require( 'slash' ); -const { runQuery } = require( './plugins/gatsby-plugin-groq' ); +const gatsbyReporter = require( 'gatsby-cli/lib/reporter' ); +const traverse = require( '@babel/traverse' ).default; +const { watch } = require( 'chokidar' ); +const { runQuery } = require( './index' ); +const { reporter } = require( './utils' ); + +// TODO +const ROOT = process.env.INIT_CWD; +const GROQ_DIR = process.env.NODE_ENV === 'development' ? `${ROOT}/.cache/groq` : `${ROOT}/public/static/groq`; + /** - * Create demo pages. + * Here's where we extract and run all initial queries. + * Also sets up a watcher to re-run queries during dev when files change.fcache + * + * Runs in right after schema creation and before createPages. */ -exports.createPages = async ( { graphql, actions, cache, getNodes, reporter, traceId } ) => { +exports.resolvableExtensions = async ( { graphql, actions, cache, getNodes, traceId, store }, plugin ) => { + + if( !! fs.existsSync( GROQ_DIR ) ) { + fs.rmdirSync( GROQ_DIR, { recursive: true } ); + } + fs.mkdirSync( GROQ_DIR ); + + // Cache fragments. + const fragmentsDir = !! plugin.fragmentsDir ? path.join( ROOT, plugin.fragmentsDir ) : null; + + if( !! fragmentsDir ) { + cacheFragments( fragmentsDir, cache ); + } + + // Extract initial queries. + const intitialNodes = getNodes(); + + extractQueries( { nodes: intitialNodes, traceId, cache } ); + + + // For now watching all files to re-extract queries. + // Right now there doesn't seem to be a way to watch for build updates using Gatsby's public node apis. + // Created a ticket in github to explore option here. + const watcher = watch( `${ROOT}/src/**/*.js` ); + + watcher.on( 'change', async ( filePath ) => { - const { createPage } = actions; - const nodes = getNodes(); - const postQuery = await runQuery( `*[ _type == "post" ]{ - _id, - slug { - current + // Recache if this was a change within fragments directory. + if( !! fragmentsDir && filePath.includes( fragmentsDir ) ) { + + await cacheFragments( fragmentsDir, cache ); + + //TODO need to figure out a way to refresh page with new data. + reporter.info( `DON'T FORGET TO RESAVE FILES WITH UPDATED FRAGMENT!` ); + + } + + // Get info for file that was changed. + const fileContents = fs.readFileSync( filePath, 'utf8' ); + + // Check if file has either page or static queries. + const pageQueryMatch = /export const groqQuery = /.exec( fileContents ); + const staticQueryMatch = /useGroqQuery/.exec( fileContents ); + if( ! pageQueryMatch && ! staticQueryMatch ) { + return; } - }`, nodes ); - if( !! postQuery && postQuery.length ) { - for( let post of postQuery ) { + reporter.info( 'Re-processing groq queries...' ); + + // Get updated nodes to query against. + const nodes = getNodes(); + + // Run page queries. + if( pageQueryMatch ) { + + const { deletePage, createPage } = actions; + const { pages } = store.getState(); + + // First we need to reprocess the page query. + const processedPageQuery = await processFilePageQuery( filePath, nodes, cache ); + + if( !! processedPageQuery ) { + + const { fileHash: newHash, query: newQuery } = processedPageQuery; + const queryFile = `${GROQ_DIR}/${newHash}.json`; - const { id, _id } = post; - const postPath = `/${post.slug.current}`; - const template = path.resolve( `./src/templates/Page.js` ); + await cacheQueryResults( newHash, newQuery ); + + // Update all paths using this page component. + // Is this performant or should we try to leverage custom cache? + for( let [ path, page ] of pages ) { + + if( page.component !== filePath ) { + continue; + } + + reporter.info( `Updating path: ${page.path}` ); + + // Run query and inject into page context. + pageQueryToContext( { + actions, + cache, + file: queryFile, + nodes, + page + } ); - createPage( { - path: postPath, - component: slash( template ), - context: { - _id, } - } ); + + } + + } + + // Static queries. + if( ! staticQueryMatch ) { + return reporter.success( 'Finished re-processing page queries' ); + } + + // Run query and save to cache. + // Files using the static query will be automatically refreshed. + const staticQuery = await processFileStaticQuery( filePath, nodes, cache ); + + if( !! staticQuery ) { + + const { hash, json } = staticQuery; + await cacheQueryResults( hash, json ); + + return reporter.success( 'Finished re-processing queries' ); + + } + + reporter.warn( 'There was a problem processing one of your static queries' ); + + } ); + +}; + +/** + * Inject page query results its page. + */ +exports.onCreatePage = async ( { actions, cache, getNodes, page, traceId } ) => { + + // Check for hashed page queries for this component. + const componentPath = page.component; + const hash = murmurhash( componentPath ); + const queryFile = `${GROQ_DIR}/${hash}.json`; + + if( ! fs.existsSync( queryFile) ) { + return; + } + + // Run query and write to page context. + pageQueryToContext( { + actions, + cache, + file: queryFile, + getNodes, + page, + } ); + + +} + +/** + * Extract page and static queries from all files. + * Process and cache results. + * + * @param {Object} $0 Gatsby Node Helpers. + */ +async function extractQueries( { nodes, traceId, cache } ) { + + reporter.info( 'Getting files for groq extraction...' ); + + const filesRegex = `*.+(t|j)s?(x)` + const pathRegex = `/{${filesRegex},!(node_modules)/**/${filesRegex}}`; + let hasErrors = false; + let files = [ + path.join( ROOT, 'src' ), + ].reduce( ( merged, folderPath ) => { + + merged.push( + ...glob.sync( path.join( folderPath, pathRegex ), { + nodir: true, + } ) + ); + + return merged; + + }, [] ); + + files = files.filter( d => ! d.match( /\.d\.ts$/ ) ); + files = files.map( normalizePath ); + + // Loop through files and look for queries to extract. + for( let file of files ) { + + const pageQuery = await processFilePageQuery( file, nodes, cache ); + const staticQuery = await processFileStaticQuery( file, nodes, cache ); + + // Cache page query. + // This will only contain a json file of unprocessed query. + if( !! pageQuery ) { + const { fileHash, query } = pageQuery; + cacheQueryResults( fileHash, query ); + } + + // Cache static query. + // This will contain actual results of the query. + if( !! staticQuery ) { + + if( staticQuery instanceof Error ) { + + return reporter.warn( 'There was an error processing static query' ); + + } + const { hash, json } = staticQuery; + cacheQueryResults( hash, json, 'static', ); + } + + } + + reporter.success( 'Finished getting files for query extraction' ); + + +} + +/** + * Cache fragments. + * + * @param {string} fragmentsDir + * @param {Object} cache + * @return {bool} if succesfully cached. + */ +async function cacheFragments( fragmentsDir, cache ) { + + const index = path.join( fragmentsDir, 'index.js' ); + + if( !! fs.readFileSync( index ) ) { + + delete require.cache[ require.resolve( index ) ]; + + fragments = require( index ); + + await cache.set( 'groq-fragments', fragments ); + + reporter.info( 'Cached fragments' ); + + return true; + + } + + return false; + +} + +/** +* Run page query and update the related page via createPage. +* +* @param {Object} $0 Gatsby Node Helpers. +*/ +async function pageQueryToContext( { actions, cache, file, getNodes, nodes, page, } ) { + + const { createPage, deletePage, setPageData } = actions; + + // Get query content. + const content = fs.readFileSync( file, 'utf8' ); + let { unprocessed: query } = JSON.parse( content ); + + // Replace any variables within query with context values. + if( page.context ){ + + for( let [ key, value ] of Object.entries( page.context ) ) { + + const search = `\\$${key}`; + const pattern = new RegExp( search, 'g' ); + query = query.replace( pattern, `"${value}"` ); + + } + } + + // Do the thing. + try { + + const allNodes = nodes || getNodes(); + const fragments = await cache.get( 'groq-fragments' ); + const { result } = await runQuery( query, allNodes, { file, fragments } ); + + page.context.data = result; + + deletePage( page ); + createPage( { + ...page, + } ); + } + catch( err ) { + console.error( page.component ); + reporter.error( `${err}` ); + reporter.error( query ); + } + + +} + +/** + * Custom webpack. + */ +exports.onCreateWebpackConfig = async ( { actions, cache, plugins, store } ) => { + + + // Make sure we have have access to GROQ_DIR for useGroqQuery(). + actions.setWebpackConfig( { + plugins: [ + plugins.define( { + 'process.env.GROQ_DIR': JSON.stringify( GROQ_DIR ), + } ) + ] + } ); + +} + +/** + * Extracts page query from file and returns its hash and unprocessed string. + * + * @param {string} file + * @param {map} nodes + * @param {Object} cache + * @return {Object} fileHash and query + */ +async function processFilePageQuery( file, nodes, cache ) { + + const contents = fs.readFileSync( file, 'utf8' ); + const match = /export const groqQuery = /.exec( contents ); + if( ! match ) { + return; + } + + const ast = parse( file, contents ); + if( ! ast ) { + return null; + } + + let pageQuery = null; + let queryStart = null; + let queryEnd = null; + + traverse( ast, { + ExportNamedDeclaration: function( path ) { + + const declarator = path.node.declaration.declarations[0]; + + if( declarator.id.name === 'groqQuery' ) { + + queryStart = declarator.init.start; + queryEnd = declarator.init.end; + pageQuery = contents.substring( queryStart, queryEnd ); + + } + + } + } ); + + if( ! pageQuery ) { + return; + } + + const hash = hashQuery( file ); + + return { + fileHash: hash, + query: JSON.stringify( { unprocessed: pageQuery } ), + } + +} + +/** + * Extracts static query from file and returns its hash and result. + * + * @param {string} file + * @param {map} nodes + * @param {Object} cache + * @return {Object} hash and query + */ +async function processFileStaticQuery( file, nodes, cache ) { + + const contents = fs.readFileSync( file, 'utf8' ); + const match = /useGroqQuery/.exec( contents ); + if( ! match ) { + return; + } + + const ast = parse( file, contents ); + if( ! ast ) { + return null; + } + + let staticQuery = null; + let queryStart = null; + let queryEnd = null; + + traverse( ast, { + CallExpression: function( path ) { + + if( !! path.node.callee && path.node.callee.name === 'useGroqQuery' ) { + + queryStart = path.node.arguments[0].start + 1; + queryEnd = path.node.arguments[0].end - 1; + staticQuery = contents.substring( queryStart, queryEnd ); + + } } + } ); + + if( ! staticQuery ) { + return null; + } + const fragments = await cache.get( 'groq-fragments' ); + const { result, finalQuery } = await runQuery( staticQuery, nodes, { file, fragments } ); + if( result instanceof Error ) { + return result; } + + const hash = hashQuery( finalQuery ); + const json = JSON.stringify( result ); + + return { hash, json }; + +} + +/** + * Cache result from query extraction. + * For page queries this caches the query itself. + * For static queries this caches the results of the query. + * + * @param {number} hash Hash to the json file. + * @param {Object|string} data Data we are caching. + * @param {string} type Page or static query. Optional. Default 'page' + */ +async function cacheQueryResults( hash, data, type = 'page' ) { + + reporter.info( `Caching ${type} query: ${hash}` ); + + const json = typeof data !== 'string' ? JSON.stringify( data ) : data; + + if( process.env.NODE_ENV === 'development' ) { + + // Probably a more sophisticated Gatsby way of doing this. + fs.writeFileSync( `${GROQ_DIR}/${hash}.json`, json, err => { + if( err ) { + throw new Error( err ); + } + } ); + + } + else { + + fs.writeFileSync( `${GROQ_DIR}/${hash}.json`, json, err => { + if( err ) { + throw new Error( err ); + } + } ); + + } + +} + +/** + * Parse. + * + * @param {string} filePath + * @param {string} content + * @param {Object} options + * @return {ast} + */ +function parse( filePath, content, options = {} ) { + + const { plugins: additionalPlugins, ...additionalOptions } = options; + let plugins = [ 'jsx' ]; + + if( !! additionalPlugins ) { + plugins = [ ...plugins, ...additionalPlugins ]; + } + + try { + + const ast = parser.parse( content, { + errorRecovery: true, + plugins, + sourceType: 'module', + ...additionalOptions, + } ); + + return ast; + + } + catch( err ) { + console.warn( `Error parsing file: ${filePath}` ); + console.log( 'Check below for more info' ); + return null; + } + +} + +/** + * Generate a hash based on the query. + * + * @param {string} query + * @return {number} + */ +function hashQuery( query ) { + return murmurhash( query ); } diff --git a/gatsby-ssr.js b/gatsby-ssr.js deleted file mode 100644 index b17b8fc..0000000 --- a/gatsby-ssr.js +++ /dev/null @@ -1,7 +0,0 @@ -/** - * Implement Gatsby's SSR (Server Side Rendering) APIs in this file. - * - * See: https://www.gatsbyjs.org/docs/ssr-apis/ - */ - -// You can delete this file if you're not using it diff --git a/plugins/gatsby-plugin-groq/index.js b/index.js similarity index 73% rename from plugins/gatsby-plugin-groq/index.js rename to index.js index 64c7d94..1c710a2 100644 --- a/plugins/gatsby-plugin-groq/index.js +++ b/index.js @@ -1,9 +1,7 @@ const groq = require( 'groq-js' ); const murmurhash = require( './murmur' ); -const parser = require( '@babel/parser' ); -const traverse = require( '@babel/traverse' ).default; +const { reporter } = require( './utils' ); -const ROOT = process.env.INIT_CWD; /** * Hook to mimic Gatsby's static query. @@ -18,30 +16,14 @@ exports.useGroqQuery = query => { const hash = murmurhash( query ); - if( process.env.NODE_ENV === 'development' ) { - - try { - const result = require( `${ROOT}/.cache/groq/${hash}.json` ); - return result; - } - catch( err ) { - console.warn( err ); - } - + try { + const result = require( `${process.env.GROQ_DIR}/${hash}.json` ); + return result; } - else { - - try { - const result = require( `${ROOT}/public/static/groq/${hash}.json` ); - return result; - } - catch( err ) { - console.warn( err ); - } - + catch( err ) { + console.warn( err ); } - } /** @@ -52,11 +34,12 @@ exports.useGroqQuery = query => { * @param {Object} options * @param {Object} options.fragments * @param {Object} options.params - * @return {array} + * @param {string} options.file For debugging. + * @return {Object} Array of results along with final query */ exports.runQuery = async ( rawQuery, dataset, options = {} ) => { - const { fragments, params } = options; + const { file, fragments, params } = options; let query = rawQuery; // Check if query has fragment. @@ -65,7 +48,7 @@ exports.runQuery = async ( rawQuery, dataset, options = {} ) => { if( hasFragment ) { if( ! fragments || ! Object.keys( fragments ).length ) { - console.warn( 'GROQ query contains fragments but no fragment index found.' ); + reporter.warn( 'Query contains fragments but no index provided.' ); return; } @@ -107,20 +90,25 @@ exports.runQuery = async ( rawQuery, dataset, options = {} ) => { } } - query = query.replace( /`/g, '', ); - try { - const parsedQuery = groq.parse( query ); + const strippedQuery = query.replace( /`/g, '', ); + const parsedQuery = groq.parse( strippedQuery ); const value = await groq.evaluate( parsedQuery, { dataset } ); const result = await value.get(); - return result; + return { result, finalQuery: query } } catch( err ) { - console.error( err ); + console.error( file ); + reporter.error( `${err}` ); + reporter.error( query ); + + return err; + } + } diff --git a/murmur.js b/murmur.js new file mode 100644 index 0000000..153289c --- /dev/null +++ b/murmur.js @@ -0,0 +1,74 @@ +// murmurhash2 via https://gist.github.com/raycmorgan/588423 + +module.exports = ( str, seed = 'abc' ) => { + + if( ! str ) { + return null; + } + + let m = 0x5bd1e995 + let r = 24 + let h = seed ^ str.length + let length = str.length + let currentIndex = 0 + + while (length >= 4) { + let k = UInt32(str, currentIndex) + + k = Umul32(k, m) + k ^= k >>> r + k = Umul32(k, m) + + h = Umul32(h, m) + h ^= k + + currentIndex += 4 + length -= 4 + } + + switch (length) { + case 3: + h ^= UInt16(str, currentIndex) + h ^= str.charCodeAt(currentIndex + 2) << 16 + h = Umul32(h, m) + break + + case 2: + h ^= UInt16(str, currentIndex) + h = Umul32(h, m) + break + + case 1: + h ^= str.charCodeAt(currentIndex) + h = Umul32(h, m) + break + } + + h ^= h >>> 13 + h = Umul32(h, m) + h ^= h >>> 15 + + return h >>> 0 +} + +function UInt32(str, pos) { + return ( + str.charCodeAt(pos++) + + (str.charCodeAt(pos++) << 8) + + (str.charCodeAt(pos++) << 16) + + (str.charCodeAt(pos) << 24) + ) +} + +function UInt16(str, pos) { + return str.charCodeAt(pos++) + (str.charCodeAt(pos++) << 8) +} + +function Umul32(n, m) { + n = n | 0 + m = m | 0 + let nlo = n & 0xffff + let nhi = n >>> 16 + let res = (nlo * m + (((nhi * m) & 0xffff) << 16)) | 0 + return res +} \ No newline at end of file diff --git a/package.json b/package.json index 6c12b2b..83b4fb1 100644 --- a/package.json +++ b/package.json @@ -1,51 +1,26 @@ { - "name": "gatsby-groq", + "name": "gatsby-plugin-groq", "private": true, - "description": "A simple starter to get up and developing quickly with Gatsby", - "version": "0.1.0", + "description": "Gatsby plugin for using GROQ in place of GraphQL", + "version": "1.0.0-alpha.0", "author": "Kevin McAloon ", - "dependencies": { - "axios": "^0.19.2", - "chokidar": "^3.4.0", - "dotenv": "^8.2.0", - "gatsby": "^2.21.21", - "gatsby-image": "^2.4.0", - "gatsby-plugin-extract-schema": "^0.0.5", - "gatsby-plugin-manifest": "^2.4.0", - "gatsby-plugin-offline": "^3.2.0", - "gatsby-plugin-react-helmet": "^3.3.0", - "gatsby-plugin-sharp": "^2.6.0", - "gatsby-source-filesystem": "^2.3.0", - "gatsby-source-sanity": "^5.0.5", - "groq-js": "^0.1.3", - "prop-types": "^15.7.2", - "react": "^16.12.0", - "react-docgen": "^5.3.0", - "react-dom": "^16.12.0", - "react-helmet": "^6.0.0", - "slugify": "^1.4.0" - }, - "devDependencies": { - "prettier": "2.0.5" - }, "keywords": [ "gatsby" ], "license": "MIT", - "scripts": { - "build": "gatsby build", - "develop": "gatsby develop", - "format": "prettier --write \"**/*.{js,jsx,json,md}\"", - "start": "gatsby develop", - "serve": "gatsby serve", - "clean": "gatsby clean", - "test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1" - }, "repository": { "type": "git", - "url": "https://github.com/gatsbyjs/gatsby-starter-default" + "url": "https://github.com/kmcaloon/gatsby-plugin-groq" }, - "bugs": { - "url": "https://github.com/gatsbyjs/gatsby/issues" + "dependencies": { + "@babel/parser": "^7.9.6", + "@babel/traverse": "^7.9.6", + "axios": "^0.19.2", + "chokidar": "^3.4.0", + "fs": "^0.0.1-security", + "gatsby-cli": "^2.12.16", + "glob": "^7.1.6", + "groq-js": "^0.1.5", + "normalize-path": "^3.0.0" } } diff --git a/.prettierignore b/packages/gatsby-groq-demo/.prettierignore similarity index 100% rename from .prettierignore rename to packages/gatsby-groq-demo/.prettierignore diff --git a/.prettierrc b/packages/gatsby-groq-demo/.prettierrc similarity index 100% rename from .prettierrc rename to packages/gatsby-groq-demo/.prettierrc diff --git a/packages/gatsby-groq-demo/LICENSE b/packages/gatsby-groq-demo/LICENSE new file mode 100644 index 0000000..5169a5e --- /dev/null +++ b/packages/gatsby-groq-demo/LICENSE @@ -0,0 +1,22 @@ +The MIT License (MIT) + +Copyright (c) 2015 gatsbyjs + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. + diff --git a/packages/gatsby-groq-demo/gatsby-config.js b/packages/gatsby-groq-demo/gatsby-config.js new file mode 100644 index 0000000..2cd8f1e --- /dev/null +++ b/packages/gatsby-groq-demo/gatsby-config.js @@ -0,0 +1,28 @@ + +/** + * Set up json transformers and + * configure groq plugin. + */ +module.exports = { + plugins: [ + { + resolve: 'gatsby-plugin-groq', + options: { + // Change this if you change the fragments index. + fragmentsDir: './src/fragments', + } + }, + { + resolve: 'gatsby-transformer-json', + options: { + typeName: ( { node, object, isArray } ) => node.relativeDirectory, + } + }, + { + resolve: 'gatsby-source-filesystem', + options: { + path: './src/data/', + } + }, + ], +} diff --git a/packages/gatsby-groq-demo/package.json b/packages/gatsby-groq-demo/package.json new file mode 100644 index 0000000..ea0cd3a --- /dev/null +++ b/packages/gatsby-groq-demo/package.json @@ -0,0 +1,39 @@ +{ + "name": "gatsby-groq-demo", + "private": true, + "description": "A demo to play around with gatsby-plugin-groq", + "version": "0.1.0", + "author": "Kevin McAloon ", + "dependencies": { + "@babel/parser": "^7.9.6", + "@babel/traverse": "^7.9.6", + "axios": "^0.19.2", + "chokidar": "^3.4.0", + "fs": "^0.0.1-security", + "gatsby": "^2.21.21", + "gatsby-source-filesystem": "^2.3.1", + "gatsby-transformer-json": "^2.4.1", + "glob": "^7.1.6", + "groq-js": "^0.1.5", + "normalize-path": "^3.0.0", + "prop-types": "^15.7.2", + "react": "^16.12.0", + "react-dom": "^16.13.1" + }, + "devDependencies": { + "prettier": "2.0.5" + }, + "keywords": [ + "gatsby" + ], + "license": "MIT", + "scripts": { + "build": "gatsby build", + "develop": "gatsby develop", + "format": "prettier --write \"**/*.{js,jsx,json,md}\"", + "start": "gatsby develop", + "serve": "gatsby serve", + "clean": "gatsby clean", + "test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1" + } +} diff --git a/packages/gatsby-groq-demo/src/data/characters/inara.json b/packages/gatsby-groq-demo/src/data/characters/inara.json new file mode 100644 index 0000000..7c8b9a0 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/inara.json @@ -0,0 +1,9 @@ +{ + "id": "inaraId", + "firstname": "Inara", + "lastname": "Serra", + "position": "Companion", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/jayne.json b/packages/gatsby-groq-demo/src/data/characters/jayne.json new file mode 100644 index 0000000..efc1407 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/jayne.json @@ -0,0 +1,9 @@ +{ + "id": "jayneId", + "firstname": "Jayne", + "lastname": "Cobb", + "position": "Hired Gun", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/kaylee.json b/packages/gatsby-groq-demo/src/data/characters/kaylee.json new file mode 100644 index 0000000..c7b7aa7 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/kaylee.json @@ -0,0 +1,9 @@ +{ + "id": "kayleeId", + "firstname": "Kaylee", + "lastname": "Frye", + "position": "Mechanic", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/malcom.json b/packages/gatsby-groq-demo/src/data/characters/malcom.json new file mode 100644 index 0000000..f504d02 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/malcom.json @@ -0,0 +1,9 @@ +{ + "id": "malcomId", + "firstname": "Malcom", + "lastname": "Reynolds", + "position": "Captain", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/nandi.json b/packages/gatsby-groq-demo/src/data/characters/nandi.json new file mode 100644 index 0000000..422a84f --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/nandi.json @@ -0,0 +1,6 @@ +{ + "id": "nandiId", + "firstname": "Nandi", + "lastname": "", + "position": "Former Companion" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/niska.json b/packages/gatsby-groq-demo/src/data/characters/niska.json new file mode 100644 index 0000000..e593a06 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/niska.json @@ -0,0 +1,9 @@ +{ + "id": "niskaId", + "firstname": "Adelai", + "lastname": "Niska", + "position": "Mob Boss", + "ship": { + "_ref": "skyplexId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/river.json b/packages/gatsby-groq-demo/src/data/characters/river.json new file mode 100644 index 0000000..f291b01 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/river.json @@ -0,0 +1,9 @@ +{ + "id": "riverId", + "firstname": "River", + "lastname": "Tam", + "position": "Assassin", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/saffron.json b/packages/gatsby-groq-demo/src/data/characters/saffron.json new file mode 100644 index 0000000..bc560ac --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/saffron.json @@ -0,0 +1,6 @@ +{ + "id": "saffronId", + "firstname": "Saffron", + "lastname": "Unknown", + "position": "Sneeky Crook" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/shepherd.json b/packages/gatsby-groq-demo/src/data/characters/shepherd.json new file mode 100644 index 0000000..33d93e6 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/shepherd.json @@ -0,0 +1,9 @@ +{ + "id": "shepherdId", + "firstname": "Shepherd", + "lastname": "Book", + "position": "Shepherd", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/simon.json b/packages/gatsby-groq-demo/src/data/characters/simon.json new file mode 100644 index 0000000..829028e --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/simon.json @@ -0,0 +1,9 @@ +{ + "id": "simonId", + "firstname": "Simon", + "lastname": "Tam", + "position": "Doctor", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/triumphElder.json b/packages/gatsby-groq-demo/src/data/characters/triumphElder.json new file mode 100644 index 0000000..da46788 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/triumphElder.json @@ -0,0 +1,4 @@ +{ + "id": "triumphElderId", + "position": "Village Elder" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/wash.json b/packages/gatsby-groq-demo/src/data/characters/wash.json new file mode 100644 index 0000000..93a8d43 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/wash.json @@ -0,0 +1,9 @@ +{ + "id": "washId", + "firstname": "Wash", + "lastname": "Washburne", + "position": "Pilot", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/characters/zoe.json b/packages/gatsby-groq-demo/src/data/characters/zoe.json new file mode 100644 index 0000000..a4329b1 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/characters/zoe.json @@ -0,0 +1,9 @@ +{ + "id": "zoeId", + "firstname": "Zoe", + "lastname": "Washburne", + "position": "First Mate", + "ship": { + "_ref": "serenityId" + } +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/cattle.json b/packages/gatsby-groq-demo/src/data/jobs/cattle.json new file mode 100644 index 0000000..1556efb --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/cattle.json @@ -0,0 +1,17 @@ +{ + "id": "cattleId", + "worlds": [ + { "_ref": "santoId" }, + { "_ref": "jiangyinId" } + ], + "client": { + "_ref": "badgerId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "jayneId" }, + { "_ref": "kayleeId" }, + { "_ref": "shepherdId" }, + { "_ref": "zoeId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/heartOfGold.json b/packages/gatsby-groq-demo/src/data/jobs/heartOfGold.json new file mode 100644 index 0000000..0df10a5 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/heartOfGold.json @@ -0,0 +1,15 @@ +{ + "id": "heartOfGoldId", + "client": { + "_ref": "nandiId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "zoeId" }, + { "_ref": "jayneId" }, + { "_ref": "washId" }, + { "_ref": "shepherdId" }, + { "_ref": "simonId" }, + { "_ref": "inaraId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/hospital.json b/packages/gatsby-groq-demo/src/data/jobs/hospital.json new file mode 100644 index 0000000..5874f95 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/hospital.json @@ -0,0 +1,15 @@ +{ + "id": "hospitalId", + "worlds": [ + { "_ref": "aerialId" } + ], + "client": { + "_ref": "simonId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "zoeId" }, + { "_ref": "jayneId" }, + { "_ref": "simonId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/jaynestown.json b/packages/gatsby-groq-demo/src/data/jobs/jaynestown.json new file mode 100644 index 0000000..810894e --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/jaynestown.json @@ -0,0 +1,16 @@ +{ + "id": "jaynestownId", + "worlds": [ + { "_ref": "higginsMoonId" } + ], + "client": { + "_ref": "?" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "jayneId" }, + { "_ref": "kayleeId" }, + { "_ref": "washId" }, + { "_ref": "simonId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/lassiter.json b/packages/gatsby-groq-demo/src/data/jobs/lassiter.json new file mode 100644 index 0000000..2f09a80 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/lassiter.json @@ -0,0 +1,18 @@ +{ + "id": "lassiterId", + "worlds": [ + { "_ref": "bellerophonId" } + ], + "client": { + "_ref": "saffronId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "saffronId" }, + { "_ref": "kayleeId" }, + { "_ref": "zoeId" }, + { "_ref": "jayneId" }, + { "_ref": "washId" }, + { "_ref": "inaraId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/spaceSalvage.json b/packages/gatsby-groq-demo/src/data/jobs/spaceSalvage.json new file mode 100644 index 0000000..38159a5 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/spaceSalvage.json @@ -0,0 +1,12 @@ +{ + "id": "spaceSalvageId", + "worlds": [], + "client": { + "_ref": "badgerId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "jayneId" }, + { "_ref": "washId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/trainHeist.json b/packages/gatsby-groq-demo/src/data/jobs/trainHeist.json new file mode 100644 index 0000000..9b1e590 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/trainHeist.json @@ -0,0 +1,14 @@ +{ + "id": "trainHeistId", + "worlds": [ + { "_ref": "reginaId" } + ], + "client": { + "_ref": "niskaId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "jayneId" }, + { "_ref": "zoeId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/jobs/triumphBandits.json b/packages/gatsby-groq-demo/src/data/jobs/triumphBandits.json new file mode 100644 index 0000000..04c1b3a --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/jobs/triumphBandits.json @@ -0,0 +1,14 @@ +{ + "id": "triumphBanditsId", + "worlds": [ + { "_ref": "triumphId" } + ], + "client": { + "_ref": "niskaId" + }, + "crew": [ + { "_ref": "malcomId" }, + { "_ref": "jayneId" }, + { "_ref": "zoeId" } + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/ships/serenity.json b/packages/gatsby-groq-demo/src/data/ships/serenity.json new file mode 100644 index 0000000..ee7219f --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/ships/serenity.json @@ -0,0 +1,12 @@ +{ + "id": "serenityId", + "name": "Serenity", + "type": "Firefly", + "strengths": [ + "Large cargo bay", + "Plenty of hiding spots" + ], + "weaknesses": [ + "Compression coils" + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/ships/skyplex.json b/packages/gatsby-groq-demo/src/data/ships/skyplex.json new file mode 100644 index 0000000..962a769 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/ships/skyplex.json @@ -0,0 +1,11 @@ +{ + "id": "skyplexId", + "name": "Niska's Skyplex", + "type": "Space Station", + "strengths": [ + "Heavily guarded" + ], + "weaknesses": [ + "Suprisingly easy to takeover" + ] +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/ariel.json b/packages/gatsby-groq-demo/src/data/worlds/ariel.json new file mode 100644 index 0000000..0d7c580 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/ariel.json @@ -0,0 +1,5 @@ +{ + "id": "arielId", + "name": "Ariel", + "location": "Core" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/bellerophon.json b/packages/gatsby-groq-demo/src/data/worlds/bellerophon.json new file mode 100644 index 0000000..c7f255b --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/bellerophon.json @@ -0,0 +1,5 @@ +{ + "id": "bellerpophonId", + "name": "Bellerophon", + "location": "Core" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/higgins.json b/packages/gatsby-groq-demo/src/data/worlds/higgins.json new file mode 100644 index 0000000..87d87c7 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/higgins.json @@ -0,0 +1,5 @@ +{ + "id": "higginsMoonId", + "name": "Higgins", + "location": "Border" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/jiangyin.json b/packages/gatsby-groq-demo/src/data/worlds/jiangyin.json new file mode 100644 index 0000000..48f3178 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/jiangyin.json @@ -0,0 +1,5 @@ +{ + "id": "jiangyinId", + "name": "Jiangyin", + "location": "Border" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/regina.json b/packages/gatsby-groq-demo/src/data/worlds/regina.json new file mode 100644 index 0000000..edb3da0 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/regina.json @@ -0,0 +1,5 @@ +{ + "id": "reginaId", + "name": "Regina", + "location": "Border" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/santo.json b/packages/gatsby-groq-demo/src/data/worlds/santo.json new file mode 100644 index 0000000..7922ba7 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/santo.json @@ -0,0 +1,5 @@ +{ + "id": "santoId", + "name": "Santo", + "location": "Core" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/data/worlds/triumph.json b/packages/gatsby-groq-demo/src/data/worlds/triumph.json new file mode 100644 index 0000000..11b27e9 --- /dev/null +++ b/packages/gatsby-groq-demo/src/data/worlds/triumph.json @@ -0,0 +1,5 @@ +{ + "id": "triumphId", + "name": "Triumph", + "location": "Border" +} \ No newline at end of file diff --git a/packages/gatsby-groq-demo/src/fragments/index.js b/packages/gatsby-groq-demo/src/fragments/index.js new file mode 100644 index 0000000..c53d473 --- /dev/null +++ b/packages/gatsby-groq-demo/src/fragments/index.js @@ -0,0 +1,11 @@ +/** + * Put all of your GROQ "fragments" here! + */ + +exports.getWorldsJobs = ` + * [ internal.type == "jobs" && ^.id in worlds[]._ref ] { + client, + crew, + worlds, + } +`; \ No newline at end of file diff --git a/src/images/gatsby-astronaut.png b/packages/gatsby-groq-demo/src/images/gatsby-astronaut.png similarity index 100% rename from src/images/gatsby-astronaut.png rename to packages/gatsby-groq-demo/src/images/gatsby-astronaut.png diff --git a/src/images/gatsby-icon.png b/packages/gatsby-groq-demo/src/images/gatsby-icon.png similarity index 100% rename from src/images/gatsby-icon.png rename to packages/gatsby-groq-demo/src/images/gatsby-icon.png diff --git a/src/pages/404.js b/packages/gatsby-groq-demo/src/pages/404.js similarity index 100% rename from src/pages/404.js rename to packages/gatsby-groq-demo/src/pages/404.js diff --git a/packages/gatsby-groq-demo/src/pages/index.js b/packages/gatsby-groq-demo/src/pages/index.js new file mode 100644 index 0000000..f11fefd --- /dev/null +++ b/packages/gatsby-groq-demo/src/pages/index.js @@ -0,0 +1,37 @@ +import React from 'react'; +import { useGroqQuery } from '../../plugins/gatsby-plugin-groq'; + +import { getWorldsJobs } from '../fragments'; + + + +export const groqQuery = `{ + "worlds": *[ internal.type == "worlds" ] { + _id, + name, + "jobs": ${getWorldsJobs} + } +}`; +const IndexPage = ( { pageContext } ) => { + + const { data } = pageContext; + const awesomeCharacters = useGroqQuery( ` + *[ internal.type == "characters" ] { + ... + } + ` ); + + console.log( 'Worlds', data.worlds ); + console.log( 'Awesome Characters', awesomeCharacters ); + + return( + +
+

Try adding a groqQuery export to this page!

+
+ + ) + +} + +export default IndexPage diff --git a/plugins/gatsby-plugin-groq/gatsby-node.js b/plugins/gatsby-plugin-groq/gatsby-node.js deleted file mode 100644 index 230ca9f..0000000 --- a/plugins/gatsby-plugin-groq/gatsby-node.js +++ /dev/null @@ -1,539 +0,0 @@ -const axios = require( 'axios' ); -const fs = require( 'fs' ); -const glob = require( 'glob' ); -const murmurhash = require( './murmur' ); -const normalizePath = require( 'normalize-path' ); -const parser = require( '@babel/parser' ); -const path = require( 'path' ); -const gatsbyReporter = require( 'gatsby-cli/lib/reporter' ); -const traverse = require( '@babel/traverse' ).default; -const { watch } = require( 'chokidar' ); -const { runQuery } = require( './index' ); - - -// Will make all this prettier once built out as plugin. -// Right now everything depends on specific directory structure. -const ROOT = process.env.INIT_CWD; -const GROQ_DIR = process.env.NODE_ENV === 'development' ? `${ROOT}/.cache/groq` : `${ROOT}/public/static/groq`; - - -/** - * Here's where we extract and run all initial queries. - * Also sets up a watcher to re-run queries during dev when files change.fcache - * - * Runs in right after schema creation and before createPages. - */ -exports.resolvableExtensions = async ( { graphql, actions, cache, getNodes, traceId, store }, plugin ) => { - - const reporter = new Reporter(); - - // Ugly. - if( ! fs.existsSync( GROQ_DIR ) ) { - fs.mkdirSync( GROQ_DIR ); - } - - // Cache fragments. - const fragmentsDir = !! plugin.fragmentsDir ? path.join( ROOT, plugin.fragmentsDir ) : null; - - if( !! fragmentsDir ) { - cacheFragments( fragmentsDir, cache ); - } - - // Extract initial queries. - const intitialNodes = getNodes(); - - extractQueries( { nodes: intitialNodes, traceId, cache } ); - - - // For now watching all files to re-extract queries. - // Right now there doesn't seem to be a way to watch for build updates using Gatsby's public node apis. - // Created a ticket in github to explore option here. - const watcher = watch( `${ROOT}/src/**/*.js` ); - - watcher.on( 'change', async ( filePath ) => { - - // Recache if this was a change within fragments directory. - if( !! fragmentsDir && filePath.includes( fragmentsDir ) ) { - - await cacheFragments( fragmentsDir, cache ); - - // TODO For now we need to force a refresh - axios.post( 'http://localhost:8000/__refresh' ); - - } - - // Get info for file that was changed. - const fileContents = fs.readFileSync( filePath, 'utf8' ); - - // Check if file has either page or static queries. - const pageQueryMatch = /export const groqQuery = /.exec( fileContents ); - const staticQueryMatch = /useGroqQuery/.exec( fileContents ); - if( ! pageQueryMatch && ! staticQueryMatch ) { - return; - } - - reporter.info( 'Re-processing groq queries...' ); - - // Get updated nodes to query against. - const nodes = getNodes(); - - // Run page queries. - if( pageQueryMatch ) { - - const { deletePage, createPage } = actions; - const { pages } = store.getState(); - - // First we need to reprocess the page query. - const processedPageQuery = await processFilePageQuery( filePath, nodes, cache ); - - if( !! processedPageQuery ) { - - const { fileHash: newHash, query: newQuery } = processedPageQuery; - const queryFile = `${GROQ_DIR}/${newHash}.json`; - - await cacheQueryResults( newHash, newQuery ); - - // Update all paths using this page component. - // Is this performant or should we try to leverage custom cache? - for( let [ path, page ] of pages ) { - - if( page.component !== filePath ) { - continue; - } - - reporter.info( `Updating path: ${page.path}` ); - - // Run query and inject into page context. - pageQueryToContext( { - actions, - cache, - file: queryFile, - nodes, - page - } ); - - } - - } - - } - - // Static queries. - if( ! staticQueryMatch ) { - return reporter.success( 'Finished re-processing page queries' ); - } - - try { - - // Run query and save to cache. - // Files using the static query will be automatically refreshed. - const { hash, json } = await processFileStaticQuery( filePath, nodes, plugin ); - await cacheQueryResults( hash, json ); - - return reporter.success( 'Finished re-processing queries' ) - - } - catch( err ) { - console.warn( err ); - } - - } ); - - -} - -/** - * Inject page query results its page. - */ -exports.onCreatePage = async ( { actions, cache, getNodes, page, traceId } ) => { - - // Check for hashed page queries for this component. - const componentPath = page.component; - const hash = murmurhash( componentPath ); - const queryFile = `${GROQ_DIR}/${hash}.json`; - - if( ! fs.existsSync( queryFile) ) { - return; - } - - // Run query and write to page context. - pageQueryToContext( { - actions, - cache, - file: queryFile, - getNodes, - page, - } ); - - -} - -/** - * Extract page and static queries from all files. - * Process and cache results. - * - * @param {Object} $0 Gatsby Node Helpers. - */ -async function extractQueries( { nodes, traceId, cache } ) { - - const reporter = new Reporter(); - - reporter.info( 'Getting files for groq extraction...' ); - - // Pattern that will be appended to searched directories. - // It will match any .js, .jsx, .ts, and .tsx files, that are not - // inside /node_modules. - const filesRegex = `*.+(t|j)s?(x)` - const pathRegex = `/{${filesRegex},!(node_modules)/**/${filesRegex}}`; - - let files = [ - path.join( ROOT, 'src' ), - ].reduce( ( merged, folderPath ) => { - - merged.push( - ...glob.sync( path.join( folderPath, pathRegex ), { - nodir: true, - } ) - ); - - return merged; - - }, [] ); - - files = files.filter( d => ! d.match( /\.d\.ts$/ ) ); - files = files.map( normalizePath ); - - // Loop through files and look for queries to extract. - for( let file of files ) { - - const pageQuery = await processFilePageQuery( file, nodes, cache ); - const staticQuery = await processFileStaticQuery( file, nodes, cache ); - - // Cache page query. - // This will only contain a json file of unprocessed query. - if( !! pageQuery ) { - const { fileHash, query } = pageQuery; - cacheQueryResults( fileHash, query ); - } - - // Cache static query. - // This will contain actual results of the query. - if( !! staticQuery ) { - const { hash, json } = staticQuery; - cacheQueryResults( hash, json, 'static', ); - } - - } - - reporter.info( 'Finished getting files for query extraction' ); - - -} - -/** - * Cache fragments. - * - * @param {string} fragmentsDir - * @param {Object} cache - * @return {bool} if succesfully cached. - */ -async function cacheFragments( fragmentsDir, cache ) { - - const reporter = new Reporter(); - const index = path.join( fragmentsDir, 'index.js' ); - - if( !! fs.readFileSync( index ) ) { - - delete require.cache[ require.resolve( index ) ]; - - fragments = require( index ); - - reporter.info( 'Caching fragments' ); - - await cache.set( 'groq-fragments', fragments ); - - return true; - - } - - return false; - -} - -/** - * Cache hash of query with fragments. - * - * @param {number} hash - * @param {Object} cache - */ -// async function cacheFragmentQueryHash( hash, cache ) { -// -// const hashes = await cache.get( 'groq-fragment-queries' ) || []; -// -// if( !! hashes[hash] ) { -// return; -// } -// -// hashes.push( hash ); -// -// await cache.set( 'groq-fragment-queries', hashes ); -// -// } - -/** -* Run page query and update the related page via createPage. -* -* @param {Object} $0 Gatsby Node Helpers. -*/ -async function pageQueryToContext( { actions, cache, file, getNodes, nodes, page, } ) { - - const { createPage, deletePage, setPageData } = actions; - - // Get query content. - const content = fs.readFileSync( file, 'utf8' ); - let { unprocessed: query } = JSON.parse( content ); - - // Replace any variables within query with context values. - if( page.context ){ - - for( let [ key, value ] of Object.entries( page.context ) ) { - - const search = `\\$${key}`; - const pattern = new RegExp( search, 'g' ); - query = query.replace( pattern, `"${value}"` ); - - } - } - - // Do the thing. - const allNodes = nodes || getNodes(); - const fragments = await cache.get( 'groq-fragments' ); - const results = await runQuery( query, allNodes, { fragments } ); - - page.context.data = results; - - deletePage( page ); - createPage( { - ...page, - } ); - -} - -/** - * Extracts page query from file and returns its hash and unprocessed string. - * - * @param {string} file - * @param {map} nodes - * @param {Object} cache - * @return {Object} fileHash and query - */ -async function processFilePageQuery( file, nodes, cache ) { - - const contents = fs.readFileSync( file, 'utf8' ); - const match = /export const groqQuery = /.exec( contents ); - if( ! match ) { - return; - } - - try { - - const ast = parser.parse( contents, { - errorRecovery: true, - plugins: [ 'jsx' ], - sourceFilename: file, - sourceType: 'module', - } ); - // const fragments = await cache.get( 'groq-fragments' ); - // let fragmentStrings = []; - // let fragmentFunctions = {}; - let pageQuery = null; - let queryStart = null; - let queryEnd = null; - - traverse( ast, { - ExportNamedDeclaration: function( path ) { - - const declarator = path.node.declaration.declarations[0]; - - if( declarator.id.name === 'groqQuery' ) { - - queryStart = declarator.init.start; - queryEnd = declarator.init.end; - pageQuery = contents.substring( queryStart, queryEnd ); - //pageQuery = declarator.init.quasis[0].value.raw; - - // if( declarator.init.expressions.length ) { - // for( let expression of declarator.init.expressions ) { - - // Process string variable - // if( expression.type === 'Identifier' ) { - // - // const variableName = expression.name; - // if( !! fragments[variableName] ) { - // fragmentStrings[variableName] = fragments[variableName]; - // } - // - // } - - // Process function variable. - // if( expression.type === 'CallExpression' ) { - // - // const { callee: { name }, arguments } = expression; - // let args = [] - // - // if( !! arguments.length ) { - // for( let { value } of arguments ) { - // args.push( value ); - // } - // } - - - // const functionName = fragments[name]; - // - // if( !! functionName ) { - // fragmentFunctions[name] = functionName( ...args ); - // } - // - // } - - // } - // } - } - - } - } ); - - if( ! pageQuery ) { - return; - } - - const hash = hashQuery( file ); - - return { - fileHash: hash, - query: JSON.stringify( { unprocessed: pageQuery } ), - } - } - catch( err ) { - console.warn( err ); - return null; - } - -} - -/** - * Extracts static query from file and returns its hash and result. - * - * @param {string} file - * @param {map} nodes - * @param {Options} plugin - * @return {Object} hash and query - */ -async function processFileStaticQuery( file, nodes, plugin ) { - - const contents = fs.readFileSync( file, 'utf8' ); - const match = /useGroqQuery/.exec( contents ); - - if( ! match ) { - return; - } - - try { - - const ast = parser.parse( contents, { - errorRecovery: true, - plugins: [ 'jsx' ], - sourceFilename: file, - sourceType: 'module', - } ); - let staticQuery = null; - - traverse( ast, { - CallExpression: function( path ) { - - if( !! path.node.callee && path.node.callee.name === 'useGroqQuery' ) { - - staticQuery = path.node.arguments[0].quasis[0].value.raw; - - } - } - } ); - if( ! staticQuery ) { - return null; - } - - const hash = hashQuery( staticQuery ); - const result = await runQuery( staticQuery, nodes ); - const json = JSON.stringify( result ); - - return { hash, json }; - - } - catch( err ) { - console.warn( err ); - return null; - } - - - -} - -/** - * Cache result from query extraction. - * For page queries this caches the query itself. - * For static queries this caches the results of the query. - * - * @param {number} hash Hash to the json file. - * @param {Object|string} data Data we are caching. - * @param {string} type Page or static query. Optional. Default 'page' - */ -async function cacheQueryResults( hash, data, type = 'page' ) { - - const reporter = new Reporter(); - reporter.info( `Caching ${type} query: ${hash}` ); - - const json = typeof data !== 'string' ? JSON.stringify( data ) : data; - - if( process.env.NODE_ENV === 'development' ) { - - // Probably a more sophisticated Gatsby way of doing this. - fs.writeFileSync( `${GROQ_DIR}/${hash}.json`, json, err => { - if( err ) { - throw new Error( err ); - } - } ); - - } - else { - - fs.writeFileSync( `${GROQ_DIR}/${hash}.json`, json, err => { - if( err ) { - throw new Error( err ); - } - } ); - - } - -} - -/** - * Generate a hash based on the query. - * - * @param {string} query - * @return {number} - */ -function hashQuery( query ) { - return murmurhash( query ); -} - -/** - * Custom reporter. - */ -function Reporter() { - - this.info = msg => gatsbyReporter.info( `[groq] ${msg}` ); - this.warning = msg => gatsbyReporter.warning( `[groq] ${msg}` ); - this.error = msg => gatsbyReporter.error( `[groq] ${msg}` ); - -} - diff --git a/plugins/gatsby-plugin-groq/murmur.js b/plugins/gatsby-plugin-groq/murmur.js deleted file mode 100644 index c5225d5..0000000 --- a/plugins/gatsby-plugin-groq/murmur.js +++ /dev/null @@ -1,82 +0,0 @@ -// murmurhash2 via https://gist.github.com/raycmorgan/588423 - -/** - * Murmurhash. - * - * @param {string} str - * @param {string} seed - * @return {number} - */ -module.exports = ( str, seed = 'abc' ) => { - - var m = 0x5bd1e995; - var r = 24; - var h = seed ^ str.length; - var length = str.length; - var currentIndex = 0; - - while (length >= 4) { - var k = UInt32(str, currentIndex); - - k = Umul32(k, m); - k ^= k >>> r; - k = Umul32(k, m); - - h = Umul32(h, m); - h ^= k; - - currentIndex += 4; - length -= 4; - } - - switch (length) { - case 3: - h ^= UInt16(str, currentIndex); - h ^= str.charCodeAt(currentIndex + 2) << 16; - h = Umul32(h, m); - break; - - case 2: - h ^= UInt16(str, currentIndex); - h = Umul32(h, m); - break; - - case 1: - h ^= str.charCodeAt(currentIndex); - h = Umul32(h, m); - break; - } - - h ^= h >>> 13; - h = Umul32(h, m); - h ^= h >>> 15; - - return h >>> 0; - } - - function UInt32(str, pos) { - return (str.charCodeAt(pos++)) + - (str.charCodeAt(pos++) << 8) + - (str.charCodeAt(pos++) << 16) + - (str.charCodeAt(pos) << 24); - } - - function UInt16(str, pos) { - return (str.charCodeAt(pos++)) + - (str.charCodeAt(pos++) << 8); - } - - function Umul32(n, m) { - n = n | 0; - m = m | 0; - var nlo = n & 0xffff; - var nhi = n >>> 16; - var res = ((nlo * m) + (((nhi * m) & 0xffff) << 16)) | 0; - return res; - } - - function getBucket(str, buckets) { - var hash = doHash(str, str.length); - var bucket = hash % buckets; - return bucket; - } \ No newline at end of file diff --git a/plugins/gatsby-plugin-groq/package.json b/plugins/gatsby-plugin-groq/package.json deleted file mode 100644 index ee2784e..0000000 --- a/plugins/gatsby-plugin-groq/package.json +++ /dev/null @@ -1,15 +0,0 @@ -{ - "name": "gatsby-plugin-groq", - "private": true, - "description": "Hi", - "version": "0.1.0", - "author": "Kevin McAloon ", - "keywords": [ - "gatsby" - ], - "license": "MIT", - "repository": { - "type": "git", - "url": "" - } -} diff --git a/src/fragments/index.js b/src/fragments/index.js deleted file mode 100644 index d12d3a7..0000000 --- a/src/fragments/index.js +++ /dev/null @@ -1,25 +0,0 @@ -/** - * Put all of your GROQ "fragments" here! - */ - -exports.demoString = ` - _id, - title, - content -`; - -exports.demoFunction = num => { - - if( num === 2 ) { - return(` - _id, - title - `); - } - else { - return(` - _id - `); - } - -} \ No newline at end of file diff --git a/src/pages/index.js b/src/pages/index.js deleted file mode 100644 index d588de5..0000000 --- a/src/pages/index.js +++ /dev/null @@ -1,17 +0,0 @@ -import React from 'react'; - - -const IndexPage = ( { pageContext } ) => { - - - return( - -
-

Go to a page

-
- - ) - -} - -export default IndexPage diff --git a/src/templates/Page.js b/src/templates/Page.js deleted file mode 100644 index c8519a5..0000000 --- a/src/templates/Page.js +++ /dev/null @@ -1,25 +0,0 @@ -import React from 'react'; - -import { demoFunction, demoString } from '../fragments'; - - -export const groqQuery = ` - *[ _type == "post" && _id == $_id ] { - ... - }[0] -`; -export const Page = ( { pageContext } ) => { - - const { data } = pageContext; - - console.log( data ); - - return( - -
-

Try to add a groqQuery export to this page!

-
- ) - -} -export default Page; \ No newline at end of file