Skip to content

Commit

Permalink
chore: format everything (#87)
Browse files Browse the repository at this point in the history
* chore(*): add prettier dev dep

* chore(*): run format on everything

* chore(*): regen READMEs
  • Loading branch information
cabljac committed May 25, 2023
1 parent 4c5bfba commit 5a39de3
Show file tree
Hide file tree
Showing 67 changed files with 653 additions and 456 deletions.
6 changes: 4 additions & 2 deletions .github/workflows/readmes-updated.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,14 @@ jobs:
mkdir -p ~/.npm-global
npm config set prefix '~/.npm-global'
echo "::set-output name=dir::$(npm config get prefix)"
- name: Cache global dependencies
uses: actions/cache@v2
with:
path: ${{ steps.global-deps-setup.outputs.dir }}
key: ${{ runner.os }}-npm-global-deps-${{ hashFiles('**/package-lock.json') }}
key:
${{ runner.os }}-npm-global-deps-${{
hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-npm-global-deps-
Expand Down
1 change: 1 addition & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
node_modules
lib
package-lock.json
**/*/README.md
2 changes: 1 addition & 1 deletion _emulator/dist/index.html
Original file line number Diff line number Diff line change
@@ -1 +1 @@
Testing
Testing
8 changes: 8 additions & 0 deletions _emulator/extensions/firestore-palm-chatbot.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
CANDIDATE_COUNT=1
CANDIDATES_FIELD=candidates
COLLECTION_NAME=messages
ENABLE_DISCUSSION_OPTION_OVERRIDES=no
LOCATION=us-west2
MODEL=models/chat-bison-001
PROMPT_FIELD=prompt
RESPONSE_FIELD=response
2 changes: 1 addition & 1 deletion _emulator/firebase.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,4 @@
"rules": "firestore.rules",
"indexes": "firestore.indexes.json"
}
}
}
17 changes: 13 additions & 4 deletions bigquery-firestore-export/POSTINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

1. Visit [this link](https://console.cloud.google.com/bigquery/transfers) to see Transfer Configs that have been created. A few moments after the extension has been installed and processing is complete, you should see a Transfer Config matching the Display Name that you created. Grab the Transfer Config ID.
2. View the Firebase document at `${param:COLLECTION_PATH}/{{TRANSFER_CONFIG_ID}}`. This is the metadata associated with the Transfer Config and will be updated if you change the Transfer Config-related extension parameters.
3. View the Firebase subcollection at `${param:COLLECTION_PATH}/{{TRANSFER_CONFIG_ID}}/runs`. When your first transfer run completes, you will see two documents, one with “latest” as the document ID, and another with the transfer run ID as the document ID.
3. View the Firebase subcollection at `${param:COLLECTION_PATH}/{{TRANSFER_CONFIG_ID}}/runs`. When your first transfer run completes, you will see two documents, one with “latest” as the document ID, and another with the transfer run ID as the document ID.
4. Click into `${param:COLLECTION_PATH}/{TRANSFER_CONFIG_ID}}/runs/{{TRANSFER_RUN_ID}}`. You will see the run metadata stored in that document, and an “output” subcollection which contains the data stored in the destination table (i.e. the results of the scheduled query at that point in time).
5. Click into `${param:COLLECTION_PATH}/{{TRANSFER_CONFIG_ID}}/runs/latest`. You will see latestRunId, runMetadata, failedRowCount, totalRowCount fields. These are updated any time a transfer run completes, so you can add Firestore to this document to receive real-time updates.

Expand All @@ -26,19 +26,28 @@ The “latest” document for a Transfer Config will be updated every time a tra

```javascript
const latestRunId = null;
db.collection(`transferConfigs/${transferConfigId}/runs`).doc("latest").onSnapshot(doc => { if (!!doc.data()) { latestRunId = doc.data().latestRunId } });
db.collection(`transferConfigs/${transferConfigId}/runs`)
.doc('latest')
.onSnapshot(doc => {
if (!!doc.data()) {
latestRunId = doc.data().latestRunId;
}
});
```

Whenever the “latest” document updates, these fields are changed: “failedRowCount”, “totalRowCount”, “runMetadata”, and “latestRunId”. The extension uses parallel individual writes to Firestore to maximize write throughout, and if some failures occur due to intermittent Firestore issues, they will be counted in failedRowCount. Depending on the application, you may want to refresh the query results only if there are no write failures.

Once you have the latestRunId, you can query the results of the query within Firestore:

```javascript
const q = db.collection(`transferConfigs/${transferConfigId}/runs/${latestRunId}/output`)
const q = db.collection(
`transferConfigs/${transferConfigId}/runs/${latestRunId}/output`
);
const results = await q.get();
```

## Uninstalling the Extension
The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.

The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.

BigQuery charges by data processed, so your project will continue to incur costs until you manually delete the scheduled query. You can manage your scheduled queries directly in [Cloud Console](https://console.cloud.google.com/bigquery/scheduled-queries).
22 changes: 11 additions & 11 deletions bigquery-firestore-export/PREINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@ This extension helps developers to set up their frontend clients to subscribe to

To use the extension, developers will configure a specific Firestore document for each query and have their frontends listen for updates, and the BigQuery table/query to execute. In the background, BigQuery will run the query on a schedule, and the extension will write the result back to the specified document. Schedules are managed as Transfer Configs using the [Data Transfer Service](https://cloud.google.com/bigquery/docs/scheduling-queries).

Upon installation, a Transfer Config is created for you via the Data Transfer Service API. This Transfer Config will be updated if you update the extension parameters for the instance.
Upon installation, a Transfer Config is created for you via the Data Transfer Service API. This Transfer Config will be updated if you update the extension parameters for the instance.

If you would like to specify multiple queries at different intervals, you can create multiple instances of the extension.

The extension will provide a Pub/Sub trigger that listens to new messages written to the specified topic, representing transfer run completion events.

The extension will parse the message to identify the correct destination table based on the runtime. It will then run a “SELECT *” query from the destination table and write the results (as JSON) to Firestore.
The extension will parse the message to identify the correct destination table based on the runtime. It will then run a “SELECT \*” query from the destination table and write the results (as JSON) to Firestore.

Each run will write to a document with ID “latest”:

Expand All @@ -19,7 +19,7 @@ DOCUMENT: {
totalRowCount: 779,
failedRowCount: 0,
latestRunId: 648762e0-0000-28ef-9109-001a11446b2a
}
}
```

Each run will also write to a “runs” subcollection with runID as the document ID, to preserve history:
Expand All @@ -37,7 +37,7 @@ Query results will be stored as individual documents in a subcollection under th

**Additional Setup**

Make sure that you've set up a [Cloud Firestore database](https://firebase.google.com/docs/firestore/quickstart) in your Firebase project.
Make sure that you've set up a [Cloud Firestore database](https://firebase.google.com/docs/firestore/quickstart) in your Firebase project.

You will also need a BigQuery instance with a dataset that contains at least one table.

Expand All @@ -49,11 +49,11 @@ You will be charged a small amount (typically around $0.01/month) for the Fireba

This extension uses other Firebase and Google Cloud Platform services, which have associated charges if you exceed the service’s no-cost tier:

* Cloud Pub/Sub
* Cloud Firestore
* BigQuery
* Cloud Functions (See [FAQs](https://firebase.google.com/support/faq#extensions-pricing))
> ⚠️ Note: The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.
>
- Cloud Pub/Sub
- Cloud Firestore
- BigQuery
- Cloud Functions (See [FAQs](https://firebase.google.com/support/faq#extensions-pricing))

> ⚠️ Note: The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.
>
> BigQuery charges by data processed, so your project will continue to incur costs until you manually delete the scheduled query. You can manage your scheduled queries directly in [Cloud Console](https://console.cloud.google.com/bigquery/scheduled-queries).
22 changes: 11 additions & 11 deletions bigquery-firestore-export/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,13 @@

To use the extension, developers will configure a specific Firestore document for each query and have their frontends listen for updates, and the BigQuery table/query to execute. In the background, BigQuery will run the query on a schedule, and the extension will write the result back to the specified document. Schedules are managed as Transfer Configs using the [Data Transfer Service](https://cloud.google.com/bigquery/docs/scheduling-queries).

Upon installation, a Transfer Config is created for you via the Data Transfer Service API. This Transfer Config will be updated if you update the extension parameters for the instance.
Upon installation, a Transfer Config is created for you via the Data Transfer Service API. This Transfer Config will be updated if you update the extension parameters for the instance.

If you would like to specify multiple queries at different intervals, you can create multiple instances of the extension.

The extension will provide a Pub/Sub trigger that listens to new messages written to the specified topic, representing transfer run completion events.

The extension will parse the message to identify the correct destination table based on the runtime. It will then run a “SELECT *” query from the destination table and write the results (as JSON) to Firestore.
The extension will parse the message to identify the correct destination table based on the runtime. It will then run a “SELECT \*” query from the destination table and write the results (as JSON) to Firestore.

Each run will write to a document with ID “latest”:

Expand All @@ -27,7 +27,7 @@ DOCUMENT: {
totalRowCount: 779,
failedRowCount: 0,
latestRunId: 648762e0-0000-28ef-9109-001a11446b2a
}
}
```

Each run will also write to a “runs” subcollection with runID as the document ID, to preserve history:
Expand All @@ -45,7 +45,7 @@ Query results will be stored as individual documents in a subcollection under th

**Additional Setup**

Make sure that you've set up a [Cloud Firestore database](https://firebase.google.com/docs/firestore/quickstart) in your Firebase project.
Make sure that you've set up a [Cloud Firestore database](https://firebase.google.com/docs/firestore/quickstart) in your Firebase project.

You will also need a BigQuery instance with a dataset that contains at least one table.

Expand All @@ -57,13 +57,13 @@ You will be charged a small amount (typically around $0.01/month) for the Fireba

This extension uses other Firebase and Google Cloud Platform services, which have associated charges if you exceed the service’s no-cost tier:

* Cloud Pub/Sub
* Cloud Firestore
* BigQuery
* Cloud Functions (See [FAQs](https://firebase.google.com/support/faq#extensions-pricing))
> ⚠️ Note: The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.
>
- Cloud Pub/Sub
- Cloud Firestore
- BigQuery
- Cloud Functions (See [FAQs](https://firebase.google.com/support/faq#extensions-pricing))

> ⚠️ Note: The extension does not delete the BigQuery Transfer Config (scheduled query) automatically when you uninstall the extension.
>
> BigQuery charges by data processed, so your project will continue to incur costs until you manually delete the scheduled query. You can manage your scheduled queries directly in [Cloud Console](https://console.cloud.google.com/bigquery/scheduled-queries).

Expand Down
37 changes: 25 additions & 12 deletions bigquery-firestore-export/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,11 +47,17 @@ apis:

roles:
- role: datastore.user
reason: Allows this extension to access Cloud Firestore to write query results from BQ.
reason:
Allows this extension to access Cloud Firestore to write query results
from BQ.
- role: bigquery.admin
reason: Allows this extension to create transfer configs in BQ, and query BQ destination tables.
reason:
Allows this extension to create transfer configs in BQ, and query BQ
destination tables.
- role: pubsub.admin
reason: Allows DTS to grant DTS service account permission to send notifications to Pub/Sub topic
reason:
Allows DTS to grant DTS service account permission to send notifications
to Pub/Sub topic

resources:
- name: processMessages
Expand All @@ -76,8 +82,8 @@ params:
- param: LOCATION
label: Cloud Functions location
description: >-
Where do you want to deploy the functions created for this extension?
You usually want a location close to your database. For help selecting a
Where do you want to deploy the functions created for this extension? You
usually want a location close to your database. For help selecting a
location, refer to the [location selection
guide](https://firebase.google.com/docs/functions/locations).
type: select
Expand Down Expand Up @@ -221,15 +227,17 @@ params:
- param: DATASET_ID
label: Dataset ID
description: >-
What's the BigQuery destination dataset you'd like to use? Each transfer run will write to a table in this destination dataset.
What's the BigQuery destination dataset you'd like to use? Each transfer
run will write to a table in this destination dataset.
type: string
example: customer_data
required: true

- param: TABLE_NAME
label: Destination Table Name
description: >-
What's the destination table name prefix you'd like to use? Each transfer run will write to the table with this name, postfixed with the runtime.
What's the destination table name prefix you'd like to use? Each transfer
run will write to the table with this name, postfixed with the runtime.
type: string
example: transactions
required: true
Expand All @@ -245,7 +253,8 @@ params:
- param: PARTITIONING_FIELD
label: Partitioning Field
description: >-
What's the partitioning field on the destination table ID? Leave empty if not using a partitioning field.
What's the partitioning field on the destination table ID? Leave empty if
not using a partitioning field.
type: string
example: timestamp
required: false
Expand All @@ -261,7 +270,8 @@ params:
- param: PUB_SUB_TOPIC
label: Pub Sub Topic
description: >-
What's the Pub Sub topic to write messages to when the scheduled query finishes executing?
What's the Pub Sub topic to write messages to when the scheduled query
finishes executing?
type: string
example: test
required: true
Expand All @@ -270,7 +280,8 @@ params:
- param: COLLECTION_PATH
label: Firestore Collection
description: >-
What's the top-level Firestore Collection to store transfer configs, run metadata, and query output?
What's the top-level Firestore Collection to store transfer configs, run
metadata, and query output?
type: string
example: transferConfigs
default: transferConfigs
Expand All @@ -282,7 +293,9 @@ lifecycleEvents:
processingMessage: Creating a new transfer config and scheduling the query.
onUpdate:
function: upsertTransferConfig
processingMessage: Creating or updating the transfer config associated with this extension.
processingMessage:
Creating or updating the transfer config associated with this extension.
onConfigure:
function: upsertTransferConfig
processingMessage: Creating or updating the transfer config associated with this extension.
processingMessage:
Creating or updating the transfer config associated with this extension.
4 changes: 1 addition & 3 deletions bigquery-firestore-export/functions/tsconfig.dev.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
{
"include": [
".eslintrc.js"
]
"include": [".eslintrc.js"]
}
2 changes: 1 addition & 1 deletion firestore-palm-chatbot/POSTINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ If the extension encounters an error, it will write an error message to the docu

If the error message is `The project or service account likely does not have access to the PaLM API`, please ensure that you have already signed up for the [waitlist](https://makersuite.google.com/waitlist) and have been approved before installing the extension.

Then, you need to add the PaLM API to your project. You can do this by going to the [PaLM API page](https://console.cloud.google.com/apis/library/language.googleapis.com) in the Google Cloud Console and clicking "Enable".
Then, you need to add the PaLM API to your project. You can do this by going to the [PaLM API page](https://console.cloud.google.com/apis/library/language.googleapis.com) in the Google Cloud Console and clicking "Enable".

## Monitoring

Expand Down
Loading

0 comments on commit 5a39de3

Please sign in to comment.