Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic re-indexing on deploy #125

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 41 additions & 4 deletions .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,11 @@ permissions:
pages: write
id-token: write

concurrency:
group: "pages"
cancel-in-progress: false

jobs:
deploy:
concurrency:
group: "pages"
cancel-in-progress: false
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
Expand Down Expand Up @@ -55,3 +54,41 @@ jobs:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
thejcannon marked this conversation as resolved.
Show resolved Hide resolved

# NB: We use a separate job so we don't hog the single spot dedicated to building/deploying
# (since we set `concurrency` on that job). As more things merge, they can start building/deploying
# while this spins.
reindex:
thejcannon marked this conversation as resolved.
Show resolved Hide resolved
runs-on: ubuntu-22.04
needs: deploy
steps:
- name: Wait for deployment to propagate
run: |
TIMEOUT_S=300
SLEEP_S=5

while [ $TIMEOUT_S -gt 0 ]; do
if curl -s http://www.pantsbuild.org | grep -q "$GITHUB_SHA"; then
echo "Found ref! Continuing on."
break
fi

echo "Ref not found yet, sleeping for $SLEEP_S seconds"
sleep $SLEEP_S
TIMEOUT_S=$((TIMEOUT_S-SLEEP_S))
done

if [ $TIMEOUT_S -le 0 ]; then
echo "TIMEOUT_S reached, failing!"
echo "::error::Timeout waiting for deploy"
exit 1
fi

# See https://www.algolia.com/doc/rest-api/crawler/#reindex-with-a-crawler
- name: Kickoff a crawl
run: |
curl \
-H "Content-Type: application/json" \
-X POST \
--user ${{ secrets.THEJCANNON_ALGOLIA_CRAWLER_USER_ID }}:${{ secrets.THEJCANNON_ALGOLIA_CRAWLER_API_KEY }} \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Btw, does someone else have access to this account? E.g. something in the 1password vault that supposedly exists / associated with @WorkerPants's email?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I invited Benjy. If you want I can invite you as well.

Inviting worker pants might also work in addition. I'll try it out

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did it work?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Benjy says he didn't see the email 😓

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It says invitation was sent. Let's see if Benjy can get logged in as WorkerPants via GitHub SSO and see the app.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK Benjy got logged in, but oddly the crawler(s) seem to be per-user. Will keep banging on it.

https://crawler.algolia.com/api/1/crawlers/7ae90af1-f627-4806-a2cc-89e7157daa44/reindex
2 changes: 2 additions & 0 deletions docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ const formatCopyright = () => {

// Only set by CI, so fallback to just `local` for local dev
const docsCommit = process.env.GITHUB_SHA;
// NB: The full SHA is grepped by our deployment script to know when the site has been updated "live"
// so it can trigger a reindex of the crawler.
const commitLink = docsCommit
? makeLink(`${repoUrl}/commit/${docsCommit}`, docsCommit.slice(0, 6))
: "local";
Expand Down