You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Every night, a cron job runs on an AWS machine managed by the Northeastern team that pulls the latest bioregistry docker container from docker hub
Once per week, Bioregistry's CI runs an "update" which also mints a new release
This has the benefit that https://bioregistry.io is always corresponding to a released version, but it has the drawback that people expect their changes to be reflected on the site more quickly (e.g., #1411 (comment))
I think it would be a good idea to set up a more typical CI/CD pipeline, where any time the bioregistry.json file is updated, we build a new docker container from the git repository and deploy it directly.
This would require adding the commit hash to the version for tracking purposes. It would also require a bit more clever usage of GitHub Actions to communicate with the machines hosting the Bioregistry
@bgyori do you think you/your team can set this up for AWS? Alternatively, I've had good experiences with Fly.io automating CI/CD, but it's more expensive than hosted machines on AWS
The text was updated successfully, but these errors were encountered:
I think a nightly release is reasonably frequent, so as a first step, finding a way to go from weekly to nightly PyPI releases would be the easiest path forward. As I understand, the only limitation with this is that PyPI has a project size limit. Looking at https://github.com/pypi/support/issues?q=is%3Aissue%20state%3Aclosed%20%22project%20limit%22, it seems really easy to just request an increase to e.g., 30 GB, past requests seem to be approved in 1-2 days.
Currently, the CI/CD system works as follows:
This has the benefit that https://bioregistry.io is always corresponding to a released version, but it has the drawback that people expect their changes to be reflected on the site more quickly (e.g., #1411 (comment))
I think it would be a good idea to set up a more typical CI/CD pipeline, where any time the
bioregistry.json
file is updated, we build a new docker container from the git repository and deploy it directly.This would require adding the commit hash to the version for tracking purposes. It would also require a bit more clever usage of GitHub Actions to communicate with the machines hosting the Bioregistry
@bgyori do you think you/your team can set this up for AWS? Alternatively, I've had good experiences with Fly.io automating CI/CD, but it's more expensive than hosted machines on AWS
The text was updated successfully, but these errors were encountered: