<aside> ⭐ We are constantly improving our product with new features and bug fixes. Join our community Slack and follow us on Twitter or LinkedIn to stay updated. For questions and feedback, feel free to reach out at support@vessl.ai.

</aside>

Feb 24, 2023 (v0.17.5)

🔀 GitLab integration

We’ve completed the integration with Git Trinity. You can now add GitLab integration in addition to GitHub and BitBucket.

Docs_2022-release-note.001.jpeg

🏢 VESSL for Enterprise

We’ve added a new section on our landing page. Take a peek into how our Enterprise customers like Hyundai Motors use VESSL.

Docs_2022-release-note.002.jpeg

🔧 Improvements & fixes


Feb 3, 2023 (v0.17.4)

🔀 Connect Public Git repos

You can now connect public GitHub, GitLab, and Bitbucket repositories to VESSL. This is as simple as copying & pasting the .git URL of the repository into your experiment launch page. Try out yourself with our public GitHub repo containing some example codes: 🔗  VESSL GitHub

Kapture 2023-02-03 at 15.15.42.gif

🐳 Improved managed Docker images

We are making sure that our managed Docker images stay updated with the latest dependencies. This means you have one less thing to worry about when you set up your runtime & dev environment for your latest model.

Untitled.001.jpeg

🎈 Community update

Latest on our blog — read how VESSL is helping researchers at Seoul National University endeavor the latest multi-disciplinary AI-enabled research like fastMRI.

Seoul National University accelerates ML for MRI research with an open competition using VESSL

Did you get our January newsletter? Sign up for an account on VESSL to receive a monthly newsletter with tips on machine learning, MLOps, and more.

VESSL January Newsletter: On-prem integrations & Hybrid cloud for ML


Nov 25, 2022 (v0.17.4)

🔌 Improved on-premise GPU cluster integration

Integrating on-premise GPU clusters is easier than ever with our latest update. All it takes is a single, one-line command. The following command will check and install all the dependencies like Docker, Helm, and Kubernetes and help you connect your machines with a breeze.

curl -sSLf <https://install.dev.vssl.ai> | sudo bash -s -- --role=controller

For those who are looking to try out the integration with your laptop, we’ve prepared an even more intuitive command. This allows you to use your personal Linux device as a single node machine — helping you to run workloads more easily with every metadata kept tracked with VESSL.

vessl cluster create --name '[CLUSTER_NAME]' --mode single

First, explore what you can achieve by integrating your personal laptop. Then expand your integration by running our single-line curl command on your GPU cluster. Refer to our tutorial for more details.

Docs_2022-release-note.gif

Hybrid Cloud for ML, Simplified