More Formats
Artifact Keeper supports a wide variety of specialized package formats beyond traditional software dependencies. This guide covers ML/AI models, editor extensions, Git LFS, and generic file storage.
ML / AI
HuggingFace
Host private machine learning models using the HuggingFace Hub protocol.
Endpoint Format: /huggingface/{repo}
Repository Key: huggingface
Supported Clients: huggingface_hub, transformers
Python SDK Configuration
Install the HuggingFace Hub library:
pip install huggingface_hubConfigure authentication:
from huggingface_hub import login
# Login with tokenlogin(token="your_artifact_keeper_token")Or set environment variable:
export HUGGING_FACE_HUB_TOKEN=your_artifact_keeper_tokenexport HF_ENDPOINT=https://artifacts.example.com/huggingface/mainUploading Models
Upload a model to Artifact Keeper:
from huggingface_hub import HfApi
api = HfApi(endpoint="https://artifacts.example.com/huggingface/main")
# Upload entire model directoryapi.upload_folder( folder_path="./my-model", repo_id="myorg/my-model", repo_type="model",)
# Upload single fileapi.upload_file( path_or_fileobj="./model.safetensors", path_in_repo="model.safetensors", repo_id="myorg/my-model",)Downloading Models
Download models from your private repository:
from huggingface_hub import hf_hub_downloadfrom transformers import AutoModel
# Download specific filemodel_path = hf_hub_download( repo_id="myorg/my-model", filename="model.safetensors", endpoint="https://artifacts.example.com/huggingface/main",)
# Load model with transformersmodel = AutoModel.from_pretrained( "myorg/my-model", endpoint="https://artifacts.example.com/huggingface/main",)Snapshots and Revisions
Work with specific model versions:
from huggingface_hub import snapshot_download
# Download entire model snapshotsnapshot_download( repo_id="myorg/my-model", revision="v1.0.0", endpoint="https://artifacts.example.com/huggingface/main",)ML Model (Generic)
For models not using HuggingFace format, use the generic ML model endpoint.
Endpoint Format: /mlmodel/{repo}
Repository Key: mlmodel
Upload models via HTTP API:
# Upload model filecurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@model.onnx" \ -F "metadata={\"framework\":\"onnx\",\"version\":\"1.0.0\"}" \ https://artifacts.example.com/mlmodel/main/models/my-model/1.0.0
# Download modelcurl -H "Authorization: Bearer $TOKEN" \ https://artifacts.example.com/mlmodel/main/models/my-model/1.0.0/model.onnx \ -o model.onnxPython example:
import requests
# Uploadwith open("model.onnx", "rb") as f: response = requests.post( "https://artifacts.example.com/mlmodel/main/models/my-model/1.0.0", headers={"Authorization": f"Bearer {token}"}, files={"file": f}, data={"metadata": '{"framework":"onnx","version":"1.0.0"}'}, )
# Downloadresponse = requests.get( "https://artifacts.example.com/mlmodel/main/models/my-model/1.0.0/model.onnx", headers={"Authorization": f"Bearer {token}"},)with open("model.onnx", "wb") as f: f.write(response.content)Editor Extensions
VS Code Extensions
Host private VS Code extensions (also compatible with Cursor, Windsurf, and Kiro).
Endpoint Format: /vscode/{repo}
Repository Key: vscode
Aliases: cursor, windsurf, kiro
Private Marketplace Configuration
Configure VS Code to use your private extension marketplace:
{ "extensions.gallery": { "serviceUrl": "https://artifacts.example.com/vscode/main", "itemUrl": "https://artifacts.example.com/vscode/main/item" }}For Cursor, Windsurf, or Kiro, use the same configuration in their respective settings files.
Publishing Extensions
Package and publish your extension:
# Install vscenpm install -g @vscode/vsce
# Package extensionvsce package
# Publish to Artifact Keepercurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@my-extension-1.0.0.vsix" \ https://artifacts.example.com/vscode/main/extensionsOr use the VS Code publish command with custom registry:
vsce publish \ --registry https://artifacts.example.com/vscode/main \ --pat $TOKENInstalling Private Extensions
Install extensions from your private marketplace:
# Via command linecode --install-extension myorg.my-extension
# Or search in Extensions view (if marketplace is configured)JetBrains Plugins
Host private IntelliJ IDEA, PyCharm, WebStorm, and other JetBrains IDE plugins.
Endpoint Format: /jetbrains/{repo}
Repository Key: jetbrains
Custom Plugin Repository Configuration
Add your custom repository in JetBrains IDE:
- Open Settings/Preferences
- Go to Plugins
- Click the gear icon → Manage Plugin Repositories
- Add:
https://artifacts.example.com/jetbrains/main/updatePlugins.xml
Or configure via idea.properties:
idea.plugins.host=https://artifacts.example.com/jetbrains/mainPublishing Plugins
Build and publish your plugin:
# Build plugin./gradlew buildPlugin
# Upload to Artifact Keepercurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@build/distributions/my-plugin-1.0.0.zip" \ https://artifacts.example.com/jetbrains/main/pluginsOr use Gradle plugin:
publishPlugin { host.set("https://artifacts.example.com/jetbrains/main") token.set(System.getenv("ARTIFACT_KEEPER_TOKEN"))}./gradlew publishPluginGit LFS
Store large files using Git Large File Storage protocol.
Endpoint Format: /lfs/{repo}
Repository Key: gitlfs
.lfsconfig Setup
Configure Git LFS to use Artifact Keeper:
# .lfsconfig (committed to repository)[lfs] url = https://artifacts.example.com/lfs/mainOr configure globally:
git config --global lfs.url https://artifacts.example.com/lfs/mainGit LFS Configuration
Install and configure Git LFS:
# Install Git LFSgit lfs install
# Set credentialsgit config lfs.https://artifacts.example.com/lfs/main.access token# Enter your Artifact Keeper token when promptedOr use credential helper:
git config --global credential.helper storeecho "https://username:token@artifacts.example.com" >> ~/.git-credentialsTracking and Pushing Large Files
Track file patterns with Git LFS:
# Track large file typesgit lfs track "*.psd"git lfs track "*.zip"git lfs track "*.bin"git lfs track "models/*.h5"
# Verify tracked patternsgit lfs track
# Commit .gitattributesgit add .gitattributesgit commit -m "Configure Git LFS tracking"Add and push large files:
# Add large filegit add large-file.bin
# Commitgit commit -m "Add large binary file"
# Push (LFS files uploaded automatically)git push origin mainCloning Repositories with LFS
Clone repositories with large files:
# Clone with LFS filesgit clone https://github.com/myorg/myrepo.git
# Clone without downloading LFS files initiallyGIT_LFS_SKIP_SMUDGE=1 git clone https://github.com/myorg/myrepo.git
# Pull LFS files latergit lfs pullManaging LFS Files
# List LFS filesgit lfs ls-files
# Fetch specific filesgit lfs fetch --include="*.psd"
# Prune old LFS filesgit lfs pruneEclipse P2
Host Eclipse IDE plugins and update sites.
Endpoint Format: /p2/{repo}
Repository Key: p2
Eclipse IDE Update Site Configuration
Add your P2 repository in Eclipse:
- Help → Install New Software
- Click “Add…”
- Enter:
- Name:
My Private Plugins - Location:
https://artifacts.example.com/p2/main
- Name:
Or configure via p2.inf:
# p2.infinstructions.configure=\ addRepository(type:0,location:https${#58}//artifacts.example.com/p2/main);\ addRepository(type:1,location:https${#58}//artifacts.example.com/p2/main);Publishing P2 Content
Upload update site content:
# Build P2 repository with Tychomvn clean package
# Upload contentcurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@target/repository/content.xml" \ https://artifacts.example.com/p2/main/
curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@target/repository/artifacts.xml" \ https://artifacts.example.com/p2/main/Generic
Store any file type with custom metadata using the generic storage endpoint.
Endpoint Format: /generic/{repo}
Repository Key: generic
Upload Examples
Upload files via curl:
# Upload single filecurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@build-output.tar.gz" \ https://artifacts.example.com/generic/main/builds/v1.0.0/output.tar.gz
# Upload with custom metadatacurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@docs.zip" \ -F "metadata={\"type\":\"documentation\",\"version\":\"1.0.0\",\"commit\":\"abc123\"}" \ https://artifacts.example.com/generic/main/docs/v1.0.0/docs.zipPython example:
import requests
# Upload filewith open("artifact.bin", "rb") as f: response = requests.post( "https://artifacts.example.com/generic/main/artifacts/v1.0.0/artifact.bin", headers={"Authorization": f"Bearer {token}"}, files={"file": f}, data={ "metadata": json.dumps({ "build_id": "12345", "platform": "linux-x64", "checksum": "sha256:abc123...", }) }, )
print(response.json())Download Examples
Download files via curl:
# Download filecurl -H "Authorization: Bearer $TOKEN" \ https://artifacts.example.com/generic/main/builds/v1.0.0/output.tar.gz \ -o output.tar.gz
# Download with metadatacurl -H "Authorization: Bearer $TOKEN" \ https://artifacts.example.com/generic/main/builds/v1.0.0/output.tar.gz?metadata=truePython example:
import requests
# Download fileresponse = requests.get( "https://artifacts.example.com/generic/main/artifacts/v1.0.0/artifact.bin", headers={"Authorization": f"Bearer {token}"},)
with open("artifact.bin", "wb") as f: f.write(response.content)
# Get metadataresponse = requests.get( "https://artifacts.example.com/generic/main/artifacts/v1.0.0/artifact.bin", headers={"Authorization": f"Bearer {token}"}, params={"metadata": "true"},)
metadata = response.json()print(f"Build ID: {metadata['build_id']}")Use Cases
Build Outputs:
# Store compiled binariescurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@app-linux-x64" \ https://artifacts.example.com/generic/main/releases/v1.0.0/linux/app
# Store build artifactscurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@build-artifacts.zip" \ https://artifacts.example.com/generic/main/builds/${BUILD_ID}/artifacts.zipDocumentation Bundles:
# Upload generated docscurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@docs-bundle.tar.gz" \ https://artifacts.example.com/generic/main/docs/v1.0.0/bundle.tar.gzProprietary Formats:
# Store custom file formatscurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@project.proprietary" \ -F "metadata={\"type\":\"project\",\"tool\":\"CustomTool\",\"version\":\"3.0\"}" \ https://artifacts.example.com/generic/main/projects/project-1/data.proprietaryConfiguration Files:
# Versioned configurationcurl -X POST \ -H "Authorization: Bearer $TOKEN" \ -F "file=@app-config.yaml" \ https://artifacts.example.com/generic/main/configs/prod/v2/config.yamlCI/CD Integration
GitHub Actions Example
name: Upload Artifacts
on: push: tags: - 'v*'
jobs: upload: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4
- name: Build run: make build
- name: Upload to Generic Storage run: | curl -X POST \ -H "Authorization: Bearer ${{ secrets.ARTIFACT_KEEPER_TOKEN }}" \ -F "file=@dist/output.tar.gz" \ https://artifacts.example.com/generic/main/releases/${{ github.ref_name }}/output.tar.gz
- name: Upload ML Model run: | pip install huggingface_hub python -c " from huggingface_hub import HfApi api = HfApi(endpoint='https://artifacts.example.com/huggingface/main') api.upload_folder(folder_path='./models', repo_id='myorg/model') " env: HUGGING_FACE_HUB_TOKEN: ${{ secrets.ARTIFACT_KEEPER_TOKEN }}Best Practices
Versioning
- Use semantic versioning for all artifacts
- Include version in file paths for immutability
- Tag releases appropriately
Metadata
- Always include descriptive metadata
- Add build/commit information
- Include checksums for verification
Organization
- Use consistent path structures
- Group related artifacts
- Separate environments (dev/staging/prod)
Security
- Use token authentication
- Restrict access by repository
- Rotate tokens regularly
- Never commit tokens to source control
Next Steps
- Learn about repository management
- Configure access control
- Set up monitoring and metrics
- Explore API documentation