Refactor high-complexity React components in Dify frontend. Use when `pnpm analyze-component...
npx skills add yugasun/skills --skill "s3"
Install specific skill from multi-skill repository
# Description
e.g. s3 or s3v4 (use s3 for Aliyun)
# SKILL.md
name: s3
description: Uploads generated static sites or files (like slides) to AWS S3 or S3-compatible services (MinIO, Aliyun OSS, etc.) for public access. Use this skill when the user wants to publish or deploy their content to the cloud.
metadata:
author: Yuga Sun
version: "2026.01.29"
moltbot:
emoji: "โ๏ธ"
requires:
bins: ["uv"]
env: ["S3_ACCESS_KEY_ID", "S3_SECRET_ACCESS_KEY", "S3_BUCKET", "S3_CUSTOM_DOMAIN"]
settings:
endpointUrl:
type: string
label: Endpoint URL
description: S3 Compatible Endpoint URL
region:
type: string
label: Region
description: S3 Region
bucket:
type: string
label: Default Bucket
description: Default Target Bucket
customDomain:
type: string
label: Custom Domain
description: Custom domain for generated link (e.g., https://cdn.example.com)
addressingStyle:
type: string
label: Addressing Style
description: auto, virtual, or path (use virtual for Aliyun)
signatureVersion:
type: string
label: Signature Version
description: e.g. s3 or s3v4 (use s3 for Aliyun)
primaryEnv: S3_ACCESS_KEY_ID
install:
- id: uv-brew
kind: brew
formula: uv
bins: ["uv"]
label: Install uv (brew)
S3 Deployment Skill
Instructions
This skill handles uploading a local directory (e.g., generated slides) to an AWS S3 bucket or any S3-compatible object storage (e.g., MinIO, Aliyun OSS, Tencent COS) for static website hosting.
-
Prerequisites:
- Ensure the user has valid credentials in their environment:
S3_ACCESS_KEY_IDS3_SECRET_ACCESS_KEYS3_BUCKET(Optional if provided as an argument)S3_REGION(Optional)S3_ENDPOINT_URL(Required for non-AWS services like MinIO or Aliyun OSS)S3_CUSTOM_DOMAIN(Optional, overrides the generated URL)
- Ensure
boto3is installed in the environment. If not, install it. - Identify the source directory (e.g.,
slides/<ppt-name>/dist/) and the target S3 bucket name.
- Ensure the user has valid credentials in their environment:
-
Implementation:
- Create a Python script using
boto3to walk through the source directory. - Configure the S3 client with
endpoint_urlif provided, to support S3-compatible providers. - Upload each file, maintaining the relative path structure.
- Crucial: specific
ContentTypemust be set for each file based on its extension (e.g.,text/htmlfor.html,text/cssfor.css,image/pngfor.png) so that the browser renders it correctly. - Make the objects public if the bucket policy allows or requires ACL (optional, based on bucket configuration).
- Print the final website URL.
- Create a Python script using
Usage
Use the bundled script to upload files:
# Explicit bucket name
uv run {baseDir}/scripts/upload_to_s3.py "slides/my-presentation/dist" "my-bucket" --prefix "presentations/my-presentation"
# Using S3_BUCKET from environment
uv run {baseDir}/scripts/upload_to_s3.py "slides/my-presentation/dist" --prefix "presentations/my-presentation"
Usage Guidelines
- Bucket Name: Check if
S3_BUCKETis set in environment. If not, ask the user for the S3 Bucket name. - Endpoint URL: Check if the user is using a non-AWS provider (like MinIO or Aliyun). If so, request or verify the
endpoint_url. - Source Path: Default to looking for
distfolders inslides/if not specified.
# Supported AI Coding Agents
This skill is compatible with the SKILL.md standard and works with all major AI coding agents:
Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.