❄️
Snowflake
Data Warehouse Database & StorageInstall Command
npx clawhub@latest install snowflake
Installation Guide
1
Check Environment
Make sure Node.js 22+ and OpenClaw are installed. Run openclaw --version in your terminal to verify.
2
Run Installation
Run the install command above in your terminal. ClawHub will automatically download and install Snowflake to the ~/.openclaw/skills/ directory.
3
Verify Installation
Run openclaw skills list to check your installed skills and confirm Snowflake appears in the list.
4
Configure (Optional)
Follow the configuration instructions in the description below to add skill settings to ~/.config/openclaw/openclaw.json5.
Manual Installation: Copy the Skill folder to
~/.openclaw/skills/ or the skills/ directory in your project root. Make sure the folder contains a SKILL.md file.
SQL data warehouse querying
Database and schema browsing
Cortex AI intelligent analytics
Detailed Description
Snowflake MCP server enables AI to directly connect to Snowflake data warehouses, execute SQL queries and data analysis, and leverage Snowflake Cortex AI features for intelligent data exploration.
Core Features
- SQL Query (run_query): Execute SQL queries in Snowflake, supporting complex analytical queries, CTEs, and window functions
- Schema Browsing: List available databases, schemas, and tables, view table structures and column definitions
- Data Exploration: Get table row counts, sample data, and basic statistics
- Cortex Integration: Leverage Snowflake Cortex AI for data analysis, text summarization, and semantic search
- Stage Management: Browse and manage files in Snowflake Stages
Configuration
{
"mcpServers": {
"snowflake": {
"command": "npx",
"args": ["-y", "@snowflake/mcp-server"],
"env": {
"SNOWFLAKE_ACCOUNT": "your-account",
"SNOWFLAKE_USER": "your-username",
"SNOWFLAKE_PASSWORD": "your-password",
"SNOWFLAKE_WAREHOUSE": "COMPUTE_WH"
}
}
}
}
Use Cases
- Data analysis: Describe analysis needs in natural language, AI generates and executes SQL queries
- Report generation: Query data and generate business reports and insights
- Data quality checks: Check tables for null values, duplicates, and anomalous data
- Data modeling: Explore table relationships and optimize data model designs