Compare commits

...

10 Commits

Author SHA1 Message Date
e984bf233e ci: add cross-platform builds for Linux, macOS (Intel + ARM), and Windows
Some checks failed
CI / lint-and-test (push) Failing after 27s
CI / build (, tusk-linux-x64, ubuntu-22.04, ) (push) Has been skipped
CI / build (, tusk-windows-x64, windows-latest, ) (push) Has been skipped
CI / build (--target aarch64-apple-darwin, tusk-macos-arm64, macos-latest, aarch64-apple-darwin) (push) Has been skipped
CI / build (--target x86_64-apple-darwin, tusk-macos-x64, macos-13, x86_64-apple-darwin) (push) Has been skipped
- Add macOS Intel build (macos-13 / x86_64-apple-darwin) to CI matrix
- Add artifact upload step to CI build job
- Add release workflow triggered by v* tags with draft GitHub Release
- Add AppImage to Linux bundle targets

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 17:21:09 +03:00
34c80809f1 fix: prevent snapshot hang caused by u8 overflow in progress calculation
When creating/restoring snapshots with 4+ tables, the progress percentage
calculation overflowed u8 (e.g. 4*80=320 > 255), causing a panic that left
the IPC call unresolved. Also deduplicate fetch_foreign_keys_raw call.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 17:45:58 +03:00
a3b05b0328 feat: add AI data validation, test data generator, index advisor, and snapshots
Four new killer features leveraging AI (Ollama) and PostgreSQL internals:

- Data Validation: describe quality rules in natural language, AI generates
  SQL to find violations, run with pass/fail results and sample violations
- Test Data Generator: right-click table to generate realistic FK-aware test
  data with AI, preview before inserting in a transaction
- Index Advisor: analyze pg_stat tables + AI recommendations for CREATE/DROP
  INDEX with one-click apply
- Data Snapshots: export selected tables to JSON (FK-ordered), restore from
  file with optional truncate in a transaction

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 13:27:41 +03:00
d507162377 fix: harden security, reduce duplication, and improve robustness
- Fix SQL injection in data.rs by wrapping get_table_data in READ ONLY transaction
- Fix SQL injection in docker.rs CREATE DATABASE via escape_ident
- Fix command injection in docker.rs by validating pg_version/container_name
  and escaping shell-interpolated values
- Fix UTF-8 panic on stderr truncation with char_indices
- Wrap delete_rows in a transaction for atomicity
- Replace .expect() with proper error propagation in lib.rs
- Cache AI settings in AppState to avoid repeated disk reads
- Cap JSONB column discovery at 50 to prevent unbounded queries
- Fix ERD colorMode to respect system theme via useTheme()
- Extract AppState::get_pool() replacing ~19 inline pool patterns
- Extract shared AiSettingsFields component (DRY popover + sheet)
- Make get_connections_path pub(crate) and reuse from docker.rs
- Deduplicate check_docker by delegating to check_docker_internal

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 11:41:14 +03:00
baa794b66a feat: fallback to ctid for editing tables without primary key
When a table has no PRIMARY KEY, use PostgreSQL's ctid (physical row ID)
to identify rows for UPDATE/DELETE operations instead of blocking edits.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 16:14:26 +03:00
e76a96deb8 feat: add unified Settings sheet, MCP indicator, and Docker host config
- Add AppSettingsSheet (gear icon in Toolbar) with MCP, Docker, and AI sections
- MCP Server: toggle on/off, port config, status badge, endpoint URL with copy
- Docker: local/remote daemon selector with remote URL input
- AI: moved Ollama settings into the unified sheet
- MCP status probes actual TCP port for reliable running detection
- Docker commands respect configurable docker host (-H flag) for remote daemons
- MCP server supports graceful shutdown via tokio watch channel
- Settings persisted to app_settings.json alongside existing config files
- StatusBar shows MCP indicator (green/gray dot) with tooltip

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 09:04:12 +03:00
20b00e55b0 fix: improve Docker clone reliability and log display
- Use bash with pipefail instead of sh to detect pg_dump failures in pipes
- Switch full clone from binary format (pg_dump -Fc | pg_restore) to plain
  text (pg_dump | psql) for reliable transfer through docker exec
- Add --no-owner --no-acl flags to avoid errors from missing roles
- Extract shared run_pipe_cmd helper with proper error handling
- Remove shell commands from progress events to prevent credential leaks
- Fix process log layout overflow with break-all and block-level details

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 19:41:59 +03:00
1ce5f78de8 feat: add Clone Database to Docker functionality
Clone any database to a local Docker PostgreSQL container with schema
and/or data transfer via pg_dump. Supports three modes: schema only,
full clone, and sample data. Includes container lifecycle management
(start/stop/remove) in the Admin panel, progress tracking with
collapsible process log, and automatic connection creation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 19:27:16 +03:00
f68057beef fix: remove duplicate app name from toolbar
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-15 16:38:57 +03:00
94df94db7c feat: add ER diagram and enhance TableStructure with FK details, triggers, comments
- Add interactive ER diagram with ReactFlow + dagre auto-layout, accessible
  via right-click context menu on schema nodes in the sidebar
- Enhance TableStructure: column comments, FK referenced table/columns,
  ON UPDATE/DELETE rules, new Triggers tab
- Backend: rewrite get_table_constraints using pg_constraint for proper
  composite FK support, add get_table_triggers and get_schema_erd commands

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-15 16:37:38 +03:00
60 changed files with 8203 additions and 634 deletions

123
.github/workflows/build.yml vendored Normal file
View File

@@ -0,0 +1,123 @@
name: CI
on:
push:
branches: [main, master]
pull_request:
branches: [main, master]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
lint-and-test:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
- name: Install Linux dependencies
run: |
sudo apt-get update
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev
- uses: dtolnay/rust-toolchain@stable
with:
components: clippy, rustfmt
- uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- uses: actions/setup-node@v4
with:
node-version: 22
cache: npm
- name: Install frontend dependencies
run: npm ci
- name: ESLint
run: npm run lint
- name: Rust fmt check
run: cd src-tauri && cargo fmt --check
- name: Rust clippy
run: cd src-tauri && cargo clippy -- -D warnings
- name: Rust tests
run: cd src-tauri && cargo test
- name: Frontend tests
run: npm test
build:
needs: lint-and-test
strategy:
fail-fast: false
matrix:
include:
- platform: macos-latest
args: --target aarch64-apple-darwin
rust-target: aarch64-apple-darwin
artifact-name: tusk-macos-arm64
- platform: macos-13
args: --target x86_64-apple-darwin
rust-target: x86_64-apple-darwin
artifact-name: tusk-macos-x64
- platform: windows-latest
args: ""
rust-target: ""
artifact-name: tusk-windows-x64
- platform: ubuntu-22.04
args: ""
rust-target: ""
artifact-name: tusk-linux-x64
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Install Linux dependencies
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev
- uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.rust-target }}
- uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- uses: actions/setup-node@v4
with:
node-version: 22
cache: npm
- name: Install frontend dependencies
run: npm ci
- name: Build Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
args: ${{ matrix.args }}
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.artifact-name }}
path: |
src-tauri/target/**/release/bundle/deb/*.deb
src-tauri/target/**/release/bundle/rpm/*.rpm
src-tauri/target/**/release/bundle/appimage/*.AppImage
src-tauri/target/**/release/bundle/dmg/*.dmg
src-tauri/target/**/release/bundle/nsis/*.exe
src-tauri/target/**/release/bundle/msi/*.msi
if-no-files-found: ignore

67
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,67 @@
name: Release
on:
push:
tags:
- "v*"
permissions:
contents: write
jobs:
release:
strategy:
fail-fast: false
matrix:
include:
- platform: macos-latest
args: --target aarch64-apple-darwin
rust-target: aarch64-apple-darwin
- platform: macos-13
args: --target x86_64-apple-darwin
rust-target: x86_64-apple-darwin
- platform: windows-latest
args: ""
rust-target: ""
- platform: ubuntu-22.04
args: ""
rust-target: ""
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Install Linux dependencies
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev
- uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.rust-target }}
- uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- uses: actions/setup-node@v4
with:
node-version: 22
cache: npm
- name: Install frontend dependencies
run: npm ci
- name: Build and release
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tagName: ${{ github.ref_name }}
releaseName: "Tusk ${{ github.ref_name }}"
releaseBody: "See the assets below to download Tusk for your platform."
releaseDraft: true
prerelease: false
args: ${{ matrix.args }}

255
package-lock.json generated
View File

@@ -15,10 +15,13 @@
"@tauri-apps/api": "^2.10.1", "@tauri-apps/api": "^2.10.1",
"@tauri-apps/plugin-dialog": "^2.6.0", "@tauri-apps/plugin-dialog": "^2.6.0",
"@tauri-apps/plugin-shell": "^2.3.5", "@tauri-apps/plugin-shell": "^2.3.5",
"@types/dagre": "^0.7.53",
"@uiw/react-codemirror": "^4.25.4", "@uiw/react-codemirror": "^4.25.4",
"@xyflow/react": "^12.10.0",
"class-variance-authority": "^0.7.1", "class-variance-authority": "^0.7.1",
"clsx": "^2.1.1", "clsx": "^2.1.1",
"cmdk": "^1.1.1", "cmdk": "^1.1.1",
"dagre": "^0.8.5",
"lucide-react": "^0.563.0", "lucide-react": "^0.563.0",
"next-themes": "^0.4.6", "next-themes": "^0.4.6",
"radix-ui": "^1.4.3", "radix-ui": "^1.4.3",
@@ -4469,6 +4472,61 @@
"@babel/types": "^7.28.2" "@babel/types": "^7.28.2"
} }
}, },
"node_modules/@types/d3-color": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/@types/d3-color/-/d3-color-3.1.3.tgz",
"integrity": "sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A==",
"license": "MIT"
},
"node_modules/@types/d3-drag": {
"version": "3.0.7",
"resolved": "https://registry.npmjs.org/@types/d3-drag/-/d3-drag-3.0.7.tgz",
"integrity": "sha512-HE3jVKlzU9AaMazNufooRJ5ZpWmLIoc90A37WU2JMmeq28w1FQqCZswHZ3xR+SuxYftzHq6WU6KJHvqxKzTxxQ==",
"license": "MIT",
"dependencies": {
"@types/d3-selection": "*"
}
},
"node_modules/@types/d3-interpolate": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz",
"integrity": "sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA==",
"license": "MIT",
"dependencies": {
"@types/d3-color": "*"
}
},
"node_modules/@types/d3-selection": {
"version": "3.0.11",
"resolved": "https://registry.npmjs.org/@types/d3-selection/-/d3-selection-3.0.11.tgz",
"integrity": "sha512-bhAXu23DJWsrI45xafYpkQ4NtcKMwWnAC/vKrd2l+nxMFuvOT3XMYTIj2opv8vq8AO5Yh7Qac/nSeP/3zjTK0w==",
"license": "MIT"
},
"node_modules/@types/d3-transition": {
"version": "3.0.9",
"resolved": "https://registry.npmjs.org/@types/d3-transition/-/d3-transition-3.0.9.tgz",
"integrity": "sha512-uZS5shfxzO3rGlu0cC3bjmMFKsXv+SmZZcgp0KD22ts4uGXp5EVYGzu/0YdwZeKmddhcAccYtREJKkPfXkZuCg==",
"license": "MIT",
"dependencies": {
"@types/d3-selection": "*"
}
},
"node_modules/@types/d3-zoom": {
"version": "3.0.8",
"resolved": "https://registry.npmjs.org/@types/d3-zoom/-/d3-zoom-3.0.8.tgz",
"integrity": "sha512-iqMC4/YlFCSlO8+2Ii1GGGliCAY4XdeG748w5vQUbevlbDu0zSjH/+jojorQVBK/se0j6DUFNPBGSqD3YWYnDw==",
"license": "MIT",
"dependencies": {
"@types/d3-interpolate": "*",
"@types/d3-selection": "*"
}
},
"node_modules/@types/dagre": {
"version": "0.7.53",
"resolved": "https://registry.npmjs.org/@types/dagre/-/dagre-0.7.53.tgz",
"integrity": "sha512-f4gkWqzPZvYmKhOsDnhq/R8mO4UMcKdxZo+i5SCkOU1wvGeHJeUXGIHeE9pnwGyPMDof1Vx5ZQo4nxpeg2TTVQ==",
"license": "MIT"
},
"node_modules/@types/estree": { "node_modules/@types/estree": {
"version": "1.0.8", "version": "1.0.8",
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz", "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz",
@@ -4874,6 +4932,66 @@
"vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0" "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
} }
}, },
"node_modules/@xyflow/react": {
"version": "12.10.0",
"resolved": "https://registry.npmjs.org/@xyflow/react/-/react-12.10.0.tgz",
"integrity": "sha512-eOtz3whDMWrB4KWVatIBrKuxECHqip6PfA8fTpaS2RUGVpiEAe+nqDKsLqkViVWxDGreq0lWX71Xth/SPAzXiw==",
"license": "MIT",
"dependencies": {
"@xyflow/system": "0.0.74",
"classcat": "^5.0.3",
"zustand": "^4.4.0"
},
"peerDependencies": {
"react": ">=17",
"react-dom": ">=17"
}
},
"node_modules/@xyflow/react/node_modules/zustand": {
"version": "4.5.7",
"resolved": "https://registry.npmjs.org/zustand/-/zustand-4.5.7.tgz",
"integrity": "sha512-CHOUy7mu3lbD6o6LJLfllpjkzhHXSBlX8B9+qPddUsIfeF5S/UZ5q0kmCsnRqT1UHFQZchNFDDzMbQsuesHWlw==",
"license": "MIT",
"dependencies": {
"use-sync-external-store": "^1.2.2"
},
"engines": {
"node": ">=12.7.0"
},
"peerDependencies": {
"@types/react": ">=16.8",
"immer": ">=9.0.6",
"react": ">=16.8"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"immer": {
"optional": true
},
"react": {
"optional": true
}
}
},
"node_modules/@xyflow/system": {
"version": "0.0.74",
"resolved": "https://registry.npmjs.org/@xyflow/system/-/system-0.0.74.tgz",
"integrity": "sha512-7v7B/PkiVrkdZzSbL+inGAo6tkR/WQHHG0/jhSvLQToCsfa8YubOGmBYd1s08tpKpihdHDZFwzQZeR69QSBb4Q==",
"license": "MIT",
"dependencies": {
"@types/d3-drag": "^3.0.7",
"@types/d3-interpolate": "^3.0.4",
"@types/d3-selection": "^3.0.10",
"@types/d3-transition": "^3.0.8",
"@types/d3-zoom": "^3.0.8",
"d3-drag": "^3.0.0",
"d3-interpolate": "^3.0.1",
"d3-selection": "^3.0.0",
"d3-zoom": "^3.0.0"
}
},
"node_modules/accepts": { "node_modules/accepts": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz", "resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz",
@@ -5269,6 +5387,12 @@
"url": "https://polar.sh/cva" "url": "https://polar.sh/cva"
} }
}, },
"node_modules/classcat": {
"version": "5.0.5",
"resolved": "https://registry.npmjs.org/classcat/-/classcat-5.0.5.tgz",
"integrity": "sha512-JhZUT7JFcQy/EzW605k/ktHtncoo9vnyW/2GspNYwFlN1C/WmjuV/xtS04e9SOkL2sTdw0VAZ2UGCcQ9lR6p6w==",
"license": "MIT"
},
"node_modules/cli-cursor": { "node_modules/cli-cursor": {
"version": "5.0.0", "version": "5.0.0",
"resolved": "https://registry.npmjs.org/cli-cursor/-/cli-cursor-5.0.0.tgz", "resolved": "https://registry.npmjs.org/cli-cursor/-/cli-cursor-5.0.0.tgz",
@@ -5607,6 +5731,122 @@
"devOptional": true, "devOptional": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/d3-color": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-color/-/d3-color-3.1.0.tgz",
"integrity": "sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-dispatch": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-dispatch/-/d3-dispatch-3.0.1.tgz",
"integrity": "sha512-rzUyPU/S7rwUflMyLc1ETDeBj0NRuHKKAcvukozwhshr6g6c5d8zh4c2gQjY2bZ0dXeGLWc1PF174P2tVvKhfg==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-drag": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/d3-drag/-/d3-drag-3.0.0.tgz",
"integrity": "sha512-pWbUJLdETVA8lQNJecMxoXfH6x+mO2UQo8rSmZ+QqxcbyA3hfeprFgIT//HW2nlHChWeIIMwS2Fq+gEARkhTkg==",
"license": "ISC",
"dependencies": {
"d3-dispatch": "1 - 3",
"d3-selection": "3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-ease": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-ease/-/d3-ease-3.0.1.tgz",
"integrity": "sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w==",
"license": "BSD-3-Clause",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-interpolate": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-interpolate/-/d3-interpolate-3.0.1.tgz",
"integrity": "sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g==",
"license": "ISC",
"dependencies": {
"d3-color": "1 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-selection": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/d3-selection/-/d3-selection-3.0.0.tgz",
"integrity": "sha512-fmTRWbNMmsmWq6xJV8D19U/gw/bwrHfNXxrIN+HfZgnzqTHp9jOmKMhsTUjXOJnZOdZY9Q28y4yebKzqDKlxlQ==",
"license": "ISC",
"peer": true,
"engines": {
"node": ">=12"
}
},
"node_modules/d3-timer": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-timer/-/d3-timer-3.0.1.tgz",
"integrity": "sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-transition": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-transition/-/d3-transition-3.0.1.tgz",
"integrity": "sha512-ApKvfjsSR6tg06xrL434C0WydLr7JewBB3V+/39RMHsaXTOG0zmt/OAXeng5M5LBm0ojmxJrpomQVZ1aPvBL4w==",
"license": "ISC",
"dependencies": {
"d3-color": "1 - 3",
"d3-dispatch": "1 - 3",
"d3-ease": "1 - 3",
"d3-interpolate": "1 - 3",
"d3-timer": "1 - 3"
},
"engines": {
"node": ">=12"
},
"peerDependencies": {
"d3-selection": "2 - 3"
}
},
"node_modules/d3-zoom": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/d3-zoom/-/d3-zoom-3.0.0.tgz",
"integrity": "sha512-b8AmV3kfQaqWAuacbPuNbL6vahnOJflOhexLzMMNLga62+/nh0JzvJ0aO/5a5MVgUFGS7Hu1P9P03o3fJkDCyw==",
"license": "ISC",
"dependencies": {
"d3-dispatch": "1 - 3",
"d3-drag": "2 - 3",
"d3-interpolate": "1 - 3",
"d3-selection": "2 - 3",
"d3-transition": "2 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/dagre": {
"version": "0.8.5",
"resolved": "https://registry.npmjs.org/dagre/-/dagre-0.8.5.tgz",
"integrity": "sha512-/aTqmnRta7x7MCCpExk7HQL2O4owCT2h8NT//9I1OQ9vt29Pa0BzSAkR5lwFUcQ7491yVi/3CXU9jQ5o0Mn2Sw==",
"license": "MIT",
"dependencies": {
"graphlib": "^2.1.8",
"lodash": "^4.17.15"
}
},
"node_modules/data-uri-to-buffer": { "node_modules/data-uri-to-buffer": {
"version": "4.0.1", "version": "4.0.1",
"resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz", "resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz",
@@ -6757,6 +6997,15 @@
"dev": true, "dev": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/graphlib": {
"version": "2.1.8",
"resolved": "https://registry.npmjs.org/graphlib/-/graphlib-2.1.8.tgz",
"integrity": "sha512-jcLLfkpoVGmH7/InMC/1hIvOPSUh38oJtGhvrOFGzioE1DZ+0YW16RgmOJhHiuWTvGiJQ9Z1Ik43JvkRPRvE+A==",
"license": "MIT",
"dependencies": {
"lodash": "^4.17.15"
}
},
"node_modules/graphql": { "node_modules/graphql": {
"version": "16.12.0", "version": "16.12.0",
"resolved": "https://registry.npmjs.org/graphql/-/graphql-16.12.0.tgz", "resolved": "https://registry.npmjs.org/graphql/-/graphql-16.12.0.tgz",
@@ -7609,6 +7858,12 @@
"url": "https://github.com/sponsors/sindresorhus" "url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/lodash": {
"version": "4.17.23",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.23.tgz",
"integrity": "sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w==",
"license": "MIT"
},
"node_modules/lodash.merge": { "node_modules/lodash.merge": {
"version": "4.6.2", "version": "4.6.2",
"resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz", "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz",

View File

@@ -18,10 +18,13 @@
"@tauri-apps/api": "^2.10.1", "@tauri-apps/api": "^2.10.1",
"@tauri-apps/plugin-dialog": "^2.6.0", "@tauri-apps/plugin-dialog": "^2.6.0",
"@tauri-apps/plugin-shell": "^2.3.5", "@tauri-apps/plugin-shell": "^2.3.5",
"@types/dagre": "^0.7.53",
"@uiw/react-codemirror": "^4.25.4", "@uiw/react-codemirror": "^4.25.4",
"@xyflow/react": "^12.10.0",
"class-variance-authority": "^0.7.1", "class-variance-authority": "^0.7.1",
"clsx": "^2.1.1", "clsx": "^2.1.1",
"cmdk": "^1.1.1", "cmdk": "^1.1.1",
"dagre": "^0.8.5",
"lucide-react": "^0.563.0", "lucide-react": "^0.563.0",
"next-themes": "^0.4.6", "next-themes": "^0.4.6",
"radix-ui": "^1.4.3", "radix-ui": "^1.4.3",

345
src-tauri/Cargo.lock generated
View File

@@ -383,6 +383,12 @@ version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9330f8b2ff13f34540b44e946ef35111825727b38d33286ef986142615121801" checksum = "9330f8b2ff13f34540b44e946ef35111825727b38d33286ef986142615121801"
[[package]]
name = "cfg_aliases"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "613afe47fcd5fac7ccf1db93babcb082c5994d996f20b8b159f2ad1658eb5724"
[[package]] [[package]]
name = "chrono" name = "chrono"
version = "0.4.43" version = "0.4.43"
@@ -438,16 +444,6 @@ dependencies = [
"version_check", "version_check",
] ]
[[package]]
name = "core-foundation"
version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "91e195e091a93c46f7102ec7818a2aa394e1e1771c3ab4825963fa03e45afb8f"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]] [[package]]
name = "core-foundation" name = "core-foundation"
version = "0.10.1" version = "0.10.1"
@@ -471,9 +467,9 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fa95a34622365fa5bbf40b20b75dba8dfa8c94c734aea8ac9a5ca38af14316f1" checksum = "fa95a34622365fa5bbf40b20b75dba8dfa8c94c734aea8ac9a5ca38af14316f1"
dependencies = [ dependencies = [
"bitflags 2.10.0", "bitflags 2.10.0",
"core-foundation 0.10.1", "core-foundation",
"core-graphics-types", "core-graphics-types",
"foreign-types 0.5.0", "foreign-types",
"libc", "libc",
] ]
@@ -484,7 +480,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d44a101f213f6c4cdc1853d4b78aef6db6bdfa3468798cc1d9912f4735013eb" checksum = "3d44a101f213f6c4cdc1853d4b78aef6db6bdfa3468798cc1d9912f4735013eb"
dependencies = [ dependencies = [
"bitflags 2.10.0", "bitflags 2.10.0",
"core-foundation 0.10.1", "core-foundation",
"libc", "libc",
] ]
@@ -930,12 +926,6 @@ dependencies = [
"pin-project-lite", "pin-project-lite",
] ]
[[package]]
name = "fastrand"
version = "2.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "37909eebbb50d72f9059c3b6d82c0463f2ff062c9e95845c43a6c9c0355411be"
[[package]] [[package]]
name = "fdeflate" name = "fdeflate"
version = "0.3.7" version = "0.3.7"
@@ -994,15 +984,6 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2" checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
[[package]]
name = "foreign-types"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
dependencies = [
"foreign-types-shared 0.1.1",
]
[[package]] [[package]]
name = "foreign-types" name = "foreign-types"
version = "0.5.0" version = "0.5.0"
@@ -1010,7 +991,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d737d9aa519fb7b749cbc3b962edcf310a8dd1f4b67c91c4f83975dbdd17d965" checksum = "d737d9aa519fb7b749cbc3b962edcf310a8dd1f4b67c91c4f83975dbdd17d965"
dependencies = [ dependencies = [
"foreign-types-macros", "foreign-types-macros",
"foreign-types-shared 0.3.1", "foreign-types-shared",
] ]
[[package]] [[package]]
@@ -1024,12 +1005,6 @@ dependencies = [
"syn 2.0.114", "syn 2.0.114",
] ]
[[package]]
name = "foreign-types-shared"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
[[package]] [[package]]
name = "foreign-types-shared" name = "foreign-types-shared"
version = "0.3.1" version = "0.3.1"
@@ -1291,8 +1266,10 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ff2abc00be7fca6ebc474524697ae276ad847ad0a6b3faa4bcb027e9a4614ad0" checksum = "ff2abc00be7fca6ebc474524697ae276ad847ad0a6b3faa4bcb027e9a4614ad0"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"js-sys",
"libc", "libc",
"wasi 0.11.1+wasi-snapshot-preview1", "wasi 0.11.1+wasi-snapshot-preview1",
"wasm-bindgen",
] ]
[[package]] [[package]]
@@ -1302,9 +1279,11 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "899def5c37c4fd7b2664648c28120ecec138e4d395b459e5ca34f9cce2dd77fd" checksum = "899def5c37c4fd7b2664648c28120ecec138e4d395b459e5ca34f9cce2dd77fd"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"js-sys",
"libc", "libc",
"r-efi", "r-efi",
"wasip2", "wasip2",
"wasm-bindgen",
] ]
[[package]] [[package]]
@@ -1455,25 +1434,6 @@ dependencies = [
"syn 2.0.114", "syn 2.0.114",
] ]
[[package]]
name = "h2"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2f44da3a8150a6703ed5d34e164b875fd14c2cdab9af1252a9a1020bde2bdc54"
dependencies = [
"atomic-waker",
"bytes",
"fnv",
"futures-core",
"futures-sink",
"http",
"indexmap 2.13.0",
"slab",
"tokio",
"tokio-util",
"tracing",
]
[[package]] [[package]]
name = "hashbrown" name = "hashbrown"
version = "0.12.3" version = "0.12.3"
@@ -1618,7 +1578,6 @@ dependencies = [
"bytes", "bytes",
"futures-channel", "futures-channel",
"futures-core", "futures-core",
"h2",
"http", "http",
"http-body", "http-body",
"httparse", "httparse",
@@ -1645,22 +1604,7 @@ dependencies = [
"tokio", "tokio",
"tokio-rustls", "tokio-rustls",
"tower-service", "tower-service",
] "webpki-roots 1.0.6",
[[package]]
name = "hyper-tls"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "70206fc6890eaca9fde8a0bf71caa2ddfc9fe045ac9e5c70df101a7dbde866e0"
dependencies = [
"bytes",
"http-body-util",
"hyper",
"hyper-util",
"native-tls",
"tokio",
"tokio-native-tls",
"tower-service",
] ]
[[package]] [[package]]
@@ -1681,11 +1625,9 @@ dependencies = [
"percent-encoding", "percent-encoding",
"pin-project-lite", "pin-project-lite",
"socket2", "socket2",
"system-configuration",
"tokio", "tokio",
"tower-service", "tower-service",
"tracing", "tracing",
"windows-registry",
] ]
[[package]] [[package]]
@@ -2079,12 +2021,6 @@ dependencies = [
"vcpkg", "vcpkg",
] ]
[[package]]
name = "linux-raw-sys"
version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df1d3c3b53da64cf5760482273a98e575c651a67eec7f77df96b5b642de8f039"
[[package]] [[package]]
name = "litemap" name = "litemap"
version = "0.8.1" version = "0.8.1"
@@ -2106,6 +2042,12 @@ version = "0.4.29"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897" checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
[[package]]
name = "lru-slab"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "112b39cec0b298b6c1999fee3e31427f74f676e4cb9879ed1a121b43661a4154"
[[package]] [[package]]
name = "mac" name = "mac"
version = "0.1.1" version = "0.1.1"
@@ -2222,23 +2164,6 @@ dependencies = [
"windows-sys 0.60.2", "windows-sys 0.60.2",
] ]
[[package]]
name = "native-tls"
version = "0.2.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6cdede44f9a69cab2899a2049e2c3bd49bf911a157f6a3353d4a91c61abbce44"
dependencies = [
"libc",
"log",
"openssl",
"openssl-probe",
"openssl-sys",
"schannel",
"security-framework",
"security-framework-sys",
"tempfile",
]
[[package]] [[package]]
name = "ndk" name = "ndk"
version = "0.9.0" version = "0.9.0"
@@ -2595,50 +2520,6 @@ dependencies = [
"pathdiff", "pathdiff",
] ]
[[package]]
name = "openssl"
version = "0.10.75"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
dependencies = [
"bitflags 2.10.0",
"cfg-if",
"foreign-types 0.3.2",
"libc",
"once_cell",
"openssl-macros",
"openssl-sys",
]
[[package]]
name = "openssl-macros"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.114",
]
[[package]]
name = "openssl-probe"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d05e27ee213611ffe7d6348b942e8f942b37114c00cc03cec254295a4a17852e"
[[package]]
name = "openssl-sys"
version = "0.9.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
dependencies = [
"cc",
"libc",
"pkg-config",
"vcpkg",
]
[[package]] [[package]]
name = "option-ext" name = "option-ext"
version = "0.2.0" version = "0.2.0"
@@ -3042,6 +2923,61 @@ dependencies = [
"memchr", "memchr",
] ]
[[package]]
name = "quinn"
version = "0.11.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b9e20a958963c291dc322d98411f541009df2ced7b5a4f2bd52337638cfccf20"
dependencies = [
"bytes",
"cfg_aliases",
"pin-project-lite",
"quinn-proto",
"quinn-udp",
"rustc-hash",
"rustls",
"socket2",
"thiserror 2.0.18",
"tokio",
"tracing",
"web-time",
]
[[package]]
name = "quinn-proto"
version = "0.11.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f1906b49b0c3bc04b5fe5d86a77925ae6524a19b816ae38ce1e426255f1d8a31"
dependencies = [
"bytes",
"getrandom 0.3.4",
"lru-slab",
"rand 0.9.2",
"ring",
"rustc-hash",
"rustls",
"rustls-pki-types",
"slab",
"thiserror 2.0.18",
"tinyvec",
"tracing",
"web-time",
]
[[package]]
name = "quinn-udp"
version = "0.5.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "addec6a0dcad8a8d96a771f815f0eaf55f9d1805756410b39f5fa81332574cbd"
dependencies = [
"cfg_aliases",
"libc",
"once_cell",
"socket2",
"tracing",
"windows-sys 0.52.0",
]
[[package]] [[package]]
name = "quote" name = "quote"
version = "1.0.44" version = "1.0.44"
@@ -3259,29 +3195,26 @@ checksum = "eddd3ca559203180a307f12d114c268abf583f59b03cb906fd0b3ff8646c1147"
dependencies = [ dependencies = [
"base64 0.22.1", "base64 0.22.1",
"bytes", "bytes",
"encoding_rs",
"futures-core", "futures-core",
"h2",
"http", "http",
"http-body", "http-body",
"http-body-util", "http-body-util",
"hyper", "hyper",
"hyper-rustls", "hyper-rustls",
"hyper-tls",
"hyper-util", "hyper-util",
"js-sys", "js-sys",
"log", "log",
"mime",
"native-tls",
"percent-encoding", "percent-encoding",
"pin-project-lite", "pin-project-lite",
"quinn",
"rustls",
"rustls-pki-types", "rustls-pki-types",
"serde", "serde",
"serde_json", "serde_json",
"serde_urlencoded", "serde_urlencoded",
"sync_wrapper", "sync_wrapper",
"tokio", "tokio",
"tokio-native-tls", "tokio-rustls",
"tower", "tower",
"tower-http", "tower-http",
"tower-service", "tower-service",
@@ -3289,6 +3222,7 @@ dependencies = [
"wasm-bindgen", "wasm-bindgen",
"wasm-bindgen-futures", "wasm-bindgen-futures",
"web-sys", "web-sys",
"webpki-roots 1.0.6",
] ]
[[package]] [[package]]
@@ -3428,6 +3362,12 @@ dependencies = [
"zeroize", "zeroize",
] ]
[[package]]
name = "rustc-hash"
version = "2.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "357703d41365b4b27c590e3ed91eabb1b663f07c4c084095e60cbed4362dff0d"
[[package]] [[package]]
name = "rustc_version" name = "rustc_version"
version = "0.4.1" version = "0.4.1"
@@ -3437,19 +3377,6 @@ dependencies = [
"semver", "semver",
] ]
[[package]]
name = "rustix"
version = "1.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "146c9e247ccc180c1f61615433868c99f3de3ae256a30a43b49f67c2d9171f34"
dependencies = [
"bitflags 2.10.0",
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.61.2",
]
[[package]] [[package]]
name = "rustls" name = "rustls"
version = "0.23.36" version = "0.23.36"
@@ -3470,6 +3397,7 @@ version = "1.14.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "be040f8b0a225e40375822a563fa9524378b9d63112f53e19ffff34df5d33fdd" checksum = "be040f8b0a225e40375822a563fa9524378b9d63112f53e19ffff34df5d33fdd"
dependencies = [ dependencies = [
"web-time",
"zeroize", "zeroize",
] ]
@@ -3505,15 +3433,6 @@ dependencies = [
"winapi-util", "winapi-util",
] ]
[[package]]
name = "schannel"
version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
dependencies = [
"windows-sys 0.61.2",
]
[[package]] [[package]]
name = "schemars" name = "schemars"
version = "0.8.22" version = "0.8.22"
@@ -3585,29 +3504,6 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49" checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "security-framework"
version = "2.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "897b2245f0b511c87893af39b033e5ca9cce68824c4d7e7630b5a1d339658d02"
dependencies = [
"bitflags 2.10.0",
"core-foundation 0.9.4",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
[[package]]
name = "security-framework-sys"
version = "2.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cc1f0cbffaac4852523ce30d8bd3c5cdc873501d96ff467ca09b6767bb8cd5c0"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]] [[package]]
name = "selectors" name = "selectors"
version = "0.24.0" version = "0.24.0"
@@ -4329,27 +4225,6 @@ dependencies = [
"syn 2.0.114", "syn 2.0.114",
] ]
[[package]]
name = "system-configuration"
version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a13f3d0daba03132c0aa9767f98351b3488edc2c100cda2d2ec2b04f3d8d3c8b"
dependencies = [
"bitflags 2.10.0",
"core-foundation 0.9.4",
"system-configuration-sys",
]
[[package]]
name = "system-configuration-sys"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e1d1b10ced5ca923a1fcb8d03e96b8d3268065d724548c0211415ff6ac6bac4"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]] [[package]]
name = "system-deps" name = "system-deps"
version = "6.2.2" version = "6.2.2"
@@ -4371,7 +4246,7 @@ checksum = "f3a753bdc39c07b192151523a3f77cd0394aa75413802c883a0f6f6a0e5ee2e7"
dependencies = [ dependencies = [
"bitflags 2.10.0", "bitflags 2.10.0",
"block2", "block2",
"core-foundation 0.10.1", "core-foundation",
"core-graphics", "core-graphics",
"crossbeam-channel", "crossbeam-channel",
"dispatch", "dispatch",
@@ -4713,19 +4588,6 @@ dependencies = [
"toml 0.9.12+spec-1.1.0", "toml 0.9.12+spec-1.1.0",
] ]
[[package]]
name = "tempfile"
version = "3.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0136791f7c95b1f6dd99f9cc786b91bb81c3800b639b3478e561ddb7be95e5f1"
dependencies = [
"fastrand",
"getrandom 0.3.4",
"once_cell",
"rustix",
"windows-sys 0.61.2",
]
[[package]] [[package]]
name = "tendril" name = "tendril"
version = "0.4.3" version = "0.4.3"
@@ -4861,16 +4723,6 @@ dependencies = [
"syn 2.0.114", "syn 2.0.114",
] ]
[[package]]
name = "tokio-native-tls"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
dependencies = [
"native-tls",
"tokio",
]
[[package]] [[package]]
name = "tokio-rustls" name = "tokio-rustls"
version = "0.26.4" version = "0.26.4"
@@ -5440,6 +5292,16 @@ dependencies = [
"wasm-bindgen", "wasm-bindgen",
] ]
[[package]]
name = "web-time"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a6580f308b1fad9207618087a65c04e7a10bc77e02c8e84e9b00dd4b12fa0bb"
dependencies = [
"js-sys",
"wasm-bindgen",
]
[[package]] [[package]]
name = "webkit2gtk" name = "webkit2gtk"
version = "2.0.2" version = "2.0.2"
@@ -5697,17 +5559,6 @@ dependencies = [
"windows-link 0.1.3", "windows-link 0.1.3",
] ]
[[package]]
name = "windows-registry"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "02752bf7fbdcce7f2a27a742f798510f3e5ad88dbe84871e5168e2120c3d5720"
dependencies = [
"windows-link 0.2.1",
"windows-result 0.4.1",
"windows-strings 0.5.1",
]
[[package]] [[package]]
name = "windows-result" name = "windows-result"
version = "0.3.4" version = "0.3.4"

View File

@@ -30,7 +30,7 @@ csv = "1"
log = "0.4" log = "0.4"
hex = "0.4" hex = "0.4"
bigdecimal = { version = "0.4", features = ["serde"] } bigdecimal = { version = "0.4", features = ["serde"] }
reqwest = { version = "0.12", features = ["json"] } reqwest = { version = "0.12", default-features = false, features = ["json", "rustls-tls"] }
rmcp = { version = "0.15", features = ["server", "macros", "transport-streamable-http-server"] } rmcp = { version = "0.15", features = ["server", "macros", "transport-streamable-http-server"] }
axum = "0.8" axum = "0.8"
schemars = "1" schemars = "1"

File diff suppressed because it is too large Load Diff

View File

@@ -14,7 +14,7 @@ pub struct ConnectResult {
pub flavor: DbFlavor, pub flavor: DbFlavor,
} }
fn get_connections_path(app: &AppHandle) -> TuskResult<std::path::PathBuf> { pub(crate) fn get_connections_path(app: &AppHandle) -> TuskResult<std::path::PathBuf> {
let dir = app let dir = app
.path() .path()
.app_data_dir() .app_data_dir()

View File

@@ -21,10 +21,7 @@ pub async fn get_table_data(
sort_direction: Option<String>, sort_direction: Option<String>,
filter: Option<String>, filter: Option<String>,
) -> TuskResult<PaginatedQueryResult> { ) -> TuskResult<PaginatedQueryResult> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table)); let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table));
@@ -49,35 +46,71 @@ pub async fn get_table_data(
let offset = (page.saturating_sub(1)) * page_size; let offset = (page.saturating_sub(1)) * page_size;
let data_sql = format!( let data_sql = format!(
"SELECT * FROM {}{}{} LIMIT {} OFFSET {}", "SELECT *, ctid::text FROM {}{}{} LIMIT {} OFFSET {}",
qualified, where_clause, order_clause, page_size, offset qualified, where_clause, order_clause, page_size, offset
); );
let count_sql = format!("SELECT COUNT(*) FROM {}{}", qualified, where_clause); let count_sql = format!("SELECT COUNT(*) FROM {}{}", qualified, where_clause);
let start = Instant::now(); let start = Instant::now();
let (rows, count_row) = tokio::try_join!( // Always run table data queries in a read-only transaction to prevent
sqlx::query(&data_sql).fetch_all(pool), // writable CTEs or other mutation via the raw filter parameter.
sqlx::query(&count_sql).fetch_one(pool), let mut tx = (&pool).begin().await.map_err(TuskError::Database)?;
) sqlx::query("SET TRANSACTION READ ONLY")
.map_err(TuskError::Database)?; .execute(&mut *tx)
.await
.map_err(TuskError::Database)?;
let rows = sqlx::query(&data_sql)
.fetch_all(&mut *tx)
.await
.map_err(TuskError::Database)?;
let count_row = sqlx::query(&count_sql)
.fetch_one(&mut *tx)
.await
.map_err(TuskError::Database)?;
tx.rollback().await.map_err(TuskError::Database)?;
let execution_time_ms = start.elapsed().as_millis(); let execution_time_ms = start.elapsed().as_millis();
let total_rows: i64 = count_row.get(0); let total_rows: i64 = count_row.get(0);
let mut columns = Vec::new(); let mut all_columns = Vec::new();
let mut types = Vec::new(); let mut all_types = Vec::new();
if let Some(first_row) = rows.first() { if let Some(first_row) = rows.first() {
for col in first_row.columns() { for col in first_row.columns() {
columns.push(col.name().to_string()); all_columns.push(col.name().to_string());
types.push(col.type_info().name().to_string()); all_types.push(col.type_info().name().to_string());
} }
} }
// Find and strip the trailing ctid column
let ctid_idx = all_columns.iter().rposition(|c| c == "ctid");
let mut ctids: Vec<String> = Vec::new();
let (columns, types) = if let Some(idx) = ctid_idx {
let mut cols = all_columns.clone();
let mut tps = all_types.clone();
cols.remove(idx);
tps.remove(idx);
(cols, tps)
} else {
(all_columns.clone(), all_types.clone())
};
let result_rows: Vec<Vec<Value>> = rows let result_rows: Vec<Vec<Value>> = rows
.iter() .iter()
.map(|row| (0..columns.len()).map(|i| pg_value_to_json(row, i)).collect()) .map(|row| {
if let Some(idx) = ctid_idx {
let ctid_val: String = row.get(idx);
ctids.push(ctid_val);
}
(0..all_columns.len())
.filter(|i| Some(*i) != ctid_idx)
.map(|i| pg_value_to_json(row, i))
.collect()
})
.collect(); .collect();
let row_count = result_rows.len(); let row_count = result_rows.len();
@@ -91,6 +124,7 @@ pub async fn get_table_data(
total_rows, total_rows,
page, page,
page_size, page_size,
ctids,
}) })
} }
@@ -104,40 +138,52 @@ pub async fn update_row(
pk_values: Vec<Value>, pk_values: Vec<Value>,
column: String, column: String,
value: Value, value: Value,
ctid: Option<String>,
) -> TuskResult<()> { ) -> TuskResult<()> {
if state.is_read_only(&connection_id).await { if state.is_read_only(&connection_id).await {
return Err(TuskError::ReadOnly); return Err(TuskError::ReadOnly);
} }
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table)); let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table));
let set_clause = format!("{} = $1", escape_ident(&column)); let set_clause = format!("{} = $1", escape_ident(&column));
let where_parts: Vec<String> = pk_columns if pk_columns.is_empty() {
.iter() // Fallback: use ctid for row identification
.enumerate() let ctid_val = ctid.ok_or_else(|| {
.map(|(i, col)| format!("{} = ${}", escape_ident(col), i + 2)) TuskError::Custom("Cannot update: no primary key and no ctid provided".into())
.collect(); })?;
let where_clause = where_parts.join(" AND "); let sql = format!(
"UPDATE {} SET {} WHERE ctid = $2::tid",
qualified, set_clause
);
let mut query = sqlx::query(&sql);
query = bind_json_value(query, &value);
query = query.bind(ctid_val);
query.execute(&pool).await.map_err(TuskError::Database)?;
} else {
let where_parts: Vec<String> = pk_columns
.iter()
.enumerate()
.map(|(i, col)| format!("{} = ${}", escape_ident(col), i + 2))
.collect();
let where_clause = where_parts.join(" AND ");
let sql = format!( let sql = format!(
"UPDATE {} SET {} WHERE {}", "UPDATE {} SET {} WHERE {}",
qualified, set_clause, where_clause qualified, set_clause, where_clause
); );
let mut query = sqlx::query(&sql); let mut query = sqlx::query(&sql);
query = bind_json_value(query, &value); query = bind_json_value(query, &value);
for pk_val in &pk_values { for pk_val in &pk_values {
query = bind_json_value(query, pk_val); query = bind_json_value(query, pk_val);
}
query.execute(&pool).await.map_err(TuskError::Database)?;
} }
query.execute(pool).await.map_err(TuskError::Database)?;
Ok(()) Ok(())
} }
@@ -154,10 +200,7 @@ pub async fn insert_row(
return Err(TuskError::ReadOnly); return Err(TuskError::ReadOnly);
} }
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table)); let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table));
@@ -176,7 +219,7 @@ pub async fn insert_row(
query = bind_json_value(query, val); query = bind_json_value(query, val);
} }
query.execute(pool).await.map_err(TuskError::Database)?; query.execute(&pool).await.map_err(TuskError::Database)?;
Ok(()) Ok(())
} }
@@ -189,42 +232,58 @@ pub async fn delete_rows(
table: String, table: String,
pk_columns: Vec<String>, pk_columns: Vec<String>,
pk_values_list: Vec<Vec<Value>>, pk_values_list: Vec<Vec<Value>>,
ctids: Option<Vec<String>>,
) -> TuskResult<u64> { ) -> TuskResult<u64> {
if state.is_read_only(&connection_id).await { if state.is_read_only(&connection_id).await {
return Err(TuskError::ReadOnly); return Err(TuskError::ReadOnly);
} }
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table)); let qualified = format!("{}.{}", escape_ident(&schema), escape_ident(&table));
let mut total_affected: u64 = 0; let mut total_affected: u64 = 0;
for pk_values in &pk_values_list { // Wrap all deletes in a transaction for atomicity
let where_parts: Vec<String> = pk_columns let mut tx = (&pool).begin().await.map_err(TuskError::Database)?;
.iter()
.enumerate()
.map(|(i, col)| format!("{} = ${}", escape_ident(col), i + 1))
.collect();
let where_clause = where_parts.join(" AND ");
let sql = format!("DELETE FROM {} WHERE {}", qualified, where_clause); if pk_columns.is_empty() {
// Fallback: use ctids for row identification
let mut query = sqlx::query(&sql); let ctid_list = ctids.ok_or_else(|| {
for val in pk_values { TuskError::Custom("Cannot delete: no primary key and no ctids provided".into())
query = bind_json_value(query, val); })?;
for ctid_val in &ctid_list {
let sql = format!("DELETE FROM {} WHERE ctid = $1::tid", qualified);
let query = sqlx::query(&sql).bind(ctid_val);
let result = query.execute(&mut *tx).await.map_err(TuskError::Database)?;
total_affected += result.rows_affected();
} }
} else {
for pk_values in &pk_values_list {
let where_parts: Vec<String> = pk_columns
.iter()
.enumerate()
.map(|(i, col)| format!("{} = ${}", escape_ident(col), i + 1))
.collect();
let where_clause = where_parts.join(" AND ");
let result = query.execute(pool).await.map_err(TuskError::Database)?; let sql = format!("DELETE FROM {} WHERE {}", qualified, where_clause);
total_affected += result.rows_affected();
let mut query = sqlx::query(&sql);
for val in pk_values {
query = bind_json_value(query, val);
}
let result = query.execute(&mut *tx).await.map_err(TuskError::Database)?;
total_affected += result.rows_affected();
}
} }
tx.commit().await.map_err(TuskError::Database)?;
Ok(total_affected) Ok(total_affected)
} }
fn bind_json_value<'q>( pub(crate) fn bind_json_value<'q>(
query: sqlx::query::Query<'q, sqlx::Postgres, sqlx::postgres::PgArguments>, query: sqlx::query::Query<'q, sqlx::Postgres, sqlx::postgres::PgArguments>,
value: &'q Value, value: &'q Value,
) -> sqlx::query::Query<'q, sqlx::Postgres, sqlx::postgres::PgArguments> { ) -> sqlx::query::Query<'q, sqlx::Postgres, sqlx::postgres::PgArguments> {

View File

@@ -0,0 +1,783 @@
use crate::error::{TuskError, TuskResult};
use crate::models::connection::ConnectionConfig;
use crate::models::docker::{
CloneMode, CloneProgress, CloneResult, CloneToDockerParams, DockerStatus, TuskContainer,
};
use crate::state::AppState;
use crate::utils::escape_ident;
use std::fs;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, State};
use tokio::process::Command;
async fn docker_cmd(state: &AppState) -> Command {
let host = state.docker_host.read().await;
let mut cmd = Command::new("docker");
if let Some(ref h) = *host {
cmd.args(["-H", h]);
}
cmd
}
fn docker_err(msg: impl Into<String>) -> TuskError {
TuskError::Docker(msg.into())
}
fn emit_progress(
app: &AppHandle,
clone_id: &str,
stage: &str,
percent: u8,
message: &str,
detail: Option<&str>,
) {
let _ = app.emit(
"clone-progress",
CloneProgress {
clone_id: clone_id.to_string(),
stage: stage.to_string(),
percent,
message: message.to_string(),
detail: detail.map(|s| s.to_string()),
},
);
}
fn load_connection_config(app: &AppHandle, connection_id: &str) -> TuskResult<ConnectionConfig> {
let path = super::connections::get_connections_path(app)?;
if !path.exists() {
return Err(TuskError::ConnectionNotFound(connection_id.to_string()));
}
let data = fs::read_to_string(&path)?;
let connections: Vec<ConnectionConfig> = serde_json::from_str(&data)?;
connections
.into_iter()
.find(|c| c.id == connection_id)
.ok_or_else(|| TuskError::ConnectionNotFound(connection_id.to_string()))
}
/// Shell-escape a string for use in single quotes
fn shell_escape(s: &str) -> String {
s.replace('\'', "'\\''")
}
/// Validate pg_version matches a safe pattern (e.g. "16", "16.2", "17.1")
fn validate_pg_version(version: &str) -> TuskResult<()> {
let is_valid = !version.is_empty()
&& version
.chars()
.all(|c| c.is_ascii_digit() || c == '.');
if !is_valid {
return Err(docker_err(format!(
"Invalid pg_version '{}': must contain only digits and dots (e.g. '16', '16.2')",
version
)));
}
Ok(())
}
/// Validate container name matches Docker naming rules: [a-zA-Z0-9][a-zA-Z0-9_.-]*
fn validate_container_name(name: &str) -> TuskResult<()> {
if name.is_empty() {
return Err(docker_err("Container name cannot be empty"));
}
let first = name.chars().next().unwrap();
if !first.is_ascii_alphanumeric() {
return Err(docker_err(format!(
"Invalid container name '{}': must start with an alphanumeric character",
name
)));
}
let is_valid = name
.chars()
.all(|c| c.is_ascii_alphanumeric() || c == '_' || c == '.' || c == '-');
if !is_valid {
return Err(docker_err(format!(
"Invalid container name '{}': only [a-zA-Z0-9_.-] characters are allowed",
name
)));
}
Ok(())
}
/// Shell-escape a string for use inside double-quoted shell contexts
fn shell_escape_double(s: &str) -> String {
s.replace('\\', "\\\\")
.replace('"', "\\\"")
.replace('$', "\\$")
.replace('`', "\\`")
.replace('!', "\\!")
}
#[tauri::command]
pub async fn check_docker(state: State<'_, Arc<AppState>>) -> TuskResult<DockerStatus> {
let docker_host = state.docker_host.read().await.clone();
check_docker_internal(&docker_host).await
}
#[tauri::command]
pub async fn list_tusk_containers(state: State<'_, Arc<AppState>>) -> TuskResult<Vec<TuskContainer>> {
let output = docker_cmd(&state)
.await
.args([
"ps",
"-a",
"--filter",
"label=tusk.managed=true",
"--format",
"{{.ID}}\t{{.Names}}\t{{.Status}}\t{{.Label \"tusk.pg-version\"}}\t{{.Label \"tusk.source-db\"}}\t{{.Label \"tusk.source-connection\"}}\t{{.CreatedAt}}\t{{.Ports}}",
])
.output()
.await
.map_err(|e| docker_err(format!("Failed to run docker ps: {}", e)))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(docker_err(format!("docker ps failed: {}", stderr)));
}
let stdout = String::from_utf8_lossy(&output.stdout);
let mut containers = Vec::new();
for line in stdout.lines() {
if line.trim().is_empty() {
continue;
}
let parts: Vec<&str> = line.split('\t').collect();
if parts.len() < 8 {
continue;
}
let host_port = parse_host_port(parts[7]);
containers.push(TuskContainer {
container_id: parts[0].to_string(),
name: parts[1].to_string(),
status: parts[2].to_string(),
host_port,
pg_version: parts[3].to_string(),
source_database: if parts[4].is_empty() {
None
} else {
Some(parts[4].to_string())
},
source_connection: if parts[5].is_empty() {
None
} else {
Some(parts[5].to_string())
},
created_at: if parts[6].is_empty() {
None
} else {
Some(parts[6].to_string())
},
});
}
Ok(containers)
}
fn parse_host_port(ports_str: &str) -> u16 {
for part in ports_str.split(',') {
let part = part.trim();
if let Some(arrow_pos) = part.find("->") {
let before = &part[..arrow_pos];
if let Some(colon_pos) = before.rfind(':') {
if let Ok(port) = before[colon_pos + 1..].parse::<u16>() {
return port;
}
}
}
}
0
}
#[tauri::command]
pub async fn clone_to_docker(
app: AppHandle,
state: State<'_, Arc<AppState>>,
params: CloneToDockerParams,
clone_id: String,
) -> TuskResult<CloneResult> {
let state = state.inner().clone();
let app_clone = app.clone();
tokio::spawn(async move { do_clone(&app_clone, &state, &params, &clone_id).await })
.await
.map_err(|e| docker_err(format!("Clone task panicked: {}", e)))?
}
/// Build a docker Command respecting the remote host setting
fn docker_cmd_sync(docker_host: &Option<String>) -> Command {
let mut cmd = Command::new("docker");
if let Some(ref h) = docker_host {
cmd.args(["-H", h]);
}
cmd
}
async fn check_docker_internal(docker_host: &Option<String>) -> TuskResult<DockerStatus> {
let output = docker_cmd_sync(docker_host)
.args(["version", "--format", "{{.Server.Version}}"])
.output()
.await;
match output {
Ok(out) => {
if out.status.success() {
let version = String::from_utf8_lossy(&out.stdout).trim().to_string();
Ok(DockerStatus {
installed: true,
daemon_running: true,
version: Some(version),
error: None,
})
} else {
let stderr = String::from_utf8_lossy(&out.stderr).trim().to_string();
let daemon_running = !stderr.contains("Cannot connect")
&& !stderr.contains("connection refused");
Ok(DockerStatus {
installed: true,
daemon_running,
version: None,
error: Some(stderr),
})
}
}
Err(_) => Ok(DockerStatus {
installed: false,
daemon_running: false,
version: None,
error: Some("Docker CLI not found. Please install Docker.".to_string()),
}),
}
}
async fn do_clone(
app: &AppHandle,
state: &Arc<AppState>,
params: &CloneToDockerParams,
clone_id: &str,
) -> TuskResult<CloneResult> {
// Validate user inputs before any operations
validate_pg_version(&params.pg_version)?;
validate_container_name(&params.container_name)?;
let docker_host = state.docker_host.read().await.clone();
// Step 1: Check Docker
emit_progress(app, clone_id, "checking", 5, "Checking Docker availability...", None);
let status = check_docker_internal(&docker_host).await?;
if !status.installed || !status.daemon_running {
let msg = status
.error
.unwrap_or_else(|| "Docker is not available".to_string());
emit_progress(app, clone_id, "error", 5, &msg, None);
return Err(docker_err(msg));
}
// Step 2: Find available port
emit_progress(app, clone_id, "port", 10, "Finding available port...", None);
let host_port = match params.host_port {
Some(p) => p,
None => find_free_port().await?,
};
emit_progress(app, clone_id, "port", 10, &format!("Using port {}", host_port), None);
// Step 3: Create container
emit_progress(app, clone_id, "container", 20, "Creating PostgreSQL container...", None);
let pg_password = params.postgres_password.as_deref().unwrap_or("tusk");
let image = format!("postgres:{}", params.pg_version);
let create_output = docker_cmd_sync(&docker_host)
.args([
"run", "-d",
"--name", &params.container_name,
"-p", &format!("{}:5432", host_port),
"-e", &format!("POSTGRES_PASSWORD={}", pg_password),
"-l", "tusk.managed=true",
"-l", &format!("tusk.source-db={}", params.source_database),
"-l", &format!("tusk.source-connection={}", params.source_connection_id),
"-l", &format!("tusk.pg-version={}", params.pg_version),
&image,
])
.output()
.await
.map_err(|e| docker_err(format!("Failed to create container: {}", e)))?;
if !create_output.status.success() {
let stderr = String::from_utf8_lossy(&create_output.stderr).trim().to_string();
emit_progress(app, clone_id, "error", 20, &format!("Failed to create container: {}", stderr), None);
return Err(docker_err(format!("Failed to create container: {}", stderr)));
}
let container_id = String::from_utf8_lossy(&create_output.stdout).trim().to_string();
// Step 4: Wait for PostgreSQL to be ready
emit_progress(app, clone_id, "waiting", 30, "Waiting for PostgreSQL to be ready...", None);
wait_for_pg_ready(&docker_host, &params.container_name, 30).await?;
emit_progress(app, clone_id, "waiting", 35, "PostgreSQL is ready", None);
// Step 5: Create target database
emit_progress(app, clone_id, "database", 35, &format!("Creating database '{}'...", params.source_database), None);
let create_db_output = docker_cmd_sync(&docker_host)
.args([
"exec", &params.container_name,
"psql", "-U", "postgres", "-c",
&format!("CREATE DATABASE {}", escape_ident(&params.source_database)),
])
.output()
.await
.map_err(|e| docker_err(format!("Failed to create database: {}", e)))?;
if !create_db_output.status.success() {
let stderr = String::from_utf8_lossy(&create_db_output.stderr).trim().to_string();
if !stderr.contains("already exists") {
emit_progress(app, clone_id, "error", 35, &format!("Failed to create database: {}", stderr), None);
return Err(docker_err(format!("Failed to create database: {}", stderr)));
}
}
// Step 6: Get source connection URL (using the specific database to clone)
emit_progress(app, clone_id, "dump", 40, "Preparing data transfer...", None);
let source_config = load_connection_config(app, &params.source_connection_id)?;
let source_url = source_config.connection_url_for_db(&params.source_database);
emit_progress(
app, clone_id, "dump", 40,
&format!("Source: {}@{}:{}/{}", source_config.user, source_config.host, source_config.port, params.source_database),
None,
);
// Step 7: Transfer data based on clone mode
match params.clone_mode {
CloneMode::SchemaOnly => {
emit_progress(app, clone_id, "transfer", 45, "Dumping schema...", None);
transfer_schema_only(app, clone_id, &source_url, &params.container_name, &params.source_database, &params.pg_version, &docker_host).await?;
}
CloneMode::FullClone => {
emit_progress(app, clone_id, "transfer", 45, "Performing full database clone...", None);
transfer_full_clone(app, clone_id, &source_url, &params.container_name, &params.source_database, &params.pg_version, &docker_host).await?;
}
CloneMode::SampleData => {
emit_progress(app, clone_id, "transfer", 45, "Dumping schema...", None);
transfer_schema_only(app, clone_id, &source_url, &params.container_name, &params.source_database, &params.pg_version, &docker_host).await?;
emit_progress(app, clone_id, "transfer", 65, "Copying sample data...", None);
let sample_rows = params.sample_rows.unwrap_or(1000);
transfer_sample_data(app, clone_id, &source_url, &params.container_name, &params.source_database, &params.pg_version, sample_rows, &docker_host).await?;
}
}
// Step 8: Save connection in Tusk
emit_progress(app, clone_id, "connection", 90, "Saving connection...", None);
let connection_id = uuid::Uuid::new_v4().to_string();
let new_config = ConnectionConfig {
id: connection_id.clone(),
name: format!("{} (Docker clone)", params.source_database),
host: "localhost".to_string(),
port: host_port,
user: "postgres".to_string(),
password: pg_password.to_string(),
database: params.source_database.clone(),
ssl_mode: Some("disable".to_string()),
color: Some("#06b6d4".to_string()),
environment: Some("local".to_string()),
};
save_connection_config(app, &new_config)?;
let connection_url = format!(
"postgres://postgres:{}@localhost:{}/{}",
pg_password, host_port, params.source_database
);
let container = TuskContainer {
container_id: container_id[..12.min(container_id.len())].to_string(),
name: params.container_name.clone(),
status: "Up".to_string(),
host_port,
pg_version: params.pg_version.clone(),
source_database: Some(params.source_database.clone()),
source_connection: Some(params.source_connection_id.clone()),
created_at: Some(chrono::Local::now().format("%Y-%m-%d %H:%M:%S").to_string()),
};
let result = CloneResult {
container,
connection_id,
connection_url,
};
emit_progress(app, clone_id, "done", 100, "Clone completed successfully!", None);
Ok(result)
}
async fn find_free_port() -> TuskResult<u16> {
let listener = tokio::net::TcpListener::bind("127.0.0.1:0")
.await
.map_err(|e| docker_err(format!("Failed to find free port: {}", e)))?;
let port = listener
.local_addr()
.map_err(|e| docker_err(format!("Failed to get port: {}", e)))?
.port();
drop(listener);
Ok(port)
}
async fn wait_for_pg_ready(docker_host: &Option<String>, container_name: &str, timeout_secs: u64) -> TuskResult<()> {
let start = std::time::Instant::now();
let timeout = std::time::Duration::from_secs(timeout_secs);
loop {
if start.elapsed() > timeout {
return Err(docker_err("PostgreSQL did not become ready in time"));
}
let output = docker_cmd_sync(docker_host)
.args(["exec", container_name, "pg_isready", "-U", "postgres"])
.output()
.await;
if let Ok(out) = output {
if out.status.success() {
return Ok(());
}
}
tokio::time::sleep(std::time::Duration::from_millis(500)).await;
}
}
async fn try_local_pg_dump() -> bool {
Command::new("pg_dump")
.arg("--version")
.output()
.await
.map(|o| o.status.success())
.unwrap_or(false)
}
/// Build the docker host flag string for shell commands
fn docker_host_flag(docker_host: &Option<String>) -> String {
match docker_host {
Some(h) => format!("-H '{}'", shell_escape(h)),
None => String::new(),
}
}
/// Build the pg_dump portion of a shell command
fn pg_dump_shell_cmd(has_local: bool, pg_version: &str, extra_args: &str, source_url: &str, docker_host: &Option<String>) -> String {
let escaped_url = shell_escape(source_url);
if has_local {
format!("pg_dump {} '{}'", extra_args, escaped_url)
} else {
let host_flag = docker_host_flag(docker_host);
format!(
"docker {} run --rm --network=host postgres:{} pg_dump {} '{}'",
host_flag, pg_version, extra_args, escaped_url
)
}
}
async fn run_pipe_cmd(
app: &AppHandle,
clone_id: &str,
pipe_cmd: &str,
label: &str,
) -> TuskResult<std::process::Output> {
// Use bash with pipefail so pg_dump failures are not swallowed
let wrapped = format!("set -o pipefail; {}", pipe_cmd);
emit_progress(app, clone_id, "transfer", 50, label, None);
let output = Command::new("bash")
.args(["-c", &wrapped])
.output()
.await
.map_err(|e| docker_err(format!("{} failed to start: {}", label, e)))?;
let stderr = String::from_utf8_lossy(&output.stderr).trim().to_string();
let stdout = String::from_utf8_lossy(&output.stdout).trim().to_string();
// Always log stderr if present
if !stderr.is_empty() {
// Truncate for progress display (full log can be long)
let short = if stderr.len() > 500 {
let truncated = stderr.char_indices()
.nth(500)
.map(|(i, _)| &stderr[..i])
.unwrap_or(&stderr);
format!("{}...", truncated)
} else {
stderr.clone()
};
emit_progress(app, clone_id, "transfer", 55, &format!("{}: stderr output", label), Some(&short));
}
// Count DDL statements in stdout for feedback
if !stdout.is_empty() {
let creates = stdout.lines()
.filter(|l| l.starts_with("CREATE") || l.starts_with("ALTER") || l.starts_with("SET"))
.count();
if creates > 0 {
emit_progress(app, clone_id, "transfer", 58, &format!("Applied {} SQL statements", creates), None);
}
}
if !output.status.success() {
let code = output.status.code().unwrap_or(-1);
emit_progress(
app, clone_id, "transfer", 55,
&format!("{} exited with code {}", label, code),
Some(&stderr),
);
// Only hard-fail on connection / fatal errors
if stderr.contains("FATAL") || stderr.contains("could not connect")
|| stderr.contains("No such file") || stderr.contains("password authentication failed")
|| stderr.contains("does not exist") || (stdout.is_empty() && stderr.is_empty())
{
return Err(docker_err(format!("{} failed (exit {}): {}", label, code, stderr)));
}
}
Ok(output)
}
async fn transfer_schema_only(
app: &AppHandle,
clone_id: &str,
source_url: &str,
container_name: &str,
database: &str,
pg_version: &str,
docker_host: &Option<String>,
) -> TuskResult<()> {
let has_local = try_local_pg_dump().await;
let label = if has_local { "local pg_dump" } else { "Docker-based pg_dump" };
emit_progress(app, clone_id, "transfer", 48, &format!("Using {} for schema...", label), None);
let dump_cmd = pg_dump_shell_cmd(has_local, pg_version, "--schema-only --no-owner --no-acl", source_url, docker_host);
let escaped_db = shell_escape(database);
let host_flag = docker_host_flag(docker_host);
let pipe_cmd = format!(
"{} | docker {} exec -i '{}' psql -U postgres -d '{}'",
dump_cmd, host_flag, shell_escape(container_name), escaped_db
);
run_pipe_cmd(app, clone_id, &pipe_cmd, "Schema transfer").await?;
emit_progress(app, clone_id, "transfer", 60, "Schema transferred successfully", None);
Ok(())
}
async fn transfer_full_clone(
app: &AppHandle,
clone_id: &str,
source_url: &str,
container_name: &str,
database: &str,
pg_version: &str,
docker_host: &Option<String>,
) -> TuskResult<()> {
let has_local = try_local_pg_dump().await;
let label = if has_local { "local pg_dump" } else { "Docker-based pg_dump" };
emit_progress(app, clone_id, "transfer", 48, &format!("Using {} for full clone...", label), None);
// Use plain text format piped to psql (more reliable than -Fc | pg_restore through docker exec)
let dump_cmd = pg_dump_shell_cmd(has_local, pg_version, "--no-owner --no-acl", source_url, docker_host);
let escaped_db = shell_escape(database);
let host_flag = docker_host_flag(docker_host);
let pipe_cmd = format!(
"{} | docker {} exec -i '{}' psql -U postgres -d '{}'",
dump_cmd, host_flag, shell_escape(container_name), escaped_db
);
run_pipe_cmd(app, clone_id, &pipe_cmd, "Full clone").await?;
emit_progress(app, clone_id, "transfer", 85, "Full clone completed", None);
Ok(())
}
async fn transfer_sample_data(
app: &AppHandle,
clone_id: &str,
source_url: &str,
container_name: &str,
database: &str,
pg_version: &str,
sample_rows: u32,
docker_host: &Option<String>,
) -> TuskResult<()> {
// List tables from the target (schema already transferred)
let target_output = docker_cmd_sync(docker_host)
.args([
"exec", container_name,
"psql", "-U", "postgres", "-d", database,
"-t", "-A", "-c",
"SELECT schemaname || '.' || tablename FROM pg_tables WHERE schemaname NOT IN ('pg_catalog', 'information_schema') ORDER BY schemaname, tablename",
])
.output()
.await
.map_err(|e| docker_err(format!("Failed to list tables: {}", e)))?;
let tables_str = String::from_utf8_lossy(&target_output.stdout);
let tables: Vec<&str> = tables_str.lines().filter(|l| !l.trim().is_empty()).collect();
let total = tables.len();
if total == 0 {
emit_progress(app, clone_id, "transfer", 85, "No tables to copy data for", None);
return Ok(());
}
let has_local = try_local_pg_dump().await;
for (i, qualified_table) in tables.iter().enumerate() {
let pct = 65 + ((i * 20) / total.max(1)).min(20) as u8;
emit_progress(
app, clone_id, "transfer", pct,
&format!("Copying sample data: {} ({}/{})", qualified_table, i + 1, total),
None,
);
let parts: Vec<&str> = qualified_table.splitn(2, '.').collect();
if parts.len() != 2 {
continue;
}
let schema = parts[0];
let table = parts[1];
// Use COPY (SELECT ... LIMIT N) TO STDOUT piped to COPY ... FROM STDIN
// Escape schema/table for use inside double-quoted shell strings
let escaped_schema = shell_escape_double(schema);
let escaped_table = shell_escape_double(table);
let copy_out_sql = format!(
"\\copy (SELECT * FROM \\\"{}\\\".\\\"{}\\\" LIMIT {}) TO STDOUT",
escaped_schema, escaped_table, sample_rows
);
let copy_in_sql = format!(
"\\copy \\\"{}\\\".\\\"{}\\\" FROM STDIN",
escaped_schema, escaped_table
);
let escaped_url = shell_escape(source_url);
let escaped_container = shell_escape(container_name);
let escaped_db = shell_escape(database);
let host_flag = docker_host_flag(docker_host);
let source_cmd = if has_local {
format!("psql '{}' -c \"{}\"", escaped_url, copy_out_sql)
} else {
let image = format!("postgres:{}", pg_version);
format!(
"docker {} run --rm --network=host {} psql '{}' -c \"{}\"",
host_flag, image, escaped_url, copy_out_sql
)
};
let pipe_cmd = format!(
"set -o pipefail; {} | docker {} exec -i '{}' psql -U postgres -d '{}' -c \"{}\"",
source_cmd, host_flag, escaped_container, escaped_db, copy_in_sql
);
let output = Command::new("bash")
.args(["-c", &pipe_cmd])
.output()
.await;
match output {
Ok(out) => {
let stderr = String::from_utf8_lossy(&out.stderr).trim().to_string();
if !stderr.is_empty() && (stderr.contains("ERROR") || stderr.contains("FATAL")) {
emit_progress(
app, clone_id, "transfer", pct,
&format!("Warning: {}", qualified_table),
Some(&stderr),
);
}
}
Err(e) => {
emit_progress(
app, clone_id, "transfer", pct,
&format!("Warning: failed to copy {}: {}", qualified_table, e),
None,
);
}
}
}
emit_progress(app, clone_id, "transfer", 85, "Sample data transfer completed", None);
Ok(())
}
fn save_connection_config(app: &AppHandle, config: &ConnectionConfig) -> TuskResult<()> {
let path = super::connections::get_connections_path(app)?;
let mut connections = if path.exists() {
let data = fs::read_to_string(&path)?;
serde_json::from_str::<Vec<ConnectionConfig>>(&data)?
} else {
vec![]
};
// Upsert by ID to avoid duplicate entries on retry
if let Some(pos) = connections.iter().position(|c| c.id == config.id) {
connections[pos] = config.clone();
} else {
connections.push(config.clone());
}
let data = serde_json::to_string_pretty(&connections)?;
fs::write(&path, data)?;
Ok(())
}
#[tauri::command]
pub async fn start_container(state: State<'_, Arc<AppState>>, name: String) -> TuskResult<()> {
let output = docker_cmd(&state)
.await
.args(["start", &name])
.output()
.await
.map_err(|e| docker_err(format!("Failed to start container: {}", e)))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(docker_err(format!("Failed to start container: {}", stderr)));
}
Ok(())
}
#[tauri::command]
pub async fn stop_container(state: State<'_, Arc<AppState>>, name: String) -> TuskResult<()> {
let output = docker_cmd(&state)
.await
.args(["stop", &name])
.output()
.await
.map_err(|e| docker_err(format!("Failed to stop container: {}", e)))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(docker_err(format!("Failed to stop container: {}", stderr)));
}
Ok(())
}
#[tauri::command]
pub async fn remove_container(state: State<'_, Arc<AppState>>, name: String) -> TuskResult<()> {
let output = docker_cmd(&state)
.await
.args(["rm", "-f", &name])
.output()
.await
.map_err(|e| docker_err(format!("Failed to remove container: {}", e)))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(docker_err(format!("Failed to remove container: {}", stderr)));
}
Ok(())
}

View File

@@ -1,6 +1,7 @@
pub mod ai; pub mod ai;
pub mod connections; pub mod connections;
pub mod data; pub mod data;
pub mod docker;
pub mod export; pub mod export;
pub mod history; pub mod history;
pub mod lookup; pub mod lookup;
@@ -8,3 +9,5 @@ pub mod management;
pub mod queries; pub mod queries;
pub mod saved_queries; pub mod saved_queries;
pub mod schema; pub mod schema;
pub mod snapshot;
pub mod settings;

View File

@@ -1,5 +1,8 @@
use crate::error::{TuskError, TuskResult}; use crate::error::{TuskError, TuskResult};
use crate::models::schema::{ColumnDetail, ColumnInfo, ConstraintInfo, IndexInfo, SchemaObject}; use crate::models::schema::{
ColumnDetail, ColumnInfo, ConstraintInfo, ErdColumn, ErdData, ErdRelationship, ErdTable,
IndexInfo, SchemaObject, TriggerInfo,
};
use crate::state::{AppState, DbFlavor}; use crate::state::{AppState, DbFlavor};
use sqlx::Row; use sqlx::Row;
use std::collections::HashMap; use std::collections::HashMap;
@@ -11,17 +14,14 @@ pub async fn list_databases(
state: State<'_, Arc<AppState>>, state: State<'_, Arc<AppState>>,
connection_id: String, connection_id: String,
) -> TuskResult<Vec<String>> { ) -> TuskResult<Vec<String>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT datname FROM pg_database \ "SELECT datname FROM pg_database \
WHERE datistemplate = false \ WHERE datistemplate = false \
ORDER BY datname", ORDER BY datname",
) )
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -32,10 +32,7 @@ pub async fn list_schemas_core(
state: &AppState, state: &AppState,
connection_id: &str, connection_id: &str,
) -> TuskResult<Vec<String>> { ) -> TuskResult<Vec<String>> {
let pools = state.pools.read().await; let pool = state.get_pool(connection_id).await?;
let pool = pools
.get(connection_id)
.ok_or_else(|| TuskError::NotConnected(connection_id.to_string()))?;
let flavor = state.get_flavor(connection_id).await; let flavor = state.get_flavor(connection_id).await;
let sql = if flavor == DbFlavor::Greenplum { let sql = if flavor == DbFlavor::Greenplum {
@@ -49,7 +46,7 @@ pub async fn list_schemas_core(
}; };
let rows = sqlx::query(sql) let rows = sqlx::query(sql)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -69,10 +66,7 @@ pub async fn list_tables_core(
connection_id: &str, connection_id: &str,
schema: &str, schema: &str,
) -> TuskResult<Vec<SchemaObject>> { ) -> TuskResult<Vec<SchemaObject>> {
let pools = state.pools.read().await; let pool = state.get_pool(connection_id).await?;
let pool = pools
.get(connection_id)
.ok_or_else(|| TuskError::NotConnected(connection_id.to_string()))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT t.table_name, \ "SELECT t.table_name, \
@@ -85,7 +79,7 @@ pub async fn list_tables_core(
ORDER BY t.table_name", ORDER BY t.table_name",
) )
.bind(schema) .bind(schema)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -116,10 +110,7 @@ pub async fn list_views(
connection_id: String, connection_id: String,
schema: String, schema: String,
) -> TuskResult<Vec<SchemaObject>> { ) -> TuskResult<Vec<SchemaObject>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT table_name FROM information_schema.views \ "SELECT table_name FROM information_schema.views \
@@ -127,7 +118,7 @@ pub async fn list_views(
ORDER BY table_name", ORDER BY table_name",
) )
.bind(&schema) .bind(&schema)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -149,10 +140,7 @@ pub async fn list_functions(
connection_id: String, connection_id: String,
schema: String, schema: String,
) -> TuskResult<Vec<SchemaObject>> { ) -> TuskResult<Vec<SchemaObject>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT routine_name FROM information_schema.routines \ "SELECT routine_name FROM information_schema.routines \
@@ -160,7 +148,7 @@ pub async fn list_functions(
ORDER BY routine_name", ORDER BY routine_name",
) )
.bind(&schema) .bind(&schema)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -182,10 +170,7 @@ pub async fn list_indexes(
connection_id: String, connection_id: String,
schema: String, schema: String,
) -> TuskResult<Vec<SchemaObject>> { ) -> TuskResult<Vec<SchemaObject>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT indexname FROM pg_indexes \ "SELECT indexname FROM pg_indexes \
@@ -193,7 +178,7 @@ pub async fn list_indexes(
ORDER BY indexname", ORDER BY indexname",
) )
.bind(&schema) .bind(&schema)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -215,10 +200,7 @@ pub async fn list_sequences(
connection_id: String, connection_id: String,
schema: String, schema: String,
) -> TuskResult<Vec<SchemaObject>> { ) -> TuskResult<Vec<SchemaObject>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT sequence_name FROM information_schema.sequences \ "SELECT sequence_name FROM information_schema.sequences \
@@ -226,7 +208,7 @@ pub async fn list_sequences(
ORDER BY sequence_name", ORDER BY sequence_name",
) )
.bind(&schema) .bind(&schema)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -248,10 +230,7 @@ pub async fn get_table_columns_core(
schema: &str, schema: &str,
table: &str, table: &str,
) -> TuskResult<Vec<ColumnInfo>> { ) -> TuskResult<Vec<ColumnInfo>> {
let pools = state.pools.read().await; let pool = state.get_pool(connection_id).await?;
let pool = pools
.get(connection_id)
.ok_or_else(|| TuskError::NotConnected(connection_id.to_string()))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT \ "SELECT \
@@ -271,14 +250,20 @@ pub async fn get_table_columns_core(
AND tc.table_name = $2 \ AND tc.table_name = $2 \
AND kcu.column_name = c.column_name \ AND kcu.column_name = c.column_name \
LIMIT 1 \ LIMIT 1 \
), false) as is_pk \ ), false) as is_pk, \
col_description( \
(SELECT oid FROM pg_class \
JOIN pg_namespace ON pg_namespace.oid = pg_class.relnamespace \
WHERE relname = $2 AND nspname = $1), \
c.ordinal_position \
) as col_comment \
FROM information_schema.columns c \ FROM information_schema.columns c \
WHERE c.table_schema = $1 AND c.table_name = $2 \ WHERE c.table_schema = $1 AND c.table_name = $2 \
ORDER BY c.ordinal_position", ORDER BY c.ordinal_position",
) )
.bind(schema) .bind(schema)
.bind(table) .bind(table)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -292,6 +277,7 @@ pub async fn get_table_columns_core(
ordinal_position: r.get::<i32, _>(4), ordinal_position: r.get::<i32, _>(4),
character_maximum_length: r.get::<Option<i32>, _>(5), character_maximum_length: r.get::<Option<i32>, _>(5),
is_primary_key: r.get::<bool, _>(6), is_primary_key: r.get::<bool, _>(6),
comment: r.get::<Option<String>, _>(7),
}) })
.collect()) .collect())
} }
@@ -313,27 +299,57 @@ pub async fn get_table_constraints(
schema: String, schema: String,
table: String, table: String,
) -> TuskResult<Vec<ConstraintInfo>> { ) -> TuskResult<Vec<ConstraintInfo>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT \ "SELECT \
tc.constraint_name, \ c.conname AS constraint_name, \
tc.constraint_type, \ CASE c.contype \
array_agg(kcu.column_name ORDER BY kcu.ordinal_position)::text[] as columns \ WHEN 'p' THEN 'PRIMARY KEY' \
FROM information_schema.table_constraints tc \ WHEN 'f' THEN 'FOREIGN KEY' \
JOIN information_schema.key_column_usage kcu \ WHEN 'u' THEN 'UNIQUE' \
ON tc.constraint_name = kcu.constraint_name \ WHEN 'c' THEN 'CHECK' \
AND tc.table_schema = kcu.table_schema \ WHEN 'x' THEN 'EXCLUDE' \
WHERE tc.table_schema = $1 AND tc.table_name = $2 \ END AS constraint_type, \
GROUP BY tc.constraint_name, tc.constraint_type \ ARRAY( \
ORDER BY tc.constraint_type, tc.constraint_name", SELECT a.attname FROM unnest(c.conkey) WITH ORDINALITY AS k(attnum, ord) \
JOIN pg_attribute a ON a.attrelid = c.conrelid AND a.attnum = k.attnum \
ORDER BY k.ord \
)::text[] AS columns, \
ref_ns.nspname AS referenced_schema, \
ref_cl.relname AS referenced_table, \
CASE WHEN c.confrelid > 0 THEN ARRAY( \
SELECT a.attname FROM unnest(c.confkey) WITH ORDINALITY AS k(attnum, ord) \
JOIN pg_attribute a ON a.attrelid = c.confrelid AND a.attnum = k.attnum \
ORDER BY k.ord \
)::text[] ELSE NULL END AS referenced_columns, \
CASE c.confupdtype \
WHEN 'a' THEN 'NO ACTION' \
WHEN 'r' THEN 'RESTRICT' \
WHEN 'c' THEN 'CASCADE' \
WHEN 'n' THEN 'SET NULL' \
WHEN 'd' THEN 'SET DEFAULT' \
ELSE NULL \
END AS update_rule, \
CASE c.confdeltype \
WHEN 'a' THEN 'NO ACTION' \
WHEN 'r' THEN 'RESTRICT' \
WHEN 'c' THEN 'CASCADE' \
WHEN 'n' THEN 'SET NULL' \
WHEN 'd' THEN 'SET DEFAULT' \
ELSE NULL \
END AS delete_rule \
FROM pg_constraint c \
JOIN pg_class cl ON cl.oid = c.conrelid \
JOIN pg_namespace ns ON ns.oid = cl.relnamespace \
LEFT JOIN pg_class ref_cl ON ref_cl.oid = c.confrelid \
LEFT JOIN pg_namespace ref_ns ON ref_ns.oid = ref_cl.relnamespace \
WHERE ns.nspname = $1 AND cl.relname = $2 \
ORDER BY c.contype, c.conname",
) )
.bind(&schema) .bind(&schema)
.bind(&table) .bind(&table)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -343,6 +359,11 @@ pub async fn get_table_constraints(
name: r.get::<String, _>(0), name: r.get::<String, _>(0),
constraint_type: r.get::<String, _>(1), constraint_type: r.get::<String, _>(1),
columns: r.get::<Vec<String>, _>(2), columns: r.get::<Vec<String>, _>(2),
referenced_schema: r.get::<Option<String>, _>(3),
referenced_table: r.get::<Option<String>, _>(4),
referenced_columns: r.get::<Option<Vec<String>>, _>(5),
update_rule: r.get::<Option<String>, _>(6),
delete_rule: r.get::<Option<String>, _>(7),
}) })
.collect()) .collect())
} }
@@ -354,10 +375,7 @@ pub async fn get_table_indexes(
schema: String, schema: String,
table: String, table: String,
) -> TuskResult<Vec<IndexInfo>> { ) -> TuskResult<Vec<IndexInfo>> {
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let rows = sqlx::query( let rows = sqlx::query(
"SELECT \ "SELECT \
@@ -374,7 +392,7 @@ pub async fn get_table_indexes(
) )
.bind(&schema) .bind(&schema)
.bind(&table) .bind(&table)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -395,10 +413,7 @@ pub async fn get_completion_schema(
connection_id: String, connection_id: String,
) -> TuskResult<HashMap<String, HashMap<String, Vec<String>>>> { ) -> TuskResult<HashMap<String, HashMap<String, Vec<String>>>> {
let flavor = state.get_flavor(&connection_id).await; let flavor = state.get_flavor(&connection_id).await;
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let sql = if flavor == DbFlavor::Greenplum { let sql = if flavor == DbFlavor::Greenplum {
"SELECT table_schema, table_name, column_name \ "SELECT table_schema, table_name, column_name \
@@ -413,7 +428,7 @@ pub async fn get_completion_schema(
}; };
let rows = sqlx::query(sql) let rows = sqlx::query(sql)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -442,10 +457,7 @@ pub async fn get_column_details(
table: String, table: String,
) -> TuskResult<Vec<ColumnDetail>> { ) -> TuskResult<Vec<ColumnDetail>> {
let flavor = state.get_flavor(&connection_id).await; let flavor = state.get_flavor(&connection_id).await;
let pools = state.pools.read().await; let pool = state.get_pool(&connection_id).await?;
let pool = pools
.get(&connection_id)
.ok_or(TuskError::NotConnected(connection_id))?;
let sql = if flavor == DbFlavor::Greenplum { let sql = if flavor == DbFlavor::Greenplum {
"SELECT c.column_name, c.data_type, \ "SELECT c.column_name, c.data_type, \
@@ -468,7 +480,7 @@ pub async fn get_column_details(
let rows = sqlx::query(sql) let rows = sqlx::query(sql)
.bind(&schema) .bind(&schema)
.bind(&table) .bind(&table)
.fetch_all(pool) .fetch_all(&pool)
.await .await
.map_err(TuskError::Database)?; .map_err(TuskError::Database)?;
@@ -483,3 +495,180 @@ pub async fn get_column_details(
}) })
.collect()) .collect())
} }
#[tauri::command]
pub async fn get_table_triggers(
state: State<'_, Arc<AppState>>,
connection_id: String,
schema: String,
table: String,
) -> TuskResult<Vec<TriggerInfo>> {
let pool = state.get_pool(&connection_id).await?;
let rows = sqlx::query(
"SELECT \
t.tgname AS trigger_name, \
CASE \
WHEN t.tgtype::int & 2 = 2 THEN 'BEFORE' \
WHEN t.tgtype::int & 2 = 0 AND t.tgtype::int & 64 = 64 THEN 'INSTEAD OF' \
ELSE 'AFTER' \
END AS timing, \
array_to_string(ARRAY[ \
CASE WHEN t.tgtype::int & 4 = 4 THEN 'INSERT' ELSE NULL END, \
CASE WHEN t.tgtype::int & 8 = 8 THEN 'DELETE' ELSE NULL END, \
CASE WHEN t.tgtype::int & 16 = 16 THEN 'UPDATE' ELSE NULL END, \
CASE WHEN t.tgtype::int & 32 = 32 THEN 'TRUNCATE' ELSE NULL END \
], ' OR ') AS event, \
CASE WHEN t.tgtype::int & 1 = 1 THEN 'ROW' ELSE 'STATEMENT' END AS orientation, \
p.proname AS function_name, \
t.tgenabled != 'D' AS is_enabled, \
pg_get_triggerdef(t.oid) AS definition \
FROM pg_trigger t \
JOIN pg_class c ON c.oid = t.tgrelid \
JOIN pg_namespace n ON n.oid = c.relnamespace \
JOIN pg_proc p ON p.oid = t.tgfoid \
WHERE n.nspname = $1 AND c.relname = $2 AND NOT t.tgisinternal \
ORDER BY t.tgname",
)
.bind(&schema)
.bind(&table)
.fetch_all(&pool)
.await
.map_err(TuskError::Database)?;
Ok(rows
.iter()
.map(|r| TriggerInfo {
name: r.get::<String, _>(0),
timing: r.get::<String, _>(1),
event: r.get::<String, _>(2),
orientation: r.get::<String, _>(3),
function_name: r.get::<String, _>(4),
is_enabled: r.get::<bool, _>(5),
definition: r.get::<String, _>(6),
})
.collect())
}
#[tauri::command]
pub async fn get_schema_erd(
state: State<'_, Arc<AppState>>,
connection_id: String,
schema: String,
) -> TuskResult<ErdData> {
let pool = state.get_pool(&connection_id).await?;
// Get all tables with columns
let col_rows = sqlx::query(
"SELECT \
c.table_name, \
c.column_name, \
c.data_type, \
c.is_nullable = 'YES' AS is_nullable, \
COALESCE(( \
SELECT true FROM pg_constraint con \
JOIN pg_class cl ON cl.oid = con.conrelid \
JOIN pg_namespace ns ON ns.oid = cl.relnamespace \
WHERE con.contype = 'p' \
AND ns.nspname = $1 AND cl.relname = c.table_name \
AND EXISTS ( \
SELECT 1 FROM unnest(con.conkey) k \
JOIN pg_attribute a ON a.attrelid = con.conrelid AND a.attnum = k \
WHERE a.attname = c.column_name \
) \
LIMIT 1 \
), false) AS is_pk \
FROM information_schema.columns c \
JOIN information_schema.tables t \
ON t.table_schema = c.table_schema AND t.table_name = c.table_name \
WHERE c.table_schema = $1 AND t.table_type = 'BASE TABLE' \
ORDER BY c.table_name, c.ordinal_position",
)
.bind(&schema)
.fetch_all(&pool)
.await
.map_err(TuskError::Database)?;
// Build tables map
let mut tables_map: HashMap<String, ErdTable> = HashMap::new();
for row in &col_rows {
let table_name: String = row.get(0);
let entry = tables_map.entry(table_name.clone()).or_insert_with(|| ErdTable {
schema: schema.clone(),
name: table_name,
columns: Vec::new(),
});
entry.columns.push(ErdColumn {
name: row.get(1),
data_type: row.get(2),
is_nullable: row.get(3),
is_primary_key: row.get(4),
});
}
let tables: Vec<ErdTable> = tables_map.into_values().collect();
// Get all FK relationships
let fk_rows = sqlx::query(
"SELECT \
c.conname AS constraint_name, \
src_ns.nspname AS source_schema, \
src_cl.relname AS source_table, \
ARRAY( \
SELECT a.attname FROM unnest(c.conkey) WITH ORDINALITY AS k(attnum, ord) \
JOIN pg_attribute a ON a.attrelid = c.conrelid AND a.attnum = k.attnum \
ORDER BY k.ord \
)::text[] AS source_columns, \
ref_ns.nspname AS target_schema, \
ref_cl.relname AS target_table, \
ARRAY( \
SELECT a.attname FROM unnest(c.confkey) WITH ORDINALITY AS k(attnum, ord) \
JOIN pg_attribute a ON a.attrelid = c.confrelid AND a.attnum = k.attnum \
ORDER BY k.ord \
)::text[] AS target_columns, \
CASE c.confupdtype \
WHEN 'a' THEN 'NO ACTION' \
WHEN 'r' THEN 'RESTRICT' \
WHEN 'c' THEN 'CASCADE' \
WHEN 'n' THEN 'SET NULL' \
WHEN 'd' THEN 'SET DEFAULT' \
END AS update_rule, \
CASE c.confdeltype \
WHEN 'a' THEN 'NO ACTION' \
WHEN 'r' THEN 'RESTRICT' \
WHEN 'c' THEN 'CASCADE' \
WHEN 'n' THEN 'SET NULL' \
WHEN 'd' THEN 'SET DEFAULT' \
END AS delete_rule \
FROM pg_constraint c \
JOIN pg_class src_cl ON src_cl.oid = c.conrelid \
JOIN pg_namespace src_ns ON src_ns.oid = src_cl.relnamespace \
JOIN pg_class ref_cl ON ref_cl.oid = c.confrelid \
JOIN pg_namespace ref_ns ON ref_ns.oid = ref_cl.relnamespace \
WHERE c.contype = 'f' AND src_ns.nspname = $1 \
ORDER BY c.conname",
)
.bind(&schema)
.fetch_all(&pool)
.await
.map_err(TuskError::Database)?;
let relationships: Vec<ErdRelationship> = fk_rows
.iter()
.map(|r| ErdRelationship {
constraint_name: r.get(0),
source_schema: r.get(1),
source_table: r.get(2),
source_columns: r.get(3),
target_schema: r.get(4),
target_table: r.get(5),
target_columns: r.get(6),
update_rule: r.get(7),
delete_rule: r.get(8),
})
.collect();
Ok(ErdData {
tables,
relationships,
})
}

View File

@@ -0,0 +1,116 @@
use crate::error::{TuskError, TuskResult};
use crate::mcp;
use crate::models::settings::{AppSettings, DockerHost, McpStatus};
use crate::state::AppState;
use std::fs;
use std::sync::Arc;
use tauri::{AppHandle, Manager, State};
fn get_settings_path(app: &AppHandle) -> TuskResult<std::path::PathBuf> {
let dir = app
.path()
.app_data_dir()
.map_err(|e| TuskError::Custom(e.to_string()))?;
fs::create_dir_all(&dir)?;
Ok(dir.join("app_settings.json"))
}
#[tauri::command]
pub async fn get_app_settings(app: AppHandle) -> TuskResult<AppSettings> {
let path = get_settings_path(&app)?;
if !path.exists() {
return Ok(AppSettings::default());
}
let data = fs::read_to_string(&path)?;
let settings: AppSettings = serde_json::from_str(&data)?;
Ok(settings)
}
#[tauri::command]
pub async fn save_app_settings(
app: AppHandle,
state: State<'_, Arc<AppState>>,
settings: AppSettings,
) -> TuskResult<()> {
let path = get_settings_path(&app)?;
let data = serde_json::to_string_pretty(&settings)?;
fs::write(&path, data)?;
// Apply docker host setting
{
let mut docker_host = state.docker_host.write().await;
*docker_host = match settings.docker.host {
DockerHost::Remote => settings.docker.remote_url.clone(),
DockerHost::Local => None,
};
}
// Apply MCP setting: restart or stop
let is_running = *state.mcp_running.read().await;
if settings.mcp.enabled {
if is_running {
// Stop existing MCP server first
let _ = state.mcp_shutdown_tx.send(true);
// Give it a moment to shut down
tokio::time::sleep(std::time::Duration::from_millis(200)).await;
*state.mcp_running.write().await = false;
}
// Start new MCP server
let connections_path = app
.path()
.app_data_dir()
.map_err(|e| TuskError::Custom(e.to_string()))?
.join("connections.json");
let mcp_state = state.inner().clone();
let port = settings.mcp.port;
let shutdown_rx = state.mcp_shutdown_tx.subscribe();
tokio::spawn(async move {
*mcp_state.mcp_running.write().await = true;
if let Err(e) =
mcp::start_mcp_server(mcp_state.clone(), connections_path, port, shutdown_rx).await
{
log::error!("MCP server error: {}", e);
}
*mcp_state.mcp_running.write().await = false;
});
} else if is_running {
// Stop MCP server
let _ = state.mcp_shutdown_tx.send(true);
*state.mcp_running.write().await = false;
}
Ok(())
}
#[tauri::command]
pub async fn get_mcp_status(app: AppHandle) -> TuskResult<McpStatus> {
// Read settings from file for enabled/port
let settings = {
let path = get_settings_path(&app)?;
if path.exists() {
let data = fs::read_to_string(&path)?;
serde_json::from_str::<AppSettings>(&data).unwrap_or_default()
} else {
AppSettings::default()
}
};
// Probe the actual port to determine if MCP is running
let running = tokio::time::timeout(
std::time::Duration::from_millis(500),
tokio::net::TcpStream::connect(format!("127.0.0.1:{}", settings.mcp.port)),
)
.await
.map(|r| r.is_ok())
.unwrap_or(false);
Ok(McpStatus {
enabled: settings.mcp.enabled,
port: settings.mcp.port,
running,
})
}

View File

@@ -0,0 +1,350 @@
use crate::commands::ai::fetch_foreign_keys_raw;
use crate::commands::data::bind_json_value;
use crate::commands::queries::pg_value_to_json;
use crate::error::{TuskError, TuskResult};
use crate::models::snapshot::{
CreateSnapshotParams, RestoreSnapshotParams, Snapshot, SnapshotMetadata, SnapshotProgress,
SnapshotTableData, SnapshotTableMeta,
};
use crate::state::AppState;
use crate::utils::{escape_ident, topological_sort_tables};
use serde_json::Value;
use sqlx::{Column, Row, TypeInfo};
use std::fs;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Manager, State};
#[tauri::command]
pub async fn create_snapshot(
app: AppHandle,
state: State<'_, Arc<AppState>>,
params: CreateSnapshotParams,
snapshot_id: String,
file_path: String,
) -> TuskResult<SnapshotMetadata> {
let pool = state.get_pool(&params.connection_id).await?;
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "preparing".to_string(),
percent: 5,
message: "Preparing snapshot...".to_string(),
detail: None,
},
);
let mut target_tables: Vec<(String, String)> = params
.tables
.iter()
.map(|t| (t.schema.clone(), t.table.clone()))
.collect();
// Fetch FK info once — used for both dependency expansion and topological sort
let fk_rows = fetch_foreign_keys_raw(&pool).await?;
if params.include_dependencies {
for fk in &fk_rows {
for (schema, table) in &params.tables.iter().map(|t| (t.schema.clone(), t.table.clone())).collect::<Vec<_>>() {
if &fk.schema == schema && &fk.table == table {
let parent = (fk.ref_schema.clone(), fk.ref_table.clone());
if !target_tables.contains(&parent) {
target_tables.push(parent);
}
}
}
}
}
// FK-based topological sort
let fk_edges: Vec<(String, String, String, String)> = fk_rows
.iter()
.map(|fk| (fk.schema.clone(), fk.table.clone(), fk.ref_schema.clone(), fk.ref_table.clone()))
.collect();
let sorted_tables = topological_sort_tables(&fk_edges, &target_tables);
let mut tx = (&pool).begin().await.map_err(TuskError::Database)?;
sqlx::query("SET TRANSACTION READ ONLY")
.execute(&mut *tx)
.await
.map_err(TuskError::Database)?;
let total_tables = sorted_tables.len();
let mut snapshot_tables: Vec<SnapshotTableData> = Vec::new();
let mut table_metas: Vec<SnapshotTableMeta> = Vec::new();
let mut total_rows: u64 = 0;
for (i, (schema, table)) in sorted_tables.iter().enumerate() {
let percent = (10 + (i * 80 / total_tables.max(1))).min(90) as u8;
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "exporting".to_string(),
percent,
message: format!("Exporting {}.{}...", schema, table),
detail: None,
},
);
let qualified = format!("{}.{}", escape_ident(schema), escape_ident(table));
let sql = format!("SELECT * FROM {}", qualified);
let rows = sqlx::query(&sql)
.fetch_all(&mut *tx)
.await
.map_err(TuskError::Database)?;
let mut columns = Vec::new();
let mut column_types = Vec::new();
if let Some(first) = rows.first() {
for col in first.columns() {
columns.push(col.name().to_string());
column_types.push(col.type_info().name().to_string());
}
}
let data_rows: Vec<Vec<Value>> = rows
.iter()
.map(|row| (0..columns.len()).map(|i| pg_value_to_json(row, i)).collect())
.collect();
let row_count = data_rows.len() as u64;
total_rows += row_count;
table_metas.push(SnapshotTableMeta {
schema: schema.clone(),
table: table.clone(),
row_count,
columns: columns.clone(),
column_types: column_types.clone(),
});
snapshot_tables.push(SnapshotTableData {
schema: schema.clone(),
table: table.clone(),
columns,
column_types,
rows: data_rows,
});
}
tx.rollback().await.map_err(TuskError::Database)?;
let metadata = SnapshotMetadata {
id: snapshot_id.clone(),
name: params.name.clone(),
created_at: chrono::Utc::now().to_rfc3339(),
connection_name: String::new(),
database: String::new(),
tables: table_metas,
total_rows,
file_size_bytes: 0,
version: 1,
};
let snapshot = Snapshot {
metadata: metadata.clone(),
tables: snapshot_tables,
};
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "saving".to_string(),
percent: 95,
message: "Saving snapshot file...".to_string(),
detail: None,
},
);
let json = serde_json::to_string_pretty(&snapshot)?;
let file_size = json.len() as u64;
fs::write(&file_path, json)?;
let mut final_metadata = metadata;
final_metadata.file_size_bytes = file_size;
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "done".to_string(),
percent: 100,
message: "Snapshot created successfully".to_string(),
detail: Some(format!("{} rows, {} tables", total_rows, total_tables)),
},
);
Ok(final_metadata)
}
#[tauri::command]
pub async fn restore_snapshot(
app: AppHandle,
state: State<'_, Arc<AppState>>,
params: RestoreSnapshotParams,
snapshot_id: String,
) -> TuskResult<u64> {
if state.is_read_only(&params.connection_id).await {
return Err(TuskError::ReadOnly);
}
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "reading".to_string(),
percent: 5,
message: "Reading snapshot file...".to_string(),
detail: None,
},
);
let data = fs::read_to_string(&params.file_path)?;
let snapshot: Snapshot = serde_json::from_str(&data)?;
let pool = state.get_pool(&params.connection_id).await?;
let mut tx = (&pool).begin().await.map_err(TuskError::Database)?;
sqlx::query("SET CONSTRAINTS ALL DEFERRED")
.execute(&mut *tx)
.await
.map_err(TuskError::Database)?;
// TRUNCATE in reverse order (children first)
if params.truncate_before_restore {
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "truncating".to_string(),
percent: 15,
message: "Truncating existing data...".to_string(),
detail: None,
},
);
for table_data in snapshot.tables.iter().rev() {
let qualified = format!(
"{}.{}",
escape_ident(&table_data.schema),
escape_ident(&table_data.table)
);
let truncate_sql = format!("TRUNCATE {} CASCADE", qualified);
sqlx::query(&truncate_sql)
.execute(&mut *tx)
.await
.map_err(TuskError::Database)?;
}
}
// INSERT in forward order (parents first)
let total_tables = snapshot.tables.len();
let mut total_inserted: u64 = 0;
for (i, table_data) in snapshot.tables.iter().enumerate() {
if table_data.columns.is_empty() || table_data.rows.is_empty() {
continue;
}
let percent = (20 + (i * 75 / total_tables.max(1))).min(95) as u8;
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "inserting".to_string(),
percent,
message: format!("Restoring {}.{}...", table_data.schema, table_data.table),
detail: Some(format!("{} rows", table_data.rows.len())),
},
);
let qualified = format!(
"{}.{}",
escape_ident(&table_data.schema),
escape_ident(&table_data.table)
);
let col_list: Vec<String> = table_data.columns.iter().map(|c| escape_ident(c)).collect();
let placeholders: Vec<String> = (1..=table_data.columns.len())
.map(|i| format!("${}", i))
.collect();
let sql = format!(
"INSERT INTO {} ({}) VALUES ({})",
qualified,
col_list.join(", "),
placeholders.join(", ")
);
// Chunked insert
for row in &table_data.rows {
let mut query = sqlx::query(&sql);
for val in row {
query = bind_json_value(query, val);
}
query.execute(&mut *tx).await.map_err(TuskError::Database)?;
total_inserted += 1;
}
}
tx.commit().await.map_err(TuskError::Database)?;
let _ = app.emit(
"snapshot-progress",
SnapshotProgress {
snapshot_id: snapshot_id.clone(),
stage: "done".to_string(),
percent: 100,
message: "Restore completed successfully".to_string(),
detail: Some(format!("{} rows restored", total_inserted)),
},
);
state.invalidate_schema_cache(&params.connection_id).await;
Ok(total_inserted)
}
#[tauri::command]
pub async fn list_snapshots(app: AppHandle) -> TuskResult<Vec<SnapshotMetadata>> {
let dir = app
.path()
.app_data_dir()
.map_err(|e| TuskError::Custom(e.to_string()))?
.join("snapshots");
if !dir.exists() {
return Ok(Vec::new());
}
let mut snapshots = Vec::new();
for entry in fs::read_dir(&dir)? {
let entry = entry?;
let path = entry.path();
if path.extension().map(|e| e == "json").unwrap_or(false) {
if let Ok(data) = fs::read_to_string(&path) {
if let Ok(snapshot) = serde_json::from_str::<Snapshot>(&data) {
let mut meta = snapshot.metadata;
meta.file_size_bytes = entry.metadata().map(|m| m.len()).unwrap_or(0);
snapshots.push(meta);
}
}
}
}
snapshots.sort_by(|a, b| b.created_at.cmp(&a.created_at));
Ok(snapshots)
}
#[tauri::command]
pub async fn read_snapshot_metadata(file_path: String) -> TuskResult<SnapshotMetadata> {
let data = fs::read_to_string(&file_path)?;
let snapshot: Snapshot = serde_json::from_str(&data)?;
let mut meta = snapshot.metadata;
meta.file_size_bytes = fs::metadata(&file_path).map(|m| m.len()).unwrap_or(0);
Ok(meta)
}

View File

@@ -23,6 +23,9 @@ pub enum TuskError {
#[error("AI error: {0}")] #[error("AI error: {0}")]
Ai(String), Ai(String),
#[error("Docker error: {0}")]
Docker(String),
#[error("{0}")] #[error("{0}")]
Custom(String), Custom(String),
} }

View File

@@ -5,6 +5,7 @@ mod models;
mod state; mod state;
mod utils; mod utils;
use models::settings::{AppSettings, DockerHost};
use state::AppState; use state::AppState;
use std::sync::Arc; use std::sync::Arc;
use tauri::Manager; use tauri::Manager;
@@ -12,24 +13,60 @@ use tauri::Manager;
pub fn run() { pub fn run() {
let shared_state = Arc::new(AppState::new()); let shared_state = Arc::new(AppState::new());
tauri::Builder::default() let _ = tauri::Builder::default()
.plugin(tauri_plugin_shell::init()) .plugin(tauri_plugin_shell::init())
.plugin(tauri_plugin_dialog::init()) .plugin(tauri_plugin_dialog::init())
.manage(shared_state) .manage(shared_state)
.setup(|app| { .setup(|app| {
let state = app.state::<Arc<AppState>>().inner().clone(); let state = app.state::<Arc<AppState>>().inner().clone();
let connections_path = app let data_dir = app
.path() .path()
.app_data_dir() .app_data_dir()
.expect("failed to resolve app data dir") .map_err(|e| Box::new(e) as Box<dyn std::error::Error>)?;
.join("connections.json"); let connections_path = data_dir.join("connections.json");
tauri::async_runtime::spawn(async move { // Read app settings
if let Err(e) = mcp::start_mcp_server(state, connections_path, 9427).await { let settings_path = data_dir.join("app_settings.json");
log::error!("MCP server error: {}", e);
} let settings = if settings_path.exists() {
std::fs::read_to_string(&settings_path)
.ok()
.and_then(|data| serde_json::from_str::<AppSettings>(&data).ok())
.unwrap_or_default()
} else {
AppSettings::default()
};
// Apply docker host from settings
let docker_host = match settings.docker.host {
DockerHost::Remote => settings.docker.remote_url.clone(),
DockerHost::Local => None,
};
let mcp_enabled = settings.mcp.enabled;
let mcp_port = settings.mcp.port;
// Set docker host synchronously (state is fresh, no contention)
let state_for_setup = state.clone();
tauri::async_runtime::block_on(async {
*state_for_setup.docker_host.write().await = docker_host;
}); });
if mcp_enabled {
let shutdown_rx = state.mcp_shutdown_tx.subscribe();
let mcp_state = state.clone();
tauri::async_runtime::spawn(async move {
*mcp_state.mcp_running.write().await = true;
if let Err(e) =
mcp::start_mcp_server(mcp_state.clone(), connections_path, mcp_port, shutdown_rx)
.await
{
log::error!("MCP server error: {}", e);
}
*mcp_state.mcp_running.write().await = false;
});
}
Ok(()) Ok(())
}) })
.invoke_handler(tauri::generate_handler![ .invoke_handler(tauri::generate_handler![
@@ -59,6 +96,8 @@ pub fn run() {
commands::schema::get_table_indexes, commands::schema::get_table_indexes,
commands::schema::get_completion_schema, commands::schema::get_completion_schema,
commands::schema::get_column_details, commands::schema::get_column_details,
commands::schema::get_table_triggers,
commands::schema::get_schema_erd,
// data // data
commands::data::get_table_data, commands::data::get_table_data,
commands::data::update_row, commands::data::update_row,
@@ -96,9 +135,34 @@ pub fn run() {
commands::ai::generate_sql, commands::ai::generate_sql,
commands::ai::explain_sql, commands::ai::explain_sql,
commands::ai::fix_sql_error, commands::ai::fix_sql_error,
commands::ai::generate_validation_sql,
commands::ai::run_validation_rule,
commands::ai::suggest_validation_rules,
commands::ai::generate_test_data_preview,
commands::ai::insert_generated_data,
commands::ai::get_index_advisor_report,
commands::ai::apply_index_recommendation,
// snapshot
commands::snapshot::create_snapshot,
commands::snapshot::restore_snapshot,
commands::snapshot::list_snapshots,
commands::snapshot::read_snapshot_metadata,
// lookup // lookup
commands::lookup::entity_lookup, commands::lookup::entity_lookup,
// docker
commands::docker::check_docker,
commands::docker::list_tusk_containers,
commands::docker::clone_to_docker,
commands::docker::start_container,
commands::docker::stop_container,
commands::docker::remove_container,
// settings
commands::settings::get_app_settings,
commands::settings::save_app_settings,
commands::settings::get_mcp_status,
]) ])
.run(tauri::generate_context!()) .run(tauri::generate_context!())
.expect("error while running tauri application"); .inspect_err(|e| {
log::error!("Tauri application error: {}", e);
});
} }

View File

@@ -13,6 +13,7 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::watch;
// --- Tool parameter types --- // --- Tool parameter types ---
@@ -217,6 +218,7 @@ pub async fn start_mcp_server(
state: Arc<AppState>, state: Arc<AppState>,
connections_path: PathBuf, connections_path: PathBuf,
port: u16, port: u16,
mut shutdown_rx: watch::Receiver<bool>,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> { ) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let service = StreamableHttpService::new( let service = StreamableHttpService::new(
move || Ok(TuskMcpServer::new(state.clone(), connections_path.clone())), move || Ok(TuskMcpServer::new(state.clone(), connections_path.clone())),
@@ -230,7 +232,14 @@ pub async fn start_mcp_server(
log::info!("MCP server listening on http://{}/mcp", addr); log::info!("MCP server listening on http://{}/mcp", addr);
axum::serve(listener, router).await?; tokio::select! {
res = axum::serve(listener, router) => {
res?;
}
_ = shutdown_rx.changed() => {
log::info!("MCP server stopped by shutdown signal");
}
}
Ok(()) Ok(())
} }

View File

@@ -1,27 +1,42 @@
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize, Default)]
#[serde(rename_all = "lowercase")]
pub enum AiProvider {
#[default]
Ollama,
OpenAi,
Anthropic,
}
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AiSettings { pub struct AiSettings {
pub provider: AiProvider,
pub ollama_url: String, pub ollama_url: String,
pub openai_api_key: Option<String>,
pub anthropic_api_key: Option<String>,
pub model: String, pub model: String,
} }
impl Default for AiSettings { impl Default for AiSettings {
fn default() -> Self { fn default() -> Self {
Self { Self {
provider: AiProvider::Ollama,
ollama_url: "http://localhost:11434".to_string(), ollama_url: "http://localhost:11434".to_string(),
openai_api_key: None,
anthropic_api_key: None,
model: String::new(), model: String::new(),
} }
} }
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct OllamaChatMessage { pub struct OllamaChatMessage {
pub role: String, pub role: String,
pub content: String, pub content: String,
} }
#[derive(Debug, Serialize)] #[derive(Debug, Clone, Serialize)]
pub struct OllamaChatRequest { pub struct OllamaChatRequest {
pub model: String, pub model: String,
pub messages: Vec<OllamaChatMessage>, pub messages: Vec<OllamaChatMessage>,
@@ -42,3 +57,137 @@ pub struct OllamaTagsResponse {
pub struct OllamaModel { pub struct OllamaModel {
pub name: String, pub name: String,
} }
// --- Wave 1: Validation ---
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum ValidationStatus {
Pending,
Generating,
Running,
Passed,
Failed,
Error,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ValidationRule {
pub id: String,
pub description: String,
pub generated_sql: String,
pub status: ValidationStatus,
pub violation_count: u64,
pub sample_violations: Vec<Vec<serde_json::Value>>,
pub violation_columns: Vec<String>,
pub error: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ValidationReport {
pub rules: Vec<ValidationRule>,
pub total_rules: usize,
pub passed: usize,
pub failed: usize,
pub errors: usize,
pub execution_time_ms: u128,
}
// --- Wave 2: Data Generator ---
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GenerateDataParams {
pub connection_id: String,
pub schema: String,
pub table: String,
pub row_count: u32,
pub include_related: bool,
pub custom_instructions: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GeneratedDataPreview {
pub tables: Vec<GeneratedTableData>,
pub insert_order: Vec<String>,
pub total_rows: u32,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GeneratedTableData {
pub schema: String,
pub table: String,
pub columns: Vec<String>,
pub rows: Vec<Vec<serde_json::Value>>,
pub row_count: u32,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct DataGenProgress {
pub gen_id: String,
pub stage: String,
pub percent: u8,
pub message: String,
pub detail: Option<String>,
}
// --- Wave 3A: Index Advisor ---
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TableStats {
pub schema: String,
pub table: String,
pub seq_scan: i64,
pub idx_scan: i64,
pub n_live_tup: i64,
pub table_size: String,
pub index_size: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct IndexStats {
pub schema: String,
pub table: String,
pub index_name: String,
pub idx_scan: i64,
pub index_size: String,
pub definition: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SlowQuery {
pub query: String,
pub calls: i64,
pub total_time_ms: f64,
pub mean_time_ms: f64,
pub rows: i64,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum IndexRecommendationType {
CreateIndex,
DropIndex,
ReplaceIndex,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct IndexRecommendation {
pub id: String,
pub recommendation_type: IndexRecommendationType,
pub table_schema: String,
pub table_name: String,
pub index_name: Option<String>,
pub ddl: String,
pub rationale: String,
pub estimated_impact: String,
pub priority: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct IndexAdvisorReport {
pub table_stats: Vec<TableStats>,
pub index_stats: Vec<IndexStats>,
pub slow_queries: Vec<SlowQuery>,
pub recommendations: Vec<IndexRecommendation>,
pub has_pg_stat_statements: bool,
}

View File

@@ -16,6 +16,10 @@ pub struct ConnectionConfig {
impl ConnectionConfig { impl ConnectionConfig {
pub fn connection_url(&self) -> String { pub fn connection_url(&self) -> String {
self.connection_url_for_db(&self.database)
}
pub fn connection_url_for_db(&self, database: &str) -> String {
let ssl = self.ssl_mode.as_deref().unwrap_or("prefer"); let ssl = self.ssl_mode.as_deref().unwrap_or("prefer");
format!( format!(
"postgres://{}:{}@{}:{}/{}?sslmode={}", "postgres://{}:{}@{}:{}/{}?sslmode={}",
@@ -23,7 +27,7 @@ impl ConnectionConfig {
urlencoded(&self.password), urlencoded(&self.password),
self.host, self.host,
self.port, self.port,
urlencoded(&self.database), urlencoded(database),
ssl ssl
) )
} }

View File

@@ -0,0 +1,57 @@
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct DockerStatus {
pub installed: bool,
pub daemon_running: bool,
pub version: Option<String>,
pub error: Option<String>,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum CloneMode {
SchemaOnly,
FullClone,
SampleData,
}
#[derive(Debug, Clone, Deserialize)]
pub struct CloneToDockerParams {
pub source_connection_id: String,
pub source_database: String,
pub container_name: String,
pub pg_version: String,
pub host_port: Option<u16>,
pub clone_mode: CloneMode,
pub sample_rows: Option<u32>,
pub postgres_password: Option<String>,
}
#[derive(Debug, Clone, Serialize)]
pub struct CloneProgress {
pub clone_id: String,
pub stage: String,
pub percent: u8,
pub message: String,
pub detail: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TuskContainer {
pub container_id: String,
pub name: String,
pub status: String,
pub host_port: u16,
pub pg_version: String,
pub source_database: Option<String>,
pub source_connection: Option<String>,
pub created_at: Option<String>,
}
#[derive(Debug, Clone, Serialize)]
pub struct CloneResult {
pub container: TuskContainer,
pub connection_id: String,
pub connection_url: String,
}

View File

@@ -1,8 +1,11 @@
pub mod ai; pub mod ai;
pub mod connection; pub mod connection;
pub mod docker;
pub mod history; pub mod history;
pub mod lookup; pub mod lookup;
pub mod management; pub mod management;
pub mod query_result; pub mod query_result;
pub mod saved_queries; pub mod saved_queries;
pub mod schema; pub mod schema;
pub mod settings;
pub mod snapshot;

View File

@@ -20,4 +20,5 @@ pub struct PaginatedQueryResult {
pub total_rows: i64, pub total_rows: i64,
pub page: u32, pub page: u32,
pub page_size: u32, pub page_size: u32,
pub ctids: Vec<String>,
} }

View File

@@ -18,6 +18,7 @@ pub struct ColumnInfo {
pub ordinal_position: i32, pub ordinal_position: i32,
pub character_maximum_length: Option<i32>, pub character_maximum_length: Option<i32>,
pub is_primary_key: bool, pub is_primary_key: bool,
pub comment: Option<String>,
} }
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
@@ -34,6 +35,11 @@ pub struct ConstraintInfo {
pub name: String, pub name: String,
pub constraint_type: String, pub constraint_type: String,
pub columns: Vec<String>, pub columns: Vec<String>,
pub referenced_schema: Option<String>,
pub referenced_table: Option<String>,
pub referenced_columns: Option<Vec<String>>,
pub update_rule: Option<String>,
pub delete_rule: Option<String>,
} }
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
@@ -43,3 +49,48 @@ pub struct IndexInfo {
pub is_unique: bool, pub is_unique: bool,
pub is_primary: bool, pub is_primary: bool,
} }
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TriggerInfo {
pub name: String,
pub event: String,
pub timing: String,
pub orientation: String,
pub function_name: String,
pub is_enabled: bool,
pub definition: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ErdColumn {
pub name: String,
pub data_type: String,
pub is_nullable: bool,
pub is_primary_key: bool,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ErdTable {
pub schema: String,
pub name: String,
pub columns: Vec<ErdColumn>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ErdRelationship {
pub constraint_name: String,
pub source_schema: String,
pub source_table: String,
pub source_columns: Vec<String>,
pub target_schema: String,
pub target_table: String,
pub target_columns: Vec<String>,
pub update_rule: String,
pub delete_rule: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ErdData {
pub tables: Vec<ErdTable>,
pub relationships: Vec<ErdRelationship>,
}

View File

@@ -0,0 +1,60 @@
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AppSettings {
pub mcp: McpSettings,
pub docker: DockerSettings,
}
impl Default for AppSettings {
fn default() -> Self {
Self {
mcp: McpSettings::default(),
docker: DockerSettings::default(),
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct McpSettings {
pub enabled: bool,
pub port: u16,
}
impl Default for McpSettings {
fn default() -> Self {
Self {
enabled: true,
port: 9427,
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct DockerSettings {
pub host: DockerHost,
pub remote_url: Option<String>,
}
impl Default for DockerSettings {
fn default() -> Self {
Self {
host: DockerHost::Local,
remote_url: None,
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
#[serde(rename_all = "lowercase")]
pub enum DockerHost {
Local,
Remote,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct McpStatus {
pub enabled: bool,
pub port: u16,
pub running: bool,
}

View File

@@ -0,0 +1,68 @@
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SnapshotMetadata {
pub id: String,
pub name: String,
pub created_at: String,
pub connection_name: String,
pub database: String,
pub tables: Vec<SnapshotTableMeta>,
pub total_rows: u64,
pub file_size_bytes: u64,
pub version: u32,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SnapshotTableMeta {
pub schema: String,
pub table: String,
pub row_count: u64,
pub columns: Vec<String>,
pub column_types: Vec<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Snapshot {
pub metadata: SnapshotMetadata,
pub tables: Vec<SnapshotTableData>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SnapshotTableData {
pub schema: String,
pub table: String,
pub columns: Vec<String>,
pub column_types: Vec<String>,
pub rows: Vec<Vec<serde_json::Value>>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SnapshotProgress {
pub snapshot_id: String,
pub stage: String,
pub percent: u8,
pub message: String,
pub detail: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CreateSnapshotParams {
pub connection_id: String,
pub tables: Vec<TableRef>,
pub name: String,
pub include_dependencies: bool,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TableRef {
pub schema: String,
pub table: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RestoreSnapshotParams {
pub connection_id: String,
pub file_path: String,
pub truncate_before_restore: bool,
}

View File

@@ -1,8 +1,11 @@
use crate::error::{TuskError, TuskResult};
use crate::models::ai::AiSettings;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use sqlx::PgPool; use sqlx::PgPool;
use std::collections::HashMap; use std::collections::HashMap;
use std::path::PathBuf; use std::path::PathBuf;
use tokio::sync::RwLock; use std::time::{Duration, Instant};
use tokio::sync::{watch, RwLock};
#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] #[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "lowercase")] #[serde(rename_all = "lowercase")]
@@ -11,23 +14,50 @@ pub enum DbFlavor {
Greenplum, Greenplum,
} }
#[derive(Clone)]
pub struct SchemaCacheEntry {
pub schema_text: String,
pub cached_at: Instant,
}
pub struct AppState { pub struct AppState {
pub pools: RwLock<HashMap<String, PgPool>>, pub pools: RwLock<HashMap<String, PgPool>>,
pub config_path: RwLock<Option<PathBuf>>, pub config_path: RwLock<Option<PathBuf>>,
pub read_only: RwLock<HashMap<String, bool>>, pub read_only: RwLock<HashMap<String, bool>>,
pub db_flavors: RwLock<HashMap<String, DbFlavor>>, pub db_flavors: RwLock<HashMap<String, DbFlavor>>,
pub schema_cache: RwLock<HashMap<String, SchemaCacheEntry>>,
pub mcp_shutdown_tx: watch::Sender<bool>,
pub mcp_running: RwLock<bool>,
pub docker_host: RwLock<Option<String>>,
pub ai_settings: RwLock<Option<AiSettings>>,
} }
const SCHEMA_CACHE_TTL: Duration = Duration::from_secs(300); // 5 minutes
impl AppState { impl AppState {
pub fn new() -> Self { pub fn new() -> Self {
let (mcp_shutdown_tx, _) = watch::channel(false);
Self { Self {
pools: RwLock::new(HashMap::new()), pools: RwLock::new(HashMap::new()),
config_path: RwLock::new(None), config_path: RwLock::new(None),
read_only: RwLock::new(HashMap::new()), read_only: RwLock::new(HashMap::new()),
db_flavors: RwLock::new(HashMap::new()), db_flavors: RwLock::new(HashMap::new()),
schema_cache: RwLock::new(HashMap::new()),
mcp_shutdown_tx,
mcp_running: RwLock::new(false),
docker_host: RwLock::new(None),
ai_settings: RwLock::new(None),
} }
} }
pub async fn get_pool(&self, connection_id: &str) -> TuskResult<PgPool> {
let pools = self.pools.read().await;
pools
.get(connection_id)
.cloned()
.ok_or_else(|| TuskError::NotConnected(connection_id.to_string()))
}
pub async fn is_read_only(&self, id: &str) -> bool { pub async fn is_read_only(&self, id: &str) -> bool {
let map = self.read_only.read().await; let map = self.read_only.read().await;
map.get(id).copied().unwrap_or(true) map.get(id).copied().unwrap_or(true)
@@ -37,4 +67,31 @@ impl AppState {
let map = self.db_flavors.read().await; let map = self.db_flavors.read().await;
map.get(id).copied().unwrap_or(DbFlavor::PostgreSQL) map.get(id).copied().unwrap_or(DbFlavor::PostgreSQL)
} }
pub async fn get_schema_cache(&self, connection_id: &str) -> Option<String> {
let cache = self.schema_cache.read().await;
cache.get(connection_id).and_then(|entry| {
if entry.cached_at.elapsed() < SCHEMA_CACHE_TTL {
Some(entry.schema_text.clone())
} else {
None
}
})
}
pub async fn set_schema_cache(&self, connection_id: String, schema_text: String) {
let mut cache = self.schema_cache.write().await;
cache.insert(
connection_id,
SchemaCacheEntry {
schema_text,
cached_at: Instant::now(),
},
);
}
pub async fn invalidate_schema_cache(&self, connection_id: &str) {
let mut cache = self.schema_cache.write().await;
cache.remove(connection_id);
}
} }

View File

@@ -1,3 +1,75 @@
use std::collections::{HashMap, HashSet};
pub fn escape_ident(name: &str) -> String { pub fn escape_ident(name: &str) -> String {
format!("\"{}\"", name.replace('"', "\"\"")) format!("\"{}\"", name.replace('"', "\"\""))
} }
/// Topological sort of tables based on foreign key dependencies.
/// Returns tables in insertion order: parents before children.
pub fn topological_sort_tables(
fk_edges: &[(String, String, String, String)], // (schema, table, ref_schema, ref_table)
target_tables: &[(String, String)],
) -> Vec<(String, String)> {
let mut graph: HashMap<(String, String), HashSet<(String, String)>> = HashMap::new();
let mut in_degree: HashMap<(String, String), usize> = HashMap::new();
// Initialize all target tables
for t in target_tables {
graph.entry(t.clone()).or_default();
in_degree.entry(t.clone()).or_insert(0);
}
let target_set: HashSet<(String, String)> = target_tables.iter().cloned().collect();
// Build edges: parent -> child (child depends on parent)
for (schema, table, ref_schema, ref_table) in fk_edges {
let child = (schema.clone(), table.clone());
let parent = (ref_schema.clone(), ref_table.clone());
if child == parent {
continue; // self-referencing
}
if !target_set.contains(&child) || !target_set.contains(&parent) {
continue;
}
if graph.entry(parent.clone()).or_default().insert(child.clone()) {
*in_degree.entry(child).or_insert(0) += 1;
}
}
// Kahn's algorithm
let mut queue: Vec<(String, String)> = in_degree
.iter()
.filter(|(_, &deg)| deg == 0)
.map(|(k, _)| k.clone())
.collect();
queue.sort(); // deterministic order
let mut result = Vec::new();
while let Some(node) = queue.pop() {
result.push(node.clone());
if let Some(neighbors) = graph.get(&node) {
for neighbor in neighbors {
if let Some(deg) = in_degree.get_mut(neighbor) {
*deg -= 1;
if *deg == 0 {
queue.push(neighbor.clone());
queue.sort();
}
}
}
}
}
// Add any remaining tables (cycles) at the end
for t in target_tables {
if !result.contains(t) {
result.push(t.clone());
}
}
result
}

View File

@@ -2,7 +2,7 @@
"$schema": "https://schema.tauri.app/config/2", "$schema": "https://schema.tauri.app/config/2",
"productName": "Tusk", "productName": "Tusk",
"version": "0.1.0", "version": "0.1.0",
"identifier": "com.tusk.app", "identifier": "com.tusk.dbm",
"build": { "build": {
"frontendDist": "../dist", "frontendDist": "../dist",
"devUrl": "http://localhost:5173", "devUrl": "http://localhost:5173",
@@ -27,7 +27,7 @@
}, },
"bundle": { "bundle": {
"active": true, "active": true,
"targets": "all", "targets": ["deb", "rpm", "appimage", "dmg", "nsis"],
"icon": [ "icon": [
"icons/32x32.png", "icons/32x32.png",
"icons/128x128.png", "icons/128x128.png",

View File

@@ -0,0 +1,82 @@
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { useOllamaModels } from "@/hooks/use-ai";
import { RefreshCw, Loader2 } from "lucide-react";
interface Props {
ollamaUrl: string;
onOllamaUrlChange: (url: string) => void;
model: string;
onModelChange: (model: string) => void;
}
export function AiSettingsFields({
ollamaUrl,
onOllamaUrlChange,
model,
onModelChange,
}: Props) {
const {
data: models,
isLoading: modelsLoading,
isError: modelsError,
refetch: refetchModels,
} = useOllamaModels(ollamaUrl);
return (
<>
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Ollama URL</label>
<Input
value={ollamaUrl}
onChange={(e) => onOllamaUrlChange(e.target.value)}
placeholder="http://localhost:11434"
className="h-8 text-xs"
/>
</div>
<div className="flex flex-col gap-1.5">
<div className="flex items-center justify-between">
<label className="text-xs text-muted-foreground">Model</label>
<Button
size="sm"
variant="ghost"
className="h-5 w-5 p-0"
onClick={() => refetchModels()}
disabled={modelsLoading}
title="Refresh models"
>
{modelsLoading ? (
<Loader2 className="h-3 w-3 animate-spin" />
) : (
<RefreshCw className="h-3 w-3" />
)}
</Button>
</div>
{modelsError ? (
<p className="text-xs text-destructive">Cannot connect to Ollama</p>
) : (
<Select value={model} onValueChange={onModelChange}>
<SelectTrigger className="h-8 w-full text-xs">
<SelectValue placeholder="Select a model" />
</SelectTrigger>
<SelectContent>
{models?.map((m) => (
<SelectItem key={m.name} value={m.name}>
{m.name}
</SelectItem>
))}
</SelectContent>
</Select>
)}
</div>
</>
);
}

View File

@@ -5,17 +5,10 @@ import {
PopoverTrigger, PopoverTrigger,
} from "@/components/ui/popover"; } from "@/components/ui/popover";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input"; import { useAiSettings, useSaveAiSettings } from "@/hooks/use-ai";
import { import { Settings } from "lucide-react";
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { useAiSettings, useSaveAiSettings, useOllamaModels } from "@/hooks/use-ai";
import { Settings, RefreshCw, Loader2 } from "lucide-react";
import { toast } from "sonner"; import { toast } from "sonner";
import { AiSettingsFields } from "./AiSettingsFields";
export function AiSettingsPopover() { export function AiSettingsPopover() {
const { data: settings } = useAiSettings(); const { data: settings } = useAiSettings();
@@ -27,16 +20,9 @@ export function AiSettingsPopover() {
const currentUrl = url ?? settings?.ollama_url ?? "http://localhost:11434"; const currentUrl = url ?? settings?.ollama_url ?? "http://localhost:11434";
const currentModel = model ?? settings?.model ?? ""; const currentModel = model ?? settings?.model ?? "";
const {
data: models,
isLoading: modelsLoading,
isError: modelsError,
refetch: refetchModels,
} = useOllamaModels(currentUrl);
const handleSave = () => { const handleSave = () => {
saveMutation.mutate( saveMutation.mutate(
{ ollama_url: currentUrl, model: currentModel }, { provider: "ollama", ollama_url: currentUrl, model: currentModel },
{ {
onSuccess: () => toast.success("AI settings saved"), onSuccess: () => toast.success("AI settings saved"),
onError: (err) => onError: (err) =>
@@ -63,53 +49,12 @@ export function AiSettingsPopover() {
<div className="flex flex-col gap-3"> <div className="flex flex-col gap-3">
<h4 className="text-sm font-medium">Ollama Settings</h4> <h4 className="text-sm font-medium">Ollama Settings</h4>
<div className="flex flex-col gap-1.5"> <AiSettingsFields
<label className="text-xs text-muted-foreground">Ollama URL</label> ollamaUrl={currentUrl}
<Input onOllamaUrlChange={setUrl}
value={currentUrl} model={currentModel}
onChange={(e) => setUrl(e.target.value)} onModelChange={setModel}
placeholder="http://localhost:11434" />
className="h-8 text-xs"
/>
</div>
<div className="flex flex-col gap-1.5">
<div className="flex items-center justify-between">
<label className="text-xs text-muted-foreground">Model</label>
<Button
size="sm"
variant="ghost"
className="h-5 w-5 p-0"
onClick={() => refetchModels()}
disabled={modelsLoading}
title="Refresh models"
>
{modelsLoading ? (
<Loader2 className="h-3 w-3 animate-spin" />
) : (
<RefreshCw className="h-3 w-3" />
)}
</Button>
</div>
{modelsError ? (
<p className="text-xs text-destructive">
Cannot connect to Ollama
</p>
) : (
<Select value={currentModel} onValueChange={setModel}>
<SelectTrigger className="h-8 w-full text-xs">
<SelectValue placeholder="Select a model" />
</SelectTrigger>
<SelectContent>
{models?.map((m) => (
<SelectItem key={m.name} value={m.name}>
{m.name}
</SelectItem>
))}
</SelectContent>
</Select>
)}
</div>
<Button size="sm" className="h-7 text-xs" onClick={handleSave}> <Button size="sm" className="h-7 text-xs" onClick={handleSave}>
Save Save

View File

@@ -0,0 +1,295 @@
import { useState, useEffect } from "react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogFooter,
} from "@/components/ui/dialog";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Badge } from "@/components/ui/badge";
import { useDataGenerator } from "@/hooks/use-data-generator";
import { toast } from "sonner";
import {
Loader2,
CheckCircle2,
XCircle,
Wand2,
Table2,
} from "lucide-react";
interface Props {
open: boolean;
onOpenChange: (open: boolean) => void;
connectionId: string;
schema: string;
table: string;
}
type Step = "config" | "preview" | "done";
export function GenerateDataDialog({
open,
onOpenChange,
connectionId,
schema,
table,
}: Props) {
const [step, setStep] = useState<Step>("config");
const [rowCount, setRowCount] = useState(10);
const [includeRelated, setIncludeRelated] = useState(true);
const [customInstructions, setCustomInstructions] = useState("");
const {
generatePreview,
preview,
isGenerating,
generateError,
insertData,
insertedRows,
isInserting,
insertError,
progress,
reset,
} = useDataGenerator();
useEffect(() => {
if (open) {
setStep("config");
setRowCount(10);
setIncludeRelated(true);
setCustomInstructions("");
reset();
}
}, [open, reset]);
const handleGenerate = () => {
const genId = crypto.randomUUID();
generatePreview(
{
params: {
connection_id: connectionId,
schema,
table,
row_count: rowCount,
include_related: includeRelated,
custom_instructions: customInstructions || undefined,
},
genId,
},
{
onSuccess: () => setStep("preview"),
onError: (err) => toast.error("Generation failed", { description: String(err) }),
}
);
};
const handleInsert = () => {
if (!preview) return;
insertData(
{ connectionId, preview },
{
onSuccess: (rows) => {
setStep("done");
toast.success(`Inserted ${rows} rows`);
},
onError: (err) => toast.error("Insert failed", { description: String(err) }),
}
);
};
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-[600px] max-h-[80vh] overflow-y-auto">
<DialogHeader>
<DialogTitle className="flex items-center gap-2">
<Wand2 className="h-5 w-5" />
Generate Test Data
</DialogTitle>
</DialogHeader>
{step === "config" && (
<>
<div className="grid gap-3 py-2">
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Table</label>
<div className="col-span-3">
<Badge variant="secondary">{schema}.{table}</Badge>
</div>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Row Count</label>
<Input
className="col-span-3"
type="number"
value={rowCount}
onChange={(e) => setRowCount(Math.min(1000, Math.max(1, parseInt(e.target.value) || 1)))}
min={1}
max={1000}
/>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Related Tables</label>
<div className="col-span-3 flex items-center gap-2">
<input
type="checkbox"
checked={includeRelated}
onChange={(e) => setIncludeRelated(e.target.checked)}
className="rounded"
/>
<span className="text-sm text-muted-foreground">
Include parent tables (via foreign keys)
</span>
</div>
</div>
<div className="grid grid-cols-4 items-start gap-3">
<label className="text-right text-sm text-muted-foreground pt-2">Instructions</label>
<Input
className="col-span-3"
placeholder="Optional: specific data requirements..."
value={customInstructions}
onChange={(e) => setCustomInstructions(e.target.value)}
/>
</div>
</div>
{isGenerating && progress && (
<div className="space-y-2">
<div className="flex items-center justify-between text-sm">
<span>{progress.message}</span>
<span className="text-muted-foreground">{progress.percent}%</span>
</div>
<div className="h-2 rounded-full bg-secondary overflow-hidden">
<div
className="h-full rounded-full bg-primary transition-all duration-300"
style={{ width: `${progress.percent}%` }}
/>
</div>
</div>
)}
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Cancel</Button>
<Button onClick={handleGenerate} disabled={isGenerating}>
{isGenerating ? (
<><Loader2 className="h-4 w-4 animate-spin mr-1" />Generating...</>
) : (
"Generate Preview"
)}
</Button>
</DialogFooter>
</>
)}
{step === "preview" && preview && (
<>
<div className="space-y-3">
<div className="flex items-center gap-2 text-sm">
<span className="text-muted-foreground">Preview:</span>
<Badge variant="secondary">{preview.total_rows} rows across {preview.tables.length} tables</Badge>
</div>
{preview.tables.map((tbl) => (
<div key={`${tbl.schema}.${tbl.table}`} className="rounded-md border">
<div className="flex items-center gap-2 px-3 py-2 bg-muted/50 text-sm font-medium border-b">
<Table2 className="h-3.5 w-3.5" />
{tbl.schema}.{tbl.table}
<Badge variant="secondary" className="ml-auto text-[10px]">{tbl.row_count} rows</Badge>
</div>
<div className="overflow-x-auto max-h-48">
<table className="w-full text-xs">
<thead>
<tr className="border-b">
{tbl.columns.map((col) => (
<th key={col} className="px-2 py-1 text-left font-medium text-muted-foreground whitespace-nowrap">
{col}
</th>
))}
</tr>
</thead>
<tbody>
{tbl.rows.slice(0, 5).map((row, i) => (
<tr key={i} className="border-b last:border-0">
{(row as unknown[]).map((val, j) => (
<td key={j} className="px-2 py-1 font-mono whitespace-nowrap">
{val === null ? (
<span className="text-muted-foreground">NULL</span>
) : (
String(val).substring(0, 50)
)}
</td>
))}
</tr>
))}
{tbl.rows.length > 5 && (
<tr>
<td colSpan={tbl.columns.length} className="px-2 py-1 text-center text-muted-foreground">
...and {tbl.rows.length - 5} more rows
</td>
</tr>
)}
</tbody>
</table>
</div>
</div>
))}
</div>
<DialogFooter>
<Button variant="outline" onClick={() => setStep("config")}>Back</Button>
<Button onClick={handleInsert} disabled={isInserting}>
{isInserting ? (
<><Loader2 className="h-4 w-4 animate-spin mr-1" />Inserting...</>
) : (
`Insert ${preview.total_rows} Rows`
)}
</Button>
</DialogFooter>
</>
)}
{step === "done" && (
<div className="py-4 space-y-4">
{insertError ? (
<div className="flex items-start gap-3 rounded-md border border-destructive/50 bg-destructive/10 p-4">
<XCircle className="h-5 w-5 text-destructive shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium text-destructive">Insert Failed</p>
<p className="text-xs text-muted-foreground">{insertError}</p>
</div>
</div>
) : (
<div className="flex items-start gap-3 rounded-md border border-green-500/50 bg-green-500/10 p-4">
<CheckCircle2 className="h-5 w-5 text-green-500 shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium">Data Generated Successfully</p>
<p className="text-xs text-muted-foreground">
{insertedRows} rows inserted across {preview?.tables.length ?? 0} tables.
</p>
</div>
</div>
)}
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Close</Button>
{insertError && (
<Button onClick={() => setStep("preview")}>Retry</Button>
)}
</DialogFooter>
</div>
)}
{generateError && step === "config" && (
<div className="flex items-start gap-3 rounded-md border border-destructive/50 bg-destructive/10 p-4">
<XCircle className="h-5 w-5 text-destructive shrink-0 mt-0.5" />
<p className="text-xs text-muted-foreground">{generateError}</p>
</div>
)}
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,494 @@
import { useState, useEffect, useRef } from "react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogFooter,
} from "@/components/ui/dialog";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { Badge } from "@/components/ui/badge";
import { useDockerStatus, useCloneToDocker } from "@/hooks/use-docker";
import { toast } from "sonner";
import {
Loader2,
CheckCircle2,
XCircle,
Container,
Copy,
ChevronDown,
ChevronRight,
} from "lucide-react";
import type { CloneMode, CloneProgress } from "@/types";
interface Props {
open: boolean;
onOpenChange: (open: boolean) => void;
connectionId: string;
database: string;
onConnect?: (connectionId: string) => void;
}
type Step = "config" | "progress" | "done";
function ProcessLog({
entries,
open: logOpen,
onToggle,
endRef,
}: {
entries: CloneProgress[];
open: boolean;
onToggle: () => void;
endRef: React.RefObject<HTMLDivElement | null>;
}) {
if (entries.length === 0) return null;
return (
<div>
<button
type="button"
className="flex items-center gap-1 text-xs text-muted-foreground hover:text-foreground transition-colors"
onClick={onToggle}
>
{logOpen ? (
<ChevronDown className="h-3 w-3" />
) : (
<ChevronRight className="h-3 w-3" />
)}
Process Log ({entries.length})
</button>
{logOpen && (
<div className="mt-1.5 rounded-md bg-muted p-3 text-xs font-mono max-h-40 overflow-auto">
{entries.map((entry, i) => (
<div key={i} className="leading-5 min-w-0">
<span className="text-muted-foreground">
{entry.percent}%
</span>{" "}
<span>{entry.message}</span>
{entry.detail && (
<div className="text-muted-foreground break-all pl-6">
{entry.detail}
</div>
)}
</div>
))}
<div ref={endRef} />
</div>
)}
</div>
);
}
export function CloneDatabaseDialog({
open,
onOpenChange,
connectionId,
database,
onConnect,
}: Props) {
const [step, setStep] = useState<Step>("config");
const [containerName, setContainerName] = useState("");
const [pgVersion, setPgVersion] = useState("16");
const [portMode, setPortMode] = useState<"auto" | "manual">("auto");
const [manualPort, setManualPort] = useState(5433);
const [cloneMode, setCloneMode] = useState<CloneMode>("schema_only");
const [sampleRows, setSampleRows] = useState(1000);
const [logEntries, setLogEntries] = useState<CloneProgress[]>([]);
const [logOpen, setLogOpen] = useState(false);
const logEndRef = useRef<HTMLDivElement>(null);
const { data: dockerStatus } = useDockerStatus();
const { clone, result, error, isCloning, progress, reset } =
useCloneToDocker();
// Reset state when dialog opens
useEffect(() => {
if (open) {
setStep("config");
setContainerName(
`tusk-${database.replace(/[^a-zA-Z0-9_-]/g, "-")}-${Date.now().toString(36)}`
);
setPgVersion("16");
setPortMode("auto");
setManualPort(5433);
setCloneMode("schema_only");
setSampleRows(1000);
setLogEntries([]);
setLogOpen(false);
reset();
}
}, [open, database, reset]);
// Accumulate progress events into log
useEffect(() => {
if (progress) {
setLogEntries((prev) => {
const last = prev[prev.length - 1];
if (last && last.stage === progress.stage && last.message === progress.message) {
return prev;
}
return [...prev, progress];
});
if (progress.stage === "done" || progress.stage === "error") {
setStep("done");
}
}
}, [progress]);
// Auto-scroll log to bottom
useEffect(() => {
if (logOpen && logEndRef.current) {
logEndRef.current.scrollIntoView({ behavior: "smooth" });
}
}, [logEntries, logOpen]);
const handleClone = () => {
if (!containerName.trim()) {
toast.error("Container name is required");
return;
}
setStep("progress");
const cloneId = crypto.randomUUID();
clone({
params: {
source_connection_id: connectionId,
source_database: database,
container_name: containerName.trim(),
pg_version: pgVersion,
host_port: portMode === "manual" ? manualPort : null,
clone_mode: cloneMode,
sample_rows: cloneMode === "sample_data" ? sampleRows : null,
postgres_password: null,
},
cloneId,
});
};
const handleConnect = () => {
if (result?.connection_id && onConnect) {
onConnect(result.connection_id);
}
onOpenChange(false);
};
const dockerReady =
dockerStatus?.installed && dockerStatus?.daemon_running;
const logSection = (
<ProcessLog
entries={logEntries}
open={logOpen}
onToggle={() => setLogOpen(!logOpen)}
endRef={logEndRef}
/>
);
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-[520px]">
<DialogHeader>
<DialogTitle className="flex items-center gap-2">
<Container className="h-5 w-5" />
Clone to Docker
</DialogTitle>
</DialogHeader>
{step === "config" && (
<>
<div className="flex items-center gap-2 rounded-md border px-3 py-2 text-sm">
{dockerStatus === undefined ? (
<>
<Loader2 className="h-4 w-4 animate-spin text-muted-foreground" />
<span className="text-muted-foreground">
Checking Docker...
</span>
</>
) : dockerReady ? (
<>
<CheckCircle2 className="h-4 w-4 text-green-500" />
<span>Docker {dockerStatus.version}</span>
</>
) : (
<>
<XCircle className="h-4 w-4 text-destructive" />
<span className="text-destructive">
{dockerStatus?.error || "Docker not available"}
</span>
</>
)}
</div>
<div className="grid gap-3 py-2">
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
Database
</label>
<div className="col-span-3">
<Badge variant="secondary">{database}</Badge>
</div>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
Container
</label>
<Input
className="col-span-3"
value={containerName}
onChange={(e) => setContainerName(e.target.value)}
placeholder="tusk-mydb-clone"
/>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
PG Version
</label>
<Select value={pgVersion} onValueChange={setPgVersion}>
<SelectTrigger className="col-span-3">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="17">PostgreSQL 17</SelectItem>
<SelectItem value="16">PostgreSQL 16</SelectItem>
<SelectItem value="15">PostgreSQL 15</SelectItem>
<SelectItem value="14">PostgreSQL 14</SelectItem>
</SelectContent>
</Select>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
Port
</label>
<div className="col-span-3 flex items-center gap-2">
<Select
value={portMode}
onValueChange={(v) =>
setPortMode(v as "auto" | "manual")
}
>
<SelectTrigger className="w-24">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="auto">Auto</SelectItem>
<SelectItem value="manual">Manual</SelectItem>
</SelectContent>
</Select>
{portMode === "manual" && (
<Input
type="number"
className="flex-1"
value={manualPort}
onChange={(e) =>
setManualPort(parseInt(e.target.value) || 5433)
}
min={1024}
max={65535}
/>
)}
</div>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
Clone Mode
</label>
<Select
value={cloneMode}
onValueChange={(v) => setCloneMode(v as CloneMode)}
>
<SelectTrigger className="col-span-3">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="schema_only">
Schema Only
</SelectItem>
<SelectItem value="full_clone">Full Clone</SelectItem>
<SelectItem value="sample_data">
Sample Data
</SelectItem>
</SelectContent>
</Select>
</div>
{cloneMode === "sample_data" && (
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">
Sample Rows
</label>
<Input
className="col-span-3"
type="number"
value={sampleRows}
onChange={(e) =>
setSampleRows(parseInt(e.target.value) || 1000)
}
min={1}
max={100000}
/>
</div>
)}
</div>
<DialogFooter>
<Button
variant="outline"
onClick={() => onOpenChange(false)}
>
Cancel
</Button>
<Button onClick={handleClone} disabled={!dockerReady}>
Clone
</Button>
</DialogFooter>
</>
)}
{step === "progress" && (
<div className="py-4 space-y-4">
<div className="space-y-2">
<div className="flex items-center justify-between text-sm">
<span>{progress?.message || "Starting..."}</span>
<span className="text-muted-foreground">
{progress?.percent ?? 0}%
</span>
</div>
<div className="h-2 rounded-full bg-secondary overflow-hidden">
<div
className="h-full rounded-full bg-primary transition-all duration-300"
style={{ width: `${progress?.percent ?? 0}%` }}
/>
</div>
</div>
{isCloning && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" />
{progress?.stage || "Initializing..."}
</div>
)}
{logSection}
</div>
)}
{step === "done" && (
<div className="py-4 space-y-4">
{error ? (
<div className="flex items-start gap-3 rounded-md border border-destructive/50 bg-destructive/10 p-4">
<XCircle className="h-5 w-5 text-destructive shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium text-destructive">
Clone Failed
</p>
<p className="text-xs text-muted-foreground">{error}</p>
</div>
</div>
) : (
<div className="space-y-3">
<div className="flex items-start gap-3 rounded-md border border-green-500/50 bg-green-500/10 p-4">
<CheckCircle2 className="h-5 w-5 text-green-500 shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium">
Clone Completed
</p>
<p className="text-xs text-muted-foreground">
Database cloned to Docker container successfully.
</p>
</div>
</div>
{result && (
<div className="rounded-md border p-3 space-y-2 text-sm">
<div className="flex items-center justify-between">
<span className="text-muted-foreground">
Container
</span>
<span className="font-mono">
{result.container.name}
</span>
</div>
<div className="flex items-center justify-between">
<span className="text-muted-foreground">Port</span>
<span className="font-mono">
{result.container.host_port}
</span>
</div>
<div className="flex items-center justify-between gap-2">
<span className="text-muted-foreground">URL</span>
<div className="flex items-center gap-1">
<span className="font-mono text-xs truncate max-w-[250px]">
{result.connection_url}
</span>
<Button
variant="ghost"
size="sm"
className="h-6 w-6 p-0"
onClick={() => {
navigator.clipboard.writeText(
result.connection_url
);
toast.success("URL copied");
}}
>
<Copy className="h-3 w-3" />
</Button>
</div>
</div>
</div>
)}
</div>
)}
{logSection}
<DialogFooter>
{error ? (
<>
<Button
variant="outline"
onClick={() => onOpenChange(false)}
>
Close
</Button>
<Button
onClick={() => setStep("config")}
>
Retry
</Button>
</>
) : (
<>
<Button
variant="outline"
onClick={() => onOpenChange(false)}
>
Close
</Button>
{onConnect && result && (
<Button onClick={handleConnect}>Connect</Button>
)}
</>
)}
</DialogFooter>
</div>
)}
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,179 @@
import { useState } from "react";
import {
useTuskContainers,
useStartContainer,
useStopContainer,
useRemoveContainer,
useDockerStatus,
} from "@/hooks/use-docker";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { toast } from "sonner";
import {
ChevronDown,
ChevronRight,
Container,
Play,
Square,
Trash2,
Loader2,
} from "lucide-react";
export function DockerContainersList() {
const [expanded, setExpanded] = useState(true);
const { data: dockerStatus } = useDockerStatus();
const { data: containers, isLoading } = useTuskContainers();
const startMutation = useStartContainer();
const stopMutation = useStopContainer();
const removeMutation = useRemoveContainer();
const dockerAvailable =
dockerStatus?.installed && dockerStatus?.daemon_running;
if (!dockerAvailable) {
return null;
}
const handleStart = (name: string) => {
startMutation.mutate(name, {
onSuccess: () => toast.success(`Container "${name}" started`),
onError: (err) =>
toast.error("Failed to start container", {
description: String(err),
}),
});
};
const handleStop = (name: string) => {
stopMutation.mutate(name, {
onSuccess: () => toast.success(`Container "${name}" stopped`),
onError: (err) =>
toast.error("Failed to stop container", {
description: String(err),
}),
});
};
const handleRemove = (name: string) => {
if (
!confirm(
`Remove container "${name}"? This will delete the container and all its data.`
)
) {
return;
}
removeMutation.mutate(name, {
onSuccess: () => toast.success(`Container "${name}" removed`),
onError: (err) =>
toast.error("Failed to remove container", {
description: String(err),
}),
});
};
const isRunning = (status: string) =>
status.toLowerCase().startsWith("up");
return (
<div className="border-b">
<div
className="flex items-center gap-1 px-3 py-2 cursor-pointer select-none hover:bg-accent/50"
onClick={() => setExpanded(!expanded)}
>
{expanded ? (
<ChevronDown className="h-3.5 w-3.5 text-muted-foreground" />
) : (
<ChevronRight className="h-3.5 w-3.5 text-muted-foreground" />
)}
<Container className="h-3.5 w-3.5 text-muted-foreground" />
<span className="text-xs font-semibold flex-1">Docker Clones</span>
{containers && containers.length > 0 && (
<Badge
variant="secondary"
className="text-[9px] px-1 py-0"
>
{containers.length}
</Badge>
)}
</div>
{expanded && (
<div className="pb-1">
{isLoading && (
<div className="px-3 py-2 text-xs text-muted-foreground flex items-center gap-1">
<Loader2 className="h-3 w-3 animate-spin" /> Loading...
</div>
)}
{containers && containers.length === 0 && (
<div className="px-6 pb-2 text-xs text-muted-foreground">
No Docker clones yet. Right-click a database to clone it.
</div>
)}
{containers?.map((container) => (
<div
key={container.container_id}
className="group flex items-center gap-1.5 px-6 py-1 text-xs hover:bg-accent/50"
>
<span className="truncate flex-1 font-medium">
{container.name}
</span>
{container.source_database && (
<span className="text-[10px] text-muted-foreground shrink-0">
{container.source_database}
</span>
)}
<span className="text-[10px] text-muted-foreground shrink-0">
:{container.host_port}
</span>
<Badge
variant={isRunning(container.status) ? "default" : "secondary"}
className={`text-[9px] px-1 py-0 shrink-0 ${
isRunning(container.status)
? "bg-green-600 hover:bg-green-600"
: ""
}`}
>
{isRunning(container.status) ? "running" : "stopped"}
</Badge>
<div className="flex gap-0.5 opacity-0 group-hover:opacity-100 shrink-0">
{isRunning(container.status) ? (
<Button
variant="ghost"
size="sm"
className="h-5 w-5 p-0"
onClick={() => handleStop(container.name)}
title="Stop"
disabled={stopMutation.isPending}
>
<Square className="h-3 w-3" />
</Button>
) : (
<Button
variant="ghost"
size="sm"
className="h-5 w-5 p-0"
onClick={() => handleStart(container.name)}
title="Start"
disabled={startMutation.isPending}
>
<Play className="h-3 w-3" />
</Button>
)}
<Button
variant="ghost"
size="sm"
className="h-5 w-5 p-0 text-destructive hover:text-destructive"
onClick={() => handleRemove(container.name)}
title="Remove"
disabled={removeMutation.isPending}
>
<Trash2 className="h-3 w-3" />
</Button>
</div>
</div>
))}
</div>
)}
</div>
);
}

View File

@@ -0,0 +1,184 @@
import { useMemo, useCallback, useEffect, useState } from "react";
import { useTheme } from "next-themes";
import {
ReactFlow,
Background,
Controls,
MiniMap,
MarkerType,
PanOnScrollMode,
applyNodeChanges,
applyEdgeChanges,
type Node,
type Edge,
type NodeTypes,
type NodeChange,
type EdgeChange,
} from "@xyflow/react";
import dagre from "dagre";
import "@xyflow/react/dist/style.css";
import { useSchemaErd } from "@/hooks/use-schema";
import { ErdTableNode, type ErdTableNodeData } from "./ErdTableNode";
import type { ErdData } from "@/types";
const nodeTypes: NodeTypes = {
erdTable: ErdTableNode,
};
const NODE_WIDTH = 250;
const NODE_ROW_HEIGHT = 24;
const NODE_HEADER_HEIGHT = 36;
function buildLayout(data: ErdData): { nodes: Node[]; edges: Edge[] } {
const g = new dagre.graphlib.Graph();
g.setDefaultEdgeLabel(() => ({}));
g.setGraph({ rankdir: "LR", nodesep: 60, ranksep: 150 });
// Build list of FK column names per table for icon display
const fkColumnsPerTable = new Map<string, string[]>();
for (const rel of data.relationships) {
const key = `${rel.source_schema}.${rel.source_table}`;
if (!fkColumnsPerTable.has(key)) fkColumnsPerTable.set(key, []);
for (const col of rel.source_columns) {
const arr = fkColumnsPerTable.get(key)!;
if (!arr.includes(col)) arr.push(col);
}
}
for (const table of data.tables) {
const height = NODE_HEADER_HEIGHT + table.columns.length * NODE_ROW_HEIGHT;
g.setNode(table.name, { width: NODE_WIDTH, height });
}
for (const rel of data.relationships) {
g.setEdge(rel.source_table, rel.target_table);
}
dagre.layout(g);
const nodes: Node[] = data.tables.map((table) => {
const pos = g.node(table.name);
const tableKey = `${table.schema}.${table.name}`;
return {
id: table.name,
type: "erdTable",
position: { x: pos.x - NODE_WIDTH / 2, y: pos.y - pos.height / 2 },
data: {
label: table.name,
schema: table.schema,
columns: table.columns,
fkColumnNames: fkColumnsPerTable.get(tableKey) ?? [],
} satisfies ErdTableNodeData,
};
});
const edges: Edge[] = data.relationships.map((rel) => ({
id: rel.constraint_name,
source: rel.source_table,
target: rel.target_table,
type: "smoothstep",
label: rel.constraint_name,
labelStyle: { fontSize: 10, fill: "var(--muted-foreground)" },
labelBgStyle: { fill: "var(--card)", fillOpacity: 0.8 },
labelBgPadding: [4, 2] as [number, number],
markerEnd: {
type: MarkerType.ArrowClosed,
width: 16,
height: 16,
color: "var(--muted-foreground)",
},
style: { stroke: "var(--muted-foreground)", strokeWidth: 1.5 },
}));
return { nodes, edges };
}
interface Props {
connectionId: string;
schema: string;
}
export function ErdDiagram({ connectionId, schema }: Props) {
const { data: erdData, isLoading, error } = useSchemaErd(connectionId, schema);
const { resolvedTheme } = useTheme();
const layout = useMemo(() => {
if (!erdData) return null;
return buildLayout(erdData);
}, [erdData]);
const [nodes, setNodes] = useState<Node[]>([]);
const [edges, setEdges] = useState<Edge[]>([]);
useEffect(() => {
if (layout) {
setNodes(layout.nodes);
setEdges(layout.edges);
}
}, [layout]);
const onNodesChange = useCallback(
(changes: NodeChange[]) => setNodes((nds) => applyNodeChanges(changes, nds)),
[],
);
const onEdgesChange = useCallback(
(changes: EdgeChange[]) => setEdges((eds) => applyEdgeChanges(changes, eds)),
[],
);
if (isLoading) {
return (
<div className="flex h-full items-center justify-center text-sm text-muted-foreground">
Loading ER diagram...
</div>
);
}
if (error) {
return (
<div className="flex h-full items-center justify-center text-sm text-destructive">
Error loading ER diagram: {String(error)}
</div>
);
}
if (!erdData || erdData.tables.length === 0) {
return (
<div className="flex h-full items-center justify-center text-sm text-muted-foreground">
No tables found in schema &quot;{schema}&quot;.
</div>
);
}
return (
<div className="h-full w-full">
<ReactFlow
nodes={nodes}
edges={edges}
onNodesChange={onNodesChange}
onEdgesChange={onEdgesChange}
nodeTypes={nodeTypes}
fitView
colorMode={resolvedTheme === "dark" ? "dark" : "light"}
minZoom={0.05}
maxZoom={3}
zoomOnScroll
zoomOnPinch
panOnScroll
panOnScrollMode={PanOnScrollMode.Free}
proOptions={{ hideAttribution: true }}
>
<Background gap={16} size={1} />
<Controls className="!bg-card !border !shadow-sm [&>button]:!bg-card [&>button]:!border-border [&>button]:!text-foreground" />
<MiniMap
className="!bg-card !border"
nodeColor="var(--muted)"
maskColor="rgba(0, 0, 0, 0.7)"
/>
</ReactFlow>
</div>
);
}

View File

@@ -0,0 +1,54 @@
import { memo } from "react";
import { Handle, Position, type NodeProps } from "@xyflow/react";
import type { ErdColumn } from "@/types";
import { KeyRound, Link } from "lucide-react";
export interface ErdTableNodeData {
label: string;
schema: string;
columns: ErdColumn[];
fkColumnNames: string[];
[key: string]: unknown;
}
function ErdTableNodeComponent({ data }: NodeProps) {
const { label, columns, fkColumnNames } = data as unknown as ErdTableNodeData;
return (
<div className="min-w-[220px] rounded-lg border border-border bg-card text-card-foreground shadow-md">
<div className="rounded-t-lg border-b bg-primary/10 px-3 py-2 text-xs font-bold tracking-wide text-primary">
{label}
</div>
<div className="divide-y divide-border/50">
{(columns as ErdColumn[]).map((col, i) => (
<div key={i} className="flex items-center gap-1.5 px-3 py-1 text-[11px]">
{col.is_primary_key ? (
<KeyRound className="h-3 w-3 shrink-0 text-amber-500" />
) : (fkColumnNames as string[]).includes(col.name) ? (
<Link className="h-3 w-3 shrink-0 text-blue-400" />
) : (
<span className="h-3 w-3 shrink-0" />
)}
<span className="font-medium">{col.name}</span>
<span className="ml-auto text-muted-foreground">{col.data_type}</span>
{col.is_nullable && (
<span className="text-muted-foreground/60">?</span>
)}
</div>
))}
</div>
<Handle
type="target"
position={Position.Left}
className="!w-1.5 !h-1.5 !bg-primary !opacity-0 hover:!opacity-100 !border-none !min-w-0 !min-h-0"
/>
<Handle
type="source"
position={Position.Right}
className="!w-1.5 !h-1.5 !bg-primary !opacity-0 hover:!opacity-100 !border-none !min-w-0 !min-h-0"
/>
</div>
);
}
export const ErdTableNode = memo(ErdTableNodeComponent);

View File

@@ -0,0 +1,232 @@
import { useState } from "react";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { useIndexAdvisorReport, useApplyIndexRecommendation } from "@/hooks/use-index-advisor";
import { RecommendationCard } from "./RecommendationCard";
import { toast } from "sonner";
import { Loader2, Gauge, Search, AlertTriangle } from "lucide-react";
import type { IndexAdvisorReport } from "@/types";
interface Props {
connectionId: string;
}
export function IndexAdvisorPanel({ connectionId }: Props) {
const [report, setReport] = useState<IndexAdvisorReport | null>(null);
const [appliedDdls, setAppliedDdls] = useState<Set<string>>(new Set());
const [applyingDdl, setApplyingDdl] = useState<string | null>(null);
const reportMutation = useIndexAdvisorReport();
const applyMutation = useApplyIndexRecommendation();
const handleAnalyze = () => {
reportMutation.mutate(connectionId, {
onSuccess: (data) => {
setReport(data);
setAppliedDdls(new Set());
},
onError: (err) => toast.error("Analysis failed", { description: String(err) }),
});
};
const handleApply = async (ddl: string) => {
if (!confirm("Apply this index change? This will modify the database schema.")) return;
setApplyingDdl(ddl);
try {
await applyMutation.mutateAsync({ connectionId, ddl });
setAppliedDdls((prev) => new Set(prev).add(ddl));
toast.success("Index change applied");
} catch (err) {
toast.error("Failed to apply", { description: String(err) });
} finally {
setApplyingDdl(null);
}
};
return (
<div className="flex h-full flex-col">
{/* Header */}
<div className="border-b px-4 py-3 flex items-center justify-between">
<div className="flex items-center gap-2">
<Gauge className="h-5 w-5 text-primary" />
<h2 className="text-sm font-medium">Index Advisor</h2>
</div>
<Button
size="sm"
onClick={handleAnalyze}
disabled={reportMutation.isPending}
>
{reportMutation.isPending ? (
<><Loader2 className="h-3.5 w-3.5 animate-spin mr-1" />Analyzing...</>
) : (
<><Search className="h-3.5 w-3.5 mr-1" />Analyze</>
)}
</Button>
</div>
{/* Content */}
<div className="flex-1 overflow-auto">
{!report ? (
<div className="flex h-full items-center justify-center text-sm text-muted-foreground">
Click Analyze to scan your database for index optimization opportunities.
</div>
) : (
<Tabs defaultValue="recommendations" className="h-full flex flex-col">
<div className="border-b px-4">
<TabsList className="h-9">
<TabsTrigger value="recommendations" className="text-xs">
Recommendations
{report.recommendations.length > 0 && (
<Badge variant="secondary" className="ml-1 text-[10px]">{report.recommendations.length}</Badge>
)}
</TabsTrigger>
<TabsTrigger value="table-stats" className="text-xs">Table Stats</TabsTrigger>
<TabsTrigger value="index-stats" className="text-xs">Index Stats</TabsTrigger>
<TabsTrigger value="slow-queries" className="text-xs">
Slow Queries
{!report.has_pg_stat_statements && (
<AlertTriangle className="h-3 w-3 ml-1 text-yellow-500" />
)}
</TabsTrigger>
</TabsList>
</div>
<TabsContent value="recommendations" className="flex-1 overflow-auto p-4 space-y-2 mt-0">
{report.recommendations.length === 0 ? (
<div className="text-sm text-muted-foreground text-center py-8">
No recommendations found. Your indexes look good!
</div>
) : (
report.recommendations.map((rec, i) => (
<RecommendationCard
key={rec.id || i}
recommendation={rec}
onApply={handleApply}
isApplying={applyingDdl === rec.ddl}
applied={appliedDdls.has(rec.ddl)}
/>
))
)}
</TabsContent>
<TabsContent value="table-stats" className="flex-1 overflow-auto mt-0">
<div className="overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b bg-muted/50">
<th className="px-3 py-2 text-left font-medium">Table</th>
<th className="px-3 py-2 text-right font-medium">Seq Scans</th>
<th className="px-3 py-2 text-right font-medium">Idx Scans</th>
<th className="px-3 py-2 text-right font-medium">Rows</th>
<th className="px-3 py-2 text-right font-medium">Table Size</th>
<th className="px-3 py-2 text-right font-medium">Index Size</th>
</tr>
</thead>
<tbody>
{report.table_stats.map((ts) => {
const ratio = ts.seq_scan + ts.idx_scan > 0
? ts.seq_scan / (ts.seq_scan + ts.idx_scan)
: 0;
return (
<tr key={`${ts.schema}.${ts.table}`} className="border-b">
<td className="px-3 py-2 font-mono">{ts.schema}.{ts.table}</td>
<td className={`px-3 py-2 text-right ${ratio > 0.8 && ts.n_live_tup > 1000 ? "text-destructive font-medium" : ""}`}>
{ts.seq_scan.toLocaleString()}
</td>
<td className="px-3 py-2 text-right">{ts.idx_scan.toLocaleString()}</td>
<td className="px-3 py-2 text-right">{ts.n_live_tup.toLocaleString()}</td>
<td className="px-3 py-2 text-right">{ts.table_size}</td>
<td className="px-3 py-2 text-right">{ts.index_size}</td>
</tr>
);
})}
</tbody>
</table>
</div>
</TabsContent>
<TabsContent value="index-stats" className="flex-1 overflow-auto mt-0">
<div className="overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b bg-muted/50">
<th className="px-3 py-2 text-left font-medium">Index</th>
<th className="px-3 py-2 text-left font-medium">Table</th>
<th className="px-3 py-2 text-right font-medium">Scans</th>
<th className="px-3 py-2 text-right font-medium">Size</th>
<th className="px-3 py-2 text-left font-medium">Definition</th>
</tr>
</thead>
<tbody>
{report.index_stats.map((is) => (
<tr key={`${is.schema}.${is.index_name}`} className="border-b">
<td className={`px-3 py-2 font-mono ${is.idx_scan === 0 ? "text-yellow-600" : ""}`}>
{is.index_name}
</td>
<td className="px-3 py-2">{is.schema}.{is.table}</td>
<td className={`px-3 py-2 text-right ${is.idx_scan === 0 ? "text-yellow-600 font-medium" : ""}`}>
{is.idx_scan.toLocaleString()}
</td>
<td className="px-3 py-2 text-right">{is.index_size}</td>
<td className="px-3 py-2 font-mono text-muted-foreground max-w-xs truncate">
{is.definition}
</td>
</tr>
))}
</tbody>
</table>
</div>
</TabsContent>
<TabsContent value="slow-queries" className="flex-1 overflow-auto mt-0">
{!report.has_pg_stat_statements ? (
<div className="p-4 text-sm text-muted-foreground">
<div className="flex items-center gap-2 mb-2">
<AlertTriangle className="h-4 w-4 text-yellow-500" />
pg_stat_statements extension is not installed
</div>
<p className="text-xs">
Enable it with: CREATE EXTENSION pg_stat_statements;
</p>
</div>
) : report.slow_queries.length === 0 ? (
<div className="p-4 text-sm text-muted-foreground text-center">
No slow queries found.
</div>
) : (
<div className="overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b bg-muted/50">
<th className="px-3 py-2 text-left font-medium">Query</th>
<th className="px-3 py-2 text-right font-medium">Calls</th>
<th className="px-3 py-2 text-right font-medium">Mean (ms)</th>
<th className="px-3 py-2 text-right font-medium">Total (ms)</th>
<th className="px-3 py-2 text-right font-medium">Rows</th>
</tr>
</thead>
<tbody>
{report.slow_queries.map((sq, i) => (
<tr key={i} className="border-b">
<td className="px-3 py-2 font-mono max-w-md truncate" title={sq.query}>
{sq.query.substring(0, 150)}
</td>
<td className="px-3 py-2 text-right">{sq.calls.toLocaleString()}</td>
<td className="px-3 py-2 text-right">{sq.mean_time_ms.toFixed(1)}</td>
<td className="px-3 py-2 text-right">{sq.total_time_ms.toFixed(0)}</td>
<td className="px-3 py-2 text-right">{sq.rows.toLocaleString()}</td>
</tr>
))}
</tbody>
</table>
</div>
)}
</TabsContent>
</Tabs>
)}
</div>
</div>
);
}

View File

@@ -0,0 +1,86 @@
import { useState } from "react";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Loader2, Play } from "lucide-react";
import type { IndexRecommendation } from "@/types";
interface Props {
recommendation: IndexRecommendation;
onApply: (ddl: string) => void;
isApplying: boolean;
applied: boolean;
}
function priorityBadge(priority: string) {
switch (priority.toLowerCase()) {
case "high":
return <Badge variant="destructive">{priority}</Badge>;
case "medium":
return <Badge className="bg-yellow-600 text-white">{priority}</Badge>;
default:
return <Badge variant="secondary">{priority}</Badge>;
}
}
function typeBadge(type: string) {
switch (type) {
case "create_index":
return <Badge className="bg-green-600 text-white">CREATE</Badge>;
case "drop_index":
return <Badge variant="destructive">DROP</Badge>;
case "replace_index":
return <Badge className="bg-blue-600 text-white">REPLACE</Badge>;
default:
return <Badge variant="secondary">{type}</Badge>;
}
}
export function RecommendationCard({ recommendation, onApply, isApplying, applied }: Props) {
const [showDdl] = useState(true);
return (
<div className="rounded-md border p-3 space-y-2">
<div className="flex items-start justify-between gap-2">
<div className="flex items-center gap-2 flex-wrap">
{typeBadge(recommendation.recommendation_type)}
{priorityBadge(recommendation.priority)}
<span className="text-xs text-muted-foreground">
{recommendation.table_schema}.{recommendation.table_name}
</span>
{recommendation.index_name && (
<span className="text-xs font-mono text-muted-foreground">
{recommendation.index_name}
</span>
)}
</div>
<Button
size="sm"
variant={applied ? "outline" : "default"}
onClick={() => onApply(recommendation.ddl)}
disabled={isApplying || applied}
className="shrink-0"
>
{isApplying ? (
<Loader2 className="h-3.5 w-3.5 animate-spin mr-1" />
) : applied ? (
"Applied"
) : (
<><Play className="h-3.5 w-3.5 mr-1" />Apply</>
)}
</Button>
</div>
<p className="text-sm">{recommendation.rationale}</p>
<div className="flex items-center gap-2 text-xs text-muted-foreground">
<span>Impact: {recommendation.estimated_impact}</span>
</div>
{showDdl && (
<pre className="rounded bg-muted p-2 text-xs font-mono overflow-x-auto">
{recommendation.ddl}
</pre>
)}
</div>
);
}

View File

@@ -1,7 +1,9 @@
import { useAppStore } from "@/stores/app-store"; import { useAppStore } from "@/stores/app-store";
import { useConnections } from "@/hooks/use-connections"; import { useConnections } from "@/hooks/use-connections";
import { useMcpStatus } from "@/hooks/use-settings";
import { Circle } from "lucide-react"; import { Circle } from "lucide-react";
import { EnvironmentBadge } from "@/components/connections/EnvironmentBadge"; import { EnvironmentBadge } from "@/components/connections/EnvironmentBadge";
import { Tooltip, TooltipContent, TooltipTrigger } from "@/components/ui/tooltip";
function formatDbVersion(version: string): string { function formatDbVersion(version: string): string {
const gpMatch = version.match(/Greenplum Database ([\d.]+)/i); const gpMatch = version.match(/Greenplum Database ([\d.]+)/i);
@@ -21,6 +23,7 @@ interface Props {
export function StatusBar({ rowCount, executionTime }: Props) { export function StatusBar({ rowCount, executionTime }: Props) {
const { activeConnectionId, connectedIds, readOnlyMap, pgVersion } = useAppStore(); const { activeConnectionId, connectedIds, readOnlyMap, pgVersion } = useAppStore();
const { data: connections } = useConnections(); const { data: connections } = useConnections();
const { data: mcpStatus } = useMcpStatus();
const activeConn = connections?.find((c) => c.id === activeConnectionId); const activeConn = connections?.find((c) => c.id === activeConnectionId);
const isConnected = activeConnectionId const isConnected = activeConnectionId
@@ -62,6 +65,25 @@ export function StatusBar({ rowCount, executionTime }: Props) {
<div className="flex items-center gap-3"> <div className="flex items-center gap-3">
{rowCount != null && <span>{rowCount.toLocaleString()} rows</span>} {rowCount != null && <span>{rowCount.toLocaleString()} rows</span>}
{executionTime != null && <span>{executionTime} ms</span>} {executionTime != null && <span>{executionTime} ms</span>}
<Tooltip>
<TooltipTrigger asChild>
<span className="flex items-center gap-1 cursor-default">
<span
className={`inline-block h-1.5 w-1.5 rounded-full ${
mcpStatus?.running
? "bg-green-500"
: "bg-muted-foreground/30"
}`}
/>
<span>MCP</span>
</span>
</TooltipTrigger>
<TooltipContent side="top">
<p className="text-xs">
MCP Server {mcpStatus?.running ? `running on :${mcpStatus.port}` : "stopped"}
</p>
</TooltipContent>
</Tooltip>
</div> </div>
</div> </div>
); );

View File

@@ -1,7 +1,7 @@
import { useAppStore } from "@/stores/app-store"; import { useAppStore } from "@/stores/app-store";
import { useConnections } from "@/hooks/use-connections"; import { useConnections } from "@/hooks/use-connections";
import { ScrollArea } from "@/components/ui/scroll-area"; import { ScrollArea } from "@/components/ui/scroll-area";
import { X, Table2, Code, Columns, Users, Activity, Search } from "lucide-react"; import { X, Table2, Code, Columns, Users, Activity, Search, GitFork, ShieldCheck, Gauge, Camera } from "lucide-react";
export function TabBar() { export function TabBar() {
const { tabs, activeTabId, setActiveTabId, closeTab } = useAppStore(); const { tabs, activeTabId, setActiveTabId, closeTab } = useAppStore();
@@ -16,6 +16,10 @@ export function TabBar() {
roles: <Users className="h-3 w-3" />, roles: <Users className="h-3 w-3" />,
sessions: <Activity className="h-3 w-3" />, sessions: <Activity className="h-3 w-3" />,
lookup: <Search className="h-3 w-3" />, lookup: <Search className="h-3 w-3" />,
erd: <GitFork className="h-3 w-3" />,
validation: <ShieldCheck className="h-3 w-3" />,
"index-advisor": <Gauge className="h-3 w-3" />,
snapshots: <Camera className="h-3 w-3" />,
}; };
return ( return (

View File

@@ -8,13 +8,15 @@ import { ReadOnlyToggle } from "@/components/layout/ReadOnlyToggle";
import { useAppStore } from "@/stores/app-store"; import { useAppStore } from "@/stores/app-store";
import { useConnections, useReconnect } from "@/hooks/use-connections"; import { useConnections, useReconnect } from "@/hooks/use-connections";
import { toast } from "sonner"; import { toast } from "sonner";
import { Database, Plus, RefreshCw, Search } from "lucide-react"; import { Database, Plus, RefreshCw, Search, Settings } from "lucide-react";
import type { ConnectionConfig, Tab } from "@/types"; import type { ConnectionConfig, Tab } from "@/types";
import { getEnvironment } from "@/lib/environment"; import { getEnvironment } from "@/lib/environment";
import { AppSettingsSheet } from "@/components/settings/AppSettingsSheet";
export function Toolbar() { export function Toolbar() {
const [listOpen, setListOpen] = useState(false); const [listOpen, setListOpen] = useState(false);
const [dialogOpen, setDialogOpen] = useState(false); const [dialogOpen, setDialogOpen] = useState(false);
const [settingsOpen, setSettingsOpen] = useState(false);
const [editingConn, setEditingConn] = useState<ConnectionConfig | null>(null); const [editingConn, setEditingConn] = useState<ConnectionConfig | null>(null);
const { activeConnectionId, currentDatabase, addTab } = useAppStore(); const { activeConnectionId, currentDatabase, addTab } = useAppStore();
const { data: connections } = useConnections(); const { data: connections } = useConnections();
@@ -117,9 +119,15 @@ export function Toolbar() {
<div className="flex-1" /> <div className="flex-1" />
<span className="text-xs font-semibold text-muted-foreground tracking-wide"> <Button
TUSK variant="ghost"
</span> size="sm"
className="h-7 w-7 p-0"
onClick={() => setSettingsOpen(true)}
title="Settings"
>
<Settings className="h-3.5 w-3.5" />
</Button>
</div> </div>
<ConnectionList <ConnectionList
@@ -140,6 +148,11 @@ export function Toolbar() {
onOpenChange={setDialogOpen} onOpenChange={setDialogOpen}
connection={editingConn} connection={editingConn}
/> />
<AppSettingsSheet
open={settingsOpen}
onOpenChange={setSettingsOpen}
/>
</> </>
); );
} }

View File

@@ -23,6 +23,7 @@ import {
Activity, Activity,
Loader2, Loader2,
} from "lucide-react"; } from "lucide-react";
import { DockerContainersList } from "@/components/docker/DockerContainersList";
import type { Tab, RoleInfo } from "@/types"; import type { Tab, RoleInfo } from "@/types";
export function AdminPanel() { export function AdminPanel() {
@@ -72,6 +73,7 @@ export function AdminPanel() {
addTab(tab); addTab(tab);
}} }}
/> />
<DockerContainersList />
</div> </div>
); );
} }

View File

@@ -31,6 +31,8 @@ import {
ContextMenuTrigger, ContextMenuTrigger,
} from "@/components/ui/context-menu"; } from "@/components/ui/context-menu";
import { GrantRevokeDialog } from "@/components/management/GrantRevokeDialog"; import { GrantRevokeDialog } from "@/components/management/GrantRevokeDialog";
import { CloneDatabaseDialog } from "@/components/docker/CloneDatabaseDialog";
import { GenerateDataDialog } from "@/components/data-generator/GenerateDataDialog";
import type { Tab, SchemaObject } from "@/types"; import type { Tab, SchemaObject } from "@/types";
function formatSize(bytes: number): string { function formatSize(bytes: number): string {
@@ -65,6 +67,7 @@ export function SchemaTree() {
const { data: databases } = useDatabases(activeConnectionId); const { data: databases } = useDatabases(activeConnectionId);
const { data: connections } = useConnections(); const { data: connections } = useConnections();
const switchDbMutation = useSwitchDatabase(); const switchDbMutation = useSwitchDatabase();
const [cloneTarget, setCloneTarget] = useState<string | null>(null);
if (!activeConnectionId) { if (!activeConnectionId) {
return ( return (
@@ -112,6 +115,7 @@ export function SchemaTree() {
connectionId={activeConnectionId} connectionId={activeConnectionId}
onSwitch={() => handleSwitchDb(db)} onSwitch={() => handleSwitchDb(db)}
isSwitching={switchDbMutation.isPending} isSwitching={switchDbMutation.isPending}
onCloneToDocker={(dbName) => setCloneTarget(dbName)}
onOpenTable={(schema, table) => { onOpenTable={(schema, table) => {
const tab: Tab = { const tab: Tab = {
id: crypto.randomUUID(), id: crypto.randomUUID(),
@@ -136,8 +140,25 @@ export function SchemaTree() {
}; };
addTab(tab); addTab(tab);
}} }}
onViewErd={(schema) => {
const tab: Tab = {
id: crypto.randomUUID(),
type: "erd",
title: `${schema} (ER Diagram)`,
connectionId: activeConnectionId,
database: currentDatabase ?? undefined,
schema,
};
addTab(tab);
}}
/> />
))} ))}
<CloneDatabaseDialog
open={cloneTarget !== null}
onOpenChange={(open) => { if (!open) setCloneTarget(null); }}
connectionId={activeConnectionId}
database={cloneTarget ?? ""}
/>
</div> </div>
); );
} }
@@ -148,16 +169,20 @@ function DatabaseNode({
connectionId, connectionId,
onSwitch, onSwitch,
isSwitching, isSwitching,
onCloneToDocker,
onOpenTable, onOpenTable,
onViewStructure, onViewStructure,
onViewErd,
}: { }: {
name: string; name: string;
isActive: boolean; isActive: boolean;
connectionId: string; connectionId: string;
onSwitch: () => void; onSwitch: () => void;
isSwitching: boolean; isSwitching: boolean;
onCloneToDocker: (dbName: string) => void;
onOpenTable: (schema: string, table: string) => void; onOpenTable: (schema: string, table: string) => void;
onViewStructure: (schema: string, table: string) => void; onViewStructure: (schema: string, table: string) => void;
onViewErd: (schema: string) => void;
}) { }) {
const [expanded, setExpanded] = useState(false); const [expanded, setExpanded] = useState(false);
const readOnlyMap = useAppStore((s) => s.readOnlyMap); const readOnlyMap = useAppStore((s) => s.readOnlyMap);
@@ -218,6 +243,58 @@ function DatabaseNode({
> >
Properties Properties
</ContextMenuItem> </ContextMenuItem>
<ContextMenuItem onClick={() => onCloneToDocker(name)}>
Clone to Docker
</ContextMenuItem>
<ContextMenuSeparator />
<ContextMenuItem
disabled={!isActive}
onClick={() => {
if (!isActive) return;
const tab: Tab = {
id: crypto.randomUUID(),
type: "validation",
title: "Data Validation",
connectionId,
database: name,
};
useAppStore.getState().addTab(tab);
}}
>
Data Validation
</ContextMenuItem>
<ContextMenuItem
disabled={!isActive}
onClick={() => {
if (!isActive) return;
const tab: Tab = {
id: crypto.randomUUID(),
type: "index-advisor",
title: "Index Advisor",
connectionId,
database: name,
};
useAppStore.getState().addTab(tab);
}}
>
Index Advisor
</ContextMenuItem>
<ContextMenuItem
disabled={!isActive}
onClick={() => {
if (!isActive) return;
const tab: Tab = {
id: crypto.randomUUID(),
type: "snapshots",
title: "Data Snapshots",
connectionId,
database: name,
};
useAppStore.getState().addTab(tab);
}}
>
Data Snapshots
</ContextMenuItem>
<ContextMenuSeparator /> <ContextMenuSeparator />
<ContextMenuItem <ContextMenuItem
disabled={isActive || isReadOnly} disabled={isActive || isReadOnly}
@@ -234,6 +311,7 @@ function DatabaseNode({
connectionId={connectionId} connectionId={connectionId}
onOpenTable={onOpenTable} onOpenTable={onOpenTable}
onViewStructure={onViewStructure} onViewStructure={onViewStructure}
onViewErd={onViewErd}
/> />
</div> </div>
)} )}
@@ -250,10 +328,12 @@ function SchemasForCurrentDb({
connectionId, connectionId,
onOpenTable, onOpenTable,
onViewStructure, onViewStructure,
onViewErd,
}: { }: {
connectionId: string; connectionId: string;
onOpenTable: (schema: string, table: string) => void; onOpenTable: (schema: string, table: string) => void;
onViewStructure: (schema: string, table: string) => void; onViewStructure: (schema: string, table: string) => void;
onViewErd: (schema: string) => void;
}) { }) {
const { data: schemas } = useSchemas(connectionId); const { data: schemas } = useSchemas(connectionId);
@@ -272,6 +352,7 @@ function SchemasForCurrentDb({
connectionId={connectionId} connectionId={connectionId}
onOpenTable={(table) => onOpenTable(schema, table)} onOpenTable={(table) => onOpenTable(schema, table)}
onViewStructure={(table) => onViewStructure(schema, table)} onViewStructure={(table) => onViewStructure(schema, table)}
onViewErd={() => onViewErd(schema)}
/> />
))} ))}
</> </>
@@ -283,32 +364,43 @@ function SchemaNode({
connectionId, connectionId,
onOpenTable, onOpenTable,
onViewStructure, onViewStructure,
onViewErd,
}: { }: {
schema: string; schema: string;
connectionId: string; connectionId: string;
onOpenTable: (table: string) => void; onOpenTable: (table: string) => void;
onViewStructure: (table: string) => void; onViewStructure: (table: string) => void;
onViewErd: () => void;
}) { }) {
const [expanded, setExpanded] = useState(false); const [expanded, setExpanded] = useState(false);
return ( return (
<div> <div>
<div <ContextMenu>
className="flex items-center gap-1 rounded-sm px-1 py-0.5 text-sm hover:bg-accent cursor-pointer select-none font-medium" <ContextMenuTrigger>
onClick={() => setExpanded(!expanded)} <div
> className="flex items-center gap-1 rounded-sm px-1 py-0.5 text-sm hover:bg-accent cursor-pointer select-none font-medium"
{expanded ? ( onClick={() => setExpanded(!expanded)}
<ChevronDown className="h-3.5 w-3.5 text-muted-foreground" /> >
) : ( {expanded ? (
<ChevronRight className="h-3.5 w-3.5 text-muted-foreground" /> <ChevronDown className="h-3.5 w-3.5 text-muted-foreground" />
)} ) : (
{expanded ? ( <ChevronRight className="h-3.5 w-3.5 text-muted-foreground" />
<FolderOpen className="h-3.5 w-3.5 text-muted-foreground" /> )}
) : ( {expanded ? (
<Database className="h-3.5 w-3.5 text-muted-foreground" /> <FolderOpen className="h-3.5 w-3.5 text-muted-foreground" />
)} ) : (
<span>{schema}</span> <Database className="h-3.5 w-3.5 text-muted-foreground" />
</div> )}
<span>{schema}</span>
</div>
</ContextMenuTrigger>
<ContextMenuContent>
<ContextMenuItem onClick={onViewErd}>
View ER Diagram
</ContextMenuItem>
</ContextMenuContent>
</ContextMenu>
{expanded && ( {expanded && (
<div className="ml-4"> <div className="ml-4">
<CategoryNode <CategoryNode
@@ -373,6 +465,7 @@ function CategoryNode({
}) { }) {
const [expanded, setExpanded] = useState(false); const [expanded, setExpanded] = useState(false);
const [privilegesTarget, setPrivilegesTarget] = useState<string | null>(null); const [privilegesTarget, setPrivilegesTarget] = useState<string | null>(null);
const [dataGenTarget, setDataGenTarget] = useState<string | null>(null);
const tablesQuery = useTables( const tablesQuery = useTables(
expanded && category === "tables" ? connectionId : null, expanded && category === "tables" ? connectionId : null,
@@ -447,6 +540,13 @@ function CategoryNode({
> >
View Structure View Structure
</ContextMenuItem> </ContextMenuItem>
{category === "tables" && (
<ContextMenuItem
onClick={() => setDataGenTarget(item.name)}
>
Generate Test Data
</ContextMenuItem>
)}
<ContextMenuSeparator /> <ContextMenuSeparator />
<ContextMenuItem <ContextMenuItem
onClick={() => setPrivilegesTarget(item.name)} onClick={() => setPrivilegesTarget(item.name)}
@@ -482,6 +582,15 @@ function CategoryNode({
table={privilegesTarget} table={privilegesTarget}
/> />
)} )}
{dataGenTarget && (
<GenerateDataDialog
open={!!dataGenTarget}
onOpenChange={(open) => !open && setDataGenTarget(null)}
connectionId={connectionId}
schema={schema}
table={dataGenTarget}
/>
)}
</div> </div>
); );
} }

View File

@@ -0,0 +1,254 @@
import { useState, useEffect } from "react";
import {
Sheet,
SheetContent,
SheetHeader,
SheetTitle,
SheetDescription,
SheetFooter,
} from "@/components/ui/sheet";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { Separator } from "@/components/ui/separator";
import { useAppSettings, useSaveAppSettings, useMcpStatus } from "@/hooks/use-settings";
import { useAiSettings, useSaveAiSettings } from "@/hooks/use-ai";
import { AiSettingsFields } from "@/components/ai/AiSettingsFields";
import { Loader2, Copy, Check } from "lucide-react";
import { toast } from "sonner";
import type { AppSettings, DockerHost } from "@/types";
interface Props {
open: boolean;
onOpenChange: (open: boolean) => void;
}
export function AppSettingsSheet({ open, onOpenChange }: Props) {
const { data: appSettings } = useAppSettings();
const { data: mcpStatus } = useMcpStatus();
const saveAppMutation = useSaveAppSettings();
const { data: aiSettings } = useAiSettings();
const saveAiMutation = useSaveAiSettings();
// MCP state
const [mcpEnabled, setMcpEnabled] = useState(true);
const [mcpPort, setMcpPort] = useState(9427);
// Docker state
const [dockerHost, setDockerHost] = useState<DockerHost>("local");
const [dockerRemoteUrl, setDockerRemoteUrl] = useState("");
// AI state
const [ollamaUrl, setOllamaUrl] = useState("http://localhost:11434");
const [aiModel, setAiModel] = useState("");
const [copied, setCopied] = useState(false);
// Sync form with loaded settings
useEffect(() => {
if (appSettings) {
setMcpEnabled(appSettings.mcp.enabled);
setMcpPort(appSettings.mcp.port);
setDockerHost(appSettings.docker.host);
setDockerRemoteUrl(appSettings.docker.remote_url ?? "");
}
}, [appSettings]);
useEffect(() => {
if (aiSettings) {
setOllamaUrl(aiSettings.ollama_url);
setAiModel(aiSettings.model);
}
}, [aiSettings]);
const mcpEndpoint = `http://127.0.0.1:${mcpPort}/mcp`;
const handleCopy = async () => {
await navigator.clipboard.writeText(mcpEndpoint);
setCopied(true);
setTimeout(() => setCopied(false), 2000);
};
const handleSave = () => {
const settings: AppSettings = {
mcp: { enabled: mcpEnabled, port: mcpPort },
docker: {
host: dockerHost,
remote_url: dockerHost === "remote" ? dockerRemoteUrl || undefined : undefined,
},
};
saveAppMutation.mutate(settings, {
onSuccess: () => {
toast.success("Settings saved");
},
onError: (err) =>
toast.error("Failed to save settings", { description: String(err) }),
});
// Save AI settings separately
saveAiMutation.mutate(
{ provider: "ollama", ollama_url: ollamaUrl, model: aiModel },
{
onError: (err) =>
toast.error("Failed to save AI settings", { description: String(err) }),
}
);
};
return (
<Sheet open={open} onOpenChange={onOpenChange}>
<SheetContent side="right" className="w-[400px] sm:max-w-[400px] overflow-y-auto">
<SheetHeader>
<SheetTitle>Settings</SheetTitle>
<SheetDescription>Application configuration</SheetDescription>
</SheetHeader>
<div className="flex flex-col gap-6 px-4">
{/* MCP Server */}
<section className="flex flex-col gap-3">
<h3 className="text-sm font-medium">MCP Server</h3>
<div className="flex items-center justify-between">
<span className="text-xs text-muted-foreground">Enabled</span>
<Button
size="sm"
variant={mcpEnabled ? "default" : "outline"}
className="h-6 text-xs px-3"
onClick={() => setMcpEnabled(!mcpEnabled)}
>
{mcpEnabled ? "On" : "Off"}
</Button>
</div>
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Port</label>
<Input
type="number"
value={mcpPort}
onChange={(e) => setMcpPort(Number(e.target.value))}
className="h-8 text-xs"
min={1}
max={65535}
/>
</div>
<div className="flex items-center gap-2">
<span className="text-xs text-muted-foreground">Status:</span>
<span className="flex items-center gap-1.5 text-xs">
<span
className={`inline-block h-2 w-2 rounded-full ${
mcpStatus?.running
? "bg-green-500"
: "bg-muted-foreground/30"
}`}
/>
{mcpStatus?.running ? "Running" : "Stopped"}
</span>
</div>
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Endpoint</label>
<div className="flex items-center gap-1">
<code className="flex-1 rounded bg-muted px-2 py-1 text-xs font-mono truncate">
{mcpEndpoint}
</code>
<Button
size="sm"
variant="ghost"
className="h-7 w-7 p-0"
onClick={handleCopy}
title="Copy endpoint URL"
>
{copied ? (
<Check className="h-3 w-3 text-green-500" />
) : (
<Copy className="h-3 w-3" />
)}
</Button>
</div>
</div>
</section>
<Separator />
{/* Docker */}
<section className="flex flex-col gap-3">
<h3 className="text-sm font-medium">Docker</h3>
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Docker Host</label>
<Select value={dockerHost} onValueChange={(v) => setDockerHost(v as DockerHost)}>
<SelectTrigger className="h-8 text-xs">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="local">Local</SelectItem>
<SelectItem value="remote">Remote</SelectItem>
</SelectContent>
</Select>
</div>
{dockerHost === "remote" && (
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Remote URL</label>
<Input
value={dockerRemoteUrl}
onChange={(e) => setDockerRemoteUrl(e.target.value)}
placeholder="tcp://192.168.1.100:2375"
className="h-8 text-xs"
/>
</div>
)}
</section>
<Separator />
{/* AI */}
<section className="flex flex-col gap-3">
<h3 className="text-sm font-medium">AI</h3>
<div className="flex flex-col gap-1.5">
<label className="text-xs text-muted-foreground">Provider</label>
<Select value="ollama" disabled>
<SelectTrigger className="h-8 text-xs">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="ollama">Ollama</SelectItem>
</SelectContent>
</Select>
</div>
<AiSettingsFields
ollamaUrl={ollamaUrl}
onOllamaUrlChange={setOllamaUrl}
model={aiModel}
onModelChange={setAiModel}
/>
</section>
</div>
<SheetFooter>
<Button
className="w-full"
onClick={handleSave}
disabled={saveAppMutation.isPending}
>
{saveAppMutation.isPending ? (
<Loader2 className="h-4 w-4 animate-spin mr-2" />
) : null}
Save
</Button>
</SheetFooter>
</SheetContent>
</Sheet>
);
}

View File

@@ -0,0 +1,271 @@
import { useState, useEffect } from "react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogFooter,
} from "@/components/ui/dialog";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { useSchemas, useTables } from "@/hooks/use-schema";
import { useCreateSnapshot } from "@/hooks/use-snapshots";
import { toast } from "sonner";
import { save } from "@tauri-apps/plugin-dialog";
import {
Loader2,
CheckCircle2,
XCircle,
Camera,
} from "lucide-react";
import type { TableRef } from "@/types";
interface Props {
open: boolean;
onOpenChange: (open: boolean) => void;
connectionId: string;
}
type Step = "config" | "progress" | "done";
export function CreateSnapshotDialog({ open, onOpenChange, connectionId }: Props) {
const [step, setStep] = useState<Step>("config");
const [name, setName] = useState("");
const [selectedSchema, setSelectedSchema] = useState<string>("");
const [selectedTables, setSelectedTables] = useState<Set<string>>(new Set());
const [includeDeps, setIncludeDeps] = useState(true);
const { data: schemas } = useSchemas(connectionId);
const { data: tables } = useTables(
selectedSchema ? connectionId : null,
selectedSchema
);
const { create, result, error, isCreating, progress, reset } = useCreateSnapshot();
useEffect(() => {
if (open) {
setStep("config");
setName(`snapshot-${new Date().toISOString().slice(0, 10)}`);
setSelectedTables(new Set());
setIncludeDeps(true);
reset();
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [open, reset]);
useEffect(() => {
if (schemas && schemas.length > 0 && !selectedSchema) {
setSelectedSchema(schemas.find((s) => s === "public") || schemas[0]);
}
}, [schemas, selectedSchema]);
useEffect(() => {
if (progress?.stage === "done" || progress?.stage === "error") {
setStep("done");
}
}, [progress]);
const handleToggleTable = (tableName: string) => {
setSelectedTables((prev) => {
const next = new Set(prev);
if (next.has(tableName)) {
next.delete(tableName);
} else {
next.add(tableName);
}
return next;
});
};
const handleSelectAll = () => {
if (tables) {
if (selectedTables.size === tables.length) {
setSelectedTables(new Set());
} else {
setSelectedTables(new Set(tables.map((t) => t.name)));
}
}
};
const handleCreate = async () => {
if (!name.trim() || selectedTables.size === 0) {
toast.error("Please enter a name and select at least one table");
return;
}
const filePath = await save({
defaultPath: `${name}.json`,
filters: [{ name: "JSON", extensions: ["json"] }],
});
if (!filePath) return;
setStep("progress");
const tableRefs: TableRef[] = Array.from(selectedTables).map((t) => ({
schema: selectedSchema,
table: t,
}));
const snapshotId = crypto.randomUUID();
create({
params: {
connection_id: connectionId,
tables: tableRefs,
name: name.trim(),
include_dependencies: includeDeps,
},
snapshotId,
filePath,
});
};
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-[520px] max-h-[80vh] overflow-y-auto">
<DialogHeader>
<DialogTitle className="flex items-center gap-2">
<Camera className="h-5 w-5" />
Create Snapshot
</DialogTitle>
</DialogHeader>
{step === "config" && (
<>
<div className="grid gap-3 py-2">
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Name</label>
<Input
className="col-span-3"
value={name}
onChange={(e) => setName(e.target.value)}
placeholder="snapshot-name"
/>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Schema</label>
<select
className="col-span-3 rounded-md border bg-background px-3 py-2 text-sm"
value={selectedSchema}
onChange={(e) => {
setSelectedSchema(e.target.value);
setSelectedTables(new Set());
}}
>
{schemas?.map((s) => (
<option key={s} value={s}>{s}</option>
))}
</select>
</div>
<div className="grid grid-cols-4 items-start gap-3">
<label className="text-right text-sm text-muted-foreground pt-1">Tables</label>
<div className="col-span-3 space-y-1">
{tables && tables.length > 0 && (
<button
className="text-xs text-primary hover:underline"
onClick={handleSelectAll}
>
{selectedTables.size === tables.length ? "Deselect all" : "Select all"}
</button>
)}
<div className="max-h-48 overflow-y-auto rounded-md border p-2 space-y-1">
{tables?.map((t) => (
<label key={t.name} className="flex items-center gap-2 text-sm cursor-pointer hover:bg-accent rounded px-1">
<input
type="checkbox"
checked={selectedTables.has(t.name)}
onChange={() => handleToggleTable(t.name)}
className="rounded"
/>
{t.name}
</label>
)) ?? (
<p className="text-xs text-muted-foreground">Select a schema first</p>
)}
</div>
<p className="text-xs text-muted-foreground">{selectedTables.size} tables selected</p>
</div>
</div>
<div className="grid grid-cols-4 items-center gap-3">
<label className="text-right text-sm text-muted-foreground">Dependencies</label>
<div className="col-span-3 flex items-center gap-2">
<input
type="checkbox"
checked={includeDeps}
onChange={(e) => setIncludeDeps(e.target.checked)}
className="rounded"
/>
<span className="text-sm text-muted-foreground">
Include referenced tables (foreign keys)
</span>
</div>
</div>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Cancel</Button>
<Button onClick={handleCreate} disabled={selectedTables.size === 0}>
Create Snapshot
</Button>
</DialogFooter>
</>
)}
{step === "progress" && (
<div className="py-4 space-y-4">
<div className="space-y-2">
<div className="flex items-center justify-between text-sm">
<span>{progress?.message || "Starting..."}</span>
<span className="text-muted-foreground">{progress?.percent ?? 0}%</span>
</div>
<div className="h-2 rounded-full bg-secondary overflow-hidden">
<div
className="h-full rounded-full bg-primary transition-all duration-300"
style={{ width: `${progress?.percent ?? 0}%` }}
/>
</div>
</div>
{isCreating && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" />
{progress?.stage || "Initializing..."}
</div>
)}
</div>
)}
{step === "done" && (
<div className="py-4 space-y-4">
{error ? (
<div className="flex items-start gap-3 rounded-md border border-destructive/50 bg-destructive/10 p-4">
<XCircle className="h-5 w-5 text-destructive shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium text-destructive">Snapshot Failed</p>
<p className="text-xs text-muted-foreground">{error}</p>
</div>
</div>
) : (
<div className="flex items-start gap-3 rounded-md border border-green-500/50 bg-green-500/10 p-4">
<CheckCircle2 className="h-5 w-5 text-green-500 shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium">Snapshot Created</p>
<p className="text-xs text-muted-foreground">
{result?.total_rows} rows from {result?.tables.length} tables saved.
</p>
</div>
</div>
)}
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Close</Button>
{error && <Button onClick={() => setStep("config")}>Retry</Button>}
</DialogFooter>
</div>
)}
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,244 @@
import { useState, useEffect } from "react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogFooter,
} from "@/components/ui/dialog";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { useRestoreSnapshot, useReadSnapshotMetadata } from "@/hooks/use-snapshots";
import { toast } from "sonner";
import { open as openFile } from "@tauri-apps/plugin-dialog";
import {
Loader2,
CheckCircle2,
XCircle,
Upload,
AlertTriangle,
FileJson,
} from "lucide-react";
import type { SnapshotMetadata } from "@/types";
interface Props {
open: boolean;
onOpenChange: (open: boolean) => void;
connectionId: string;
}
type Step = "select" | "confirm" | "progress" | "done";
export function RestoreSnapshotDialog({ open, onOpenChange, connectionId }: Props) {
const [step, setStep] = useState<Step>("select");
const [filePath, setFilePath] = useState<string | null>(null);
const [metadata, setMetadata] = useState<SnapshotMetadata | null>(null);
const [truncate, setTruncate] = useState(false);
const readMeta = useReadSnapshotMetadata();
const { restore, rowsRestored, error, isRestoring, progress, reset } = useRestoreSnapshot();
useEffect(() => {
if (open) {
setStep("select");
setFilePath(null);
setMetadata(null);
setTruncate(false);
reset();
}
}, [open, reset]);
useEffect(() => {
if (progress?.stage === "done" || progress?.stage === "error") {
setStep("done");
}
}, [progress]);
const handleSelectFile = async () => {
const selected = await openFile({
filters: [{ name: "JSON Snapshot", extensions: ["json"] }],
multiple: false,
});
if (!selected) return;
const path = typeof selected === "string" ? selected : (selected as { path: string }).path;
setFilePath(path);
readMeta.mutate(path, {
onSuccess: (meta) => {
setMetadata(meta);
setStep("confirm");
},
onError: (err) => toast.error("Invalid snapshot file", { description: String(err) }),
});
};
const handleRestore = () => {
if (!filePath) return;
setStep("progress");
const snapshotId = crypto.randomUUID();
restore({
params: {
connection_id: connectionId,
file_path: filePath,
truncate_before_restore: truncate,
},
snapshotId,
});
};
function formatBytes(bytes: number): string {
if (bytes < 1024) return `${bytes} B`;
if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(0)} KB`;
return `${(bytes / (1024 * 1024)).toFixed(1)} MB`;
}
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-[520px]">
<DialogHeader>
<DialogTitle className="flex items-center gap-2">
<Upload className="h-5 w-5" />
Restore Snapshot
</DialogTitle>
</DialogHeader>
{step === "select" && (
<>
<div className="py-8 flex flex-col items-center gap-3">
<FileJson className="h-12 w-12 text-muted-foreground" />
<p className="text-sm text-muted-foreground">Select a snapshot file to restore</p>
<Button onClick={handleSelectFile} disabled={readMeta.isPending}>
{readMeta.isPending ? (
<><Loader2 className="h-4 w-4 animate-spin mr-1" />Reading...</>
) : (
"Choose File"
)}
</Button>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Cancel</Button>
</DialogFooter>
</>
)}
{step === "confirm" && metadata && (
<>
<div className="space-y-3 py-2">
<div className="rounded-md border p-3 space-y-2 text-sm">
<div className="flex justify-between">
<span className="text-muted-foreground">Name</span>
<span className="font-medium">{metadata.name}</span>
</div>
<div className="flex justify-between">
<span className="text-muted-foreground">Created</span>
<span>{new Date(metadata.created_at).toLocaleString()}</span>
</div>
<div className="flex justify-between">
<span className="text-muted-foreground">Tables</span>
<span>{metadata.tables.length}</span>
</div>
<div className="flex justify-between">
<span className="text-muted-foreground">Total Rows</span>
<span>{metadata.total_rows.toLocaleString()}</span>
</div>
<div className="flex justify-between">
<span className="text-muted-foreground">File Size</span>
<span>{formatBytes(metadata.file_size_bytes)}</span>
</div>
</div>
<div className="space-y-1">
<p className="text-xs font-medium text-muted-foreground">Tables included:</p>
<div className="flex flex-wrap gap-1">
{metadata.tables.map((t) => (
<Badge key={`${t.schema}.${t.table}`} variant="secondary" className="text-[10px]">
{t.schema}.{t.table} ({t.row_count})
</Badge>
))}
</div>
</div>
<div className="flex items-start gap-2 rounded-md border border-yellow-500/50 bg-yellow-500/10 p-3">
<AlertTriangle className="h-4 w-4 text-yellow-600 shrink-0 mt-0.5" />
<div className="space-y-2">
<label className="flex items-center gap-2 text-sm cursor-pointer">
<input
type="checkbox"
checked={truncate}
onChange={(e) => setTruncate(e.target.checked)}
className="rounded"
/>
Truncate existing data before restore
</label>
{truncate && (
<p className="text-xs text-yellow-700 dark:text-yellow-400">
This will DELETE all existing data in the affected tables before restoring.
</p>
)}
</div>
</div>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => setStep("select")}>Back</Button>
<Button onClick={handleRestore}>Restore</Button>
</DialogFooter>
</>
)}
{step === "progress" && (
<div className="py-4 space-y-4">
<div className="space-y-2">
<div className="flex items-center justify-between text-sm">
<span>{progress?.message || "Starting..."}</span>
<span className="text-muted-foreground">{progress?.percent ?? 0}%</span>
</div>
<div className="h-2 rounded-full bg-secondary overflow-hidden">
<div
className="h-full rounded-full bg-primary transition-all duration-300"
style={{ width: `${progress?.percent ?? 0}%` }}
/>
</div>
</div>
{isRestoring && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" />
{progress?.detail || progress?.stage || "Restoring..."}
</div>
)}
</div>
)}
{step === "done" && (
<div className="py-4 space-y-4">
{error ? (
<div className="flex items-start gap-3 rounded-md border border-destructive/50 bg-destructive/10 p-4">
<XCircle className="h-5 w-5 text-destructive shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium text-destructive">Restore Failed</p>
<p className="text-xs text-muted-foreground">{error}</p>
</div>
</div>
) : (
<div className="flex items-start gap-3 rounded-md border border-green-500/50 bg-green-500/10 p-4">
<CheckCircle2 className="h-5 w-5 text-green-500 shrink-0 mt-0.5" />
<div className="space-y-1">
<p className="text-sm font-medium">Restore Completed</p>
<p className="text-xs text-muted-foreground">
{rowsRestored?.toLocaleString()} rows restored successfully.
</p>
</div>
</div>
)}
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>Close</Button>
{error && <Button onClick={() => setStep("confirm")}>Retry</Button>}
</DialogFooter>
</div>
)}
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,122 @@
import { useState } from "react";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { useListSnapshots } from "@/hooks/use-snapshots";
import { CreateSnapshotDialog } from "./CreateSnapshotDialog";
import { RestoreSnapshotDialog } from "./RestoreSnapshotDialog";
import {
Camera,
Upload,
Plus,
FileJson,
Calendar,
Table2,
HardDrive,
} from "lucide-react";
import type { SnapshotMetadata } from "@/types";
interface Props {
connectionId: string;
}
function formatBytes(bytes: number): string {
if (bytes < 1024) return `${bytes} B`;
if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(0)} KB`;
return `${(bytes / (1024 * 1024)).toFixed(1)} MB`;
}
function SnapshotCard({ snapshot }: { snapshot: SnapshotMetadata }) {
return (
<div className="rounded-md border p-3 space-y-2">
<div className="flex items-start justify-between">
<div className="flex items-center gap-2">
<FileJson className="h-4 w-4 text-primary" />
<span className="text-sm font-medium">{snapshot.name}</span>
</div>
<Badge variant="secondary" className="text-[10px]">v{snapshot.version}</Badge>
</div>
<div className="grid grid-cols-3 gap-2 text-xs text-muted-foreground">
<div className="flex items-center gap-1">
<Calendar className="h-3 w-3" />
{new Date(snapshot.created_at).toLocaleDateString()}
</div>
<div className="flex items-center gap-1">
<Table2 className="h-3 w-3" />
{snapshot.tables.length} tables
</div>
<div className="flex items-center gap-1">
<HardDrive className="h-3 w-3" />
{formatBytes(snapshot.file_size_bytes)}
</div>
</div>
<div className="flex flex-wrap gap-1">
{snapshot.tables.map((t) => (
<Badge key={`${t.schema}.${t.table}`} variant="outline" className="text-[10px]">
{t.schema}.{t.table}
<span className="ml-1 text-muted-foreground">({t.row_count})</span>
</Badge>
))}
</div>
<div className="text-xs text-muted-foreground">
{snapshot.total_rows.toLocaleString()} total rows
</div>
</div>
);
}
export function SnapshotPanel({ connectionId }: Props) {
const [showCreate, setShowCreate] = useState(false);
const [showRestore, setShowRestore] = useState(false);
const { data: snapshots } = useListSnapshots();
return (
<div className="flex h-full flex-col">
{/* Header */}
<div className="border-b px-4 py-3 flex items-center justify-between">
<div className="flex items-center gap-2">
<Camera className="h-5 w-5 text-primary" />
<h2 className="text-sm font-medium">Data Snapshots</h2>
</div>
<div className="flex items-center gap-2">
<Button variant="outline" size="sm" onClick={() => setShowRestore(true)}>
<Upload className="h-3.5 w-3.5 mr-1" />
Restore
</Button>
<Button size="sm" onClick={() => setShowCreate(true)}>
<Plus className="h-3.5 w-3.5 mr-1" />
Create
</Button>
</div>
</div>
{/* Content */}
<div className="flex-1 overflow-auto p-4 space-y-2">
{!snapshots || snapshots.length === 0 ? (
<div className="flex h-full flex-col items-center justify-center gap-3 text-muted-foreground">
<Camera className="h-12 w-12" />
<p className="text-sm">No snapshots yet</p>
<p className="text-xs">Create a snapshot to save table data for later restoration.</p>
</div>
) : (
snapshots.map((snap) => (
<SnapshotCard key={snap.id} snapshot={snap} />
))
)}
</div>
<CreateSnapshotDialog
open={showCreate}
onOpenChange={setShowCreate}
connectionId={connectionId}
/>
<RestoreSnapshotDialog
open={showRestore}
onOpenChange={setShowRestore}
connectionId={connectionId}
/>
</div>
);
}

View File

@@ -94,30 +94,46 @@ export function TableDataView({ connectionId, schema, table }: Props) {
[pendingChanges, isReadOnly] [pendingChanges, isReadOnly]
); );
const usesCtid = pkColumns.length === 0;
const handleCommit = async () => { const handleCommit = async () => {
if (!data || pkColumns.length === 0) { if (!data) return;
toast.error("Cannot save: no primary key detected"); if (pkColumns.length === 0 && (!data.ctids || data.ctids.length === 0)) {
toast.error("Cannot save: no primary key and no ctid available");
return; return;
} }
setIsSaving(true); setIsSaving(true);
try { try {
for (const [_key, change] of pendingChanges) { for (const [_key, change] of pendingChanges) {
const row = data.rows[change.rowIndex]; const row = data.rows[change.rowIndex];
const pkValues = pkColumns.map((pkCol) => {
const idx = data.columns.indexOf(pkCol);
return row[idx];
});
const colName = data.columns[change.colIndex]; const colName = data.columns[change.colIndex];
await updateRowApi({ if (usesCtid) {
connectionId, await updateRowApi({
schema, connectionId,
table, schema,
pkColumns, table,
pkValues: pkValues as unknown[], pkColumns: [],
column: colName, pkValues: [],
value: change.value, column: colName,
}); value: change.value,
ctid: data.ctids[change.rowIndex],
});
} else {
const pkValues = pkColumns.map((pkCol) => {
const idx = data.columns.indexOf(pkCol);
return row[idx];
});
await updateRowApi({
connectionId,
schema,
table,
pkColumns,
pkValues: pkValues as unknown[],
column: colName,
value: change.value,
});
}
} }
setPendingChanges(new Map()); setPendingChanges(new Map());
queryClient.invalidateQueries({ queryClient.invalidateQueries({
@@ -182,6 +198,14 @@ export function TableDataView({ connectionId, schema, table }: Props) {
Read-Only Read-Only
</span> </span>
)} )}
{!isReadOnly && usesCtid && (
<span
className="rounded bg-orange-500/10 px-1.5 py-0.5 text-[10px] font-medium text-orange-600 dark:text-orange-400"
title="This table has no primary key. Edits use physical row ID (ctid), which may change after VACUUM or concurrent writes."
>
No PK using ctid
</span>
)}
<Filter className="h-3.5 w-3.5 text-muted-foreground" /> <Filter className="h-3.5 w-3.5 text-muted-foreground" />
<Input <Input
placeholder="WHERE clause (e.g. id > 10)" placeholder="WHERE clause (e.g. id > 10)"

View File

@@ -3,6 +3,7 @@ import {
getTableColumns, getTableColumns,
getTableConstraints, getTableConstraints,
getTableIndexes, getTableIndexes,
getTableTriggers,
} from "@/lib/tauri"; } from "@/lib/tauri";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"; import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { import {
@@ -38,6 +39,11 @@ export function TableStructure({ connectionId, schema, table }: Props) {
queryFn: () => getTableIndexes(connectionId, schema, table), queryFn: () => getTableIndexes(connectionId, schema, table),
}); });
const { data: triggers } = useQuery({
queryKey: ["table-triggers", connectionId, schema, table],
queryFn: () => getTableTriggers(connectionId, schema, table),
});
return ( return (
<Tabs defaultValue="columns" className="flex h-full flex-col"> <Tabs defaultValue="columns" className="flex h-full flex-col">
<TabsList className="mx-2 mt-2 w-fit"> <TabsList className="mx-2 mt-2 w-fit">
@@ -50,6 +56,9 @@ export function TableStructure({ connectionId, schema, table }: Props) {
<TabsTrigger value="indexes" className="text-xs"> <TabsTrigger value="indexes" className="text-xs">
Indexes{indexes ? ` (${indexes.length})` : ""} Indexes{indexes ? ` (${indexes.length})` : ""}
</TabsTrigger> </TabsTrigger>
<TabsTrigger value="triggers" className="text-xs">
Triggers{triggers ? ` (${triggers.length})` : ""}
</TabsTrigger>
</TabsList> </TabsList>
<TabsContent value="columns" className="flex-1 overflow-hidden mt-0"> <TabsContent value="columns" className="flex-1 overflow-hidden mt-0">
@@ -63,6 +72,7 @@ export function TableStructure({ connectionId, schema, table }: Props) {
<TableHead className="text-xs">Nullable</TableHead> <TableHead className="text-xs">Nullable</TableHead>
<TableHead className="text-xs">Default</TableHead> <TableHead className="text-xs">Default</TableHead>
<TableHead className="text-xs">Key</TableHead> <TableHead className="text-xs">Key</TableHead>
<TableHead className="text-xs">Comment</TableHead>
</TableRow> </TableRow>
</TableHeader> </TableHeader>
<TableBody> <TableBody>
@@ -84,7 +94,7 @@ export function TableStructure({ connectionId, schema, table }: Props) {
{col.is_nullable ? "YES" : "NO"} {col.is_nullable ? "YES" : "NO"}
</TableCell> </TableCell>
<TableCell className="max-w-[200px] truncate text-xs text-muted-foreground"> <TableCell className="max-w-[200px] truncate text-xs text-muted-foreground">
{col.column_default ?? ""} {col.column_default ?? "\u2014"}
</TableCell> </TableCell>
<TableCell> <TableCell>
{col.is_primary_key && ( {col.is_primary_key && (
@@ -93,6 +103,9 @@ export function TableStructure({ connectionId, schema, table }: Props) {
</Badge> </Badge>
)} )}
</TableCell> </TableCell>
<TableCell className="max-w-[200px] truncate text-xs text-muted-foreground">
{col.comment ?? "\u2014"}
</TableCell>
</TableRow> </TableRow>
))} ))}
</TableBody> </TableBody>
@@ -108,6 +121,9 @@ export function TableStructure({ connectionId, schema, table }: Props) {
<TableHead className="text-xs">Name</TableHead> <TableHead className="text-xs">Name</TableHead>
<TableHead className="text-xs">Type</TableHead> <TableHead className="text-xs">Type</TableHead>
<TableHead className="text-xs">Columns</TableHead> <TableHead className="text-xs">Columns</TableHead>
<TableHead className="text-xs">References</TableHead>
<TableHead className="text-xs">On Update</TableHead>
<TableHead className="text-xs">On Delete</TableHead>
</TableRow> </TableRow>
</TableHeader> </TableHeader>
<TableBody> <TableBody>
@@ -124,6 +140,17 @@ export function TableStructure({ connectionId, schema, table }: Props) {
<TableCell className="text-xs"> <TableCell className="text-xs">
{c.columns.join(", ")} {c.columns.join(", ")}
</TableCell> </TableCell>
<TableCell className="text-xs text-muted-foreground">
{c.referenced_table
? `${c.referenced_schema}.${c.referenced_table}(${c.referenced_columns?.join(", ")})`
: "\u2014"}
</TableCell>
<TableCell className="text-xs text-muted-foreground">
{c.update_rule ?? "\u2014"}
</TableCell>
<TableCell className="text-xs text-muted-foreground">
{c.delete_rule ?? "\u2014"}
</TableCell>
</TableRow> </TableRow>
))} ))}
</TableBody> </TableBody>
@@ -163,6 +190,49 @@ export function TableStructure({ connectionId, schema, table }: Props) {
</Table> </Table>
</ScrollArea> </ScrollArea>
</TabsContent> </TabsContent>
<TabsContent value="triggers" className="flex-1 overflow-hidden mt-0">
<ScrollArea className="h-full">
<Table>
<TableHeader>
<TableRow>
<TableHead className="text-xs">Name</TableHead>
<TableHead className="text-xs">Timing</TableHead>
<TableHead className="text-xs">Event</TableHead>
<TableHead className="text-xs">Level</TableHead>
<TableHead className="text-xs">Function</TableHead>
<TableHead className="text-xs">Enabled</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{triggers?.map((t) => (
<TableRow key={t.name}>
<TableCell className="text-xs font-medium">
{t.name}
</TableCell>
<TableCell>
<Badge variant="outline" className="text-[10px]">
{t.timing}
</Badge>
</TableCell>
<TableCell className="text-xs">
{t.event}
</TableCell>
<TableCell className="text-xs">
{t.orientation}
</TableCell>
<TableCell className="text-xs text-muted-foreground">
{t.function_name}
</TableCell>
<TableCell className="text-xs">
{t.is_enabled ? "YES" : "NO"}
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</ScrollArea>
</TabsContent>
</Tabs> </Tabs>
); );
} }

View File

@@ -0,0 +1,216 @@
import { useState, useCallback } from "react";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Badge } from "@/components/ui/badge";
import {
useGenerateValidationSql,
useRunValidationRule,
useSuggestValidationRules,
} from "@/hooks/use-validation";
import { ValidationRuleCard } from "./ValidationRuleCard";
import { toast } from "sonner";
import { Plus, Sparkles, PlayCircle, Loader2, ShieldCheck } from "lucide-react";
import type { ValidationRule, ValidationStatus } from "@/types";
interface Props {
connectionId: string;
}
export function ValidationPanel({ connectionId }: Props) {
const [rules, setRules] = useState<ValidationRule[]>([]);
const [ruleInput, setRuleInput] = useState("");
const [runningIds, setRunningIds] = useState<Set<string>>(new Set());
const generateSql = useGenerateValidationSql();
const runRule = useRunValidationRule();
const suggestRules = useSuggestValidationRules();
const updateRule = useCallback(
(id: string, updates: Partial<ValidationRule>) => {
setRules((prev) =>
prev.map((r) => (r.id === id ? { ...r, ...updates } : r))
);
},
[]
);
const addRule = useCallback(
async (description: string) => {
const id = crypto.randomUUID();
const newRule: ValidationRule = {
id,
description,
generated_sql: "",
status: "generating" as ValidationStatus,
violation_count: 0,
sample_violations: [],
violation_columns: [],
error: null,
};
setRules((prev) => [...prev, newRule]);
try {
const sql = await generateSql.mutateAsync({
connectionId,
ruleDescription: description,
});
updateRule(id, { generated_sql: sql, status: "pending" });
} catch (err) {
updateRule(id, {
status: "error",
error: String(err),
});
}
},
[connectionId, generateSql, updateRule]
);
const handleAddRule = () => {
if (!ruleInput.trim()) return;
addRule(ruleInput.trim());
setRuleInput("");
};
const handleRunRule = useCallback(
async (id: string) => {
const rule = rules.find((r) => r.id === id);
if (!rule || !rule.generated_sql) return;
setRunningIds((prev) => new Set(prev).add(id));
updateRule(id, { status: "running" });
try {
const result = await runRule.mutateAsync({
connectionId,
sql: rule.generated_sql,
});
updateRule(id, {
status: result.status,
violation_count: result.violation_count,
sample_violations: result.sample_violations,
violation_columns: result.violation_columns,
error: result.error,
});
} catch (err) {
updateRule(id, { status: "error", error: String(err) });
} finally {
setRunningIds((prev) => {
const next = new Set(prev);
next.delete(id);
return next;
});
}
},
[rules, connectionId, runRule, updateRule]
);
const handleRemoveRule = useCallback((id: string) => {
setRules((prev) => prev.filter((r) => r.id !== id));
}, []);
const handleRunAll = async () => {
const runnableRules = rules.filter(
(r) => r.generated_sql && r.status !== "generating"
);
for (const rule of runnableRules) {
await handleRunRule(rule.id);
}
};
const handleSuggest = async () => {
try {
const suggestions = await suggestRules.mutateAsync(connectionId);
for (const desc of suggestions) {
await addRule(desc);
}
toast.success(`Added ${suggestions.length} suggested rules`);
} catch (err) {
toast.error("Failed to suggest rules", { description: String(err) });
}
};
const passed = rules.filter((r) => r.status === "passed").length;
const failed = rules.filter((r) => r.status === "failed").length;
const errors = rules.filter((r) => r.status === "error").length;
return (
<div className="flex h-full flex-col">
{/* Header */}
<div className="border-b px-4 py-3 space-y-3">
<div className="flex items-center justify-between">
<div className="flex items-center gap-2">
<ShieldCheck className="h-5 w-5 text-primary" />
<h2 className="text-sm font-medium">Data Validation</h2>
</div>
<div className="flex items-center gap-2">
<Button
variant="outline"
size="sm"
onClick={handleSuggest}
disabled={suggestRules.isPending}
>
{suggestRules.isPending ? (
<Loader2 className="h-3.5 w-3.5 animate-spin mr-1" />
) : (
<Sparkles className="h-3.5 w-3.5 mr-1" />
)}
Auto-suggest
</Button>
<Button
variant="outline"
size="sm"
onClick={handleRunAll}
disabled={rules.length === 0 || runningIds.size > 0}
>
<PlayCircle className="h-3.5 w-3.5 mr-1" />
Run All
</Button>
</div>
</div>
<div className="flex items-center gap-2">
<Input
placeholder="Describe a data quality rule (e.g., 'All orders must have a positive total')"
value={ruleInput}
onChange={(e) => setRuleInput(e.target.value)}
onKeyDown={(e) => e.key === "Enter" && handleAddRule()}
className="flex-1"
/>
<Button size="sm" onClick={handleAddRule} disabled={!ruleInput.trim()}>
<Plus className="h-3.5 w-3.5 mr-1" />
Add
</Button>
</div>
{rules.length > 0 && (
<div className="flex items-center gap-2 text-xs">
<span className="text-muted-foreground">{rules.length} rules</span>
{passed > 0 && <Badge className="bg-green-600 text-white text-[10px]">{passed} passed</Badge>}
{failed > 0 && <Badge variant="destructive" className="text-[10px]">{failed} failed</Badge>}
{errors > 0 && <Badge variant="outline" className="text-[10px]">{errors} errors</Badge>}
</div>
)}
</div>
{/* Rules List */}
<div className="flex-1 overflow-auto p-4 space-y-2">
{rules.length === 0 ? (
<div className="flex h-full items-center justify-center text-sm text-muted-foreground">
Add a validation rule or click Auto-suggest to get started.
</div>
) : (
rules.map((rule) => (
<ValidationRuleCard
key={rule.id}
rule={rule}
onRun={() => handleRunRule(rule.id)}
onRemove={() => handleRemoveRule(rule.id)}
isRunning={runningIds.has(rule.id)}
/>
))
)}
</div>
</div>
);
}

View File

@@ -0,0 +1,138 @@
import { useState } from "react";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import {
ChevronDown,
ChevronRight,
Play,
Trash2,
Loader2,
} from "lucide-react";
import type { ValidationRule } from "@/types";
interface Props {
rule: ValidationRule;
onRun: () => void;
onRemove: () => void;
isRunning: boolean;
}
function statusBadge(status: string) {
switch (status) {
case "passed":
return <Badge className="bg-green-600 text-white">Passed</Badge>;
case "failed":
return <Badge variant="destructive">Failed</Badge>;
case "error":
return <Badge variant="outline" className="text-destructive border-destructive">Error</Badge>;
case "generating":
case "running":
return <Badge variant="secondary"><Loader2 className="h-3 w-3 animate-spin mr-1" />Running</Badge>;
default:
return <Badge variant="secondary">Pending</Badge>;
}
}
export function ValidationRuleCard({ rule, onRun, onRemove, isRunning }: Props) {
const [showSql, setShowSql] = useState(false);
const [showViolations, setShowViolations] = useState(false);
return (
<div className="rounded-md border p-3 space-y-2">
<div className="flex items-start justify-between gap-2">
<div className="flex-1 min-w-0">
<p className="text-sm">{rule.description}</p>
</div>
<div className="flex items-center gap-1 shrink-0">
{statusBadge(rule.status)}
<Button
variant="ghost"
size="sm"
className="h-7 w-7 p-0"
onClick={onRun}
disabled={isRunning}
>
{isRunning ? (
<Loader2 className="h-3.5 w-3.5 animate-spin" />
) : (
<Play className="h-3.5 w-3.5" />
)}
</Button>
<Button
variant="ghost"
size="sm"
className="h-7 w-7 p-0 text-muted-foreground hover:text-destructive"
onClick={onRemove}
>
<Trash2 className="h-3.5 w-3.5" />
</Button>
</div>
</div>
{rule.status === "failed" && (
<p className="text-xs text-destructive">
{rule.violation_count} violation{rule.violation_count !== 1 ? "s" : ""} found
</p>
)}
{rule.error && (
<p className="text-xs text-destructive">{rule.error}</p>
)}
{rule.generated_sql && (
<div>
<button
className="flex items-center gap-1 text-xs text-muted-foreground hover:text-foreground"
onClick={() => setShowSql(!showSql)}
>
{showSql ? <ChevronDown className="h-3 w-3" /> : <ChevronRight className="h-3 w-3" />}
SQL
</button>
{showSql && (
<pre className="mt-1 rounded bg-muted p-2 text-xs font-mono overflow-x-auto max-h-32 overflow-y-auto">
{rule.generated_sql}
</pre>
)}
</div>
)}
{rule.status === "failed" && rule.sample_violations.length > 0 && (
<div>
<button
className="flex items-center gap-1 text-xs text-muted-foreground hover:text-foreground"
onClick={() => setShowViolations(!showViolations)}
>
{showViolations ? <ChevronDown className="h-3 w-3" /> : <ChevronRight className="h-3 w-3" />}
Sample Violations ({rule.sample_violations.length})
</button>
{showViolations && (
<div className="mt-1 overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b">
{rule.violation_columns.map((col) => (
<th key={col} className="px-2 py-1 text-left font-medium text-muted-foreground">
{col}
</th>
))}
</tr>
</thead>
<tbody>
{rule.sample_violations.map((row, i) => (
<tr key={i} className="border-b last:border-0">
{(row as unknown[]).map((val, j) => (
<td key={j} className="px-2 py-1 font-mono">
{val === null ? <span className="text-muted-foreground">NULL</span> : String(val)}
</td>
))}
</tr>
))}
</tbody>
</table>
</div>
)}
</div>
)}
</div>
);
}

View File

@@ -5,6 +5,10 @@ import { TableStructure } from "@/components/table-viewer/TableStructure";
import { RoleManagerView } from "@/components/management/RoleManagerView"; import { RoleManagerView } from "@/components/management/RoleManagerView";
import { SessionsView } from "@/components/management/SessionsView"; import { SessionsView } from "@/components/management/SessionsView";
import { EntityLookupPanel } from "@/components/lookup/EntityLookupPanel"; import { EntityLookupPanel } from "@/components/lookup/EntityLookupPanel";
import { ErdDiagram } from "@/components/erd/ErdDiagram";
import { ValidationPanel } from "@/components/validation/ValidationPanel";
import { IndexAdvisorPanel } from "@/components/index-advisor/IndexAdvisorPanel";
import { SnapshotPanel } from "@/components/snapshots/SnapshotPanel";
export function TabContent() { export function TabContent() {
const { tabs, activeTabId, updateTab } = useAppStore(); const { tabs, activeTabId, updateTab } = useAppStore();
@@ -72,6 +76,35 @@ export function TabContent() {
/> />
); );
break; break;
case "erd":
content = (
<ErdDiagram
connectionId={tab.connectionId}
schema={tab.schema!}
/>
);
break;
case "validation":
content = (
<ValidationPanel
connectionId={tab.connectionId}
/>
);
break;
case "index-advisor":
content = (
<IndexAdvisorPanel
connectionId={tab.connectionId}
/>
);
break;
case "snapshots":
content = (
<SnapshotPanel
connectionId={tab.connectionId}
/>
);
break;
default: default:
content = null; content = null;
} }

View File

@@ -0,0 +1,73 @@
import { useState, useEffect, useCallback, useRef } from "react";
import { useMutation } from "@tanstack/react-query";
import {
generateTestDataPreview,
insertGeneratedData,
onDataGenProgress,
} from "@/lib/tauri";
import type { GenerateDataParams, DataGenProgress, GeneratedDataPreview } from "@/types";
export function useDataGenerator() {
const [progress, setProgress] = useState<DataGenProgress | null>(null);
const genIdRef = useRef<string>("");
const previewMutation = useMutation({
mutationFn: ({
params,
genId,
}: {
params: GenerateDataParams;
genId: string;
}) => {
genIdRef.current = genId;
setProgress(null);
return generateTestDataPreview(params, genId);
},
});
const insertMutation = useMutation({
mutationFn: ({
connectionId,
preview,
}: {
connectionId: string;
preview: GeneratedDataPreview;
}) => insertGeneratedData(connectionId, preview),
});
useEffect(() => {
const unlistenPromise = onDataGenProgress((p) => {
if (p.gen_id === genIdRef.current) {
setProgress(p);
}
});
return () => {
unlistenPromise.then((unlisten) => unlisten());
};
}, []);
const previewRef = useRef(previewMutation);
previewRef.current = previewMutation;
const insertRef = useRef(insertMutation);
insertRef.current = insertMutation;
const reset = useCallback(() => {
previewRef.current.reset();
insertRef.current.reset();
setProgress(null);
genIdRef.current = "";
}, []);
return {
generatePreview: previewMutation.mutate,
preview: previewMutation.data as GeneratedDataPreview | undefined,
isGenerating: previewMutation.isPending,
generateError: previewMutation.error ? String(previewMutation.error) : null,
insertData: insertMutation.mutate,
insertedRows: insertMutation.data as number | undefined,
isInserting: insertMutation.isPending,
insertError: insertMutation.error ? String(insertMutation.error) : null,
progress,
reset,
};
}

111
src/hooks/use-docker.ts Normal file
View File

@@ -0,0 +1,111 @@
import { useState, useEffect, useCallback, useRef } from "react";
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import {
checkDocker,
listTuskContainers,
cloneToDocker,
startContainer,
stopContainer,
removeContainer,
onCloneProgress,
} from "@/lib/tauri";
import type { CloneToDockerParams, CloneProgress, CloneResult } from "@/types";
export function useDockerStatus() {
return useQuery({
queryKey: ["docker-status"],
queryFn: checkDocker,
staleTime: 30_000,
});
}
export function useTuskContainers() {
return useQuery({
queryKey: ["tusk-containers"],
queryFn: listTuskContainers,
refetchInterval: 10_000,
});
}
export function useCloneToDocker() {
const [progress, setProgress] = useState<CloneProgress | null>(null);
const cloneIdRef = useRef<string>("");
const queryClient = useQueryClient();
const mutation = useMutation({
mutationFn: ({
params,
cloneId,
}: {
params: CloneToDockerParams;
cloneId: string;
}) => {
cloneIdRef.current = cloneId;
setProgress(null);
return cloneToDocker(params, cloneId);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["connections"] });
queryClient.invalidateQueries({ queryKey: ["tusk-containers"] });
},
});
useEffect(() => {
const unlistenPromise = onCloneProgress((p) => {
if (p.clone_id === cloneIdRef.current) {
setProgress(p);
}
});
return () => {
unlistenPromise.then((unlisten) => unlisten());
};
}, []);
const mutationRef = useRef(mutation);
mutationRef.current = mutation;
const reset = useCallback(() => {
mutationRef.current.reset();
setProgress(null);
cloneIdRef.current = "";
}, []);
return {
clone: mutation.mutate,
result: mutation.data as CloneResult | undefined,
error: mutation.error ? String(mutation.error) : null,
isCloning: mutation.isPending,
progress,
reset,
};
}
export function useStartContainer() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (name: string) => startContainer(name),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["tusk-containers"] });
},
});
}
export function useStopContainer() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (name: string) => stopContainer(name),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["tusk-containers"] });
},
});
}
export function useRemoveContainer() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (name: string) => removeContainer(name),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["tusk-containers"] });
},
});
}

View File

@@ -0,0 +1,20 @@
import { useMutation } from "@tanstack/react-query";
import { getIndexAdvisorReport, applyIndexRecommendation } from "@/lib/tauri";
export function useIndexAdvisorReport() {
return useMutation({
mutationFn: (connectionId: string) => getIndexAdvisorReport(connectionId),
});
}
export function useApplyIndexRecommendation() {
return useMutation({
mutationFn: ({
connectionId,
ddl,
}: {
connectionId: string;
ddl: string;
}) => applyIndexRecommendation(connectionId, ddl),
});
}

View File

@@ -8,6 +8,7 @@ import {
listSequences, listSequences,
switchDatabase, switchDatabase,
getColumnDetails, getColumnDetails,
getSchemaErd,
} from "@/lib/tauri"; } from "@/lib/tauri";
import type { ConnectionConfig } from "@/types"; import type { ConnectionConfig } from "@/types";
@@ -88,3 +89,12 @@ export function useColumnDetails(connectionId: string | null, schema: string | n
staleTime: 5 * 60 * 1000, staleTime: 5 * 60 * 1000,
}); });
} }
export function useSchemaErd(connectionId: string | null, schema: string | null) {
return useQuery({
queryKey: ["schema-erd", connectionId, schema],
queryFn: () => getSchemaErd(connectionId!, schema!),
enabled: !!connectionId && !!schema,
staleTime: 5 * 60 * 1000,
});
}

30
src/hooks/use-settings.ts Normal file
View File

@@ -0,0 +1,30 @@
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
import { getAppSettings, saveAppSettings, getMcpStatus } from "@/lib/tauri";
import type { AppSettings } from "@/types";
export function useAppSettings() {
return useQuery({
queryKey: ["app-settings"],
queryFn: getAppSettings,
staleTime: Infinity,
});
}
export function useSaveAppSettings() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (settings: AppSettings) => saveAppSettings(settings),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["app-settings"] });
queryClient.invalidateQueries({ queryKey: ["mcp-status"] });
},
});
}
export function useMcpStatus() {
return useQuery({
queryKey: ["mcp-status"],
queryFn: getMcpStatus,
refetchInterval: 5000,
});
}

131
src/hooks/use-snapshots.ts Normal file
View File

@@ -0,0 +1,131 @@
import { useState, useEffect, useCallback, useRef } from "react";
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import {
createSnapshot,
restoreSnapshot,
listSnapshots,
readSnapshotMetadata,
onSnapshotProgress,
} from "@/lib/tauri";
import type {
CreateSnapshotParams,
RestoreSnapshotParams,
SnapshotProgress,
SnapshotMetadata,
} from "@/types";
export function useListSnapshots() {
return useQuery({
queryKey: ["snapshots"],
queryFn: listSnapshots,
staleTime: 30_000,
});
}
export function useReadSnapshotMetadata() {
return useMutation({
mutationFn: (filePath: string) => readSnapshotMetadata(filePath),
});
}
export function useCreateSnapshot() {
const [progress, setProgress] = useState<SnapshotProgress | null>(null);
const snapshotIdRef = useRef<string>("");
const queryClient = useQueryClient();
const mutation = useMutation({
mutationFn: ({
params,
snapshotId,
filePath,
}: {
params: CreateSnapshotParams;
snapshotId: string;
filePath: string;
}) => {
snapshotIdRef.current = snapshotId;
setProgress(null);
return createSnapshot(params, snapshotId, filePath);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["snapshots"] });
},
});
useEffect(() => {
const unlistenPromise = onSnapshotProgress((p) => {
if (p.snapshot_id === snapshotIdRef.current) {
setProgress(p);
}
});
return () => {
unlistenPromise.then((unlisten) => unlisten());
};
}, []);
const mutationRef = useRef(mutation);
mutationRef.current = mutation;
const reset = useCallback(() => {
mutationRef.current.reset();
setProgress(null);
snapshotIdRef.current = "";
}, []);
return {
create: mutation.mutate,
result: mutation.data as SnapshotMetadata | undefined,
error: mutation.error ? String(mutation.error) : null,
isCreating: mutation.isPending,
progress,
reset,
};
}
export function useRestoreSnapshot() {
const [progress, setProgress] = useState<SnapshotProgress | null>(null);
const snapshotIdRef = useRef<string>("");
const mutation = useMutation({
mutationFn: ({
params,
snapshotId,
}: {
params: RestoreSnapshotParams;
snapshotId: string;
}) => {
snapshotIdRef.current = snapshotId;
setProgress(null);
return restoreSnapshot(params, snapshotId);
},
});
useEffect(() => {
const unlistenPromise = onSnapshotProgress((p) => {
if (p.snapshot_id === snapshotIdRef.current) {
setProgress(p);
}
});
return () => {
unlistenPromise.then((unlisten) => unlisten());
};
}, []);
const mutationRef = useRef(mutation);
mutationRef.current = mutation;
const reset = useCallback(() => {
mutationRef.current.reset();
setProgress(null);
snapshotIdRef.current = "";
}, []);
return {
restore: mutation.mutate,
rowsRestored: mutation.data as number | undefined,
error: mutation.error ? String(mutation.error) : null,
isRestoring: mutation.isPending,
progress,
reset,
};
}

View File

@@ -0,0 +1,38 @@
import { useMutation } from "@tanstack/react-query";
import {
generateValidationSql,
runValidationRule,
suggestValidationRules,
} from "@/lib/tauri";
export function useGenerateValidationSql() {
return useMutation({
mutationFn: ({
connectionId,
ruleDescription,
}: {
connectionId: string;
ruleDescription: string;
}) => generateValidationSql(connectionId, ruleDescription),
});
}
export function useRunValidationRule() {
return useMutation({
mutationFn: ({
connectionId,
sql,
sampleLimit,
}: {
connectionId: string;
sql: string;
sampleLimit?: number;
}) => runValidationRule(connectionId, sql, sampleLimit),
});
}
export function useSuggestValidationRules() {
return useMutation({
mutationFn: (connectionId: string) => suggestValidationRules(connectionId),
});
}

View File

@@ -11,6 +11,8 @@ import type {
ColumnInfo, ColumnInfo,
ConstraintInfo, ConstraintInfo,
IndexInfo, IndexInfo,
TriggerInfo,
ErdData,
HistoryEntry, HistoryEntry,
SavedQuery, SavedQuery,
SessionInfo, SessionInfo,
@@ -26,6 +28,22 @@ import type {
OllamaModel, OllamaModel,
EntityLookupResult, EntityLookupResult,
LookupProgress, LookupProgress,
DockerStatus,
CloneToDockerParams,
CloneProgress,
CloneResult,
TuskContainer,
AppSettings,
McpStatus,
ValidationRule,
GenerateDataParams,
GeneratedDataPreview,
DataGenProgress,
IndexAdvisorReport,
SnapshotMetadata,
CreateSnapshotParams,
RestoreSnapshotParams,
SnapshotProgress,
} from "@/types"; } from "@/types";
// Connections // Connections
@@ -115,6 +133,15 @@ export const getTableIndexes = (
table: string table: string
) => invoke<IndexInfo[]>("get_table_indexes", { connectionId, schema, table }); ) => invoke<IndexInfo[]>("get_table_indexes", { connectionId, schema, table });
export const getTableTriggers = (
connectionId: string,
schema: string,
table: string
) => invoke<TriggerInfo[]>("get_table_triggers", { connectionId, schema, table });
export const getSchemaErd = (connectionId: string, schema: string) =>
invoke<ErdData>("get_schema_erd", { connectionId, schema });
// Data // Data
export const getTableData = (params: { export const getTableData = (params: {
connectionId: string; connectionId: string;
@@ -135,6 +162,7 @@ export const updateRow = (params: {
pkValues: unknown[]; pkValues: unknown[];
column: string; column: string;
value: unknown; value: unknown;
ctid?: string;
}) => invoke<void>("update_row", params); }) => invoke<void>("update_row", params);
export const insertRow = (params: { export const insertRow = (params: {
@@ -151,6 +179,7 @@ export const deleteRows = (params: {
table: string; table: string;
pkColumns: string[]; pkColumns: string[];
pkValuesList: unknown[][]; pkValuesList: unknown[][];
ctids?: string[];
}) => invoke<number>("delete_rows", params); }) => invoke<number>("delete_rows", params);
// History // History
@@ -280,3 +309,84 @@ export const onLookupProgress = (
callback: (p: LookupProgress) => void callback: (p: LookupProgress) => void
): Promise<UnlistenFn> => ): Promise<UnlistenFn> =>
listen<LookupProgress>("lookup-progress", (e) => callback(e.payload)); listen<LookupProgress>("lookup-progress", (e) => callback(e.payload));
// Docker
export const checkDocker = () =>
invoke<DockerStatus>("check_docker");
export const listTuskContainers = () =>
invoke<TuskContainer[]>("list_tusk_containers");
export const cloneToDocker = (params: CloneToDockerParams, cloneId: string) =>
invoke<CloneResult>("clone_to_docker", { params, cloneId });
export const startContainer = (name: string) =>
invoke<void>("start_container", { name });
export const stopContainer = (name: string) =>
invoke<void>("stop_container", { name });
export const removeContainer = (name: string) =>
invoke<void>("remove_container", { name });
export const onCloneProgress = (
callback: (p: CloneProgress) => void
): Promise<UnlistenFn> =>
listen<CloneProgress>("clone-progress", (e) => callback(e.payload));
// App Settings
export const getAppSettings = () =>
invoke<AppSettings>("get_app_settings");
export const saveAppSettings = (settings: AppSettings) =>
invoke<void>("save_app_settings", { settings });
export const getMcpStatus = () =>
invoke<McpStatus>("get_mcp_status");
// Validation (Wave 1)
export const generateValidationSql = (connectionId: string, ruleDescription: string) =>
invoke<string>("generate_validation_sql", { connectionId, ruleDescription });
export const runValidationRule = (connectionId: string, sql: string, sampleLimit?: number) =>
invoke<ValidationRule>("run_validation_rule", { connectionId, sql, sampleLimit });
export const suggestValidationRules = (connectionId: string) =>
invoke<string[]>("suggest_validation_rules", { connectionId });
// Data Generator (Wave 2)
export const generateTestDataPreview = (params: GenerateDataParams, genId: string) =>
invoke<GeneratedDataPreview>("generate_test_data_preview", { params, genId });
export const insertGeneratedData = (connectionId: string, preview: GeneratedDataPreview) =>
invoke<number>("insert_generated_data", { connectionId, preview });
export const onDataGenProgress = (
callback: (p: DataGenProgress) => void
): Promise<UnlistenFn> =>
listen<DataGenProgress>("datagen-progress", (e) => callback(e.payload));
// Index Advisor (Wave 3A)
export const getIndexAdvisorReport = (connectionId: string) =>
invoke<IndexAdvisorReport>("get_index_advisor_report", { connectionId });
export const applyIndexRecommendation = (connectionId: string, ddl: string) =>
invoke<void>("apply_index_recommendation", { connectionId, ddl });
// Snapshots (Wave 3B)
export const createSnapshot = (params: CreateSnapshotParams, snapshotId: string, filePath: string) =>
invoke<SnapshotMetadata>("create_snapshot", { params, snapshotId, filePath });
export const restoreSnapshot = (params: RestoreSnapshotParams, snapshotId: string) =>
invoke<number>("restore_snapshot", { params, snapshotId });
export const listSnapshots = () =>
invoke<SnapshotMetadata[]>("list_snapshots");
export const readSnapshotMetadata = (filePath: string) =>
invoke<SnapshotMetadata>("read_snapshot_metadata", { filePath });
export const onSnapshotProgress = (
callback: (p: SnapshotProgress) => void
): Promise<UnlistenFn> =>
listen<SnapshotProgress>("snapshot-progress", (e) => callback(e.payload));

View File

@@ -30,6 +30,7 @@ export interface PaginatedQueryResult extends QueryResult {
total_rows: number; total_rows: number;
page: number; page: number;
page_size: number; page_size: number;
ctids: string[];
} }
export interface SchemaObject { export interface SchemaObject {
@@ -56,12 +57,18 @@ export interface ColumnInfo {
ordinal_position: number; ordinal_position: number;
character_maximum_length: number | null; character_maximum_length: number | null;
is_primary_key: boolean; is_primary_key: boolean;
comment: string | null;
} }
export interface ConstraintInfo { export interface ConstraintInfo {
name: string; name: string;
constraint_type: string; constraint_type: string;
columns: string[]; columns: string[];
referenced_schema: string | null;
referenced_table: string | null;
referenced_columns: string[] | null;
update_rule: string | null;
delete_rule: string | null;
} }
export interface IndexInfo { export interface IndexInfo {
@@ -223,8 +230,13 @@ export interface SavedQuery {
created_at: string; created_at: string;
} }
export type AiProvider = "ollama" | "openai" | "anthropic";
export interface AiSettings { export interface AiSettings {
provider: AiProvider;
ollama_url: string; ollama_url: string;
openai_api_key?: string;
anthropic_api_key?: string;
model: string; model: string;
} }
@@ -272,7 +284,117 @@ export interface LookupProgress {
total: number; total: number;
} }
export type TabType = "query" | "table" | "structure" | "roles" | "sessions" | "lookup"; export interface TriggerInfo {
name: string;
event: string;
timing: string;
orientation: string;
function_name: string;
is_enabled: boolean;
definition: string;
}
export interface ErdColumn {
name: string;
data_type: string;
is_nullable: boolean;
is_primary_key: boolean;
}
export interface ErdTable {
schema: string;
name: string;
columns: ErdColumn[];
}
export interface ErdRelationship {
constraint_name: string;
source_schema: string;
source_table: string;
source_columns: string[];
target_schema: string;
target_table: string;
target_columns: string[];
update_rule: string;
delete_rule: string;
}
export interface ErdData {
tables: ErdTable[];
relationships: ErdRelationship[];
}
// App Settings
export type DockerHost = "local" | "remote";
export interface McpSettings {
enabled: boolean;
port: number;
}
export interface DockerSettings {
host: DockerHost;
remote_url?: string;
}
export interface AppSettings {
mcp: McpSettings;
docker: DockerSettings;
}
export interface McpStatus {
enabled: boolean;
port: number;
running: boolean;
}
// Docker
export interface DockerStatus {
installed: boolean;
daemon_running: boolean;
version: string | null;
error: string | null;
}
export type CloneMode = "schema_only" | "full_clone" | "sample_data";
export interface CloneToDockerParams {
source_connection_id: string;
source_database: string;
container_name: string;
pg_version: string;
host_port: number | null;
clone_mode: CloneMode;
sample_rows: number | null;
postgres_password: string | null;
}
export interface CloneProgress {
clone_id: string;
stage: string;
percent: number;
message: string;
detail: string | null;
}
export interface TuskContainer {
container_id: string;
name: string;
status: string;
host_port: number;
pg_version: string;
source_database: string | null;
source_connection: string | null;
created_at: string | null;
}
export interface CloneResult {
container: TuskContainer;
connection_id: string;
connection_url: string;
}
export type TabType = "query" | "table" | "structure" | "roles" | "sessions" | "lookup" | "erd" | "validation" | "index-advisor" | "snapshots";
export interface Tab { export interface Tab {
id: string; id: string;
@@ -287,3 +409,159 @@ export interface Tab {
lookupColumn?: string; lookupColumn?: string;
lookupValue?: string; lookupValue?: string;
} }
// --- Wave 1: Validation ---
export type ValidationStatus = "pending" | "generating" | "running" | "passed" | "failed" | "error";
export interface ValidationRule {
id: string;
description: string;
generated_sql: string;
status: ValidationStatus;
violation_count: number;
sample_violations: unknown[][];
violation_columns: string[];
error: string | null;
}
export interface ValidationReport {
rules: ValidationRule[];
total_rules: number;
passed: number;
failed: number;
errors: number;
execution_time_ms: number;
}
// --- Wave 2: Data Generator ---
export interface GenerateDataParams {
connection_id: string;
schema: string;
table: string;
row_count: number;
include_related: boolean;
custom_instructions?: string;
}
export interface GeneratedDataPreview {
tables: GeneratedTableData[];
insert_order: string[];
total_rows: number;
}
export interface GeneratedTableData {
schema: string;
table: string;
columns: string[];
rows: unknown[][];
row_count: number;
}
export interface DataGenProgress {
gen_id: string;
stage: string;
percent: number;
message: string;
detail: string | null;
}
// --- Wave 3A: Index Advisor ---
export interface TableStats {
schema: string;
table: string;
seq_scan: number;
idx_scan: number;
n_live_tup: number;
table_size: string;
index_size: string;
}
export interface IndexStatsInfo {
schema: string;
table: string;
index_name: string;
idx_scan: number;
index_size: string;
definition: string;
}
export interface SlowQuery {
query: string;
calls: number;
total_time_ms: number;
mean_time_ms: number;
rows: number;
}
export type IndexRecommendationType = "create_index" | "drop_index" | "replace_index";
export interface IndexRecommendation {
id: string;
recommendation_type: IndexRecommendationType;
table_schema: string;
table_name: string;
index_name: string | null;
ddl: string;
rationale: string;
estimated_impact: string;
priority: string;
}
export interface IndexAdvisorReport {
table_stats: TableStats[];
index_stats: IndexStatsInfo[];
slow_queries: SlowQuery[];
recommendations: IndexRecommendation[];
has_pg_stat_statements: boolean;
}
// --- Wave 3B: Snapshots ---
export interface SnapshotMetadata {
id: string;
name: string;
created_at: string;
connection_name: string;
database: string;
tables: SnapshotTableMeta[];
total_rows: number;
file_size_bytes: number;
version: number;
}
export interface SnapshotTableMeta {
schema: string;
table: string;
row_count: number;
columns: string[];
column_types: string[];
}
export interface SnapshotProgress {
snapshot_id: string;
stage: string;
percent: number;
message: string;
detail: string | null;
}
export interface CreateSnapshotParams {
connection_id: string;
tables: TableRef[];
name: string;
include_dependencies: boolean;
}
export interface TableRef {
schema: string;
table: string;
}
export interface RestoreSnapshotParams {
connection_id: string;
file_path: string;
truncate_before_restore: boolean;
}