diff --git a/.planning/1-CONTEXT.md b/.planning/1-CONTEXT.md new file mode 100644 index 0000000..36fd572 --- /dev/null +++ b/.planning/1-CONTEXT.md @@ -0,0 +1,119 @@ +--- +phase: 1 +title: Foundation +status: ready-for-planning +created: 2026-04-02 +--- + +# Phase 1 Context: Foundation + +## Decided Areas (from prior research + STATE.md) + +These are locked — do not re-litigate during planning or execution. + +| Decision | Value | +|---|---| +| Runtime | .NET 10 LTS + WPF | +| MVVM framework | CommunityToolkit.Mvvm 8.4.2 | +| SharePoint library | PnP.Framework 1.18.0 | +| Auth | MSAL.NET 4.83.1 + Extensions.Msal 4.83.3 + Desktop 4.82.1 | +| Token cache | MsalCacheHelper — one `IPublicClientApplication` per ClientId | +| DI host | Microsoft.Extensions.Hosting 10.x | +| Logging | Serilog 4.3.1 + rolling file sink → `%AppData%\SharepointToolbox\logs\` | +| JSON | System.Text.Json (built-in) | +| JSON persistence | Write-then-replace (`file.tmp` → validate → `File.Move`) + `SemaphoreSlim(1)` per file | +| Async pattern | `AsyncRelayCommand` everywhere — zero `async void` handlers | +| Trimming | `PublishTrimmed=false` — accept ~150–200 MB EXE | +| Architecture | 4-layer MVVM: View → ViewModel → Service → Infrastructure | +| Cross-VM messaging | `WeakReferenceMessenger` for tenant-switched events | +| Session holder | Singleton `SessionManager` — only class that holds `ClientContext` objects | +| Localization | .resx resource files (EN default, FR overlay) | + +## Gray Areas — Defaults Applied (user skipped discussion) + +### 1. Shell Layout + +**Default:** Mirror the existing tool's spatial contract — users are already trained on it. + +- **Window structure:** `MainWindow` with a top `ToolBar`, a center `TabControl` (feature tabs), and a bottom docked log panel. +- **Log panel:** Always visible, 150 px tall, not collapsible in Phase 1 (collapsibility is cosmetic — defer to a later phase). Uses a `RichTextBox`-equivalent (`RichTextBox` XAML control) with color-coded entries. +- **Tab strip:** `TabControl` with one `TabItem` per feature area. Phase 1 delivers a shell with placeholder tabs for all features so navigation is wired from day one. +- **Tabs to stub out:** Permissions, Storage, File Search, Duplicates, Templates, Bulk Operations, Folder Structure, Settings — all stubbed with a `"Coming soon"` placeholder `TextBlock` except Settings (partially functional in Phase 1 for profile management and language switching). +- **Status bar:** `StatusBar` at the very bottom (below the log panel) showing: current tenant display name | operation status text | progress percentage. + +### 2. Tenant Selector Placement + +**Default:** Prominent top-toolbar presence — tenant context is the most critical runtime state. + +- **Toolbar layout (left to right):** `ComboBox` (tenant display name list, ~220 px wide) → `Button "Connect"` → `Button "Manage Profiles..."` → separator → `Button "Clear Session"`. +- **ComboBox:** Bound to `MainWindowViewModel.TenantProfiles` ObservableCollection. Selecting a different item triggers a tenant-switch command (WeakReferenceMessenger broadcast to reset all feature VMs). +- **"Manage Profiles..." button:** Opens a modal `ProfileManagementDialog` (separate Window) for CRUD — create, rename, delete profiles. Inline editing in the toolbar would be too cramped. +- **"Clear Session" button:** Clears the MSAL token cache for the currently selected tenant and resets connection state. Lives in the toolbar (not buried in settings) because MSP users need quick access when switching client accounts mid-session. +- **Profile fields:** Name (display label), Tenant URL, Client ID — matches existing `{ name, tenantUrl, clientId }` JSON schema exactly. + +### 3. Progress + Cancel UX + +**Default:** Per-tab pattern — each feature tab owns its progress state. No global progress bar. + +- **Per-tab layout (bottom of each tab's content area):** `ProgressBar` (indeterminate or 0–100) + `TextBlock` (operation description, e.g. "Scanning site 3 of 12…") + `Button "Cancel"` — shown only when an operation is running (`Visibility` bound to `IsRunning`). +- **`CancellationTokenSource`:** Owned by each ViewModel, recreated per operation. Cancel button calls `_cts.Cancel()`. +- **`IProgress`:** `OperationProgress` is a shared record `{ int Current, int Total, string Message }` — defined in the `Core/` layer and used by all feature services. Concrete implementation uses `Progress` which marshals to the UI thread automatically. +- **Log panel as secondary channel:** Every progress step that produces a meaningful event also writes a timestamped line to the log panel. The per-tab progress bar is the live indicator; the log is the audit trail. +- **Status bar:** `StatusBar` at the bottom updates its operation text from the active tab's progress events via WeakReferenceMessenger — so the user sees progress even if they switch away from the running tab. + +### 4. Error Surface UX + +**Default:** Log panel as primary surface; modal dialog only for blocking errors. + +- **Non-fatal errors** (an operation failed, a SharePoint call returned an error): Written to log panel in red. The per-tab status area shows a brief summary (e.g. "Completed with 2 errors — see log"). No modal. +- **Fatal/blocking errors** (auth failure, unhandled exception): `MessageBox.Show` modal with the error message and a "Copy to Clipboard" button for diagnostics. Keep it simple — no custom dialog in Phase 1. +- **No toasts in Phase 1:** Toast/notification infrastructure is a cosmetic feature — defer. The log panel is always visible and sufficient. +- **Log entry format:** `HH:mm:ss [LEVEL] Message` — color coded: green = info/success, orange = warning, red = error. `LEVEL` maps to Serilog severity. +- **Global exception handler:** `Application.DispatcherUnhandledException` and `TaskScheduler.UnobservedTaskException` both funnel to the log panel + a fatal modal. Neither swallows the exception. +- **Empty catch block policy:** Any `catch` block must do exactly one of: log-and-recover, log-and-rethrow, or log-and-surface. Empty catch = build defect. Enforce via code review on every PR in Phase 1. + +## JSON Compatibility + +Existing file names and schema must be preserved exactly — users have live data in these files. + +| File | Schema | +|---|---| +| `Sharepoint_Export_profiles.json` | `{ "profiles": [{ "name": "...", "tenantUrl": "...", "clientId": "..." }] }` | +| `Sharepoint_Settings.json` | `{ "dataFolder": "...", "lang": "en" }` | + +The C# `SettingsService` must read these files without migration — the field names are the contract. + +## Localization + +- **EN strings are the default `.resx`** — `Strings.resx` (neutral/EN). FR is `Strings.fr.resx`. +- **Key naming:** Mirror existing PowerShell key convention (`tab.perms`, `btn.run.scan`, `menu.language`, etc.) so the EN default content is easily auditable against the existing app. +- **Dynamic switching:** `CultureInfo.CurrentUICulture` swap + `WeakReferenceMessenger` broadcast triggers all bound `LocalizedString` markup extensions to re-evaluate. No app restart needed. +- **FR completeness:** FR strings will be stubbed with EN fallback in Phase 1 — FR completeness is a Phase 5 concern. + +## Infrastructure Patterns (Phase 1 Deliverables) + +These are shared helpers that all feature phases reuse. They must be built and tested in Phase 1 before any feature work begins. + +1. **`SharePointPaginationHelper`** — static helper that wraps `CamlQuery` with `RowLimit ≤ 2,000` and `ListItemCollectionPosition` looping. All list enumeration in the codebase must call this — never raw `ExecuteQuery` on a list. +2. **`AsyncRelayCommand` pattern** — a thin base or example `FeatureViewModel` that demonstrates the canonical async command pattern: create `CancellationTokenSource`, bind `IsRunning`, bind `IProgress`, handle `OperationCanceledException` gracefully. +3. **`ObservableCollection` threading rule** — results are accumulated in `List` on a background thread, then assigned as `new ObservableCollection(list)` via `Dispatcher.InvokeAsync`. Never modify an `ObservableCollection` from `Task.Run`. +4. **`ExecuteQueryRetryAsync` wrapper** — wraps PnP Framework's retry logic. All CSOM calls use this; surface retry events as log + progress messages ("Throttled — retrying in 30s…"). +5. **`ClientContext` disposal** — always `await using`. Unit tests verify `Dispose()` is called on cancellation. + +## Deferred Ideas (out of scope for Phase 1) + +- Log panel collapsibility (cosmetic, Phase 3+) +- Dark/light theme toggle (cosmetic, post-v1) +- Toast/notification system (Phase 3+) +- FR locale completeness (Phase 5) +- User access export, storage charts, simplified permissions view (v1.x features, Phase 5) + +## code_context + +| Asset | Path | Notes | +|---|---|---| +| Existing profile JSON schema | `Sharepoint_ToolBox.ps1:68–72` | `Save-Profiles` shows exact field names | +| Existing settings JSON schema | `Sharepoint_ToolBox.ps1:147–152` | `Save-Settings` shows `dataFolder` + `lang` | +| Existing localization keys (EN) | `Sharepoint_ToolBox.ps1:2795–2870` (approx) | Full EN key set for `.resx` migration | +| Existing tab names | `Sharepoint_ToolBox.ps1:3824` | 9 tabs: Perms, Storage, Templates, Search, Dupes, Transfer, Bulk, Struct, Versions | +| Log panel pattern | `Sharepoint_ToolBox.ps1:6–17` | Color + timestamp format to mirror | diff --git a/.planning/ROADMAP.md b/.planning/ROADMAP.md index 847a0c2..7299dd2 100644 --- a/.planning/ROADMAP.md +++ b/.planning/ROADMAP.md @@ -22,7 +22,7 @@ Decimal phases appear between their surrounding integers in numeric order. - [x] **Phase 1: Foundation** - WPF shell, multi-tenant auth, DI, async patterns, error handling, logging, localization, JSON persistence (completed 2026-04-02) - [x] **Phase 2: Permissions** - Permissions scan (single and multi-site), CSV and HTML report export -- [ ] **Phase 3: Storage and File Operations** - Storage metrics, file search, and duplicate detection +- [x] **Phase 3: Storage and File Operations** - Storage metrics, file search, and duplicate detection (completed 2026-04-02) - [ ] **Phase 4: Bulk Operations and Provisioning** - Bulk member/site/transfer operations, site templates, folder structure provisioning - [ ] **Phase 5: Distribution and Hardening** - Self-contained EXE packaging, end-to-end validation, FR locale completeness @@ -125,6 +125,6 @@ Phases execute in numeric order: 1 → 2 → 3 → 4 → 5 |-------|----------------|--------|-----------| | 1. Foundation | 8/8 | Complete | 2026-04-02 | | 2. Permissions | 7/7 | Complete | 2026-04-02 | -| 3. Storage and File Operations | 7/8 | In Progress| | +| 3. Storage and File Operations | 8/8 | Complete | 2026-04-02 | | 4. Bulk Operations and Provisioning | 0/? | Not started | - | | 5. Distribution and Hardening | 0/? | Not started | - | diff --git a/.planning/STATE.md b/.planning/STATE.md index b8c29c5..1a6dee9 100644 --- a/.planning/STATE.md +++ b/.planning/STATE.md @@ -3,14 +3,14 @@ gsd_state_version: 1.0 milestone: v1.0 milestone_name: milestone status: executing -stopped_at: Completed 03-05-PLAN.md — Search and Duplicate Export Services -last_updated: "2026-04-02T13:40:11.479Z" -last_activity: 2026-04-02 — Plan 03-02 complete — StorageService CSOM scan engine implemented +stopped_at: Completed 03-08-PLAN.md — Phase 3 Storage complete — visual checkpoint approved by user +last_updated: "2026-04-02T15:50:00.000Z" +last_activity: 2026-04-02 — Plan 03-08 complete — SearchViewModel + DuplicatesViewModel + Views + DI wiring, visual checkpoint approved progress: total_phases: 5 - completed_phases: 2 + completed_phases: 3 total_plans: 23 - completed_plans: 22 + completed_plans: 23 percent: 65 --- @@ -21,14 +21,14 @@ progress: See: .planning/PROJECT.md (updated 2026-04-02) **Core value:** Administrators can audit and manage SharePoint/Teams permissions and storage across multiple client tenants from a single, reliable desktop application. -**Current focus:** Phase 3 — Storage and File Operations (planned, ready to execute) +**Current focus:** Phase 4 — Bulk Operations and Provisioning (not yet planned) ## Current Position -Phase: 3 of 5 (Storage and File Operations) — EXECUTING -Plan: 2 of 8 in phase 03 — completed 03-02, ready for 03-03 -Status: Executing — StorageService complete, proceeding to Wave 2 (exports + SearchService) -Last activity: 2026-04-02 — Plan 03-02 complete — StorageService CSOM scan engine implemented +Phase: 3 of 5 (Storage and File Operations) — COMPLETE +Plan: 8 of 8 in phase 03 — all plans complete, visual checkpoint approved +Status: Ready for Phase 4 planning +Last activity: 2026-04-02 — Plan 03-08 complete — SearchViewModel + DuplicatesViewModel + Views visual checkpoint approved Progress: [██████░░░░] 65% @@ -82,6 +82,7 @@ Progress: [██████░░░░] 65% | Phase 03-storage P04 | 2min | 2 tasks | 2 files | | Phase 03-storage P07 | 4min | 2 tasks | 10 files | | Phase 03-storage P05 | 4min | 2 tasks | 3 files | +| Phase 03 P08 | 4min | 3 tasks | 9 files | ## Accumulated Context @@ -148,6 +149,8 @@ Recent decisions affecting current work: - [Phase 03-storage]: IndentConverter/BytesConverter/InverseBoolConverter registered in App.xaml Application.Resources — accessible to all views without per-UserControl declaration - [Phase 03-storage]: SearchCsvExportService uses UTF-8 BOM for Excel compatibility — consistent with Phase 2 CsvExportService pattern - [Phase 03-storage]: DuplicatesHtmlExportService always uses badge-dup (red) for all groups — ok/diff distinction removed from final DUPL-03 spec +- [Phase 03]: SearchViewModel and DuplicatesViewModel use TenantProfile site URL override pattern — ctx.Url is read-only in CSOM (established pattern from StorageViewModel) +- [Phase 03]: DuplicateRow flat DTO wraps DuplicateItem with GroupName and GroupSize for DataGrid display ### Pending Todos @@ -160,6 +163,6 @@ None yet. ## Session Continuity -Last session: 2026-04-02T13:40:11.476Z -Stopped at: Completed 03-05-PLAN.md — Search and Duplicate Export Services +Last session: 2026-04-02T13:46:30.499Z +Stopped at: Completed 03-08-PLAN.md — SearchViewModel + DuplicatesViewModel + Views + DI wiring (visual checkpoint pending) Resume file: None diff --git a/.planning/phases/03-storage/03-01-PLAN.md b/.planning/phases/03-storage/03-01-PLAN.md new file mode 100644 index 0000000..b336d88 --- /dev/null +++ b/.planning/phases/03-storage/03-01-PLAN.md @@ -0,0 +1,815 @@ +--- +phase: 03 +plan: 01 +title: Wave 0 — Test Scaffolds, Stub Interfaces, and Core Models +status: pending +wave: 0 +depends_on: [] +files_modified: + - SharepointToolbox/Core/Models/StorageNode.cs + - SharepointToolbox/Core/Models/StorageScanOptions.cs + - SharepointToolbox/Core/Models/SearchResult.cs + - SharepointToolbox/Core/Models/SearchOptions.cs + - SharepointToolbox/Core/Models/DuplicateGroup.cs + - SharepointToolbox/Core/Models/DuplicateItem.cs + - SharepointToolbox/Core/Models/DuplicateScanOptions.cs + - SharepointToolbox/Services/IStorageService.cs + - SharepointToolbox/Services/ISearchService.cs + - SharepointToolbox/Services/IDuplicatesService.cs + - SharepointToolbox/Services/Export/StorageCsvExportService.cs + - SharepointToolbox/Services/Export/StorageHtmlExportService.cs + - SharepointToolbox/Services/Export/SearchCsvExportService.cs + - SharepointToolbox/Services/Export/SearchHtmlExportService.cs + - SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs + - SharepointToolbox.Tests/Services/StorageServiceTests.cs + - SharepointToolbox.Tests/Services/SearchServiceTests.cs + - SharepointToolbox.Tests/Services/DuplicatesServiceTests.cs + - SharepointToolbox.Tests/Services/Export/StorageCsvExportServiceTests.cs + - SharepointToolbox.Tests/Services/Export/StorageHtmlExportServiceTests.cs + - SharepointToolbox.Tests/Services/Export/SearchExportServiceTests.cs + - SharepointToolbox.Tests/Services/Export/DuplicatesHtmlExportServiceTests.cs +autonomous: true +requirements: + - STOR-01 + - STOR-02 + - STOR-03 + - STOR-04 + - STOR-05 + - SRCH-01 + - SRCH-02 + - SRCH-03 + - SRCH-04 + - DUPL-01 + - DUPL-02 + - DUPL-03 + +must_haves: + truths: + - "dotnet build produces 0 errors after all 7 models, 3 interfaces, and 5 stub export classes are created" + - "All 7 test files exist and are discovered by dotnet test (test count > 0)" + - "StorageServiceTests, SearchServiceTests, DuplicatesServiceTests compile but skip (stubs referencing types that exist after this plan)" + - "The pure-logic tests in DuplicatesServiceTests (MakeKey composite key) are real [Fact] tests — not skipped — and pass" + - "Export service tests compile but fail (types exist as stubs with no real implementation yet) — expected until Plans 03/05" + artifacts: + - path: "SharepointToolbox/Core/Models/StorageNode.cs" + provides: "Tree node model for storage metrics display" + - path: "SharepointToolbox/Core/Models/SearchResult.cs" + provides: "Flat result record for file search output" + - path: "SharepointToolbox/Core/Models/DuplicateGroup.cs" + provides: "Group record for duplicate detection output" + - path: "SharepointToolbox/Services/IStorageService.cs" + provides: "Interface enabling ViewModel mocking for storage" + - path: "SharepointToolbox/Services/ISearchService.cs" + provides: "Interface enabling ViewModel mocking for search" + - path: "SharepointToolbox/Services/IDuplicatesService.cs" + provides: "Interface enabling ViewModel mocking for duplicates" + key_links: + - from: "StorageServiceTests.cs" + to: "IStorageService" + via: "mock interface" + pattern: "IStorageService" + - from: "SearchServiceTests.cs" + to: "ISearchService" + via: "mock interface" + pattern: "ISearchService" + - from: "DuplicatesServiceTests.cs" + to: "MakeKey" + via: "static pure function" + pattern: "MakeKey" +--- + +# Plan 03-01: Wave 0 — Test Scaffolds, Stub Interfaces, and Core Models + +## Goal + +Create all data models, service interfaces, export service stubs, and test scaffolds needed so every subsequent plan has a working `dotnet test --filter` verify command pointing at a real test class. Interfaces and models define the contracts; implementation plans (03-02 through 03-05) fill them in. One set of pure-logic tests (the `MakeKey` composite key function for duplicate detection) are real `[Fact]` tests that pass immediately since the logic is pure and has no CSOM dependencies. + +## Context + +Phase 2 created `PermissionEntry`, `ScanOptions`, `IPermissionsService`, and test scaffolds in exactly this pattern. Phase 3 follows the same Wave 0 approach: models + interfaces first, implementation in subsequent plans. The test project at `SharepointToolbox.Tests/SharepointToolbox.Tests.csproj` already has xUnit 2.9.3 + Moq. The export service stubs must compile (the test files reference them) even though their `BuildCsv`/`BuildHtml` methods return empty strings until implemented. + +## Tasks + +### Task 1: Create all 7 core models and 3 service interfaces + +**Files:** +- `SharepointToolbox/Core/Models/StorageNode.cs` +- `SharepointToolbox/Core/Models/StorageScanOptions.cs` +- `SharepointToolbox/Core/Models/SearchResult.cs` +- `SharepointToolbox/Core/Models/SearchOptions.cs` +- `SharepointToolbox/Core/Models/DuplicateGroup.cs` +- `SharepointToolbox/Core/Models/DuplicateItem.cs` +- `SharepointToolbox/Core/Models/DuplicateScanOptions.cs` +- `SharepointToolbox/Services/IStorageService.cs` +- `SharepointToolbox/Services/ISearchService.cs` +- `SharepointToolbox/Services/IDuplicatesService.cs` + +**Action:** Create | Write + +**Why:** All subsequent plans depend on these contracts. Tests must compile against them. Interfaces enable Moq-based unit tests. + +```csharp +// SharepointToolbox/Core/Models/StorageNode.cs +namespace SharepointToolbox.Core.Models; + +public class StorageNode +{ + public string Name { get; set; } = string.Empty; + public string Url { get; set; } = string.Empty; + public string SiteTitle { get; set; } = string.Empty; + public string Library { get; set; } = string.Empty; + public long TotalSizeBytes { get; set; } + public long FileStreamSizeBytes { get; set; } + public long VersionSizeBytes => Math.Max(0L, TotalSizeBytes - FileStreamSizeBytes); + public long TotalFileCount { get; set; } + public DateTime? LastModified { get; set; } + public int IndentLevel { get; set; } + public List Children { get; set; } = new(); +} +``` + +```csharp +// SharepointToolbox/Core/Models/StorageScanOptions.cs +namespace SharepointToolbox.Core.Models; + +public record StorageScanOptions( + bool PerLibrary = true, + bool IncludeSubsites = false, + int FolderDepth = 0 // 0 = library root only; >0 = recurse N levels +); +``` + +```csharp +// SharepointToolbox/Core/Models/SearchResult.cs +namespace SharepointToolbox.Core.Models; + +public class SearchResult +{ + public string Title { get; set; } = string.Empty; + public string Path { get; set; } = string.Empty; + public string FileExtension { get; set; } = string.Empty; + public DateTime? Created { get; set; } + public DateTime? LastModified { get; set; } + public string Author { get; set; } = string.Empty; + public string ModifiedBy { get; set; } = string.Empty; + public long SizeBytes { get; set; } +} +``` + +```csharp +// SharepointToolbox/Core/Models/SearchOptions.cs +namespace SharepointToolbox.Core.Models; + +public record SearchOptions( + string[] Extensions, + string? Regex, + DateTime? CreatedAfter, + DateTime? CreatedBefore, + DateTime? ModifiedAfter, + DateTime? ModifiedBefore, + string? CreatedBy, + string? ModifiedBy, + string? Library, + int MaxResults, + string SiteUrl +); +``` + +```csharp +// SharepointToolbox/Core/Models/DuplicateItem.cs +namespace SharepointToolbox.Core.Models; + +public class DuplicateItem +{ + public string Name { get; set; } = string.Empty; + public string Path { get; set; } = string.Empty; + public string Library { get; set; } = string.Empty; + public long? SizeBytes { get; set; } + public DateTime? Created { get; set; } + public DateTime? Modified { get; set; } + public int? FolderCount { get; set; } + public int? FileCount { get; set; } +} +``` + +```csharp +// SharepointToolbox/Core/Models/DuplicateGroup.cs +namespace SharepointToolbox.Core.Models; + +public class DuplicateGroup +{ + public string GroupKey { get; set; } = string.Empty; + public string Name { get; set; } = string.Empty; + public List Items { get; set; } = new(); +} +``` + +```csharp +// SharepointToolbox/Core/Models/DuplicateScanOptions.cs +namespace SharepointToolbox.Core.Models; + +public record DuplicateScanOptions( + string Mode = "Files", // "Files" or "Folders" + bool MatchSize = true, + bool MatchCreated = false, + bool MatchModified = false, + bool MatchSubfolderCount = false, + bool MatchFileCount = false, + bool IncludeSubsites = false, + string? Library = null +); +``` + +```csharp +// SharepointToolbox/Services/IStorageService.cs +using Microsoft.SharePoint.Client; +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services; + +public interface IStorageService +{ + Task> CollectStorageAsync( + ClientContext ctx, + StorageScanOptions options, + IProgress progress, + CancellationToken ct); +} +``` + +```csharp +// SharepointToolbox/Services/ISearchService.cs +using Microsoft.SharePoint.Client; +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services; + +public interface ISearchService +{ + Task> SearchFilesAsync( + ClientContext ctx, + SearchOptions options, + IProgress progress, + CancellationToken ct); +} +``` + +```csharp +// SharepointToolbox/Services/IDuplicatesService.cs +using Microsoft.SharePoint.Client; +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services; + +public interface IDuplicatesService +{ + Task> ScanDuplicatesAsync( + ClientContext ctx, + DuplicateScanOptions options, + IProgress progress, + CancellationToken ct); +} +``` + +**Verification:** + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +``` + +Expected: 0 errors + +### Task 2: Create 5 export service stubs and 7 test scaffold files + +**Files:** +- `SharepointToolbox/Services/Export/StorageCsvExportService.cs` +- `SharepointToolbox/Services/Export/StorageHtmlExportService.cs` +- `SharepointToolbox/Services/Export/SearchCsvExportService.cs` +- `SharepointToolbox/Services/Export/SearchHtmlExportService.cs` +- `SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs` +- `SharepointToolbox.Tests/Services/StorageServiceTests.cs` +- `SharepointToolbox.Tests/Services/SearchServiceTests.cs` +- `SharepointToolbox.Tests/Services/DuplicatesServiceTests.cs` +- `SharepointToolbox.Tests/Services/Export/StorageCsvExportServiceTests.cs` +- `SharepointToolbox.Tests/Services/Export/StorageHtmlExportServiceTests.cs` +- `SharepointToolbox.Tests/Services/Export/SearchExportServiceTests.cs` +- `SharepointToolbox.Tests/Services/Export/DuplicatesHtmlExportServiceTests.cs` + +**Action:** Create | Write + +**Why:** Stubs enable test files to compile. The `MakeKey` helper and `VersionSizeBytes` derived property can be unit tested immediately without any CSOM. Export service tests will fail until plans 03-03 and 03-05 implement the real logic — that is the expected state. + +```csharp +// SharepointToolbox/Services/Export/StorageCsvExportService.cs +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services.Export; + +public class StorageCsvExportService +{ + public string BuildCsv(IReadOnlyList nodes) => string.Empty; // implemented in Plan 03-03 + + public async Task WriteAsync(IReadOnlyList nodes, string filePath, CancellationToken ct) + { + var csv = BuildCsv(nodes); + await System.IO.File.WriteAllTextAsync(filePath, csv, new System.Text.UTF8Encoding(true), ct); + } +} +``` + +```csharp +// SharepointToolbox/Services/Export/StorageHtmlExportService.cs +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services.Export; + +public class StorageHtmlExportService +{ + public string BuildHtml(IReadOnlyList nodes) => string.Empty; // implemented in Plan 03-03 + + public async Task WriteAsync(IReadOnlyList nodes, string filePath, CancellationToken ct) + { + var html = BuildHtml(nodes); + await System.IO.File.WriteAllTextAsync(filePath, html, System.Text.Encoding.UTF8, ct); + } +} +``` + +```csharp +// SharepointToolbox/Services/Export/SearchCsvExportService.cs +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services.Export; + +public class SearchCsvExportService +{ + public string BuildCsv(IReadOnlyList results) => string.Empty; // implemented in Plan 03-05 + + public async Task WriteAsync(IReadOnlyList results, string filePath, CancellationToken ct) + { + var csv = BuildCsv(results); + await System.IO.File.WriteAllTextAsync(filePath, csv, new System.Text.UTF8Encoding(true), ct); + } +} +``` + +```csharp +// SharepointToolbox/Services/Export/SearchHtmlExportService.cs +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services.Export; + +public class SearchHtmlExportService +{ + public string BuildHtml(IReadOnlyList results) => string.Empty; // implemented in Plan 03-05 + + public async Task WriteAsync(IReadOnlyList results, string filePath, CancellationToken ct) + { + var html = BuildHtml(results); + await System.IO.File.WriteAllTextAsync(filePath, html, System.Text.Encoding.UTF8, ct); + } +} +``` + +```csharp +// SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services.Export; + +public class DuplicatesHtmlExportService +{ + public string BuildHtml(IReadOnlyList groups) => string.Empty; // implemented in Plan 03-05 + + public async Task WriteAsync(IReadOnlyList groups, string filePath, CancellationToken ct) + { + var html = BuildHtml(groups); + await System.IO.File.WriteAllTextAsync(filePath, html, System.Text.Encoding.UTF8, ct); + } +} +``` + +Now the test scaffold files. The `DuplicatesServiceTests` includes a real pure-logic test for `MakeKey` — define the helper class inline in the same file so it compiles without depending on `DuplicatesService`: + +```csharp +// SharepointToolbox.Tests/Services/StorageServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services; +using Xunit; + +namespace SharepointToolbox.Tests.Services; + +public class StorageServiceTests +{ + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-02 implementation")] + public Task CollectStorageAsync_ReturnsLibraryNodes_ForDocumentLibraries() + => Task.CompletedTask; + + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-02 implementation")] + public Task CollectStorageAsync_WithFolderDepth1_ReturnsSubfolderNodes() + => Task.CompletedTask; + + [Fact] + public void StorageNode_VersionSizeBytes_IsNonNegative() + { + // VersionSizeBytes = TotalSizeBytes - FileStreamSizeBytes (never negative) + var node = new StorageNode { TotalSizeBytes = 1000L, FileStreamSizeBytes = 1200L }; + Assert.Equal(0L, node.VersionSizeBytes); // Math.Max(0, -200) = 0 + } + + [Fact] + public void StorageNode_VersionSizeBytes_IsCorrectWhenPositive() + { + var node = new StorageNode { TotalSizeBytes = 5000L, FileStreamSizeBytes = 3000L }; + Assert.Equal(2000L, node.VersionSizeBytes); + } +} +``` + +```csharp +// SharepointToolbox.Tests/Services/SearchServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services; +using Xunit; + +namespace SharepointToolbox.Tests.Services; + +public class SearchServiceTests +{ + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-04 implementation")] + public Task SearchFilesAsync_WithExtensionFilter_BuildsCorrectKql() + => Task.CompletedTask; + + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-04 implementation")] + public Task SearchFilesAsync_PaginationStopsAt50000() + => Task.CompletedTask; + + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-04 implementation")] + public Task SearchFilesAsync_FiltersVersionHistoryPaths() + => Task.CompletedTask; +} +``` + +```csharp +// SharepointToolbox.Tests/Services/DuplicatesServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services; +using Xunit; + +namespace SharepointToolbox.Tests.Services; + +/// +/// Pure-logic tests for the MakeKey composite key function (no CSOM needed). +/// Inline helper matches the implementation DuplicatesService will produce in Plan 03-04. +/// +public class DuplicatesServiceTests +{ + // Inline copy of MakeKey to test logic before Plan 03-04 creates the real class + private static string MakeKey(DuplicateItem item, DuplicateScanOptions opts) + { + var parts = new System.Collections.Generic.List { item.Name.ToLowerInvariant() }; + if (opts.MatchSize && item.SizeBytes.HasValue) parts.Add(item.SizeBytes.Value.ToString()); + if (opts.MatchCreated && item.Created.HasValue) parts.Add(item.Created.Value.Date.ToString("yyyy-MM-dd")); + if (opts.MatchModified && item.Modified.HasValue) parts.Add(item.Modified.Value.Date.ToString("yyyy-MM-dd")); + if (opts.MatchSubfolderCount && item.FolderCount.HasValue) parts.Add(item.FolderCount.Value.ToString()); + if (opts.MatchFileCount && item.FileCount.HasValue) parts.Add(item.FileCount.Value.ToString()); + return string.Join("|", parts); + } + + [Fact] + public void MakeKey_NameOnly_ReturnsLowercaseName() + { + var item = new DuplicateItem { Name = "Report.docx", SizeBytes = 1000 }; + var opts = new DuplicateScanOptions(MatchSize: false); + Assert.Equal("report.docx", MakeKey(item, opts)); + } + + [Fact] + public void MakeKey_WithSizeMatch_AppendsSizeToKey() + { + var item = new DuplicateItem { Name = "Report.docx", SizeBytes = 1024 }; + var opts = new DuplicateScanOptions(MatchSize: true); + Assert.Equal("report.docx|1024", MakeKey(item, opts)); + } + + [Fact] + public void MakeKey_WithCreatedAndModified_AppendsDateStrings() + { + var item = new DuplicateItem + { + Name = "file.pdf", + SizeBytes = 500, + Created = new DateTime(2024, 3, 15), + Modified = new DateTime(2024, 6, 1) + }; + var opts = new DuplicateScanOptions(MatchSize: false, MatchCreated: true, MatchModified: true); + Assert.Equal("file.pdf|2024-03-15|2024-06-01", MakeKey(item, opts)); + } + + [Fact] + public void MakeKey_SameKeyForSameItems_GroupsCorrectly() + { + var opts = new DuplicateScanOptions(MatchSize: true); + var item1 = new DuplicateItem { Name = "Budget.xlsx", SizeBytes = 2048 }; + var item2 = new DuplicateItem { Name = "BUDGET.xlsx", SizeBytes = 2048 }; + Assert.Equal(MakeKey(item1, opts), MakeKey(item2, opts)); + } + + [Fact] + public void MakeKey_DifferentSize_ProducesDifferentKeys() + { + var opts = new DuplicateScanOptions(MatchSize: true); + var item1 = new DuplicateItem { Name = "file.docx", SizeBytes = 100 }; + var item2 = new DuplicateItem { Name = "file.docx", SizeBytes = 200 }; + Assert.NotEqual(MakeKey(item1, opts), MakeKey(item2, opts)); + } + + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-04 implementation")] + public Task ScanDuplicatesAsync_Files_GroupsByCompositeKey() + => Task.CompletedTask; + + [Fact(Skip = "Requires live CSOM context — covered by Plan 03-04 implementation")] + public Task ScanDuplicatesAsync_Folders_UsesCamlFSObjType1() + => Task.CompletedTask; +} +``` + +```csharp +// SharepointToolbox.Tests/Services/Export/StorageCsvExportServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services.Export; +using Xunit; + +namespace SharepointToolbox.Tests.Services.Export; + +public class StorageCsvExportServiceTests +{ + [Fact] + public void BuildCsv_WithKnownNodes_ProducesHeaderRow() + { + var svc = new StorageCsvExportService(); + var nodes = new List + { + new() { Name = "Shared Documents", Library = "Shared Documents", SiteTitle = "MySite", + TotalSizeBytes = 1024, FileStreamSizeBytes = 800, TotalFileCount = 5, + LastModified = new DateTime(2024, 1, 15) } + }; + var csv = svc.BuildCsv(nodes); + Assert.Contains("Library", csv); + Assert.Contains("Site", csv); + Assert.Contains("Files", csv); + Assert.Contains("Total Size", csv); + Assert.Contains("Version Size", csv); + Assert.Contains("Last Modified", csv); + } + + [Fact] + public void BuildCsv_WithEmptyList_ReturnsHeaderOnly() + { + var svc = new StorageCsvExportService(); + var csv = svc.BuildCsv(new List()); + Assert.NotEmpty(csv); // must have at least the header row + var lines = csv.Split('\n', StringSplitOptions.RemoveEmptyEntries); + Assert.Single(lines); // only header, no data rows + } + + [Fact] + public void BuildCsv_NodeValues_AppearInOutput() + { + var svc = new StorageCsvExportService(); + var nodes = new List + { + new() { Name = "Reports", Library = "Reports", SiteTitle = "ProjectSite", + TotalSizeBytes = 2048, FileStreamSizeBytes = 1024, TotalFileCount = 10 } + }; + var csv = svc.BuildCsv(nodes); + Assert.Contains("Reports", csv); + Assert.Contains("ProjectSite", csv); + Assert.Contains("10", csv); + } +} +``` + +```csharp +// SharepointToolbox.Tests/Services/Export/StorageHtmlExportServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services.Export; +using Xunit; + +namespace SharepointToolbox.Tests.Services.Export; + +public class StorageHtmlExportServiceTests +{ + [Fact] + public void BuildHtml_WithNodes_ContainsToggleJs() + { + var svc = new StorageHtmlExportService(); + var nodes = new List + { + new() { Name = "Shared Documents", Library = "Shared Documents", SiteTitle = "Site1", + TotalSizeBytes = 5000, FileStreamSizeBytes = 4000, TotalFileCount = 20, + Children = new List + { + new() { Name = "Archive", Library = "Shared Documents", SiteTitle = "Site1", + TotalSizeBytes = 1000, FileStreamSizeBytes = 800, TotalFileCount = 5 } + } } + }; + var html = svc.BuildHtml(nodes); + Assert.Contains("toggle(", html); + Assert.Contains("", html); + Assert.Contains("Shared Documents", html); + } + + [Fact] + public void BuildHtml_WithEmptyList_ReturnsValidHtml() + { + var svc = new StorageHtmlExportService(); + var html = svc.BuildHtml(new List()); + Assert.Contains("", html); + Assert.Contains(" + { + new() { Name = "Documents", Library = "Documents", SiteTitle = "Site1", TotalSizeBytes = 1000 }, + new() { Name = "Images", Library = "Images", SiteTitle = "Site1", TotalSizeBytes = 2000 } + }; + var html = svc.BuildHtml(nodes); + Assert.Contains("Documents", html); + Assert.Contains("Images", html); + } +} +``` + +```csharp +// SharepointToolbox.Tests/Services/Export/SearchExportServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services.Export; +using Xunit; + +namespace SharepointToolbox.Tests.Services.Export; + +public class SearchExportServiceTests +{ + private static SearchResult MakeSample() => new() + { + Title = "Q1 Budget.xlsx", + Path = "https://contoso.sharepoint.com/sites/Finance/Shared Documents/Q1 Budget.xlsx", + FileExtension = "xlsx", + Created = new DateTime(2024, 1, 10), + LastModified = new DateTime(2024, 3, 20), + Author = "Alice Smith", + ModifiedBy = "Bob Jones", + SizeBytes = 48_000 + }; + + // ── CSV tests ────────────────────────────────────────────────────────────── + + [Fact] + public void BuildCsv_WithKnownResults_ContainsExpectedHeader() + { + var svc = new SearchCsvExportService(); + var csv = svc.BuildCsv(new List { MakeSample() }); + Assert.Contains("File Name", csv); + Assert.Contains("Extension", csv); + Assert.Contains("Created", csv); + Assert.Contains("Created By", csv); + Assert.Contains("Modified By", csv); + Assert.Contains("Size", csv); + } + + [Fact] + public void BuildCsv_WithEmptyList_ReturnsHeaderOnly() + { + var svc = new SearchCsvExportService(); + var csv = svc.BuildCsv(new List()); + Assert.NotEmpty(csv); + var lines = csv.Split('\n', StringSplitOptions.RemoveEmptyEntries); + Assert.Single(lines); + } + + [Fact] + public void BuildCsv_ResultValues_AppearInOutput() + { + var svc = new SearchCsvExportService(); + var csv = svc.BuildCsv(new List { MakeSample() }); + Assert.Contains("Alice Smith", csv); + Assert.Contains("xlsx", csv); + } + + // ── HTML tests ───────────────────────────────────────────────────────────── + + [Fact] + public void BuildHtml_WithResults_ContainsSortableColumnScript() + { + var svc = new SearchHtmlExportService(); + var html = svc.BuildHtml(new List { MakeSample() }); + Assert.Contains("", html); + Assert.Contains("sort", html); // sortable columns JS + Assert.Contains("Q1 Budget.xlsx", html); + } + + [Fact] + public void BuildHtml_WithResults_ContainsFilterInput() + { + var svc = new SearchHtmlExportService(); + var html = svc.BuildHtml(new List { MakeSample() }); + Assert.Contains("filter", html); // filter input element + } + + [Fact] + public void BuildHtml_WithEmptyList_ReturnsValidHtml() + { + var svc = new SearchHtmlExportService(); + var html = svc.BuildHtml(new List()); + Assert.Contains("", html); + } +} +``` + +```csharp +// SharepointToolbox.Tests/Services/Export/DuplicatesHtmlExportServiceTests.cs +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services.Export; +using Xunit; + +namespace SharepointToolbox.Tests.Services.Export; + +public class DuplicatesHtmlExportServiceTests +{ + private static DuplicateGroup MakeGroup(string name, int count) => new() + { + GroupKey = $"{name}|1024", + Name = name, + Items = Enumerable.Range(1, count).Select(i => new DuplicateItem + { + Name = name, + Path = $"https://contoso.sharepoint.com/sites/Site{i}/{name}", + Library = "Shared Documents", + SizeBytes = 1024 + }).ToList() + }; + + [Fact] + public void BuildHtml_WithGroups_ContainsGroupCards() + { + var svc = new DuplicatesHtmlExportService(); + var groups = new List { MakeGroup("report.docx", 3) }; + var html = svc.BuildHtml(groups); + Assert.Contains("", html); + Assert.Contains("report.docx", html); + } + + [Fact] + public void BuildHtml_WithMultipleGroups_AllGroupNamesPresent() + { + var svc = new DuplicatesHtmlExportService(); + var groups = new List + { + MakeGroup("budget.xlsx", 2), + MakeGroup("photo.jpg", 4) + }; + var html = svc.BuildHtml(groups); + Assert.Contains("budget.xlsx", html); + Assert.Contains("photo.jpg", html); + } + + [Fact] + public void BuildHtml_WithEmptyList_ReturnsValidHtml() + { + var svc = new DuplicatesHtmlExportService(); + var html = svc.BuildHtml(new List()); + Assert.Contains("", html); + } +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~DuplicatesServiceTests" -x +``` + +Expected: 5 real tests pass (MakeKey logic tests), CSOM stubs skip + +## Verification + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~DuplicatesServiceTests|FullyQualifiedName~StorageServiceTests" -x +``` + +Expected: 0 build errors; 7 DuplicatesServiceTests+StorageServiceTests pass or skip with no CS errors + +> **Note on unfiltered test run at Wave 0:** Running `dotnet test` without a filter at this stage will show approximately 15 failing tests across `StorageCsvExportServiceTests`, `StorageHtmlExportServiceTests`, `SearchExportServiceTests`, and `DuplicatesHtmlExportServiceTests`. This is expected — all 5 export service stubs return `string.Empty` until Plans 03-03 and 03-05 implement the real logic. Do not treat these failures as a blocker for Wave 0 completion. + +## Commit Message +feat(03-01): create Phase 3 models, interfaces, export stubs, and test scaffolds + +## Output + +After completion, create `.planning/phases/03-storage/03-01-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-02-PLAN.md b/.planning/phases/03-storage/03-02-PLAN.md new file mode 100644 index 0000000..cba5725 --- /dev/null +++ b/.planning/phases/03-storage/03-02-PLAN.md @@ -0,0 +1,246 @@ +--- +phase: 03 +plan: 02 +title: StorageService — CSOM StorageMetrics Scan Engine +status: pending +wave: 1 +depends_on: + - 03-01 +files_modified: + - SharepointToolbox/Services/StorageService.cs +autonomous: true +requirements: + - STOR-01 + - STOR-02 + - STOR-03 + +must_haves: + truths: + - "StorageService implements IStorageService and is registered in DI (added in Plan 03-07)" + - "CollectStorageAsync returns one StorageNode per document library at IndentLevel=0, with correct TotalSizeBytes, FileStreamSizeBytes, VersionSizeBytes, TotalFileCount, and LastModified" + - "With FolderDepth>0, child StorageNodes are recursively populated and appear at IndentLevel=1+" + - "VersionSizeBytes = TotalSizeBytes - FileStreamSizeBytes (never negative)" + - "All CSOM round-trips use ExecuteQueryRetryHelper.ExecuteQueryRetryAsync — no direct ctx.ExecuteQueryAsync calls" + - "System/hidden lists are skipped (Hidden=true or BaseType != DocumentLibrary)" + - "ct.ThrowIfCancellationRequested() is called at the top of every recursive step" + artifacts: + - path: "SharepointToolbox/Services/StorageService.cs" + provides: "CSOM scan engine — IStorageService implementation" + exports: ["StorageService"] + key_links: + - from: "StorageService.cs" + to: "ExecuteQueryRetryHelper.ExecuteQueryRetryAsync" + via: "every CSOM load" + pattern: "ExecuteQueryRetryHelper\\.ExecuteQueryRetryAsync" + - from: "StorageService.cs" + to: "folder.StorageMetrics" + via: "ctx.Load include expression" + pattern: "StorageMetrics" +--- + +# Plan 03-02: StorageService — CSOM StorageMetrics Scan Engine + +## Goal + +Implement `StorageService` — the C# port of the PowerShell `Get-PnPFolderStorageMetric` / `Collect-FolderStorage` pattern. It loads `Folder.StorageMetrics` for each document library on a site (and optionally recurses into subfolders up to a configurable depth), returning a flat list of `StorageNode` objects that the ViewModel will display in a `DataGrid`. + +## Context + +Plan 03-01 created `StorageNode`, `StorageScanOptions`, and `IStorageService`. This plan creates the only concrete implementation. The service receives an already-authenticated `ClientContext` from the ViewModel (obtained via `ISessionManager.GetOrCreateContextAsync`) — it never calls SessionManager itself. + +Critical loading pattern: `ctx.Load(folder, f => f.StorageMetrics, f => f.TimeLastModified, f => f.Name, f => f.ServerRelativeUrl)` — if `StorageMetrics` is not in the Load expression, `folder.StorageMetrics.TotalSize` throws `PropertyOrFieldNotInitializedException`. + +The `VersionSizeBytes` derived property is already on `StorageNode` (`TotalSizeBytes - FileStreamSizeBytes`). StorageService only needs to populate `TotalSizeBytes` and `FileStreamSizeBytes`. + +## Tasks + +### Task 1: Implement StorageService + +**File:** `SharepointToolbox/Services/StorageService.cs` + +**Action:** Create + +**Why:** Implements STOR-01, STOR-02, STOR-03. Single file, single concern — no helper changes needed. + +```csharp +using Microsoft.SharePoint.Client; +using SharepointToolbox.Core.Helpers; +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services; + +/// +/// CSOM-based storage metrics scanner. +/// Port of PowerShell Collect-FolderStorage / Get-PnPFolderStorageMetric pattern. +/// +public class StorageService : IStorageService +{ + public async Task> CollectStorageAsync( + ClientContext ctx, + StorageScanOptions options, + IProgress progress, + CancellationToken ct) + { + ct.ThrowIfCancellationRequested(); + + // Load web-level metadata in one round-trip + ctx.Load(ctx.Web, + w => w.Title, + w => w.Url, + w => w.ServerRelativeUrl, + w => w.Lists.Include( + l => l.Title, + l => l.Hidden, + l => l.BaseType, + l => l.RootFolder.ServerRelativeUrl)); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + string webSrl = ctx.Web.ServerRelativeUrl.TrimEnd('/'); + string siteTitle = ctx.Web.Title; + + var result = new List(); + var libs = ctx.Web.Lists + .Where(l => !l.Hidden && l.BaseType == BaseType.DocumentLibrary) + .ToList(); + + int idx = 0; + foreach (var lib in libs) + { + ct.ThrowIfCancellationRequested(); + idx++; + progress.Report(new OperationProgress(idx, libs.Count, + $"Loading storage metrics: {lib.Title} ({idx}/{libs.Count})")); + + var libNode = await LoadFolderNodeAsync( + ctx, lib.RootFolder.ServerRelativeUrl, lib.Title, + siteTitle, lib.Title, 0, progress, ct); + + if (options.FolderDepth > 0) + { + await CollectSubfoldersAsync( + ctx, lib.RootFolder.ServerRelativeUrl, + libNode, 1, options.FolderDepth, + siteTitle, lib.Title, progress, ct); + } + + result.Add(libNode); + } + + return result; + } + + // ── Private helpers ────────────────────────────────────────────────────── + + private static async Task LoadFolderNodeAsync( + ClientContext ctx, + string serverRelativeUrl, + string name, + string siteTitle, + string library, + int indentLevel, + IProgress progress, + CancellationToken ct) + { + ct.ThrowIfCancellationRequested(); + + Folder folder = ctx.Web.GetFolderByServerRelativeUrl(serverRelativeUrl); + ctx.Load(folder, + f => f.StorageMetrics, + f => f.TimeLastModified, + f => f.ServerRelativeUrl, + f => f.Name); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + DateTime? lastMod = folder.StorageMetrics.LastModified > DateTime.MinValue + ? folder.StorageMetrics.LastModified + : folder.TimeLastModified > DateTime.MinValue + ? folder.TimeLastModified + : (DateTime?)null; + + return new StorageNode + { + Name = name, + Url = ctx.Url.TrimEnd('/') + serverRelativeUrl, + SiteTitle = siteTitle, + Library = library, + TotalSizeBytes = folder.StorageMetrics.TotalSize, + FileStreamSizeBytes = folder.StorageMetrics.TotalFileStreamSize, + TotalFileCount = folder.StorageMetrics.TotalFileCount, + LastModified = lastMod, + IndentLevel = indentLevel, + Children = new List() + }; + } + + private static async Task CollectSubfoldersAsync( + ClientContext ctx, + string parentServerRelativeUrl, + StorageNode parentNode, + int currentDepth, + int maxDepth, + string siteTitle, + string library, + IProgress progress, + CancellationToken ct) + { + if (currentDepth > maxDepth) return; + ct.ThrowIfCancellationRequested(); + + // Load direct child folders of this folder + Folder parentFolder = ctx.Web.GetFolderByServerRelativeUrl(parentServerRelativeUrl); + ctx.Load(parentFolder, + f => f.Folders.Include( + sf => sf.Name, + sf => sf.ServerRelativeUrl)); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + foreach (Folder subFolder in parentFolder.Folders) + { + ct.ThrowIfCancellationRequested(); + + // Skip SharePoint system folders + if (subFolder.Name.Equals("Forms", StringComparison.OrdinalIgnoreCase) || + subFolder.Name.StartsWith("_", StringComparison.Ordinal)) + continue; + + var childNode = await LoadFolderNodeAsync( + ctx, subFolder.ServerRelativeUrl, subFolder.Name, + siteTitle, library, currentDepth, progress, ct); + + if (currentDepth < maxDepth) + { + await CollectSubfoldersAsync( + ctx, subFolder.ServerRelativeUrl, childNode, + currentDepth + 1, maxDepth, + siteTitle, library, progress, ct); + } + + parentNode.Children.Add(childNode); + } + } +} +``` + +**Verification:** + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~StorageServiceTests" -x +``` + +Expected: 0 build errors; 2 pure-logic tests pass (VersionSizeBytes), 2 CSOM stubs skip + +## Verification + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +``` + +Expected: 0 errors. `StorageService` implements `IStorageService` (grep: `class StorageService : IStorageService`). `ExecuteQueryRetryHelper.ExecuteQueryRetryAsync` is called for every folder load (grep verifiable). + +## Commit Message +feat(03-02): implement StorageService CSOM StorageMetrics scan engine + +## Output + +After completion, create `.planning/phases/03-storage/03-02-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-03-PLAN.md b/.planning/phases/03-storage/03-03-PLAN.md new file mode 100644 index 0000000..8bb75c0 --- /dev/null +++ b/.planning/phases/03-storage/03-03-PLAN.md @@ -0,0 +1,340 @@ +--- +phase: 03 +plan: 03 +title: Storage Export Services — CSV and Collapsible-Tree HTML +status: pending +wave: 2 +depends_on: + - 03-02 +files_modified: + - SharepointToolbox/Services/Export/StorageCsvExportService.cs + - SharepointToolbox/Services/Export/StorageHtmlExportService.cs +autonomous: true +requirements: + - STOR-04 + - STOR-05 + +must_haves: + truths: + - "StorageCsvExportService.BuildCsv produces a UTF-8 BOM CSV with header: Library, Site, Files, Total Size (MB), Version Size (MB), Last Modified" + - "StorageCsvExportService.BuildCsv includes one row per StorageNode (flattened, respects IndentLevel for Library name prefix)" + - "StorageHtmlExportService.BuildHtml produces a self-contained HTML file with inline CSS and JS — no external dependencies" + - "StorageHtmlExportService.BuildHtml includes toggle(i) JS and collapsible subfolder rows (sf-{i} IDs)" + - "StorageCsvExportServiceTests: all 3 tests pass" + - "StorageHtmlExportServiceTests: all 3 tests pass" + artifacts: + - path: "SharepointToolbox/Services/Export/StorageCsvExportService.cs" + provides: "CSV exporter for StorageNode list (STOR-04)" + exports: ["StorageCsvExportService"] + - path: "SharepointToolbox/Services/Export/StorageHtmlExportService.cs" + provides: "Collapsible-tree HTML exporter for StorageNode list (STOR-05)" + exports: ["StorageHtmlExportService"] + key_links: + - from: "StorageCsvExportService.cs" + to: "StorageNode.VersionSizeBytes" + via: "computed property" + pattern: "VersionSizeBytes" + - from: "StorageHtmlExportService.cs" + to: "toggle(i) JS" + via: "inline script" + pattern: "toggle\\(" +--- + +# Plan 03-03: Storage Export Services — CSV and Collapsible-Tree HTML + +## Goal + +Replace the stub implementations in `StorageCsvExportService` and `StorageHtmlExportService` with real implementations. The CSV export produces a flat UTF-8 BOM CSV compatible with Excel. The HTML export ports the PowerShell `Export-StorageToHTML` function (PS lines 1621-1780), producing a self-contained HTML file with a collapsible tree view driven by an inline `toggle(i)` JavaScript function. + +## Context + +Plan 03-01 created stub `BuildCsv`/`BuildHtml` methods returning `string.Empty`. This plan fills them in. The test files `StorageCsvExportServiceTests.cs` and `StorageHtmlExportServiceTests.cs` already exist and define the expected output — they currently fail because of the stubs. + +Pattern reference: Phase 2 `CsvExportService` uses UTF-8 BOM + RFC 4180 quoting. The same `Csv()` helper pattern is applied here. `StorageHtmlExportService` uses a `_togIdx` counter reset at the start of each `BuildHtml` call (per the PS pattern) to generate unique IDs for collapsible rows. + +## Tasks + +### Task 1: Implement StorageCsvExportService + +**File:** `SharepointToolbox/Services/Export/StorageCsvExportService.cs` + +**Action:** Modify (replace stub with full implementation) + +**Why:** STOR-04 — user can export storage metrics to CSV. + +```csharp +using SharepointToolbox.Core.Models; +using System.Text; + +namespace SharepointToolbox.Services.Export; + +/// +/// Exports a flat list of StorageNode objects to a UTF-8 BOM CSV. +/// Compatible with Microsoft Excel (BOM signals UTF-8 encoding). +/// +public class StorageCsvExportService +{ + public string BuildCsv(IReadOnlyList nodes) + { + var sb = new StringBuilder(); + + // Header + sb.AppendLine("Library,Site,Files,Total Size (MB),Version Size (MB),Last Modified"); + + foreach (var node in nodes) + { + sb.AppendLine(string.Join(",", + Csv(node.Name), + Csv(node.SiteTitle), + node.TotalFileCount.ToString(), + FormatMb(node.TotalSizeBytes), + FormatMb(node.VersionSizeBytes), + node.LastModified.HasValue + ? Csv(node.LastModified.Value.ToString("yyyy-MM-dd")) + : string.Empty)); + } + + return sb.ToString(); + } + + public async Task WriteAsync(IReadOnlyList nodes, string filePath, CancellationToken ct) + { + var csv = BuildCsv(nodes); + // UTF-8 with BOM for Excel compatibility + await File.WriteAllTextAsync(filePath, csv, new UTF8Encoding(encoderShouldEmitUTF8Identifier: true), ct); + } + + // ── Helpers ─────────────────────────────────────────────────────────────── + + private static string FormatMb(long bytes) + => (bytes / (1024.0 * 1024.0)).ToString("F2"); + + /// RFC 4180 CSV field quoting. + private static string Csv(string value) + { + if (string.IsNullOrEmpty(value)) return string.Empty; + if (value.Contains(',') || value.Contains('"') || value.Contains('\n')) + return $"\"{value.Replace("\"", "\"\"")}\""; + return value; + } +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~StorageCsvExportServiceTests" -x +``` + +Expected: 3 tests pass + +### Task 2: Implement StorageHtmlExportService + +**File:** `SharepointToolbox/Services/Export/StorageHtmlExportService.cs` + +**Action:** Modify (replace stub with full implementation) + +**Why:** STOR-05 — user can export storage metrics to interactive HTML with collapsible tree view. + +```csharp +using SharepointToolbox.Core.Models; +using System.Text; + +namespace SharepointToolbox.Services.Export; + +/// +/// Exports StorageNode tree to a self-contained HTML file with collapsible subfolder rows. +/// Port of PS Export-StorageToHTML (PS lines 1621-1780). +/// Uses a toggle(i) JS pattern where each collapsible row has id="sf-{i}". +/// +public class StorageHtmlExportService +{ + private int _togIdx; + + public string BuildHtml(IReadOnlyList nodes) + { + _togIdx = 0; + var sb = new StringBuilder(); + + sb.AppendLine(""" + + + + + + SharePoint Storage Metrics + + + + +

SharePoint Storage Metrics

+ """); + + sb.AppendLine(""" + + + + + + + + + + + + + """); + + foreach (var node in nodes) + { + RenderNode(sb, node); + } + + sb.AppendLine(""" + +
Library / FolderSiteFilesTotal SizeVersion SizeLast Modified
+ """); + + sb.AppendLine($"

Generated: {DateTime.Now:yyyy-MM-dd HH:mm}

"); + sb.AppendLine(""); + + return sb.ToString(); + } + + public async Task WriteAsync(IReadOnlyList nodes, string filePath, CancellationToken ct) + { + var html = BuildHtml(nodes); + await File.WriteAllTextAsync(filePath, html, Encoding.UTF8, ct); + } + + // ── Private rendering ──────────────────────────────────────────────────── + + private void RenderNode(StringBuilder sb, StorageNode node) + { + bool hasChildren = node.Children.Count > 0; + int myIdx = hasChildren ? ++_togIdx : 0; + + string nameCell = hasChildren + ? $"{HtmlEncode(node.Name)}" + : $"{HtmlEncode(node.Name)}"; + + string lastMod = node.LastModified.HasValue + ? node.LastModified.Value.ToString("yyyy-MM-dd") + : string.Empty; + + sb.AppendLine($""" + + {nameCell} + {HtmlEncode(node.SiteTitle)} + {node.TotalFileCount:N0} + {FormatSize(node.TotalSizeBytes)} + {FormatSize(node.VersionSizeBytes)} + {lastMod} + + """); + + if (hasChildren) + { + sb.AppendLine($""); + sb.AppendLine(""); + foreach (var child in node.Children) + { + RenderChildNode(sb, child); + } + sb.AppendLine("
"); + sb.AppendLine(""); + } + } + + private void RenderChildNode(StringBuilder sb, StorageNode node) + { + bool hasChildren = node.Children.Count > 0; + int myIdx = hasChildren ? ++_togIdx : 0; + + string indent = $"margin-left:{(node.IndentLevel + 1) * 16}px"; + string nameCell = hasChildren + ? $"{HtmlEncode(node.Name)}" + : $"{HtmlEncode(node.Name)}"; + + string lastMod = node.LastModified.HasValue + ? node.LastModified.Value.ToString("yyyy-MM-dd") + : string.Empty; + + sb.AppendLine($""" + + {nameCell} + {HtmlEncode(node.SiteTitle)} + {node.TotalFileCount:N0} + {FormatSize(node.TotalSizeBytes)} + {FormatSize(node.VersionSizeBytes)} + {lastMod} + + """); + + if (hasChildren) + { + sb.AppendLine($""); + sb.AppendLine(""); + foreach (var child in node.Children) + { + RenderChildNode(sb, child); + } + sb.AppendLine("
"); + sb.AppendLine(""); + } + } + + private static string FormatSize(long bytes) + { + if (bytes >= 1_073_741_824L) return $"{bytes / 1_073_741_824.0:F2} GB"; + if (bytes >= 1_048_576L) return $"{bytes / 1_048_576.0:F2} MB"; + if (bytes >= 1024L) return $"{bytes / 1024.0:F2} KB"; + return $"{bytes} B"; + } + + private static string HtmlEncode(string value) + => System.Net.WebUtility.HtmlEncode(value ?? string.Empty); +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~StorageHtmlExportServiceTests" -x +``` + +Expected: 3 tests pass + +## Verification + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~StorageCsvExportServiceTests|FullyQualifiedName~StorageHtmlExportServiceTests" -x +``` + +Expected: 6 tests pass, 0 fail + +## Commit Message +feat(03-03): implement StorageCsvExportService and StorageHtmlExportService + +## Output + +After completion, create `.planning/phases/03-storage/03-03-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-04-PLAN.md b/.planning/phases/03-storage/03-04-PLAN.md new file mode 100644 index 0000000..0dc61bd --- /dev/null +++ b/.planning/phases/03-storage/03-04-PLAN.md @@ -0,0 +1,572 @@ +--- +phase: 03 +plan: 04 +title: SearchService and DuplicatesService — KQL Pagination and Duplicate Grouping +status: pending +wave: 2 +depends_on: + - 03-01 +files_modified: + - SharepointToolbox/Services/SearchService.cs + - SharepointToolbox/Services/DuplicatesService.cs +autonomous: true +requirements: + - SRCH-01 + - SRCH-02 + - DUPL-01 + - DUPL-02 + +must_haves: + truths: + - "SearchService implements ISearchService and builds KQL from all SearchOptions fields (extension, dates, creator, editor, library)" + - "SearchService paginates StartRow += 500 and stops when StartRow > 50,000 (platform cap) or MaxResults reached" + - "SearchService filters out _vti_history/ paths from results" + - "SearchService applies client-side Regex filter when SearchOptions.Regex is non-empty" + - "DuplicatesService implements IDuplicatesService for both Mode=Files (Search API) and Mode=Folders (CAML FSObjType=1)" + - "DuplicatesService groups items by MakeKey composite key and returns only groups with count >= 2" + - "All CSOM round-trips use ExecuteQueryRetryHelper.ExecuteQueryRetryAsync" + - "Folder enumeration uses SharePointPaginationHelper.GetAllItemsAsync with FSObjType=1 CAML" + artifacts: + - path: "SharepointToolbox/Services/SearchService.cs" + provides: "KQL search engine with pagination (SRCH-01/02)" + exports: ["SearchService"] + - path: "SharepointToolbox/Services/DuplicatesService.cs" + provides: "Duplicate detection for files and folders (DUPL-01/02)" + exports: ["DuplicatesService"] + key_links: + - from: "SearchService.cs" + to: "KeywordQuery + SearchExecutor" + via: "Microsoft.SharePoint.Client.Search.Query" + pattern: "KeywordQuery" + - from: "DuplicatesService.cs" + to: "SharePointPaginationHelper.GetAllItemsAsync" + via: "folder enumeration" + pattern: "SharePointPaginationHelper\\.GetAllItemsAsync" + - from: "DuplicatesService.cs" + to: "MakeKey" + via: "composite key grouping" + pattern: "MakeKey" +--- + +# Plan 03-04: SearchService and DuplicatesService — KQL Pagination and Duplicate Grouping + +## Goal + +Implement `SearchService` (KQL-based file search with 500-row pagination and 50,000 hard cap) and `DuplicatesService` (file duplicates via Search API + folder duplicates via CAML `FSObjType=1`). Both services are wave 2 — they depend only on the models and interfaces from Plan 03-01, not on StorageService. + +## Context + +`Microsoft.SharePoint.Client.Search.dll` is available as a transitive dependency of PnP.Framework 1.18.0. The namespace is `Microsoft.SharePoint.Client.Search.Query`. The search pattern requires calling `executor.ExecuteQuery(kq)` to register the query, then `ExecuteQueryRetryHelper.ExecuteQueryRetryAsync` to execute it — calling `ctx.ExecuteQuery()` directly afterward is incorrect and must be avoided. + +`DuplicatesService` for folders uses `SharePointPaginationHelper.GetAllItemsAsync` with `FSObjType=1` CAML. The CAML field name is `FSObjType` (not `FileSystemObjectType`) — using the wrong name returns zero results silently. + +The `MakeKey` composite key logic tested in Plan 03-01 `DuplicatesServiceTests` must match exactly what `DuplicatesService` implements. + +## Tasks + +### Task 1: Implement SearchService + +**File:** `SharepointToolbox/Services/SearchService.cs` + +**Action:** Create + +**Why:** SRCH-01 (multi-criteria search) and SRCH-02 (configurable max results up to 50,000). + +```csharp +using Microsoft.SharePoint.Client; +using Microsoft.SharePoint.Client.Search.Query; +using SharepointToolbox.Core.Helpers; +using SharepointToolbox.Core.Models; +using System.Text.RegularExpressions; + +namespace SharepointToolbox.Services; + +/// +/// File search using SharePoint KQL Search API. +/// Port of PS Search-SPOFiles pattern (PS lines 4747-4987). +/// Pagination: 500 rows per batch, hard cap StartRow=50,000 (SharePoint Search boundary). +/// +public class SearchService : ISearchService +{ + private const int BatchSize = 500; + private const int MaxStartRow = 50_000; + + public async Task> SearchFilesAsync( + ClientContext ctx, + SearchOptions options, + IProgress progress, + CancellationToken ct) + { + ct.ThrowIfCancellationRequested(); + + string kql = BuildKql(options); + ValidateKqlLength(kql); + + Regex? regexFilter = null; + if (!string.IsNullOrWhiteSpace(options.Regex)) + { + regexFilter = new Regex(options.Regex, + RegexOptions.IgnoreCase | RegexOptions.Compiled, + TimeSpan.FromSeconds(2)); + } + + var allResults = new List(); + int startRow = 0; + int maxResults = Math.Min(options.MaxResults, MaxStartRow); + + do + { + ct.ThrowIfCancellationRequested(); + + var kq = new KeywordQuery(ctx) + { + QueryText = kql, + StartRow = startRow, + RowLimit = BatchSize, + TrimDuplicates = false + }; + kq.SelectProperties.AddRange(new[] + { + "Title", "Path", "Author", "LastModifiedTime", + "FileExtension", "Created", "ModifiedBy", "Size" + }); + + var executor = new SearchExecutor(ctx); + ClientResult clientResult = executor.ExecuteQuery(kq); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + var table = clientResult.Value + .FirstOrDefault(t => t.TableType == KnownTableTypes.RelevantResults); + if (table == null || table.RowCount == 0) break; + + foreach (System.Collections.Hashtable row in table.ResultRows) + { + var dict = row.Cast() + .ToDictionary(e => e.Key.ToString()!, e => e.Value ?? (object)string.Empty); + + // Skip SharePoint version history paths + string path = Str(dict, "Path"); + if (path.Contains("/_vti_history/", StringComparison.OrdinalIgnoreCase)) + continue; + + var result = ParseRow(dict); + + // Client-side Regex filter on file name + if (regexFilter != null) + { + string fileName = System.IO.Path.GetFileName(result.Path); + if (!regexFilter.IsMatch(fileName) && !regexFilter.IsMatch(result.Title)) + continue; + } + + allResults.Add(result); + if (allResults.Count >= maxResults) goto done; + } + + progress.Report(new OperationProgress(allResults.Count, maxResults, + $"Retrieved {allResults.Count:N0} results…")); + + startRow += BatchSize; + } + while (startRow <= MaxStartRow && allResults.Count < maxResults); + + done: + return allResults; + } + + // ── Extension point: bypassing the 50,000-item cap ─────────────────────── + // + // The StartRow approach has a hard ceiling at 50,000 (SharePoint Search boundary). + // To go beyond it, replace the StartRow loop with a DocId cursor: + // + // 1. Add "DocId" to SelectProperties. + // 2. Add query.SortList.Add("DocId", SortDirection.Ascending). + // 3. First page KQL: unchanged. + // Subsequent pages: append "AND DocId>{lastDocId}" to the KQL (StartRow stays 0). + // 4. Track lastDocId = Convert.ToInt64(lastRow["DocId"]) after each batch. + // 5. Stop when batch.RowCount < BatchSize. + // + // Caveats: + // - DocId is per-site-collection; for multi-site searches, maintain a separate + // cursor per ClientContext (site URL). + // - The search index can shift between batches (new items indexed mid-scan); + // the DocId cursor is safer than StartRow but cannot guarantee zero drift. + // - DocId is not returned by default — it must be in SelectProperties. + // + // This is deliberately not implemented here because SRCH-02 caps results at 50,000, + // which the StartRow approach already covers exactly (100 pages × 500 rows). + // Implement the DocId cursor if the cap needs to be lifted in a future version. + + // ── KQL builder ─────────────────────────────────────────────────────────── + + internal static string BuildKql(SearchOptions opts) + { + var parts = new List { "ContentType:Document" }; + + if (opts.Extensions.Length > 0) + { + var extParts = opts.Extensions + .Select(e => $"FileExtension:{e.TrimStart('.').ToLowerInvariant()}"); + parts.Add($"({string.Join(" OR ", extParts)})"); + } + if (opts.CreatedAfter.HasValue) + parts.Add($"Created>={opts.CreatedAfter.Value:yyyy-MM-dd}"); + if (opts.CreatedBefore.HasValue) + parts.Add($"Created<={opts.CreatedBefore.Value:yyyy-MM-dd}"); + if (opts.ModifiedAfter.HasValue) + parts.Add($"Write>={opts.ModifiedAfter.Value:yyyy-MM-dd}"); + if (opts.ModifiedBefore.HasValue) + parts.Add($"Write<={opts.ModifiedBefore.Value:yyyy-MM-dd}"); + if (!string.IsNullOrEmpty(opts.CreatedBy)) + parts.Add($"Author:\"{opts.CreatedBy}\""); + if (!string.IsNullOrEmpty(opts.ModifiedBy)) + parts.Add($"ModifiedBy:\"{opts.ModifiedBy}\""); + if (!string.IsNullOrEmpty(opts.Library) && !string.IsNullOrEmpty(opts.SiteUrl)) + parts.Add($"Path:\"{opts.SiteUrl.TrimEnd('/')}/{opts.Library.TrimStart('/')}*\""); + + return string.Join(" AND ", parts); + } + + private static void ValidateKqlLength(string kql) + { + // SharePoint Search KQL text hard cap is 4096 characters + if (kql.Length > 4096) + throw new InvalidOperationException( + $"KQL query exceeds 4096-character SharePoint Search limit ({kql.Length} chars). " + + "Reduce the number of extension filters."); + } + + // ── Row parser ──────────────────────────────────────────────────────────── + + private static SearchResult ParseRow(IDictionary row) + { + static string Str(IDictionary r, string key) => + r.TryGetValue(key, out var v) ? v?.ToString() ?? string.Empty : string.Empty; + + static DateTime? Date(IDictionary r, string key) + { + var s = Str(r, key); + return DateTime.TryParse(s, out var dt) ? dt : (DateTime?)null; + } + + static long ParseSize(IDictionary r, string key) + { + var raw = Str(r, key); + var digits = Regex.Replace(raw, "[^0-9]", ""); + return long.TryParse(digits, out var v) ? v : 0L; + } + + return new SearchResult + { + Title = Str(row, "Title"), + Path = Str(row, "Path"), + FileExtension = Str(row, "FileExtension"), + Created = Date(row, "Created"), + LastModified = Date(row, "LastModifiedTime"), + Author = Str(row, "Author"), + ModifiedBy = Str(row, "ModifiedBy"), + SizeBytes = ParseSize(row, "Size") + }; + } + + private static string Str(IDictionary r, string key) => + r.TryGetValue(key, out var v) ? v?.ToString() ?? string.Empty : string.Empty; +} +``` + +**Verification:** + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~SearchServiceTests" -x +``` + +Expected: 0 build errors; CSOM tests skip, no compile errors + +### Task 2: Implement DuplicatesService + +**File:** `SharepointToolbox/Services/DuplicatesService.cs` + +**Action:** Create + +**Why:** DUPL-01 (file duplicates via Search API) and DUPL-02 (folder duplicates via CAML pagination). + +```csharp +using Microsoft.SharePoint.Client; +using Microsoft.SharePoint.Client.Search.Query; +using SharepointToolbox.Core.Helpers; +using SharepointToolbox.Core.Models; + +namespace SharepointToolbox.Services; + +/// +/// Duplicate file and folder detection. +/// Files: Search API (same KQL engine as SearchService) + client-side composite key grouping. +/// Folders: CSOM CAML FSObjType=1 via SharePointPaginationHelper + composite key grouping. +/// Port of PS Find-DuplicateFiles / Find-DuplicateFolders (PS lines 4942-5036). +/// +public class DuplicatesService : IDuplicatesService +{ + private const int BatchSize = 500; + private const int MaxStartRow = 50_000; + + public async Task> ScanDuplicatesAsync( + ClientContext ctx, + DuplicateScanOptions options, + IProgress progress, + CancellationToken ct) + { + ct.ThrowIfCancellationRequested(); + + List allItems; + + if (options.Mode == "Folders") + allItems = await CollectFolderItemsAsync(ctx, options, progress, ct); + else + allItems = await CollectFileItemsAsync(ctx, options, progress, ct); + + progress.Report(OperationProgress.Indeterminate($"Grouping {allItems.Count:N0} items by duplicate key…")); + + var groups = allItems + .GroupBy(item => MakeKey(item, options)) + .Where(g => g.Count() >= 2) + .Select(g => new DuplicateGroup + { + GroupKey = g.Key, + Name = g.First().Name, + Items = g.ToList() + }) + .OrderByDescending(g => g.Items.Count) + .ThenBy(g => g.Name) + .ToList(); + + return groups; + } + + // ── File collection via Search API ──────────────────────────────────────── + + private static async Task> CollectFileItemsAsync( + ClientContext ctx, + DuplicateScanOptions options, + IProgress progress, + CancellationToken ct) + { + // KQL: all documents, optionally scoped to a library + var kqlParts = new List { "ContentType:Document" }; + if (!string.IsNullOrEmpty(options.Library)) + kqlParts.Add($"Path:\"{ctx.Url.TrimEnd('/')}/{options.Library.TrimStart('/')}*\""); + string kql = string.Join(" AND ", kqlParts); + + var allItems = new List(); + int startRow = 0; + + do + { + ct.ThrowIfCancellationRequested(); + + var kq = new KeywordQuery(ctx) + { + QueryText = kql, + StartRow = startRow, + RowLimit = BatchSize, + TrimDuplicates = false + }; + kq.SelectProperties.AddRange(new[] + { + "Title", "Path", "FileExtension", "Created", + "LastModifiedTime", "Size", "ParentLink" + }); + + var executor = new SearchExecutor(ctx); + ClientResult clientResult = executor.ExecuteQuery(kq); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + var table = clientResult.Value + .FirstOrDefault(t => t.TableType == KnownTableTypes.RelevantResults); + if (table == null || table.RowCount == 0) break; + + foreach (System.Collections.Hashtable row in table.ResultRows) + { + var dict = row.Cast() + .ToDictionary(e => e.Key.ToString()!, e => e.Value ?? (object)string.Empty); + + string path = GetStr(dict, "Path"); + if (path.Contains("/_vti_history/", StringComparison.OrdinalIgnoreCase)) + continue; + + string name = System.IO.Path.GetFileName(path); + if (string.IsNullOrEmpty(name)) + name = GetStr(dict, "Title"); + + string raw = GetStr(dict, "Size"); + string digits = System.Text.RegularExpressions.Regex.Replace(raw, "[^0-9]", ""); + long size = long.TryParse(digits, out var sv) ? sv : 0L; + + DateTime? created = ParseDate(GetStr(dict, "Created")); + DateTime? modified = ParseDate(GetStr(dict, "LastModifiedTime")); + + // Derive library from ParentLink or path segments + string parentLink = GetStr(dict, "ParentLink"); + string library = ExtractLibraryFromPath(path, ctx.Url); + + allItems.Add(new DuplicateItem + { + Name = name, + Path = path, + Library = library, + SizeBytes = size, + Created = created, + Modified = modified + }); + } + + progress.Report(new OperationProgress(allItems.Count, MaxStartRow, + $"Collected {allItems.Count:N0} files…")); + + startRow += BatchSize; + } + while (startRow <= MaxStartRow); + + return allItems; + } + + // ── Folder collection via CAML ──────────────────────────────────────────── + + private static async Task> CollectFolderItemsAsync( + ClientContext ctx, + DuplicateScanOptions options, + IProgress progress, + CancellationToken ct) + { + // Load all document libraries on the site + ctx.Load(ctx.Web, + w => w.Lists.Include( + l => l.Title, l => l.Hidden, l => l.BaseType)); + await ExecuteQueryRetryHelper.ExecuteQueryRetryAsync(ctx, progress, ct); + + var libs = ctx.Web.Lists + .Where(l => !l.Hidden && l.BaseType == BaseType.DocumentLibrary) + .ToList(); + + // Filter to specific library if requested + if (!string.IsNullOrEmpty(options.Library)) + { + libs = libs + .Where(l => l.Title.Equals(options.Library, StringComparison.OrdinalIgnoreCase)) + .ToList(); + } + + var camlQuery = new CamlQuery + { + ViewXml = """ + + + + + + 1 + + + + 2000 + + """ + }; + + var allItems = new List(); + + foreach (var lib in libs) + { + ct.ThrowIfCancellationRequested(); + progress.Report(OperationProgress.Indeterminate($"Scanning folders in {lib.Title}…")); + + await foreach (var item in SharePointPaginationHelper.GetAllItemsAsync(ctx, lib, camlQuery, ct)) + { + ct.ThrowIfCancellationRequested(); + + var fv = item.FieldValues; + string name = fv["FileLeafRef"]?.ToString() ?? string.Empty; + string fileRef = fv["FileRef"]?.ToString() ?? string.Empty; + int subCount = Convert.ToInt32(fv["FolderChildCount"] ?? 0); + int childCount = Convert.ToInt32(fv["ItemChildCount"] ?? 0); + int fileCount = Math.Max(0, childCount - subCount); + DateTime? created = fv["Created"] is DateTime cr ? cr : (DateTime?)null; + DateTime? modified = fv["Modified"] is DateTime md ? md : (DateTime?)null; + + allItems.Add(new DuplicateItem + { + Name = name, + Path = fileRef, + Library = lib.Title, + FolderCount = subCount, + FileCount = fileCount, + Created = created, + Modified = modified + }); + } + } + + return allItems; + } + + // ── Composite key builder (matches test scaffold in DuplicatesServiceTests) ── + + internal static string MakeKey(DuplicateItem item, DuplicateScanOptions opts) + { + var parts = new List { item.Name.ToLowerInvariant() }; + if (opts.MatchSize && item.SizeBytes.HasValue) parts.Add(item.SizeBytes.Value.ToString()); + if (opts.MatchCreated && item.Created.HasValue) parts.Add(item.Created.Value.Date.ToString("yyyy-MM-dd")); + if (opts.MatchModified && item.Modified.HasValue) parts.Add(item.Modified.Value.Date.ToString("yyyy-MM-dd")); + if (opts.MatchSubfolderCount && item.FolderCount.HasValue) parts.Add(item.FolderCount.Value.ToString()); + if (opts.MatchFileCount && item.FileCount.HasValue) parts.Add(item.FileCount.Value.ToString()); + return string.Join("|", parts); + } + + // ── Private utilities ───────────────────────────────────────────────────── + + private static string GetStr(IDictionary r, string key) => + r.TryGetValue(key, out var v) ? v?.ToString() ?? string.Empty : string.Empty; + + private static DateTime? ParseDate(string s) => + DateTime.TryParse(s, out var dt) ? dt : (DateTime?)null; + + private static string ExtractLibraryFromPath(string path, string siteUrl) + { + // Extract first path segment after the site URL as library name + // e.g. https://tenant.sharepoint.com/sites/MySite/Shared Documents/file.docx -> "Shared Documents" + if (string.IsNullOrEmpty(path) || string.IsNullOrEmpty(siteUrl)) + return string.Empty; + + string relative = path.StartsWith(siteUrl.TrimEnd('/'), StringComparison.OrdinalIgnoreCase) + ? path.Substring(siteUrl.TrimEnd('/').Length).TrimStart('/') + : path; + + int slash = relative.IndexOf('/'); + return slash > 0 ? relative.Substring(0, slash) : relative; + } +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~DuplicatesServiceTests" -x +``` + +Expected: 5 pure-logic tests pass (MakeKey), 2 CSOM stubs skip + +## Verification + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~SearchServiceTests|FullyQualifiedName~DuplicatesServiceTests" -x +``` + +Expected: 0 build errors; 5 MakeKey tests pass; CSOM stub tests skip; no compile errors + +## Commit Message +feat(03-04): implement SearchService KQL pagination and DuplicatesService composite key grouping + +## Output + +After completion, create `.planning/phases/03-storage/03-04-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-05-PLAN.md b/.planning/phases/03-storage/03-05-PLAN.md new file mode 100644 index 0000000..cc3f911 --- /dev/null +++ b/.planning/phases/03-storage/03-05-PLAN.md @@ -0,0 +1,459 @@ +--- +phase: 03 +plan: 05 +title: Search and Duplicate Export Services — CSV, Sortable HTML, and Grouped HTML +status: pending +wave: 3 +depends_on: + - 03-04 +files_modified: + - SharepointToolbox/Services/Export/SearchCsvExportService.cs + - SharepointToolbox/Services/Export/SearchHtmlExportService.cs + - SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs +autonomous: true +requirements: + - SRCH-03 + - SRCH-04 + - DUPL-03 + +must_haves: + truths: + - "SearchCsvExportService.BuildCsv produces a UTF-8 BOM CSV with header: File Name, Extension, Path, Created, Created By, Modified, Modified By, Size (bytes)" + - "SearchHtmlExportService.BuildHtml produces a self-contained HTML with sortable columns (click-to-sort JS) and a filter/search input" + - "DuplicatesHtmlExportService.BuildHtml produces a self-contained HTML with one card per group, showing item paths, and an ok/diff badge indicating group size" + - "SearchExportServiceTests: all 6 tests pass" + - "DuplicatesHtmlExportServiceTests: all 3 tests pass" + artifacts: + - path: "SharepointToolbox/Services/Export/SearchCsvExportService.cs" + provides: "CSV exporter for SearchResult list (SRCH-03)" + exports: ["SearchCsvExportService"] + - path: "SharepointToolbox/Services/Export/SearchHtmlExportService.cs" + provides: "Sortable/filterable HTML exporter for SearchResult list (SRCH-04)" + exports: ["SearchHtmlExportService"] + - path: "SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs" + provides: "Grouped HTML exporter for DuplicateGroup list (DUPL-03)" + exports: ["DuplicatesHtmlExportService"] + key_links: + - from: "SearchHtmlExportService.cs" + to: "sortTable JS" + via: "inline script" + pattern: "sort" + - from: "DuplicatesHtmlExportService.cs" + to: "group card HTML" + via: "per-DuplicateGroup rendering" + pattern: "group" +--- + +# Plan 03-05: Search and Duplicate Export Services — CSV, Sortable HTML, and Grouped HTML + +## Goal + +Replace the three stub export implementations created in Plan 03-01 with real ones. `SearchCsvExportService` produces a UTF-8 BOM CSV. `SearchHtmlExportService` ports the PS `Export-SearchToHTML` pattern (PS lines 2112-2233) with sortable columns and a live filter input. `DuplicatesHtmlExportService` ports the PS `Export-DuplicatesToHTML` pattern (PS lines 2235-2406) with grouped cards and ok/diff badges. + +## Context + +Test files `SearchExportServiceTests.cs` and `DuplicatesHtmlExportServiceTests.cs` already exist from Plan 03-01 and currently fail because stubs return `string.Empty`. This plan makes them pass. + +All HTML exports are self-contained (no external CDN or CSS links) using the same `Segoe UI` font stack and `#0078d4` color palette established in Phase 2. + +## Tasks + +### Task 1: Implement SearchCsvExportService and SearchHtmlExportService + +**Files:** +- `SharepointToolbox/Services/Export/SearchCsvExportService.cs` +- `SharepointToolbox/Services/Export/SearchHtmlExportService.cs` + +**Action:** Modify (replace stubs with full implementation) + +**Why:** SRCH-03 (CSV export) and SRCH-04 (sortable/filterable HTML export). + +```csharp +// SharepointToolbox/Services/Export/SearchCsvExportService.cs +using SharepointToolbox.Core.Models; +using System.Text; + +namespace SharepointToolbox.Services.Export; + +/// +/// Exports SearchResult list to a UTF-8 BOM CSV file. +/// Header matches the column order in SearchHtmlExportService for consistency. +/// +public class SearchCsvExportService +{ + public string BuildCsv(IReadOnlyList results) + { + var sb = new StringBuilder(); + + // Header + sb.AppendLine("File Name,Extension,Path,Created,Created By,Modified,Modified By,Size (bytes)"); + + foreach (var r in results) + { + sb.AppendLine(string.Join(",", + Csv(IfEmpty(System.IO.Path.GetFileName(r.Path), r.Title)), + Csv(r.FileExtension), + Csv(r.Path), + r.Created.HasValue ? Csv(r.Created.Value.ToString("yyyy-MM-dd")) : string.Empty, + Csv(r.Author), + r.LastModified.HasValue ? Csv(r.LastModified.Value.ToString("yyyy-MM-dd")) : string.Empty, + Csv(r.ModifiedBy), + r.SizeBytes.ToString())); + } + + return sb.ToString(); + } + + public async Task WriteAsync(IReadOnlyList results, string filePath, CancellationToken ct) + { + var csv = BuildCsv(results); + await File.WriteAllTextAsync(filePath, csv, new UTF8Encoding(encoderShouldEmitUTF8Identifier: true), ct); + } + + private static string Csv(string value) + { + if (string.IsNullOrEmpty(value)) return string.Empty; + if (value.Contains(',') || value.Contains('"') || value.Contains('\n')) + return $"\"{value.Replace("\"", "\"\"")}\""; + return value; + } + + private static string IfEmpty(string? value, string fallback = "") + => string.IsNullOrEmpty(value) ? fallback : value!; +} +``` + +```csharp +// SharepointToolbox/Services/Export/SearchHtmlExportService.cs +using SharepointToolbox.Core.Models; +using System.Text; + +namespace SharepointToolbox.Services.Export; + +/// +/// Exports SearchResult list to a self-contained sortable/filterable HTML report. +/// Port of PS Export-SearchToHTML (PS lines 2112-2233). +/// Columns are sortable by clicking the header. A filter input narrows rows by text match. +/// +public class SearchHtmlExportService +{ + public string BuildHtml(IReadOnlyList results) + { + var sb = new StringBuilder(); + + sb.AppendLine(""" + + + + + + SharePoint File Search Results + + + +

File Search Results

+
+ + + +
+ """); + + sb.AppendLine(""" + + + + + + + + + + + + + + + """); + + foreach (var r in results) + { + string fileName = System.IO.Path.GetFileName(r.Path); + if (string.IsNullOrEmpty(fileName)) fileName = r.Title; + + sb.AppendLine($""" + + + + + + + + + + + """); + } + + sb.AppendLine(" \n
File NameExtensionPathCreatedCreated ByModifiedModified BySize
{H(fileName)}{H(r.FileExtension)}{H(r.Path)}{(r.Created.HasValue ? r.Created.Value.ToString("yyyy-MM-dd") : string.Empty)}{H(r.Author)}{(r.LastModified.HasValue ? r.LastModified.Value.ToString("yyyy-MM-dd") : string.Empty)}{H(r.ModifiedBy)}{FormatSize(r.SizeBytes)}
"); + + // Inline sort + filter JS + sb.AppendLine($$""" +

Generated: {{DateTime.Now:yyyy-MM-dd HH:mm}} — {{results.Count:N0}} result(s)

+ + + """); + + return sb.ToString(); + } + + public async Task WriteAsync(IReadOnlyList results, string filePath, CancellationToken ct) + { + var html = BuildHtml(results); + await File.WriteAllTextAsync(filePath, html, Encoding.UTF8, ct); + } + + private static string H(string value) => + System.Net.WebUtility.HtmlEncode(value ?? string.Empty); + + private static string FormatSize(long bytes) + { + if (bytes >= 1_073_741_824L) return $"{bytes / 1_073_741_824.0:F2} GB"; + if (bytes >= 1_048_576L) return $"{bytes / 1_048_576.0:F2} MB"; + if (bytes >= 1024L) return $"{bytes / 1024.0:F2} KB"; + return $"{bytes} B"; + } +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~SearchExportServiceTests" -x +``` + +Expected: 6 tests pass + +### Task 2: Implement DuplicatesHtmlExportService + +**File:** `SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs` + +**Action:** Modify (replace stub with full implementation) + +**Why:** DUPL-03 — user can export duplicate report to HTML with grouped display and visual indicators. + +```csharp +// SharepointToolbox/Services/Export/DuplicatesHtmlExportService.cs +using SharepointToolbox.Core.Models; +using System.Text; + +namespace SharepointToolbox.Services.Export; + +/// +/// Exports DuplicateGroup list to a self-contained HTML with collapsible group cards. +/// Port of PS Export-DuplicatesToHTML (PS lines 2235-2406). +/// Each group gets a card showing item count badge and a table of paths. +/// +public class DuplicatesHtmlExportService +{ + public string BuildHtml(IReadOnlyList groups) + { + var sb = new StringBuilder(); + + sb.AppendLine(""" + + + + + + SharePoint Duplicate Detection Report + + + + +

Duplicate Detection Report

+ """); + + sb.AppendLine($"

{groups.Count:N0} duplicate group(s) found.

"); + + for (int i = 0; i < groups.Count; i++) + { + var g = groups[i]; + int count = g.Items.Count; + string badgeClass = "badge-dup"; + + sb.AppendLine($""" +
+
+ {H(g.Name)} + {count} copies +
+
+ + + + + + + + + + + + + """); + + for (int j = 0; j < g.Items.Count; j++) + { + var item = g.Items[j]; + string size = item.SizeBytes.HasValue ? FormatSize(item.SizeBytes.Value) : string.Empty; + string created = item.Created.HasValue ? item.Created.Value.ToString("yyyy-MM-dd") : string.Empty; + string modified = item.Modified.HasValue ? item.Modified.Value.ToString("yyyy-MM-dd") : string.Empty; + + sb.AppendLine($""" + + + + + + + + + """); + } + + sb.AppendLine(""" + +
#LibraryPathSizeCreatedModified
{j + 1}{H(item.Library)}{H(item.Path)}{size}{created}{modified}
+
+
+ """); + } + + sb.AppendLine($"

Generated: {DateTime.Now:yyyy-MM-dd HH:mm}

"); + sb.AppendLine(""); + + return sb.ToString(); + } + + public async Task WriteAsync(IReadOnlyList groups, string filePath, CancellationToken ct) + { + var html = BuildHtml(groups); + await File.WriteAllTextAsync(filePath, html, Encoding.UTF8, ct); + } + + private static string H(string value) => + System.Net.WebUtility.HtmlEncode(value ?? string.Empty); + + private static string FormatSize(long bytes) + { + if (bytes >= 1_073_741_824L) return $"{bytes / 1_073_741_824.0:F2} GB"; + if (bytes >= 1_048_576L) return $"{bytes / 1_048_576.0:F2} MB"; + if (bytes >= 1024L) return $"{bytes / 1024.0:F2} KB"; + return $"{bytes} B"; + } +} +``` + +**Verification:** + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~DuplicatesHtmlExportServiceTests" -x +``` + +Expected: 3 tests pass + +## Verification + +```bash +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj --filter "FullyQualifiedName~SearchExportServiceTests|FullyQualifiedName~DuplicatesHtmlExportServiceTests" -x +``` + +Expected: 9 tests pass, 0 fail + +## Commit Message +feat(03-05): implement SearchCsvExportService, SearchHtmlExportService, DuplicatesHtmlExportService + +## Output + +After completion, create `.planning/phases/03-storage/03-05-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-06-PLAN.md b/.planning/phases/03-storage/03-06-PLAN.md new file mode 100644 index 0000000..0fe91af --- /dev/null +++ b/.planning/phases/03-storage/03-06-PLAN.md @@ -0,0 +1,301 @@ +--- +phase: 03 +plan: 06 +title: Localization — Phase 3 EN and FR Keys +status: pending +wave: 2 +depends_on: + - 03-01 +files_modified: + - SharepointToolbox/Localization/Strings.resx + - SharepointToolbox/Localization/Strings.fr.resx + - SharepointToolbox/Localization/Strings.Designer.cs +autonomous: true +requirements: + - STOR-01 + - STOR-02 + - STOR-04 + - STOR-05 + - SRCH-01 + - SRCH-02 + - SRCH-03 + - SRCH-04 + - DUPL-01 + - DUPL-02 + - DUPL-03 + +must_haves: + truths: + - "All Phase 3 EN keys exist in Strings.resx" + - "All Phase 3 FR keys exist in Strings.fr.resx with non-empty French values" + - "Strings.Designer.cs has one static property per new key (dot-to-underscore naming: chk.per.lib -> chk_per_lib)" + - "dotnet build produces 0 errors after localization changes" + - "No existing Phase 2 or Phase 1 keys are modified or removed" + artifacts: + - path: "SharepointToolbox/Localization/Strings.resx" + provides: "English localization for Phase 3 tabs" + - path: "SharepointToolbox/Localization/Strings.fr.resx" + provides: "French localization for Phase 3 tabs" + - path: "SharepointToolbox/Localization/Strings.Designer.cs" + provides: "Strongly-typed accessors for new keys" + key_links: + - from: "Strings.Designer.cs" + to: "Strings.resx" + via: "ResourceManager.GetString" + pattern: "ResourceManager\\.GetString" +--- + +# Plan 03-06: Localization — Phase 3 EN and FR Keys + +## Goal + +Add all EN and FR localization keys needed by the Storage, File Search, and Duplicates tabs. Views in plans 03-07 and 03-08 reference these keys via `TranslationSource.Instance["key"]` XAML bindings. Keys must exist before the Views compile. + +## Context + +Strings.resx uses a manually maintained `Strings.Designer.cs` (no ResXFileCodeGenerator — confirmed in Phase 1 decisions). The naming convention converts dots to underscores: key `chk.per.lib` becomes accessor `Strings.chk_per_lib`. Both `.resx` files use `xml:space="preserve"` on each `` element. The following keys already exist and must NOT be duplicated: `tab.storage`, `tab.search`, `tab.duplicates`, `lbl.folder.depth`, `chk.max.depth`. + +> **Pre-existing keys — do not add:** The following keys are confirmed present in `Strings.resx` from Phase 2 and must be skipped when editing both `.resx` files and `Strings.Designer.cs`: +> - `grp.scan.opts` (value: "Scan Options") — already exists +> - `grp.export.fmt` (value: "Export Format") — already exists +> - `btn.cancel` (value: "Cancel") — already exists +> +> Before appending, verify with: `grep -n "grp.scan.opts\|grp.export.fmt\|btn.cancel" SharepointToolbox/Localization/Strings.resx` +> Do not add designer properties for these keys if they already exist in `Strings.Designer.cs`. + +## Tasks + +### Task 1: Add Phase 3 keys to Strings.resx, Strings.fr.resx, and Strings.Designer.cs + +**Files:** +- `SharepointToolbox/Localization/Strings.resx` +- `SharepointToolbox/Localization/Strings.fr.resx` +- `SharepointToolbox/Localization/Strings.Designer.cs` + +**Action:** Modify — append new `` elements before `` in both .resx files; append new properties before the closing `}` in Strings.Designer.cs + +**Why:** Views in plans 03-07 and 03-08 bind to these keys. Missing keys produce empty strings at runtime. + +Add these entries immediately before the closing `` tag in `Strings.resx`: + +```xml + + Per-Library Breakdown + Include Subsites + Note: deeper folder scans on large sites may take several minutes. + Generate Metrics + Open Report + Library + Site + Files + Total Size + Version Size + Last Modified + Share of Total + CSV + HTML + + Search Filters + Extension(s): + docx pdf xlsx + Name / Regex: + Ex: report.* or \.bak$ + Created after: + Created before: + Modified after: + Modified before: + Created by: + First Last or email + Modified by: + First Last or email + Library: + Optional relative path e.g. Shared Documents + Max results: + Site URL: + https://tenant.sharepoint.com/sites/MySite + Run Search + Open Results + File Name + Extension + Created + Modified + Created By + Modified By + Size + Path + CSV + HTML + + Duplicate Type + Duplicate files + Duplicate folders + Comparison Criteria + Name is always the primary criterion. Check additional criteria: + Same size + Same creation date + Same modification date + Same subfolder count + Same file count + Include subsites + All (leave empty) + Run Scan + Open Results +``` + +Add these entries immediately before the closing `` tag in `Strings.fr.resx`: + +```xml + + Détail par bibliothèque + Inclure les sous-sites + Remarque : les analyses de dossiers profondes sur les grands sites peuvent prendre plusieurs minutes. + Générer les métriques + Ouvrir le rapport + Bibliothèque + Site + Fichiers + Taille totale + Taille des versions + Dernière modification + Part du total + CSV + HTML + + Filtres de recherche + Extension(s) : + docx pdf xlsx + Nom / Regex : + Ex : rapport.* ou \.bak$ + Créé après : + Créé avant : + Modifié après : + Modifié avant : + Créé par : + Prénom Nom ou courriel + Modifié par : + Prénom Nom ou courriel + Bibliothèque : + Chemin relatif optionnel, ex. Documents partagés + Max résultats : + URL du site : + https://tenant.sharepoint.com/sites/MonSite + Lancer la recherche + Ouvrir les résultats + Nom du fichier + Extension + Créé + Modifié + Créé par + Modifié par + Taille + Chemin + CSV + HTML + + Type de doublon + Fichiers en doublon + Dossiers en doublon + Critères de comparaison + Le nom est toujours le critère principal. Cochez des critères supplémentaires : + Même taille + Même date de création + Même date de modification + Même nombre de sous-dossiers + Même nombre de fichiers + Inclure les sous-sites + Tous (laisser vide) + Lancer l'analyse + Ouvrir les résultats +``` + +Add these properties inside the `Strings` class in `Strings.Designer.cs` (before the closing `}`): + +```csharp + // Phase 3: Storage Tab + public static string chk_per_lib => ResourceManager.GetString("chk.per.lib", resourceCulture) ?? string.Empty; + public static string chk_subsites => ResourceManager.GetString("chk.subsites", resourceCulture) ?? string.Empty; + public static string stor_note => ResourceManager.GetString("stor.note", resourceCulture) ?? string.Empty; + public static string btn_gen_storage => ResourceManager.GetString("btn.gen.storage", resourceCulture) ?? string.Empty; + public static string btn_open_storage => ResourceManager.GetString("btn.open.storage", resourceCulture) ?? string.Empty; + public static string stor_col_library => ResourceManager.GetString("stor.col.library", resourceCulture) ?? string.Empty; + public static string stor_col_site => ResourceManager.GetString("stor.col.site", resourceCulture) ?? string.Empty; + public static string stor_col_files => ResourceManager.GetString("stor.col.files", resourceCulture) ?? string.Empty; + public static string stor_col_size => ResourceManager.GetString("stor.col.size", resourceCulture) ?? string.Empty; + public static string stor_col_versions => ResourceManager.GetString("stor.col.versions", resourceCulture) ?? string.Empty; + public static string stor_col_lastmod => ResourceManager.GetString("stor.col.lastmod", resourceCulture) ?? string.Empty; + public static string stor_col_share => ResourceManager.GetString("stor.col.share", resourceCulture) ?? string.Empty; + public static string stor_rad_csv => ResourceManager.GetString("stor.rad.csv", resourceCulture) ?? string.Empty; + public static string stor_rad_html => ResourceManager.GetString("stor.rad.html", resourceCulture) ?? string.Empty; + + // Phase 3: File Search Tab + public static string grp_search_filters => ResourceManager.GetString("grp.search.filters", resourceCulture) ?? string.Empty; + public static string lbl_extensions => ResourceManager.GetString("lbl.extensions", resourceCulture) ?? string.Empty; + public static string ph_extensions => ResourceManager.GetString("ph.extensions", resourceCulture) ?? string.Empty; + public static string lbl_regex => ResourceManager.GetString("lbl.regex", resourceCulture) ?? string.Empty; + public static string ph_regex => ResourceManager.GetString("ph.regex", resourceCulture) ?? string.Empty; + public static string chk_created_after => ResourceManager.GetString("chk.created.after", resourceCulture) ?? string.Empty; + public static string chk_created_before => ResourceManager.GetString("chk.created.before", resourceCulture) ?? string.Empty; + public static string chk_modified_after => ResourceManager.GetString("chk.modified.after", resourceCulture) ?? string.Empty; + public static string chk_modified_before => ResourceManager.GetString("chk.modified.before", resourceCulture) ?? string.Empty; + public static string lbl_created_by => ResourceManager.GetString("lbl.created.by", resourceCulture) ?? string.Empty; + public static string ph_created_by => ResourceManager.GetString("ph.created.by", resourceCulture) ?? string.Empty; + public static string lbl_modified_by => ResourceManager.GetString("lbl.modified.by", resourceCulture) ?? string.Empty; + public static string ph_modified_by => ResourceManager.GetString("ph.modified.by", resourceCulture) ?? string.Empty; + public static string lbl_library => ResourceManager.GetString("lbl.library", resourceCulture) ?? string.Empty; + public static string ph_library => ResourceManager.GetString("ph.library", resourceCulture) ?? string.Empty; + public static string lbl_max_results => ResourceManager.GetString("lbl.max.results", resourceCulture) ?? string.Empty; + public static string lbl_site_url => ResourceManager.GetString("lbl.site.url", resourceCulture) ?? string.Empty; + public static string ph_site_url => ResourceManager.GetString("ph.site.url", resourceCulture) ?? string.Empty; + public static string btn_run_search => ResourceManager.GetString("btn.run.search", resourceCulture) ?? string.Empty; + public static string btn_open_search => ResourceManager.GetString("btn.open.search", resourceCulture) ?? string.Empty; + public static string srch_col_name => ResourceManager.GetString("srch.col.name", resourceCulture) ?? string.Empty; + public static string srch_col_ext => ResourceManager.GetString("srch.col.ext", resourceCulture) ?? string.Empty; + public static string srch_col_created => ResourceManager.GetString("srch.col.created", resourceCulture) ?? string.Empty; + public static string srch_col_modified => ResourceManager.GetString("srch.col.modified", resourceCulture) ?? string.Empty; + public static string srch_col_author => ResourceManager.GetString("srch.col.author", resourceCulture) ?? string.Empty; + public static string srch_col_modby => ResourceManager.GetString("srch.col.modby", resourceCulture) ?? string.Empty; + public static string srch_col_size => ResourceManager.GetString("srch.col.size", resourceCulture) ?? string.Empty; + public static string srch_col_path => ResourceManager.GetString("srch.col.path", resourceCulture) ?? string.Empty; + public static string srch_rad_csv => ResourceManager.GetString("srch.rad.csv", resourceCulture) ?? string.Empty; + public static string srch_rad_html => ResourceManager.GetString("srch.rad.html", resourceCulture) ?? string.Empty; + + // Phase 3: Duplicates Tab + public static string grp_dup_type => ResourceManager.GetString("grp.dup.type", resourceCulture) ?? string.Empty; + public static string rad_dup_files => ResourceManager.GetString("rad.dup.files", resourceCulture) ?? string.Empty; + public static string rad_dup_folders => ResourceManager.GetString("rad.dup.folders", resourceCulture) ?? string.Empty; + public static string grp_dup_criteria => ResourceManager.GetString("grp.dup.criteria", resourceCulture) ?? string.Empty; + public static string lbl_dup_note => ResourceManager.GetString("lbl.dup.note", resourceCulture) ?? string.Empty; + public static string chk_dup_size => ResourceManager.GetString("chk.dup.size", resourceCulture) ?? string.Empty; + public static string chk_dup_created => ResourceManager.GetString("chk.dup.created", resourceCulture) ?? string.Empty; + public static string chk_dup_modified => ResourceManager.GetString("chk.dup.modified", resourceCulture) ?? string.Empty; + public static string chk_dup_subfolders => ResourceManager.GetString("chk.dup.subfolders", resourceCulture) ?? string.Empty; + public static string chk_dup_filecount => ResourceManager.GetString("chk.dup.filecount", resourceCulture) ?? string.Empty; + public static string chk_include_subsites => ResourceManager.GetString("chk.include.subsites", resourceCulture) ?? string.Empty; + public static string ph_dup_lib => ResourceManager.GetString("ph.dup.lib", resourceCulture) ?? string.Empty; + public static string btn_run_scan => ResourceManager.GetString("btn.run.scan", resourceCulture) ?? string.Empty; + public static string btn_open_results => ResourceManager.GetString("btn.open.results", resourceCulture) ?? string.Empty; +``` + +**Verification:** + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +``` + +Expected: 0 errors + +## Verification + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +dotnet test C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.Tests/SharepointToolbox.Tests.csproj -x 2>&1 | tail -5 +``` + +Expected: 0 build errors; all previously passing tests still pass; no new failures + +## Commit Message +feat(03-06): add Phase 3 EN/FR localization keys for Storage, Search, and Duplicates tabs + +## Output + +After completion, create `.planning/phases/03-storage/03-06-SUMMARY.md` diff --git a/.planning/phases/03-storage/03-07-PLAN.md b/.planning/phases/03-storage/03-07-PLAN.md new file mode 100644 index 0000000..dd2ee03 --- /dev/null +++ b/.planning/phases/03-storage/03-07-PLAN.md @@ -0,0 +1,577 @@ +--- +phase: 03 +plan: 07 +title: StorageViewModel + StorageView XAML + DI Wiring +status: pending +wave: 3 +depends_on: + - 03-03 + - 03-06 +files_modified: + - SharepointToolbox/ViewModels/Tabs/StorageViewModel.cs + - SharepointToolbox/Views/Tabs/StorageView.xaml + - SharepointToolbox/Views/Tabs/StorageView.xaml.cs + - SharepointToolbox/App.xaml.cs + - SharepointToolbox/MainWindow.xaml + - SharepointToolbox/MainWindow.xaml.cs +autonomous: true +requirements: + - STOR-01 + - STOR-02 + - STOR-03 + - STOR-04 + - STOR-05 + +must_haves: + truths: + - "StorageView appears in the Storage tab (replaces FeatureTabBase stub) when the app runs" + - "User can enter a site URL, set folder depth (0 = library root, or N levels), check per-library breakdown, and click Generate Metrics" + - "DataGrid displays StorageNode rows with library name indented by IndentLevel, file count, total size, version size, last modified" + - "Export buttons are enabled after a successful scan and disabled when Results is empty" + - "Never modify ObservableCollection from a background thread — accumulate in List on background, then Dispatcher.InvokeAsync" + - "StorageViewModel never stores ClientContext — it calls ISessionManager.GetOrCreateContextAsync at operation start" + artifacts: + - path: "SharepointToolbox/ViewModels/Tabs/StorageViewModel.cs" + provides: "Storage tab ViewModel (IStorageService orchestration)" + exports: ["StorageViewModel"] + - path: "SharepointToolbox/Views/Tabs/StorageView.xaml" + provides: "Storage tab XAML (DataGrid + controls)" + - path: "SharepointToolbox/Views/Tabs/StorageView.xaml.cs" + provides: "StorageView code-behind" + key_links: + - from: "StorageViewModel.cs" + to: "IStorageService.CollectStorageAsync" + via: "RunOperationAsync override" + pattern: "CollectStorageAsync" + - from: "StorageViewModel.cs" + to: "ISessionManager.GetOrCreateContextAsync" + via: "context acquisition" + pattern: "GetOrCreateContextAsync" + - from: "StorageView.xaml" + to: "StorageViewModel.Results" + via: "DataGrid ItemsSource binding" + pattern: "Results" +--- + +# Plan 03-07: StorageViewModel + StorageView XAML + DI Wiring + +## Goal + +Create the `StorageViewModel` (orchestrates `IStorageService`, export commands) and `StorageView` XAML (DataGrid with IndentLevel-based name indentation). Wire the Storage tab in `MainWindow` to replace the `FeatureTabBase` stub, register all dependencies in `App.xaml.cs`. + +## Context + +Plans 03-02 (StorageService), 03-03 (export services), and 03-06 (localization) must complete before this plan. The ViewModel follows the exact pattern from `PermissionsViewModel`: `FeatureViewModelBase` base class, `AsyncRelayCommand` for exports, `ObservableCollection` updated via `Dispatcher.InvokeAsync` from background thread. + +`MainWindow.xaml` currently has the Storage tab as: +```xml + + + +``` +This plan adds `x:Name="StorageTabItem"` to that TabItem and wires `StorageTabItem.Content` in `MainWindow.xaml.cs`. + +The `IndentConverter` value converter maps `IndentLevel` (int) → `Thickness(IndentLevel * 16, 0, 0, 0)`. It must be defined in the View or a shared Resources file. + +## Tasks + +### Task 1: Create StorageViewModel + +**File:** `SharepointToolbox/ViewModels/Tabs/StorageViewModel.cs` + +**Action:** Create + +**Why:** Storage tab business logic — orchestrates StorageService scan, holds results, triggers exports. + +```csharp +using System.Collections.ObjectModel; +using System.Diagnostics; +using System.Windows; +using CommunityToolkit.Mvvm.ComponentModel; +using CommunityToolkit.Mvvm.Input; +using CommunityToolkit.Mvvm.Messaging; +using Microsoft.Extensions.Logging; +using Microsoft.Win32; +using SharepointToolbox.Core.Messages; +using SharepointToolbox.Core.Models; +using SharepointToolbox.Services; +using SharepointToolbox.Services.Export; + +namespace SharepointToolbox.ViewModels.Tabs; + +public partial class StorageViewModel : FeatureViewModelBase +{ + private readonly IStorageService _storageService; + private readonly ISessionManager _sessionManager; + private readonly StorageCsvExportService _csvExportService; + private readonly StorageHtmlExportService _htmlExportService; + private readonly ILogger _logger; + private TenantProfile? _currentProfile; + + [ObservableProperty] + private string _siteUrl = string.Empty; + + [ObservableProperty] + private bool _perLibrary = true; + + [ObservableProperty] + private bool _includeSubsites; + + [ObservableProperty] + private int _folderDepth; + + public bool IsMaxDepth + { + get => FolderDepth >= 999; + set + { + if (value) FolderDepth = 999; + else if (FolderDepth >= 999) FolderDepth = 0; + OnPropertyChanged(); + } + } + + private ObservableCollection _results = new(); + public ObservableCollection Results + { + get => _results; + private set + { + _results = value; + OnPropertyChanged(); + ExportCsvCommand.NotifyCanExecuteChanged(); + ExportHtmlCommand.NotifyCanExecuteChanged(); + } + } + + public IAsyncRelayCommand ExportCsvCommand { get; } + public IAsyncRelayCommand ExportHtmlCommand { get; } + + public TenantProfile? CurrentProfile => _currentProfile; + + public StorageViewModel( + IStorageService storageService, + ISessionManager sessionManager, + StorageCsvExportService csvExportService, + StorageHtmlExportService htmlExportService, + ILogger logger) + : base(logger) + { + _storageService = storageService; + _sessionManager = sessionManager; + _csvExportService = csvExportService; + _htmlExportService = htmlExportService; + _logger = logger; + + ExportCsvCommand = new AsyncRelayCommand(ExportCsvAsync, CanExport); + ExportHtmlCommand = new AsyncRelayCommand(ExportHtmlAsync, CanExport); + } + + /// Test constructor — omits export services. + internal StorageViewModel( + IStorageService storageService, + ISessionManager sessionManager, + ILogger logger) + : base(logger) + { + _storageService = storageService; + _sessionManager = sessionManager; + _csvExportService = null!; + _htmlExportService = null!; + _logger = logger; + + ExportCsvCommand = new AsyncRelayCommand(ExportCsvAsync, CanExport); + ExportHtmlCommand = new AsyncRelayCommand(ExportHtmlAsync, CanExport); + } + + protected override async Task RunOperationAsync(CancellationToken ct, IProgress progress) + { + if (_currentProfile == null) + { + StatusMessage = "No tenant selected. Please connect to a tenant first."; + return; + } + if (string.IsNullOrWhiteSpace(SiteUrl)) + { + StatusMessage = "Please enter a site URL."; + return; + } + + var ctx = await _sessionManager.GetOrCreateContextAsync(_currentProfile, ct); + // Override URL to the site URL the user entered (may differ from tenant root) + ctx.Url = SiteUrl.TrimEnd('/'); + + var options = new StorageScanOptions( + PerLibrary: PerLibrary, + IncludeSubsites: IncludeSubsites, + FolderDepth: FolderDepth); + + var nodes = await _storageService.CollectStorageAsync(ctx, options, progress, ct); + + // Flatten tree to one level for DataGrid display (assign IndentLevel during flatten) + var flat = new List(); + foreach (var node in nodes) + FlattenNode(node, 0, flat); + + if (Application.Current?.Dispatcher is { } dispatcher) + { + await dispatcher.InvokeAsync(() => + { + Results = new ObservableCollection(flat); + }); + } + else + { + Results = new ObservableCollection(flat); + } + } + + protected override void OnTenantSwitched(TenantProfile profile) + { + _currentProfile = profile; + Results = new ObservableCollection(); + SiteUrl = string.Empty; + OnPropertyChanged(nameof(CurrentProfile)); + ExportCsvCommand.NotifyCanExecuteChanged(); + ExportHtmlCommand.NotifyCanExecuteChanged(); + } + + internal void SetCurrentProfile(TenantProfile profile) => _currentProfile = profile; + + internal Task TestRunOperationAsync(CancellationToken ct, IProgress progress) + => RunOperationAsync(ct, progress); + + private bool CanExport() => Results.Count > 0; + + private async Task ExportCsvAsync() + { + if (Results.Count == 0) return; + var dialog = new SaveFileDialog + { + Title = "Export storage metrics to CSV", + Filter = "CSV files (*.csv)|*.csv|All files (*.*)|*.*", + DefaultExt = "csv", + FileName = "storage_metrics" + }; + if (dialog.ShowDialog() != true) return; + try + { + await _csvExportService.WriteAsync(Results, dialog.FileName, CancellationToken.None); + OpenFile(dialog.FileName); + } + catch (Exception ex) + { + StatusMessage = $"Export failed: {ex.Message}"; + _logger.LogError(ex, "CSV export failed."); + } + } + + private async Task ExportHtmlAsync() + { + if (Results.Count == 0) return; + var dialog = new SaveFileDialog + { + Title = "Export storage metrics to HTML", + Filter = "HTML files (*.html)|*.html|All files (*.*)|*.*", + DefaultExt = "html", + FileName = "storage_metrics" + }; + if (dialog.ShowDialog() != true) return; + try + { + await _htmlExportService.WriteAsync(Results, dialog.FileName, CancellationToken.None); + OpenFile(dialog.FileName); + } + catch (Exception ex) + { + StatusMessage = $"Export failed: {ex.Message}"; + _logger.LogError(ex, "HTML export failed."); + } + } + + private static void FlattenNode(StorageNode node, int level, List result) + { + node.IndentLevel = level; + result.Add(node); + foreach (var child in node.Children) + FlattenNode(child, level + 1, result); + } + + private static void OpenFile(string filePath) + { + try { Process.Start(new ProcessStartInfo(filePath) { UseShellExecute = true }); } + catch { /* ignore — file may open but this is best-effort */ } + } +} +``` + +**Verification:** + +```bash +dotnet build C:/Users/dev/Documents/projets/Sharepoint/SharepointToolbox.slnx +``` + +Expected: 0 errors + +### Task 2: Create StorageView XAML + code-behind, update DI and MainWindow wiring + +**Files:** +- `SharepointToolbox/Views/Tabs/StorageView.xaml` +- `SharepointToolbox/Views/Tabs/StorageView.xaml.cs` +- `SharepointToolbox/Views/Converters/IndentConverter.cs` (create — also adds BytesConverter and InverseBoolConverter) +- `SharepointToolbox/App.xaml` (modify — register converters as Application.Resources) +- `SharepointToolbox/App.xaml.cs` (modify — add Storage registrations) +- `SharepointToolbox/MainWindow.xaml` (modify — add x:Name to Storage TabItem) +- `SharepointToolbox/MainWindow.xaml.cs` (modify — wire StorageTabItem.Content) + +**Action:** Create / Modify + +**Why:** STOR-01/02/03/04/05 — the UI that ties the storage service to user interaction. + +```xml + + + + + + + + + + +