Compare commits
22 Commits
c550790287
...
8bdd5e2277
| Author | SHA1 | Date | |
|---|---|---|---|
| 8bdd5e2277 | |||
| 26343b08f8 | |||
| 6d350940b9 | |||
| ba6abad602 | |||
| c0da075f03 | |||
| 6bfb148e39 | |||
| 45c05b5720 | |||
| bd498b9bcb | |||
| 20e80ac28e | |||
| ad3a73f061 | |||
| c0fb8f69de | |||
| 8b8e8a8d8c | |||
| 05d2d77515 | |||
| 8b701745bf | |||
| a79ee1b228 | |||
| d8e39c66be | |||
| 055937fc3d | |||
| 1338f6bace | |||
| 995f5b1339 | |||
| 22bee4f32c | |||
| 5f1421f4bd | |||
| f8e8e85cb0 |
284
docs/superpowers/specs/2026-03-17-scheduling-v2-design.md
Normal file
284
docs/superpowers/specs/2026-03-17-scheduling-v2-design.md
Normal file
@@ -0,0 +1,284 @@
|
|||||||
|
# Scheduling V2 — Design Spec
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
The current scheduler is a 48h rolling window with a flat block list per channel. This works as MVP but has two major gaps for everyday use:
|
||||||
|
|
||||||
|
1. **No weekly patterns** — users can't say "Monday runs X, weekends run Y"; all blocks repeat identically every day.
|
||||||
|
2. **No history or recovery** — overwriting a channel config loses the previous setup forever; a bug that resets a sequential series (e.g. Sopranos resets from S3E4 to S1E1) has no recovery path.
|
||||||
|
|
||||||
|
This spec covers two features: **weekly scheduling** and **schedule history**.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Feature 1: Weekly Scheduling (7-day grid)
|
||||||
|
|
||||||
|
### Data model
|
||||||
|
|
||||||
|
`ScheduleConfig` changes from a flat block list to a day-keyed map:
|
||||||
|
|
||||||
|
```rust
|
||||||
|
// BEFORE
|
||||||
|
pub struct ScheduleConfig {
|
||||||
|
pub blocks: Vec<ProgrammingBlock>,
|
||||||
|
}
|
||||||
|
|
||||||
|
// AFTER
|
||||||
|
pub struct ScheduleConfig {
|
||||||
|
pub day_blocks: HashMap<Weekday, Vec<ProgrammingBlock>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
pub enum Weekday {
|
||||||
|
Monday, Tuesday, Wednesday, Thursday, Friday, Saturday, Sunday,
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`ProgrammingBlock` is otherwise unchanged. Block IDs remain UUIDs; each day has its own independent Vec, so the same "show" on Mon and Wed has two separate block entries (different IDs, independent continuity tracking).
|
||||||
|
|
||||||
|
### Migration (transparent, zero-downtime)
|
||||||
|
|
||||||
|
Existing `channels.schedule_config` stores `{"blocks":[...]}`. Use `#[serde(untagged)]` deserialization:
|
||||||
|
|
||||||
|
```rust
|
||||||
|
#[serde(untagged)]
|
||||||
|
enum ScheduleConfigCompat {
|
||||||
|
V2(ScheduleConfig), // {"day_blocks": {"monday": [...], ...}}
|
||||||
|
V1(OldScheduleConfig), // {"blocks": [...]}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
V1→V2 conversion: clone the blocks Vec into all 7 days. The first `PUT /channels/:id` after deploy saves V2 format. Channels never touched continue to deserialize via V1 path indefinitely.
|
||||||
|
|
||||||
|
**Edge case**: if a payload has both `blocks` and `day_blocks` keys (e.g. partially migrated export), `#[serde(untagged)]` tries V2 first and succeeds — `day_blocks` is used and `blocks` is silently ignored. This is acceptable; the alternative (error on ambiguity) would break more use cases.
|
||||||
|
|
||||||
|
### ScheduleConfig helper methods
|
||||||
|
|
||||||
|
Three methods on `ScheduleConfig` must be updated:
|
||||||
|
|
||||||
|
- **`find_block_at(weekday: Weekday, time: NaiveTime) -> Option<&ProgrammingBlock>`** — searches `day_blocks[weekday]` for the block whose window contains `time`.
|
||||||
|
- **`next_block_start_after(weekday: Weekday, time: NaiveTime) -> Option<NaiveTime>`** — searches that day's vec; returns `None` if no block starts after `time` on that day (day-rollover is the caller's responsibility).
|
||||||
|
- **`earliest_block_start() -> Option<NaiveTime>`** — **iterates all days, returns the global earliest start time across the entire week**. This is the form needed by the background scheduler (which needs to know when any content starts). Empty day = no contribution; all days empty = `None`.
|
||||||
|
|
||||||
|
**Call-site update pattern for `broadcast.rs` (lines 64, 171):**
|
||||||
|
```rust
|
||||||
|
// derive weekday from slot start_at in channel timezone
|
||||||
|
let tz: chrono_tz::Tz = channel.timezone.parse().unwrap_or(chrono_tz::UTC);
|
||||||
|
let local_dt = slot.start_at.with_timezone(&tz);
|
||||||
|
let weekday = Weekday::from_chrono(local_dt.weekday()); // new From impl
|
||||||
|
let block = channel.schedule_config.find_block_at(weekday, local_dt.time());
|
||||||
|
```
|
||||||
|
|
||||||
|
The same derivation applies to `dto.rs` (`ScheduledSlotResponse::with_block_access`).
|
||||||
|
|
||||||
|
### MCP crate
|
||||||
|
|
||||||
|
`mcp/src/tools/channels.rs` manipulates `schedule_config.blocks` directly. After V2:
|
||||||
|
|
||||||
|
- The MCP `add_block` tool must accept a `day: Weekday` parameter (required). It pushes the new block to `day_blocks[day]`.
|
||||||
|
- The MCP `remove_block` tool must iterate all days' vecs (remove by block ID across all days, since block IDs are unique per entry).
|
||||||
|
- `mcp/src/server.rs` `set_schedule_config` must accept a `day_blocks` map. The old `blocks_json` string parameter is replaced with `day_blocks_json: String` (JSON object keyed by weekday name).
|
||||||
|
|
||||||
|
These are breaking changes to the MCP API — acceptable since MCP tools are internal/developer-facing.
|
||||||
|
|
||||||
|
### Generation engine
|
||||||
|
|
||||||
|
- Window: `valid_from + 7 days` (was 48h). Update `GeneratedSchedule` doc comment accordingly.
|
||||||
|
- Day iteration: already walks calendar days; now walks 7 days, looks up `day_blocks[weekday]` for each day.
|
||||||
|
- **Empty day**: if `day_blocks[weekday]` is empty or the key is absent, that day produces no slots — valid, not an error.
|
||||||
|
- Continuity (`find_last_slot_per_block`): unchanged.
|
||||||
|
|
||||||
|
### Files changed (backend)
|
||||||
|
- `domain/src/value_objects.rs` — add `Weekday` enum with `From<chrono::Weekday>` impl
|
||||||
|
- `domain/src/entities.rs` — `ScheduleConfig`, `OldScheduleConfig` compat struct, update helper method signatures, update `GeneratedSchedule` doc comment
|
||||||
|
- `domain/src/services.rs` — 7-day window, `day_blocks[weekday]` lookup per day
|
||||||
|
- `api/src/routes/channels/broadcast.rs` — update block lookups at lines 64 and 171 using weekday-derivation pattern above
|
||||||
|
- `api/src/dto.rs` — update `ScheduledSlotResponse::with_block_access` block lookup
|
||||||
|
- `mcp/src/tools/channels.rs` — `add_block` accepts `day` param; `remove_block` iterates all days
|
||||||
|
- `mcp/src/server.rs` — replace `blocks_json` with `day_blocks_json`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Feature 2: Schedule History
|
||||||
|
|
||||||
|
### 2a. Config version history
|
||||||
|
|
||||||
|
Every `PUT /channels/:id` auto-snapshots the previous config before overwriting. Users can pin named checkpoints and restore any version.
|
||||||
|
|
||||||
|
**New DB migration:**
|
||||||
|
```sql
|
||||||
|
CREATE TABLE channel_config_snapshots (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
channel_id TEXT NOT NULL REFERENCES channels(id) ON DELETE CASCADE,
|
||||||
|
config_json TEXT NOT NULL,
|
||||||
|
version_num INTEGER NOT NULL,
|
||||||
|
label TEXT, -- NULL = auto-saved, non-NULL = pinned
|
||||||
|
created_at TEXT NOT NULL,
|
||||||
|
UNIQUE (channel_id, version_num)
|
||||||
|
);
|
||||||
|
CREATE INDEX idx_config_snapshots_channel ON channel_config_snapshots(channel_id, version_num DESC);
|
||||||
|
```
|
||||||
|
|
||||||
|
**`version_num` assignment**: computed inside the write transaction as `SELECT COALESCE(MAX(version_num), 0) + 1 FROM channel_config_snapshots WHERE channel_id = ?`. The transaction serializes concurrent writes naturally in SQLite (single writer). The `UNIQUE` constraint is a safety net only — no 409 is exposed to the client; the server retries within the transaction if needed (in practice impossible with SQLite's serialized writes).
|
||||||
|
|
||||||
|
**New API endpoints (all require auth + channel ownership — same auth middleware as existing channel routes):**
|
||||||
|
```
|
||||||
|
GET /channels/:id/config/history
|
||||||
|
→ [{id, version_num, label, created_at}] -- channel_id omitted (implicit from URL)
|
||||||
|
|
||||||
|
PATCH /channels/:id/config/history/:snap_id
|
||||||
|
body: {"label": "Before S3 switchover"}
|
||||||
|
→ 404 if snap_id not found or not owned by this channel
|
||||||
|
→ 200 {id, version_num, label, created_at}
|
||||||
|
|
||||||
|
POST /channels/:id/config/history/:snap_id/restore
|
||||||
|
→ snapshots current config first, then replaces channel config with target snapshot
|
||||||
|
→ 404 if snap_id not found or not owned by this channel
|
||||||
|
→ 200 {channel}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Domain + infra changes:**
|
||||||
|
- `ChannelConfigSnapshot` entity (fields: id, channel_id, config, version_num, label, created_at)
|
||||||
|
- Extend `ChannelRepository` port: `save_config_snapshot`, `list_config_snapshots`, `get_config_snapshot`, `patch_config_snapshot_label`
|
||||||
|
- `ChannelService::update_channel` calls `save_config_snapshot` before writing new config
|
||||||
|
|
||||||
|
**Files changed (backend):**
|
||||||
|
- `domain/src/entities.rs` — add `ChannelConfigSnapshot`
|
||||||
|
- `domain/src/repositories.rs` — extend `ChannelRepository` port
|
||||||
|
- `infra/src/channel_repo.rs` — implement snapshot methods
|
||||||
|
- `migrations_sqlite/YYYYMMDD_add_config_snapshots.sql`
|
||||||
|
- `api/src/routes/channels.rs` — new history endpoints + DTOs for snapshot responses
|
||||||
|
|
||||||
|
### 2b. Generated schedule audit log
|
||||||
|
|
||||||
|
**Ownership check**: `get_schedule_by_id(channel_id, gen_id)` queries `generated_schedules WHERE id = :gen_id AND channel_id = :channel_id` — the `channel_id` column is the join, so no separate channel lookup is needed.
|
||||||
|
|
||||||
|
**New API endpoints (all require auth + channel ownership):**
|
||||||
|
```
|
||||||
|
GET /channels/:id/schedule/history
|
||||||
|
→ [{id, generation, valid_from, valid_until}] ordered by generation DESC
|
||||||
|
|
||||||
|
GET /channels/:id/schedule/history/:gen_id
|
||||||
|
→ full GeneratedSchedule with slots
|
||||||
|
→ 404 if gen_id not found or channel_id mismatch
|
||||||
|
|
||||||
|
POST /channels/:id/schedule/history/:gen_id/rollback
|
||||||
|
→ 404 if gen_id not found or channel_id mismatch
|
||||||
|
→ explicit two-step delete (no DB-level cascade from playback_records to generated_schedules):
|
||||||
|
1. DELETE FROM playback_records WHERE channel_id = ? AND generation > :target_generation
|
||||||
|
2. DELETE FROM generated_schedules WHERE channel_id = ? AND generation > :target_generation
|
||||||
|
(scheduled_slots cascade via FK from generated_schedules)
|
||||||
|
→ calls generate_schedule from now
|
||||||
|
→ 200 {new_schedule}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Repository changes:**
|
||||||
|
- `list_schedule_history(channel_id)` — headers only
|
||||||
|
- `get_schedule_by_id(channel_id, gen_id)` — full with slots
|
||||||
|
- `delete_schedules_after(channel_id, generation_num)` — two-step explicit delete as above
|
||||||
|
|
||||||
|
**Files changed (backend):**
|
||||||
|
- `domain/src/repositories.rs` — extend `ScheduleRepository`
|
||||||
|
- `infra/src/schedule_repo.rs` — implement list, get-by-id, delete-after
|
||||||
|
- `api/src/routes/channels.rs` — new history and rollback endpoints
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Frontend
|
||||||
|
|
||||||
|
### Weekly grid editor (`edit-channel-sheet.tsx`)
|
||||||
|
|
||||||
|
Replace the flat block list with a tabbed weekly grid:
|
||||||
|
|
||||||
|
- 7 day tabs (Mon–Sun), each showing that day's block list
|
||||||
|
- Blocks within a day: same card UI as current (drag to reorder, edit, delete)
|
||||||
|
- "Copy to →" dropdown per tab: duplicates block entries with new UUIDs into target day(s)
|
||||||
|
- "+ Add block for [Day]" button per tab
|
||||||
|
- "🕐 Config history" button in sheet footer → opens config history panel
|
||||||
|
|
||||||
|
### Config history panel (`config-history-sheet.tsx` — new)
|
||||||
|
|
||||||
|
- List of snapshots: version_num, timestamp, label (if pinned)
|
||||||
|
- Current version highlighted
|
||||||
|
- Pin button on current version (opens label input)
|
||||||
|
- Restore button on any past version (confirm dialog)
|
||||||
|
|
||||||
|
### Schedule audit log (`schedule-history-dialog.tsx` — new)
|
||||||
|
|
||||||
|
- Lists past generations: gen#, date range
|
||||||
|
- "Rollback to here" button with confirm dialog
|
||||||
|
|
||||||
|
### Types (`lib/types.ts`)
|
||||||
|
```ts
|
||||||
|
type Weekday = 'monday' | 'tuesday' | 'wednesday' | 'thursday' | 'friday' | 'saturday' | 'sunday'
|
||||||
|
const WEEKDAYS: Weekday[] = ['monday','tuesday','wednesday','thursday','friday','saturday','sunday']
|
||||||
|
|
||||||
|
interface ScheduleConfig {
|
||||||
|
day_blocks: Record<Weekday, ProgrammingBlock[]>
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ConfigSnapshot {
|
||||||
|
id: string
|
||||||
|
version_num: number
|
||||||
|
label: string | null
|
||||||
|
created_at: string
|
||||||
|
// channel_id intentionally omitted — always accessed via /channels/:id/config/history
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ScheduleHistoryEntry {
|
||||||
|
id: string
|
||||||
|
generation: number
|
||||||
|
valid_from: string
|
||||||
|
valid_until: string
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Zod schema (`lib/schemas.ts`)
|
||||||
|
```ts
|
||||||
|
const weekdaySchema = z.enum(['monday','tuesday','wednesday','thursday','friday','saturday','sunday'])
|
||||||
|
|
||||||
|
// replace blocks: z.array(blockSchema) with:
|
||||||
|
day_blocks: z.record(weekdaySchema, z.array(blockSchema)).default(
|
||||||
|
() => Object.fromEntries(WEEKDAYS.map(d => [d, []])) as Record<Weekday, ProgrammingBlock[]>
|
||||||
|
)
|
||||||
|
// A missing day key is valid (treated as empty). The default initializes all days to [].
|
||||||
|
```
|
||||||
|
|
||||||
|
### Channel export (`lib/channel-export.ts`)
|
||||||
|
Export format after V2: `day_blocks` map as-is (no flattening). The export JSON shape mirrors `ScheduleConfig` directly. Re-import reads via the same `ScheduleConfigCompat` deserialization path, so V1 exports remain importable indefinitely.
|
||||||
|
|
||||||
|
### New hooks (`hooks/use-channels.ts`)
|
||||||
|
- `useConfigHistory(channelId)`
|
||||||
|
- `useRestoreConfig()`
|
||||||
|
- `usePinSnapshot()`
|
||||||
|
- `useScheduleHistory(channelId)`
|
||||||
|
- `useScheduleGeneration(channelId, genId)` (lazy, for detail view)
|
||||||
|
- `useRollbackSchedule()`
|
||||||
|
|
||||||
|
### Files changed (frontend)
|
||||||
|
- `lib/types.ts`
|
||||||
|
- `lib/schemas.ts`
|
||||||
|
- `lib/channel-export.ts`
|
||||||
|
- `hooks/use-channels.ts`
|
||||||
|
- `dashboard/components/edit-channel-sheet.tsx`
|
||||||
|
- `dashboard/components/config-history-sheet.tsx` (new)
|
||||||
|
- `dashboard/components/schedule-history-dialog.tsx` (new)
|
||||||
|
- `app/(main)/dashboard/page.tsx` — wire new dialog triggers
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
|
||||||
|
| Scenario | Expected |
|
||||||
|
|---|---|
|
||||||
|
| Load channel with old `{blocks:[...]}` config | Blocks appear on all 7 day tabs |
|
||||||
|
| `PUT /channels/:id` on old-format channel | Config saved as V2 `{day_blocks:{...}}`; snapshot v1 created |
|
||||||
|
| Channel with Mon+Sat blocks only → generate | Slots only on Mondays and Saturdays in 7-day window |
|
||||||
|
| Day with empty block list | No slots that day, no error |
|
||||||
|
| `PUT /channels/:id` twice | `GET /config/history` returns 2 entries with incrementing version_num |
|
||||||
|
| Pin snapshot | Label persists in history list |
|
||||||
|
| Restore snapshot | Config reverts; new snapshot created at top of history |
|
||||||
|
| `GET /schedule/history/:bad_id` | 404 |
|
||||||
|
| Generate 3 schedules → rollback to gen#1 | gen#2+3 deleted (schedules + playback_records); new generation resumes from gen#1 continuity |
|
||||||
|
| Sequential block at S4E2 → rollback → regenerate | New schedule starts at correct episode |
|
||||||
|
| Payload with both `blocks` and `day_blocks` keys | `day_blocks` used, `blocks` silently ignored |
|
||||||
|
| V1 export file re-imported after V2 deploy | Deserializes correctly via compat path |
|
||||||
1
k-tv-backend/Cargo.lock
generated
1
k-tv-backend/Cargo.lock
generated
@@ -702,6 +702,7 @@ dependencies = [
|
|||||||
"email_address",
|
"email_address",
|
||||||
"rand 0.8.5",
|
"rand 0.8.5",
|
||||||
"serde",
|
"serde",
|
||||||
|
"serde_json",
|
||||||
"thiserror 2.0.17",
|
"thiserror 2.0.17",
|
||||||
"tokio",
|
"tokio",
|
||||||
"url",
|
"url",
|
||||||
|
|||||||
@@ -114,7 +114,7 @@ pub struct UpdateChannelRequest {
|
|||||||
pub description: Option<String>,
|
pub description: Option<String>,
|
||||||
pub timezone: Option<String>,
|
pub timezone: Option<String>,
|
||||||
/// Replace the entire schedule config (template import/edit)
|
/// Replace the entire schedule config (template import/edit)
|
||||||
pub schedule_config: Option<domain::ScheduleConfig>,
|
pub schedule_config: Option<domain::ScheduleConfigCompat>,
|
||||||
pub recycle_policy: Option<domain::RecyclePolicy>,
|
pub recycle_policy: Option<domain::RecyclePolicy>,
|
||||||
pub auto_schedule: Option<bool>,
|
pub auto_schedule: Option<bool>,
|
||||||
pub access_mode: Option<domain::AccessMode>,
|
pub access_mode: Option<domain::AccessMode>,
|
||||||
@@ -180,6 +180,34 @@ impl From<domain::Channel> for ChannelResponse {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Config history DTOs
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize)]
|
||||||
|
pub struct ConfigSnapshotResponse {
|
||||||
|
pub id: Uuid,
|
||||||
|
pub version_num: i64,
|
||||||
|
pub label: Option<String>,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<domain::ChannelConfigSnapshot> for ConfigSnapshotResponse {
|
||||||
|
fn from(s: domain::ChannelConfigSnapshot) -> Self {
|
||||||
|
Self {
|
||||||
|
id: s.id,
|
||||||
|
version_num: s.version_num,
|
||||||
|
label: s.label,
|
||||||
|
created_at: s.created_at,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Deserialize)]
|
||||||
|
pub struct PatchSnapshotRequest {
|
||||||
|
pub label: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
// EPG / playback DTOs
|
// EPG / playback DTOs
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
@@ -245,8 +273,7 @@ impl ScheduledSlotResponse {
|
|||||||
pub fn with_block_access(slot: domain::ScheduledSlot, channel: &domain::Channel) -> Self {
|
pub fn with_block_access(slot: domain::ScheduledSlot, channel: &domain::Channel) -> Self {
|
||||||
let block_access_mode = channel
|
let block_access_mode = channel
|
||||||
.schedule_config
|
.schedule_config
|
||||||
.blocks
|
.all_blocks()
|
||||||
.iter()
|
|
||||||
.find(|b| b.id == slot.source_block_id)
|
.find(|b| b.id == slot.source_block_id)
|
||||||
.map(|b| b.access_mode.clone())
|
.map(|b| b.access_mode.clone())
|
||||||
.unwrap_or_default();
|
.unwrap_or_default();
|
||||||
@@ -306,6 +333,27 @@ pub struct TranscodeStatsResponse {
|
|||||||
pub item_count: usize,
|
pub item_count: usize,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize)]
|
||||||
|
pub struct ScheduleHistoryEntry {
|
||||||
|
pub id: Uuid,
|
||||||
|
pub generation: u32,
|
||||||
|
pub valid_from: DateTime<Utc>,
|
||||||
|
pub valid_until: DateTime<Utc>,
|
||||||
|
pub slot_count: usize,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<domain::GeneratedSchedule> for ScheduleHistoryEntry {
|
||||||
|
fn from(s: domain::GeneratedSchedule) -> Self {
|
||||||
|
Self {
|
||||||
|
id: s.id,
|
||||||
|
generation: s.generation,
|
||||||
|
valid_from: s.valid_from,
|
||||||
|
valid_until: s.valid_until,
|
||||||
|
slot_count: s.slots.len(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl From<domain::GeneratedSchedule> for ScheduleResponse {
|
impl From<domain::GeneratedSchedule> for ScheduleResponse {
|
||||||
fn from(s: domain::GeneratedSchedule) -> Self {
|
fn from(s: domain::GeneratedSchedule) -> Self {
|
||||||
Self {
|
Self {
|
||||||
|
|||||||
@@ -172,6 +172,10 @@ mod tests {
|
|||||||
async fn delete(&self, _id: ChannelId) -> DomainResult<()> {
|
async fn delete(&self, _id: ChannelId) -> DomainResult<()> {
|
||||||
unimplemented!()
|
unimplemented!()
|
||||||
}
|
}
|
||||||
|
async fn save_config_snapshot(&self, _channel_id: ChannelId, _config: &domain::ScheduleConfig, _label: Option<String>) -> DomainResult<domain::ChannelConfigSnapshot> { unimplemented!() }
|
||||||
|
async fn list_config_snapshots(&self, _channel_id: ChannelId) -> DomainResult<Vec<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
|
async fn get_config_snapshot(&self, _channel_id: ChannelId, _snapshot_id: Uuid) -> DomainResult<Option<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
|
async fn patch_config_snapshot_label(&self, _channel_id: ChannelId, _snapshot_id: Uuid, _label: Option<String>) -> DomainResult<Option<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
}
|
}
|
||||||
|
|
||||||
struct MockScheduleRepo {
|
struct MockScheduleRepo {
|
||||||
@@ -213,6 +217,9 @@ mod tests {
|
|||||||
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
||||||
Ok(HashMap::new())
|
Ok(HashMap::new())
|
||||||
}
|
}
|
||||||
|
async fn list_schedule_history(&self, _channel_id: ChannelId) -> DomainResult<Vec<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn get_schedule_by_id(&self, _channel_id: ChannelId, _schedule_id: Uuid) -> DomainResult<Option<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn delete_schedules_after(&self, _channel_id: ChannelId, _target_generation: u32) -> DomainResult<()> { unimplemented!() }
|
||||||
}
|
}
|
||||||
|
|
||||||
struct MockRegistry;
|
struct MockRegistry;
|
||||||
@@ -437,6 +444,9 @@ mod tests {
|
|||||||
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
||||||
Ok(HashMap::new())
|
Ok(HashMap::new())
|
||||||
}
|
}
|
||||||
|
async fn list_schedule_history(&self, _: ChannelId) -> DomainResult<Vec<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn get_schedule_by_id(&self, _: ChannelId, _: Uuid) -> DomainResult<Option<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn delete_schedules_after(&self, _: ChannelId, _: u32) -> DomainResult<()> { unimplemented!() }
|
||||||
}
|
}
|
||||||
|
|
||||||
let now = Utc::now();
|
let now = Utc::now();
|
||||||
|
|||||||
@@ -61,8 +61,7 @@ pub(super) async fn get_current_broadcast(
|
|||||||
Some(broadcast) => {
|
Some(broadcast) => {
|
||||||
let block_access_mode = channel
|
let block_access_mode = channel
|
||||||
.schedule_config
|
.schedule_config
|
||||||
.blocks
|
.all_blocks()
|
||||||
.iter()
|
|
||||||
.find(|b| b.id == broadcast.slot.source_block_id)
|
.find(|b| b.id == broadcast.slot.source_block_id)
|
||||||
.map(|b| b.access_mode.clone())
|
.map(|b| b.access_mode.clone())
|
||||||
.unwrap_or_default();
|
.unwrap_or_default();
|
||||||
@@ -168,8 +167,7 @@ pub(super) async fn get_stream(
|
|||||||
// Block-level access check
|
// Block-level access check
|
||||||
if let Some(block) = channel
|
if let Some(block) = channel
|
||||||
.schedule_config
|
.schedule_config
|
||||||
.blocks
|
.all_blocks()
|
||||||
.iter()
|
|
||||||
.find(|b| b.id == broadcast.slot.source_block_id)
|
.find(|b| b.id == broadcast.slot.source_block_id)
|
||||||
{
|
{
|
||||||
check_access(
|
check_access(
|
||||||
|
|||||||
72
k-tv-backend/api/src/routes/channels/config_history.rs
Normal file
72
k-tv-backend/api/src/routes/channels/config_history.rs
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
use axum::{
|
||||||
|
Json,
|
||||||
|
extract::{Path, State},
|
||||||
|
http::StatusCode,
|
||||||
|
response::IntoResponse,
|
||||||
|
};
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
|
use crate::{
|
||||||
|
dto::{ChannelResponse, ConfigSnapshotResponse, PatchSnapshotRequest},
|
||||||
|
error::ApiError,
|
||||||
|
extractors::CurrentUser,
|
||||||
|
state::AppState,
|
||||||
|
};
|
||||||
|
|
||||||
|
use super::require_owner;
|
||||||
|
|
||||||
|
pub(super) async fn list_config_history(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path(channel_id): Path<Uuid>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let snapshots = state.channel_service.list_config_snapshots(channel_id).await?;
|
||||||
|
let response: Vec<ConfigSnapshotResponse> = snapshots.into_iter().map(Into::into).collect();
|
||||||
|
Ok(Json(response))
|
||||||
|
}
|
||||||
|
|
||||||
|
pub(super) async fn patch_config_snapshot(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path((channel_id, snap_id)): Path<(Uuid, Uuid)>,
|
||||||
|
Json(payload): Json<PatchSnapshotRequest>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let updated = state
|
||||||
|
.channel_service
|
||||||
|
.patch_config_snapshot_label(channel_id, snap_id, payload.label)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound("Snapshot not found".into()))?;
|
||||||
|
|
||||||
|
Ok(Json(ConfigSnapshotResponse::from(updated)))
|
||||||
|
}
|
||||||
|
|
||||||
|
pub(super) async fn restore_config_snapshot(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path((channel_id, snap_id)): Path<(Uuid, Uuid)>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let updated = state
|
||||||
|
.channel_service
|
||||||
|
.restore_config_snapshot(channel_id, snap_id)
|
||||||
|
.await
|
||||||
|
.map_err(|e| match e {
|
||||||
|
domain::DomainError::ChannelNotFound(_) => ApiError::NotFound("Snapshot not found".into()),
|
||||||
|
other => ApiError::from(other),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let _ = state
|
||||||
|
.activity_log_repo
|
||||||
|
.log("config_restored", &snap_id.to_string(), Some(channel_id))
|
||||||
|
.await;
|
||||||
|
|
||||||
|
Ok((StatusCode::OK, Json(ChannelResponse::from(updated))))
|
||||||
|
}
|
||||||
@@ -102,7 +102,7 @@ pub(super) async fn update_channel(
|
|||||||
channel.timezone = tz;
|
channel.timezone = tz;
|
||||||
}
|
}
|
||||||
if let Some(sc) = payload.schedule_config {
|
if let Some(sc) = payload.schedule_config {
|
||||||
channel.schedule_config = sc;
|
channel.schedule_config = domain::ScheduleConfig::from(sc);
|
||||||
}
|
}
|
||||||
if let Some(rp) = payload.recycle_policy {
|
if let Some(rp) = payload.recycle_policy {
|
||||||
channel.recycle_policy = rp;
|
channel.recycle_policy = rp;
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ use domain::{AccessMode, User};
|
|||||||
use crate::{error::ApiError, state::AppState};
|
use crate::{error::ApiError, state::AppState};
|
||||||
|
|
||||||
mod broadcast;
|
mod broadcast;
|
||||||
|
mod config_history;
|
||||||
mod crud;
|
mod crud;
|
||||||
mod schedule;
|
mod schedule;
|
||||||
|
|
||||||
@@ -27,9 +28,30 @@ pub fn router() -> Router<AppState> {
|
|||||||
"/{id}/schedule",
|
"/{id}/schedule",
|
||||||
post(schedule::generate_schedule).get(schedule::get_active_schedule),
|
post(schedule::generate_schedule).get(schedule::get_active_schedule),
|
||||||
)
|
)
|
||||||
|
.route("/{id}/schedule/history", get(schedule::list_schedule_history))
|
||||||
|
.route(
|
||||||
|
"/{id}/schedule/history/{gen_id}",
|
||||||
|
get(schedule::get_schedule_history_entry),
|
||||||
|
)
|
||||||
|
.route(
|
||||||
|
"/{id}/schedule/history/{gen_id}/rollback",
|
||||||
|
post(schedule::rollback_schedule),
|
||||||
|
)
|
||||||
.route("/{id}/now", get(broadcast::get_current_broadcast))
|
.route("/{id}/now", get(broadcast::get_current_broadcast))
|
||||||
.route("/{id}/epg", get(broadcast::get_epg))
|
.route("/{id}/epg", get(broadcast::get_epg))
|
||||||
.route("/{id}/stream", get(broadcast::get_stream))
|
.route("/{id}/stream", get(broadcast::get_stream))
|
||||||
|
.route(
|
||||||
|
"/{id}/config/history",
|
||||||
|
get(config_history::list_config_history),
|
||||||
|
)
|
||||||
|
.route(
|
||||||
|
"/{id}/config/history/{snap_id}",
|
||||||
|
axum::routing::patch(config_history::patch_config_snapshot),
|
||||||
|
)
|
||||||
|
.route(
|
||||||
|
"/{id}/config/history/{snap_id}/restore",
|
||||||
|
post(config_history::restore_config_snapshot),
|
||||||
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
|
|||||||
@@ -10,7 +10,7 @@ use uuid::Uuid;
|
|||||||
use domain::{self, DomainError};
|
use domain::{self, DomainError};
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
dto::ScheduleResponse,
|
dto::{ScheduleHistoryEntry, ScheduleResponse},
|
||||||
error::ApiError,
|
error::ApiError,
|
||||||
extractors::CurrentUser,
|
extractors::CurrentUser,
|
||||||
state::AppState,
|
state::AppState,
|
||||||
@@ -18,7 +18,7 @@ use crate::{
|
|||||||
|
|
||||||
use super::require_owner;
|
use super::require_owner;
|
||||||
|
|
||||||
/// Trigger 48-hour schedule generation for a channel, starting from now.
|
/// Trigger 7-day schedule generation for a channel, starting from now.
|
||||||
/// Replaces any existing schedule for the same window.
|
/// Replaces any existing schedule for the same window.
|
||||||
pub(super) async fn generate_schedule(
|
pub(super) async fn generate_schedule(
|
||||||
State(state): State<AppState>,
|
State(state): State<AppState>,
|
||||||
@@ -42,7 +42,7 @@ pub(super) async fn generate_schedule(
|
|||||||
Ok((StatusCode::CREATED, Json(ScheduleResponse::from(schedule))))
|
Ok((StatusCode::CREATED, Json(ScheduleResponse::from(schedule))))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Return the currently active 48-hour schedule for a channel.
|
/// Return the currently active 7-day schedule for a channel.
|
||||||
/// 404 if no schedule has been generated yet — call POST /:id/schedule first.
|
/// 404 if no schedule has been generated yet — call POST /:id/schedule first.
|
||||||
pub(super) async fn get_active_schedule(
|
pub(super) async fn get_active_schedule(
|
||||||
State(state): State<AppState>,
|
State(state): State<AppState>,
|
||||||
@@ -60,3 +60,75 @@ pub(super) async fn get_active_schedule(
|
|||||||
|
|
||||||
Ok(Json(ScheduleResponse::from(schedule)))
|
Ok(Json(ScheduleResponse::from(schedule)))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// List all schedule generations for a channel, newest first.
|
||||||
|
/// Returns lightweight entries (no slots).
|
||||||
|
pub(super) async fn list_schedule_history(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path(channel_id): Path<Uuid>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let history = state.schedule_engine.list_schedule_history(channel_id).await?;
|
||||||
|
let entries: Vec<ScheduleHistoryEntry> = history.into_iter().map(Into::into).collect();
|
||||||
|
Ok(Json(entries))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Fetch a single historical schedule with all its slots.
|
||||||
|
pub(super) async fn get_schedule_history_entry(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path((channel_id, gen_id)): Path<(Uuid, Uuid)>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let schedule = state
|
||||||
|
.schedule_engine
|
||||||
|
.get_schedule_by_id(channel_id, gen_id)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound(format!("Schedule {} not found", gen_id)))?;
|
||||||
|
|
||||||
|
Ok(Json(ScheduleResponse::from(schedule)))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Roll back to a previous schedule generation.
|
||||||
|
///
|
||||||
|
/// Deletes all generations after `gen_id`'s generation, then generates a fresh
|
||||||
|
/// schedule from now (inheriting the rolled-back generation as the base for
|
||||||
|
/// recycle-policy history).
|
||||||
|
pub(super) async fn rollback_schedule(
|
||||||
|
State(state): State<AppState>,
|
||||||
|
CurrentUser(user): CurrentUser,
|
||||||
|
Path((channel_id, gen_id)): Path<(Uuid, Uuid)>,
|
||||||
|
) -> Result<impl IntoResponse, ApiError> {
|
||||||
|
let channel = state.channel_service.find_by_id(channel_id).await?;
|
||||||
|
require_owner(&channel, user.id)?;
|
||||||
|
|
||||||
|
let target = state
|
||||||
|
.schedule_engine
|
||||||
|
.get_schedule_by_id(channel_id, gen_id)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound(format!("Schedule {} not found", gen_id)))?;
|
||||||
|
|
||||||
|
state
|
||||||
|
.schedule_engine
|
||||||
|
.delete_schedules_after(channel_id, target.generation)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
let schedule = state
|
||||||
|
.schedule_engine
|
||||||
|
.generate_schedule(channel_id, Utc::now())
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
let _ = state.event_tx.send(domain::DomainEvent::ScheduleGenerated {
|
||||||
|
channel_id,
|
||||||
|
schedule: schedule.clone(),
|
||||||
|
});
|
||||||
|
let detail = format!("rollback to gen {}; {} slots", target.generation, schedule.slots.len());
|
||||||
|
let _ = state.activity_log_repo.log("schedule_rollback", &detail, Some(channel_id)).await;
|
||||||
|
|
||||||
|
Ok(Json(ScheduleResponse::from(schedule)))
|
||||||
|
}
|
||||||
|
|||||||
@@ -127,6 +127,10 @@ mod tests {
|
|||||||
async fn delete(&self, _id: ChannelId) -> DomainResult<()> {
|
async fn delete(&self, _id: ChannelId) -> DomainResult<()> {
|
||||||
unimplemented!()
|
unimplemented!()
|
||||||
}
|
}
|
||||||
|
async fn save_config_snapshot(&self, _channel_id: ChannelId, _config: &domain::ScheduleConfig, _label: Option<String>) -> DomainResult<domain::ChannelConfigSnapshot> { unimplemented!() }
|
||||||
|
async fn list_config_snapshots(&self, _channel_id: ChannelId) -> DomainResult<Vec<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
|
async fn get_config_snapshot(&self, _channel_id: ChannelId, _snapshot_id: Uuid) -> DomainResult<Option<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
|
async fn patch_config_snapshot_label(&self, _channel_id: ChannelId, _snapshot_id: Uuid, _label: Option<String>) -> DomainResult<Option<domain::ChannelConfigSnapshot>> { unimplemented!() }
|
||||||
}
|
}
|
||||||
|
|
||||||
struct MockScheduleRepo {
|
struct MockScheduleRepo {
|
||||||
@@ -168,6 +172,9 @@ mod tests {
|
|||||||
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
) -> DomainResult<HashMap<BlockId, MediaItemId>> {
|
||||||
Ok(HashMap::new())
|
Ok(HashMap::new())
|
||||||
}
|
}
|
||||||
|
async fn list_schedule_history(&self, _channel_id: ChannelId) -> DomainResult<Vec<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn get_schedule_by_id(&self, _channel_id: ChannelId, _schedule_id: Uuid) -> DomainResult<Option<GeneratedSchedule>> { unimplemented!() }
|
||||||
|
async fn delete_schedules_after(&self, _channel_id: ChannelId, _target_generation: u32) -> DomainResult<()> { unimplemented!() }
|
||||||
}
|
}
|
||||||
|
|
||||||
struct MockRegistry;
|
struct MockRegistry;
|
||||||
|
|||||||
@@ -16,3 +16,4 @@ uuid = { version = "1.19.0", features = ["v4", "serde"] }
|
|||||||
|
|
||||||
[dev-dependencies]
|
[dev-dependencies]
|
||||||
tokio = { version = "1", features = ["rt", "macros"] }
|
tokio = { version = "1", features = ["rt", "macros"] }
|
||||||
|
serde_json = "1"
|
||||||
|
|||||||
@@ -6,11 +6,12 @@
|
|||||||
pub use crate::value_objects::{Email, UserId};
|
pub use crate::value_objects::{Email, UserId};
|
||||||
use chrono::{DateTime, NaiveTime, Timelike, Utc};
|
use chrono::{DateTime, NaiveTime, Timelike, Utc};
|
||||||
use serde::{Deserialize, Serialize};
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::collections::HashMap;
|
||||||
use uuid::Uuid;
|
use uuid::Uuid;
|
||||||
|
|
||||||
use crate::value_objects::{
|
use crate::value_objects::{
|
||||||
AccessMode, BlockId, ChannelId, ContentType, FillStrategy, LogoPosition, MediaFilter,
|
AccessMode, BlockId, ChannelId, ContentType, FillStrategy, LogoPosition, MediaFilter,
|
||||||
MediaItemId, RecyclePolicy, SlotId,
|
MediaItemId, RecyclePolicy, SlotId, Weekday,
|
||||||
};
|
};
|
||||||
|
|
||||||
/// A user in the system.
|
/// A user in the system.
|
||||||
@@ -132,40 +133,77 @@ impl Channel {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The user-designed programming template.
|
/// The user-designed programming template (V2: day-keyed weekly grid).
|
||||||
///
|
///
|
||||||
/// This is the shareable/exportable part of a channel. It contains an ordered
|
/// Each day of the week has its own independent list of `ProgrammingBlock`s.
|
||||||
/// list of `ProgrammingBlock`s but makes no assumptions about the media source.
|
/// A day with an empty vec (or absent key) produces no slots — valid, not an error.
|
||||||
/// A channel does not need to cover all 24 hours — gaps are valid and render
|
/// A channel does not need to cover all 24 hours — gaps render as no-signal.
|
||||||
/// as a no-signal state on the client.
|
///
|
||||||
|
/// `deny_unknown_fields` is required so the `#[serde(untagged)]` compat enum
|
||||||
|
/// correctly rejects V1 `{"blocks":[...]}` payloads and falls through to `OldScheduleConfig`.
|
||||||
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||||
|
#[serde(deny_unknown_fields)]
|
||||||
pub struct ScheduleConfig {
|
pub struct ScheduleConfig {
|
||||||
|
pub day_blocks: HashMap<Weekday, Vec<ProgrammingBlock>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// V1 on-disk shape — kept for transparent migration only.
|
||||||
|
/// Never construct directly; use `ScheduleConfigCompat` for deserialization.
|
||||||
|
/// `deny_unknown_fields` ensures V2 payloads don't accidentally match here.
|
||||||
|
#[derive(Debug, Clone, Deserialize)]
|
||||||
|
#[serde(deny_unknown_fields)]
|
||||||
|
pub struct OldScheduleConfig {
|
||||||
pub blocks: Vec<ProgrammingBlock>,
|
pub blocks: Vec<ProgrammingBlock>,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Deserializes either V2 (`day_blocks`) or V1 (`blocks`) from the DB.
|
||||||
|
/// V1 is automatically promoted: all blocks are copied to all 7 days.
|
||||||
|
#[derive(Debug, Clone, Deserialize)]
|
||||||
|
#[serde(untagged)]
|
||||||
|
pub enum ScheduleConfigCompat {
|
||||||
|
V2(ScheduleConfig),
|
||||||
|
V1(OldScheduleConfig),
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<ScheduleConfigCompat> for ScheduleConfig {
|
||||||
|
fn from(c: ScheduleConfigCompat) -> Self {
|
||||||
|
match c {
|
||||||
|
ScheduleConfigCompat::V2(cfg) => cfg,
|
||||||
|
ScheduleConfigCompat::V1(old) => {
|
||||||
|
let day_blocks = Weekday::all()
|
||||||
|
.into_iter()
|
||||||
|
.map(|d| (d, old.blocks.clone()))
|
||||||
|
.collect();
|
||||||
|
ScheduleConfig { day_blocks }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl ScheduleConfig {
|
impl ScheduleConfig {
|
||||||
/// Return the block whose time window contains `time`, if any.
|
/// Blocks for a given day. Returns empty slice if the day has no blocks.
|
||||||
///
|
pub fn blocks_for(&self, day: Weekday) -> &[ProgrammingBlock] {
|
||||||
/// Handles blocks that span midnight (e.g. start 23:00, duration 180 min).
|
self.day_blocks.get(&day).map(|v| v.as_slice()).unwrap_or(&[])
|
||||||
pub fn find_block_at(&self, time: NaiveTime) -> Option<&ProgrammingBlock> {
|
}
|
||||||
|
|
||||||
|
/// The block whose window contains `time` on `day`, if any.
|
||||||
|
pub fn find_block_at(&self, day: Weekday, time: NaiveTime) -> Option<&ProgrammingBlock> {
|
||||||
let secs = time.num_seconds_from_midnight();
|
let secs = time.num_seconds_from_midnight();
|
||||||
self.blocks.iter().find(|block| {
|
self.blocks_for(day).iter().find(|block| {
|
||||||
let start = block.start_time.num_seconds_from_midnight();
|
let start = block.start_time.num_seconds_from_midnight();
|
||||||
let end = start + block.duration_mins * 60;
|
let end = start + block.duration_mins * 60;
|
||||||
if end <= 86_400 {
|
if end <= 86_400 {
|
||||||
secs >= start && secs < end
|
secs >= start && secs < end
|
||||||
} else {
|
} else {
|
||||||
// Block crosses midnight: active from `start` to `end % 86400` next day
|
|
||||||
secs >= start || secs < (end % 86_400)
|
secs >= start || secs < (end % 86_400)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Return the start time of the next block that begins strictly after `time`,
|
/// The start time of the next block beginning strictly after `time` on `day`.
|
||||||
/// within the same calendar day.
|
pub fn next_block_start_after(&self, day: Weekday, time: NaiveTime) -> Option<NaiveTime> {
|
||||||
pub fn next_block_start_after(&self, time: NaiveTime) -> Option<NaiveTime> {
|
|
||||||
let secs = time.num_seconds_from_midnight();
|
let secs = time.num_seconds_from_midnight();
|
||||||
self.blocks
|
self.blocks_for(day)
|
||||||
.iter()
|
.iter()
|
||||||
.map(|b| b.start_time.num_seconds_from_midnight())
|
.map(|b| b.start_time.num_seconds_from_midnight())
|
||||||
.filter(|&s| s > secs)
|
.filter(|&s| s > secs)
|
||||||
@@ -173,9 +211,15 @@ impl ScheduleConfig {
|
|||||||
.and_then(|s| NaiveTime::from_num_seconds_from_midnight_opt(s, 0))
|
.and_then(|s| NaiveTime::from_num_seconds_from_midnight_opt(s, 0))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The earliest block start time across all blocks (used for next-day rollover).
|
/// Earliest block start time across ALL days (used by background scheduler).
|
||||||
|
/// Returns `None` if every day is empty.
|
||||||
pub fn earliest_block_start(&self) -> Option<NaiveTime> {
|
pub fn earliest_block_start(&self) -> Option<NaiveTime> {
|
||||||
self.blocks.iter().map(|b| b.start_time).min()
|
self.day_blocks.values().flatten().map(|b| b.start_time).min()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Iterator over all blocks across all days (for block-ID lookups that are day-agnostic).
|
||||||
|
pub fn all_blocks(&self) -> impl Iterator<Item = &ProgrammingBlock> {
|
||||||
|
self.day_blocks.values().flatten()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -304,7 +348,7 @@ pub struct MediaItem {
|
|||||||
pub episode_number: Option<u32>,
|
pub episode_number: Option<u32>,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// A fully resolved 48-hour broadcast program for one channel.
|
/// A fully resolved 7-day broadcast program for one channel.
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
pub struct GeneratedSchedule {
|
pub struct GeneratedSchedule {
|
||||||
pub id: Uuid,
|
pub id: Uuid,
|
||||||
@@ -356,6 +400,18 @@ pub struct PlaybackRecord {
|
|||||||
pub generation: u32,
|
pub generation: u32,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// A point-in-time snapshot of a channel's `ScheduleConfig`.
|
||||||
|
/// Auto-created on every config save; users can pin with a label.
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct ChannelConfigSnapshot {
|
||||||
|
pub id: Uuid,
|
||||||
|
pub channel_id: ChannelId,
|
||||||
|
pub config: ScheduleConfig,
|
||||||
|
pub version_num: i64,
|
||||||
|
pub label: Option<String>,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
}
|
||||||
|
|
||||||
impl PlaybackRecord {
|
impl PlaybackRecord {
|
||||||
pub fn new(channel_id: ChannelId, item_id: MediaItemId, generation: u32) -> Self {
|
pub fn new(channel_id: ChannelId, item_id: MediaItemId, generation: u32) -> Self {
|
||||||
Self {
|
Self {
|
||||||
@@ -367,3 +423,74 @@ impl PlaybackRecord {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod schedule_config_tests {
|
||||||
|
use super::*;
|
||||||
|
use chrono::NaiveTime;
|
||||||
|
|
||||||
|
fn t(h: u32, m: u32) -> NaiveTime {
|
||||||
|
NaiveTime::from_hms_opt(h, m, 0).unwrap()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn make_block(start: NaiveTime, duration_mins: u32) -> ProgrammingBlock {
|
||||||
|
ProgrammingBlock::new_algorithmic(
|
||||||
|
"test", start, duration_mins,
|
||||||
|
Default::default(), FillStrategy::Random,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn cfg_with_monday_block(start: NaiveTime, dur: u32) -> ScheduleConfig {
|
||||||
|
let mut cfg = ScheduleConfig::default();
|
||||||
|
cfg.day_blocks.insert(Weekday::Monday, vec![make_block(start, dur)]);
|
||||||
|
cfg
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn find_block_at_finds_active_block() {
|
||||||
|
let cfg = cfg_with_monday_block(t(8, 0), 60);
|
||||||
|
assert!(cfg.find_block_at(Weekday::Monday, t(8, 30)).is_some());
|
||||||
|
assert!(cfg.find_block_at(Weekday::Monday, t(9, 0)).is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn find_block_at_wrong_day_returns_none() {
|
||||||
|
let cfg = cfg_with_monday_block(t(8, 0), 60);
|
||||||
|
assert!(cfg.find_block_at(Weekday::Tuesday, t(8, 30)).is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn v1_compat_copies_blocks_to_all_days() {
|
||||||
|
let json = r#"{"blocks": []}"#;
|
||||||
|
let compat: ScheduleConfigCompat = serde_json::from_str(json).unwrap();
|
||||||
|
let cfg: ScheduleConfig = compat.into();
|
||||||
|
assert_eq!(cfg.day_blocks.len(), 7);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn v2_payload_with_unknown_blocks_key_fails() {
|
||||||
|
let json = r#"{"blocks": [], "day_blocks": {}}"#;
|
||||||
|
let result: Result<ScheduleConfigCompat, _> = serde_json::from_str(json);
|
||||||
|
match result {
|
||||||
|
Ok(ScheduleConfigCompat::V2(cfg)) => {
|
||||||
|
let _ = cfg;
|
||||||
|
}
|
||||||
|
Ok(ScheduleConfigCompat::V1(_)) => { /* acceptable */ }
|
||||||
|
Err(_) => { /* acceptable — ambiguous payload rejected */ }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn earliest_block_start_across_days() {
|
||||||
|
let mut cfg = ScheduleConfig::default();
|
||||||
|
cfg.day_blocks.insert(Weekday::Monday, vec![make_block(t(10, 0), 60)]);
|
||||||
|
cfg.day_blocks.insert(Weekday::Friday, vec![make_block(t(7, 0), 60)]);
|
||||||
|
assert_eq!(cfg.earliest_block_start(), Some(t(7, 0)));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn empty_config_earliest_block_start_is_none() {
|
||||||
|
let cfg = ScheduleConfig::default();
|
||||||
|
assert!(cfg.earliest_block_start().is_none());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -10,7 +10,7 @@ use chrono::DateTime;
|
|||||||
use chrono::Utc;
|
use chrono::Utc;
|
||||||
use uuid::Uuid;
|
use uuid::Uuid;
|
||||||
|
|
||||||
use crate::entities::{Channel, GeneratedSchedule, PlaybackRecord, User};
|
use crate::entities::{Channel, ChannelConfigSnapshot, GeneratedSchedule, PlaybackRecord, ScheduleConfig, User};
|
||||||
use crate::errors::DomainResult;
|
use crate::errors::DomainResult;
|
||||||
use crate::value_objects::{BlockId, ChannelId, MediaItemId, UserId};
|
use crate::value_objects::{BlockId, ChannelId, MediaItemId, UserId};
|
||||||
|
|
||||||
@@ -71,6 +71,33 @@ pub trait ChannelRepository: Send + Sync {
|
|||||||
/// Insert or update a channel.
|
/// Insert or update a channel.
|
||||||
async fn save(&self, channel: &Channel) -> DomainResult<()>;
|
async fn save(&self, channel: &Channel) -> DomainResult<()>;
|
||||||
async fn delete(&self, id: ChannelId) -> DomainResult<()>;
|
async fn delete(&self, id: ChannelId) -> DomainResult<()>;
|
||||||
|
|
||||||
|
/// Snapshot the current config before saving a new one.
|
||||||
|
/// version_num is computed by the infra layer as MAX(version_num)+1 inside a transaction.
|
||||||
|
async fn save_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
config: &ScheduleConfig,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<ChannelConfigSnapshot>;
|
||||||
|
|
||||||
|
async fn list_config_snapshots(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<ChannelConfigSnapshot>>;
|
||||||
|
|
||||||
|
async fn get_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>>;
|
||||||
|
|
||||||
|
async fn patch_config_snapshot_label(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Repository port for `GeneratedSchedule` and `PlaybackRecord` persistence.
|
/// Repository port for `GeneratedSchedule` and `PlaybackRecord` persistence.
|
||||||
@@ -107,6 +134,28 @@ pub trait ScheduleRepository: Send + Sync {
|
|||||||
&self,
|
&self,
|
||||||
channel_id: ChannelId,
|
channel_id: ChannelId,
|
||||||
) -> DomainResult<HashMap<BlockId, MediaItemId>>;
|
) -> DomainResult<HashMap<BlockId, MediaItemId>>;
|
||||||
|
|
||||||
|
/// List all generated schedule headers for a channel, newest first.
|
||||||
|
async fn list_schedule_history(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<GeneratedSchedule>>;
|
||||||
|
|
||||||
|
/// Fetch a specific schedule with its slots, verifying channel ownership.
|
||||||
|
async fn get_schedule_by_id(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
schedule_id: Uuid,
|
||||||
|
) -> DomainResult<Option<GeneratedSchedule>>;
|
||||||
|
|
||||||
|
/// Delete all schedules with generation > target_generation for this channel.
|
||||||
|
/// Also deletes matching playback_records (no DB cascade between those tables).
|
||||||
|
/// scheduled_slots cascade via FK from generated_schedules.
|
||||||
|
async fn delete_schedules_after(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
target_generation: u32,
|
||||||
|
) -> DomainResult<()>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Repository port for activity log persistence.
|
/// Repository port for activity log persistence.
|
||||||
|
|||||||
@@ -1,6 +1,8 @@
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
|
|
||||||
use crate::entities::Channel;
|
use uuid::Uuid;
|
||||||
|
|
||||||
|
use crate::entities::{Channel, ChannelConfigSnapshot, ScheduleConfig};
|
||||||
use crate::errors::{DomainError, DomainResult};
|
use crate::errors::{DomainError, DomainResult};
|
||||||
use crate::repositories::ChannelRepository;
|
use crate::repositories::ChannelRepository;
|
||||||
use crate::value_objects::{ChannelId, UserId};
|
use crate::value_objects::{ChannelId, UserId};
|
||||||
@@ -42,10 +44,75 @@ impl ChannelService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub async fn update(&self, channel: Channel) -> DomainResult<Channel> {
|
pub async fn update(&self, channel: Channel) -> DomainResult<Channel> {
|
||||||
|
// Auto-snapshot the existing config before overwriting
|
||||||
|
if let Some(existing) = self.channel_repo.find_by_id(channel.id).await? {
|
||||||
|
self.channel_repo
|
||||||
|
.save_config_snapshot(channel.id, &existing.schedule_config, None)
|
||||||
|
.await?;
|
||||||
|
}
|
||||||
self.channel_repo.save(&channel).await?;
|
self.channel_repo.save(&channel).await?;
|
||||||
Ok(channel)
|
Ok(channel)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub async fn list_config_snapshots(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<ChannelConfigSnapshot>> {
|
||||||
|
self.channel_repo.list_config_snapshots(channel_id).await
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn get_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>> {
|
||||||
|
self.channel_repo.get_config_snapshot(channel_id, snapshot_id).await
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn patch_config_snapshot_label(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>> {
|
||||||
|
self.channel_repo.patch_config_snapshot_label(channel_id, snapshot_id, label).await
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Restore a snapshot: auto-snapshot current config, then apply the snapshot's config.
|
||||||
|
pub async fn restore_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
) -> DomainResult<Channel> {
|
||||||
|
let snapshot = self
|
||||||
|
.channel_repo
|
||||||
|
.get_config_snapshot(channel_id, snapshot_id)
|
||||||
|
.await?
|
||||||
|
.ok_or(DomainError::ChannelNotFound(channel_id))?;
|
||||||
|
let mut channel = self
|
||||||
|
.channel_repo
|
||||||
|
.find_by_id(channel_id)
|
||||||
|
.await?
|
||||||
|
.ok_or(DomainError::ChannelNotFound(channel_id))?;
|
||||||
|
// Snapshot current config before overwriting
|
||||||
|
self.channel_repo
|
||||||
|
.save_config_snapshot(channel_id, &channel.schedule_config, None)
|
||||||
|
.await?;
|
||||||
|
channel.schedule_config = snapshot.config;
|
||||||
|
channel.updated_at = chrono::Utc::now();
|
||||||
|
self.channel_repo.save(&channel).await?;
|
||||||
|
Ok(channel)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn save_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
config: &ScheduleConfig,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<ChannelConfigSnapshot> {
|
||||||
|
self.channel_repo.save_config_snapshot(channel_id, config, label).await
|
||||||
|
}
|
||||||
|
|
||||||
/// Delete a channel, enforcing that `requester_id` is the owner.
|
/// Delete a channel, enforcing that `requester_id` is the owner.
|
||||||
pub async fn delete(&self, id: ChannelId, requester_id: UserId) -> DomainResult<()> {
|
pub async fn delete(&self, id: ChannelId, requester_id: UserId) -> DomainResult<()> {
|
||||||
let channel = self.find_by_id(id).await?;
|
let channel = self.find_by_id(id).await?;
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
|
|
||||||
use chrono::{DateTime, Duration, TimeZone, Utc};
|
use chrono::{DateTime, Datelike, Duration, TimeZone, Utc};
|
||||||
use chrono_tz::Tz;
|
use chrono_tz::Tz;
|
||||||
use uuid::Uuid;
|
use uuid::Uuid;
|
||||||
|
|
||||||
@@ -20,7 +20,7 @@ mod recycle;
|
|||||||
|
|
||||||
/// Core scheduling engine.
|
/// Core scheduling engine.
|
||||||
///
|
///
|
||||||
/// Generates 48-hour broadcast schedules by walking through a channel's
|
/// Generates 7-day broadcast schedules by walking through a channel's
|
||||||
/// `ScheduleConfig` day by day, resolving each `ProgrammingBlock` into concrete
|
/// `ScheduleConfig` day by day, resolving each `ProgrammingBlock` into concrete
|
||||||
/// `ScheduledSlot`s via the `IMediaProvider`, and applying the `RecyclePolicy`
|
/// `ScheduledSlot`s via the `IMediaProvider`, and applying the `RecyclePolicy`
|
||||||
/// to avoid replaying recently aired items.
|
/// to avoid replaying recently aired items.
|
||||||
@@ -47,12 +47,12 @@ impl ScheduleEngineService {
|
|||||||
// Public API
|
// Public API
|
||||||
// -------------------------------------------------------------------------
|
// -------------------------------------------------------------------------
|
||||||
|
|
||||||
/// Generate and persist a 48-hour schedule for `channel_id` starting at `from`.
|
/// Generate and persist a 7-day schedule for `channel_id` starting at `from`.
|
||||||
///
|
///
|
||||||
/// The algorithm:
|
/// The algorithm:
|
||||||
/// 1. Walk each calendar day in the 48-hour window.
|
/// 1. Walk each calendar day in the 7-day window.
|
||||||
/// 2. For each `ProgrammingBlock`, compute its UTC wall-clock interval for that day.
|
/// 2. For each `ProgrammingBlock`, compute its UTC wall-clock interval for that day.
|
||||||
/// 3. Clip the interval to `[from, from + 48h)`.
|
/// 3. Clip the interval to `[from, from + 7d)`.
|
||||||
/// 4. Resolve the block content via the media provider, applying the recycle policy.
|
/// 4. Resolve the block content via the media provider, applying the recycle policy.
|
||||||
/// 5. For `Sequential` blocks, resume from where the previous generation left off
|
/// 5. For `Sequential` blocks, resume from where the previous generation left off
|
||||||
/// (series continuity — see `fill::fill_sequential`).
|
/// (series continuity — see `fill::fill_sequential`).
|
||||||
@@ -101,7 +101,7 @@ impl ScheduleEngineService {
|
|||||||
.await?;
|
.await?;
|
||||||
|
|
||||||
let valid_from = from;
|
let valid_from = from;
|
||||||
let valid_until = from + Duration::hours(48);
|
let valid_until = from + Duration::days(7);
|
||||||
|
|
||||||
let start_date = from.with_timezone(&tz).date_naive();
|
let start_date = from.with_timezone(&tz).date_naive();
|
||||||
let end_date = valid_until.with_timezone(&tz).date_naive();
|
let end_date = valid_until.with_timezone(&tz).date_naive();
|
||||||
@@ -110,7 +110,8 @@ impl ScheduleEngineService {
|
|||||||
let mut current_date = start_date;
|
let mut current_date = start_date;
|
||||||
|
|
||||||
while current_date <= end_date {
|
while current_date <= end_date {
|
||||||
for block in &channel.schedule_config.blocks {
|
let weekday = crate::value_objects::Weekday::from(current_date.weekday());
|
||||||
|
for block in channel.schedule_config.blocks_for(weekday) {
|
||||||
let naive_start = current_date.and_time(block.start_time);
|
let naive_start = current_date.and_time(block.start_time);
|
||||||
|
|
||||||
// `earliest()` handles DST gaps — if the local time doesn't exist
|
// `earliest()` handles DST gaps — if the local time doesn't exist
|
||||||
@@ -123,7 +124,7 @@ impl ScheduleEngineService {
|
|||||||
let block_end_utc =
|
let block_end_utc =
|
||||||
block_start_utc + Duration::minutes(block.duration_mins as i64);
|
block_start_utc + Duration::minutes(block.duration_mins as i64);
|
||||||
|
|
||||||
// Clip to the 48-hour window.
|
// Clip to the 7-day window.
|
||||||
let slot_start = block_start_utc.max(valid_from);
|
let slot_start = block_start_utc.max(valid_from);
|
||||||
let slot_end = block_end_utc.min(valid_until);
|
let slot_end = block_end_utc.min(valid_until);
|
||||||
|
|
||||||
@@ -224,6 +225,32 @@ impl ScheduleEngineService {
|
|||||||
self.provider_registry.get_stream_url(item_id, quality).await
|
self.provider_registry.get_stream_url(item_id, quality).await
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// List all generated schedule headers for a channel, newest first.
|
||||||
|
pub async fn list_schedule_history(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<GeneratedSchedule>> {
|
||||||
|
self.schedule_repo.list_schedule_history(channel_id).await
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Fetch a specific schedule with its slots.
|
||||||
|
pub async fn get_schedule_by_id(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
schedule_id: uuid::Uuid,
|
||||||
|
) -> DomainResult<Option<GeneratedSchedule>> {
|
||||||
|
self.schedule_repo.get_schedule_by_id(channel_id, schedule_id).await
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Delete all schedules with generation > target_generation for this channel.
|
||||||
|
pub async fn delete_schedules_after(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
target_generation: u32,
|
||||||
|
) -> DomainResult<()> {
|
||||||
|
self.schedule_repo.delete_schedules_after(channel_id, target_generation).await
|
||||||
|
}
|
||||||
|
|
||||||
/// Return all slots that overlap the given time window — the EPG data.
|
/// Return all slots that overlap the given time window — the EPG data.
|
||||||
pub fn get_epg(
|
pub fn get_epg(
|
||||||
schedule: &GeneratedSchedule,
|
schedule: &GeneratedSchedule,
|
||||||
|
|||||||
@@ -138,3 +138,64 @@ impl Default for RecyclePolicy {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Day of week, used as key in weekly schedule configs.
|
||||||
|
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "lowercase")]
|
||||||
|
pub enum Weekday {
|
||||||
|
Monday,
|
||||||
|
Tuesday,
|
||||||
|
Wednesday,
|
||||||
|
Thursday,
|
||||||
|
Friday,
|
||||||
|
Saturday,
|
||||||
|
Sunday,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<chrono::Weekday> for Weekday {
|
||||||
|
fn from(w: chrono::Weekday) -> Self {
|
||||||
|
match w {
|
||||||
|
chrono::Weekday::Mon => Weekday::Monday,
|
||||||
|
chrono::Weekday::Tue => Weekday::Tuesday,
|
||||||
|
chrono::Weekday::Wed => Weekday::Wednesday,
|
||||||
|
chrono::Weekday::Thu => Weekday::Thursday,
|
||||||
|
chrono::Weekday::Fri => Weekday::Friday,
|
||||||
|
chrono::Weekday::Sat => Weekday::Saturday,
|
||||||
|
chrono::Weekday::Sun => Weekday::Sunday,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Weekday {
|
||||||
|
pub fn all() -> [Weekday; 7] {
|
||||||
|
// ISO week order: Monday = index 0, Sunday = index 6.
|
||||||
|
// The schedule engine depends on this order when iterating days.
|
||||||
|
[
|
||||||
|
Weekday::Monday, Weekday::Tuesday, Weekday::Wednesday,
|
||||||
|
Weekday::Thursday, Weekday::Friday, Weekday::Saturday, Weekday::Sunday,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod weekday_tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn from_chrono_weekday_all_variants() {
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Mon), Weekday::Monday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Tue), Weekday::Tuesday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Wed), Weekday::Wednesday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Thu), Weekday::Thursday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Fri), Weekday::Friday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Sat), Weekday::Saturday);
|
||||||
|
assert_eq!(Weekday::from(chrono::Weekday::Sun), Weekday::Sunday);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn all_returns_monday_first_sunday_last() {
|
||||||
|
let days = Weekday::all();
|
||||||
|
assert_eq!(days[0], Weekday::Monday);
|
||||||
|
assert_eq!(days[6], Weekday::Sunday);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -2,7 +2,7 @@ use chrono::{DateTime, Utc};
|
|||||||
use sqlx::FromRow;
|
use sqlx::FromRow;
|
||||||
use uuid::Uuid;
|
use uuid::Uuid;
|
||||||
|
|
||||||
use domain::{AccessMode, Channel, ChannelId, DomainError, LogoPosition, RecyclePolicy, ScheduleConfig, UserId};
|
use domain::{AccessMode, Channel, ChannelId, DomainError, LogoPosition, RecyclePolicy, ScheduleConfig, ScheduleConfigCompat, UserId};
|
||||||
|
|
||||||
#[derive(Debug, FromRow)]
|
#[derive(Debug, FromRow)]
|
||||||
pub(super) struct ChannelRow {
|
pub(super) struct ChannelRow {
|
||||||
@@ -44,10 +44,11 @@ impl TryFrom<ChannelRow> for Channel {
|
|||||||
.map_err(|e| DomainError::RepositoryError(format!("Invalid channel UUID: {}", e)))?;
|
.map_err(|e| DomainError::RepositoryError(format!("Invalid channel UUID: {}", e)))?;
|
||||||
let owner_id: UserId = Uuid::parse_str(&row.owner_id)
|
let owner_id: UserId = Uuid::parse_str(&row.owner_id)
|
||||||
.map_err(|e| DomainError::RepositoryError(format!("Invalid owner UUID: {}", e)))?;
|
.map_err(|e| DomainError::RepositoryError(format!("Invalid owner UUID: {}", e)))?;
|
||||||
let schedule_config: ScheduleConfig = serde_json::from_str(&row.schedule_config)
|
let schedule_config: ScheduleConfig = serde_json::from_str::<ScheduleConfigCompat>(&row.schedule_config)
|
||||||
.map_err(|e| {
|
.map_err(|e| {
|
||||||
DomainError::RepositoryError(format!("Invalid schedule_config JSON: {}", e))
|
DomainError::RepositoryError(format!("Invalid schedule_config JSON: {}", e))
|
||||||
})?;
|
})
|
||||||
|
.map(ScheduleConfig::from)?;
|
||||||
let recycle_policy: RecyclePolicy = serde_json::from_str(&row.recycle_policy)
|
let recycle_policy: RecyclePolicy = serde_json::from_str(&row.recycle_policy)
|
||||||
.map_err(|e| {
|
.map_err(|e| {
|
||||||
DomainError::RepositoryError(format!("Invalid recycle_policy JSON: {}", e))
|
DomainError::RepositoryError(format!("Invalid recycle_policy JSON: {}", e))
|
||||||
|
|||||||
@@ -1,6 +1,9 @@
|
|||||||
use async_trait::async_trait;
|
use async_trait::async_trait;
|
||||||
|
use chrono::{DateTime, Utc};
|
||||||
|
use sqlx::Row;
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
use domain::{Channel, ChannelId, ChannelRepository, DomainError, DomainResult, UserId};
|
use domain::{Channel, ChannelConfigSnapshot, ChannelId, ChannelRepository, DomainError, DomainResult, ScheduleConfig, ScheduleConfigCompat, UserId};
|
||||||
|
|
||||||
use super::mapping::{ChannelRow, SELECT_COLS};
|
use super::mapping::{ChannelRow, SELECT_COLS};
|
||||||
|
|
||||||
@@ -139,4 +142,129 @@ impl ChannelRepository for SqliteChannelRepository {
|
|||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async fn save_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
config: &ScheduleConfig,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<ChannelConfigSnapshot> {
|
||||||
|
let id = Uuid::new_v4();
|
||||||
|
let now = Utc::now();
|
||||||
|
let config_json = serde_json::to_string(config)
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
let mut tx = self.pool.begin().await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
let version_num: i64 = sqlx::query_scalar(
|
||||||
|
"SELECT COALESCE(MAX(version_num), 0) + 1 FROM channel_config_snapshots WHERE channel_id = ?"
|
||||||
|
)
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_one(&mut *tx)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
sqlx::query(
|
||||||
|
"INSERT INTO channel_config_snapshots (id, channel_id, config_json, version_num, label, created_at)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?)"
|
||||||
|
)
|
||||||
|
.bind(id.to_string())
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.bind(&config_json)
|
||||||
|
.bind(version_num)
|
||||||
|
.bind(&label)
|
||||||
|
.bind(now.to_rfc3339())
|
||||||
|
.execute(&mut *tx)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
tx.commit().await.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
Ok(ChannelConfigSnapshot { id, channel_id, config: config.clone(), version_num, label, created_at: now })
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn list_config_snapshots(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<ChannelConfigSnapshot>> {
|
||||||
|
let rows = sqlx::query(
|
||||||
|
"SELECT id, config_json, version_num, label, created_at
|
||||||
|
FROM channel_config_snapshots WHERE channel_id = ?
|
||||||
|
ORDER BY version_num DESC"
|
||||||
|
)
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_all(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
rows.iter().map(|row| {
|
||||||
|
let id: Uuid = row.get::<String, _>("id").parse()
|
||||||
|
.map_err(|_| DomainError::RepositoryError("bad uuid".into()))?;
|
||||||
|
let config_json: String = row.get("config_json");
|
||||||
|
let config_compat: ScheduleConfigCompat = serde_json::from_str(&config_json)
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
let config: ScheduleConfig = config_compat.into();
|
||||||
|
let version_num: i64 = row.get("version_num");
|
||||||
|
let label: Option<String> = row.get("label");
|
||||||
|
let created_at_str: String = row.get("created_at");
|
||||||
|
let created_at = created_at_str.parse::<DateTime<Utc>>()
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
Ok(ChannelConfigSnapshot { id, channel_id, config, version_num, label, created_at })
|
||||||
|
}).collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn get_config_snapshot(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>> {
|
||||||
|
let row = sqlx::query(
|
||||||
|
"SELECT id, config_json, version_num, label, created_at
|
||||||
|
FROM channel_config_snapshots WHERE id = ? AND channel_id = ?"
|
||||||
|
)
|
||||||
|
.bind(snapshot_id.to_string())
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_optional(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
match row {
|
||||||
|
None => Ok(None),
|
||||||
|
Some(row) => {
|
||||||
|
let config_json: String = row.get("config_json");
|
||||||
|
let config_compat: ScheduleConfigCompat = serde_json::from_str(&config_json)
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
let config: ScheduleConfig = config_compat.into();
|
||||||
|
let version_num: i64 = row.get("version_num");
|
||||||
|
let label: Option<String> = row.get("label");
|
||||||
|
let created_at_str: String = row.get("created_at");
|
||||||
|
let created_at = created_at_str.parse::<DateTime<Utc>>()
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
Ok(Some(ChannelConfigSnapshot { id: snapshot_id, channel_id, config, version_num, label, created_at }))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn patch_config_snapshot_label(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
snapshot_id: Uuid,
|
||||||
|
label: Option<String>,
|
||||||
|
) -> DomainResult<Option<ChannelConfigSnapshot>> {
|
||||||
|
let updated = sqlx::query(
|
||||||
|
"UPDATE channel_config_snapshots SET label = ? WHERE id = ? AND channel_id = ? RETURNING id"
|
||||||
|
)
|
||||||
|
.bind(&label)
|
||||||
|
.bind(snapshot_id.to_string())
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_optional(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
if updated.is_none() {
|
||||||
|
return Ok(None);
|
||||||
|
}
|
||||||
|
self.get_config_snapshot(channel_id, snapshot_id).await
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ use chrono::{DateTime, Utc};
|
|||||||
use std::collections::HashMap;
|
use std::collections::HashMap;
|
||||||
|
|
||||||
use domain::{BlockId, ChannelId, DomainError, DomainResult, GeneratedSchedule, MediaItemId, PlaybackRecord, ScheduleRepository};
|
use domain::{BlockId, ChannelId, DomainError, DomainResult, GeneratedSchedule, MediaItemId, PlaybackRecord, ScheduleRepository};
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
use super::mapping::{map_schedule, LastSlotRow, PlaybackRecordRow, ScheduleRow, SlotRow};
|
use super::mapping::{map_schedule, LastSlotRow, PlaybackRecordRow, ScheduleRow, SlotRow};
|
||||||
|
|
||||||
@@ -183,6 +184,77 @@ impl ScheduleRepository for SqliteScheduleRepository {
|
|||||||
Ok(map)
|
Ok(map)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async fn list_schedule_history(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
) -> DomainResult<Vec<GeneratedSchedule>> {
|
||||||
|
let rows: Vec<ScheduleRow> = sqlx::query_as(
|
||||||
|
"SELECT id, channel_id, valid_from, valid_until, generation \
|
||||||
|
FROM generated_schedules WHERE channel_id = ? ORDER BY generation DESC",
|
||||||
|
)
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_all(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
rows.into_iter()
|
||||||
|
.map(|r| map_schedule(r, vec![]))
|
||||||
|
.collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn get_schedule_by_id(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
schedule_id: Uuid,
|
||||||
|
) -> DomainResult<Option<GeneratedSchedule>> {
|
||||||
|
let row: Option<ScheduleRow> = sqlx::query_as(
|
||||||
|
"SELECT id, channel_id, valid_from, valid_until, generation \
|
||||||
|
FROM generated_schedules WHERE id = ? AND channel_id = ?",
|
||||||
|
)
|
||||||
|
.bind(schedule_id.to_string())
|
||||||
|
.bind(channel_id.to_string())
|
||||||
|
.fetch_optional(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
match row {
|
||||||
|
None => Ok(None),
|
||||||
|
Some(r) => {
|
||||||
|
let slots = self.fetch_slots(&r.id).await?;
|
||||||
|
Some(map_schedule(r, slots)).transpose()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn delete_schedules_after(
|
||||||
|
&self,
|
||||||
|
channel_id: ChannelId,
|
||||||
|
target_generation: u32,
|
||||||
|
) -> DomainResult<()> {
|
||||||
|
let target_gen = target_generation as i64;
|
||||||
|
let ch = channel_id.to_string();
|
||||||
|
|
||||||
|
sqlx::query(
|
||||||
|
"DELETE FROM playback_records WHERE channel_id = ? AND generation > ?",
|
||||||
|
)
|
||||||
|
.bind(&ch)
|
||||||
|
.bind(target_gen)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
sqlx::query(
|
||||||
|
"DELETE FROM generated_schedules WHERE channel_id = ? AND generation > ?",
|
||||||
|
)
|
||||||
|
.bind(&ch)
|
||||||
|
.bind(target_gen)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await
|
||||||
|
.map_err(|e| DomainError::RepositoryError(e.to_string()))?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
async fn save_playback_record(&self, record: &PlaybackRecord) -> DomainResult<()> {
|
async fn save_playback_record(&self, record: &PlaybackRecord) -> DomainResult<()> {
|
||||||
sqlx::query(
|
sqlx::query(
|
||||||
r#"
|
r#"
|
||||||
|
|||||||
@@ -59,14 +59,16 @@ pub struct DeleteChannelParams {
|
|||||||
pub struct SetScheduleConfigParams {
|
pub struct SetScheduleConfigParams {
|
||||||
/// Channel UUID
|
/// Channel UUID
|
||||||
pub channel_id: String,
|
pub channel_id: String,
|
||||||
/// JSON array of ProgrammingBlock objects
|
/// JSON object of the full ScheduleConfig shape: {"monday": [...], "tuesday": [...], ...}
|
||||||
pub blocks_json: String,
|
pub day_blocks_json: String,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Deserialize, JsonSchema)]
|
#[derive(Debug, Deserialize, JsonSchema)]
|
||||||
pub struct AddBlockParams {
|
pub struct AddBlockParams {
|
||||||
/// Channel UUID
|
/// Channel UUID
|
||||||
pub channel_id: String,
|
pub channel_id: String,
|
||||||
|
/// Day of week: "monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"
|
||||||
|
pub day: String,
|
||||||
/// ProgrammingBlock serialized as JSON
|
/// ProgrammingBlock serialized as JSON
|
||||||
pub block_json: String,
|
pub block_json: String,
|
||||||
}
|
}
|
||||||
@@ -163,43 +165,44 @@ impl KTvMcpServer {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[tool(
|
#[tool(
|
||||||
description = "Replace a channel's entire schedule config. blocks_json is a JSON array of ProgrammingBlock objects."
|
description = "Replace a channel's entire schedule config. day_blocks_json is a JSON object of the ScheduleConfig shape: {\"monday\": [...], ...}"
|
||||||
)]
|
)]
|
||||||
async fn set_schedule_config(&self, #[tool(aggr)] p: SetScheduleConfigParams) -> String {
|
async fn set_schedule_config(&self, #[tool(aggr)] p: SetScheduleConfigParams) -> String {
|
||||||
let channel_id = match parse_uuid(&p.channel_id) {
|
let channel_id = match parse_uuid(&p.channel_id) {
|
||||||
Ok(id) => id,
|
Ok(id) => id,
|
||||||
Err(e) => return e,
|
Err(e) => return e,
|
||||||
};
|
};
|
||||||
let blocks: Vec<ProgrammingBlock> = match serde_json::from_str(&p.blocks_json) {
|
let config: ScheduleConfig = match serde_json::from_str(&p.day_blocks_json) {
|
||||||
Ok(b) => b,
|
Ok(c) => c,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
return serde_json::json!({"error": format!("invalid blocks_json: {e}")})
|
return serde_json::json!({"error": format!("invalid day_blocks_json: {e}")})
|
||||||
.to_string()
|
.to_string()
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
channels::set_schedule_config(
|
channels::set_schedule_config(&self.channel_service, channel_id, config).await
|
||||||
&self.channel_service,
|
|
||||||
channel_id,
|
|
||||||
ScheduleConfig { blocks },
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#[tool(
|
#[tool(
|
||||||
description = "Append a ProgrammingBlock to a channel's schedule. block_json is a serialized ProgrammingBlock."
|
description = "Append a ProgrammingBlock to a channel's schedule for a specific day. day: monday|tuesday|wednesday|thursday|friday|saturday|sunday. block_json is a serialized ProgrammingBlock."
|
||||||
)]
|
)]
|
||||||
async fn add_programming_block(&self, #[tool(aggr)] p: AddBlockParams) -> String {
|
async fn add_programming_block(&self, #[tool(aggr)] p: AddBlockParams) -> String {
|
||||||
let channel_id = match parse_uuid(&p.channel_id) {
|
let channel_id = match parse_uuid(&p.channel_id) {
|
||||||
Ok(id) => id,
|
Ok(id) => id,
|
||||||
Err(e) => return e,
|
Err(e) => return e,
|
||||||
};
|
};
|
||||||
|
let day: domain::Weekday = match serde_json::from_str(&format!("\"{}\"", p.day)) {
|
||||||
|
Ok(d) => d,
|
||||||
|
Err(e) => {
|
||||||
|
return serde_json::json!({"error": format!("invalid day: {e}")}).to_string()
|
||||||
|
}
|
||||||
|
};
|
||||||
let block: ProgrammingBlock = match serde_json::from_str(&p.block_json) {
|
let block: ProgrammingBlock = match serde_json::from_str(&p.block_json) {
|
||||||
Ok(b) => b,
|
Ok(b) => b,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
return serde_json::json!({"error": format!("invalid block_json: {e}")}).to_string()
|
return serde_json::json!({"error": format!("invalid block_json: {e}")}).to_string()
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
channels::add_programming_block(&self.channel_service, channel_id, block).await
|
channels::add_programming_block(&self.channel_service, channel_id, day, block).await
|
||||||
}
|
}
|
||||||
|
|
||||||
#[tool(description = "Remove a programming block from a channel's schedule by block UUID")]
|
#[tool(description = "Remove a programming block from a channel's schedule by block UUID")]
|
||||||
|
|||||||
@@ -95,13 +95,17 @@ pub async fn set_schedule_config(
|
|||||||
pub async fn add_programming_block(
|
pub async fn add_programming_block(
|
||||||
svc: &Arc<ChannelService>,
|
svc: &Arc<ChannelService>,
|
||||||
channel_id: Uuid,
|
channel_id: Uuid,
|
||||||
|
day: domain::Weekday,
|
||||||
block: domain::ProgrammingBlock,
|
block: domain::ProgrammingBlock,
|
||||||
) -> String {
|
) -> String {
|
||||||
let mut channel: Channel = match svc.find_by_id(channel_id).await {
|
let mut channel: Channel = match svc.find_by_id(channel_id).await {
|
||||||
Ok(c) => c,
|
Ok(c) => c,
|
||||||
Err(e) => return domain_err(e),
|
Err(e) => return domain_err(e),
|
||||||
};
|
};
|
||||||
channel.schedule_config.blocks.push(block);
|
channel.schedule_config.day_blocks
|
||||||
|
.entry(day)
|
||||||
|
.or_default()
|
||||||
|
.push(block);
|
||||||
channel.updated_at = chrono::Utc::now();
|
channel.updated_at = chrono::Utc::now();
|
||||||
match svc.update(channel).await {
|
match svc.update(channel).await {
|
||||||
Ok(c) => ok_json(&c),
|
Ok(c) => ok_json(&c),
|
||||||
@@ -118,7 +122,9 @@ pub async fn remove_programming_block(
|
|||||||
Ok(c) => c,
|
Ok(c) => c,
|
||||||
Err(e) => return domain_err(e),
|
Err(e) => return domain_err(e),
|
||||||
};
|
};
|
||||||
channel.schedule_config.blocks.retain(|b| b.id != block_id);
|
for blocks in channel.schedule_config.day_blocks.values_mut() {
|
||||||
|
blocks.retain(|b| b.id != block_id);
|
||||||
|
}
|
||||||
channel.updated_at = chrono::Utc::now();
|
channel.updated_at = chrono::Utc::now();
|
||||||
match svc.update(channel).await {
|
match svc.update(channel).await {
|
||||||
Ok(c) => ok_json(&c),
|
Ok(c) => ok_json(&c),
|
||||||
|
|||||||
@@ -0,0 +1,12 @@
|
|||||||
|
CREATE TABLE channel_config_snapshots (
|
||||||
|
id TEXT PRIMARY KEY NOT NULL,
|
||||||
|
channel_id TEXT NOT NULL REFERENCES channels(id) ON DELETE CASCADE,
|
||||||
|
config_json TEXT NOT NULL,
|
||||||
|
version_num INTEGER NOT NULL,
|
||||||
|
label TEXT,
|
||||||
|
created_at TEXT NOT NULL,
|
||||||
|
UNIQUE (channel_id, version_num)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_config_snapshots_channel
|
||||||
|
ON channel_config_snapshots(channel_id, version_num DESC);
|
||||||
@@ -11,6 +11,7 @@ import {
|
|||||||
Download,
|
Download,
|
||||||
ChevronUp,
|
ChevronUp,
|
||||||
ChevronDown,
|
ChevronDown,
|
||||||
|
History,
|
||||||
} from "lucide-react";
|
} from "lucide-react";
|
||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
import { useActiveSchedule } from "@/hooks/use-channels";
|
import { useActiveSchedule } from "@/hooks/use-channels";
|
||||||
@@ -29,6 +30,7 @@ interface ChannelCardProps {
|
|||||||
onExport: () => void;
|
onExport: () => void;
|
||||||
onMoveUp: () => void;
|
onMoveUp: () => void;
|
||||||
onMoveDown: () => void;
|
onMoveDown: () => void;
|
||||||
|
onScheduleHistory: () => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
function useScheduleStatus(channelId: string) {
|
function useScheduleStatus(channelId: string) {
|
||||||
@@ -69,9 +71,12 @@ export function ChannelCard({
|
|||||||
onExport,
|
onExport,
|
||||||
onMoveUp,
|
onMoveUp,
|
||||||
onMoveDown,
|
onMoveDown,
|
||||||
|
onScheduleHistory,
|
||||||
}: ChannelCardProps) {
|
}: ChannelCardProps) {
|
||||||
const [confirmOpen, setConfirmOpen] = useState(false);
|
const [confirmOpen, setConfirmOpen] = useState(false);
|
||||||
const blockCount = channel.schedule_config.blocks.length;
|
const blockCount = Object.values(channel.schedule_config.day_blocks).reduce(
|
||||||
|
(sum, blocks) => sum + blocks.length, 0
|
||||||
|
);
|
||||||
const { status, label } = useScheduleStatus(channel.id);
|
const { status, label } = useScheduleStatus(channel.id);
|
||||||
|
|
||||||
const scheduleColor =
|
const scheduleColor =
|
||||||
@@ -183,6 +188,15 @@ export function ChannelCard({
|
|||||||
>
|
>
|
||||||
<CalendarDays className="size-3.5" />
|
<CalendarDays className="size-3.5" />
|
||||||
</Button>
|
</Button>
|
||||||
|
<Button
|
||||||
|
size="icon-sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={onScheduleHistory}
|
||||||
|
title="Schedule history"
|
||||||
|
className="text-zinc-600 hover:text-zinc-200"
|
||||||
|
>
|
||||||
|
<History className="size-3.5" />
|
||||||
|
</Button>
|
||||||
<Button
|
<Button
|
||||||
size="icon-sm"
|
size="icon-sm"
|
||||||
asChild
|
asChild
|
||||||
|
|||||||
@@ -0,0 +1,119 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useState } from 'react'
|
||||||
|
import { Sheet, SheetContent, SheetHeader, SheetTitle } from '@/components/ui/sheet'
|
||||||
|
import { Button } from '@/components/ui/button'
|
||||||
|
import { Input } from '@/components/ui/input'
|
||||||
|
import { useConfigHistory, usePinSnapshot, useRestoreConfig } from '@/hooks/use-channels'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
channelId: string
|
||||||
|
open: boolean
|
||||||
|
onOpenChange: (open: boolean) => void
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ConfigHistorySheet({ channelId, open, onOpenChange }: Props) {
|
||||||
|
const { data: snapshots } = useConfigHistory(channelId)
|
||||||
|
const pin = usePinSnapshot()
|
||||||
|
const restore = useRestoreConfig()
|
||||||
|
const [pinningId, setPinningId] = useState<string | null>(null)
|
||||||
|
const [pinLabel, setPinLabel] = useState('')
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Sheet open={open} onOpenChange={onOpenChange}>
|
||||||
|
<SheetContent>
|
||||||
|
<SheetHeader>
|
||||||
|
<SheetTitle>Config history</SheetTitle>
|
||||||
|
</SheetHeader>
|
||||||
|
<div className="flex flex-col gap-2 mt-4 overflow-y-auto px-4 pb-4">
|
||||||
|
{(snapshots ?? []).map((snap, i) => (
|
||||||
|
<div
|
||||||
|
key={snap.id}
|
||||||
|
className={cn(
|
||||||
|
'flex items-center gap-3 p-3 rounded border',
|
||||||
|
i === 0 ? 'border-green-700 bg-green-950/30' : 'border-border'
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<div className="flex-1 min-w-0">
|
||||||
|
<div className="text-sm font-medium">
|
||||||
|
v{snap.version_num} —{' '}
|
||||||
|
{new Date(snap.created_at).toLocaleString()}
|
||||||
|
{i === 0 && (
|
||||||
|
<span className="ml-2 text-xs text-green-400 bg-green-950 px-1.5 py-0.5 rounded">
|
||||||
|
current
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
{snap.label ? (
|
||||||
|
<div className="text-xs text-amber-400 mt-0.5">📌 {snap.label}</div>
|
||||||
|
) : (
|
||||||
|
<div className="text-xs text-muted-foreground">Auto-saved</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{i === 0 && (
|
||||||
|
pinningId === snap.id ? (
|
||||||
|
<div className="flex gap-1 items-center">
|
||||||
|
<Input
|
||||||
|
value={pinLabel}
|
||||||
|
onChange={e => setPinLabel(e.target.value)}
|
||||||
|
className="h-7 text-xs w-32"
|
||||||
|
placeholder="label…"
|
||||||
|
onKeyDown={e => {
|
||||||
|
if (e.key === 'Enter') {
|
||||||
|
pin.mutate({ channelId, snapId: snap.id, label: pinLabel })
|
||||||
|
setPinningId(null)
|
||||||
|
}
|
||||||
|
if (e.key === 'Escape') setPinningId(null)
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
<Button
|
||||||
|
size="sm"
|
||||||
|
onClick={() => {
|
||||||
|
pin.mutate({ channelId, snapId: snap.id, label: pinLabel })
|
||||||
|
setPinningId(null)
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Save
|
||||||
|
</Button>
|
||||||
|
<Button size="sm" variant="ghost" onClick={() => setPinningId(null)}>
|
||||||
|
✕
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => {
|
||||||
|
setPinningId(snap.id)
|
||||||
|
setPinLabel(snap.label ?? '')
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Pin
|
||||||
|
</Button>
|
||||||
|
)
|
||||||
|
)}
|
||||||
|
|
||||||
|
{i > 0 && (
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => restore.mutate({ channelId, snapId: snap.id })}
|
||||||
|
disabled={restore.isPending}
|
||||||
|
>
|
||||||
|
Restore
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
{(snapshots ?? []).length === 0 && (
|
||||||
|
<p className="text-sm text-muted-foreground text-center py-8">
|
||||||
|
No history yet. History is created automatically when you save changes.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</SheetContent>
|
||||||
|
</Sheet>
|
||||||
|
)
|
||||||
|
}
|
||||||
@@ -15,6 +15,7 @@ import { RecyclePolicyEditor } from "./recycle-policy-editor";
|
|||||||
import { WebhookEditor } from "./webhook-editor";
|
import { WebhookEditor } from "./webhook-editor";
|
||||||
import { AccessSettingsEditor } from "./access-settings-editor";
|
import { AccessSettingsEditor } from "./access-settings-editor";
|
||||||
import { LogoEditor } from "./logo-editor";
|
import { LogoEditor } from "./logo-editor";
|
||||||
|
import { ConfigHistorySheet } from "./config-history-sheet";
|
||||||
import { useChannelForm } from "@/hooks/use-channel-form";
|
import { useChannelForm } from "@/hooks/use-channel-form";
|
||||||
import { channelFormSchema, extractErrors } from "@/lib/schemas";
|
import { channelFormSchema, extractErrors } from "@/lib/schemas";
|
||||||
import type { FieldErrors } from "@/lib/schemas";
|
import type { FieldErrors } from "@/lib/schemas";
|
||||||
@@ -27,7 +28,10 @@ import type {
|
|||||||
MediaFilter,
|
MediaFilter,
|
||||||
ProviderInfo,
|
ProviderInfo,
|
||||||
RecyclePolicy,
|
RecyclePolicy,
|
||||||
|
Weekday,
|
||||||
} from "@/lib/types";
|
} from "@/lib/types";
|
||||||
|
import { WEEKDAYS, WEEKDAY_LABELS } from "@/lib/types";
|
||||||
|
import { cn } from "@/lib/utils";
|
||||||
|
|
||||||
// ---------------------------------------------------------------------------
|
// ---------------------------------------------------------------------------
|
||||||
// Local shared primitives (only used inside this file)
|
// Local shared primitives (only used inside this file)
|
||||||
@@ -334,7 +338,7 @@ interface EditChannelSheetProps {
|
|||||||
name: string;
|
name: string;
|
||||||
description: string;
|
description: string;
|
||||||
timezone: string;
|
timezone: string;
|
||||||
schedule_config: { blocks: ProgrammingBlock[] };
|
schedule_config: { day_blocks: Record<Weekday, ProgrammingBlock[]> };
|
||||||
recycle_policy: RecyclePolicy;
|
recycle_policy: RecyclePolicy;
|
||||||
auto_schedule: boolean;
|
auto_schedule: boolean;
|
||||||
access_mode?: AccessMode;
|
access_mode?: AccessMode;
|
||||||
@@ -364,6 +368,29 @@ export function EditChannelSheet({
|
|||||||
}: EditChannelSheetProps) {
|
}: EditChannelSheetProps) {
|
||||||
const form = useChannelForm(channel);
|
const form = useChannelForm(channel);
|
||||||
const [fieldErrors, setFieldErrors] = useState<FieldErrors>({});
|
const [fieldErrors, setFieldErrors] = useState<FieldErrors>({});
|
||||||
|
const [activeDay, setActiveDay] = useState<Weekday>('monday');
|
||||||
|
const [copyTarget, setCopyTarget] = useState<Weekday | 'all' | ''>('');
|
||||||
|
const [configHistoryOpen, setConfigHistoryOpen] = useState(false);
|
||||||
|
|
||||||
|
const handleCopyTo = () => {
|
||||||
|
if (!copyTarget) return;
|
||||||
|
const sourceBlocks = form.dayBlocks[activeDay] ?? [];
|
||||||
|
if (copyTarget === 'all') {
|
||||||
|
const newDayBlocks = { ...form.dayBlocks };
|
||||||
|
for (const day of WEEKDAYS) {
|
||||||
|
if (day !== activeDay) {
|
||||||
|
newDayBlocks[day] = sourceBlocks.map(b => ({ ...b, id: crypto.randomUUID() }));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
form.setDayBlocks(newDayBlocks);
|
||||||
|
} else {
|
||||||
|
form.setDayBlocks({
|
||||||
|
...form.dayBlocks,
|
||||||
|
[copyTarget]: sourceBlocks.map(b => ({ ...b, id: crypto.randomUUID() })),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
setCopyTarget('');
|
||||||
|
};
|
||||||
|
|
||||||
const handleSubmit = (e: React.FormEvent) => {
|
const handleSubmit = (e: React.FormEvent) => {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
@@ -373,7 +400,7 @@ export function EditChannelSheet({
|
|||||||
name: form.name,
|
name: form.name,
|
||||||
description: form.description,
|
description: form.description,
|
||||||
timezone: form.timezone,
|
timezone: form.timezone,
|
||||||
blocks: form.blocks,
|
day_blocks: form.dayBlocks,
|
||||||
recycle_policy: form.recyclePolicy,
|
recycle_policy: form.recyclePolicy,
|
||||||
auto_schedule: form.autoSchedule,
|
auto_schedule: form.autoSchedule,
|
||||||
access_mode: form.accessMode,
|
access_mode: form.accessMode,
|
||||||
@@ -390,7 +417,7 @@ export function EditChannelSheet({
|
|||||||
name: form.name,
|
name: form.name,
|
||||||
description: form.description,
|
description: form.description,
|
||||||
timezone: form.timezone,
|
timezone: form.timezone,
|
||||||
schedule_config: { blocks: form.blocks },
|
schedule_config: { day_blocks: form.dayBlocks },
|
||||||
recycle_policy: form.recyclePolicy,
|
recycle_policy: form.recyclePolicy,
|
||||||
auto_schedule: form.autoSchedule,
|
auto_schedule: form.autoSchedule,
|
||||||
access_mode: form.accessMode !== "public" ? form.accessMode : "public",
|
access_mode: form.accessMode !== "public" ? form.accessMode : "public",
|
||||||
@@ -410,6 +437,7 @@ export function EditChannelSheet({
|
|||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Sheet open={open} onOpenChange={onOpenChange}>
|
<Sheet open={open} onOpenChange={onOpenChange}>
|
||||||
<SheetContent
|
<SheetContent
|
||||||
@@ -542,6 +570,47 @@ export function EditChannelSheet({
|
|||||||
|
|
||||||
{/* Right: block editor */}
|
{/* Right: block editor */}
|
||||||
<div className="flex flex-1 flex-col overflow-hidden">
|
<div className="flex flex-1 flex-col overflow-hidden">
|
||||||
|
{/* Day tab bar */}
|
||||||
|
<div className="shrink-0 flex items-center border-b border-zinc-800 overflow-x-auto">
|
||||||
|
{WEEKDAYS.map(day => (
|
||||||
|
<button
|
||||||
|
key={day}
|
||||||
|
type="button"
|
||||||
|
onClick={() => { setActiveDay(day); form.setSelectedBlockId(null); }}
|
||||||
|
className={cn(
|
||||||
|
'px-4 py-2.5 text-sm whitespace-nowrap transition-colors shrink-0',
|
||||||
|
activeDay === day
|
||||||
|
? 'border-b-2 border-blue-400 text-blue-400'
|
||||||
|
: 'text-zinc-500 hover:text-zinc-300'
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
{WEEKDAY_LABELS[day]}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
{/* Copy-to control */}
|
||||||
|
<div className="ml-auto flex items-center gap-1.5 px-3 py-1 text-xs text-zinc-500 shrink-0">
|
||||||
|
<span>Copy to</span>
|
||||||
|
<select
|
||||||
|
value={copyTarget}
|
||||||
|
onChange={e => setCopyTarget(e.target.value as Weekday | 'all' | '')}
|
||||||
|
className="bg-zinc-800 border border-zinc-700 rounded px-1 py-0.5 text-xs text-zinc-300"
|
||||||
|
>
|
||||||
|
<option value="">day…</option>
|
||||||
|
{WEEKDAYS.filter(d => d !== activeDay).map(d => (
|
||||||
|
<option key={d} value={d}>{WEEKDAY_LABELS[d]}</option>
|
||||||
|
))}
|
||||||
|
<option value="all">All days</option>
|
||||||
|
</select>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={handleCopyTo}
|
||||||
|
className="bg-blue-900/40 border border-blue-700 text-blue-400 px-2 py-0.5 rounded text-xs hover:bg-blue-900/60"
|
||||||
|
>
|
||||||
|
Copy
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div className="shrink-0 space-y-3 border-b border-zinc-800 px-5 py-4">
|
<div className="shrink-0 space-y-3 border-b border-zinc-800 px-5 py-4">
|
||||||
<div className="flex items-center justify-between">
|
<div className="flex items-center justify-between">
|
||||||
<h3 className="text-xs font-semibold uppercase tracking-wider text-zinc-500">
|
<h3 className="text-xs font-semibold uppercase tracking-wider text-zinc-500">
|
||||||
@@ -551,31 +620,31 @@ export function EditChannelSheet({
|
|||||||
type="button"
|
type="button"
|
||||||
variant="outline"
|
variant="outline"
|
||||||
size="xs"
|
size="xs"
|
||||||
onClick={() => form.addBlock()}
|
onClick={() => form.addBlock(activeDay)}
|
||||||
className="border-zinc-700 text-zinc-300 hover:text-zinc-100"
|
className="border-zinc-700 text-zinc-300 hover:text-zinc-100"
|
||||||
>
|
>
|
||||||
<Plus className="size-3" />
|
<Plus className="size-3" />
|
||||||
Add block
|
Add block for {WEEKDAY_LABELS[activeDay]}
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<BlockTimeline
|
<BlockTimeline
|
||||||
blocks={form.blocks}
|
blocks={form.dayBlocks[activeDay] ?? []}
|
||||||
selectedId={form.selectedBlockId}
|
selectedId={form.selectedBlockId}
|
||||||
onSelect={form.setSelectedBlockId}
|
onSelect={form.setSelectedBlockId}
|
||||||
onChange={form.setBlocks}
|
onChange={(blocks) => form.setDayBlocks(prev => ({ ...prev, [activeDay]: blocks }))}
|
||||||
onCreateBlock={(startMins, durationMins) =>
|
onCreateBlock={(startMins, durationMins) =>
|
||||||
form.addBlock(startMins, durationMins)
|
form.addBlock(activeDay, startMins, durationMins)
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
|
|
||||||
{form.blocks.length === 0 ? (
|
{(form.dayBlocks[activeDay] ?? []).length === 0 ? (
|
||||||
<p className="rounded-md border border-dashed border-zinc-700 px-4 py-4 text-center text-xs text-zinc-600">
|
<p className="rounded-md border border-dashed border-zinc-700 px-4 py-4 text-center text-xs text-zinc-600">
|
||||||
No blocks yet. Drag on the timeline or click Add block.
|
No blocks for {WEEKDAY_LABELS[activeDay]}. Drag on the timeline or click Add block.
|
||||||
</p>
|
</p>
|
||||||
) : (
|
) : (
|
||||||
<div className="max-h-48 space-y-1 overflow-y-auto">
|
<div className="max-h-48 space-y-1 overflow-y-auto">
|
||||||
{form.blocks.map((block, idx) => (
|
{(form.dayBlocks[activeDay] ?? []).map((block, idx) => (
|
||||||
<button
|
<button
|
||||||
key={block.id}
|
key={block.id}
|
||||||
type="button"
|
type="button"
|
||||||
@@ -603,7 +672,7 @@ export function EditChannelSheet({
|
|||||||
role="button"
|
role="button"
|
||||||
onClick={(e) => {
|
onClick={(e) => {
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
form.removeBlock(idx);
|
form.removeBlock(activeDay, idx);
|
||||||
}}
|
}}
|
||||||
className="rounded p-1 text-zinc-600 hover:bg-zinc-700 hover:text-red-400"
|
className="rounded p-1 text-zinc-600 hover:bg-zinc-700 hover:text-red-400"
|
||||||
>
|
>
|
||||||
@@ -624,11 +693,12 @@ export function EditChannelSheet({
|
|||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
const selectedIdx = form.blocks.findIndex(
|
const activeDayBlocks = form.dayBlocks[activeDay] ?? [];
|
||||||
|
const selectedIdx = activeDayBlocks.findIndex(
|
||||||
(b) => b.id === form.selectedBlockId,
|
(b) => b.id === form.selectedBlockId,
|
||||||
);
|
);
|
||||||
const selectedBlock =
|
const selectedBlock =
|
||||||
selectedIdx >= 0 ? form.blocks[selectedIdx] : null;
|
selectedIdx >= 0 ? activeDayBlocks[selectedIdx] : null;
|
||||||
if (!selectedBlock) {
|
if (!selectedBlock) {
|
||||||
return (
|
return (
|
||||||
<div className="flex h-full items-center justify-center text-sm text-zinc-600">
|
<div className="flex h-full items-center justify-center text-sm text-zinc-600">
|
||||||
@@ -642,7 +712,7 @@ export function EditChannelSheet({
|
|||||||
index={selectedIdx}
|
index={selectedIdx}
|
||||||
errors={fieldErrors}
|
errors={fieldErrors}
|
||||||
providers={providers}
|
providers={providers}
|
||||||
onChange={(b) => form.updateBlock(selectedIdx, b)}
|
onChange={(b) => form.updateBlock(activeDay, selectedIdx, b)}
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
})()}
|
})()}
|
||||||
@@ -657,6 +727,15 @@ export function EditChannelSheet({
|
|||||||
</p>
|
</p>
|
||||||
)}
|
)}
|
||||||
<div className="ml-auto flex gap-2">
|
<div className="ml-auto flex gap-2">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => setConfigHistoryOpen(true)}
|
||||||
|
className="border-zinc-700 text-zinc-400 hover:text-zinc-100"
|
||||||
|
>
|
||||||
|
Config history
|
||||||
|
</Button>
|
||||||
<Button
|
<Button
|
||||||
type="button"
|
type="button"
|
||||||
variant="ghost"
|
variant="ghost"
|
||||||
@@ -670,6 +749,13 @@ export function EditChannelSheet({
|
|||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
{channel && (
|
||||||
|
<ConfigHistorySheet
|
||||||
|
channelId={channel.id}
|
||||||
|
open={configHistoryOpen}
|
||||||
|
onOpenChange={setConfigHistoryOpen}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
</form>
|
</form>
|
||||||
</SheetContent>
|
</SheetContent>
|
||||||
</Sheet>
|
</Sheet>
|
||||||
|
|||||||
@@ -0,0 +1,94 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useState } from 'react'
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
} from '@/components/ui/dialog'
|
||||||
|
import { Button } from '@/components/ui/button'
|
||||||
|
import { useScheduleHistory, useRollbackSchedule } from '@/hooks/use-channels'
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
channelId: string
|
||||||
|
open: boolean
|
||||||
|
onOpenChange: (open: boolean) => void
|
||||||
|
}
|
||||||
|
|
||||||
|
const fmtDateRange = (from: string, until: string) =>
|
||||||
|
`${new Date(from).toLocaleDateString()} – ${new Date(until).toLocaleDateString()}`
|
||||||
|
|
||||||
|
export function ScheduleHistoryDialog({ channelId, open, onOpenChange }: Props) {
|
||||||
|
const { data: entries } = useScheduleHistory(channelId)
|
||||||
|
const rollback = useRollbackSchedule()
|
||||||
|
const [confirmId, setConfirmId] = useState<string | null>(null)
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Dialog open={open} onOpenChange={onOpenChange}>
|
||||||
|
<DialogContent>
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>Schedule history</DialogTitle>
|
||||||
|
</DialogHeader>
|
||||||
|
<div className="flex flex-col gap-2 mt-2 max-h-[60vh] overflow-y-auto">
|
||||||
|
{(entries ?? []).map((entry, i) => (
|
||||||
|
<div
|
||||||
|
key={entry.id}
|
||||||
|
className="flex items-center gap-3 p-3 rounded border border-border"
|
||||||
|
>
|
||||||
|
<div className="flex-1 min-w-0">
|
||||||
|
<div className="text-sm font-medium">
|
||||||
|
Gen #{entry.generation}
|
||||||
|
{i === 0 && (
|
||||||
|
<span className="ml-2 text-xs text-green-400 bg-green-950 px-1.5 py-0.5 rounded">
|
||||||
|
active
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
<div className="text-xs text-muted-foreground mt-0.5">
|
||||||
|
{fmtDateRange(entry.valid_from, entry.valid_until)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{i > 0 && (
|
||||||
|
confirmId === entry.id ? (
|
||||||
|
<div className="flex items-center gap-1 text-xs">
|
||||||
|
<span className="text-amber-400 whitespace-nowrap">Roll back to gen #{entry.generation}?</span>
|
||||||
|
<Button
|
||||||
|
size="sm"
|
||||||
|
variant="destructive"
|
||||||
|
disabled={rollback.isPending}
|
||||||
|
onClick={() => {
|
||||||
|
rollback.mutate({ channelId, genId: entry.id })
|
||||||
|
setConfirmId(null)
|
||||||
|
onOpenChange(false)
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Confirm
|
||||||
|
</Button>
|
||||||
|
<Button size="sm" variant="ghost" onClick={() => setConfirmId(null)}>
|
||||||
|
Cancel
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<Button
|
||||||
|
size="sm"
|
||||||
|
variant="outline"
|
||||||
|
onClick={() => setConfirmId(entry.id)}
|
||||||
|
>
|
||||||
|
Rollback to here
|
||||||
|
</Button>
|
||||||
|
)
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
{(entries ?? []).length === 0 && (
|
||||||
|
<p className="text-sm text-muted-foreground text-center py-8">
|
||||||
|
No schedule history yet. Generate a schedule to get started.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
)
|
||||||
|
}
|
||||||
@@ -28,10 +28,12 @@ import {
|
|||||||
} from "./components/import-channel-dialog";
|
} from "./components/import-channel-dialog";
|
||||||
import { IptvExportDialog } from "./components/iptv-export-dialog";
|
import { IptvExportDialog } from "./components/iptv-export-dialog";
|
||||||
import { TranscodeSettingsDialog } from "./components/transcode-settings-dialog";
|
import { TranscodeSettingsDialog } from "./components/transcode-settings-dialog";
|
||||||
|
import { ScheduleHistoryDialog } from "./components/schedule-history-dialog";
|
||||||
import type {
|
import type {
|
||||||
ChannelResponse,
|
ChannelResponse,
|
||||||
ProgrammingBlock,
|
ProgrammingBlock,
|
||||||
RecyclePolicy,
|
RecyclePolicy,
|
||||||
|
Weekday,
|
||||||
} from "@/lib/types";
|
} from "@/lib/types";
|
||||||
|
|
||||||
export default function DashboardPage() {
|
export default function DashboardPage() {
|
||||||
@@ -58,6 +60,7 @@ export default function DashboardPage() {
|
|||||||
const [editChannel, setEditChannel] = useState<ChannelResponse | null>(null);
|
const [editChannel, setEditChannel] = useState<ChannelResponse | null>(null);
|
||||||
const [deleteTarget, setDeleteTarget] = useState<ChannelResponse | null>(null);
|
const [deleteTarget, setDeleteTarget] = useState<ChannelResponse | null>(null);
|
||||||
const [scheduleChannel, setScheduleChannel] = useState<ChannelResponse | null>(null);
|
const [scheduleChannel, setScheduleChannel] = useState<ChannelResponse | null>(null);
|
||||||
|
const [scheduleHistoryChannelId, setScheduleHistoryChannelId] = useState<string | null>(null);
|
||||||
|
|
||||||
const handleCreate = (data: {
|
const handleCreate = (data: {
|
||||||
name: string;
|
name: string;
|
||||||
@@ -84,7 +87,7 @@ export default function DashboardPage() {
|
|||||||
name: string;
|
name: string;
|
||||||
description: string;
|
description: string;
|
||||||
timezone: string;
|
timezone: string;
|
||||||
schedule_config: { blocks: ProgrammingBlock[] };
|
schedule_config: { day_blocks: Record<Weekday, ProgrammingBlock[]> };
|
||||||
recycle_policy: RecyclePolicy;
|
recycle_policy: RecyclePolicy;
|
||||||
auto_schedule: boolean;
|
auto_schedule: boolean;
|
||||||
access_mode?: import("@/lib/types").AccessMode;
|
access_mode?: import("@/lib/types").AccessMode;
|
||||||
@@ -185,6 +188,7 @@ export default function DashboardPage() {
|
|||||||
onExport={() => exportChannel(channel)}
|
onExport={() => exportChannel(channel)}
|
||||||
onMoveUp={() => handleMoveUp(channel.id)}
|
onMoveUp={() => handleMoveUp(channel.id)}
|
||||||
onMoveDown={() => handleMoveDown(channel.id)}
|
onMoveDown={() => handleMoveDown(channel.id)}
|
||||||
|
onScheduleHistory={() => setScheduleHistoryChannelId(channel.id)}
|
||||||
/>
|
/>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
@@ -245,6 +249,14 @@ export default function DashboardPage() {
|
|||||||
}}
|
}}
|
||||||
/>
|
/>
|
||||||
|
|
||||||
|
{scheduleHistoryChannelId && (
|
||||||
|
<ScheduleHistoryDialog
|
||||||
|
channelId={scheduleHistoryChannelId}
|
||||||
|
open={!!scheduleHistoryChannelId}
|
||||||
|
onOpenChange={open => !open && setScheduleHistoryChannelId(null)}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
{deleteTarget && (
|
{deleteTarget && (
|
||||||
<DeleteChannelDialog
|
<DeleteChannelDialog
|
||||||
channelName={deleteTarget.name}
|
channelName={deleteTarget.name}
|
||||||
|
|||||||
@@ -9,7 +9,9 @@ import type {
|
|||||||
ProgrammingBlock,
|
ProgrammingBlock,
|
||||||
MediaFilter,
|
MediaFilter,
|
||||||
RecyclePolicy,
|
RecyclePolicy,
|
||||||
|
Weekday,
|
||||||
} from "@/lib/types";
|
} from "@/lib/types";
|
||||||
|
import { WEEKDAYS } from "@/lib/types";
|
||||||
|
|
||||||
export const WEBHOOK_PRESETS = {
|
export const WEBHOOK_PRESETS = {
|
||||||
discord: `{
|
discord: `{
|
||||||
@@ -54,11 +56,17 @@ export function defaultBlock(startMins = 20 * 60, durationMins = 60): Programmin
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function emptyDayBlocks(): Record<Weekday, ProgrammingBlock[]> {
|
||||||
|
const result = {} as Record<Weekday, ProgrammingBlock[]>;
|
||||||
|
for (const d of WEEKDAYS) result[d] = [];
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
export function useChannelForm(channel: ChannelResponse | null) {
|
export function useChannelForm(channel: ChannelResponse | null) {
|
||||||
const [name, setName] = useState("");
|
const [name, setName] = useState("");
|
||||||
const [description, setDescription] = useState("");
|
const [description, setDescription] = useState("");
|
||||||
const [timezone, setTimezone] = useState("UTC");
|
const [timezone, setTimezone] = useState("UTC");
|
||||||
const [blocks, setBlocks] = useState<ProgrammingBlock[]>([]);
|
const [dayBlocks, setDayBlocks] = useState<Record<Weekday, ProgrammingBlock[]>>(emptyDayBlocks);
|
||||||
const [recyclePolicy, setRecyclePolicy] = useState<RecyclePolicy>({
|
const [recyclePolicy, setRecyclePolicy] = useState<RecyclePolicy>({
|
||||||
cooldown_days: null,
|
cooldown_days: null,
|
||||||
cooldown_generations: null,
|
cooldown_generations: null,
|
||||||
@@ -84,7 +92,10 @@ export function useChannelForm(channel: ChannelResponse | null) {
|
|||||||
setName(channel.name);
|
setName(channel.name);
|
||||||
setDescription(channel.description ?? "");
|
setDescription(channel.description ?? "");
|
||||||
setTimezone(channel.timezone);
|
setTimezone(channel.timezone);
|
||||||
setBlocks(channel.schedule_config.blocks);
|
setDayBlocks({
|
||||||
|
...emptyDayBlocks(),
|
||||||
|
...channel.schedule_config.day_blocks,
|
||||||
|
});
|
||||||
setRecyclePolicy(channel.recycle_policy);
|
setRecyclePolicy(channel.recycle_policy);
|
||||||
setAutoSchedule(channel.auto_schedule);
|
setAutoSchedule(channel.auto_schedule);
|
||||||
setAccessMode(channel.access_mode ?? "public");
|
setAccessMode(channel.access_mode ?? "public");
|
||||||
@@ -110,20 +121,23 @@ export function useChannelForm(channel: ChannelResponse | null) {
|
|||||||
}
|
}
|
||||||
}, [channel]);
|
}, [channel]);
|
||||||
|
|
||||||
const addBlock = (startMins = 20 * 60, durationMins = 60) => {
|
const addBlock = (day: Weekday, startMins = 20 * 60, durationMins = 60) => {
|
||||||
const block = defaultBlock(startMins, durationMins);
|
const block = defaultBlock(startMins, durationMins);
|
||||||
setBlocks((prev) => [...prev, block]);
|
setDayBlocks((prev) => ({ ...prev, [day]: [...(prev[day] ?? []), block] }));
|
||||||
setSelectedBlockId(block.id);
|
setSelectedBlockId(block.id);
|
||||||
};
|
};
|
||||||
|
|
||||||
const updateBlock = (idx: number, block: ProgrammingBlock) =>
|
const updateBlock = (day: Weekday, idx: number, block: ProgrammingBlock) =>
|
||||||
setBlocks((prev) => prev.map((b, i) => (i === idx ? block : b)));
|
setDayBlocks((prev) => ({
|
||||||
|
...prev,
|
||||||
|
[day]: (prev[day] ?? []).map((b, i) => (i === idx ? block : b)),
|
||||||
|
}));
|
||||||
|
|
||||||
const removeBlock = (idx: number) => {
|
const removeBlock = (day: Weekday, idx: number) => {
|
||||||
setBlocks((prev) => {
|
setDayBlocks((prev) => {
|
||||||
const next = prev.filter((_, i) => i !== idx);
|
const dayArr = prev[day] ?? [];
|
||||||
if (selectedBlockId === prev[idx].id) setSelectedBlockId(null);
|
if (selectedBlockId === dayArr[idx]?.id) setSelectedBlockId(null);
|
||||||
return next;
|
return { ...prev, [day]: dayArr.filter((_, i) => i !== idx) };
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -147,8 +161,8 @@ export function useChannelForm(channel: ChannelResponse | null) {
|
|||||||
webhookFormat, setWebhookFormat,
|
webhookFormat, setWebhookFormat,
|
||||||
webhookBodyTemplate, setWebhookBodyTemplate,
|
webhookBodyTemplate, setWebhookBodyTemplate,
|
||||||
webhookHeaders, setWebhookHeaders,
|
webhookHeaders, setWebhookHeaders,
|
||||||
// Blocks
|
// Blocks (day-keyed)
|
||||||
blocks, setBlocks,
|
dayBlocks, setDayBlocks,
|
||||||
selectedBlockId, setSelectedBlockId,
|
selectedBlockId, setSelectedBlockId,
|
||||||
recyclePolicy, setRecyclePolicy,
|
recyclePolicy, setRecyclePolicy,
|
||||||
addBlock,
|
addBlock,
|
||||||
|
|||||||
@@ -117,3 +117,69 @@ export function useEpg(channelId: string, from?: string, until?: string, channel
|
|||||||
enabled: !!channelId,
|
enabled: !!channelId,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function useConfigHistory(channelId: string) {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ["config-history", channelId],
|
||||||
|
queryFn: () => api.channels.listConfigHistory(channelId, token!),
|
||||||
|
enabled: !!token && !!channelId,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function usePinSnapshot() {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
const qc = useQueryClient();
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: ({ channelId, snapId, label }: { channelId: string; snapId: string; label: string | null }) =>
|
||||||
|
api.channels.patchConfigSnapshot(channelId, snapId, label, token!),
|
||||||
|
onSuccess: (_, { channelId }) => qc.invalidateQueries({ queryKey: ["config-history", channelId] }),
|
||||||
|
onError: (e: Error) => toast.error(e.message),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useRestoreConfig() {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
const qc = useQueryClient();
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: ({ channelId, snapId }: { channelId: string; snapId: string }) =>
|
||||||
|
api.channels.restoreConfigSnapshot(channelId, snapId, token!),
|
||||||
|
onSuccess: (_, { channelId }) => {
|
||||||
|
qc.invalidateQueries({ queryKey: ["channels"] });
|
||||||
|
qc.invalidateQueries({ queryKey: ["config-history", channelId] });
|
||||||
|
},
|
||||||
|
onError: (e: Error) => toast.error(e.message),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useScheduleHistory(channelId: string) {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ["schedule-history", channelId],
|
||||||
|
queryFn: () => api.channels.listScheduleHistory(channelId, token!),
|
||||||
|
enabled: !!token && !!channelId,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useScheduleGeneration(channelId: string, genId: string | null) {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ["schedule-generation", channelId, genId],
|
||||||
|
queryFn: () => api.channels.getScheduleGeneration(channelId, genId!, token!),
|
||||||
|
enabled: !!token && !!channelId && genId !== null,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useRollbackSchedule() {
|
||||||
|
const { token } = useAuthContext();
|
||||||
|
const qc = useQueryClient();
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: ({ channelId, genId }: { channelId: string; genId: string }) =>
|
||||||
|
api.channels.rollbackSchedule(channelId, genId, token!),
|
||||||
|
onSuccess: (_, { channelId }) => {
|
||||||
|
qc.invalidateQueries({ queryKey: ["schedule-history", channelId] });
|
||||||
|
qc.invalidateQueries({ queryKey: ["schedule", channelId] });
|
||||||
|
},
|
||||||
|
onError: (e: Error) => toast.error(e.message),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|||||||
@@ -4,6 +4,8 @@ import { useState } from "react";
|
|||||||
import { useQueryClient } from "@tanstack/react-query";
|
import { useQueryClient } from "@tanstack/react-query";
|
||||||
import { api } from "@/lib/api";
|
import { api } from "@/lib/api";
|
||||||
import type { ChannelImportData } from "@/app/(main)/dashboard/components/import-channel-dialog";
|
import type { ChannelImportData } from "@/app/(main)/dashboard/components/import-channel-dialog";
|
||||||
|
import { WEEKDAYS } from "@/lib/types";
|
||||||
|
import type { Weekday } from "@/lib/types";
|
||||||
|
|
||||||
export function useImportChannel(token: string | null) {
|
export function useImportChannel(token: string | null) {
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
@@ -26,7 +28,11 @@ export function useImportChannel(token: string | null) {
|
|||||||
await api.channels.update(
|
await api.channels.update(
|
||||||
created.id,
|
created.id,
|
||||||
{
|
{
|
||||||
schedule_config: { blocks: data.blocks },
|
schedule_config: {
|
||||||
|
day_blocks: Object.fromEntries(
|
||||||
|
WEEKDAYS.map(d => [d, d === 'monday' ? data.blocks : []])
|
||||||
|
) as Record<Weekday, typeof data.blocks>,
|
||||||
|
},
|
||||||
recycle_policy: data.recycle_policy,
|
recycle_policy: data.recycle_policy,
|
||||||
},
|
},
|
||||||
token,
|
token,
|
||||||
|
|||||||
@@ -17,6 +17,8 @@ import type {
|
|||||||
ActivityEvent,
|
ActivityEvent,
|
||||||
ProviderConfig,
|
ProviderConfig,
|
||||||
ProviderTestResult,
|
ProviderTestResult,
|
||||||
|
ConfigSnapshot,
|
||||||
|
ScheduleHistoryEntry,
|
||||||
} from "@/lib/types";
|
} from "@/lib/types";
|
||||||
|
|
||||||
const API_BASE =
|
const API_BASE =
|
||||||
@@ -110,6 +112,34 @@ export const api = {
|
|||||||
|
|
||||||
delete: (id: string, token: string) =>
|
delete: (id: string, token: string) =>
|
||||||
request<void>(`/channels/${id}`, { method: "DELETE", token }),
|
request<void>(`/channels/${id}`, { method: "DELETE", token }),
|
||||||
|
|
||||||
|
listConfigHistory: (channelId: string, token: string) =>
|
||||||
|
request<ConfigSnapshot[]>(`/channels/${channelId}/config/history`, { token }),
|
||||||
|
|
||||||
|
patchConfigSnapshot: (channelId: string, snapId: string, label: string | null, token: string) =>
|
||||||
|
request<ConfigSnapshot>(`/channels/${channelId}/config/history/${snapId}`, {
|
||||||
|
method: "PATCH",
|
||||||
|
body: JSON.stringify({ label }),
|
||||||
|
token,
|
||||||
|
}),
|
||||||
|
|
||||||
|
restoreConfigSnapshot: (channelId: string, snapId: string, token: string) =>
|
||||||
|
request<ChannelResponse>(`/channels/${channelId}/config/history/${snapId}/restore`, {
|
||||||
|
method: "POST",
|
||||||
|
token,
|
||||||
|
}),
|
||||||
|
|
||||||
|
listScheduleHistory: (channelId: string, token: string) =>
|
||||||
|
request<ScheduleHistoryEntry[]>(`/channels/${channelId}/schedule/history`, { token }),
|
||||||
|
|
||||||
|
getScheduleGeneration: (channelId: string, genId: string, token: string) =>
|
||||||
|
request<ScheduleResponse>(`/channels/${channelId}/schedule/history/${genId}`, { token }),
|
||||||
|
|
||||||
|
rollbackSchedule: (channelId: string, genId: string, token: string) =>
|
||||||
|
request<ScheduleResponse>(`/channels/${channelId}/schedule/history/${genId}/rollback`, {
|
||||||
|
method: "POST",
|
||||||
|
token,
|
||||||
|
}),
|
||||||
},
|
},
|
||||||
|
|
||||||
library: {
|
library: {
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ export function exportChannel(channel: ChannelResponse): void {
|
|||||||
name: channel.name,
|
name: channel.name,
|
||||||
description: channel.description ?? undefined,
|
description: channel.description ?? undefined,
|
||||||
timezone: channel.timezone,
|
timezone: channel.timezone,
|
||||||
blocks: channel.schedule_config.blocks,
|
day_blocks: channel.schedule_config.day_blocks,
|
||||||
recycle_policy: channel.recycle_policy,
|
recycle_policy: channel.recycle_policy,
|
||||||
};
|
};
|
||||||
const blob = new Blob([JSON.stringify(payload, null, 2)], {
|
const blob = new Blob([JSON.stringify(payload, null, 2)], {
|
||||||
|
|||||||
@@ -1,4 +1,10 @@
|
|||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
|
import { WEEKDAYS } from "@/lib/types";
|
||||||
|
import type { Weekday } from "@/lib/types";
|
||||||
|
|
||||||
|
const weekdaySchema = z.enum([
|
||||||
|
'monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday', 'sunday',
|
||||||
|
]);
|
||||||
|
|
||||||
export const mediaFilterSchema = z.object({
|
export const mediaFilterSchema = z.object({
|
||||||
content_type: z.enum(["movie", "episode", "short"]).nullable().optional(),
|
content_type: z.enum(["movie", "episode", "short"]).nullable().optional(),
|
||||||
@@ -53,7 +59,10 @@ export const channelFormSchema = z.object({
|
|||||||
name: z.string().min(1, "Name is required"),
|
name: z.string().min(1, "Name is required"),
|
||||||
timezone: z.string().min(1, "Timezone is required"),
|
timezone: z.string().min(1, "Timezone is required"),
|
||||||
description: z.string().optional(),
|
description: z.string().optional(),
|
||||||
blocks: z.array(blockSchema),
|
day_blocks: z.record(weekdaySchema, z.array(blockSchema))
|
||||||
|
.default(() =>
|
||||||
|
Object.fromEntries(WEEKDAYS.map(d => [d, []])) as unknown as Record<Weekday, z.infer<typeof blockSchema>[]>
|
||||||
|
),
|
||||||
recycle_policy: z.object({
|
recycle_policy: z.object({
|
||||||
cooldown_days: z.number().int().min(0).nullable().optional(),
|
cooldown_days: z.number().int().min(0).nullable().optional(),
|
||||||
cooldown_generations: z.number().int().min(0).nullable().optional(),
|
cooldown_generations: z.number().int().min(0).nullable().optional(),
|
||||||
|
|||||||
@@ -91,8 +91,35 @@ export interface ProgrammingBlock {
|
|||||||
access_password?: string;
|
access_password?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export type Weekday =
|
||||||
|
| 'monday' | 'tuesday' | 'wednesday' | 'thursday'
|
||||||
|
| 'friday' | 'saturday' | 'sunday'
|
||||||
|
|
||||||
|
export const WEEKDAYS: Weekday[] = [
|
||||||
|
'monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday', 'sunday',
|
||||||
|
]
|
||||||
|
|
||||||
|
export const WEEKDAY_LABELS: Record<Weekday, string> = {
|
||||||
|
monday: 'Mon', tuesday: 'Tue', wednesday: 'Wed', thursday: 'Thu',
|
||||||
|
friday: 'Fri', saturday: 'Sat', sunday: 'Sun',
|
||||||
|
}
|
||||||
|
|
||||||
export interface ScheduleConfig {
|
export interface ScheduleConfig {
|
||||||
blocks: ProgrammingBlock[];
|
day_blocks: Record<Weekday, ProgrammingBlock[]>
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ConfigSnapshot {
|
||||||
|
id: string
|
||||||
|
version_num: number
|
||||||
|
label: string | null
|
||||||
|
created_at: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ScheduleHistoryEntry {
|
||||||
|
id: string
|
||||||
|
generation: number
|
||||||
|
valid_from: string
|
||||||
|
valid_until: string
|
||||||
}
|
}
|
||||||
|
|
||||||
// Config
|
// Config
|
||||||
|
|||||||
Reference in New Issue
Block a user