Media Posting And Processing Guide
Purpose
This wiki explains how media-backed posting works in Loreax, what is implemented today, and how each post type is expected to move through posting and processing. It also preserves the package-readiness notes for the installed media stack.
Installed Packages
spatie/laravel-medialibrarypbmedia/laravel-ffmpeg
Current Status
Both packages are installed through Composer and available through Laravel package discovery.
At the application level:
- text posts are implemented
- link posts are implemented
- image posts are implemented as draft + media registration + queued processor + publish flow
- video and audio posts now have queued processor jobs and execution logging foundations
- media processing can be toggled off through platform settings for controlled bypass
- AI prescreen moderation can queue post review and move a post to
published,pending_review, orquarantined - poll and livestream post types are still enum/model groundwork only
Current Post Creation Surface
POST /api/v1/content/postsPATCH /api/v1/content/posts/{post}POST /api/v1/content/posts/{post}/mediaPOST /api/v1/content/posts/{post}/publishPOST /api/v1/content/posts/{post}/unpublishDELETE /api/v1/content/posts/{post}GET /api/v1/content/posts/{post}
Shared Posting Model
Every current post type starts with the same draft-first model.
flowchart TD
A["Creator POST /api/v1/content/posts"] --> B["CreatePostRequest validates payload"]
B --> C["CreatePostAction creates posts row status=draft"]
C --> D["Creator may PATCH draft"]
D --> E{"Needs uploaded media?"}
E -- "No" --> F["Creator can publish directly"]
E -- "Yes" --> G["Creator POST /api/v1/content/posts/{id}/media"]
G --> H["AttachPostMediaAction creates media row"]
H --> I{"content.post_processing_enabled?"}
I -- "No" --> J["media.processing_status=ready with bypass metadata"]
I -- "Yes" --> K["media.processing_status=pending for image, video, or audio"]
K --> L["Primary media pushes post.status=processing"]
L --> M["DB::afterCommit dispatches ProcessImageMediaJob or ProcessVideoMediaJob or ProcessAudioMediaJob"]
M --> N["PostMediaProcessingRunner marks media.processing_status=processing"]
N --> O["IPostProcessor executes logged processing steps"]
O --> P{"Processor outcome"}
P -- "Succeeded" --> Q["media.processing_status=ready"]
P -- "Failed" --> R["media.processing_status=failed"]
J --> S["Creator publishes when media requirements are satisfied"]
Q --> S
R --> T["Creator retries or replaces failed media before publish"]
Text Post Flow
Text posts are the simplest implemented path because they do not require attached media.
flowchart TD
A["Create text draft"] --> B["posts.type=text"]
B --> C["Optional edits via PATCH"]
C --> D["Publish request"]
D --> E["PublishPostAction marks published_at and status=published"]
E --> F["Public viewers can read post"]
Text Post Notes
bodyis required at creation time- No media rows are needed
- Publish is immediate in the current implementation
Link Post Flow
Link posts behave like text posts, except they require linkUrl.
flowchart TD
A["Create link draft"] --> B["posts.type=link"]
B --> C["Store title + optional body + link_url"]
C --> D["Optional edits via PATCH"]
D --> E["Publish request"]
E --> F["PublishPostAction marks status=published"]
F --> G["Public viewers can read link post"]
Link Post Notes
linkUrlis required at creation time- OG scraping / preview enrichment is not implemented yet
- There is no background processing job today for link posts
Image Post Flow
Image posts are the main media-backed flow currently implemented.
flowchart TD
A["Create image draft"] --> B["posts.type=image status=draft"]
B --> C["Upload file to storage outside the app flow"]
C --> D["POST /api/v1/content/posts/{id}/media with file metadata"]
D --> E["AttachPostMediaAction creates media row"]
E --> F["media.kind=image processing_status=pending"]
F --> G["post.status=processing"]
G --> H{"content.post_processing_enabled?"}
H -- "No" --> I["media.processing_status=ready with bypass metadata"]
H -- "Yes" --> J["DB::afterCommit dispatches ProcessImageMediaJob"]
J --> K["PostMediaProcessingRunner marks processing"]
K --> L["ImagePostProcessor records steps and builds variant manifest"]
L --> M{"Processor outcome"}
M -- "Succeeded" --> N["media.processing_status=ready and variants saved"]
M -- "Failed" --> O["media.processing_status=failed and processing_error saved"]
I --> P["Creator POST /publish"]
N --> P
P --> Q["PublishPostAction requires all primary media to be ready"]
Q --> R{"moderation.mode"}
R -- "reactive_only" --> S["post.status=published"]
R -- "ai_prescreen" --> T["post.status=pending_review and ProcessPostAiReviewJob dispatched"]
Image Post Notes
- The current API registers already-uploaded file metadata; it does not issue presigned upload URLs yet
- Primary image attachment updates post metadata with
mediaCount - Image processor jobs now create variant manifests and execution logs
- Real binary resizing through an image library is still pending
5.4
Video Post Flow
Video posts now have a queued processor and execution-log foundation, but not true FFmpeg-backed transcoding yet.
flowchart TD
A["Create video draft"] --> B["Upload original video"]
B --> C["Register media as kind=video"]
C --> D["AttachPostMediaAction stores media.processing_status=pending"]
D --> E["DB::afterCommit dispatches ProcessVideoMediaJob"]
E --> F["PostMediaProcessingRunner marks processing"]
F --> G["VideoPostProcessor records execution steps"]
G --> H["Video manifest and variants metadata are prepared"]
H --> I{"Processor outcome"}
I -- "Succeeded" --> J["media.processing_status=ready"]
I -- "Failed" --> K["media.processing_status=failed"]
J --> L["Creator publishes once all primary media are ready"]
Video Post Notes
ProcessVideoMediaJobandVideoPostProcessornow exist- Execution steps and durations are logged through the processor execution logger
- Real FFmpeg-backed transcode work is still pending
Audio Post Flow
Audio posts now have a queued processor and execution-log foundation, but not true FFmpeg-backed normalization yet.
flowchart TD
A["Create audio draft"] --> B["Upload original audio"]
B --> C["Register media as kind=audio"]
C --> D["AttachPostMediaAction stores media.processing_status=pending"]
D --> E["DB::afterCommit dispatches ProcessAudioMediaJob"]
E --> F["PostMediaProcessingRunner marks processing"]
F --> G["AudioPostProcessor records execution steps"]
G --> H["Audio manifest metadata is prepared"]
H --> I{"Processor outcome"}
I -- "Succeeded" --> J["media.processing_status=ready"]
I -- "Failed" --> K["media.processing_status=failed"]
J --> L["Creator publishes once all primary media are ready"]
Audio Post Notes
ProcessAudioMediaJobandAudioPostProcessornow exist- Execution steps and durations are logged through the processor execution logger
- Real FFmpeg-backed normalization is still pending
AI Prescreen Flow
When moderation.mode is set to ai_prescreen, publish becomes an async review step.
flowchart TD
A["Creator POST /api/v1/content/posts/{id}/publish"] --> B["PublishPostAction validates media readiness"]
B --> C["Post.status=pending_review moderation_state=under_review"]
C --> D["DB::afterCommit dispatches ProcessPostAiReviewJob"]
D --> E["PostAiReviewRunner calls IModerationAiProvider"]
E --> F["Score compared with moderation.ai_review_threshold and moderation.ai_revoke_threshold"]
F --> G{"Decision"}
G -- "Below review threshold" --> H["post.status=published moderation_state=clean"]
G -- "Above review threshold" --> I["post.status=pending_review moderation_state=under_review"]
G -- "Above revoke threshold" --> J["post.status=quarantined moderation_state=flagged"]
H --> K["Execution logged to post_processor_runs"]
I --> K
J --> K
Processor Execution Logging
Every image, video, or audio processor job can emit a Mongo document describing the run.
flowchart TD
A["ProcessImageMediaJob or ProcessVideoMediaJob or ProcessAudioMediaJob"] --> B["PostMediaProcessingRunner loads media and related post"]
B --> C["Runner creates executionId and ProcessingStepRecorder"]
C --> D["Runner sets media.processing_status=processing"]
D --> E["IPostProcessor::process executes named steps"]
E --> F["ProcessingStepRecorder captures duration and outcome for each step"]
F --> G{"Processor outcome"}
G -- "Succeeded" --> H["Runner saves variants, metadata, providerAssetId, and status=ready"]
G -- "Failed" --> I["Runner saves status=failed and processing_error"]
H --> J["MongoProcessorLogger writes post_processor_runs document"]
I --> J
What Gets Logged
- One Mongo document is written per processor run into
post_processor_runs processorKeystores the logical processor identifier such asimage,video, oraudioprocessorNamestores the processor class name used for the runjobClass,queueConnection,queueName, andattemptmake queue behavior traceablemediaUuid,mediaKind,mediaCollection,storageDisk, andstoragePathtie the run back to the uploaded assetmediaStatusBeforeandmediaStatusAftershow the state transition through the pipelinepostUuid,postType,creatorUuid, anduserUuidlet us group runs by post, owner, and actor/subjectresult,durationMs,stepCount,memoryUsageBytes, andpeakMemoryUsageBytessupport performance analysisresultPayload,providerAssetId,errorClass, anderrorMessagecapture the final processor outcomeexecutionStepsstores the per-step timeline withname,status,durationMs,input,output,metadata, and error detailsstartedAt,completedAt, andoccurredAtprovide the run timestamps
Why Step Logging Matters
- One post can have more than one attached media asset
- One media asset can require more than one processing step
- Failed runs are diagnosable without replaying the whole job
- Slow steps can be isolated before real FFmpeg and binary image transforms are added
Poll Post Flow
Polls do not depend on uploaded media and are now implemented as a lightweight post subtype.
flowchart TD
A["Create poll draft"] --> B["Store poll question and options on poll + poll_options"]
B --> C["Optional edits before voting starts"]
C --> D["Publish poll post"]
D --> E["Users vote through POST /api/v1/content/posts/{id}/poll/vote"]
E --> F["Existing vote is replaced for that user"]
F --> G["Poll option counts and total voter count are recalculated"]
Poll Post Notes
- Poll draft creation is part of
POST /api/v1/content/posts - Single-choice and multiple-choice voting are both supported
- Re-voting updates the caller's selection instead of creating duplicate votes
- Access-entitlement checks beyond current post visibility remain later Access-domain work
Livestream Post Flow
Livestreams are also planned rather than implemented.
flowchart TD
A["Create livestream draft"] --> B["Provision external livestream asset"]
B --> C["Store ingest/playback metadata"]
C --> D["Publish livestream post"]
D --> E["Optionally attach recording later"]
Livestream Post Notes
- This is the intended Phase
5.11path ILivestreamProviderand provider-backed provisioning are still pending
Read Flow For Published Versus Private Posts
flowchart TD
A["Client GET /api/v1/content/posts/{id}"] --> B["ShowPostAction loads post through visibleTo(viewer)"]
B --> C{"Viewer is owner?"}
C -- "Yes" --> D["Owner can read editable non-public states"]
C -- "No" --> E{"Post is published?"}
E -- "Yes" --> F["Return post"]
E -- "No" --> G["Return 404"]
What "Ready For Use" Means In This Repo
- No additional Composer work is required
- No manual service-provider registration is required
- FFmpeg runtime path placeholders already exist in
.env.example:FFMPEG_BINARY_PATHFFPROBE_BINARY_PATH
- The broader technical design already expects local FFmpeg-backed video/audio providers and a media-attachment lifecycle in the Content domain
Package Roles
spatie/laravel-medialibrary
- Attach uploaded media to content models
- Track original files and derived assets
- Support later media conversions and variant management in the Content domain
pbmedia/laravel-ffmpeg
- Drive queued video and audio processing through local FFmpeg/FFprobe
- Back the future
FFmpegLocalVideoProviderandFFmpegLocalAudioProvider - Support HLS generation, normalization, thumbnails, and other derived media operations
Processor Scheduling and Recovery
All media processors support two independent trigger modes, controlled by env flags in the Processor Scheduling and Processor Event Triggers sections of .env.
Event-Based Trigger (default: enabled)
PROCESSOR_EVENT_TRIGGER_MEDIA_ENABLED=true (default). When AttachPostMediaAction creates a media row it dispatches the correct job inside DB::afterCommit:
| Kind | Job |
|---|---|
image |
ProcessImageMediaJob |
audio |
ProcessAudioMediaJob |
video |
ProcessVideoMediaJob |
Setting this to false suspends all new upload-triggered processing without affecting already-queued jobs. Use this only for planned maintenance.
Schedule-Based Trigger (default: disabled)
PROCESSOR_SCHEDULE_MEDIA_SWEEP_ENABLED=true activates a periodic sweep. Requires php artisan schedule:run wired to the OS cron scheduler (* * * * * php artisan schedule:run >> /dev/null 2>&1).
The processor:sweep-pending-media command runs on PROCESSOR_SCHEDULE_MEDIA_SWEEP_CRON (default */15 * * * *) and re-queues media items where:
processing_statusis inPROCESSOR_SCHEDULE_MEDIA_SWEEP_STATUSES(defaultpending,failed)- or
processing_status = processingandupdated_at ≤ now - PROCESSOR_SCHEDULE_MEDIA_SWEEP_STUCK_MINUTES(default 30 min)
The sweep skips thumbnail, transcript, and livestream_recording kinds (no corresponding processor exists). It uses withoutOverlapping() and runInBackground() so overlapping sweeps cannot stack.
Manual Recovery
# Dry-run — list matched items without dispatching
php artisan processor:sweep-pending-media --dry-run
# Re-queue only failed items
php artisan processor:sweep-pending-media --statuses=failed
# Re-queue pending + failed + items stuck >60 min
php artisan processor:sweep-pending-media --statuses=pending,failed --stuck-minutes=60
Current Boundaries
- The packages are installed and repo-ready, but the full processing layer is only partially built
- The current implementation stores reusable
mediarows and returns them throughPostResource - No custom package config has been published into
config/at this stage because the current repo does not yet need project-specific overrides - Future processing should continue through domain actions/jobs rather than using package APIs ad hoc from controllers