> Full Neon documentation index: https://neon.com/docs/llms.txt # Creating a Content Moderation System with Laravel, OpenAI API, and Neon Postgres Build an automated content moderation system for your application using Laravel Livewire, OpenAI's moderation API, and Neon Postgres Content moderation is essential for maintaining healthy online communities and platforms. In this guide, we'll create a content moderation system that uses OpenAI's moderation API to automatically analyze and flag potentially problematic content before it reaches your users. We will use Laravel, OpenAI's moderation API, and Neon's serverless Postgres database and build a system that can handle content moderation for comments, forum posts, product reviews, or any user-generated content. ## What You'll Build In this guide, you'll build a content moderation system with the following features: 1. A form for users to submit content to our Neon database 2. Automatic content analysis using [OpenAI's moderation API](https://platform.openai.com/docs/guides/moderation) 3. A moderation queue for reviewing flagged content 4. A dashboard for viewing moderation statistics 5. Settings management for different content types ## Prerequisites To follow the steps in this guide, you will need: - PHP 8.2 or higher - [Composer](https://getcomposer.org/) installed - A [Neon](https://console.neon.tech/signup) account - An [OpenAI](https://platform.openai.com/signup) account with API access - Basic familiarity with Laravel and PHP ## Create a Neon Project Neon provides a serverless Postgres database that automatically scales as your application grows. Let's set up a Neon database for our content moderation system: 1. Navigate to the [Projects](https://console.neon.tech/app/projects) page in the Neon Console. 2. Click "New Project" and select your preferred settings. 3. Once your project is created, you'll see the connection details. Save the connection string for later use. ## Set up a Laravel Project Now, let's create a new Laravel project and set it up to work with our Neon database: ```bash composer create-project laravel/laravel moderation-system cd moderation-system ``` This creates a new Laravel 11 project in a directory called `moderation-system` and moves you into that directory. ## Configure Environment Variables To configure your Laravel application to connect to Neon Postgres and OpenAI, you need to set up your environment variables. 1. Open the `.env` file in your Laravel project directory. 2. Update your database configuration with the Neon connection details: ``` DB_CONNECTION=pgsql DB_HOST=your-neon-hostname.neon.tech DB_PORT=5432 DB_DATABASE=neondb DB_USERNAME=your-username DB_PASSWORD=your-password DB_SSLMODE=require ``` 3. Add your OpenAI API key: ``` OPENAI_API_KEY=your-openai-api-key ``` The `OPENAI_API_KEY` will be used by our moderation service to communicate with [OpenAI's moderation API](https://platform.openai.com/docs/guides/moderation). ## Install Livewire and Other Required Packages Let's install the necessary packages for our project: ```bash composer require livewire/livewire openai-php/laravel ``` This installs: - [Livewire](https://livewire.laravel.com/): A Laravel package that makes building dynamic web apps simple, without writing JavaScript - [OpenAI Laravel Client](https://github.com/openai-php/laravel): A library for interacting with OpenAI's API within Laravel Next, let's install Laravel Breeze with Livewire for authentication and UI scaffolding: ```bash composer require laravel/breeze --dev php artisan breeze:install livewire ``` After installing Breeze, follow the instructions to complete the setup: ```bash npm install npm run build ``` This will install the necessary NPM packages and build your static assets. Let's also run the migrations to create the default Laravel tables: ```bash php artisan migrate ``` **Important:** Neon supports both direct and pooled database connection strings, which can be copied from the **Connection Details** widget on your Neon Project Dashboard. A pooled connection string connects your application to the database via a PgBouncer connection pool, allowing for a higher number of concurrent connections. However, using a pooled connection string for migrations can be prone to errors. For this reason, we recommend using a direct (non-pooled) connection when performing migrations. For more information about direct and pooled connections, see [Connection pooling](https://neon.com/docs/connect/connection-pooling). ## Create Database Schema Now we'll create the database schema for our content moderation system. We need to track three main types of data: 1. Content items that need moderation 2. Moderation results from the OpenAI API 3. Moderation settings for different content types Let's create the migrations: ```bash php artisan make:migration create_content_items_table php artisan make:migration create_moderation_results_table php artisan make:migration create_moderation_settings_table ``` This will create three migration files in the `database/migrations` directory. Now, let's define the schema for each table: ### 1. Content Items Table This table stores the actual content that needs moderation: ```php // database/migrations/xxxx_xx_xx_create_content_items_table.php public function up(): void { Schema::create('content_items', function (Blueprint $table) { // Primary key $table->id(); // Foreign key to the user who created the content (optional) $table->foreignId('user_id')->nullable()->constrained()->onDelete('set null'); // Type of content (e.g., 'comment', 'post', 'review') $table->string('content_type'); // The actual content text $table->text('content'); // Current moderation status ('pending', 'approved', 'rejected') $table->string('status')->default('pending'); // Created/updated timestamps $table->timestamps(); }); } ``` ### 2. Moderation Results Table This table stores the results returned by the OpenAI moderation API: ```php // database/migrations/xxxx_xx_xx_create_moderation_results_table.php public function up(): void { Schema::create('moderation_results', function (Blueprint $table) { // Primary key $table->id(); // Foreign key to the content item being moderated $table->foreignId('content_item_id')->constrained()->onDelete('cascade'); // Whether the content was flagged by the moderation API $table->boolean('flagged'); // Categories that were flagged (stored as JSON) $table->json('categories')->nullable(); // Scores for each category (stored as JSON) $table->json('category_scores')->nullable(); // Highest confidence score among all categories $table->decimal('confidence', 8, 6)->nullable(); // Created/updated timestamps $table->timestamps(); }); } ``` ### 3. Moderation Settings Table This table stores moderation settings for different content types: ```php // database/migrations/xxxx_xx_xx_create_moderation_settings_table.php public function up(): void { Schema::create('moderation_settings', function (Blueprint $table) { // Primary key $table->id(); // Type of content these settings apply to $table->string('content_type'); // Categories to flag (stored as JSON) $table->json('flagged_categories')->nullable(); // Threshold for auto-rejection (0-1) $table->decimal('confidence_threshold', 8, 6)->default(0.5); // Whether to auto-approve content that passes moderation $table->boolean('auto_approve')->default(false); // Created/updated timestamps $table->timestamps(); }); } ``` Now run the migrations to create the tables in your Neon database: ```bash php artisan migrate ``` After completing your migrations, you can switch to a pooled connection for better performance in your application. ## Create Models Now let's create the Eloquent models for our database tables. These models will help us interact with the database using Laravel's ORM: ```bash php artisan make:model ContentItem php artisan make:model ModerationResult php artisan make:model ModerationSetting ``` This will create three model files in the `app/Models` directory. Let's define each model with their relationships and attributes: ### 1. ContentItem Model ```php // app/Models/ContentItem.php hasOne(ModerationResult::class); } /** * Get the user who created this content item. * This establishes a many-to-one relationship with User. */ public function user(): BelongsTo { return $this->belongsTo(User::class); } } ``` In the `ContentItem` model, we define the `$fillable` array to specify which fields can be mass-assigned. We also define relationships with the `ModerationResult` and `User` models which will allow us to retrieve related data without writing complex SQL queries. ### 2. ModerationResult Model ```php // app/Models/ModerationResult.php 'boolean', // Convert to PHP boolean 'categories' => 'array', // Convert JSON to PHP array 'category_scores' => 'array', // Convert JSON to PHP array 'confidence' => 'float', // Convert to PHP float ]; /** * Get the content item associated with this moderation result. */ public function contentItem(): BelongsTo { return $this->belongsTo(ContentItem::class); } } ``` Here again, we define the `$fillable` array to specify which fields can be mass-assigned. We also define a relationship with the `ContentItem` model to retrieve the content item associated with this moderation result. ### 3. ModerationSetting Model ```php // app/Models/ModerationSetting.php 'array', // Convert JSON to PHP array 'confidence_threshold' => 'float', // Convert to PHP float 'auto_approve' => 'boolean', // Convert to PHP boolean ]; } ``` Similar to the other models, we define the structure of our data and the relationships between them. The `ModerationSetting` model will store the moderation settings for different content types. ## Build Moderation Service Now, let's create a service class that will handle the content moderation logic. This service will use the OpenAI API to analyze content and store the results. First, create a new directory for services: ```bash mkdir -p app/Services ``` Now, create the moderation service file: ```php // app/Services/ModerationService.php client = OpenAI::client(env('OPENAI_API_KEY')); } /** * Moderate a content item using OpenAI's moderation API. * * @param ContentItem $contentItem The content item to moderate * @return ModerationResult The result of the moderation * @throws Exception If the moderation API request fails */ public function moderateContent(ContentItem $contentItem) { try { // Get the content and settings $content = $contentItem->content; // Find or create settings for this content type $settings = ModerationSetting::where('content_type', $contentItem->content_type)->first(); if (!$settings) { // Create default settings if none exist $settings = ModerationSetting::create([ 'content_type' => $contentItem->content_type, 'flagged_categories' => null, // Consider all categories 'confidence_threshold' => 0.5, // Medium threshold 'auto_approve' => false, // Don't auto-approve ]); } // Call OpenAI moderation API $response = $this->client->moderations()->create([ 'input' => $content, ]); // Process response $result = $response->results[0]; $flagged = $result->flagged; // Extract categories and scores $categories = []; $categoryScores = []; // Loop through each category in the response foreach ($result->categories as $key => $category) { $categoryScores[$key] = $category->score; if ($category->violated) { $categories[] = $key; } } // Determine highest score as overall confidence $confidence = !empty($categoryScores) ? max($categoryScores) : 0; // Save moderation result to database $moderationResult = ModerationResult::create([ 'content_item_id' => $contentItem->id, 'flagged' => $flagged, 'categories' => $categories, 'category_scores' => $categoryScores, 'confidence' => $confidence, ]); // Auto-approve or auto-reject based on settings if (!$flagged && $settings->auto_approve) { // Content is clean and auto-approve is enabled $contentItem->update(['status' => 'approved']); } elseif ($flagged && $confidence >= $settings->confidence_threshold) { // Content is flagged with confidence above threshold $contentItem->update(['status' => 'rejected']); } return $moderationResult; } catch (Exception $e) { // Log the error and rethrow Log::error('Moderation API error: ' . $e->getMessage()); throw $e; } } /** * Approve a content item. * * @param ContentItem $contentItem The content item to approve * @return bool Whether the update was successful */ public function approveContent(ContentItem $contentItem) { return $contentItem->update(['status' => 'approved']); } /** * Reject a content item. * * @param ContentItem $contentItem The content item to reject * @return bool Whether the update was successful */ public function rejectContent(ContentItem $contentItem) { return $contentItem->update(['status' => 'rejected']); } } ``` There are a few key points that the `ModerationService` class does, let's break it down: 1. It initializes an OpenAI client using your API key. 2. The `moderateContent` method: - Finds or creates settings for the content type - Calls the OpenAI moderation API - Processes the response to extract flagged categories and scores - Saves the moderation result to the database - Auto-approves or auto-rejects content based on settings 3. It provides methods to manually approve or reject content. A service provider in Laravel is a class that binds services to the Laravel service container. This allows us to use dependency injection to access the service in our controllers, models, or other classes. Let's register this service in the Laravel service container by creating a new service provider: ```bash php artisan make:provider ModerationServiceProvider ``` The new service provider will be created in the `app/Providers` directory. Now, configure the service provider: ```php // app/Providers/ModerationServiceProvider.php app->singleton(ModerationService::class, function ($app) { return new ModerationService(); }); } /** * Bootstrap services. */ public function boot(): void { // } } ``` Add this new service provider to the providers array in `bootstrap/providers.php`: ```php // bootstrap/providers.php 'providers' => [ // ... other providers App\Providers\ModerationServiceProvider::class, ], ``` With the service provider in place along the models and migration files, we can now move to the next step of creating the Livewire components. ## Create Livewire Components Now, let's create Livewire components for our content moderation system. Livewire allows us to create interactive UI components without writing JavaScript. We'll create a component for content submission, a moderation queue, and a dashboard for moderation statistics. 1. `ContentSubmission` component - for users to submit content 2. `ModerationQueue` component - for moderators to review content 3. `DashboardStats` component - to display moderation statistics Let's create these components: ```bash php artisan livewire:make ContentSubmission php artisan livewire:make ModerationQueue php artisan livewire:make DashboardStats ``` This will create three new Livewire components in the `app/Livewire` directory along with their corresponding views in the `resources/views/livewire` directory. ### 1. `ContentSubmission` Component First, let's implement the component class: ```php // app/Livewire/ContentSubmission.php 'required|string|min:5', 'contentType' => 'required|string', ]; /** * Handle form submission. */ public function submitContent() { // Validate form input $this->validate(); // Create content item in the database $contentItem = ContentItem::create([ 'user_id' => Auth::id(), // Current logged-in user 'content_type' => $this->contentType, 'content' => $this->content, 'status' => 'pending', // Initial status is pending ]); // Moderate the content immediately try { // Get the moderation service from the container $moderationService = app(ModerationService::class); // Send the content to OpenAI for moderation $moderationService->moderateContent($contentItem); // Update the message based on moderation status $this->message = 'Content submitted for review'; $this->status = $contentItem->status; if ($contentItem->status === 'approved') { $this->message = 'Content approved and published'; } elseif ($contentItem->status === 'rejected') { $this->message = 'Content rejected due to policy violations'; } } catch (\Exception $e) { // Handle moderation API errors $this->message = 'Content submitted for review, but moderation service is currently unavailable.'; } // Clear form after submission $this->reset('content'); } /** * Render the component. */ public function render() { return view('livewire.content-submission'); } } ``` The `ContentSubmission` component class handles form submission, content validation, and moderation using the `ModerationService`. It also updates the message based on the moderation status. Now, let's create the view for this component: ```php
| ID | Type | Content | Status | Flags | Actions |
|---|---|---|---|---|---|
| {{ $item->id }} | {{ $item->content_type }} |
{{ $item->content }}
|
@if ($item->status === 'pending') Pending @elseif ($item->status === 'approved') Approved @elseif ($item->status === 'rejected') Rejected @endif |
@if ($item->moderationResult)
@if ($item->moderationResult->flagged)
Flagged
@foreach ($item->moderationResult->categories as $category)
{{ $category }}
@endforeach
@else
Clean
@endif
@else
Not Checked
@endif
|
@if ($item->status !== 'approved')
@endif
@if ($item->status !== 'rejected')
@endif
@if (!$item->moderationResult)
@endif
|
| No content items found | |||||
Pending Review
{{ $pendingCount }}
Approved
{{ $approvedCount }}
Rejected
{{ $rejectedCount }}
Flagged
{{ $flaggedCount }}