Initial commit: Complete threaded conversation system with inline replies
**Major Features Added:** - **Inline Reply System**: Replace compose screen with inline reply boxes - **Thread Navigation**: Parent/child navigation with jump functionality - **Chain Flow UI**: Reply counts, expand/collapse animations, visual hierarchy - **Enhanced Animations**: Smooth transitions, hover effects, micro-interactions **Frontend Changes:** - **ThreadedCommentWidget**: Complete rewrite with animations and navigation - **ThreadNode Model**: Added parent references and descendant counting - **ThreadedConversationScreen**: Integrated navigation handlers - **PostDetailScreen**: Replaced with threaded conversation view - **ComposeScreen**: Added reply indicators and context - **PostActions**: Fixed visibility checks for chain buttons **Backend Changes:** - **API Route**: Added /posts/:id/thread endpoint - **Post Repository**: Include allow_chain and visibility fields in feed - **Thread Handler**: Support for fetching post chains **UI/UX Improvements:** - **Reply Context**: Clear indication when replying to specific posts - **Character Counting**: 500 character limit with live counter - **Visual Hierarchy**: Depth-based indentation and styling - **Smooth Animations**: SizeTransition, FadeTransition, hover states - **Chain Navigation**: Parent/child buttons with visual feedback **Technical Enhancements:** - **Animation Controllers**: Proper lifecycle management - **State Management**: Clean separation of concerns - **Navigation Callbacks**: Reusable navigation system - **Error Handling**: Graceful fallbacks and user feedback This creates a Reddit-style threaded conversation experience with smooth animations, inline replies, and intuitive navigation between posts in a chain.
This commit is contained in:
commit
3c4680bdd7
5
.firebaserc
Normal file
5
.firebaserc
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
{
|
||||
"projects": {
|
||||
"default": "your-firebase-project-id"
|
||||
}
|
||||
}
|
||||
94
.gitignore
vendored
Normal file
94
.gitignore
vendored
Normal file
|
|
@ -0,0 +1,94 @@
|
|||
# Environment variables
|
||||
*.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# Supabase
|
||||
.branches
|
||||
.temp
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# Large build artifacts and debug files
|
||||
*.zip
|
||||
*.tar.gz
|
||||
*.tar
|
||||
*.gz
|
||||
*.exe
|
||||
*.bin
|
||||
*.db
|
||||
*.sqlite
|
||||
*.sqlite3
|
||||
|
||||
# HAR files
|
||||
*.har
|
||||
localhost.har
|
||||
|
||||
# Claude AI
|
||||
.claude/
|
||||
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# Dependencies
|
||||
node_modules/
|
||||
.pnp
|
||||
.pnp.js
|
||||
|
||||
# Testing
|
||||
coverage/
|
||||
.nyc_output
|
||||
|
||||
# Build
|
||||
dist/
|
||||
build/
|
||||
out/
|
||||
sources/
|
||||
|
||||
# Flutter
|
||||
.dart_tool/
|
||||
.flutter-plugins
|
||||
.flutter-plugins-dependencies
|
||||
.packages
|
||||
.pub-cache/
|
||||
.pub/
|
||||
build/
|
||||
*.g.dart
|
||||
*.freezed.dart
|
||||
|
||||
# Secrets
|
||||
*.pem
|
||||
*.key
|
||||
secrets/
|
||||
credentials/
|
||||
|
||||
# Documentation (keep locally, not on GitHub)
|
||||
docs/
|
||||
|
||||
# Temporary files
|
||||
tmp/
|
||||
temp/
|
||||
*.tmp
|
||||
|
||||
# Documentation and SQL
|
||||
# Remove broad exclusion to allow go-backend files
|
||||
# !go-backend/**/*.md
|
||||
# !go-backend/**/*.sql
|
||||
|
||||
logo.ai
|
||||
sojorn_app/analysis_results_final.txt
|
||||
go-backend/.env
|
||||
go-backend/bin/
|
||||
358
ANDROID_FCM_TROUBLESHOOTING.md
Normal file
358
ANDROID_FCM_TROUBLESHOOTING.md
Normal file
|
|
@ -0,0 +1,358 @@
|
|||
# Android FCM Notifications - Troubleshooting Guide
|
||||
|
||||
## Issue: Chat notifications work on Web but not Android
|
||||
|
||||
### Current Status
|
||||
- ✅ Web notifications working
|
||||
- ✅ Android has `google-services.json` configured
|
||||
- ✅ Android has FCM plugin in `build.gradle.kts`
|
||||
- ✅ Android has notification permissions in `AndroidManifest.xml`
|
||||
- ❓ Android FCM token registration status unknown
|
||||
|
||||
---
|
||||
|
||||
## Diagnostic Steps
|
||||
|
||||
### Step 1: Check Android Logs for FCM Token
|
||||
|
||||
Run the Android app with logging:
|
||||
```bash
|
||||
cd c:\Webs\Sojorn
|
||||
.\run_dev.ps1
|
||||
```
|
||||
|
||||
**In Android Studio or terminal, check logcat:**
|
||||
```bash
|
||||
adb logcat | findstr "FCM"
|
||||
```
|
||||
|
||||
**Look for these log messages:**
|
||||
```
|
||||
[FCM] Initializing for platform: android
|
||||
[FCM] Permission status: AuthorizationStatus.authorized
|
||||
[FCM] Requesting token...
|
||||
[FCM] Token registered (android): eXaMpLe...
|
||||
[FCM] Syncing token with backend...
|
||||
[FCM] Token synced with Go Backend successfully
|
||||
[FCM] Initialization complete
|
||||
```
|
||||
|
||||
### Step 2: Check for Common Errors
|
||||
|
||||
#### Error: "Token is null after getToken()"
|
||||
**Cause:** Firebase not properly initialized or `google-services.json` mismatch
|
||||
|
||||
**Fix:**
|
||||
1. Verify `google-services.json` package name matches:
|
||||
```json
|
||||
"package_name": "com.gosojorn.app"
|
||||
```
|
||||
2. Check `build.gradle.kts` has:
|
||||
```kotlin
|
||||
applicationId = "com.gosojorn.app"
|
||||
```
|
||||
3. Rebuild: `flutter clean && flutter pub get && flutter run`
|
||||
|
||||
#### Error: "Permission denied"
|
||||
**Cause:** User denied notification permission or Android 13+ permission not requested
|
||||
|
||||
**Fix:**
|
||||
1. Check `AndroidManifest.xml` has:
|
||||
```xml
|
||||
<uses-permission android:name="android.permission.POST_NOTIFICATIONS" />
|
||||
```
|
||||
2. On Android 13+, permission must be requested at runtime
|
||||
3. Uninstall and reinstall app to re-trigger permission prompt
|
||||
|
||||
#### Error: "Failed to initialize notifications"
|
||||
**Cause:** Firebase plugin not properly initialized
|
||||
|
||||
**Fix:**
|
||||
1. Check `android/build.gradle` (project level) has:
|
||||
```gradle
|
||||
dependencies {
|
||||
classpath 'com.google.gms:google-services:4.4.0'
|
||||
}
|
||||
```
|
||||
2. Check `android/app/build.gradle.kts` has:
|
||||
```kotlin
|
||||
plugins {
|
||||
id("com.google.gms.google-services")
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 3: Verify Backend Receives Token
|
||||
|
||||
### Check Database for Android Tokens
|
||||
|
||||
SSH to server:
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
```
|
||||
|
||||
Query database:
|
||||
```bash
|
||||
sudo -u postgres psql sojorn
|
||||
```
|
||||
|
||||
```sql
|
||||
-- Check for Android tokens
|
||||
SELECT
|
||||
user_id,
|
||||
platform,
|
||||
LEFT(fcm_token, 30) as token_preview,
|
||||
created_at
|
||||
FROM public.fcm_tokens
|
||||
WHERE platform = 'android'
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 5;
|
||||
```
|
||||
|
||||
**Expected output:**
|
||||
```
|
||||
user_id | platform | token_preview | created_at
|
||||
-------------------------------------+----------+--------------------------------+-------------------
|
||||
5568b545-5215-4734-875f-84b3106cd170 | android | eXaMpLe_android_token_here... | 2026-01-29 06:00
|
||||
```
|
||||
|
||||
**If no Android tokens:**
|
||||
- Token registration failed
|
||||
- Token sync to backend failed
|
||||
- Check Android logs for errors
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Test Push Notification Manually
|
||||
|
||||
### Send Test Notification from Server
|
||||
|
||||
```bash
|
||||
# Get an Android FCM token from database
|
||||
sudo -u postgres psql sojorn -c "SELECT fcm_token FROM public.fcm_tokens WHERE platform = 'android' LIMIT 1;"
|
||||
```
|
||||
|
||||
The Go backend automatically sends notifications when:
|
||||
- Someone sends you a chat message
|
||||
- Someone follows you
|
||||
- Someone accepts your follow request
|
||||
|
||||
**Test by sending a chat message:**
|
||||
1. Open app on Android device
|
||||
2. Have another user (or web browser) send you a message
|
||||
3. Check Android logs for: `[FCM] Foreground message received`
|
||||
|
||||
---
|
||||
|
||||
## Step 5: Check Notification Channel (Android 8+)
|
||||
|
||||
Android 8+ requires notification channels. Check `strings.xml`:
|
||||
|
||||
**File:** `android/app/src/main/res/values/strings.xml`
|
||||
```xml
|
||||
<resources>
|
||||
<string name="default_notification_channel_id">chat_messages</string>
|
||||
<string name="default_notification_channel_name">Chat messages</string>
|
||||
</resources>
|
||||
```
|
||||
|
||||
**Referenced in AndroidManifest.xml:**
|
||||
```xml
|
||||
<meta-data
|
||||
android:name="com.google.firebase.messaging.default_notification_channel_id"
|
||||
android:value="@string/default_notification_channel_id" />
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Common Issues & Solutions
|
||||
|
||||
### Issue 1: "google-services.json not found"
|
||||
|
||||
**Symptoms:**
|
||||
- Build fails with "File google-services.json is missing"
|
||||
- FCM token is null
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Verify file exists
|
||||
ls sojorn_app/android/app/google-services.json
|
||||
|
||||
# If missing, download from Firebase Console:
|
||||
# https://console.firebase.google.com/project/sojorn-a7a78/settings/general
|
||||
# Click "Add app" > Android > Download google-services.json
|
||||
```
|
||||
|
||||
### Issue 2: Package name mismatch
|
||||
|
||||
**Symptoms:**
|
||||
- FCM token is null
|
||||
- No errors in logs
|
||||
|
||||
**Solution:**
|
||||
Verify all package names match:
|
||||
1. `google-services.json`: `"package_name": "com.gosojorn.app"`
|
||||
2. `build.gradle.kts`: `applicationId = "com.gosojorn.app"`
|
||||
3. `AndroidManifest.xml`: `<manifest xmlns:android="...">` (no package attribute needed)
|
||||
|
||||
### Issue 3: Notification permission not granted
|
||||
|
||||
**Symptoms:**
|
||||
- Log shows: `[FCM] Permission status: AuthorizationStatus.denied`
|
||||
- No token generated
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Uninstall app
|
||||
adb uninstall com.gosojorn.app
|
||||
|
||||
# Reinstall and allow notification permission when prompted
|
||||
flutter run
|
||||
```
|
||||
|
||||
### Issue 4: Token generated but not synced to backend
|
||||
|
||||
**Symptoms:**
|
||||
- Log shows: `[FCM] Token registered (android): ...`
|
||||
- Log shows: `[FCM] Sync failed: ...`
|
||||
- No token in database
|
||||
|
||||
**Solution:**
|
||||
Check API endpoint exists:
|
||||
```bash
|
||||
# On server
|
||||
sudo journalctl -u sojorn-api -f | grep "notifications/device"
|
||||
```
|
||||
|
||||
Verify Go backend has the endpoint:
|
||||
```go
|
||||
// Should be in cmd/api/main.go
|
||||
authorized.POST("/notifications/device", settingsHandler.RegisterFCMToken)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Comparison: Web vs Android
|
||||
|
||||
### Web (Working ✅)
|
||||
- Uses VAPID key for authentication
|
||||
- Service worker handles background messages
|
||||
- Token format: `d2n2ELGKel7yzPL3wZLGSe:APA91b...`
|
||||
|
||||
### Android (Troubleshooting ❓)
|
||||
- Uses `google-services.json` for authentication
|
||||
- Native Android handles background messages
|
||||
- Token format: Different from web, longer
|
||||
- Requires runtime permission on Android 13+
|
||||
|
||||
---
|
||||
|
||||
## Debug Checklist
|
||||
|
||||
Run through this checklist:
|
||||
|
||||
- [ ] `google-services.json` exists in `android/app/`
|
||||
- [ ] Package name matches in all files
|
||||
- [ ] `build.gradle.kts` has `google-services` plugin
|
||||
- [ ] `AndroidManifest.xml` has `POST_NOTIFICATIONS` permission
|
||||
- [ ] App has notification permission granted
|
||||
- [ ] Android logs show FCM initialization
|
||||
- [ ] Android logs show token generated
|
||||
- [ ] Token appears in database `fcm_tokens` table
|
||||
- [ ] Backend logs show notification being sent
|
||||
- [ ] Android logs show notification received
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. **Run the app with enhanced logging:**
|
||||
```bash
|
||||
cd c:\Webs\Sojorn
|
||||
.\run_dev.ps1
|
||||
```
|
||||
|
||||
2. **Monitor Android logs:**
|
||||
```bash
|
||||
adb logcat | findstr "FCM"
|
||||
```
|
||||
|
||||
3. **Look for the specific log messages:**
|
||||
- `[FCM] Initializing for platform: android`
|
||||
- `[FCM] Token registered (android): ...`
|
||||
- `[FCM] Token synced with Go Backend successfully`
|
||||
|
||||
4. **If token is null:**
|
||||
- Check `google-services.json` is correct
|
||||
- Verify package name matches
|
||||
- Rebuild: `flutter clean && flutter pub get && flutter run`
|
||||
|
||||
5. **If token generated but notifications not received:**
|
||||
- Check database has the token
|
||||
- Send a test message
|
||||
- Check backend logs for push notification being sent
|
||||
- Verify Android device has internet connection
|
||||
|
||||
---
|
||||
|
||||
## Files to Check
|
||||
|
||||
### Android Configuration
|
||||
- `sojorn_app/android/app/google-services.json` - Firebase config
|
||||
- `sojorn_app/android/app/build.gradle.kts` - Build configuration
|
||||
- `sojorn_app/android/app/src/main/AndroidManifest.xml` - Permissions
|
||||
- `sojorn_app/android/app/src/main/res/values/strings.xml` - Notification channel
|
||||
|
||||
### Flutter Code
|
||||
- `sojorn_app/lib/services/notification_service.dart` - FCM initialization (now with enhanced logging)
|
||||
- `sojorn_app/lib/main.dart` - App initialization
|
||||
|
||||
### Backend
|
||||
- `go-backend/internal/services/push_service.go` - Push notification sender
|
||||
- `go-backend/internal/handlers/settings_handler.go` - FCM token registration endpoint
|
||||
|
||||
---
|
||||
|
||||
## Quick Commands
|
||||
|
||||
```bash
|
||||
# Check Android logs
|
||||
adb logcat | findstr "FCM"
|
||||
|
||||
# Check if app is installed
|
||||
adb shell pm list packages | findstr gosojorn
|
||||
|
||||
# Uninstall app
|
||||
adb uninstall com.gosojorn.app
|
||||
|
||||
# Check notification settings
|
||||
adb shell dumpsys notification | findstr gosojorn
|
||||
|
||||
# Check database for tokens
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
sudo -u postgres psql sojorn -c "SELECT platform, COUNT(*) FROM fcm_tokens GROUP BY platform;"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Expected Behavior
|
||||
|
||||
**When working correctly:**
|
||||
|
||||
1. App starts → `[FCM] Initializing for platform: android`
|
||||
2. Permission requested → User grants → `[FCM] Permission status: AuthorizationStatus.authorized`
|
||||
3. Token generated → `[FCM] Token registered (android): eXaMpLe...`
|
||||
4. Token synced → `[FCM] Token synced with Go Backend successfully`
|
||||
5. Message sent → Backend sends push → `[FCM] Foreground message received`
|
||||
6. Notification appears in Android notification tray
|
||||
|
||||
---
|
||||
|
||||
## Contact & Support
|
||||
|
||||
If issues persist after following this guide:
|
||||
1. Share Android logcat output (filtered for FCM)
|
||||
2. Share database query results for `fcm_tokens` table
|
||||
3. Share backend logs when sending notification
|
||||
4. Verify Firebase Console shows Android app is active
|
||||
167
CHAT_DELETE_DEPLOYMENT.md
Normal file
167
CHAT_DELETE_DEPLOYMENT.md
Normal file
|
|
@ -0,0 +1,167 @@
|
|||
# Chat Deletion Feature - Deployment Guide
|
||||
|
||||
## Summary
|
||||
|
||||
Fixed the chat deletion functionality to make it **permanent** with proper warnings. Users now get a clear warning dialog before deletion, and the system removes data from both the server database and local IndexedDB storage.
|
||||
|
||||
---
|
||||
|
||||
## Changes Made
|
||||
|
||||
### 1. Backend (Go) - DELETE Endpoints Added
|
||||
|
||||
**Files Modified:**
|
||||
- `go-backend/internal/repository/chat_repository.go` - Added `DeleteConversation()` and `DeleteMessage()` methods
|
||||
- `go-backend/internal/handlers/chat_handler.go` - Added DELETE handler endpoints
|
||||
- `go-backend/cmd/api/main.go` - Registered DELETE routes
|
||||
|
||||
**New API Endpoints:**
|
||||
- `DELETE /api/v1/conversations/:id` - Permanently deletes conversation and all messages
|
||||
- `DELETE /api/v1/messages/:id` - Permanently deletes a single message
|
||||
|
||||
**Security:**
|
||||
- Verifies user is a participant before allowing deletion
|
||||
- Returns 401 Unauthorized if user doesn't have permission
|
||||
|
||||
### 2. Flutter App - Permanent Deletion
|
||||
|
||||
**Files Modified:**
|
||||
- `sojorn_app/lib/services/api_service.dart` - Added `deleteConversation()` and `deleteMessage()` API methods
|
||||
- `sojorn_app/lib/services/secure_chat_service.dart` - Updated to call backend DELETE API
|
||||
- `sojorn_app/lib/screens/secure_chat/secure_chat_screen.dart` - Enhanced warning dialog
|
||||
|
||||
**Key Changes:**
|
||||
- Delete now removes data from **both** server database and local IndexedDB
|
||||
- New warning dialog with:
|
||||
- ⚠️ Red warning icon and "PERMANENT DELETION" title
|
||||
- Bullet points showing what will be deleted
|
||||
- Red warning box stating "THIS ACTION CANNOT BE UNDONE"
|
||||
- Red "DELETE PERMANENTLY" button
|
||||
- Cannot be dismissed by tapping outside
|
||||
- Loading indicator during deletion
|
||||
- Success/error feedback
|
||||
|
||||
---
|
||||
|
||||
## Deployment Steps
|
||||
|
||||
### Step 1: Delete All Existing Chats from Database
|
||||
|
||||
**SSH into your server:**
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
```
|
||||
|
||||
**Password:** `P22k154ever!`
|
||||
|
||||
**Run the SQL script:**
|
||||
```bash
|
||||
sudo -u postgres psql sojorn
|
||||
```
|
||||
|
||||
Then paste this SQL:
|
||||
```sql
|
||||
BEGIN;
|
||||
|
||||
-- Delete all messages first (foreign key dependency)
|
||||
DELETE FROM public.secure_messages;
|
||||
|
||||
-- Delete all conversations
|
||||
DELETE FROM public.encrypted_conversations;
|
||||
|
||||
COMMIT;
|
||||
|
||||
-- Verify deletion
|
||||
SELECT COUNT(*) as message_count FROM public.secure_messages;
|
||||
SELECT COUNT(*) as conversation_count FROM public.encrypted_conversations;
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
message_count: 0
|
||||
conversation_count: 0
|
||||
```
|
||||
|
||||
Type `\q` to exit psql.
|
||||
|
||||
### Step 2: Deploy Go Backend
|
||||
|
||||
**From your local machine, deploy the updated backend:**
|
||||
```powershell
|
||||
cd c:\Webs\Sojorn\go-backend
|
||||
.\scripts\deploy.sh
|
||||
```
|
||||
|
||||
Or manually:
|
||||
```bash
|
||||
# On server
|
||||
cd /home/patrick/sojorn-backend
|
||||
git pull
|
||||
go build -o sojorn-api ./cmd/api
|
||||
sudo systemctl restart sojorn-api
|
||||
sudo systemctl status sojorn-api
|
||||
```
|
||||
|
||||
### Step 3: Hot Restart Flutter App
|
||||
|
||||
**No deployment needed** - just hot restart the Flutter web app:
|
||||
1. In your browser, press `R` or refresh the page
|
||||
2. Or run: `flutter run -d chrome --web-port 8080`
|
||||
|
||||
---
|
||||
|
||||
## Testing the Fix
|
||||
|
||||
1. **Start a new conversation** with another user
|
||||
2. **Send a few test messages**
|
||||
3. **Click the 3-dot menu** in the chat screen
|
||||
4. **Select "Delete Chat"**
|
||||
5. **Verify the warning dialog shows:**
|
||||
- Red warning icon
|
||||
- "PERMANENT DELETION" title
|
||||
- List of what will be deleted
|
||||
- Red warning box
|
||||
- "DELETE PERMANENTLY" button
|
||||
6. **Click "DELETE PERMANENTLY"**
|
||||
7. **Verify:**
|
||||
- Loading indicator appears
|
||||
- Success message shows
|
||||
- Chat screen closes
|
||||
- Conversation is removed from list
|
||||
- Messages are gone from database (check with SQL)
|
||||
- Messages are gone from IndexedDB (check browser DevTools > Application > IndexedDB)
|
||||
|
||||
---
|
||||
|
||||
## What's Fixed
|
||||
|
||||
### Before:
|
||||
- ❌ Delete only removed local IndexedDB data
|
||||
- ❌ Server data remained (encrypted messages still in DB)
|
||||
- ❌ Weak warning dialog
|
||||
- ❌ Deletion wasn't permanent
|
||||
- ❌ Other user could still see messages
|
||||
|
||||
### After:
|
||||
- ✅ Deletes from **both** server database and local storage
|
||||
- ✅ Strong warning dialog with multiple warnings
|
||||
- ✅ **PERMANENT** deletion - cannot be undone
|
||||
- ✅ Both users lose all messages
|
||||
- ✅ Proper loading and error handling
|
||||
- ✅ Authorization checks (only participants can delete)
|
||||
|
||||
---
|
||||
|
||||
## SQL Script Location
|
||||
|
||||
The SQL script to delete all chats is saved at:
|
||||
`c:\Webs\Sojorn\migrations_archive\delete_all_chats.sql`
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- The E2EE key fixes from earlier are still in place
|
||||
- Users will need to hot restart to get the new OTK fixes
|
||||
- After deleting all chats, users can start fresh with properly working E2EE
|
||||
- The delete function now works correctly for future conversations
|
||||
334
FCM_DEPLOYMENT.md
Normal file
334
FCM_DEPLOYMENT.md
Normal file
|
|
@ -0,0 +1,334 @@
|
|||
# FCM Notifications - Complete Deployment Guide
|
||||
|
||||
## Quick Start (TL;DR)
|
||||
|
||||
1. Get VAPID key from Firebase Console
|
||||
2. Download Firebase service account JSON
|
||||
3. Update Flutter app with VAPID key
|
||||
4. Upload JSON to server at `/opt/sojorn/firebase-service-account.json`
|
||||
5. Add to `/opt/sojorn/.env`: `FIREBASE_CREDENTIALS_FILE=/opt/sojorn/firebase-service-account.json`
|
||||
6. Restart Go backend
|
||||
7. Test notifications
|
||||
|
||||
---
|
||||
|
||||
## Detailed Steps
|
||||
|
||||
### 1. Get Firebase Credentials
|
||||
|
||||
#### A. Get VAPID Key (for Web Push)
|
||||
|
||||
1. Go to https://console.firebase.google.com/project/sojorn-a7a78/settings/cloudmessaging
|
||||
2. Scroll to **Web configuration** section
|
||||
3. Under **Web Push certificates**, copy the **Key pair**
|
||||
4. It should look like: `BNxS7_very_long_string_of_characters...`
|
||||
|
||||
#### B. Download Service Account JSON (for Server)
|
||||
|
||||
1. Go to https://console.firebase.google.com/project/sojorn-a7a78/settings/serviceaccounts
|
||||
2. Click **Generate new private key**
|
||||
3. Click **Generate key** - downloads JSON file
|
||||
4. Save it somewhere safe (you'll upload it to server)
|
||||
|
||||
---
|
||||
|
||||
### 2. Update Flutter App with VAPID Key
|
||||
|
||||
**File:** `sojorn_app/lib/config/firebase_web_config.dart`
|
||||
|
||||
Replace line 24:
|
||||
```dart
|
||||
static const String _vapidKey = 'YOUR_VAPID_KEY_HERE';
|
||||
```
|
||||
|
||||
With your actual VAPID key:
|
||||
```dart
|
||||
static const String _vapidKey = 'BNxS7_your_actual_vapid_key_from_firebase_console';
|
||||
```
|
||||
|
||||
**Commit and push:**
|
||||
```bash
|
||||
cd c:\Webs\Sojorn
|
||||
git add sojorn_app/lib/config/firebase_web_config.dart
|
||||
git commit -m "Add FCM VAPID key for web push notifications"
|
||||
git push
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 3. Upload Firebase Service Account JSON to Server
|
||||
|
||||
**From Windows PowerShell:**
|
||||
```powershell
|
||||
scp -i "C:\Users\Patrick\.ssh\mpls.pem" "C:\path\to\sojorn-a7a78-firebase-adminsdk-xxxxx.json" patrick@194.238.28.122:/tmp/firebase-service-account.json
|
||||
```
|
||||
|
||||
Replace `C:\path\to\...` with the actual path to your downloaded JSON file.
|
||||
|
||||
---
|
||||
|
||||
### 4. Configure Server
|
||||
|
||||
**SSH to server:**
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
```
|
||||
|
||||
**Run the setup script:**
|
||||
```bash
|
||||
cd /home/patrick
|
||||
curl -O https://raw.githubusercontent.com/your-repo/sojorn/main/setup_fcm_server.sh
|
||||
chmod +x setup_fcm_server.sh
|
||||
./setup_fcm_server.sh
|
||||
```
|
||||
|
||||
**Or manually:**
|
||||
|
||||
```bash
|
||||
# Move JSON file
|
||||
sudo mv /tmp/firebase-service-account.json /opt/sojorn/firebase-service-account.json
|
||||
sudo chmod 600 /opt/sojorn/firebase-service-account.json
|
||||
sudo chown patrick:patrick /opt/sojorn/firebase-service-account.json
|
||||
|
||||
# Edit .env
|
||||
sudo nano /opt/sojorn/.env
|
||||
```
|
||||
|
||||
Add these lines to `.env`:
|
||||
```bash
|
||||
# Firebase Cloud Messaging
|
||||
FIREBASE_CREDENTIALS_FILE=/opt/sojorn/firebase-service-account.json
|
||||
FIREBASE_WEB_VAPID_KEY=BNxS7_your_actual_vapid_key_here
|
||||
```
|
||||
|
||||
Save and exit (Ctrl+X, Y, Enter)
|
||||
|
||||
---
|
||||
|
||||
### 5. Restart Go Backend
|
||||
|
||||
```bash
|
||||
cd /home/patrick/sojorn-backend
|
||||
sudo systemctl restart sojorn-api
|
||||
sudo systemctl status sojorn-api
|
||||
```
|
||||
|
||||
**Check logs for successful initialization:**
|
||||
```bash
|
||||
sudo journalctl -u sojorn-api -f --since "1 minute ago"
|
||||
```
|
||||
|
||||
Look for:
|
||||
```
|
||||
[INFO] PushService initialized successfully
|
||||
```
|
||||
|
||||
If you see errors, check:
|
||||
- JSON file exists: `ls -la /opt/sojorn/firebase-service-account.json`
|
||||
- .env has correct path: `sudo cat /opt/sojorn/.env | grep FIREBASE`
|
||||
- JSON is valid: `cat /opt/sojorn/firebase-service-account.json | jq .`
|
||||
|
||||
---
|
||||
|
||||
### 6. Deploy Flutter App
|
||||
|
||||
**Hot restart (no build needed):**
|
||||
Just refresh your browser or press `R` in the Flutter dev console.
|
||||
|
||||
**Or rebuild and deploy:**
|
||||
```bash
|
||||
cd c:\Webs\Sojorn\sojorn_app
|
||||
flutter build web --release
|
||||
# Deploy to your hosting
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 7. Test FCM Notifications
|
||||
|
||||
#### Test 1: Check Token Registration
|
||||
|
||||
1. Open Sojorn web app in browser
|
||||
2. Open DevTools (F12) > Console
|
||||
3. Look for: `FCM token registered (web): d2n2ELGKel7yzPL3wZLGSe...`
|
||||
4. If you see "Web push is missing FIREBASE_WEB_VAPID_KEY", VAPID key is not set correctly
|
||||
|
||||
#### Test 2: Check Database
|
||||
|
||||
```bash
|
||||
sudo -u postgres psql sojorn
|
||||
```
|
||||
|
||||
```sql
|
||||
-- Check FCM tokens are being stored
|
||||
SELECT user_id, platform, LEFT(fcm_token, 30) as token_preview, created_at
|
||||
FROM public.fcm_tokens
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 5;
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
user_id | platform | token_preview | created_at
|
||||
-------------------------------------+----------+--------------------------------+-------------------
|
||||
5568b545-5215-4734-875f-84b3106cd170 | web | d2n2ELGKel7yzPL3wZLGSe:APA91b | 2026-01-29 05:50
|
||||
```
|
||||
|
||||
#### Test 3: Send Test Message
|
||||
|
||||
1. Open two browser windows (or use two different users)
|
||||
2. User A sends a chat message to User B
|
||||
3. User B should receive a push notification (if browser is in background)
|
||||
|
||||
**Check server logs:**
|
||||
```bash
|
||||
sudo journalctl -u sojorn-api -f | grep -i push
|
||||
```
|
||||
|
||||
You should see:
|
||||
```
|
||||
[INFO] Sending push notification to user 5568b545...
|
||||
[INFO] Push notification sent successfully
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Issue: "Web push is missing FIREBASE_WEB_VAPID_KEY"
|
||||
|
||||
**Cause:** VAPID key not set in Flutter app
|
||||
|
||||
**Fix:**
|
||||
1. Update `firebase_web_config.dart` with actual VAPID key
|
||||
2. Hot restart Flutter app
|
||||
3. Check console again
|
||||
|
||||
### Issue: "Failed to initialize PushService"
|
||||
|
||||
**Cause:** Firebase service account JSON not found or invalid
|
||||
|
||||
**Fix:**
|
||||
```bash
|
||||
# Check file exists
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
|
||||
# Check .env has correct path
|
||||
sudo cat /opt/sojorn/.env | grep FIREBASE_CREDENTIALS_FILE
|
||||
|
||||
# Validate JSON
|
||||
cat /opt/sojorn/firebase-service-account.json | jq .
|
||||
|
||||
# Check permissions
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
# Should show: -rw------- 1 patrick patrick
|
||||
```
|
||||
|
||||
### Issue: Notifications not received
|
||||
|
||||
**Checklist:**
|
||||
- [ ] Browser notification permissions granted
|
||||
- [ ] FCM token registered (check console)
|
||||
- [ ] Token stored in database (check SQL)
|
||||
- [ ] Go backend logs show push being sent
|
||||
- [ ] Service worker registered (check DevTools > Application > Service Workers)
|
||||
|
||||
**Check service worker:**
|
||||
1. Open DevTools > Application > Service Workers
|
||||
2. Should see `firebase-messaging-sw.js` registered
|
||||
3. If not, check `sojorn_app/web/firebase-messaging-sw.js` exists
|
||||
|
||||
---
|
||||
|
||||
## Current Configuration
|
||||
|
||||
**Firebase Project:**
|
||||
- Project ID: `sojorn-a7a78`
|
||||
- Sender ID: `486753572104`
|
||||
- Console: https://console.firebase.google.com/project/sojorn-a7a78
|
||||
|
||||
**Server Paths:**
|
||||
- .env: `/opt/sojorn/.env`
|
||||
- Service Account: `/opt/sojorn/firebase-service-account.json`
|
||||
- Backend: `/home/patrick/sojorn-backend`
|
||||
|
||||
**Flutter Files:**
|
||||
- Config: `sojorn_app/lib/config/firebase_web_config.dart`
|
||||
- Service Worker: `sojorn_app/web/firebase-messaging-sw.js`
|
||||
- Notification Service: `sojorn_app/lib/services/notification_service.dart`
|
||||
|
||||
---
|
||||
|
||||
## How FCM Works in Sojorn
|
||||
|
||||
1. **User opens app** → Flutter requests notification permission
|
||||
2. **Permission granted** → Firebase generates FCM token
|
||||
3. **Token sent to backend** → Stored in `fcm_tokens` table
|
||||
4. **Event occurs** (new message, follow, etc.) → Go backend calls `PushService.SendPush()`
|
||||
5. **FCM sends notification** → User's device/browser receives it
|
||||
6. **User clicks notification** → App opens to relevant screen
|
||||
|
||||
**Notification Triggers:**
|
||||
- New chat message (`chat_handler.go:156`)
|
||||
- New follower (`user_handler.go:141`)
|
||||
- Follow request accepted (`user_handler.go:319`)
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference Commands
|
||||
|
||||
```bash
|
||||
# SSH to server
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
|
||||
# Check .env
|
||||
sudo cat /opt/sojorn/.env | grep FIREBASE
|
||||
|
||||
# Check service account file
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
cat /opt/sojorn/firebase-service-account.json | jq .project_id
|
||||
|
||||
# Restart backend
|
||||
sudo systemctl restart sojorn-api
|
||||
|
||||
# View logs
|
||||
sudo journalctl -u sojorn-api -f
|
||||
|
||||
# Check FCM tokens in DB
|
||||
sudo -u postgres psql sojorn -c "SELECT COUNT(*) as token_count FROM public.fcm_tokens;"
|
||||
|
||||
# View recent tokens
|
||||
sudo -u postgres psql sojorn -c "SELECT user_id, platform, created_at FROM public.fcm_tokens ORDER BY created_at DESC LIMIT 5;"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. `sojorn_app/lib/config/firebase_web_config.dart` - Added VAPID key placeholder
|
||||
2. `go-backend/.env.example` - Updated FCM configuration format
|
||||
3. Created `FCM_SETUP_GUIDE.md` - Detailed setup instructions
|
||||
4. Created `setup_fcm_server.sh` - Automated server setup script
|
||||
|
||||
---
|
||||
|
||||
## Next Steps After Deployment
|
||||
|
||||
1. Monitor logs for FCM errors
|
||||
2. Test notifications with real users
|
||||
3. Check FCM token count grows as users log in
|
||||
4. Verify push notifications work on:
|
||||
- Chrome (desktop & mobile)
|
||||
- Firefox (desktop & mobile)
|
||||
- Safari (if supported)
|
||||
- Edge
|
||||
|
||||
---
|
||||
|
||||
## Support
|
||||
|
||||
If you encounter issues:
|
||||
1. Check logs: `sudo journalctl -u sojorn-api -f`
|
||||
2. Verify configuration: `sudo cat /opt/sojorn/.env | grep FIREBASE`
|
||||
3. Test JSON validity: `cat /opt/sojorn/firebase-service-account.json | jq .`
|
||||
4. Check Firebase Console for errors: https://console.firebase.google.com/project/sojorn-a7a78/notification
|
||||
236
FCM_SETUP_GUIDE.md
Normal file
236
FCM_SETUP_GUIDE.md
Normal file
|
|
@ -0,0 +1,236 @@
|
|||
# Firebase Cloud Messaging (FCM) Setup Guide
|
||||
|
||||
## Overview
|
||||
|
||||
This guide will help you configure FCM push notifications for the Sojorn app. You need:
|
||||
1. **VAPID Key** - For web push notifications
|
||||
2. **Firebase Service Account JSON** - For server-side FCM API access
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Get Your VAPID Key from Firebase Console
|
||||
|
||||
1. Go to [Firebase Console](https://console.firebase.google.com/)
|
||||
2. Select your project: **sojorn-a7a78**
|
||||
3. Click the gear icon ⚙️ > **Project Settings**
|
||||
4. Go to the **Cloud Messaging** tab
|
||||
5. Scroll down to **Web configuration**
|
||||
6. Under **Web Push certificates**, you'll see your VAPID key pair
|
||||
7. If you don't have one, click **Generate key pair**
|
||||
8. Copy the **Key pair** (starts with `B...`)
|
||||
|
||||
**Example VAPID Key format:**
|
||||
```
|
||||
BNxS7_example_vapid_key_here_very_long_string_of_characters
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Get Your Firebase Service Account JSON
|
||||
|
||||
1. Still in Firebase Console > **Project Settings**
|
||||
2. Go to the **Service Accounts** tab
|
||||
3. Click **Generate new private key**
|
||||
4. Click **Generate key** - this downloads a JSON file
|
||||
5. The file will be named something like: `sojorn-a7a78-firebase-adminsdk-xxxxx-xxxxxxxxxx.json`
|
||||
|
||||
**Example JSON structure:**
|
||||
```json
|
||||
{
|
||||
"type": "service_account",
|
||||
"project_id": "sojorn-a7a78",
|
||||
"private_key_id": "abc123...",
|
||||
"private_key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
|
||||
"client_email": "firebase-adminsdk-xxxxx@sojorn-a7a78.iam.gserviceaccount.com",
|
||||
"client_id": "123456789...",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://oauth2.googleapis.com/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/..."
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 3: Update Server Configuration (/opt/sojorn/.env)
|
||||
|
||||
SSH into your server:
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
```
|
||||
|
||||
Edit the .env file:
|
||||
```bash
|
||||
sudo nano /opt/sojorn/.env
|
||||
```
|
||||
|
||||
Add these lines (replace with your actual values):
|
||||
```bash
|
||||
# Firebase Cloud Messaging
|
||||
FIREBASE_CREDENTIALS_FILE=/opt/sojorn/firebase-service-account.json
|
||||
FIREBASE_WEB_VAPID_KEY=BNxS7_YOUR_ACTUAL_VAPID_KEY_HERE
|
||||
```
|
||||
|
||||
Save and exit (Ctrl+X, Y, Enter)
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Upload Firebase Service Account JSON to Server
|
||||
|
||||
From your local machine, upload the JSON file:
|
||||
```powershell
|
||||
scp -i "C:\Users\Patrick\.ssh\mpls.pem" "C:\path\to\sojorn-a7a78-firebase-adminsdk-xxxxx.json" patrick@194.238.28.122:/tmp/firebase-service-account.json
|
||||
```
|
||||
|
||||
Then on the server, move it to the correct location:
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
sudo mv /tmp/firebase-service-account.json /opt/sojorn/firebase-service-account.json
|
||||
sudo chmod 600 /opt/sojorn/firebase-service-account.json
|
||||
sudo chown patrick:patrick /opt/sojorn/firebase-service-account.json
|
||||
```
|
||||
|
||||
Verify the file exists:
|
||||
```bash
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 5: Update Flutter App with VAPID Key
|
||||
|
||||
The VAPID key needs to be hardcoded in the Flutter app (already done in the code changes below).
|
||||
|
||||
---
|
||||
|
||||
## Step 6: Restart Go Backend
|
||||
|
||||
```bash
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
cd /home/patrick/sojorn-backend
|
||||
sudo systemctl restart sojorn-api
|
||||
sudo systemctl status sojorn-api
|
||||
```
|
||||
|
||||
Check logs for FCM initialization:
|
||||
```bash
|
||||
sudo journalctl -u sojorn-api -f --since "5 minutes ago"
|
||||
```
|
||||
|
||||
You should see:
|
||||
```
|
||||
[INFO] PushService initialized successfully
|
||||
```
|
||||
|
||||
If you see:
|
||||
```
|
||||
[WARN] Failed to initialize PushService
|
||||
```
|
||||
|
||||
Check that:
|
||||
- The JSON file exists at `/opt/sojorn/firebase-service-account.json`
|
||||
- The file has correct permissions (600)
|
||||
- The JSON is valid (not corrupted)
|
||||
|
||||
---
|
||||
|
||||
## Step 7: Test FCM Notifications
|
||||
|
||||
### Test 1: Register FCM Token
|
||||
|
||||
1. Open the Sojorn web app
|
||||
2. Open browser DevTools (F12) > Console
|
||||
3. Look for: `FCM token registered (web): ...`
|
||||
4. If you see "Web push is missing FIREBASE_WEB_VAPID_KEY", the VAPID key is not set
|
||||
|
||||
### Test 2: Send a Test Notification
|
||||
|
||||
From your server, you can test sending a notification:
|
||||
|
||||
```bash
|
||||
# Get a user's FCM token from database
|
||||
sudo -u postgres psql sojorn -c "SELECT fcm_token FROM public.fcm_tokens LIMIT 1;"
|
||||
|
||||
# The Go backend will automatically send push notifications when:
|
||||
# - Someone sends you a chat message
|
||||
# - Someone follows you
|
||||
# - Someone accepts your follow request
|
||||
```
|
||||
|
||||
### Test 3: Verify in Database
|
||||
|
||||
Check that FCM tokens are being stored:
|
||||
```bash
|
||||
sudo -u postgres psql sojorn
|
||||
SELECT user_id, platform, LEFT(fcm_token, 20) as token_preview, created_at
|
||||
FROM public.fcm_tokens
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 5;
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Issue: "Web push is missing FIREBASE_WEB_VAPID_KEY"
|
||||
|
||||
**Solution:** The VAPID key is not configured in the Flutter app. Make sure the code changes below are deployed.
|
||||
|
||||
### Issue: "Failed to initialize PushService"
|
||||
|
||||
**Possible causes:**
|
||||
1. Firebase service account JSON file not found
|
||||
2. Invalid JSON file
|
||||
3. Wrong file path in .env
|
||||
|
||||
**Check:**
|
||||
```bash
|
||||
cat /opt/sojorn/.env | grep FIREBASE
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
cat /opt/sojorn/firebase-service-account.json | jq .
|
||||
```
|
||||
|
||||
### Issue: Notifications not received
|
||||
|
||||
**Check:**
|
||||
1. Browser notification permissions granted
|
||||
2. FCM token registered (check console logs)
|
||||
3. Go backend logs show push being sent
|
||||
4. Database has FCM token for user
|
||||
|
||||
---
|
||||
|
||||
## Current Configuration
|
||||
|
||||
**Firebase Project:** sojorn-a7a78
|
||||
**Project ID:** sojorn-a7a78
|
||||
**Sender ID:** 486753572104
|
||||
|
||||
**Server Paths:**
|
||||
- `.env`: `/opt/sojorn/.env`
|
||||
- Service Account JSON: `/opt/sojorn/firebase-service-account.json`
|
||||
- Go Backend: `/home/patrick/sojorn-backend`
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference Commands
|
||||
|
||||
```bash
|
||||
# SSH to server
|
||||
ssh -i "C:\Users\Patrick\.ssh\mpls.pem" patrick@194.238.28.122
|
||||
|
||||
# View .env
|
||||
sudo cat /opt/sojorn/.env | grep FIREBASE
|
||||
|
||||
# Check service account file
|
||||
ls -la /opt/sojorn/firebase-service-account.json
|
||||
|
||||
# Restart backend
|
||||
sudo systemctl restart sojorn-api
|
||||
|
||||
# View logs
|
||||
sudo journalctl -u sojorn-api -f
|
||||
|
||||
# Check FCM tokens in DB
|
||||
sudo -u postgres psql sojorn -c "SELECT COUNT(*) FROM public.fcm_tokens;"
|
||||
```
|
||||
17
README.md
Normal file
17
README.md
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
# Sojorn
|
||||
|
||||
Sojorn is a calm, consent-first social platform.
|
||||
|
||||
## Architecture
|
||||
|
||||
This project has been migrated from a Supabase backend to a custom Go backend.
|
||||
|
||||
- **Client**: `sojorn_app/` (Flutter)
|
||||
- **Backend**: `go-backend/` (Go/Gin + PostgreSQL)
|
||||
- **Docs**: `sojorn_docs/`
|
||||
- **Legacy**: `_legacy/` (contains old Supabase functions and migrations)
|
||||
- **Migrations**: `go-backend/internal/database/migrations` (Active)
|
||||
|
||||
## Getting Started
|
||||
|
||||
See `go-backend/README.md` for backend setup and `sojorn_app/README.md` for client setup.
|
||||
13
SVG/Artboard 4.svg
Normal file
13
SVG/Artboard 4.svg
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="LEAF" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
|
||||
<defs>
|
||||
<style>
|
||||
.cls-1 {
|
||||
fill: #fff;
|
||||
stroke-width: 0px;
|
||||
}
|
||||
</style>
|
||||
</defs>
|
||||
<path class="cls-1" d="M15.46,13.32c.09-.32.17-.65.26-.97.01-.05.04-.18.04-.18,0,0,.1-.04.15-.07.9-.5,1.9-.78,2.91-.89s2.06-.13,3.04.07c.65.13,1.28.38,1.89.64,0,0-.44,1.04-.76,1.54-.5.76-1,1.5-1.63,2.15-1.41,1.44-3.29,2.47-5.25,2.58-1.32.08-2.65-.25-3.77-1.01.04-.03.57-.47.59-.45.77.6,1.78.71,2.73.64,1.07-.08,2.12-.37,3.1-.84,1.31-.63,2.44-1.68,3.26-2.94.25-.4.49-.81.73-1.22.02-.04.1-.14.1-.14,0,0-.1-.02-.16-.03-.48-.1-.95-.22-1.43-.29-1.97-.29-3.92.04-5.57,1.27-.07.05-.14.1-.21.16Z"/>
|
||||
<path class="cls-1" d="M8.95,12.78s-1.01-.55-1.01-.55c-.07-.06,0-.64,0-.75.01-.27.03-.53.06-.8.06-.52.16-1.04.31-1.54.31-1.01.84-1.95,1.43-2.8.46-.66.98-1.28,1.52-1.87.09-.1,1.02-.98,1.06-.94,1.23,1.35,2.54,2.91,3.1,4.73.48,1.56.56,3.52-.18,5.02-.19.47-.4.94-.69,1.35-.52.73-1.09,1.75-3.09,3-.64.32-1.31.52-2.01.58-1.02.08-2.01.09-3.01-.16-1.73-.43-3.27-1.53-4.44-2.95-.76-.92-1.36-1.99-1.75-3.15,2.06-.9,4.43-.84,6.53-.06s3.95,2.25,5.42,4.04l-.67.48c-1.49-1.56-3.05-3.16-5.01-3.91-1.67-.64-3.53-.6-5.24-.07.89,1.83,2.3,3.39,4.05,4.25,1.73.86,3.9,1.04,5.67.19.32-.15.62-.34.9-.55.06-.04.24-.18.46-.39.26-.25.51-.53.87-1.01.33-.44.6-.83.88-1.29.97-1.62,1.19-3.54.59-5.35-.24-.73-.6-1.41-1.02-2.04-.2-.3-.42-.62-.67-.88-.05-.05-.68-.72-.69-.71-.42.37-.85.76-1.2,1.2-.66.83-1.16,1.81-1.56,2.81-.52,1.33-.79,2.69-.62,4.12Z"/>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 1.5 KiB |
174
_legacy/supabase/BEACON_SYSTEM_EXPLAINED.md
Normal file
174
_legacy/supabase/BEACON_SYSTEM_EXPLAINED.md
Normal file
|
|
@ -0,0 +1,174 @@
|
|||
# Beacon System Architecture
|
||||
|
||||
## Overview
|
||||
|
||||
Sojorn has **two separate posting systems** that share the same database but create slightly different content:
|
||||
|
||||
1. **Regular Posts** (compose_screen.dart) - Standard social media posts
|
||||
2. **Beacon Posts** (create-beacon_sheet.dart) - GPS-tagged safety alerts
|
||||
|
||||
## How It Works
|
||||
|
||||
### Database Structure
|
||||
|
||||
Both systems create records in the `posts` table, but with different flags:
|
||||
|
||||
| Field | Regular Post | Beacon Post |
|
||||
|-------|-------------|-------------|
|
||||
| `is_beacon` | `FALSE` | `TRUE` |
|
||||
| `beacon_type` | `NULL` | `'police'`, `'checkpoint'`, `'taskForce'`, `'hazard'`, `'safety'`, or `'community'` |
|
||||
| `location` | `NULL` | GPS coordinates (PostGIS POINT) |
|
||||
| `confidence_score` | `NULL` | 0.5 - 1.0 (starts at 50-80% based on user trust) |
|
||||
| `is_active_beacon` | `NULL` | `TRUE` (becomes `FALSE` when pruned) |
|
||||
| `allow_chain` | User choice | Always `FALSE` |
|
||||
|
||||
### User Opt-In System
|
||||
|
||||
**Critical Feature**: Beacon posts are **OPT-IN ONLY** for feeds.
|
||||
|
||||
#### `profiles.beacon_enabled` Column
|
||||
- **Default**: `FALSE` (opted out)
|
||||
- **When FALSE**: User NEVER sees beacon posts in their Following or Sojorn feeds
|
||||
- **When TRUE**: User sees beacon posts mixed in with regular posts
|
||||
- **Beacon Map**: ALWAYS visible regardless of this setting
|
||||
|
||||
#### Why Opt-In?
|
||||
|
||||
Some users don't want safety alerts mixed into their social feed. The opt-in system allows:
|
||||
- **Casual users**: Just social content
|
||||
- **Community safety advocates**: Social content + beacons
|
||||
- **Everyone**: Can still view the Beacon Network map anytime
|
||||
|
||||
### Feed Filtering Logic
|
||||
|
||||
#### feed-personal (Following Feed)
|
||||
```typescript
|
||||
// Check user's beacon preference
|
||||
const { data: profile } = await supabase
|
||||
.from("profiles")
|
||||
.select("beacon_enabled")
|
||||
.eq("id", user.id)
|
||||
.single();
|
||||
|
||||
const beaconEnabled = profile?.beacon_enabled || false;
|
||||
|
||||
// Build query
|
||||
let postsQuery = supabase.from("posts").select(...);
|
||||
|
||||
// Filter out beacons if user has NOT opted in
|
||||
if (!beaconEnabled) {
|
||||
postsQuery = postsQuery.eq("is_beacon", false);
|
||||
}
|
||||
```
|
||||
|
||||
#### feed-sojorn (Algorithmic Feed)
|
||||
Same logic - beacons are filtered unless `beacon_enabled = TRUE`.
|
||||
|
||||
### Beacon Creation Flow
|
||||
|
||||
1. User opens Beacon Network tab
|
||||
2. Taps map to drop beacon
|
||||
3. Fills out CreateBeaconSheet:
|
||||
- Type (police, checkpoint, etc.)
|
||||
- Title
|
||||
- Description
|
||||
- Optional photo
|
||||
4. Submits → Edge function `create-beacon`
|
||||
5. **Creates a POST in the `posts` table** with:
|
||||
- `is_beacon = TRUE`
|
||||
- `beacon_type = <selected_type>`
|
||||
- `location = GPS point`
|
||||
- `category_id = "Beacon Alerts"` category
|
||||
- `confidence_score` based on user's trust score
|
||||
- `allow_chain = FALSE`
|
||||
|
||||
### Regular Post Creation Flow
|
||||
|
||||
1. User taps "New Post" button
|
||||
2. Fills out ComposeScreen:
|
||||
- Community selection
|
||||
- Body text
|
||||
- Optional photo
|
||||
- Toggle for chain responses
|
||||
3. Submits → Edge function `publish-post`
|
||||
4. **Creates a POST in the `posts` table** with:
|
||||
- `is_beacon = FALSE`
|
||||
- No GPS data
|
||||
- User-selected category
|
||||
- User's chain preference
|
||||
|
||||
## Key Differences
|
||||
|
||||
| Feature | Regular Post | Beacon Post |
|
||||
|---------|-------------|-------------|
|
||||
| **Purpose** | Social sharing | Safety alerts |
|
||||
| **GPS Data** | No | Required |
|
||||
| **Visible On** | Feeds (if user follows author) | Beacon map + feeds (if user opted in) |
|
||||
| **Category** | User selects | Always "Beacon Alerts" |
|
||||
| **Chaining** | User choice | Disabled |
|
||||
| **Confidence Score** | No | Yes (trust-based) |
|
||||
| **Voting** | No | Yes (vouch/report) |
|
||||
| **Auto-Pruning** | No | Yes (low confidence + old = disabled) |
|
||||
|
||||
## User Experience Scenarios
|
||||
|
||||
### Scenario 1: User With Beacons Disabled (Default)
|
||||
```
|
||||
Following Feed: ✓ Regular posts from people they follow
|
||||
Sojorn Feed: ✓ Algorithmic regular posts
|
||||
Beacon Map: ✓ All active beacons in area
|
||||
```
|
||||
|
||||
### Scenario 2: User With Beacons Enabled
|
||||
```
|
||||
Following Feed: ✓ Regular posts + beacons from people they follow
|
||||
Sojorn Feed: ✓ Algorithmic regular posts + beacons
|
||||
Beacon Map: ✓ All active beacons in area
|
||||
```
|
||||
|
||||
### Scenario 3: User Creates Beacon
|
||||
1. Beacon appears on map IMMEDIATELY for ALL users
|
||||
2. Beacon appears in creator's feed (if they have beacons enabled)
|
||||
3. Beacon appears in OTHER users' feeds (if they follow creator AND have beacons enabled)
|
||||
|
||||
## Migration Required
|
||||
|
||||
To enable this system, run:
|
||||
|
||||
```sql
|
||||
-- Add beacon_enabled column to profiles
|
||||
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS beacon_enabled BOOLEAN NOT NULL DEFAULT FALSE;
|
||||
|
||||
-- Add index for fast filtering
|
||||
CREATE INDEX IF NOT EXISTS idx_profiles_beacon_enabled ON profiles(beacon_enabled) WHERE beacon_enabled = TRUE;
|
||||
```
|
||||
|
||||
Or apply the migration file:
|
||||
```bash
|
||||
# Via Supabase Dashboard SQL Editor
|
||||
# Paste contents of: supabase/migrations/add_beacon_opt_in.sql
|
||||
```
|
||||
|
||||
## Edge Functions Updated
|
||||
|
||||
1. ✅ **feed-personal** - Now filters beacons based on user preference
|
||||
2. ✅ **feed-sojorn** - Now filters beacons based on user preference
|
||||
3. ✅ **create-beacon** - Creates beacon posts correctly
|
||||
4. ✅ **publish-post** - Creates regular posts correctly
|
||||
|
||||
## Frontend Components
|
||||
|
||||
- ✅ **ComposeScreen** - Regular post composer
|
||||
- ✅ **CreateBeaconSheet** - Beacon post composer
|
||||
- 🔲 **Settings Screen** - TODO: Add toggle for `beacon_enabled` preference
|
||||
- ✅ **BeaconScreen** - Shows beacons on map (always visible)
|
||||
- ✅ **FeedPersonalScreen** - Filtered feed
|
||||
- ✅ **FeedSojornScreen** - Filtered feed
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Apply database migration (`add_beacon_opt_in.sql`)
|
||||
2. Deploy updated edge functions
|
||||
3. Add UI toggle in user settings for beacon opt-in
|
||||
4. Test both posting flows
|
||||
5. Verify feed filtering works correctly
|
||||
61
_legacy/supabase/CREATE_SEARCH_VIEW.md
Normal file
61
_legacy/supabase/CREATE_SEARCH_VIEW.md
Normal file
|
|
@ -0,0 +1,61 @@
|
|||
# Create Search Tags View
|
||||
|
||||
The search function requires a database view called `view_searchable_tags` for efficient tag searching.
|
||||
|
||||
## Why This Is Needed
|
||||
|
||||
Without this view, the search function would need to download ALL posts from the database just to count tags, which will:
|
||||
- Timeout with 1000+ posts
|
||||
- Crash the Edge Function
|
||||
- Cause poor performance
|
||||
|
||||
The view pre-aggregates tag counts at the database level, making searches instant.
|
||||
|
||||
## How to Create the View
|
||||
|
||||
### Option 1: Via Supabase Dashboard (Recommended)
|
||||
|
||||
1. Go to your Supabase project's SQL Editor:
|
||||
https://supabase.com/dashboard/project/zwkihedetedlatyvplyz/sql
|
||||
|
||||
2. Paste and run this SQL:
|
||||
|
||||
```sql
|
||||
CREATE OR REPLACE VIEW view_searchable_tags AS
|
||||
SELECT
|
||||
unnest(tags) as tag,
|
||||
COUNT(*) as count
|
||||
FROM posts
|
||||
WHERE
|
||||
deleted_at IS NULL
|
||||
AND tags IS NOT NULL
|
||||
AND array_length(tags, 1) > 0
|
||||
GROUP BY unnest(tags)
|
||||
ORDER BY count DESC;
|
||||
```
|
||||
|
||||
3. Click "RUN" to execute
|
||||
|
||||
### Option 2: Via PowerShell (if you have psql installed)
|
||||
|
||||
Run this from the project root:
|
||||
|
||||
```powershell
|
||||
Get-Content supabase\migrations\create_searchable_tags_view.sql | psql $DATABASE_URL
|
||||
```
|
||||
|
||||
Replace `$DATABASE_URL` with your Supabase database connection string.
|
||||
|
||||
## Verifying It Works
|
||||
|
||||
After creating the view, test it with:
|
||||
|
||||
```sql
|
||||
SELECT * FROM view_searchable_tags LIMIT 10;
|
||||
```
|
||||
|
||||
You should see a list of tags with their counts.
|
||||
|
||||
## What Happens If You Don't Create It?
|
||||
|
||||
The search function will return an error when searching for tags. Users and posts will still work fine, but tag search will fail until this view is created.
|
||||
70
_legacy/supabase/MIGRATION_INSTRUCTIONS.md
Normal file
70
_legacy/supabase/MIGRATION_INSTRUCTIONS.md
Normal file
|
|
@ -0,0 +1,70 @@
|
|||
# Database Migration: Enhanced Search Function
|
||||
|
||||
This migration updates the `search_sojorn()` function to enable full-text search across posts, users, and hashtags.
|
||||
|
||||
## What Changed
|
||||
|
||||
The search function now searches:
|
||||
- **Users**: by handle AND display name (previously only handle)
|
||||
- **Tags**: hashtags from the posts.tags array (unchanged)
|
||||
- **Posts**: NEW - searches post body content for any word, matching hashtags
|
||||
|
||||
## Option 1: Apply via Supabase Dashboard (Recommended)
|
||||
|
||||
1. Go to your Supabase Dashboard: https://app.supabase.com/project/zwkihedetedlatyvplyz
|
||||
|
||||
2. Navigate to **SQL Editor** in the left sidebar
|
||||
|
||||
3. Click **"New Query"**
|
||||
|
||||
4. Copy and paste the SQL from `supabase/migrations/update_search_function.sql`
|
||||
|
||||
5. Click **"Run"** to execute the migration
|
||||
|
||||
6. Verify success - you should see "Success. No rows returned"
|
||||
|
||||
## Option 2: Apply via Supabase CLI
|
||||
|
||||
If you have Supabase CLI configured with your project:
|
||||
|
||||
```bash
|
||||
# Link to your project (if not already linked)
|
||||
supabase link --project-ref zwkihedetedlatyvplyz
|
||||
|
||||
# Push the migration
|
||||
supabase db push --include-all
|
||||
```
|
||||
|
||||
## Verification
|
||||
|
||||
After applying the migration, test the search:
|
||||
|
||||
1. Open the app and navigate to Search
|
||||
2. Try searching for:
|
||||
- A username (e.g., "john")
|
||||
- A hashtag (e.g., "#nature")
|
||||
- Any word from a post body (e.g., "wellness")
|
||||
3. Click on a hashtag in a post - it should navigate to search with results
|
||||
|
||||
## Rollback (if needed)
|
||||
|
||||
If you need to revert, run this SQL:
|
||||
|
||||
```sql
|
||||
CREATE OR REPLACE FUNCTION search_sojorn(p_query TEXT, limit_count INTEGER DEFAULT 10)
|
||||
RETURNS JSON LANGUAGE plpgsql STABLE AS $$
|
||||
DECLARE result JSON;
|
||||
BEGIN
|
||||
SELECT json_build_object(
|
||||
'users', (SELECT json_agg(json_build_object('id', p.id, 'username', p.handle, 'display_name', p.display_name, 'avatar_url', p.avatar_url, 'harmony_tier', COALESCE(ts.tier, 'new')))
|
||||
FROM profiles p LEFT JOIN trust_state ts ON p.id = ts.user_id WHERE p.handle ILIKE '%' || p_query || '%' LIMIT limit_count),
|
||||
'tags', (SELECT json_agg(json_build_object('tag', tag, 'count', cnt)) FROM (
|
||||
SELECT LOWER(UNNEST(tags)) AS tag, COUNT(*) AS cnt FROM posts WHERE tags IS NOT NULL AND deleted_at IS NULL
|
||||
GROUP BY tag HAVING LOWER(tag) LIKE '%' || LOWER(p_query) || '%' ORDER BY cnt DESC LIMIT limit_count) t)
|
||||
) INTO result;
|
||||
RETURN result;
|
||||
END;
|
||||
$$;
|
||||
```
|
||||
|
||||
Note: This removes the posts search capability.
|
||||
76
_legacy/supabase/apply-migration.ps1
Normal file
76
_legacy/supabase/apply-migration.ps1
Normal file
|
|
@ -0,0 +1,76 @@
|
|||
# PowerShell script to apply database migration via Supabase CLI
|
||||
# This connects to your remote Supabase project and applies the search function update
|
||||
|
||||
$PROJECT_REF = "zwkihedetedlatyvplyz"
|
||||
$MIGRATION_FILE = "migrations/update_search_function.sql"
|
||||
|
||||
Write-Host "=====================================" -ForegroundColor Cyan
|
||||
Write-Host "Sojorn Database Migration" -ForegroundColor Cyan
|
||||
Write-Host "=====================================" -ForegroundColor Cyan
|
||||
Write-Host ""
|
||||
Write-Host "This will update the search_sojorn() function to enable:" -ForegroundColor Yellow
|
||||
Write-Host " - Full-text search in post bodies" -ForegroundColor White
|
||||
Write-Host " - User search by display name AND handle" -ForegroundColor White
|
||||
Write-Host " - Hashtag search with post results" -ForegroundColor White
|
||||
Write-Host ""
|
||||
|
||||
# Check if Supabase CLI is installed
|
||||
if (-not (Get-Command supabase -ErrorAction SilentlyContinue)) {
|
||||
Write-Host "ERROR: Supabase CLI is not installed or not in PATH" -ForegroundColor Red
|
||||
Write-Host "Install via: scoop install supabase" -ForegroundColor Yellow
|
||||
Write-Host "Or visit: https://supabase.com/docs/guides/cli" -ForegroundColor Yellow
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Host "Step 1: Linking to Supabase project..." -ForegroundColor Green
|
||||
Write-Host "Project Reference: $PROJECT_REF" -ForegroundColor White
|
||||
|
||||
# Link to project (will prompt for database password if needed)
|
||||
$linkResult = supabase link --project-ref $PROJECT_REF 2>&1
|
||||
|
||||
if ($LASTEXITCODE -ne 0) {
|
||||
Write-Host ""
|
||||
Write-Host "ERROR: Failed to link to Supabase project" -ForegroundColor Red
|
||||
Write-Host "Please ensure you have the correct project reference and database password" -ForegroundColor Yellow
|
||||
Write-Host ""
|
||||
Write-Host "Alternative: Apply via Supabase Dashboard" -ForegroundColor Cyan
|
||||
Write-Host "1. Go to: https://app.supabase.com/project/$PROJECT_REF/sql" -ForegroundColor White
|
||||
Write-Host "2. Copy contents of: $MIGRATION_FILE" -ForegroundColor White
|
||||
Write-Host "3. Paste and run in SQL Editor" -ForegroundColor White
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Host ""
|
||||
Write-Host "Step 2: Applying migration..." -ForegroundColor Green
|
||||
Write-Host "Reading: $MIGRATION_FILE" -ForegroundColor White
|
||||
|
||||
# Read the migration SQL
|
||||
if (-not (Test-Path $MIGRATION_FILE)) {
|
||||
Write-Host "ERROR: Migration file not found: $MIGRATION_FILE" -ForegroundColor Red
|
||||
exit 1
|
||||
}
|
||||
|
||||
$sql = Get-Content $MIGRATION_FILE -Raw
|
||||
|
||||
# Apply the migration
|
||||
Write-Host "Executing SQL..." -ForegroundColor White
|
||||
$result = $sql | supabase db execute 2>&1
|
||||
|
||||
if ($LASTEXITCODE -eq 0) {
|
||||
Write-Host ""
|
||||
Write-Host "SUCCESS! Migration applied successfully" -ForegroundColor Green
|
||||
Write-Host ""
|
||||
Write-Host "Next steps:" -ForegroundColor Cyan
|
||||
Write-Host "1. Test the search functionality in your app" -ForegroundColor White
|
||||
Write-Host "2. Search for users, hashtags, and words in posts" -ForegroundColor White
|
||||
Write-Host "3. Click hashtags in posts to navigate to search" -ForegroundColor White
|
||||
} else {
|
||||
Write-Host ""
|
||||
Write-Host "ERROR: Migration failed" -ForegroundColor Red
|
||||
Write-Host "Error output:" -ForegroundColor Yellow
|
||||
Write-Host $result
|
||||
Write-Host ""
|
||||
Write-Host "Try applying manually via Supabase Dashboard:" -ForegroundColor Cyan
|
||||
Write-Host "https://app.supabase.com/project/$PROJECT_REF/sql" -ForegroundColor White
|
||||
exit 1
|
||||
}
|
||||
103
_legacy/supabase/apply_e2ee_migration.sql
Normal file
103
_legacy/supabase/apply_e2ee_migration.sql
Normal file
|
|
@ -0,0 +1,103 @@
|
|||
-- ============================================================================
|
||||
-- APPLY E2EE CHAT MIGRATION MANUALLY
|
||||
-- ============================================================================
|
||||
-- Run this script in your Supabase SQL Editor to apply the E2EE chat migration
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Update profiles table to store identity key and registration ID
|
||||
-- ============================================================================
|
||||
|
||||
-- Add Signal Protocol identity key and registration ID to profiles
|
||||
ALTER TABLE profiles
|
||||
ADD COLUMN IF NOT EXISTS identity_key TEXT,
|
||||
ADD COLUMN IF NOT EXISTS registration_id INTEGER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Create separate one_time_prekeys table
|
||||
-- ============================================================================
|
||||
|
||||
-- Separate table for one-time pre-keys (consumed on use)
|
||||
CREATE TABLE IF NOT EXISTS one_time_prekeys (
|
||||
id SERIAL PRIMARY KEY,
|
||||
user_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
key_id INTEGER NOT NULL,
|
||||
public_key TEXT NOT NULL,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
|
||||
-- Ensure unique key_id per user
|
||||
UNIQUE(user_id, key_id)
|
||||
);
|
||||
|
||||
-- Index for efficient key consumption
|
||||
CREATE INDEX IF NOT EXISTS idx_one_time_prekeys_user_id ON one_time_prekeys(user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Update signal_keys table structure
|
||||
-- ============================================================================
|
||||
|
||||
-- Remove one_time_prekeys from signal_keys (now separate table)
|
||||
ALTER TABLE signal_keys
|
||||
DROP COLUMN IF EXISTS one_time_prekeys;
|
||||
|
||||
-- Add registration_id to signal_keys if not already present
|
||||
ALTER TABLE signal_keys
|
||||
ADD COLUMN IF NOT EXISTS registration_id INTEGER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Update consume_one_time_prekey function
|
||||
-- ============================================================================
|
||||
|
||||
-- Drop existing function if it exists (different return type)
|
||||
DROP FUNCTION IF EXISTS consume_one_time_prekey(UUID);
|
||||
|
||||
-- Create the new function to work with the separate table
|
||||
CREATE FUNCTION consume_one_time_prekey(target_user_id UUID)
|
||||
RETURNS TABLE(key_id INTEGER, public_key TEXT) AS $$
|
||||
DECLARE
|
||||
selected_key_id INTEGER;
|
||||
selected_public_key TEXT;
|
||||
BEGIN
|
||||
-- First, find the oldest key
|
||||
SELECT otpk.key_id, otpk.public_key
|
||||
INTO selected_key_id, selected_public_key
|
||||
FROM one_time_prekeys otpk
|
||||
WHERE otpk.user_id = target_user_id
|
||||
ORDER BY otpk.created_at ASC
|
||||
LIMIT 1;
|
||||
|
||||
-- If we found a key, delete it and return it
|
||||
IF selected_key_id IS NOT NULL THEN
|
||||
DELETE FROM one_time_prekeys
|
||||
WHERE user_id = target_user_id AND key_id = selected_key_id;
|
||||
|
||||
RETURN QUERY SELECT selected_key_id, selected_public_key;
|
||||
END IF;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Update RLS policies for one_time_prekeys
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable RLS
|
||||
ALTER TABLE one_time_prekeys ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Users can read their own pre-keys (for management)
|
||||
CREATE POLICY one_time_prekeys_select_own ON one_time_prekeys
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
-- Users can insert their own pre-keys
|
||||
CREATE POLICY one_time_prekeys_insert_own ON one_time_prekeys
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
-- Users can delete their own pre-keys (when consumed)
|
||||
CREATE POLICY one_time_prekeys_delete_own ON one_time_prekeys
|
||||
FOR DELETE USING (auth.uid() = user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- SUCCESS MESSAGE
|
||||
-- ============================================================================
|
||||
|
||||
-- If you see this message, the migration completed successfully!
|
||||
SELECT 'E2EE Chat Migration Applied Successfully!' as status;
|
||||
2
_legacy/supabase/config.toml
Normal file
2
_legacy/supabase/config.toml
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
[functions.cleanup-expired-content]
|
||||
verify_jwt = false
|
||||
193
_legacy/supabase/functions/_shared/harmony.ts
Normal file
193
_legacy/supabase/functions/_shared/harmony.ts
Normal file
|
|
@ -0,0 +1,193 @@
|
|||
/**
|
||||
* Harmony Score Calculation
|
||||
*
|
||||
* Design intent:
|
||||
* - Influence adapts; people are not judged.
|
||||
* - Guidance replaces punishment.
|
||||
* - Fit emerges naturally.
|
||||
*
|
||||
* Philosophy:
|
||||
* - Score is private, decays over time, and is reversible.
|
||||
* - Never bans or removes beliefs.
|
||||
* - Shapes distribution width, not access.
|
||||
*
|
||||
* Inputs:
|
||||
* - Blocks received (pattern-based, not single incidents)
|
||||
* - Trusted reports (from high-harmony users)
|
||||
* - Category friction (posting to sensitive categories with low CIS)
|
||||
* - Posting cadence (erratic spikes vs steady participation)
|
||||
* - Rewrite prompts triggered (content rejected for tone)
|
||||
* - False reports made (reports that were dismissed)
|
||||
*
|
||||
* Effects:
|
||||
* - Shapes distribution width (reach)
|
||||
* - Adds gentle posting friction if low
|
||||
* - Limits Trending eligibility
|
||||
*/
|
||||
|
||||
export interface HarmonyInputs {
|
||||
user_id: string;
|
||||
blocks_received_7d: number;
|
||||
blocks_received_30d: number;
|
||||
trusted_reports_against: number;
|
||||
total_reports_against: number;
|
||||
posts_rejected_7d: number; // rewrite prompts triggered
|
||||
posts_created_7d: number;
|
||||
false_reports_filed: number; // reports dismissed after review
|
||||
validated_reports_filed: number; // reports confirmed after review
|
||||
days_since_signup: number;
|
||||
current_harmony_score: number;
|
||||
current_tier: string;
|
||||
}
|
||||
|
||||
export interface HarmonyAdjustment {
|
||||
new_score: number;
|
||||
delta: number;
|
||||
reason: string;
|
||||
new_tier: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate harmony score adjustment based on recent behavior
|
||||
*/
|
||||
export function calculateHarmonyAdjustment(inputs: HarmonyInputs): HarmonyAdjustment {
|
||||
let delta = 0;
|
||||
const reasons: string[] = [];
|
||||
|
||||
// 1. Blocks received (pattern-based)
|
||||
// Single block = minor signal. Pattern of blocks = strong negative signal.
|
||||
if (inputs.blocks_received_7d >= 3) {
|
||||
delta -= 10;
|
||||
reasons.push('Multiple blocks received recently');
|
||||
} else if (inputs.blocks_received_30d >= 5) {
|
||||
delta -= 5;
|
||||
reasons.push('Pattern of blocks over time');
|
||||
}
|
||||
|
||||
// 2. Trusted reports
|
||||
// Reports from high-harmony users are strong negative signals
|
||||
if (inputs.trusted_reports_against >= 2) {
|
||||
delta -= 8;
|
||||
reasons.push('Multiple reports from trusted users');
|
||||
} else if (inputs.trusted_reports_against === 1) {
|
||||
delta -= 3;
|
||||
reasons.push('Report from trusted user');
|
||||
}
|
||||
|
||||
// 3. Content rejection rate (rewrite prompts)
|
||||
// High rejection rate indicates persistent tone issues
|
||||
const rejectionRate =
|
||||
inputs.posts_created_7d > 0 ? inputs.posts_rejected_7d / inputs.posts_created_7d : 0;
|
||||
|
||||
if (rejectionRate > 0.3) {
|
||||
delta -= 6;
|
||||
reasons.push('High content rejection rate');
|
||||
} else if (rejectionRate > 0.1) {
|
||||
delta -= 2;
|
||||
reasons.push('Some content rejected for tone');
|
||||
}
|
||||
|
||||
// 4. False reports filed
|
||||
// Filing false reports is harmful behavior
|
||||
if (inputs.false_reports_filed >= 3) {
|
||||
delta -= 7;
|
||||
reasons.push('Multiple false reports filed');
|
||||
} else if (inputs.false_reports_filed >= 1) {
|
||||
delta -= 3;
|
||||
reasons.push('False report filed');
|
||||
}
|
||||
|
||||
// 5. Positive signals: Validated reports
|
||||
// Accurate reporting helps the community
|
||||
if (inputs.validated_reports_filed >= 3) {
|
||||
delta += 5;
|
||||
reasons.push('Helpful reporting behavior');
|
||||
} else if (inputs.validated_reports_filed >= 1) {
|
||||
delta += 2;
|
||||
reasons.push('Validated report filed');
|
||||
}
|
||||
|
||||
// 6. Time-based trust growth
|
||||
// Steady participation without issues slowly builds trust
|
||||
if (inputs.days_since_signup > 90 && delta >= 0) {
|
||||
delta += 2;
|
||||
reasons.push('Sustained positive participation');
|
||||
} else if (inputs.days_since_signup > 30 && delta >= 0) {
|
||||
delta += 1;
|
||||
reasons.push('Consistent participation');
|
||||
}
|
||||
|
||||
// 7. Natural decay toward equilibrium (50)
|
||||
// Scores gradually drift back toward 50 over time
|
||||
// This ensures old negative signals fade
|
||||
if (inputs.current_harmony_score < 45) {
|
||||
delta += 1;
|
||||
reasons.push('Natural recovery over time');
|
||||
} else if (inputs.current_harmony_score > 60) {
|
||||
delta -= 1;
|
||||
reasons.push('Natural equilibrium adjustment');
|
||||
}
|
||||
|
||||
// 8. Calculate new score with bounds [0, 100]
|
||||
const new_score = Math.max(0, Math.min(100, inputs.current_harmony_score + delta));
|
||||
|
||||
// 9. Determine tier based on new score
|
||||
let new_tier: string;
|
||||
if (new_score >= 75) {
|
||||
new_tier = 'established';
|
||||
} else if (new_score >= 50) {
|
||||
new_tier = 'trusted';
|
||||
} else if (new_score >= 25) {
|
||||
new_tier = 'new';
|
||||
} else {
|
||||
new_tier = 'restricted';
|
||||
}
|
||||
|
||||
return {
|
||||
new_score,
|
||||
delta,
|
||||
reason: reasons.join('; ') || 'No significant changes',
|
||||
new_tier,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user-facing explanation of harmony score (without revealing the number)
|
||||
*/
|
||||
export function getHarmonyExplanation(tier: string, score: number): string {
|
||||
if (tier === 'restricted') {
|
||||
return 'Your posts currently have limited reach. This happens when content patterns trigger community concerns. Your reach will naturally restore over time with calm participation.';
|
||||
}
|
||||
|
||||
if (tier === 'new') {
|
||||
return 'Your posts reach a modest audience while you build trust. Steady participation and helpful contributions will gradually expand your reach.';
|
||||
}
|
||||
|
||||
if (tier === 'trusted') {
|
||||
return 'Your posts reach a good audience. You have shown consistent, calm participation.';
|
||||
}
|
||||
|
||||
if (tier === 'established') {
|
||||
return 'Your posts reach a wide audience. You have built strong trust through sustained positive contributions.';
|
||||
}
|
||||
|
||||
return 'Your reach is determined by your participation patterns and community response.';
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine reach multiplier for feed algorithms
|
||||
*/
|
||||
export function getReachMultiplier(tier: string, score: number): number {
|
||||
const baseMultiplier: Record<string, number> = {
|
||||
restricted: 0.2,
|
||||
new: 0.6,
|
||||
trusted: 1.0,
|
||||
established: 1.4,
|
||||
};
|
||||
|
||||
// Fine-tune based on score within tier
|
||||
const tierBase = baseMultiplier[tier] || 1.0;
|
||||
const scoreAdjustment = (score - 50) / 200; // -0.25 to +0.25
|
||||
|
||||
return Math.max(0.1, tierBase + scoreAdjustment);
|
||||
}
|
||||
108
_legacy/supabase/functions/_shared/r2_signer.ts
Normal file
108
_legacy/supabase/functions/_shared/r2_signer.ts
Normal file
|
|
@ -0,0 +1,108 @@
|
|||
import { AwsClient } from 'https://esm.sh/aws4fetch@1.0.17'
|
||||
|
||||
const CUSTOM_MEDIA_DOMAIN = (Deno.env.get("CUSTOM_MEDIA_DOMAIN") ?? "https://img.gosojorn.com").trim();
|
||||
const CUSTOM_VIDEO_DOMAIN = (Deno.env.get("CUSTOM_VIDEO_DOMAIN") ?? "https://quips.gosojorn.com").trim();
|
||||
|
||||
const DEFAULT_BUCKET_NAME = "sojorn-media";
|
||||
const RESOLVED_BUCKET = (Deno.env.get("R2_BUCKET_NAME") ?? DEFAULT_BUCKET_NAME).trim();
|
||||
|
||||
function normalizeKey(key: string): string {
|
||||
let normalized = key.replace(/^\/+/, "");
|
||||
if (RESOLVED_BUCKET && normalized.startsWith(`${RESOLVED_BUCKET}/`)) {
|
||||
normalized = normalized.slice(RESOLVED_BUCKET.length + 1);
|
||||
}
|
||||
return normalized;
|
||||
}
|
||||
|
||||
function extractObjectKey(input: string): string {
|
||||
const trimmed = input.trim();
|
||||
if (!trimmed) {
|
||||
throw new Error("Missing file key");
|
||||
}
|
||||
|
||||
try {
|
||||
const url = new URL(trimmed);
|
||||
const key = decodeURIComponent(url.pathname);
|
||||
return normalizeKey(key);
|
||||
} catch {
|
||||
return normalizeKey(trimmed);
|
||||
}
|
||||
}
|
||||
|
||||
export function transformLegacyMediaUrl(input: string): string | null {
|
||||
const trimmed = input.trim();
|
||||
if (!trimmed) return null;
|
||||
|
||||
try {
|
||||
const url = new URL(trimmed);
|
||||
|
||||
// Handle legacy media.gosojorn.com URLs
|
||||
if (url.hostname === 'media.gosojorn.com') {
|
||||
const key = decodeURIComponent(url.pathname);
|
||||
return key;
|
||||
}
|
||||
|
||||
return null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Deprecated: no-op signer retained for compatibility
|
||||
export async function signR2Url(fileKey: string, expiresIn: number = 3600): Promise<string> {
|
||||
return await trySignR2Url(fileKey, undefined, expiresIn) ?? fileKey;
|
||||
}
|
||||
|
||||
export async function trySignR2Url(fileKey: string, bucket?: string, expiresIn: number = 3600): Promise<string | null> {
|
||||
try {
|
||||
const key = normalizeKey(extractObjectKey(fileKey));
|
||||
|
||||
// Check if we have credentials to sign. If not, fallback to public URL.
|
||||
const ACCOUNT_ID = Deno.env.get('R2_ACCOUNT_ID');
|
||||
const ACCESS_KEY = Deno.env.get('R2_ACCESS_KEY');
|
||||
const SECRET_KEY = Deno.env.get('R2_SECRET_KEY');
|
||||
|
||||
const isVideo = key.toLowerCase().endsWith('.mp4') ||
|
||||
key.toLowerCase().endsWith('.mov') ||
|
||||
key.toLowerCase().endsWith('.webm') ||
|
||||
bucket === 'sojorn-videos';
|
||||
|
||||
if (!ACCOUNT_ID || !ACCESS_KEY || !SECRET_KEY) {
|
||||
console.warn("Missing R2 credentials for signing. Falling back to public domain.");
|
||||
const domain = isVideo ? CUSTOM_VIDEO_DOMAIN : CUSTOM_MEDIA_DOMAIN;
|
||||
if (domain && domain.startsWith("http")) {
|
||||
return `${domain.replace(/\/+$/, "")}/${key}`;
|
||||
}
|
||||
return fileKey;
|
||||
}
|
||||
|
||||
const r2 = new AwsClient({
|
||||
accessKeyId: ACCESS_KEY,
|
||||
secretAccessKey: SECRET_KEY,
|
||||
region: 'auto',
|
||||
service: 's3',
|
||||
});
|
||||
|
||||
const targetBucket = bucket || (isVideo ? 'sojorn-videos' : 'sojorn-media');
|
||||
|
||||
// We sign against the actual R2 endpoint to ensure auth works,
|
||||
// but the SignedMediaImage can handle redirect/proxying if needed.
|
||||
const url = new URL(`https://${ACCOUNT_ID}.r2.cloudflarestorage.com/${targetBucket}/${key}`);
|
||||
|
||||
// Add expiration
|
||||
url.searchParams.set('X-Amz-Expires', expiresIn.toString());
|
||||
|
||||
const signedRequest = await r2.sign(url, {
|
||||
method: "GET",
|
||||
aws: { signQuery: true, allHeaders: false },
|
||||
});
|
||||
|
||||
return signedRequest.url;
|
||||
} catch (error) {
|
||||
console.error("R2 signing failed", {
|
||||
fileKey,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
});
|
||||
return null;
|
||||
}
|
||||
}
|
||||
133
_legacy/supabase/functions/_shared/ranking.ts
Normal file
133
_legacy/supabase/functions/_shared/ranking.ts
Normal file
|
|
@ -0,0 +1,133 @@
|
|||
/**
|
||||
* Ranking Algorithm for sojorn Feed
|
||||
*
|
||||
* Design intent:
|
||||
* - Attention moves slowly.
|
||||
* - Nothing competes for dominance.
|
||||
* - Clean content is protected from suppression.
|
||||
*
|
||||
* Principles:
|
||||
* - Saves > likes (intentional curation over quick reaction)
|
||||
* - Steady appreciation over time > viral spikes
|
||||
* - Low block rate = content is not harmful
|
||||
* - Low trusted report rate = content is genuinely clean
|
||||
* - Ignore comment count (we don't reward arguments)
|
||||
* - Ignore report spikes on high-CIS posts (brigading protection)
|
||||
*/
|
||||
|
||||
export interface PostForRanking {
|
||||
id: string;
|
||||
created_at: string;
|
||||
cis_score: number;
|
||||
tone_label: string;
|
||||
save_count: number;
|
||||
like_count: number;
|
||||
view_count: number;
|
||||
author_harmony_score: number;
|
||||
author_tier: string;
|
||||
blocks_received_24h: number;
|
||||
trusted_reports: number;
|
||||
total_reports: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate calm velocity score
|
||||
* Measures steady appreciation rather than viral spikes
|
||||
*/
|
||||
function calculateCalmVelocity(post: PostForRanking): number {
|
||||
const ageInHours = (Date.now() - new Date(post.created_at).getTime()) / (1000 * 60 * 60);
|
||||
|
||||
if (ageInHours === 0 || post.view_count === 0) return 0;
|
||||
|
||||
// Saves are weighted 3x more than likes
|
||||
const engagementScore = post.save_count * 3 + post.like_count;
|
||||
|
||||
// Engagement rate (relative to views)
|
||||
const engagementRate = engagementScore / Math.max(post.view_count, 1);
|
||||
|
||||
// Calm velocity = steady engagement over time (not spiky)
|
||||
// Using logarithmic scaling to prevent runaway viral effects
|
||||
const velocity = Math.log1p(engagementRate * 100) / Math.log1p(ageInHours + 1);
|
||||
|
||||
return velocity;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate safety score
|
||||
* Lower score = content has triggered negative signals
|
||||
*/
|
||||
function calculateSafetyScore(post: PostForRanking): number {
|
||||
let score = 1.0;
|
||||
|
||||
// Penalize if blocks received in last 24h
|
||||
if (post.blocks_received_24h > 0) {
|
||||
score -= post.blocks_received_24h * 0.2;
|
||||
}
|
||||
|
||||
// Penalize trusted reports heavily
|
||||
if (post.trusted_reports > 0) {
|
||||
score -= post.trusted_reports * 0.3;
|
||||
}
|
||||
|
||||
// Ignore report spikes if CIS is high (brigading protection)
|
||||
if (post.cis_score < 0.7 && post.total_reports > 2) {
|
||||
score -= 0.15;
|
||||
}
|
||||
|
||||
return Math.max(score, 0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate author influence multiplier
|
||||
* Based on harmony score and tier
|
||||
*/
|
||||
function calculateAuthorInfluence(post: PostForRanking): number {
|
||||
const harmonyMultiplier = post.author_harmony_score / 100; // 0-1 range
|
||||
|
||||
const tierMultiplier: Record<string, number> = {
|
||||
new: 0.5,
|
||||
trusted: 1.0,
|
||||
established: 1.3,
|
||||
restricted: 0.2,
|
||||
};
|
||||
|
||||
return harmonyMultiplier * (tierMultiplier[post.author_tier] || 1.0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate final ranking score for sojorn feed
|
||||
*/
|
||||
export function calculateRankingScore(post: PostForRanking): number {
|
||||
// Base score from content integrity
|
||||
const cisBonus = post.cis_score;
|
||||
|
||||
// Tone eligibility (handled in feed query, but can boost here)
|
||||
const toneBonus = post.tone_label === 'positive' ? 1.2 : post.tone_label === 'neutral' ? 1.0 : 0.8;
|
||||
|
||||
// Calm velocity (steady appreciation)
|
||||
const velocity = calculateCalmVelocity(post);
|
||||
|
||||
// Safety (no blocks or trusted reports)
|
||||
const safety = calculateSafetyScore(post);
|
||||
|
||||
// Author influence
|
||||
const influence = calculateAuthorInfluence(post);
|
||||
|
||||
// Final score
|
||||
const score = cisBonus * toneBonus * velocity * safety * influence;
|
||||
|
||||
return score;
|
||||
}
|
||||
|
||||
/**
|
||||
* Rank posts for feed
|
||||
* Returns sorted array with scores attached
|
||||
*/
|
||||
export function rankPosts(posts: PostForRanking[]): Array<PostForRanking & { rank_score: number }> {
|
||||
return posts
|
||||
.map((post) => ({
|
||||
...post,
|
||||
rank_score: calculateRankingScore(post),
|
||||
}))
|
||||
.sort((a, b) => b.rank_score - a.rank_score);
|
||||
}
|
||||
27
_legacy/supabase/functions/_shared/supabase-client.ts
Normal file
27
_legacy/supabase/functions/_shared/supabase-client.ts
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
/**
|
||||
* Shared Supabase client configuration for Edge Functions
|
||||
*/
|
||||
|
||||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
|
||||
|
||||
export function createSupabaseClient(authHeader: string) {
|
||||
return createClient(
|
||||
Deno.env.get('SUPABASE_URL') ?? '',
|
||||
Deno.env.get('SUPABASE_ANON_KEY') ?? '',
|
||||
{
|
||||
global: {
|
||||
headers: {
|
||||
Authorization: authHeader,
|
||||
apikey: Deno.env.get('SUPABASE_ANON_KEY') ?? '',
|
||||
},
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
export function createServiceClient() {
|
||||
return createClient(
|
||||
Deno.env.get('SUPABASE_URL') ?? '',
|
||||
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY') ?? ''
|
||||
);
|
||||
}
|
||||
173
_legacy/supabase/functions/_shared/tone-detection.ts
Normal file
173
_legacy/supabase/functions/_shared/tone-detection.ts
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
/**
|
||||
* Content Filtering with OpenAI Moderation API
|
||||
*
|
||||
* Philosophy:
|
||||
* 1. Block slurs immediately (zero tolerance)
|
||||
* 2. Send to OpenAI Moderation API for additional checking
|
||||
* 3. Everything else is allowed
|
||||
*/
|
||||
|
||||
export type ToneLabel = 'positive' | 'neutral' | 'mixed' | 'negative' | 'hostile' | 'hate';
|
||||
|
||||
export interface ToneAnalysis {
|
||||
tone: ToneLabel;
|
||||
cis: number; // content integrity score (0-1)
|
||||
flags: string[]; // detected patterns
|
||||
shouldReject: boolean;
|
||||
rejectReason?: string;
|
||||
}
|
||||
|
||||
// Slurs - zero tolerance (block immediately)
|
||||
const SLURS = [
|
||||
// Racial slurs
|
||||
'nigger', 'nigga', 'negro', 'chink', 'gook', 'spic', 'wetback', 'raghead',
|
||||
'sandnigger', 'coon', 'darkie', 'jap', 'zipperhead', 'mex',
|
||||
// Homophobic slurs
|
||||
'faggot', 'fag', 'fags', 'dyke', 'tranny', 'trannie', 'homo', 'lez', 'lesbo', 'queer',
|
||||
// Other
|
||||
'kike', 'spook', 'simian', 'groids', 'currymuncher', 'paki', 'cunt',
|
||||
];
|
||||
|
||||
const OPENAI_MODERATION_URL = 'https://api.openai.com/v1/moderations';
|
||||
|
||||
/**
|
||||
* Analyze text - first check slurs, then send to OpenAI Moderation API
|
||||
*/
|
||||
export async function analyzeTone(text: string): Promise<ToneAnalysis> {
|
||||
const flags: string[] = [];
|
||||
const lowerText = text.toLowerCase();
|
||||
|
||||
// Check for slurs (zero tolerance - block immediately)
|
||||
const foundSlurs = SLURS.filter(slug => lowerText.includes(slug));
|
||||
if (foundSlurs.length > 0) {
|
||||
return {
|
||||
tone: 'hate',
|
||||
cis: 0.0,
|
||||
flags: foundSlurs,
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content contains slurs which are not allowed.',
|
||||
};
|
||||
}
|
||||
|
||||
// Send to OpenAI Moderation API for additional checking
|
||||
const openAiKey = Deno.env.get('OPEN_AI');
|
||||
console.log('OPEN_AI key exists:', !!openAiKey);
|
||||
|
||||
if (openAiKey) {
|
||||
try {
|
||||
console.log('Sending to OpenAI Moderation API, text:', text);
|
||||
|
||||
const response = await fetch(OPENAI_MODERATION_URL, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${openAiKey}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ model: 'omni-moderation-latest', input: text }),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
const result = data.results[0];
|
||||
|
||||
// Check various categories (using correct OpenAI category names)
|
||||
const categories = result.categories;
|
||||
|
||||
if (categories['hate'] || categories['hate/threatening']) {
|
||||
return {
|
||||
tone: 'hate',
|
||||
cis: 0.0,
|
||||
flags: ['openai_hate'],
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content was flagged by moderation.',
|
||||
};
|
||||
}
|
||||
|
||||
if (categories['harassment'] || categories['harassment/threatening']) {
|
||||
return {
|
||||
tone: 'hostile',
|
||||
cis: 0.1,
|
||||
flags: ['openai_harassment'],
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content contains harassment.',
|
||||
};
|
||||
}
|
||||
|
||||
if (categories['sexual'] || categories['sexual/minors']) {
|
||||
return {
|
||||
tone: 'hostile',
|
||||
cis: 0.1,
|
||||
flags: ['openai_sexual'],
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content is not appropriate.',
|
||||
};
|
||||
}
|
||||
|
||||
if (categories['violence'] || categories['violence/graphic']) {
|
||||
return {
|
||||
tone: 'hostile',
|
||||
cis: 0.1,
|
||||
flags: ['openai_violence'],
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content contains violence.',
|
||||
};
|
||||
}
|
||||
|
||||
if (categories['self-harm'] || categories['self-harm/intent'] || categories['self-harm/instructions']) {
|
||||
return {
|
||||
tone: 'hostile',
|
||||
cis: 0.1,
|
||||
flags: ['openai_self_harm'],
|
||||
shouldReject: true,
|
||||
rejectReason: 'This content contains self-harm references.',
|
||||
};
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('OpenAI moderation error:', e);
|
||||
// Continue with basic analysis if moderation API fails
|
||||
}
|
||||
}
|
||||
|
||||
// Determine tone based on basic sentiment
|
||||
const hasProfanity = /fuck|shit|damn|ass|bitch|dick|cock|pussy|cunt|hell|bastard/i.test(text);
|
||||
const isPositive = /love|thank|grateful|appreciate|happy|joy|peace|calm|beautiful|wonderful|amazing|great/i.test(text);
|
||||
const isNegative = /hate|angry|furious|enraged|upset|sad|depressed|hopeless|worthless|terrible/i.test(text);
|
||||
|
||||
let tone: ToneLabel;
|
||||
let cis: number;
|
||||
|
||||
if (isPositive && !isNegative) {
|
||||
tone = 'positive';
|
||||
cis = 0.9;
|
||||
} else if (isNegative && !isPositive) {
|
||||
tone = 'negative';
|
||||
cis = 0.5;
|
||||
flags.push('negative_tone');
|
||||
} else if (hasProfanity) {
|
||||
tone = 'neutral';
|
||||
cis = 0.7;
|
||||
flags.push('profanity');
|
||||
} else {
|
||||
tone = 'neutral';
|
||||
cis = 0.8;
|
||||
}
|
||||
|
||||
return { tone, cis, flags, shouldReject: false };
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate user-facing feedback for rejected content
|
||||
*/
|
||||
export function getRewriteSuggestion(analysis: ToneAnalysis): string {
|
||||
if (analysis.tone === 'hate') {
|
||||
return 'Slurs are not allowed on sojorn.';
|
||||
}
|
||||
if (analysis.tone === 'hostile') {
|
||||
return 'Sharp speech does not travel here. Consider softening your words.';
|
||||
}
|
||||
if (analysis.tone === 'negative') {
|
||||
return 'This reads as negative. If you want it to reach others, try reframing.';
|
||||
}
|
||||
return 'Consider adjusting your tone for better engagement.';
|
||||
}
|
||||
53
_legacy/supabase/functions/_shared/validation.ts
Normal file
53
_legacy/supabase/functions/_shared/validation.ts
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
/**
|
||||
* Shared validation utilities
|
||||
*/
|
||||
|
||||
export class ValidationError extends Error {
|
||||
constructor(message: string, public field?: string) {
|
||||
super(message);
|
||||
this.name = 'ValidationError';
|
||||
}
|
||||
}
|
||||
|
||||
export function validatePostBody(body: string): void {
|
||||
const trimmed = body.trim();
|
||||
|
||||
if (trimmed.length === 0) {
|
||||
throw new ValidationError('Post cannot be empty.', 'body');
|
||||
}
|
||||
|
||||
if (body.length > 500) {
|
||||
throw new ValidationError('Post is too long (max 500 characters).', 'body');
|
||||
}
|
||||
}
|
||||
|
||||
export function validateCommentBody(body: string): void {
|
||||
const trimmed = body.trim();
|
||||
|
||||
if (trimmed.length === 0) {
|
||||
throw new ValidationError('Comment cannot be empty.', 'body');
|
||||
}
|
||||
|
||||
if (body.length > 300) {
|
||||
throw new ValidationError('Comment is too long (max 300 characters).', 'body');
|
||||
}
|
||||
}
|
||||
|
||||
export function validateReportReason(reason: string): void {
|
||||
const trimmed = reason.trim();
|
||||
|
||||
if (trimmed.length < 10) {
|
||||
throw new ValidationError('Report reason must be at least 10 characters.', 'reason');
|
||||
}
|
||||
|
||||
if (reason.length > 500) {
|
||||
throw new ValidationError('Report reason is too long (max 500 characters).', 'reason');
|
||||
}
|
||||
}
|
||||
|
||||
export function validateUUID(value: string, fieldName: string): void {
|
||||
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/i;
|
||||
if (!uuidRegex.test(value)) {
|
||||
throw new ValidationError(`Invalid ${fieldName}.`, fieldName);
|
||||
}
|
||||
}
|
||||
1
_legacy/supabase/functions/appreciate/config.toml
Normal file
1
_legacy/supabase/functions/appreciate/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
202
_legacy/supabase/functions/appreciate/index.ts
Normal file
202
_legacy/supabase/functions/appreciate/index.ts
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
/**
|
||||
* POST /appreciate - Appreciate a post (boost it)
|
||||
* DELETE /appreciate - Remove appreciation
|
||||
*
|
||||
* Design intent:
|
||||
* - "Appreciate" instead of "like" - more intentional
|
||||
* - Quiet appreciation matters
|
||||
* - Boost-only, no downvotes
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || '*';
|
||||
const CORS_HEADERS = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Methods': 'POST, DELETE',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
interface AppreciateRequest {
|
||||
post_id: string;
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, { headers: CORS_HEADERS });
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
console.log('Auth header present:', !!authHeader, 'Length:', authHeader?.length ?? 0);
|
||||
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const adminClient = createServiceClient();
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
console.log('Auth result - user:', user?.id ?? 'null', 'error:', authError?.message ?? 'none');
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Unauthorized',
|
||||
message: authError?.message ?? 'No user found'
|
||||
}), {
|
||||
status: 401,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const { post_id } = (await req.json()) as AppreciateRequest;
|
||||
validateUUID(post_id, 'post_id');
|
||||
|
||||
// Use admin client to check post existence - RLS was causing issues for some users
|
||||
// The post_likes insert will still enforce that only valid posts can be liked
|
||||
const { data: postRow, error: postError } = await adminClient
|
||||
.from('posts')
|
||||
.select('id, visibility, author_id, status')
|
||||
.eq('id', post_id)
|
||||
.maybeSingle();
|
||||
|
||||
if (postError || !postRow) {
|
||||
console.error('Post lookup failed:', { post_id, error: postError?.message });
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post not found' }),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// Check if post is active (published)
|
||||
// Note: posts use 'active' status for published posts
|
||||
if (postRow.status !== 'active') {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post is not available' }),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// For private posts, verify the user has access
|
||||
if (postRow.visibility === 'private' && postRow.author_id !== user.id) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post not accessible' }),
|
||||
{ status: 403, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// For followers-only posts, verify the user follows the author
|
||||
if (postRow.visibility === 'followers' && postRow.author_id !== user.id) {
|
||||
const { data: followRow } = await adminClient
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', postRow.author_id)
|
||||
.eq('status', 'accepted')
|
||||
.maybeSingle();
|
||||
|
||||
if (!followRow) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'You must follow this user to appreciate their posts' }),
|
||||
{ status: 403, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle remove appreciation (DELETE)
|
||||
if (req.method === 'DELETE') {
|
||||
const { error: deleteError } = await adminClient
|
||||
.from('post_likes')
|
||||
.delete()
|
||||
.eq('user_id', user.id)
|
||||
.eq('post_id', post_id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Error removing appreciation:', deleteError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to remove appreciation' }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
status: 200,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Handle appreciate (POST)
|
||||
const { error: likeError } = await adminClient
|
||||
.from('post_likes')
|
||||
.insert({
|
||||
user_id: user.id,
|
||||
post_id,
|
||||
});
|
||||
|
||||
if (likeError) {
|
||||
console.error('Like error details:', JSON.stringify({
|
||||
code: likeError.code,
|
||||
message: likeError.message,
|
||||
details: likeError.details,
|
||||
hint: likeError.hint,
|
||||
user_id: user.id,
|
||||
post_id,
|
||||
}));
|
||||
|
||||
// Already appreciated (duplicate key)
|
||||
if (likeError.code === '23505') {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'You have already appreciated this post' }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// Post not visible (RLS blocked it)
|
||||
if (likeError.message?.includes('violates row-level security')) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post not found or not accessible', code: likeError.code }),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Error appreciating post:', likeError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to appreciate post', details: likeError.message }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Appreciation noted. Quiet signals matter.',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Validation error', message: error.message }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/block/config.toml
Normal file
1
_legacy/supabase/functions/block/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
183
_legacy/supabase/functions/block/index.ts
Normal file
183
_legacy/supabase/functions/block/index.ts
Normal file
|
|
@ -0,0 +1,183 @@
|
|||
/**
|
||||
* POST /block
|
||||
*
|
||||
* Design intent:
|
||||
* - One-tap, immediate, silent.
|
||||
* - Blocking removes all visibility both ways.
|
||||
* - No drama, no notification, complete separation.
|
||||
*
|
||||
* Flow:
|
||||
* 1. Validate auth
|
||||
* 2. Create block record
|
||||
* 3. Remove existing follows (if any)
|
||||
* 4. Log audit event
|
||||
* 5. Return success
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
interface BlockRequest {
|
||||
user_id: string; // the user to block
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST, DELETE',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Handle unblock (DELETE method)
|
||||
if (req.method === 'DELETE') {
|
||||
const { user_id } = (await req.json()) as BlockRequest;
|
||||
validateUUID(user_id, 'user_id');
|
||||
|
||||
const { error: deleteError } = await supabase
|
||||
.from('blocks')
|
||||
.delete()
|
||||
.eq('blocker_id', user.id)
|
||||
.eq('blocked_id', user_id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Error removing block:', deleteError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to remove block' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const serviceClient = createServiceClient();
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'user_unblocked',
|
||||
p_payload: { blocked_id: user_id },
|
||||
});
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 2. Parse request (POST method)
|
||||
const { user_id: blocked_id } = (await req.json()) as BlockRequest;
|
||||
|
||||
// 3. Validate input
|
||||
validateUUID(blocked_id, 'user_id');
|
||||
|
||||
if (blocked_id === user.id) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Invalid block',
|
||||
message: 'You cannot block yourself.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 4. Create block (idempotent - duplicate key will be ignored)
|
||||
const { error: blockError } = await supabase.from('blocks').insert({
|
||||
blocker_id: user.id,
|
||||
blocked_id,
|
||||
});
|
||||
|
||||
if (blockError && !blockError.message.includes('duplicate')) {
|
||||
console.error('Error creating block:', blockError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to create block' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 5. Remove any existing follows (both directions)
|
||||
// This ensures complete separation
|
||||
const { error: unfollowError1 } = await supabase
|
||||
.from('follows')
|
||||
.delete()
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', blocked_id);
|
||||
|
||||
const { error: unfollowError2 } = await supabase
|
||||
.from('follows')
|
||||
.delete()
|
||||
.eq('follower_id', blocked_id)
|
||||
.eq('following_id', user.id);
|
||||
|
||||
if (unfollowError1 || unfollowError2) {
|
||||
console.warn('Error removing follows during block:', unfollowError1 || unfollowError2);
|
||||
// Continue anyway - block is more important
|
||||
}
|
||||
|
||||
// 6. Log audit event
|
||||
const serviceClient = createServiceClient();
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'user_blocked',
|
||||
p_payload: { blocked_id },
|
||||
});
|
||||
|
||||
// 7. Return success
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Block applied. You will no longer see each other.',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: error.message,
|
||||
field: error.field,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/calculate-harmony/config.toml
Normal file
1
_legacy/supabase/functions/calculate-harmony/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
215
_legacy/supabase/functions/calculate-harmony/index.ts
Normal file
215
_legacy/supabase/functions/calculate-harmony/index.ts
Normal file
|
|
@ -0,0 +1,215 @@
|
|||
/**
|
||||
* Harmony Score Calculation (Cron Job)
|
||||
*
|
||||
* This function runs periodically (e.g., daily) to recalculate harmony scores
|
||||
* for all users based on their recent behavior patterns.
|
||||
*
|
||||
* Design intent:
|
||||
* - Influence adapts automatically based on behavior.
|
||||
* - Scores decay over time (old issues fade).
|
||||
* - Changes are gradual, not sudden.
|
||||
*
|
||||
* Trigger: Scheduled via Supabase cron or external scheduler
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { calculateHarmonyAdjustment, type HarmonyInputs } from '../_shared/harmony.ts';
|
||||
|
||||
serve(async (req) => {
|
||||
try {
|
||||
// Verify this is a scheduled/cron request
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
const cronSecret = Deno.env.get('CRON_SECRET');
|
||||
|
||||
if (!authHeader || authHeader !== `Bearer ${cronSecret}`) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
// 1. Get all users with their current trust state
|
||||
const { data: users, error: usersError } = await serviceClient
|
||||
.from('trust_state')
|
||||
.select('user_id, harmony_score, tier, counters')
|
||||
.order('user_id');
|
||||
|
||||
if (usersError) {
|
||||
console.error('Error fetching users:', usersError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to fetch users' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const sevenDaysAgo = new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString();
|
||||
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString();
|
||||
|
||||
let updatedCount = 0;
|
||||
let errorCount = 0;
|
||||
|
||||
// 2. Process each user
|
||||
for (const user of users) {
|
||||
try {
|
||||
// Gather behavior metrics for this user
|
||||
|
||||
// Blocks received
|
||||
const { count: blocks7d } = await serviceClient
|
||||
.from('blocks')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('blocked_id', user.user_id)
|
||||
.gte('created_at', sevenDaysAgo);
|
||||
|
||||
const { count: blocks30d } = await serviceClient
|
||||
.from('blocks')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('blocked_id', user.user_id)
|
||||
.gte('created_at', thirtyDaysAgo);
|
||||
|
||||
// Reports against this user
|
||||
const { data: reportsAgainst } = await serviceClient
|
||||
.from('reports')
|
||||
.select('reporter_id, status')
|
||||
.or('target_type.eq.post,target_type.eq.comment')
|
||||
.in(
|
||||
'target_id',
|
||||
serviceClient
|
||||
.from('posts')
|
||||
.select('id')
|
||||
.eq('author_id', user.user_id)
|
||||
.then((r) => r.data?.map((p) => p.id) || [])
|
||||
);
|
||||
|
||||
// Count trusted reports (from high-harmony reporters)
|
||||
let trustedReportsCount = 0;
|
||||
if (reportsAgainst) {
|
||||
for (const report of reportsAgainst) {
|
||||
const { data: reporterTrust } = await serviceClient
|
||||
.from('trust_state')
|
||||
.select('harmony_score')
|
||||
.eq('user_id', report.reporter_id)
|
||||
.single();
|
||||
|
||||
if (reporterTrust && reporterTrust.harmony_score >= 70) {
|
||||
trustedReportsCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Posts rejected in last 7 days (from audit log)
|
||||
const { data: rejectedPosts } = await serviceClient
|
||||
.from('audit_log')
|
||||
.select('id')
|
||||
.eq('actor_id', user.user_id)
|
||||
.eq('event_type', 'post_rejected')
|
||||
.gte('created_at', sevenDaysAgo);
|
||||
|
||||
// Posts created in last 7 days
|
||||
const { count: postsCreated7d } = await serviceClient
|
||||
.from('posts')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('author_id', user.user_id)
|
||||
.gte('created_at', sevenDaysAgo);
|
||||
|
||||
// Reports filed by user
|
||||
const { data: reportsFiled } = await serviceClient
|
||||
.from('reports')
|
||||
.select('id, status')
|
||||
.eq('reporter_id', user.user_id);
|
||||
|
||||
const falseReports = reportsFiled?.filter((r) => r.status === 'dismissed').length || 0;
|
||||
const validatedReports = reportsFiled?.filter((r) => r.status === 'resolved').length || 0;
|
||||
|
||||
// Days since signup
|
||||
const { data: profile } = await serviceClient
|
||||
.from('profiles')
|
||||
.select('created_at')
|
||||
.eq('id', user.user_id)
|
||||
.single();
|
||||
|
||||
const daysSinceSignup = profile
|
||||
? Math.floor((Date.now() - new Date(profile.created_at).getTime()) / (1000 * 60 * 60 * 24))
|
||||
: 0;
|
||||
|
||||
// 3. Calculate harmony adjustment
|
||||
const inputs: HarmonyInputs = {
|
||||
user_id: user.user_id,
|
||||
blocks_received_7d: blocks7d || 0,
|
||||
blocks_received_30d: blocks30d || 0,
|
||||
trusted_reports_against: trustedReportsCount,
|
||||
total_reports_against: reportsAgainst?.length || 0,
|
||||
posts_rejected_7d: rejectedPosts?.length || 0,
|
||||
posts_created_7d: postsCreated7d || 0,
|
||||
false_reports_filed: falseReports,
|
||||
validated_reports_filed: validatedReports,
|
||||
days_since_signup: daysSinceSignup,
|
||||
current_harmony_score: user.harmony_score,
|
||||
current_tier: user.tier,
|
||||
};
|
||||
|
||||
const adjustment = calculateHarmonyAdjustment(inputs);
|
||||
|
||||
// 4. Update trust state if score changed
|
||||
if (adjustment.delta !== 0) {
|
||||
const { error: updateError } = await serviceClient
|
||||
.from('trust_state')
|
||||
.update({
|
||||
harmony_score: adjustment.new_score,
|
||||
tier: adjustment.new_tier,
|
||||
updated_at: new Date().toISOString(),
|
||||
})
|
||||
.eq('user_id', user.user_id);
|
||||
|
||||
if (updateError) {
|
||||
console.error(`Error updating trust state for ${user.user_id}:`, updateError);
|
||||
errorCount++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// 5. Log the adjustment
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: null, // system action
|
||||
p_event_type: 'harmony_recalculated',
|
||||
p_payload: {
|
||||
user_id: user.user_id,
|
||||
old_score: user.harmony_score,
|
||||
new_score: adjustment.new_score,
|
||||
delta: adjustment.delta,
|
||||
old_tier: user.tier,
|
||||
new_tier: adjustment.new_tier,
|
||||
reason: adjustment.reason,
|
||||
},
|
||||
});
|
||||
|
||||
updatedCount++;
|
||||
}
|
||||
} catch (userError) {
|
||||
console.error(`Error processing user ${user.user_id}:`, userError);
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
total_users: users.length,
|
||||
updated: updatedCount,
|
||||
errors: errorCount,
|
||||
message: 'Harmony score recalculation complete',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
157
_legacy/supabase/functions/cleanup-expired-content/index.ts
Normal file
157
_legacy/supabase/functions/cleanup-expired-content/index.ts
Normal file
|
|
@ -0,0 +1,157 @@
|
|||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { S3Client, DeleteObjectCommand } from 'https://esm.sh/@aws-sdk/client-s3@3.470.0';
|
||||
import { createServiceClient } from '../_shared/supabase-client.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com';
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
const R2_ENDPOINT = (Deno.env.get('R2_ENDPOINT') ?? '').trim();
|
||||
const R2_ACCOUNT_ID = (Deno.env.get('R2_ACCOUNT_ID') ?? '').trim();
|
||||
const R2_ACCESS_KEY_ID = (Deno.env.get('R2_ACCESS_KEY_ID') ?? Deno.env.get('R2_ACCESS_KEY') ?? '').trim();
|
||||
const R2_SECRET_ACCESS_KEY = (Deno.env.get('R2_SECRET_ACCESS_KEY') ?? Deno.env.get('R2_SECRET_KEY') ?? '').trim();
|
||||
const R2_BUCKET_NAME = (Deno.env.get('R2_BUCKET_NAME') ?? '').trim();
|
||||
const DEFAULT_BUCKET_NAME = 'sojorn-media';
|
||||
|
||||
const RESOLVED_ENDPOINT = R2_ENDPOINT || (R2_ACCOUNT_ID ? `https://${R2_ACCOUNT_ID}.r2.cloudflarestorage.com` : '');
|
||||
const RESOLVED_BUCKET = R2_BUCKET_NAME || DEFAULT_BUCKET_NAME;
|
||||
|
||||
const supabase = createServiceClient();
|
||||
|
||||
function extractObjectKey(imageUrl: string, bucketName: string): string | null {
|
||||
try {
|
||||
const url = new URL(imageUrl);
|
||||
let key = url.pathname.replace(/^\/+/, '');
|
||||
if (!key) return null;
|
||||
if (bucketName && key.startsWith(`${bucketName}/`)) {
|
||||
key = key.slice(bucketName.length + 1);
|
||||
}
|
||||
return decodeURIComponent(key);
|
||||
} catch (_) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: { ...corsHeaders, 'Access-Control-Allow-Methods': 'POST OPTIONS' } });
|
||||
}
|
||||
|
||||
if (req.method !== 'POST') {
|
||||
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
|
||||
status: 405,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (!RESOLVED_ENDPOINT || !R2_ACCESS_KEY_ID || !R2_SECRET_ACCESS_KEY || !RESOLVED_BUCKET) {
|
||||
return new Response(JSON.stringify({ error: 'Missing R2 configuration' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const r2 = new S3Client({
|
||||
region: 'auto',
|
||||
endpoint: RESOLVED_ENDPOINT,
|
||||
credentials: {
|
||||
accessKeyId: R2_ACCESS_KEY_ID,
|
||||
secretAccessKey: R2_SECRET_ACCESS_KEY,
|
||||
},
|
||||
forcePathStyle: true,
|
||||
});
|
||||
|
||||
try {
|
||||
const maxBatches = 50;
|
||||
const batchSize = 100;
|
||||
const maxRuntimeMs = 25000;
|
||||
const startTime = Date.now();
|
||||
|
||||
let processedCount = 0;
|
||||
let deletedCount = 0;
|
||||
let skippedCount = 0;
|
||||
let batches = 0;
|
||||
|
||||
while (batches < maxBatches && Date.now() - startTime < maxRuntimeMs) {
|
||||
const { data: posts, error } = await supabase
|
||||
.from('posts')
|
||||
.select('id, image_url, expires_at')
|
||||
.lt('expires_at', new Date().toISOString())
|
||||
.order('expires_at', { ascending: true })
|
||||
.limit(batchSize);
|
||||
|
||||
if (error) {
|
||||
console.error('Error fetching expired posts:', error);
|
||||
return new Response(JSON.stringify({ error: 'Failed to fetch expired posts' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const expiredPosts = posts ?? [];
|
||||
if (expiredPosts.length === 0) break;
|
||||
|
||||
processedCount += expiredPosts.length;
|
||||
batches += 1;
|
||||
|
||||
for (const post of expiredPosts) {
|
||||
if (post.image_url) {
|
||||
const key = extractObjectKey(post.image_url, RESOLVED_BUCKET);
|
||||
if (!key) {
|
||||
console.error('Could not parse image key:', { post_id: post.id, image_url: post.image_url });
|
||||
skippedCount += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
await r2.send(
|
||||
new DeleteObjectCommand({
|
||||
Bucket: RESOLVED_BUCKET,
|
||||
Key: key,
|
||||
})
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('R2 deletion failed:', { post_id: post.id, error });
|
||||
skippedCount += 1;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
const { error: deleteError } = await supabase
|
||||
.from('posts')
|
||||
.delete()
|
||||
.eq('id', post.id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Failed to delete post row:', { post_id: post.id, error: deleteError });
|
||||
skippedCount += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
deletedCount += 1;
|
||||
}
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
processed: processedCount,
|
||||
deleted: deletedCount,
|
||||
skipped: skippedCount,
|
||||
batches,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Unexpected cleanup error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
98
_legacy/supabase/functions/consume_one_time_prekey/index.ts
Normal file
98
_legacy/supabase/functions/consume_one_time_prekey/index.ts
Normal file
|
|
@ -0,0 +1,98 @@
|
|||
import { serve } from "https://deno.land/std@0.168.0/http/server.ts"
|
||||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2'
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
}
|
||||
|
||||
interface ConsumeOneTimePrekeyRequest {
|
||||
target_user_id: string
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
// Handle CORS
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders })
|
||||
}
|
||||
|
||||
try {
|
||||
// Create a Supabase client with the Auth context of the logged in user.
|
||||
const supabaseClient = createClient(
|
||||
Deno.env.get('SUPABASE_URL') ?? '',
|
||||
Deno.env.get('SUPABASE_ANON_KEY') ?? '',
|
||||
{
|
||||
global: {
|
||||
headers: { Authorization: req.headers.get('Authorization')! },
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
// Get the current user
|
||||
const {
|
||||
data: { user },
|
||||
} = await supabaseClient.auth.getUser()
|
||||
|
||||
if (!user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
// Parse request body
|
||||
const { target_user_id }: ConsumeOneTimePrekeyRequest = await req.json()
|
||||
|
||||
if (!target_user_id) {
|
||||
return new Response(JSON.stringify({ error: 'target_user_id is required' }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
console.log(`Consuming one-time pre-key for user: ${target_user_id}`)
|
||||
|
||||
// Get and consume (delete) the oldest one-time pre-key for the target user
|
||||
const { data: prekey, error } = await supabaseClient
|
||||
.from('one_time_prekeys')
|
||||
.delete()
|
||||
.eq('user_id', target_user_id)
|
||||
.order('created_at')
|
||||
.limit(1)
|
||||
.select('id, public_key')
|
||||
.single()
|
||||
|
||||
if (error) {
|
||||
console.error('Error consuming one-time pre-key:', error)
|
||||
return new Response(JSON.stringify({ error: 'Failed to consume one-time pre-key' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
if (!prekey) {
|
||||
console.log(`No one-time pre-keys available for user: ${target_user_id}`)
|
||||
// Return null to indicate no pre-key was available
|
||||
return new Response(JSON.stringify(null), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
console.log(`Successfully consumed one-time pre-key: ${prekey.id} for user: ${target_user_id}`)
|
||||
|
||||
// Return the consumed pre-key data
|
||||
return new Response(JSON.stringify({
|
||||
key_id: prekey.id,
|
||||
public_key: prekey.public_key,
|
||||
}), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error)
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
})
|
||||
1
_legacy/supabase/functions/create-beacon/config.toml
Normal file
1
_legacy/supabase/functions/create-beacon/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
181
_legacy/supabase/functions/create-beacon/index.ts
Normal file
181
_legacy/supabase/functions/create-beacon/index.ts
Normal file
|
|
@ -0,0 +1,181 @@
|
|||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts'
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts'
|
||||
import { trySignR2Url } from '../_shared/r2_signer.ts'
|
||||
|
||||
interface BeaconRequest {
|
||||
lat: number
|
||||
long: number
|
||||
title: string
|
||||
description: string
|
||||
type: 'police' | 'checkpoint' | 'taskForce' | 'hazard' | 'safety' | 'community'
|
||||
image_url?: string
|
||||
}
|
||||
|
||||
interface ResponseData {
|
||||
beacon?: Record<string, unknown>
|
||||
error?: string
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
try {
|
||||
// Get auth header
|
||||
const authHeader = req.headers.get('Authorization')
|
||||
if (!authHeader) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Missing Authorization header' } as ResponseData),
|
||||
{ status: 401, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader)
|
||||
const { data: { user }, error: userError } = await supabase.auth.getUser()
|
||||
|
||||
if (userError || !user) {
|
||||
console.error('Auth error:', userError)
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Unauthorized' } as ResponseData),
|
||||
{ status: 401, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
// Use service role for DB operations
|
||||
const supabaseAdmin = createServiceClient()
|
||||
|
||||
// Parse request body
|
||||
const body = await req.json()
|
||||
|
||||
// Convert lat/long to numbers (handles both int and double from client)
|
||||
const beaconReq: BeaconRequest = {
|
||||
lat: Number(body.lat),
|
||||
long: Number(body.long),
|
||||
title: body.title,
|
||||
description: body.description,
|
||||
type: body.type,
|
||||
image_url: body.image_url
|
||||
}
|
||||
|
||||
// Validate required fields
|
||||
if (!beaconReq.lat || !beaconReq.long || !beaconReq.title || !beaconReq.description || !beaconReq.type) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Missing required fields: lat, long, title, description, type' } as ResponseData),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
// Validate beacon type
|
||||
const validTypes = ['police', 'checkpoint', 'taskForce', 'hazard', 'safety', 'community']
|
||||
if (!validTypes.includes(beaconReq.type)) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Invalid beacon type. Must be: police, checkpoint, taskForce, hazard, safety, or community' } as ResponseData),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
// Get user's profile and trust score (use admin client to bypass RLS)
|
||||
const { data: profile, error: profileError } = await supabaseAdmin
|
||||
.from('profiles')
|
||||
.select('id, trust_state(harmony_score)')
|
||||
.eq('id', user.id)
|
||||
.single()
|
||||
|
||||
if (profileError || !profile) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Profile not found' } as ResponseData),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
// Get a default category for beacons (search by slug to match seed data)
|
||||
const { data: category } = await supabaseAdmin
|
||||
.from('categories')
|
||||
.select('id')
|
||||
.eq('slug', 'beacon_alerts')
|
||||
.single()
|
||||
|
||||
let categoryId = category?.id
|
||||
|
||||
if (!categoryId) {
|
||||
// Create the beacon category if it doesn't exist (with service role bypass)
|
||||
const { data: newCategory, error: insertError } = await supabaseAdmin
|
||||
.from('categories')
|
||||
.insert({ slug: 'beacon_alerts', name: 'Beacon Alerts', description: 'Community safety and alert posts' })
|
||||
.select('id')
|
||||
.single()
|
||||
|
||||
if (insertError || !newCategory) {
|
||||
console.error('Failed to create beacon category:', insertError)
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Failed to create beacon category' } as ResponseData),
|
||||
{ status: 500, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
categoryId = newCategory.id
|
||||
}
|
||||
|
||||
// Get user's trust score for initial confidence
|
||||
const trustScore = profile.trust_state?.harmony_score ?? 0.5
|
||||
const initialConfidence = 0.5 + (trustScore * 0.3) // Start at 50-80% based on trust
|
||||
|
||||
// Create the beacon post
|
||||
const { data: beacon, error: beaconError } = await supabaseAdmin
|
||||
.from('posts')
|
||||
.insert({
|
||||
author_id: user.id,
|
||||
category_id: categoryId,
|
||||
body: beaconReq.description,
|
||||
is_beacon: true,
|
||||
beacon_type: beaconReq.type,
|
||||
location: `SRID=4326;POINT(${beaconReq.long} ${beaconReq.lat})`,
|
||||
confidence_score: Math.min(1.0, Math.max(0.0, initialConfidence)),
|
||||
is_active_beacon: true,
|
||||
image_url: beaconReq.image_url,
|
||||
status: 'active',
|
||||
tone_label: 'neutral',
|
||||
cis_score: 0.8,
|
||||
allow_chain: false // Beacons don't allow chaining
|
||||
})
|
||||
.select()
|
||||
.single()
|
||||
|
||||
if (beaconError) {
|
||||
console.error('Error creating beacon:', beaconError)
|
||||
return new Response(
|
||||
JSON.stringify({ error: `Failed to create beacon: ${beaconError.message}` } as ResponseData),
|
||||
{ status: 500, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
// Get full beacon data with author info
|
||||
const { data: fullBeacon } = await supabaseAdmin
|
||||
.from('posts')
|
||||
.select(`
|
||||
*,
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
)
|
||||
`)
|
||||
.eq('id', beacon.id)
|
||||
.single()
|
||||
|
||||
let signedBeacon = fullBeacon
|
||||
if (fullBeacon?.image_url) {
|
||||
signedBeacon = { ...fullBeacon, image_url: await trySignR2Url(fullBeacon.image_url) }
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({ beacon: signedBeacon } as ResponseData),
|
||||
{ status: 201, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error)
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Internal server error' } as ResponseData),
|
||||
{ status: 500, headers: { 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
})
|
||||
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
168
_legacy/supabase/functions/deactivate-account/index.ts
Normal file
168
_legacy/supabase/functions/deactivate-account/index.ts
Normal file
|
|
@ -0,0 +1,168 @@
|
|||
/**
|
||||
* POST /deactivate-account - Deactivate user account (reactivatable within 30 days)
|
||||
* POST /deactivate-account/reactivate - Reactivate a deactivated account
|
||||
*
|
||||
* Design intent:
|
||||
* - Allows users to temporarily deactivate their account
|
||||
* - Account can be reactivated by logging in
|
||||
* - Profile and posts are hidden while deactivated
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient } from '../_shared/supabase-client.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com';
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (req.method !== 'POST') {
|
||||
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
|
||||
status: 405,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const isReactivate = url.pathname.endsWith('/reactivate');
|
||||
|
||||
if (isReactivate) {
|
||||
// Reactivate account
|
||||
const { data, error } = await supabase
|
||||
.rpc('reactivate_account', { p_user_id: user.id });
|
||||
|
||||
if (error) {
|
||||
console.error('Error reactivating account:', error);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Failed to reactivate account',
|
||||
details: error.message
|
||||
}), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (!data || !data.success) {
|
||||
return new Response(JSON.stringify({
|
||||
error: data?.error || 'Account is not deactivated'
|
||||
}), {
|
||||
status: 400,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Account reactivated successfully',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
}
|
||||
);
|
||||
} else {
|
||||
// Deactivate account
|
||||
const { data, error } = await supabase
|
||||
.rpc('deactivate_account', { p_user_id: user.id });
|
||||
|
||||
if (error) {
|
||||
console.error('Error deactivating account:', error);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Failed to deactivate account',
|
||||
details: error.message
|
||||
}), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (!data || !data.success) {
|
||||
return new Response(JSON.stringify({
|
||||
error: data?.error || 'Account already deactivated'
|
||||
}), {
|
||||
status: 400,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Account deactivated successfully. You can reactivate it anytime by logging in.',
|
||||
deactivated_at: data.deactivated_at,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/delete-account/config.toml
Normal file
1
_legacy/supabase/functions/delete-account/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
170
_legacy/supabase/functions/delete-account/index.ts
Normal file
170
_legacy/supabase/functions/delete-account/index.ts
Normal file
|
|
@ -0,0 +1,170 @@
|
|||
/**
|
||||
* POST /delete-account - Request permanent account deletion (30-day waiting period)
|
||||
* POST /delete-account/cancel - Cancel a pending deletion request
|
||||
*
|
||||
* Design intent:
|
||||
* - Allows users to request permanent account deletion
|
||||
* - 30-day waiting period before actual deletion
|
||||
* - Users can cancel the request within 30 days
|
||||
* - After 30 days, account is permanently deleted by a scheduled job
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient } from '../_shared/supabase-client.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com';
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (req.method !== 'POST') {
|
||||
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
|
||||
status: 405,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const isCancel = url.pathname.endsWith('/cancel');
|
||||
|
||||
if (isCancel) {
|
||||
// Cancel deletion request
|
||||
const { data, error } = await supabase
|
||||
.rpc('cancel_account_deletion', { p_user_id: user.id });
|
||||
|
||||
if (error) {
|
||||
console.error('Error cancelling deletion:', error);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Failed to cancel deletion request',
|
||||
details: error.message
|
||||
}), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (!data || !data.success) {
|
||||
return new Response(JSON.stringify({
|
||||
error: data?.error || 'No pending deletion request found'
|
||||
}), {
|
||||
status: 400,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Account deletion request cancelled successfully',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
}
|
||||
);
|
||||
} else {
|
||||
// Request account deletion
|
||||
const { data, error } = await supabase
|
||||
.rpc('request_account_deletion', { p_user_id: user.id });
|
||||
|
||||
if (error) {
|
||||
console.error('Error requesting deletion:', error);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Failed to request account deletion',
|
||||
details: error.message
|
||||
}), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (!data || !data.success) {
|
||||
return new Response(JSON.stringify({
|
||||
error: data?.error || 'Account deletion already requested'
|
||||
}), {
|
||||
status: 400,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Account deletion requested. Your account will be permanently deleted in 30 days. You can cancel this request anytime by logging in.',
|
||||
deletion_date: data.deletion_date,
|
||||
deletion_requested_at: data.deletion_requested_at,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
},
|
||||
});
|
||||
}
|
||||
});
|
||||
5
_legacy/supabase/functions/deno.jsonc
Normal file
5
_legacy/supabase/functions/deno.jsonc
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["deno.ns", "dom"]
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
686
_legacy/supabase/functions/e2ee_session_manager/index.ts
Normal file
686
_legacy/supabase/functions/e2ee_session_manager/index.ts
Normal file
|
|
@ -0,0 +1,686 @@
|
|||
import { serve } from "https://deno.land/std@0.168.0/http/server.ts";
|
||||
import { createClient } from "https://esm.sh/@supabase/supabase-js@2";
|
||||
|
||||
/**
|
||||
* E2EE Session Manager Edge Function
|
||||
*
|
||||
* This function handles session management, cleanup, and recovery for the
|
||||
* end-to-end encryption system. It operates on metadata only and cannot
|
||||
* access the actual encrypted message content.
|
||||
*
|
||||
* Security Properties:
|
||||
* - Blind to message content (only handles metadata)
|
||||
* - Enforces proper session protocols
|
||||
* - Handles cleanup and recovery scenarios
|
||||
* - Maintains perfect forward secrecy
|
||||
*/
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST, OPTIONS',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
'Content-Type': 'application/json',
|
||||
};
|
||||
|
||||
serve(async (req) => {
|
||||
// Handle CORS preflight
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
// Verify authorization header exists
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
// Create auth client to verify the user's JWT
|
||||
const authClient = createClient(
|
||||
Deno.env.get('SUPABASE_URL') ?? '',
|
||||
Deno.env.get('SUPABASE_ANON_KEY') ?? '',
|
||||
{
|
||||
global: {
|
||||
headers: {
|
||||
Authorization: authHeader,
|
||||
apikey: Deno.env.get('SUPABASE_ANON_KEY') ?? '',
|
||||
},
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
// Verify the JWT and get the authenticated user
|
||||
const { data: { user }, error: authError } = await authClient.auth.getUser();
|
||||
if (authError || !user) {
|
||||
console.error('[E2EE Session Manager] Auth error:', authError);
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
const serviceRoleKey = Deno.env.get('SUPABASE_SERVICE_ROLE_KEY');
|
||||
if (!serviceRoleKey) {
|
||||
console.error('[E2EE Session Manager] Missing SUPABASE_SERVICE_ROLE_KEY');
|
||||
return new Response(JSON.stringify({ error: 'Server misconfiguration: missing service role key' }), {
|
||||
status: 500,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
// Create service client for database operations (bypasses RLS)
|
||||
const supabase = createClient(
|
||||
Deno.env.get('SUPABASE_URL') ?? '',
|
||||
serviceRoleKey
|
||||
);
|
||||
|
||||
// Parse request body
|
||||
const { action, userId, recipientId, conversationId, hasSession } = await req.json();
|
||||
|
||||
if (!action) {
|
||||
return new Response(JSON.stringify({ error: 'Action is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
// Use the authenticated user's ID, ignoring any userId in the request body
|
||||
// This prevents users from impersonating others
|
||||
const authenticatedUserId = user.id;
|
||||
|
||||
console.log(`[E2EE Session Manager] ${action} requested by ${authenticatedUserId}`);
|
||||
|
||||
switch (action) {
|
||||
case 'reset_session':
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
if (recipientId === authenticatedUserId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId must be different from userId' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
{
|
||||
const profileCheck = await ensureProfilesExist(supabase, [authenticatedUserId, recipientId]);
|
||||
if (profileCheck) return profileCheck;
|
||||
}
|
||||
return handleResetSession(supabase, authenticatedUserId, recipientId, corsHeaders);
|
||||
case 'cleanup_conversation':
|
||||
if (!conversationId) {
|
||||
return new Response(JSON.stringify({ error: 'conversationId is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
return handleCleanupConversation(supabase, authenticatedUserId, conversationId, corsHeaders);
|
||||
case 'verify_session':
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
if (recipientId === authenticatedUserId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId must be different from userId' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
{
|
||||
const profileCheck = await ensureProfilesExist(supabase, [authenticatedUserId, recipientId]);
|
||||
if (profileCheck) return profileCheck;
|
||||
}
|
||||
return handleVerifySession(supabase, authenticatedUserId, recipientId, corsHeaders);
|
||||
case 'force_key_refresh':
|
||||
return handleForceKeyRefresh(supabase, authenticatedUserId, corsHeaders);
|
||||
case 'sync_session_state':
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
if (recipientId === authenticatedUserId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId must be different from userId' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
if (typeof hasSession !== 'boolean') {
|
||||
return new Response(JSON.stringify({ error: 'hasSession must be a boolean' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
{
|
||||
const profileCheck = await ensureProfilesExist(supabase, [authenticatedUserId, recipientId]);
|
||||
if (profileCheck) return profileCheck;
|
||||
}
|
||||
return handleSyncSessionState(supabase, authenticatedUserId, recipientId, hasSession, corsHeaders);
|
||||
case 'get_session_state':
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
if (recipientId === authenticatedUserId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId must be different from userId' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
{
|
||||
const profileCheck = await ensureProfilesExist(supabase, [authenticatedUserId, recipientId]);
|
||||
if (profileCheck) return profileCheck;
|
||||
}
|
||||
return handleGetSessionState(supabase, authenticatedUserId, recipientId, corsHeaders);
|
||||
default:
|
||||
return new Response(JSON.stringify({ error: 'Invalid action' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Error:', error);
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 500,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
async function ensureProfilesExist(
|
||||
supabase: any,
|
||||
userIds: string[]
|
||||
): Promise<Response | null> {
|
||||
const uniqueIds = [...new Set(userIds)].filter(Boolean);
|
||||
if (uniqueIds.length === 0) {
|
||||
return new Response(JSON.stringify({ error: 'Invalid user IDs' }), {
|
||||
status: 400,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
const { data, error } = await supabase
|
||||
.from('profiles')
|
||||
.select('id')
|
||||
.in('id', uniqueIds);
|
||||
|
||||
if (error) {
|
||||
console.error('[E2EE Session Manager] Failed to verify profiles:', error);
|
||||
return new Response(JSON.stringify({ error: 'Failed to verify profiles' }), {
|
||||
status: 500,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
const found = new Set((data ?? []).map((row: { id: string }) => row.id));
|
||||
const missing = uniqueIds.filter((id) => !found.has(id));
|
||||
if (missing.length > 0) {
|
||||
return new Response(JSON.stringify({ error: 'Profile not found', missingIds: missing }), {
|
||||
status: 404,
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset session between two users
|
||||
* - Clears session keys for both parties
|
||||
* - Forces re-establishment of encryption session
|
||||
*/
|
||||
async function handleResetSession(supabase: any, userId: string, recipientId: string, headers: Record<string, string>) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Resetting session between ${userId} and ${recipientId}`);
|
||||
|
||||
// Note: We can't directly clear flutter_secure_storage from the server,
|
||||
// but we can send commands that the client will process
|
||||
|
||||
// Store a session reset command in the database
|
||||
const { error } = await supabase
|
||||
.from('e2ee_session_commands')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
recipient_id: recipientId,
|
||||
command_type: 'session_reset',
|
||||
status: 'pending',
|
||||
created_at: new Date().toISOString(),
|
||||
});
|
||||
|
||||
if (error) {
|
||||
console.error('[E2EE Session Manager] Failed to store session reset command:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Send a realtime notification to both parties
|
||||
// The client will pick this up and clear their local session keys
|
||||
await supabase
|
||||
.from('e2ee_session_events')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
event_type: 'session_reset',
|
||||
recipient_id: recipientId,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
await supabase
|
||||
.from('e2ee_session_events')
|
||||
.insert({
|
||||
user_id: recipientId,
|
||||
event_type: 'session_reset',
|
||||
recipient_id: userId,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
message: 'Session reset initiated for both parties'
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Session reset failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up conversation and related data
|
||||
* - Deletes messages (if allowed by RLS)
|
||||
* - Clears session keys
|
||||
* - Handles cleanup of related metadata
|
||||
*/
|
||||
async function handleCleanupConversation(supabase: any, userId: string, conversationId: string, headers: Record<string, string>) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Cleaning up conversation ${conversationId} for ${userId}`);
|
||||
|
||||
// Get conversation details
|
||||
const { data: conversation, error: convError } = await supabase
|
||||
.from('encrypted_conversations')
|
||||
.select('participant_a, participant_b')
|
||||
.eq('id', conversationId)
|
||||
.single();
|
||||
|
||||
if (convError || !conversation) {
|
||||
console.error('[E2EE Session Manager] Conversation not found:', convError);
|
||||
return new Response(JSON.stringify({ success: false, error: 'Conversation not found' }), {
|
||||
status: 404,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Determine the other participant
|
||||
const otherUserId = conversation.participant_a === userId
|
||||
? conversation.participant_b
|
||||
: conversation.participant_a;
|
||||
|
||||
// Delete messages sent by the current user
|
||||
const { error: deleteError } = await supabase
|
||||
.from('encrypted_messages')
|
||||
.delete()
|
||||
.eq('conversation_id', conversationId)
|
||||
.eq('sender_id', userId);
|
||||
|
||||
if (deleteError) {
|
||||
console.warn('[E2EE Session Manager] Could not delete all messages:', deleteError);
|
||||
// This is acceptable - RLS might prevent deletion of some messages
|
||||
}
|
||||
|
||||
// Store cleanup command for both parties
|
||||
await supabase
|
||||
.from('e2ee_session_commands')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
conversation_id: conversationId,
|
||||
command_type: 'conversation_cleanup',
|
||||
status: 'pending',
|
||||
created_at: new Date().toISOString(),
|
||||
});
|
||||
|
||||
await supabase
|
||||
.from('e2ee_session_commands')
|
||||
.insert({
|
||||
user_id: otherUserId,
|
||||
conversation_id: conversationId,
|
||||
command_type: 'conversation_cleanup',
|
||||
status: 'pending',
|
||||
created_at: new Date().toISOString(),
|
||||
});
|
||||
|
||||
// Send realtime notifications
|
||||
await supabase
|
||||
.from('e2ee_session_events')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
event_type: 'conversation_cleanup',
|
||||
conversation_id: conversationId,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
await supabase
|
||||
.from('e2ee_session_events')
|
||||
.insert({
|
||||
user_id: otherUserId,
|
||||
event_type: 'conversation_cleanup',
|
||||
conversation_id: conversationId,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
message: 'Conversation cleanup initiated',
|
||||
otherUserId: otherUserId
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Conversation cleanup failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify if a session exists between two users
|
||||
* - Checks for existing session metadata
|
||||
* - Returns session status without exposing keys
|
||||
*/
|
||||
async function handleVerifySession(supabase: any, userId: string, recipientId: string, headers: Record<string, string>) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Verifying session between ${userId} and ${recipientId}`);
|
||||
|
||||
// Check if both users have signal keys (indicates they can establish sessions)
|
||||
const { data: userKeys, error: userError } = await supabase
|
||||
.from('signal_keys')
|
||||
.select('user_id')
|
||||
.eq('user_id', userId)
|
||||
.single();
|
||||
|
||||
const { data: recipientKeys, error: recipientError } = await supabase
|
||||
.from('signal_keys')
|
||||
.select('user_id')
|
||||
.eq('user_id', recipientId)
|
||||
.single();
|
||||
|
||||
if (userError || !userKeys) {
|
||||
return new Response(JSON.stringify({
|
||||
success: false,
|
||||
error: 'User has no encryption keys',
|
||||
userHasKeys: false,
|
||||
recipientHasKeys: !!recipientKeys
|
||||
}), {
|
||||
status: 400,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
if (recipientError || !recipientKeys) {
|
||||
return new Response(JSON.stringify({
|
||||
success: false,
|
||||
error: 'Recipient has no encryption keys',
|
||||
userHasKeys: true,
|
||||
recipientHasKeys: false
|
||||
}), {
|
||||
status: 400,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Check if there's an existing conversation (indicates potential session)
|
||||
const { data: conversation, error: convError } = await supabase
|
||||
.from('encrypted_conversations')
|
||||
.select('id, last_message_at')
|
||||
.or(`participant_a.eq.${userId},participant_b.eq.${userId}`)
|
||||
.or(`participant_a.eq.${recipientId},participant_b.eq.${recipientId}`)
|
||||
.single();
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
userHasKeys: true,
|
||||
recipientHasKeys: true,
|
||||
hasConversation: !!conversation,
|
||||
conversationId: conversation?.id,
|
||||
lastMessageAt: conversation?.last_message_at
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Session verification failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Force key refresh for a user
|
||||
* - Triggers rotation of encryption keys
|
||||
* - Handles key upload and cleanup
|
||||
*/
|
||||
async function handleForceKeyRefresh(supabase: any, userId: string, headers: Record<string, string>) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Forcing key refresh for ${userId}`);
|
||||
|
||||
// Store a key refresh command
|
||||
const { error } = await supabase
|
||||
.from('e2ee_session_commands')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
command_type: 'key_refresh',
|
||||
status: 'pending',
|
||||
created_at: new Date().toISOString(),
|
||||
});
|
||||
|
||||
if (error) {
|
||||
console.error('[E2EE Session Manager] Failed to store key refresh command:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Send realtime notification
|
||||
await supabase
|
||||
.from('e2ee_session_events')
|
||||
.insert({
|
||||
user_id: userId,
|
||||
event_type: 'key_refresh',
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
message: 'Key refresh initiated',
|
||||
note: 'Client should generate new keys and upload them'
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Key refresh failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync session state between two users
|
||||
* - Updates the server-side session state tracking
|
||||
* - Detects mismatches and triggers recovery if needed
|
||||
*/
|
||||
async function handleSyncSessionState(
|
||||
supabase: any,
|
||||
userId: string,
|
||||
recipientId: string,
|
||||
hasSession: boolean,
|
||||
headers: Record<string, string>
|
||||
) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Syncing session state: ${userId} -> ${recipientId}, hasSession: ${hasSession}`);
|
||||
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Call the database function to update session state
|
||||
const { data, error } = await supabase.rpc('update_e2ee_session_state', {
|
||||
p_user_id: userId,
|
||||
p_peer_id: recipientId,
|
||||
p_has_session: hasSession,
|
||||
});
|
||||
|
||||
if (error) {
|
||||
console.error('[E2EE Session Manager] Failed to update session state:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
const result = data as {
|
||||
success: boolean;
|
||||
user_has_session: boolean;
|
||||
peer_has_session: boolean;
|
||||
session_mismatch: boolean;
|
||||
peer_session_version: number;
|
||||
};
|
||||
|
||||
// If there's a mismatch, notify both parties
|
||||
if (result.session_mismatch) {
|
||||
console.log(`[E2EE Session Manager] Session mismatch detected between ${userId} and ${recipientId}`);
|
||||
|
||||
// Insert session_mismatch event for both users
|
||||
await supabase.from('e2ee_session_events').insert([
|
||||
{
|
||||
user_id: userId,
|
||||
event_type: 'session_mismatch',
|
||||
recipient_id: recipientId,
|
||||
error_details: {
|
||||
user_has_session: result.user_has_session,
|
||||
peer_has_session: result.peer_has_session,
|
||||
},
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
{
|
||||
user_id: recipientId,
|
||||
event_type: 'session_mismatch',
|
||||
recipient_id: userId,
|
||||
error_details: {
|
||||
user_has_session: result.peer_has_session,
|
||||
peer_has_session: result.user_has_session,
|
||||
},
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
userHasSession: result.user_has_session,
|
||||
peerHasSession: result.peer_has_session,
|
||||
sessionMismatch: result.session_mismatch,
|
||||
peerSessionVersion: result.peer_session_version,
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Sync session state failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current session state between two users
|
||||
* - Returns session state without modifying it
|
||||
* - Used to check for mismatches before sending messages
|
||||
*/
|
||||
async function handleGetSessionState(
|
||||
supabase: any,
|
||||
userId: string,
|
||||
recipientId: string,
|
||||
headers: Record<string, string>
|
||||
) {
|
||||
try {
|
||||
console.log(`[E2EE Session Manager] Getting session state: ${userId} <-> ${recipientId}`);
|
||||
|
||||
if (!recipientId) {
|
||||
return new Response(JSON.stringify({ error: 'recipientId is required' }), {
|
||||
status: 400,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
// Call the database function to get session state
|
||||
const { data, error } = await supabase.rpc('get_e2ee_session_state', {
|
||||
p_user_id: userId,
|
||||
p_peer_id: recipientId,
|
||||
});
|
||||
|
||||
if (error) {
|
||||
console.error('[E2EE Session Manager] Failed to get session state:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
|
||||
const result = data as {
|
||||
exists: boolean;
|
||||
user_has_session: boolean;
|
||||
peer_has_session: boolean;
|
||||
session_mismatch: boolean;
|
||||
user_session_version?: number;
|
||||
peer_session_version?: number;
|
||||
};
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
exists: result.exists,
|
||||
userHasSession: result.user_has_session,
|
||||
peerHasSession: result.peer_has_session,
|
||||
sessionMismatch: result.session_mismatch,
|
||||
userSessionVersion: result.user_session_version,
|
||||
peerSessionVersion: result.peer_session_version,
|
||||
}), {
|
||||
status: 200,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[E2EE Session Manager] Get session state failed:', error);
|
||||
return new Response(JSON.stringify({ success: false, error: error.message }), {
|
||||
status: 500,
|
||||
headers,
|
||||
});
|
||||
}
|
||||
}
|
||||
1
_legacy/supabase/functions/feed-personal/config.toml
Normal file
1
_legacy/supabase/functions/feed-personal/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
149
_legacy/supabase/functions/feed-personal/index.ts
Normal file
149
_legacy/supabase/functions/feed-personal/index.ts
Normal file
|
|
@ -0,0 +1,149 @@
|
|||
import { serve } from "https://deno.land/std@0.177.0/http/server.ts";
|
||||
import { createSupabaseClient, createServiceClient } from "../_shared/supabase-client.ts";
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
const corsHeaders = {
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Access-Control-Allow-Methods": "GET, OPTIONS",
|
||||
"Access-Control-Allow-Headers": "authorization, x-client-info, apikey, content-type",
|
||||
"Vary": "Origin",
|
||||
};
|
||||
|
||||
serve(async (req: Request) => {
|
||||
if (req.method === "OPTIONS") {
|
||||
return new Response(null, { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get("Authorization");
|
||||
if (!authHeader) {
|
||||
console.error("Missing authorization header");
|
||||
return new Response(JSON.stringify({ error: "Missing authorization header" }), { status: 401, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
|
||||
// Don't pass JWT explicitly - let the SDK validate using its internal session
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
console.error("Auth error in feed-personal:", authError);
|
||||
console.error("User object:", user);
|
||||
return new Response(JSON.stringify({ error: "Unauthorized", details: authError?.message || "No user returned" }), { status: 401, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
const serviceKey = Deno.env.get("SUPABASE_SERVICE_ROLE_KEY");
|
||||
if (!serviceKey) {
|
||||
console.error("Missing SUPABASE_SERVICE_ROLE_KEY in function environment");
|
||||
return new Response(JSON.stringify({ error: "Server misconfigured: service key missing" }), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
// Use service client for database queries to bypass RLS
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
const url = new URL(req.url);
|
||||
const limit = Math.min(parseInt(url.searchParams.get("limit") || "50"), 100);
|
||||
const offset = parseInt(url.searchParams.get("offset") || "0");
|
||||
|
||||
// Check if user has opted into beacon posts
|
||||
const { data: profile } = await serviceClient
|
||||
.from("profiles")
|
||||
.select("beacon_enabled")
|
||||
.eq("id", user.id)
|
||||
.single();
|
||||
|
||||
const beaconEnabled = profile?.beacon_enabled || false;
|
||||
|
||||
// Get list of users this person follows (with accepted status)
|
||||
const { data: followingData } = await serviceClient
|
||||
.from("follows")
|
||||
.select("following_id")
|
||||
.eq("follower_id", user.id)
|
||||
.eq("status", "accepted");
|
||||
|
||||
const followingIds = (followingData || []).map((f: any) => f.following_id);
|
||||
|
||||
// Include user's own posts in their feed + posts from people they follow
|
||||
const authorIds = [user.id, ...followingIds];
|
||||
|
||||
// Debug: First try a simple query to see if the basic setup works
|
||||
console.log("Debug: About to query posts for user:", user.id);
|
||||
console.log("Debug: Author IDs:", authorIds);
|
||||
console.log("Debug: Beacon enabled:", beaconEnabled);
|
||||
|
||||
// Fetch posts from followed users and self
|
||||
let postsQuery = serviceClient
|
||||
.from("posts")
|
||||
.select(`id, type, body, created_at, visibility, author_id,
|
||||
author:profiles!posts_author_id_fkey (id, handle, display_name, avatar_url)`)
|
||||
.in("author_id", authorIds);
|
||||
// .eq("status", "active"); // Temporarily remove status filter to debug
|
||||
|
||||
// Filter visibility: user can see their own posts (any visibility),
|
||||
// or public/followers posts from people they follow
|
||||
postsQuery = postsQuery.or(`author_id.eq.${user.id},visibility.in.(public,followers)`);
|
||||
|
||||
// Only filter out beacons if user has NOT opted in
|
||||
if (!beaconEnabled) {
|
||||
postsQuery = postsQuery.eq("is_beacon", false);
|
||||
}
|
||||
|
||||
const { data: posts, error: postsError } = await postsQuery
|
||||
.order("created_at", { ascending: false })
|
||||
.range(offset, offset + limit - 1);
|
||||
|
||||
if (postsError) {
|
||||
console.error("Error fetching posts:", postsError);
|
||||
return new Response(JSON.stringify({ error: "Failed to fetch feed" }), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
// Get chain parent posts separately (self-referential relationship not set up)
|
||||
const postsWithChains = posts || [];
|
||||
const chainParentIds = postsWithChains
|
||||
.filter((p: any) => p.chain_parent_id)
|
||||
.map((p: any) => p.chain_parent_id);
|
||||
|
||||
let chainParentMap = new Map<string, any>();
|
||||
if (chainParentIds.length > 0) {
|
||||
const { data: chainParents } = await serviceClient
|
||||
.from("posts")
|
||||
.select(`id, body, created_at,
|
||||
author:profiles!posts_author_id_fkey (id, handle, display_name, avatar_url)`)
|
||||
.in("id", [...new Set(chainParentIds)]);
|
||||
|
||||
chainParents?.forEach((cp: any) => {
|
||||
chainParentMap.set(cp.id, {
|
||||
id: cp.id,
|
||||
body: cp.body,
|
||||
created_at: cp.created_at,
|
||||
author: cp.author,
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const feedItems = postsWithChains.map((post: any) => ({
|
||||
id: post.id, body: post.body, body_format: post.body_format, background_id: post.background_id, created_at: post.created_at, tone_label: post.tone_label,
|
||||
allow_chain: post.allow_chain, chain_parent_id: post.chain_parent_id,
|
||||
image_url: post.image_url,
|
||||
visibility: post.visibility,
|
||||
chain_parent: post.chain_parent_id ? chainParentMap.get(post.chain_parent_id) : null,
|
||||
author: post.author, category: post.category, metrics: post.metrics,
|
||||
user_liked: post.user_liked?.some((l: any) => l.user_id === user.id) || false,
|
||||
user_saved: post.user_saved?.some((s: any) => s.user_id === user.id) || false,
|
||||
}));
|
||||
|
||||
const signedItems = await Promise.all(
|
||||
feedItems.map(async (post) => {
|
||||
if (!post.image_url) {
|
||||
return post;
|
||||
}
|
||||
return { ...post, image_url: await trySignR2Url(post.image_url) };
|
||||
})
|
||||
);
|
||||
|
||||
return new Response(JSON.stringify({ posts: signedItems, pagination: { limit, offset, returned: signedItems.length } }), { status: 200, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
} catch (error) {
|
||||
console.error("Unexpected error:", error);
|
||||
return new Response(JSON.stringify({ error: "Internal server error" }), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/feed-sojorn/config.toml
Normal file
1
_legacy/supabase/functions/feed-sojorn/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
363
_legacy/supabase/functions/feed-sojorn/index.ts
Normal file
363
_legacy/supabase/functions/feed-sojorn/index.ts
Normal file
|
|
@ -0,0 +1,363 @@
|
|||
import { serve } from "https://deno.land/std@0.177.0/http/server.ts";
|
||||
import { createSupabaseClient, createServiceClient } from "../_shared/supabase-client.ts";
|
||||
import { rankPosts, type PostForRanking } from "../_shared/ranking.ts";
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get("ALLOWED_ORIGIN") || "*";
|
||||
const corsHeaders = {
|
||||
"Access-Control-Allow-Origin": ALLOWED_ORIGIN,
|
||||
"Access-Control-Allow-Methods": "GET",
|
||||
"Access-Control-Allow-Headers": "authorization, x-client-info, apikey, content-type",
|
||||
};
|
||||
|
||||
interface Profile { id: string; handle: string; display_name: string; }
|
||||
interface Category { id: string; slug: string; name: string; }
|
||||
interface PostMetrics { like_count: number; save_count: number; view_count: number; }
|
||||
interface Post {
|
||||
id: string; body: string; body_format?: string; background_id?: string; created_at: string; category_id: string | null;
|
||||
tone_label: "positive" | "neutral" | "mixed" | "negative";
|
||||
cis_score: number; author_id: string; author: Profile; category: Category;
|
||||
metrics: PostMetrics | null; allow_chain: boolean; chain_parent_id: string | null;
|
||||
image_url: string | null; tags: string[] | null; visibility?: string;
|
||||
user_liked: { user_id: string }[]; user_saved: { user_id: string }[];
|
||||
// Sponsored ad fields
|
||||
is_sponsored?: boolean;
|
||||
advertiser_name?: string;
|
||||
advertiser_cta_link?: string;
|
||||
advertiser_cta_text?: string;
|
||||
advertiser_body?: string;
|
||||
advertiser_image_url?: string;
|
||||
}
|
||||
interface TrustState { user_id: string; harmony_score: number; tier: "new" | "standard" | "trusted"; }
|
||||
interface Block { blocked_id: string; }
|
||||
interface Report { target_id: string; reporter_id: string; status: "pending" | "resolved"; }
|
||||
interface PostLike { user_id: string; }
|
||||
interface PostSave { user_id: string; }
|
||||
interface SponsoredPost {
|
||||
id: string;
|
||||
advertiser_name: string;
|
||||
body: string;
|
||||
image_url: string | null;
|
||||
cta_link: string;
|
||||
cta_text: string;
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
if (req.method === "OPTIONS") {
|
||||
return new Response(null, { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get("Authorization");
|
||||
if (!authHeader) {
|
||||
console.error("Missing authorization header");
|
||||
return new Response(JSON.stringify({ error: "Missing authorization header" }), { status: 401, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
console.log("Auth header present, length:", authHeader.length);
|
||||
|
||||
// Extract JWT from header
|
||||
const jwt = authHeader.replace("Bearer ", "");
|
||||
console.log("JWT extracted, length:", jwt.length);
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
console.log("Supabase client created");
|
||||
|
||||
const serviceClient = createServiceClient();
|
||||
console.log("Service client created");
|
||||
|
||||
// Don't pass JWT explicitly - let the SDK validate using its internal session
|
||||
// The auth header was already used to create the client
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser();
|
||||
console.log("getUser result:", { userId: user?.id, error: authError });
|
||||
|
||||
if (authError || !user) {
|
||||
console.error("Auth error in feed-sojorn:", authError);
|
||||
console.error("User object:", user);
|
||||
return new Response(JSON.stringify({ error: "Unauthorized", details: authError?.message || "No user returned" }), { status: 401, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const limit = Math.min(parseInt(url.searchParams.get("limit") || "50"), 100);
|
||||
const offset = parseInt(url.searchParams.get("offset") || "0");
|
||||
|
||||
// Check if user has opted into beacon posts
|
||||
const { data: profile } = await serviceClient
|
||||
.from("profiles")
|
||||
.select("beacon_enabled")
|
||||
.eq("id", user.id)
|
||||
.single();
|
||||
|
||||
const beaconEnabled = profile?.beacon_enabled || false;
|
||||
|
||||
// Get user's enabled category IDs for ad targeting
|
||||
const { data: userCategorySettings, error: userCategoryError } = await serviceClient
|
||||
.from("user_category_settings")
|
||||
.select("category_id")
|
||||
.eq("user_id", user.id)
|
||||
.eq("enabled", true);
|
||||
|
||||
if (userCategoryError) {
|
||||
console.error("Error fetching user category settings:", userCategoryError);
|
||||
}
|
||||
|
||||
const userCategoryIds = (userCategorySettings || [])
|
||||
.map((uc) => uc.category_id)
|
||||
.filter(Boolean);
|
||||
|
||||
// Map categories to their slugs for tag matching (normalized lowercase)
|
||||
let userCategorySlugs: string[] = [];
|
||||
if (userCategoryIds.length > 0) {
|
||||
const { data: categories } = await serviceClient
|
||||
.from("categories")
|
||||
.select("id, slug")
|
||||
.in("id", userCategoryIds);
|
||||
|
||||
userCategorySlugs = (categories || [])
|
||||
.map((c: Category) => (c.slug || "").toLowerCase())
|
||||
.filter((slug) => slug.length > 0);
|
||||
}
|
||||
|
||||
const sevenDaysAgo = new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString();
|
||||
|
||||
// Fetch posts prioritizing those with images first, then by recency
|
||||
// This ensures visual content gets featured prominently
|
||||
// Exclude beacon posts unless user has opted in
|
||||
let postsQuery = serviceClient
|
||||
.from("posts")
|
||||
.select(`id, body, body_format, created_at, category_id, tone_label, cis_score, author_id, image_url, tags, visibility,
|
||||
author:profiles!posts_author_id_fkey (id, handle, display_name, avatar_url),
|
||||
category:categories!posts_category_id_fkey (id, slug, name),
|
||||
metrics:post_metrics (like_count, save_count)`)
|
||||
.in("tone_label", ["positive", "neutral", "mixed"])
|
||||
.gte("created_at", sevenDaysAgo);
|
||||
|
||||
// Hybrid matching: legacy categories OR new hashtag tags
|
||||
if (userCategoryIds.length > 0 || userCategorySlugs.length > 0) {
|
||||
const orConditions: string[] = [];
|
||||
if (userCategoryIds.length > 0) {
|
||||
orConditions.push(`category_id.in.(${userCategoryIds.join(",")})`);
|
||||
}
|
||||
if (userCategorySlugs.length > 0) {
|
||||
orConditions.push(`tags.ov.{${userCategorySlugs.join(",")}}`);
|
||||
}
|
||||
if (orConditions.length > 0) {
|
||||
postsQuery = postsQuery.or(orConditions.join(","));
|
||||
}
|
||||
}
|
||||
|
||||
// Only filter out beacons if user has NOT opted in
|
||||
if (!beaconEnabled) {
|
||||
postsQuery = postsQuery.eq("is_beacon", false);
|
||||
}
|
||||
|
||||
const { data: posts, error: postsError } = await postsQuery
|
||||
.order("image_url", { ascending: false }) // Posts WITH images first
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(1000); // Fetch more to rank, then paginate
|
||||
|
||||
if (postsError) {
|
||||
console.error("Error fetching posts:", postsError);
|
||||
return new Response(JSON.stringify({
|
||||
error: "Failed to fetch feed",
|
||||
details: postsError.message,
|
||||
code: postsError.code,
|
||||
hint: postsError.hint,
|
||||
}), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
const safePosts = posts || [];
|
||||
const authorIds = [...new Set(safePosts.map((p: Post) => p.author_id))];
|
||||
const trustStates = authorIds.length > 0
|
||||
? (await serviceClient.from("trust_state").select("user_id, harmony_score, tier").in("user_id", authorIds)).data
|
||||
: [];
|
||||
const trustMap = new Map<string, TrustState>(trustStates?.map((t: TrustState) => [t.user_id, t]) || []);
|
||||
|
||||
const oneDayAgo = new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
|
||||
const recentBlocks = authorIds.length > 0
|
||||
? (await serviceClient.from("blocks").select("blocked_id").in("blocked_id", authorIds).gte("created_at", oneDayAgo)).data
|
||||
: [];
|
||||
const blocksMap = new Map<string, number>();
|
||||
recentBlocks?.forEach((block: Block) => { blocksMap.set(block.blocked_id, (blocksMap.get(block.blocked_id) || 0) + 1); });
|
||||
|
||||
const postIds = safePosts.map((p: Post) => p.id);
|
||||
const reports = postIds.length > 0
|
||||
? (await serviceClient.from("reports").select("target_id, reporter_id, status").eq("target_type", "post").in("target_id", postIds)).data
|
||||
: [];
|
||||
const trustedReportMap = new Map<string, number>();
|
||||
const totalReportMap = new Map<string, number>();
|
||||
for (const report of reports || []) {
|
||||
totalReportMap.set(report.target_id, (totalReportMap.get(report.target_id) || 0) + 1);
|
||||
const reporterTrust = trustMap.get(report.reporter_id);
|
||||
if (reporterTrust && reporterTrust.harmony_score >= 70) {
|
||||
trustedReportMap.set(report.target_id, (trustedReportMap.get(report.target_id) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate has_image bonus for ranking
|
||||
const postsForRanking: PostForRanking[] = safePosts.map((post: Post) => {
|
||||
const authorTrust = trustMap.get(post.author_id);
|
||||
return {
|
||||
id: post.id,
|
||||
created_at: post.created_at,
|
||||
cis_score: post.cis_score || 0.8,
|
||||
tone_label: post.tone_label || "neutral",
|
||||
save_count: post.metrics?.save_count || 0,
|
||||
like_count: post.metrics?.like_count || 0,
|
||||
view_count: post.metrics?.view_count || 0,
|
||||
author_harmony_score: authorTrust?.harmony_score || 50,
|
||||
author_tier: authorTrust?.tier || "new",
|
||||
blocks_received_24h: blocksMap.get(post.author_id) || 0,
|
||||
trusted_reports: trustedReportMap.get(post.id) || 0,
|
||||
total_reports: totalReportMap.get(post.id) || 0,
|
||||
has_image: post.image_url != null && post.image_url.length > 0,
|
||||
};
|
||||
});
|
||||
|
||||
// Use ranking algorithm
|
||||
const rankedPosts = rankPosts(postsForRanking);
|
||||
const paginatedPosts = rankedPosts.slice(offset, offset + limit);
|
||||
const resultIds = paginatedPosts.map((p) => p.id);
|
||||
|
||||
let finalPosts: Post[] = [];
|
||||
if (resultIds.length > 0) {
|
||||
const { data, error: finalError } = await serviceClient
|
||||
.from("posts")
|
||||
.select(`id, body, body_format, background_id, created_at, tone_label, allow_chain, chain_parent_id, image_url, tags, visibility,
|
||||
author:profiles!posts_author_id_fkey (id, handle, display_name, avatar_url),
|
||||
category:categories!posts_category_id_fkey (id, slug, name),
|
||||
metrics:post_metrics (like_count, save_count),
|
||||
user_liked:post_likes!left (user_id),
|
||||
user_saved:post_saves!left (user_id)`)
|
||||
.in("id", resultIds);
|
||||
|
||||
if (finalError) {
|
||||
console.error("Error fetching final posts:", finalError);
|
||||
return new Response(JSON.stringify({
|
||||
error: "Failed to fetch feed",
|
||||
details: finalError.message,
|
||||
code: finalError.code,
|
||||
hint: finalError.hint,
|
||||
}), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
|
||||
finalPosts = data || [];
|
||||
}
|
||||
|
||||
let orderedPosts = resultIds.map((id) => finalPosts.find((p: Post) => p.id === id)).filter(Boolean).map((post: Post) => ({
|
||||
id: post.id, body: post.body, body_format: post.body_format, background_id: post.background_id, created_at: post.created_at, tone_label: post.tone_label, allow_chain: post.allow_chain,
|
||||
chain_parent_id: post.chain_parent_id, image_url: post.image_url, tags: post.tags, visibility: post.visibility, author: post.author, category: post.category, metrics: post.metrics,
|
||||
user_liked: post.user_liked?.some((l: PostLike) => l.user_id === user.id) || false,
|
||||
user_saved: post.user_saved?.some((s: PostSave) => s.user_id === user.id) || false,
|
||||
}));
|
||||
|
||||
// =========================================================================
|
||||
// SILENT AD INJECTION - Check for sponsored content
|
||||
// =========================================================================
|
||||
|
||||
// Only inject if user has categories and we have posts to inject into
|
||||
if (userCategoryIds.length > 0 && orderedPosts.length > 0) {
|
||||
try {
|
||||
// Fetch a random active ad that matches user's subscribed categories
|
||||
const { data: sponsoredPosts } = await serviceClient
|
||||
.from("sponsored_posts")
|
||||
.select("id, advertiser_name, body, image_url, cta_link, cta_text")
|
||||
.eq("active", true)
|
||||
.or(
|
||||
`target_categories.cs.{*},target_categories.ov.{${userCategoryIds.join(",")}}`,
|
||||
)
|
||||
.lt("current_impressions", serviceClient.raw("impression_goal")) // Only show if under goal
|
||||
.limit(1);
|
||||
|
||||
if (sponsoredPosts && sponsoredPosts.length > 0) {
|
||||
const ad = sponsoredPosts[0] as SponsoredPost;
|
||||
|
||||
// Create a fake post object that looks like a real post but is marked as sponsored
|
||||
const sponsoredPost: Post = {
|
||||
id: ad.id,
|
||||
body: ad.body,
|
||||
body_format: "markdown",
|
||||
background_id: null,
|
||||
created_at: new Date().toISOString(),
|
||||
tone_label: "neutral",
|
||||
cis_score: 1.0,
|
||||
author_id: "sponsored",
|
||||
author: {
|
||||
id: "sponsored",
|
||||
handle: "sponsored",
|
||||
display_name: ad.advertiser_name,
|
||||
},
|
||||
category: {
|
||||
id: "sponsored",
|
||||
slug: "sponsored",
|
||||
name: "Sponsored",
|
||||
},
|
||||
metrics: { like_count: 0, save_count: 0, view_count: 0 },
|
||||
allow_chain: false,
|
||||
chain_parent_id: null,
|
||||
image_url: ad.image_url,
|
||||
tags: null,
|
||||
user_liked: [],
|
||||
user_saved: [],
|
||||
// Sponsored ad markers
|
||||
is_sponsored: true,
|
||||
advertiser_name: ad.advertiser_name,
|
||||
advertiser_cta_link: ad.cta_link,
|
||||
advertiser_cta_text: ad.cta_text,
|
||||
advertiser_body: ad.body,
|
||||
advertiser_image_url: ad.image_url,
|
||||
};
|
||||
|
||||
// Inject at position 4 (5th slot) if we have enough posts
|
||||
const adInjectionIndex = 4;
|
||||
if (orderedPosts.length > adInjectionIndex) {
|
||||
orderedPosts = [
|
||||
...orderedPosts.slice(0, adInjectionIndex),
|
||||
sponsoredPost,
|
||||
...orderedPosts.slice(adInjectionIndex),
|
||||
];
|
||||
console.log("Sponsored ad injected at index", adInjectionIndex, "advertiser:", ad.advertiser_name);
|
||||
} else {
|
||||
// If not enough posts, inject at the end
|
||||
orderedPosts = [...orderedPosts, sponsoredPost];
|
||||
console.log("Sponsored ad injected at end, advertiser:", ad.advertiser_name);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Sponsored ad injection failed:", error);
|
||||
}
|
||||
}
|
||||
// =========================================================================
|
||||
// END AD INJECTION
|
||||
// =========================================================================
|
||||
|
||||
const signedPosts = await Promise.all(
|
||||
orderedPosts.map(async (post) => {
|
||||
if (!post.image_url && !post.advertiser_image_url) {
|
||||
return post;
|
||||
}
|
||||
|
||||
const nextPost = { ...post };
|
||||
if (post.image_url) {
|
||||
nextPost.image_url = await trySignR2Url(post.image_url);
|
||||
}
|
||||
if (post.advertiser_image_url) {
|
||||
nextPost.advertiser_image_url = await trySignR2Url(post.advertiser_image_url);
|
||||
}
|
||||
return nextPost;
|
||||
})
|
||||
);
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
posts: signedPosts,
|
||||
pagination: { limit, offset, returned: orderedPosts.length },
|
||||
ranking_explanation: "Posts are ranked by: 1) Has image bonus, 2) Author harmony score, 3) Steady appreciation (saves > likes), 4) Recency. Sponsored content may appear periodically.",
|
||||
}), { status: 200, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
} catch (error) {
|
||||
console.error("Unexpected error:", error);
|
||||
return new Response(JSON.stringify({
|
||||
error: "Internal server error",
|
||||
details: error instanceof Error ? error.message : String(error),
|
||||
}), { status: 500, headers: { ...corsHeaders, "Content-Type": "application/json" } });
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/follow/config.toml
Normal file
1
_legacy/supabase/functions/follow/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
135
_legacy/supabase/functions/follow/index.ts
Normal file
135
_legacy/supabase/functions/follow/index.ts
Normal file
|
|
@ -0,0 +1,135 @@
|
|||
/**
|
||||
* POST /follow - Follow a user
|
||||
* DELETE /follow - Unfollow a user
|
||||
*
|
||||
* Design intent:
|
||||
* - Following is explicit and intentional
|
||||
* - Mutual follow enables conversation
|
||||
* - Cannot follow if blocked
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient } from '../_shared/supabase-client.ts';
|
||||
import { validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
interface FollowRequest {
|
||||
user_id: string; // the user to follow/unfollow
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST, DELETE',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
let { user_id: following_id } = (await req.json()) as FollowRequest;
|
||||
validateUUID(following_id, 'user_id');
|
||||
following_id = following_id.toLowerCase();
|
||||
|
||||
if (following_id === user.id.toLowerCase()) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Cannot follow yourself' }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// Handle unfollow
|
||||
if (req.method === 'DELETE') {
|
||||
const { error: deleteError } = await supabase
|
||||
.from('follows')
|
||||
.delete()
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', following_id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Error unfollowing:', deleteError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to unfollow' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Unfollowed' }), {
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Handle follow (POST) via request_follow
|
||||
const { data: status, error: followError } = await supabase
|
||||
.rpc('request_follow', { target_id: following_id });
|
||||
|
||||
if (followError) {
|
||||
if (followError.message?.includes('Cannot follow') || followError.code === '23514') {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Cannot follow this user' }),
|
||||
{ status: 403, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Error following:', followError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to follow' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const followStatus = status as string | null;
|
||||
const message =
|
||||
followStatus === 'pending'
|
||||
? 'Request sent.'
|
||||
: 'Followed. Mutual follow enables conversation.';
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
status: followStatus,
|
||||
message,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Validation error', message: error.message }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/manage-post/config.toml
Normal file
1
_legacy/supabase/functions/manage-post/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
318
_legacy/supabase/functions/manage-post/index.ts
Normal file
318
_legacy/supabase/functions/manage-post/index.ts
Normal file
|
|
@ -0,0 +1,318 @@
|
|||
/// <reference types="https://deno.land/x/deno@v1.28.0/cli/dts/lib.deno.ts" />
|
||||
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com';
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
// ============================================================================
|
||||
// TONE CHECK (inline version to avoid external call)
|
||||
// ============================================================================
|
||||
|
||||
interface ToneResult {
|
||||
tone: 'positive' | 'neutral' | 'mixed' | 'negative' | 'hostile' | 'hate';
|
||||
cis: number;
|
||||
flags: string[];
|
||||
reason: string;
|
||||
}
|
||||
|
||||
function basicModeration(text: string): ToneResult {
|
||||
const lowerText = text.toLowerCase();
|
||||
const flags: string[] = [];
|
||||
|
||||
// Slur patterns
|
||||
const slurPatterns = [/\bn+[i1]+g+[aegr]+/i, /\bf+[a4]+g+[s$o0]+t/i, /\br+[e3]+t+[a4]+r+d/i];
|
||||
for (const pattern of slurPatterns) {
|
||||
if (pattern.test(text)) {
|
||||
return { tone: 'hate', cis: 0.0, flags: ['hate-speech'], reason: 'Hate speech detected.' };
|
||||
}
|
||||
}
|
||||
|
||||
// Attack patterns
|
||||
const attackPatterns = [/\b(fuck|screw|damn)\s+(you|u|your|ur)/i, /\b(kill|hurt|attack)\s+(you|yourself)/i];
|
||||
for (const pattern of attackPatterns) {
|
||||
if (pattern.test(text)) {
|
||||
return { tone: 'hostile', cis: 0.2, flags: ['hostile'], reason: 'Personal attack detected.' };
|
||||
}
|
||||
}
|
||||
|
||||
// Positive indicators
|
||||
const positiveWords = ['thank', 'appreciate', 'love', 'support', 'grateful'];
|
||||
if (positiveWords.some(word => lowerText.includes(word))) {
|
||||
return { tone: 'positive', cis: 1.0, flags: [], reason: 'Positive content' };
|
||||
}
|
||||
|
||||
return { tone: 'neutral', cis: 0.8, flags: [], reason: 'Content approved' };
|
||||
}
|
||||
|
||||
async function checkTone(text: string): Promise<ToneResult> {
|
||||
const openAiKey = Deno.env.get('OPEN_AI');
|
||||
|
||||
if (openAiKey) {
|
||||
try {
|
||||
const response = await fetch('https://api.openai.com/v1/moderations', {
|
||||
method: 'POST',
|
||||
headers: { 'Authorization': `Bearer ${openAiKey}`, 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ input: text, model: 'text-moderation-latest' }),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
const results = data.results?.[0];
|
||||
if (results) {
|
||||
if (results.flagged) {
|
||||
if (results.categories?.hate) return { tone: 'hate', cis: 0.0, flags: ['hate'], reason: 'Hate speech detected.' };
|
||||
if (results.categories?.harassment) return { tone: 'hostile', cis: 0.2, flags: ['harassment'], reason: 'Harassment detected.' };
|
||||
return { tone: 'mixed', cis: 0.5, flags: ['flagged'], reason: 'Content flagged.' };
|
||||
}
|
||||
// Not flagged - return neutral with CIS based on scores
|
||||
const maxScore = Math.max(results.category_scores?.harassment || 0, results.category_scores?.hate || 0);
|
||||
if (maxScore > 0.5) return { tone: 'mixed', cis: 0.6, flags: [], reason: 'Content approved (caution)' };
|
||||
return { tone: 'neutral', cis: 0.8, flags: [], reason: 'Content approved' };
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('OpenAI moderation error:', error);
|
||||
}
|
||||
}
|
||||
|
||||
return basicModeration(text);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// MAIN HANDLER
|
||||
// ============================================================================
|
||||
|
||||
interface RequestBody {
|
||||
action: 'edit' | 'delete' | 'update_privacy' | 'bulk_update_privacy' | 'pin' | 'unpin';
|
||||
post_id?: string;
|
||||
content?: string; // For edit action
|
||||
visibility?: 'public' | 'followers' | 'private';
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: { ...corsHeaders, 'Access-Control-Allow-Methods': 'POST OPTIONS' } });
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization' }), { status: 401, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser();
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), { status: 401, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
const { action, post_id, content, visibility } = await req.json() as RequestBody;
|
||||
|
||||
if (!action) {
|
||||
return new Response(JSON.stringify({ error: 'Missing required fields' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
const allowedVisibilities = ['public', 'followers', 'private'];
|
||||
if ((action === 'update_privacy' || action === 'bulk_update_privacy') && (!visibility || !allowedVisibilities.includes(visibility))) {
|
||||
return new Response(JSON.stringify({ error: 'Invalid visibility' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
if (action === 'bulk_update_privacy') {
|
||||
const { error: bulkError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({ visibility })
|
||||
.eq('author_id', user.id);
|
||||
|
||||
if (bulkError) {
|
||||
console.error('Bulk privacy update error:', bulkError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to update post privacy' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Post privacy updated' }), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
if (!post_id) {
|
||||
return new Response(JSON.stringify({ error: 'Missing post_id' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// Fetch the post
|
||||
const { data: post, error: postError } = await serviceClient
|
||||
.from('posts')
|
||||
.select('id, author_id, body, tone_label, cis_score, created_at, is_edited')
|
||||
.eq('id', post_id)
|
||||
.single();
|
||||
|
||||
if (postError || !post) {
|
||||
return new Response(JSON.stringify({ error: 'Post not found' }), { status: 404, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// Verify ownership
|
||||
if (post.author_id !== user.id) {
|
||||
return new Response(JSON.stringify({ error: 'You can only edit or delete your own posts' }), { status: 403, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// ========================================================================
|
||||
// ACTION: PIN
|
||||
// ========================================================================
|
||||
if (action === 'pin') {
|
||||
const pinnedAt = new Date().toISOString();
|
||||
|
||||
const { error: clearError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({ pinned_at: null })
|
||||
.eq('author_id', user.id);
|
||||
|
||||
if (clearError) {
|
||||
console.error('Clear pinned post error:', clearError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to pin post' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
const { error: pinError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({ pinned_at: pinnedAt })
|
||||
.eq('id', post_id);
|
||||
|
||||
if (pinError) {
|
||||
console.error('Pin post error:', pinError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to pin post' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Post pinned' }), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// ========================================================================
|
||||
// ACTION: UNPIN
|
||||
// ========================================================================
|
||||
if (action === 'unpin') {
|
||||
const { error: unpinError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({ pinned_at: null })
|
||||
.eq('id', post_id);
|
||||
|
||||
if (unpinError) {
|
||||
console.error('Unpin post error:', unpinError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to unpin post' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Post unpinned' }), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// ========================================================================
|
||||
// ACTION: UPDATE PRIVACY
|
||||
// ========================================================================
|
||||
if (action === 'update_privacy') {
|
||||
const { error: privacyError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({ visibility })
|
||||
.eq('id', post_id);
|
||||
|
||||
if (privacyError) {
|
||||
console.error('Privacy update error:', privacyError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to update post privacy' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Post privacy updated' }), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// ========================================================================
|
||||
// ACTION: DELETE (Hard Delete)
|
||||
// ========================================================================
|
||||
if (action === 'delete') {
|
||||
const { error: deleteError } = await serviceClient
|
||||
.from('posts')
|
||||
.delete()
|
||||
.eq('id', post_id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Delete error:', deleteError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to delete post' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true, message: 'Post deleted successfully' }), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// ========================================================================
|
||||
// ACTION: EDIT
|
||||
// ========================================================================
|
||||
if (action === 'edit') {
|
||||
if (!content || content.trim().length === 0) {
|
||||
return new Response(JSON.stringify({ error: 'Content is required for edits' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// 1. Time Check: 2-minute window
|
||||
const createdAt = new Date(post.created_at);
|
||||
const now = new Date();
|
||||
const twoMinutesAgo = new Date(now.getTime() - 2 * 60 * 1000);
|
||||
|
||||
if (createdAt < twoMinutesAgo) {
|
||||
return new Response(JSON.stringify({ error: 'Edit window expired. Posts can only be edited within 2 minutes of creation.' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// 2. Check if already edited (one edit only)
|
||||
if (post.is_edited) {
|
||||
return new Response(JSON.stringify({ error: 'Post has already been edited' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// 3. Moderation Check: Run new content through tone check
|
||||
const moderation = await checkTone(content);
|
||||
|
||||
if (moderation.tone === 'hate' || moderation.tone === 'hostile') {
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Edit rejected: Content does not meet community guidelines.',
|
||||
details: moderation.reason
|
||||
}), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// 4. Archive: Save current content to post_versions
|
||||
const { error: archiveError } = await serviceClient
|
||||
.from('post_versions')
|
||||
.insert({
|
||||
post_id: post_id,
|
||||
content: post.body,
|
||||
tone_label: post.tone_label,
|
||||
cis_score: post.cis_score,
|
||||
});
|
||||
|
||||
if (archiveError) {
|
||||
console.error('Archive error:', archiveError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to save edit history' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
// 5. Update: Apply new content
|
||||
const { error: updateError } = await serviceClient
|
||||
.from('posts')
|
||||
.update({
|
||||
body: content,
|
||||
tone_label: moderation.tone,
|
||||
cis_score: moderation.cis,
|
||||
is_edited: true,
|
||||
edited_at: now.toISOString(),
|
||||
})
|
||||
.eq('id', post_id);
|
||||
|
||||
if (updateError) {
|
||||
console.error('Update error:', updateError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to update post' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({
|
||||
success: true,
|
||||
message: 'Post updated successfully',
|
||||
moderation: { tone: moderation.tone, cis: moderation.cis }
|
||||
}), { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ error: 'Invalid action' }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } });
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/notifications/config.toml
Normal file
1
_legacy/supabase/functions/notifications/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
244
_legacy/supabase/functions/notifications/index.ts
Normal file
244
_legacy/supabase/functions/notifications/index.ts
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2.39.3';
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
interface NotificationRow {
|
||||
id: string;
|
||||
user_id: string;
|
||||
type: 'follow' | 'appreciate' | 'comment' | 'chain' | 'mention' | 'follow_request' | 'new_follower' | 'request_accepted';
|
||||
actor_id: string;
|
||||
post_id: string | null;
|
||||
comment_id: string | null;
|
||||
metadata: Record<string, unknown>;
|
||||
is_read: boolean;
|
||||
created_at: string;
|
||||
}
|
||||
|
||||
interface NotificationResponse extends NotificationRow {
|
||||
actor: {
|
||||
id: string;
|
||||
handle: string;
|
||||
display_name: string;
|
||||
avatar_url: string | null;
|
||||
};
|
||||
post?: {
|
||||
id: string;
|
||||
body: string;
|
||||
} | null;
|
||||
}
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
// Handle CORS preflight
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const supabaseUrl = Deno.env.get('SUPABASE_URL') ?? '';
|
||||
const anonKey = Deno.env.get('SUPABASE_ANON_KEY') ?? '';
|
||||
const serviceRoleKey = Deno.env.get('SUPABASE_SERVICE_ROLE_KEY') ?? '';
|
||||
|
||||
const supabaseClient = createClient(supabaseUrl, anonKey, {
|
||||
global: {
|
||||
headers: {
|
||||
Authorization: req.headers.get('Authorization') ?? '',
|
||||
apikey: anonKey,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const adminClient = createClient(supabaseUrl, serviceRoleKey);
|
||||
|
||||
// Get authenticated user (explicit token from header)
|
||||
const authHeader =
|
||||
req.headers.get('Authorization') ?? req.headers.get('authorization') ?? '';
|
||||
const token = authHeader.startsWith('Bearer ')
|
||||
? authHeader.slice('Bearer '.length)
|
||||
: authHeader;
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabaseClient.auth.getUser(token || undefined);
|
||||
|
||||
if (authError || !user) {
|
||||
console.error('Auth error:', authError?.message ?? 'No user found');
|
||||
console.error('Auth header present:', !!authHeader, 'Length:', authHeader?.length ?? 0);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Unauthorized',
|
||||
code: 401,
|
||||
message: authError?.message ?? 'Invalid JWT'
|
||||
}), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const limit = parseInt(url.searchParams.get('limit') || '20');
|
||||
const offset = parseInt(url.searchParams.get('offset') || '0');
|
||||
const unreadOnly = url.searchParams.get('unread_only') === 'true';
|
||||
const includeArchived = url.searchParams.get('include_archived') === 'true';
|
||||
|
||||
// Handle different HTTP methods
|
||||
if (req.method === 'GET') {
|
||||
// Build query
|
||||
let query = adminClient
|
||||
.from('notifications')
|
||||
.select(`
|
||||
id,
|
||||
user_id,
|
||||
type,
|
||||
actor_id,
|
||||
post_id,
|
||||
comment_id,
|
||||
metadata,
|
||||
is_read,
|
||||
archived_at,
|
||||
created_at,
|
||||
actor:actor_id (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
),
|
||||
post:post_id (
|
||||
id,
|
||||
body
|
||||
)
|
||||
`)
|
||||
.eq('user_id', user.id)
|
||||
.order('created_at', { ascending: false })
|
||||
.range(offset, offset + limit - 1);
|
||||
|
||||
if (!includeArchived) {
|
||||
query = query.is('archived_at', null);
|
||||
}
|
||||
|
||||
if (unreadOnly) {
|
||||
query = query.eq('is_read', false);
|
||||
}
|
||||
|
||||
const { data: notifications, error } = await query;
|
||||
|
||||
if (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ notifications }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (req.method === 'PATCH' || req.method === 'POST') {
|
||||
// Mark notifications as read
|
||||
const body = await req.json();
|
||||
const { notification_ids, mark_all_read, archive_ids, archive_all } = body;
|
||||
const archiveAt = new Date().toISOString();
|
||||
let didAction = false;
|
||||
|
||||
if (archive_all) {
|
||||
didAction = true;
|
||||
const { error } = await adminClient
|
||||
.from('notifications')
|
||||
.update({ archived_at: archiveAt, is_read: true })
|
||||
.eq('user_id', user.id)
|
||||
.is('archived_at', null);
|
||||
|
||||
if (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (archive_ids && Array.isArray(archive_ids)) {
|
||||
didAction = true;
|
||||
const { error } = await adminClient
|
||||
.from('notifications')
|
||||
.update({ archived_at: archiveAt, is_read: true })
|
||||
.in('id', archive_ids)
|
||||
.eq('user_id', user.id);
|
||||
|
||||
if (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (mark_all_read) {
|
||||
didAction = true;
|
||||
// Mark all notifications as read
|
||||
const { error } = await adminClient
|
||||
.from('notifications')
|
||||
.update({ is_read: true })
|
||||
.eq('user_id', user.id)
|
||||
.eq('is_read', false)
|
||||
.is('archived_at', null);
|
||||
|
||||
if (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (notification_ids && Array.isArray(notification_ids)) {
|
||||
didAction = true;
|
||||
// Mark specific notifications as read
|
||||
const { error } = await adminClient
|
||||
.from('notifications')
|
||||
.update({ is_read: true })
|
||||
.in('id', notification_ids)
|
||||
.eq('user_id', user.id)
|
||||
.is('archived_at', null);
|
||||
|
||||
if (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (!didAction) {
|
||||
return new Response(JSON.stringify({ error: 'Invalid request body' }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Method not allowed
|
||||
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
|
||||
status: 405,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
} catch (error) {
|
||||
return new Response(JSON.stringify({ error: error.message }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/profile-posts/config.toml
Normal file
1
_legacy/supabase/functions/profile-posts/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
193
_legacy/supabase/functions/profile-posts/index.ts
Normal file
193
_legacy/supabase/functions/profile-posts/index.ts
Normal file
|
|
@ -0,0 +1,193 @@
|
|||
import { serve } from "https://deno.land/std@0.177.0/http/server.ts";
|
||||
import { createSupabaseClient, createServiceClient } from "../_shared/supabase-client.ts";
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
const corsHeaders = {
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Access-Control-Allow-Methods": "GET, OPTIONS",
|
||||
"Access-Control-Allow-Headers": "authorization, x-client-info, apikey, content-type",
|
||||
};
|
||||
|
||||
const postSelect = `
|
||||
id,
|
||||
body,
|
||||
author_id,
|
||||
category_id,
|
||||
tags,
|
||||
body_format,
|
||||
background_id,
|
||||
tone_label,
|
||||
cis_score,
|
||||
status,
|
||||
created_at,
|
||||
edited_at,
|
||||
expires_at,
|
||||
is_edited,
|
||||
allow_chain,
|
||||
chain_parent_id,
|
||||
image_url,
|
||||
video_url,
|
||||
thumbnail_url,
|
||||
duration_ms,
|
||||
type,
|
||||
visibility,
|
||||
pinned_at,
|
||||
chain_parent:posts (
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name
|
||||
)
|
||||
),
|
||||
metrics:post_metrics!post_metrics_post_id_fkey (
|
||||
like_count,
|
||||
save_count,
|
||||
view_count
|
||||
),
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
trust_state (
|
||||
user_id,
|
||||
harmony_score,
|
||||
tier,
|
||||
posts_today
|
||||
)
|
||||
)
|
||||
`;
|
||||
|
||||
serve(async (req: Request) => {
|
||||
if (req.method === "OPTIONS") {
|
||||
return new Response(null, { headers: corsHeaders });
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get("Authorization");
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: "Missing authorization header" }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: "Unauthorized" }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const authorId = url.searchParams.get("author_id");
|
||||
if (!authorId) {
|
||||
return new Response(JSON.stringify({ error: "Missing author_id" }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const limit = Math.min(parseInt(url.searchParams.get("limit") || "20"), 100);
|
||||
const offset = parseInt(url.searchParams.get("offset") || "0");
|
||||
|
||||
// Use service client to bypass RLS issues
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
// Check visibility rules manually:
|
||||
// - User can always see their own posts
|
||||
// - Public posts are visible to everyone
|
||||
// - Followers-only posts require accepted follow
|
||||
// - Private posts only visible to author
|
||||
const isOwnProfile = authorId === user.id;
|
||||
|
||||
// Get author's profile to check privacy settings
|
||||
const { data: authorProfile } = await serviceClient
|
||||
.from("profiles")
|
||||
.select("is_private, is_official")
|
||||
.eq("id", authorId)
|
||||
.single();
|
||||
|
||||
// Check if viewer follows the author (for followers-only posts)
|
||||
let hasAcceptedFollow = false;
|
||||
if (!isOwnProfile) {
|
||||
const { data: followRow } = await serviceClient
|
||||
.from("follows")
|
||||
.select("status")
|
||||
.eq("follower_id", user.id)
|
||||
.eq("following_id", authorId)
|
||||
.eq("status", "accepted")
|
||||
.maybeSingle();
|
||||
hasAcceptedFollow = !!followRow;
|
||||
}
|
||||
|
||||
// Build query with appropriate visibility filter
|
||||
// Note: posts use 'active' status for published posts
|
||||
let query = serviceClient
|
||||
.from("posts")
|
||||
.select(postSelect)
|
||||
.eq("author_id", authorId)
|
||||
.eq("status", "active");
|
||||
|
||||
// Apply visibility filters based on relationship
|
||||
if (isOwnProfile) {
|
||||
// User can see all their own posts (no visibility filter)
|
||||
} else if (hasAcceptedFollow) {
|
||||
// Follower can see public and followers-only posts
|
||||
query = query.in("visibility", ["public", "followers"]);
|
||||
} else if (authorProfile?.is_official || authorProfile?.is_private === false) {
|
||||
// Public/official profiles - only public posts
|
||||
query = query.eq("visibility", "public");
|
||||
} else {
|
||||
// Private profile without follow - only public posts
|
||||
query = query.eq("visibility", "public");
|
||||
}
|
||||
|
||||
const { data: posts, error } = await query
|
||||
.order("pinned_at", { ascending: false, nullsFirst: false })
|
||||
.order("created_at", { ascending: false })
|
||||
.range(offset, offset + limit - 1);
|
||||
|
||||
if (error) {
|
||||
console.error("Profile posts fetch error:", error);
|
||||
return new Response(JSON.stringify({ error: "Failed to fetch posts" }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const signedPosts = await Promise.all(
|
||||
(posts || []).map(async (post: any) => {
|
||||
const imageUrl = post.image_url ? await trySignR2Url(post.image_url) : null;
|
||||
const thumbUrl = post.thumbnail_url ? await trySignR2Url(post.thumbnail_url) : null;
|
||||
return {
|
||||
...post,
|
||||
image_url: imageUrl,
|
||||
thumbnail_url: thumbUrl,
|
||||
};
|
||||
})
|
||||
);
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
posts: signedPosts,
|
||||
pagination: { limit, offset, returned: signedPosts.length },
|
||||
}),
|
||||
{ status: 200, headers: { ...corsHeaders, "Content-Type": "application/json" } }
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Unexpected profile-posts error:", error);
|
||||
return new Response(JSON.stringify({ error: "Internal server error" }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/profile/config.toml
Normal file
1
_legacy/supabase/functions/profile/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
441
_legacy/supabase/functions/profile/index.ts
Normal file
441
_legacy/supabase/functions/profile/index.ts
Normal file
|
|
@ -0,0 +1,441 @@
|
|||
/**
|
||||
* GET /profile/:handle - Get user profile by handle
|
||||
* GET /profile/me - Get own profile
|
||||
* PATCH /profile - Update own profile
|
||||
*
|
||||
* Design intent:
|
||||
* - Profiles are public (unless blocked)
|
||||
* - Shows harmony tier (not score)
|
||||
* - Minimal public metrics
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient } from '../_shared/supabase-client.ts';
|
||||
import { ValidationError } from '../_shared/validation.ts';
|
||||
import { trySignR2Url } from '../_shared/r2_signer.ts';
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || '*';
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Methods': 'GET, PATCH',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: corsHeaders,
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const url = new URL(req.url);
|
||||
const handle = url.searchParams.get('handle');
|
||||
|
||||
// GET /profile/me or /profile?handle=username
|
||||
if (req.method === 'GET') {
|
||||
let profileQuery;
|
||||
let isOwnProfile = false;
|
||||
|
||||
if (handle) {
|
||||
// Get profile by handle
|
||||
profileQuery = supabase
|
||||
.from('profiles')
|
||||
.select(
|
||||
`
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
bio,
|
||||
location,
|
||||
website,
|
||||
interests,
|
||||
avatar_url,
|
||||
cover_url,
|
||||
origin_country,
|
||||
is_private,
|
||||
is_official,
|
||||
created_at,
|
||||
trust_state (
|
||||
tier
|
||||
)
|
||||
`
|
||||
)
|
||||
.eq('handle', handle)
|
||||
.single();
|
||||
} else {
|
||||
// Get own profile
|
||||
isOwnProfile = true;
|
||||
profileQuery = supabase
|
||||
.from('profiles')
|
||||
.select(
|
||||
`
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
bio,
|
||||
location,
|
||||
website,
|
||||
interests,
|
||||
avatar_url,
|
||||
cover_url,
|
||||
origin_country,
|
||||
is_private,
|
||||
is_official,
|
||||
created_at,
|
||||
updated_at,
|
||||
trust_state (
|
||||
harmony_score,
|
||||
tier,
|
||||
posts_today
|
||||
)
|
||||
`
|
||||
)
|
||||
.eq('id', user.id)
|
||||
.single();
|
||||
}
|
||||
|
||||
const { data: profile, error: profileError } = await profileQuery;
|
||||
|
||||
if (profileError || !profile) {
|
||||
return new Response(JSON.stringify({ error: 'Profile not found' }), {
|
||||
status: 404,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Check if viewing own profile
|
||||
if (profile.id === user.id) {
|
||||
isOwnProfile = true;
|
||||
}
|
||||
|
||||
// Get privacy settings for this profile
|
||||
const { data: privacySettings } = await supabase
|
||||
.from('profile_privacy_settings')
|
||||
.select('*')
|
||||
.eq('user_id', profile.id)
|
||||
.maybeSingle();
|
||||
|
||||
const profileVisibility = privacySettings?.profile_visibility || 'public';
|
||||
|
||||
// Apply privacy filtering based on viewer relationship
|
||||
if (!isOwnProfile && profileVisibility !== 'public') {
|
||||
// Check if viewer is following the profile
|
||||
const { data: followData } = await supabase
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', profile.id)
|
||||
.maybeSingle();
|
||||
|
||||
const followStatus = followData?.status as string | null;
|
||||
const isFollowing = followStatus === 'accepted';
|
||||
let isFollowedBy = false;
|
||||
if (user.id !== profile.id) {
|
||||
const { data: reverseFollow } = await supabase
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', profile.id)
|
||||
.eq('following_id', user.id)
|
||||
.maybeSingle();
|
||||
isFollowedBy = reverseFollow?.status === 'accepted';
|
||||
}
|
||||
const isFriend = isFollowing && isFollowedBy;
|
||||
|
||||
// Check privacy visibility
|
||||
if (profile.is_private || profileVisibility === 'private') {
|
||||
// Private profiles show minimal info to non-followers
|
||||
if (!isFollowing) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
profile: {
|
||||
id: profile.id,
|
||||
handle: profile.handle ?? 'unknown',
|
||||
display_name: profile.display_name ?? 'Anonymous',
|
||||
avatar_url: profile.avatar_url,
|
||||
created_at: profile.created_at,
|
||||
trust_state: profile.trust_state,
|
||||
},
|
||||
stats: {
|
||||
posts: 0,
|
||||
followers: 0,
|
||||
following: 0,
|
||||
},
|
||||
is_following: false,
|
||||
is_followed_by: isFollowedBy,
|
||||
is_friend: isFriend,
|
||||
follow_status: followStatus,
|
||||
is_private: true,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
} else if (profileVisibility === 'followers') {
|
||||
// Followers-only profiles hide details from non-followers
|
||||
if (!isFollowing) {
|
||||
profile.bio = null;
|
||||
profile.location = null;
|
||||
profile.website = null;
|
||||
profile.interests = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Coalesce null values for older clients
|
||||
const safeProfile = {
|
||||
...profile,
|
||||
handle: profile.handle ?? 'unknown',
|
||||
display_name: profile.display_name ?? 'Anonymous',
|
||||
};
|
||||
|
||||
if (safeProfile.avatar_url) {
|
||||
safeProfile.avatar_url = await trySignR2Url(safeProfile.avatar_url);
|
||||
}
|
||||
if (safeProfile.cover_url) {
|
||||
safeProfile.cover_url = await trySignR2Url(safeProfile.cover_url);
|
||||
}
|
||||
|
||||
// Get post count
|
||||
const { count: postCount } = await supabase
|
||||
.from('posts')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('author_id', safeProfile.id);
|
||||
|
||||
// Get follower/following counts
|
||||
const { count: followerCount } = await supabase
|
||||
.from('follows')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('following_id', safeProfile.id)
|
||||
.eq('status', 'accepted');
|
||||
|
||||
const { count: followingCount } = await supabase
|
||||
.from('follows')
|
||||
.select('*', { count: 'exact', head: true })
|
||||
.eq('follower_id', safeProfile.id)
|
||||
.eq('status', 'accepted');
|
||||
|
||||
// Check if current user is following this profile
|
||||
let isFollowing = false;
|
||||
let isFollowedBy = false;
|
||||
let isFriend = false;
|
||||
let followStatus: string | null = null;
|
||||
if (user && user.id !== safeProfile.id) {
|
||||
const { data: followData } = await supabase
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', safeProfile.id)
|
||||
.maybeSingle();
|
||||
|
||||
followStatus = followData?.status as string | null;
|
||||
isFollowing = followStatus === 'accepted';
|
||||
|
||||
const { data: reverseFollow } = await supabase
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', safeProfile.id)
|
||||
.eq('following_id', user.id)
|
||||
.maybeSingle();
|
||||
isFollowedBy = reverseFollow?.status === 'accepted';
|
||||
isFriend = isFollowing && isFollowedBy;
|
||||
}
|
||||
|
||||
const isPrivateForViewer = (profile.is_private ?? false) && !isFollowing && !isOwnProfile;
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
profile: safeProfile,
|
||||
stats: {
|
||||
posts: postCount || 0,
|
||||
followers: followerCount || 0,
|
||||
following: followingCount || 0,
|
||||
},
|
||||
is_following: isFollowing,
|
||||
is_followed_by: isFollowedBy,
|
||||
is_friend: isFriend,
|
||||
follow_status: followStatus,
|
||||
is_private: isPrivateForViewer,
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// PATCH /profile - Update own profile
|
||||
if (req.method === 'PATCH') {
|
||||
const { handle, display_name, bio, location, website, interests, avatar_url, cover_url } = await req.json();
|
||||
|
||||
const updates: any = {};
|
||||
|
||||
// Handle username changes with 30-day limit
|
||||
if (handle !== undefined) {
|
||||
if (handle.trim().length < 3 || handle.length > 20) {
|
||||
throw new ValidationError('Username must be 3-20 characters', 'handle');
|
||||
}
|
||||
if (!/^[a-zA-Z0-9_]+$/.test(handle)) {
|
||||
throw new ValidationError('Username can only contain letters, numbers, and underscores', 'handle');
|
||||
}
|
||||
|
||||
// Check if handle is already taken
|
||||
const { data: existingProfile } = await supabase
|
||||
.from('profiles')
|
||||
.select('id')
|
||||
.eq('handle', handle)
|
||||
.neq('id', user.id)
|
||||
.maybeSingle();
|
||||
|
||||
if (existingProfile) {
|
||||
throw new ValidationError('Username is already taken', 'handle');
|
||||
}
|
||||
|
||||
// Check 30-day limit
|
||||
const { data: canChange, error: canChangeError } = await supabase
|
||||
.rpc('can_change_handle', { p_user_id: user.id });
|
||||
|
||||
if (canChangeError || !canChange) {
|
||||
throw new ValidationError('You can only change your username once every 30 days', 'handle');
|
||||
}
|
||||
|
||||
updates.handle = handle;
|
||||
}
|
||||
|
||||
if (display_name !== undefined) {
|
||||
if (display_name.trim().length === 0 || display_name.length > 50) {
|
||||
throw new ValidationError('Display name must be 1-50 characters', 'display_name');
|
||||
}
|
||||
updates.display_name = display_name;
|
||||
}
|
||||
|
||||
if (bio !== undefined) {
|
||||
if (bio && bio.length > 300) {
|
||||
throw new ValidationError('Bio must be 300 characters or less', 'bio');
|
||||
}
|
||||
updates.bio = bio || null;
|
||||
}
|
||||
|
||||
if (location !== undefined) {
|
||||
if (location && location.length > 100) {
|
||||
throw new ValidationError('Location must be 100 characters or less', 'location');
|
||||
}
|
||||
updates.location = location || null;
|
||||
}
|
||||
|
||||
if (website !== undefined) {
|
||||
if (website) {
|
||||
if (website.length > 200) {
|
||||
throw new ValidationError('Website must be 200 characters or less', 'website');
|
||||
}
|
||||
// Validate URL format and scheme
|
||||
try {
|
||||
const url = new URL(website.startsWith('http') ? website : `https://${website}`);
|
||||
if (!['http:', 'https:'].includes(url.protocol)) {
|
||||
throw new ValidationError('Website must be a valid HTTP or HTTPS URL', 'website');
|
||||
}
|
||||
} catch (error) {
|
||||
throw new ValidationError('Website must be a valid URL', 'website');
|
||||
}
|
||||
}
|
||||
updates.website = website || null;
|
||||
}
|
||||
|
||||
if (interests !== undefined) {
|
||||
if (Array.isArray(interests)) {
|
||||
updates.interests = interests;
|
||||
} else {
|
||||
throw new ValidationError('Interests must be an array', 'interests');
|
||||
}
|
||||
}
|
||||
|
||||
if (avatar_url !== undefined) {
|
||||
updates.avatar_url = avatar_url || null;
|
||||
}
|
||||
|
||||
if (cover_url !== undefined) {
|
||||
updates.cover_url = cover_url || null;
|
||||
}
|
||||
|
||||
if (Object.keys(updates).length === 0) {
|
||||
return new Response(JSON.stringify({ error: 'No fields to update' }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
updates.updated_at = new Date().toISOString();
|
||||
|
||||
const { data: profile, error: updateError } = await supabase
|
||||
.from('profiles')
|
||||
.update(updates)
|
||||
.eq('id', user.id)
|
||||
.select()
|
||||
.single();
|
||||
|
||||
if (updateError) {
|
||||
console.error('Error updating profile:', updateError);
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Failed to update profile',
|
||||
details: updateError.message,
|
||||
code: updateError.code
|
||||
}), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
profile,
|
||||
message: 'Profile updated',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
|
||||
status: 405,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Validation error', message: error.message }),
|
||||
{ status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/publish-comment/config.toml
Normal file
1
_legacy/supabase/functions/publish-comment/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
206
_legacy/supabase/functions/publish-comment/index.ts
Normal file
206
_legacy/supabase/functions/publish-comment/index.ts
Normal file
|
|
@ -0,0 +1,206 @@
|
|||
/**
|
||||
* POST /publish-comment
|
||||
*
|
||||
* Design intent:
|
||||
* - Conversation requires consent (mutual follow).
|
||||
* - Sharp speech is rejected quietly.
|
||||
* - Comments never affect post reach.
|
||||
*
|
||||
* Flow:
|
||||
* 1. Validate auth and inputs
|
||||
* 2. Verify mutual follow with post author
|
||||
* 3. Reject profanity or hostility
|
||||
* 4. Store comment with tone metadata
|
||||
* 5. Log audit event
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { analyzeTone, getRewriteSuggestion } from '../_shared/tone-detection.ts';
|
||||
import { validateCommentBody, validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
interface PublishCommentRequest {
|
||||
post_id: string;
|
||||
body: string;
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 2. Parse request
|
||||
const { post_id, body } = (await req.json()) as PublishCommentRequest;
|
||||
|
||||
// 3. Validate inputs
|
||||
validateUUID(post_id, 'post_id');
|
||||
validateCommentBody(body);
|
||||
|
||||
// 4. Get post author
|
||||
const { data: post, error: postError } = await supabase
|
||||
.from('posts')
|
||||
.select('author_id')
|
||||
.eq('id', post_id)
|
||||
.single();
|
||||
|
||||
if (postError || !post) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Post not found',
|
||||
message: 'This post does not exist or you cannot see it.',
|
||||
}),
|
||||
{
|
||||
status: 404,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 5. Verify mutual follow
|
||||
const serviceClient = createServiceClient();
|
||||
const { data: isMutual, error: followError } = await serviceClient.rpc('is_mutual_follow', {
|
||||
user_a: user.id,
|
||||
user_b: post.author_id,
|
||||
});
|
||||
|
||||
if (followError) {
|
||||
console.error('Error checking mutual follow:', followError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to verify follow relationship' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (!isMutual) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Mutual follow required',
|
||||
message: 'You can only comment on posts from people you mutually follow.',
|
||||
}),
|
||||
{
|
||||
status: 403,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 6. Analyze tone
|
||||
const analysis = analyzeTone(body);
|
||||
|
||||
// 7. Reject hostile or profane content
|
||||
if (analysis.shouldReject) {
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'comment_rejected',
|
||||
p_payload: {
|
||||
post_id,
|
||||
tone: analysis.tone,
|
||||
cis: analysis.cis,
|
||||
flags: analysis.flags,
|
||||
reason: analysis.rejectReason,
|
||||
},
|
||||
});
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Comment rejected',
|
||||
message: analysis.rejectReason,
|
||||
suggestion: getRewriteSuggestion(analysis),
|
||||
tone: analysis.tone,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 8. Create comment
|
||||
const { data: comment, error: commentError } = await supabase
|
||||
.from('comments')
|
||||
.insert({
|
||||
post_id,
|
||||
author_id: user.id,
|
||||
body,
|
||||
tone_label: analysis.tone,
|
||||
status: 'active',
|
||||
})
|
||||
.select()
|
||||
.single();
|
||||
|
||||
if (commentError) {
|
||||
console.error('Error creating comment:', commentError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to create comment' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 9. Log successful comment
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'comment_created',
|
||||
p_payload: {
|
||||
comment_id: comment.id,
|
||||
post_id,
|
||||
tone: analysis.tone,
|
||||
cis: analysis.cis,
|
||||
},
|
||||
});
|
||||
|
||||
// 10. Return comment
|
||||
return new Response(JSON.stringify({ comment }), {
|
||||
status: 201,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: error.message,
|
||||
field: error.field,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/publish-post/config.toml
Normal file
1
_legacy/supabase/functions/publish-post/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
629
_legacy/supabase/functions/publish-post/index.ts
Normal file
629
_legacy/supabase/functions/publish-post/index.ts
Normal file
|
|
@ -0,0 +1,629 @@
|
|||
/**
|
||||
* POST /publish-post
|
||||
*
|
||||
* Features:
|
||||
* - Hashtag extraction and storage
|
||||
* - AI tone analysis
|
||||
* - Rate limiting via trust state
|
||||
* - Beacon support (location-based alerts)
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { validatePostBody, validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
import { trySignR2Url } from '../_shared/r2_signer.ts';
|
||||
|
||||
interface PublishPostRequest {
|
||||
category_id?: string | null;
|
||||
body: string;
|
||||
body_format?: 'plain' | 'markdown';
|
||||
allow_chain?: boolean;
|
||||
chain_parent_id?: string | null;
|
||||
chain_parent_id?: string | null;
|
||||
image_url?: string | null;
|
||||
thumbnail_url?: string | null;
|
||||
ttl_hours?: number | null;
|
||||
user_warned?: boolean;
|
||||
|
||||
// Beacon fields (optional)
|
||||
is_beacon?: boolean;
|
||||
beacon_type?: 'police' | 'checkpoint' | 'taskForce' | 'hazard' | 'safety' | 'community';
|
||||
beacon_lat?: number;
|
||||
beacon_long?: number;
|
||||
}
|
||||
|
||||
interface AnalysisResult {
|
||||
flagged: boolean;
|
||||
category?: 'bigotry' | 'nsfw' | 'violence';
|
||||
flags: string[];
|
||||
rejectReason?: string;
|
||||
}
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || '*';
|
||||
const CORS_HEADERS = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
type ModerationStatus = 'approved' | 'flagged_bigotry' | 'flagged_nsfw' | 'rejected';
|
||||
|
||||
function getModerationStatus(category?: AnalysisResult['category']): ModerationStatus {
|
||||
switch (category) {
|
||||
case 'bigotry':
|
||||
return 'flagged_bigotry';
|
||||
case 'nsfw':
|
||||
return 'flagged_nsfw';
|
||||
case 'violence':
|
||||
return 'rejected';
|
||||
default:
|
||||
return 'approved';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract hashtags from post body using regex
|
||||
* Returns array of lowercase tags without the # prefix
|
||||
*/
|
||||
function extractHashtags(body: string): string[] {
|
||||
const hashtagRegex = /#\w+/g;
|
||||
const matches = body.match(hashtagRegex);
|
||||
if (!matches) return [];
|
||||
|
||||
// Remove # prefix and lowercase all tags
|
||||
return [...new Set(matches.map(tag => tag.substring(1).toLowerCase()))];
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
// CORS preflight
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, { headers: CORS_HEADERS });
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
console.error('Auth error details:', {
|
||||
error: authError,
|
||||
errorMessage: authError?.message,
|
||||
errorStatus: authError?.status,
|
||||
user: user,
|
||||
authHeader: authHeader ? 'present' : 'missing',
|
||||
});
|
||||
return new Response(JSON.stringify({
|
||||
error: 'Unauthorized',
|
||||
details: authError?.message,
|
||||
}), {
|
||||
status: 401,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const { data: profileRows, error: profileError } = await supabase
|
||||
.from('profiles')
|
||||
.select('id')
|
||||
.eq('id', user.id)
|
||||
.limit(1);
|
||||
|
||||
if (profileError) {
|
||||
console.error('Error checking profile:', profileError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to verify profile' }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (!profileRows || profileRows.length === 0) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Profile not found',
|
||||
message: 'Please complete your profile before posting.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 2. Parse request
|
||||
const {
|
||||
category_id,
|
||||
body,
|
||||
body_format,
|
||||
allow_chain,
|
||||
chain_parent_id,
|
||||
image_url,
|
||||
thumbnail_url,
|
||||
ttl_hours,
|
||||
user_warned,
|
||||
is_beacon,
|
||||
beacon_type,
|
||||
beacon_lat,
|
||||
beacon_long,
|
||||
} = (await req.json()) as PublishPostRequest;
|
||||
const requestedCategoryId = category_id ?? null;
|
||||
|
||||
// 3. Validate inputs
|
||||
// For beacons, category_id is ignored and replaced by "Beacon Alerts" internally
|
||||
validatePostBody(body);
|
||||
if (is_beacon !== true && category_id) {
|
||||
validateUUID(category_id, 'category_id');
|
||||
}
|
||||
if (chain_parent_id) {
|
||||
validateUUID(chain_parent_id, 'chain_parent_id');
|
||||
}
|
||||
|
||||
let ttlHours: number | null | undefined = undefined;
|
||||
if (ttl_hours !== undefined && ttl_hours !== null) {
|
||||
const parsedTtl = Number(ttl_hours);
|
||||
if (!Number.isFinite(parsedTtl) || !Number.isInteger(parsedTtl) || parsedTtl < 0) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: 'ttl_hours must be a non-negative integer',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
ttlHours = parsedTtl;
|
||||
}
|
||||
|
||||
// Validate beacon fields if provided
|
||||
if (is_beacon === true) {
|
||||
const latMissing = beacon_lat === undefined || beacon_lat === null || Number.isNaN(beacon_lat);
|
||||
const longMissing = beacon_long === undefined || beacon_long === null || Number.isNaN(beacon_long);
|
||||
|
||||
if (latMissing || longMissing) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: 'beacon_lat and beacon_long are required for beacon posts',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
const validBeaconTypes = ['police', 'checkpoint', 'taskForce', 'hazard', 'safety', 'community'];
|
||||
if (beacon_type && !validBeaconTypes.includes(beacon_type)) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: 'Invalid beacon_type. Must be: police, checkpoint, taskForce, hazard, safety, or community',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Check if user can post (rate limiting via trust state)
|
||||
const serviceClient = createServiceClient();
|
||||
const { data: canPostData, error: canPostError } = await serviceClient.rpc('can_post', {
|
||||
p_user_id: user.id,
|
||||
});
|
||||
|
||||
if (canPostError) {
|
||||
console.error('Error checking post eligibility:', canPostError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to check posting eligibility' }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (!canPostData) {
|
||||
const { data: limitData } = await serviceClient.rpc('get_post_rate_limit', {
|
||||
p_user_id: user.id,
|
||||
});
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Rate limit reached',
|
||||
message: `You have reached your posting limit for today (${limitData} posts).`,
|
||||
suggestion: 'Take a moment. Your influence grows with patience.',
|
||||
}),
|
||||
{
|
||||
status: 429,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 5. Validate chain parent (if any)
|
||||
if (chain_parent_id) {
|
||||
const { data: parentPost, error: parentError } = await supabase
|
||||
.from('posts')
|
||||
.select('id, allow_chain, status')
|
||||
.eq('id', chain_parent_id)
|
||||
.single();
|
||||
|
||||
if (parentError || !parentPost) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Chain unavailable',
|
||||
message: 'This post is not available for chaining.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
if (!parentPost.allow_chain || parentPost.status !== 'active') {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Chain unavailable',
|
||||
message: 'Chaining has been disabled for this post.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// 6. Call tone-check function for AI moderation
|
||||
let analysis: AnalysisResult = {
|
||||
flagged: false,
|
||||
category: undefined,
|
||||
flags: [],
|
||||
rejectReason: undefined,
|
||||
};
|
||||
|
||||
try {
|
||||
const supabaseUrl = Deno.env.get('SUPABASE_URL') || '';
|
||||
const supabaseKey = Deno.env.get('SUPABASE_ANON_KEY') || '';
|
||||
|
||||
console.log('Calling tone-check function...');
|
||||
|
||||
const moderationResponse = await fetch(
|
||||
`${supabaseUrl}/functions/v1/tone-check`,
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${authHeader}`,
|
||||
'Content-Type': 'application/json',
|
||||
'apikey': supabaseKey,
|
||||
},
|
||||
body: JSON.stringify({
|
||||
text: body,
|
||||
imageUrl: image_url || undefined,
|
||||
}),
|
||||
}
|
||||
);
|
||||
|
||||
console.log('tone-check response status:', moderationResponse.status);
|
||||
|
||||
if (moderationResponse.ok) {
|
||||
const data = await moderationResponse.json();
|
||||
console.log('tone-check response:', JSON.stringify(data));
|
||||
|
||||
const flagged = Boolean(data.flagged);
|
||||
const category = data.category as AnalysisResult['category'] | undefined;
|
||||
|
||||
analysis = {
|
||||
flagged,
|
||||
category,
|
||||
flags: data.flags || [],
|
||||
rejectReason: data.reason,
|
||||
};
|
||||
|
||||
console.log('Analysis result:', JSON.stringify(analysis));
|
||||
} else {
|
||||
console.error('tone-check failed:', await moderationResponse.text());
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Tone check error:', e);
|
||||
// Fail CLOSED: Reject post if moderation is unavailable
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'post_rejected_moderation_error',
|
||||
p_payload: {
|
||||
category_id: requestedCategoryId,
|
||||
error: e.toString(),
|
||||
},
|
||||
});
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Moderation unavailable',
|
||||
message: 'Content moderation is temporarily unavailable. Please try again later.',
|
||||
}),
|
||||
{
|
||||
status: 503,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 7. Reject hostile or hateful content
|
||||
if (analysis.flagged) {
|
||||
const moderationStatus = getModerationStatus(analysis.category);
|
||||
if (user_warned === true) {
|
||||
const { data: profileRow } = await serviceClient
|
||||
.from('profiles')
|
||||
.select('strikes')
|
||||
.eq('id', user.id)
|
||||
.maybeSingle();
|
||||
const currentStrikes = typeof profileRow?.strikes === 'number' ? profileRow!.strikes : 0;
|
||||
|
||||
await serviceClient
|
||||
.from('profiles')
|
||||
.update({ strikes: currentStrikes + 1 })
|
||||
.eq('id', user.id);
|
||||
}
|
||||
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'post_rejected',
|
||||
p_payload: {
|
||||
category_id: requestedCategoryId,
|
||||
moderation_category: analysis.category,
|
||||
flags: analysis.flags,
|
||||
reason: analysis.rejectReason,
|
||||
moderation_status: moderationStatus,
|
||||
user_warned: user_warned === true,
|
||||
},
|
||||
});
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Content rejected',
|
||||
message: analysis.rejectReason || 'This content was rejected by moderation.',
|
||||
category: analysis.category,
|
||||
moderation_status: moderationStatus,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 8. Determine post status based on tone
|
||||
const status = 'active';
|
||||
const moderationStatus: ModerationStatus = 'approved';
|
||||
const toneLabel = 'neutral';
|
||||
const cisScore = 0.8;
|
||||
|
||||
// 9. Extract hashtags from body (skip for beacons - they use description as body)
|
||||
const tags = is_beacon === true ? [] : extractHashtags(body);
|
||||
console.log(`Extracted ${tags.length} tags:`, tags);
|
||||
|
||||
// 10. Handle beacon category and data
|
||||
let postCategoryId = requestedCategoryId;
|
||||
let beaconData: any = null;
|
||||
|
||||
if (is_beacon === true) {
|
||||
// Get or create "Beacon Alerts" category for beacons
|
||||
const { data: beaconCategory } = await serviceClient
|
||||
.from('categories')
|
||||
.select('id')
|
||||
.eq('name', 'Beacon Alerts')
|
||||
.single();
|
||||
|
||||
if (beaconCategory) {
|
||||
postCategoryId = beaconCategory.id;
|
||||
} else {
|
||||
// Create the beacon category if it doesn't exist
|
||||
const { data: newCategory } = await serviceClient
|
||||
.from('categories')
|
||||
.insert({ name: 'Beacon Alerts', description: 'Community safety and alert posts' })
|
||||
.select('id')
|
||||
.single();
|
||||
|
||||
if (newCategory) {
|
||||
postCategoryId = newCategory.id;
|
||||
}
|
||||
}
|
||||
|
||||
// Get user's trust score for initial confidence
|
||||
const { data: profile } = await serviceClient
|
||||
.from('profiles')
|
||||
.select('trust_state(harmony_score)')
|
||||
.eq('id', user.id)
|
||||
.single();
|
||||
|
||||
const trustScore = profile?.trust_state?.harmony_score ?? 0.5;
|
||||
const initialConfidence = 0.5 + (trustScore * 0.3);
|
||||
|
||||
// Store beacon data to be included in post
|
||||
beaconData = {
|
||||
is_beacon: true,
|
||||
beacon_type: beacon_type ?? 'community',
|
||||
location: `SRID=4326;POINT(${beacon_long} ${beacon_lat})`,
|
||||
confidence_score: Math.min(1.0, Math.max(0.0, initialConfidence)),
|
||||
is_active_beacon: true,
|
||||
allow_chain: false, // Beacons don't allow chaining
|
||||
};
|
||||
}
|
||||
|
||||
// 11. Resolve post expiration
|
||||
let expiresAt: string | null = null;
|
||||
if (ttlHours !== undefined) {
|
||||
if (ttlHours > 0) {
|
||||
expiresAt = new Date(Date.now() + ttlHours * 60 * 60 * 1000).toISOString();
|
||||
} else {
|
||||
expiresAt = null;
|
||||
}
|
||||
} else {
|
||||
const { data: settingsRow, error: settingsError } = await supabase
|
||||
.from('user_settings')
|
||||
.select('default_post_ttl')
|
||||
.eq('user_id', user.id)
|
||||
.maybeSingle();
|
||||
|
||||
if (settingsError) {
|
||||
console.error('Error fetching user settings:', settingsError);
|
||||
} else {
|
||||
const defaultTtl = settingsRow?.default_post_ttl;
|
||||
if (typeof defaultTtl === 'number' && defaultTtl > 0) {
|
||||
expiresAt = new Date(Date.now() + defaultTtl * 60 * 60 * 1000).toISOString();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 12. Create post with tags and beacon data
|
||||
let postVisibility = 'public';
|
||||
const { data: privacyRow, error: privacyError } = await supabase
|
||||
.from('profile_privacy_settings')
|
||||
.select('posts_visibility')
|
||||
.eq('user_id', user.id)
|
||||
.maybeSingle();
|
||||
|
||||
if (privacyError) {
|
||||
console.error('Error fetching privacy settings:', privacyError);
|
||||
} else if (privacyRow?.posts_visibility) {
|
||||
postVisibility = privacyRow.posts_visibility;
|
||||
}
|
||||
|
||||
const insertData: any = {
|
||||
author_id: user.id,
|
||||
category_id: postCategoryId ?? null,
|
||||
body,
|
||||
body_format: body_format ?? 'plain',
|
||||
tone_label: toneLabel,
|
||||
cis_score: cisScore,
|
||||
status,
|
||||
moderation_status: moderationStatus,
|
||||
allow_chain: beaconData?.allow_chain ?? (allow_chain ?? true),
|
||||
chain_parent_id: chain_parent_id ?? null,
|
||||
image_url: image_url ?? null,
|
||||
thumbnail_url: thumbnail_url ?? null,
|
||||
tags: tags,
|
||||
expires_at: expiresAt,
|
||||
visibility: postVisibility,
|
||||
};
|
||||
|
||||
// Add beacon fields if this is a beacon
|
||||
if (beaconData) {
|
||||
insertData.is_beacon = beaconData.is_beacon;
|
||||
insertData.beacon_type = beaconData.beacon_type;
|
||||
insertData.location = beaconData.location;
|
||||
insertData.confidence_score = beaconData.confidence_score;
|
||||
insertData.is_active_beacon = beaconData.is_active_beacon;
|
||||
}
|
||||
|
||||
// Use service client for INSERT to bypass RLS issues for private users
|
||||
// The user authentication has already been validated above
|
||||
const { data: post, error: postError } = await serviceClient
|
||||
.from('posts')
|
||||
.insert(insertData)
|
||||
.select()
|
||||
.single();
|
||||
|
||||
if (postError) {
|
||||
console.error('Error creating post:', JSON.stringify({
|
||||
code: postError.code,
|
||||
message: postError.message,
|
||||
details: postError.details,
|
||||
hint: postError.hint,
|
||||
user_id: user.id,
|
||||
}));
|
||||
return new Response(JSON.stringify({ error: 'Failed to create post', details: postError.message }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 13. Log successful post
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'post_created',
|
||||
p_payload: {
|
||||
post_id: post.id,
|
||||
category_id: postCategoryId,
|
||||
tone: toneLabel,
|
||||
cis: cisScore,
|
||||
chain_parent_id: chain_parent_id ?? null,
|
||||
tags_count: tags.length,
|
||||
is_beacon: is_beacon ?? false,
|
||||
},
|
||||
});
|
||||
|
||||
// 14. Return post with metadata - FLATTENED STRUCTURE
|
||||
let message = 'Your post is ready.';
|
||||
if (toneLabel === 'negative' || toneLabel === 'mixed') {
|
||||
message = 'This post may have limited reach based on its tone.';
|
||||
}
|
||||
|
||||
// Create a flattened post object that merges post data with location data
|
||||
const flattenedPost: any = {
|
||||
...post,
|
||||
// Add location fields at the top level for beacons
|
||||
...(beaconData && {
|
||||
latitude: beacon_lat,
|
||||
longitude: beacon_long,
|
||||
}),
|
||||
};
|
||||
|
||||
if (flattenedPost.image_url) {
|
||||
flattenedPost.image_url = await trySignR2Url(flattenedPost.image_url);
|
||||
}
|
||||
if (flattenedPost.thumbnail_url) {
|
||||
flattenedPost.thumbnail_url = await trySignR2Url(flattenedPost.thumbnail_url);
|
||||
}
|
||||
|
||||
const response: any = {
|
||||
post: flattenedPost,
|
||||
tone_analysis: {
|
||||
tone: toneLabel,
|
||||
cis: cisScore,
|
||||
message,
|
||||
},
|
||||
tags,
|
||||
};
|
||||
|
||||
return new Response(
|
||||
JSON.stringify(response),
|
||||
{
|
||||
status: 201,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: error.message,
|
||||
field: error.field,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { ...CORS_HEADERS, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/push-notification/config.toml
Normal file
1
_legacy/supabase/functions/push-notification/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
238
_legacy/supabase/functions/push-notification/index.ts
Normal file
238
_legacy/supabase/functions/push-notification/index.ts
Normal file
|
|
@ -0,0 +1,238 @@
|
|||
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2.39.3';
|
||||
import { cert, getApps, initializeApp } from 'https://esm.sh/firebase-admin@11.10.1/app?target=deno&bundle';
|
||||
import { getMessaging } from 'https://esm.sh/firebase-admin@11.10.1/messaging?target=deno&bundle';
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
};
|
||||
|
||||
let supabase: ReturnType<typeof createClient> | null = null;
|
||||
|
||||
function getSupabase() {
|
||||
if (supabase) return supabase;
|
||||
const url = Deno.env.get('SUPABASE_URL') ?? '';
|
||||
const key = Deno.env.get('SUPABASE_SERVICE_ROLE_KEY') ?? '';
|
||||
if (!url || !key) {
|
||||
throw new Error('Missing SUPABASE_URL or SUPABASE_SERVICE_ROLE_KEY');
|
||||
}
|
||||
supabase = createClient(url, key);
|
||||
return supabase;
|
||||
}
|
||||
|
||||
function initFirebase() {
|
||||
if (getApps().length > 0) return;
|
||||
const raw = Deno.env.get('FIREBASE_SERVICE_ACCOUNT');
|
||||
if (!raw) {
|
||||
throw new Error('Missing FIREBASE_SERVICE_ACCOUNT environment variable');
|
||||
}
|
||||
const serviceAccount = JSON.parse(raw);
|
||||
if (serviceAccount.private_key && typeof serviceAccount.private_key === 'string') {
|
||||
serviceAccount.private_key = serviceAccount.private_key.replace(/\\n/g, '\n');
|
||||
}
|
||||
initializeApp({
|
||||
credential: cert(serviceAccount),
|
||||
});
|
||||
}
|
||||
|
||||
// Clean up invalid FCM tokens from database
|
||||
async function cleanupInvalidTokens(
|
||||
supabaseClient: ReturnType<typeof createClient>,
|
||||
tokens: string[],
|
||||
responses: { success: boolean; error?: { code: string } }[]
|
||||
) {
|
||||
const invalidTokens: string[] = [];
|
||||
|
||||
responses.forEach((response, index) => {
|
||||
if (!response.success && response.error) {
|
||||
const errorCode = response.error.code;
|
||||
// These error codes indicate the token is no longer valid
|
||||
if (
|
||||
errorCode === 'messaging/invalid-registration-token' ||
|
||||
errorCode === 'messaging/registration-token-not-registered' ||
|
||||
errorCode === 'messaging/invalid-argument'
|
||||
) {
|
||||
invalidTokens.push(tokens[index]);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if (invalidTokens.length > 0) {
|
||||
await supabaseClient
|
||||
.from('user_fcm_tokens')
|
||||
.delete()
|
||||
.in('token', invalidTokens);
|
||||
console.log(`Cleaned up ${invalidTokens.length} invalid FCM tokens`);
|
||||
}
|
||||
|
||||
return invalidTokens.length;
|
||||
}
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', { headers: corsHeaders });
|
||||
}
|
||||
|
||||
if (req.method === 'GET') {
|
||||
const vapidKey = Deno.env.get('FIREBASE_WEB_VAPID_KEY') || '';
|
||||
return new Response(JSON.stringify({ firebase_web_vapid_key: vapidKey }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
initFirebase();
|
||||
const supabaseClient = getSupabase();
|
||||
|
||||
const payload = await req.json();
|
||||
console.log('Received payload:', JSON.stringify(payload));
|
||||
|
||||
// Handle different payload formats:
|
||||
// - Database webhook: { type: 'INSERT', table: '...', record: {...} }
|
||||
// - Direct call: { conversation_id: '...', sender_id: '...' }
|
||||
// - Alternative: { new: {...} }
|
||||
const record = payload?.record ?? payload?.new ?? payload;
|
||||
console.log('Extracted record:', JSON.stringify(record));
|
||||
|
||||
const conversationId = record?.conversation_id as string | undefined;
|
||||
const senderId = record?.sender_id as string | undefined;
|
||||
const messageType = record?.message_type != null
|
||||
? Number(record.message_type)
|
||||
: undefined;
|
||||
|
||||
console.log(`Processing: conversation=${conversationId}, sender=${senderId}, type=${messageType}`);
|
||||
|
||||
if (!conversationId || !senderId) {
|
||||
console.error('Missing required fields in payload');
|
||||
return new Response(JSON.stringify({ error: 'Missing conversation_id or sender_id', receivedPayload: payload }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
if (messageType === 2) {
|
||||
return new Response(JSON.stringify({ skipped: true, reason: 'command_message' }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const { data: conversation, error: conversationError } = await supabaseClient
|
||||
.from('encrypted_conversations')
|
||||
.select('participant_a, participant_b')
|
||||
.eq('id', conversationId)
|
||||
.single();
|
||||
|
||||
if (conversationError || !conversation) {
|
||||
return new Response(JSON.stringify({ error: 'Conversation not found' }), {
|
||||
status: 404,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const receiverId =
|
||||
conversation.participant_a === senderId
|
||||
? conversation.participant_b
|
||||
: conversation.participant_a;
|
||||
|
||||
if (!receiverId) {
|
||||
return new Response(JSON.stringify({ error: 'Receiver not resolved' }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const { data: senderProfile } = await supabaseClient
|
||||
.from('profiles')
|
||||
.select('handle, display_name')
|
||||
.eq('id', senderId)
|
||||
.single();
|
||||
|
||||
const senderName =
|
||||
senderProfile?.display_name?.trim() ||
|
||||
(senderProfile?.handle ? `@${senderProfile.handle}` : 'Someone');
|
||||
|
||||
const { data: tokens } = await supabaseClient
|
||||
.from('user_fcm_tokens')
|
||||
.select('token')
|
||||
.eq('user_id', receiverId);
|
||||
|
||||
const tokenList = (tokens ?? [])
|
||||
.map((row) => row.token as string)
|
||||
.filter((token) => !!token);
|
||||
|
||||
if (tokenList.length == 0) {
|
||||
console.log(`No FCM tokens found for receiver ${receiverId}`);
|
||||
return new Response(JSON.stringify({ skipped: true, reason: 'no_tokens', receiverId }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
console.log(`Sending to ${tokenList.length} token(s) for receiver ${receiverId}`);
|
||||
|
||||
const messaging = getMessaging();
|
||||
const response = await messaging.sendEachForMulticast({
|
||||
tokens: tokenList,
|
||||
notification: {
|
||||
title: `New Message from ${senderName}`,
|
||||
body: '🔒 [Encrypted Message]',
|
||||
},
|
||||
data: {
|
||||
conversation_id: conversationId,
|
||||
type: 'chat',
|
||||
},
|
||||
// Android-specific options
|
||||
android: {
|
||||
priority: 'high',
|
||||
notification: {
|
||||
channelId: 'chat_messages',
|
||||
priority: 'high',
|
||||
},
|
||||
},
|
||||
// iOS-specific options
|
||||
apns: {
|
||||
payload: {
|
||||
aps: {
|
||||
sound: 'default',
|
||||
badge: 1,
|
||||
contentAvailable: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Clean up any invalid tokens
|
||||
const cleanedUp = await cleanupInvalidTokens(
|
||||
supabaseClient,
|
||||
tokenList,
|
||||
response.responses
|
||||
);
|
||||
|
||||
console.log(`Push notification sent: ${response.successCount} success, ${response.failureCount} failed, ${cleanedUp} tokens cleaned up`);
|
||||
|
||||
// Log individual failures for debugging
|
||||
response.responses.forEach((resp, index) => {
|
||||
if (!resp.success && resp.error) {
|
||||
console.error(`Failed to send to token ${index}: ${resp.error.code} - ${resp.error.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
sent: response.successCount,
|
||||
failed: response.failureCount,
|
||||
tokensCleanedUp: cleanedUp,
|
||||
}),
|
||||
{
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Push notification error:', error);
|
||||
return new Response(JSON.stringify({ error: error.message ?? 'Unknown error' }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/report/config.toml
Normal file
1
_legacy/supabase/functions/report/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
250
_legacy/supabase/functions/report/index.ts
Normal file
250
_legacy/supabase/functions/report/index.ts
Normal file
|
|
@ -0,0 +1,250 @@
|
|||
/**
|
||||
* POST /report
|
||||
*
|
||||
* Design intent:
|
||||
* - Strict reasons only.
|
||||
* - Reports never auto-remove content.
|
||||
* - Reporting accuracy affects reporter trust.
|
||||
*
|
||||
* Flow:
|
||||
* 1. Validate auth and inputs
|
||||
* 2. Ensure target exists and is visible
|
||||
* 3. Create report record
|
||||
* 4. Log audit event
|
||||
* 5. Queue for moderation review
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { validateReportReason, validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
type TargetType = 'post' | 'comment' | 'profile';
|
||||
|
||||
interface ReportRequest {
|
||||
target_type: TargetType;
|
||||
target_id: string;
|
||||
reason: string;
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 2. Parse request
|
||||
const { target_type, target_id, reason } = (await req.json()) as ReportRequest;
|
||||
|
||||
// 3. Validate inputs
|
||||
if (!['post', 'comment', 'profile'].includes(target_type)) {
|
||||
throw new ValidationError('Invalid target type', 'target_type');
|
||||
}
|
||||
|
||||
validateUUID(target_id, 'target_id');
|
||||
validateReportReason(reason);
|
||||
|
||||
// 4. Verify target exists and is visible to reporter
|
||||
let targetExists = false;
|
||||
let targetAuthorId: string | null = null;
|
||||
|
||||
if (target_type === 'post') {
|
||||
const { data: post } = await supabase
|
||||
.from('posts')
|
||||
.select('author_id')
|
||||
.eq('id', target_id)
|
||||
.single();
|
||||
|
||||
if (post) {
|
||||
targetExists = true;
|
||||
targetAuthorId = post.author_id;
|
||||
}
|
||||
} else if (target_type === 'comment') {
|
||||
const { data: comment } = await supabase
|
||||
.from('comments')
|
||||
.select('author_id')
|
||||
.eq('id', target_id)
|
||||
.single();
|
||||
|
||||
if (comment) {
|
||||
targetExists = true;
|
||||
targetAuthorId = comment.author_id;
|
||||
}
|
||||
} else if (target_type === 'profile') {
|
||||
const { data: profile } = await supabase
|
||||
.from('profiles')
|
||||
.select('id')
|
||||
.eq('id', target_id)
|
||||
.single();
|
||||
|
||||
if (profile) {
|
||||
targetExists = true;
|
||||
targetAuthorId = target_id;
|
||||
}
|
||||
}
|
||||
|
||||
if (!targetExists) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Target not found',
|
||||
message: 'The content you are trying to report does not exist or you cannot see it.',
|
||||
}),
|
||||
{
|
||||
status: 404,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 5. Prevent self-reporting
|
||||
if (targetAuthorId === user.id) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Invalid report',
|
||||
message: 'You cannot report your own content.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 6. Check for duplicate reports (constraint will prevent, but give better message)
|
||||
const { data: existingReport } = await supabase
|
||||
.from('reports')
|
||||
.select('id')
|
||||
.eq('reporter_id', user.id)
|
||||
.eq('target_type', target_type)
|
||||
.eq('target_id', target_id)
|
||||
.single();
|
||||
|
||||
if (existingReport) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Duplicate report',
|
||||
message: 'You have already reported this.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 7. Create report
|
||||
const { data: report, error: reportError } = await supabase
|
||||
.from('reports')
|
||||
.insert({
|
||||
reporter_id: user.id,
|
||||
target_type,
|
||||
target_id,
|
||||
reason,
|
||||
status: 'pending',
|
||||
})
|
||||
.select()
|
||||
.single();
|
||||
|
||||
if (reportError) {
|
||||
console.error('Error creating report:', reportError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to create report' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 8. Update reporter's trust counters
|
||||
const serviceClient = createServiceClient();
|
||||
await serviceClient.rpc('log_audit_event', {
|
||||
p_actor_id: user.id,
|
||||
p_event_type: 'report_filed',
|
||||
p_payload: {
|
||||
report_id: report.id,
|
||||
target_type,
|
||||
target_id,
|
||||
target_author_id: targetAuthorId,
|
||||
},
|
||||
});
|
||||
|
||||
// Increment reports_filed counter
|
||||
const { error: counterError } = await serviceClient
|
||||
.from('trust_state')
|
||||
.update({
|
||||
counters: supabase.rpc('jsonb_set', {
|
||||
target: supabase.rpc('counters'),
|
||||
path: '{reports_filed}',
|
||||
new_value: supabase.rpc('to_jsonb', [
|
||||
supabase.rpc('jsonb_extract_path_text', ['counters', 'reports_filed']) + 1,
|
||||
]),
|
||||
}),
|
||||
updated_at: new Date().toISOString(),
|
||||
})
|
||||
.eq('user_id', user.id);
|
||||
|
||||
if (counterError) {
|
||||
console.warn('Failed to update report counter:', counterError);
|
||||
// Continue anyway - report was created
|
||||
}
|
||||
|
||||
// 9. Return success
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
report_id: report.id,
|
||||
message:
|
||||
'Report received. All reports are reviewed. False reports may affect your account standing.',
|
||||
}),
|
||||
{
|
||||
status: 201,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: error.message,
|
||||
field: error.field,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/save/config.toml
Normal file
1
_legacy/supabase/functions/save/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
177
_legacy/supabase/functions/save/index.ts
Normal file
177
_legacy/supabase/functions/save/index.ts
Normal file
|
|
@ -0,0 +1,177 @@
|
|||
/**
|
||||
* POST /save - Save a post (private bookmark)
|
||||
* DELETE /save - Remove from saved
|
||||
*
|
||||
* Design intent:
|
||||
* - Saves are private, for personal curation
|
||||
* - More intentional than appreciation
|
||||
* - Saves > likes in ranking algorithm
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { validateUUID, ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
interface SaveRequest {
|
||||
post_id: string;
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'POST, DELETE',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const { post_id } = (await req.json()) as SaveRequest;
|
||||
validateUUID(post_id, 'post_id');
|
||||
|
||||
// Use admin client to bypass RLS issues
|
||||
const adminClient = createServiceClient();
|
||||
|
||||
// Verify post exists and check visibility
|
||||
const { data: postRow, error: postError } = await adminClient
|
||||
.from('posts')
|
||||
.select('id, visibility, author_id, status')
|
||||
.eq('id', post_id)
|
||||
.maybeSingle();
|
||||
|
||||
if (postError || !postRow) {
|
||||
console.error('Post lookup failed:', { post_id, error: postError?.message });
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post not found' }),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// Check if post is active
|
||||
if (postRow.status !== 'active') {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post is not available' }),
|
||||
{ status: 404, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// For private posts, verify the user has access (must be author)
|
||||
if (postRow.visibility === 'private' && postRow.author_id !== user.id) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post not accessible' }),
|
||||
{ status: 403, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
// For followers-only posts, verify the user follows the author
|
||||
if (postRow.visibility === 'followers' && postRow.author_id !== user.id) {
|
||||
const { data: followRow } = await adminClient
|
||||
.from('follows')
|
||||
.select('status')
|
||||
.eq('follower_id', user.id)
|
||||
.eq('following_id', postRow.author_id)
|
||||
.eq('status', 'accepted')
|
||||
.maybeSingle();
|
||||
|
||||
if (!followRow) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'You must follow this user to save their posts' }),
|
||||
{ status: 403, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle unsave (DELETE)
|
||||
if (req.method === 'DELETE') {
|
||||
const { error: deleteError } = await adminClient
|
||||
.from('post_saves')
|
||||
.delete()
|
||||
.eq('user_id', user.id)
|
||||
.eq('post_id', post_id);
|
||||
|
||||
if (deleteError) {
|
||||
console.error('Error removing save:', deleteError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to remove from saved' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ success: true }), {
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// Handle save (POST)
|
||||
const { error: saveError } = await adminClient
|
||||
.from('post_saves')
|
||||
.insert({
|
||||
user_id: user.id,
|
||||
post_id,
|
||||
});
|
||||
|
||||
if (saveError) {
|
||||
// Already saved (duplicate key)
|
||||
if (saveError.code === '23505') {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Post already saved' }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Error saving post:', saveError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to save post' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: 'Saved. You can find this in your collection.',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Validation error', message: error.message }),
|
||||
{ status: 400, headers: { 'Content-Type': 'application/json' } }
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/search/config.toml
Normal file
1
_legacy/supabase/functions/search/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
287
_legacy/supabase/functions/search/index.ts
Normal file
287
_legacy/supabase/functions/search/index.ts
Normal file
|
|
@ -0,0 +1,287 @@
|
|||
import { serve } from "https://deno.land/std@0.177.0/http/server.ts";
|
||||
import { createSupabaseClient, createServiceClient } from "../_shared/supabase-client.ts";
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
interface Profile {
|
||||
id: string;
|
||||
handle: string;
|
||||
display_name: string;
|
||||
avatar_url: string | null;
|
||||
harmony_tier: string | null;
|
||||
}
|
||||
|
||||
interface SearchUser {
|
||||
id: string;
|
||||
username: string;
|
||||
display_name: string;
|
||||
avatar_url: string | null;
|
||||
harmony_tier: string;
|
||||
}
|
||||
|
||||
interface SearchTag {
|
||||
tag: string;
|
||||
count: number;
|
||||
}
|
||||
|
||||
interface SearchPost {
|
||||
id: string;
|
||||
body: string;
|
||||
author_id: string;
|
||||
author_handle: string;
|
||||
author_display_name: string;
|
||||
created_at: string;
|
||||
image_url: string | null;
|
||||
visibility?: string;
|
||||
}
|
||||
|
||||
function extractHashtags(text: string): string[] {
|
||||
const matches = text.match(/#\w+/g) || [];
|
||||
return [...new Set(matches.map((t) => t.replace("#", "").toLowerCase()))].filter((t) => t.length > 0);
|
||||
}
|
||||
|
||||
function stripHashtags(text: string): string {
|
||||
return text.replace(/#\w+/g, " ").replace(/\s+/g, " ").trim();
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
// 1. Handle CORS Preflight
|
||||
if (req.method === "OPTIONS") {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Access-Control-Allow-Methods": "GET, POST",
|
||||
"Access-Control-Allow-Headers": "authorization, x-client-info, apikey, content-type",
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 2. Auth & Input Parsing
|
||||
const authHeader = req.headers.get("Authorization");
|
||||
if (!authHeader) throw new Error("Missing authorization header");
|
||||
|
||||
let query: string | null = null;
|
||||
if (req.method === "POST") {
|
||||
try {
|
||||
const body = await req.json();
|
||||
query = body.query;
|
||||
} catch { /* ignore parsing error */ }
|
||||
}
|
||||
if (!query) {
|
||||
const url = new URL(req.url);
|
||||
query = url.searchParams.get("query");
|
||||
}
|
||||
|
||||
// Return empty if no query
|
||||
if (!query || query.trim().length === 0) {
|
||||
return new Response(
|
||||
JSON.stringify({ users: [], tags: [], posts: [] }),
|
||||
{ status: 200, headers: { "Content-Type": "application/json" } }
|
||||
);
|
||||
}
|
||||
|
||||
const rawQuery = query.trim();
|
||||
const isHashtagSearch = rawQuery.startsWith("#");
|
||||
const cleanTag = isHashtagSearch ? rawQuery.replace("#", "").toLowerCase() : "";
|
||||
const hashtagFilters = extractHashtags(rawQuery);
|
||||
const ftsQuery = stripHashtags(rawQuery).replace(/[|&!]/g, " ").trim();
|
||||
const hasFts = ftsQuery.length > 0;
|
||||
const safeQuery = rawQuery.toLowerCase().replace(/[,()]/g, "");
|
||||
|
||||
console.log("Search query:", rawQuery, "isHashtagSearch:", isHashtagSearch, "cleanTag:", cleanTag);
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser();
|
||||
if (authError || !user) throw new Error("Unauthorized");
|
||||
|
||||
// 3. Prepare Exclusion Lists (Blocked Users)
|
||||
const { data: blockedUsers } = await serviceClient
|
||||
.from("blocks")
|
||||
.select("blocked_id")
|
||||
.eq("blocker_id", user.id);
|
||||
|
||||
const blockedIds = (blockedUsers?.map((b) => b.blocked_id) || []);
|
||||
const excludeIds = [...blockedIds, user.id]; // Exclude self from user search
|
||||
const postExcludeIds = blockedIds; // Allow own posts in search
|
||||
|
||||
// 4. Parallel Execution (Fastest Approach)
|
||||
// Users + tags first so tag matches can inform post search.
|
||||
const [usersResult, tagsResult] = await Promise.all([
|
||||
|
||||
// A. Search Users
|
||||
(async () => {
|
||||
let dbQuery = serviceClient
|
||||
.from("profiles")
|
||||
.select("id, handle, display_name, avatar_url")
|
||||
.or(`handle.ilike.%${safeQuery}%,display_name.ilike.%${safeQuery}%`)
|
||||
.limit(5);
|
||||
|
||||
if (excludeIds.length > 0) {
|
||||
dbQuery = dbQuery.not("id", "in", `(${excludeIds.join(",")})`);
|
||||
}
|
||||
return await dbQuery;
|
||||
})(),
|
||||
|
||||
// B. Search Tags (Using the View for performance)
|
||||
(async () => {
|
||||
// NOTE: Ensure you have run the SQL to create 'view_searchable_tags'
|
||||
// If view missing, this returns error, handle gracefully
|
||||
return await serviceClient
|
||||
.from("view_searchable_tags")
|
||||
.select("tag, count")
|
||||
.ilike("tag", `%${isHashtagSearch ? cleanTag : safeQuery}%`)
|
||||
.order("count", { ascending: false })
|
||||
.limit(5);
|
||||
})(),
|
||||
|
||||
]);
|
||||
|
||||
const matchedTags = (tagsResult.data || [])
|
||||
.map((t: any) => String(t.tag).toLowerCase())
|
||||
.filter((t: string) => t.length > 0);
|
||||
|
||||
const tagCandidates = matchedTags.length > 0
|
||||
? matchedTags
|
||||
: (isHashtagSearch && cleanTag.length > 0 ? [cleanTag] : []);
|
||||
|
||||
// C. Search Posts (tag-first for hashtag queries; hybrid for others)
|
||||
const postsResult = await (async () => {
|
||||
const postsMap = new Map<string, any>();
|
||||
|
||||
if (isHashtagSearch) {
|
||||
if (cleanTag.length > 0) {
|
||||
let exactTagQuery = serviceClient
|
||||
.from("posts")
|
||||
.select("id, body, tags, created_at, author_id, image_url, visibility, profiles!posts_author_id_fkey(handle, display_name)")
|
||||
.contains("tags", [cleanTag])
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(20);
|
||||
if (excludeIds.length > 0) {
|
||||
exactTagQuery = exactTagQuery.not("author_id", "in", `(${excludeIds.join(",")})`);
|
||||
}
|
||||
const exactTagResult = await exactTagQuery;
|
||||
(exactTagResult.data || []).forEach((p: any) => postsMap.set(p.id, p));
|
||||
}
|
||||
|
||||
if (tagCandidates.length > 0) {
|
||||
let tagQuery = serviceClient
|
||||
.from("posts")
|
||||
.select("id, body, tags, created_at, author_id, image_url, visibility, profiles!posts_author_id_fkey(handle, display_name)")
|
||||
.overlaps("tags", tagCandidates)
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(20);
|
||||
if (postExcludeIds.length > 0) {
|
||||
tagQuery = tagQuery.not("author_id", "in", `(${postExcludeIds.join(",")})`);
|
||||
}
|
||||
const tagResult = await tagQuery;
|
||||
(tagResult.data || []).forEach((p: any) => postsMap.set(p.id, p));
|
||||
}
|
||||
|
||||
return { data: Array.from(postsMap.values()).slice(0, 20), error: null };
|
||||
}
|
||||
|
||||
if (hasFts) {
|
||||
let ftsDbQuery = serviceClient
|
||||
.from("posts")
|
||||
.select("id, body, tags, created_at, author_id, image_url, visibility, profiles!posts_author_id_fkey(handle, display_name)")
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(20);
|
||||
if (postExcludeIds.length > 0) {
|
||||
ftsDbQuery = ftsDbQuery.not("author_id", "in", `(${postExcludeIds.join(",")})`);
|
||||
}
|
||||
ftsDbQuery = ftsDbQuery.textSearch("fts", ftsQuery, {
|
||||
type: "websearch",
|
||||
config: "english",
|
||||
});
|
||||
const ftsResult = await ftsDbQuery;
|
||||
(ftsResult.data || []).forEach((p: any) => postsMap.set(p.id, p));
|
||||
}
|
||||
|
||||
if (tagCandidates.length > 0) {
|
||||
let tagOverlapQuery = serviceClient
|
||||
.from("posts")
|
||||
.select("id, body, tags, created_at, author_id, image_url, visibility, profiles!posts_author_id_fkey(handle, display_name)")
|
||||
.overlaps("tags", tagCandidates)
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(20);
|
||||
if (postExcludeIds.length > 0) {
|
||||
tagOverlapQuery = tagOverlapQuery.not("author_id", "in", `(${postExcludeIds.join(",")})`);
|
||||
}
|
||||
const tagOverlapResult = await tagOverlapQuery;
|
||||
(tagOverlapResult.data || []).forEach((p: any) => postsMap.set(p.id, p));
|
||||
}
|
||||
|
||||
if (hashtagFilters.length > 0) {
|
||||
let tagOverlapQuery = serviceClient
|
||||
.from("posts")
|
||||
.select("id, body, tags, created_at, author_id, image_url, visibility, profiles!posts_author_id_fkey(handle, display_name)")
|
||||
.overlaps("tags", hashtagFilters)
|
||||
.order("created_at", { ascending: false })
|
||||
.limit(20);
|
||||
if (postExcludeIds.length > 0) {
|
||||
tagOverlapQuery = tagOverlapQuery.not("author_id", "in", `(${postExcludeIds.join(",")})`);
|
||||
}
|
||||
const tagOverlapResult = await tagOverlapQuery;
|
||||
(tagOverlapResult.data || []).forEach((p: any) => postsMap.set(p.id, p));
|
||||
}
|
||||
|
||||
return { data: Array.from(postsMap.values()).slice(0, 20), error: null };
|
||||
})();
|
||||
|
||||
// 5. Process Users (Get Harmony Tiers)
|
||||
const profiles = usersResult.data || [];
|
||||
let users: SearchUser[] = [];
|
||||
|
||||
if (profiles.length > 0) {
|
||||
const { data: trustStates } = await serviceClient
|
||||
.from("trust_state")
|
||||
.select("user_id, tier")
|
||||
.in("user_id", profiles.map(p => p.id));
|
||||
|
||||
const trustMap = new Map(trustStates?.map(t => [t.user_id, t.tier]) || []);
|
||||
|
||||
users = profiles.map((p: any) => ({
|
||||
id: p.id,
|
||||
username: p.handle,
|
||||
display_name: p.display_name || p.handle,
|
||||
avatar_url: p.avatar_url,
|
||||
harmony_tier: trustMap.get(p.id) || "new",
|
||||
}));
|
||||
}
|
||||
|
||||
// 6. Process Tags
|
||||
const tags: SearchTag[] = (tagsResult.data || []).map((t: any) => ({
|
||||
tag: t.tag,
|
||||
count: t.count
|
||||
}));
|
||||
|
||||
// 7. Process Posts
|
||||
const searchPosts: SearchPost[] = await Promise.all(
|
||||
(postsResult.data || []).map(async (p: any) => ({
|
||||
id: p.id,
|
||||
body: p.body,
|
||||
author_id: p.author_id,
|
||||
author_handle: p.profiles?.handle || "unknown",
|
||||
author_display_name: p.profiles?.display_name || "Unknown User",
|
||||
created_at: p.created_at,
|
||||
image_url: p.image_url ? await trySignR2Url(p.image_url) : null,
|
||||
visibility: p.visibility,
|
||||
}))
|
||||
);
|
||||
|
||||
// 8. Return Result
|
||||
return new Response(JSON.stringify({ users, tags, posts: searchPosts }), {
|
||||
status: 200,
|
||||
headers: { "Content-Type": "application/json" },
|
||||
});
|
||||
|
||||
} catch (error: any) {
|
||||
console.error("Search error:", error);
|
||||
return new Response(
|
||||
JSON.stringify({ error: error.message || "Internal server error" }),
|
||||
{ status: 500, headers: { "Content-Type": "application/json" } }
|
||||
);
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/sign-media/config.toml
Normal file
1
_legacy/supabase/functions/sign-media/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
80
_legacy/supabase/functions/sign-media/index.ts
Normal file
80
_legacy/supabase/functions/sign-media/index.ts
Normal file
|
|
@ -0,0 +1,80 @@
|
|||
import { serve } from "https://deno.land/std@0.177.0/http/server.ts";
|
||||
import { createSupabaseClient } from "../_shared/supabase-client.ts";
|
||||
import { trySignR2Url, transformLegacyMediaUrl } from "../_shared/r2_signer.ts";
|
||||
|
||||
const corsHeaders = {
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Access-Control-Allow-Methods": "POST, OPTIONS",
|
||||
"Access-Control-Allow-Headers": "authorization, x-client-info, apikey, content-type",
|
||||
};
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === "OPTIONS") {
|
||||
return new Response("ok", { headers: corsHeaders });
|
||||
}
|
||||
|
||||
if (req.method !== "POST") {
|
||||
return new Response(JSON.stringify({ error: "Method not allowed" }), {
|
||||
status: 405,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
const authHeader = req.headers.get("Authorization");
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: "Missing authorization header" }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: "Unauthorized" }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
const body = await req.json().catch(() => ({}));
|
||||
const url = body?.url as string | undefined;
|
||||
const key = body?.key as string | undefined;
|
||||
const expiresIn = Number.isFinite(body?.expiresIn) ? Number(body.expiresIn) : 3600;
|
||||
|
||||
const target = key || url;
|
||||
if (!target) {
|
||||
return new Response(JSON.stringify({ error: "Missing url or key" }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
// Transform legacy media.gosojorn.com URLs to their object key
|
||||
const transformedTarget = transformLegacyMediaUrl(target) ?? target;
|
||||
|
||||
const signedUrl = await trySignR2Url(transformedTarget, expiresIn);
|
||||
if (!signedUrl) {
|
||||
return new Response(JSON.stringify({ error: "Failed to sign media URL" }), {
|
||||
status: 400,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(JSON.stringify({ signedUrl, signed_url: signedUrl }), {
|
||||
status: 200,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Internal server error";
|
||||
return new Response(JSON.stringify({ error: message }), {
|
||||
status: 500,
|
||||
headers: { ...corsHeaders, "Content-Type": "application/json" },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/signup/config.toml
Normal file
1
_legacy/supabase/functions/signup/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
195
_legacy/supabase/functions/signup/index.ts
Normal file
195
_legacy/supabase/functions/signup/index.ts
Normal file
|
|
@ -0,0 +1,195 @@
|
|||
/**
|
||||
* POST /signup
|
||||
*
|
||||
* User registration and profile creation
|
||||
* Creates profile + initializes trust_state
|
||||
*
|
||||
* Flow:
|
||||
* 1. User signs up via Supabase Auth (handled by client)
|
||||
* 2. Client calls this function with profile details
|
||||
* 3. Create profile record
|
||||
* 4. Trust state is auto-initialized by trigger
|
||||
* 5. Return profile data
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient } from '../_shared/supabase-client.ts';
|
||||
import { ValidationError } from '../_shared/validation.ts';
|
||||
|
||||
interface SignupRequest {
|
||||
handle: string;
|
||||
display_name: string;
|
||||
bio?: string;
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com',
|
||||
'Access-Control-Allow-Methods': 'POST',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 2. Parse request
|
||||
const { handle, display_name, bio } = (await req.json()) as SignupRequest;
|
||||
|
||||
// 3. Validate inputs
|
||||
if (!handle || !handle.match(/^[a-z0-9_]{3,20}$/)) {
|
||||
throw new ValidationError(
|
||||
'Handle must be 3-20 characters, lowercase letters, numbers, and underscores only',
|
||||
'handle'
|
||||
);
|
||||
}
|
||||
|
||||
if (!display_name || display_name.trim().length === 0 || display_name.length > 50) {
|
||||
throw new ValidationError('Display name must be 1-50 characters', 'display_name');
|
||||
}
|
||||
|
||||
if (bio && bio.length > 300) {
|
||||
throw new ValidationError('Bio must be 300 characters or less', 'bio');
|
||||
}
|
||||
|
||||
// 3b. Get origin country from IP geolocation
|
||||
// Uses ipinfo.io to look up country from client IP address
|
||||
let originCountry: string | null = null;
|
||||
const clientIp = req.headers.get('x-forwarded-for')?.split(',')[0]?.trim();
|
||||
const ipinfoToken = Deno.env.get('IPINFO_TOKEN');
|
||||
|
||||
if (clientIp && ipinfoToken) {
|
||||
try {
|
||||
const geoRes = await fetch(`https://api.ipinfo.io/lite/${clientIp}?token=${ipinfoToken}`);
|
||||
if (geoRes.ok) {
|
||||
const geoData = await geoRes.json();
|
||||
// ipinfo.io returns country as ISO 3166-1 alpha-2 code (e.g., 'US', 'GB')
|
||||
if (geoData.country && /^[A-Z]{2}$/.test(geoData.country)) {
|
||||
originCountry = geoData.country;
|
||||
}
|
||||
}
|
||||
} catch (geoError) {
|
||||
// Geolocation is optional - don't fail signup if it errors
|
||||
console.warn('Geolocation lookup failed:', geoError);
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Check if profile already exists
|
||||
const { data: existingProfile } = await supabase
|
||||
.from('profiles')
|
||||
.select('id')
|
||||
.eq('id', user.id)
|
||||
.single();
|
||||
|
||||
if (existingProfile) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Profile already exists',
|
||||
message: 'You have already completed signup',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 5. Create profile (trust_state will be auto-created by trigger)
|
||||
const { data: profile, error: profileError } = await supabase
|
||||
.from('profiles')
|
||||
.insert({
|
||||
id: user.id,
|
||||
handle,
|
||||
display_name,
|
||||
bio: bio || null,
|
||||
origin_country: originCountry,
|
||||
})
|
||||
.select()
|
||||
.single();
|
||||
|
||||
if (profileError) {
|
||||
// Check for duplicate handle
|
||||
if (profileError.code === '23505') {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Handle taken',
|
||||
message: 'This handle is already in use. Please choose another.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Error creating profile:', profileError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to create profile' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 6. Get the auto-created trust state
|
||||
const { data: trustState } = await supabase
|
||||
.from('trust_state')
|
||||
.select('harmony_score, tier')
|
||||
.eq('user_id', user.id)
|
||||
.single();
|
||||
|
||||
// 7. Return profile data
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
profile,
|
||||
trust_state: trustState,
|
||||
message: 'Welcome to sojorn. Your journey begins quietly.',
|
||||
}),
|
||||
{
|
||||
status: 201,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
if (error instanceof ValidationError) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Validation error',
|
||||
message: error.message,
|
||||
field: error.field,
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
1
_legacy/supabase/functions/tone-check/config.toml
Normal file
1
_legacy/supabase/functions/tone-check/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
202
_legacy/supabase/functions/tone-check/index.ts
Normal file
202
_legacy/supabase/functions/tone-check/index.ts
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
/// <reference types="https://deno.land/x/deno@v1.28.0/cli/dts/lib.deno.d.ts" />
|
||||
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts'
|
||||
|
||||
const OPENAI_MODERATION_URL = 'https://api.openai.com/v1/moderations'
|
||||
|
||||
const ALLOWED_ORIGIN = Deno.env.get('ALLOWED_ORIGIN') || 'https://gosojorn.com';
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': ALLOWED_ORIGIN,
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
}
|
||||
|
||||
type ModerationCategory = 'bigotry' | 'nsfw' | 'violence'
|
||||
|
||||
interface ModerationResult {
|
||||
flagged: boolean
|
||||
category?: ModerationCategory
|
||||
flags: string[]
|
||||
reason: string
|
||||
}
|
||||
|
||||
// Basic keyword-based fallback (when OpenAI is unavailable)
|
||||
function basicModeration(text: string): ModerationResult {
|
||||
const lowerText = text.toLowerCase()
|
||||
const flags: string[] = []
|
||||
|
||||
// Slurs and hate speech patterns (basic detection)
|
||||
const slurPatterns = [
|
||||
/\bn+[i1]+g+[aegr]+/i,
|
||||
/\bf+[a4]+g+[s$o0]+t/i,
|
||||
/\br+[e3]+t+[a4]+r+d/i,
|
||||
// Add more patterns as needed
|
||||
]
|
||||
|
||||
for (const pattern of slurPatterns) {
|
||||
if (pattern.test(text)) {
|
||||
return {
|
||||
flagged: true,
|
||||
category: 'bigotry',
|
||||
flags: ['hate-speech'],
|
||||
reason: 'This content contains hate speech or slurs.',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Targeted profanity/attacks
|
||||
const attackPatterns = [
|
||||
/\b(fuck|screw|damn)\s+(you|u|your|ur)\b/i,
|
||||
/\byou('re| are)\s+(a |an )?(fucking |damn |stupid |idiot|moron|dumb)/i,
|
||||
/\b(kill|hurt|attack|destroy)\s+(you|yourself)\b/i,
|
||||
/\byou\s+should\s+(die|kill|hurt)/i,
|
||||
]
|
||||
|
||||
for (const pattern of attackPatterns) {
|
||||
if (pattern.test(text)) {
|
||||
flags.push('harassment')
|
||||
return {
|
||||
flagged: true,
|
||||
category: 'bigotry',
|
||||
flags,
|
||||
reason: 'This content appears to be harassing or attacking someone.',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Positive indicators
|
||||
const positiveWords = ['thank', 'appreciate', 'love', 'support', 'grateful', 'amazing', 'wonderful']
|
||||
const hasPositive = positiveWords.some(word => lowerText.includes(word))
|
||||
|
||||
if (hasPositive) {
|
||||
return {
|
||||
flagged: false,
|
||||
flags: [],
|
||||
reason: 'Content approved',
|
||||
}
|
||||
}
|
||||
|
||||
// Default: Allow
|
||||
return {
|
||||
flagged: false,
|
||||
flags: [],
|
||||
reason: 'Content approved',
|
||||
}
|
||||
}
|
||||
|
||||
serve(async (req: Request) => {
|
||||
// Handle CORS
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response('ok', {
|
||||
headers: {
|
||||
...corsHeaders,
|
||||
'Access-Control-Allow-Methods': 'POST OPTIONS',
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
try {
|
||||
const { text, imageUrl } = await req.json() as { text: string; imageUrl?: string }
|
||||
|
||||
if (!text || text.trim().length === 0) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: 'Text is required' }),
|
||||
{ status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
|
||||
const openAiKey = Deno.env.get('OPEN_AI')
|
||||
|
||||
// Try OpenAI Moderation API first (if key available)
|
||||
if (openAiKey) {
|
||||
try {
|
||||
console.log('Attempting OpenAI moderation check...')
|
||||
const input: Array<{ type: 'text'; text: string } | { type: 'image_url'; image_url: { url: string } }> = [
|
||||
{ type: 'text', text },
|
||||
]
|
||||
|
||||
if (imageUrl) {
|
||||
input.push({
|
||||
type: 'image_url',
|
||||
image_url: { url: imageUrl },
|
||||
})
|
||||
}
|
||||
|
||||
const moderationResponse = await fetch(OPENAI_MODERATION_URL, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${openAiKey}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
input,
|
||||
model: 'omni-moderation-latest',
|
||||
}),
|
||||
})
|
||||
|
||||
if (moderationResponse.ok) {
|
||||
const moderationData = await moderationResponse.json()
|
||||
const results = moderationData.results?.[0]
|
||||
|
||||
if (results) {
|
||||
const categories = results.categories || {}
|
||||
const flags = Object.entries(categories)
|
||||
.filter(([, value]) => value === true)
|
||||
.map(([key]) => key)
|
||||
|
||||
const isHate = categories.hate || categories['hate/threatening']
|
||||
const isHarassment = categories.harassment || categories['harassment/threatening']
|
||||
const isSexual = categories.sexual || categories['sexual/minors']
|
||||
const isViolence = categories.violence || categories['violence/graphic']
|
||||
|
||||
let category: ModerationCategory | undefined
|
||||
let reason = 'Content approved'
|
||||
const flagged = Boolean(isHate || isHarassment || isSexual || isViolence)
|
||||
|
||||
if (flagged) {
|
||||
if (isHate || isHarassment) {
|
||||
category = 'bigotry'
|
||||
reason = 'Potential hate or harassment detected.'
|
||||
} else if (isSexual) {
|
||||
category = 'nsfw'
|
||||
reason = 'Potential sexual content detected.'
|
||||
} else if (isViolence) {
|
||||
category = 'violence'
|
||||
reason = 'Potential violent content detected.'
|
||||
}
|
||||
}
|
||||
|
||||
console.log('OpenAI moderation successful:', { flagged, category })
|
||||
return new Response(JSON.stringify({ flagged, category, flags, reason }), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
} else {
|
||||
const errorText = await moderationResponse.text()
|
||||
console.error('OpenAI API error:', moderationResponse.status, errorText)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('OpenAI moderation failed:', error)
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to basic keyword moderation
|
||||
console.log('Using basic keyword moderation (OpenAI unavailable)')
|
||||
const result = basicModeration(text)
|
||||
|
||||
return new Response(JSON.stringify(result), {
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
} catch (e) {
|
||||
console.error('Error in tone-check function:', e)
|
||||
// Fail CLOSED: Reject content when moderation fails
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
flagged: true,
|
||||
category: null,
|
||||
flags: [],
|
||||
reason: 'Content moderation is temporarily unavailable. Please try again later.',
|
||||
}),
|
||||
{ status: 503, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
|
||||
)
|
||||
}
|
||||
})
|
||||
1
_legacy/supabase/functions/trending/config.toml
Normal file
1
_legacy/supabase/functions/trending/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
345
_legacy/supabase/functions/trending/index.ts
Normal file
345
_legacy/supabase/functions/trending/index.ts
Normal file
|
|
@ -0,0 +1,345 @@
|
|||
/**
|
||||
* GET /trending
|
||||
*
|
||||
* Design intent:
|
||||
* - Trending reflects calm resonance, not excitement.
|
||||
* - Nothing trends forever.
|
||||
* - Categories do not compete.
|
||||
*
|
||||
* Implementation:
|
||||
* - Category-scoped lists only (no global trending)
|
||||
* - Eligibility: Positive or Neutral tone, High CIS, Low block/report rate
|
||||
* - Rank by calm velocity (steady appreciation > spikes)
|
||||
* - Allow admin editorial override with expiration
|
||||
*/
|
||||
|
||||
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';
|
||||
import { createSupabaseClient, createServiceClient } from '../_shared/supabase-client.ts';
|
||||
import { rankPosts, type PostForRanking } from '../_shared/ranking.ts';
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': 'GET',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
// 1. Validate auth
|
||||
const authHeader = req.headers.get('Authorization');
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ error: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
const supabase = createSupabaseClient(authHeader);
|
||||
const serviceClient = createServiceClient();
|
||||
|
||||
const {
|
||||
data: { user },
|
||||
error: authError,
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
if (authError || !user) {
|
||||
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
|
||||
status: 401,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 2. Parse query params
|
||||
const url = new URL(req.url);
|
||||
const categorySlug = url.searchParams.get('category');
|
||||
const limit = Math.min(parseInt(url.searchParams.get('limit') || '20'), 50);
|
||||
|
||||
if (!categorySlug) {
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: 'Missing category',
|
||||
message: 'Trending is category-scoped. Provide a category slug.',
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// 3. Get category ID
|
||||
const { data: category, error: categoryError } = await supabase
|
||||
.from('categories')
|
||||
.select('id, name, slug')
|
||||
.eq('slug', categorySlug)
|
||||
.single();
|
||||
|
||||
if (categoryError || !category) {
|
||||
return new Response(JSON.stringify({ error: 'Category not found' }), {
|
||||
status: 404,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 4. Check for editorial overrides (unexpired)
|
||||
const { data: overrides } = await supabase
|
||||
.from('trending_overrides')
|
||||
.select(
|
||||
`
|
||||
post_id,
|
||||
reason,
|
||||
posts (
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
tone_label,
|
||||
cis_score,
|
||||
allow_chain,
|
||||
chain_parent_id,
|
||||
chain_parent:posts!posts_chain_parent_id_fkey (
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
)
|
||||
),
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
),
|
||||
category:categories!posts_category_id_fkey (
|
||||
id,
|
||||
slug,
|
||||
name
|
||||
),
|
||||
metrics:post_metrics (
|
||||
like_count,
|
||||
save_count
|
||||
)
|
||||
)
|
||||
`
|
||||
)
|
||||
.eq('category_id', category.id)
|
||||
.gt('expires_at', new Date().toISOString())
|
||||
.order('created_at', { ascending: false });
|
||||
|
||||
const overridePosts =
|
||||
overrides?.map((o: any) => ({
|
||||
...o.posts,
|
||||
is_editorial: true,
|
||||
editorial_reason: o.reason,
|
||||
})) || [];
|
||||
|
||||
// 5. Fetch candidate posts for algorithmic trending
|
||||
// Eligibility:
|
||||
// - Positive or Neutral tone only
|
||||
// - CIS >= 0.8 (high content integrity)
|
||||
// - Created in last 48 hours (trending is recent)
|
||||
// - Active status
|
||||
const twoDaysAgo = new Date(Date.now() - 48 * 60 * 60 * 1000).toISOString();
|
||||
|
||||
const { data: posts, error: postsError } = await serviceClient
|
||||
.from('posts')
|
||||
.select(
|
||||
`
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
category_id,
|
||||
tone_label,
|
||||
cis_score,
|
||||
author_id,
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
),
|
||||
category:categories!posts_category_id_fkey (
|
||||
id,
|
||||
slug,
|
||||
name
|
||||
),
|
||||
metrics:post_metrics (
|
||||
like_count,
|
||||
save_count,
|
||||
view_count
|
||||
)
|
||||
`
|
||||
)
|
||||
.eq('category_id', category.id)
|
||||
.in('tone_label', ['positive', 'neutral'])
|
||||
.gte('cis_score', 0.8)
|
||||
.gte('created_at', twoDaysAgo)
|
||||
.eq('status', 'active')
|
||||
.limit(100); // Candidate pool
|
||||
|
||||
if (postsError) {
|
||||
console.error('Error fetching trending posts:', postsError);
|
||||
return new Response(JSON.stringify({ error: 'Failed to fetch trending posts' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
// 6. Enrich posts with safety metrics
|
||||
const authorIds = [...new Set(posts.map((p) => p.author_id))];
|
||||
|
||||
const { data: trustStates } = await serviceClient
|
||||
.from('trust_state')
|
||||
.select('user_id, harmony_score, tier')
|
||||
.in('user_id', authorIds);
|
||||
|
||||
const trustMap = new Map(trustStates?.map((t) => [t.user_id, t]) || []);
|
||||
|
||||
const oneDayAgo = new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
|
||||
|
||||
const { data: recentBlocks } = await serviceClient
|
||||
.from('blocks')
|
||||
.select('blocked_id')
|
||||
.in('blocked_id', authorIds)
|
||||
.gte('created_at', oneDayAgo);
|
||||
|
||||
const blocksMap = new Map<string, number>();
|
||||
recentBlocks?.forEach((block) => {
|
||||
blocksMap.set(block.blocked_id, (blocksMap.get(block.blocked_id) || 0) + 1);
|
||||
});
|
||||
|
||||
const postIds = posts.map((p) => p.id);
|
||||
|
||||
const { data: reports } = await serviceClient
|
||||
.from('reports')
|
||||
.select('target_id, reporter_id')
|
||||
.eq('target_type', 'post')
|
||||
.in('target_id', postIds);
|
||||
|
||||
const trustedReportMap = new Map<string, number>();
|
||||
const totalReportMap = new Map<string, number>();
|
||||
|
||||
for (const report of reports || []) {
|
||||
totalReportMap.set(report.target_id, (totalReportMap.get(report.target_id) || 0) + 1);
|
||||
|
||||
const reporterTrust = trustMap.get(report.reporter_id);
|
||||
if (reporterTrust && reporterTrust.harmony_score >= 70) {
|
||||
trustedReportMap.set(report.target_id, (trustedReportMap.get(report.target_id) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// 7. Filter out posts with safety issues
|
||||
const safePosts = posts.filter((post) => {
|
||||
const blocksReceived = blocksMap.get(post.author_id) || 0;
|
||||
const trustedReports = trustedReportMap.get(post.id) || 0;
|
||||
|
||||
// Exclude if:
|
||||
// - Author received 2+ blocks in 24h
|
||||
// - Post has any trusted reports
|
||||
return blocksReceived < 2 && trustedReports === 0;
|
||||
});
|
||||
|
||||
// 8. Transform and rank
|
||||
const postsForRanking: PostForRanking[] = safePosts.map((post) => {
|
||||
const authorTrust = trustMap.get(post.author_id);
|
||||
|
||||
return {
|
||||
id: post.id,
|
||||
created_at: post.created_at,
|
||||
cis_score: post.cis_score || 0.8,
|
||||
tone_label: post.tone_label || 'neutral',
|
||||
save_count: post.metrics?.save_count || 0,
|
||||
like_count: post.metrics?.like_count || 0,
|
||||
view_count: post.metrics?.view_count || 0,
|
||||
author_harmony_score: authorTrust?.harmony_score || 50,
|
||||
author_tier: authorTrust?.tier || 'new',
|
||||
blocks_received_24h: blocksMap.get(post.author_id) || 0,
|
||||
trusted_reports: trustedReportMap.get(post.id) || 0,
|
||||
total_reports: totalReportMap.get(post.id) || 0,
|
||||
};
|
||||
});
|
||||
|
||||
const rankedPosts = rankPosts(postsForRanking);
|
||||
|
||||
// 9. Take top N algorithmic posts
|
||||
const topPosts = rankedPosts.slice(0, limit - overridePosts.length);
|
||||
|
||||
// 10. Fetch full data for algorithmic posts
|
||||
const algorithmicIds = topPosts.map((p) => p.id);
|
||||
|
||||
const { data: algorithmicPosts } = await supabase
|
||||
.from('posts')
|
||||
.select(
|
||||
`
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
tone_label,
|
||||
allow_chain,
|
||||
chain_parent_id,
|
||||
chain_parent:posts!posts_chain_parent_id_fkey (
|
||||
id,
|
||||
body,
|
||||
created_at,
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
)
|
||||
),
|
||||
author:profiles!posts_author_id_fkey (
|
||||
id,
|
||||
handle,
|
||||
display_name,
|
||||
avatar_url
|
||||
),
|
||||
category:categories!posts_category_id_fkey (
|
||||
id,
|
||||
slug,
|
||||
name
|
||||
),
|
||||
metrics:post_metrics (
|
||||
like_count,
|
||||
save_count
|
||||
)
|
||||
`
|
||||
)
|
||||
.in('id', algorithmicIds);
|
||||
|
||||
const algorithmicWithFlag = algorithmicPosts?.map((p) => ({ ...p, is_editorial: false })) || [];
|
||||
|
||||
// 11. Merge editorial overrides first, then algorithmic
|
||||
const trendingPosts = [...overridePosts, ...algorithmicWithFlag];
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
category: {
|
||||
id: category.id,
|
||||
slug: category.slug,
|
||||
name: category.name,
|
||||
},
|
||||
posts: trendingPosts,
|
||||
explanation:
|
||||
'Trending shows calm resonance: steady saves and appreciation from trusted accounts. Editorial picks are marked. Nothing trends forever.',
|
||||
}),
|
||||
{
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Unexpected error:', error);
|
||||
return new Response(JSON.stringify({ error: 'Internal server error' }), {
|
||||
status: 500,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
}
|
||||
});
|
||||
20
_legacy/supabase/functions/tsconfig.json
Normal file
20
_legacy/supabase/functions/tsconfig.json
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2020",
|
||||
"module": "commonjs",
|
||||
"lib": ["ES2020"],
|
||||
"strict": true,
|
||||
"skipLibCheck": true,
|
||||
"noEmit": true,
|
||||
"esModuleInterop": true,
|
||||
"resolveJsonModule": true,
|
||||
"allowJs": true,
|
||||
"moduleResolution": "node",
|
||||
"types": ["node"],
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"isolatedModules": true,
|
||||
"allowSyntheticDefaultImports": true
|
||||
},
|
||||
"include": ["**/*.ts"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
1
_legacy/supabase/functions/upload-image/config.toml
Normal file
1
_legacy/supabase/functions/upload-image/config.toml
Normal file
|
|
@ -0,0 +1 @@
|
|||
verify_jwt = false
|
||||
150
_legacy/supabase/functions/upload-image/index.ts
Normal file
150
_legacy/supabase/functions/upload-image/index.ts
Normal file
|
|
@ -0,0 +1,150 @@
|
|||
import { serve } from "https://deno.land/std@0.168.0/http/server.ts"
|
||||
import { AwsClient } from 'https://esm.sh/aws4fetch@1.0.17'
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') return new Response('ok', { headers: corsHeaders })
|
||||
|
||||
try {
|
||||
// 1. AUTH CHECK
|
||||
const authHeader = req.headers.get('Authorization')
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ code: 401, message: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
// Extract user ID from JWT without full validation
|
||||
// The JWT is already validated by Supabase's edge runtime
|
||||
let userId: string
|
||||
try {
|
||||
const token = authHeader.replace('Bearer ', '')
|
||||
const payload = JSON.parse(atob(token.split('.')[1]))
|
||||
userId = payload.sub
|
||||
if (!userId) throw new Error('No user ID in token')
|
||||
console.log('Authenticated user:', userId)
|
||||
} catch (e) {
|
||||
console.error('Failed to parse JWT:', e)
|
||||
return new Response(JSON.stringify({ code: 401, message: 'Invalid JWT' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
// 2. CONFIGURATION
|
||||
const R2_BUCKET = 'sojorn-media'
|
||||
const ACCOUNT_ID = (Deno.env.get('R2_ACCOUNT_ID') ?? '').trim()
|
||||
const ACCESS_KEY = (Deno.env.get('R2_ACCESS_KEY') ?? '').trim()
|
||||
const SECRET_KEY = (Deno.env.get('R2_SECRET_KEY') ?? '').trim()
|
||||
if (!ACCOUNT_ID || !ACCESS_KEY || !SECRET_KEY) throw new Error('Missing R2 Secrets')
|
||||
|
||||
// 3. PARSE MULTIPART FORM DATA (image + metadata)
|
||||
const contentType = req.headers.get('content-type') || ''
|
||||
|
||||
if (!contentType.includes('multipart/form-data')) {
|
||||
throw new Error('Request must be multipart/form-data')
|
||||
}
|
||||
|
||||
const formData = await req.formData()
|
||||
const imageFile = formData.get('image') as File
|
||||
const fileName = formData.get('fileName') as string
|
||||
|
||||
if (!imageFile) {
|
||||
throw new Error('No image file provided')
|
||||
}
|
||||
|
||||
// Extract and sanitize extension from filename
|
||||
let extension = 'jpg';
|
||||
if (fileName) {
|
||||
const parts = fileName.split('.');
|
||||
if (parts.length > 1) {
|
||||
const ext = parts[parts.length - 1].toLowerCase();
|
||||
// Only allow safe image extensions
|
||||
if (['jpg', 'jpeg', 'png', 'gif', 'webp'].includes(ext)) {
|
||||
extension = ext;
|
||||
}
|
||||
}
|
||||
}
|
||||
const safeFileName = `${crypto.randomUUID()}.${extension}`
|
||||
const imageContentType = imageFile.type || 'application/octet-stream'
|
||||
|
||||
console.log(`Direct upload: fileName=${fileName}, contentType=${imageContentType}, size=${imageFile.size}`)
|
||||
|
||||
// 4. INIT R2 CLIENT
|
||||
const r2 = new AwsClient({
|
||||
accessKeyId: ACCESS_KEY,
|
||||
secretAccessKey: SECRET_KEY,
|
||||
region: 'auto',
|
||||
service: 's3',
|
||||
})
|
||||
|
||||
// 5. UPLOAD DIRECTLY TO R2 FROM EDGE FUNCTION
|
||||
const url = `https://${ACCOUNT_ID}.r2.cloudflarestorage.com/${R2_BUCKET}/${safeFileName}`
|
||||
const imageBytes = await imageFile.arrayBuffer()
|
||||
|
||||
const uploadResponse = await r2.fetch(url, {
|
||||
method: 'PUT',
|
||||
body: imageBytes,
|
||||
headers: {
|
||||
'Content-Type': imageContentType,
|
||||
'Content-Length': imageBytes.byteLength.toString(),
|
||||
},
|
||||
})
|
||||
|
||||
if (!uploadResponse.ok) {
|
||||
const errorText = await uploadResponse.text()
|
||||
console.error('R2 upload failed:', errorText)
|
||||
throw new Error(`R2 upload failed: ${uploadResponse.status} ${errorText}`)
|
||||
}
|
||||
|
||||
console.log('Successfully uploaded to R2:', safeFileName)
|
||||
|
||||
// 6. RETURN SUCCESS RESPONSE
|
||||
// Always return a signed URL to avoid public bucket access.
|
||||
const signedUrl = await trySignR2Url(safeFileName)
|
||||
if (!signedUrl) {
|
||||
throw new Error('Failed to generate signed URL')
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
publicUrl: signedUrl,
|
||||
signedUrl,
|
||||
signed_url: signedUrl,
|
||||
fileKey: safeFileName,
|
||||
fileName: safeFileName,
|
||||
fileSize: imageFile.size,
|
||||
contentType: imageContentType,
|
||||
}),
|
||||
{
|
||||
headers: {
|
||||
...corsHeaders,
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
} catch (error: unknown) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
|
||||
console.error('Upload function error:', errorMessage)
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: errorMessage,
|
||||
hint: 'Make sure you are logged in and R2 credentials are configured.'
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: {
|
||||
...corsHeaders,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
})
|
||||
161
_legacy/supabase/functions/upload-media/index.ts
Normal file
161
_legacy/supabase/functions/upload-media/index.ts
Normal file
|
|
@ -0,0 +1,161 @@
|
|||
import { serve } from "https://deno.land/std@0.168.0/http/server.ts"
|
||||
import { AwsClient } from 'https://esm.sh/aws4fetch@1.0.17'
|
||||
import { trySignR2Url } from "../_shared/r2_signer.ts";
|
||||
|
||||
const corsHeaders = {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
|
||||
}
|
||||
|
||||
serve(async (req) => {
|
||||
if (req.method === 'OPTIONS') return new Response('ok', { headers: corsHeaders })
|
||||
|
||||
try {
|
||||
// 1. AUTH CHECK
|
||||
const authHeader = req.headers.get('Authorization')
|
||||
if (!authHeader) {
|
||||
return new Response(JSON.stringify({ code: 401, message: 'Missing authorization header' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
// Extract user ID from JWT without full validation
|
||||
// The JWT is already validated by Supabase's edge runtime
|
||||
let userId: string
|
||||
try {
|
||||
const token = authHeader.replace('Bearer ', '')
|
||||
const payload = JSON.parse(atob(token.split('.')[1]))
|
||||
userId = payload.sub
|
||||
if (!userId) throw new Error('No user ID in token')
|
||||
console.log('Authenticated user:', userId)
|
||||
} catch (e) {
|
||||
console.error('Failed to parse JWT:', e)
|
||||
return new Response(JSON.stringify({ code: 401, message: 'Invalid JWT' }), {
|
||||
status: 401,
|
||||
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
|
||||
})
|
||||
}
|
||||
|
||||
// 2. CONFIGURATION
|
||||
const R2_BUCKET_IMAGES = 'sojorn-media'
|
||||
const R2_BUCKET_VIDEOS = 'sojorn-videos'
|
||||
const ACCOUNT_ID = (Deno.env.get('R2_ACCOUNT_ID') ?? '').trim()
|
||||
const ACCESS_KEY = (Deno.env.get('R2_ACCESS_KEY') ?? '').trim()
|
||||
const SECRET_KEY = (Deno.env.get('R2_SECRET_KEY') ?? '').trim()
|
||||
if (!ACCOUNT_ID || !ACCESS_KEY || !SECRET_KEY) throw new Error('Missing R2 Secrets')
|
||||
|
||||
// 3. PARSE MULTIPART FORM DATA
|
||||
const contentType = req.headers.get('content-type') || ''
|
||||
|
||||
if (!contentType.includes('multipart/form-data')) {
|
||||
throw new Error('Request must be multipart/form-data')
|
||||
}
|
||||
|
||||
const formData = await req.formData()
|
||||
const mediaFile = formData.get('media') as File
|
||||
const fileName = formData.get('fileName') as string
|
||||
const mediaType = formData.get('type') as string
|
||||
|
||||
if (!mediaFile) {
|
||||
throw new Error('No media file provided')
|
||||
}
|
||||
|
||||
if (!mediaType || (mediaType !== 'image' && mediaType !== 'video')) {
|
||||
throw new Error('Invalid or missing type parameter. Must be "image" or "video"')
|
||||
}
|
||||
|
||||
// Extract and sanitize extension from filename
|
||||
let extension = mediaType === 'image' ? 'jpg' : 'mp4'
|
||||
if (fileName) {
|
||||
const parts = fileName.split('.')
|
||||
if (parts.length > 1) {
|
||||
const ext = parts[parts.length - 1].toLowerCase()
|
||||
// Only allow safe extensions
|
||||
if (mediaType === 'image' && ['jpg', 'jpeg', 'png', 'gif', 'webp'].includes(ext)) {
|
||||
extension = ext
|
||||
} else if (mediaType === 'video' && ['mp4', 'mov', 'webm'].includes(ext)) {
|
||||
extension = ext
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const safeFileName = `${crypto.randomUUID()}.${extension}`
|
||||
const mediaContentType = mediaFile.type || (mediaType === 'image' ? 'image/jpeg' : 'video/mp4')
|
||||
|
||||
console.log(`Direct upload: type=${mediaType}, fileName=${fileName}, contentType=${mediaContentType}, size=${mediaFile.size}`)
|
||||
|
||||
// 4. INIT R2 CLIENT
|
||||
const r2 = new AwsClient({
|
||||
accessKeyId: ACCESS_KEY,
|
||||
secretAccessKey: SECRET_KEY,
|
||||
region: 'auto',
|
||||
service: 's3',
|
||||
})
|
||||
|
||||
// 5. UPLOAD DIRECTLY TO R2 FROM EDGE FUNCTION
|
||||
const bucket = mediaType === 'image' ? R2_BUCKET_IMAGES : R2_BUCKET_VIDEOS
|
||||
const url = `https://${ACCOUNT_ID}.r2.cloudflarestorage.com/${bucket}/${safeFileName}`
|
||||
const mediaBytes = await mediaFile.arrayBuffer()
|
||||
|
||||
const uploadResponse = await r2.fetch(url, {
|
||||
method: 'PUT',
|
||||
body: mediaBytes,
|
||||
headers: {
|
||||
'Content-Type': mediaContentType,
|
||||
'Content-Length': mediaBytes.byteLength.toString(),
|
||||
},
|
||||
})
|
||||
|
||||
if (!uploadResponse.ok) {
|
||||
const errorText = await uploadResponse.text()
|
||||
console.error('R2 upload failed:', errorText)
|
||||
throw new Error(`R2 upload failed: ${uploadResponse.status} ${errorText}`)
|
||||
}
|
||||
|
||||
console.log('Successfully uploaded to R2:', safeFileName)
|
||||
|
||||
// 6. RETURN SUCCESS RESPONSE
|
||||
// Always return a signed URL to avoid public bucket access.
|
||||
const signedUrl = await trySignR2Url(safeFileName, bucket)
|
||||
if (!signedUrl) {
|
||||
throw new Error('Failed to generate signed URL')
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
publicUrl: signedUrl,
|
||||
signedUrl,
|
||||
signed_url: signedUrl,
|
||||
fileKey: safeFileName,
|
||||
fileName: safeFileName,
|
||||
fileSize: mediaFile.size,
|
||||
contentType: mediaContentType,
|
||||
type: mediaType,
|
||||
}),
|
||||
{
|
||||
headers: {
|
||||
...corsHeaders,
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
} catch (error: unknown) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
|
||||
console.error('Upload function error:', errorMessage)
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
error: errorMessage,
|
||||
hint: 'Make sure you are logged in and R2 credentials are configured.'
|
||||
}),
|
||||
{
|
||||
status: 400,
|
||||
headers: {
|
||||
...corsHeaders,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
})
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
-- Migration: Add hashtag + full-text search support to posts
|
||||
-- Created: 2026-01-12
|
||||
-- Purpose: Make category optional while enabling tag storage and search vectors
|
||||
|
||||
-- 1) Make category optional to remove posting friction
|
||||
alter table posts
|
||||
alter column category_id drop not null;
|
||||
|
||||
-- 2) Store extracted hashtags and full-text search vector
|
||||
alter table posts
|
||||
add column if not exists tags text[] default '{}'::text[],
|
||||
add column if not exists fts tsvector
|
||||
generated always as (to_tsvector('english', coalesce(body, '')))
|
||||
stored;
|
||||
|
||||
-- 3) Index for fast full-text search lookups
|
||||
create index if not exists idx_posts_fts on posts using gin (fts);
|
||||
11
_legacy/supabase/migrations/20260113_post_ttl.sql
Normal file
11
_legacy/supabase/migrations/20260113_post_ttl.sql
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
-- Privacy-first TTL support for posts
|
||||
-- default_post_ttl is in hours; NULL means keep forever.
|
||||
|
||||
alter table if exists user_settings
|
||||
add column if not exists default_post_ttl integer;
|
||||
|
||||
alter table if exists posts
|
||||
add column if not exists expires_at timestamptz;
|
||||
|
||||
create index if not exists posts_expires_at_idx
|
||||
on posts (expires_at);
|
||||
|
|
@ -0,0 +1,34 @@
|
|||
-- Security lint remediations
|
||||
-- 1) Make view_searchable_tags SECURITY INVOKER (avoid definer semantics)
|
||||
create or replace view view_searchable_tags
|
||||
with (security_invoker = true) as
|
||||
select
|
||||
unnest(tags) as tag,
|
||||
count(*) as count
|
||||
from posts
|
||||
where deleted_at is null
|
||||
and tags is not null
|
||||
and array_length(tags, 1) > 0
|
||||
group by unnest(tags)
|
||||
order by count desc;
|
||||
|
||||
-- 2) Enable RLS on notifications with per-user visibility
|
||||
alter table if exists notifications enable row level security;
|
||||
drop policy if exists "Users can view own notifications" on notifications;
|
||||
create policy "Users can view own notifications" on notifications
|
||||
for select
|
||||
using (user_id = auth.uid());
|
||||
|
||||
-- Allow inserts/updates/deletes via service role (if your functions need it)
|
||||
drop policy if exists "Service role manages notifications" on notifications;
|
||||
create policy "Service role manages notifications" on notifications
|
||||
for all
|
||||
using (auth.role() = 'service_role')
|
||||
with check (auth.role() = 'service_role');
|
||||
|
||||
-- 3) Enforce RLS on spatial_ref_sys unconditionally (run as owner/superuser)
|
||||
alter table spatial_ref_sys enable row level security;
|
||||
drop policy if exists "Public read spatial_ref_sys" on spatial_ref_sys;
|
||||
create policy "Public read spatial_ref_sys" on spatial_ref_sys
|
||||
for select
|
||||
using (true);
|
||||
15
_legacy/supabase/migrations/20260114_add_profile_role.sql
Normal file
15
_legacy/supabase/migrations/20260114_add_profile_role.sql
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
alter table profiles
|
||||
add column if not exists role text not null default 'user';
|
||||
|
||||
do $$
|
||||
begin
|
||||
if not exists (
|
||||
select 1
|
||||
from pg_constraint
|
||||
where conname = 'profiles_role_check'
|
||||
) then
|
||||
alter table profiles
|
||||
add constraint profiles_role_check
|
||||
check (role in ('user', 'moderator', 'admin', 'banned'));
|
||||
end if;
|
||||
end $$;
|
||||
18
_legacy/supabase/migrations/20260114_admin_rls_policy.sql
Normal file
18
_legacy/supabase/migrations/20260114_admin_rls_policy.sql
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
do $$
|
||||
begin
|
||||
if not exists (
|
||||
select 1
|
||||
from pg_policies
|
||||
where schemaname = 'public'
|
||||
and tablename = 'posts'
|
||||
and policyname = 'Admins can see everything'
|
||||
) then
|
||||
create policy "Admins can see everything"
|
||||
on posts
|
||||
for select
|
||||
to authenticated
|
||||
using (
|
||||
(select role from profiles where id = auth.uid()) in ('admin', 'moderator')
|
||||
);
|
||||
end if;
|
||||
end $$;
|
||||
59
_legacy/supabase/migrations/20260114_beacon_vouch_ttl.sql
Normal file
59
_legacy/supabase/migrations/20260114_beacon_vouch_ttl.sql
Normal file
|
|
@ -0,0 +1,59 @@
|
|||
-- Auto-expire beacons 12 hours after the most recent vouch
|
||||
|
||||
create or replace function update_beacon_expires_at(p_beacon_id uuid)
|
||||
returns void
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
declare
|
||||
last_vouch_at timestamptz;
|
||||
begin
|
||||
select max(created_at)
|
||||
into last_vouch_at
|
||||
from beacon_votes
|
||||
where beacon_id = p_beacon_id
|
||||
and vote_type = 'vouch';
|
||||
|
||||
update posts
|
||||
set expires_at = case
|
||||
when last_vouch_at is null then null
|
||||
else last_vouch_at + interval '12 hours'
|
||||
end
|
||||
where id = p_beacon_id
|
||||
and is_beacon = true;
|
||||
end;
|
||||
$$;
|
||||
|
||||
create or replace function handle_beacon_vote_ttl()
|
||||
returns trigger
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
declare
|
||||
target_beacon_id uuid;
|
||||
begin
|
||||
target_beacon_id := coalesce(new.beacon_id, old.beacon_id);
|
||||
if target_beacon_id is not null then
|
||||
perform update_beacon_expires_at(target_beacon_id);
|
||||
end if;
|
||||
return null;
|
||||
end;
|
||||
$$;
|
||||
|
||||
drop trigger if exists beacon_vote_ttl_trigger on beacon_votes;
|
||||
create trigger beacon_vote_ttl_trigger
|
||||
after insert or update or delete on beacon_votes
|
||||
for each row
|
||||
execute function handle_beacon_vote_ttl();
|
||||
|
||||
-- Backfill existing beacons with vouches
|
||||
update posts p
|
||||
set expires_at = v.last_vouch_at + interval '12 hours'
|
||||
from (
|
||||
select beacon_id, max(created_at) as last_vouch_at
|
||||
from beacon_votes
|
||||
where vote_type = 'vouch'
|
||||
group by beacon_id
|
||||
) v
|
||||
where p.id = v.beacon_id
|
||||
and p.is_beacon = true;
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
alter table profiles
|
||||
add column if not exists strikes integer not null default 0;
|
||||
|
||||
alter table posts
|
||||
add column if not exists moderation_status text not null default 'approved';
|
||||
|
||||
do $$
|
||||
begin
|
||||
if not exists (
|
||||
select 1
|
||||
from pg_constraint
|
||||
where conname = 'posts_moderation_status_check'
|
||||
) then
|
||||
alter table posts
|
||||
add constraint posts_moderation_status_check
|
||||
check (moderation_status in ('approved', 'flagged_bigotry', 'flagged_nsfw', 'rejected'));
|
||||
end if;
|
||||
end $$;
|
||||
14
_legacy/supabase/migrations/20260117_add_origin_country.sql
Normal file
14
_legacy/supabase/migrations/20260117_add_origin_country.sql
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
-- Add origin_country column to profiles table
|
||||
-- Stores ISO 3166-1 alpha-2 country code (e.g., 'US', 'GB', 'CA')
|
||||
-- Captured automatically at signup via ipinfo.io geolocation lookup
|
||||
|
||||
ALTER TABLE profiles
|
||||
ADD COLUMN IF NOT EXISTS origin_country TEXT;
|
||||
|
||||
-- Add constraint for valid ISO 2-letter code format
|
||||
ALTER TABLE profiles
|
||||
ADD CONSTRAINT profiles_origin_country_format
|
||||
CHECK (origin_country IS NULL OR origin_country ~ '^[A-Z]{2}$');
|
||||
|
||||
-- Add comment for documentation
|
||||
COMMENT ON COLUMN profiles.origin_country IS 'ISO 3166-1 alpha-2 country code captured at signup via ipinfo.io geolocation';
|
||||
174
_legacy/supabase/migrations/20260117_private_follow_model.sql
Normal file
174
_legacy/supabase/migrations/20260117_private_follow_model.sql
Normal file
|
|
@ -0,0 +1,174 @@
|
|||
-- Private-by-default follow model + mutuals enforcement
|
||||
|
||||
-- 1) Profiles: add privacy/official flags
|
||||
alter table if exists profiles
|
||||
add column if not exists is_private boolean not null default true,
|
||||
add column if not exists is_official boolean not null default false;
|
||||
|
||||
-- 2) Follows: add status and constraint
|
||||
alter table if exists follows
|
||||
add column if not exists status text not null default 'accepted';
|
||||
|
||||
do $$
|
||||
begin
|
||||
if not exists (
|
||||
select 1
|
||||
from pg_constraint
|
||||
where conname = 'follows_status_check'
|
||||
) then
|
||||
alter table follows
|
||||
add constraint follows_status_check
|
||||
check (status in ('pending', 'accepted'));
|
||||
end if;
|
||||
end $$;
|
||||
|
||||
-- 3) Request follow function (privacy-aware)
|
||||
create or replace function request_follow(target_id uuid)
|
||||
returns text
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
declare
|
||||
existing_status text;
|
||||
target_private boolean;
|
||||
target_official boolean;
|
||||
new_status text;
|
||||
begin
|
||||
if auth.uid() is null then
|
||||
raise exception 'Not authenticated';
|
||||
end if;
|
||||
|
||||
select status into existing_status
|
||||
from follows
|
||||
where follower_id = auth.uid()
|
||||
and following_id = target_id;
|
||||
|
||||
if existing_status is not null then
|
||||
return existing_status;
|
||||
end if;
|
||||
|
||||
select is_private, is_official
|
||||
into target_private, target_official
|
||||
from profiles
|
||||
where id = target_id;
|
||||
|
||||
if target_private is null then
|
||||
raise exception 'Target profile not found';
|
||||
end if;
|
||||
|
||||
if target_official or target_private = false then
|
||||
new_status := 'accepted';
|
||||
else
|
||||
new_status := 'pending';
|
||||
end if;
|
||||
|
||||
insert into follows (follower_id, following_id, status)
|
||||
values (auth.uid(), target_id, new_status);
|
||||
|
||||
return new_status;
|
||||
end;
|
||||
$$;
|
||||
|
||||
-- 4) Mutual follow must be accepted on both sides
|
||||
create or replace function is_mutual_follow(user_a uuid, user_b uuid)
|
||||
returns boolean
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
begin
|
||||
return exists (
|
||||
select 1
|
||||
from follows f1
|
||||
where f1.follower_id = user_a
|
||||
and f1.following_id = user_b
|
||||
and f1.status = 'accepted'
|
||||
) and exists (
|
||||
select 1
|
||||
from follows f2
|
||||
where f2.follower_id = user_b
|
||||
and f2.following_id = user_a
|
||||
and f2.status = 'accepted'
|
||||
);
|
||||
end;
|
||||
$$;
|
||||
|
||||
-- 5) Follow request management helpers
|
||||
create or replace function accept_follow_request(requester_id uuid)
|
||||
returns void
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
begin
|
||||
if auth.uid() is null then
|
||||
raise exception 'Not authenticated';
|
||||
end if;
|
||||
|
||||
update follows
|
||||
set status = 'accepted'
|
||||
where follower_id = requester_id
|
||||
and following_id = auth.uid();
|
||||
end;
|
||||
$$;
|
||||
|
||||
create or replace function reject_follow_request(requester_id uuid)
|
||||
returns void
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
begin
|
||||
if auth.uid() is null then
|
||||
raise exception 'Not authenticated';
|
||||
end if;
|
||||
|
||||
delete from follows
|
||||
where follower_id = requester_id
|
||||
and following_id = auth.uid();
|
||||
end;
|
||||
$$;
|
||||
|
||||
create or replace function get_follow_requests()
|
||||
returns table (
|
||||
follower_id uuid,
|
||||
handle text,
|
||||
display_name text,
|
||||
avatar_url text,
|
||||
requested_at timestamptz
|
||||
)
|
||||
language sql
|
||||
security definer
|
||||
as $$
|
||||
select
|
||||
f.follower_id,
|
||||
p.handle,
|
||||
p.display_name,
|
||||
p.avatar_url,
|
||||
f.created_at as requested_at
|
||||
from follows f
|
||||
join profiles p on p.id = f.follower_id
|
||||
where f.following_id = auth.uid()
|
||||
and f.status = 'pending'
|
||||
order by f.created_at desc;
|
||||
$$;
|
||||
|
||||
-- 6) Posts RLS: allow self, public, or accepted follow
|
||||
alter table if exists posts enable row level security;
|
||||
|
||||
drop policy if exists posts_select_private_model on posts;
|
||||
create policy posts_select_private_model on posts
|
||||
for select
|
||||
using (
|
||||
auth.uid() = author_id
|
||||
or exists (
|
||||
select 1
|
||||
from profiles p
|
||||
where p.id = author_id
|
||||
and p.is_private = false
|
||||
)
|
||||
or exists (
|
||||
select 1
|
||||
from follows f
|
||||
where f.follower_id = auth.uid()
|
||||
and f.following_id = author_id
|
||||
and f.status = 'accepted'
|
||||
)
|
||||
);
|
||||
305
_legacy/supabase/migrations/20260117_secure_e2ee_chat.sql
Normal file
305
_legacy/supabase/migrations/20260117_secure_e2ee_chat.sql
Normal file
|
|
@ -0,0 +1,305 @@
|
|||
-- ============================================================================
|
||||
-- Secure E2EE Chat System for Mutual Follows
|
||||
-- ============================================================================
|
||||
-- This migration creates the infrastructure for end-to-end encrypted messaging
|
||||
-- using Signal Protocol concepts. Only mutual follows can exchange messages.
|
||||
-- The server never sees plaintext - only encrypted blobs.
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Signal Protocol Key Storage
|
||||
-- ============================================================================
|
||||
|
||||
-- Identity and pre-keys for Signal Protocol key exchange
|
||||
CREATE TABLE IF NOT EXISTS signal_keys (
|
||||
user_id UUID PRIMARY KEY REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
|
||||
-- Identity Key (long-term, base64 encoded public key)
|
||||
identity_key_public TEXT NOT NULL,
|
||||
|
||||
-- Signed Pre-Key (medium-term, rotated periodically)
|
||||
signed_prekey_public TEXT NOT NULL,
|
||||
signed_prekey_id INTEGER NOT NULL DEFAULT 1,
|
||||
signed_prekey_signature TEXT NOT NULL,
|
||||
|
||||
-- One-Time Pre-Keys (for perfect forward secrecy, consumed on use)
|
||||
-- Stored as JSONB array: [{"id": 1, "key": "base64..."}, ...]
|
||||
one_time_prekeys JSONB DEFAULT '[]'::JSONB,
|
||||
|
||||
-- Metadata
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Index for fast key lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_signal_keys_user_id ON signal_keys(user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Encrypted Conversations Metadata
|
||||
-- ============================================================================
|
||||
|
||||
-- Conversation metadata (no content, just participants and state)
|
||||
CREATE TABLE IF NOT EXISTS encrypted_conversations (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Participants (always 2 for DM)
|
||||
participant_a UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
participant_b UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
|
||||
-- Conversation state
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
last_message_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
|
||||
-- Ensure ordered participant storage (smaller UUID first)
|
||||
-- This prevents duplicate conversations
|
||||
CONSTRAINT ordered_participants CHECK (participant_a < participant_b),
|
||||
CONSTRAINT unique_conversation UNIQUE (participant_a, participant_b)
|
||||
);
|
||||
|
||||
-- Indexes for conversation lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_conversations_participant_a ON encrypted_conversations(participant_a);
|
||||
CREATE INDEX IF NOT EXISTS idx_conversations_participant_b ON encrypted_conversations(participant_b);
|
||||
CREATE INDEX IF NOT EXISTS idx_conversations_last_message ON encrypted_conversations(last_message_at DESC);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Encrypted Messages
|
||||
-- ============================================================================
|
||||
|
||||
-- Encrypted message storage - server sees ONLY ciphertext
|
||||
CREATE TABLE IF NOT EXISTS encrypted_messages (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Conversation reference
|
||||
conversation_id UUID NOT NULL REFERENCES encrypted_conversations(id) ON DELETE CASCADE,
|
||||
|
||||
-- Sender (for routing, not content attribution)
|
||||
sender_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
|
||||
-- Encrypted payload (what the server stores)
|
||||
-- This is the Signal Protocol message, completely opaque to server
|
||||
ciphertext BYTEA NOT NULL,
|
||||
|
||||
-- Signal Protocol header (needed for decryption, but reveals nothing)
|
||||
-- Contains ephemeral key, previous chain length, message number
|
||||
message_header TEXT NOT NULL,
|
||||
|
||||
-- Message type (for protocol handling)
|
||||
-- 1 = PreKeyWhisperMessage (initial message establishing session)
|
||||
-- 2 = WhisperMessage (subsequent messages in established session)
|
||||
message_type INTEGER NOT NULL DEFAULT 2,
|
||||
|
||||
-- Delivery metadata
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
delivered_at TIMESTAMPTZ,
|
||||
read_at TIMESTAMPTZ,
|
||||
|
||||
-- Expiration (optional ephemeral messaging)
|
||||
expires_at TIMESTAMPTZ
|
||||
);
|
||||
|
||||
-- Indexes for message retrieval
|
||||
CREATE INDEX IF NOT EXISTS idx_messages_conversation ON encrypted_messages(conversation_id, created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_messages_sender ON encrypted_messages(sender_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_messages_unread ON encrypted_messages(conversation_id, read_at) WHERE read_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_messages_expiring ON encrypted_messages(expires_at) WHERE expires_at IS NOT NULL;
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Helper Functions
|
||||
-- ============================================================================
|
||||
|
||||
-- Check if two users have a mutual follow relationship
|
||||
CREATE OR REPLACE FUNCTION is_mutual_follow(user_a UUID, user_b UUID)
|
||||
RETURNS BOOLEAN AS $$
|
||||
BEGIN
|
||||
RETURN EXISTS (
|
||||
SELECT 1 FROM follows f1
|
||||
WHERE f1.follower_id = user_a
|
||||
AND f1.following_id = user_b
|
||||
) AND EXISTS (
|
||||
SELECT 1 FROM follows f2
|
||||
WHERE f2.follower_id = user_b
|
||||
AND f2.following_id = user_a
|
||||
);
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- Get or create a conversation between two mutual follows
|
||||
CREATE OR REPLACE FUNCTION get_or_create_conversation(user_a UUID, user_b UUID)
|
||||
RETURNS UUID AS $$
|
||||
DECLARE
|
||||
conv_id UUID;
|
||||
ordered_a UUID;
|
||||
ordered_b UUID;
|
||||
BEGIN
|
||||
-- Verify mutual follow
|
||||
IF NOT is_mutual_follow(user_a, user_b) THEN
|
||||
RAISE EXCEPTION 'Users must have mutual follow to start conversation';
|
||||
END IF;
|
||||
|
||||
-- Order participants for consistent storage
|
||||
IF user_a < user_b THEN
|
||||
ordered_a := user_a;
|
||||
ordered_b := user_b;
|
||||
ELSE
|
||||
ordered_a := user_b;
|
||||
ordered_b := user_a;
|
||||
END IF;
|
||||
|
||||
-- Try to get existing conversation
|
||||
SELECT id INTO conv_id
|
||||
FROM encrypted_conversations
|
||||
WHERE participant_a = ordered_a AND participant_b = ordered_b;
|
||||
|
||||
-- Create if doesn't exist
|
||||
IF conv_id IS NULL THEN
|
||||
INSERT INTO encrypted_conversations (participant_a, participant_b)
|
||||
VALUES (ordered_a, ordered_b)
|
||||
RETURNING id INTO conv_id;
|
||||
END IF;
|
||||
|
||||
RETURN conv_id;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- Consume a one-time pre-key (returns and removes it atomically)
|
||||
CREATE OR REPLACE FUNCTION consume_one_time_prekey(target_user_id UUID)
|
||||
RETURNS JSONB AS $$
|
||||
DECLARE
|
||||
prekey JSONB;
|
||||
remaining JSONB;
|
||||
BEGIN
|
||||
-- Get the first prekey
|
||||
SELECT one_time_prekeys->0 INTO prekey
|
||||
FROM signal_keys
|
||||
WHERE user_id = target_user_id
|
||||
AND jsonb_array_length(one_time_prekeys) > 0
|
||||
FOR UPDATE;
|
||||
|
||||
IF prekey IS NULL THEN
|
||||
RETURN NULL;
|
||||
END IF;
|
||||
|
||||
-- Remove it from the array
|
||||
UPDATE signal_keys
|
||||
SET one_time_prekeys = one_time_prekeys - 0,
|
||||
updated_at = NOW()
|
||||
WHERE user_id = target_user_id;
|
||||
|
||||
RETURN prekey;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Row Level Security Policies
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable RLS on all tables
|
||||
ALTER TABLE signal_keys ENABLE ROW LEVEL SECURITY;
|
||||
ALTER TABLE encrypted_conversations ENABLE ROW LEVEL SECURITY;
|
||||
ALTER TABLE encrypted_messages ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Signal Keys: Users can only manage their own keys
|
||||
CREATE POLICY signal_keys_select ON signal_keys
|
||||
FOR SELECT USING (true); -- Anyone can read public keys
|
||||
|
||||
CREATE POLICY signal_keys_insert ON signal_keys
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY signal_keys_update ON signal_keys
|
||||
FOR UPDATE USING (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY signal_keys_delete ON signal_keys
|
||||
FOR DELETE USING (auth.uid() = user_id);
|
||||
|
||||
-- Conversations: Only participants can see their conversations
|
||||
CREATE POLICY conversations_select ON encrypted_conversations
|
||||
FOR SELECT USING (
|
||||
auth.uid() = participant_a OR auth.uid() = participant_b
|
||||
);
|
||||
|
||||
-- Conversations are created via the get_or_create_conversation function
|
||||
-- which enforces mutual follow, so we allow insert if user is a participant
|
||||
CREATE POLICY conversations_insert ON encrypted_conversations
|
||||
FOR INSERT WITH CHECK (
|
||||
(auth.uid() = participant_a OR auth.uid() = participant_b)
|
||||
AND is_mutual_follow(participant_a, participant_b)
|
||||
);
|
||||
|
||||
-- Messages: Only conversation participants can see/send messages
|
||||
CREATE POLICY messages_select ON encrypted_messages
|
||||
FOR SELECT USING (
|
||||
EXISTS (
|
||||
SELECT 1 FROM encrypted_conversations c
|
||||
WHERE c.id = conversation_id
|
||||
AND (c.participant_a = auth.uid() OR c.participant_b = auth.uid())
|
||||
)
|
||||
);
|
||||
|
||||
-- Critical: Messages can only be inserted by sender who is in a mutual follow
|
||||
CREATE POLICY messages_insert ON encrypted_messages
|
||||
FOR INSERT WITH CHECK (
|
||||
auth.uid() = sender_id
|
||||
AND EXISTS (
|
||||
SELECT 1 FROM encrypted_conversations c
|
||||
WHERE c.id = conversation_id
|
||||
AND (c.participant_a = auth.uid() OR c.participant_b = auth.uid())
|
||||
AND is_mutual_follow(c.participant_a, c.participant_b)
|
||||
)
|
||||
);
|
||||
|
||||
-- Users can update their own sent messages (for read receipts on received messages)
|
||||
CREATE POLICY messages_update ON encrypted_messages
|
||||
FOR UPDATE USING (
|
||||
EXISTS (
|
||||
SELECT 1 FROM encrypted_conversations c
|
||||
WHERE c.id = conversation_id
|
||||
AND (c.participant_a = auth.uid() OR c.participant_b = auth.uid())
|
||||
)
|
||||
);
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Triggers for Metadata Updates
|
||||
-- ============================================================================
|
||||
|
||||
-- Update conversation last_message_at when new message is inserted
|
||||
CREATE OR REPLACE FUNCTION update_conversation_timestamp()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
UPDATE encrypted_conversations
|
||||
SET last_message_at = NEW.created_at
|
||||
WHERE id = NEW.conversation_id;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql;
|
||||
|
||||
CREATE TRIGGER trg_update_conversation_timestamp
|
||||
AFTER INSERT ON encrypted_messages
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_conversation_timestamp();
|
||||
|
||||
-- Auto-expire old messages (for ephemeral messaging)
|
||||
CREATE OR REPLACE FUNCTION cleanup_expired_messages()
|
||||
RETURNS void AS $$
|
||||
BEGIN
|
||||
DELETE FROM encrypted_messages
|
||||
WHERE expires_at IS NOT NULL AND expires_at < NOW();
|
||||
END;
|
||||
$$ LANGUAGE plpgsql;
|
||||
|
||||
-- ============================================================================
|
||||
-- 7. Realtime Subscriptions
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable realtime for messages (participants will subscribe to their conversations)
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE encrypted_messages;
|
||||
|
||||
-- ============================================================================
|
||||
-- 8. Comments for Documentation
|
||||
-- ============================================================================
|
||||
|
||||
COMMENT ON TABLE signal_keys IS 'Public cryptographic keys for Signal Protocol key exchange. Private keys stored only on device.';
|
||||
COMMENT ON TABLE encrypted_conversations IS 'Metadata for E2EE conversations. No message content stored here.';
|
||||
COMMENT ON TABLE encrypted_messages IS 'Encrypted message blobs. Server cannot decrypt - only route.';
|
||||
COMMENT ON FUNCTION is_mutual_follow IS 'Returns true if both users follow each other.';
|
||||
COMMENT ON FUNCTION get_or_create_conversation IS 'Creates or retrieves a conversation, enforcing mutual follow requirement.';
|
||||
COMMENT ON FUNCTION consume_one_time_prekey IS 'Atomically retrieves and removes a one-time pre-key for forward secrecy.';
|
||||
|
|
@ -0,0 +1,26 @@
|
|||
-- Fix notifications type check to allow new follow-related types
|
||||
|
||||
do $$
|
||||
begin
|
||||
if exists (
|
||||
select 1
|
||||
from pg_constraint
|
||||
where conname = 'notifications_type_check'
|
||||
) then
|
||||
alter table notifications
|
||||
drop constraint notifications_type_check;
|
||||
end if;
|
||||
end $$;
|
||||
|
||||
alter table notifications
|
||||
add constraint notifications_type_check
|
||||
check (type in (
|
||||
'appreciate',
|
||||
'chain',
|
||||
'follow',
|
||||
'comment',
|
||||
'mention',
|
||||
'follow_request',
|
||||
'new_follower',
|
||||
'request_accepted'
|
||||
));
|
||||
65
_legacy/supabase/migrations/20260118_fix_secure_chat_rls.sql
Normal file
65
_legacy/supabase/migrations/20260118_fix_secure_chat_rls.sql
Normal file
|
|
@ -0,0 +1,65 @@
|
|||
-- Fix RLS policies for secure chat messages to ensure participants can access them.
|
||||
|
||||
-- 1. Ensure RLS is enabled on the encrypted_messages table.
|
||||
ALTER TABLE public.encrypted_messages ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- 2. Policy for SELECTing messages.
|
||||
-- Allows a user to read messages in conversations they are a participant in.
|
||||
DROP POLICY IF EXISTS "Allow participants to read messages" ON public.encrypted_messages;
|
||||
CREATE POLICY "Allow participants to read messages"
|
||||
ON public.encrypted_messages FOR SELECT USING (
|
||||
exists (
|
||||
select 1
|
||||
from public.encrypted_conversations
|
||||
where id = encrypted_messages.conversation_id
|
||||
and (
|
||||
participant_a = auth.uid() or
|
||||
participant_b = auth.uid()
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
-- 3. Policy for INSERTing new messages.
|
||||
-- Allows a user to insert a message if they are the sender and a participant
|
||||
-- in the conversation.
|
||||
DROP POLICY IF EXISTS "Allow participants to send messages" ON public.encrypted_messages;
|
||||
CREATE POLICY "Allow participants to send messages"
|
||||
ON public.encrypted_messages FOR INSERT WITH CHECK (
|
||||
sender_id = auth.uid() AND
|
||||
exists (
|
||||
select 1
|
||||
from public.encrypted_conversations
|
||||
where id = encrypted_messages.conversation_id
|
||||
and (
|
||||
participant_a = auth.uid() or
|
||||
participant_b = auth.uid()
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
-- 4. Policy for UPDATing messages (e.g., marking as read).
|
||||
-- Allows a participant to update a message in their conversation.
|
||||
-- The client-side code should enforce that only the recipient can mark as read.
|
||||
DROP POLICY IF EXISTS "Allow participants to update messages" ON public.encrypted_messages;
|
||||
CREATE POLICY "Allow participants to update messages"
|
||||
ON public.encrypted_messages FOR UPDATE USING (
|
||||
exists (
|
||||
select 1
|
||||
from public.encrypted_conversations
|
||||
where id = encrypted_messages.conversation_id
|
||||
and (
|
||||
participant_a = auth.uid() or
|
||||
participant_b = auth.uid()
|
||||
)
|
||||
)
|
||||
) WITH CHECK (
|
||||
exists (
|
||||
select 1
|
||||
from public.encrypted_conversations
|
||||
where id = encrypted_messages.conversation_id
|
||||
and (
|
||||
participant_a = auth.uid() or
|
||||
participant_b = auth.uid()
|
||||
)
|
||||
)
|
||||
);
|
||||
55
_legacy/supabase/migrations/20260118_follow_guardrails.sql
Normal file
55
_legacy/supabase/migrations/20260118_follow_guardrails.sql
Normal file
|
|
@ -0,0 +1,55 @@
|
|||
-- Follow guardrails: prevent self-follow and block-based follows
|
||||
|
||||
create or replace function request_follow(target_id uuid)
|
||||
returns text
|
||||
language plpgsql
|
||||
security definer
|
||||
as $$
|
||||
declare
|
||||
existing_status text;
|
||||
target_private boolean;
|
||||
target_official boolean;
|
||||
new_status text;
|
||||
begin
|
||||
if auth.uid() is null then
|
||||
raise exception 'Not authenticated';
|
||||
end if;
|
||||
|
||||
if target_id is null then
|
||||
raise exception 'Target profile not found';
|
||||
end if;
|
||||
|
||||
if auth.uid() = target_id then
|
||||
raise exception 'Cannot follow yourself';
|
||||
end if;
|
||||
|
||||
select status into existing_status
|
||||
from follows
|
||||
where follower_id = auth.uid()
|
||||
and following_id = target_id;
|
||||
|
||||
if existing_status is not null then
|
||||
return existing_status;
|
||||
end if;
|
||||
|
||||
select is_private, is_official
|
||||
into target_private, target_official
|
||||
from profiles
|
||||
where id = target_id;
|
||||
|
||||
if target_private is null then
|
||||
raise exception 'Target profile not found';
|
||||
end if;
|
||||
|
||||
if target_official or target_private = false then
|
||||
new_status := 'accepted';
|
||||
else
|
||||
new_status := 'pending';
|
||||
end if;
|
||||
|
||||
insert into follows (follower_id, following_id, status)
|
||||
values (auth.uid(), target_id, new_status);
|
||||
|
||||
return new_status;
|
||||
end;
|
||||
$$;
|
||||
|
|
@ -0,0 +1,67 @@
|
|||
-- Follow notifications: trigger + metadata
|
||||
|
||||
alter table if exists notifications
|
||||
add column if not exists metadata jsonb not null default '{}'::jsonb;
|
||||
|
||||
do $$
|
||||
begin
|
||||
alter type notification_type add value if not exists 'follow_request';
|
||||
alter type notification_type add value if not exists 'new_follower';
|
||||
alter type notification_type add value if not exists 'request_accepted';
|
||||
end $$;
|
||||
|
||||
create or replace function handle_follow_notification()
|
||||
returns trigger
|
||||
language plpgsql
|
||||
as $$
|
||||
begin
|
||||
if tg_op = 'INSERT' then
|
||||
if new.status = 'pending' then
|
||||
insert into notifications (user_id, type, actor_id, metadata)
|
||||
values (
|
||||
new.following_id,
|
||||
'follow_request',
|
||||
new.follower_id,
|
||||
jsonb_build_object(
|
||||
'follower_id', new.follower_id,
|
||||
'following_id', new.following_id,
|
||||
'status', new.status
|
||||
)
|
||||
);
|
||||
elsif new.status = 'accepted' then
|
||||
insert into notifications (user_id, type, actor_id, metadata)
|
||||
values (
|
||||
new.following_id,
|
||||
'new_follower',
|
||||
new.follower_id,
|
||||
jsonb_build_object(
|
||||
'follower_id', new.follower_id,
|
||||
'following_id', new.following_id,
|
||||
'status', new.status
|
||||
)
|
||||
);
|
||||
end if;
|
||||
elsif tg_op = 'UPDATE' then
|
||||
if old.status = 'pending' and new.status = 'accepted' then
|
||||
insert into notifications (user_id, type, actor_id, metadata)
|
||||
values (
|
||||
new.follower_id,
|
||||
'request_accepted',
|
||||
new.following_id,
|
||||
jsonb_build_object(
|
||||
'follower_id', new.follower_id,
|
||||
'following_id', new.following_id,
|
||||
'status', new.status
|
||||
)
|
||||
);
|
||||
end if;
|
||||
end if;
|
||||
|
||||
return new;
|
||||
end;
|
||||
$$;
|
||||
|
||||
drop trigger if exists follow_notification_trigger on follows;
|
||||
create trigger follow_notification_trigger
|
||||
after insert or update on follows
|
||||
for each row execute function handle_follow_notification();
|
||||
8
_legacy/supabase/migrations/20260118_pinned_posts.sql
Normal file
8
_legacy/supabase/migrations/20260118_pinned_posts.sql
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
-- Allow one pinned post per author for profile feeds
|
||||
|
||||
alter table if exists posts
|
||||
add column if not exists pinned_at timestamptz;
|
||||
|
||||
create unique index if not exists posts_author_pinned_unique
|
||||
on posts (author_id)
|
||||
where pinned_at is not null;
|
||||
39
_legacy/supabase/migrations/20260118_post_visibility.sql
Normal file
39
_legacy/supabase/migrations/20260118_post_visibility.sql
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
-- Post-level visibility controls
|
||||
|
||||
alter table if exists posts
|
||||
add column if not exists visibility text not null default 'public';
|
||||
|
||||
update posts
|
||||
set visibility = 'public'
|
||||
where visibility is null;
|
||||
|
||||
do $$
|
||||
begin
|
||||
if not exists (
|
||||
select 1
|
||||
from pg_constraint
|
||||
where conname = 'posts_visibility_check'
|
||||
) then
|
||||
alter table posts
|
||||
add constraint posts_visibility_check
|
||||
check (visibility in ('public', 'followers', 'private'));
|
||||
end if;
|
||||
end $$;
|
||||
|
||||
drop policy if exists posts_select_private_model on posts;
|
||||
create policy posts_select_private_model on posts
|
||||
for select
|
||||
using (
|
||||
auth.uid() = author_id
|
||||
or visibility = 'public'
|
||||
or (
|
||||
visibility = 'followers'
|
||||
and exists (
|
||||
select 1
|
||||
from follows f
|
||||
where f.follower_id = auth.uid()
|
||||
and f.following_id = author_id
|
||||
and f.status = 'accepted'
|
||||
)
|
||||
)
|
||||
);
|
||||
|
|
@ -0,0 +1,109 @@
|
|||
-- ============================================================================
|
||||
-- Signal Protocol Schema Updates
|
||||
-- ============================================================================
|
||||
-- Updates to support proper Signal Protocol implementation with separate
|
||||
-- one_time_prekeys table and profiles identity key storage.
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Update profiles table to store identity key and registration ID
|
||||
-- ============================================================================
|
||||
|
||||
-- Add Signal Protocol identity key and registration ID to profiles
|
||||
ALTER TABLE profiles
|
||||
ADD COLUMN IF NOT EXISTS identity_key TEXT,
|
||||
ADD COLUMN IF NOT EXISTS registration_id INTEGER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Create separate one_time_prekeys table
|
||||
-- ============================================================================
|
||||
|
||||
-- Separate table for one-time pre-keys (consumed on use)
|
||||
CREATE TABLE IF NOT EXISTS one_time_prekeys (
|
||||
id SERIAL PRIMARY KEY,
|
||||
user_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
key_id INTEGER NOT NULL,
|
||||
public_key TEXT NOT NULL,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
|
||||
-- Ensure unique key_id per user
|
||||
UNIQUE(user_id, key_id)
|
||||
);
|
||||
|
||||
-- Index for efficient key consumption
|
||||
CREATE INDEX IF NOT EXISTS idx_one_time_prekeys_user_id ON one_time_prekeys(user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Update signal_keys table structure
|
||||
-- ============================================================================
|
||||
|
||||
-- Remove one_time_prekeys from signal_keys (now separate table)
|
||||
ALTER TABLE signal_keys
|
||||
DROP COLUMN IF EXISTS one_time_prekeys;
|
||||
|
||||
-- Add registration_id to signal_keys if not already present
|
||||
ALTER TABLE signal_keys
|
||||
ADD COLUMN IF NOT EXISTS registration_id INTEGER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Update consume_one_time_prekey function
|
||||
-- ============================================================================
|
||||
|
||||
-- Update the function to work with the separate table
|
||||
CREATE OR REPLACE FUNCTION consume_one_time_prekey(target_user_id UUID)
|
||||
RETURNS TABLE(key_id INTEGER, public_key TEXT) AS $$
|
||||
BEGIN
|
||||
RETURN QUERY
|
||||
DELETE FROM one_time_prekeys
|
||||
WHERE user_id = target_user_id
|
||||
ORDER BY created_at ASC
|
||||
LIMIT 1
|
||||
RETURNING one_time_prekeys.key_id, one_time_prekeys.public_key;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Update RLS policies for one_time_prekeys
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable RLS
|
||||
ALTER TABLE one_time_prekeys ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Users can read their own pre-keys (for management)
|
||||
CREATE POLICY one_time_prekeys_select_own ON one_time_prekeys
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
-- Users can insert their own pre-keys
|
||||
CREATE POLICY one_time_prekeys_insert_own ON one_time_prekeys
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
-- Users can delete their own pre-keys (when consumed)
|
||||
CREATE POLICY one_time_prekeys_delete_own ON one_time_prekeys
|
||||
FOR DELETE USING (auth.uid() = user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Migration helper: Move existing one_time_prekeys to new table
|
||||
-- ============================================================================
|
||||
|
||||
-- Insert existing one_time_prekeys from signal_keys into the new table
|
||||
INSERT INTO one_time_prekeys (user_id, key_id, public_key)
|
||||
SELECT
|
||||
sk.user_id,
|
||||
(prekey->>'id')::INTEGER,
|
||||
prekey->>'key'
|
||||
FROM signal_keys sk,
|
||||
jsonb_array_elements(sk.one_time_prekeys) AS prekey
|
||||
ON CONFLICT (user_id, key_id) DO NOTHING;
|
||||
|
||||
-- Remove the old column after migration
|
||||
-- (Commented out to prevent accidental data loss - run manually after verification)
|
||||
-- ALTER TABLE signal_keys DROP COLUMN IF EXISTS one_time_prekeys;
|
||||
|
||||
-- ============================================================================
|
||||
-- 7. Comments for documentation
|
||||
-- ============================================================================
|
||||
|
||||
COMMENT ON TABLE one_time_prekeys IS 'One-time pre-keys for Signal Protocol. Each key is consumed after first use.';
|
||||
COMMENT ON COLUMN profiles.identity_key IS 'Signal Protocol identity key public part (base64 encoded)';
|
||||
COMMENT ON COLUMN profiles.registration_id IS 'Signal Protocol registration ID for this user';
|
||||
COMMENT ON FUNCTION consume_one_time_prekey IS 'Atomically consumes and returns the oldest one-time pre-key for a user';
|
||||
111
_legacy/supabase/migrations/20260119_e2ee_fix_policies.sql
Normal file
111
_legacy/supabase/migrations/20260119_e2ee_fix_policies.sql
Normal file
|
|
@ -0,0 +1,111 @@
|
|||
-- ============================================================================
|
||||
-- E2EE Policy Fix Migration
|
||||
-- ============================================================================
|
||||
-- This migration safely recreates policies that may have failed on initial run.
|
||||
-- Uses DROP IF EXISTS before CREATE to be idempotent.
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Fix e2ee_session_commands policies
|
||||
-- ============================================================================
|
||||
|
||||
DROP POLICY IF EXISTS session_commands_select_own ON e2ee_session_commands;
|
||||
DROP POLICY IF EXISTS session_commands_insert_own ON e2ee_session_commands;
|
||||
DROP POLICY IF EXISTS session_commands_update_own ON e2ee_session_commands;
|
||||
|
||||
CREATE POLICY session_commands_select_own ON e2ee_session_commands
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_commands_insert_own ON e2ee_session_commands
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_commands_update_own ON e2ee_session_commands
|
||||
FOR UPDATE USING (auth.uid() = user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Fix e2ee_session_events policies
|
||||
-- ============================================================================
|
||||
|
||||
DROP POLICY IF EXISTS session_events_select_own ON e2ee_session_events;
|
||||
DROP POLICY IF EXISTS session_events_insert_own ON e2ee_session_events;
|
||||
DROP POLICY IF EXISTS session_events_update_own ON e2ee_session_events;
|
||||
|
||||
CREATE POLICY session_events_select_own ON e2ee_session_events
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_events_insert_own ON e2ee_session_events
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_events_update_own ON e2ee_session_events
|
||||
FOR UPDATE USING (auth.uid() = user_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Fix e2ee_decryption_failures policies
|
||||
-- ============================================================================
|
||||
|
||||
DROP POLICY IF EXISTS decryption_failures_select_own ON e2ee_decryption_failures;
|
||||
DROP POLICY IF EXISTS decryption_failures_insert_own ON e2ee_decryption_failures;
|
||||
DROP POLICY IF EXISTS decryption_failures_update_own ON e2ee_decryption_failures;
|
||||
|
||||
CREATE POLICY decryption_failures_select_own ON e2ee_decryption_failures
|
||||
FOR SELECT USING (auth.uid() = recipient_id);
|
||||
|
||||
CREATE POLICY decryption_failures_insert_own ON e2ee_decryption_failures
|
||||
FOR INSERT WITH CHECK (auth.uid() = recipient_id);
|
||||
|
||||
CREATE POLICY decryption_failures_update_own ON e2ee_decryption_failures
|
||||
FOR UPDATE USING (auth.uid() = recipient_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Fix e2ee_session_state policies
|
||||
-- ============================================================================
|
||||
|
||||
DROP POLICY IF EXISTS session_state_select_own ON e2ee_session_state;
|
||||
|
||||
CREATE POLICY session_state_select_own ON e2ee_session_state
|
||||
FOR SELECT USING (auth.uid() = user_id OR auth.uid() = peer_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Safely add tables to realtime publication (ignore if already added)
|
||||
-- ============================================================================
|
||||
|
||||
DO $$
|
||||
BEGIN
|
||||
-- Add e2ee_session_events if not already in publication
|
||||
IF NOT EXISTS (
|
||||
SELECT 1 FROM pg_publication_tables
|
||||
WHERE pubname = 'supabase_realtime'
|
||||
AND tablename = 'e2ee_session_events'
|
||||
) THEN
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_events;
|
||||
END IF;
|
||||
|
||||
-- Add e2ee_session_commands if not already in publication
|
||||
IF NOT EXISTS (
|
||||
SELECT 1 FROM pg_publication_tables
|
||||
WHERE pubname = 'supabase_realtime'
|
||||
AND tablename = 'e2ee_session_commands'
|
||||
) THEN
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_commands;
|
||||
END IF;
|
||||
|
||||
-- Add e2ee_session_state if not already in publication
|
||||
IF NOT EXISTS (
|
||||
SELECT 1 FROM pg_publication_tables
|
||||
WHERE pubname = 'supabase_realtime'
|
||||
AND tablename = 'e2ee_session_state'
|
||||
) THEN
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_state;
|
||||
END IF;
|
||||
END $$;
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Ensure event type constraint includes all types
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE e2ee_session_events
|
||||
DROP CONSTRAINT IF EXISTS e2ee_session_events_event_type_check;
|
||||
|
||||
ALTER TABLE e2ee_session_events
|
||||
ADD CONSTRAINT e2ee_session_events_event_type_check
|
||||
CHECK (event_type IN ('session_reset', 'conversation_cleanup', 'key_refresh', 'decryption_failure', 'session_mismatch', 'session_established'));
|
||||
|
|
@ -0,0 +1,232 @@
|
|||
-- ============================================================================
|
||||
-- E2EE Session Manager Tables
|
||||
-- ============================================================================
|
||||
-- Tables to support the E2EE session management edge function
|
||||
-- These tables handle session commands, events, and cleanup operations
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Session Commands Table
|
||||
-- ============================================================================
|
||||
|
||||
-- Table to store session management commands
|
||||
CREATE TABLE IF NOT EXISTS e2ee_session_commands (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Command details
|
||||
user_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
recipient_id UUID REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
conversation_id UUID REFERENCES encrypted_conversations(id) ON DELETE CASCADE,
|
||||
command_type TEXT NOT NULL CHECK (command_type IN ('session_reset', 'conversation_cleanup', 'key_refresh')),
|
||||
status TEXT NOT NULL DEFAULT 'pending' CHECK (status IN ('pending', 'completed', 'failed')),
|
||||
|
||||
-- Metadata
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
completed_at TIMESTAMPTZ,
|
||||
error_message TEXT
|
||||
);
|
||||
|
||||
-- Indexes for efficient command processing
|
||||
CREATE INDEX IF NOT EXISTS idx_session_commands_user ON e2ee_session_commands(user_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_commands_status ON e2ee_session_commands(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_commands_type ON e2ee_session_commands(command_type);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_commands_conversation ON e2ee_session_commands(conversation_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Session Events Table
|
||||
-- ============================================================================
|
||||
|
||||
-- Table to store session management events for realtime notifications
|
||||
CREATE TABLE IF NOT EXISTS e2ee_session_events (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Event details
|
||||
user_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
event_type TEXT NOT NULL CHECK (event_type IN ('session_reset', 'conversation_cleanup', 'key_refresh', 'decryption_failure')),
|
||||
recipient_id UUID REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
conversation_id UUID REFERENCES encrypted_conversations(id) ON DELETE CASCADE,
|
||||
|
||||
-- Additional context
|
||||
message_id UUID REFERENCES encrypted_messages(id) ON DELETE CASCADE,
|
||||
error_details JSONB,
|
||||
|
||||
-- Metadata
|
||||
timestamp TIMESTAMPTZ DEFAULT NOW(),
|
||||
processed_at TIMESTAMPTZ,
|
||||
processed_by TEXT
|
||||
);
|
||||
|
||||
-- Indexes for efficient event processing
|
||||
CREATE INDEX IF NOT EXISTS idx_session_events_user ON e2ee_session_events(user_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_events_type ON e2ee_session_events(event_type);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_events_timestamp ON e2ee_session_events(timestamp DESC);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Decryption Failure Logs
|
||||
-- ============================================================================
|
||||
|
||||
-- Table to log decryption failures for debugging and recovery
|
||||
CREATE TABLE IF NOT EXISTS e2ee_decryption_failures (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Failure details
|
||||
message_id UUID NOT NULL REFERENCES encrypted_messages(id) ON DELETE CASCADE,
|
||||
sender_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
recipient_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
conversation_id UUID NOT NULL REFERENCES encrypted_conversations(id) ON DELETE CASCADE,
|
||||
|
||||
-- Error information
|
||||
error_type TEXT NOT NULL,
|
||||
error_message TEXT,
|
||||
stack_trace TEXT,
|
||||
ciphertext_length INTEGER,
|
||||
message_type INTEGER,
|
||||
|
||||
-- Context for debugging
|
||||
session_key_exists BOOLEAN,
|
||||
ephemeral_key_in_header BOOLEAN,
|
||||
header_data JSONB,
|
||||
|
||||
-- Recovery status
|
||||
recovery_attempted BOOLEAN DEFAULT FALSE,
|
||||
recovery_success BOOLEAN,
|
||||
recovery_method TEXT,
|
||||
|
||||
-- Metadata
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
last_updated TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Indexes for failure analysis
|
||||
CREATE INDEX IF NOT EXISTS idx_decryption_failures_message ON e2ee_decryption_failures(message_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_decryption_failures_conversation ON e2ee_decryption_failures(conversation_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_decryption_failures_recipient ON e2ee_decryption_failures(recipient_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_decryption_failures_timestamp ON e2ee_decryption_failures(created_at DESC);
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. RLS Policies
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable RLS on all new tables
|
||||
ALTER TABLE e2ee_session_commands ENABLE ROW LEVEL SECURITY;
|
||||
ALTER TABLE e2ee_session_events ENABLE ROW LEVEL SECURITY;
|
||||
ALTER TABLE e2ee_decryption_failures ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Session Commands: Users can only see their own commands
|
||||
CREATE POLICY session_commands_select_own ON e2ee_session_commands
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_commands_insert_own ON e2ee_session_commands
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_commands_update_own ON e2ee_session_commands
|
||||
FOR UPDATE USING (auth.uid() = user_id);
|
||||
|
||||
-- Session Events: Users can only see their own events
|
||||
CREATE POLICY session_events_select_own ON e2ee_session_events
|
||||
FOR SELECT USING (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_events_insert_own ON e2ee_session_events
|
||||
FOR INSERT WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
CREATE POLICY session_events_update_own ON e2ee_session_events
|
||||
FOR UPDATE USING (auth.uid() = user_id);
|
||||
|
||||
-- Decryption Failures: Users can only see failures for their own messages
|
||||
CREATE POLICY decryption_failures_select_own ON e2ee_decryption_failures
|
||||
FOR SELECT USING (auth.uid() = recipient_id);
|
||||
|
||||
CREATE POLICY decryption_failures_insert_own ON e2ee_decryption_failures
|
||||
FOR INSERT WITH CHECK (auth.uid() = recipient_id);
|
||||
|
||||
CREATE POLICY decryption_failures_update_own ON e2ee_decryption_failures
|
||||
FOR UPDATE USING (auth.uid() = recipient_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Realtime Subscriptions
|
||||
-- ============================================================================
|
||||
|
||||
-- Enable realtime for session events
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_events;
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_commands;
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Comments for Documentation
|
||||
-- ============================================================================
|
||||
|
||||
COMMENT ON TABLE e2ee_session_commands IS 'Commands for E2EE session management (reset, cleanup, key refresh). Processed by client and edge functions.';
|
||||
COMMENT ON TABLE e2ee_session_events IS 'Realtime events for E2EE session management. Triggers client-side actions.';
|
||||
COMMENT ON TABLE e2ee_decryption_failures IS 'Logs of decryption failures for debugging and automatic recovery.';
|
||||
COMMENT ON COLUMN e2ee_decryption_failures.error_type IS 'Type of decryption error (e.g., "mac_failure", "session_mismatch")';
|
||||
COMMENT ON COLUMN e2ee_decryption_failures.recovery_method IS 'Method used for recovery (e.g., "session_reset", "key_refresh")';
|
||||
|
||||
-- ============================================================================
|
||||
-- 7. Helper Functions
|
||||
-- ============================================================================
|
||||
|
||||
-- Function to log decryption failures
|
||||
CREATE OR REPLACE FUNCTION log_decryption_failure(
|
||||
p_message_id UUID,
|
||||
p_sender_id UUID,
|
||||
p_recipient_id UUID,
|
||||
p_conversation_id UUID,
|
||||
p_error_type TEXT,
|
||||
p_error_message TEXT,
|
||||
p_stack_trace TEXT,
|
||||
p_ciphertext_length INTEGER,
|
||||
p_message_type INTEGER,
|
||||
p_session_key_exists BOOLEAN,
|
||||
p_ephemeral_key_in_header BOOLEAN,
|
||||
p_header_data JSONB
|
||||
) RETURNS VOID AS $$
|
||||
BEGIN
|
||||
INSERT INTO e2ee_decryption_failures (
|
||||
message_id, sender_id, recipient_id, conversation_id,
|
||||
error_type, error_message, stack_trace, ciphertext_length, message_type,
|
||||
session_key_exists, ephemeral_key_in_header, header_data
|
||||
) VALUES (
|
||||
p_message_id, p_sender_id, p_recipient_id, p_conversation_id,
|
||||
p_error_type, p_error_message, p_stack_trace, p_ciphertext_length, p_message_type,
|
||||
p_session_key_exists, p_ephemeral_key_in_header, p_header_data
|
||||
);
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- Function to mark recovery attempts
|
||||
CREATE OR REPLACE FUNCTION mark_recovery_attempt(
|
||||
p_failure_id UUID,
|
||||
p_recovery_method TEXT,
|
||||
p_success BOOLEAN
|
||||
) RETURNS VOID AS $$
|
||||
BEGIN
|
||||
UPDATE e2ee_decryption_failures
|
||||
SET
|
||||
recovery_attempted = TRUE,
|
||||
recovery_success = p_success,
|
||||
recovery_method = p_recovery_method,
|
||||
last_updated = NOW()
|
||||
WHERE id = p_failure_id;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 8. Cleanup Function
|
||||
-- ============================================================================
|
||||
|
||||
-- Function to cleanup old session data
|
||||
CREATE OR REPLACE FUNCTION cleanup_old_e2ee_data()
|
||||
RETURNS VOID AS $$
|
||||
BEGIN
|
||||
-- Delete old commands (older than 30 days)
|
||||
DELETE FROM e2ee_session_commands
|
||||
WHERE created_at < NOW() - INTERVAL '30 days';
|
||||
|
||||
-- Delete old events (older than 7 days)
|
||||
DELETE FROM e2ee_session_events
|
||||
WHERE timestamp < NOW() - INTERVAL '7 days';
|
||||
|
||||
-- Delete old failure logs (older than 90 days)
|
||||
DELETE FROM e2ee_decryption_failures
|
||||
WHERE created_at < NOW() - INTERVAL '90 days';
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
255
_legacy/supabase/migrations/20260119_e2ee_session_state.sql
Normal file
255
_legacy/supabase/migrations/20260119_e2ee_session_state.sql
Normal file
|
|
@ -0,0 +1,255 @@
|
|||
-- ============================================================================
|
||||
-- E2EE Session State Tracking
|
||||
-- ============================================================================
|
||||
-- Server-side session state tracking to detect and recover from session
|
||||
-- mismatches between parties. The server cannot see session keys, only
|
||||
-- metadata about whether sessions exist.
|
||||
-- ============================================================================
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Session State Table
|
||||
-- ============================================================================
|
||||
|
||||
-- Track which users have established sessions with each other
|
||||
-- This allows detection of asymmetric session states (one party has session, other doesn't)
|
||||
CREATE TABLE IF NOT EXISTS e2ee_session_state (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
|
||||
-- Session participants (always stored with user_id < peer_id for consistency)
|
||||
user_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
peer_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||
|
||||
-- Session state flags
|
||||
user_has_session BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
peer_has_session BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
|
||||
-- Session metadata (no actual keys stored!)
|
||||
user_session_created_at TIMESTAMPTZ,
|
||||
peer_session_created_at TIMESTAMPTZ,
|
||||
|
||||
-- Version tracking for conflict resolution
|
||||
user_session_version INTEGER NOT NULL DEFAULT 0,
|
||||
peer_session_version INTEGER NOT NULL DEFAULT 0,
|
||||
|
||||
-- Timestamps
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
|
||||
-- Ensure unique pair (user_id should always be < peer_id)
|
||||
CONSTRAINT e2ee_session_state_pair_unique UNIQUE (user_id, peer_id),
|
||||
CONSTRAINT e2ee_session_state_ordering CHECK (user_id < peer_id)
|
||||
);
|
||||
|
||||
-- Indexes for efficient lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_session_state_user ON e2ee_session_state(user_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_state_peer ON e2ee_session_state(peer_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_state_mismatch ON e2ee_session_state(user_has_session, peer_has_session)
|
||||
WHERE user_has_session != peer_has_session;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. RLS Policies
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE e2ee_session_state ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Users can see session state for their own sessions
|
||||
CREATE POLICY session_state_select_own ON e2ee_session_state
|
||||
FOR SELECT USING (auth.uid() = user_id OR auth.uid() = peer_id);
|
||||
|
||||
-- Users can only insert/update their own side of the session
|
||||
-- (handled via function to enforce ordering)
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Helper Functions
|
||||
-- ============================================================================
|
||||
|
||||
-- Function to update session state (handles ordering automatically)
|
||||
CREATE OR REPLACE FUNCTION update_e2ee_session_state(
|
||||
p_user_id UUID,
|
||||
p_peer_id UUID,
|
||||
p_has_session BOOLEAN
|
||||
) RETURNS JSONB AS $$
|
||||
DECLARE
|
||||
v_lower_id UUID;
|
||||
v_higher_id UUID;
|
||||
v_is_user_lower BOOLEAN;
|
||||
v_result JSONB;
|
||||
v_session_state RECORD;
|
||||
BEGIN
|
||||
-- Determine ordering (user_id < peer_id constraint)
|
||||
IF p_user_id < p_peer_id THEN
|
||||
v_lower_id := p_user_id;
|
||||
v_higher_id := p_peer_id;
|
||||
v_is_user_lower := TRUE;
|
||||
ELSE
|
||||
v_lower_id := p_peer_id;
|
||||
v_higher_id := p_user_id;
|
||||
v_is_user_lower := FALSE;
|
||||
END IF;
|
||||
|
||||
-- Insert or update
|
||||
INSERT INTO e2ee_session_state (user_id, peer_id, user_has_session, peer_has_session, user_session_created_at, peer_session_created_at, user_session_version, peer_session_version)
|
||||
VALUES (
|
||||
v_lower_id,
|
||||
v_higher_id,
|
||||
CASE WHEN v_is_user_lower THEN p_has_session ELSE FALSE END,
|
||||
CASE WHEN NOT v_is_user_lower THEN p_has_session ELSE FALSE END,
|
||||
CASE WHEN v_is_user_lower AND p_has_session THEN NOW() ELSE NULL END,
|
||||
CASE WHEN NOT v_is_user_lower AND p_has_session THEN NOW() ELSE NULL END,
|
||||
CASE WHEN v_is_user_lower THEN 1 ELSE 0 END,
|
||||
CASE WHEN NOT v_is_user_lower THEN 1 ELSE 0 END
|
||||
)
|
||||
ON CONFLICT (user_id, peer_id) DO UPDATE SET
|
||||
user_has_session = CASE
|
||||
WHEN v_is_user_lower THEN p_has_session
|
||||
ELSE e2ee_session_state.user_has_session
|
||||
END,
|
||||
peer_has_session = CASE
|
||||
WHEN NOT v_is_user_lower THEN p_has_session
|
||||
ELSE e2ee_session_state.peer_has_session
|
||||
END,
|
||||
user_session_created_at = CASE
|
||||
WHEN v_is_user_lower AND p_has_session AND e2ee_session_state.user_session_created_at IS NULL THEN NOW()
|
||||
WHEN v_is_user_lower AND NOT p_has_session THEN NULL
|
||||
ELSE e2ee_session_state.user_session_created_at
|
||||
END,
|
||||
peer_session_created_at = CASE
|
||||
WHEN NOT v_is_user_lower AND p_has_session AND e2ee_session_state.peer_session_created_at IS NULL THEN NOW()
|
||||
WHEN NOT v_is_user_lower AND NOT p_has_session THEN NULL
|
||||
ELSE e2ee_session_state.peer_session_created_at
|
||||
END,
|
||||
user_session_version = CASE
|
||||
WHEN v_is_user_lower THEN e2ee_session_state.user_session_version + 1
|
||||
ELSE e2ee_session_state.user_session_version
|
||||
END,
|
||||
peer_session_version = CASE
|
||||
WHEN NOT v_is_user_lower THEN e2ee_session_state.peer_session_version + 1
|
||||
ELSE e2ee_session_state.peer_session_version
|
||||
END,
|
||||
updated_at = NOW()
|
||||
RETURNING * INTO v_session_state;
|
||||
|
||||
-- Build result with mismatch detection
|
||||
v_result := jsonb_build_object(
|
||||
'success', TRUE,
|
||||
'user_has_session', CASE WHEN v_is_user_lower THEN v_session_state.user_has_session ELSE v_session_state.peer_has_session END,
|
||||
'peer_has_session', CASE WHEN v_is_user_lower THEN v_session_state.peer_has_session ELSE v_session_state.user_has_session END,
|
||||
'session_mismatch', v_session_state.user_has_session != v_session_state.peer_has_session,
|
||||
'peer_session_version', CASE WHEN v_is_user_lower THEN v_session_state.peer_session_version ELSE v_session_state.user_session_version END
|
||||
);
|
||||
|
||||
RETURN v_result;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- Function to get session state between two users
|
||||
CREATE OR REPLACE FUNCTION get_e2ee_session_state(
|
||||
p_user_id UUID,
|
||||
p_peer_id UUID
|
||||
) RETURNS JSONB AS $$
|
||||
DECLARE
|
||||
v_lower_id UUID;
|
||||
v_higher_id UUID;
|
||||
v_is_user_lower BOOLEAN;
|
||||
v_session_state RECORD;
|
||||
BEGIN
|
||||
-- Determine ordering
|
||||
IF p_user_id < p_peer_id THEN
|
||||
v_lower_id := p_user_id;
|
||||
v_higher_id := p_peer_id;
|
||||
v_is_user_lower := TRUE;
|
||||
ELSE
|
||||
v_lower_id := p_peer_id;
|
||||
v_higher_id := p_user_id;
|
||||
v_is_user_lower := FALSE;
|
||||
END IF;
|
||||
|
||||
SELECT * INTO v_session_state
|
||||
FROM e2ee_session_state
|
||||
WHERE user_id = v_lower_id AND peer_id = v_higher_id;
|
||||
|
||||
IF NOT FOUND THEN
|
||||
RETURN jsonb_build_object(
|
||||
'exists', FALSE,
|
||||
'user_has_session', FALSE,
|
||||
'peer_has_session', FALSE,
|
||||
'session_mismatch', FALSE
|
||||
);
|
||||
END IF;
|
||||
|
||||
RETURN jsonb_build_object(
|
||||
'exists', TRUE,
|
||||
'user_has_session', CASE WHEN v_is_user_lower THEN v_session_state.user_has_session ELSE v_session_state.peer_has_session END,
|
||||
'peer_has_session', CASE WHEN v_is_user_lower THEN v_session_state.peer_has_session ELSE v_session_state.user_has_session END,
|
||||
'session_mismatch', v_session_state.user_has_session != v_session_state.peer_has_session,
|
||||
'user_session_version', CASE WHEN v_is_user_lower THEN v_session_state.user_session_version ELSE v_session_state.peer_session_version END,
|
||||
'peer_session_version', CASE WHEN v_is_user_lower THEN v_session_state.peer_session_version ELSE v_session_state.user_session_version END
|
||||
);
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- Function to clear session state (when resetting)
|
||||
CREATE OR REPLACE FUNCTION clear_e2ee_session_state(
|
||||
p_user_id UUID,
|
||||
p_peer_id UUID
|
||||
) RETURNS JSONB AS $$
|
||||
DECLARE
|
||||
v_lower_id UUID;
|
||||
v_higher_id UUID;
|
||||
v_is_user_lower BOOLEAN;
|
||||
BEGIN
|
||||
-- Determine ordering
|
||||
IF p_user_id < p_peer_id THEN
|
||||
v_lower_id := p_user_id;
|
||||
v_higher_id := p_peer_id;
|
||||
v_is_user_lower := TRUE;
|
||||
ELSE
|
||||
v_lower_id := p_peer_id;
|
||||
v_higher_id := p_user_id;
|
||||
v_is_user_lower := FALSE;
|
||||
END IF;
|
||||
|
||||
-- Update only the caller's side of the session
|
||||
UPDATE e2ee_session_state SET
|
||||
user_has_session = CASE WHEN v_is_user_lower THEN FALSE ELSE user_has_session END,
|
||||
peer_has_session = CASE WHEN NOT v_is_user_lower THEN FALSE ELSE peer_has_session END,
|
||||
user_session_created_at = CASE WHEN v_is_user_lower THEN NULL ELSE user_session_created_at END,
|
||||
peer_session_created_at = CASE WHEN NOT v_is_user_lower THEN NULL ELSE peer_session_created_at END,
|
||||
user_session_version = CASE WHEN v_is_user_lower THEN user_session_version + 1 ELSE user_session_version END,
|
||||
peer_session_version = CASE WHEN NOT v_is_user_lower THEN peer_session_version + 1 ELSE peer_session_version END,
|
||||
updated_at = NOW()
|
||||
WHERE user_id = v_lower_id AND peer_id = v_higher_id;
|
||||
|
||||
RETURN jsonb_build_object('success', TRUE);
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Add decryption_failure event type if not exists
|
||||
-- ============================================================================
|
||||
|
||||
-- Update the check constraint to include session_mismatch event type
|
||||
ALTER TABLE e2ee_session_events
|
||||
DROP CONSTRAINT IF EXISTS e2ee_session_events_event_type_check;
|
||||
|
||||
ALTER TABLE e2ee_session_events
|
||||
ADD CONSTRAINT e2ee_session_events_event_type_check
|
||||
CHECK (event_type IN ('session_reset', 'conversation_cleanup', 'key_refresh', 'decryption_failure', 'session_mismatch', 'session_established'));
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Realtime for session state changes
|
||||
-- ============================================================================
|
||||
|
||||
ALTER PUBLICATION supabase_realtime ADD TABLE e2ee_session_state;
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Comments
|
||||
-- ============================================================================
|
||||
|
||||
COMMENT ON TABLE e2ee_session_state IS 'Server-side tracking of E2EE session existence between user pairs. Does NOT store actual keys.';
|
||||
COMMENT ON COLUMN e2ee_session_state.user_has_session IS 'Whether the user with smaller UUID has an active session';
|
||||
COMMENT ON COLUMN e2ee_session_state.peer_has_session IS 'Whether the user with larger UUID has an active session';
|
||||
COMMENT ON COLUMN e2ee_session_state.user_session_version IS 'Incremented each time user updates their session state';
|
||||
COMMENT ON FUNCTION update_e2ee_session_state IS 'Update session state for a user-peer pair. Handles UUID ordering automatically.';
|
||||
COMMENT ON FUNCTION get_e2ee_session_state IS 'Get session state between two users. Returns mismatch detection.';
|
||||
COMMENT ON FUNCTION clear_e2ee_session_state IS 'Clear session state for a user (used during session reset).';
|
||||
135
_legacy/supabase/migrations/20260120_fix_user_post_access.sql
Normal file
135
_legacy/supabase/migrations/20260120_fix_user_post_access.sql
Normal file
|
|
@ -0,0 +1,135 @@
|
|||
-- Fix for recurring post/appreciate access issues for specific users
|
||||
-- This migration ensures:
|
||||
-- 1. All posts have valid visibility values
|
||||
-- 2. Profile privacy settings are consistent
|
||||
-- 3. RLS policies don't conflict
|
||||
-- 4. Users can always see public posts regardless of their own privacy settings
|
||||
|
||||
-- Step 1: Ensure all posts have visibility set (fix any nulls)
|
||||
UPDATE posts
|
||||
SET visibility = 'public'
|
||||
WHERE visibility IS NULL;
|
||||
|
||||
-- Step 2: Ensure profiles have consistent privacy settings
|
||||
UPDATE profiles
|
||||
SET is_private = false
|
||||
WHERE is_private IS NULL;
|
||||
|
||||
-- Step 2b: Official accounts should always be public (is_private = false)
|
||||
-- This is because official/verified accounts are meant to be publicly accessible
|
||||
UPDATE profiles
|
||||
SET is_private = false
|
||||
WHERE is_official = true AND is_private = true;
|
||||
|
||||
-- Step 3: Drop ALL conflicting policies and create a unified one
|
||||
DROP POLICY IF EXISTS posts_select_private_model ON posts;
|
||||
DROP POLICY IF EXISTS posts_select_policy ON posts;
|
||||
DROP POLICY IF EXISTS posts_visibility_policy ON posts;
|
||||
DROP POLICY IF EXISTS posts_select_unified ON posts;
|
||||
|
||||
-- Unified SELECT policy that combines both models:
|
||||
-- ANY authenticated user can see post if:
|
||||
-- a) They are the author (always see own posts)
|
||||
-- b) Post visibility is 'public' (ANYONE can see public posts)
|
||||
-- c) Post visibility is 'followers' AND user has accepted follow to author
|
||||
-- d) Author's profile is NOT private (legacy backward compat - treat as public)
|
||||
--
|
||||
-- IMPORTANT: The viewer's own is_private setting does NOT affect what they can see.
|
||||
-- is_private only affects whether OTHERS can see YOUR posts without following you.
|
||||
CREATE POLICY posts_select_unified ON posts
|
||||
FOR SELECT
|
||||
USING (
|
||||
-- Author can always see their own posts
|
||||
auth.uid() = author_id
|
||||
-- Public posts visible to ALL authenticated users
|
||||
OR visibility = 'public'
|
||||
-- Followers-only posts visible to accepted followers
|
||||
OR (
|
||||
visibility = 'followers'
|
||||
AND EXISTS (
|
||||
SELECT 1
|
||||
FROM follows f
|
||||
WHERE f.follower_id = auth.uid()
|
||||
AND f.following_id = posts.author_id
|
||||
AND f.status = 'accepted'
|
||||
)
|
||||
)
|
||||
-- Legacy: If author's profile is NOT private, treat their non-private posts as visible
|
||||
-- This handles posts created before visibility column existed
|
||||
OR (
|
||||
visibility IS DISTINCT FROM 'private'
|
||||
AND EXISTS (
|
||||
SELECT 1
|
||||
FROM profiles p
|
||||
WHERE p.id = posts.author_id
|
||||
AND p.is_private = false
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
-- Step 4: Ensure INSERT policy exists for posts
|
||||
DROP POLICY IF EXISTS posts_insert_policy ON posts;
|
||||
CREATE POLICY posts_insert_policy ON posts
|
||||
FOR INSERT
|
||||
WITH CHECK (auth.uid() = author_id);
|
||||
|
||||
-- Step 5: Ensure UPDATE policy exists for posts
|
||||
DROP POLICY IF EXISTS posts_update_policy ON posts;
|
||||
CREATE POLICY posts_update_policy ON posts
|
||||
FOR UPDATE
|
||||
USING (auth.uid() = author_id)
|
||||
WITH CHECK (auth.uid() = author_id);
|
||||
|
||||
-- Step 6: Ensure DELETE policy exists for posts
|
||||
DROP POLICY IF EXISTS posts_delete_policy ON posts;
|
||||
CREATE POLICY posts_delete_policy ON posts
|
||||
FOR DELETE
|
||||
USING (auth.uid() = author_id);
|
||||
|
||||
-- Step 7: Fix post_likes table RLS
|
||||
ALTER TABLE IF EXISTS post_likes ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
DROP POLICY IF EXISTS post_likes_select_policy ON post_likes;
|
||||
CREATE POLICY post_likes_select_policy ON post_likes
|
||||
FOR SELECT
|
||||
USING (true); -- Anyone can see likes (needed for like counts)
|
||||
|
||||
DROP POLICY IF EXISTS post_likes_insert_policy ON post_likes;
|
||||
CREATE POLICY post_likes_insert_policy ON post_likes
|
||||
FOR INSERT
|
||||
WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
DROP POLICY IF EXISTS post_likes_delete_policy ON post_likes;
|
||||
CREATE POLICY post_likes_delete_policy ON post_likes
|
||||
FOR DELETE
|
||||
USING (auth.uid() = user_id);
|
||||
|
||||
-- Step 8: Create index for faster RLS checks
|
||||
CREATE INDEX IF NOT EXISTS idx_follows_follower_following_status
|
||||
ON follows(follower_id, following_id, status);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_posts_author_visibility
|
||||
ON posts(author_id, visibility);
|
||||
|
||||
-- Step 7b: Fix post_saves table RLS (same pattern as post_likes)
|
||||
ALTER TABLE IF EXISTS post_saves ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
DROP POLICY IF EXISTS post_saves_select_policy ON post_saves;
|
||||
CREATE POLICY post_saves_select_policy ON post_saves
|
||||
FOR SELECT
|
||||
USING (auth.uid() = user_id); -- Users can only see their own saves
|
||||
|
||||
DROP POLICY IF EXISTS post_saves_insert_policy ON post_saves;
|
||||
CREATE POLICY post_saves_insert_policy ON post_saves
|
||||
FOR INSERT
|
||||
WITH CHECK (auth.uid() = user_id);
|
||||
|
||||
DROP POLICY IF EXISTS post_saves_delete_policy ON post_saves;
|
||||
CREATE POLICY post_saves_delete_policy ON post_saves
|
||||
FOR DELETE
|
||||
USING (auth.uid() = user_id);
|
||||
|
||||
-- Step 9: Grant necessary permissions
|
||||
GRANT SELECT, INSERT, UPDATE, DELETE ON posts TO authenticated;
|
||||
GRANT SELECT, INSERT, DELETE ON post_likes TO authenticated;
|
||||
GRANT SELECT, INSERT, DELETE ON post_saves TO authenticated;
|
||||
|
|
@ -0,0 +1,24 @@
|
|||
-- Add archiving for notifications
|
||||
alter table notifications
|
||||
add column if not exists archived_at timestamptz;
|
||||
|
||||
create index if not exists idx_notifications_user_archived
|
||||
on notifications (user_id, archived_at, created_at desc);
|
||||
|
||||
-- Update unread count to ignore archived notifications
|
||||
create or replace function get_unread_notification_count(p_user_id uuid)
|
||||
returns integer
|
||||
language plpgsql
|
||||
stable
|
||||
security definer
|
||||
as $$
|
||||
begin
|
||||
return (
|
||||
select count(*)::integer
|
||||
from notifications
|
||||
where user_id = p_user_id
|
||||
and is_read = false
|
||||
and archived_at is null
|
||||
);
|
||||
end;
|
||||
$$;
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue