Indexer Guide
Complete guide for setting up and operating the Prophyt indexer.
Overview
The Prophyt indexer is a TypeScript/Node.js service that:
- Monitors Sui blockchain events in real-time
- Processes and stores market, bet, and event data
- Provides REST API for frontend applications
- Generates NFT portfolio images
- Integrates with Nautilus for market resolution
Quick Start
Prerequisites
- Node.js >= 18.0.0
- PostgreSQL database
- pnpm package manager
- Sui network access
Installation
BASHcd indexer pnpm install
Configuration
Create .env file:
BASH# Database DATABASE_URL=postgresql://user:password@localhost:5432/prophyt_indexer # Sui Network NETWORK=mainnet PROPHYT_PACKAGE_ID=0x... # Walrus Storage WALRUS_CLI_PATH=/path/to/walrus WALRUS_CONFIG_PATH=~/.config/walrus/client_config.yaml WALRUS_EPOCHS=5 WALRUS_PUBLISHER_URL=https://publisher.walrus-testnet.walrus.space # External APIs ADJACENT_API_KEY=your_key_here # Nautilus NAUTILUS_SERVER_URL=http://localhost:8080 NAUTILUS_ENABLED=true # Server PORT=8000
Database Setup
BASH# Using Docker Compose docker-compose up -d # Run migrations pnpm db:setup:dev pnpm db:generate
Start Services
BASH# Start API server pnpm dev # Start indexer (separate terminal) pnpm indexer
Architecture
Event Processing
The indexer uses cursor-based event tracking:
- Poll Events: Queries Sui events every 1 second
- Filter Events: Only processes Prophyt contract events
- Process Events: Handles each event type appropriately
- Store Data: Saves to PostgreSQL database
- Update Cursor: Tracks last processed event
Event Types
MarketCreated: New market creationBetPlaced: User bet placementMarketResolved: Market resolutionWinningsClaimed: User claim eventsBetProofNFTMinted: Bet proof NFT mintingWinningProofNFTMinted: Winning proof NFT mintingYieldDeposited: Yield protocol deposits
Database Schema
Key tables:
- Market: Market data and statistics
- Bet: User bets and positions
- Event: Blockchain event history
- User: User portfolio data
- Protocol: DeFi protocol metadata
- Price: CoinGecko price data
- Cursor: Event processing state
API Endpoints
See API Reference for complete endpoint documentation.
Market Endpoints
GET /api/markets- List marketsGET /api/markets/:id- Get market detailsPOST /api/markets/seed- Seed markets
Bet Endpoints
GET /api/bets/:id- Get bet detailsGET /api/bets/user/:address- Get user betsGET /api/bets/market/:id- Get market betsPOST /api/bets/generate-bet-image- Generate bet NFTPOST /api/bets/generate-winning-image- Generate winning NFT
User Endpoints
GET /api/users/:address/bets- User portfolioGET /api/users/:address/stats- User statistics
Nautilus Endpoints
GET /api/nautilus/health- Health checkGET /api/nautilus/pending-markets- Pending marketsPOST /api/nautilus/resolve/:id- Resolve marketGET /api/nautilus/resolutions- Resolution history
Background Services
Price Updater
Automatically fetches SUI/USD price from CoinGecko:
- Updates every hour (configurable)
- Caches prices in database
- Provides latest price via API
Market Resolution Scheduler
Automatically resolves expired markets:
- Checks every 60 seconds
- Triggers Nautilus resolution
- Submits to blockchain
Image Generation
The indexer generates NFT images for bets and winnings:
Bet Proof Images
- Market question
- Bet position (Yes/No)
- Bet amount
- Timestamp
Winning Proof Images
- Market outcome
- Original bet amount
- Winning amount
- Yield share
- Profit percentage
Walrus Integration
Images are uploaded to Walrus decentralized storage:
- HTTP API or CLI upload
- Returns blob address and ID
- Used in NFT metadata
Monitoring
Health Checks
BASH# API health curl http://localhost:8000/ # Database health psql $DATABASE_URL -c "SELECT 1;"
Metrics to Monitor
- Event processing rate
- API response times
- Database connection pool
- Image generation success rate
- Walrus upload success rate
- Market resolution success rate
Logging
Logs are output to console. For production:
BASH# Using PM2 pm2 logs prophyt-indexer # Using Docker docker logs prophyt-indexer
Maintenance
Database Backups
BASH# Backup pg_dump $DATABASE_URL > backup.sql # Restore psql $DATABASE_URL < backup.sql
Cursor Management
Cursors are automatically managed. If issues occur:
BASH# View cursors psql $DATABASE_URL -c "SELECT * FROM \"Cursor\";" # Reset cursor (if needed) psql $DATABASE_URL -c "DELETE FROM \"Cursor\" WHERE id = 'event_type';"
Data Synchronization
Sync markets from blockchain:
BASH# Dry run pnpm sync:markets:dry-run # Actual sync pnpm sync:markets
Backfill Events
Backfill historical events:
BASH# Dry run pnpm backfill:dry-run # Actual backfill pnpm backfill:live
Performance Tuning
Database Optimization
SQL-- Add indexes CREATE INDEX idx_market_status ON "Market"(status); CREATE INDEX idx_market_end_date ON "Market"("endDate"); CREATE INDEX idx_bet_user ON "Bet"("userId"); CREATE INDEX idx_bet_market ON "Bet"("marketId");
Connection Pooling
Prisma automatically handles connection pooling. Configure in db.ts:
TYPESCRIPTconst db = new PrismaClient({ datasources: { db: { url: process.env.DATABASE_URL, }, }, });
Caching
Implement caching for frequently accessed data:
TYPESCRIPT// Example with Redis import Redis from 'ioredis'; const redis = new Redis(); async function getMarketCached(marketId: string) { const cached = await redis.get(`market:${marketId}`); if (cached) return JSON.parse(cached); const market = await prisma.market.findUnique({ where: { marketId } }); await redis.setex(`market:${marketId}`, 60, JSON.stringify(market)); return market; }
Troubleshooting
See Troubleshooting Guide for common issues.
Event Processing Issues
- Check Sui network connection
- Verify package ID is correct
- Review cursor state
- Check event filter configuration
API Issues
- Verify database connection
- Check environment variables
- Review error logs
- Test endpoints individually
Image Generation Issues
- Verify canvas dependencies
- Check font files exist
- Test Walrus configuration
- Review image generation logs
Production Deployment
See Deployment Guide for production setup.
Using PM2
BASHpm2 start dist/server.js --name prophyt-api pm2 start dist/indexer.js --name prophyt-indexer pm2 save
Using Docker
BASHdocker build -t prophyt-indexer . docker run -d --env-file .env prophyt-indexer
Security
- Use HTTPS in production
- Secure database connections
- Implement API authentication
- Store secrets in environment variables
- Validate all inputs
- Use parameterized queries
Support
- GitHub Issues
- Documentation
- X - Latest updates
