streamware

Data Pipeline Examples

ETL, data transformation, and stream processing.

📁 Examples

File Description
api_to_database.py Fetch API → Transform → Save to DB
csv_processor.py CSV transformation pipeline
kafka_consumer.py Kafka stream processing
etl_with_ai.py AI-powered data extraction
file_watcher.py Watch files and process changes

🚀 Quick Start

# API to JSON
sq get https://api.example.com/users --json

# Transform CSV
sq file data.csv | sq transform --csv --delimiter ";"

# Kafka consume
sq kafka consume --topic events --group my-app

# PostgreSQL query
sq postgres "SELECT * FROM users" --json

# File pipeline
sq file input.json | sq transform --jsonpath "$.items[*].name" > output.txt

🔧 Configuration

# PostgreSQL
export POSTGRES_URL=postgresql://user:pass@host/db

# Kafka
export KAFKA_BROKERS=localhost:9092

# RabbitMQ
export RABBITMQ_URL=amqp://user:pass@host:5672

🔗 Source Code