Location
Remote EU, Worldwide
Employment Type
Full time
Location Type
Remote
Department
P2P, Technology, Data Team
P2P.org is the largest institutional staking provider with a TVL of over $10B and a market share exceeding 20% in restaking. We focus on researching and improving our infrastructure to extract maximum APR while enhancing security. We also launch new networks and yield products, and work with clients that include BitGo, Copper, Crypto.com, Ledger, ByBit, Bitget, OKX, HTX, Bitvavo, SBI, and others. We are expanding our product line into RWA, data, yield, and service products for banks, exchanges, custodians, and wallets. P2P.org unites talented individuals globally and maintains a client-centric approach with an extensive product line from unified API to widgets and custom dApps. We are a distributed team that shares a passion for decentralized finance and are focused on building the future of finance. P2P.org boasts a strong reputation and network, prioritizing customer satisfaction and developing innovative solutions to bolster our brand.
About Us
P2P.org is the DeFi Intelligence Platform. The mission is to provide a single source of truth for on-chain financial data, enabling investors and institutions to :
- Track token balances and DeFi positions across multiple chains
- Analyze historical and real-time rewards
- Accurately calculate PnL and uncover hidden costs (e.g., slippage, rebalancing, fees)
- Compare strategies and pools across protocols with confidence
Our mission is to make crypto data transparent, reliable, and actionable, reducing the time to generate accurate performance reports from weeks to hours. We’re a fast-moving startup with a strong technical culture, building the backbone of crypto data infrastructure.
What You’ll Do
Design, maintain, and scale streaming ETL pipelines for blockchain data.Build and optimize ClickHouse data models and materialized views for high-performance analytics.Develop and maintain data exporters using orchestration tools.Implement data transformations and decoding logic.Establish and improve testing, monitoring, automation, and migration processes for pipelines.Ensure timely delivery of new data features in alignment with product goals.Combine multiple data sources — indexers and Kafka topics from third parties — to aggregate them into tables for our API.Create automation tools for data analyst inputs, such as a dictionary, to keep them up to date.Collaborate within the team to deliver accurate, reliable, and scalable data services that power the Lambda app.Tech Stack
Streaming & ETL : Managed Flink-based pipelines (real-time event & transaction processing), Apache KafkaData Warehouse : ClickHouse (Cloud)Workflow orchestration : AirflowProgramming : Python (data processing, services, automation)Domain : Multi-chain crypto data (EVM & non-EVM ecosystems)4+ years in Data Engineering (ETL / ELT, data pipelines, streaming systems).What We’re Looking For
Strong SQL skills with columnar databases (ClickHouse, Druid, BigQuery, etc.).Hands-on streaming frameworks experience (Flink, Kafka, or similar).Solid Python skills for data engineering and backend services.Proven track record of delivering pipelines and features to production on schedule.Strong focus on automation, reliability, maintainability, and documentation.Startup mindset : balance speed and quality.Nice to Have
Experience operating ClickHouse at scale (performance tuning, partitioning, materialized views).Experience with CI / CD and automated testing for data pipelines (e.g. GitHub Actions, dbt).Knowledge of multi-chain ecosystems (EVM & non-EVM).Familiarity with blockchain / crypto data structures (transactions, logs, ABI decoding).Contributions to open-source or blockchain data infrastructure projects.P2P.org is committed to providing equal opportunities. All applicants will be considered without regard to race, color, national origin, religion, sex, sexual orientation, gender identity, veteran status, or disability.
#J-18808-Ljbffr