Handling high-cardinality telemetry data efficiently is crucial for modern observability systems. In this session, we will explore strategies for designing a scalable telemetry pipeline that can process large volumes of diverse metrics without performance bottlenecks. We’ll cover best practices for data ingestion, storage optimization, and query performance, along with real-world techniques to mitigate challenges like high dimensionality and resource constraints. Attendees will gain insights into building a robust telemetry pipeline that balances accuracy, efficiency, and cost-effectiveness.
I have spent a career building and supporting Internet Infrastructure as an engineer and now as a leader. I organize teams around executing technical visions, solving problems by defining processes where needed, automation where possible, and ensuring individuals are getting what they seek from their careers. I manage a culture of accountability and results through trust, transparency, and empathy.