This Mezmo Log Analysis Ingestion Source takes your data sent to our Log Management endpoint logs.mezmo.com
and redirects it into a Pipeline Source. This re-routes the data flow from Mezmo Log Management to your Pipeline. For more information, check out our documentation.
Mezmo Flow provides an easy onboarding experience focused on helping you gain an understanding of your data, and then recommending Processors based on common patterns and message types. With Mezmo Flow, you're four steps away from creating a telemetry data Pipeline that will substantially reduce the volume of telemetry data sent to your storage locations and observability tools, saving both on costs and the mental toil required to optimize your data for your observability requirements. For more information, check out the Mezmo Flow topic in our product documentation.
Mezmo is pleased to announce a new feature for the Mezmo Platform, the ability to search product documentation within the Mezmo Web App.
The in-app search is powered by Intercom, so to access the in-app search:
The in-app search will return results from all the Mezmo product guides, as well as product announcement.
Mezmo is pleased to announce the release of a new Destination for Mezmo Telemetry Pipelines, AWS SQS. Check out our docs for more information.
Mezmo is pleased to announce the release of a new Destination for Mezmo Telemetry Pipelines, AWS Kinesis Streams. Check out our docs for more information.
Mezmo is pleased to announce the release of a new Destination for Mezmo Telemetry Pipelines, AWS Kinesis Firehose. Check out our docs for more information.
Mezmo is pleased to announce the release of a new Destination for Mezmo Telemetry Pipelines, AWS Cloudwatch Logs. Check out our docs for more information.
Mezmo is pleased to announce the release of a new Destination for Mezmo Telemetry Pipelines, AWS Cloudwatch Metrics. Check out our docs for more information.
Mezmo is pleased to announce the release of Processor Modules for Mezmo Telemetry Pipelines. With this feature, you can create modules for sets of Processors that are used for specific data optimization functionality, and then share them with other members of your organization. When used with features like Shared Sources, you can create re-usable, purpose-built Pipeline components for your teams to help them quickly build Telemetry Pipelines for every purpose. For more information, check out the Create Processor Module topic at docs.mezmo.com.
Mezmo is pleased to announce the release of a new Pipeline Destination for OpenTelemetry. With this Destination, you can send your logs, metrics and traces to any destination that accepts data using the OpenTelemetry protocol, for example Grafana, TelemetryHub, or New Relic. For more information check out the documentation at docs.mezmo.com.