Alibaba Cloud Logstash supports more than 100 system plug-ins, including open source plug-ins and self-developed plug-ins. The plug-ins improve cluster capabilities such as data transmission, data processing, and log debugging. This topic describes the system plug-ins that are supported by Alibaba Cloud Logstash.
Alibaba Cloud Logstash supports only system plug-ins. The system plug-ins are built-in plug-ins. You can install or remove them based on your business requirements. For more information, see Install or remove a Logstash plug-in. The Alibaba Cloud Logstash clusters on which the plug-ins run belong to customers. We recommend that you do not use the clusters to perform illegal operations.
The following tables describe the system plug-ins that are supported by Alibaba Cloud Logstash.
Self-developed plug-ins
Category
Plug-in
Description
References
input
logstash-input-maxcompute
Reads data from MaxCompute.
logstash-input-oss
Reads data from Object Storage Service (OSS).
logstash-input-sls
Consumes logs from Simple Log Service.
output
logstash-output-file_extend
Pushes the output of a pipeline to the Elasticsearch console.
logstash-output-oss
Transfers multiple data records to OSS at a time.
Open source plug-ins
Category
Plug-in
Description
References
input
logstash-input-azure_event_hubs
Consumes events from Azure Event Hubs.
logstash-input-beats
Receives events from the Elastic Beats framework.
logstash-input-dead_letter_queue
Reads events from the dead-letter queue of Logstash.
logstash-input-elasticsearch
Reads data from an Elasticsearch cluster.
logstash-input-exec
Runs a shell command periodically and captures the output of the shell command as an event.
logstash-input-ganglia
Reads Ganglia packets over User Datagram Protocol (UDP).
logstash-input-gelf
Reads Graylog Extended Log Format (GELF) messages from networks as events.
logstash-input-generator
Generates random log events.
logstash-input-graphite
Reads metrics from Graphite.
logstash-input-heartbeat
Generates heartbeat messages.
logstash-input-http
Receives single-line or multiline events over HTTP or HTTPS.
logstash-input-http_poller
Decodes the output of an HTTP API into events and sends the events.
logstash-input-imap
Reads emails from an Internet Message Access Protocol (IMAP) server.
logstash-input-jdbc
Reads data from a database with a Java Database Connectivity (JDBC) interface.
logstash-input-kafka
Reads events from a Kafka topic.
logstash-input-pipe
Streams events from a long-running command pipe.
logstash-input-rabbitmq
Reads events from a RabbitMQ queue.
logstash-input-redis
Reads events from a Redis instance.
logstash-input-s3
Streams events from the objects in an Amazon Simple Storage Service (Amazon S3) bucket.
logstash-input-snmp
Polls network devices by using Simple Network Management Protocol (SNMP) to obtain the statuses of the devices.
logstash-input-snmptrap
Reads SNMP trap messages as events.
logstash-input-sqs
Reads events from an Amazon Simple Queue Service (SQS) queue.
logstash-input-stdin
Reads events from a standard input.
logstash-input-syslog
Reads Syslog messages from networks as events.
logstash-input-tcp
Reads events over a TCP socket.
logstash-input-twitter
Reads events from the Twitter Streaming API.
logstash-input-udp
Reads messages from networks over UDP as events.
logstash-input-unix
Reads events over a UNIX socket.
output
logstash-output-elasticsearch
Writes data to an Elasticsearch cluster.
logstash-output-kafka
Writes events to a Kafka topic.
logstash-output-lumberjack
Sends events over the lumberjack protocol.
logstash-output-nagios
Sends passive check results to Nagios by using Nagios command files.
logstash-output-pagerduty
Sends notifications based on preconfigured services and upgrade policies.
logstash-output-pipe
Pushes events to the standard input of another program by using pipes.
logstash-output-rabbitmq
Pushes events to a RabbitMQ exchange.
logstash-output-redis
Sends events to a Redis queue by using the RPUSH command.
logstash-output-s3
Uploads Logstash events to Amazon S3 at a time.
logstash-output-sns
Sends events to Amazon Simple Notification Service (SNS). Amazon SNS is a fully managed pub/sub messaging service.
logstash-output-sqs
Pushes events to an Amazon SQS queue.
logstash-output-stdout
Returns events to the standard output of Logstash that runs shell commands.
logstash-output-tcp
Writes events over a TCP socket.
logstash-output-udp
Sends events over UDP.
logstash-output-webhdfs
Sends Logstash events to the files in Hadoop Distributed File System (HDFS) by calling the WebHDFS RESTful API.
logstash-output-cloudwatch
Aggregates and sends metrics to Amazon CloudWatch.
logstash-output-csv
Writes events to disks in CSV or a delimited format. This plug-in is based on the file output, and many configuration values are shared. The Ruby CSV library is not recommended in the production environment.
logstash-output-elastic_app_search
Sends events to Elastic App Search.
logstash-output-email
Sends an email when output is received. You can include or exclude the email output execution by using conditions.
logstash-output-file
Writes events to files on disks. You can use a field in an event as a part of the file name or path.
logstash-output-graphite
Reads metrics from logs and sends the metrics to Graphite. Graphite is an open source tool that allows you to store and graph metrics.
logstash-output-http
Sends events to a universal HTTP or HTTPS endpoint.
filter
logstash-filter-aggregate
Aggregates information from several events that are typically log records and belong to the same task, and pushes the aggregated information to the final task event.
logstash-filter-anonymize
Anonymizes fields by replacing field values with a consistent hash.
logstash-filter-cidr
Checks IP addresses in events against a list of network blocks.
logstash-filter-prune
Streamlines event data based on fields in a blacklist or whitelist.
logstash-filter-clone
Duplicates events. This plug-in creates a clone for each type in the clone list.
logstash-filter-csv
Parses an event field that contains CSV data, and stores the field as an individual field. You can also specify the field name. This plug-in still parses data with delimiters, which include but are not limited to commas (,).
logstash-filter-date
Parses a date from fields and uses the date or timestamp as the Logstash timestamp for the event.
logstash-filter-de_dot
Replaces a dot (.) with a different delimiter to rename a field. This plug-in is not cost effective. It copies the content of a source field to a destination field whose name no longer contains dots, and removes the source field.
logstash-filter-dissect
Performs a split operation.
logstash-filter-dns
Performs a lookup on specified or all records under the reverse arrays. The lookup is either an A or CNAME record lookup or a reverse lookup on PTR records.
logstash-filter-drop
Drops all events that meet filter conditions.
logstash-filter-elasticsearch
Searches Elasticsearch for a historical log event and copies some fields from the event to the current event.
logstash-filter-fingerprint
Creates consistent hashes as fingerprints for one or more fields and stores the results in a new field.
logstash-filter-geoip
Adds the geographical locations of IP addresses based on the data in Maxmind GeoLite2 databases.
logstash-filter-grok
Parses arbitrary text that is unstructured data into structured data.
logstash-filter-http
Provides integration with external web services or RESTful APIs.
logstash-filter-jdbc_static
Enriches events with data that is preloaded from a remote database.
logstash-filter-jdbc_streaming
Executes an SQL query and stores the result set in the destination field. This plug-in locally caches results in the Least Recently Used (LRU) cache that is valid.
logstash-filter-json
Expands an existing field that contains JSON data into an actual data structure within a Logstash event. This plug-in is a JSON parsing filter.
logstash-filter-kv
Enables automatic parsing of messages or specific event fields. The messages are foo=bar metasyntactic variables.
logstash-filter-memcached
Provides integration with external data in Memcached.
logstash-filter-metrics
Aggregates metrics.
logstash-filter-mutate
Performs mutations on fields. You can rename, remove, replace, and modify fields in your events.
logstash-filter-ruby
Executes Ruby code. This plug-in accepts inline Ruby code or a Ruby file. The two options are mutually exclusive and are slightly different in the methods of working.
logstash-filter-sleep
Enters the sleep mode for a specified time range. This causes Logstash to stall in this period, which facilitates throttling.
logstash-filter-split
Clones an event by splitting one of its fields and placing each value resulting from the split operation into a clone of the original event. The field for splitting can be either a string or an array.
logstash-filter-syslog_pri
Parses the PRI field of a Syslog message. For more information about the Syslog messages, see RFC 3164. If no priority is specified, the default value is 13 per RFC.
logstash-filter-throttle
Limits the number of events.
logstash-filter-translate
Uses a configured hash or a file to determine replacement values. This plug-in is a general search and replacement tool.
logstash-filter-truncate
Truncates fields that exceed a specified length.
logstash-filter-urldecode
Decodes URL-encoded fields.
logstash-filter-useragent
Parses user agent strings into structured data based on Browserscope data.
logstash-filter-xml
Expands a field that contains XML data into an actual data structure. This plug-in is an XML filter.
codec
logstash-codec-cef
Reads data in ArcSight Common Event Format (CEF). This plug-in is an implementation of a Logstash codec. It is based on Revision 20 of Implementing ArcSight CEF, dated from June 5, 2013.
logstash-codec-collectd
Reads events on networks from the collectd binary protocol over UDP.
logstash-codec-dots
Generates a dot (.) to represent each event that is processed by this plug-in.
logstash-codec-edn
Reads and produces data in the Extensible Data Notation (EDN) format.
logstash-codec-edn_lines
Reads and produces EDN-formatted data that is delimited by line breaks.
logstash-codec-es_bulk
Decodes the Elasticsearch bulk format into individual events and decodes metadata into the [@metadata](/metadata) field.
logstash-codec-fluent
Handles the MessagePack schema for Fluentd.
logstash-codec-graphite
Encodes and decodes Graphite-formatted lines.
logstash-codec-json
Decodes and encodes full JSON messages. The decoding process is based on inputs, and the encoding process is based on outputs.
logstash-codec-json_lines
Decodes streamed JSON data that is delimited by line breaks.
logstash-codec-line
Reads line-oriented text data.
logstash-codec-msgpack
Reads and produces MessagePack-encoded content.
logstash-codec-multiline
Merges multiline messages into a single event.
logstash-codec-netflow
Decodes Netflow v5, v9, and v10 (IPFIX) flows.
logstash-codec-plain
Handles the plaintext with no delimiters between events.
logstash-codec-rubydebug
Generates Logstash event data by using the Ruby Awesome Print library.