|Darshil Chanpura 7bed1ed1b6|
This tool helps you to put records in specified Kinesis Stream getting it from S3 Object.
It is using AWS SDK for Go for Kinesis and S3. This has some parallel downloads and PutRecords call to Kinesis.
It has some limits built-in to avoid errors, e.g. Max of 500 records can be sent per second per partition.
Usage can be found by option
> kinesis-tool -h Usage of kinesis-tool: -compressed Should be true if file is compressed -debug Enable debug logs -file string Local File Path -kinesis-key string Kinesis Partition Key -kinesis-partitions int Kinesis Number of Partitions (default 1) -kinesis-stream string Kinesis Stream Name -profile string Profile to be used for AWS (default "default") -region string Region to be used for AWS (default "us-east-1") -s3-bucket string S3 Bucket Name -s3-key string S3 Key Name -version Show Version
debug: Enables the debug logs.
profile: AWS Profile to use.
region: AWS Region to use.
compressed: It can accept GZIP compressed files as well, it decompresses and then puts the records.
version: Displays version, to display the commit ID enable the debug flag as well
In case we want to send something from local we can use this
file: Path to file which can be read line by line and then send to Kinesis.
s3-bucket: As we can understand from the name it is the bucket name from where we want to use the file.
s3-key: The object or key path with prefix.
kinesis-stream: Kinesis stream to put the records to.
kinesis-key: Partition key by which we send the records.