Pages

Monday, August 15, 2016

Aws Flowlogs for Traffic Monitoring

VPC Flow Logs is a feature that enables you to capture information about the IP traffic going to and from network interfaces in your VPC. Flow log data is stored using Amazon CloudWatch Logs. After you've created a flow log, you can view and retrieve its data in Amazon CloudWatch Logs.

Flow logs can help you with a number of tasks; for example, to troubleshoot why specific traffic is not reaching an instance, which in turn can help you diagnose overly restrictive security group rules. You can also use flow logs as a security tool to monitor the traffic that is reaching your instance.

To create Flow log for your subnet we followed these steps:
1. Creating Log group in your CloudWatch:
  - We created new log group in your CloudWatch to log your entries.
  - Please remember your can use same log group for multiple flow log you create.
  - To create log group: AWS Management console -> CloudWatch -> Logs --> Create new log group

2. Create Flow Log for VPC
  - Open the Amazon EC2 console -> Service VPC
  - In the navigation pane, choose VPC then select your VPC
  - From VPC Action select Create a Flow Log
Please refer this link to get more information on this: http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/flow-logs.html#create-flow-log

Note that VPC Flow Log Records[1] is a space-separated string that has the format outline in our document, where 14 fields are available, and field #11 & #12 has recorded the time in Unix seconds)

According to the Filter & Pattern Syntax[2], we can filter the log events matching our conditions for space-delimited logs.

Example filter (as we dont care the 1st 10 fields in this case, so we use ... )

[..., start, end, action, status]

Say if we need to capture the vpc flow log between Sat, 06 Aug 2016 04:35:56 GMT and Sun, 07 Aug 2016 04:35:56 GMT

using epoch time converter(http://www.epochconverter.com/ for example), we get the Unix time in second being 1470458156 & 1470544556

so the filter we will be using become

[..., start>1470458156, end<1470544556, action, status]

So you can follow link[3], To search all log entries after a given start time using the Amazon CloudWatch console

Goto AWS CloudWatch Console-> Logs -> Select the vpc flowlog log group -> above "Log Streams List", click "Search Event"

and use the [..., start>1470458156, end<1470544556, action, status] in the filter field, then press Enter.

You can modify the filter accordingly for more conditions.


Resource Links: [1] AWS - VPC - VPC Flow Logs - Flow Log Records https://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/flow-logs.html#flow-log-records [2] AWS - CloudWatch - Searching and Filtering Log Data - Filter and Pattern Syntax - Using Metric Filters to Extract Values from Space-Delimited Log Events https://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/FilterAndPatternSyntax.html#d0e26783 [3] AWS - CloudWatch - To search all log entries after a given start time using the Amazon CloudWatch console https://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/SearchDataFilterPattern.html

Wednesday, August 3, 2016

Extract public/private key from a PKCS#12

You can use following commands to extract public/private key from a PKCS#12 container:
  • Private key:
    openssl pkcs12 -in yourP12File.pfx -nocerts -out privateKey.pem
  • Certificates:
    openssl pkcs12 -in yourP12File.pfx -clcerts -nokeys -out publicCert.pem
    You can add -nocerts to only output the private key or add -nokeys to only output the certificates.
  •  openssl pkcs12 -in Sample.pfx -out Sample.pem -nodes

Monday, July 25, 2016

Import OVA to Amazon Aws

VM Import/Export enables you to easily import virtual machine images from your existing environment to Amazon EC2 instances and export them back to your on-premises environment. This offering allows you to leverage your existing investments in the virtual machines that you have built to meet your IT security, configuration management, and compliance requirements by bringing those virtual machines into Amazon EC2 as ready-to-use instances. You can also export imported instances back to your on-premises virtualization infrastructure, allowing you to deploy workloads across your IT infrastructure.


Step 1.  : installing the Aws CLI

Step 2. We can get the Access Key ID and Secret Key from Aws IAM service under the specific User.
aws configure
AWS Access Key ID [None]:
AWS Secret Access Key [None]:
Default region name [None]: us-west-2
Default output format [None]: ENTER

Step 3
Now create two files: trust-policy.json & role-policy.json, in the second file you’ll need to replace “$bucketname” with your bucket name.

trust-policy.json:
===============
{
   "Version":"2012-10-17",
   "Statement":[
      {
         "Sid":"",
         "Effect":"Allow",
         "Principal":{
            "Service":"vmie.amazonaws.com"
         },
         "Action":"sts:AssumeRole",
         "Condition":{
            "StringEquals":{
               "sts:ExternalId":"vmimport"
            }
         }
      }
   ]
}
===============

role-policy.json:
=================
{
   "Version":"2012-10-17",
   "Statement":[
      {
         "Effect":"Allow",
         "Action":[
            "s3:ListBucket",
            "s3:GetBucketLocation"
         ],
         "Resource":[
            "arn:aws:s3:::$bucketname"
         ]
      },
      {
         "Effect":"Allow",
         "Action":[
            "s3:GetObject"
         ],
         "Resource":[
            "arn:aws:s3:::$bucketname/*"
         ]
      },
      {
         "Effect":"Allow",
         "Action":[
            "ec2:ModifySnapshotAttribute",
            "ec2:CopySnapshot",
            "ec2:RegisterImage",
            "ec2:Describe*"
         ],
         "Resource":"*"
      }
   ]
}
===================

Now, use the aws cli tools to apply the policies:
$ aws iam create-role --role-name vmimport --assume-role-policy-document file://trust-policy.json
$ aws iam put-role-policy --role-name vmimport --policy-name vmimport --policy-document file://role-policy.json

Step 4 : Check VM prerequisite before exporting as OVA
========================
In regard of VM, before exporting it from Vsphere and importing to AWS cloud, please make sure that all prerequisites for Import have been fullfiled.
Compare your VM with this checklist:
-all unnecessary services are disabled,
-no unnecessary applications are placed in Windows Startup,
-there are no pending reboots (reboot flag set by Windows Update or by any other software),
-VM volumes are defragmented and the size of each disk is resized to necessary (bigger disk=longer conversion time),
-you use single network interface setup to use DHCP (this should be done prior to import),
-no ISO is attached to this VM,
-make sure that Microsoft .NET Framework 3.5 Service Pack 1 or later are installed (required to support Ec2Config),
-your VM's root volume use MBR partition table,
-your anti-virus and anti-spyware software and firewalls are disabled,
-only one partition is bootable,
-rdp access is enabled,
-the administrator account and all other user accounts use secure passwords. All accounts must have passwords or the importation might fail.
-Uninstall the VMware Tools from your VMware VM,
-the language of your OS is EN-US,
-these hotfixes are installed (according to OS version):
Install Latest Ec2Config
https://aws.amazon.com/developertools/5562082477397515
=================

Step 5 : Uploading the OVA to S3 and Creating the AMI
You can upload your VMs in OVA format to your Amazon S3 bucket using the upload tool of your choice. After you upload your VM to Amazon S3, you can use the AWS CLI to import your OVA image. These tools accept either a URL (public Amazon S3 file, a signed GET URL for private Amazon S3 files) or the Amazon S3 bucket and path to the disk file.

Use aws ec2 import-image to create a new import image task.
The syntax of the command is as follows:

$ aws ec2 import-image --description "Windows 2008 OVA" --disk-containers file://containers.json
The file containers.json is a JSON document that contains information about the image. The S3Key is the name of the image file you want to upload to the S3Bucket.

[{
    "Description": "First CLI task",
    "Format": "ova",
    "UserBucket": {
        "S3Bucket": "my-import-bucket",
        "S3Key": "my-windows-2008-vm.ova"
    }
}]

Step 6 : Checking the Status


Use the “aws ec2 describe-import-image-tasks” command to return the status of the task. The syntax of the command is as follows:

Regarding the License licensing, within the  api-call "aws ec2 import-image" we can define a "--license-type" value.
Based on this option your VM will use your license (BYOL) or will activate itself in AWS KMS[4]. Option should be set to "AWS" or "BYOL".