Monday, July 30, 2018

Mocking Frameworks



Scenarios where Mocking Frameworks require :

1)  With depending on backends developers can test the code using  mocking frameworks.
2)  The main objective of mocking frameworks is without  depending on backend developer , developers can do  testing frameworks



Mocking F/W are:

1.  Mockito
2.  PowerMockito
3.  Jmock
4.  EasyMock
...
...
..



Mockito :

1.  Introduction.
2.  Software Installation
3.  Annotations.
4.  Steps to implement Mockito
5.  Examples on Different usecase
6.  Implement Mockito in project
7.  Drawbacks of Mockito


Wednesday, July 25, 2018

Spunk










  1. Architecture



  1. About Splunk   - Some  key  responsibilities  :

  • What is forwarder configuration, search heads and indexing. 
  • How to create Dashboards, report, scheduled searches, and alerts. 
  • Creating Vulnerability dashboard that aggregates data across multiple services to identify critical threats and proactively mitigate risks. 
  • How to search  SPLUNK search strings and operational strings. 
  • How to analyze  security-based events, risks, and reporting instances. 
  • Step by step to develop  custom web application solutions for internal ticket metrics reporting. 
  • Provide regular support guidance to SPLUNK project teams on complex solution and issue resolution with the objective of ensuring best fit and high quality. 
  • Interact with the data warehousing team regarding extracting the data and suggest the standard data format such that Splunk will identify most of the fields. 
  • Onboard new log sources with log analysis and parsing to enable SIEM correlation. 
  • Performed field extraction using IFX in an event action. 
  • Involved in setting up alerts for different type of errors. 
  • Performed Splunk administration tasks such as installing, configuring, monitoring and tuning. 
  • • Install and maintain the Splunk add-on including the DB Connect 1, Active Directory LDAP for work with directory and SQL database. 
  • • Installed and configured Splunk DB Connect in Single and distributed server environments. 
  • • Configure the add-on app SSO Integration for user authentication and Single Sign-on in SplunkWeb. 
  • • Automating in Splunk using Perl with Service-Now for event triggering. 
  • • Deployed Splunk updates and license distribution over multiple servers using a deployment server. 
  • • Create Dashboard Views, Reports and Alerts for events and configure alert mail. 
  • • Monitor the Splunk infrastructure for capacity planning and optimization 
  • • Server monitoring using tools likes Splunk, Solarwinds-Orion, HP BSM and HP Open View. 
  • • Integrated ServiceNow with Splunk to generate the Incidents from Splunk
  • • Active monitoring of Jobs through alert tools and responding with certain action logs, analyses the logs and escalate to high level teams on critical issues. 
  • • Configured and administered Tomcat JDBC, JMS and JNDI services. 
  • • Configured Node manager to remotely administer Managed servers 
  • • Experience in handling network resources and protocols such as TCP/IP, Ethernet, DNS 
  • • Splunk configuration that involves different web application and batch, create Saved search and summary search, summary indexes. 
  • • Splunk search construction with ability to create well-structured search queries that minimize performance impact. 
  • • Scaling up ELK (Elastic search/Log stash/Kibana) to index 90G a day of raw data(Tested alternative open source for splunk
  • • Monitored the database (data tables and error tables), WebLogic error log files and application error log files to track and fix bugs. 
  • • Ensuring that the application website is up and available to the users. 
  • • Continuous monitoring of the alerts received through mails to check if all the application servers and web servers are up. 
  • • Worked on DB Connect configuration for Oracle, MySQL and MSSQL. 
  • • Supporting migration from Splunk On Premise data center to Amazon AWS 
  • • Launching, Configuring, Supporting large scale instances on AWS 
  • • Headed Proof-of-Concepts (POC) on Splunk ES implementation, mentored and guided other team members on Understanding the use case of Splunk
  • • Expertise in customizing Splunk for Monitoring, Application Management and Security as per customer requirements and industry best practice. 
  • • Expertise in Installation, Configuration, Migration, Trouble-Shooting and Maintenance of Splunk, Passionate about Machine data and operational Intelligence. 
  • • Implemented workflow actions to drive troubleshooting across multiple event types in Splunk
  • • Expert in installing and configuring Splunk forwarders on Linux, Unix and Windows. 
  • • Expert in installing and using Splunk apps for UNIX and Linux (Splunk nix) 
  • • Knowledge on Configuration files in Splunk (props. conf, Transforms.conf, Output.confg) 
  • • Worked on large datasets to generate insights by using Splunk
  • • Production error monitoring and root cause analysis using Splunk
  • • Install, configure, and administer Splunk Cloud Environment 6.5.0 and Splunk Forwarder 6.x.x on Windows Servers. 
  • • Supported Splunk Cloud with 4 Indexers, 80 forwarders and Generated 700 Gb of data per day. 
  • • Involved in standardizing SPLUNK forwarder deployment, configuration, and maintenance across Windows Servers 
  • • Configured inputs. conf and outputs.conf to pull the XML based events to SPLUNK Cloud Indexer. 
  • • Debug Splunk related and integration issues. 
  • • Installed Splunk on nix & Splunk SOS and maintained Splunk instance for monitoring the health of the clusters 
  • • Integrate Spunk Web console with Splunk Mobile App using Mobile Access server Add on 
  • • Build, customize and deploy Splunk apps as per internal customers 
  • • Splunk UI experience and able to debug expensive search queries. 
  • • Configured Clusters for load balancing and fail over solutions. 
  • • Implemented a Log Viewer Dashboard as a replacement for an existing tool to view logs across multiple applications hosted on a PaaS setup. 
  • • Create Splunk Search Processing Language (SPL) queries, Reports, Alerts and Dashboards. 
  • • Ability to provide engineering expertise and assistance to the Splunk user community Advanced Splunk Search Processing Language skills (SPL). 
  • • Extensively used various extract keyword, search commands like stats, chart, time chart, transaction, strptime, strftime, eval, where, xyseries, table etc 
  • • Good knowledge about Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards, Clustering and Forwarder Management. 
  • • Created and Managed Splunk DB connect Identities, Database Connections, Database Inputs, Outputs, lookups, access controls. 
  • • Strong experience with web/application servers like Apache Tomcat, Jetty, JBoss, IBM WebSphere, WebLogic. 
  • • Strong experience using SQL, PL/SQL Procedures/Functions, Triggers and Packages. 
  • • Creating accurate reports, Dashboards, Visualizations, Elastic search, and Pivot tables for the business users. 
  • • Well versed with Dynatrace Monitoring tool. Expert in architecture of applications monitoring and UE Analytics. Experience within configuration and infrastructure support of Monitoring alerting and reporting tools through Dynatrace interface
  • x
  • x
  • x
  • x
  • Docker in Production Using Amazon Web Services







    1)  Pre-requisites

    Setup
    Installing Docker
    Installing Brew
    Installing Java
    Installing Ansible
    Other Recommended Tools
    Setting up Required Services
    Setting up a Course Root Folder

    2)  Creating the Sample Application

        Application Architecture
        Installing the Application
        Building the Application
        Running the Application
        Testing the Application

    3)  Creating Docker Release Images

         Release Pipeline Workflow
         Workflow Specifications
         Understanding the Test Stage
         Running the Test Stage
         Understanding the Base Image
         Understanding the Release Stage
         Running the Release Stage
         Tagging and Publishing Release Images
     
    5) Setting up AWS Access
         Setting up AWS Identity and Access Management
         Creating an Account Alias
         Creating IAM Roles
    Creating Groups
    Creating Users
    Setting up an EC2 Key Pair
    Setting up AWS CLI Access

    6) Running Docker Applications Using the EC2 Container Service

         EC2 Container Service Overview
         Publishing Images to EC2 Container Registry
         Publishing Images Using the Release Pipeline
         Creating an ECS Cluster
         Verifying an ECS Container Instance
         Creating an ECS Task Definition
         Creating an ECS Service
         ECS Service Deployments
         Creating an ECS Task


    7) Customizing ECS Container Instances

         Custom Amazon Machine Image Design
         Understanding EC2 Instance Initialization
         Using Packer to Build Amazon Machine Images
         Creating a Packer Template
         Adding Packer Provisioning Tasks
         Configuring the ECS Agent
         Customizing Docker
         CloudWatch Logs Integration
         HTTP Proxy Support
         ECS Container Instance Health Checks
         Post Build Cleanup
         Building and Publishing the Image

    8) Deploying AWS Infrastructure Using Ansible and CloudFormation

         AWS Deployment Strategy
         AWS Account Infrastructure Architecture
         Ansible Playbooks
         Ansible Playbook Structure
         Creating an Ansible Playbook
         Configuring the AWS STS Role
         Configuring the AWS CloudFormation Role
         Understanding the AWS CloudFormation Role
         Using AWS CloudFormation Role Templates
         Creating the CloudFormation Resources Stack
         Creating the EC2 Container Registry Stack
         Creating the Network Stack
         Publishing the Docker Squid Image
         Creating the HTTP Proxy Stack
         Deploying the HTTP Proxy Stack
       
    9)  Architecting and Preparing Applications for ECS

          Microtrader AWS Architecture
          Docker and AWS Challenges
          Controlling Cluster Auto Discovery
          Configuring Cluster Auto Discovery Using Confd
         
    10) Defining ECS Applications Using Ansible and CloudFormation

          Creating the Microtrader Deployment Playbook
          Configuring EC2 Autoscaling Groups
          Configuring Autoscaling Launch Configurations
          Configuring CloudFormation Init Metadata
          Configuring EC2 Autoscaling Security Groups
          Configuring Autoscaling EC2 Instance Profiles
          Configuring a Public Load Balancer
          Configuring an Internal Load Balancer
          Configuring DNS Records
          Configuring the Relational Database Service
          Configuring CloudWatch Log Groups

    11)  Deploying ECS Applications Using Ansible and CloudFormation

          ECS System Resources
          Understanding ECS Memory Allocation
          Configuring ECS Task Definitions Part 1
          Configuring ECS Task Definitions Part 2
          Understanding ECS Service Deployment
          Configuring ECS Services
          Deploying the Microtrader Stack
          Troubleshooting the Microtrader Application

    12) Creating CloudFormation Custom Resources Using AWS Lambda

          CloudFormation Custom Resources Overview
          Creating a Lambda Function
          Creating a Custom Resource
          Creating a Custom Resources Project
          Handling CloudFormation Requests
          Validating Input Data
          Integrating with the ECS Service
          Polling ECS Task Status
          Building Lambda Functions
          Publishing Lambda Functions
          Creating Lambda Functions in CloudFormation
          Deploying CloudFormation Custom Resources
          Handling Update and Delete Requests
          Handling Long Running ECS Tasks


    13) Managing Secrets in AWS

           Introducing the EC2 Systems Manager
           Secrets Management Solution Overview
           Creating the Secrets Provisioner
           Creating Secrets Using CloudFormation
           Injecting Secrets at Container Startup
           Consuming Secrets In CloudFormation
           Configuring IAM Roles for Accessing Secrets
           Deploying Secrets Using CloudFormation
         
    14) Managing ECS Infrastructure Lifecycle
       
            Understanding Auto Scaling Lifecycle Hooks
            Creating a Lifecycle Hook Lambda Function
            Configuring Lifecycle Hooks in CloudFormation
            Configuring Auto Scaling Rolling Updates
            Deploying and Testing Lifecycle Hooks
           
    15) Auto Scaling ECS Applications
       
            ECS Auto Scaling Challenges
            Understanding ECS Resources
            ECS Capacity Management
            Scaling out ECS Clusters
            Scaling in ECS Clusters
            Auto Scaling Solution Overview
            Understanding ECS Events
            Creating a Capacity Manager Lambda Function
    Calculating ECS Container Instance Capacity
    Testing the ECS Capacity Manager
    Publishing CloudWatch Custom Metrics
    Creating CloudWatch Alarms
    Creating EC2 Auto Scaling Policies
    Managing ECS Capacity Using CloudFormation
    EC2 Auto Scaling Using CloudFormation
    Configuring ECS Auto Scaling
    ECS Auto Scaling Using CloudFormation

    16) Continuous Delivery Using CodePipeline

    Solution Overview
    Creating a CodeBuild Image
    Adding CodeBuild Support for Build Actions
    Creating a Pipeline
    Creating an Application Version Artifact
    Adding CodeBuild Support for Deploy Tasks
    Creating Deployment Templates
    Creating a Deployment Action
    Ensuring Immutable Artifacts
    Deploying to Production
    Creating Pipelines in CloudFormation
    Defining the Source Stage in CloudFormation
    Defining the Build Stage in CloudFormation
    Defining Deploy Stages in CloudFormation
    Deploying a Pipeline Using CloudFormation




    Tuesday, July 24, 2018

    Ansible


    Key tasks or responsibilities that can  handle through  Ansible 


    • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has Integrated Ansible with Jenkins.
    • Used various services of AWS for this infrastructure. I used EC2 as virtual servers to host Git, Jenkins and configuration management tool like Ansible. Converted slow and manual procedures to dynamic API generated procedures. 
    • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. Run Ansible Scripts to Provide Dev Servers. 
    •  Configured Generic data source, Multi data source using configuration management Process using Ansible. 
    • Configured JMS Modules, SAF Agents, SAF Modules, using Configuration Management Process using Ansible. 
    • Developed Ansible scripts for an automated server provisioning and Docker images for isolation, reducing the time between provisioning and deployment from over 3 hours to less than 10 minutes.

    ================================================================
    • Before & After Configuration Management
    • What is Ansible ?
    • Features of Ansible
    • Ansible Case Study :  NASA
    • Ansible for Orchestration
    • Ansible for Provisioning
    • Ansible for Configuration Management
    • Ansible for Application Deployment
    • Ansible for Security
    • Write Ansible Playbooks to install LAMP Stack and Host a website

    ================================================================


    How Ansible Works?
    • There are many similar automation tools available like Puppet, Capistrano, ChefSalt, Space Walk, etc, but Ansible categorizes into two types of server: controlling machines and nodes.
    • The controlling machine, where Ansible is installed, and Nodes are managed by this controlling machine over SSH(Secure Shell). By controlling the machine through inventry, location of nodes is specified.
    • Using SSH protocol, the controlling machine (Ansible) deploys modules to nodes and these modules are stored temporarily on remote nodes and through a JSON connection, they will communicate with the Ansible machine over the standard output.
    How Ansible Works
    • There is no need for any agent installation on remote nodes because Ansible is agent-less, so it means there are no any background daemons or programs are executing for Ansible when it’s not managing any nodes.
    • Ansible can handle 100’s of nodes from a single system over SSH connection and the entire operation can be handled and executed by one single command ‘ansible’. But, in some cases, where you required to execute multiple commands for a deployment, here we can build playbooks.
    • Playbooks are a bunch of commands which can perform multiple tasks, and each playbook are in YAML file format.
    What’s the Use of Ansible?
    Ansible can be used in IT infrastructure to manage and deploy software applications to remote nodes. For example, let’s say you need to deploy a single software or multiple software to 100’s of nodes by a single command, here ansible comes into picture, with the help of Ansible you can deploy as many as applications to many nodes with one single command, but you must have a little programming knowledge for understanding the ansible scripts.
    We’ve compiled a series on Ansible, title ‘Preparation for the Deployment of your IT Infrastructure with Ansible IT Automation Tool‘, through parts 1-4 and covers the following topics.
    Is there a web interface / REST API / etc?
    Yes, Ansible, Inc makes a great product that makes Ansible even more powerful and easy to use. See Ansible Tower.
    How do I submit a change to the documentation?
    Documentation for Ansible is kept in the main project git repository, and complete instructions for contributing can be found in the docs.
    When should I use {{ }}? Also, how to interpolate variables or dynamic variable names
    1. A steadfast rule is ‘always use {{ }} except when when:‘. Conditionals are always run through Jinja2 as to resolve the expression, so when: failed_when: and changed_when: are always templates and you should avoid adding {{}}.
    2. In most other cases you should always use the brackets, even if previously you could use variables without specifying (like with_ clauses), as this made it hard to distinguish between an undefined variable and a string.
    3. Another rule is ‘moustaches don’t stack’. We often see this:
    4. {{ somevar_{{other_var}} }}
    5. The above DOES NOT WORK, if you need to use a dynamic variable use the hostvars or vars dictionary as appropriate:
    6. {{ hostvars[inventory_hostname][‘somevar_’ + other_var] }}
    How to install Ansible
    Installation of Ansible Ubuntu 14.04
    The best way to get Ansible for Ubuntu is to add the project’s PPA (personal package archive) to your system.
    To do this effectively, we need to install the software-properties-common package, which will give us the ability to work with PPAs easily. (This package was called python-software-properties on older versions of Ubuntu.)
    sudo apt-get update
    sudo apt-get install software-properties-common
    Once the package is installed, we can add the Ansible PPA by typing the following command:
    sudo apt-add-repository ppa:ansible/ansible
     For the PPA addition, press enter.
    After that we can refresh our system package, we can see available PPA packages and can install the software.
    sudo apt-get install ansible
    sudo apt-get update
    Through Ansible, we have the software required to administer our servers.
    How do I generate crypted passwords for the user module?
    The mkpasswd utility that is available on most Linux systems is a great option:
    mkpasswd --method=sha-512
    If this utility is not installed on your system (e.g. you are using OS X) then you can still easily generate these passwords using Python. First, ensure that the Passlib password hashing library is installed.
    pip install passlib
    Once the library is ready, SHA512 password values can then be generated as follows:
    python -c "from passlib.hash import sha512_crypt; import getpass; print sha512_crypt.encrypt(getpass.getpass())"
    Use the integrated Hashing filters to generate a hashed version of a password. You shouldn’t put plaintext passwords in your playbook or host_vars; instead, use Vault to encrypt sensitive data.
    Ansible Training
    By enabling Kerberized SSH, How do I get ansible to reuse connections?
    1. Use ‘-c ssh or ssh’ to Switch your default connection type in the configuration file, instead of the Python use Native Open SSH for connections. paramiko library. In Ansible 1.2.1 and later, ‘ssh’ will be used by default if Open SSH is new enough to support Control Persist as an option.
    2. Paramiko is great for starting out, but the OpenSSH type offers many advanced options. You will want to run Ansible from a machine new enough to support ControlPersist, if you are using this connection type. You can still manage older clients. If you are using RHEL 6, CentOS 6, SLES 10 or SLES 11, the version of OpenSSH, is still a bit old, so consider managing from a Fedora or openSUSE client even though you are managing older nodes, or just use paramiko.
    3. We keep paramiko as the default as if you are first installing Ansible on an EL box, it offers a better experience for new users.
    What is the best way to make content reusable/redistributable?
    If you have not done so already, read all about “Roles” in the playbooks documentation. This helps you make playbook content self-contained and works well with things like git sub modules for sharing content with others.
    If some of these plugin types look strange to you, see the API documentation for more details about ways Ansible can be extended.
    How do I see all the inventory vars defined for my host?
    You can see the resulting vars you define in inventory running the following command:
    ansible -m debug -a "var=hostvars['hostname']" localhost
    How do I copy files recursively onto a target host?
    The “copy” module has a recursive parameter, though if you want to do something more efficient for many files, look at the “synchronize” module instead, which wraps rsync. See the module index for info on both modules.
    What is Ansible Role?
    Ansible can interact with configured clients from the command line with the ansible command, and how you can automate configuration with playbooks run through the ansible-playbook command.
    The first step in creating a role is creating its directory structure. To create the base directory structure, we’re going to use a tool bundled with Ansible called ansible-galaxy:
    $ ansible-galaxy init azavea.packer
    azavea.packer was created successfully
    That command will create an azavea.packer directory with the following structure:
    ├── README.md
    ├── defaults
    │ └── main.yml
    ├── files
    ├── handlers
    │ └── main.yml
    ├── meta
    │ └── main.yml
    ├── tasks
    │ └── main.yml
    ├── templates
    └── vars
    └── main.yml
    Difference between Variable name and Environment Variables.
    Variable NameEnvironment Variables
    Variable Name can be built by adding strings.To access the environment variable need to access existing variables.
    {{ hostvars[inventory_hostname][‘ansible_’ + which_interface][‘ipv4’][‘address’] }}# … vars: local_home: “{{ lookup(‘env’,’HOME’) }}”
    We can add stringsif we want to set variables, see the advanced playbooks section.
    For Variable names we use ipv4 address.For Remote environment variables, use {{ ansible_env.SOME_VARIABLE }}

    References :






    Jenkins

    Key tasks - responsibilities that can perform using Jenkins :



    •  Jenkins AWS Code Deploy plugin to deploy to AWS.
    •  Involved in Setting up the backup server for Jenkins and prepared disaster recovery plans for Jenkins and bamboo. 
    • Used Maven to build rpms from source code checked out from GIT and Subversion repository, with Jenkins being the Continuous Integration Server and Artifactory as repository manager. 
    •  Installed and Configured Jenkins Plugins to support the project specific tasks. 
    • Developed Docker based micro services, deployment modules with Jenkins, Kubernetes and Ansiblebased pipelines/frameworks. 
       




    Please download latest windows jenkins installer from - https://jenkins.io/download/thank-you-downloading-windows-installer-stable/






    Creating pipeline





    Sample  Pipeline script   (GitHub+Maven)


    node {
       def mvnHome
       stage('Preparation') { // for display purposes
          // Get some code from a GitHub repository
          git 'https://github.com/jglick/simple-maven-project-with-tests.git'
          // Get the Maven tool.
          // ** NOTE: This 'M3' Maven tool must be configured
          // **       in the global configuration.           
          mvnHome = tool 'M3'
       }
       stage('Build') {
          // Run the maven build
          if (isUnix()) {
             sh "'${mvnHome}/bin/mvn' -Dmaven.test.failure.ignore clean package"
          } else {
             bat(/"${mvnHome}\bin\mvn" -Dmaven.test.failure.ignore clean package/)
          }
       }
       stage('Results') {
          junit '**/target/surefire-reports/TEST-*.xml'
          archive 'target/*.jar'
       }
    }



    Sunday, July 22, 2018

    JSON

    WADL



    WADL



    HATEOAS : Hypermedia As The Engine Of Application Status.



    HATEOAS  : Hypermedia  As The Engine Of Application Status.








    YAML




    YAML (YAML Ain't Markup Language) is a human-readable data serialization language. It is commonly used for configuration files, but could be used in many applications where data is being stored (e.g. debugging output) or transmitted (e.g. document headers). YAML targets many of the same communications applications as XML but has a minimal syntax which intentionally breaks compatibility with SGML[1]. It uses both Python-style indentation to indicate nesting, and a more compact format that uses [] for lists and {} for maps[1] making YAML 1.2 a superset of JSON.[2]
    Custom data types are allowed, but YAML natively encodes scalars (such as stringsintegers, and floats), lists, and associative arrays (also known as hashes, maps, or dictionaries). These data types are based on the Perl programming language, though all commonly used high-level programming languages share very similar concepts. The colon-centered syntax, used to express key-value pairs, is inspired by electronic mail headers as defined in RFC 0822, and the document separator "---" is borrowed from MIME (RFC 2045)[not in citation given]Escape sequences are reused from C, and whitespace wrapping for multi-line strings is inspired from HTML. Lists and hashes can contain nested lists and hashes, forming a tree structure; arbitrary graphs can be represented using YAML aliases (similar to XML in SOAP).[1] YAML is intended to be read and written in streams, a feature inspired by SAX.[1]
    Support for reading and writing YAML is available for many programming languages.[3] Some source code editors such as Emacs[4] and various integrated development environments[5][6][7] have features that make editing YAML easier, such as folding up nested structures or automatically highlighting syntax errors.


    https://codebeautify.org/


    Custom Content Handler 


    It is used to handle any kind of formate.  JAXRS  has provided 2 interfaces.

    1.  MessageBodyReader -- 1.  readFrom(InputStream is)
                            2.  isReadable()

    2  Message BodyWriter -  1. isWritable() 
                                                       2. getSize()
                                                       3. writeTo(OutPutStream os )


    Note :  Before resource will execute MessageBodyReader  will be called.  It will read the input data by  InputStream.

              After  resource will execute MessageBodyWriter will be called.  It will write the output dat ainto Outputstream.

    • No braces require for yaml document.
    • Message body reader  and Message body writer.

    Hyderabad Trip - Best Places to visit

     Best Places to Visit  in Hyderabad 1.        1. Golconda Fort Maps Link :   https://www.google.com/maps/dir/Aparna+Serene+Park,+Masj...