Showing posts with label #opensource. Show all posts
Showing posts with label #opensource. Show all posts

Monday, November 23, 2015

TTA in Discuss Agile Day Pune

I spoke in Discuss Agile Day, Pune on 22nd Nov on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

Below are the details of the talk:

Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Wednesday, October 14, 2015

Good Trends for TTA in DevOps Summit

I spoke in DevOps Summit on 8th Oct in Bangalore on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

The conversations during and after this talk with various veterans in the Software Industry, across various different domains; reiterated my belief in the need for me to spend more time in taking TTA to the next level and make it a more robust and feature-rich product.

Below are the details of the talk:


Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Video from the talk

 

 


Wednesday, September 23, 2015

Selenium Conference 2015 - it simply came, and went so fast

Its been a crazy summer - the 2nd week of September 2015 just amplified that…

A good few months ago we - the Selenium Conference Planning Committee started on the journey of planning this years Selenium Conference 2015. We started with debating where to have this years conference, till Portland magically came up on the radar, and became a reality. We met over Google Hangout every 2 weeks initially, and then as we got closer to the date, every week.

Can’t believe as I am writing this post, the conference is already over (a couple of weeks ago) …

The team put in a lot of hard work - me doing the least of that … and the turnout (approx 500 people), the interactions and the quality of talks proves the hard work paid dividends.

I traveled from Pune, India on 5th Sept at around 6pm headed to Portland, Oregon. The journey - from home to the hotel took approximately 35 hours.

After crazy 4 days, and a total of around 25-30 hours of sleep in 5 nights (thanks to the jet lag), and having delivered 3 talks as well, it was another 35 hour trip back home ... the only good thing after this hectic trip - I never got adjusted to the US time zone - which meant no jet-lag when I came back home :) This was a first for me :)

Slides & Videos from Selenium Conference 2015:

All the slides and videos for all the talks are available here.

Below is the list of my talks:

To Deploy or Not-to-Deploy - decide using TTA's Trend & Failure Analysis

I got a lot of very good feedback for this talk, and also quite a few people expressed interest in trying it out! Looking for feedback from their experiences!


Video of the talk is available on YouTube here:


Slides are available here:


Automate across Platform, OS, Technologies with TaaS

This topic is so relevant with anyone working in large enterprises, or when it is being "mandated" to work on a common test automation framework.

Video of the talk is available here:

Slides are available here:


Say ‘No’ to (more) Selenium Tests

I paired with Bhumika on this talk. We were very agile in preparing for this talk - a day in advance to be precise. Also, it was very bold topic to have in a Selenium Conference - standing in front of 200+ Selenium enthusiasts, and telling them - do NOT write more Selenium tests. But went pretty well ... given that we were able to walk on our own feet out of the room, and that people were able to get the message we were trying to deliver :D

Video of the talk is available here:


Slides are available here:


Friday, August 14, 2015

Client-side Performance Testing workshop video

As mentioned here, I conducted a Client-side Performance Testing workshop in TechJam.

It was a full house, and almost turned into a flop show because there was no wifi available - an essential requirement for the workshop. There were 2 things that saved me:
1. The attendees, thankfully (in this case) did not read the prerequisites well - most of them came without a laptop
2. Because of the above, I could get by using a 3G USB connection and just do a demo of the tools I wanted to show.

End of the day, all was good. I got good feedback from the participants that they really enjoyed the workshop, and it was very informative and useful. (Thank you all again for the kind words!)

Below is the video from the workshop.


The slides are available here:


Tuesday, July 14, 2015

Client-side Performance Testing Workshop in TechJam, 13th August 2015

I am conducting a Client-side Performance Testing workshop in TechJam on Thursday, 13th August 2015.

You can register for the same from the TechJam page.


Abstract

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Expected Learnings

  1. What is Performance Testing and Performance Engineering.
  2. Hand's on experience of some open-source tools to monitor, measure and automate Client-side Performance Testing.
  3. Examples / code walk-through of some ways to automate Client-side Performance Testing.

Prerequisites

  1. Participants are required to bring their own laptop for this workshop.
  2. Also, please install phantomJS on your machine (http://phantomjs.org/download.html)