Popular Posts

Showing posts with label load testing. Show all posts
Showing posts with label load testing. Show all posts

Monday, May 14, 2012

Monitoring User Experience of the Cloud

In this blog, we are going to talk about metrics to build into service level agreements and learn how to track the quality of service users of SaaS and cloud applications actually experience.

Q&A With Vik Chaudhary, VP of Product Management and Corporate Development, Keynote Systems

Phil Waineright: I’m glad to have you with us because your — Keynote — is a SaaS provider itself, but you also actually work with SaaS and cloud companies who make use of your services, don’t you?

Vik Chaudhary: That’s right. In fact we do both. We started out as a cloud company and a SaaS company, well before those words were even invented, back 14 years ago. And today, we work with about 2800 different companies all over the world; SaaS companies are among them.

Phil Waineright: Right. So okay. So this part of the business is serving traditional enterprise businesses but part — a growing part I suppose of the business — is serving the cloud vendor community of one type or another.

Vik Chaudhary: As it turns out, the cloud vendor community is — especially in the SaaS world — is growing to include businesses that are using SaaS vendors very effectively. And because businesses typically care about their online performance and customer experience, they happen to look to us to help moderate the conversation between them and the SaaS providers so we can assure that performance and reliability of their applications are really top-notch.

Continue reading

Tuesday, December 13, 2011

Internal Testing Program Using Real Devices


Creating an internal test program using real devices is a challenge. The devices have to be bought, carrier contracts have to be established, test scripts created, users trained, results compiled and analyzed — for each geographic market. And then the devices and contracts have to be dealt with after the testing is done, or maintained for ongoing monitoring.
This complexity is why so many enterprises use an outside test partner that offers an established infrastructure with hundreds or thousands of devices deployed over a broad geography. The test provider works with the client company to develop the necessary scripts, and then leases time to them on its network to run the tests. This is testing in the "public cloud," which means that many clients utilize the same devices and infrastructure to conduct their tests. It's an ideal solution for most companies — there's no upfront capital expenditure and tests can be quickly executed on demand, on a budget-friendly pay-per-use basis.
But when a company has particular security concerns and is reluctant to expose its data on a shared infrastructure, or if it has ongoing testing needs, a "private cloud"solution is in order. In this case, the test provider procures devices and contracts to create a private test network exclusively for the use of a particular client company.
In either case, the client team has ready online access to the entire process, from scripting to running the tests to reading results.
Cloud-based app testing is an automated process. There's no human being working the phones out in the field; rather, the devices are remotely operated by machine, performing the interactions specified in the test scripts and providing feedback on the devices' responses.
Read More at http://keynote.com/benchmark/mobile_wireless/article_mobile_app_performance.shtml

Monday, November 28, 2011

Internet Retailers and Website Performance


How many seconds does it take to lose a shopper to a competitor’s site? How long will a business user wait for Javascript to execute so she can see the data she’s searching for? How many times will a user tolerate delays in downloading a bank transaction, or registering a bid, or completing a form, before they abandon the site?

The cost of poor site performance is not just lost visitors, it’s lost money. In a recent survey, nearly three-quarters of Internet retailers correlate poor site performance with lost revenue, and more than half with lost traffic.

Just a few short years ago, evaluating website performance was a fairly simple affair. “How fast did the page load?” was often the first and last question that needed to be asked. User expectations were far lower, and patience much higher, when the experience of accessing information or making a purchase online was new and different and amazingly convenient.

Today, however, user expectations are stratospherically higher. With the Internet now tightly woven into the fabric of everyday life, and a multitude of Web sites available to satisfy any given need or desire, users expect not only virtually instant page-loads, but fast and flawless execution of transactions and enhanced functionality that delivers a “rich” site experience.  In the intense competition to attract and keep site visitors, web performance is now a critical business driver for site success.

Read More at http://www.keynote.com/benchmark/index.shtml

Wednesday, September 28, 2011

The impact of web load testing on performance

What drives the financial impact of a site outage or performance issues is abandonment. It’s important to understand that visitors experiencing a Website with issues don’t result in lost revenue, per se. Only when those visitors don’t return, and/or go somewhere else is revenue lost. A shopper’s tolerance for errors is called “tenacity” in Web load testing parlance. Low tenacity shoppers bail from slow searches and hanging shopping carts in a dash.

If a load test had been run that adequately modeled the impact of the planned launch, the damage would have been done outside business hours and Target management would have been able to decide whether to make changes to their systems and retest, postpone or restructure the launch or just "risk it" and see what happens.  We don't really think they ever had the chance to make those decisions.  Surely they didn't conduct a load test that predicted such an epic fail.  But should they have?

Realistic Web load tests model site usage and shopper behavior. Systems are deployed to simulate high levels of demand from multiple geographically disperse areas. Once the load is generated, the infrastructure and application’s response are watched carefully to identify bottlenecks and breakage points as the entire mesh of the Website’s interconnecting parts are stressed. Only this level of testing can accurately inform e-commerce teams of their preparation adequacy.

Tuesday, March 1, 2011

Improve Performance Of Third-Party Components


A slick-looking Web site is ultimately of limited effectiveness if all of the its bells and whistles are an impediment to system performance. Hence, the monitoring of system performance at the end-user/UI level is extremely important to ensuring a consistently excellent user experience. By using the proper website performance tools in a targeted manner, both business managers and developers can effectively monitor the impact of third-party performance both on the individual component level and in the aggregate.

The more complex your Web site is, the more likely it is that its performance – and possibly your profit margin - is in the hands of third parties and their components running on your site. Ideally, we want to improve the performance of these third-party components (aka “widgets”), so that a page loads just as fast with these widgets as without them.

If you are the owner of the Web site, there are at least two ways to improve overall site performance. One is to optimize the Web site itself for each widget so that those widgets running on the page will be more efficient, a method that is not cost-efficient. The other, more cost-effective option is to use continuous and focused website performance monitoring. Breaking down performance by time and by component category allows you to pinpoint the components that adversely impact Web site performance