Popular Posts

Wednesday, March 21, 2012

Measuring site performance over an actual cellular network


As fundamental as it may seem, many site owners don’t alter their approach at all when tackling a mobile site project. Call it “desktop thinking,” or terrestrial, landline, wireline thinking — by any name, it ignores the fundamental reality of the cellular network, which as described above, is inherently slower and riddled with opportunities for performance degradation.

One common desktop tactic that causes issues in mobile is the URL redirect, which instructs the browser to follow a different URL than the one originally requested. There are a number of legitimate reasons to employ this technique — to direct users to your third-party site host; to offer nicknames that provide multiple paths to the main site; or to send users to a site designed specifically for the detected browser.

This is generally a fine practice in the desktop browser world, where redirects usually happen in the blink of an eye and are virtually undetectable to the user. Use the same technique on a mobile site, though, where the big “L” — latency — colors the entire experience, and you end up with users staring and staring at a screen where nothing’s happening.

Surprisingly, even some of the biggest retailers have mobile sites bogged down with URL redirects. The problems become apparent when measuring site performance over an actual cellular network (as opposed to a WiFi connection).

At what point does the user come to the conclusion that the site’s not working, or that it’s not worth the wait? If they’ve just navigated from a well-built mobile site that loaded quickly, there’s a good chance they’re not going to wait eight seconds. How likely is it that they’ll come back again? How likely they’ll tell their friends about the experience? Forget what that means in terms of a lost sale. What does it mean for retailer X’s brand image?

Wednesday, March 7, 2012

App, Website, Or Both?


There’s no silver bullet for getting content successfully onto three screens, no switch that can be thrown to make content fast and usable. On smartphones and now tablets, site owners are juggling some combination of Mobile site optimization, Web app and native app; except for those who are doing nothing at all, and effectively writing off what is soon to be online’s biggest audience.
Whatever the approach, any serious mobile strategy requires effort and investment. Native apps need to be developed separately for multiple platforms, at minimum iOS and Android, for both smartphone and tablet — that’s virtually four apps, for starters.
HTML5 Web apps theoretically simplify the development task, in as much as one app should function acceptably on multiple phones. But a big tweak is required for tablets, to take advantage of the format and deliver a native-like experience.
Still another approach is to build separate, mobile-optimized sites for smartphones and tablets, streamlined for fast loading and usability, but not offering the capability of offline viewing.
Read More

Wednesday, February 22, 2012

"Performance" to cloud providers typically means only availability


One reason cloud performance monitoring is so critical is that cloud provider service is so nebulous. "Performance" to cloud providers typically means only availability; and even availability is only loosely guaranteed. For Amazon Web Services, as an example, unavailability means no connectivity at all during a five-minute period; if your user has a lousy, erratic, miserably slow connection, as far as Amazon is concerned, they've delivered. And availability means availability when it leaves Amazon's door; whether or not anything actually reaches your user is not Amazon's problem (regardless of their choices for ISP and connectivity). Oh, and the burden of proof is on you, the customer. For all intents and purposes, Amazon is not even checking to see if you even have service.

This is not to dump solely on Amazon. The same guarantees, or lack thereof, are typical of many cloud providers. In addition to the caveats above, scheduled and emergency downtime is excluded from availability guarantees; penalties for unavailability are minimal, and certainly not commensurate with potential business damages; and any other kind of performance is not included in the service level agreement.

An ideal cloud-client working relationship includes substantial SLAs, external monitoring of SLA parameters that is visible to both provider and client, and meaningful recourse if the service falls short. In lieu of this ideal, however, the onus is on the enterprise to put cloud monitoring and measurement in place and to hold their provider accountable – so they can either get the service level they need, or switch to a better provider.


Wednesday, February 8, 2012

The Elephant In The Cloud: Performance

For all of its tremendous potential, there’s one thing that will make or break the SaaS model both for vendors and users: Web Performance. No longer is technology contained within the walls of the enterprise, running on its own proven network, controlled closely by its own IT department. Now, the nerve center of the enterprise, the productivity of workers, the integrity of information assets is controlled by an outside entity, flows through the various pipes from a remote data center over the Internet into the enterprise’s network and ultimately, into a browser. And that presents challenges both for performance and user experience.


The problem with the browser being the front-end for SaaS applications is that users have very clear expectations of a browser experience, based on their use of the Web. Users go to a site and expect it to load fast — in two seconds or less. They don’t like to wait, and won’t. Google says that for every additional 500ms of delay, the site loses 20 percent of its traffic. With fast broadband and wireless connections everywhere, users expect blazing speed when they fire up their browser.


Read More http://keynote.com/benchmark/SaaS/article_industry_focus_cloudy_applications.shtml

Tablet Problem? Or Tablet Opportunity?


Yes, it’s another form factor, and more operating system permutations, still more network connection considerations, and additional development expense. But tablets could actually be a bright spot for content owners looking to attract audience and advertising dollars.

This past holiday season, iPad users showed themselves to be valuable customers, buying more and spending more than other mobile shoppers. And a number of studies indicate that tablet users are avid content consumers as well.

More than three-quarters of tablet owners use their device every day, on average for 90 minutes. More than half use it daily to get their news, and many now spend more time on news than they did pre-tablet. In fact, the only thing they’re more likely to use their tablets for than news is to browse the Web.

These numbers are likely skewed, as most current tablet owners have pricey iPads; they tend to be higher-income, better educated, often middle-aged consumers. As the market expands down to lower-priced Kindle Fires and others, tablet demographics will likewise gravitate toward a more mass consumer profile.

Read More: http://keynote.com/benchmark/SaaS/article_three_screen_optimization.shtml

Tuesday, January 24, 2012

Mobile Web: Load Testing and Ongoing Performance Monitoring


Many retailers are making efforts to enable genuine mobile commerce, as opposed to trying to drive traffic from mobile to other channels to complete the transaction. And many are bringing their mobile development efforts in-house, after testing the waters with a third-party solution. Both of these scenarios involve significant risk, making it all the more imperative to perform robust load testing and ongoing performance monitoring.

The big challenge that mobile sites face that’s different than the desktop Web — in addition to the inherent slowness of over-the-air (OTA) signals — is the plethora of devices, operating systems, and carriers. A mobilesite or app has to be vetted in all the major configurations that ultimately control what is displayed in the user’s hands.

Any outage or excessive slowness during peak holiday periods can inflict serious pain on a retailer’s bottom line, whether the retailer is big or small, online-only or hybrid. The only way to make sure a website can handle an extraordinary rush of traffic is to thoroughly test it well in advance and every time changes are made. The most solid test regimens subject a site to traffic that’s multiples of the highest demand projected by the sales and marketing department.

Read More

Wednesday, January 11, 2012

Real tests use real devices


It's not that testing with real devices is the gold standard, though that's one way to look at it. The fact is, there is simply no way to emulate the behavior of a mobile app in the field unless the device is literally in the field. System overhead, memory usage, CPU speed — a host of variables impact app functionality in ways that just can't be reproduced in a lab. To get real test results, you have to deploy real devices.

But creating an internal test program using real devices is a challenge. The devices have to be bought, carrier contracts have to be established, test scripts created, users trained, results compiled and analyzed — for each geographic market. And then the devices and contracts have to be dealt with after the testing is done, or maintained for ongoing monitoring.

This complexity is why so many enterprises use an outside test partner that offers an established infrastructure with hundreds or thousands of devices deployed over a broad geography. The test provider works with the client company to develop the necessary scripts, and then leases time to them on its network to run the tests. This is testing in the "public cloud," which means that many clients utilize the same devices and infrastructure to conduct their tests. It's an ideal solution for most companies — there's no upfront capital expenditure and tests can be quickly executed on demand, on a budget-friendly pay-per-use basis.

Source: http://keynote.com/benchmark/mobile_wireless/article_mobile_app_performance.shtml