Архив метки: QA

AppThwack Takes On Android Fragmentation With New Automated Testing Service


Android development doesn’t always have to look like this. Smaller developers without their own in-house QA departments often outsource their testing to services like Testdroid, for example, which tests their Android apps on physical devices. But today, Testdroid and the like will have some new competition from a company called AppThwack, which plans to not only match Testdroid’s capabilities, but will go even further in terms of the number of automation frameworks it supports.

Currently, Testdroid supports more devices than AppThwack, which only has 60 to Testdroid’s 100+. But, points out AppThwack co-founder Trent Peterson, they’ve managed to go from zero to sixty in just three months. And, he adds, you don’t need to have every device in order to cover the majority of the market. That said, AppThwack is still adding more devices to the service at a rate of about five per week.

Peterson and his co-founder Pawel Wojnarowicz formerly worked at Intel, where, for nine years, they focused on automating distributed systems for Wi-Fi, WiMax and Bluetooth. In March 2012, they decided to quit and begin building AppThwack. Originally, the idea was to build a distributed automation platform and market that as the company’s flagship product.

“But it quickly became apparent that two guys with an unproven automated platform that’s fairly generic is nearly impossible to market to enterprise,” says Peterson. So they shifted into Android testing instead, using the automation platform as the base and building AppThwack on top of it. “The goal is to allow developers to see how their apps are performing on devices before they ever hit an actual end user,” he says of the new product.

To be clear, AppThwack is not a beta testing suite, where apps are distributed to people who then run the apps on their devices and give individualized feedback (such as Applover, e.g.). “From our background in automation and general QA, I don’t think [beta testing is] really a solution in and of itself,” says Peterson. “First of all, these people have no ties to your app and don’t know what it should look like and how it should behave, plus, you’re sending out your IP to random people. I don’t really compare us to that entire market.”

Instead, AppThwack’s closest competitors will be the services that automate the testing process on actual hardware. Peterson identified his closest competitor as the above-mentioned Testdroid, but notes that Testdroid is focused on Robotium, the Selenium-like testing service designed for Android. With AppThwack, Robotium will be supported, but it will also support Exerciser Monkey and it will randomly test the UI by taking screenshots in both portrait and landscape modes.

“We didn’t design around Robotium or design around Exerciser Monkey, everything is very modular,” says Peterson. “And as we get new requests there are a couple of other automation frameworks for testing Android devices, and we’re adding these in as we go.” He mentions monkeyrunner, MonkeyTalk, and Calabash as those under consideration. Implementation will be based on demand. In addition, the company is adding support for web testing, too. Right now, it loads URLs and takes screenshots in a variety of browsers, but it will become a more robust service in time.

After tests are run, developers will be provided with easy-to-read reports, like these examples here: Android, web.

During its private beta, the bootstrapped company had 200 developers who ran over 200,000 tests on the service. The former director of QA at Swype, Michael Tu, and a co-founder of OpenSignalMaps, Sina Khanifer, are current users of the service.

AppThwack will be a freemium service, but until pricing is worked out, it’s free. There’s also an option for bigger shops to install the framework in-house, if they choose. Sign-up is here.

AppThwack Takes On Android Fragmentation With New Automated Testing Service

uTest’s AppGrader Scores Mobile Apps, Helps Developers Squash Bugs


uTest, a company known for providing a variety of testing solutions for desktop, web and mobile, is launching a new solution designed to grade mobile apps’ performance under real-world conditions, and then compare the app’s rating with that of its competition. The solution, for obvious reasons (i.e., desperate need) is arriving first on Android, with an iOS version to follow soon. The app testing process takes just a few minutes, the company claims, and will then return a report grading the app on a scale of 1 to 100.

In addition to the score, the report also details any issues discovered during the app download, installation and basic usage. To provide more context, the AppGrader report, as it’s called, also compares the app’s grade to those of the most popular applications in the Android Market Google Play store that are found in the same app store category.

The system isn’t designed to replace the testing and QA work developers already do, of course, but is meant to function as more of a final step that can give more insight on how the app will run when actually put into the hands of users. Explains uTest CMO Matt Johnson, this “in-the-wild testing provides ‘last mile’ assurance that the apps work on real devices, under real-world conditions, in a wide variety of locations,” he says. However, there’s no reason why developers couldn’t continue to run AppGrader after the app’s launch, if need be, or as they continue to push out minor updates and tweaks to the app in question.

To use the service, developers just upload the Android APK file to get started, and AppGrader will send out an email notification within a few minutes after the testing is complete. For apps that crash, developers will also be given the device-specific crash log for additional diagnostic details.

For now, the service tests the apps on top Android devices, like the Samsung Galaxy Nexus/Samsung Galaxy S II, Google’s Nexus S, LG Nitro HD, Samsung Galaxy Tab, HTC Thunderbolt, Sony Ericson Xperia, Motorola Droid X2 and the T-Mobile My Touch. Apps are also tested on U.S. carriers AT&T, Verizon, and Sprint.

Although all mobile developers could benefit from more testing tools, there’s more of need to address the Android developer base first. On Android, developers don’t just have to deal with an incredible number of device types in the wild, they’re also constantly challenged by OS fragmentation, too. According to Google’s own statistics, only 4.9% of users are running the latest version of Android (Ice Cream Sandwich), 3.3% are stuck on the version just prior (Honeycomb) while 64.4% are on Gingerbread, which was first released back in December 2010. The remaining 27.4% are running versions that are even older, if you can believe it.

To put this in perspective, iOS users update to the latest version remarkably fast. (One report shows 38% hit iOS 5 within 5 days of its release, for example). It’s not entirely fair to make judgements about the users on either platform, however – iOS users have access to upgrades, while Android users, either due to carrier or OEM restrictions, often do not. But it does showcase the greater challenges that Android developers have to deal with when it comes to building apps for a number of handsets and software versions.

uTest’s AppGrader is available now, from here.  The service is free, as the company expects it might entice users to try out the company’s other mobile testing products.

uTest’s AppGrader Scores Mobile Apps, Helps Developers Squash Bugs