Automating UI testing
- 3. Road map What are automated UI tests and why do we need them Apple’s UIAutomation framework FoneMonkey This is where you should wake up
- 4. Why do we need automated tests Hopefully - this is a rhetorical question
- 5. We want a 5 star app A great design and marketing will get you a lot of users But if your app is buggy you will see a lot of those:
- 6. There are always more scenarios The number of test scenarios is always growing Each new feature = new scenarios Each new view = new scenarios Each bug discovered = new scenarios Impossible to test them all manually!
- 7. Refactoring is alway dangerous Most of the times fixing bugs introduce new bugs The sooner we find a bug, the easier it is to fix it
- 9. The solution - Automation We need a way to test all those scenarios automatically Unit tests are great and important, but are focused on a specific feature - very hard to test how parts of the app work together
- 11. UIAutomation framework A JavaScript library introduced in iOS SDK 4.0 Integrated in Instruments Can be run on the iPhone and Simulator Touch based
- 14. UIAutomation elements Target Application - Access and control of application-level UI elements (windows, status bar, keyboard etc.) UIATarget.localTarget() .frontMostApp()
- 15. UIAutomation elements Target Application Main window - control of the application’s window elements UIATarget.localTarget().frontMostApp() .mainWindow()
- 17. UIAutomation elements Target Application Main window View element UIATarget.localTarget().frontMostApp().mainWindow().tableViews()[0] .cells()[0]
- 18. UIAutomation elements Target Application Main window View element Child element UIATarget.localTarget().frontMostApp().mainWindow().tableViews()[0].cells()[0] .elements()[“Chocolate Cake”]
- 19. Logging Since UIAutomation is based on view hierarchy, we need a simple way to examine the hierarchy We do this by calling: UIATarget.localTarget(). logElementTree()
- 25. UIAutomation commands Tapping buttons UIATarget.localTarget().frontMostApp().navigationBar(). buttons()[“Add”].tap(); Text input UIATarget.localTarget().frontMostApp().mainWindow(). textFields()[0].setValue(“new”)
- 26. UIAutomation commands Scrolling UIATarget.localTarget().frontMostApp().mainWindow(). tableViews()[0].scrollToElementWithPredicate(“name beginsWith ‘Adam’); Sliding UIASlider UIATarget.localTarget().frontMostApp().mainWindow(). sliders()[0].dragToValue(0.5)
- 27. UIAutomation elements A complete list of elements and commands can be found at apple’s UI Automation Reference Collection More info at WWDC 2010 lecture - Automating User interface Testing with Instruments
- 29. UIAutomation - Summary “ Native” apple framework Can be run both on device and simulator Pretty good gesture support Support for most UIKit components Impossible to add functionality Very fragile - small changes in UI cause massive changes in tests No recording abilities Must be a developer to add tests Weak validation - only visibility and existence Hard to integrate in CI
- 31. FoneMonkey Open source project release by GorillaLogic Records and plays back “higher-level” event Flexible and extendable
- 37. Using FoneMonkey When running the new target, the application is launched with the FoneMonkey console The console provides controls for recording, playback, validation and script management
- 38. Using FoneMonkey Start recording by tapping the Record button Click the More button to view recorded commands
- 39. Using FoneMonkey Edit specific commands Arguments values are per command Touch UIButton “myButton” InputText UITextField “myTextField” new text
- 42. Extending FoneMonkey There are many cases where we need to extend the basic functionality with custom gestures double tap Long press + drag gestures FoneMonkey can be extended for both recording and playback
- 43. Adding touch events #import "FoneMonkeyAPI.h" #import "MyView+FoneMonkey.h" @implementation MyView (FoneMonkey) - (NSString*) monkeyID { return self.myViewId ? self.myViewId : [super monkeyID]; } - (void) handleMonkeyTouchEvent:(NSSet*)touches withEvent:(UIEvent*)event { UITouch* touch = [touches anyObject]; if (touch.tapCount == 2) { [FoneMonkeyAPI record:self command:@"DoubleTouch" args:nil]; } else { [super handleMonkeyTouchEvent:touches withEvent:event]; } } @end
- 44. Adding control events - (UIControlEvents)monkeyEventsToHandle { return UIControlEventEditingDidEnd; } - (void) handleMonkeyEventFromSender:(id)sender forEvent:(UIEvent*)event { if (!self.editing && self.text != nil) { [FoneMonkeyAPI record:self command:FMCommandInputText args:[NSArray arrayWithObject:[self.text copy]]]; } else { [FoneMonkeyAPI continueRecording]; } }
- 45. Extending playback - (void) playbackMonkeyEvent:(FMCommandEvent*)event { if (event.command isEqualToString:@"Drag") { [self performDrag]; } else { [super playBackMonkeyEvent:event]; } }
- 46. What about gestures Unfortunately, FoneMonkey doesn’t have built in support for UIGestureRecognizers Taps are OK, but every other gesture recognizer won’t be caught in handleMonkeyTouchEvent A very big problem since UIGestureRecognizers are widely used Luckily - Can be solved easily with a little Method Swizzling
- 47. Extending gestures with Method swizzling Method swizzling - Swapping method implementations in run time The main difference from categories is that method swizzling let you use the original implementation @implementation MyView (FoneMonkey) #import “ FoneMonkeyAPI.h” #import <objc/runtime.h> + (void) load Method originalMethod = class_getInstanceMethod(self, @selector(handleSwipe:)); Method replacedMethod = class_getInstanceMethod(self, @selector(fm_handleSwipe:)); method_exchangeImplementations(originalMethod, replacedMethod); } @implementation MyView (FoneMonkey) - (void) fm_handleSwipe:(UIGestureRecognizer *)recognizer [self fm_handleSwipe:recognizer]; [FoneMonkeyAPI record:self command:@"Swipe" args:nil]; }
- 50. FoneMonkey script formats Every test recorded in FoneMonkey is saved in three formats: scriptName.fm - Native Mac Plist file used by the FoneMonkey console scriptName.m - Drop-in Objective-C OCUnit test to be integrated with XCode Can be extended easily (loops etc.) Can be run in Continuous Integration environment (behind the scope of this presentation) scriptName.js - UIAutomation Javascript for instruments tool
- 52. FoneMonkey VS UIAutomation UIAutomation is a good framework with support for almost every UIKit component and gesture It has 4 major down sides that make it very hard to use in the long term Impossible to extend Very vulnerable to changes Takes a long time to write tests Can’t be integrated in a CI environment FoneMonkey has fewer support for components and gestures, but it answers the above bullets, so in the long run - it is the best solution in my opinion
- 53. Other alternatives FoneMonkey is my favorite UI testing framework However, there are many other solutions. The most famous are: Selenium - http://seleniumhq.org Frank - testingwithfrank.com You can try out and see if they fit you better Everyone is entitled to his own wrong opinion Frank
- 54. Where to go from here Try it out! If you don’t need to extend functionality, integrating FoneMonkey is less than a day. Teach your QA team to record test scenarios If you already have a CI environment - integrating FoneMonkey tests takes about 2-3 days
- 55. Extending FoneMonkey does take some work This is actually how GorillaLogic probably makes monkey But what if I do need extended functionality? Try to estimate the effort to integrate FoneMonkey vs the effort you spend on manual testing If you need help or advice - my mailbox is always open: [email_address]
- 56. Resources WWDC 2010 videos - Session 306 David Reidy presentation http://www.slideshare.net/DavidReidy/automated-ui-testing Alex Vollmer’s blog http://alexvollmer.com/posts/2010/07/03/working-with-uiautomation/ FoneMonkey docs - http://www.gorillalogic.com/books/fonemonkey-docs/home FoneMonkey iOS Tutorial from IOS DevCon 2011 - http://www.gorillalogic.com/userfiles/fonemonkey/FoneMonkeyDevCon.pptx.pdf Dr.Dobb’s blog http://drdobbs.com/open-source/231903414?pgno=1