Pittsburgh Tech Fest: iOS Best Practices slides & code

Pittsburgh Tech Fest was great this year! It was a perfect opportunity to learn about some different technologies and techniques.  I’d like to give a special thanks to Dave and Eric for doing an awesome job organizing the event and the speakers.

For those that are interested, below are the links to the talk I did on iOS Best Practices: Avoid the Bloat and feel free to comment or ask any questions on this post!

Code before refactoring: https://github.com/JAgostoni/iOS-Best-Practices/tree/master/UglyApp
Code after refactoring: https://github.com/JAgostoni/iOS-Best-Practices/tree/master/NotSoUglyApp
Code as presented at Pittsburgh Tech Fest: https://github.com/JAgostoni/iOS-Best-Practices/tree/master/PghTechFest

PowerPoint Slides: iOS Best Practices – Pittsburgh Tech Fest
PDF Slides: iOS Best Practices (PDF) – Pittsburgh Tech Fest

Thanks again to those that attended my talk!

Creating the Sports Schedules App for the iPhone

I finally released my first real iPhone application in the app store that I did not create specifically for a client. The app is called Sports Schedules (great name, eh?) and is available via the Apple App Store here: bit.ly/SportsAppStore.

First, a quick background

I really do enjoy watching any Pittsburgh sports and it is pretty easy to remember to watch a Steelers football game.  But when it came to college hoops or hockey I frequently forgot about the games and missed them. Being pretty busy with work as well as (the more full time) entertaining my 5 year old I really didn’t leave much mental capacity for remembering there was a game that night.

Well … I have this nice smartphone so I perhaps I’ll put it to work for me, right? So I launched the App Store app and searched for Sports Schedules. I was faced with the very standard problem of finding a useful app in a sea of apps that didn’t do what I wanted. To top I off, there wasn’t an app simply called Sports Schedules so I figured I would just go ahead and create it.

Other apps

There are certainly other apps out there which have schedules for sports in them. First, each sport and even many teams have apps specific to them, but I really didn’t want a bunch of one off apps. Next, there were apps like ESPN, etc. While they have schedules in them … I really wasn’t impressed with the way the information was organized. Lastly, there are apps which simply put events on your calendars and I wasn’t really pleased with that scenario either.

Design

The first thing I was concerned with was creating some a nice, simple design for the application.  I started with some wireframe mockups using Balsamiq (great tool) so that I could initially focus on the information flow within the application.  Since it isn’t (initially) a large app this didn’t take much time: I wanted a dashboard showing the week at-a-glance, and the ability to add and remove teams

Next, I wanted to spend some time skinning this wireframe as I wanted to focus on creating a visually appealing app.  This reinforced the fact that I am a TERRIBLE designer.  I spent hours creating what I thought was an acceptable design and showed it to  my wife.  Trying to be nice she said: “It looks like you did it your self … can’t you buy a design on the Internet or something?”  Thankfully … you can … and I did!

Data

Now that I had proven my lack of design skills, I decided it wasn’t worth putting too much time into the app until I was able to locate a source of the data for the schedules.  Initially, I had thought about just entering it myself or paying someone a few bucks to enter the data into a database manually and keep it updated.  The problem is, that does not scale and is in no way reliable.  Instead, I started the search for a source of the data as an online service.  After searching for well over a week and contacting many many sales people (still gettin’ those emails) I finally found a source for the data … data that is not free and not cheap, as it turns out.  Before fronting the cash, I decided I better actually create the app first.

Developing the app

Once committed to creating the app, the problem became finding time to actually sit down and develop it.  I set a self-imposed deadline of “May” to try to keep motivated. With a full time job and a family this was indeed the most challenging part.  However, I managed to push through it and get the code written so I was able to move on to the next phase, the services.

Services

Next I needed to decide how to get the data updated on the app.  I didn’t want to spend a lot of money on servers but also I wanted to be able to scale up easily.  Elastic cloud computing seemed to make sense but last I had checked cloud services were a little pricey.  Lucky for me, Amazon started their deal where new customers get a pretty good share of EC2 instances and EBS storage free for a year so my decision was pretty easily made.

Next, I needed to decide whether or not to use a Windows instance or a Linux instance.  In a previous life, I did a fair amount of work on Linux but in recent years I was far more productive in Windows so that was my original choice.  The thought was that if I wanted to expand with some web applications I would have a nice .NET and SQL instance setup.  However, I quickly realized that Windows instances in Amazon are not nearly as economical as Linux instances so I scrapped that idea and setup an Amazon AMI Linux instance.  Frankly, the fact that the Windows instance uses a minimum of 30 GB of EBS storage and the Linux instance only 6 GB solidified that plan.

For better scalability and performance I decided to generate static XML data files for the iPhone app to download from.  This would mean fewer resources used on the Linux instance and would enable me to leverage Amazon’s CDN if necessary to better distribute the content.  As a result, I actually have no real web services serving up the content.

Integration

With the plan for services underway, I now needed to figure out how to get the data from my provider into a format suitable for the app.  As an Integration Architect on many projects I learned a lot of patterns and techniques as well as best practices for this sort of problem.  The primary take away is to decouple the data source from the apps in the event I need to replace it or even aggregate the data from multiple sources.  Also, having worked extensively with BizTalk I saw great value in leveraging an integration platform but BizTalk is far too pricey for my problem.  I decided to investigate free/open source integration solutions.

There are several FOSS integration platforms out there: WSO2, Mule, JitterBit, etc.  JitterBit caught my attention for a few reasons: it was simple, it had just the features I needed, and the installation was relatively lightweight.  It also came with a decent development environment.  JitterBit allowed me to very quickly integrate the XML data source with my database (PostgreSQL) and then to output the static XML files.

More app development

Now that I had a steady stream of data coming in from the provider, I needed to finish the coding in the application.  Hooking up to the XML content was a pretty standard academic exercise as was adding in some small features and fixing bugs.

App Store

The part I dreaded most was submitting to the App Store.  One can spend a lot of time and a lot of money just to have Apple reject the app for some completely unforeseen reason.  In nearly all occasions, when I submit to the App Store I get rejected at least once.

I submitted the app fully expecting to get rejected … what I didn’t plan on was that I would be the one doing the rejecting.  In two occasions while waiting for approval I found bugs and needed to reject my own app.  After re-submitting it … twice … I was very pleasantly surprised when my app was approved after nearly EXACTLY one week.

After all the effort, my app was finally in the App Store and working well … maybe a few bugs that need squashed … but overall a pretty smooth launch.

TFS Xcode Build – v1.0 Released

Xcode, TFS and the ALM …

Many organizations have been faced with centralizing all of their ALM tools in order to enable better integration across all the tools for each role in your app lifecycle.  Team Foundation Server (TFS) provides an excelllent integration environment for Microsoft .NET projects and even application developed in Eclipse (Java, Android, etc.).  There have been many recent advances into the mobile space especially in iOS applications and my work is certainly no exclusion to this.  Since CEI has adopted TFS as our ALM platform I have been keeping all of my Xcode projects in TFS via the Subversion bridge.

But what about builds?

While storing Xcode projects in TFS works quite well (including the ability to associate with Work Items) one of the primary features of TFS (and any integrated ALM platform) is Build Automation.  Since Xcode projects can ONLY be built on Mac OS X there simply was no way to trigger a build using Team Build in TFS.  Alternatives exists, for sure … there are other CI platforms that can be triggered via SVN (svnbridge in TFS) but that requires more investment in software.

What I wanted was a solution leveraging Team Build as much as possible and a Mac only where it was needed to compile the Xcode project.  The thought of implemeting a Team Build Agent on the Mac was …. frightening 😉  So, instead, I decided to automate copying the source code from the Team Build server to the Mac (using SCP), remotely triggering xcodebuild (via SSH), and finally retrieving the results (again, via SCP).  It turns out this was pretty straight-forward and reliable.  To share this, I created a Codeplex project to host the source code and binaries.

TFS Xcode Build v1.0

Check out the project hosted on Codeplex here: http://tfsxcodebuild.codeplex.com/.  There you can find the latest source code, binary release and documentation.

iOS Best Practices – Singletons

Problem

Many examples found online utilize the AppDelegate instance for global storage/variables.  While this is a quick way of sharing data and methods between views and classes it can (and usually does) lead to several problems:

No control over global variables/storage

Each referencing class assumes direct control over this variable and won’t necessarily respect how another class is expecting to use it.  With a singleton, the data has been fully encapsulated and controlled in one place.

Repeated business logc

If there is any business logic on how this global storage is to be used it has to be repeated throughout the application.  While some may “encapsulate” this by using accessor methods the logic is in the wrong place.

Big Ball Of Mud

Very quickly, the AppDelegate class will become a big ball of mud and VERY difficult to maintain.  This is compounded over time as the app is revisioned and different developers add more and more code to the ball of mud.

Fixing the problem: Singleton

One way of fixing the “I need to put all my global variables in the AppDelegate” is to use Singletons.  A singleton is a design pattern (and implementation) ensuring that a given class exists with one and only one instance.  The developer can now store like variables and implementations together with the confidence that the same data will be retained throughout the application.  In fact, the AppDelegate is held in a singleton of your application ([UIApplication sharedApplication]).

The developer must also ensure to not repeat the same “big ball of mud” anti-pattern by simply moving all the code from the AppDelegate into one Singleton class.  The concept of single-purpose classes will be covered in a future post.

Implementation

The implementation is pretty straight-forward based on Apple’s Fundamentals and is made even simpler using ARC in iOS 5.  The trick is ensuring all code that references this class is using the exact same instance.

Steps/tips for a Singleton in Objective-C:

1. Implement a “shared manager” static method to dynamically create and retrieve the same instance each time.

static SingletonSample *sharedObject;
+ (SingletonSample*)sharedInstance
{
if (sharedObject == nil) {
sharedObject = [[super allocWithZone:NULL] init];
}
return sharedObject;
}

2. Leverage public shared methods as a convenience factor to encourage use of the singleton.

+(NSString *) getSomeData {
    // Ensure we are using the shared instance
    SingletonSample *shared = [SingletonSample sharedInstance];
    return shared.someData;
}

3. Create and use instance variables and methods as you normally would

@interface SingletonSample : NSObject {
    // Instance variables:
    //   - Declare as usual.  The alloc/sharedIntance.
    NSString *someData;
}

// Properties as usual
@property (nonatomic, retain) NSString *someData;

4. Use the class via the shared methods and/or instance

- (IBAction)singletonTouched:(id)sender {
    // Using the convenience method simplifies the code even more
    self.singletonLabel.text = [SingletonSample getSomeData];
}

The full source code with sample application is available here: https://github.com/JAgostoni/iOS-Best-Practices/tree/master/BigBallOfMud

iOS Best Practices – Introduction

As my work gets more and more into mobile development (primarily iOS) I find our typical adoption to best practices (in .NET and Java, for example) not as strong. Whatever the reason, these practices are just as important on mobile platforms as they are on web and desktop platforms. The fact that a mobile device has more constrained resources or has fewer technology choices should have no impact on proper coding and design practices.

There are several books (Apple and otherwise) that I can recommend that cover basic coding conventions and UI design guidelines so i’ll try not to re-hash much of that here. As I encounter references and resources such as these I will link to them in a resources section.

The following series of posts is meant to document iOS design and development best practices as I have encountered them both in practice and as I have found them researching across the Internet. I encourage anyone following along to not only share the practices but to critique and contribute as well.

This introductory post will serve to index all the posted best practices:
1. Avoiding the big ball of mud (part 1) – Singletons – http://wp.me/p15S8e-3d

WCF Web Service Latency on BizTalk 2010

I was recently working with a client to roll-out some WCF web services to BizTalk 2010 which access an Oracle database to facilitate the query (also using the WCF-OracleDb adapter).  As expected, we were getting some pretty serious latency issues (1-2 seconds per request) as BizTalk is tuned for throughput and not low-latency.  This, however, is not really an acceptable solution for web services.

To fix this, we needed to adjust some BizTalk tuning parameters to improve the latency.  Specifically, we looked at the polling interval for messages on a given host.  By default, it is set a 500 ms which means we’ll most likely have a MINIMUM of .5 seconds to process the request and most likely more as the message is sent to the orchestration, to the WCF-OracleDb adapter, and then back to the orchestration.

We decided to create a new BizTalk low latency in-process host to accomodate our web services.  Fortunately for us, BizTalk 2010 makes it even easier to set some of the previously registry-based on inaccessible tuning parameters.  After creating our host and host instances, it was pretty straight-forward:

  1. Open the Admin console, right-click on the BizTalk group and select settings
  2. Under the Hosts node, we selected our low latency host from the drop-down
  3. Now, under the Polling intervals we set both Messaging and Orchestration to 50 ms instead of 500 ms
    1. A better practice would be to have split receive, process, and send hosts and set the polling more specifically.   This was not required in our case.
  4. After selecting the host in the bindings (orchestration and send ports, not needed for the isolated receive port) and restarting we got the average response time down to 300 ms which puts us at the mercy of the Oracle databases response time. 

We also adjusted the internal message queue size for the host to ensure more messages can be kept in memory for even lower latency.

Take a look at this article for more specific details.

 

TFS 2010 Configuration: TfsJobAgent Won’t Start – Access Denied

While setting up and configuring TFS for a client, the other day, we ran into a strange error during configuration.  The TFS 2010 configuration failed to complete because the TfsJobAgent service could not start.  The error was simple and straight-forward: Access Denied. 

Usually there are several items to check here:

  1. Are the credentials for the service account correct? (The error would have told us Logon Failure anyway)
  2. Does the service account have the Log on as service policy right?
  3. Is the service account NOT in the Deny log on as service policy?
  4. Are these policies being locked/overridden via an AD policy, etc.?

Well … we exhausted all of these options and still could not determine the cause.  We decided to get a fresh set of eye in on the issue and he pointed out the brutally obvious to us by asking: Does the service account have file system rights to the EXE? Check the ACLs.

Brilliant!  Turns out this client restricts folder ACLs on their servers to aid in security and the TFS service account didn’t have access to this folder. 

So now I can add this to my “pre-flight” checklist.

Getting off the ground with the WCF-OracleDB adapter in BizTalk

Time and time again I walk into a client which uses Oracle and I need to connect BizTalk to it.  And … time and time again I run into issues getting the correct Oracle client installed, then getting Visual Studio to pick that up and so forth.  Finally, I think I have arrived at a formula for getting things working.

Step 1: Install the WCF LOB Adapter pack

Ensure that you have installed at least the 32-bit version on a development workstation as well as the 64-bit version for any 64-bit environments.  The BizTalk 2010 installation program does a nice job of walking you through installing the SDK and then the LOB packs so I won’t go into any more details.

Step 2: Obtain and install the ODAC 11.1.0.7.20 client

Technically, the adapter pack is compatible with an older 11g client but this will be taken care of via a slew of Publisher Policy’s that will “redirect” your client to the installed version.  To get the client you can search for ODAC1110720 or go to Oracle’s Windows download page and locate the ODAC client links and then the specific version listed above.  I honestly have not tested with the newer 11g clients but perhaps the same/similar process will work with them.

Step 3: Update the adapter pack’s assmbly binding

Locate the Microsoft.Adapters.OracleDB.Config file in the Adapter Pack’s install directory (c:\Program Files\Microsoft BizTalk Adapter Pack\bin) and add the following XML snippet in the <assemblyBinding…> element:

<dependentAssembly>
<assemblyIdentity name="Oracle.DataAccess" publicKeyToken="89b483f429c47342" culture="neutral" />
<bindingRedirect oldVersion="2.111.7.00" newVersion="2.111.7.20"/>
</dependentAssembly>

Now this will redirect all 11g calls to th 11.7.0 client version to the installed version.

At this point, the Add Generated Items->Consume Adapter Service wizard should work to connect to your Oracle server.  Depending on your standards, you may need to get a tnsnames.ora file in the correct location OR skip TNS resolution and use the server’s direct settings in the binding configuration.

Please leave a comment if this does (or does not) work for you!

New TFS Build Extensions

Per Brian Harry:

Mike Fourie just published a bunch of workflow activities/actions for TFS builds.  It’s a great set of extensions that makes TFS builds even more powerful with less work.

While there doesn’t seem to be any true documentation on the actual extensions themselves, looking at the bundled/generated CHM it looks like we have some new extensions in the following categories:

  •  IIS7 Integration – looks like creating components in IIS to roll-out a web application (application, site, app pool, etc.)
  • VB6 Builds
  • Hyper-V/VirtualPC Integration – Tools to manage Virtual PCs and to interact with Hyper-V
  • SQL Server command execution
  • WMI script execution
  • PowerShell script execution
  • ZIP Integration (one that is frequently asked of me from our customers)
  • Sending emails
  • Code Metrics integration
  • StyleCop integration
  • NUnit integration
  • File system, assembly info/update, RoboCopy and more

Seems like a pretty decent list of enhancements for free!  Grab them here:

http://tfsbuildextensions.codeplex.com/releases/view/67138

BizTalk Schema Inheritance Practices / Examples

Frequently I find myself at clients explaining BizTalk best practices and one of the items I always try and push forward is proper schema inheritance practices.  Much like object-oriented design, schema inheritance can help create a nice canonical domain model.  However, there are always questions as to which type of inheritance to use.  While these practices go beyond just BizTalk, I wanted to focus this on implementation within the BizTalk tools.

Microsoft (and others) do put out documentation on this very subject but it doesn’t really go into any good examples of each type so I plan to plug that gap here.  The first few sections go into some background and other practices … if you are just looking for examples and practices for the given types feel free to skip ahead.

Types of Schema Inheritance

To summarize, there are three ways to re-use schemas within your BizTalk solution:

  • XSD Import – Importing types from another schema/namespace
  • XSD Include – Importing types from another schema within the SAME namespace
  • XSD Redefine – Importing types from another schema with the intent on extending/overriding the types

These all sound very similar in definition but there are some key usage scenarios for each in practice.

Organization of Schemas

Before going into the re-use of schemas it is important to think of re-use of your schemas beyond just your solution and extend into the greater collection of BizTalk applications at a given organization.  This becomes even MORE crucial in the realm of BizTalk when dealing with dependency, deployment and maintenance issues.  Without getting into too much detail, here are some quick pointers:

  • When creating canonical schemas to be shared with multiple BizTalk applications they MUST be isolated in their own project/assembly.
  • Treat yourself like a 3rd party vendor when consuming these schemas: add a reference to a well-known, strongly versioned DLL of these schemas
  • Remember that BizTalk only allows a given schema message type to be defined once in the entire BizTalk group so sharing these at the source code level won’t work
  • Track these central schemas as a separate project/lifecycle so that there is a clean separation between your consuming application and these soon-to-be highly shared schemas
  • Enforce schema versioning (especially in the namespace) from the beginning.  When you roll out an update you’ll not only have to version the .NET DLL but also the schema namespace
Creating ComplexTypes

The first step in re-using other schemas is to define the reusable schema itself (obviously) but then to create actual schema types (ComplexType) out of the bits to re-use.  This is the simple part: Simply select the “record” node which you want to re-use/share and set the DataStructureType property to the name you wish to use for the type.  The following screen shot shows the data structure type property.

BizTalk - Schema Data Structure Type

BizTalk - Schema Data Structure Type

It would be a good idea to define a naming standard/practice for these types … I generally use something like the node name plus the word type.  For example if I were creating an Order schema I would create a complex type named simply: OrderType.

The following sections define a typical usage scenario for each type of schema inheritance.

XSD Import – Composition

Frequently, you have the same type of information repeated throughout your projects and solutions.  A classic example is the Address.  When defining a schema like Order, you can have several addresses all containing the same fields but representing different locations.  For example, you may want to have a shipping and billing address in your order and ensure the addresses are consistent. This is very simple to accomplish:

  1. Create a schema which represents your address, and create a ComplexType out of it as in the above example.
  2. Create a schema called Order and ensure it has a different target namespace than the Address schema
  3. Click on the <Schema> node, then open the Imports collection in the properties window
  4. In this windows you’ll now be able to “XSD Import” the address schema into the Order schema

The following screenshots depict the XSD Import of the Address schema:

BizTalk - XSD Import

XSD Import

Next, create a couple of Child Records nodes for holding a Billing and a Shipping address. In the properties window for each record, set the Base Data Type to the complex type created above.

BizTalk - Base Data Type

Base Data Type

Now as the address type is changed, it will be reflected in your Order schema as well.

Another, perhaps more useful, example is adding additional context details to a schema.  Take, for example, the Order Schema created above.  If you wanted to add additional process context items you could wrap the order schema into another schema (call it: MyProcessOrder) where a child record node represents the Order and additional items are added as sibling nodes.  Using this method isolates your Order schema as the “Order entity” without poluting it with non-Order related attributes.

XSD Include – Batch Schemas

XSD Includes are slightly different from Imports in that the target namespaces of the schema and the included schema must match.  The primary use case I have used here is when creating envelope schemas used for debatching.  Continuing the above Order example, convert the Order schema to a ComplexType (using the Data Structure Type node) so it can be included in another schema.

Use the following steps to create an Orders schema containing one or more Order records:

  1. Create a new schema called Orders
  2. Ensure the target namespace of the Order and Orders schema are IDENTICAL
  3. Open the Imports collection (same as above) and now you can “XSD Include” the Order schema
  4. Now, add a child record and set it’s type to the OrderType created above

The following screen shots show including the Order schema into the Orders envelope schema:

XSD Include

XSD Include

And then creating the Order record …

Order Record

Order Record

Since this will be used for debatching, a few more properties need set at:

  • Envelope: Yes
  • Root Reference: Orders
  • On the Orders node, set the Body Xpath property to point to the Order node
  • On the Order node, set the maxOccurs to unbounded
XSD Redefine – Embrace and Extend

The XSD Redefine is probalby the least uses type of schema reuse for me as I genreally find myself using it to “hack” around another schema or trying to over-simplify something … and actually making it more cumbersome in the end.

An easy example of this is to take the AddressType above and create a new type from it to include some phone numbers:

  1. Create a new schema called AddressPhoneType
  2. Ensure the namespace on AddressPhoneType matches that of AddressType above
  3. As above, use the Imports property to “XSD Redefine” the Address schema
  4. Add a new child record and set its Data Structure Type property to AddressType (this is different from earlier where the Base Data Type property was used)
  5. Now, add a new Sequence Group to the new child record and customize!

The following screen shot depicts the AddressPhoneType schema:

XSD Redefine

XSD Redefine

Now you can change the Address schema without affecting other “consumers” of it.  Very useful if you want to embrace the re-usability of the Address type but not break aa bunch of other schemas which inherit it as well.

Now, with the above examples BizTalk developers can embrace re-usability in their BizTalk schemas along with other best practices!