Alexander Stone mobile Developer Portfolio.
Hi, My name is Alexander Stone, and I’m an independent mobile developer. I made apps for iOS, Android and earlier J2ME phones. This is portfolio of my personal apps and apps I have worked on since March 2011.
From April 2011 until March 2013:
- I have 7 iOS apps in the iOS app store
- 2 Android apps on Google Play
- I give away about 100 free iOS apps a day
- Sales of my iOS apps buy me lunch every day (~4000$ in 2012)
- One android app broke 500,000 downloads
- My app support/marketing website receives over 20,000 visits per month
iOS App sales
While my Android apps were a labor of love, I actually tried to make money on the iOS app store. Below are the screenshots of my efforts. It appears that just as with websites, the “older” an app is, more successful it is. Maybe this is due to the app sales page propagating to various app aggregating websites (where good keyword selection matters). All of my apps seem to have a very even distribution of sales, with the exception of the following events:
- The initial presentation generates a splash of interest (Augmented Reality: Glamour hit 3.4 k downloads on the first day, diminishing after a week)
- Posting apps to appropriate Reddit pages generated up to 100 paid app sales a day for 3 days
Personal projects, most recent first
What it is: Better Mood Tracker is and iOS app designed for iPhone. The app is built around the concept of Experience Sampling Method (ESM).
The app is a comprehensive lifestyle tracker and its purpose is to collect rich data about the user’s lifestyle to be analyzed at a later point. The app has a built-in events log, sleep tracker, weather and a biological clock. I use this app day to day to keep track of what I do and when I do it. I really like this app and have been using it daily for over 4 months now.
I’ve been doing sleep tracking with a similar tool for over 9 months prior to building the Better Mood Tracker. I found that the sleep data alone is not enough to make judgements about the user’s lifestyle. This is why I improved upon the app to add event logging and other features I liked.
Better mood tracker was built over 6 weeks, I’ve used this app to integrate all of my existing iOS knowledge in a single app. This is the prime example of what I can do as a developer.
While most of my apps are complex, I will use Better Mood Tracker to highlight some of the frameworks and concepts that I’m proficient with:
- A customized GMGridView to allow for drag/drop/rearrange and delete functionality
- Backed by Core Data entities, acting as a template for each event
- User is able to customize each event template
- Has a photo gallery to display all pictures stored with events
- Use of tap, pan gesture recognizers
- Customized FGallery for photo browsing
- Adopted to work with CoreData stored images
- Images are compressed for storage
- Option to save photos to camera roll
- Custom adaptation of Pick-A-Mood instrument (with permission of copyright owner)
- X-Y color map algorithm to map HSV color mapped to location
- Custom interface layout
- Tap and Pan gestures
- Mood compass – CGAffineTransforms are used to achieve arrow rotation and resizing
- Mood information is saved to core data
- A heavily customized UITableView
- Custom queries to CoreData pull out multiple entities(events, moods and sleep) to display in a single table
- Uses a UIScrollView and UITableView to achieve side-scrolling, as well as vertical movement
- Implemented Icon caching for quicker scrolling
- Each row and each marker is tappable to bring up the corresponding view of data
- “Classic” interface layout, designed to work with iPhone 4 dimensions
- 3rd party algorithm to calculate sunrise/sunset from GPS location and date
- 3rd party weather service using AFNetworking and JSON deserialization
- Image cache for fast scrolling
- Animation of clock hand rotations
- The screen can be populated with past events (tapping on a row in history brings the user here)
- A customized UITableView pulling a single entity from Core Data
- Caches event icons
- Uses predicates to dynamically split events into table view into sections by day
- Uses Core Motion framework for activity sampling
- Algorithm processes motion events and stores results in Core Data
- Calculates multiple sleep metrics
- Crash proof in the event of app terminating or phone running out of battery
- Live clock with custom font provided with the app
- Screen dimming, screen saver using a thread-safe timer
- Built-in help, instructions and a link to online help
- Custom UITableView backed by queries to Core Data
- Sleep records are broken by day
- A graph view to display sleep cycles
- Animated tooltip, responding to gestures
- Algorithms for motion analysys using Core Motion
- Delegation used to make actigraph work with multiple parts of the app (logging, display, analysis)
- Bar graph, scrolls to show any data in the app
- Live line graph to show sensor filtering in action
- Calibration screen, using statistics to filter out sensor noise
- Two donate options implemented with in-app purchasing
- Check for network availability
- Uses specific app identifier for app store
- Can retrieve previous purchases
- Use of local notifications to schedule reminders from awakening until bedtime
- Algorithm schedules reminders to cover the entire user day with just a few reminders
- A number of options and features are implemented
- Passcode Lock
- Different Pick-A-Mood characters
- Reminder Management
- The one thing that I learned the most is to limit the scope of my projects. Time and time again I saw features that I built in a day require multiple days of polishing, debugging and documentation before they are “customer ready”.
- I’ve used minimalistic graphics – some gradients, border color, but the app could benefit from a dedicated graphics designer
- I got pretty good with core data queries – unlike traditional table views backed by a NSFetchedResultsController fetching a single entity, I’m using date queries to pull multiple entities out of core data and display them in a table view. This requires more work, but is a lot more flexible.
- I’ve taken an opportunity to look into new things – network weather fetch and in-app purchasing are things that I’ve never done before.
Blue Light Therapy is and iOS app built for iPhone and iPad. The app is an experimental attempt at influencing circadian rhythms and the human biological clock with light. This is my simplest app, and also the one with the highest user ratings. Surprisingly, it also has the most downloads/day, and is the only app that appears in the #1 search result for multiple related keywords.
- Simple design – only 3 screens, 3 options
- Uses OpenGLES to create ripple effect with gesture recognizers
- Plays a single music track on a loop
- Wallpaper mode – iOS idle timer disabled
Icon Maker++ is an app that uses an iPhone to build icons for xCode projects. Each app requires multiple iPhone and iPad icons, and with the help of the Icon Maker++, I can quickly lay out, preview and create an icon. Unlike all other apps on the market, this one supports layers, which are interacted with gestures.
- Uses Core graphics and affine transforms for rotation, reflection
- Scroll Views for scaling and translation of images
- Dropbox to core data sync
- Custom table View and GMGridView for layer and frame management
- Imitation of iOS 5.1 app store look and feel to preview icon, name and company name
- Core Data for persistent icon storage
- Usability – an icon can be made in minutes, right on the device
- I worked with an acquaintance graphics designer for this project
Augmented Reality: Glamour is an iOS app that blends two facial images together. Another name for the concept is “Overlay Camera” – the user selects an image from the web and position a live camera feed over the image. With a bit of transparency, the user’s brain merges the two images together into a single one. Wrinkles, facial contour and emotions of a (celebrity) face are seen and realized on one’s own reflection.
I developed this app with a female audience in mind, hence the “glamour” in the name. There were not too many apps that featured the keyword “glamour”, so I wanted to include the word, to capture a part of the search results for that keyword.
I really like the concept and enjoyed playing with it for many hours. This is my 2nd most downloaded app.
- Uses Core Video, AVFoundation to configure a live camera feed within a UIView (not a preview layer)
- Uses gestures to pan, scale and rotate the live camera feed
- Uses a UIWebView to fetch an image from the web
- Screenshot functionality – WYSWIG
- iPhone and iPad app
5 in 1 Journal was my first attempt at an iPhone journal-like app that recorded events over the course of the day. It was fairly simple, and is currently one of the apps that I’m selling on the iOS app store.
With such a long name, the 5 in 1 journal was my first attempt at figuring out how the App store ranks its apps based on keywords. A “journal” is an excellent keyword, and many journal apps are just glorified notepads. I improved on the concept, making entry creation a one tap operation. The app was my first one to use iOS5 storyboards and core data. A solid app, it is an attempt to repackage my earlier work in a different commercial package. The app has positive user reviews, but does not have a sustained marketing efforts. Over the past year, the app has slowly grown in popularity.
Singularity Experience is the reason why I’m an iPhone Developer today.I’ve learned iOS to be able to build this app. What is the app? It was my
first commercial attempt at building a lucid dream induction device using an iPhone. I put an unbelievable amount of work into this app, with 2 months of long, daily development sessions.
The app was trying to accomplish a very ambitious task – to be the tool that helps the user experience dreams consciously. Dreams do not work the way real world does, and text is prone to changing arbitrarily, so I spent a lot of time designing a custom user interface, that would be unique to the app.
With this app, I was able to predict when the user is dreaming – in Rapid Eye Movement sleep. Something that to my knowledge has never been accomplished with actigraphy before.
- Motion processing algorithms using Core Motion
- Custom interface layout with XIB files and programmatic layout
- Use of audio library to select and play music tracks
- Persistent data storage with Core Data
- Custom sleep graph using UIViews
- UIView animations with blocks
- I worked with a volunteer tester for this project
Bio2Real time was my first iOS app published in the app store. This small app was intended to help people find appropriate bedtimes and rise times, based on how long they wanted to sleep. The app uses GPS and day of the year to calculate sunrise and sunset, then providing bedtime and rise time suggestions to the user.
The main reason why I built this app is to get experience with the process of publishing apps to the app store. The entire app was built from existing components, and published in a single day. While the app got mentioned as “new and noteworthy” by Apple staff, the original submission was rejected due to the GPS use. I had to explain that GPS is used for astronomical calculations and nothing else.
other iOS apps I have worked on during my employment:
I spent some time working for DataViz, a small mobile-centric software company and at a mobile center of excellence at Pfizer. I had an opportunity to work on commercially successful iPhone apps.
HemMobile at Pfizer – I worked with a small development team and a graphics artist to implement the vision of a mobile tracker app for Hemophilia patients. I did a large amount of work with custom interface layout, Core Data queries, networking and MapKit framework. Additionally, I had an opportunity to look into Dropbox and iCloud syncing of core data objects. The experience of working with pre-planned wireframes and a graphics designer was new to me.
I did a small amount of work for Passwords+, a passwords app for iPhone. This included laying out a couple view controllers and doing some behind the scene bug fixing.
I was working primarily on SpreadsheetToGo, a mobile version of Excel, with some work done on the Word and PowerPoint side as well. I understand how those apps work and are built. Most of my work included debugging of hardcore Objective-C++ code, and I’ve resolved around 20 bugs in the short time I was at DataViz. This taught me to step lightly around Objective-C++, and influenced all of my subsequent development to rely on open source Objective-C code. I became pretty good at using xCode debugger, learned how to break on exceptions, and work with build settings. I was briefly exposed to networking and sync (SugarSync).
Personal protype projects
I have a small number of interesting projects that are not consumer-ready yet.
Neurosky MindWave Mobile ElectroEncephaloGram (EEG) display
I’ve long wanted to look at my own brainwaves with EEG, but until now the cost has been prohibitively expensive. With the NeuroSky MindWave
Mobile headband, I was finally able to make that dream a reality. The product comes with a series of pre-built apps, which I found rather lacking or random. I wanted to look at raw data, and has adapted one of my earilier devices – the Actigraph to analyze and display Raw EEG data.
- Uses custom real-time graph widget, updating at 200 times per second
- Analyzes EEG data, creating a single metric of brain activity
- Plays audio in response to the change in brain activity metric (the user no longer has to stare at the graph to observe changes, one can listen for them while doing something else)
- Uses core data to store EEG band information in scrollable, peristent bar graphs
- Project influenced two other hackers to purchase their own Mindwave mobile devices.
Tracked Platform using iPhone and WiRC module
I wanted to make an iPhone be able to move and interact with an environment by itself. Normally, an iPhone has all these wonderful sensors, cameras, displays, but it cannot move. So I found out how to have an iPhone app can control chassis and servos. And I assembled a prototype that I can drive remotely – I get a camera image from the platform using WiFi signal, and can control the platform using phone’s orientation. I can rotate the camera and look around remotely. The platform is controlled using CoreMotion for steering and acceleration. It works. The next step is to mount the iPhone on the platform and have it choose which way to go itself.
- Uses an RC tracked platform with two independent tracks
- Uses a wi-fi module (WiRC) to send control signals to servos and electronic speed controllers
- Has a Pan-tilt camera turret with live camera feed
- Move the phone to steer using CoreMotion
- Tap on a camera image to make the camera rotate and look at that point
- GPS location updates integrated with MapKit to show platform path
iPhone based spectrograph
I’ve assembled a simple device that breaks up light into it’s monochrome color components to test my household lights. The device works, and has demonstrated that light filtering indeed eliminated the wavelengths of light that I was interested in. It was a very simple project, that nonetheless has fulfilled its purpose – I analyzed all of my household lights and figured out that none of them are good to use at night: