↑ Return to About

Alexander Stone Mobile Portfolio

Alexander Stone mobile Developer Portfolio.

Hi, My name is Alexander Stone, and I’m an independent mobile developer. I made apps for iOS, Android and earlier J2ME phones. This is portfolio of my personal apps and apps I have worked on since March 2011.

From April 2011 until  March 2013:

  • I have 7 iOS apps in the iOS app store
  • 2 Android apps on Google Play
  • I give away about 100 free iOS apps a day
  • Sales of my iOS apps buy me lunch every day (~4000$ in 2012)
  • One android app broke 500,000 downloads
  • My app support/marketing website receives over 20,000 visits per month

 

iOS App sales

While my Android apps were a labor of love, I actually tried to make money on the iOS app store. Below are the screenshots of my efforts. It appears that just as with websites, the “older” an app is, more successful it is. Maybe this is due to the app sales page propagating to various app aggregating websites (where good keyword selection matters). All of my apps seem to have a very even distribution of sales, with the exception of the following events:

  • The initial presentation generates a splash of interest (Augmented Reality: Glamour hit 3.4 k downloads on the first day, diminishing after a week)
  • Posting apps to appropriate Reddit pages generated up to 100 paid app sales a day for 3 days

My paid app sales per week. On most weeks the sales are enough to by me a lunch every day. Recently they seemed to pick up slightly. 

My free app downloads have stayed consistent over the last few months, with very little marketing efforts.

 

 

 

 

 

 

 

 

 

 

Personal projects, most recent first

Better Mood Tracker

What it is: Better Mood Tracker is and iOS app designed for iPhone. The app is built around the concept of Experience Sampling Method (ESM).

Mood selector

The app is a comprehensive lifestyle tracker and its purpose is to collect rich data about the user’s lifestyle to be analyzed at a later point.  The app has a built-in events log, sleep tracker, weather and a biological clock. I use this app day to day to keep track of what I do and when I do it. I really like this app and have been using it daily for over 4 months now.

I’ve been doing sleep tracking with a similar tool for over 9 months prior to building the Better Mood Tracker. I found that the sleep data alone is not enough to make judgements about the user’s lifestyle. This is why I improved upon the app to add event logging and other features I liked.

Better mood tracker was built over 6 weeks, I’ve used this app to integrate all of my existing iOS knowledge in a single app. This is the prime example of what I can do as a developer.

While most of my apps are complex, I will use Better Mood Tracker to highlight some of the frameworks and concepts that I’m proficient with:

Events logger

  • A customized GMGridView to allow for drag/drop/rearrange and delete functionality
  • Backed by Core Data entities, acting as a template for each event
  • User is able to customize each event template
  • Has a photo gallery to display all pictures stored with events
  • Use of  tap, pan gesture recognizers

Photo Gallery

  • Customized FGallery for photo browsing
  • Adopted to work with CoreData stored images
  • Images are compressed for storage
  • Option to save photos to camera roll
Mood Logging Component
  • Custom adaptation of Pick-A-Mood instrument (with permission of copyright owner)
  • X-Y color map algorithm to map HSV color mapped to location
  • Custom interface layout
  • Tap and Pan gestures 
  • Mood compass – CGAffineTransforms are used to achieve arrow rotation and resizing
  • Mood information is saved to core data
History
History is a component that I’ve worked on for a very long time. It uses color-coded circles with icons to display times of events day to day
  • A heavily customized UITableView
  • Custom queries to CoreData pull out multiple entities(events, moods and sleep) to display in a single table
  • Uses a UIScrollView and UITableView to achieve side-scrolling, as well as vertical movement
  • Implemented Icon caching for quicker scrolling
  • Each row and each marker is tappable to bring up the corresponding view of data
My Day component
My Day is a biological clock with weather. It plots sunrise/sunset, weather and user events on a single 24 hour view.
  • “Classic” interface layout, designed to work with iPhone 4 dimensions
  • 3rd party algorithm to calculate sunrise/sunset from GPS location and date
  • 3rd party weather service using AFNetworking and JSON deserialization
  • Image cache for fast scrolling
  • Animation of clock hand rotations
  • The screen can be populated with past events (tapping on a row in history brings the user here)
Events Log
  • A customized UITableView pulling a single entity from Core Data
  • Caches event icons
  • Uses predicates to dynamically split events into table view into sections by day
Sleep Tracker
Uses actigraphy algorithms to analyze user sleep and calculate sleep metrics. Can work by touching the body or by contact with the mattress.
  • Uses Core Motion framework for activity sampling
  • Algorithm processes motion events and stores results in Core Data
  • Calculates multiple sleep metrics
  • Crash proof in the event of app terminating or phone running out of battery
  • Live clock with custom font provided with the app
  • Screen dimming, screen saver using a thread-safe timer
  • Built-in help, instructions and a link to online help
Sleep Data
  • Custom UITableView backed by queries to Core Data
  • Sleep records are broken by day
  • A graph view to display sleep cycles
  • Animated tooltip, responding to gestures
Actigraph
Actigraph is a tool used to study human motion. When used, it analyzes human motion and displays the results as a custom bar graph.
  • Algorithms for motion analysys using Core Motion
  • Delegation used to make actigraph work with multiple parts of the app (logging, display, analysis)
  • Bar graph, scrolls to show any data in the app
  • Live line graph to show sensor filtering in action
  • Calibration screen, using statistics to filter out sensor noise
Donate functionality
Some people have asked for this, so I took this opportunity to implement in-app purchasing for the first time.
  • Two donate options implemented with in-app purchasing
  • Check for network availability
  • Uses specific app identifier for app store
  • Can retrieve previous purchases
Reminders
The app can periodically ask the user to interact with the app.
  • Use of local notifications to schedule reminders from awakening until bedtime
  • Algorithm schedules reminders to cover the entire user day with just a few reminders
In-app Settings
  • A number of options and features are implemented
  • Passcode Lock
  • Different Pick-A-Mood characters
  • Reminder Management
What I’ve learned from building the Better Mood Tracker
  • The one thing that I learned the most is to limit the scope of my projects. Time and time again I saw features that I built in a day require multiple days of polishing, debugging and documentation before they are “customer ready”.
  • I’ve used minimalistic graphics – some gradients, border color, but the app could benefit from a dedicated graphics designer
  • I got pretty good with core data queries – unlike traditional table views backed by a NSFetchedResultsController fetching a single entity, I’m using date queries to pull multiple entities out of core data and display them in a table view. This requires more work, but is a lot more flexible.
  • I’ve taken an opportunity to look into new things – network weather fetch and in-app purchasing are things that I’ve never done before.
Unfortunately, Better Mood Tracker was published after iOS6 revamp of the app store look and feel, and while the app did appear on the first page of search results in iOS 5.1 app store, it is virtually invisible in the new app store, which severely hinders it’s visibility. To help with the issue, I created a BetterMoodTracker.com mobile-optimized site, but it may take time to raise in visibility. 

Blue Light Therapy

Blue Light Therapy is and iOS app built for iPhone and iPad. The app is an experimental attempt at influencing circadian rhythms and the human biological clock with light.  This is my simplest app, and also the one with the highest user ratings. Surprisingly, it also has the most downloads/day, and is the only app that appears in the #1 search result for multiple related keywords.

  • Simple design – only 3 screens, 3 options
  • Uses OpenGLES to create ripple effect with gesture recognizers
  • Plays a single music track on a loop
  • Wallpaper mode – iOS idle timer disabled
What I’ve learned from this app:
I enjoyed building this app, and the original was built in a day, polished over the next 3 days and published in the app store. I kept the number of options to the minimum intentionally – no extra colors. Even with so little options, there were hidden challenges – for example the iPhone idle timer would shut down the screen after 1 minute if enabled. Generating a texture from image was painful – hidden image orientations created a need for rotations and translations, and I was frequently staring at a blank screen. It took a very long time to debug.
I’m not an OpenGLES expert, but knew enough to change the shaders and replace the background texture. It is still not perfect – the spiral is supposed to be circular, while generating a texture from image somehow stretching it.
While this app is tiny and simple, the marketing around it is not – I’ve crafted a 4000 character, keyword-rich sales copy that not only points out the benefits of the app to the user, but also helps with the SEO. As a result, the app ranks very high in both the app store and google. Additionally, I promoted the app on Reddit.com to

Icon Maker++

Icon Maker++ is an app that uses an iPhone to build icons for xCode projects. Each app requires multiple iPhone and iPad icons, and with the help of the Icon Maker++, I can quickly lay out, preview and create an icon. Unlike all other apps on the market, this one supports layers, which are interacted with gestures.

  • Uses Core graphics and affine transforms for rotation, reflection
  • Scroll Views for scaling and translation of images
  • Dropbox to core data sync
  • Custom table View and GMGridView for layer and frame management
  • Imitation of iOS 5.1 app store look and feel to preview icon, name and company name
  • Core Data for persistent icon storage
  • Usability – an icon can be made in minutes, right on the device
  • I worked with an acquaintance graphics designer for this project
What I’ve learned from the IconMaker++
I’ve spent a very long time, over 40 programming sessions just building this app, and have learned a lot about iOS in general. First, the iOS graphics capabilities are amazing. In some sense, I’ve created a mini version of Photoshop, and as I played with filters, transforms, bezier curves, etc, I realized that this can probably be scaled far.
The scope of this project just kept on creeping up, as I added more features. For example, I serialized the icon file, with all the metadata and output images, and sent it to DropBox using DropBox api. Another version of my app, connected to the same dropbox can download, inflate and create core data icon objects. Two devices would have the same icons files. Doing this was incredibly difficult, and I spent very long time debugging the component.
Unfortunately, the project failed commercially, partially due to lack of full-time graphics artist and sustained marketing effort. The programmer market proved to be difficult to reach.

Augmented Reality: Glamour

Augmented Reality: Glamour is an iOS app that blends two facial images together. Another name for the concept is “Overlay Camera” – the user selects an image from the web and position a live camera feed over the image. With a bit of transparency, the user’s brain merges the two images together into a single one.  Wrinkles, facial contour and emotions of a (celebrity) face are seen and realized on one’s own reflection.

I developed this app with a female audience in mind, hence the “glamour” in the name. There were not too many apps that featured the keyword “glamour”, so I wanted to include the word, to capture a part of the search results for that keyword.

I really like the concept and enjoyed playing with it for many hours. This is my 2nd most downloaded app.

  • Uses Core Video, AVFoundation to configure a live camera feed within a UIView (not a preview layer)
  • Uses gestures to pan, scale and rotate the live camera feed
  • Uses a UIWebView to fetch an image from the web
  • Screenshot functionality – WYSWIG
  • iPhone and iPad app
What I’ve learned from building Augmented Reality: Glamour app:
Augmented Reality: Glamour has been a painful lesson in User Experience. Without good instructions, the users were confused on how to switch between camera and web views, which created some negative user reviews. After a remake of the iPhone UI to appear similar to the native camera app, the situation improved.
I’ve learned that a great idea can be rapidly taken to market with an iPhone. The app prototype was built and published in a little bit under a week and was featured in a “New and Noteworthy”, resulting in over 3000 downloads on the first day.

5 in 1 Journal: Food, Sleep, Exercise, Work and Mood Tracker with event history

5 in 1 Journal was my first attempt at an iPhone journal-like app that recorded events over the course of the day. It was fairly simple, and is currently one of the apps that I’m selling on the iOS app store.

With such a long name, the 5 in 1 journal was my first attempt at figuring out how the App store ranks its apps based on keywords. A “journal” is an excellent keyword, and many journal apps are just glorified notepads. I improved on the concept, making entry creation a one tap operation. The app was my first one to use iOS5 storyboards and core data. A solid app, it is an attempt to repackage my earlier work in a different commercial package. The app has positive user reviews, but does not have a sustained marketing efforts. Over the past year, the app has slowly grown in popularity.

 

Singularity Experience: The Lucid Dreaming App

Singularity Experience is the reason why I’m an iPhone Developer today.I’ve learned iOS to be able to build this app. What is the app? It was my

Biological clock for the Singularity Experience

first commercial attempt at building a lucid dream induction device using an iPhone. I put an unbelievable amount of work into this app, with 2 months of long, daily development sessions.

The app was trying to accomplish a very ambitious task – to be the tool that helps the user experience dreams consciously. Dreams do not work the way real world does, and text is prone to changing arbitrarily, so I spent a lot of time designing a custom user interface, that would be unique to the app.

With this app, I was able to predict when the user is dreaming – in Rapid Eye Movement sleep. Something that to my knowledge has never been accomplished with actigraphy before.

  • Motion processing algorithms using Core Motion
  • Custom interface layout with XIB files and programmatic layout
  • Use of audio library to select and play music tracks
  • Persistent data storage with Core Data
  • Custom sleep graph using UIViews
  • UIView animations with blocks
  • I worked with a volunteer tester for this project
What I’ve learned from building Singularity Experience:
iOS. With the expert advice from stackoverflow.com and various open source examples, I was able to port my existing Android app over to iOS. It was a very exciting time, and I still use many of the components that I conceived during building that app.
One of the things that I quickly became aware of is incredible amount of complexity involved in such large projects. At the end of 2 months, I had over 60 class files, and many more interfaces and prototypes. While building those was fast, polishing and debugging them was taking so long that I was forced to cut features and abandon prototypes. I used simple built-in xCode source control, and sometimes lost code.  Still, shortly after release, the app had all of the planned functionality and was fairly bug free.
“If they build it, they will come”, and even with a significant amount of marketing efforts, I was unable to make the app commercially successful. I tried various price points, and promotional efforts, but the app has never reached it’s sales target.

 

Bio2RealTime

Bio2Real time was my first iOS app published in the app store. This small app was intended to help people find appropriate bedtimes and rise times, based on how long they wanted to sleep. The app uses GPS and day of the year to calculate sunrise and sunset, then providing bedtime and rise time suggestions to the user.

The main reason why I built this app is to get experience with the process of publishing apps to the app store. The entire app was built from existing components, and published in a single day. While the app got mentioned as “new and noteworthy” by Apple staff, the original submission was rejected due to the GPS use. I had to explain that GPS is used for astronomical calculations and nothing else.

 

other iOS apps I have worked on during my employment:

I spent some time working for DataViz, a small mobile-centric software company and at a mobile center of excellence at Pfizer. I had an opportunity to work on commercially successful iPhone apps.

HemMobile at Pfizer – I worked with a small development team and a graphics artist to implement the vision of a mobile tracker app for Hemophilia patients. I did a large amount of work with custom interface layout, Core Data queries, networking and MapKit framework. Additionally, I had an opportunity to look into Dropbox and iCloud syncing of core data objects. The experience of working with pre-planned wireframes and a graphics designer was new to me.

Passwords+ at DataViz

I did a small amount of work for Passwords+, a passwords app for iPhone. This included laying out a couple view controllers and doing some behind the scene bug fixing.

DocsToGo at DataViz

I was working primarily on SpreadsheetToGo, a mobile version of Excel, with some work done on the Word and PowerPoint side as well. I understand how those apps work and are built. Most of my work included debugging of hardcore Objective-C++ code, and I’ve resolved around 20 bugs in the short time I was at DataViz. This taught me to step lightly around Objective-C++, and influenced all of my subsequent development to rely on open source Objective-C code.  I became pretty good at using xCode debugger, learned how to break on exceptions, and work with build settings. I was briefly exposed to networking and sync (SugarSync).

 

 

Personal protype projects

I have a small number of interesting projects that are not consumer-ready yet.

Neurosky MindWave Mobile ElectroEncephaloGram (EEG) display

I’ve long wanted to look at my own brainwaves with EEG, but until now the cost has been prohibitively expensive. With the NeuroSky MindWave

Early prototype takes a raw EEG waveform and “scores” it, as well as displays the changes in score over 30 seconds

Mobile headband, I was finally able to make that dream a reality. The product comes with a series of pre-built apps, which I found rather lacking or random. I wanted to look at raw data, and has adapted one of my earilier devices – the Actigraph to analyze and display Raw EEG data.

  • Uses custom real-time graph widget, updating at 200 times per second
  • Analyzes EEG data, creating a single metric of brain activity
  • Plays audio in response to the change in brain activity metric (the user no longer has to stare at the graph to observe changes, one can listen for them while doing something else)
  • Uses core data to store EEG band information in scrollable, peristent bar graphs
  • Project influenced two other hackers to purchase their own Mindwave mobile devices.

Tracked Platform using iPhone and WiRC module

I wanted to make an iPhone be able to move and interact with an environment by itself. Normally, an iPhone has all these wonderful sensors, cameras, displays, but it cannot move. So I found out how to have an iPhone app can control chassis and servos. And I assembled a prototype that I can drive remotely – I get a camera image from the platform using WiFi signal, and can control the platform using phone’s orientation. I can rotate the camera and look around remotely. The platform is controlled using CoreMotion for steering and acceleration. It works. The next step is to mount the iPhone on the platform and have it choose which way to go itself.

  • Uses an RC tracked platform with two independent tracks
  • Uses a wi-fi module (WiRC) to send control signals to servos and electronic speed controllers
  • Has a Pan-tilt camera turret with live camera feed
  • Move the phone to steer using CoreMotion
  • Tap on a camera image to make the camera rotate and look at that point
  • GPS location updates integrated with MapKit to show platform path
What I’ve learned from this platform:
Great ideas are often hard to implement. Currently there are 2 other consumer-grade toys with similar functionality (remote controlled WiFi platform with camera), meaning that there is a demand for such item. In my experiments, I found that truly autonomous operation requires a lot more work – for example integrating proximity sensors for various distances. At intense turns (ex: when stuck) , the platform may lose one of it’s tracks and lose its mobility completely.

iPhone based spectrograph

I’ve assembled a simple device that breaks up light into it’s monochrome color components to test my household lights. The device works, and has demonstrated that light filtering indeed eliminated the wavelengths of light that I was interested in. It was a very simple project, that nonetheless has fulfilled its purpose – I analyzed all of my household lights and figured out that none of them are good to use at night:

Household lights spectrum check. Right side of each image is filtered using BluBlocker glasses

Share