HiCon is now offered in Pro and Lite Versions

Weburban is now offering HiCon Lite as a free download to all iPhone users. The Lite version allows a user to immediately take a picture after launching the application, where it then processes the image and saves it to the user’s photo roll. There is no limit to the number of images a user can take, and there is no difference in quality or resolution between Pro and Lite versions.

HiCon Pro offers a great deal more functionality. The introduction of a super quality color mode allows HiCon users to make high intensity color images, as well as taking pictures from either the photo roll or directly from the user’s camera. Optionally, users can save an image that has been processed or discard it. The ability to process HiCon repeatedly on a single image can also intensify the effect producing stiking images. HiCon Pro is offered at an introductory price of $0.99(US) making it a formidable competitor in the iPhone Apps market.

HiCon Lite on the iTunes App Store

HiCon Pro on the iTunes App Store

Search for photos made by HiCon users on Flickr

HiCon Addict Flickr Group

Leaf next to Subway

Advertisements

Light Leak iPhone application


Adding to Weburban’s current lineup of applications is a photographic simulation of the plastic cameras people loved to hate in the 1960s. Light Leak 1.0 for the iPhone produces both color and black & white simulated film images with edges that look like light has leaked onto the film, producing jagged borders.

In reality, the pictures are put through an algorithm to produce randomly generated pixel perfect bleeds that punch up contrast by multiplying pixel intensity, especially along the borders of the image. This makes pictures with a large amount of contrast a lot more jagged and bleeding. Images with little or no contrast will look faded and torn.

A large degree of research went into the production of Light Leak, including techniques involving film emulsion transfer, film development using plastic cameras, pin-hole film development, and infra-red film development. Careful study into the effects of light and its gradual attenuation were applied to prevent the images from looking fake, over-processed, or digital. This was particularly challenging since the iPhone’s camera does not allow for exposure manipulation during a snapshot, so all of the processing must account for over or under exposed images.

The latest update to the iPhone application lineup has introduced multi-threading to improve stability and overall speed of processing. Since this application is the most intensive to date, multi-threading was critical to maintaining overall performance and stability and was added to all applications in Weburban’s lineup of iPhone applications.

Multi-touch Flash Application using touchlib

The Natural History Science Museum of Los Angeles wanted to create a large kiosk that many guests could operate at the same time. This would make using a conventional mouse-based kiosk impossible because a mouse has a single point of control. Ideally, the designers at the museum wanted a large screen that can have objects manipulated by multiple control points (i.e. multiple “mouse” pointers) allowing for a shared experience. Cost was also a factor, so 8 foot touch screens were not an option.

An open source movement exists for software libraries designed to manage multiple touch events on an ordinary computer. It works by registering the x and y coordinates of blobs seen via a web cam on a translucent glass screen. These blobs are then sent to the event model as custom touch events and handled by the application. Touchlib is the software we used for our project and the experience was overwhelmingly positive.

The software is released under the Open Source New BSD license, which gives you the possibility to use it and modify it in every circumstance. It can be found here:
http://code.google.com/p/touchlib/

One of the main quality attributes of the application was that it had to be easily extensible. Flash Actionscipt 3.0 was the language of choice for the application since staff at the museum already knew the language, and it was the most platform agnostic. This became critical during deployment when the target machines moved from being PCs to Mac Mini computers.

Flash integration was easier than I had anticipated. Once the touchlib library captures the x and y coordinates from blobs registered from the web cam, it sends them via UDP to a port registered by a lightweight java server running in the background. Flash can listen to the port for UDP events and call them as needed. All this is done in about two lines of Flash code:

First, import the touchlib Flash library — import app.core.action.RotatableScalable;
Then, start listening to events over UDP — TUIO.init( this, ‘localhost’, 3000, ”, true );

There is a substantial number of Flash commands and events already written out and stashed in the utility library ready for use. We overloaded some of the behaviors to prevent runaway scaling of images and added some custom behavior to prevent text from being scaled too small. Most of the additonal functionality had to do with inteface animation and user controlled interface design. Since the Flash app is constantly listening for UDP events it is wise not to overload it with other events that are not crucial — be sure to clean up Timer events or EnterFrame events that are not needed.

Finding a web client’s physical location

 The iPhone has the ability to triangulate the physical location of the phone using a combination of GPS and Phone tower ID points. Some people have been asking me if it could be possible to find the location of a user’s computer when they view a web page. Since every computer uses a unique IP number, and that number is registered in blocks to internet service providers, then it could be theoretically possible to reverse lookup these numbers to get an approximation as to where they are. You cannot know exactly where the person is as accurately as GPS, but instead get a general approximation. This could have some interesting uses, especially since the cost for this technology is relatively small.

Physical proximity and distance can be calculated using the haversine formula. This will tell us the distance between two points on a sphere. And of course, the earth is not flat, so typical straight line calculations are pretty useless. Thankfully, the formula is quite simple:

First we get the radius of the earth in either miles or kilometers:
R = 6371 km radius of the earth

Next, we calculate the delta or change between the latitude and longitude points:
Δlat = lat2− lat1
Δlong = long2− long1

Finally, we plug those givens into the formula:
a = sin²(Δlat/2) + cos(lat1) * cos(lat2) * sin²(Δlong/2)
c = 2 * atan2(√a, √(1−a))
d = R * c

For the purpose of this experiment, we also need a data table with the values of IP numbers and their proximity to lat long points in the world. MaxMind has generously provided a free data set of city points for this very purpose. After loading the data points and creating a script that pulls the browser’s IP number, the page gets a list of data back. The detail and quality of the data depends on the data set coverage. However, so far in testing, the results are pretty good for the minimal effort put into the coding.

Uses for something like this could include:
– Add immediate locational feedback on web site analytics without log processing
– Provide web site browsers with geographically specific services
– Provide geographically tailored content on front pages
– Restrict content based on geographic location

Weburban LLC is an iPhone, and Android development studio based in the San Francisco Bay Area, California.

EdgePix adds arty borders to iPhone

Over the span of a week, Weburban staff experimented with real and computer generated inks to come up with a compelling range of borders for images in the new application EdgePix available exclusively on the iPhone app store. Singular streaks, photographic edges, painted brush strokes, and all kinds of smears were used to mask the images.

We wanted people to be able to quickly add clean and gritty edges to their newly captured pictures. Even a simple border will help a FaceBook image look more interesting. We wanted to keep the range of Edges useful enough to have a broad range of applications without watering them down so much that they become boring.

The next iteration of EdgePix is being worked on right now at the Weburban offices. It will use the Weburban Pix 2.0 framework and will allow users to pick the image first and then apply the edge progressively. Also, a new pack of edges will be included free of charge to all current users of EdgePix 1.0.

ButtonPix creates Aqua style images easily

Apple has made the shiny glass button into an icon for the sleek user interface of tomorrow. Now, with ButtonPix, you can turn any image in your Camera Roll into a shiny glass button with one easy selection.

The button processing algorythmn intelligently detects whether the image is horizontal or vertical, and then applies a series of filters and effects to make the image look shiny, rounded, and glass-like. Processing does not alter the original image so there is no risk in losing pictures during experimentation.