1980k 1980k

Beyond Emotion : Designing for Visceral Allure

Article: Source

When we think something is cute, why do we feel that way? What is “cute”? What is it inside of all of us that is inexorably drawn to puppies, stuffed animals, and garden gnomes? We all know that we think things are cute, but few of us can explain why, or for what purpose.

Before I get much further, I’d love to reference and give credit to Aarron Walter’s excellent post on Emotional Interface Design. To summarize, Aarron discusses how our digital lives have become more public, and that “as users let their humanity show online, frontiers of communication are opening for web designers.” He also speaks to the deeper needs of users, advising us to not simply be satisfied with “usable,” but to provide an experience that touches our users on an emotional level. While I agree wholeheartedly with Aarron, I think it goes even deeper than emotion to a place that we as humans cannot control. It’s in our DNA, and understanding why that connection affects and influences us is the key to successfully and meaningfully reaching our users.

I recently stumbled across the answer to my initial question: Disney. Yes, that Disney. More specifically, something Disney animator Preston Blair discovered long before I was born: we can’t deny cute.

You may have seen this before, but I think many underestimate just how profound this illustration is:

Illustration of the elements of drawing cute characters

We are magnetized by anything that meets these characteristics—we simply cannot help ourselves. It’s in our very makeup as humans.

This mysterious force is why so many formulas and principles such as the Golden Ratio, the Fibbonaci numbers, the rule of thirds, and grid systems attract us to well-designed products. Throughout history, we’ve even seen ancient cultures use these principles to, without modern tools, craft history itself into stone. These principles have been described as the “mathematical formulas for beauty.” They’re proven, meta-emotional blueprints to visual captivation.

It’s also not just about cuteness. Advertisers have worn the sole thin on the adjective “sexy" when it comes to describing products. Why have so many car commercials aired featuring this word? Why are products described as "gorgeous,” “beautiful" and "hot”? The most effective tactic of marketing and sales professionals wield is to play on human emotions. This includes (non-sexual) physical attraction, even to products. People often become infatuated and connected to these items to a point that they begin to refer to them as if they were living things. They genuinely love these products.

Aesthetic pleasure is in fact an emotional need that is met by appealing to what Don Norman calls the “visceral” layer of emotional design. I’d add that beauty is a function. Fun is a function. Something that is attractive or enjoyable to us makes us want to find any excuse we can to use it or be around it. This is why we want to use beautiful products, and why we love “cute” things. We’re actually fulfilling our own emotional needs. As experience designers, this concept is the foundation of understanding our own behavior and, more importantly, the needs of the people who use our products.

We tend to label the people we’re designing for as “users,” but I fear the term is distancing and a little too clinical. It removes us from them, creating a subconscious, indirect “us vs. them” attitude. The truth is that we’re all human, and we need to admit that we’re all at least a little bit guilty of judging on appearance. While this can have varying consequences in social situations, designers can take advantage of this to make compelling first impressions with their products.

This aesthetic layer is the first point of emotional contact with our users, like a first date. We don’t dress forfunction when we’re meeting someone for the first time. Otherwise, we’d have napkins built into our pants and cushions sewn on for optimal movie-watching comfort. We want to look as attractive as possible.

To be truthful, I typically won’t use a poorly designed product unless it works for a very specific task (thinkFileMerge). It’s purely utilitarian and based on whether or not it can get the job done. I will close an ugly app as soon as I’m done with it, but I’ll leave an attractive one such as TextMate open for days, even when I’m not using it.

I love TextMate because of its simple but powerful UI and its remarkable level of readability. I want to find any excuse to open it. It’s beautiful, but I don’t love how it functions in the contexts in which I want to use it. That doesn’t stop me from launching it when I want to view a readme file. It’s unassuming, yet elegant. I’d rather deal with its shortcomings and be able to experience its design.

Good design makes people happy, which is an emotion. Bad design frustrates people, which is also an emotion. Good visual design can make us overlook flaws in the way attractive things behave, much as we do with people. Aarron Walter mentions this in his article:

Babies create bonds with their parents through an interesting feedback loop. When they cry their parents respond by soothing them, which releases calming neurotransmitters in their brains. As this cycle repeats, the baby begins to trust that their parents will respond when they need them.
A similar feedback loop happens in interface design. Positive emotional stimuli can build a sense of trust and engagement with your users. People will forgive your site or application’s shortcomings, follow your lead, and sing your praises if you reward them with positive emotion.

The lesson here is not that we should ignore usability in favor of cosmetic benefits when we design experiences. Rather, it’s that beauty promotes usability, but not necessarily by making your product more useful. It promotes usability by influencing users to want to interact with it.

Don Norman also reminds us that the design of products serves as the vehicle through which we emotionally connect and communicate with those in our lives:

Our attachment to those objects is entirely shaped by memory. Because past experiences are no longer recoverable except through recollection, we value objects by the emotions they provide rather than their intrinsic worth. It’s why the memories surrounding them often transcend everything else about them.
But creating a product with emotional resonance does not require Jonathan Ive and his band of merry pranksters, or a team of German automotive engineers. It is not about technology or elaborate styling. Our love of objects is not even about the objects themselves. It is always about us. We grow to love the objects that connect us to other people, create meaning, and remind us that we’re alive.

That last sentence is what drives this concept home. On a genetic level, we strive to enjoy as much beauty and cuteness as we can in our lifetimes because they “remind us that we’re alive.” To me, being human is about making connections with people, learning, and loving. These are all things that make us alive. When we create emotional touchpoints in our products, we are providing people with moments in which they can feel alive, connected, and reminded of the beauty in their own lives and those closest to them. They may not even realize why they feel the way they do about your product, but they will remember it.

Comprehending these principles of functional aesthetics, and understanding why the “faces” of the things we make need to appeal to, and even go beyond human emotions, is how truly great products are born. Beauty and cosmetic interest serve the purposes of visual enticement and emotional draw.

Read More
1980k 1980k

5 Tips For Designing IPhone Apps In Photoshop

Article: Source

Designing a user interface for iPhone is totally different than designing for the web. Small screen, fixed width, users are on the move etc.

I’ve been designing a couple of iPhone apps recently and while doing so came across some useful techniques that I thought I would share for any designers who have still to move from web to iOS design. Good luck designing your mobile apps for iPhone!

1. Setup Photoshop

If you’re designing for iPhone 4 you’re designing for a high-res retina screen. 2 pixel lines appear as 1 point when they’re shown on screen.

Setup your Photoshop grid so the minimum grid lines are split in 2 pixels.

Photoshop > Preferences > Guides, Grid & Slices

I set the gridline to every 20 pixels, and subdivisions to every 2 pixels.

grid.png

Now ensure you have snap setup to snap to all, ensuring any objects you move will snap to the 2px grid.

snap.gif

Also ensure your shapes tool is setup to snap to pixels. This should keep everything nice and sharp.

snapshapes.gif

Colours look different on different screens so you may want to adjust your colour settings also.

There’s a good article by bjango on how to setup colour preferences for iPhone in Photoshop.

2. Have A PSD Template

If you’re a web designer you more than likely have a PSD template for designing a web site in Photoshop or a framework for marking it up in HTML/CSS.

Do the same for designing an iPhone app. Have the ‘static’ elements in place (e.g. status bar), a grid to work to and typical guide lines you might want to use (e.g. a guide line for the tab bar).

You may even want to have several elements to hand like buttons, text etc.

psd.gif

3. Download The IPhone 4 GUI PSD Retina Display

Save yourself some time and effort by using Teehan+Lax’s iPhone 4 GUI PSD retina display elements.

Even if you’re designing your own style of buttons, tab bars etc. these will come in very handy.

gui.jpg

4. Use Liveview To Preview Your User Interface

It’s impossible to get a good idea of how an iPhone interface will actually look on an iPhone when you’re designing on a monitor.

Thanks to Nicholas Zambetti, there’s a free app to help you live preview your UI on iPhone.

  1. Window > Arrange > New window for [documentName]
  2. Move the new window to part of the screen you’re not using (a separate monitor works best)
  3. Open Liveview and place the ‘frame’ over that window in Photoshop.
  4. Now open Liveview on your iPhone, connect to your computer, and voila there is your app on your iPhone, giving you a live preview of how it looks.
liveview.gif

5. Know The Apple IOS Human Interface Guidelines And Best Practices

Apple have documentation on designing interfaces for iOS that you must read if you’re designing any iOS app.

There are a lot of enforced guidelines in here along with best practices for mobile devices.

Examples:

  • 44px is the ideal tappable area on screen
  • Users expect iPhone apps to launch in portrait orientation
  • Branding should be unobtrusive
  • Each app submission requires several icon dimensions
  • Controls should look tappable – make use of contours and gradients
  • Modal windows/tasks interrupt so use sparingly
  • etc.
Read More
1980k 1980k

November 6th 2012 was an important day for the United States of America. The country voted its leader for the next four years. Since it was such an important day I wanted to display through screenshots how all the major News companies handled the news announcement last night. At the time that the confirmation came through that Barack Obama would be our leader for another four years I decided to open all of the News companies websites and screenshot them all at the same time. This post above displays the simultaneous screenshots taken from the Election news.

Read More
1980k 1980k

HTML5 vs. Apps: Why The Debate Matters, And Who Will Win

Article: Source

Many think it will save the web, rendering native platform-dependent apps obsolete. 

So, which will win? Native apps or HTML5? 

recent report from BI Intelligence explains why we think HTML5 will win out, and what an HTML future will look like for consumers, developers, and brands.

Access The Full Report By Signing Up For A Free Trial Today »

Here’s why the Apps-vs-HTML5 debate matters:

  • Distribution: Native apps are distributed through app stores and markets controlled by the owners of the platforms. HTML5 is distributed through the rules of the open web: the link economy.
  • Monetization: Native apps come with one-click purchase options built into mobile platforms. HTML5 apps will tend to be monetized more through advertising, because payments will be less user-friendly.
  • Platform power and network effects: Developers have to conform with Apple’s rules. Apple’s market share, meanwhile, creates network effects and lock-in. If and when developers can build excellent iPhone and iPad functionality on the web using HTML5, developers can cut Apple out of the loop. This will reduce the network effects of Apple’s platform.
  • Functionality: Right now, native apps can do a lot more than HTML5 apps. HTML5 apps will get better, but not as fast as some HTML5 advocates think. 

In full, the special report analyzes:

  • What HTML5 is, giving an overview of how it is a technology done by committee.


Read more: http://www.businessinsider.com/html5-vs-apps-why-the-debate-matters-and-who-will-win-2012-10#ixzz296bLlwu3

Read More
1980k 1980k

iOS 6 for HTML5 developers, a big step forward

Article: Source

The new main version of the Apple’s iOS is with us, along with the new iPhone 5 and the iPod Touch fifth generation. As every big change, lot of new stuff is available for HTML5 developers and -as always- no much official information is available.

QUICK REVIEW

I’m going to divide this post in two parts: iPhone 5 and iOS 6 new stuff.

On iPhone 5:

  • New screen size
  • New simulator
  • What you need to do
  • Problems

New features on iOS 6:

  • File uploads and camera access with Media Capture and File API
  • Web Audio API
  • Smart App Banners for native app integration
  • CSS 3 Filters
  • CSS 3 Cross Fade
  • CSS Partial Image support
  • Full screen support
  • Animation Timing API
  • Multi-resolution image support
  • Passbook coupons and passes delivery
  • Storage APIs and web app changes
  • Web View changes for native web apps
  • Debugging with Remote Web Inspector
  • Faster JavaScript engine and other news

IPHONE 5

The new iPhone 5 -along with the iPod Touch 5th generation- has only one big change in terms of web development: screen resolution. These devices have a wide 4″ screen, WDVGA (Wide Double VGA) 640×1136pixels, 326 DPI -Retina Display as Apple called it. These devices have the same width as iPhone 4/4S but 176 more pixels-height on portrait mode.

NEW SIMULATOR

iOS Simulator on Xcode 4 includes iPhone 5 emulation

The new Xcode 4 (available on the Mac AppStore) includes the updated iOS Simulator. The new version has three options for the iPhone simulation:

  • iPhone: iPhone 3GS, iPod Touch 1st-3rd generation
  • iPhone Retina 3.5″: iPhone 4, iPhone 4S, iPod Touch 4th generation
  • iPhone Retina 4″: iPhone 5, iPod Touch 5th generation

The new simulator also includes the new Maps application replacing Google Maps by default and Passbook.

WHAT YOU NEED TO DO FOR THE NEW DEVICES

Usually, if your website/app is optimized for vertical scrolling, you should not have any problem. Same viewport, icons and techniques for iPhone 4/4S should work properly.  Remember, when updating the iOS, you are also updating the Web View: that means that all the native web apps -such as PhoneGap/Apache Cordova apps- and pseudo-browsers such as Google Chrome for iOS are also updated. However if your solution is height-dependent, then you may have a problem.Just look at the following example of the Google Maps website on iPhone 4 and iPhone 5. As it is talking the height as a constant, the status bar is not hidden and there is a white bar at the bottom.

Be careful if you’ve designed for an specific height as Google Maps. As you can see (right caption is from iPhone 5) there is a white bottom bar and the URL bar can’t be hidden as there is no enough content.

If you are using Responsive Web Design you should not have too much trouble as usually, RWD techniques are using the width and not the height for conditionals.

DEVICE DETECTION

At the time of this writing there are no iPhone 5 on the street yet. However, as far as every test I could check, there is no way to detect iPhone 5 server-side. The user agent only specifies an iPhone with iOS 6, and the same exact user agent is being used for iPhone 4S with iOS 6 and iPhone 5.

Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) 
AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A403 Safari/8536.25

Therefore, the only way to detect the presence of a 4″ iPhone device is to use JavaScript and/or media queries, client-side. If you need to know server-side, you can plant a cookie from client-side for next load. Remember that these devices have 1136 pixels height, but in terms of CSS pixels (independent resolution pixels) we are talking about 568 pixels-heightas these devices have a pixel ratio of 2.

isPhone4inches = (window.screen.height==568);

Using CSS Media Queries and Responsive Web Design techniques,we can detect the iPhone 5 using:

@media (device-height: 568px) and (-webkit-min-device-pixel-ratio: 2) {
/* iPhone 5 or iPod Touch 5th generation */
}

HOME SCREEN WEBAPPS

For Home Screen webapps the problem seems important. I’ve reported the problem while in NDA without any answer from Apple yet.

Basically, when you add a website to the Home Screen that supports apple-mobile-web-app-capable meta tag, your webapp works only in iPhone 3.5″ emulation mode (it’s not taking the whole height) as you can see in the following example from the Financial Times webapp.

The letterbox black bars you see here are not a problem in the image. That is how a full-screen webapp is being launched by default on iPhone 5 and the new iPod Touch.

While it’s a good idea to not add more height to a webapp if the OS is not sure about it’s compatibility on a wider screen, as far as I could test, there is no way to define that our webapp is 4″ compatible. I’ve tried as many combinations as I could think about, and if you provide an apple-touch-startup-image of 640×1096, the iPhone 5 takes your splash screen but it’s being resized to 640×920, at least in the Simulator for the GM compilation (almost the final version).


UPDATE 9/20: Solution found, thanks to some guys that were pointing to some solutions I;ve found the trick. As weird as it sounds, you need to forget about the viewport with width=device-width or width=320. If you don’t provide a viewport, it will work properly. The same if you use other properties than width; if you don’t want your viewport to be the default 980px, the way to do it is:

<meta name="viewport" content="initial-scale=1.0">

Even if you use a viewport for an specific size different than 320 letterbox will not be present.

<meta name="viewport" content="width=320.1">

Instead of changing all your viewports right now, the following script will make the trick changing it dynamically:
if (window.screen.height==568) { // iPhone 4"
document.querySelector("meta[name=viewport]").content="width=320.1";
}

The startup image has nothing to do with the letterbox as some developers were reporting. Of course, if you want to provide your launch startup image it has to be 640×1096 and you can use media queries to use different images on different devices. Some reports were saying that you need to name the launch image as in native apps “Default-568h@2x.png” but it’s not true. You can name it however you want. The sizes attribute is completely ignored.

You can use media queries to provide different startup images:

<link href="startup-568h.png" rel="apple-touch-startup-image" media="(device-height: 568px)">
<link href="startup.png" rel="apple-touch-startup-image" sizes="640x920" media="(device-height: 480px)">

If you want to provide an alternative version for low resolution devices then you can use the -webkit-device-pixel-ratio conditional too. If you are wondering why 568px and not 1136px remember that we are using CSS pixels and in these devices the pixel ratio is 2.

The trick is the viewport. Why? I don’t really know. For me, it’s just a bug. But it’s the only solution I’ve found so far.

The other problem is with Home Screen icons that you are already have before buying your new device. iTunes will install the shortcut icon again from your backup and it’s not clear if we are going to have a way to upgrade the compatibility. Even if you change the viewport, if the icon is already installed before the change you will get the letterbox.

IOS 6 AND HTML5 DEVELOPMENT

iOS 6 is available as a free update for every iOS 5 device but not the iPad first generation so we will see this version browsing the web really soon and the iPad market is being fragmented for the first time. The following findings are useful for all iOS devices that are talking the iOS 6 upgrade. As always -and unfortunately- Apple is giving us just partial and incomplete updates on what’s new on Safari and I -as always- enter the hard work of digging into the DOM and other tricks to find new compatibility.

FILE MANAGEMENT

Finally! Safari for iOS 6 supports a file upload input type and with HTML Media Capture partial support.
A simple file upload as the following, will ask the user for a file, from the Camera or Gallery as you can see in the figure. I really like how Safari is showing you screenshots instead of a temporary filename after selecting your image.

<label>Single file</label>
<input type="file">

We can also request multiple files using the HTML5 new boolean attribute. In this case, the user can’t use the camera as a source.
<label>Multiple files</label>
<input type="file" multiple>

We can access the camera and gallery using file uploads

There is no way to force the camera, as using capture=”camcorder”. However, we can specify if we want to capture images or videos only, using the accept attribute.

<input type=file accept="video/*">
<input type=file accept="image/*">

There is no support for other kind of files, such as audio, Pages documents or PDFs. There is no support for getUserMedia for live camera streaming.

What you can do with the image or video after selected?

  • Send it using multipart POST form action (old-fashion upload mechanism)
  • Use XMLHttpRequest 2 to upload it using AJAX (even with progress support)
  • Use the File API that is available on iOS 6 that allows JavaScript to read the bytes directly and manipulate the file client side. There is a good example of this API in action on HTML5Rocks.

WEB AUDIO API

HTML5 game developers should be happy! Web Audio API appears on a mobile browser for the first time. This API allow us to process and synthesize audio on JavaScript. If you have never played with some low level audio, the API may seems a little weird, but after a while is not so hard to understand. Again, HTML5Rocks has a great article to begin with the Audio API.

More information and news on the API on http://www.html5audio.org

SMART APP BANNERS

Website or native app? If we have both, now we can join efforts and connect our website with our native app. With Smart App Banners, Safari can show a banner when the current website has an associated native app. The banner will show a “INSTALL” button if the user doesn’t have the app installed or a “VIEW” button to open it if installed. We can also send arguments from the web to the native app. The case is to open the native app on the same content that the user were seeing on the web site.

To define a Smart App Banner we need to create a meta tag, with name=”apple-itunes-app”. We first need to go and search for the app we have on iTunes Link Maker and take the app ID from there.

<meta name="apple-itunes-app" content="app-id=9999999">

We can provide a string value for arguments using app-argument and if we participate in the iTunes Affiliate program we can also add affiliate-data in the same meta tag.

<meta name="apple-itunes-app" content="app-id=9999999, app-argument=xxxxxx">
<meta name="apple-itunes-app" content="app-id=9999999, app-argument=xxxxxx, affiliate-data=partnerId=99&siteID=XXXX">

The banner takes 156 pixels (312 on hi-dpi devices) at the top until the user click on the bottom or on the close button and your website will get the full height for you. It acts like a DOM object at the top of your HTML but it’s not really on the DOM. On iPad -and more on landscape- it seems a little space-wasting.

With Smart App Banners, the browser will automatically invite the user to install or open a native app

During some seconds, the banners shows a “loading” animation while the system is verifying that the app suggested is valid to the current user’s device and App Store. If it’s not valid, the banner hides automatically; for example, it’s an iPad-only app and you are browsing with an iPhone or the app is available only on the german App Store and your account is in US.

CSS 3 FILTERS

CSS 3 Filters a set of image operations (filters) that we can apply using CSS functions, such as grayscale, blur, drop-shadow, brightness and other effects. These functions will be applied before the content is rendered on screen. We can use multiple filters using spaces (similar to transforms).

You can try a nice demo here. A quick example of how it looks like:

-webkit-filter: blur(5px) grayscale (.5) opacity(0.66) hue-rotate(100deg);

CSS 3 CROSS-FADE

iOS 6 start supporting some of the new CSS Image Values  standard, including the cross-fade function. With this function, we can apply two images on the same place with different levels of opacity and it can even part of a transition or animation.

Quick example:
background-image: -webkit-cross-fade(url("logo1.png"), url("logo2.png"), 50%);

FULL SCREEN IN SAFARI

Besides the chrome-less home screen meta tag, now the iPhone and iPod Touch (not the iPad) supports a full-screen mode when in landscape. This is perfect for immersive experiences such as games or multimedia apps. There is no way to force full-screen mode and it needs to be launched by the user (last icon on the toolbar). However, we can invite the user to move to landscape first and press on the full-screen icon to activate our app. If we mix this with some touch event handling we can hide the URL bar and provide a good interface until the user escape from the full-screen.

Fullscreen navigation on iPhone and iPod Touch

You will always find two or three overlay buttons at the bottom that your design should be aware of, that’s the back button, the optional forward button and the cancel full-screen.

You can use the onresize event to detect if the user is changing to full-screen while in landscape.

ANIMATION TIMING API

Game developers, again, you are lucky. iOS 6 supports Animation Timing API, also known as requestAnimationFrame, a new way to manage JavaScript-based animations. It’s webkit prefixed, and for a nice demo and more detailed explanation check this post from Paul Irish.

CSS IMAGE SET

This is not part of any standard group yet. It’s a new image function, called image-set receiving a group or images with conditions to be applied. The only compatible conditions right now seems to be 1x and 2x for low density devices and high density devices. With this new function we don’t need to use media queries to define different images for different resolutions. The working syntax is:
-webkit-image-set(url(low.png) 1x, url(hi.jpg) 2x)
It’s working on CSS, such as a background-image. I couldn’t make it work on the HTML side, for the src attribute on an img element or the new proposed pictureelement. With this new syntax we can have more clear multi-resolution image definition, as we don’t need to use media queries and background-size values.

PASSBOOK COUPONS AND PASSES DELIVERY

Passbook is a new app in iOS that works as a virtual container for all your passes, ticket, discount coupons, loyalty cards and gift cards. As a web developer you may want to serve the user with a discount coupon, a ticket to an event, an e-ticket for your next flight or a loyalty card.

Apple allow websites to deliver this kind of passes from a website without the need of a native app.

To deliver the pass on your website you just need to use the MIME type application/vnd.apple.pkpass or send it through email

Apple provides a tool that you can install on your server to package and sign customized passes on the fly that may include current user information

To pass file is just a JSON meta-data file and a couple of images. We need to package the file and sign it. Unfortunately, to sign the pass we need a signature from Apple and that means that the web developer needs an iOS Developer Program account ($99/year). If you receive the pass already signed, you can just insert it on your own site.

One of the great features of passes is that once installed you can provide some web services on your end and through Push Notification Services, the operating system will call your web services to update the information on the pass.

More information at developer.apple.com/passbook

STORAGE APIS AND WEBAPP UPDATES

No, there is no new storage API available. There is no support for IndexedDB yet. However, there are some changes you should take in consideration:

  • Application Cache limit was increased to 25Mb.
  • Chromeless webapps (using the apple-mobile-web-app-capable meta tag) now have their own storage sandbox. That means that even if they are served from the same domain, the web app from the Home Screen will have its own persistent Local and SQL Storage. Even if you install the icon many times, every icon will have its own sandbox. While this is good news for apps, it may be also a problem on some apps if you are passing information from the website to the home screen widgets through storage.
    Credits for this finding to George Henne on his post.
  • There is a new undocumented meta tag that can be used on any website (having the apple-mobile-web-app-capable meta tag or not) that allow us to define adifferent title for the Home Screen icon. As you may know, by default Safari takes the document’s title and crop it to 13 characters. Now we can define an alternative title for the Home Screen using:

<meta name="apple-mobile-web-app-title" content="My App Name">

I’ve also found a meta tag called apple-mobile-web-app-orientations accepting the possible values portraitportrait-upside-downlandscape-right,landscape-leftportrait-any. Unfortunately, I couldn’t make it work. If you have any luck feel free to comment here.

WEB VIEW UPDATES

On Web View (pseudobrowsers, PhoneGap/Cordova apps, embedded browsers) JavaScript now runs 3.3x slower (or let’s say that Nitro engine on Safari and Web apps is 3.3x faster). Be careful about the 3.3x, that is just the different of running SunSpider on the same device, on Web View and Safari. However, SunSpider is not covering all possible kind of apps and your total rendering time is not just JavaScript, so this doesn’t mean that your app runs 3.3x slower.

We can find some other good news:

  • Remote Web Inspector for webapp debugging
  • A new supressesIncrementalRendering Boolean attribute that can eliminate the partial rendering mechanism. I believe this feature is useful to reduce the perception of loading a web page instead of being an app.
  • A new WebKitStoreWebDataForBackup info.plist Boolean feature where we can define that we want localStorage and Web SQL databases to be stored in a place to be backed up, such as in iCloud. This problem has appeared in iOS 5.01, now it’s solved
  • Changes in the developer agreement: it seems that the restriction of using only the native WebView to parse HTML and JS has gone. It will be good if someone from Apple can confirm this. The only mention to the internal WebKit engine is that it’s the only engine capable of downloading and execute new code, while in the same app expected behavior; that’s the anti-Chrome statement. You can use your own engine but only if you are not downloading the code from the web. This may be opening a door… such as delivering our own engine, for example, with WebGL support.

REMOTE DEBUGGING

I’m keeping this topic at the end. Because it’s a huge change for web developers. For the first time, Safari on iOS includes an official Remote Web Inspector. Therefore tools, such as iWebInspector or Weinre will become obsolete since this version. The Remote Debugger works with the Simulator and with real devices via USB connection only.

To start a remote inspection session you need to use Safari 6 for desktop. Here comes the bad news: you can only debug your webapp on a Mac desktop computer. It was a silent change, but Safari for Windows is not available anymore, so it’s stuck in 5.x. Therefore, only with a Mac OS computer you can make web debugging session on your iOS devices (at least officially for now).

For security reasons, you need to first enable the Web Inspector from Settings > Safari > Advanced. The new Inspector means that the old JavaScript console is not available anymore.

You can start a debugging session with:

  • A safari window on your iOS device or simulator
  • A chrome-less webapp installed on your iOS device or simulator
  • A native app using a Web View, such as Apache Cordova/PhoneGap apps.

When talking about native apps, you can only inspect apps that were installed in the device by Xcode (your own apps). Therefore, there is no way to inspect Google Chrome on iOS websites for example.

If you are used to the Webkit Inspector -Safari 5 or Chrome-, you are going to see a completely redesign version of the inspector in Safari 6 based on Xcode native development UI. You will be lost for a while understanding the new UI. With the inspector session, you can:

  • See and make live changes on your HTML and CSS
  • Access your storages: cookies, local storage, session storage and SQL databases
  • Profile your webapp, including performance reports for Network requests, Layout & Rendering and JavaScript and events. This is a big step in terms of performance tools.
  • Search on your DOM
  • See all the warning and errors in one place
  • Manage your workers (threads)
  • Manage JavaScript breakpoints, and define Uncaught exception breakpoint.
  • Access the console and execute JavaScript
  • Debug your JavaScript code
  • Touch to inspect: There is a little hand icon inside the inspector that allows you to touch on your device and find that DOM element on the inspector.

Well done Apple, we were waiting for this on iOS for long time. Apache Cordova users should be also happy with this feature.

OTHER SMALLER UPDATES

  • Apple claims to have a faster JavaScript engine. And it seems to be true. On the SunSpider test I’m receiving 20% improvement on JavaScript performance on the same device with iOS 5.1 and iOS 6.
  •  Google Maps is not available anymore on iOS 6; Now http://maps.google.com redirects to the Google Maps website and not the native app. there fore there is a new URL scheme, maps, that will open the native new Maps applications. The syntax is maps:?q=<query> and query can be just a search or latitud and longitude separated by comma. To initiate a route navigation, the parameters are: maps:?saddr=<source>&daddr=<destination>.
  •  XHR2: Now the XMLHttpRequestProgressEvent is supported
  • The autocomplete attribute of the input is officially in the DOM
  • Mutation Observers from DOM4 are now implemented. You can catch a change in the DOM using the WebKitMutationObserver constructor
  • Safari no longer always creates hardware-accelerated layers for elements with the -webkit-transform: preserve-3d option. We should stop using it for performance techniques.
  • Selection API through window.selection
  • <keygen> element
  • Canvas update: Now the createImageData has one parameter and now there are two new functions that the name suggest to be prepared to provide High Resolution images webkitGetImageDataHD and webkitPutImageDataHD.
  • Updates to SVG processor and event constructors
  • New CSS viewport related measures: vh (viewport height) vw (viewport width) and vmin (minimum between vw and vh)
  • CSS3 Exclusions and CSS Regions were available on beta 1 but they were removed from the final version. It’s a shame although they were too new and not mature enough.
  • iCloud tabs. You can synchronize your tabs between all your devices, including Macs, iPhones and iPads. So the same URL will be distributed through all devices. Be careful on your mobile web architecture!

WHAT WE ARE STILL WAITING FOR

There are a couple of things that we still need to wait for a next version, such as:

  • IndexedDB
  • FileSystem API
  • Performance Timing API
  • WebRTC and getUserMedia
  • WebGL -still there and still disabled-
  • Orientation Lock API for gaming/inmersive apps
  • Integration with Facebook and Twitter Accounts native APIs, so we can use current OS user’s credentials automatically instead of forcing the user to log in again.

FINAL THOUGHTS

Safari on iOS 6 is a big step for HTML5 developers; debugging tools, new APIs, better JavaScript performance. However, I must say that Apple is still forgetting about documentation updates and properly communication with web developers. There are almost no answers on the discussion forum, no updates on the Safari documentation (some docs are really too old right now).

Read More
1980k 1980k

App Design Paradigms

Article: Source

Have you ever wondered what it is about an application that makes the user experience familiar and intuitive to us? There are many underlying factors which affect how we use an app, as well as the way we connect with it. A large portion of the apps we use have a basic framework which we seem to connect and interact more intuitively with, otherwise we wouldn’t be using them. These app design paradigms come into play with the way we navigate apps, create content in applications, and organize assets we have within those apps.

Navigation

Within the apps that have the most wide-spread use, there seems to be a consistency in how the navigation is implemented into the UI; and not only on hand-held devices either. The consistency (or lack thereof) goes unnoticed for the most part. As designers though, it’s our job to figure out what conditions us as humans to interact with the various devices and interfaces that we come across each day, so that we may better our future designs, products, and apps.

Google+ Navigation App Design Paradigms

Left-Oriented Navigation

FacebookGoogle+Path, Mail.app, and dozens of other services and apps have left-oriented navigation systems. While there could be many factors for this design choice, there’s one reason that makes the most sense, at least in my opinion.

Since the conception of paper, (most) humans have been writing their languages from left-to-right. This was originally intended to protect the ink from being smeared across the pages as it was being written onto the papyrus. While an iPhone display isn’t exactly a piece of Indo-European papyrus, it’s in our culture to teach and be taught to read and organize from left to right.

From an early age this has been instilled in us and through that, our brains have learned to comprehend much, if not all of what we do in the left-to-right manner. By placing the organization system within an application on the left-hand side, we naturally view it as part of the interaction process. There are exceptions to this, as not all apps are designed in the examples I discussed above. However this does present a valid argument for left-oriented navigation within apps.

Left Hand Navigation App Design Paradigms

Bottom-Oriented Navigation

Bottom-oriented navigation is yet another possibility that seems to shine within certain mobile apps. Apps such asTweetbotDropbox, and Instagram all have predominately bottom-oriented navigation. The best argument for bottom-oriented navigation is that when using our mobile devices, the placement of our hands allows our thumb to easily glide along the bottom screen.

This is a welcomed concept to our subconscious, as it is the quickest solution to a problem. This placement also helps being that such movement of our thumbs prevents us from blocking any other content that is on the screen. It’s a logical solution to the development of the mobile platform.

Bottom Oriented Navigation App Design Paradigms

Top-Oriented Navigation

Top-oriented navigation exists, although it seems to be a bit more prominent in desktop apps. I’m yet to come up with a valid argument for that paradigm, with the exception of productivity apps such as Numbers and Pages, where, for organizational sake you start at the top. Though Pocket is one exception. By having the navigation keys up top on a slightly darker background color, it puts more emphasis on the saved content. If you have any argument for this paradigm, I’d love to hear feedback in the comments below.

Publishing/Creation

When going to share, publish, reply, or create something within a mobile application, there is a recurring theme in the vast amount of designs. While the y-axis of this particular paradigm shifts from app to app, the x-axis seems to stay consistent. In a variety of mobile app genres, the button to add, create, or publish content is right-oriented.

Tumblr, Apple’s Clock app, and Trip Cubby all share this UI choice. To make a new Tumblr post, select the button in the lower-right-hand corner; to create a new alarm in the Clock app, select the ‘+’ button in the upper-right-hand corner; the same can be said for creating a new log in Trip Buddy. I believe the reason for this is the same as mentioned above; the fact that a majority of individuals are right-handed and our thumbs naturally fall along the right side of the x-axis.

As with the paradigms mentioned above, there are exceptions. Facebook, both native mobile apps and browser-based versions, have the button for a new post located across the top of the display. Path also takes the complete opposite approach with the post button being located in the lower-left-hand corner. If there are any apps that use a different method, I’d love to see how they implement it into the design.

Right Oriented Creation App Design Paradigms

Overview

With designers and developers coming up with new apps each day, there will always be a variety of UIs within their respectable platform. It’s interesting to look at, however, that a large portion of the most downloaded and used apps share the paradigm of designing with the human subconscious in mind. Many developers may not even consciously know themselves, the reason for which they implement these UI choices, but that only goes to show that this paradigm may simply be a product of that which is unseen. The subliminal, if you will.

What design paradigms have you noticed within various UIs? Share them, below!

Read More
1980k 1980k

Four Internet of Things trends.

Article: Source

The Internet of Things is comprised of networked objects with sensors and actuators. These objects observe their environment and share the data they collect with each other, Internet servers and people. This data is analyzed and the results are used to make decisions and affect change. Change may come from a connected object making adjustments in the environment, or it may come after the collected information is analyzed further by a person.

Odopod has several clients involved in the Internet of Things space and we’ve worked with them in a variety of ways including brand and marketing work, product and service development and connected object prototyping.

We recently lead a workshop with one of these clients, exploring ways that their household products could benefit from being connected to the Internet. Several of their products are already connected to each other and the Internet, we helped them uncover new opportunities to push these products beyond pure utility and to find ways to do and say something new.

To get things started we reviewed four themes that come up most often in Odopod’s work around the Internet of Things.

1. The quantified self.
image



At this year’s Planningness Conference, Guthrie (Director of Brand and Strategy at Odopod) and I lead a session on Connected Personal Objects, where we explored how the Internet of Things can drive a virtuous cycle of learning and change based on the collection and analysis of data.

Tracking performance as a guide for change is not a new idea. Companies use data to improve business processes as well as product marketing. Athletes and medical professionals collect biometric data to optimize performance and patient treatment. What’s more, an increasing number of non-professionals are collecting information about themselves, looking for patterns in order to positively impact their lives. In all cases, the mechanisms employed range from pen and paper to high-tech devices coupled with data mining.

There is no question that the Internet of Things makes it easier and easier for us to learn from our actions. Many products provide customers with direct access to the information from which they can draw their own conclusions. Increasingly, these products will be bundled with services to perform more detailed analysis and deliver simple, actionable recommendations.

For example, most services that track athletic performance such as running collect data and report extensive information about current and past runs. Future services will take things further. Based on deeper analysis, these services will be able to set optimal diet and workout plans as well as provide real-time coaching based on your individual training goals and performance history.

2. From computers to things.
image



As sensors, actuators and the technologies that let them talk to each other become smaller and less expensive, more and more objects will be networked. This progression is the nature of the Internet of Things and it’s changing our relationships to computers and information.

Today, smartphones are the most pervasive objects in the Internet of Things landscape. They provide a wide array of sensors and radios for communication in a single, portable package. They run robust operating systems that allow an endless number of applications to take advantage of these sensors and radios in different ways. As miniaturized general-purpose computers, smartphones bear more of a resemblance to PCs then they do to the future of connected objects.

We’re already seeing an increase in the number of connected objects dedicated to one purpose. These objects are custom fit to do a specific thing better and more conveniently than a smartphone app. These items are easily recognizable as digital devices. Not only do their buttons and screens betray their heritage, but they also tend to be dependent on smartphones, PCs and chargers.

As technology continues to advance, everyday objects unrecognizable as hi-tech gadgets will be equipped with sensors and internet connectivity. Objects such as lamps will inconspicuously monitor the conditions of their surroundings, communicate with other objects within their network and act based on their collective knowledge.

Finally, in environments where robust sensors are pervasive, it’s no longer necessary for objects to contain their own electronics or power. These objects are virtually linked to the Internet by other objects that act as their agents. Technologies like RFID and computer vision allow connected objects to identify these objects and display data, information and user interfaces on their behalf. A connected kitchen counter could identify groceries by sight and display nutritional information and recipes that match your tastes.

3. Ambient information.
image



As the components of the Internet of Things disperse, it becomes possible for displays to become more integrated into our environment. Ambient displays such as the three projects pictured above are dedicated, real-time displays for a small set of dynamic data.

As with this weather clock or this map of bicycle availability, an ambient display may work exclusively with a specific type of information. Others, like the Ugle are designed to display data of your choice in a coded manner, significant only to those in the know.

In either case, these objects can be made with simpler technologies than more complex devices that combine sensing, control and display. They can be made in smaller batches and in a variety of styles to match the environment in which they’re placed. It seems inevitable that stores like Target, Restoration Hardware and Bloomingdale’s will carry a range of ambient displays in their house wares departments. Furthermore, small boutique shops and furniture designers will sell uniquely designed and custom-made displays. Consumers will shop for displays that match their personal styles and connect them to the information services of their choice.

With ambient displays, people won’t need to grab their phones, launch an app and wait for data to be fetched. At a glance, they will know the weather forecast, if a spouse has left work, or if the doors of their house are locked.

4. Interoperability adds value. 
This last trend is relatively simple, but critical: things on the Internet must work together.

Currently there is no broadly used, open standard for how objects on the Internet of Things share information and communicate with one another. Except in the case of peripherals, objects made by one company seldom interact directly with those from another company. When they can directly interact, the objects tend to be part of a closed, proprietary system with a licenser acting as gatekeeper preventing a truly open standard.

In response, web-based Application Programming Interfaces (APIs) are being used to encourage interoperability between different systems. Implementing an API does not require that companies expose all of their data or give up competitive advantages. Anyone who integrates their product or service with Facebook wishes that they had more control over what they read or publish through the API; but to give that control to third parties could jeopardize Facebook’s livelihood.

The Health Graph by RunKeeper is one example of a company publishing a subset of their data to an open database. In exchange for providing this information to potentially competitive products, RunKeeper benefits from complementary devices such as the Fitbit Ultra, the Withings Scale and the Zeo Sleep Manager publishing data to the same system. Having one place for all of these products and services to come together provides additional value to the customers of all participating companies.

In cases where a manufacturer does not directly support interoperability with specific products or features, third party systems such as If This Then That will step in and provide an easy way to connect the outputs of one API to the inputs of another. Theoretically, a service like this could allow a house’s lighting system to benefit from knowing that the alarm system has been armed and that everyone has left the house. This sort of interaction between systems assumes that those features have been exposed via APIs and that the homeowner has given each system permission to talk to one another. This scenario doesn’t require that the two manufacturers plan for or invest in this particular integration.

As with popular web-services, we’ll find that Internet of Things products and services that offer APIs and encourage third parties to develop integrated applications will enjoy broader adoption than those that are closed.

Forward looking statements. 
These four themes are in no way comprehensive of all the trends influencing the development of the Internet of Things. The work we do at Odopod largely focuses on personal connected objects and those found within homes and autos. Within this purview, these trends are influencing the design of Internet of Things products and services and the way we interact with them.

What trends related to the Internet of Things interest you? Let us know in the comments below.

Read More
1980k 1980k

Spectacular Floating Jellyfish Aquarium at Portland Airport

Article: Source

image

Multi-disciplinary artist Sayuri Sasaki Hemann brings underwater jellyfish worlds to the surface in this ongoing project entitled Urban Aquarium. Growing up, the artist and her family relocated many times between Japan, Australia, and Romania. Often feeling displaced within these new environments and new languages, Hemann says “I naturally found refuge in using my hands and making art.”

In her adult life, Hemann continues to investigate these feelings of being out of context and displaced in life. Specifically, she draws on a fascination with sea life, recreating floating jellyfish aquariums where you might least expect them and in front of a public eye. Urban Aquarium consists of a variety of installations and Underwater Flight, located in Portland International Airport, is just one of the many located across the city of Portland, Oregon.

The artist enjoys using a diverse range of medium and she says, “I am most inspired by the way light reflects on each medium, threads and fibers, everchanging underwater sea creatures, and vivid colors of life.” These animals are made out of colored organza and are suspended from the ceiling to create a wild interpretation of a jellyfish world. As travelers quickly zip by to catch a flight, they will be unexpectedly greeted by a sea of floating creatures.

image
image
image
image
image
image
image
image
image
image
image
image
image

Read More