Category Archives: Cross Platform

Future Cities :: Future of Planning

We are delighted to confirm that we have been selected as one of the shortlisted 10 projects in the Future Cities Catapult “Future of Planning” initiative.eventbrite-002-1600x800

 

 

 

 

 

https://smartcitiesworld.net/governance/smes-help-to-overhaul-uk-planning-1473

Our solution B4itsBuilt, mobilises citizen engagement through true visualisation.

This project will extend professional, accurate, real-world AR experiences into the public domain for the first time. It will deliver the template for a consumer solution to allow users to see the visual effect of a planning application. It will be possible to hold up a phone or tablet and see the impact of an application in order to enable informed engagement and increase community influence in decision making.

If you would like to be involved as a data partner, and have a project / development you would like to see – please feel free to get in touch – hello@linknode.co.uk or call us on 0141 559 6170

catapuly

 

 

Staff Spotlight – George Banfill

Our latest team spotlight is our Technical Director George Banfill.

GB

Who are you and what is your role at Linknode?
I’m George Banfill – husband, father, sailor, cyclist, geek and tired. (Comes of having a 3 year old and a 4 month old at home).

 What is your role at Linknode?
I oversee all the product software development and software releases we do as a company.  I was employee number two back in August 2011 and have been working on building cross platform mobile apps and supporting systems ever since. I’ve been involved in building and overseeing Windows phone, Windows 8, Xamarin.Android and Xamarin.iOS development as well as creating asp.net web sites to support them. Continue reading

Staff Spotlight – Rufus Mall

Our latest team spotlight is our iOS guru, senior software engineer, Rufus Mall.

Who are you and what is your role at Linknode?

My name is Rufus Mall. I’ve been working as a Software Engineer at Linknode for around three and a half years now. The main skills I use of at Linknode are Graphics Programming and iOS Development. I spent the first three years of my time at Linknode working on VentusAR but have now moved on to working on our upcoming product UrbanPlanAR.

RUFUS

My favourite person in the world (alongside Steve Wozniak)

Continue reading

Hackdays!

Last week, as a company we had a hack day. Everyone (well all the developers) stopped their normal work and had two days to see what they could produce.

The Brief

Produce an immersive Augemented or Virtual Reality experience.

The Prizes

  • A bluetooth wireless speaker
  • A  mini drone
  • Pie Face (the kids game with skooshy cream)
  • Eternal respect and bragging rights in Linknode Towers

The Technology

Our standard developer kit is to use Xamarin to make cross platform C# for android and iOS. So for this hack day, I wanted to give everyone the option to use whatever tools (software / hardware) they wanted.

  • An Oculus Rift DK2
  • Google Cardboard
  • A DSLR camera and a Go Pro
  • iPhones (various different types)
  • Android phones (various different types)
  • Unity / MonoGame

The Starting Gun

The Entries

VentusAR Fly Through using WebGL on Google Cardboard / Oculus Rift

ventusar_cardboard ventusar_oculusrift

 

 

 

Experiementing with WebGL to create something for  google cardboard experience. This provdes a views for each eye that when viewed togther provide a stereoscopic effect. In theory the same technique could be used to show the VentusAR Fly Through on the Oculus Rift.

Google Street View on Google Cardboard

Screenshot_20151113-102946Can we provide an Outdoor Virtual Reality experince using imageary from Google Street View and view it while on site. This takes the GPS location from the device and contacts google street view to show photography from that location.

 

Surviving the blob onslaught (with Unity for Google Cardboard)

Screenshot_2015-11-13-16-30-59A simple survival game based where users, wearing the Google Cardboard headset, have to explore the virtual world. They are under sustained attack from blobs, which they must destroy using the magnetic switch on the side of the device. Built in Unity so is cross platform, this game runs well on Android and iPhone devices.

Exploring panoramic photographs with the Oculus Rift

Could we take a 360degree panoramic photograph and explore our way aorund it using the oculus rift. This would anchor the photograph in real world position and as you move you head, only show the part currently shown in the current field of view.  However, getting the oculus rift running in 2 days proved hard.

The Results

Ryan won the best experience prize with his ‘Surviving the blob onslaught’ game for google cardboard. He won the drone and eternal bragging rights within Linknode Towers (or at least until we do it all again sometime in 2016). Congratulations Ryan.

Minh and Rufus shared the Most Commercial possibility prize for their efforts with bringing existing VentusAR functionality to google cardboard / oculus rift.

Conclusions

We would pick three over-riding thoughts that we took out of this hack day:

  • The pixel resolution of a phones screen is visible when viewed through the google cardboard lenses.
  • The google cardboard is crying out for other methods of getting input
  • Getting anything to work with the Oculus Rift is hard

In all there are some interesting additions we could add to VentusAR that provide more immersive virtual reality experiences. Watch this space…

Minecraft Makes Wind Farm Development Projects Accessible

Press Release

Valid 00:01 April 1 2015 to 11:59am April 1 2015

New guidance for the visualisation of wind farms development understands that “stakeholder engagement is extremely important” and recognises that new developments have “considerable scope” for use as techniques are developed and presented.

In support of this guidance, Linknode today announced that development of a Minecraft version of its interactive visualisation and communication tools for wind farm development has reached preview stage.

MC - Ben Nevis

Minecraft Visualisation over Fort William to Ben Nevis (using Ordnance Survey Open Data – OS Terrain 50)

Minecraft is the most popular 3D world creation and exploration tool. Its use, up to now, has been primarily for gaming. However, its capabilities for planning assessment and visual impact assessment can now be readily exploited.

Crispin Hoult, CEO of Linknode explained: “This new development enables the visual impact of a project accessible on a PC, mobile phone or even a games console in the living room”. Hoult added, “with familiar devices and controls we can communicate information better and to a wide audience of interested stakeholders”.

MC - Sunrise over Ben Nevis

Minecraft Visualisation of a Sunrise Over Ben Nevis (using Ordnance Survey Open Data – OS Terrain 50)

Linknode’s business is the integration of real-world data for visualisation – typically in tablets for live photomontage visualisation with the flagship VentusAR software. The company realised that the underlying data and services can make the data available to a far wider audience, otherwise less engaged with the planning and development process.

In the future, SNH guidance for visualisation plans to embrace and incorporate all types of digital media, including real-time visuals and personalised access to data (after suitable testing and scrutiny).

The commercialisation and integrated product development is scheduled to take place over the next year to be released with a version of VentusAR on 1 April 2016.

Notes to Editors

Linknode is the creator of VentusAR – the tablet-based software that uses augmented reality to allow developers, planners and communities to see, in real-time, what a planned wind farm will look like in the landscape, from any location.

Minecraft: Available across multiple platforms, “Minecraft” is one of the most popular video games in history, with more than 100 million downloads, on PC alone, by players since its launch in 2009. “Minecraft” is one of the top PC games of all time, the most popular online game on Xbox, and the top paid app for iOS and Android in the US.

In September 2014 Microsoft Corp. announced it had reached an agreement to acquire Mojang, the celebrated Stockholm-based game developer, and the company’s iconic “Minecraft” franchise.

Viewing and Purchasing Cumulative Datasets

In our previous blog we introduced the VentusAR cumulative data service and described just how big the dataset is. Now I guess you’re now wondering how you get access to it?

It’s simple, and can save you time, effort and money in your everyday work.

As a VentusAR customer, you already have the tools to allow you to view a summary of the cumulative projects within the National Dataset. Full access to the details and a snapshot update are available at a small cost.

Now, non-VentusAR app customers can also access the data by signing up as a Cumulative Data User (portal-only, no app), please contact us for more information.


So, where to start? At the Portal login as usual, select the project you want to assess the cumulative impact on, and navigate to the ‘Cumulative’ tab.

Here you can:

  • View any ‘Organisational’ Cumulative Projects you have set up within the portal
  • View the Free Summary from the National Dataset
  • Request a Snapshot of the Project Area (remember, a Snapshot is a time-stamped update of the cumulative projects following our search and refresh process)
  • View your Snapshot Data

Free Summary

The free summary shows the number of cumulative projects found within the range set by the user (i.e. 30km), they are organised by project stage.

free_summary

You can change the search range by navigating to Edit Project page and entering the specific radius you require in the Cumulative Search Radius field. This enables different searches to be quickly and easily previewed for summary counts before ordering a Snapshot. The National Database is refreshed every two months.

Requesting a Snapshot

To ensure the data is up-to-date for a particular project, each Snapshot delivery is validated and refreshed individually for all the Cumulative Data within your project radius.

The charges for this service are worked out by the number of projects found within your search area. The Portal will show you how much you will be charged, and youcan review before committing. You must agree to our data usage and pricing terms.

To request the Snapshot, click the Request Snapshot button.

request

Once you have agreed to terms and ordered a Snapshot, instant access is provided to full details within the National Dataset for the project area.  This includes project name and location, turbine information (locations, sizes and models) and planning application number.

(Remember, the national dataset extract may be up to two-months out of date so should not be used for planning submissions).

An up-to-date Snapshot refresh will be provided within 5 – 7 working days of order.  This includes project name and location, turbine information (locations, sizes and models) and planning application number. The snapshot will be delivered directly to your project within the Portal, and will replace the extract from the National Dataset. The Snapshot provides up to date details of nearby projects, so may be used for planning submissions.

Additional metadata is included in the snapshot to show the source source and date retrieved to provide an auditable trail of cumulative data.

Viewing the Cumulative Data within the App

You can view the Free Summary in the app too, The details will be locked however, and will only show the project locations on the map summary.

pin_locked

After you have requested a Snapshot, the Cumulative Data will be available in the app, on the map summary, the Fly Through and on My View. Until the Snapshot is delivered, this will be the latest National Dataset details available.

myview

Exporting Cumulative Data

As well as being viewable in the app, the Snapshot Cumulative Dataset you purchase can also be downloaded for your own use as a .CSV (comma separated, plain text) file.

Upon purchasing the dataset, you agree to only use your data for provision of the VentusAR system and services, or in relation to your use for that project only and not for data resale.

If you have any questions regarding access to our Cumulative Data, please feel free to call us on 0141 559 6170 or email hello@ventusar.com

Your Phone has Attitude!

The axis on a mobile device

The axis on a mobile device

Sorry, this post isn’t about your phone or tablets bad attitude and the way it doesn’t let you do what you want – that’s just working with Android that does that. Instead, this post is about how we at Linknode use the sensors built into your device to understand the direction it is orientated to and how that can be used to do interesting things.

This is a core piece of technology we use within VentusAR. We have spent a lot of time and effort interfacing with the sensors within your devices. This experience and skill goes into several of our mobile apps to provide a more intuitive and useful mobile experience.

In this post, we’ll talk about attitude (or geospatial orientation), sensors and sensor fusion, then show some example code of how to get this attitude information on each of the major platforms. I’ll write a follow up post that will dig more deeply into what sensor fusion is and how we have customised it in VentusAR, to provide a better user experience in our augmented reality applications.

Attitude

To allow the device to present useful information about its surroundings, we need to know the direction the device is looking. This provides key information that you must know to be able to do any proper augmented reality. The direction your device is looking  is called ‘the attitude’ (or geographic orientation) of the device. In essence, this is a value that represents the rotation of the device in real world coordinates.  In mathematics, this rotation value can be represented in a number of ways: a quaternion, a rotation matrix or as three separate values for yaw, pitch and roll. We use a quaternion to represent this rotation because this is smaller, involves simpler maths to work with and avoids known problems with rotation matrices – I’ll cover that in a separate blog post some time.

Sensors

Modern phones and tablets have lots of sensors in them – they allow app developers get an insight into the world around them. In terms of attitude, the ones we are interested in this post are:

  • Compass – gives the direction of magnetic north in 3D space
  • Gyroscope – this measures angular rotation – how far you have rotated the device
  • Accelerometer – measure the direction of gravity in 3D space

There are a couple of limitations of these sensors that are worth knowing about:

  • Digital compasses are very noisy and susceptible to interference so often they jump during real world use. This is down to the characteristics of the sensor – as an app developer, there is not much you can do about it.
  • Gyroscopes tend to drift. There is no real world reference for the gyroscope, it is just measuring rotation. If you did a complete 360°, you would expect the gyroscope to give the same result. Unfortunately it doesn’t, after a while of running it tends to drift.

For these reasons, some very clever people came up with the concept of sensor fusion.

 Sensor Fusion

These sensors can be merged through software into a single “virtual” sensor using a process called Sensor Fusion. Many people have written in-depth articles about what Sensor Fusion is and how it works – but you may need a PhD to understand them. I think it is easiest to see it as a mathematical process that takes input from the three physical sensors (Compass, Gyro and accelerometer) and provides one unified quaternion representing the attitude of the device.

Sensor Fusion block diagram

Sensor Fusion block diagram

To provide a more detailed example, if you were standing in the northern hemisphere with the device perpendicular to the ground facing the north pole (i.e. level on a tripod, facing a heading of 0 degrees), the devices attitude would be:

0 degrees 45 degrees 90 degrees 180 degrees
x  0  0  0  0
y  -1  -0.9238795  – 0.7071068  0
z  0  0  0  0
w  0  0.3826834   0.7071068  -1

How does it help

As I said at the start, the integration to the sensors is at the core of what we do at Linknode. We have several apps that read data from the sensors and provide a real time view across a 3D world. We can pull in a real world terrain model and show what the terrain looks like in a particular direction.

Implementations

Each device manufacture / OS vendor provides their own implementation of sensor fusion within their devices. These are usually good enough for general or gaming purpose – they tend to have an emphasis on speed of response instead of absolute accuracy. Below I have shown some code that allows you to get a quaternion out of the API provided by the OS.

All code below is c# as all the code we write is c#. For more information on running c# on iOS or Android have a look at what Xamarin are up to.

Apple (iOS)

Apple provide the CMMotionManager classes that can be used on iOS.

public class IOSSensorFusionExample
{
  public void Start()
  {
    CMMotionManager _motionManager = new CMMotionManager();
    _motionManager.DeviceMotionUpdateInterval = 1/60; //request 60 updates a second
    _motionManager.StartDeviceMotionUpdates(
      CMAttitudeReferenceFrame.XMagneticNorthZVertical,
      _backgroundQueue,
      delegate (CMDeviceMotion motionData, NSError error)
      {
        CMQuaternion cMQuatAttitude = (CMQuaternion)motionData.Attitude.Quaternion;
        //do something useful with the quaternion here
      });
  }
}

(See Xamarin API for mode details)

Android

Android provides the RotationVector sensor type accessible from their SensorManager class:

public class AndroidSensorFusionExample : Java.Lang.Object, ISensorEventListener
{
  public void Start()
  {
    SensorManager sensorManager = this.GetSystemService(Context.SensorService);
    var defaultRotationVectorSensor = sensorManager.GetDefaultSensor(SensorType.RotationVector);
    sensorManager.RegisterListener(this, defaultRotationVectorSensor, SensorDelay.Game);
  }

  public void OnSensorChanged(SensorEvent e)
  {
    float[] q = new float[4];
    SensorManager.GetQuaternionFromVector(q, e.Value.Values.ToArray());
    Quaternion quaternion = new Quaternion(q[1], q[2], q[3], q[0]);
    //do something useful with the quaternion here
  }
}

(see Xamain Android API and Android Docs for more information)

Windows Phone

Windows Phone provides the motion classes:

Motion sensor = new Microsoft.Devices.Sensors.Motion();
sensor.CurrentValueChanged += (sender, args) =>
{
    var quaternion = args.SensorReading.Attitude.Quaternion;
    //do something useful with the quaternion here
};
sensor.Start();

(see MSDN for more details)

Windows 8

Windows 8 uses the motion class:

var sensor = Windows.Devices.Sensors.OrientationSensor.GetDefault();
sensor.ReadingChanged += (sender, args) =>
{
    var quaternion = args.Reading.Quaternion;
    //do something useful with the quaternion here
};
sensor.ReportInterval = 16;

(see MSDN for more details)

Mobile Application Development

Our History
Linknode are an app development company. We have have a portfolio of apps available across different platforms and their stores. We focus on cross platform sensor driven applications that fit with in our Mobile Geography domain. This post is going to be a tour around our app portfolio and what we learnt along the way.

How it started: EmergencySMS

LinkemergencySMSnode’s early apps were all for the Windows Phone market. It all started with the launch of Emergency SMS in June 2011 (the company was 2 months old at the time). This application takes the devices location, looks up the address and creates an SMS message that can be sent to the UK’s National Emergency SMS.

EmergencySMS is still available in the Windows Phone marketplace, and is still used by customers: it gets 10-20 downloads a day. I don’t consider it a masterpiece of design or implementation but it is a good place to start.

  • Start small, your apps may last longer than you think.

MegaTile – free version and premium features

MegaTileAnother consumer app (and most successful in terms of revenue generated) is a called MegaTile and allows the user to customise their home screen on Windows Phone 7 and 7.5. The user can choose an image to make a MegaTile from and set an action (launch phone dialler  send SMS, open web page etc.) for each tile.

  • Have a free version, it make users download it and gets them to convert to the full version. 7% of our users purchased after running a trial.
  • Test the upgrade process of your app on each release. We didn’t and caused ourselves a lot of negative comments from users who were unable to upgrade between v1.2 and v1.3.

Album Flow – Use a Design Pattern

AlbumFlowWe had an idea to bring the iOS style coverflow to Window Phone, this resulted in the Album Flow application. The initial version of this app was ready in 3 days. We are just polishing our 9th release and have over 100,000 users. Album Flow was featured by Appvertise (Nokia and Microsoft) and won a CreativeBloq App Generator Top Ten App.

  • Use a design pattern to make your code consistent and understandable. We use the MVVM pattern and the MVVMLite library to support it.
  • Respond to user demand and comments to get better reviews and more satisfied users.

Starting out in Augmented Reality

Where On Earth, and its cut down version – Heads Up Compass, are two apps that started to define the GIality concept and are the first examples of the use of GIality on a mobile.

HeadsUpCompassHeads Up Compass takes the data from the devices motion sensor, uses it to work out which direction you are looking and superimposes that on the camera feed. Calibrating the compass and getting accuracy from the device sensors is hard. Use the Combined Motion API on Windows Phone or equivalent on Android or iOS for better results than just a single sensor.

Where On Earth goes a step further and looks up features around you to help identify features (hill tops and towns currently) that you can see in the camera view. Where On Earth was customised for a team entering the Oxfam Trail Trekker to provide them with a heads up display of check point locations and distances.

WhereOnEarth

  • Review your maths text book: Quaternions are good when working in Augmented Reality situations, but complicated to get your head round.
  • Mobile apps should work when out of mobile signal. Design to manage offline access and syncing data. These a whole other blog post in this topic.

Windows Phone Development Summary

All of the apps covered above are only for Windows Phone. I love the process of developing for Windows Phone.

  • The tools are excellent – I can use Visual Studio (which I have plenty of experience with and is set up exactly the way I want it).
  • I can switch to Expression Blend if I want to do some more complicated user interaction.
  • I can use C# a popular language that many developers who have worked in enterprise understand.
  • Having come from a web background, I would much rather do layouts in XAML than in HTML / CSS.

My biggest complaints so far have been to do with the store and submission process (not being able to cancel an app being certified) and understanding some of the SDK design decisions taken to do with the back stack and Navigation.

Evolving Beyond Windows Phone

In mid 2012 we decided that Windows Phone was not a big enough market for some of our mobile geography plans (Although it may be getting there according to the latest stats from Kantar). Specifically we wanted to expand into using running our GIality solutions on a tablet. Whilst Windows 8 was coming (it launched October 2012), we wanted to start running GIality apps on tablets right away. We chose Android as our first tablet platform: specifically the Nexus 7 and when it was released the Nexus 10 .

VentusAR: Support for Android

Ventus_logoOur first cross platform app was VentusAR. This is a Business to Business application (so it’s not available in the Play Store) to allow you to visualise what a wind farm would look like if it was built as planned. There are many different parts to VentusAR that all need to work together and need to work across multiple platforms. The main application logic needs to be complied by the Windows Phone and Mono for Android compilers (we’ve since added Mono-touch for iOS and Windows RT compilers too).

Coming from a Windows Phone background, Android development (even using Mono for Android and Xamarin tools) feels a bit of a disjointed mess. I guess it comes of saying anyone can customize the OS to run on any hardware (the Android approach) rather the OS will only run on our hardware (Apple) or the OS will only run on hardware that meets strict guidelines (Microsoft’s approach).

  • Think about making the app cross platform-able early on. Its easier to start that way than to try to change an existing app to be cross platform.
  • The android emulator is too slow to use for real development, buy a proper device to develop on – it’s much quicker.
  • Android Resource qualifiers (screen resolution, pixel density language etc.  make designing and testing Android UI’s very hard. The Xamarin Designer helps a lot.
  • Run an automated build server as changes in the android app can have unintented affects in the other builds.

3DTry.it: Available for Windows Phone, Android and iPad

3DTryit_logo_websiteOur first public cross platform GIality app to launch is 3DTry.it (available for Android, iPad and Windows Phone). This is an app that allows the user to view published 3D models on their tablets on top of the the camera view. The models rotate as though they are in the real world. Download it from the app store if you want to see it in action.

  • The motion sensor frame of reference is different on different devices (the nexus 10 is 90°different from the Nexus 7).
  • Xamarin tools allows iOS apps to be developed on a Windows PC which speeds up development for windows users.

Summary

Having tried developing apps for all three platforms, I enjoy developing for Windows Phone the most (maybe that’s because it was the first one I started developing for). Android is frustrating due to the number of different devices, screen resolutions and version available. iOS development in c# is made easy using Xamarin tools, which have done the job for us so far.

Building Cross Platform Applications

Cross Platform Applications
Having an app is a very effective way of getting information out to your users on the move. However there are several device platforms (iOS, Android, Windows Phone, Blackberry etc.) to build apps for. The percentage market share of each platform means we decided to be inclusive and support as many platforms as practical.

IDC produce reports of market share, the data for 2011 and 2012 is shown below.

From this data, we conclude that support for iOS and Android is essential and makes 90% of the market. We also support Windows Phone as it is the only other platform with an expanding market share. Full details from IDC report.

There are several broad approaches to building cross platform apps for mobile platforms. Each has its own advantages and disadvantages as described below.

Cross-platform HTML5 app

HTML5 Powered with Connectivity / Realtime, CSS3 / Styling, Device Access, Graphics, 3D & Effects, Multimedia, Performance & Integration, Semantics, and Offline & Storage

HTML5 with CSS3, Device Access, Graphics, 3D, Multimedia, Performance, Semantics, and Offline Storage

A HTML5 app uses web technology and the devices built in browser to present data to the user. It allows apps to be created quickly in a similar way to creating a website. This can then be wrapped in a framework that allows the app to be packaged into something that can be sold in the app stores. The technologies and frameworks used to make this approach possible includes: PhoneGap, Applicaiton Craft, Appcelerator and many more

The advantages of this approach are:

  • The speed an app can be created.
  • The consistency of the look and feel of the end result (app on android looks the same as app on iOS).
  • Tthe reuse of existing skills a web developer has to create the apps.

HTML5 allows access to some of the more standard device features: 3D Graphics, Geolocation, multimedia etc. It does not allow (yet!) access to the some of the low level sensors (Accelerometer, Gyroscope, Compass) we make use of in our GIality solutions.

Native applications

Building native apps is the process of building an app for one platform. It uses built presentation framework of that platform you are working on (Cocoa Touch in iOS, XAML on Windows Phone and XML in android). This type of app allows full use of the power of the device: it can make use of all the sensors and can produce richer user interfaces that are consistent with the rest of the device.

The disadvantages it that you must make separate applications (probably in separate languages, probably maintained by different teams) for each platform you want to support.

Our approach: Xamarin Studio

As a small company with a small development team, we looked at what we could do using the different approaches described above and decided we wanted the rich user interface and  speed of native applications (we do a lot of 3D work), but we didn’t want to maintain separate code bases for each project.

four platforms

Our approach has been to use Xamarin technology to create native applications using a common language and a shared code based. We use C# (a language that many developers are already skilled in) to create code that can be run natively on Android, iOS and Windows Phone. We write a native user interface for that application in whatever technology is appropriate for that device.

Monogame Logo

Working with Xamarin technology is the MonoGame project. This creates a 3D environment based on the API of Microsoft XNA. The Monogame team describe this as:

MonoGame is an Open Source implementation of the Microsoft XNA 4 Framework. Our goal is to allow XNA developers on Xbox 360, Windows & Windows Phone to port their games to the iOS, Android, Mac OS X, Linux and Windows 8 Metro.

This allows us to create out 3D visualizations in C# writing XNA code and the game to be ported from platform to platform with very little extra work required.

Final thought: native experiences driving innovation

Scott Hanselman in his article on Apps Are Too Much Like 1990s CDROMs And Not Enough Like The Web, points out (amongst other things) that new experiences currently only possible in native apps, like the 3D experiences of GIality apps, will eventually become a web standard and become possible in a standard web app.

Scott Hanselman’s Web Experience Cycle.

I’m looking forward to writing our 3D visualisations according to a published web standard, but not until working with HTML5, Javascript and CSS is as easy and rewarding as working in C# using Xamarin Studio.