So the main objective for mobile apps is:
Minimize technical context switches in an environment of continuous changing user contexts
User Centric Mobile Apps is our concept and the Ad-hoc Mobile Ecosystem is our implementation.
Our next step is seamless integration of a Tactical (2D) map-based view and a First Person View (3D) into a single User Centric Augmented Reality app with following requirements:- Minimize distraction through context switches with a unified AR view
- Deeply situated navigation through decision relevant information
- Natural support for concurrent events (calls, chat, navigation, search)
We do not limit AR to our visual senses, but also to tactile senses (vibrating) and hearing as well.
First Person View (3D)
- It is setup as a library to build upon not a project tight to one use-case.
- It contains a rich set of working demos to experience the library versatility.
- Screencast series on YouTube.
- Support for visualizing OpenGL 3D models, Pattern Recognition and building 3D Head Up Displays (HUD).
Tactical View (2D)
One of the main features of the Ad hoc Mobile Ecosystem project is to provide and share location-based information. Maps are made available through an Open Source WMS-Server.
gvSIG Mini is an appropriate WMS-Client for Android platforms and we worked with it successfully within the German Bundeswehr funded APP-6 Maker project (more on EC Joinup Portal).
gvSIG Mini is an appropriate WMS-Client for Android platforms and we worked with it successfully within the German Bundeswehr funded APP-6 Maker project (more on EC Joinup Portal).
Breaking news
4th of April gave us an important kick with next step of former *secret* project of Google[x] Labs smart glasses project, now called Project Glass. Read more and follow them on their G+ page. Beside all of our concepts and software, this hardware project seems to be the ideal fit within our vision.