Vision of Webizing


Vision of Webizing

SCROLL DOWN

Vision of Webizing


Vision of Webizing

Picture1.png
Picture1.png

Webizing Research


Webizing Research

Webizing Research


Webizing Research

Figure25b.png

WEBIZING Mixed Augmented and virtual reality

We present a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.

 

WEBIZING asynchronous collaborative work

Most mixed and augmented reality (MAR) applications require the target to be specified a priori and need the same MAR applications among cooperative users. Thus, most existing MAR applications are hard to share MAR scenes or contents without prior agreement for cooperative works. To address this problem, we propose a webizing method that redesigns component relationship of MAR content using an episode as a container of user interactions for sharing life experience and activities. The MAR scene can be gradually developed by multiple participants of the episode. We present MAR system that can exploit the ecological growth model of the Web and the cooperative augmentation of experience by sharing MAR scene. An example demonstrates MAR scene descriptions emerging from multiple augmentation usages like Web referencing model and MAR content as a cooperative experience augmentation media where the sender message includes user’s context of augmentation while the receiver can share the context and understand the sender’s viewpoint easily. Our examples demonstrate MAR contents can be cooperatively produced, accumulated and consumed like Web 2.0, a digital prosumption that has proven effective in dramatic expansion of Web contents.

 
Untitled.png

Webizing super multiview display

With the development of optic technologies, various forms of stereoscopic 3D displays such, as 3DTVs, head-mounted displays, multi-view autostereoscopic displays, and super multi-view autostereoscopic displays, have emerged. However, displays have various characteristics and different implementations and the supply of content and rendering applications cannot keep up with the development speed of the displays. The content production varies and depends on the characteristics of a specific display. To reuse content with various types of autostereoscopic 3D displays with different characteristics, the content or rendering application should be modified for other displays. To resolve these problems, we propose a webizing method for the content and display of super multi-view autostereoscopic systems in the Web environment. The proposed method renders not only 3D autostereoscopic model by WebGL but also existing web pages described by HTML on the Web as super multi-view autostereoscopic 3D system content on the Web. In addition, we propose a JavaScript interface for describing displays to allow developers to access the profile and functionalities of displays through web browsers. This interface helps rendering applications generate monoscopic 3D or N multi-view autostereoscopic 3D content based on a display profile from the same source, also known as one-source multi-use.

 
Figure 1.png

WEBIZING HUMAN INTERFACE DEVICES

Recently virtual reality (VR) technology has been widely distributed, but VR interaction devices supported in web environments are limited compared with in the traditional VR environment. In the traditional VR environment, the Virtual- Reality Peripheral Network (VRPN) provides a device- independent and network-transparent interface. To promote the development of WebVR applications with various interaction devices, a method like VRPN is required in the web environment as well. We propose a webizing method for human interface devices and related events that serves as either VRPN messages or HTML DOM events to deal with interaction events. The method uses an event negotiation mechanism to provide different abstraction types of user interaction events. As a result, the developers choose a familiar way to implement VR web applications and use resources and libraries on the Web. We expect our proposed method to promote the development of VR web applications with various interaction devices through traditional VR and web developers.