Presentation Slides and A-Frame Workshop See git repo for demo files: https://github.com/rdub80/Ferguson-VR-Hackathon
A talk from the Developer Track at AWE Europe 2017 - the largest conference for AR+VR in Munich, Germany October 19-20, 2017 Casper Fabricius (Cimmerse): WebVR with A-Frame, React and Redux Web-based virtual reality allows VR experiences at the click of a link, and it enables millions of web developers to dive straight into VR and AR. This workshop will teach developers with some web and javascript experience to use the popular React, Redux and A-Frame libraries to build advanced, interactive WebVR sites step by step. Participants will also learn how to add WebVR to existing websites, leveraging and sharing existing code between 2D, 3D and VR.
"The next frontier: WebGL and WebVR" by Martin Naumann The browser is a window into a vast, unlimited world. But what about we don't just peek into a flat, page-based world, but into a space with depth? What can we build with that? And now that our browser is the portal to a space, how can we immerse ourselves into it, rather than stare into it from the outside? And maybe we can link this world inside our browser to our real world as well? This talk explores the possibilities technologies such as Augmented Reality and Virtual Reality by using WebGL and Javascript to extend and enhance the web beyond the flat browser window.
講演者:千葉 悟史(Kudan株式会社) こんな人におすすめ ・UnityでモバイルAR/MRアプリを開発したい方 ・AR/MRを非ゲーム領域でも活用したい方 受講者が得られる知見 ・モバイルAR/MRの新しい活用可能性 ・AR/MRアプリ開発の容易さ ・Computer Vision技術の最新動向とグローバル事例
講演者:ラス・スカンメル(Unity Technologies) こんな人におすすめ ・2Dゲーム開発をする方 ・2Dゲーム開発に興味がある方 受講者が得られる知見 ・Unityの新しい2D機能の内容 ・今後実装される2D新機能の内容 講演動画:https://youtu.be/H9H2MxZwzY0
An introduction to what multiplayer games are, what makes them different from normal games, how to approach building them and specifically how to begin building them with the Unity game engine. Talk given at the GameIS & Dragonplay mobile multiplayer hackathon, 30/7/2015
The document discusses Stage3D and Starling frameworks. It describes how Stage3D allows managing texture memory, vertex and pixel shading, and mesh rendering using OpenGL and DirectX. It also explains that Starling uses a quad batching approach to group quads with the same state for efficient rendering and provides a vertex buffer description to store quad vertex attributes. The document recommends profiling tools for optimizing 3D rendering and references books for further reading.
Valentin Simonov is a field engineer at Unity Technologies who helps game studios optimize their games. He teaches developers how to use Unity efficiently through conferences, trainings, blog posts, and translating books. He maintains an open source project on GitHub. The document provides tips on using tools like the Unity Profiler to identify optimization opportunities and make informed decisions on optimizations. It highlights platform-specific testing and avoiding outdated information online.
Slides from my presentation at Autodesk Forge 2016 http://forge.autodesk.com/tracks-and-speakers/#track-2
講演者:アルトゥロ・ヌネス(Unity Technologies) こんな人におすすめ ・2DアニメーションパイプラインをUnityに持ち込みたい2D アーティスト ・Anima2D のアニメーションワークフローを拡張することに興味のあるプログラマー ・ビルドサイズを拡大することなく、ゲームにもっと多くの機能やアニメーションを使用したいゲーム開発者 受講者が得られる知見 ・Anima2Dのオーサリングパイプラインとスケルトン、アニメーション、バリアントを作成する方法 ・スケルトンベースのアニメーションをコードでオーバーライドする方法 ・Anima2Dアニメーションに各種Unityコンポーネントを統合する方法
Svetlin Denkov gives a presentation on building interactive visualizations using Axure. He demonstrates techniques for creating basic charts like bar charts and donut charts from scratch in Axure. He then shows how to add interactivity through animations like resizing and rotating elements. Finally, he shares more advanced examples from others and discusses limitations of Axure as well as finding the right tool based on project needs. The presentation includes live demos at each stage.
We'll build a game with WebVR then bring it across multiple different platforms with Progressive Web Applications. We'll show how to be able to have import assets, use controllers, and even show how the code is portable across all VR devices.
An high-level introduction to Phaser.js. https://github.com/sH4rk0/meetupRush https://github.com/sH4rk0/xmas2016 Thanks to Michel Wacker (@starnut) for some input.
Full-day training on mobile game development with JavaScript using the Phaser library. To learn more about our online and on-site training on game, web and mobile app development visit https://zenva.com
Video: https://www.youtube.com/watch?v=klDeljOKDjU O tym jak przez kolejne 10 dni pisałem grę z użyciem Phaser.js. Fabuła gry opiera się na anime Dragon Ball. Grafikę do gry robiłem własnoręcznie, o czym możecie się przekonać wchodząc na www.dragonballplay.com, gdzie znajduje się wersja v1.0 tego projektu. Codziennie poświęcałem 5-6 godzin po pracy, aby od 1 do 10 września stworzyć pełnoprawną grę internetową.
For today’s mobile apps, it is quite important to provide interaction among the end users. Consider a game application have users from both Android and IOS and you want them to play together. How about scalability? low latency? user state management? Definitely, there were lots of things you had to deal with so far. On this session, I will try to simplify the things and prepare the sample apps for both IOS and Android talking each other.
Learn about Cross Platform Mobile Game Development with CoronaSDK from Corona Labs. I discuss some of the benefits I've found with Corona and why I chose this platform for our game development.
This document provides an overview and introduction to building virtual reality (VR) experiences using WebVR and the A-Frame framework. It discusses: - What WebVR is and how it allows creating VR tools, standards, and experiences for the open web. - What A-Frame is and its features for building VR scenes in HTML such as being easy to learn, cross-platform support, performance optimizations, and a visual inspector. - Examples of VR experiences that have been built with A-Frame, Mozilla's work in mixed reality and VR including Firefox Reality, Spoke for creating 3D environments, and Unity WebVR assets.
AR/VR in JavaScript Apps discusses emerging technologies for building augmented and virtual reality experiences using web technologies. It introduces key concepts like AR, VR, and XR. It also outlines several JavaScript libraries and frameworks that can be used to create 3D content and immersive experiences, such as Three.js, WebXR, A-Frame, and React 360. The document recommends resources for continued learning and provides examples of how to get started with these technologies.
Tony Parisi discusses the foundations of the immersive web and how it can reach billions of users by 2020. Key points include: - WebVR allows rendering of 3D graphics and VR content directly in browsers using standards like WebGL and a new VR API. - This eliminates friction compared to native apps by allowing instant access to VR content through web links on any device with a compatible browser. - Current browsers like Chrome and Firefox are adding initial WebVR support, and content can already be experienced on mobile in Cardboard viewers. - The immersive web is being built on open standards and has potential to scale to the billions of users accessible through the existing web ecosystem and its development
Save 10% off ANY FITC event with discount code 'slideshare' See our upcoming events at www.fitc.ca Virtual Reality development has become very active recently, with the availability of low cost and high quality headsets, motion tracking equipment, and sensors. However, most VR app development is happening natively — users are stuck in the days of needing to download the right binary, trust a third-party that their code isn’t malicious and fix compatibility issues. Developers need to target multiple platforms, thus often ignoring those with fewer users. Instead, wouldn’t it be great if high quality VR content could be delivered through the Web? In this session, Vladimir Vukicevic will address additions to HTML, CSS, and WebGL that Mozilla is experimenting with which allow Web developers to create immersive VR experiences. Everything from pure VR WebGL content to responsive HTML and CSS that can shift from mobile to tablet to desktop to VR will be covered. Additionally, Vladimir will discuss delivering VR video via the Web, as well as how to mix WebGL and CSS content in a true 3D space. OBJECTIVE To show how VR and the Web work together, and the techniques for bringing VR content to the Web. TARGET AUDIENCE Web developers and designers ASSUMED AUDIENCE KNOWLEDGE Some knowledge of at least one of WebGL, CSS 3D Transforms, or modern 3D graphics would be helpful. FIVE THINGS AUDIENCE MEMBERS WILL LEARN An overview of current VR devices, their capabilities and how they can interface with the Web. How to render WebGL content to a VR device. How to create documents using HTML and CSS that can be projected in VR. How to create responsive documents that can shift in and out of VR based on user choice. How WebGL and CSS content can be mixed, providing interactive 3D graphics but with the full power of HTML for non-3D elements.
Learn about XR, how it works on the web, and how it can leverage the power of additional Web APIs to creative Immersive Experiences
1. The document summarizes a meetup about developing graphics with WebGL. It discusses WebGL and how it allows 3D rendering in browsers using OpenGL ES and JavaScript. 2. It then provides information about the company ThreeDee Media and their WebGL framework and tools. Examples of what can be done with WebGL are shown. 3. The document outlines how WebGL works, integrating 3D graphics with HTML5 pages using WebGL contexts on canvas elements. It discusses why one would use WebGL for rich internet experiences with hardware-accelerated 3D.
The document discusses the Gear VR Framework (GVRf), an open source Java library for developing VR applications for Android on the Gear VR. It provides an overview of GVRf's features like loading 3D models and textures, scene management, and handling head tracking. It demonstrates how to create a basic "hello world" app and discusses new features in development like standard shapes, scene objects for video/camera/webviews, scripting support, and improved 3D modeling capabilities. A key focus is the new 3D Cursor interaction model and support for various VR input devices through a common API to standardize interaction across apps. Live demos of hand/eye tracking were shown to illustrate cursor behaviors.
This document discusses getting started with WebGL. It begins with an introduction to WebGL, explaining that it allows 3D graphics in browsers similarly to OpenGL. It then provides examples of what can be done with WebGL, such as data visualization, games, 3D modeling, and more. The document proceeds to explain the basic graphics pipeline and JavaScript API used in WebGL. It concludes by discussing how to set up a basic 3D scene and choose a WebGL library like Three.js or PhiloGL to get started creating WebGL applications.
A quick starter for prototyping in Aframe. Designers often find it difficult to communicate their VR design ideas to fellow developers. This is the presentation prepared for a VR design meetup in Bangalore wherein I gave hands on workshop to prototype in aframe for VR
Windows Phone 7 and Windows Azure are a good match because they both provide easy and familiar development environments, connectivity through the cloud, and scalability. They are compatible in these areas. The document discusses how Windows Phone 7 and Windows Azure can be used together through features like data storage in Windows Azure tables and blobs, push notifications, and identity management with Access Control Services. It provides examples of how to integrate the platforms for storing, retrieving, and displaying data stored in the cloud.
This document discusses developing augmented reality (AR) and virtual reality (VR) applications using React Native. It provides an overview of AR and VR history and technologies. It then discusses using the Viro React library to build cross-platform AR, VR, and XR applications in React Native. It covers components for 3D objects, lighting, and particle effects. Examples are provided for basic component layout and using 3D models, lighting, and particle effects in a Viro application. Considerations for when to use Viro React are discussed.