SlideShare a Scribd company logo
Hello ARCore
Giovanni Laquidara
CODEMOTION MILAN - SPECIAL EDITION
10 – 11 NOVEMBER 2017
Who
Giovanni Laquidara
Software Engineer
Giovanni Laquidara
Software Engineer
Mobile Developer
Giovanni Laquidara
Software Engineer
Mobile Developer
XR Engineer
Why
AR and VR make computing more intuitive and natural. When computers
work more like we do, they’re easier to use and more accessible.
What
01
02
03
04
05
What’s this AR?
ARCore explained
How to develop with ARCore?
Show me some code!
Advanced tools
@joaolaq
What is AR?
AR and VR are points on a spectrum of immersive computing. When digital imagery completely replaces what you
see, you have VR. And when you add digital objects to what you’re already seeing, you have augmented reality.
Real world
Computer-generated
Reality
Augmented
Reality
Virtual
Reality
AR + VR = Immersive Computing
@joaolaq
AR can bring anything to you.
AR can *bring* anything to you. It adds computer-generated
information and objects to your everyday world.
@joaolaq
History of Augmented
Reality
The Sword of Damocles (1968)
Ivan Sutherland
@joaolaq
AR Frameworks
The Sword of Damocles (1968)
Ivan Sutherland
@joaolaq
ARCore Explained
Giovanni Laquidara - Hello ARCore - Codemotion Milan 2017
@joaolaq
ARCore runs on qualified
devices running Android
7.0 Nougat and above
Today, that’s Pixel and Pixel XL, Pixel 2, Pixel 2 XL and
the Samsung Galaxy S8.
By the end of the preview
phase, ARCore will run on
100 million Android devices
Google is working with a number of Android
manufacturers to bring ARCore to as many devices as
possible in 2017, 2018, and beyond.
MOTION TRACKING ENVIRONMENTAL
UNDERSTANDING
LIGHT ESTIMATION
Specs
MOTION TRACKING
As your mobile device moves
through the world, ARCore
combines visual data from the
device's camera and inertial
measurements from the
device's IMU to estimate the
pose (position and orientation)
of the camera relative to the
world over time. This process,
called visual inertial odometry
(VIO), lets ARCore know where
the device is relative to the
world around it.
By aligning the pose of the
virtual camera that renders your
3D content with the pose of the
device's camera provided by
ARCore, developers are able to
render virtual content from the
correct perspective.
The rendered virtual image
can be overlayed on top of the
image obtained from the
device's camera, making it
appear as if the virtual content
is part of the real world.
@joaolaq
ENVIRONMENTAL
UNDERSTANDING
Specs
ARCore is constantly improving
its understanding of the real
world environment by detecting
feature points and planes.
Feature points are visually
distinct features in the captured
camera image that ARCore can
recognize even when the
camera's position changes
slightly.
Planes
ARCore looks for clusters of
feature points that appear
to lie on common horizontal
surfaces, like tables and desks,
and makes these surfaces
available to your app as planes.
Anchors
Fixed location and orientation in
the real world
@joaolaq
@joaolaq
@joaolaq
LIGHT ESTIMATION
Specs
ARCore can detect information
about the lighting of its
environment and provide you
with the average intensity of
a given camera image. This
information lets you light your
virtual objects under the same
conditions as the environment
around them, increasing the
sense of realism.
@joaolaq
How to develop with ARCore?
Augmented reality
on the web
The power and scale of the web will help make
augmented reality accessible to everyone. Google
released prototype browsers for web developers
so they can start building AR experiences.
ARCore provides SDKs for many of the most
popular development environments. These SDKs
provide native APIs for all of the essential AR
features like motion tracking, environmental
understanding, and light estimation. With these
capabilities you can build entirely new AR
experiences or enhance existing apps with AR
features.
Use your favourite
environment
@joaolaq
Show me some code!
$ git clone https://github.com/google-ar/arcore-android-sdk.git
You will need a basic understanding of Android
development with OpenGL.
@joaolaq
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mSurfaceView = (GLSurfaceView) findViewById(R.id.surfaceview);
mSession = new Session(/*context=*/this);
// Create default config, check is supported, create session from that
config.
mDefaultConfig = Config.createDefaultConfig();
if (!mSession.isSupported(mDefaultConfig)) {
Toast.makeText(this, "This device does not support AR",
Toast.LENGTH_LONG).show();
finish();
return;
}
@Override
public void onDrawFrame(GL10 gl) {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT |
GLES20.GL_DEPTH_BUFFER_BIT);
try {
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to
the
// camera framerate.
Frame frame = mSession.update();
// Visualize tracked points.
mPointCloud.update(frame.getPointCloud());
mPointCloud.draw(frame.getPointCloudPose(), viewmtx, projmtx);
// Check if we detected at least one plane. If so, hide the loading message.
if (mLoadingMessageSnackbar != null) {
for (Plane plane : mSession.getAllPlanes()) {
if (plane.getType() == com.google.ar.core.Plane.Type.HORIZONTAL_UPWARD_FACING &&
plane.getTrackingState() == Plane.TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = mQueuedSingleTaps.poll();
if (tap != null && frame.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon.
if (hit instanceof PlaneHitResult && ((PlaneHitResult) hit).isHitInPolygon()) {
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (mTouches.size() >= 16) {
mSession.removeAnchors(Arrays.asList(mTouches.get(0).getAnchor()));
mTouches.remove(0);
}
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor will be used in PlaneAttachment to place the 3d model
// in the correct position relative both to the world and to the plane.
mTouches.add(new PlaneAttachment(
((PlaneHitResult) hit).getPlane(),
mSession.addAnchor(hit.getHitPose())));
// Hits are sorted by depth. Consider only closest hit on a plane.
break;
}
}
}
Unity
https://github.com/joaobiriba/ARCore-Kittens
Look for “ARCore 101” on Google
@joaolaq
Advanced Tools
AR Editor / AR Remote tool
Install the AR Remote on your device
Connect the Editor to your device
Start your project and you will see in the editor
what your smartphone is showing
AR Interface
Unified way to develop with ARKit & ARCore
Unity give us some cool tools
@joaolaq
ARCore & ARKit multiplayer shared experience
@joaolaq
Experimenting with multiplayer ARCore and ARKit: jump in with
sample code
https://developers.google.com/resonance-audio/
Blocks and Tilt Brush
Easily create beautiful assets in VR for use in AR apps.
https://poly.google.com/
Giovanni Laquidara - Hello ARCore - Codemotion Milan 2017
Thanks!
glaquidara@gmail.com
@joaolaq
https://thisisarcore.com/

More Related Content

Giovanni Laquidara - Hello ARCore - Codemotion Milan 2017

  • 1. Hello ARCore Giovanni Laquidara CODEMOTION MILAN - SPECIAL EDITION 10 – 11 NOVEMBER 2017
  • 2. Who
  • 6. Why
  • 7. AR and VR make computing more intuitive and natural. When computers work more like we do, they’re easier to use and more accessible.
  • 9. 01 02 03 04 05 What’s this AR? ARCore explained How to develop with ARCore? Show me some code! Advanced tools @joaolaq
  • 11. AR and VR are points on a spectrum of immersive computing. When digital imagery completely replaces what you see, you have VR. And when you add digital objects to what you’re already seeing, you have augmented reality. Real world Computer-generated Reality Augmented Reality Virtual Reality AR + VR = Immersive Computing @joaolaq
  • 12. AR can bring anything to you. AR can *bring* anything to you. It adds computer-generated information and objects to your everyday world. @joaolaq
  • 13. History of Augmented Reality The Sword of Damocles (1968) Ivan Sutherland @joaolaq
  • 14. AR Frameworks The Sword of Damocles (1968) Ivan Sutherland @joaolaq
  • 18. ARCore runs on qualified devices running Android 7.0 Nougat and above Today, that’s Pixel and Pixel XL, Pixel 2, Pixel 2 XL and the Samsung Galaxy S8.
  • 19. By the end of the preview phase, ARCore will run on 100 million Android devices Google is working with a number of Android manufacturers to bring ARCore to as many devices as possible in 2017, 2018, and beyond.
  • 21. Specs MOTION TRACKING As your mobile device moves through the world, ARCore combines visual data from the device's camera and inertial measurements from the device's IMU to estimate the pose (position and orientation) of the camera relative to the world over time. This process, called visual inertial odometry (VIO), lets ARCore know where the device is relative to the world around it. By aligning the pose of the virtual camera that renders your 3D content with the pose of the device's camera provided by ARCore, developers are able to render virtual content from the correct perspective. The rendered virtual image can be overlayed on top of the image obtained from the device's camera, making it appear as if the virtual content is part of the real world. @joaolaq
  • 22. ENVIRONMENTAL UNDERSTANDING Specs ARCore is constantly improving its understanding of the real world environment by detecting feature points and planes. Feature points are visually distinct features in the captured camera image that ARCore can recognize even when the camera's position changes slightly. Planes ARCore looks for clusters of feature points that appear to lie on common horizontal surfaces, like tables and desks, and makes these surfaces available to your app as planes. Anchors Fixed location and orientation in the real world @joaolaq
  • 25. LIGHT ESTIMATION Specs ARCore can detect information about the lighting of its environment and provide you with the average intensity of a given camera image. This information lets you light your virtual objects under the same conditions as the environment around them, increasing the sense of realism. @joaolaq
  • 26. How to develop with ARCore?
  • 27. Augmented reality on the web The power and scale of the web will help make augmented reality accessible to everyone. Google released prototype browsers for web developers so they can start building AR experiences.
  • 28. ARCore provides SDKs for many of the most popular development environments. These SDKs provide native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these capabilities you can build entirely new AR experiences or enhance existing apps with AR features. Use your favourite environment @joaolaq
  • 29. Show me some code!
  • 30. $ git clone https://github.com/google-ar/arcore-android-sdk.git You will need a basic understanding of Android development with OpenGL. @joaolaq
  • 31. @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); mSurfaceView = (GLSurfaceView) findViewById(R.id.surfaceview); mSession = new Session(/*context=*/this); // Create default config, check is supported, create session from that config. mDefaultConfig = Config.createDefaultConfig(); if (!mSession.isSupported(mDefaultConfig)) { Toast.makeText(this, "This device does not support AR", Toast.LENGTH_LONG).show(); finish(); return; }
  • 32. @Override public void onDrawFrame(GL10 gl) { // Clear screen to notify driver it should not load any pixels from previous frame. GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); try { // Obtain the current frame from ARSession. When the configuration is set to // UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the // camera framerate. Frame frame = mSession.update();
  • 33. // Visualize tracked points. mPointCloud.update(frame.getPointCloud()); mPointCloud.draw(frame.getPointCloudPose(), viewmtx, projmtx); // Check if we detected at least one plane. If so, hide the loading message. if (mLoadingMessageSnackbar != null) { for (Plane plane : mSession.getAllPlanes()) { if (plane.getType() == com.google.ar.core.Plane.Type.HORIZONTAL_UPWARD_FACING && plane.getTrackingState() == Plane.TrackingState.TRACKING) { hideLoadingMessage(); break; } } }
  • 34. // Handle taps. Handling only one tap per frame, as taps are usually low frequency // compared to frame rate. MotionEvent tap = mQueuedSingleTaps.poll(); if (tap != null && frame.getTrackingState() == TrackingState.TRACKING) { for (HitResult hit : frame.hitTest(tap)) { // Check if any plane was hit, and if it was hit inside the plane polygon. if (hit instanceof PlaneHitResult && ((PlaneHitResult) hit).isHitInPolygon()) { // Cap the number of objects created. This avoids overloading both the // rendering system and ARCore. if (mTouches.size() >= 16) { mSession.removeAnchors(Arrays.asList(mTouches.get(0).getAnchor())); mTouches.remove(0); } // Adding an Anchor tells ARCore that it should track this position in // space. This anchor will be used in PlaneAttachment to place the 3d model // in the correct position relative both to the world and to the plane. mTouches.add(new PlaneAttachment( ((PlaneHitResult) hit).getPlane(), mSession.addAnchor(hit.getHitPose()))); // Hits are sorted by depth. Consider only closest hit on a plane. break; } } }
  • 35. Unity
  • 38. AR Editor / AR Remote tool Install the AR Remote on your device Connect the Editor to your device Start your project and you will see in the editor what your smartphone is showing AR Interface Unified way to develop with ARKit & ARCore Unity give us some cool tools @joaolaq
  • 39. ARCore & ARKit multiplayer shared experience @joaolaq Experimenting with multiplayer ARCore and ARKit: jump in with sample code
  • 41. Blocks and Tilt Brush Easily create beautiful assets in VR for use in AR apps. https://poly.google.com/