java - Detecting planes and tap events with ARCore -


i'm trying understand google's arcore api , pushed sample project (java_arcore_hello_ar) github.

in example, when deploy app android, horizontal surfaces/planes detected. if tap on detected plane, "andy" andrid robot rendered in location tap. pretty cool.

i'm trying find in code:

  1. thats horizontal surface/plane gets detected; and
  2. where logic lives resize & re-orient andy correctly (i assume if point tap further away camera, rendered small, etc.)

i believe when planes detected, android framework calls onsurfacecreated method:

@override public void onsurfacecreated(gl10 gl, eglconfig config) {     gles20.glclearcolor(0.1f, 0.1f, 0.1f, 1.0f);      // create texture , pass arcore session filled during update().     mbackgroundrenderer.createonglthread(/*context=*/this);     msession.setcameratexturename(mbackgroundrenderer.gettextureid());      // prepare other rendering objects.     try {         mvirtualobject.createonglthread(/*context=*/this, "andy.obj", "andy.png");         mvirtualobject.setmaterialproperties(0.0f, 3.5f, 1.0f, 6.0f);          mvirtualobjectshadow.createonglthread(/*context=*/this,             "andy_shadow.obj", "andy_shadow.png");         mvirtualobjectshadow.setblendmode(blendmode.shadow);         mvirtualobjectshadow.setmaterialproperties(1.0f, 0.0f, 0.0f, 1.0f);     } catch (ioexception e) {         log.e(tag, "failed read obj file");     }     try {         mplanerenderer.createonglthread(/*context=*/this, "trigrid.png");     } catch (ioexception e) {         log.e(tag, "failed read plane texture");     }     mpointcloud.createonglthread(/*context=*/this); } 

however code looks like assumes user has tapped on surface. i'm not seeing if-conditional says "render andy if user has tapped on detected plane/surface.". can spot might happening?

the tap detection done in mgesturedetector:

mgesturedetector = new gesturedetector(this, new gesturedetector.simpleongesturelistener() {     @override     public boolean onsingletapup(motionevent e) {         onsingletap(e);         return true;     }      @override     public boolean ondown(motionevent e) {         return true;     } }); 

which linked surfaceview

msurfaceview.setontouchlistener(new view.ontouchlistener() {     @override     public boolean ontouch(view v, motionevent event) {         return mgesturedetector.ontouchevent(event);     } }); 

both things happen in oncreate(), every time tap surface view (the "main" view in activity),

private void onsingletap(motionevent e) {     // queue tap if there space. tap lost if queue full.     mqueuedsingletaps.offer(e); } 

is called , tap stored. queue processed in every frame drawing (which in turn issued system's ui drawing cycle) here

motionevent tap = mqueuedsingletaps.poll(); if (tap != null && frame.gettrackingstate() == trackingstate.tracking) {     (hitresult hit : frame.hittest(tap)) {        ... 

this adds new anchor (i.e. point "locked" in physical world") @ android object rendered (cf. line).


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -