Package apple.arkit

Class ARWorldTrackingConfiguration

  • All Implemented Interfaces:
    NSCopying, NSObject

    public class ARWorldTrackingConfiguration
    extends ARConfiguration
    A configuration for running world tracking. World tracking provides 6 degrees of freedom tracking of the device. By finding feature points in the scene, world tracking enables performing hit-tests against the frame. Tracking can no longer be resumed once the session is paused.
    • Constructor Detail

      • ARWorldTrackingConfiguration

        protected ARWorldTrackingConfiguration​(org.moe.natj.general.Pointer peer)
    • Method Detail

      • accessInstanceVariablesDirectly

        public static boolean accessInstanceVariablesDirectly()
      • allocWithZone

        public static java.lang.Object allocWithZone​(org.moe.natj.general.ptr.VoidPtr zone)
      • automaticallyNotifiesObserversForKey

        public static boolean automaticallyNotifiesObserversForKey​(java.lang.String key)
      • cancelPreviousPerformRequestsWithTarget

        public static void cancelPreviousPerformRequestsWithTarget​(java.lang.Object aTarget)
      • cancelPreviousPerformRequestsWithTargetSelectorObject

        public static void cancelPreviousPerformRequestsWithTargetSelectorObject​(java.lang.Object aTarget,
                                                                                 org.moe.natj.objc.SEL aSelector,
                                                                                 java.lang.Object anArgument)
      • classFallbacksForKeyedArchiver

        public static NSArray<java.lang.String> classFallbacksForKeyedArchiver()
      • classForKeyedUnarchiver

        public static org.moe.natj.objc.Class classForKeyedUnarchiver()
      • debugDescription_static

        public static java.lang.String debugDescription_static()
      • description_static

        public static java.lang.String description_static()
      • hash_static

        public static long hash_static()
      • instanceMethodSignatureForSelector

        public static NSMethodSignature instanceMethodSignatureForSelector​(org.moe.natj.objc.SEL aSelector)
      • instancesRespondToSelector

        public static boolean instancesRespondToSelector​(org.moe.natj.objc.SEL aSelector)
      • isSubclassOfClass

        public static boolean isSubclassOfClass​(org.moe.natj.objc.Class aClass)
      • isSupported

        public static boolean isSupported()
      • keyPathsForValuesAffectingValueForKey

        public static NSSet<java.lang.String> keyPathsForValuesAffectingValueForKey​(java.lang.String key)
      • new_objc

        public static java.lang.Object new_objc()
      • planeDetection

        public long planeDetection()
        Type of planes to detect in the scene. If set, new planes will continue to be detected and updated over time. Detected planes will be added to the session as ARPlaneAnchor objects. In the event that two planes are merged, the newer plane will be removed. Defaults to ARPlaneDetectionNone.
      • resolveClassMethod

        public static boolean resolveClassMethod​(org.moe.natj.objc.SEL sel)
      • resolveInstanceMethod

        public static boolean resolveInstanceMethod​(org.moe.natj.objc.SEL sel)
      • setPlaneDetection

        public void setPlaneDetection​(long value)
        Type of planes to detect in the scene. If set, new planes will continue to be detected and updated over time. Detected planes will be added to the session as ARPlaneAnchor objects. In the event that two planes are merged, the newer plane will be removed. Defaults to ARPlaneDetectionNone.
      • setVersion_static

        public static void setVersion_static​(long aVersion)
      • superclass_static

        public static org.moe.natj.objc.Class superclass_static()
      • version_static

        public static long version_static()
      • automaticImageScaleEstimationEnabled

        public boolean automaticImageScaleEstimationEnabled()
        Enables the estimation of a scale factor which may be used to correct the physical size of an image. If set to true ARKit will attempt to use the computed camera positions in order to compute the scale by which the given physical size differs from the estimated one. The information about the estimated scale can be found as the property estimatedScaleFactor on the ARImageAnchor. [@note] When set to true the transform of a returned ARImageAnchor will use the estimated scale factor to correct the translation. Default value is NO.
      • detectionImages

        public NSSet<? extends ARReferenceImage> detectionImages()
        Images to detect in the scene. If set the session will attempt to detect the specified images. When an image is detected an ARImageAnchor will be added to the session.
      • detectionObjects

        public NSSet<? extends ARReferenceObject> detectionObjects()
        Objects to detect in the scene. If set the session will attempt to detect the specified objects. When an object is detected an ARObjectAnchor will be added to the session.
      • environmentTexturing

        public long environmentTexturing()
        The mode of environment texturing to run. If set, texture information will be accumulated and updated. Adding an AREnvironmentProbeAnchor to the session will get the current environment texture available from that probe's perspective which can be used for lighting virtual objects in the scene. Defaults to AREnvironmentTexturingNone.
      • initialWorldMap

        public ARWorldMap initialWorldMap()
        The initial map of the physical space that world tracking will localize to and track. If set, the session will attempt to localize to the provided map with a limited tracking state until localization is successful or run is called again with a different (or no) initial map specified. Once localized, the map will be extended and can again be saved using the `getCurrentWorldMap` method on the session.
      • isAutoFocusEnabled

        public boolean isAutoFocusEnabled()
        Enable or disable continuous auto focus. Enabled by default.
      • isCollaborationEnabled

        public boolean isCollaborationEnabled()
        Enable/disable a collaborative session. Disabled by default. When enabled, ARSession will output collaboration data for other participants using its delegate didOutputCollaborationData. It is the responsibility of the caller to send the data to each participant. When data is received by a participant, it should be passed to the ARSession by calling updateWithCollaborationData.
      • maximumNumberOfTrackedImages

        public long maximumNumberOfTrackedImages()
        Maximum number of images to track simultaneously. Setting the maximum number of tracked images will limit the number of images that can be tracked in a given frame. If more than the maximum is visible, only the images already being tracked will continue to track until tracking is lost or another image is removed. Images will continue to be detected regardless of images tracked. Default value is zero.
      • setAutoFocusEnabled

        public void setAutoFocusEnabled​(boolean value)
        Enable or disable continuous auto focus. Enabled by default.
      • setAutomaticImageScaleEstimationEnabled

        public void setAutomaticImageScaleEstimationEnabled​(boolean value)
        Enables the estimation of a scale factor which may be used to correct the physical size of an image. If set to true ARKit will attempt to use the computed camera positions in order to compute the scale by which the given physical size differs from the estimated one. The information about the estimated scale can be found as the property estimatedScaleFactor on the ARImageAnchor. [@note] When set to true the transform of a returned ARImageAnchor will use the estimated scale factor to correct the translation. Default value is NO.
      • setCollaborationEnabled

        public void setCollaborationEnabled​(boolean value)
        Enable/disable a collaborative session. Disabled by default. When enabled, ARSession will output collaboration data for other participants using its delegate didOutputCollaborationData. It is the responsibility of the caller to send the data to each participant. When data is received by a participant, it should be passed to the ARSession by calling updateWithCollaborationData.
      • setDetectionImages

        public void setDetectionImages​(NSSet<? extends ARReferenceImage> value)
        Images to detect in the scene. If set the session will attempt to detect the specified images. When an image is detected an ARImageAnchor will be added to the session.
      • setDetectionObjects

        public void setDetectionObjects​(NSSet<? extends ARReferenceObject> value)
        Objects to detect in the scene. If set the session will attempt to detect the specified objects. When an object is detected an ARObjectAnchor will be added to the session.
      • setEnvironmentTexturing

        public void setEnvironmentTexturing​(long value)
        The mode of environment texturing to run. If set, texture information will be accumulated and updated. Adding an AREnvironmentProbeAnchor to the session will get the current environment texture available from that probe's perspective which can be used for lighting virtual objects in the scene. Defaults to AREnvironmentTexturingNone.
      • setInitialWorldMap

        public void setInitialWorldMap​(ARWorldMap value)
        The initial map of the physical space that world tracking will localize to and track. If set, the session will attempt to localize to the provided map with a limited tracking state until localization is successful or run is called again with a different (or no) initial map specified. Once localized, the map will be extended and can again be saved using the `getCurrentWorldMap` method on the session.
      • setMaximumNumberOfTrackedImages

        public void setMaximumNumberOfTrackedImages​(long value)
        Maximum number of images to track simultaneously. Setting the maximum number of tracked images will limit the number of images that can be tracked in a given frame. If more than the maximum is visible, only the images already being tracked will continue to track until tracking is lost or another image is removed. Images will continue to be detected regardless of images tracked. Default value is zero.
      • setUserFaceTrackingEnabled

        public void setUserFaceTrackingEnabled​(boolean value)
        Enable or disable running Face Tracking using the front facing camera. Disabled by default. When enabled, ARSession detects faces (if visible in the front-facing camera image) and adds to its list of anchors, an ARFaceAnchor object representing each face. The transform of the ARFaceAnchor objects will be in the world coordinate space.
        See Also:
        ARFaceAnchor
      • setWantsHDREnvironmentTextures

        public void setWantsHDREnvironmentTextures​(boolean value)
        Determines whether environment textures will be provided with high dynamic range. Enabled by default.
      • supportsFrameSemantics

        public static boolean supportsFrameSemantics​(long frameSemantics)
      • supportsUserFaceTracking

        public static boolean supportsUserFaceTracking()
        Indicates whether user face tracking using the front facing camera can be enabled on this device.
      • userFaceTrackingEnabled

        public boolean userFaceTrackingEnabled()
        Enable or disable running Face Tracking using the front facing camera. Disabled by default. When enabled, ARSession detects faces (if visible in the front-facing camera image) and adds to its list of anchors, an ARFaceAnchor object representing each face. The transform of the ARFaceAnchor objects will be in the world coordinate space.
        See Also:
        ARFaceAnchor
      • wantsHDREnvironmentTextures

        public boolean wantsHDREnvironmentTextures()
        Determines whether environment textures will be provided with high dynamic range. Enabled by default.
      • sceneReconstruction

        public long sceneReconstruction()
        Type of scene reconstruction to run. Defaults to ARSceneReconstructionNone. If set to a value other than ARSceneReconstructionNone, output of scene reconstruction will be added to the session as ARMeshAnchor objects.
        See Also:
        ARMeshAnchor
      • setSceneReconstruction

        public void setSceneReconstruction​(long value)
        Type of scene reconstruction to run. Defaults to ARSceneReconstructionNone. If set to a value other than ARSceneReconstructionNone, output of scene reconstruction will be added to the session as ARMeshAnchor objects.
        See Also:
        ARMeshAnchor
      • supportsSceneReconstruction

        public static boolean supportsSceneReconstruction​(long sceneReconstruction)
        Indicates whether the scene reconstruction type is supported for the configuration on this device.