Package apple.arkit
Class ARWorldTrackingConfiguration
- java.lang.Object
-
- org.moe.natj.general.NativeObject
-
- org.moe.natj.objc.ObjCObject
-
- apple.NSObject
-
- apple.arkit.ARConfiguration
-
- apple.arkit.ARWorldTrackingConfiguration
-
public class ARWorldTrackingConfiguration extends ARConfiguration
A configuration for running world tracking. World tracking provides 6 degrees of freedom tracking of the device. By finding feature points in the scene, world tracking enables performing hit-tests against the frame. Tracking can no longer be resumed once the session is paused.
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class apple.NSObject
NSObject.Function_instanceMethodForSelector_ret, NSObject.Function_methodForSelector_ret
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedARWorldTrackingConfiguration(org.moe.natj.general.Pointer peer)
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static booleanaccessInstanceVariablesDirectly()static ARWorldTrackingConfigurationalloc()static java.lang.ObjectallocWithZone(org.moe.natj.general.ptr.VoidPtr zone)static booleanautomaticallyNotifiesObserversForKey(java.lang.String key)booleanautomaticImageScaleEstimationEnabled()Enables the estimation of a scale factor which may be used to correct the physical size of an image.static voidcancelPreviousPerformRequestsWithTarget(java.lang.Object aTarget)static voidcancelPreviousPerformRequestsWithTargetSelectorObject(java.lang.Object aTarget, org.moe.natj.objc.SEL aSelector, java.lang.Object anArgument)static NSArray<java.lang.String>classFallbacksForKeyedArchiver()static org.moe.natj.objc.ClassclassForKeyedUnarchiver()static java.lang.StringdebugDescription_static()static java.lang.Stringdescription_static()NSSet<? extends ARReferenceImage>detectionImages()Images to detect in the scene.NSSet<? extends ARReferenceObject>detectionObjects()Objects to detect in the scene.longenvironmentTexturing()The mode of environment texturing to run.static longhash_static()ARWorldTrackingConfigurationinit()ARWorldMapinitialWorldMap()The initial map of the physical space that world tracking will localize to and track.static NSObject.Function_instanceMethodForSelector_retinstanceMethodForSelector(org.moe.natj.objc.SEL aSelector)static NSMethodSignatureinstanceMethodSignatureForSelector(org.moe.natj.objc.SEL aSelector)static booleaninstancesRespondToSelector(org.moe.natj.objc.SEL aSelector)booleanisAutoFocusEnabled()Enable or disable continuous auto focus.booleanisCollaborationEnabled()Enable/disable a collaborative session.static booleanisSubclassOfClass(org.moe.natj.objc.Class aClass)static booleanisSupported()static NSSet<java.lang.String>keyPathsForValuesAffectingValueForKey(java.lang.String key)longmaximumNumberOfTrackedImages()Maximum number of images to track simultaneously.static java.lang.Objectnew_objc()longplaneDetection()Type of planes to detect in the scene.static booleanresolveClassMethod(org.moe.natj.objc.SEL sel)static booleanresolveInstanceMethod(org.moe.natj.objc.SEL sel)longsceneReconstruction()Type of scene reconstruction to run.voidsetAutoFocusEnabled(boolean value)Enable or disable continuous auto focus.voidsetAutomaticImageScaleEstimationEnabled(boolean value)Enables the estimation of a scale factor which may be used to correct the physical size of an image.voidsetCollaborationEnabled(boolean value)Enable/disable a collaborative session.voidsetDetectionImages(NSSet<? extends ARReferenceImage> value)Images to detect in the scene.voidsetDetectionObjects(NSSet<? extends ARReferenceObject> value)Objects to detect in the scene.voidsetEnvironmentTexturing(long value)The mode of environment texturing to run.voidsetInitialWorldMap(ARWorldMap value)The initial map of the physical space that world tracking will localize to and track.voidsetMaximumNumberOfTrackedImages(long value)Maximum number of images to track simultaneously.voidsetPlaneDetection(long value)Type of planes to detect in the scene.voidsetSceneReconstruction(long value)Type of scene reconstruction to run.voidsetUserFaceTrackingEnabled(boolean value)Enable or disable running Face Tracking using the front facing camera.static voidsetVersion_static(long aVersion)voidsetWantsHDREnvironmentTextures(boolean value)Determines whether environment textures will be provided with high dynamic range.static org.moe.natj.objc.Classsuperclass_static()static NSArray<? extends ARVideoFormat>supportedVideoFormats()static booleansupportsFrameSemantics(long frameSemantics)static booleansupportsSceneReconstruction(long sceneReconstruction)Indicates whether the scene reconstruction type is supported for the configuration on this device.static booleansupportsUserFaceTracking()Indicates whether user face tracking using the front facing camera can be enabled on this device.booleanuserFaceTrackingEnabled()Enable or disable running Face Tracking using the front facing camera.static longversion_static()booleanwantsHDREnvironmentTextures()Determines whether environment textures will be provided with high dynamic range.-
Methods inherited from class apple.arkit.ARConfiguration
copyWithZone, frameSemantics, isLightEstimationEnabled, providesAudioData, setFrameSemantics, setLightEstimationEnabled, setProvidesAudioData, setVideoFormat, setWorldAlignment, videoFormat, worldAlignment
-
Methods inherited from class apple.NSObject
accessibilityActivate, accessibilityActivationPoint, accessibilityAssistiveTechnologyFocusedIdentifiers, accessibilityAttributedHint, accessibilityAttributedLabel, accessibilityAttributedUserInputLabels, accessibilityAttributedValue, accessibilityContainerType, accessibilityCustomActions, accessibilityCustomRotors, accessibilityDecrement, accessibilityDragSourceDescriptors, accessibilityDropPointDescriptors, accessibilityElementAtIndex, accessibilityElementCount, accessibilityElementDidBecomeFocused, accessibilityElementDidLoseFocus, accessibilityElementIsFocused, accessibilityElements, accessibilityElementsHidden, accessibilityFrame, accessibilityHint, accessibilityIncrement, accessibilityLabel, accessibilityLanguage, accessibilityNavigationStyle, accessibilityPath, accessibilityPerformEscape, accessibilityPerformMagicTap, accessibilityRespondsToUserInteraction, accessibilityScroll, accessibilityTextualContext, accessibilityTraits, accessibilityUserInputLabels, accessibilityValue, accessibilityViewIsModal, addObserverForKeyPathOptionsContext, attemptRecoveryFromErrorOptionIndex, attemptRecoveryFromErrorOptionIndexDelegateDidRecoverSelectorContextInfo, autoContentAccessingProxy, awakeAfterUsingCoder, awakeFromNib, class_objc, classForCoder, classForKeyedArchiver, copy, dealloc, debugDescription, description, dictionaryWithValuesForKeys, didChangeValueForKey, didChangeValueForKeyWithSetMutationUsingObjects, didChangeValuesAtIndexesForKey, doesNotRecognizeSelector, fileManagerShouldProceedAfterError, fileManagerWillProcessPath, finalize_objc, forwardingTargetForSelector, forwardInvocation, hash, indexOfAccessibilityElement, isAccessibilityElement, isEqual, isKindOfClass, isMemberOfClass, isProxy, methodForSelector, methodSignatureForSelector, mutableArrayValueForKey, mutableArrayValueForKeyPath, mutableCopy, mutableOrderedSetValueForKey, mutableOrderedSetValueForKeyPath, mutableSetValueForKey, mutableSetValueForKeyPath, observationInfo, observeValueForKeyPathOfObjectChangeContext, performSelector, performSelectorInBackgroundWithObject, performSelectorOnMainThreadWithObjectWaitUntilDone, performSelectorOnMainThreadWithObjectWaitUntilDoneModes, performSelectorOnThreadWithObjectWaitUntilDone, performSelectorOnThreadWithObjectWaitUntilDoneModes, performSelectorWithObject, performSelectorWithObjectAfterDelay, performSelectorWithObjectAfterDelayInModes, performSelectorWithObjectWithObject, prepareForInterfaceBuilder, provideImageDataBytesPerRowOrigin_Size_UserInfo, removeObserverForKeyPath, removeObserverForKeyPathContext, replacementObjectForCoder, replacementObjectForKeyedArchiver, respondsToSelector, self, setAccessibilityActivationPoint, setAccessibilityAttributedHint, setAccessibilityAttributedLabel, setAccessibilityAttributedUserInputLabels, setAccessibilityAttributedValue, setAccessibilityContainerType, setAccessibilityCustomActions, setAccessibilityCustomRotors, setAccessibilityDragSourceDescriptors, setAccessibilityDropPointDescriptors, setAccessibilityElements, setAccessibilityElementsHidden, setAccessibilityFrame, setAccessibilityHint, setAccessibilityLabel, setAccessibilityLanguage, setAccessibilityNavigationStyle, setAccessibilityPath, setAccessibilityRespondsToUserInteraction, setAccessibilityTextualContext, setAccessibilityTraits, setAccessibilityUserInputLabels, setAccessibilityValue, setAccessibilityViewIsModal, setIsAccessibilityElement, setNilValueForKey, setObservationInfo, setShouldGroupAccessibilityChildren, setValueForKey, setValueForKeyPath, setValueForUndefinedKey, setValuesForKeysWithDictionary, shouldGroupAccessibilityChildren, superclass, validateValueForKeyError, validateValueForKeyPathError, valueForKey, valueForKeyPath, valueForUndefinedKey, willChangeValueForKey, willChangeValueForKeyWithSetMutationUsingObjects, willChangeValuesAtIndexesForKey
-
-
-
-
Method Detail
-
accessInstanceVariablesDirectly
public static boolean accessInstanceVariablesDirectly()
-
alloc
public static ARWorldTrackingConfiguration alloc()
-
allocWithZone
public static java.lang.Object allocWithZone(org.moe.natj.general.ptr.VoidPtr zone)
-
automaticallyNotifiesObserversForKey
public static boolean automaticallyNotifiesObserversForKey(java.lang.String key)
-
cancelPreviousPerformRequestsWithTarget
public static void cancelPreviousPerformRequestsWithTarget(java.lang.Object aTarget)
-
cancelPreviousPerformRequestsWithTargetSelectorObject
public static void cancelPreviousPerformRequestsWithTargetSelectorObject(java.lang.Object aTarget, org.moe.natj.objc.SEL aSelector, java.lang.Object anArgument)
-
classFallbacksForKeyedArchiver
public static NSArray<java.lang.String> classFallbacksForKeyedArchiver()
-
classForKeyedUnarchiver
public static org.moe.natj.objc.Class classForKeyedUnarchiver()
-
debugDescription_static
public static java.lang.String debugDescription_static()
-
description_static
public static java.lang.String description_static()
-
hash_static
public static long hash_static()
-
init
public ARWorldTrackingConfiguration init()
- Overrides:
initin classARConfiguration
-
instanceMethodForSelector
public static NSObject.Function_instanceMethodForSelector_ret instanceMethodForSelector(org.moe.natj.objc.SEL aSelector)
-
instanceMethodSignatureForSelector
public static NSMethodSignature instanceMethodSignatureForSelector(org.moe.natj.objc.SEL aSelector)
-
instancesRespondToSelector
public static boolean instancesRespondToSelector(org.moe.natj.objc.SEL aSelector)
-
isSubclassOfClass
public static boolean isSubclassOfClass(org.moe.natj.objc.Class aClass)
-
isSupported
public static boolean isSupported()
-
keyPathsForValuesAffectingValueForKey
public static NSSet<java.lang.String> keyPathsForValuesAffectingValueForKey(java.lang.String key)
-
new_objc
public static java.lang.Object new_objc()
-
planeDetection
public long planeDetection()
Type of planes to detect in the scene. If set, new planes will continue to be detected and updated over time. Detected planes will be added to the session as ARPlaneAnchor objects. In the event that two planes are merged, the newer plane will be removed. Defaults to ARPlaneDetectionNone.
-
resolveClassMethod
public static boolean resolveClassMethod(org.moe.natj.objc.SEL sel)
-
resolveInstanceMethod
public static boolean resolveInstanceMethod(org.moe.natj.objc.SEL sel)
-
setPlaneDetection
public void setPlaneDetection(long value)
Type of planes to detect in the scene. If set, new planes will continue to be detected and updated over time. Detected planes will be added to the session as ARPlaneAnchor objects. In the event that two planes are merged, the newer plane will be removed. Defaults to ARPlaneDetectionNone.
-
setVersion_static
public static void setVersion_static(long aVersion)
-
superclass_static
public static org.moe.natj.objc.Class superclass_static()
-
version_static
public static long version_static()
-
automaticImageScaleEstimationEnabled
public boolean automaticImageScaleEstimationEnabled()
Enables the estimation of a scale factor which may be used to correct the physical size of an image. If set to true ARKit will attempt to use the computed camera positions in order to compute the scale by which the given physical size differs from the estimated one. The information about the estimated scale can be found as the property estimatedScaleFactor on the ARImageAnchor. [@note] When set to true the transform of a returned ARImageAnchor will use the estimated scale factor to correct the translation. Default value is NO.
-
detectionImages
public NSSet<? extends ARReferenceImage> detectionImages()
Images to detect in the scene. If set the session will attempt to detect the specified images. When an image is detected an ARImageAnchor will be added to the session.
-
detectionObjects
public NSSet<? extends ARReferenceObject> detectionObjects()
Objects to detect in the scene. If set the session will attempt to detect the specified objects. When an object is detected an ARObjectAnchor will be added to the session.
-
environmentTexturing
public long environmentTexturing()
The mode of environment texturing to run. If set, texture information will be accumulated and updated. Adding an AREnvironmentProbeAnchor to the session will get the current environment texture available from that probe's perspective which can be used for lighting virtual objects in the scene. Defaults to AREnvironmentTexturingNone.
-
initialWorldMap
public ARWorldMap initialWorldMap()
The initial map of the physical space that world tracking will localize to and track. If set, the session will attempt to localize to the provided map with a limited tracking state until localization is successful or run is called again with a different (or no) initial map specified. Once localized, the map will be extended and can again be saved using the `getCurrentWorldMap` method on the session.
-
isAutoFocusEnabled
public boolean isAutoFocusEnabled()
Enable or disable continuous auto focus. Enabled by default.
-
isCollaborationEnabled
public boolean isCollaborationEnabled()
Enable/disable a collaborative session. Disabled by default. When enabled, ARSession will output collaboration data for other participants using its delegate didOutputCollaborationData. It is the responsibility of the caller to send the data to each participant. When data is received by a participant, it should be passed to the ARSession by calling updateWithCollaborationData.
-
maximumNumberOfTrackedImages
public long maximumNumberOfTrackedImages()
Maximum number of images to track simultaneously. Setting the maximum number of tracked images will limit the number of images that can be tracked in a given frame. If more than the maximum is visible, only the images already being tracked will continue to track until tracking is lost or another image is removed. Images will continue to be detected regardless of images tracked. Default value is zero.
-
setAutoFocusEnabled
public void setAutoFocusEnabled(boolean value)
Enable or disable continuous auto focus. Enabled by default.
-
setAutomaticImageScaleEstimationEnabled
public void setAutomaticImageScaleEstimationEnabled(boolean value)
Enables the estimation of a scale factor which may be used to correct the physical size of an image. If set to true ARKit will attempt to use the computed camera positions in order to compute the scale by which the given physical size differs from the estimated one. The information about the estimated scale can be found as the property estimatedScaleFactor on the ARImageAnchor. [@note] When set to true the transform of a returned ARImageAnchor will use the estimated scale factor to correct the translation. Default value is NO.
-
setCollaborationEnabled
public void setCollaborationEnabled(boolean value)
Enable/disable a collaborative session. Disabled by default. When enabled, ARSession will output collaboration data for other participants using its delegate didOutputCollaborationData. It is the responsibility of the caller to send the data to each participant. When data is received by a participant, it should be passed to the ARSession by calling updateWithCollaborationData.
-
setDetectionImages
public void setDetectionImages(NSSet<? extends ARReferenceImage> value)
Images to detect in the scene. If set the session will attempt to detect the specified images. When an image is detected an ARImageAnchor will be added to the session.
-
setDetectionObjects
public void setDetectionObjects(NSSet<? extends ARReferenceObject> value)
Objects to detect in the scene. If set the session will attempt to detect the specified objects. When an object is detected an ARObjectAnchor will be added to the session.
-
setEnvironmentTexturing
public void setEnvironmentTexturing(long value)
The mode of environment texturing to run. If set, texture information will be accumulated and updated. Adding an AREnvironmentProbeAnchor to the session will get the current environment texture available from that probe's perspective which can be used for lighting virtual objects in the scene. Defaults to AREnvironmentTexturingNone.
-
setInitialWorldMap
public void setInitialWorldMap(ARWorldMap value)
The initial map of the physical space that world tracking will localize to and track. If set, the session will attempt to localize to the provided map with a limited tracking state until localization is successful or run is called again with a different (or no) initial map specified. Once localized, the map will be extended and can again be saved using the `getCurrentWorldMap` method on the session.
-
setMaximumNumberOfTrackedImages
public void setMaximumNumberOfTrackedImages(long value)
Maximum number of images to track simultaneously. Setting the maximum number of tracked images will limit the number of images that can be tracked in a given frame. If more than the maximum is visible, only the images already being tracked will continue to track until tracking is lost or another image is removed. Images will continue to be detected regardless of images tracked. Default value is zero.
-
setUserFaceTrackingEnabled
public void setUserFaceTrackingEnabled(boolean value)
Enable or disable running Face Tracking using the front facing camera. Disabled by default. When enabled, ARSession detects faces (if visible in the front-facing camera image) and adds to its list of anchors, an ARFaceAnchor object representing each face. The transform of the ARFaceAnchor objects will be in the world coordinate space.- See Also:
ARFaceAnchor
-
setWantsHDREnvironmentTextures
public void setWantsHDREnvironmentTextures(boolean value)
Determines whether environment textures will be provided with high dynamic range. Enabled by default.
-
supportedVideoFormats
public static NSArray<? extends ARVideoFormat> supportedVideoFormats()
-
supportsFrameSemantics
public static boolean supportsFrameSemantics(long frameSemantics)
-
supportsUserFaceTracking
public static boolean supportsUserFaceTracking()
Indicates whether user face tracking using the front facing camera can be enabled on this device.
-
userFaceTrackingEnabled
public boolean userFaceTrackingEnabled()
Enable or disable running Face Tracking using the front facing camera. Disabled by default. When enabled, ARSession detects faces (if visible in the front-facing camera image) and adds to its list of anchors, an ARFaceAnchor object representing each face. The transform of the ARFaceAnchor objects will be in the world coordinate space.- See Also:
ARFaceAnchor
-
wantsHDREnvironmentTextures
public boolean wantsHDREnvironmentTextures()
Determines whether environment textures will be provided with high dynamic range. Enabled by default.
-
sceneReconstruction
public long sceneReconstruction()
Type of scene reconstruction to run. Defaults to ARSceneReconstructionNone. If set to a value other than ARSceneReconstructionNone, output of scene reconstruction will be added to the session as ARMeshAnchor objects.- See Also:
ARMeshAnchor
-
setSceneReconstruction
public void setSceneReconstruction(long value)
Type of scene reconstruction to run. Defaults to ARSceneReconstructionNone. If set to a value other than ARSceneReconstructionNone, output of scene reconstruction will be added to the session as ARMeshAnchor objects.- See Also:
ARMeshAnchor
-
supportsSceneReconstruction
public static boolean supportsSceneReconstruction(long sceneReconstruction)
Indicates whether the scene reconstruction type is supported for the configuration on this device.
-
-