Package apple.avfoundation
Class AVSampleBufferAudioRenderer
- java.lang.Object
-
- org.moe.natj.general.NativeObject
-
- org.moe.natj.objc.ObjCObject
-
- apple.NSObject
-
- apple.avfoundation.AVSampleBufferAudioRenderer
-
- All Implemented Interfaces:
AVQueuedSampleBufferRendering,NSObject
public class AVSampleBufferAudioRenderer extends NSObject implements AVQueuedSampleBufferRendering
AVSampleBufferAudioRenderer AVSampleBufferAudioRenderer can decompress and play compressed or uncompressed audio. An instance of AVSampleBufferAudioRenderer must be added to an AVSampleBufferRenderSynchronizer before the first sample buffer is enqueued.
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static interfaceAVSampleBufferAudioRenderer.Block_flushFromSourceTimeCompletionHandler-
Nested classes/interfaces inherited from class apple.NSObject
NSObject.Function_instanceMethodForSelector_ret, NSObject.Function_methodForSelector_ret
-
Nested classes/interfaces inherited from interface apple.avfoundation.protocol.AVQueuedSampleBufferRendering
AVQueuedSampleBufferRendering.Block_requestMediaDataWhenReadyOnQueueUsingBlock
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedAVSampleBufferAudioRenderer(org.moe.natj.general.Pointer peer)
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static booleanaccessInstanceVariablesDirectly()static AVSampleBufferAudioRendereralloc()static java.lang.ObjectallocWithZone(org.moe.natj.general.ptr.VoidPtr zone)java.lang.StringaudioTimePitchAlgorithm()[@property] audioTimePitchAlgorithm Indicates the processing algorithm used to manage audio pitch at varying rates.static booleanautomaticallyNotifiesObserversForKey(java.lang.String key)static voidcancelPreviousPerformRequestsWithTarget(java.lang.Object aTarget)static voidcancelPreviousPerformRequestsWithTargetSelectorObject(java.lang.Object aTarget, org.moe.natj.objc.SEL aSelector, java.lang.Object anArgument)static NSArray<java.lang.String>classFallbacksForKeyedArchiver()static org.moe.natj.objc.ClassclassForKeyedUnarchiver()static java.lang.StringdebugDescription_static()static java.lang.Stringdescription_static()voidenqueueSampleBuffer(CMSampleBufferRef sampleBuffer)enqueueSampleBuffer: Sends a sample buffer in order to render its contents.NSErrorerror()[@property] error If the renderer's status is AVQueuedSampleBufferRenderingStatusFailed, this describes the error that caused the failure.voidflush()flush Instructs the receiver to discard pending enqueued sample buffers.voidflushFromSourceTimeCompletionHandler(CMTime time, AVSampleBufferAudioRenderer.Block_flushFromSourceTimeCompletionHandler completionHandler)flushFromSourceTime:completionHandler: Flushes enqueued sample buffers with presentation time stamps later than or equal to the specified time.static longhash_static()AVSampleBufferAudioRendererinit()static NSObject.Function_instanceMethodForSelector_retinstanceMethodForSelector(org.moe.natj.objc.SEL aSelector)static NSMethodSignatureinstanceMethodSignatureForSelector(org.moe.natj.objc.SEL aSelector)static booleaninstancesRespondToSelector(org.moe.natj.objc.SEL aSelector)booleanisMuted()[@property] muted Indicates whether or not audio output of the AVSampleBufferAudioRenderer is muted.booleanisReadyForMoreMediaData()[@property] readyForMoreMediaData Indicates the readiness of the receiver to accept more sample buffers.static booleanisSubclassOfClass(org.moe.natj.objc.Class aClass)static NSSet<java.lang.String>keyPathsForValuesAffectingValueForKey(java.lang.String key)static java.lang.Objectnew_objc()voidrequestMediaDataWhenReadyOnQueueUsingBlock(NSObject queue, AVQueuedSampleBufferRendering.Block_requestMediaDataWhenReadyOnQueueUsingBlock block)requestMediaDataWhenReadyOnQueue:usingBlock: Instructs the target to invoke a client-supplied block repeatedly, at its convenience, in order to gather sample buffers for playback.static booleanresolveClassMethod(org.moe.natj.objc.SEL sel)static booleanresolveInstanceMethod(org.moe.natj.objc.SEL sel)voidsetAudioTimePitchAlgorithm(java.lang.String value)[@property] audioTimePitchAlgorithm Indicates the processing algorithm used to manage audio pitch at varying rates.voidsetMuted(boolean value)[@property] muted Indicates whether or not audio output of the AVSampleBufferAudioRenderer is muted.static voidsetVersion_static(long aVersion)voidsetVolume(float value)[@property] volume Indicates the current audio volume of the AVSampleBufferAudioRenderer.longstatus()[@property] status Indicates the status of the audio renderer.voidstopRequestingMediaData()stopRequestingMediaData Cancels any current requestMediaDataWhenReadyOnQueue:usingBlock: call.static org.moe.natj.objc.Classsuperclass_static()CMTimebaseReftimebase()[@property] timebase The renderer's timebase, which governs how time stamps are interpreted.static longversion_static()floatvolume()[@property] volume Indicates the current audio volume of the AVSampleBufferAudioRenderer.-
Methods inherited from class apple.NSObject
accessibilityActivate, accessibilityActivationPoint, accessibilityAssistiveTechnologyFocusedIdentifiers, accessibilityAttributedHint, accessibilityAttributedLabel, accessibilityAttributedUserInputLabels, accessibilityAttributedValue, accessibilityContainerType, accessibilityCustomActions, accessibilityCustomRotors, accessibilityDecrement, accessibilityDragSourceDescriptors, accessibilityDropPointDescriptors, accessibilityElementAtIndex, accessibilityElementCount, accessibilityElementDidBecomeFocused, accessibilityElementDidLoseFocus, accessibilityElementIsFocused, accessibilityElements, accessibilityElementsHidden, accessibilityFrame, accessibilityHint, accessibilityIncrement, accessibilityLabel, accessibilityLanguage, accessibilityNavigationStyle, accessibilityPath, accessibilityPerformEscape, accessibilityPerformMagicTap, accessibilityRespondsToUserInteraction, accessibilityScroll, accessibilityTextualContext, accessibilityTraits, accessibilityUserInputLabels, accessibilityValue, accessibilityViewIsModal, addObserverForKeyPathOptionsContext, attemptRecoveryFromErrorOptionIndex, attemptRecoveryFromErrorOptionIndexDelegateDidRecoverSelectorContextInfo, autoContentAccessingProxy, awakeAfterUsingCoder, awakeFromNib, class_objc, classForCoder, classForKeyedArchiver, copy, dealloc, debugDescription, description, dictionaryWithValuesForKeys, didChangeValueForKey, didChangeValueForKeyWithSetMutationUsingObjects, didChangeValuesAtIndexesForKey, doesNotRecognizeSelector, fileManagerShouldProceedAfterError, fileManagerWillProcessPath, finalize_objc, forwardingTargetForSelector, forwardInvocation, hash, indexOfAccessibilityElement, isAccessibilityElement, isEqual, isKindOfClass, isMemberOfClass, isProxy, methodForSelector, methodSignatureForSelector, mutableArrayValueForKey, mutableArrayValueForKeyPath, mutableCopy, mutableOrderedSetValueForKey, mutableOrderedSetValueForKeyPath, mutableSetValueForKey, mutableSetValueForKeyPath, observationInfo, observeValueForKeyPathOfObjectChangeContext, performSelector, performSelectorInBackgroundWithObject, performSelectorOnMainThreadWithObjectWaitUntilDone, performSelectorOnMainThreadWithObjectWaitUntilDoneModes, performSelectorOnThreadWithObjectWaitUntilDone, performSelectorOnThreadWithObjectWaitUntilDoneModes, performSelectorWithObject, performSelectorWithObjectAfterDelay, performSelectorWithObjectAfterDelayInModes, performSelectorWithObjectWithObject, prepareForInterfaceBuilder, provideImageDataBytesPerRowOrigin_Size_UserInfo, removeObserverForKeyPath, removeObserverForKeyPathContext, replacementObjectForCoder, replacementObjectForKeyedArchiver, respondsToSelector, self, setAccessibilityActivationPoint, setAccessibilityAttributedHint, setAccessibilityAttributedLabel, setAccessibilityAttributedUserInputLabels, setAccessibilityAttributedValue, setAccessibilityContainerType, setAccessibilityCustomActions, setAccessibilityCustomRotors, setAccessibilityDragSourceDescriptors, setAccessibilityDropPointDescriptors, setAccessibilityElements, setAccessibilityElementsHidden, setAccessibilityFrame, setAccessibilityHint, setAccessibilityLabel, setAccessibilityLanguage, setAccessibilityNavigationStyle, setAccessibilityPath, setAccessibilityRespondsToUserInteraction, setAccessibilityTextualContext, setAccessibilityTraits, setAccessibilityUserInputLabels, setAccessibilityValue, setAccessibilityViewIsModal, setIsAccessibilityElement, setNilValueForKey, setObservationInfo, setShouldGroupAccessibilityChildren, setValueForKey, setValueForKeyPath, setValueForUndefinedKey, setValuesForKeysWithDictionary, shouldGroupAccessibilityChildren, superclass, validateValueForKeyError, validateValueForKeyPathError, valueForKey, valueForKeyPath, valueForUndefinedKey, willChangeValueForKey, willChangeValueForKeyWithSetMutationUsingObjects, willChangeValuesAtIndexesForKey
-
-
-
-
Method Detail
-
accessInstanceVariablesDirectly
public static boolean accessInstanceVariablesDirectly()
-
alloc
public static AVSampleBufferAudioRenderer alloc()
-
allocWithZone
public static java.lang.Object allocWithZone(org.moe.natj.general.ptr.VoidPtr zone)
-
audioTimePitchAlgorithm
public java.lang.String audioTimePitchAlgorithm()
[@property] audioTimePitchAlgorithm Indicates the processing algorithm used to manage audio pitch at varying rates. Constants for various time pitch algorithms, e.g. AVAudioTimePitchSpectral, are defined in AVAudioProcessingSettings.h. The default value on iOS is AVAudioTimePitchAlgorithmLowQualityZeroLatency and on macOS is AVAudioTimePitchAlgorithmTimeDomain. If the timebase's rate is not supported by the audioTimePitchAlgorithm, audio will be muted. Modifying this property while the timebase's rate is not 0.0 may cause the rate to briefly change to 0.0.
-
automaticallyNotifiesObserversForKey
public static boolean automaticallyNotifiesObserversForKey(java.lang.String key)
-
cancelPreviousPerformRequestsWithTarget
public static void cancelPreviousPerformRequestsWithTarget(java.lang.Object aTarget)
-
cancelPreviousPerformRequestsWithTargetSelectorObject
public static void cancelPreviousPerformRequestsWithTargetSelectorObject(java.lang.Object aTarget, org.moe.natj.objc.SEL aSelector, java.lang.Object anArgument)
-
classFallbacksForKeyedArchiver
public static NSArray<java.lang.String> classFallbacksForKeyedArchiver()
-
classForKeyedUnarchiver
public static org.moe.natj.objc.Class classForKeyedUnarchiver()
-
debugDescription_static
public static java.lang.String debugDescription_static()
-
description_static
public static java.lang.String description_static()
-
enqueueSampleBuffer
public void enqueueSampleBuffer(CMSampleBufferRef sampleBuffer)
Description copied from interface:AVQueuedSampleBufferRenderingenqueueSampleBuffer: Sends a sample buffer in order to render its contents. Video-specific notes: If sampleBuffer has the kCMSampleAttachmentKey_DoNotDisplay attachment set to kCFBooleanTrue, the frame will be decoded but not displayed. Otherwise, if sampleBuffer has the kCMSampleAttachmentKey_DisplayImmediately attachment set to kCFBooleanTrue, the decoded image will be displayed as soon as possible, replacing all previously enqueued images regardless of their timestamps. Otherwise, the decoded image will be displayed at sampleBuffer's output presentation timestamp, as interpreted by the timebase. To schedule the removal of previous images at a specific timestamp, enqueue a marker sample buffer containing no samples, with the kCMSampleBufferAttachmentKey_EmptyMedia attachment set to kCFBooleanTrue. IMPORTANT NOTE: attachments with the kCMSampleAttachmentKey_ prefix must be set via CMSampleBufferGetSampleAttachmentsArray and CFDictionarySetValue. Attachments with the kCMSampleBufferAttachmentKey_ prefix must be set via CMSetAttachment.- Specified by:
enqueueSampleBufferin interfaceAVQueuedSampleBufferRendering
-
error
public NSError error()
[@property] error If the renderer's status is AVQueuedSampleBufferRenderingStatusFailed, this describes the error that caused the failure. The value of this property is an NSError that describes what caused the renderer to no longer be able to render sample buffers. The value of this property is nil unless the value of status is AVQueuedSampleBufferRenderingStatusFailed.
-
flush
public void flush()
Description copied from interface:AVQueuedSampleBufferRenderingflush Instructs the receiver to discard pending enqueued sample buffers. Additional sample buffers can be appended after -flush. Video-specific notes: It is not possible to determine which sample buffers have been decoded, so the next frame passed to enqueueSampleBuffer: should be an IDR frame (also known as a key frame or sync sample).- Specified by:
flushin interfaceAVQueuedSampleBufferRendering
-
flushFromSourceTimeCompletionHandler
public void flushFromSourceTimeCompletionHandler(CMTime time, AVSampleBufferAudioRenderer.Block_flushFromSourceTimeCompletionHandler completionHandler)
flushFromSourceTime:completionHandler: Flushes enqueued sample buffers with presentation time stamps later than or equal to the specified time. This method can be used to replace media data scheduled to be rendered in the future, without interrupting playback. One example of this is when the data that has already been enqueued is from a sequence of two songs and the second song is swapped for a new song. In this case, this method would be called with the time stamp of the first sample buffer from the second song. After the completion handler is executed with a YES parameter, media data may again be enqueued with timestamps at the specified time. If NO is provided to the completion handler, the flush did not succeed and the set of enqueued sample buffers remains unchanged. A flush can fail becuse the source time was too close to (or earlier than) the current time or because the current configuration of the receiver does not support flushing at a particular time. In these cases, the caller can choose to flush all enqueued media data by invoking the -flush method.- Parameters:
completionHandler- A block that is invoked, possibly asynchronously, after the flush operation completes or fails.
-
hash_static
public static long hash_static()
-
init
public AVSampleBufferAudioRenderer init()
-
instanceMethodForSelector
public static NSObject.Function_instanceMethodForSelector_ret instanceMethodForSelector(org.moe.natj.objc.SEL aSelector)
-
instanceMethodSignatureForSelector
public static NSMethodSignature instanceMethodSignatureForSelector(org.moe.natj.objc.SEL aSelector)
-
instancesRespondToSelector
public static boolean instancesRespondToSelector(org.moe.natj.objc.SEL aSelector)
-
isMuted
public boolean isMuted()
[@property] muted Indicates whether or not audio output of the AVSampleBufferAudioRenderer is muted. Setting this property only affects audio muting for the renderer instance and not for the device.
-
isReadyForMoreMediaData
public boolean isReadyForMoreMediaData()
Description copied from interface:AVQueuedSampleBufferRendering[@property] readyForMoreMediaData Indicates the readiness of the receiver to accept more sample buffers. An object conforming to AVQueuedSampleBufferRendering keeps track of the occupancy levels of its internal queues for the benefit of clients that enqueue sample buffers from non-real-time sources -- i.e., clients that can supply sample buffers faster than they are consumed, and so need to decide when to hold back. Clients enqueueing sample buffers from non-real-time sources may hold off from generating or obtaining more sample buffers to enqueue when the value of readyForMoreMediaData is NO. It is safe to call enqueueSampleBuffer: when readyForMoreMediaData is NO, but it is a bad idea to enqueue sample buffers without bound. To help with control of the non-real-time supply of sample buffers, such clients can use -requestMediaDataWhenReadyOnQueue:usingBlock in order to specify a block that the receiver should invoke whenever it's ready for sample buffers to be appended. The value of readyForMoreMediaData will often change from NO to YES asynchronously, as previously supplied sample buffers are decoded and rendered. This property is not key value observable.- Specified by:
isReadyForMoreMediaDatain interfaceAVQueuedSampleBufferRendering
-
isSubclassOfClass
public static boolean isSubclassOfClass(org.moe.natj.objc.Class aClass)
-
keyPathsForValuesAffectingValueForKey
public static NSSet<java.lang.String> keyPathsForValuesAffectingValueForKey(java.lang.String key)
-
new_objc
public static java.lang.Object new_objc()
-
requestMediaDataWhenReadyOnQueueUsingBlock
public void requestMediaDataWhenReadyOnQueueUsingBlock(NSObject queue, AVQueuedSampleBufferRendering.Block_requestMediaDataWhenReadyOnQueueUsingBlock block)
Description copied from interface:AVQueuedSampleBufferRenderingrequestMediaDataWhenReadyOnQueue:usingBlock: Instructs the target to invoke a client-supplied block repeatedly, at its convenience, in order to gather sample buffers for playback. The block should enqueue sample buffers to the receiver either until the receiver's readyForMoreMediaData property becomes NO or until there is no more data to supply. When the receiver has decoded enough of the media data it has received that it becomes ready for more media data again, it will invoke the block again in order to obtain more. If this method is called multiple times, only the last call is effective. Call stopRequestingMediaData to cancel this request. Each call to requestMediaDataWhenReadyOnQueue:usingBlock: should be paired with a corresponding call to stopRequestingMediaData:. Releasing the AVQueuedSampleBufferRendering object without a call to stopRequestingMediaData will result in undefined behavior.- Specified by:
requestMediaDataWhenReadyOnQueueUsingBlockin interfaceAVQueuedSampleBufferRendering
-
resolveClassMethod
public static boolean resolveClassMethod(org.moe.natj.objc.SEL sel)
-
resolveInstanceMethod
public static boolean resolveInstanceMethod(org.moe.natj.objc.SEL sel)
-
setAudioTimePitchAlgorithm
public void setAudioTimePitchAlgorithm(java.lang.String value)
[@property] audioTimePitchAlgorithm Indicates the processing algorithm used to manage audio pitch at varying rates. Constants for various time pitch algorithms, e.g. AVAudioTimePitchSpectral, are defined in AVAudioProcessingSettings.h. The default value on iOS is AVAudioTimePitchAlgorithmLowQualityZeroLatency and on macOS is AVAudioTimePitchAlgorithmTimeDomain. If the timebase's rate is not supported by the audioTimePitchAlgorithm, audio will be muted. Modifying this property while the timebase's rate is not 0.0 may cause the rate to briefly change to 0.0.
-
setMuted
public void setMuted(boolean value)
[@property] muted Indicates whether or not audio output of the AVSampleBufferAudioRenderer is muted. Setting this property only affects audio muting for the renderer instance and not for the device.
-
setVersion_static
public static void setVersion_static(long aVersion)
-
setVolume
public void setVolume(float value)
[@property] volume Indicates the current audio volume of the AVSampleBufferAudioRenderer. A value of 0.0 means "silence all audio", while 1.0 means "play at the full volume of the audio media". This property should be used for frequent volume changes, for example via a volume knob or fader. This property is most useful on iOS to control the volume of the AVSampleBufferAudioRenderer relative to other audio output, not for setting absolute volume.
-
status
public long status()
[@property] status Indicates the status of the audio renderer. A renderer begins with status AVQueuedSampleBufferRenderingStatusUnknown. As sample buffers are enqueued for rendering using -enqueueSampleBuffer:, the renderer will transition to either AVQueuedSampleBufferRenderingStatusRendering or AVQueuedSampleBufferRenderingStatusFailed. If the status is AVQueuedSampleBufferRenderingStatusFailed, check the value of the renderer's error property for information on the error encountered. This is terminal status from which recovery is not always possible. This property is key value observable.
-
stopRequestingMediaData
public void stopRequestingMediaData()
Description copied from interface:AVQueuedSampleBufferRenderingstopRequestingMediaData Cancels any current requestMediaDataWhenReadyOnQueue:usingBlock: call. This method may be called from outside the block or from within the block.- Specified by:
stopRequestingMediaDatain interfaceAVQueuedSampleBufferRendering
-
superclass_static
public static org.moe.natj.objc.Class superclass_static()
-
timebase
public CMTimebaseRef timebase()
Description copied from interface:AVQueuedSampleBufferRendering[@property] timebase The renderer's timebase, which governs how time stamps are interpreted. The timebase is used to interpret time stamps. The timebase is read-only. Use the AVSampleBufferRenderSynchronizer to set the rate or time.- Specified by:
timebasein interfaceAVQueuedSampleBufferRendering
-
version_static
public static long version_static()
-
volume
public float volume()
[@property] volume Indicates the current audio volume of the AVSampleBufferAudioRenderer. A value of 0.0 means "silence all audio", while 1.0 means "play at the full volume of the audio media". This property should be used for frequent volume changes, for example via a volume knob or fader. This property is most useful on iOS to control the volume of the AVSampleBufferAudioRenderer relative to other audio output, not for setting absolute volume.
-
-