3

I'm using>Xcode Version 9.2<
I'm using>AudioKit Version 4.0.4<


I've written some code you can find below that should be able to

  • play a specific sound (frequency: 500.0HZ)
  • "listen" to the microphone input and calculating the frequency in real time

If I'm calling playSound() or receiveSound() separated everything looks fine and is really working as I expected. But calling playSound() and receiveSound() afterwards? Exactly there I got big issues.

This is how I'd like to get the code working:

SystemClass.playSound() //play sound
DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 3.0)) {
 SystemClass.receiveSound() //get microphone input 3 seconds later
}

let SystemClass: System = System()
class System {
 public init() { }
 func playSound() {
 let sound = AKOscillator()
 AudioKit.output = sound
 AudioKit.start()
 sound.frequency = 500.0
 sound.amplitude = 0.5
 sound.start()
 DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 2.0)) {
 sound.stop()
 }
 }
 var tracker: AKFrequencyTracker!
 func receiveSound() {
 AudioKit.stop()
 AKSettings.audioInputEnabled = true
 let mic = AKMicrophone()
 tracker = AKFrequencyTracker(mic)
 let silence = AKBooster(tracker, gain: 0)
 AudioKit.output = silence
 AudioKit.start()
 Timer.scheduledTimer( timeInterval: 0.1, target: self, selector: #selector(SystemClass.outputFrequency), userInfo: nil, repeats: true)
 }
 @objc func outputFrequency() {
 print("Frequency: \(tracker.frequency)")
 }
}

These messages are some of the compiler error-messages I get every time I'd like to run the code (calling playSound() and calling receiveSound () 3 seconds later):

AVAEInternal.h:103:_AVAE_CheckNoErr: [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875
AVAudioEngine.mm:149:-[AVAudioEngine prepare]: Engine@0x1c401bff0: could not initialize, error = -10875
[MediaRemote] [AVOutputContext] WARNING: AVF context unavailable for sharedSystemAudioContext
[AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875
Fatal error: AudioKit: Could not start engine. error: Error 
Domain=com.apple.coreaudio.avfaudio Code=-10875 "(null)" UserInfo={failed call=err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)}.: file /Users/megastep/src/ak/AudioKit/AudioKit/Common/Internals/AudioKit.swift, line 243
asked Jan 6, 2018 at 18:21

1 Answer 1

3

I believe the lionshare of your problems are due to local declaration of AKNodes within the functions that use them:

 let sound = AKOscillator()
 let mic = AKMicrophone() 
 let silence = AKBooster(tracker, gain: 0)

Declare these as instance variables instead, as described here.

answered Jan 6, 2018 at 23:53
Sign up to request clarification or add additional context in comments.

1 Comment

Were you able to accomplish what you wrote in the question (playing sounds and receiving sounds)? I want to do the same thing, but it's just crashing.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.