getting pcm audio for visualization via Spotify iOS SDK
We’re currently looking at taking our music visualization software that’s been around for many years to an iOS app that plays music via the new iOS Spotify SDK — check out http://soundspectrum.com to see our visuals such as G-Force and Aeon.
Anyway, we have the demo projects in the Spotify iOS SDK all up and running and things look good, but the major step forward is to get access to the audio pcm so we can sent it into our visual engines, etc.
Could a Spotify dev or someone in the know kindly suggest what possibilities are available to get a hold of the pcm audio? The audio pcm block can be as simple as a circular buffer of a few thousand of the latest samples (that we would use to FFT, etc).
Thanks in advance!
- Spotify: Login using the web api not accepting redirect url
- -canOpenURL: failed for URL: “spotify:” - error: “(null)”
- Opening spotify app from my iphone app
- Spotify iOS SDK Swift display all (!) playlists (20+)
- XCode - iOS: Can't resolve conflict between CocoaLibSpotify and Parse framework
- How to get currently playing track in Spotify Mac from Cocoa
Solutions Collect From Internet About “getting pcm audio for visualization via Spotify iOS SDK”
SPTCoreAudioController and do one of two things:
AudioUnitAddRenderNotify()to add a render callback to
destinationNode‘s audio unit. The callback will be called as the output node is rendered and will give you access to the audio as it’s leaving for the speakers. Once you’ve done that, make sure you call
super‘s implementation for the Spotify iOS SDK’s audio pipeline to work correctly.
attemptToDeliverAudioFrames:ofCount:streamDescription:. This gives you access to the PCM data as it’s produced by the library. However, there’s some buffering going on in the default pipeline so the data given in this callback might be up to half a second behind what’s going out to the speakers, so I’d recommend using suggestion 1 over this. Call
superhere to continue with the default pipeline.
Once you have your custom audio controller, initialise an
SPTAudioStreamingController with it and you should be good to go.
I actually used suggestion 1 to implement iTunes’ visualiser API in my Mac OS X Spotify client that was built with CocoaLibSpotify. It’s not working 100% smoothly (I think I’m doing something wrong with runloops and stuff), but it drives G-Force and Whitecap pretty well. You can find the project here, and the visualiser stuff is in VivaCoreAudioController.m. The audio controller class in CocoaLibSpotify and that project is essentially the same as the one in the new iOS SDK.
- How do I add a Watch or Inspect in Xcode?
- How to create Custom MKAnnotationView and custom annotation title and subtitle
- How do you include SSZipArchive for IOS 5?
- Applying a custom SKShader to SKScene that pixelates the whole rendered scene in iOS 8 SpriteKit with Swift
- Swipe left gestures over UITextField
- Error Getting Default Calendar For New Events – Swift
- UIScrollView phantom subviews
- iPhone UITableView image background when table is empty
- scrollToRowAtIndexPath with UITableView does not work
- How to open ios app using url?
- Mac OSX 10.7.4, Xcode 4.4.1, no <array> header file?
- Can you use map to create instances without a wrapper?
- UIActivityViewController Gmail Share subject and body going Same
- MPMoviePlayerController starting image
- Fake Status Bar Color when Navigation Bar is hidden