What is Linear PCM

July 7, 2009 § Leave a comment

Linear PCM is an uncompressed audio format. It can have up to 8 channels of audio at 48khz or 96khz sampling frequency and 16, 20 or 24 bits per sample. It has a maximum bitrate of a huge 6.144 MB/s. It’s also known as LPCM.

Advertisements

Supported Audio Formats in Playback & Recording

July 7, 2009 § 2 Comments

Supported Audio Playback formats:

  • AAC
  • HE-AAC
  • AMR (Adaptive Multi-Rate, a format for speech)
  • ALAC (Apple Lossless)
  • iLBC (internet Low Bitrate Codec, another format for speech)
  • IMA4 (IMA/ADPCM)
  • linear PCM (uncompressed)
  • µ-law and a-law
  • MP3 (MPEG-1 audio layer 3

Supported Audio Recording Formats:

  • ALAC (Apple Lossless)
  • iLBC (internet Low Bitrate Codec, for speech)
  • IMA/ADPCM (IMA4)
  • linear PCM
  • µ-law and a-law

AAC, MP3, and ALAC (Apple Lossless) Playback for AAC, MP3, and ALAC sounds can use efficient hardware-based decoding on iPhone OS–based devices, but these codecs all share a single hardware path. The device can play only a single instance of one of these formats at a time through hardware. If the user is playing a sound in one of these three formats in the iPod application, then your application—to play along over that audio—will employ software decoding. AAC, ALAC, MP3 can employ hardware decoding for playback.

Linear PCM and IMA4 (IMA/ADPCM You can play multiple linear PCM or IMA4 sounds simultaneously in iPhone OS without incurring CPU resource problems. It is same for the AMR and iLBC speech-quality formats, and for the µ-law and a-law compressed formats.

The device can play only a single instance of one of these formats at a time through hardware. For example, if you are playing a stereo MP3 sound, a second simultaneous MP3 sound will use software decoding. Similarly, you cannot simultaneously play an AAC and an ALAC sound using hardware. If the iPod application is playing an AAC sound in the background, your application plays AAC, ALAC, and MP3 audio using software decoding.

To play multiple sounds with best performance, or to efficiently play sounds while the iPod is playing in the background, use linear PCM (uncompressed) or IMA4 (compressed) audio.

Measurement of Audio Latency

July 7, 2009 § Leave a comment

Latency = buffer duration + inherent hardware latency

In other words, latency is to get the last sample of audio in its buffer out of the headphones or speaker.

Reference here.

Audio References:
Subfruther.com
Michael Tyson

Audio Framework OpenAL

July 7, 2009 § Leave a comment

OpenAL is similar to OpenGL. It is straightforward with 3 main entities: the Listener, the Source, and the Buffer.

The Listener is you. OpenAL allows you to specify where the listenser is in relation to the sources. You can keep the minimum static sound.

The Source is similar to a speaker. It generates sound which the listener can hear. You can move the sources around and get positional effects. Or you keep this simple.

The Buffer is sound that will be played. The buffer holds the raw audio data.

The device is hardware that will be playing the sound and context is the current session for audio.

Minimum Process

* get the device
* make a context with the device
* put data into the buffer
* attach the buffer to a source
* play the source

Why you should use CAF instead of AIF & WAV?

July 7, 2009 § 2 Comments

iPhone OS have a native audio file format, the Core Audio Format (or CAF) file format. It is available in iphone OS 2.0 +. it can contain any audio data format supported on a platform. CAF files have no size restrictions—unlike .aif and .wav files—and can support a wide range of metadata, such as channel information and text annotations.

This is command line to convert audio into Little-Endian 16-it 44,100 sample rate format in .caf.

/usr/bin/afconvert -f caff -d LEI16@44100 inputSoundFile.aiff outputSoundFile.caf

It seems like Audio Toolbox methods handle multiple formats. However, if you use the audio in the right format then iphone won’t have to do it at play time.

Audio Toolkit is fine if you have a button and simple UI interaction. It doesn’t play immediately. You can’t match specific frame with specific sound effects. It is late by many frames or the entire audio will pause and wait for audio toolbox to load the sound into the buffer. It is just not good for game or music app.

SDK 3.0 Upgrades on AVFoundation Framework

July 7, 2009 § Leave a comment

SDK 3.0 has added the following APIs in AVFoundation framework

  • AVAudioRecorder.h
  • AVAudioSession.h
  • AVAudioSettings.h

Existing APIs in SDK 2.2.1:

  • AVAudioPlayer.h
  • AVFoundation.h

Upgrade in Utility App from SDK 2.2.1 to SDK 3.0

July 7, 2009 § 1 Comment

In SDK 2.2.1, infoButton is just a UIButton that is controlled by the toggleView: method in RootViewController.h.

SDK 3.0 reveals a new architecture. RootViewController .h & .m files are eliminated from the directory of “Application Delegate”. <FlipsideViewControllerDelegate> and showInfo: method are added to the MainViewConroller .h file.

New architecture in SDK 3.0 is a better implementation because it eliminates the complexity of RootViewController. It is complicated to pass data between MainViewController and FlipsideViewController via the middleman, RootViewController.

In SDK 2.2.1 RootViewController.m:

#import <UIKit/UIKit.h>
@class MainViewController;
@class FlipsideViewController;
@interface RootViewController : UIViewController {
UIButton *infoButton;
MainViewController *mainViewController;
FlipsideViewController *flipsideViewController;
UINavigationBar *flipsideNavigationBar;
}
@property (nonatomic, retain) IBOutlet UIButton *infoButton;
@property (nonatomic, retain) MainViewController *mainViewController;
@property (nonatomic, retain) UINavigationBar *flipsideNavigationBar;
@property (nonatomic, retain) FlipsideViewController *flipsideViewController;
– (IBAction)toggleView;
@end

In RootViewController.m:

– (IBAction)toggleView {
/*
This method is called when the info or Done button is pressed.
It flips the displayed view from the main view to the flipside view and vice-versa.
*/
if (flipsideViewController == nil) {
[self loadFlipsideViewController];
}
UIView *mainView = mainViewController.view;
UIView *flipsideView = flipsideViewController.view;
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:1];
[UIView setAnimationTransition:([mainView superview] ? UIViewAnimationTransitionFlipFromRight : UIViewAnimationTransitionFlipFromLeft) forView:self.view cache:YES];
if ([mainView superview] != nil) {
[flipsideViewController viewWillAppear:YES];
[mainViewController viewWillDisappear:YES];
[mainView removeFromSuperview];
[infoButton removeFromSuperview];
[self.view addSubview:flipsideView];
[self.view insertSubview:flipsideNavigationBar aboveSubview:flipsideView];
[mainViewController viewDidDisappear:YES];
[flipsideViewController viewDidAppear:YES];
} else {
[mainViewController viewWillAppear:YES];
[flipsideViewController viewWillDisappear:YES];
[flipsideView removeFromSuperview];
[flipsideNavigationBar removeFromSuperview];
[self.view addSubview:mainView];
[self.view insertSubview:infoButton aboveSubview:mainViewController.view];
[flipsideViewController viewDidDisappear:YES];
[mainViewController viewDidAppear:YES];
}
[UIView commitAnimations];
}

In SDK 3.0 MainViewController.h:

#import “FlipsideViewController.h”
@interface MainViewController : UIViewController <FlipsideViewControllerDelegate> {
}
– (IBAction)showInfo;
@end

In MainViewController.m:
– (void)flipsideViewControllerDidFinish:(FlipsideViewController *)controller {
[self dismissModalViewControllerAnimated:YES];
}
– (IBAction)showInfo {
FlipsideViewController *controller = [[FlipsideViewController alloc] initWithNibName:@”FlipsideView” bundle:nil];
controller.delegate = self;
controller.modalTransitionStyle = UIModalTransitionStyleFlipHorizontal;
[self presentModalViewController:controller animated:YES];
[controller release];
}

Where Am I?

You are currently viewing the archives for July, 2009 at Web Builders.