Friday, May 23, 2014

LKDB Helper Sqlite ORM

this is sqlite ORM (an automatic database operation)
thread-safe and not afraid of recursive deadlock
新版 添加字段的时候 可以直接 定义属性就好了 不用再调用 [self tableUpdateAddColumnWithPN:@"color"]; 这种的方法了

v1.1

  • 支持 列名属性 之间的绑定。
  • 你也可以 设置 列 的属性。
  • 当你列 映射 使用 LKSQLUserCalculate 值 。 就重载下面两个方法,由你决定插入到数据库中的数据
    -(id)userGetValueForModel:(LKDBProperty *)property
    -(void)userSetValueForModel:(LKDBProperty *)property value:(id)value
  • 还增加了两个添加列的方法,方便在表版本升级的时候调用。
  • 为了 支持多数据库 取消了 shareDBHelper 这个方法,
    改成 [modelClass getUsingDBHelper] 这样每个model 可以重载 , 选择要使用的数据库
    可以看 NSObject+LKDBHelper 里面 的方法
    #v1.1
  • Support column name binding attributes.
  • You can also set the properties of the column.
  • When you use LKSQLUserCalculate column mapping value. To override the following two methods you decide to insert data in the database
    - (id) userGetValueForModel: (LKDBProperty *) property
    - (void) userSetValueForModel: (LKDBProperty *) property value: (id) value
  • Also added two ways to add columns for easy upgrades in the table when called.
  • In order to support multiple databases canceled shareDBHelper this method,
    Changed to [modelClass getUsingDBHelper] so that each model can be overloaded, select the database you want to use
    You can see NSObject LKDBHelper method inside

Requirements

  • iOS 4.3+
  • ARC only
  • FMDB(https://github.com/ccgus/fmdb)

Adding to your project

If you are using CocoaPods, then, just add this line to your PodFile
pod 'LKDBHelper', :head

Basic usage

1 . Create a new Objective-C class for your data model
@interface LKTest : NSObject
@property(copy,nonatomic)NSString* name;
@property int  age;
@property BOOL isGirl;

@property(strong,nonatomic)LKTestForeign* address;

@property char like;
@property(strong,nonatomic) UIImage* img;
@property(strong,nonatomic) NSDate* date;

@property(copy,nonatomic)NSString* error;
@property(copy,nonatomic)UIColor* color;
@end
2 . in the *.m file, overwirte getTableName function
+(NSString *)getTableName
{
    return @"LKTestTable";
}
3 . In your app start function
    LKDBHelper* globalHelper = [LKDBHelper getUsingLKDBHelper];

    //create table need to manually call! will check the version number of the table
    [globalHelper createTableWithModelClass:[LKTest class]];
4 . Initialize your model with data and insert to database
    LKTest* test = [[LKTest alloc]init];
    test.name = @"zhan san";
    test.age = 16;

    test.address = foreign;

    test.isGirl = YES;
    test.like = 'I';
    test.img = [UIImage imageNamed:@"41.png"];
    test.date = [NSDate date];
    test.color = [UIColor orangeColor];

    [globalHelper insertToDB:test];

5 . select 、 delete 、 update 、 isExists 、 rowCount ...
    select:

        NSMutableArray* array = [globalHelper search:[LKTest class] where:nil orderBy:nil offset:0 count:100];
        for (NSObject* obj in array) {
            addText(@"%@",[obj printAllPropertys]);
        }

    delete:

        [globalHelper deleteToDB:test];

    update:

        test.name = "rename";
        [globalHelper updateToDB:test where:nil];

    isExists:

        [globalHelper isExistsModel:test];

    rowCount:

        [globalHelper rowCount:[LKTest class] where:nil];


6 . Description of parameters "where"
 For example: 
        single:  @"rowid = 1"                         or      @{@"rowid":@1}

        more:    @"rowid = 1 and sex = 0"             or      @{@"rowid":@1,@"sex":@0}

                    when where is "or" type , such as @"rowid = 1 or sex = 0"
                    you only use NSString

        array:   @"rowid in (1,2,3)"                  or      @{@"rowid":@[@1,@2,@3]}

        composite:  @"rowid in (1,2,3) and sex=0 "      or      @{@"rowid":@[@1,@2,@3],@"sex":@0}

        If you want to be judged , only use NSString
        For example: @"date >= '2013-04-01 00:00:00'"

table mapping

overwirte getTableMapping Function
+(NSDictionary *)getTableMapping
{
    //return nil 
    return @{@"name":LKSQLInherit,
             @"MyAge":@"age",
             @"img":LKSQLInherit,
             @"MyDate":@"date",
             @"color":LKSQLInherit,
             @"address":LKSQLUserCalculate};
}

table update

+(LKTableUpdateType)tableUpdateForOldVersion:(int)oldVersion newVersion:(int)newVersion
{
    switch (oldVersion) {
        case 1:
        {
            [self tableUpdateAddColumnWithPN:@"color"];
        }
        case 2:
        {
            [self tableUpdateAddColumnWithName:@"address" sqliteType:LKSQLText];
        }
            break;
    }
    return LKTableUpdateTypeCustom;
}

set column attribute

+(void)columnAttributeWithProperty:(LKDBProperty *)property
{
    if([property.sqlColumnName isEqualToString:@"MyAge"])
    {
        property.defaultValue = @"15";
    }
    if([property.propertyName isEqualToString:@"date"])
    {
        property.isUnique = YES;
        property.checkValue = @"MyDate > '2000-01-01 00:00:00'";
        property.length = 30;
    }
}

demo screenshot

demo screenshot
table test data

foreign key data


Change-log

Version 1.1 @ 2012-6-20
  • automatic table mapping
  • support optional columns
  • support column attribute settings
  • you can return column content
Version 1.0 @ 2013-5-19
  • overwrite and rename LKDBHelper
  • property type support: UIColor,NSDate,UIImage,NSData,CGRect,CGSize,CGPoint,int,float,double,NSString,short,char,bool,NSInterger..
  • fix a recursive deadlock.
  • rewrite the asynchronous operation -
  • thread-safe
  • various bug modified optimize cache to improve performance
  • test and demos
  • bug fixes, speed improvements
Version 0.0.1 @ 2012-10-1
  • Initial release with LKDAOBase
Download: https://github.com/li6185377/LKDBHelper-SQLite-ORM/archive/master.zip

Wednesday, April 23, 2014

Particle Button

ParticleButton



Download: https://github.com/uiue/ParticleButton/archive/master.zip

Thursday, March 20, 2014

EZ Audio code4app

alt text

EZAudio

A simple, intuitive audio framework for iOS and OSX.

Update

Thank you everyone for using EZAudio! Just an update - I'm working on a 1.0.0 production version of EZAudio that will contain a bunch of improvements in the API, feature an EZAudioPlayer, and hooks for the DOUAudioStreamer for visualizing remote streaming audio. To make the next version of EZAudio even better I encourage you all to email me your feedback, feature requests, and experiences using the framework. Thanks!

Features

alt text
Awesome Components
I've designed six core components to allow you to immediately get your hands dirty recording, playing, and visualizing audio data. These components simply plug into each other and build on top of the high-performance, low-latency AudioUnits API and give you an easy to use API written in Objective-C instead of pure C.
EZMicrophone
A microphone class that provides its delegate audio data from the default device microphone with one line of code.
EZRecorder
A recorder class that provides a quick and easy way to write audio files from any datasource.
EZAudioFile
An audio file class that reads/seeks through audio files and provides useful delegate callbacks.
EZOutput
An output class that will playback any audio it is provided by its datasource.
EZAudioPlot
A CoreGraphics-based audio waveform plot capable of visualizing any float array as a buffer or rolling plot.
EZAudioPlotGL
An OpenGL-based, GPU-accelerated audio waveform plot capable of visualizing any float array as a buffer or rolling plot.
Cross Platform
EZAudio was designed to work transparently across all iOS and OSX devices. This means one universal API whether you're building for Mac or iOS. For instance, under the hood an EZAudioPlot knows that it will subclass a UIView for iOS or an NSView for OSX and the EZMicrophone knows to build on top of the RemoteIO AudioUnit for iOS, but defaults to the system defaults for input and output for OSX.

Examples & Docs

Within this repo you'll find the examples for iOS and OSX to get you up to speed using each component and plugging them into each other. With just a few lines of code you'll be recording from the microphone, generating audio waveforms, and playing audio files like a boss. See the full Getting Started guide for an interactive look into each of components.

Example Projects

EZAudioCoreGraphicsWaveformExample
Shows how to use the EZMicrophone and EZAudioPlot to visualize the audio data from the microphone in real-time. The waveform can be displayed as a buffer or a rolling waveform plot (traditional waveform look).
EZAudioOpenGLWaveformExample
Shows how to use the EZMicrophone and EZAudioPlotGL to visualize the audio data from the microphone in real-time. The drawing is using OpenGL so it is much faster and like the first example can display a buffer or rolling waveform.
EZAudioPlayFileExample
Shows how to use the EZAudioFile, EZOutput, and EZAudioPlotGL to playback, pause, and seek through an audio file while displaying its waveform as a buffer or a rolling waveform plot.
EZAudioRecordWaveformExample
Shows how to use the EZMicrophone, EZRecorder, and EZAudioPlotGL to record the audio from the microphone input to a file while displaying the audio waveform of the incoming data. You can then playback the newly recorded audio file using AVFoundation and keep adding more audio data to the tail of the file.
EZAudioWaveformFromFileExample
Shows how to use the EZAudioFile and EZAudioPlot to display the audio waveform an entire audio file.
EZAudioPassThroughExample
Shows how to use the EZMicrophone, EZOutput, and the EZAudioPlotGL to pass the microphone input to the output for playback while displaying the audio waveform (as a buffer or rolling plot) in real-time.
EZAudioFFTExample
Shows how to calculate the real-time FFT of the audio data coming from the EZMicrophone and the Accelerate framework. The audio data is plotted using the EZAudioPlotGL for the time domain plot and the EZAudioPlot for the frequency domain plot.
alt text

Documentation

The official documentation for EZAudio can be found here: http://cocoadocs.org/docsets/EZAudio/0.0.3/
You can also generate the docset yourself using appledocs by running the appledocs on the EZAudio source folder.

Getting Started

To see the full project page, interactive Getting Started guide, and Documentation go here: http://syedharisali.com/projects/EZAudio/getting-started
To begin using EZAudio you must first make sure you have the proper build requirements and frameworks. Below you'll find explanations of each component and code snippets to show how to use each to perform common tasks like getting microphone data, updating audio waveform plots, reading/seeking through audio files, and performing playback.

Build Requirements

iOS
  • 6.0+
OSX
  • 10.8+

Frameworks

iOS
  • AudioToolbox
  • AVFoundation
  • GLKit
OSX
  • AudioToolbox
  • AudioUnit
  • CoreAudio
  • QuartzCore
  • OpenGL
  • GLKit

Adding To Project

You can add EZAudio to your project in a few ways:

1.) The easiest way to use EZAudio is via Cocoapods. Simply add EZAudio to your Podfile like so:
pod 'EZAudio', '~> 0.0.4'
2.) Alternatively, you could clone or fork this repo and just drag and drop the source into your project.
For more information see main project page: http://syedharisali.com/projects/EZAudio/getting-started

Core Components

EZAudio currently offers four components that encompass a wide range of audio functionality. In addition to the functional aspects of these components such as pulling audio data, reading/writing from files, and performing playback they also take special care to hook into the interface components to allow developers to display visual feedback (see the Interface Components below).

EZAudioFile

Provides simple read/seek operations, pulls waveform amplitude data, and provides the EZAudioFileDelegate to notify of any read/seek action occuring on the EZAudioFile.
Relevant Example Projects
  • EZAudioPlayFileExample (iOS)
  • EZAudioPlayFileExample (OSX)
  • EZAudioWaveformFromFileExample (iOS)
  • EZAudioWaveformFromFileExample (OSX)

Opening An Audio File

To open an audio file create a new instance of the EZAudioFile class.
// Declare the EZAudioFile as a strong property
@property (nonatomic,strong) EZAudioFile *audioFile;

...

// Initialize the EZAudioFile instance and assign it a delegate to receive the read/seek callbacks
self.audioFile = [EZAudioFile audioFileWithURL:[NSURL fileURLWithPath:@"/path/to/your/file"] 
                                   andDelegate:self];

Getting Waveform Data

There is a getWaveformDataWithCompletionBlock: method to allow you to easily and asynchronously get the waveform amplitude data that will best represent the whole audio file (will calculate the best fit that's constrainted to ~2048 data points)
// Get the waveform data from the audio file asynchronously 
[audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
  // Update the audio plot with the waveform data (use the EZPlotTypeBuffer in this case)
  self.audioPlot.plotType = EZPlotTypeBuffer;
  [self.audioPlot updateBuffer:waveformData withBufferSize:length];
}];

Reading From An Audio File

Reading audio data from a file requires you to create an AudioBufferList to hold the data. The EZAudio utility function, audioBufferList, provides a convenient way to get an allocated AudioBufferList to use. There is also a utility function, freeBufferList:, to use to free (or release) the AudioBufferList when you are done using that audio data.
Note: You have to free the AudioBufferList, even in ARC.
// Allocate a buffer list to hold the file's data
UInt32          frames      = 512;
AudioBufferList *bufferList = [EZAudio audioBufferList];
UInt32          bufferSize; // Read function will populate this value
BOOL            eof;        // Read function will populate this value
// Reads 512 frames from the audio file
[audioFile readFrames:frames
      audioBufferList:bufferList
           bufferSize:&bufferSize
                  eof:&eof];
// Cleanup when done working with audio data (yes, even in ARC)
[EZAudio freeBufferList:bufferList];
When a read occurs the EZAudioFileDelegate receives two events.
An event notifying the delegate of the read audio data as float arrays:
// The EZAudioFile method `readFrames:audioBufferList:bufferSize:eof:` triggers an event notifying the delegate of the read audio data as float arrays
-(void)     audioFile:(EZAudioFile *)audioFile
            readAudio:(float **)buffer
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
  // The audio data from the read as a float buffer. You can feed this into an audio plot!
  dispatch_async(dispatch_get_main_queue(), ^{
    // Update that audio plot!
    [self.audioPlot updateBuffer:buffer[0] withBufferSize:bufferSize];
  });
}
and an event notifying the delegate of the new frame position within the EZAudioFile:
// The EZAudioFile method `readFrames:audioBufferList:bufferSize:eof:` triggers an event notifying the delegate of the new frame position within the file.
-(void)audioFile:(EZAudioFile *)audioFile updatedPosition:(SInt64)framePosition {
  dispatch_async(dispatch_get_main_queue(), ^{
    // Move that slider to this new position!
  });
}

Seeking Through An Audio File

You can seek very easily through an audio file using the EZAudioFile's seekToFrame: method. The EZAudioFile provides a totalFrames method to provide you the total amount of frames in an audio file so you can calculate a proper offset.
// Get the total number of frames for the audio file
SInt64 totalFrames = [self.audioFile totalFrames];
// Seeks halfway through the audio file
[self.audioFile seekToFrame:(totalFrames/2)];
When a seek occurs the EZAudioFileDelegate receives the seek event:
// The EZAudioFile method `seekToFrame:` triggers an event notifying the delegate of the new frame position within the file.
-(void)audioFile:(EZAudioFile *)audioFile updatedPosition:(SInt64)framePosition {
  dispatch_async(dispatch_get_main_queue(), ^{
    // Move that slider to this new position!
  });
}

EZMicrophone

Provides access to the default device microphone in one line of code and provides delegate callbacks to receive the audio data as an AudioBufferList and float arrays.
Relevant Example Projects
  • EZAudioCoreGraphicsWaveformExample (iOS)
  • EZAudioCoreGraphicsWaveformExample (OSX)
  • EZAudioOpenGLWaveformExample (iOS)
  • EZAudioOpenGLWaveformExample (OSX)
  • EZAudioRecordExample (iOS)
  • EZAudioRecordExample (OSX)

Creating A Microphone

Create an EZMicrophone instance by declaring a property and initializing it like so:
// Declare the EZMicrophone as a strong property
@property (nonatomic,strong) EZMicrophone *microphone;

...

// Initialize the microphone instance and assign it a delegate to receive the audio data callbacks
self.microphone = [EZMicrophone microphoneWithDelegate:self];
Alternatively, you could also use the shared EZMicrophone instance and just assign its EZMicrophoneDelegate.
// Assign a delegate to the shared instance of the microphone to receive the audio data callbacks
[EZMicrophone sharedMicrophone].microphoneDelegate = self;

Getting Microphone Data

To tell the microphone to start fetching audio use the startFetchingAudio function.
// Starts fetching audio from the default device microphone and sends data to EZMicrophoneDelegate
[self.microphone startFetchingAudio];
Once the EZMicrophone has started it will send the EZMicrophoneDelegate the audio back in a few ways. An array of float arrays:
/**
 The microphone data represented as float arrays useful for:
    - Creating real-time waveforms using EZAudioPlot or EZAudioPlotGL
    - Creating any number of custom visualizations that utilize audio!
 */
-(void)   microphone:(EZMicrophone *)microphone
    hasAudioReceived:(float **)buffer
      withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels {
  // Getting audio data as an array of float buffer arrays that can be fed into the EZAudioPlot, EZAudioPlotGL, or whatever visualization you would like to do with the microphone data.
  dispatch_async(dispatch_get_main_queue(),^{
    // Visualize this data brah, buffer[0] = left channel, buffer[1] = right channel
    [self.audioPlot updateBuffer:buffer[0] withBufferSize:bufferSize];
  });
}
or the AudioBufferList representation:
/**
 The microphone data represented as CoreAudio's AudioBufferList useful for:
    - Appending data to an audio file via the EZRecorder
    - Playback via the EZOutput

 */
-(void)    microphone:(EZMicrophone *)microphone
        hasBufferList:(AudioBufferList *)bufferList
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
    // Getting audio data as an AudioBufferList that can be directly fed into the EZRecorder or EZOutput. Say whattt...
}

Pausing/Resuming The Microphone

Pause or resume fetching audio at any time like so:
// Stop fetching audio
[self.microphone stopFetchingAudio];

// Resume fetching audio
[self.microphone startFetchingAudio];
Alternatively, you could also toggle the microphoneOn property (safe to use with Cocoa Bindings)
// Stop fetching audio
self.microphone.microphoneOn = NO;

// Start fetching audio
self.microphone.microphoneOn = YES;

EZOutput

Provides flexible playback to the default output device by asking the EZOutputDataSource for audio data to play. Doesn't care where the buffers come from (microphone, audio file, streaming audio, etc). The EZOutputDataSource has three functions that can provide audio data for the output callback. You should implement only ONE of these functions:
// Full override of the audio callback 
-(void)           output:(EZOutput*)output
 callbackWithActionFlags:(AudioUnitRenderActionFlags*)ioActionFlags
             inTimeStamp:(const AudioTimeStamp*)inTimeStamp
             inBusNumber:(UInt32)inBusNumber
          inNumberFrames:(UInt32)inNumberFrames
                  ioData:(AudioBufferList*)ioData;

// Provides the audio callback with a circular buffer holding the audio data
-(TPCircularBuffer*)outputShouldUseCircularBuffer:(EZOutput *)output;

// Provides the audio callback with a buffer list, number of frames, and buffer size to use
-(void)             output:(EZOutput *)output
 shouldFillAudioBufferList:(AudioBufferList *)audioBufferList
        withNumberOfFrames:(UInt32)frames;
Relevant Example Projects
  • EZAudioPlayFileExample (iOS)
  • EZAudioPlayFileExample (OSX)
  • EZAudioPassThroughExample (iOS)
  • EZAudioPassThroughExample (OSX)

Creating An Output

Create an EZOutput by declaring a property and initializing it like so:
// Declare the EZOutput as a strong property
@property (nonatomic,strong) EZOutput *output;

...

// Initialize the EZOutput instance and assign it a delegate to provide the output audio data
self.output = [EZOutput outputWithDataSource:self];
Alternatively, you could also use the shared output instance and just assign it an EZOutputDataSource. This is the preferred way to use the EZOutput (usually just have one per app).
// Assign a delegate to the shared instance of the output to provide the output audio data
[EZOutput sharedOutput].outputDataSource = self;

Playback Using An AudioBufferList

One method to play back audio is to provide an AudioBufferList (for instance, reading from an EZAudioFile):
// Use the AudioBufferList datasource method to read from an EZAudioFile
-(void)             output:(EZOutput *)output
 shouldFillAudioBufferList:(AudioBufferList *)audioBufferList
        withNumberOfFrames:(UInt32)frames
{
  if( self.audioFile )
  {
    UInt32 bufferSize;
    [self.audioFile readFrames:frames
               audioBufferList:audioBufferList
                    bufferSize:&bufferSize
                           eof:&_eof];
    if( _eof )
    {
      [self seekToFrame:0];
    }
  }
}

Playback Using A Circular Buffer

Another method is to provide a circular buffer via Michael Tyson's (who, btw is a serious badass and also wrote the Amazing Audio Engine for iOS) TPCircularBuffer containing the data. For instance, for passing the microphone input to the output for a basic passthrough:
// Declare circular buffer as global
TPCircularBuffer circularBuffer;
...
// Using an EZMicrophone, append the AudioBufferList from the microphone callback to the global circular buffer
-(void)    microphone:(EZMicrophone *)microphone
        hasBufferList:(AudioBufferList *)bufferList
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
  /**
   Append the audio data to a circular buffer
   */
  [EZAudio appendDataToCircularBuffer:&circularBuffer
                  fromAudioBufferList:bufferList];
}
// Pass the circular buffer to the EZOutputDataSource using the circular buffer callback
-(TPCircularBuffer *)outputShouldUseCircularBuffer:(EZOutput *)output {
  return &circularBuffer;
}

Playback By Manual Override

And the last method is to completely override the output callback method and populate the AudioBufferList however you can imagine:
// Completely override the output callback function
-(void)           output:(EZOutput *)output
 callbackWithActionFlags:(AudioUnitRenderActionFlags *)ioActionFlags
             inTimeStamp:(const AudioTimeStamp *)inTimeStamp
             inBusNumber:(UInt32)inBusNumber
          inNumberFrames:(UInt32)inNumberFrames
                  ioData:(AudioBufferList *)ioData {
 // Fill the ioData with your audio data from anywhere
}

EZRecorder

Provides a way to record any audio source to an audio file. This hooks into the other components quite nicely to do something like plot the audio waveform while recording to give visual feedback as to what is happening.
Relevant Example Projects
  • EZAudioRecordExample (iOS)
  • EZAudioRecordExample (OSX)

Creating A Recorder

To create an EZRecorder you must start with an AudioStreamBasicDescription, which is just a CoreAudio structure representing the audio format of a file. The EZMicrophone and EZAudioFile both provide the AudioStreamBasicDescription as properties (for the EZAudioFile use the clientFormat property) that you can use when initializing the EZRecorder.
// Declare the EZRecorder as a strong property
@property (nonatomic,strong) EZRecorder *recorder;

...

// Here's how we would initialize the recorder for an EZMicrophone instance
self.recorder = [EZRecorder recorderWithDestinationURL:[NSURL fileURLWithPath:@"path/to/file.caf"]
                                       andSourceFormat:microphone.audioStreamBasicDescription];

// Here's how we would initialize the recorder for an EZAudioFile instance
self.recorder = [EZRecorder recorderWithDestinationURL:[NSURL fileURLWithPath:@"path/to/file.caf"]
                                       andSourceFormat:audioFile.clientFormat];

Recording Some Audio

Once you've initialized your EZRecorder you can append data by passing in an AudioBufferList and its buffer size like so:
// Append the microphone data coming as a AudioBufferList with the specified buffer size to the recorder
-(void)    microphone:(EZMicrophone *)microphone
        hasBufferList:(AudioBufferList *)bufferList
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
  // Getting audio data as a buffer list that can be directly fed into the EZRecorder. This is happening on the audio thread - any UI updating needs a GCD main queue block.
  if( self.isRecording ){
    [self.recorder appendDataFromBufferList:bufferList
                             withBufferSize:bufferSize];
  } 
}

Interface Components

EZAudio currently offers two drop in audio waveform components that help simplify the process of visualizing audio.

EZAudioPlot

Provides an audio waveform plot that uses CoreGraphics to perform the drawing. On iOS this is a subclass of UIView while on OSX this is a subclass of NSView. Best used on OSX as the drawing falls on the CPU and needs to redisplay after every audio data update, but useful in iOS apps for displaying full, static waveforms.
Relevant Example Projects
  • EZAudioCoreGraphicsWaveformExample (iOS)
  • EZAudioCoreGraphicsWaveformExample (OSX)

Creating An Audio Plot

You can create an audio plot in the interface builder by dragging in a UIView on iOS or an NSView on OSX onto your content area. Then change the custom class of the UIView/NSView to EZAudioPlot. See full Getting Started page for how to: http://syedharisali.com/projects/EZAudio/getting-started
Alternatively, you can could create the audio plot programmatically
// Programmatically create an audio plot
EZAudioPlot *audioPlot = [[EZAudioPlot alloc] initWithFrame:self.view.frame];
[self.view addSubview:audioPlot];

Customizing The Audio Plot

All plots offer the ability to change the background color, waveform color, plot type (buffer or rolling), toggle between filled and stroked, and toggle between mirrored and unmirrored (about the x-axis). For iOS colors are of the type UIColor while on OSX colors are of the type NSColor.
// Background color (use UIColor for iOS)
audioPlot.backgroundColor = [NSColor colorWithCalibratedRed:0.816 
                                                      green:0.349 
                                                       blue:0.255 
                                                      alpha:1];
// Waveform color (use UIColor for iOS)
audioPlot.color = [NSColor colorWithCalibratedRed:1.000 
                                            green:1.000 
                                             blue:1.000
                                            alpha:1];
// Plot type
audioPlot.plotType     = EZPlotTypeBuffer;
// Fill
audioPlot.shouldFill   = YES;
// Mirror
audioPlot.shouldMirror = YES;

Updating The Audio Plot

All plots have only one update function, updateBuffer:withBufferSize:, which expects a float array and its length.
// The microphone component provides audio data to its delegate as an array of float buffer arrays.
-(void)    microphone:(EZMicrophone *)microphone
     hasAudioReceived:(float **)buffer
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
  /** 
   Update the audio plot using the float array provided by the microphone:
     buffer[0] = left channel
     buffer[1] = right channel
   Note: Audio updates happen asynchronously so we need to make sure
         sure to update the plot on the main thread
   */
  dispatch_async(dispatch_get_main_queue(),^{
    [self.audioPlot updateBuffer:buffer[0] withBufferSize:bufferSize];
  });
}

EZAudioPlotGL

Provides an audio waveform plot that uses OpenGL to perform the drawing. The API this class are exactly the same as those for the EZAudioPlot above. On iOS this is a subclass of the EZPlot and uses an embedded GLKViewController to perform the OpenGL drawing while on OSX this is a subclass of the NSOpenGLView. In most cases this is the plot you want to use, it's GPU-accelerated, has a low memory footprint, and performs amazingly on all devices.
Relevant Example Projects
  • EZAudioOpenGLWaveformExample (iOS)
  • EZAudioOpenGLWaveformExample (OSX)

Creating An OpenGL Audio Plot

You can create an audio plot in the interface builder by dragging in a UIView on iOS or an NSOpenGLView on OSX onto your content area. Then change the custom class of the UIView/NSView to EZAudioPlotGL. See full Getting Started page for how to: http://syedharisali.com/projects/EZAudio/getting-started
Alternatively, you can could create the EZAudioPlotGL programmatically
// Programmatically create an audio plot
EZAudioPlotGL *audioPlotGL = [[EZAudioPlotGL alloc] initWithFrame:self.view.frame];
[self.view addSubview:audioPlotGL];

Customizing The OpenGL Audio Plot

All plots offer the ability to change the background color, waveform color, plot type (buffer or rolling), toggle between filled and stroked, and toggle between mirrored and unmirrored (about the x-axis). For iOS colors are of the type UIColor while on OSX colors are of the type NSColor.
// Background color (use UIColor for iOS)
audioPlotGL.backgroundColor = [NSColor colorWithCalibratedRed:0.816 
                                                        green:0.349 
                                                         blue:0.255 
                                                        alpha:1];
// Waveform color (use UIColor for iOS)
audioPlotGL.color = [NSColor colorWithCalibratedRed:1.000 
                                              green:1.000 
                                               blue:1.000
                                              alpha:1];
// Plot type
audioPlotGL.plotType     = EZPlotTypeBuffer;
// Fill
audioPlotGL.shouldFill   = YES;
// Mirror
audioPlotGL.shouldMirror = YES;

Updating The OpenGL Audio Plot

All plots have only one update function, updateBuffer:withBufferSize:, which expects a float array and its length.
// The microphone component provides audio data to its delegate as an array of float buffer arrays.
-(void)    microphone:(EZMicrophone *)microphone
     hasAudioReceived:(float **)buffer
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
  /** 
   Update the audio plot using the float array provided by the microphone:
     buffer[0] = left channel
     buffer[1] = right channel
   Note: Audio updates happen asynchronously so we need to make sure
         sure to update the plot on the main thread
   */
  dispatch_async(dispatch_get_main_queue(),^{
    [self.audioPlotGL updateBuffer:buffer[0] withBufferSize:bufferSize];
  });
}
Download: https://github.com/syedhali/EZAudio/archive/master.zip

Wednesday, March 19, 2014

if eng News Order Demo

Introduction:
     Achieve a Phoenix news channel subscription interface and features, click on a channel, it can be moved into or out of the subscription column, the rest of the channels automatically rearrange. Archiving Model array to a local app's Library folder. Code is written in non-ARC.
Test environment:
     [Code4App] compile the test, the test environment: Xcode 5.0, iOS 6.0 above.
Renderings:


iOS / iPhone / iPad 实现了凤凰新闻的频道订阅界面和功能,点击某个频道,可以将其移入或者移出订阅栏,剩下的频道自动重新排列。归档Model数组到本地app的Library文件夹。代码是用非ARC编写的。 

     iOS / iPhone / iPad to achieve a Phoenix news channel subscription interface and features, click on a channel, it can be moved into or out of the subscription column, the rest of the channels automatically rearrange. Archiving Model array to a local app's Library folder. Code is written in non-ARC.


Download: https://github.com/ppt04025/ifengNewsOrderDemo/archive/New.zip

Sunday, March 16, 2014

Web View Javascript Bridge

An iOS/OSX bridge for sending messages between Obj-C and JavaScript in UIWebViews/WebViews.
If you like WebViewJavascriptBridge you may also want to check out WebViewProxy.

In the Wild

WebViewJavascriptBridge is used by a range of companies and projects. This list is incomplete, but feel free to add your's and send a PR.
  • Facebook Messenger
  • Facebook Paper
  • Yardsale
  • EverTrue
  • Game Insight
  • Altralogica
  • Sush.io
  • Flutterby Labs
  • JD Media's 鼎盛中华
  • Dojo4's Imbed

Setup & Examples (iOS & OSX)

Start with the Example Apps/ folder. Open either the iOS or OSX project and hit run to see it in action.
To use a WebViewJavascriptBridge in your own project:
1) Drag the WebViewJavascriptBridge folder into your project.
  • In the dialog that appears, uncheck "Copy items into destination group's folder" and select "Create groups for any folders"
2) Import the header file:
#import "WebViewJavascriptBridge.h"
3) Instantiate WebViewJavascriptBridge with a UIWebView (iOS) or WebView (OSX):
WebViewJavascriptBridge* bridge = [WebViewJavascriptBridge bridgeForWebView:webView handler:^(id data, WVJBResponseCallback responseCallback) {
    NSLog(@"Received message from javascript: %@", data);
    responseCallback(@"Right back atcha");
}];
4) Go ahead and send some messages from ObjC to javascript:
[bridge send:@"Well hello there"];
[bridge send:[NSDictionary dictionaryWithObject:@"Foo" forKey:@"Bar"]];
[bridge send:@"Give me a response, will you?" responseCallback:^(id responseData) {
    NSLog(@"ObjC got its response! %@ %@", responseData);
}];
4) Finally, set up the javascript side:
function connectWebViewJavascriptBridge(callback) {
    if (window.WebViewJavascriptBridge) {
        callback(WebViewJavascriptBridge)
    } else {
        document.addEventListener('WebViewJavascriptBridgeReady', function() {
            callback(WebViewJavascriptBridge)
        }, false)
    }
}

connectWebViewJavascriptBridge(function(bridge) {

    /* Init your app here */

    bridge.init(function(message, responseCallback) {
        alert('Received message: ' + message)   
        if (responseCallback) {
            responseCallback("Right back atcha")
        }
    })
    bridge.send('Hello from the javascript')
    bridge.send('Please respond to this', function responseCallback(responseData) {
        console.log("Javascript got its response", responseData)
    })
})

Contributors & Forks

Contributors: https://github.com/marcuswestin/WebViewJavascriptBridge/graphs/contributors
Forks: https://github.com/marcuswestin/WebViewJavascriptBridge/network/members

API Reference

ObjC API

[WebViewJavascriptBridge bridgeForWebView:(UIWebView/WebView*)webview handler:(WVJBHandler)handler]
[WebViewJavascriptBridge bridgeForWebView:(UIWebView/WebView*)webview webViewDelegate:(UIWebViewDelegate*)webViewDelegate handler:(WVJBHandler)handler]
Create a javascript bridge for the given web view.
The WVJBResponseCallback will not be nil if the javascript expects a response.
Optionally, pass in webViewDelegate:(UIWebViewDelegate*)webViewDelegate if you need to respond to the web view's lifecycle events.
Example:
[WebViewJavascriptBridge bridgeForWebView:webView handler:^(id data, WVJBResponseCallback responseCallback) {
    NSLog(@"Received message from javascript: %@", data);
    if (responseCallback) {
        responseCallback(@"Right back atcha");
    }
}]

[WebViewJavascriptBridge bridgeForWebView:webView webViewDelegate:self handler:^(id data, WVJBResponseCallback responseCallback) { /* ... */ }];
[bridge send:(id)data]
[bridge send:(id)data responseCallback:(WVJBResponseCallback)responseCallback]
Send a message to javascript. Optionally expect a response by giving a responseCallback block.
Example:
[bridge send:@"Hi"];
[bridge send:[NSDictionary dictionaryWithObject:@"Foo" forKey:@"Bar"]];
[bridge send:@"I expect a response!" responseCallback:^(id responseData) {
    NSLog(@"Got response! %@", responseData);
}];
[bridge registerHandler:(NSString*)handlerName handler:(WVJBHandler)handler]
Register a handler called handlerName. The javascript can then call this handler with WebViewJavascriptBridge.callHandler("handlerName").
Example:
[bridge registerHandler:@"getScreenHeight" handler:^(id data, WVJBResponseCallback responseCallback) {
    responseCallback([NSNumber numberWithInt:[UIScreen mainScreen].bounds.size.height]);
}];
[bridge callHandler:(NSString*)handlerName data:(id)data]
[bridge callHandler:(NSString*)handlerName data:(id)data responseCallback:(WVJBResponseCallback)callback]
Call the javascript handler called handlerName. Optionally expect a response by giving a responseCallback block.
Example:
[bridge callHandler:@"showAlert" data:@"Hi from ObjC to JS!"];
[bridge callHandler:@"getCurrentPageUrl" data:nil responseCallback:^(id responseData) {
    NSLog(@"Current UIWebView page URL is: %@", responseData);
}];

Javascript API

document.addEventListener('WebViewJavascriptBridgeReady', function onBridgeReady(event) { ... }, false)
Always wait for the WebViewJavascriptBridgeReady DOM event.
Example:
document.addEventListener('WebViewJavascriptBridgeReady', function(event) {
    var bridge = event.bridge
    // Start using the bridge
}, false)
bridge.init(function messageHandler(data, response) { ... })
Initialize the bridge. This should be called inside of the 'WebViewJavascriptBridgeReady' event handler.
The messageHandler function will receive all messages sent from ObjC via [bridge send:(id)data] and [bridge send:(id)data responseCallback:(WVJBResponseCallback)responseCallback].
The response object will be defined if if ObjC sent the message with a WVJBResponseCallback block.
Example:
bridge.init(function(data, responseCallback) {
    alert("Got data " + JSON.stringify(data))
    if (responseCallback) {
        responseCallback("Right back atcha!")
    }
})
bridge.send("Hi there!")
bridge.send({ Foo:"Bar" })
bridge.send(data, function responseCallback(responseData) { ... })
Send a message to ObjC. Optionally expect a response by giving a responseCallback function.
Example:
bridge.send("Hi there!")
bridge.send("Hi there!", function(responseData) {
    alert("I got a response! "+JSON.stringify(responseData))
})
bridge.registerHandler("handlerName", function(responseData) { ... })
Register a handler called handlerName. The ObjC can then call this handler with [bridge callHandler:"handlerName" data:@"Foo"] and [bridge callHandler:"handlerName" data:@"Foo" responseCallback:^(id responseData) { ... }]
Example:
bridge.registerHandler("showAlert", function(data) { alert(data) })
bridge.registerHandler("getCurrentPageUrl", function(data, responseCallback) {
    responseCallback(document.location.toString())
})

iOS4 support (with JSONKit)

Note: iOS4 support has not yet been tested in v2+.
WebViewJavascriptBridge uses NSJSONSerialization by default. If you need iOS 4 support then you can use JSONKit, and add USE_JSONKIT to the preprocessor macros for your project.

Download: https://github.com/marcuswestin/WebViewJavascriptBridge/archive/master.zip

Tuesday, March 4, 2014

UI Text Field-Shake

UITextField category that adds a shake animation like the password field of the OsX login screen.

Screenshot

UITextField+Shake

Setup with Cocoapods

  • Add pod 'UITextField+Shake' to your Podfile
  • Run pod install
  • Run open App.xcworkspace
  • Import UITextField+Shake.h in your controller's header file

Usage

// Shake with the default speed
[self.textField shake:10   // 10 times
            withDelta:5    // 5 points wide
];

// Shake with a custom speed
[self.textField shake:10   // 10 times
            withDelta:5    // 5 points wide
             andSpeed:0.03 // 30ms per shake
];

// Shake with a custom speed and direction
[self.textField shake:10   // 10 times
            withDelta:5    // 5 points wide
             andSpeed:0.03 // 30ms per shake
       shakeDirection:ShakeDirectionVertical
];
Download: https://github.com/andreamazz/UITextField-Shake/archive/master.zip

Sunday, March 2, 2014

SKS Table View

SKSTableView is a custom table view class extended from UITableView class. This class provides a single-level hierarchical structure(an expandable table view) for your contents. In order to minimalize the effectiveness of the table view, the default insertion and remove mechanism of UITableView (insertRowsAtIndexPaths:withRowAnimation: and deleteRowsAtIndexPaths:withRowAnimation:) is used. Main rows of your table view, which can be expandable or not must be instances of SKSTableViewCell class. Subrows can be instances of any class that is extended from UITableViewCellor or UITableViewCell itself.
In order to use the SKSTableView, just drag&drop the 'SKSTableViewImp' folder into your project folder.
Sample screenshot:

Download: https://github.com/sakkaras/SKSTableView/archive/master.zip

Friday, February 28, 2014

RK Tab View

RKTabView - easy applicable toolbar/tabbar component for iOS

BackgroundImage
RKTabView provides the opportunity to create toolbars with customizable behavior, functions and appearance. Standard iOS components such as UITabBar or UIToolbar sometimes are not customizable as needed and always behave in a certain way. Unlike them, this component can be installed anywhere, anytime, but the most important thing - all elements appearance and behavior can differ. Imagine, that part of bar should work as UITabBar(one is on - others are off), elements of second part should work as switchers(any element can be turned on or off independently), third part - ordinary buttons. All this can be done with RKTabView.

Features

Elements and behavior

  • element creation and behavior customization
    Create elements of 3 types:
    • Excludable element(Same as UITabBar elemets. One is on - others are off)
    • Unexludable element (can be turned on or off independently)
    • Button element (work as ordinary button)
      To do this, use one of 3 initialization methods. Initialization method require element images for "enabled" and "disabled" states. "Button element" initialization method requires target and selector in addition.
  • put elements to tab view easily:
    No need to care about elements size, location and other appearance stuff. Passing elements array to tab view is enough.
  • tab view delegate:
    Interaction implemented via delegate. Delegate parameter should be specified. Delegate methods tell which element (index of element) changes state.

Appearance

  • customizable view (RKTabView subclassed from UIView)
  • bar items automatic location:
    Bar area is divided equally betwen elements. All elements have the same height as tab view.
  • autoresize:
    All elements are set up to adjust their sizes.
  • separator lines:
    Option allows to draw top and bot separator lines to separate content.
  • customizable horizontal insets:
    Distance between edges and content can be specified.
  • marking element:
    Option allows to mark "enabled" elements with darker background.
  • selected backgound color:
    Option allows to specify background color which will be applied to element when it gets "enabled" state. Can be specified for whole tab view or for separate elements.
  • elements background color:
    Any element can have its own background color.
  • elements content:
    Element image usually located in center and has it's original size.

Titles

  • titled elements:
    Title can be cpecified for any element.
  • title font:
    Title font can be cpecified for whole tab view or for separate elements.
  • title color:
    Title font color can be cpecified for whole tab view or for separate elements.

Adding to project

Adding as pod

CocoaPods is the recommended way to use RKTabView in your project.
  • Simply add this line to your Podfile: pod 'RKTabView', '~> 1.0.0'
  • Run pod install.
  • Include with #import <RKTabView.h> to use it wherever you need.

Adding manually

  • Add RKTabView and RKTabItem .h .m files to your project (4 files total).
  • Include RKTabView.h (#import "RKTabView.h").

Usage

Step 1 - Create tab items

There are 3 initialization methods for tab items. Each method used for specialized tab item type.
Create standard element (one is on - others are off) with 'createUsualItemWithImageEnabled:imageDisabled:' class method:
RKTabItem *tabItem = [RKTabItem createUsualItemWithImageEnabled:(UIImage *) imageDisabled:(UIImage *)];
Create independent element (can be turned on or off independently) with 'createUnexcludableItemWithImageEnabled:imageDisabled:' class method:
RKTabItem *tabItem = [RKTabItem createUnexcludableItemWithImageEnabled:(UIImage *) imageDisabled:(UIImage *)];
Disabled and enabled images should be specified. If enabled image is nil, disabled image will be used instead.
To make item instantly enabled set tabState property to TabStateEnabled.
Create button element with createButtonItemWithImage:target:selector: class method:
RKTabItem *tabItem = [RKTabItem createButtonItemWithImage:(UIImage *) target:(id) selector:(SEL)];
Target and selector should be specified. This item works as an ordinry UIButton.
Items should be collected into an array and passed to tabItems property of RKTabView. See below.

Step 2 - Create tab view

Initialize RKTabView using 'initWithFrame:andTabItems:' method:
RKTabView *tabView = [[RKTabView alloc] initWithFrame:(CGRect) andTabItems:(NSArray *)];
You can use standard 'initWithFrame:' method and pass tabItems array later.
RKTabView *tabView = [[RKTabView alloc] initWithFrame:(CGRect)];
tabView.tabItems = @[item1, item2, item3, ...];
To display tab view add it to your view's as subview:
[self.view addSubview:tabView];
Tab view can be created in Interface builder. Then you just have to pass items array to it.

Step 3 - Implement delegate methods

Your delegate should conform to 'RKTabViewDelegate' protocol and have these methods implemented:
- (void)tabView:(RKTabView *)tabView tabBecameEnabledAtIndex:(int)index tab:(RKTabItem *)tabItem; 
- (void)tabView:(RKTabView *)tabView tabBecameDisabledAtIndex:(int)index tab:(RKTabItem *)tabItem;
Delegate methods do not relate to "button" items. First method gets called for standard and unexludable items. Second - only for unexcludable items.
After tab view created you should specify delegate:
RKTabView *tabView = [[RKTabView alloc] initWithFrame:CGRectMake(0,0,320,44)];  
tabView.delegate = self;

Customization

  • customize horizontal insets
    To set distance between tab view content and edges set horizontalInsets property of RKTabView:
    objc tabView.horizontalInsets = HorizontalEdgeInsetsMake(70, 70);
  • draw separator lines at top and bottom
    Set drawSeparators property to 'YES' to enable this option:
    objc tabView.drawSeparators = YES;
  • mark enabled elements with darker background
    Set darkensBackgroundForEnabledTabs property to 'YES' to enable this option:
    objc tabView.darkensBackgroundForEnabledTabs = YES;
  • enabled backgound color
    Background color for state "enabled" can be specified for whole tab view or for any separate element.
    If you want same "enabled" color for all elements set enabledTabBackgrondColor property for tab view:
    tabView.enabledTabBackgrondColor = [UIColor redColor];
    
    If you want some element have individual "enabled" background color, then specify enabledBackgroundColor property for tab item:
    tabItem.enabledBackgroundColor = [UIColor blueColor];
    
    Note: tabItem ignores tabView's "enabled" background color and uses it's own color if it is specified.
  • elements background color
    To set it elements background color specify 'backgroundColor' propery:
    tabItem.backgroundColor = [UIColor redColor];
    
  • title
    To set title for element set 'titleString' property for tab item:
    tabItem.titleString = @"Title";
    
    Title font and title font color can be specified both for tab view and tab item similiary as for "enabled background color". If tab item's properties specified, tabView's properties are ignored:
    tabView.titlesFont = [UIFont systemFontOfSize:9];
    
    tabView.titlesFontColor = [UIColor darkTextColor];
    
    or
    tabItem.titleFont = [UIFont systemFontOfSize:9];
    
    tabItem.titleFontColor = [UIColor darkTextColor];  
Download: https://github.com/RafaelKayumov/RKTabView/archive/master.zip