Jump to content

  • Log In with Google      Sign In   
  • Create Account


aregee

Member Since 06 Oct 2009
Online Last Active Today, 05:15 AM
-----

Topics I've Started

Trying to reduce the massive amount of variables in my struct

16 July 2014 - 09:42 AM

I am adding music formats to my audio player.  This time I am adding support for the ancient Amiga module format that was used in a lot of music trackers under lots of various names.  While I have not added every effect used by the "format" yet, it is playing my old music surprisingly well.

 

My question is about my structs.

 

I have organised the module (the song) in structs in a similar way to this:

typedef struct AmigaModule {
    uint8_t *ModuleName;
    uint8_t SampleCount;
    ModuleSample *SampleList;
    uint8_t SongPositions;
    uint8_t RestartPosition;
    uint8_t PatternTable[128];
    uint8_t MaxPattern;
    uint8_t ChannelCount;
    ModuleChannel *ChannelList;
} AmigaModule;

Where ModuleSample is a struct for each sample, and ModuleChannel is a struct for each channel in the song, with more structs to describe the pattern data for each channel, but this is not really interesting for my question.

 

ModuleChannel that has grown massively big with lots of arbitrary variables for each effect that tracks state of the song while it is playing, and I have made an effort in refactoring my code to a better structure.  Now I have removed all the state variables in its own struct to track states.

 

It looks something like this:

typedef enum ModuleChannelEffect {
    EFFECT_NONE,
    EFFECT_ARPEGGIO,
    EFFECT_PORTAMENTO,
    EFFECT_VIBRATO,
    EFFECT_TREMOLO,
    EFFECT_VOLUME_SLIDE,
    EFFECT_RETRIGGER_SAMPLE
} ModuleChannelEffect;

typedef enum ModuleChannelEffectSpecifier {
    EFFECT_SPECIFIER_NONE,
    EFFECT_SPECIFIER_PORTAMENTO_TO_NOTE,
    EFFECT_SPECIFIER_PORTAMENTO_UP,
    EFFECT_SPECIFIER_PORTAMENTO_DOWN,
    EFFECT_SPECIFIER_VOLUME_SLIDE_UP,
    EFFECT_SPECIFIER_VOLUME_SLIDE_DOWN
    //And more coming later...
} ModuleChannelEffectSpecifier;

typedef struct ModuleEffect {
    ModuleChannelEffect primaryEffect;
    ModuleChannelEffectSpecifier primaryEffectSpecifier;
    uint16_t parameter1;
    uint16_t parameter2;
    uint16_t parameter3;
} ModuleEffect;

And here is my issue:

See the three lines 'parameter1', 'parameter2', 'parameter3'?

 

In code, it looks like this:

currentChannel->effect.parameter1 = periodsTable[translatedPeriod];
currentChannel->effect.parameter2 = periodsTable[arpNoteIndex1];
currentChannel->effect.parameter3 = periodsTable[arpNoteIndex2];

And when I use those values:

currentChannel->period = currentChannel->effect.parameter1;

I like two things:

 

1. Code that is self-explanatory.

2. Generalisation, as long as it is not over-generalisation.

 

My problem is that 'parameter1' doesn't tell me anything, really.  In the portamento effect, is 'parameter1' the speed or the note I am sliding towards?  I could name the values for what they are, but then I wouldn't solve the problem with too many variables inside the struct again, because I would need to make lots of variables I don't always need to use.

 

Then it came to me that I could do something like this:

#define ArpeggioValue1 parameter1
#define ArpeggioValue2 parameter2
#define ArpeggioValue3 parameter3

Now I can reference the variable names in a more sensible and descriptive way:

currentChannel->effect.ArpeggioValue1 = periodsTable[translatedPeriod];
currentChannel->effect.ArpeggioValue2 = periodsTable[arpNoteIndex1];
currentChannel->effect.ArpeggioValue3 = periodsTable[arpNoteIndex2];
And:
 
currentChannel->period = currentChannel->effect.ArpeggioValue1;

My question is whether this is an acceptable coding practise, is there a better way of doing this?  Do you have any other suggestion?  The point here is really that I want to learn to get better.  What I am doing now is working, but how would you go about it?  Make massive objects to handle this?  Some other thing?  What is the "best" way?

 


Endianness

06 April 2014 - 07:18 PM

Coming from a big endian platform in the past, the good old Amiga computers that had the Motorola 680x0 processor series, I always found it weird with the seemingly backward little endian that is Intel 'x86.  I always thought that the choice of endianness was an arbitrary choice, a weird one, and according to wikipedia, it indeed is, but today I was thinking about some code that I was writing, and thinking "why is this really working"?
 
First, let me show you the code I am taling about:
 
NSInputStream *iStream = [[NSInputStream alloc] initWithFileAtPath:@"<somefile>"];
[iStream open];

uint64_t value = 0;

[iStream read:(uint8_t *)&value maxLength:1];
uint64_t myValue1 = value;

[iStream read:(uint8_t *)&value maxLength:2];
uint64_t myValue2 = value;

[iStream read:(uint8_t *)&value maxLength:4];
uint64_t myValue3 = value;

[iStream read:(uint8_t *)&value maxLength:8];
uint64_t myValue4 = value;
This works well on little endian platforms.
 
Hint: Look at where in the uint64_t I am reading the smaller values into and how I am storing the values afterwards.
 
If you were to do this on a big endian platform, you would need to write something like this (to my understanding):
 
[iStream read:(uint8_t *)(&value + 7) maxLength:1];
//store the value
[iStream read:(uint8_t *)(&value + 6) maxLength:2];
//store the value
[iStream read:(uint8_t *)(&value + 4) maxLength:4];
//store the value
[iStream read:(uint8_t *)&value maxLength:8];
//store the value
 
Unless there is some magic voodoo that I don't understand going on here, then I have found one good reason for the choice of little endian on an architecture, and to me it is not so arbitrary any more, even though it might really be...
 
Little endian, I do understand you existence a little bit more now... ;)
 
EDIT:
 
Oh yes, wikipedia also mentions this realisation:
 
"The little-endian system has the property that the same value can be read from memory at different lengths without using different addresses (even when alignment restrictions are imposed). For example, a 32-bit memory location with content 4A 00 00 00 can be read at the same address as either 8-bit (value = 4A), 16-bit (004A), 24-bit (00004A), or 32-bit (0000004A), all of which retain the same numeric value. Although this little-endian property is rarely used directly by high-level programmers, it is often employed by code optimizers as well as by assembly language programmers."

Optimisation of my audio player application (Mac)

03 April 2014 - 09:12 PM

I am working on an audio player application for Mac, and have made a play routine for both wav and flac files from scratch, and will soon start working on a mp3 decoder too.  While this is working great, I am trying to squeeze as much fast response and performance as I can out of the application.

 

My goal is to make a super responsive audio player that later will be part of an (semi) automated audio transcription tool.

 

Now I am looking to optimise my audio player before I move on to add more functionality.

 

My first optimisation was a qualified guess:

 

I have a bit file reader that I have made, that I just threw together to be able to read a file on a number of bit basis.  This actually would just read one and one byte from the file and shift the data into the integer type I wanted, bit by bit.

 

The optimisation I did, was to create a double buffer and pre-read an amount of data to that buffer instead of requesting one and one byte from the file like I did in the version I just threw together.  The reason I made the buffer "double buffering" is that I thought that I would always have one buffer ready, and that I could increase responsiveness by reading data into the used buffer from another thread.  I am not sure if this really is a good idea, but I will test this later.  Now I just have prepared so it will be trivial to add this functionality.

 

This initial optimisation made the CPU usage drop from about 30% down to about 6%.  I'd say that is a good improvement.

 

Now I want to try to reduce the CPU Energy Impact to land in the "low" area instead of constantly "high".  I am wondering if this is at all possible in an application like this, where there are constant reads from disk and Audio Queues constantly requesting more data to play, and also graphics that is updated on regular basis.  (Probably not more than 4 times per second, but still...)

 

I also notice something called 'wakes' per second.  This number varies between 4 and 11, but is mostly residing at 4.  I don't really know what kind of an impact this has on my application, and I am not even sure I can avoid it to happen.  I am thinking that maybe I can reduce the wakes to happen in "bulk" somehow, but I need to learn more about this topic.  It is hard to find any good information on this.

 

I also wonder how you might go about trying to reduce the energy impact in such an application where you constantly stream audio from file to an audio device.

 

Lastly, a weird bug that I am not even sure what is caused by.  Sometimes I have a drop-out in the audio.  It is not a long dropout, and the progress bar is moving and it seems that the song is playing, but there is no sound for the short duration of the dropout.  The stream is not reporting any errors, and I have the same problem wether I play wav or flac.  Both play routines are written independently of each other and from scratch, and with TONS of sanity and error checking, and buffer overflow protection, that I would see if the play routines did something weird.  I am suspecting latency in the system, but I have no idea how to pursue that.  I was hoping that buffering the bit file reader would solve the problem, and sort of it did - the short drop-outs are a bit shorter now...  When the audio comes back, it indeed did play with no sound, since the song does not resume from where it stopped.  To combat this, I have tried to increase buffer sizes.  The dropouts are still of the same length.  It really seems to be relating to some issue with how the system either manages sleep on the hard drive OR how the system accumulates threading by delaying threads a little bit to reduce the energy impact.  Yes, OS X does this to some extent, but I don't know how to even start to debug this problem.

 

I have one idea I am planning to try, though...  To try to time different sections of the code to see if there is any major deviations when this happens.

 

I am happy for any feedback or suggestions.  Thank you! smile.png

 

 

EDIT 2014 06 10:

 

Instead of waking this old thread, I will just make an edit: 

 

I found a likely reason for why I am getting audio drop-outs.  It is called App Nap on OS X Mavericks.  It is not supposed to affect apps that is playing audio, but I see that it is an issue with Mavericks:

 

http://smallthingsfloat.com/2013/11/09/airfoil-cutting-out-on-os-x-mavericks/

 

I also see similar behaviour in other apps that I never had problems with before Mavericks.


Beach ball when adding a massive amount of rows to NSTableView using NSArrayController

29 March 2014 - 11:59 AM

In my audio player, I am adding a ton of items to a NSTableView through an NSArrayController.

 

Moving the process to an NSOperation with a NSOperationQueue, I hoped to get rid of waiting for a long time for my window to open, and also that my application would be immediately responsive.

 

Here is my NSOperation (.h-file):

#import <Foundation/Foundation.h>
#include "STASearchForMediaProtocol.h"

@interface STASearchForMediaOperation : NSOperation {
    NSString *pathToSearch;
    BOOL searchRecursive;
    NSOperationQueue *ownOperationQueue;
    id <STASearchForMediaProtocol> ourDelegate;
}

- (id)initWithFilePath:(NSString *)path recursiveSearch:(BOOL)recursive usingOperationQueue:(NSOperationQueue *)operationQueue withDelegate:(id)delegate;

@end

Here is my NSOperation (.m-file):

 
#import "STASearchForMediaOperation.h"

@implementation STASearchForMediaOperation

- (id)init {
    self = [super init];
    
    if (self) {
        pathToSearch = nil;
        searchRecursive = NO;
        ownOperationQueue = nil;
        ourDelegate = nil;
    }
    
    return self;
}

- (id)initWithFilePath:(NSString *)path recursiveSearch:(BOOL)recursive usingOperationQueue:(NSOperationQueue *)operationQueue withDelegate:(id)delegate {
    self = [self init];
    
    if (self) {
        pathToSearch = path;
        searchRecursive = recursive;
        ownOperationQueue = operationQueue;
        ourDelegate = delegate;
    }
    
    return self;
}

- (void)searchPathForSong:(NSString *)path {
    NSFileManager *fileManager = [NSFileManager defaultManager];
    NSError *error = nil;
    NSArray *paths = [fileManager contentsOfDirectoryAtPath:path error:&error];
    if (error != nil) {
        printf("Error reading path '%s'.", [path UTF8String]);
        return;
    }
    BOOL isDirectory = NO;
    for (NSString *item in paths) {
        NSString *itemPath = [NSString stringWithFormat:@"%@/%@", path, item];
        BOOL fileExistsAtPath = [[NSFileManager defaultManager] fileExistsAtPath:itemPath isDirectory:&isDirectory];
        if (fileExistsAtPath) {
            if (isDirectory)
            {
                //We need our own operation queue to post further searches
                if (searchRecursive && (ownOperationQueue != nil)) {
                    NSOperation *newOperation = [[STASearchForMediaOperation alloc] initWithFilePath:itemPath recursiveSearch:searchRecursive usingOperationQueue:ownOperationQueue withDelegate:ourDelegate];
                    [newOperation addDependency:self];
                    [ownOperationQueue addOperation:newOperation];
                }
            }
        }
        if ([[item pathExtension] isEqualToString:@"mp3"]) {
            NSMutableArray *array = [[NSMutableArray alloc] initWithCapacity:2];
            [array addObject:item];
            [array addObject:itemPath];
            if (ourDelegate != nil) {
                [(NSObject *)ourDelegate performSelectorOnMainThread:(@selector(signalMp3FileFound:)) withObject:array waitUntilDone:YES];
            }
        }
        else if ([[item pathExtension] isEqualToString:@"flac"]) {
            NSMutableArray *array = [[NSMutableArray alloc] initWithCapacity:2];
            [array addObject:item];
            [array addObject:itemPath];
            if (ourDelegate != nil) {
                [(NSObject *)ourDelegate performSelectorOnMainThread:(@selector(signalFlacFileFound:)) withObject:array waitUntilDone:YES];
            }
        }
        else if ([[item pathExtension] isEqualToString:@"wav"]) {
            NSMutableArray *array = [[NSMutableArray alloc] initWithCapacity:2];
            [array addObject:item];
            [array addObject:itemPath];
            if (ourDelegate != nil) {
                [(NSObject *)ourDelegate performSelectorOnMainThread:(@selector(signalWavFileFound:)) withObject:array waitUntilDone:YES];
            }
        }
        else if ([[item pathExtension] isEqualToString:@"aiff"]) {
            NSMutableArray *array = [[NSMutableArray alloc] initWithCapacity:2];
            [array addObject:item];
            [array addObject:itemPath];
            if (ourDelegate != nil) {
                [(NSObject *)ourDelegate performSelectorOnMainThread:(@selector(signalAiffFileFound:)) withObject:array waitUntilDone:YES];
            }
        }
        
        if ([self isCancelled]) {
            break;
        }
    }
}

- (void)main {
    @autoreleasepool {
        if (ourDelegate != nil)
        {
            [ourDelegate signalBeginUpdate];
        }
        
        [self searchPathForSong:pathToSearch];
        
        if (ourDelegate != nil)
        {
            [ourDelegate signalEndUpdate];
        }
    }
}

@end

My protocol for the delegate:

@protocol STASearchForMediaProtocol <NSObject>

@required
- (void)signalBeginUpdate;
- (void)signalEndUpdate;
- (void)signalMp3FileFound:(id)object;
- (void)signalFlacFileFound:(id)object;
- (void)signalWavFileFound:(id)object;
- (void)signalAiffFileFound:(id)object;

@end

My app delegate methods that are being used for this purpose:

- (void)signalBeginUpdate {
    if (updateCount == 0) {
        [tableViewRef beginUpdates];
    }
    
    updateCount++;
}

- (void)signalEndUpdate {
    updateCount--;
    
    if (updateCount == 0) {
        [tableViewRef endUpdates];
    }
}

- (void)signalMp3FileFound:(id)object {
    @autoreleasepool {
        NSArray *items = (NSArray *)object;
        [self addSongWithArtist:@"Music" album:@"MP3" song:(NSString *)[items objectAtIndex:0] fileName:(NSString *)[items objectAtIndex:1]];
    }
}

- (void)signalFlacFileFound:(id)object {
    @autoreleasepool {
        NSArray *items = (NSArray *)object;
        [self addSongWithArtist:@"Music" album:@"FLAC" song:(NSString *)[items objectAtIndex:0] fileName:(NSString *)[items objectAtIndex:1]];
    }
}

- (void)signalWavFileFound:(id)object {
    @autoreleasepool {
        NSArray *items = (NSArray *)object;
        [self addSongWithArtist:@"Music" album:@"WAV" song:(NSString *)[items objectAtIndex:0] fileName:(NSString *)[items objectAtIndex:1]];
    }
}

- (void)signalAiffFileFound:(id)object {
    @autoreleasepool {
        NSArray *items = (NSArray *)object;
        [self addSongWithArtist:@"Music" album:@"AIFF" song:(NSString *)[items objectAtIndex:0] fileName:(NSString *)[items objectAtIndex:1]];
    }
}

The 'addSongWithArtist:album:song:fileName:' method:

- (void)addSongWithArtist:(NSString *)artist album:(NSString *)album song:(NSString *)song fileName:(NSString *)fileName {
    internalIDCounter++;

    [ArrayControllerRef addObject:[NSMutableDictionary dictionaryWithObjectsAndKeys:artist, @"Artist", album, @"Album", song, @"Song", fileName, @"fileName", [NSNumber numberWithUnsignedLong:internalIDCounter], @"iID", nil]];
}

How I initiate the NSOperation:

    updateCount = 0;
    fileSearchQueue = [[NSOperationQueue alloc] init];
    [fileSearchQueue setName:@"File search operation queue"];
    
    STASearchForMediaOperation *searchForMedia = [[STASearchForMediaOperation alloc] initWithFilePath:@"/Volumes/Music/UNSORT" recursiveSearch:YES usingOperationQueue:fileSearchQueue withDelegate:self];
    [fileSearchQueue addOperation:searchForMedia];

My problem is...

 

As the code is right now: I still have to wait for the process to complete before the window opens. With beach ball...  It takes an awful long time...  I never waited long enough to see the app really start.

 

If I change 'waitUntilDone' to 'NO', I get the window straight away as I want, but with a little period of 'hiccups' with beach ball and waiting.  I also get a 'NSLog'-message:

 

"CoreAnimation: warning, deleted thread with uncommitted CATransaction; set CA_DEBUG_TRANSACTIONS=1 in environment to log backtraces."

 

It is also random how many songs will actually be in my NSTableView.  Often it has a lot of empty rows.

 

I guess this is because the threads gets deleted before my table view manages to retrieve the data since I don't 'waitUntilDone'.

 

Does anyone have a suggestion:

1. How to fix my solution?

2. A better solution to solve my problem?

 


Hard to track bugs...

26 March 2014 - 09:34 AM

I have just spent two days tracking down a mysterious bug, only to find something really stupid...

 

I have made a flac (Free Lossless Audio Codec) decoder from scratch, and it exhibited some weird behaviour.  Some songs played completely through without any errors at all.  I even compared side by side with the original wav file while playing.  Other songs erred all the way through the song.  There were no in-betweens.  Either the song played correctly all the way, or it didn't play at all.  The errors I had when the playroutine failed was of a missing sync problem, indicating a bug that caused the wrong number of bits from a block to be read.  The mystery was why and where it happened.  I was almost confident my decoding routine was ok, since it was going through the same stages with the songs that worked, than the ones that didn't work.

 

In pure desperation I tracked down a flac frame that seemed broken, and started to decode the frame manually to see what was supposed to happen.  What I found was that the flac was supposed to be decoded using the LPC decoder stage, but my audio player said that it was trying to squeeze the data through the FIXED decoder stage.  Finally a hint of what could be the problem.  Looking at the bit pattern that chooses which stage to decode in:

 

001xxx = FIXED (xxx = order)

1xxxxx = LPC (xxxxx - 1 = order)

 

Looking at my code, i found something like this:

            if ((subFrameType & 0x08) == 0x08) {
                [frameInfo setCurrentSampleSize:[self sampleSizeForChannel:audioChannel]];
                isOK = [self subFrameFixed:audioChannel predictorOrder:(subFrameType & 0x07) fullyDecode:YES];
            }
           else if ((subFrameType &0x20) == 0x20) {
                [frameInfo setCurrentSampleSize:[self sampleSizeForChannel:audioChannel]];
                isOK = [self subFrameLPC:audioChannel lpcOrder:((subFrameType & 0x1f) + 1) fullyDecode:YES];
            }
Say we have a LPC frame with order 9, which codes to bits 101000, it will match the FIXED decoding stage because I am testing in the wrong order.  If I swap the if-tests above, everything will work fine.  Simple stupid error, hard to track down.
 
This made me wonder, what kind of simple, but hard to find bugs have you been struggling with in the past?
 
EDIT: It is also a story about not postponing doing the dirty work when it really is needed...  If I actually did the manual decoding two days earlier that I found being too much work, I would probably have solved this problem much sooner.

PARTNERS