Category: Computing

In this recently released version of Winamp, the packaged AAC encoder has been changed!  Now the app comes with Fraunhofer encoding for HE-AAC.  Dialogs have also been changed so that the format of AAC you end up with (HE, HEv2, or LE) are set for you.  This all makes for a much nicer user experience – no need to read my previous blog for a start!  So how well does the new interface and encoder work?  As before, we look at a set of 20 tracks and clips and check out their sizes and some frequency graphs.

An improved encoder, an improved interface

In the Encoder settings for Winamp, we have a new MPEG-4 AAC Encoder!  Whoop!

Unlike the old encoder, we can pick from Variable, as well as Constant bitrate encoding, and the quality slider for each automatically changes the specific AAC encoding version that is used:


This automatic AAC type selection means that even YOU, humble reader, can get the benefits of AAC-HEv2, and you don’t even know it.

There are only a few disadvantages here, but you do need to consider them, and even plan ahead to do some hair-dye-style spot checks before you start any massive re-encoding / re-synching process:

  • Not all players understand AAC-HE – as was the problem on an old build of an Android Media Player I was using, and all the sound came out dodgy

Nope, that’s it!


As you will see from previous posts, we have used spectrum plots and file sizes to measure, in a “what do you reckon” type way, the performance and success of AAC and MP3 conversion.

In this post, let’s see how the encoding options here compare to options for other converters in that previous blog, and then what kind of file sizes and spectrum plots we get.  You will recall that there are 20 tracks and 20 track clips in our test block, across a range of genres.

To carry out the conversion, Winamp’s “Send” context menu is used to pass the tracks and clips to the format converter.  Here we can see that the HE-AAC encoding (but presumably not AAC-LC encoding) is done by Fraunhofer.


Variable Encoding

Theoretically, variable encoding is preferred (as discussed last time) as the computer can decide what bitrate is most appropriate throughout a given song and allocate as much as is necessary.  The downside is that the file size isn’t calculable beforehand, and the file sizes fluctuate.

The quality slider on the dialog uses different measures of Quality to our previous test, so it’s not easy to carry out a fair, objective comparison of this encoder against the previous or other encoders.

For example, after running all tracks through the encoder, the results show that “1” in Winamp’s new encoder reduces tracks to 2.6% of their original “pure” WAV size, “3” to around 7%, and “5” to around 15%.  Looking at our old VBR results, this puts 5 just in between the old Winamp and Nero “H” settings results, and 4 at the old “L” levels.

“1” has been completely excluded from any comparisons because we didn’t look at HE-AACv2 in any detail last time.  But I have to say, HE-AACv2 in this encoder is awesome…  For some tracks (especially where stereo wasn’t that important) it prove very difficult for friends to tell, in a blind test, which was the Q1 and which was the Q5.  Yet the file sizes are the difference between holding 10,000 songs on a 8GB phone memory card (Q1) and just 1,500 (Q5).


As in the previous tests, here is a table that shows, for each encoder’s own quality setting, the variance from that quality setting’s average compression rate across all the tracks.


“HON” here is used to represent the new Winamp encoder, while AAC is the old one.  As before, we do need to take care not to compare across columns, since each resulting compression average is different between encoders and encoder settings.  Some things do stick out:

  • The encoder does well when the others don’t: e.g. track 4, 7-9, and 16.
  • The encoder does badly when the others don’t: e.g. track 2, 3, 11, 15, and 18

Of course, one might argue that this could be to do with the quality setting, if it weren’t that these conclusions are across the quality settings; take track 3 for example, where it sticks out like a sore thumb at Q5 (~14% larger) and Q4 (~14.25% larger).


Here are some spectrum plots for some good performers (4, 7, 16) and some bad ones (2, 11, 18).

4 image
7 image
16 image
2 image
11 image
18 image

Remember: we can’t compare directly and say Oooooh well clearly Nero does better than Haunhofer, we can only make generalisations.  And generally, the Nero AAC Encoder from our previous tests gives good reproduction at high quality levels (e.g. Track 2, 18) and the Haunhofer does well (Track 11) too.  Generally the encoders still cut off high frequencies to save space (as we would expect).

But there are a few things that interest me: look at Track 2.  Haunhofer’s 8.5MB seems to be spent just as well as Nero’s 10.5MB – in fact there are more upper frequencies kept by Haunhofer.  It’s not the same story for 2-18 though, where a comparable amount of space is used by each, but the results look very different.

Well, we could wax lyrical about this and given my complete lack of any type of degree in physics or Marten Coltrane Supremes I’m not going to.  But as a customer, pure and simple, the Haunhofer encoder does just as well as Nero.  And, importantly, it works inside Winamp.  I can use it for ripping, and it works.  INSIDE WINAMP.  Whoot!  Finally, VBR encoding, and with a good encoder as well.  Whereas Nero was fiddly to get going in any other media player (I had to do all the conversions for the last blog on AAC using a command line interface…)

Constant Rate Encoding

I’ll be honest, we’re both bored by now.  We know CBR will give us a fixed file size, and we’re pretty much going to get a set amount of quality across the board.  What interest me at this point is whether it’s worth re-ripping out of the 160Kbps fixed rate that I used last time.

Album Old Size New Size Old Spectrum New Spectrum
The Prodigy
The Fat of the Land
65.4MB 98.9MB image image

And hey presto, the spectrum plot for track 1, “Smack My Very Naughty Girlfriend in a Tasteful Way”, looks exactly the same.  And the sound test?  Exactly the same.  Some tracks grew by around 2MB (which isn’t bad, considering the computer now had another 60kbps to play with), but a couple grew in size by over 50%.

Here’s Narayan, which grew from 10,775 to 16,743.  The left is the lossless, FLAC version of the track.  Next, the 160 CBR version, and finally our Q5 VBR version:


The Q5 version seems to keep a better upper frequency representation, especially around 15,000Hz.

And here’s Climbatize, where the same is true: better upper frequencies.  And note how in both of these, a dip in the plot on the original at around 175000Hz is “smoothed over” in the 160 but not the Q5 encodings.



Sure, Haunhofer in Winamp is a good way to rip CDs with Gracenote and album art, and a much better interface now.  And yes, on the down side, you are going to pay for this higher quality with disk space.  I can’t prove Q5 is better than 160Kbps, even though my brain says that it is: it’s not like you can’t hear people singing at 160kbps but you can at Q5.

And some people’s equipment, let alone their ears, won’t be good enough to reproduce upper frequencies with any gusto or accuracy… would they?  WELL!  These new Dr Whatsisface Headphones are spreading and multiplying like subtle mice.  Fashionable, yes, but good too.  Not only Dr’s, but also Monster headphones, and little Japanese ones that cost £50.  There may be a revolution coming.

Perhaps we are to witness the death of MP3 after all? Perhaps people will shun 128kbps MP3 downloads once they hear the glory, the majesty of Honfernenfrauer Q5 VBR AAC?  And yes, together we shall declare, “YES!  You may pay for it in disk space, but by golly, the upper frequencies really do look better on a graph than MP3 EVER COULD!”

But for me, who already has 160kbps, who already has Monster headphones and isn’t DEAD because of a lack of upper frequency plots… well… why would I bother?

Oh yes: because I’m an early-adopting neurotic that must ensure all I am and all I own is better than you and yours…



I’ve done the dastardly deed and upgraded to 3D for PC gaming: but is it worth all the fuss and nonsense?  Is it worth spending a month’s wages on it?

The Quality and the Performance

Let’s get one thing straight: take what nVidia or ATI or Samsung or Sony tell you with a pinch of salt.  Did I say pinch?  I meant a MOUNTAIN of salt.

Well maybe that’s a bit dramatic.  Maybe a modicum.  Yes, a modicum.  What a great word.


The point of products made by nVidia, IZ3D and DDD are to make money.  As well as to make the games and videos that you own somehow take on a 3D appearance.  They all achieve results in different ways, and they all achieve them with varying degrees of success.  nVidia have a bit of software that only works with their graphics cards, and it’s an add-on to your graphics driver.

iZ3D and DDD are less connected to particular vendor’s hardware: but let’s face it – if you’re car manufacturer gave you the supercharge kit for free when you bought your car, why would you go buy or invest in something made by someone else as well?  For a laugh?  I DON’T THINK SO.  So, nVidia guys get it without thinking and don’t bother looking at alternatives.  If your graphics card is made by ATI or Intel or Shamookamookamooka you’ll have to look at IZ3D and DDD.

So let’s get this straight, oh beautiful reader.  Different manufacturers for the graphics card, different people writing software to turn stuff into 3D, different companies making the monitor, TV, or projector, and different people sitting in front of them.

And let’s not even start on the source material itself: some early 3D films just had one or two tiny little segments in 3D that, if you were lucky, made someone’s fingernail appear to touch you inappropriately.  Nowadays, 3D production on films like The Green Lantern is PROPER WICKED with the entire film, even the boring “I LOVE YOU!!” scenes, with 3D on it.

For gaming, that means a game from 1983 will work less well, no matter how much you spend or how much you drink / smoke / inject, than a game made last year with 3D in mind.  Knowing this, the manufacturers have published nice pretty lists.  nVidia’s list is really extensive, and when you start up a game it (sometimes irritatingly) tells you exactly what to expect from the game.

Darksiders Intro Trailer

Let’s take a game like Darksiders, which is a brutally ridiculous game that I, for some reason, can’t stop playing.

The nVidia rating for this is “Excellent”, which means it should be excellent, right?

What they mean by excellent is that it’s playable, and that any glitches won’t make your eyes explode.

iZ3D doesn’t even list Darksiders as compatible.  So good luck with that.

But to confuse matters, that doesn’t mean it *won’t* work, it means that they’re not sure if their system, and the hardware their system works with, will be any good.  I.e. : suck it and see.

And it’s supposed to do what exactly?

It’s SUPPOSED to make your genitals implode with their lack of usefulness.  It’s SUPPOSED to be brilliant.  But what with experience being so inconsistent, you can go from playing a brilliant 3D title to a game that’s is so appalling in 3D that it will give you a migraine.

In every title I’ve played, using a 3D solution will, definitely, give you Depth of Field.  They do not emulate perspective.  WHAT?  WHAT IS THIS NONSENSE you cry.  Sorry my sexy readers (phwoar!  God you’re gorgeous!) but I can’t explain it any other way.

Think of it like this, if you will, while I get undressed.  When you’re on a train looking out the window, the things in the background move at a different rate to the things in the foreground.  There is perspective.  And things are specifically smaller because they really, definitely, absolutely are further away.

An explanation of what I mean from Father Ted

So you’re looking at a flat surface, and your head is TRICKED into thinking things are far away, when they actually aren’t.  They’re smaller because the computer made them smaller when it drew them onto the screen.

Depth of Field, however, means your head sees smaller things behind the bigger things.

Why the HELL is this so important?  Well, because it’s a trick.  It’s not 3D, it’s depth of vision, and if the game you want to play don’t know that some other manufacturer is going to try and make the player put some things “behind” others, all kinds of things go wrong.  In Darksiders, the little tiny yellow marker that hovers above enemies when you’re about to slash their guts into 1722 is in front of everything else on the screen, but it’s the right size.  In Grand Theft Auto 4, shadows are all over the place, and don’t correlate *at all* to the place they really should be – so you have to turn them off.  In Crysis, the gun sight (or crosshair) has no depth at all which (back to the perspective versus depth-of-field distinction) means you’ll be firing at completely the wrong angle to hit something – so you have to use one in the 3D software instead.

My poor brain.

Now I know, you’re all sitting there in your naked states, quietly wondering whether I look good naked.  And if you’re not, you’re thinking I’m slating 3D.

But wait: this DoF trick is AWESOME.

Let me say that again in bigger letters:





Something else can go wrong?

Oh yes, oh yes.

Although it’s not so bad since I rearranged the furniture in my office and changed all the lighting, you can get this nasty thing called Ghosting.  It’s like, well, you know: deja vu.  But worse.  Like, you’re constantly getting deja vu.  On repeat.  You’ve seen Deja Vu right?  Well that film is awful.  And let me tell you, watching that over and over again once in each eye but like so fast that you see the whole film in its entirety every 60th of a second is nausea inducing.

If you want to know about it, go look at the 3D Vision Blog where he writes stuff about it whenever he writes a review.

If you can’t be arsed lemme just tell you right now sonny Jim, avoid it.

Worst of all, you can just end up with the wrong kit.  The wrong software inside the HDMI socket on the back could see it incompatible with your PS3 or XBox, or the monitor itself just might not work with your chosen manufacturers 3D software.  Or you could be one of those unlucky people who read the Daily Mail, but that’s just life.

What’s it like to play?

OK yes back to the games while you lube up.

So: split-screen co-op play obviously fails.  It’ll require you to get extra glasses, and when you do start playing you’ll be fighting for screen space with your co-op player and everything will just look odd.  In fact, just don’t bother.

Getting help from a friend who’s already played this boss and knows what to do, or which ledge to grapple onto, or just to laugh at you uncontrollably as Lara Croft yet again fails to roll when you CLEARLY pressed [X] you BITCH – FAIL.  They can’t see what you’re looking at.  To them, it just looks like Esther Rantzen.

Fog and smoke hanging around on battlefields, lasers that really do go from the front to the back of a room, knowing how far away a corner is on a track, mountains and hills that are genuinely “further away” than the background, and being able to tell how close you are to environmental objects and targets: yeah, that rules.  Totally utterly rules.


In the shop when I bought the glasses and the 3D monitor (for thou canst not use any old monitor, forsooth) the man said, “your frame rate will drop”.  I thought maybe he was talking about the number of glasses I get through in a year, which is just ridiculous because I don’t even WEAR glasses, so I just scowled at him and walked away.

Now, I have to confess.  I have spoilt myself somewhat this year.

First, I sped up my PC for (almost) free by a super-duper 50% by turning up (overclocking) my old Quad Core processor, which would have melted it were it not for a Scythe Mugen 2 fan that I fitted.  I had to buy that specific one just because it looks like it could be a weapon.

Second, I went a bit mad and bought an ASUS GTX 580 – one of the latest graphics cards for PCs – to replace my old one.  Then, just today, I plugged my old one back in again so that it could do nothing but calculate what happens when the Frak Cannon ammunition hits the wall instead of my CPU having to do it.

Of course, I still want a new case for it, a new processor and motherboard… ah the list goes on.

Needless to say, my PC is now SCHMOKIN.  3D Mark 11 (a gaming benchmark programme) rates my PC as 4820 3DMarks.  Which is nice.  Apparently that’s still slow for a PC of my stature and if I pay money they will show me a web page that tells me where to spend more money.  So I haven’t.  Cos you wouldn’t, would you.  Not really.  No.  Period.

Of course, what really matters is how games work.  So let’s look at some benchmarks from within the games, with 3D off and 3D on.  The settings will be the same for both 3D and non-3D modes cos this is just about showing whether the frame rate drops and by how much.  Anything below 20 frames per second will make you annoyed.

Just Cause 2

These are from the Conrete Jungle Benchmark within the game.  I always crank everything up for no reason other than I can show off.  Sadly, this affects the fluidness of the game, especially in 3D.  Check it out, 23fps without 3D, 14fps (YUK!) with 3D.


Obviously, I could be less of a show off and turn down my settings from Very High High On and YES PLEASE, and that would help, e.g. 32x CSAA.  I mean really, why would I want that.  Tsk.

Anyway the game plays great at 23fps.  OH ACTUALLY sorry it’s 24fps if you round up.



Here’s the other in-game benchmark, Desert Bumrise, again off on the left, on on the right.  This time it’s a drop from 40fps to 24fps.


The solution here would be to rape my bank account and get some more fps, which means, stupidly, that I need to spend another £800 on a new motherboard, CPU, and another GTX 580 so I can run two graphics cards together at the same time.  Oh and another £100 on a new power supply.

I could also use the ASUS Overclocking Utility that came with my graphics card…  But that scares me right now.

Mafia II

OK so one of the only other games I have that runs benchmarks of its own content: Mafia 2.  First we have the bog-standard, on-install video settings.  At the top, 3D is off, at the bottom, 3D is on:



Yep you read right, a drop of almost 1/3 of my framerate (which mathematically speaking isn’t so bad since, in theory, the computer is having to do twice the work).  But then we go to mad town, with all the settings way up, and Physics Effects turned on to the max:



Now the rate drops from an almost acceptable 40fps to under 24fps.  The game complains; the World ends.

Be serious…

Who in their right mind would spend a total of £1600 on that kit just so that smaller things appear in front of bigger things.  Nobody, that’s who.  No-one.  NOBODY.  Look out of the bloomin’ window.

3D movies, though – well that’s cool.  Especially the naughty ones.  And pictures, they’re good too.  Yeah, I might get a 3D camera – 3D pictures are cool.  And don’t say “ew those specs” because I used to love those red 3D picture binoculars and I never threw a wobbly about those.  But then, I wasn’t buying them.

Oh I’m losing track.  Let’s instead concentrate on AMILLIONBYTES’S’S SUGGESTIONS TO 3D LAND on how to get this working.

aMillionBytes Suggestions to 3D Land on How To Get This Working

Number One: Cheaper, transparent costs, re-usable kit.  Yes, you heard it – work together bitches.

Number Two: Set an industry standard measure for 3D games support, and then publish it.  Together.  As one.

Number Three: Get your connectors and software standards sorted out.  Go with the best one and stop doing stupid dumb-ass things to confuse consumers because you want to make money.  You know it’s Displayport.

Number Four: Stream 3D content.  Stop making me use one vendor’s product to watch videos in 3D on my PC (i.e. Trampburn a.k.a. Arseslap a.k.a. Fanflange a.k.a. Firefarx).  I.e. standardise bitches.

Number Five and Number Six are really important so they appear together: Give me your next version kit to me so that I can show off and kiss it.  I can be your tipping point!

Will I send it back?



Do I look at other people’s PCs and go – ew – that looks odd?  Yes I do.  Yes I do wander why the depth perception is wrong.  Yes I point and laugh.  HO HO you silly 2D people.

But seriously – it’s really, really good.  Let me rephrase that:

It was expensive, hideously expensive, but the effect is good.

And the technology shows promise – like an X-Factor contestant you like even though you hate her because she’s on the X-Factor and her teeth are where her eyes should be, and if you spend too long looking at her you get a migraine.





And I ❤ it.  And you will too, or hell I’ll put on a Simon Cowell costume and ruin your dreams.

Now put your clothes back on and stop staring at my crutch.

So rude.

From MP3 to AAC

In a previous post on the Digital Home, we saw that AAC is the new MP3.  But is it really worth re-ripping all your CDs to AAC, and switching your purchases to AAC?  With my tiny bit of PC and hi-fi knowledge I go on a detailed journey to understand the differences between the two formats.  Come with me and see if you agree…

What’s the theory?

According to the sales pitches, AAC can give you the same quality of sound from much smaller file sizes: use up the same space on your computer and get better sound – quicker downloads, more music on your portable player…

If you’re like me and ripped those CDs you own to digital form in the 90’s, chances are you had so little disk space available you had to choose a low quality MP3 so that your CDs would fit on the 20GB hard disk that just cost you a month’s wages.

Now comes Mr AAC, with it’s better clothes and cleaner teeth.  I ask myself: is there any point re-ripping all those CDs into a new format?

What sounds better?

A typical CD is 650MB in size – that’s almost three-quarters of a gigabyte!  Ripping software, like iTunes, Winamp, or Windows Media Player, squash a CD using a “codec”.  Codecs strip out sound beyond the hearing range of most listeners, or remove sounds that cancel each other out.

We can tell codecs either to be very strict about the file of the size (bitrate compression) or we can say, hey, I don’t care about the file size – just make it sound “medium” (like video codecs). The lower the quality or the bitrate, the smaller the file and the worse the reproduction is.

Bitrate restriction is like pushing water down a pipe of a fixed diameter, restricting the amount that can be pushed down it.  Like Bill here: he only just fits in the pipe.

The theory goes that AAC squishes in a superior way than bog-standard MP3, resulting in smaller file sizes (when aim for the same quality) or better quality for the same file size (when we’re strict about the bitrate).

The tools to do it

Different companies produce different AAC codecs; as with most things, some manufacturers are better than others.  Experts with a lot more experience have already extended their extremely expressive extremities and compiled a list of different codecs, rating each one using various measures. Nero (for AAC) and LAME (for MP3) seem to come out best.

To drive a codec you need software that co-ordinates things.  It tells your computer to read music from the CD (still illegal in the UK!!) and then throw the 0s and 1s to the codec which produces an MP3 or AAC file that is then saved to your hard disk.

Here are some pieces of software, together with the codec they use for AAC or MP3.

Reader AAC MP3 HE-AAC (see later)
Fre:AC FAAC Codec LAME MP3 Encoder  
Foobar 2000 Nero AAC LAME MP3 Encoder  
Winamp Coding Technologies LC-AAC Encoder N/A Coding Technologies aacPlus (up to 128kbps)
Winamp Pro
(fast ripping)
Coding Technologies LC-AAC Encoder Winamp MP3 Converter Coding Technologies aacPlus High Bitrate (over 128kbps)
MediaMonkey FAAC Codec LAME MP3 Encoder  

Seeing the change

Audacity is a great little audio tool; one of its capabilities is to generate spectrum graphs so we can visually compare ripped tracks against the original.

Armed with 20 songs from a range of genres I created 20 second clips, converting each clip and each track into AAC and MP3.  I used three bitrates (128, 192, and 256) and three “qualities” – that’s six different spectrum graphs for every track and six for every clip.

Here are the spectrum plots for just a few of them.  You can click on an image to make it bigger.

Setting CD MP3 AAC
128kbps image image image
128kbps image image image
192kbps image image image
192kbps image image image
256kbps image image image
256kbps image image image

We can see from the plots that:

According to forum posts, the quality based approach should always be used because it follows the ebbs and flows of the music, resulting in fewer wasted bits in your files.  We should expect some much better frequency plots here than the fixed bitrate versions.

So let’s look at the Quality based encodes then:

Setting CD MP3 AAC
Low image image image
Low image image image
Medium image image image
Medium image image image
High image image image
High image image image

Yup, High Quality AAC is better than High Quality MP3.

A techie adventure into Q

I converted three tracks and took the average of the average bitrates for different VBR and Q settings to come up with VBR / Q settings for Quality as follows: Low (5; 0.4); Medium (3 / 0.5); High (0 / 0.65). It was roughly what Foobar with NeroAAC suggested anyway so I did feel a bit like a pleb afterwards.

I created spectrum plots at a range of quality settings for AAC so we could see how the sound differs at each setting when varied by 0.01 points:

Q File size / avg kbps Track 1 Track 2
.58 9170 / 220

7268 / 211

image image
.57 8962 / 215

7268 / 206

image image
.56 8749 / 209

6944 / 202

image image
.55 8541 / 204

6776 / 197

image image

I switched to another codec (Winamp).  Here are the outputs for a different track at .56 and .57:


…and for a hi-fi calibration noise:


The rule seems to be that encoding over a .57 quality setting gives a nicer representation of the original than one at just .01 lower.  Which is odd.  Anyway I can’t explain it, but I left it in here to show that I am awesome when it comes to getting obsessed with codec settings…!

Back to reality: save that disk space!

Giving the computer the power to adapt by using Quality Encoding instead of fixed bitrate encoding means you could save disk space.  With fixed bitrate, one minute is the same number of bits no matter what; with quality, the computer decides where to use all those bits up.

The spread-sheet below lists all the 20 tracks I encoded.  It shows how big each song was when quality encoded, as well as how bit it was in the bitrate version.  The bars then show how much better or worse it was in terms of file size.


Corpse Pose (Divination); Porcelain (Moby); Dancin’ (Aaron Smith); and Previous Love (Blaze / Barbara Tucker) compress better (according to the averages for each quality method) independent of whether it’s AAC or MP3, while Ghost Hardware (Burial) and I Love New York Live (Madonna) fair badly.

And which codec is more consistent when it comes to behaviour?


Of the six quality-based encoding options, MP3 compression (with this codec) comes out worse – very slightly. Notice too that the differences using AAC as opposed to MP3 are much less severe.

We’ve learnt:

  • some songs (around 20% tracks in the sample) are better encoded using quality settings rather than fixed bitrates, giving you smaller file sizes.  MP3 has the potential to be much worse or much better, though, while AAC is just a little worse or a little better.
  • AAC at bitrates over around 200kbps have the capacity to be better than MP3.  (Whether the environment you listen to your music in (tube, front room, toilet) and the equipment you have will give your ears and brain the chance to take advantage of that capacity is a different matter.)

BUT WAIT!  What this chart could be showing is the effect the Quality setting has – i.e., codecs will take advantage of our flexibility by using more space to produce better sound.  And if that is the case, AAC is able to do this more efficiently than MP3 can.  I.e., the same quality but for less cost.  Which is what Matalan apparently does, but I’ve never been in there because it smells funny and my brother says everything is CHEAP.  CHEAP!!!

Is it worth it?

It’s not a *bad* thing to want to make the stored version of music closer to the original form from the CD.  And it’s certainly convinced me to choose AAC of at least 256kbps when I buy music from a shop.  But if all that matters is that it *sounds* the same, well, none of the analysis above will tell us that.  Next then, to be sure, it’s worth comparing the experience AAC gives versus MP3 in the environment and on the equipment that will be used to listen to it.

I roped in flatmates and took 20 second samples from each track and played them back to back in a completely random order, like this one:

[Original] [MP3 High] [AAC 128] [AAC High] [MP3 128]

I’m not testing the tester’s ability to distinguish good from bad – I’m testing whether there is a noticeable experiential difference.  To that end, the listener scored the encoded tracks as either the same, better, or worse.  Then I checked patterns in the results:

Pattern One: The listener thinks AAC is better than MP3

Original >> [Worse] [Better] [Better or Same] [Worse]

Pattern Two: The listener thinks higher bitrates are better

Original >> [Worse] [Worse] [Better] [Worse]

Pattern Three: AAC at a lower bitrate is the same or better than MP3

Original >> [Worse] [Same] [Better] [Worse]

The first listener’s responses (on a 5.1 Fatal1ty Headset) gave up Pattern 3 in 48 of the 80 possible responses.  But they only thought AAC was better in 37 of the samples and that higher bitrates were better in 34.  I.e. he believes what he heard was better than MP3 if it was in AAC when the previous clip was either (a) an MP3 at a lower bitrate / quality than the previous clip or (b) AAC at the same rate.  Looked at another way, the listener failed to notice any rate changes in over 33 of the possible 60 “true test” clips, and a codec change in 36.  We’re not really getting this blinding light – this eureka moment – that we might expect.

Only 3 of the “matches” to the Pattern 3 were pure and exclusive – the others 45 occurrences could just be down to them correctly detecting a rate or codec change.

(I’ve got this data tucked away if anyone wants it!)

The second listener’s results were equally as pants.  They picked up a change in Codec or Bitrate more than 50% of the time (30 for bitrate, 32 for codec), but Pattern 3 emerged only 20 times.

Importantly, these two flatmates of mine couldn’t consistently tell the difference between AAC and MP3 – not in the way frequency plots imply.  There just is not the evidence to support a carte blanche move to AAC just because it has prettier frequency spectrum graphs.  To quote one of the listeners:

It just sounded like someone had tweaked the bass and the treble between tracks – like sometimes when there was a lot of bass, the treble wasn’t so good.  Otherwise they all just sounded the same.

AAC: The Next Generation

This entire blog entry has focused on AAC LE – the first born.  The evolved standard, formalised some 6 years later than AAC-LE, is HE-AAC.

There are three key enhancements in HE-AAC that are not part of LE:

SBR gets the high frequencies back into our recorded versions at much lower bitrates than either the original forms of LE-AAC or MP3 ever could.

PNS recognises that the sound of the sea is the same as wall-sanding. Even though they are different sound sources, PNS tells your media player to go “shhhhh” by producing white noise. This gives even more space over for other important sounds.

Finally, Parametric Stereo cleverly turns stereo music into mono, and then includes data on how to bring the stereo back. This only kicks in at ultra-low bit-rates, say 24kbps – so in our world it’s not really doing anything – but it’s great for digital radio and telephone conversations.

So, take a look at these frequency plots.  The left hand side is a standard MP3 at 128kbps – on the right is HE-AAC with SBR at just 96kbps:


And the AAC-HE file is 2MB smaller than the MP3 on the left – a whole 50% smaller.

Think back on the plots at the beginning of this blog.  We noted that, at around 192kbps, AAC-LE re-introduced frequencies at a bitrate of around 200kbps.  Given the efficiencies of PNS and SBR, will HE-AAC bring back those frequencies earlier than both AAC-LE and MP3?  Let’s look at Kelis / 4th Of July (Fireworks).  First, LE-AAC (top row) then AAC-HE (bottom row).

From left to right it’s original, 128 and then 256.



Wow! AAC-LE drops a huge block of frequencies at the top-end, while HE-AAC has near enough finished its beer and is on the way home.  Notice how AAC-LE replicates the original at 256kbps, but HE-AAC is still processing and optimising the original.

Just for a laugh, here are the MP3 equivalents for your pleasure:


It goes without saying that there is a significant disk space saving to be had, and then, necessarily, less requirement for you to have the latest and greatest network connection.  I.e., AAC-HE ought to save you money and perform better.

Let’s get into looking at bitrates, then.

Bitrate Lovely Lines File Size
96 image 4044
112 image 4707
128 image 5370
160 image 6697
192 image 8025
224 image 9352
256 image 10679

There are three groupings for this track – one including 96 and 112, one including 128, 160, and 192, and one including 224 and 256.

The last group has a fuzzier plot over 20Mhz then the middle, the middle group has frequencies over 20Mhz, and the lower group has no frequencies over 20Mhz.  At this point, you’re probably thinking, “nyeah” and would plump for 128 for this song.  Thinking ahead and outside this song (knowing that some songs do better than others) I’m plumping for 160 to give that all-important leg-room and a sick-bag for the unexpected.

By re-encoding CDs to AAC-HE 160kbps I can expect a drop in disk space usage of about 15% in moving to AAC-HE at 160kbps – two more albums on my phone for every dozen I have currently.

Let us rise up, fair readers!

In conclusion then:

  • MP3 drops upper frequencies, even at its highest settings, for every song thrown at it
  • AAC-HE beats LE-AAC at bitrates over 128kbps
  • AAC-HE does little, if anything, to sound at bitrates over 160kbps
  • AAC beats MP3 hands down in professional and beer-powered testing

In other words, I can store more and hear the same thing by switching to AAC-HE at 160kbps from MP3.  The only question now is: can I be bothered to re-rip everything?

Getting movies to play on DLNA-compatible devices isn’t easy.  Welcome to my first blog of 2011: where middle-classes commit crime, get confused, and have to go back to playing Doom for a count-to-10-fest.

In the Beginning

After some suspicious law-breaking caused by not wanting to spend £200 on Ikea Shelves, many of the DVDs I own suddenly, and quite by surprise, found themselves stored on two PCs in my house.  Magically catalogued in a Danish “My Movies” system, I can access these movies through a fab looking graphical interface across my network.



Many LCD / LED televisions allow us to watch movies stored on a computer across our home networks.  Radios supporting the “digital home” mean we can be in our kitchens or bathrooms listening to the music we’ve bought and put onto home PCs.  Even mobile phones now support DLNA!

imageThe pretty Danish thing is not an example of a DLNA Digital Home. It’s a computer database that tells other computers what file to get to play a movie.  Maybe you’ve got something similar already running in your home – a shared folder with all your Music in, for example.

DLNA, on the other hand, works more like this:

A digital home reads original media (it could be a video from your camcorder) and then stores it somehow in the right format.  Next, it is converted before being broadcast by your digital home servers over your network.

The device you’re using (like an iPad) then receives the media, and might convert it internally into a usable format.  The player on the device then turns that into a format that encourages the dunking of biscuits.

Read on to find out how I investigated and decided on a digital media home set-up for myself…

Continue reading