difference between 192 kbps and 256 kbps?

  • Thread starter Thread starter arninn
  • Start date Start date
of course it's audible. maybe between 256 and 320 not that much, but at 192 you can definetely hear the lack of freshness of the high frequencies.

the typical mp3 "blur" is there.
 
I'd go as far as to say there's noticeable differences down the entire range of kbps settings for mp3. I think theres a huge difference between 256kbps and 320kbps if you a/b the audio. It's just that once you get to 160kbps, any sound quality should be tolerable.

That's not to say higher rates aren't gonna have a noticeable difference when A/Bing, just saying you shouldn't care much if you got a 160kbps or better mp3 and nothing else.
 
Last edited:
Definitely an audible difference, even to the average listener.

If you want to get technical, the difference is how much data each second (kilobits per second, or kbps) is used to represent the original audio waveform. If there were no compression, logically the signal would be the same as the original.

Its actually optimal to encode files with a variable bit rate (VBR) as opposed to a constant bit rate (CBR), in order to prevent artifacts from forcing the encoder to "over-sample."

The newest LAME codec is good enough though, and I think a 320 CBR rip sounds pretty damn good.
 
right. an average song is about 40-50 Mb uncompressed. the same song in a 192 kbps MP3 is about 5 Mb. The quality is exactly 10 times worse.
 
right. an average song is about 40-50 Mb uncompressed. the same song in a 192 kbps MP3 is about 5 Mb. The quality is exactly 10 times worse.

Quality is a very subjective thing. The file size is 10x less, but it doesn't it sound 10x worse. MP3 compression works on the property that it is removing frequencies your ear can't ear. Obviously, the lower the bitrate of compression, the less this is true.

If this was the case, then what would be the point of MP3 compression in the first place? 10x smaller = 10x worse quality? Definitely not. That would be a really poor (and useless) compression algorithm.

The value of mp3 compression is the small files it creates sound pretty damn good (especially when you compare a WAV file to a 320 mp3.)

From a visual standpoint, you can have a 40MB TIFF image, and a 4MB JPEG image. It won't necessarily look "10x worse."
 
Quality is a very subjective thing. The file size is 10x less, but it doesn't it sound 10x worse. MP3 compression works on the property that it is removing frequencies your ear can't ear.


what frequencies? at 44.1kHz sampling frequency, the highest frequency you can have is 20.5 kHz. Human ear can hear 20kHz already. so what can you remove without audible loss?

plus, even frequencies you can't hear can affect the way you percieve the sound.

nothing to even talk about. If you say 320kbps MP3 is okay, that's a different thing. Still not CD quality, but all good.

but someone who can't hear how 192kbps fuuccks up audio, shouldn't produce at all.
 
Last edited:
KBPS in mp3 is how many kilobits per second are sampled. Basically the higher that number, means more audio samples are taken to make the resulting mp3 track.


Basically it results in higher quality audio.
 
what frequencies? at 44.1kHz sampling frequency, the highest frequency you can have is 20.5 kHz. Human ear can hear 20kHz already. so what can you remove without audible loss?

plus, even frequencies you can't hear can affect the way you percieve the sound.

nothing to even talk about. If you say 320kbps MP3 is okay, that's a different thing. Still not CD quality, but all good.

but someone who can't hear how 192kbps fuuccks up audio, shouldn't produce at all.

The important take-away from my earlier post is that 10x decreased file size doesn't equate to 10x worse quality.

I don't understand the specifics of the psycho-acoustics thing, but thats the general principle of how the compression algorithm works.
 
The important take-away from my earlier post is that 10x decreased file size doesn't equate to 10x worse quality.

I don't understand the specifics of the psycho-acoustics thing, but thats the general principle of how the compression algorithm works.

10 times less bytes means 10 times less information. of course, there is information that isn't really lost, just stored effectively (lossless compression), and there is information that's actually lost(lossy compression). MP3 basically do both.

all I was trying to say that is 192kbps is fd up.
 
At the radio station I play at, the MINIMUM bitrate you can play is 192. I guess most people can't hear the difference.

I don't think you could put on an MP3 for me and I could say "oh no, this is 192 kbps, take it back!" like a lot of you act like you can do, but you can definitely hear when a song is 128. I think 192 is kind of the cutoff where they start removing a lot of the clearly audible frequencies of a song.

Doesn't matter to be; I still spin vinyl which doesn't have a bitrate, just lots of dust.
 
The difference is (256-192)kbps = 64 kbps
SO EASY!

---------- Post added 06-26-2010 at 02:42 AM ---------- Previous post was 06-25-2010 at 08:49 AM ----------

Real world = analog (voice)
Computers = digital

In order to best represent an analog sine wave (voice) in a way that computers can understand we need to sample it. The higher the sampling rate used the closer the digital square wave gets to representing the original analog sine wave.

There comes a point where sampling is "good enough" for some people but not for others, depending on how good your ears are. For my ears, 320 kbps is the best quality for MP3.
 
But then again, being able to tell the difference depends on your monitoring system, also. For example, it would be easy to tell the difference with an Apogee D/A and a set of Genelec monitors... But would you be able to on some $15 dollar store mp3 player speakers?

I suppose that's really just splitting hairs though.
 
At the radio station I play at, the MINIMUM bitrate you can play is 192. I guess most people can't hear the difference.

I don't think you could put on an MP3 for me and I could say "oh no, this is 192 kbps, take it back!" like a lot of you act like you can do, but you can definitely hear when a song is 128. I think 192 is kind of the cutoff where they start removing a lot of the clearly audible frequencies of a song.

Doesn't matter to be; I still spin vinyl which doesn't have a bitrate, just lots of dust.

You have to broadcast at 192 because of transmission losses and signal degradation.

If I'm listening to the radio in my truck, I probably would not be able to tell the difference between a 192 kbps broadcasted signal and a signal of a higher broadcasted bitrate. But if I have my "BEATS" headphones on and I'm listening to an mp3 on my computer I can definitely tell the difference, and so can a lot of other people.

It's a bit like those tone deaf people on American Idol during auditions lol. The trained ear can tell that those people can't sing, but the tone deaf people swear that they're the next Celine Dion.
 
Last edited by a moderator:
i think it makes a difference cz da more kb da more freq ur gna gt n u wnt wnt a full range of high freq to gt dat air in ur mix bt nt to much tho
 
Back
Top