Announcement

Collapse
No announcement yet.

What bit are we on now?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    What bit are we on now?

    Like, we had the 8-bit era and the 16-bit era and the 32-bit era but then, aside from the obvious N64, people seemed to stop counting bits. Did they cease to be important? What are we on now?

    #2
    I was thinking about the same thing the other day. I don't remember anything beyond 128-bit.

    Comment


      #3
      Not fully sure but I think the processors used in the 360 and PS3 are 64-bit technically.

      I don't think they matter so much any more.

      PS2's CPU was described (by Sony) as being 128-bit but was technically only 64-bit too.

      Comment


        #4
        It's a bit (lol) irrelevant with GPU vs CPU now isn't it?

        Comment


          #5
          Didn't we have this thread a few months back?

          Comment


            #6
            We did. The answer was terribly convoluted and dull.

            Basically the "bit" has got something to do with the bus speed or number of processes and the architecture is altogether more complicated now so the term is redundant. Or something.

            Comment


              #7
              Especially so when you only need 16, tops

              Comment


                #8
                Yeah, it used to define the number of bits-of-information that a CPU could read in a single clock cycle.

                8-bit machines could only process 8-bits (one byte) at a time, which limited them to using relatively small numbers in calculations. To perform more complex calculations, they needed to combine the results of many calculations ... which slowed things down.

                So as bits increased, computing got more efficient, faster, and sexier.

                However, these days we tend to have CPU's that are performing several calculations concurrently (multiple cores, etc) and other co-processors doing donkey work in the background, and all manner of other wizardry, so declaring this to be the "n-bit era" is a bit pointless.

                Comment


                  #9
                  Originally posted by peeveen View Post
                  and all manner of other wizardry
                  Excellent technical description. I'll be using that.

                  Comment


                    #10
                    Pah, I remember when we talked about K, never mind bits! 128K, those were the days! How many K are we at now then?

                    Comment


                      #11
                      As peeven said the term referred to the console's word size, which long ago became irrelevant. It certainly was not the size of the address bus!

                      The K you refer to was the amount of memory the machine had (the same as your PC's RAM but with all kinds of extra usage limitations on various portions) and was used as a kind of slang for kB, which is kiloBytes (so 1024 * 8 bits). This has nothing to do with processing power or speed but affects the amount of game data that is loaded at any one time and the workspace for game state (the type of memory used can affect performance speed though).

                      The 360 has 512MB of main memory not including that on the graphics cards (I know little about the 360's internals) so you are looking at 524288K.
                      Last edited by averybluemonkey; 03-12-2008, 20:34.

                      Comment


                        #12
                        Darwick was having fun, saying he's so old he predates the great 8-bit+ generation.

                        Though I have to say - 128k! Wow! When I started it was 1k! You had to have a C128 or CPC128 for a whole 128k!!

                        Comment


                          #13
                          Oops!

                          Comment


                            #14
                            Dreamcast was advertised as 128bit.

                            Comment


                              #15
                              Originally posted by Dogg Thang View Post
                              Like, we had the 8-bit era and the 16-bit era and the 32-bit era but then, aside from the obvious N64, people seemed to stop counting bits. Did they cease to be important? What are we on now?
                              The Dreamcast has an Hitachi SuperH-4 processor which should be 64 bit; PS2's Emotion Engine has a 64 bit ALU and 32 bit FPU; the GameCube's Gekko is a modified Power PC G3 with a 32 bit ALU and a 64 bit FPU. (*)
                              X Box central processing unit is a modified Intel Celeron, 32 bits.

                              Describing current-gen CPUs is even harder: PS3 Cell processor is composed of PPEs and SPEs, each with different registries at either 32, 64 or 128 bits.
                              Xenon (X360's CPU) is a three-core CPU derived from the Power PC architecture capable of 128 bit calculations when using SiMDs.
                              Wii's CPU, Broadway, is theorically derived from the GC's Gekko, but is smaller, runs at a higher speed and draws less power - nothing much is known (just like the Wii graphic chip) as both Nintendo and IBM haven't released any official paper.

                              (*)very very very simplified explanation: ALU is the most basic CPU component which handles addictions and subtractions. FPUs normally calculate divisions, moltiplications and square roots.

                              Comment

                              Working...
                              X