Mar 12, 17 / Ari 15, 01 15:03 UTC

Historical background of computer technology has it's own importance  

Today computer has it's great importance for human development.We should know it's historical background. Here i request to all asgardians who know about the history of computer science to share it's knowledge.

Mar 12, 17 / Ari 15, 01 16:28 UTC

Hello ,Miss Zahlee George, Thanks for your understanding and very grateful to you for appreciating my profile. please guide me to improve it . I live in India with my family and happy getting your E-mail thanks.

Mar 12, 17 / Ari 15, 01 16:50 UTC

Comment deleted

  Updated  on Jun 15, 17 / Can 26, 01 16:10 UTC, Total number of edits: 1 time
Reason: "This user no longer wishes to be associated with a tin pot banana republic"

Mar 12, 17 / Ari 15, 01 17:02 UTC

The aishagada (Zahlee George) account, is a spam account and the user has been banned and the comments deletes.

  Last edited by:  Kristin Marie Underhaug (Asgardian)  on Mar 12, 17 / Ari 15, 01 17:02 UTC, Total number of edits: 1 time

Mar 12, 17 / Ari 15, 01 17:59 UTC

Anything directing your traffic outside of these services should be met with general distrust. Especially with such poor bait as "something important to discuss" or the offer of a "picture" that's almost certainly embedded with some exploit.

With regards to the history of computing - that's slightly arguable. You can argue it didn't occur until the invention of the microprocessor, or you could argue Babbage kicked off the whole affair with the thinking machine. Or you could sit between them with the likes Turing's works and the database system IBM setup for the Nazi to track/catalogue jews. Archemedies supposedly had a device that could be argued as a mechanical computer, apparenly used to predict various repetative astrological occurances.

I personally focus on the transition from valve to transistor technology and the reduction in scale of devices that became sillicon semiconductors. For a history of what is likely under your fingers, then to assume x86_x64 then it goes back to about 1978 and the introduction of the intel 80086 which was an early 8-bit 5Mhz(some was 10Mhz) CPU. By about 1982 his evolved into the 80186 and 16-bit cloclking between 6Mhz and 25Mhz, evolving rapidly to 80286 another 16-bit chip that absorbed many system funcitons like memory management and came in steps between 4Mhz and 12.5Mhz for the later models.

1985 saw the 80386 and more absorbtion of system function and the introduction of the 32-bit platform that still shapes x64 software today. Clocking between 12Mhz and 40Mhz for the later models, it also delivered significant improvements in speed. There was a 80387 co-processor available that made things like floating point math more efficient. This evolved into the i486 - the 80xxx notation being dropped eventually due to legal battles of trying to own a number.

The i486 was still 32-bit but faster at 16Mhz to 50Mhz. It also contained a FPU so could do floating point faster than a 80386+80387, and other upgrades like a data cache. The results of this legal battle I mention resulted in the i586 being labelled "pentium" - as that was easier to own than a number - the P5 still being x86 but bringing yet more inmprovements and clocking faster at between 60 Mhz and 233Mhz. This stage also introduced "MMX" - MultiMediaeXtension - to handle processing of media more effectively.

By about 1995 the evolution comes to the P6 / i686 - which covers quite a lot of chips. All still x86 more improvements arrived with each chip fabrication generation and steadily increasing clock speeds as pentium pro became pentium II became pentium III and change package completely, then changed again for pentium4 before it became celeron/celeron m, and eventually by about 2006 core/core 2 and clock speeds of up to 4Ghz. About here is when the x86_x64 extensions started to creep in and chips started to get a 64 bits of addressable space and the Xeon architechture still used in heavy server platforms. After that came the i3/i5/i7/Xeon platform you're possibly already comfortable with.

There's obviously other manufacturers, and their efforts have shaped this landscape also. There's other chip types, like ARM which will trace it's history back to the RISC processors in old Archemedies/Acorn systems etc. And chips previous to the x86 line.

  Updated  on Mar 12, 17 / Ari 15, 01 18:05 UTC, Total number of edits: 2 times
Reason: typo, formatting fail

Mar 12, 17 / Ari 15, 01 19:16 UTC

Comment deleted

  Updated  on Jun 15, 17 / Can 26, 01 16:09 UTC, Total number of edits: 1 time
Reason: "This user no longer wishes to be associated with a tin pot banana republic"

Mar 12, 17 / Ari 15, 01 19:24 UTC

I'm not entirely certain that'd define up as a "computer" - although it's almost certianly the inspirations behind the 1940's input systems employed by IBM that held strong up until the invention of teletypewriter machines(and why I still have TTY sessions), and consequentially the keyboard.