Jump to content
IGNORED

When did PCs stop being PCs?


Keatah

Recommended Posts

Yes.

 

Both terms, clone and compatible were used. Compatible was used more formally and professionally. Advertising and box labeling usually used "compatible" because it sounded richer and less cheap than "clone". Clone painted the picture of cheap knock-off, something off-shore for the lowest bid, something not 100%. A copy. Compatible created the image of a quality system built to many standards instituted by any given company. Compatible means works with. Clone means copy.

  • Like 1
Link to comment
Share on other sites

On 12/10/2022 at 1:26 PM, Keatah said:

Yes.

 

Both terms, clone and compatible were used. Compatible was used more formally and professionally. Advertising and box labeling usually used "compatible" because it sounded richer and less cheap than "clone". Clone painted the picture of cheap knock-off, something off-shore for the lowest bid, something not 100%. A copy. Compatible created the image of a quality system built to many standards instituted by any given company. Compatible means works with. Clone means copy.

I seem to recall that the early clones/PC compatibles were not 100% compatible though?   That may have influenced the terminology

Link to comment
Share on other sites

1 hour ago, Gemintronic said:

This year with Windows 11.  You can no longer install Windows without connecting to the Internet and signing up for an account.

 

YES there are workarounds.. but, that's classic M$ allowing workarounds at first to ease some evil.

 

Correct.  I have seen this on some installations of Windows 10, too.

 

https://pureinfotech.com/bypass-internet-connection-install-windows-11/

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

With all the psychological marketing and hype and surrounding peer pressure from people and the internet surrounding building PCs making you feel you need to keep up or you're missing out and stuff.. Has anyone here ever won the race to the top? The overclocking trophy? To have the bestest most fastest rig ever in the annals of Personal Computing? And if you did how long did you stay at the top? A few days? Weeks? And by who's standards, which organization's standards, were you judged against? And if you partook in such futility, when did you escape from the suffocating illness and stop doing it?

Link to comment
Share on other sites

Like the 'great' Ivan Drago once said: "I fight to win for Me!  FOR ME!!!"

 

Meaning build whateverthehell satisfies YOU within your budget and/or fantasy of what your 'dream PC' should be!

 

In my case, who the hell wants Windows 11? 😛

Link to comment
Share on other sites

Has there been any recent push for overclocking? I know there is a segment that gets giddy over having an absurd* number of cores but clock speed chasing seems to have slowed. 30 years ago, even the fastest hardware one could get was often slower than desirable. Now, cores get turned off and the remaining cores spend most of the time at a minimally clocked idle. 

 

* If the memory subsystem can not fill all the cores, the system has cores that can not be used and money has been wasted. 

Link to comment
Share on other sites

8 hours ago, Keatah said:

With all the psychological marketing and hype and surrounding peer pressure from people and the internet surrounding building PCs making you feel you need to keep up or you're missing out and stuff.. Has anyone here ever won the race to the top? The overclocking trophy? To have the bestest most fastest rig ever in the annals of Personal Computing? And if you did how long did you stay at the top? A few days? Weeks? And by who's standards, which organization's standards, were you judged against? And if you partook in such futility, when did you escape from the suffocating illness and stop doing it?

Some people just feel they have to have the best, they can't stand that someone else somewhere is getting 2 fps more than them.

 

It's also not as widespread as you'd think.   If you look at Steam hardware survey, maybe 5% max have really beefy PCs and the vast majority have average or even somewhat dated ones.

  • Thanks 1
Link to comment
Share on other sites

38 minutes ago, Krebizfan said:

Has there been any recent push for overclocking? I know there is a segment that gets giddy over having an absurd* number of cores but clock speed chasing seems to have slowed. 30 years ago, even the fastest hardware one could get was often slower than desirable. Now, cores get turned off and the remaining cores spend most of the time at a minimally clocked idle. 

I love how overclocking has become a marketing thing.   My GPU advertises that it's overclock capable, has enough cooling to handle it, and even comes with overclocking tools.     It's used to be that overclockers were kind of rogue-  you were daring to run things faster than they were meant to run, and if heat was a problem then it was on you to figure out how to cool it.

Link to comment
Share on other sites

1 hour ago, zzip said:

It's also not as widespread as you'd think.   If you look at Steam hardware survey, maybe 5% max have really beefy PCs and the vast majority have average or even somewhat dated ones.

Perhaps in recent times the urge has become less. What with all the speed changes that happen in a modern chip. This ain't your grandfather's oscillator.

 

SpeedStepping and TurboBoosting does it for you. Today it seems the chips will run as fast as you can cool them. All about thermal headroom. And thankfully air cooling has become highly capable - because in an ordinary rig it means it can run silently.

 

My recent Intel builds have base clock speeda 200-600MHz slower than the Pentium 4 Northwood and Prescott from 20 years ago. While of course being vastly more efficient and quicker.

 

1 hour ago, zzip said:

I love how overclocking has become a marketing thing.

The industry needs something. Number of cores isn't as sexy as clocking. And it seems easy to add them willy-nilly. E-cores and P-cores and thread arbitrator circuits just don't lend themselves to wow moments on paper anymore. Especially as software is still more single-threaded than not!

Link to comment
Share on other sites

2 hours ago, Krebizfan said:

Has there been any recent push for overclocking? I know there is a segment that gets giddy over having an absurd* number of cores but clock speed chasing seems to have slowed.

I don't know much about the core chasing crowd. I'm still impressed by anything more than just 6 or 8 cores. And this is mainstream today.

 

I wonder if the speed chasing hasn't slowed because it's all automatic and based on that thermal headroom factor. Just feed a system enough cooling to keep the CPU below 100C and it will run as fast as it can. Nothing to it!

 

2 hours ago, Krebizfan said:

30 years ago, even the fastest hardware one could get was often slower than desirable. Now, cores get turned off and the remaining cores spend most of the time at a minimally clocked idle.

It's a stark illustration of how we use our systems today. Most tasks don't require 8 or 16 cores running full-tilt at breakneck speeds.

 

Like I'm sitting here organizing my emulator stuff. That doesn't need anything more than 800MHz or so. Same thing with word processing. And video playback has its own instruction set and circuitry so it, too, can idle along stress-free.

 

I suppose I permanently dropped out of the MHz race once I saw how well modern parts ramp their speed and just plow through everything. Instead of manually tweaking everything, just feed the system and let it auto-adjust to load conditions as it sees fit.

 

2 hours ago, Krebizfan said:

* If the memory subsystem can not fill all the cores, the system has cores that can not be used and money has been wasted. 

This is why I'm so big on cache, L1, L2, and L3. Mainstream i9KF has like more than 64MB of on-die cache. That's like the entire memory amount in an early-mid 90's computer.

 

As a trivia question: Did any classic 8-bit system have a cache other than the Apple //c+?

Link to comment
Share on other sites

On 12/14/2022 at 7:03 PM, Keatah said:

The industry needs something. Number of cores isn't as sexy as clocking. And it seems easy to add them willy-nilly. E-cores and P-cores and thread arbitrator circuits just don't lend themselves to wow moments on paper anymore. Especially as software is still more single-threaded than not!

Today from what I can notice, the race is more about having a full blinkenlight, and the lowest temperatures. Of course having low temp is always good for the hardware, but I seen peopel that consider that your computer is wholly unoptimized if your CPU temp goes over 50°C at full load...

 

Overclocking is still a thing, but not as much since many motherboards includes it, and well, winning 200 htz over a speed of 3Ghtz is more symbolic than anything else.

 

  • Like 1
Link to comment
Share on other sites

4 hours ago, CatPix said:

Of course having low temp is always good for the hardware, but I seen peopel that consider that your computer is wholly unoptimized if your CPU temp goes over 50°C at full load...

True enough. Efficiency is always good. And higher temps tend to limit the self-OC mechanisms.

 

IMHO the best coolers are large passive radiators and not water cooling. Water cooling is fraught with durability and longevity issues. Has to be serviced every couple of years, and even then in the interim time problems can develop. Not so with a large Mugen 5 air radiator. Silent and just works.

  • Like 1
Link to comment
Share on other sites

Efficiency is good yes, but 50°C was, not even 10 years ago for desktop machines, a regular temperature for an idle CPU.

Now, we can get the same temperature for a CPU under load? That is basically excellent.

High-performance laptops still hit 80°C or more under load.

Sure, it is better for the longevity of your hardware, but those persons are the same that replace their CPU or whole computer every 2 years or so.

Bickering about CPU temp under load is the equivalent of the 90's race to squeeze 10 more Mhtz out of a Pentium III.

I have seen people who's goal is to have their CPU temp being basically at room temperature. It's just basically what most people have left on their PC to do since the Mhtz race is dead or easy to achieve.

 

And you're right, usually those people are all about watercooling.

 

  • Like 1
Link to comment
Share on other sites

I am a firm believer in reading the manuals for maximum temperature. Different companies place the sensors differently yielding different readings. I remember a few years ago, AMD had listed lower maximum temps than Intel but the AMD sensor would also register a lower temperature. 

 

I don't like running fans at maximum speed. The noise bothers me. If the system proves fast enough running a bit slow and only using minimal passive cooling, that is perfect for me. 

Link to comment
Share on other sites

Speak for yourself. :)

 

i do enjoy tinkering, but I dont like having shit break because of my being an edgelord. I very much like reliability.

 

when I plan builds these days, its with the '10 year realistic service life' baked in. I rode the 5yr plan version in the 90s and 2000s, and dont fancy riding it again.

 

 

Link to comment
Share on other sites

28 minutes ago, wierd_w said:

when I plan builds these days, its with the '10 year realistic service life' baked in. I rode the 5yr plan version in the 90s and 2000s, and dont fancy riding it again.

My customers do the five-year maximum to keep critical machines under warranty.  Far too many of them like to push that out and make me responsible for keeping things running.  Which is why I have so many S3420 and S1200 motherboards laying around, though the last of those machines has been taken out of service so I no longer need them.

 

Me, however... my laptop is seven years old, and while my i7 desktop is only about four years old, it replaced a Core2Quad which was about ten years old (time flies when you are having fun.)  I have one server which is 20 years old on an Athlon XP 1900+, another which is running on almost 30 year-old hardware with an AMD K6-III+, and a Sun SS10 clone running dual Ross 100s which is old enough to rent a car.  None of these run critical roles, but it is some kind of sick and sadistic fun to keep them running.

Link to comment
Share on other sites

Continuing to run a 20 year-old Pentium-M, undervolted and idle set to 50-598MHz for R-Pi class power consumption. Of course it ramps to 1.7GHZ and dissipates 21 watts on demand. But it's great for office work and light emulation duties currently.

Edited by Keatah
Link to comment
Share on other sites

3 hours ago, OLD CS1 said:

My customers do the five-year maximum to keep critical machines under warranty.  Far too many of them like to push that out and make me responsible for keeping things running.  Which is why I have so many S3420 and S1200 motherboards laying around, though the last of those machines has been taken out of service so I no longer need them.

 

Me, however... my laptop is seven years old, and while my i7 desktop is only about four years old, it replaced a Core2Quad which was about ten years old (time flies when you are having fun.)  I have one server which is 20 years old on an Athlon XP 1900+, another which is running on almost 30 year-old hardware with an AMD K6-III+, and a Sun SS10 clone running dual Ross 100s which is old enough to rent a car.  None of these run critical roles, but it is some kind of sick and sadistic fun to keep them running.

 

I like sick and sadistic things also, but dont have the real-estate to run vintage equipment.  Instead, I have done "Nobody sane would EVER try to do this. So, let's do it anyway!!" type things. Just because I can.

 

One such foray, was setting up a Netware 5 server inside dosbox (with hosted dos disk image), running on a NAS, with a VNC virtual x server.  It ran about as well as you would expect. :P 

It's not like I actually had some NEED to test that-- or even for a netware server at all really--- it was one of those "I wonder if" moments.

Link to comment
Share on other sites

  • 2 months later...

replying to almost dead post but....

 

I really think PC started to become something different during WIN95 and 98se basically when Microsoft added DX6-12 (I think it's at now.) due to each DX release it had a .LIB set to each Dx release and made it more of emulation than actual hardware... games that used newer graphical effects would be included in the next release of DX.. a good example of this is Gothic 3 vs hell gate londan DX9 vs DX10 both DX9 and ten could have depth of field how ever you need 10 to get this effect in hell gate londan... not because Dx9 could hold it but because it was easier under 10 since it was part of DX10 .libs files that made it easier to program for developers basically they stopped actually devolving games.. for hardware.. and had to make games more compatible for ATI, NVIDIA

 

One more reason I like Open GL is because its open source and dosent have to pay Microsoft for fee of using DX

 

Anyways.. What depth and fild added as a graphic effect is the blurry back ground to a crisp for ground..

 

 

 

Gothic 3 DX 9.0

 

DX 10

 

 

PS... I'm drunk so don't get on me about to much about things LOL!!

  • Haha 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...