# The problems between Intel and AMD



## The red spirit (Sep 29, 2015)

It's no surprise that Intel and AMD are always competing in CPU market and that they are two leaders of it. Recently I found some articles that greatly illustrate some problems that we consumers may have never knew. We often care about which offers faster, more practical product, but these articles are more about managements, finances and the great story of undying competitors.


Links:
https://jolt.law.harvard.edu/digest/intel-and-the-x86-architecture-a-legal-perspective
https://arstechnica.com/information...all-of-amd-how-an-underdog-stuck-it-to-intel/


At least to me it became a bit clearer why AMD is always an underdog after reading these. Also why people call Intel evil. Certainly many things became just obvious in my head. lol I never had any idea that AMD once had a crazy CEO.


----------



## tanstaafl28 (Sep 10, 2012)

@The red spirit

I've almost always been an AMD fan. I've always gotten reliable performance from them at a more affordable price.


----------



## The red spirit (Sep 29, 2015)

tanstaafl28 said:


> @The red spirit
> 
> I've almost always been an AMD fan. I've always gotten reliable performance from them at a more affordable price.


lol. I never had any Intel device myself, only used them via relatives or in short amounts like in school or in store. Their chips are fine, but I just can't buy them, not because of money, but because AMD offers more for less and Intel just seems illogical choice. Also AMD has always been such dramatical and romantic company, Intel is just like monster, it eats everyone alive if they come too close, meanwhile AMD is always that Don Quixote, that you end up falling in love, but the only problem is that he end's up close to big monster Intel. This is probably laughable visualization of imagination, but to me AMD feels more humanistic and closer to hearth. I grew to prefer AMD, instead of Intel. 


BTW I will do yet another AMD project really soon and post it in perC. Current codename is "Mild velocity", other names were "Overdrive" (it sorta is too Intel), "OverGHzing" (it misses the point), "Gentle gust" (mild velocity sounds better). I will leave your imagination run wild about what it actually gonna be. I can say that it will likely be something not really expected and out of ordinary.


----------



## SgtPepper (Nov 22, 2016)

i worked with intel for a year before i learned about how they teamed up with microsoft to keep people in the dark about how to create their own OS. fuck em all tbh.


----------



## The red spirit (Sep 29, 2015)

SgtPepper said:


> i worked with intel for a year before i learned about how they teamed up with microsoft to keep people in the dark about how to create their own OS. fuck em all tbh.


Seriously, the more I know about Intel, the more I think I will get nightmares. Just out of curiosity, are you still working in computer related field?


----------



## SgtPepper (Nov 22, 2016)

The red spirit said:


> Seriously, the more I know about Intel, the more I think I will get nightmares. Just out of curiosity, are you still working in computer related field?


Not seriously anymore. only as a hobby. I used to work as a Network Engineer, then tried to get into the kernel programming world which i still love, but not with companies who want to lock the game and maintain their monopolies.

i changed to the art and music field now.


----------



## The red spirit (Sep 29, 2015)

@ae1905 what do you think about Intel and AMD?


I swear I will not hate Intel, their management is generally good. AMD is pure mess. I still don't like some Intel's aggressive and criminal actions, but they can make great products. I will try to keep conversation civil this time


----------



## Maybe (Sep 10, 2016)

I just an older laptop off ebay.

Dell Latitude E6530
Intel Core i7-3540M 3.00GHz, 2 Core(s), 4 Logical Processor(s)

With integrated Intel HD 4000 graphics.

It's not bad considering what I paid for it. I'm pretty happy with what it can do. Graphics is crap, but that's what I expected.


----------



## The red spirit (Sep 29, 2015)

Maybe said:


> I just an older laptop off ebay.
> 
> Dell Latitude E6530
> Intel Core i7-3540M 3.00GHz, 2 Core(s), 4 Logical Processor(s)
> ...


Don't want to disappoint you, but that's a bit off-topic. This thread is mostly about AMD's and Intel's management, finances, strategies, history and what they achieved.

BTW yeah Intel graphics are the worst. I remember they were too slow for NFS underground, unless you run it on lowest settings and 640x480 and they fare almost the same as FX 5200 in 3D Mark 2001 SE. My dad's brand new laptop feels slow, because they can't render PDF pages scrolling fast enough. Even with Nvidia Optimus this problem persists. I had motherboard with ATI integrated graphics, called ATI 3000 series IGP and it wasn't so bad. Neither integrated Nvidia 6150 wasn't so atrocious. And now considering how many things are accelerated by GPU, Intel should definitely do something about it or they will become incompetitive without graphics card and that would mean that whole Intel PC will become more expensive and people just won't buy that. I also read that Intel graphics often have various graphical glitches, poor rendering quality and other various problems that mostly were AGP era things. Intel graphics are longest surviving graphics decelerators.


----------



## ae1905 (Jun 7, 2014)

The red spirit said:


> @*ae1905* what do you think about Intel and AMD?
> 
> 
> I swear I will not hate Intel, their management is generally good. AMD is pure mess. I still don't like some Intel's aggressive and criminal actions, but they can make great products. I will try to keep conversation civil this time
> ...



your arstechnica article--and this thread--is curiously dated






also, just in time for this thread:

*Intel CEO Brian Krzanich Resigns Over Relationship With Employee*


----------



## The red spirit (Sep 29, 2015)

ae1905 said:


> your arstechnica article--and this thread--is curiously dated


Not sure what do you mean here. This is a thread about history, actions and tactics of companies. It's not like Intel's suing attempts will disappear from history, neither their criminal actions will be deleted from records. Neither AMD's poor management choices or "real men build fabs". Things that happened will exist and be signs of how those companies operate. These may be scars, but still it's interesting to know more about deeper level of how those companies worked. 





ae1905 said:


>


So you are saying that AMD is now stronger? I thought I posted this thread to show why AMD struggled, besides offering really closely performing processors for a long time. This video tells something about current situation and it's true. Well, okay. Nice addition, just that thread was more about what happened.





ae1905 said:


> also, just in time for this thread:
> 
> *Intel CEO Brian Krzanich Resigns Over Relationship With Employee*


lol, what a pain to read it in such format. I read that, but I didn't understand what happened well, so I read these two instead:
https://www.theverge.com/2018/6/21/17488070/intel-ceo-brian-krzanich-resigns-employee-relationship
Intel CEO Brian Krzanich resigns after investigation finds past relationship with employee - Business Insider

So yeah it was a mess. It looks like tyrannic decision, that was being covered up. Totally not nice, but Brian shouldn't have broke that policy if everyone has to obey it. Well they knew before applying into job that they will have to do what rules say, but that's still a bit cold. It feels wrong do such a cruel thing, but as history shows and this actions shows, Intel doesn't care about that. Intel does what they think will be successful and they are really strict about that. There's lack of morality in their upper management, I guess. I find that being really wrong.


----------



## contradictionary (Apr 1, 2018)

The red spirit said:


> Links:
> https://jolt.law.harvard.edu/digest/intel-and-the-x86-architecture-a-legal-perspective
> ]


Ahhh the infamous am386 chip, my first taste ever on cpu product which had left me with lifetime impression.

This kind of flashback brought back to the memory when AMD was last time sat in the performance throne from their Athlon X2 lineups for quite some time until the Chipzilla reclaim it on 2006 with their new core 2 architecture.

Which brought to my attention that the most recent revealed vulnerabilities in the form of meltdown and spectre (and derivatives) which haunts Intel since the new year reveal to me some of their secret recipe way back to early core 2, i.e: branch prediction unprivileged access. Disregarding standard security measures for most possible performance at all cost. 

Security by obscurity, until someone found out that by cache priming and probing they can create covert communication channel into os kernel, fully undetected.

Sent using Tapatalk


----------



## The red spirit (Sep 29, 2015)

contradictionary said:


> Ahhh the infamous am386 chip, my first taste ever on cpu product which had left me with lifetime impression.


To me such thing was and will always be AMD K8. It was absolute monster for years, even first dual core architecture and they are still alive and kicking if paired with graphics card for acceleration. They last forever. That's amazing "hammer" that will forever be sign of lasting, simple and almost perfect design. From what I read about AMD's 386, it was legendary, but I will never fell it, I guess.




contradictionary said:


> This kind of flashback brought back to the memory when AMD was last time sat in the performance throne from their Athlon X2 lineups for quite some time until the Chipzilla reclaim it on 2006 with their new core 2 architecture.


It wasn't only Athlon X2, it was Athlon 64 FX, Athlon 64, Semprons, Athlon XPs and even original Athlons. AMD beat Intel for years then. It was extremely successful K7 and K8 killing spree.





contradictionary said:


> Which brought to my attention that the most recent revealed vulnerabilities in the form of meltdown and spectre (and derivatives) which haunts Intel since the new year reveal to me some of their secret recipe way back to early core 2, i.e: branch prediction unprivileged access. Disregarding standard security measures for most possible performance at all cost.
> 
> Security by obscurity, until someone found out that by cache priming and probing they can create covert communication channel into os kernel, fully undetected.


Wow, that leaves a bad after-taste on Core 2 Duos and Quads, which were faster, but now it seems like they were desperate too.


----------



## contradictionary (Apr 1, 2018)

The red spirit said:


> It wasn't only Athlon X2, it was Athlon 64 FX, Athlon 64, Semprons, Athlon XPs and even original Athlons. AMD beat Intel for years then. It was extremely successful K7 and K8 killing spree.


Ah yes, my keyboard had tricked me. I mean the last crowned in early 2006 was Athlon 64 X2. And 64FX. Right before the release of Core 2 Duo and Core 2 Extreme.

Nehalem just seal the chipzilla's clamp to the throne even further. But even then i ended up picking my X4 955 BE roud:

Sent using Tapatalk


----------



## The red spirit (Sep 29, 2015)

contradictionary said:


> Ah yes, my keyboard had tricked me. I mean the last crowned in early 2006 was Athlon 64 X2. And 64FX. Right before the release of Core 2 Duo and Core 2 Extreme.


I think it was 2005, but really that doesn't matter much. AMD had reigned ever since K7 Athlon and that's at least 3 years if not 4 or 5. It was very long time in tech industry.




contradictionary said:


> Nehalem just seal the chipzilla's clamp to the throne even further. But even then i ended up picking my X4 955 BE roud:


X4 955 BE was overclocker's dream, yet another very memorable CPU.


----------



## The red spirit (Sep 29, 2015)

Not much new to learn, but media is talking about all Intel's criminal behaviour.


----------



## The red spirit (Sep 29, 2015)

Finally a video about AMD.

Conclusion: They are good company and they are liked by people, because their products are interesting, promising. Too bad AMD often makes bad decisions, poor work on software support side or focus on less important things. Therefore they are lovable fools, who seldom shoot themselves into their own foots, but sometimes they trade blows with giant tech companies too, when they magically get their priorities straight and do things right.

Potential is high, but stability is low. Bonus points for not being semi-criminal company like Intel is.


----------



## The red spirit (Sep 29, 2015)

Barely related comment:
AMD's GPU division tactics are often really scummy. Lots of promises broken, not exactly well functioning features and etc.

Examples:
Vega disaster
RX 560 D mass robbery
Mantle API (what happened?)


I personally have problems with RX 560. It doesn't sustain base clock speed, it overheats in Furmark, it has random artifacting issues, unstable performance due to unstable clock speed, power consumption is way higher than advertised. AMD's drivers compared to Nvidia's are pain in the butt. They aren't as stable. Plain and simple. They are also rarer. Finewine wasn't a myth tho. It is real, performance does increase, but I would rather have stable and functioning hardware whole time.

I haven't had any of such problems, when ATI was ATI. AMD ruined ATI. Fun thing is that some ATI leftovers could be found in secret names or even in GPU-Z. I have ATI graphics logo, but card is identified as AMD. 

Forgot to mention that AMD's Crimson control Panel is huge mess. Meanwhile Catalyst looked old, but at least it was mostly straight-forward and easy to use. New control panel is nearly useless. Perfect case of trying to fix something good, that doesn't need to be fixed and it turns out to be shit.

nVidia experience was mostly better. Issues weren't tried to be hid and they soon got fixed. Card just aged, not too well but I hold no such thought as "oh it still could be competitive if not those lazy driver writers".

I honestly feel robbed by AMD and their RX 560. Should have went with 1050.

Unrelated to this post:
AMD also lied about Bulldozer cores:
https://www.engadget.com/2015/11/07/amd-processor-core-class-action-lawsuit/

There was lawsuit and AMD lost. So my FX 6300 is real triple core CPU with 6 threads.


----------



## contradictionary (Apr 1, 2018)

Yeah, the radeon group perform so badly that Intel had to kidnap Raja Koduri from them. And Jim Keller too! Hahahaha

Sent using Tapatalk


----------



## The red spirit (Sep 29, 2015)

contradictionary said:


> Yeah, the radeon group perform so badly that Intel had to kidnap Raja Koduri from them. And Jim Keller too! Hahahaha
> 
> Sent using Tapatalk


I'm not sure. Really. Now that I have calmed down I think that manufacturers may acknowledge that specifications are rather useless, so instead model number only means performing similar and should define tier. You see, the problem is that CPUs and graphics cards now have lots of clock speeds, or states if you want to be more technical. Those are low power states, turbo states and there are so many of them. For CPU you can mostly turn these off, but for GPU you can't, at least without BIOS flashing. Gigabyte's website that my graphics card doesn't have base clock speed, instead it has base clock speed up to. Therefore making it harder to tell if graphics card is throttling or not. 

Problems I have is that speed is always fluctuating, but this is kinda alright after further realizations. I tested my graphics card in Unigine heaven test at stock settings and with power limit settings, custom preset, 1440p. Stock got 628 points, meanwhile with more fixed clock speed it got 657. There are gains, but they are pretty small. I tried to overclock my card too and reached 1350 core and 2000 memory clock speeds, then I got 711. Still small improvement and poor scaling. I have to increase clock speed more than twice as much to get much less of increase in performance. So, I think manufacturers are trying to find most efficient value between clock speeds, power consumption. I can only say maybe here.

What is seriously infuriating is that in Furmark graphics card dips into 980MHz, that's really a throttling, but temperature was around 72C. I tried to increase power limit and it gave me higher clock speeds, which were in 1100s, but card got very hot. 85C and then Furmark crashed. Honestly, I sort of wanted to melt it, so that I would have excuse for warranty. Anyway, this sort of thing is totally not acceptable. No power saving shit should be so intrusive and card having dual fan design and one of the bigger heatsinks is just getting too hot. It's seriously bad, but I have to remember that it only happens in stress test like that, but not in any real life scenario. Still sucks to know that it can't sustain maximum wattage. My nVidia GTX 650 Ti from Palit was single fan version and it was loud, but it worked fine at stock clock speeds, it didn't have turbo state. Survives Furmark as well as Heaven. Even can handle overclocking.

Another thing is drivers. AMD drivers are inferior. I never ever had to use DDU with nVidia cards, but with ATI/AMD I have. They just don't uninstall properly. 

I tried some beta drivers on APU and it was huge train wreck. Drivers worked acceptably well, but residue left was atrocious. After "uninstall" (several times), there was like 3GB useless files accumulated which could be safely removed.

I have few words about Crimson panel. It's nice to see that AMD cares about looks, but Crimson panel is inferior to some Catalyst panels. ATI X800 series had one of the best control panels I have ever seen and RX 560 has one of the worst. I barely can find anything and barely could configure anything. It's not straightforward and tends to get slightly sluggish between switching tabs over time. With nVidia I had sluggishness problem, which was never resolved. Minus points for both as both ATI X800 series card and FX 5200 have lighting fast panels. GTX 650 Ti has slightly less functional panel than FX 5200, but only slightly. RX 560 has much more inferior control panel to ATI X800 series. AMD A4's Catalyst was garbage can too, it was functional, but so overloaded and messy. 

I have another problem with RX 560. It sometimes does artifact on lower half of screen. Simple computer restart or monitor restart solves this issue. Not sure if it's hardware or software, but GTX 650 Ti only had it for short time until one driver release. Also, when my Gigabyte motherboard started to die I saw same thing. It makes me feel a bit uneasy, knowing that my PC may start to malfunction soon.

Still, there are some good things about RX 560 or AMD in graphics cards. Finewine is real. Basically graphics card specifications still matter. More (shader) cores are better as simple as that. GTX 1050 may have won on launch, but now I see RX 560 overtaking it in almost any games. Personally I saw my own RX 560 improve dramatically in GTA 5. I got maybe 10 fps increase from 45fps. That's a lot. Same happened with R7 260x compared to 750 Ti. Old Radeons like 7970, R9 290 still hold up well, because AMD can extract more performance with driver releases if graphics card have huge core count. With nVidia you just don't get it. AMD cards are investment. 

RX 560 still makes more sense compared to GTX 1050, even if it was slower. Answer is only 2GB VRAM. Today it's very low. Therefore you may not be able to fit higher quality textures and as I concluded in Athlonium 64 thread, textures are one of the most important settings in terms of visual quality. As people say "RAM size doesn't matter, as long as you have enough". It's true in graphics cards too, if you are running out of it, then you get hit on performance, maybe get some nasty stuttering or not completely rendered textures. That's precisely why my GTX 650 TI aged so badly was only 1GB VRAM. It couldn't even play Doom 640x480 well. It just demanded too much VRAM. Meanwhile I frequently see GTX 650 TI's with 2GB VRAM performing much better.

One of main selling points for Radeon to me was better retro software support and I didn't got it at all. Juiced didn't render textures well, had hick-ups. 3DMark 2001 just refuses to run. NFS Porsche Unleashed doesn't work well. UT99 has lots of issues and for some reason worked best with Metal API, which wasn't AMD's nor ATI's thing at all. On DirectX it either is way too dark, crashes or I get low fps. All those tasks are much better executed with GTX 650 Ti. Well sure AMD never purposely advertised it as feature, but community said the words of wisdom. Too bad they only applied up to R9 series, RX is middle finger. GTX 650 Ti does retro stuff better as nVidia has 3DFX roots and some abilities were integrated, if not then community made something like Glide wrapper nGlide, which works on nVidia hardware.

I can say that APUs that I tried, A4 6300 and A6 7400K had somewhat less of those problems. ATI X800 cards were totally amazing and made me fall for AMD. Little did I know that ATI and AMD have different philosophies and that they never should have be confused with each other. 

Meanwhile all AMD GPUs have software problem with Crimson control panel. If you check for updates and click on update drivers, then there's like 60-70% chance that it will fail. It was main reason, why I had to use DDU so often. That's not acceptable.

I can say that I was disappointed with my experience. Finewine shouldn't be considered as feature by people, it literally shows that AMD sucks at driver optimization of their cards and always lags behind nVidia, just because their driver team is garbage. They lose on many early benchmarking, which is very important for early perception of card. Even now that RX 560 is faster than GTX 1050, people still recommend GTX 1050. Mostly because once they read it, it got stuck and AMD successfully creates inferior brand image.

I can say that one annoying thing exists out of AMD's range. It's Windows 10 with driver auto install. It's complete garbage. In 99% of cases drivers either work slowly, don't work correctly, don't install control panels like Catalyst and etc. It's a huge problem I have with this "feature". It applies to almost any piece of hardware like motherboard, graphics card or etc. Days when driver installation must have been done manually are gone, but this is like one of the worst solutions to that and creates more mess than there initially was or wasn't.

I forgot to mention another thing. Power consumption. I have RX 560 without 6 pin connector. It should eat 75 watts at max. Card in GPU-Z is reported as 47 watt card. It never was a reality. It was almost always closer to 100 watts.

So here we have it hot, poor software, high power consumption promising hardware with lots of broken expectations. Looks like classical Radeon illness. They even got as shady as Intel with their RX 560 Dick controversy. Sure we can excuse that specs are meaningless and that model name only defines tier. Too bad it failed so hard that it puts it to lower tier, which is misleading and almost could be called a mass robbery or money swindling or scamming. I'm also still mad that they ditched their Ruby in graphics demos, they looked so good. They could have made it official AMD heroine, sort of like hero, but no all good thoughts went to trash, because AMD is AMD. nVidia's demos very often were creepy AF, like faces, ogre, some other creepy animations of horror, ugh...

Here's a random demo:






One good thing that somehow still survived is ATI's perfect BIOS moding tools. AMDs got them updated and still functional. That's nice to see, but it's nowhere near ATI's X800 days. When pencil mods, volt mods, aftermarket cooling solutions, lots of creative cooler designs, BIOS moding, overclocking with dice, LN2 was very popular (considering extreme cooling it was relatively very popular to overclock those cards like that). 2004 was best AMD/ATI year. Athlon 64 and ATI X800 series were one of the best things those companies ever made. Athlon 64 was dominator, ATI X800 series were moder's dream. Forgot to mention unlocking pipelines to make your model into more expensive model, as well as sometimes still existent VIVO functionality. Ain't nothing like that nowadays. Ryzen isn't a clear dominator as Athlon 64 was, Radeons now are just seriously un-Radeony. R9 290X or R9 Fury X was the last true 'muscle' class GPUs. Those didn't care about "if we lower clock speeds we could save power", they were literally fuck that and tried to push as many cores as possible. They ran hot, they ate lots of power, but also offered lots of power and even now still do. R9 290X specifically was AMD's last card that was awesome and just felt like AMD or ATI. Hugely inefficient beast.


----------



## Lucan1010 (Jul 23, 2018)

Intel is generally going to outperform current generation AMD (though AMD is more cost effective) but AMD had really caught up and started to bridge the gap between the two companies. Nowadays and really comes down to preference and your specific budget/needs.


----------



## The red spirit (Sep 29, 2015)

Lucan1010 said:


> Intel is generally going to outperform current generation AMD (though AMD is more cost effective) but AMD had really caught up and started to bridge the gap between the two companies.


What? How so?

Both companies at this moment are selling very close to each other CPU in terms of performance, but then there's no way to define that performance. In single threaded tasks Intel wins, but AMD has much more threads, which are just slightly slower than Intel's. In terms of performance per watts, which some reviewers call important measure, AMD has some slight lead with their 65 watts CPUs.

Now AMD Ryzen isn't very cost effective. 1200 doesn't make sense at all, it costs too much, almost 90 euros. For budget part it's not passable. On the low end Pentium G5400 is dominating with price of around 50 euros. Best value from AMD is only with 1600, 1600X and 1700. Everything else has poor value relatively.

On Intel's side Pentium G5400 makes most sense for budget buyers. Then we have a problem, i3s generally are just not worth it. i5s is where it's at. More precisely it's i5 8400. Then i7s mostly are poor value.

Anyway both companies' offers poor value mostly right now, but here's one thing. At least in my country older AMD parts like FX CPUs and some FM2+ or FM2 APUs are dirt cheap and knowing that DDR3 is also almost twice cheaper, those parts still make sense for budget builders as they have great value.

Intel is going to outperform current generation AMD is just pure blanket statement. Open your eyes and look, Intel may have faster each thread, while AMD has only slightly slower ones. Which means that same core count and same thread count CPUs will perform really closely. But here's a key, AMD just gives more cores and more threads. They are literally offering most processing power for home users and on enthusiast side their Threadrippers are literally ripping Intel apart. Mostly due to more cores. But Intel's enthusiast grade CPUs often have lower clock speed than home user i7 for example and AMD offers lots of cores without such compromises. There are many very similarly performing chips now, but on purely high end market Intel is worse choice than AMD, unless all threads aren't used. Then Intel may be slightly ahead, but we are talking about high end here and we have to think about everything that comes with CPU too. AMD packs good cooler with some hefty headroom left in terms of thermals. Intel is skimping on cooling and result is way too thing heatsink, which can't do task properly. So, higher end CPUs with those coolers may not achieve their maximum base clock speed on high load (when it is mostly needed) and we aren't talking about turbo clock speed here. Then Intel's theoretical speed advantage is ruined by reality. It's impossible to cool down very hot small spot either without increasing radiating area or dramatically increasing airflow (which will result in unbearable noise). Or if you buy K series CPU, which is poor value by itself, then you get no cooler at all. Meaning that you will have to invest in cooling separately.

Cooling has been Intel's problem for almost decade:





(context: back then lots of aftermarket cooling products were really low quality, didn't provide lots of surface area, had poor installation mechanisms, were sometimes unfinished and spilt blood due to really sharp corners. It was rare to actually see decent aftermarket solution. So that's they they still recommended to not change cooler even if stock one is obviously not enough. Anyway motherboard recommendation is load of bull, those were in no way outstanding or whatever. Anyway back then Intel's Pentium 4 was a huge failure, truly inferior to AMD's Athlon 64, so if you were enthusiast back then, you should have gotten AMD and don't even bother with Intel)

If we compare some side features like overclocking without caring about value, then Intel makes capable overclocking chips. It's possible to reach 5GHz on air. AMD's Zen CPUs are horrible at overclocking, pretty much if you are not a hardcore enthusiast ready to shell out money on phase change cooler, then it's just better to not invest in CPU cooling and either be fine with stock clock speeds or just push CPU slightly with stock cooler. This is the only one thing, where AMD's Zen is seriously inferior to AMD's FX series and AMD's up to 7000 series APUs. 7000 series APUs are terrible overclockers too, not to mention how hot they are. So Intel can make sense for enthusiast, who is ready to overclock his computer and wants latests parts.

Really, on the high end it makes more sense to get AMD, even more if we think about value. If I had to make a choice now for myself, I would get k series i5 and clock it to 5GHz, but I would invest in decent cooling and know that AMD will outlast it. I think it should be more fun to have Intel for my personal needs. My choice of cooling would be air cooler in case with good airflow.



Lucan1010 said:


> Nowadays and really comes down to preference and your specific budget/needs.


Only sometimes.


----------



## Lucan1010 (Jul 23, 2018)

The red spirit said:


> What? How so?
> 
> Both companies at this moment are selling very close to each other CPU in terms of performance, but then there's no way to define that performance. In single threaded tasks Intel wins, but AMD has much more threads, which are just slightly slower than Intel's. In terms of performance per watts, which some reviewers call important measure, AMD has some slight lead with their 65 watts CPUs.
> 
> ...


I did say "generally" outperforms, and I will specify that the latest Intel model USUALLY (but not always) outperforms AMD on most tasks. However, the difference is becoming more and more negligable as time goes on. The exact model and brand and a user should get depends on their budget and needs--they should get the best processor they can for their money, and there are a few rare situations that a specific brand works better for a specific task (this is a GPU example, so it's slightly unrelated: AMD renders HDR better, NVIDEA renders textures better.). I personally prefer Intel but AMD also makes great products and some of the best computer's I've used had AMD.


----------



## The red spirit (Sep 29, 2015)

Lucan1010 said:


> I did say "generally" outperforms, and I will specify that the latest Intel model USUALLY (but not always) outperforms AMD on most tasks. However, the difference is becoming more and more negligable as time goes on.


That usually is starting to lose meaning. In recent years AMD has been making Zen CPUs and now plans are to increase core count and thread count. For some reason media is so hyped to see those new monstrosities. In reality tasks scale better if single threaded performance is good, instead of core count. As I see it now, it looks like developers will have very little choice soon and will have to optimize software for lots of cores to execute. Right now is AMD reigning in here and is ambitious to push boundaries as far as they can. Then Intel is losing relevance with their scandals, overpriced products, stupid products and as we can see at the moment, they struggle to release something new. I think your belief is more historical, when FX was very questionable competitor to Core i series or when Phenom was just just good enough to compete against Intel stuff. That's very short time frame to be honest. AMD totally destroyed Intel in Athlon, Athlon XP, Athlon 64 (FX) and Athlon 64 X2 days. That's like 5 years in total and it was a lot. Even more historically when AMD could make socket compatible processors, Intel released something and AMD released faster and cheaper products. Hell, even AMD 486 was almost equal to Intel's Pentium. So even to say that there is that "something that generally outperforms" is at best completely subjective and very narrow belief. In pre-2000 (until 95 I guess) years AMD was almost expected to release faster than Intel's CPUs. In 2001-2005 AMD was untouchable as Pentium 4, Pentium D failed horribly and AMD ran cooler, faster, was cheaper and at way lower clock speeds (better IPC). Anyway since 2005 all way up to AMD's Zen era Intel were better choices. Maybe Phenoms were competitive, but FXs totally weren't. If we think about majority of tasks then Intel's single thread speed advantage can be often seen, but then again very demanding tasks are often optimized to use lots of threads and AMD shines there. Even in single threaded tasks AMD isn't very far from Intel. There's no such thing as generally in terms of outperforming, unless we are talking about short time periods.




Lucan1010 said:


> The exact model and brand and a user should get depends on their budget and needs--they should get the best processor they can for their money, and there are a few rare situations that a specific brand works better for a specific task (this is a GPU example, so it's slightly unrelated: AMD renders HDR better, NVIDEA renders textures better.).


Whoa, wait. How does nVidia render textures better. That doesn't make sense as texture is texture, it can be rendered or not, but it's going to be exactly the same thing on any hardware (I'm not talking about ancient AGP stuff). Maybe if you meant texture filtering then it may be possible that nVidia offers more filtering, but I never saw it. I have GTX 650 Ti and RX 560. Haven't noticed any difference in textures, nor their settings. Even if filtering is 8X, 16X makes very little difference, almost indistinguishable and anything beyond will have no benefits. Maybe you meant better compression, but compression doesn't affect quality or affects it negatively if it's lossy. Compression can only help in slightly increasing memory bandwidth and even then it's not a lot. Maybe it was once a time, when nVidia offered more TMUs, but now it's totally not that time. Also AMD has HBM, which is much faster than anything GDDR and it may help in everything, including textures.

I dunno about HDR and honestly don't really care as it brings very little difference. Resolution and textures (filtering too) are the most important. All other shading, lighting and even AA filtering are in second place. This is what I learned while experimenting with FX 5200 and with ATI X800 XT PE ( I honestly was disappointed to see ATI not bring much of visual difference compared to the lowest of the low FX 5200, which I happened to have fastest model of it). Those settings make most perceivable difference, but I want to add polygon count now sometimes known as geometry or tessellation too. At many modern games it becomes really visible. I run Far Cry Blood Dragon at 1440p, low settings, except I left textures at highest setting. RX 560 can push around 60-70 fps to bring buttery smooth experience. I would like some AA and more geometry, but in action heavy game I would rather have more fps, because in fast paced games, low fps can ruin experience much more than let's say flat or blocky rocks, walls, trees.

Meanwhile at games like Civilization, I wouldn't mind 30-40 fps if I could have more AA and overall better visuals, yet again those games aren't very graphically demanding so it's easier to have those on higher settings. They need faster CPU or maybe faster single threaded performance, knowing that there are enough threads.




Lucan1010 said:


> I personally prefer Intel but AMD also makes great products and some of the best computer's I've used had AMD.


I never had Intel computer myself, but I have used others. My grandpa's laptop has Pentium P6200 and despite it being like 8 years old, it's still is surprisingly fast. Anyway, integrated graphics are more than horrible. They can only run NFS Underground at 640x480 and really low settings. That's almost as bad as integrated nVidia Geforce 6150 Go that I tried and it is form 2004. FX 5200 probably would outperform both, despite being from 2003 and not supporting many modern features and overall having abysmal specifications. In previous school, I used something Pentium 4 era Intel running at 3.06GHz. It was really slow, I think it was late Pentium 4 without HT, Celerons probably didn't have such model. It was worse than Athlon 64 3200+. In my current school (now finished) I had first series Pentium G CPU, can remember full model, but it was Gx40. It was dual core. It felt snappy, rather fast and overall good. Then I used i5 7400 and it blew my mind. It was literally the fastest thing I have ever used. In web browser page rendering was literally instantaneous. It was the first thing that was faster than my overclocked FX 6300. Then my dad bought laptop with i7 7700HQ, which is supposedly faster than my FX 6300. It felt slow, really slow. Intel graphics with it are totally atrocious just like always, but it has GTX 1050 Ti in there. Too bad switching is sluggish and not enjoyable at all. 

Now I will talk about AMDs, which are all mine. Athlon 64 3200+ was rather fast at its prime and even now it's still kicking. Athlon 64 3400+ which has 200 MHz more, but less cache feels the same as 3200+. Not really snappy now, but still usable. FX 6300 was faster with Windows 7 and 8.1, for some reason it's very sluggish with 10, but it works and still is multitasking monster, still it should have been called triple core CPU with 6 threads instead of hexacore. AMD A4 6300 as I purchased it wasn't very fast, but could do lots of tasks well. Then I upgraded it to A6 7400K and it felt much faster, yet real performance was very similar and it's furnace. My another grandpa upgraded his old laptop to modern one and I inherited old one. It had AMD Sempron 64 3400+ mobile, 1.8GHz single core CPU, which was surprisingly still really snappy even now. Upgraded it to AMD Turion 64 X2 TL-60 mobile, which is dual core running at 2GHz, it made very little perceivable difference compared to that Sempron, which was very surprising to me. It honestly didn't feel snappy.

I never owned Intel myself, but have used fair share of them. Since I only had AMDs, I had to troubleshoot their problems and curse only at their faults, but not at Intel's. Actually it was more often to see motherboards malfunctioning and CPUs working well. Due to motherboards, I had lots of bad experiences which makes me want to associate AMD with negative things, but I know that CPUs have nothing to do with those faults. I never seen Intel based PC to stop working or to malfunction, but that's related to never having them and never fully discovering them. I know that they fail and that's nothing more unusual than failure on AMD based system.

I honestly wouldn't care what was the best thing you have used in terms of brand as it doesn't matter. What matters is picking the best available at current era. Brand loyalty doesn't make much sense, unless some brands have terrible management or other atrocious issues, when just performance becomes less meaningful. Intel does have rather long criminal record and it's nothing good. AMD doesn't seem to have it that bad, but their CPU division in the past made seriously stupid choices. Meanwhile AMD's graphics division is just as bad as Intel in terms of honesty. Therefore there are no saints or true evil corporations, only beliefs, biases. I would now would just go with whatever is faster and better overall. Both companies have done many things wrong, some crimes and etc. I would say it's a tie. I personally know what I want and know my needs rather well. For sure, I would invest in good air cooling and lots of fans, because I'm enthusiastic here and absolutely hate systems that run really hot, also I'm concerned about VRMs, which from AMD FX era made me look more seriously at them, instead of just being happy with what I had. I know that I like to overclock and am no fan of locked systems, except those cases, when thing just work better on those. I know I will push CPU to max once, which is probably 5GHz as that number is magical. Then very soon will just stay with mild overclock for the rest of the time or maybe if I'm happy with thermals, will just keep it at 5GHz. I also don't need lots of cores and 6 fast cores are totally fine for me. Making i5 8600K the best choice for me. Ryzens don't deliver great overclockability and I don't really want AMD. I'm still not happy for AMD choosing such a stupid name for their CPUs and killing Athlons entirely instead of keeping it as low budget line of CPUs.

I get all that brand game and get that love for underdog, but it all disappears to me when I have to deal with sluggish crap. So "always pick the best available despite the brand" looks like the best I came up with. I really hate when people start to totally hate brands, just for being brands and etc. To me they are both equally capable brands with some future.


----------



## Lucan1010 (Jul 23, 2018)

The red spirit said:


> That usually is starting to lose meaning. In recent years AMD has been making Zen CPUs and now plans are to increase core count and thread count. For some reason media is so hyped to see those new monstrosities. In reality tasks scale better if single threaded performance is good, instead of core count. As I see it now, it looks like developers will have very little choice soon and will have to optimize software for lots of cores to execute. Right now is AMD reigning in here and is ambitious to push boundaries as far as they can. Then Intel is losing relevance with their scandals, overpriced products, stupid products and as we can see at the moment, they struggle to release something new. I think your belief is more historical, when FX was very questionable competitor to Core i series or when Phenom was just just good enough to compete against Intel stuff. That's very short time frame to be honest. AMD totally destroyed Intel in Athlon, Athlon XP, Athlon 64 (FX) and Athlon 64 X2 days. That's like 5 years in total and it was a lot. Even more historically when AMD could make socket compatible processors, Intel released something and AMD released faster and cheaper products. Hell, even AMD 486 was almost equal to Intel's Pentium. So even to say that there is that "something that generally outperforms" is at best completely subjective and very narrow belief. In pre-2000 (until 95 I guess) years AMD was almost expected to release faster than Intel's CPUs. In 2001-2005 AMD was untouchable as Pentium 4, Pentium D failed horribly and AMD ran cooler, faster, was cheaper and at way lower clock speeds (better IPC). Anyway since 2005 all way up to AMD's Zen era Intel were better choices. Maybe Phenoms were competitive, but FXs totally weren't. If we think about majority of tasks then Intel's single thread speed advantage can be often seen, but then again very demanding tasks are often optimized to use lots of threads and AMD shines there. Even in single threaded tasks AMD isn't very far from Intel. There's no such thing as generally in terms of outperforming, unless we are talking about short time periods.
> 
> 
> 
> ...


I don't really have brand preference when it comes to hardware, and I agree with "pick the bestdespite the brand". And here's what I meant about HDR: https://segmentnext.com/2018/07/23/nvidia-gpu-performance-suffer-hdr/ HDR is nice for watching movies. I don't personally own a 4K screen so I don't know how it effects gaming.


----------



## contradictionary (Apr 1, 2018)

Heh, freesync and the new freesync2 ftw!

https://www.techradar.com/news/nvid...c-2-the-race-for-high-dynamic-range-pc-gaming

Sent using Tapatalk


----------



## The red spirit (Sep 29, 2015)

Lucan1010 said:


> I don't really have brand preference when it comes to hardware, and I agree with "pick the best despite the brand".


I felt some real preference for Intel here. I see no problem with having those, but it better be only at preference level, instead of starting to become a fact. It's fine to be informed overall and have preferences, but as I see on net. Many people have very little knowledge about hardware in general and there exists a strong belief that Intel is better. Their arguments sometimes are totally irrational like "Intel CPU will last longer" or "Intel CPUs are more stable". They are both ridiculous because lasting longer can't be predicted well and there is very little research about CPUs lasting longer, so at best this is a myth, at worst delusion. Intel being more stable is complete nonsense too. Sure there were some periods in 90s or 80s, when Intel was 1st player and AMD had to make CPUs for same sockets and same chipsets. Intel had advantage of being in the lead. There truly were such problems as sometimes AMD being less compatible, but now it's very far from truth. Sometimes if you manage to come into wrong part of Intel fanboys I could get response "Enjoy your AMD trash". I'm sure that AMD fanboyism exists too, but AMD fanboy quantity is much lower in general. 




Lucan1010 said:


> And here's what I meant about HDR: https://segmentnext.com/2018/07/23/nvidia-gpu-performance-suffer-hdr/


10% is small difference, really. And I don't like that test at all. Only one game, only one resolution. It looks to me like some part of GPU was already close to limit at SDR, meanwhile HDR being slightly more demanding it meant that there was lack of execution units or something like that. For some reason doesn't apply to Radeon card. Maybe nVidia lacks part of hardware altogether, meanwhile AMD has it? I dunno. But really that article looks to me like it was written more for flaming purposes rather to show anything worthwhile.

Oh shit, sorry. Didn't saw that there are other pics.

Well, I see. I still think that it's lack of hardware or some other hardware related deficiency. Then it's valid point of HDR being worse on nVidia. I still would like to see more cards tested, maybe older models and at lower resolutions. Besides that I remember Linus's Vega 64 review. It honestly sucked, but now I see AMD's classical Finewine kicking in. Vega 64 is generally faster than GTX 1080. My RX 560 started to perform better in GTA 5 after some updates. I got around 10 fps increase. That's a lot, knowing that I played it on 1440p and at medium-high settings, got rather unstable 45 fps, now it's much less droppy and on average feels much higher.

Anyway, all I can say is that HDR is still mostly useless feature. I tried to see comparisons and at best there is slight difference in shades, but overall it's probably barely noticeable by many people. I remember there wee other issues with it on some monitors, so it's a problem not only on graphics card level. I'm not sure if my monitor does support it, but I have BenQ BL2420PT, which is entry level professional monitor. 1440p, 24", 60Hz, IPS, 5ms, 100% sRGB. Everything looks great on it and I have no problems with colors. Knowing how many people have almost zero knowledge on how to calibrate TVs or monitors, I think HDR may become more of masking tool for their lack of knowledge, rather than anything worthwhile. I paid honestly a lot for this BenQ, but I have privilege to see things as accurately as cameras could capture video. Sure there are much better monitors than this one, but at this level it's decent. For same price I could get 1440p, 240Hz, TN, 1ms. But after TN, I didn't want another TN. Viewing angle to me were crucial.




Lucan1010 said:


> HDR is nice for watching movies. I don't personally own a 4K screen so I don't know how it effects gaming.


I can tell you difference between 1080p and 1440p. Games will look slightly more clear. Photos can looks much more life-alike and more detailed. In movies you should see more details. That's it. 1440p is somewhat more taxing on graphics card rather than 1080p, but difference isn't huge. 1080p is still decent.


----------



## Lucan1010 (Jul 23, 2018)

The red spirit said:


> I felt some real preference for Intel here. I see no problem with having those, but it better be only at preference level, instead of starting to become a fact. It's fine to be informed overall and have preferences, but as I see on net. Many people have very little knowledge about hardware in general and there exists a strong belief that Intel is better. Their arguments sometimes are totally irrational like "Intel CPU will last longer" or "Intel CPUs are more stable". They are both ridiculous because lasting longer can't be predicted well and there is very little research about CPUs lasting longer, so at best this is a myth, at worst delusion. Intel being more stable is complete nonsense too. Sure there were some periods in 90s or 80s, when Intel was 1st player and AMD had to make CPUs for same sockets and same chipsets. Intel had advantage of being in the lead. There truly were such problems as sometimes AMD being less compatible, but now it's very far from truth. Sometimes if you manage to come into wrong part of Intel fanboys I could get response "Enjoy your AMD trash". I'm sure that AMD fanboyism exists too, but AMD fanboy quantity is much lower in general.
> 
> 
> 
> ...


I only have 1080p screen (albeit a very nice one with high color accuracy) but I can notice a difference with HDR on 4K. The colors really pop and it adds a lot of depth. It is a matter of opinion though if you want it, not everyone does. One of my friends only buys movies and tv shows in Standard Definition because they claim they can't notice the difference between that and HD (then again, they kinda have bad eyesight and have glasses, so that's probably why).


----------



## The red spirit (Sep 29, 2015)

Lucan1010 said:


> I only have 1080p screen (albeit a very nice one with high color accuracy) but I can notice a difference with HDR on 4K. The colors really pop and it adds a lot of depth. It is a matter of opinion though if you want it, not everyone does. One of my friends only buys movies and tv shows in Standard Definition because they claim they can't notice the difference between that and HD (then again, they kinda have bad eyesight and have glasses, so that's probably why).


If things pop up, then it could just be more contrast.

I personally looked back at DVD movies few weeks ago and quality is good. Not perfect, but good. Sure thing bluray is better, but if bluray movie is going to cost times more than DVD, then it barely makes sense to buy bluray.

It's almost always better to have at least inferior product than not have it at all.


----------



## Lucan1010 (Jul 23, 2018)

The red spirit said:


> If things pop up, then it could just be more contrast.
> 
> I personally looked back at DVD movies few weeks ago and quality is good. Not perfect, but good. Sure thing bluray is better, but if bluray movie is going to cost times more than DVD, then it barely makes sense to buy bluray.
> 
> It's almost always better to have at least inferior product than not have it at all.


Fair enough. DVD is how I end up watching most movies even though I have a blu ray player. Blu rays have gone down in price though. A new movie on blu ray used to cost around $30, now it's rare to see on above $20. I don't usually rent or buy movies though, most of the time I either see them in theaters or borrow them from the local library.


----------



## The red spirit (Sep 29, 2015)

Lucan1010 said:


> Fair enough. DVD is how I end up watching most movies even though I have a blu ray player. Blu rays have gone down in price though. A new movie on blu ray used to cost around $30, now it's rare to see on above $20. I don't usually rent or buy movies though, most of the time I either see them in theaters or borrow them from the local library.


I just use The Pirate Bay, totally free for me.


----------

