# The future of the internet



## Electra (Oct 24, 2014)

What do you think will be the future of the internet? (Earth moving closer to the sun, solar outbursts, moving to other plants, etc.)


----------



## Grandmaster Yoda (Jan 18, 2014)

What do you mean? We're going to get close to the sun and everything will melt?

The Internet is going to be more widespread than ever. We already have it on our refrigerators. Communications will become more unified as we continue to switch our older technologies to the Internet.

We also might get cancer and global warming from all of the wifi and radios.


----------



## Electra (Oct 24, 2014)

Well, according to some theory at least the earth is supposed to melt after some time, then there are the solar outburst which does things with electromagnetism, and if we move to another planet I have no idea how it would work at all when it comes to the cabels and servers, etc.


----------



## Grandmaster Yoda (Jan 18, 2014)

Blizzard said:


> Well, according to some theory at least the earth is supposed to melt after some time, then there are the solar outburst which does things with electromagnetism, and if we move to another planet I have no idea how it would work at all when it comes to the cabels and servers, etc.


Electromagnetic interference is a big problem. On Earth, when it is quite severe, we ought to use Shielded Twisted Pair cables that block out the interference. Like in nuclear power plants I believe they use those. Fiber Optics, which rely on just LEDs or Lasers to operate, are virtually immune to EMI as we know it. Our computers aren't completely light operated like some our cables though. I have a feeling by the time we move to a different planet, we will have made fiber optics cheaper and more capable than ever even though they are already much better than what we normally use in local environments. To my understanding, we are at the tip of the iceberg with a lot of technologies even though they have been around for decades. There is still a lot of progress to be made.


----------



## HAL (May 10, 2014)

Blizzard said:


> Well, according to some theory at least the earth is supposed to melt after some time, then there are the solar outburst which does things with electromagnetism, and if we move to another planet I have no idea how it would work at all when it comes to the cabels and servers, etc.


lolwut

Earth won't melt, it will be engulfed by the sun as it expands due to its ever-reducing density as its fuel burns out.

But that's so far in the future, the planet will have probably gone through several new eras of species dominance (first being dinosaurs, second being mammals, then third, fourth etc being who knows what). 5 billion years from now. Human society will be long gone. 

In the even that humans actually are able to find a place and plonk themselves there instead, 'the internet' would still be the same as it is now. Do you mean people from the new colony might want to connect to get information from the old earth internet? Well that's a question of transferring data, which would need hard drives, not cables and servers.


----------



## Electra (Oct 24, 2014)

HAL said:


> lolwut
> 
> Earth won't melt, it will be engulfed by the sun as it expands due to its ever-reducing density as its fuel burns out.
> 
> ...


You have an interesting answer but you need think more optimistic-realistic.
I don't know so much about how transferring internet works so thanks for sharing!
I assume we will need internet for a long time and if people will live on different plants I believe we still want to communicate with each other! Much of our history and knowledge are online and internet is a powerful tool.


----------



## HAL (May 10, 2014)

Blizzard said:


> You have an interesting answer but you need think more optimistic-realistic.
> I don't know so much about how transferring internet works so thanks for sharing!
> I assume we will need internet for a long time and if people will live on different plants I believe we still want to communicate with each other! Much of our history and knowledge are online and internet is a powerful tool.


I guess it would be possible to use some kind of deep space wifi system. You'd have to ask an electronic engineer about the complexities of it though!


----------



## zynthaxx (Aug 12, 2009)

Blizzard said:


> You have an interesting answer but you need think more optimistic-realistic.
> I don't know so much about how transferring internet works so thanks for sharing!
> I assume we will need internet for a long time and if people will live on different plants I believe we still want to communicate with each other! Much of our history and knowledge are online and internet is a powerful tool.


The issue is one of practicality. Already from Mars, the signal delay between your computer and a computer on Earth would be between 3 and 21 minutes, depending on where in their orbits the planets were located at the time. Then the response would take the same time back. Add to that the issue of bandwidth: How much data could you transfer per second across such a link?

In that sense HAL's response is extremely valid: You'd need to pack hard drives containing the information you need to have available on another planet and bring them along; creating a new, separate Internet on the other planet. Communication between the two Internets would be a trickle in comparison to what's possible between computers with mere thousands of kilometers of ground-based infrastructure between them.


----------



## Electra (Oct 24, 2014)

zynthaxx said:


> The issue is one of practicality. Already from Mars, the signal delay between your computer and a computer on Earth would be between 3 and 21 minutes, depending on where in their orbits the planets were located at the time. Then the response would take the same time back. Add to that the issue of bandwidth: How much data could you transfer per second across such a link?
> 
> In that sense HAL's response is extremely valid: You'd need to pack hard drives containing the information you need to have available on another planet and bring them along; creating a new, separate Internet on the other planet. Communication between the two Internets would be a trickle in comparison to what's possible between computers with mere thousands of kilometers of ground-based infrastructure between them.


Interesting. So if Aliens who were able to communicate like us were to contact us maybe they would not be _that likely _to primarily use the internets unless they landed on one of these planets earth first...? Idk


----------



## zynthaxx (Aug 12, 2009)

Blizzard said:


> Interesting. So if Aliens who were able to communicate like us were to contact us maybe they would not be _that likely _to primarily use the internets unless they landed on one of these planets earth first...? Idk


There's a difference between someone emitting signals indicating conscious intent ("There's intelligent life here at the point in time when this message is being sent") and feasible two-way communication. To give another example, Voyager 1 has barely left our Solar system, and rather than a signal requiring twentyish minutes to arrive (worst-case from Mars), the signal requires over nineteen and a half HOURS for a one-way trip. Say hello, and you'll get a reply almost two days later - and far longer if you have anything interesting to say.
And guess what: That "light year" way of calculating distance to other stars? That's also the way to calculate the fastest one-way communication time to them. So if you forgot your hard drive and want to browse Earth porn from Proxima Centauri, you'd better practice that patience...


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> The Internet is going to be more widespread than ever.


We likely don't want it. We already have problem with NEETs, weabos, antisocials and retarded society.



Grandmaster Yoda said:


> We already have it on our refrigerators.


Yeah right, my 18 year old Daewoo probably has it too.



Grandmaster Yoda said:


> Communications will become more unified as we continue to switch our older technologies to the Internet.


Aren't they unified already?


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> We likely don't want it. We already have problem with NEETs, weabos, antisocials and retarded society.
> 
> 
> Yeah right, my 18 year old Daewoo probably has it too.
> ...


Unified Communciations refers to putting "realtime services" and "non-realtime services" onto the same network infrastructure. For example, having voice over IP phones and IP teleconferencing. We certainly are not already completely unified and I don't know if that is necessarily a giant vision for us, but is cheaper in some sense. There is also some benefit to the separation of networks. A company could have a dial-up connection going to their equipment so that if their internet access fails, they can still dial in and fix things using telephone wiring.

It's cheaper to make a phone call over the Internet than it is to make a phone call over the regular telephone system in terms of long distance. My school just recently switched to mostly use of VoIP phones. There are still regular telephone jacks in our rooms for some reason, not like anyone uses them. Obviously in a lot of houses people aren't using VoIP phones. We still have landline here. I know the US is the most corrupt and backward country when it comes to this stuff though.

A lot of people are getting video services from Netflix and such now instead of using cable. The discussion is really more relevant for businesses since people at homes don't necessarily "maintain networks" in that sense. There is still backward compatibility and conversions between Internet Phones and Regular Phones so it's not like we are completely unified.

I'm pretty sure IoT is expanding. I doubt this applies to every country in the universe because richer countries have more money to work with. But as I said before the US is pretty garbage because of a lack of investment. Imagine getting a notification that your toast is ready. That sounds kind of dumb, but we are moving in that direction it seems.


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> Unified Communciations refers to putting "realtime services" and "non-realtime services" onto the same network infrastructure. For example, having voice over IP phones and IP teleconferencing. We certainly are not already completely unified and I don't know if that is necessarily a giant vision for us, but is cheaper in some sense. There is also some benefit to the separation of networks. A company could have a dial-up connection going to their equipment so that if their internet access fails, they can still dial in and fix things using telephone wiring.


So that's what you meant, I thought you mean all those smart devices.



Grandmaster Yoda said:


> It's cheaper to make a phone call over the Internet than it is to make a phone call over the regular telephone system in terms of long distance. My school just recently switched to mostly use of VoIP phones. There are still regular telephone jacks in our rooms for some reason, not like anyone uses them. Obviously in a lot of houses people aren't using VoIP phones. We still have landline here. I know the US is the most corrupt and backward country when it comes to this stuff though.


US as most corrupt and backwards? Haha XD no. Cute attempt tho.




Grandmaster Yoda said:


> A lot of people are getting video services from Netflix and such now instead of using cable. The discussion is really more relevant for businesses since people at homes don't necessarily "maintain networks" in that sense. There is still backward compatibility and conversions between Internet Phones and Regular Phones so it's not like we are completely unified.


ok then, you seem to me like touching irrelevant stuff for many people instead of something more relevant. i bet that more people are interested in Netflix, rather your suggested internet phones.




Grandmaster Yoda said:


> I'm pretty sure IoT is expanding. I doubt this applies to every country in the universe because richer countries have more money to work with. But as I said before the US is pretty garbage because of a lack of investment. Imagine getting a notification that your toast is ready. That sounds kind of dumb, but we are moving in that direction it seems.


That doesn't seems like it, but it is dumb. I have never seen tech like that in my country at all. US is likely one of the leading countries in tech and your complaining looks really meh to someone, who lives in way less advanced country. I couldn't even find RX 560 4GB model in like 8 shops.


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> So that's what you meant, I thought you mean all those smart devices.
> 
> 
> US as most corrupt and backwards? Haha XD no. Cute attempt tho.
> ...


In terms of the network infrastructure between fiber-optics and cabling, the developing countries get newer technology while the US stills uses the same stuff we've been using for decades. There are plenty of examples in the US where we have had fiber-optics laid down, but completely unused. ISPs here are generally not considered to be the businesses to look up to here. Part of the anti-net neutrality argument was that fewer regulations would mean ISPs would invest more in infrastructure, but many people scoff at that claim due to the fact that these CEOs of those companies pocket money for themselves instead of investing and so forth with the political jargon. Not really worth getting into, but I think there is somewhat of a consensus that there is a messed up situation going on there.

GPUs, game systems and all those consumer products are likely a different story. I know that a lot of stuff costs more to buy in other countries. I should have separated those two things. Because I have heard multiple times that the US computer network infrastructure doesn't get changed and new ones get built in other countries which provide better technology. One of my professors joked that the Hurricanes represent great job opportunities for us because now that their old infrastructure is destroyed we will be able to lay down new stuff.

I mean those are just random examples. People switching from cable to Netflix is in a way unifying their communications to only internet. Internet phones also aren't limited to hard phones, but that also includes things like calling people over Facebook or Skype because those don't typically rely on using your telephone network. I've seen a few times people prefer to use Skype to make video calls to their distant families instead of regular telephone calls. But in the business world it just makes a lot more sense to use VoIP over long distance because it is cheaper.

The really irrelevant thing was the toaster thing. I don't know why anyone would want to push notification for their toast. It's possible though. I haven't really read too much about how it is progressing in the news or anything. But the term for that is IoT (Internet of Things) which brings to the table plenty of security concerns and stuff.

It seems like the wealthy get the technology first and then it comes down to the less wealthy afterward. One of my classroom topics was SCADA systems, which is basically how we network computers to monitor factory equipment. That has probably been useable by manufacturers longer than the average person has had their own personal computer measuring type devices like heartbeat sensors and Internet toasters. If you think about so was the Internet itself. Before regular people started using computer networking, it was used for very commercial purposes between businesses.

I do now confirm that SmartToasters exist for almost $200. Smart TVs and everything. I've never seen anyone with a smart toaster. I don't think most people have this type of stuff. But once you start looking at newer appliances, you might notice some form of connectivity to computers. It's there, but I haven't really seen it much either.


----------



## Electra (Oct 24, 2014)

Grandmaster Yoda said:


> The really irrelevant thing was the toaster thing. I don't know why anyone would want to push notification for their toast. It's possible though. I haven't really read too much about how it is progressing in the news or anything. But the term for that is IoT (Internet of Things) which brings to the table plenty of security concerns and stuff.


 My grandfathers house got a new owner after he died and that guy updated his house to have lots of cameras and be super secure, but someone hired from a tv-company hacked his house in a program about the issue just to test it, and they opened his curtains from outside in a car.




Grandmaster Yoda said:


> I do now confirm that SmartToasters exist for almost $200. Smart TVs and everything. I've never seen anyone with a smart toaster. I don't think most people have this type of stuff. But once you start looking at newer appliances, you might notice some form of connectivity to computers. It's there, but I haven't really seen it much either.


 I have ADHD and if I had that toaster I may not forget about the toast all together. Same goes for ppl with dementia, alzheimers, depression, etc who struggle with memory and consentration/distraction 
:happy:


----------



## Electra (Oct 24, 2014)

If one doesn't have problem with these issues; the smart toaster seems kind of like this:
https://www.isitdarkoutside.com/


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> In terms of the network infrastructure between fiber-optics and cabling, the developing countries get newer technology while the US stills uses the same stuff we've been using for decades. There are plenty of examples in the US where we have had fiber-optics laid down, but completely unused. ISPs here are generally not considered to be the businesses to look up to here. Part of the anti-net neutrality argument was that fewer regulations would mean ISPs would invest more in infrastructure, but many people scoff at that claim due to the fact that these CEOs of those companies pocket money for themselves instead of investing and so forth with the political jargon. Not really worth getting into, but I think there is somewhat of a consensus that there is a messed up situation going on there.


That sucks for US. I live in states too. Mine are Baltic states. Once Lithuania had the fastest net in the whole world. I guess we are advanced in that field, but we can't solve other problems at all. Poverty is normal thing, vandals and thugs too. 




Grandmaster Yoda said:


> GPUs, game systems and all those consumer products are likely a different story. I know that a lot of stuff costs more to buy in other countries. I should have separated those two things. Because I have heard multiple times that the US computer network infrastructure doesn't get changed and new ones get built in other countries which provide better technology. One of my professors joked that the Hurricanes represent great job opportunities for us because now that their old infrastructure is destroyed we will be able to lay down new stuff.


But on the other hand they had overpriced GTX 1050 Ti's everywhere. I only was able to find RX 560 2GB, which is already obsolete thing. So I ordered what I wanted. Now I have dual fan version of it with 4 gigs of VRAM. My point is that consumers shouldn't be treated like trash or be influenced by unfair availability of products. Even tho Nvidia and Intel have worst value, they are trying to be sold. Meanwhile AMD is something you have to search a lot in physical stores, only to find gimped variants of them. In USA you have at least some stores with great availability of products. There's some hope there, but not in the rest of the world. If your prof lived here and hurricane happened here too, then he would be funny man with wise words. Too bad I don't know usual situation in US, so I won't comment on that much, but he is a funny guy.




Grandmaster Yoda said:


> I mean those are just random examples. People switching from cable to Netflix is in a way unifying their communications to only internet. Internet phones also aren't limited to hard phones, but that also includes things like calling people over Facebook or Skype because those don't typically rely on using your telephone network. I've seen a few times people prefer to use Skype to make video calls to their distant families instead of regular telephone calls. But in the business world it just makes a lot more sense to use VoIP over long distance because it is cheaper.


You make sense here. That's very true, but some services are fucking evil and I want smack their CEOs in da face. You know why? Nah, you probably don't. Some of them are not available in many countries. Little did I knew after purchasing my first Xbox Live Gold that some stuff just simple won't work, because I live in bad region. I thought, that I live in Europe. The civilized place, too bad it wasn't true. Nowadays it's much less of the problem, but at rare times it happens.



Grandmaster Yoda said:


> The really irrelevant thing was the toaster thing. I don't know why anyone would want to push notification for their toast. It's possible though. I haven't really read too much about how it is progressing in the news or anything. But the term for that is IoT (Internet of Things) which brings to the table plenty of security concerns and stuff.


Before potato mania in computing memes, toasters were the poor victims of that. Maybe that somehow derived from that and someone thought it was a good idea lmao.




Grandmaster Yoda said:


> It seems like the wealthy get the technology first and then it comes down to the less wealthy afterward.


It's nothing new. My parents were using Athlon 64 system until 2017. I was suggesting them to sell it maybe 2013, so they could get some money for it. If it wasn't my knowledge about some older hardware it would have been sold for maybe 20 or 30 euros. Euro is like 1.2 dollar. Anyway I turned it into badass relic from the past, into Athlonium 64. At the 2004 Athlonium 64 before upgrades costed around 1200 euros. That's a lot, considering it didn't have decent graphics computational power and crappy 300W power supply and some other bits, that are more on the low end.



Grandmaster Yoda said:


> One of my classroom topics was SCADA systems, which is basically how we network computers to monitor factory equipment. That has probably been useable by manufacturers longer than the average person has had their own personal computer measuring type devices like heartbeat sensors and Internet toasters. If you think about so was the Internet itself. Before regular people started using computer networking, it was used for very commercial purposes between businesses.


And now they are like "look my new PC now has lots of RGBs!". We have honestly degraded. Benchmarks back then were measured in how well does business software work. Almost never games. Only the important things. I have read some older computer and tech magazines, like Maximum PC or Computer Bild (Lithuanian version of German magazine).




Grandmaster Yoda said:


> I do now confirm that SmartToasters exist for almost $200. Smart TVs and everything. I've never seen anyone with a smart toaster. I don't think most people have this type of stuff. But once you start looking at newer appliances, you might notice some form of connectivity to computers. It's there, but I haven't really seen it much either.


I would honestly ditch that Smart TV shit and would buy TV with 120Hz IPS display, that could output 100% sRGB colors and has sturdy construction. Many of those Smart TV qualities become no functional so soon that they are totally not worth it. On top of that many of them have horrible controls too. Having a keyboard for TV is a bit ridiculous, meanwhile voice controls are inaccurate. LG's smart remote was sort of nice thing for Smart functionality, but it sucked for normal TV stuff. They can't even make that stuff decently, they only want to put Smart stickers everywhere. 


Another problem is people themselves. Many of them don't care about technology much and if they saw this video:





I bet that at least 20% would think it's legit in my country. Not because they are dum dum, just because they know nothing about tech at all. Effectively rendering lots of advancements in tech straight into trash can. But ya know, consumers are always right. My music teacher is right for hitting poor Samsung monitor, just because it turns off every 15 minutes. Obviously someone set turn off timer and she thinks it's malfunctioning. She likes to hit or smack other electronics too, then rant a bit lmao.


----------



## pwowq (Aug 7, 2016)

Toasters have USB-ports.


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> That sucks for US. I live in states too. Mine are Baltic states. Once Lithuania had the fastest net in the whole world. I guess we are advanced in that field, but we can't solve other problems at all. Poverty is normal thing, vandals and thugs too.
> 
> 
> 
> ...


https://www.cnet.com/news/fcc-us-slow-internet-speed-americans-lagging/
It's a year old but kind of discusses the problem.

I'm sure that you can come to the local electronics store and pick a GTX 1060 off the shelf here and they would have more to spare. This is the wealthiest country in the world. Of course it's also full of wealth inequality as well. Go beyond electronics, the US probably has a lot of availability of products compared to poorer nations. Even clothing is cheaper here. My "stepmother" is from Ecuador, she literally buys clothes here and then gives them to her family and they can sell them. Ah, but they have cheaper medicine than we do.

I remember there was backlash during the Xbox One announcement because they said you had to connect to the Internet to activate games or something. Some countries don't even have internet so it would have been useless.

I don't know how much computers used to be. I think I looked up our family's original computer from 2000. It had an AMD Duron which turns out to be a low-end processor but it did have 8MB graphics. I think it was at least $700, but I don't remember. If you want low-end now, you can get something for maybe $300. Low-powered computers that are like mini-itx inside lightweight cases. I had one of those in my room before we gave it away. Completely non-upgradable except for HDD and RAM. We also have my cousin an all in one PC with the same incredibly low specs. It had 6GB of RAM, but not like that matters when it was dual-core 1.4GHz AMD APU. Weak. It seems like people more interested in computers will buy something more powerful. That AMD thing works for basic web browsing or giving it to grandma to use take Internet out for a spin. But not much else.

I was watching some educational or training videos by a guy on YouTube talking about networking and he recommended that you pay for business internet because companies treat businesses with respect but homes get bad service.

Kids at my school have small smart TVs or game consoles and it's perfect for them to watch Netflix or something instead of the lackluster basic cable the school gives us. A lot of people I know hate TV now. I don't watch TV is a common phrase. But in my house, we have a giant smart TV, but nobody bothers to use the smart part of it. They just watch cable. I occasionally use it to watch YouTube. If you want a big tv go for it, but unused smart functionality is a waste of money. It also is not very intuitive to use, but not the worst thing in the world.

I don't know about 120Hz. The Soap Opera Effect annoys me a lot. Makes everything look fake. Plus, the whole point of it is to overcome the inherent weaknesses of LCD. LCD has a slow response time which makes it horrid to play games on. I'm not talking about input lag either (I've never noticed input lag, I'm not a hardcore gamer so I don't notice that) but low response time because that as the picture moves everything will leave a nasty trail of blurriness. On top of that, they have very poor black levels and low contrast ratios. Another crutch is given called "dynamic contrast ratio" which is just another marketing term. It is the most marketable technology, it is cheap, bright and energy efficient. Also, don't forget high resolution. I'll tell you, when your entire screen gets blurry from moving objects that high resolution doesn't a mean a thing. I guess people enjoy it for some reason. I am lucky enough to have DLP TV in my room, which generally has a tiny response time and black almost actually looks like the black instead of bluish black. Or if you have the backlight set all the way up as many people do, there is no black at all. You can only get over that but turning lights on which cancel it out a little bit. It makes perfect sense on a smartphone though. Really bright for really bright environments and it uses a steady amount of power no matter what color is on the screen. Bad black levels are remedied by bright environments like outdoors. The Internet is mostly white pages. Perfect. Yet, they made OLED phones mainstream. They use less power than LED on darker or mixed brightness images. But they consume more power when displaying white screens. Which is worse in my opinion for a phone since the Internet is white and making it black just makes it harder to see in the sun. It should be mainstream of television instead and computer monitors because it has much better picture quality in terms of color, contrast and everything. Only disadvantage is the degradation of the display is uneven and probably faster than regular LED. I think LCD is best for viewing text hands down, it has no screen burn-in potential at all. Good for computer monitors and smartphones. Not for movies or gaming.


----------



## Electra (Oct 24, 2014)

pwowq said:


> Toasters have USB-ports.


Hm, interesting, so this method _might be_ how the guy hacked the curtains to open from outside my old (now dead) grandfathers house...(like I mentioned above somewhere).

Still, I assume he would first have to crack the password to get into the house-owner to access his network? It shouldn't be that easy. Even they did use a password-generator code I assume the computer would lock it self up after a certain amount of tries for some time at least before they could try again? So they would have to have a lot of patience and sit there for hours? It's kinda creepy to think about that someone can just connect to your network like that and have fun with your gadgets.


----------



## gte (Mar 4, 2017)

The future of the Internet? That's easy - Skynet!


----------



## Electra (Oct 24, 2014)

gte said:


> The future of the Internet? That's easy - Skynet!


:encouragement: Why did nobody here think of that yet :facepalm: :winky:


----------



## Electra (Oct 24, 2014)

What if someone decided to gain all open- or even locked- info of peoples social network profiles in a country during a war.


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> https://www.cnet.com/news/fcc-us-slow-internet-speed-americans-lagging/
> It's a year old but kind of discusses the problem.


Lmao those speeds haha XD. My parents have the cheapest net in this city and it reaches 100 mbps download pretty easily. My phone's wireless network in city reaches 22 mbps download. The fastest net service I have seen is 3gbps (maybe 1, I' writing from memory) and it doesn't cost a fortune either. Situation in smaller cities or villages isn't that good. They pay for the dreadful net same money as we do for the fastest.





Grandmaster Yoda said:


> I'm sure that you can come to the local electronics store and pick a GTX 1060 off the shelf here and they would have more to spare.


RX 560 4Gb is the last not overpriced GPU on the market. 1050 Ti isn't worth it. Anything higher end isn't either. NV's 1050 is alright now, but I think it's gonna be left gimped soon. Also it's more expensive than RX 560 and doesn't justify it well.



Grandmaster Yoda said:


> This is the wealthiest country in the world. Of course it's also full of wealth inequality as well. Go beyond electronics, the US probably has a lot of availability of products compared to poorer nations. Even clothing is cheaper here. My "stepmother" is from Ecuador, she literally buys clothes here and then gives them to her family and they can sell them. Ah, but they have cheaper medicine than we do.


Used clothes in Lithuania are extremely cheap. You can be fully dressed for 15 euros. New clothes are pretty cheap in some places too, but quality is very meh.




Grandmaster Yoda said:


> I remember there was backlash during the Xbox One announcement because they said you had to connect to the Internet to activate games or something. Some countries don't even have internet so it would have been useless.


Nerds were raging and finally got what they wanted, but it wasn't the only problem. Xbox One was nicknamed as VHS player and some other flaws.




Grandmaster Yoda said:


> I don't know how much computers used to be. I think I looked up our family's original computer from 2000. It had an AMD Duron which turns out to be a low-end processor but it did have 8MB graphics. I think it was at least $700, but I don't remember. If you want low-end now, you can get something for maybe $300. Low-powered computers that are like mini-itx inside lightweight cases. I had one of those in my room before we gave it away. Completely non-upgradable except for HDD and RAM. We also have my cousin an all in one PC with the same incredibly low specs. It had 6GB of RAM, but not like that matters when it was dual-core 1.4GHz AMD APU. Weak. It seems like people more interested in computers will buy something more powerful. That AMD thing works for basic web browsing or giving it to grandma to use take Internet out for a spin. But not much else.


But Athlons costed like 300 bucks. According to Maximum PC magazines from the past. FX series chips weren't too expensive either. 




Grandmaster Yoda said:


> I was watching some educational or training videos by a guy on YouTube talking about networking and he recommended that you pay for business internet because companies treat businesses with respect but homes get bad service.


I think it depends on how much you pay much more.




Grandmaster Yoda said:


> Kids at my school have small smart TVs or game consoles and it's perfect for them to watch Netflix or something instead of the lackluster basic cable the school gives us.


Seriously, at schools?




Grandmaster Yoda said:


> A lot of people I know hate TV now. I don't watch TV is a common phrase. But in my house, we have a giant smart TV, but nobody bothers to use the smart part of it. They just watch cable. I occasionally use it to watch YouTube. If you want a big tv go for it, but unused smart functionality is a waste of money. It also is not very intuitive to use, but not the worst thing in the world.


I used it extensively in 2013 and it was very inconvenient and often slow. Now situation should be better. Sony on older TVs removed Youtube app... 





Grandmaster Yoda said:


> I don't know about 120Hz. The Soap Opera Effect annoys me a lot.


I was talking about real refresh rate, not about super fast motion hertz.




Grandmaster Yoda said:


> Makes everything look fake.


I tried it myself, it felt a bit weird, but not unnatural. I totally wasn't used to more than 30 fps TV broadcast.




Grandmaster Yoda said:


> Plus, the whole point of it is to overcome the inherent weaknesses of LCD. LCD has a slow response time which makes it horrid to play games on. I'm not talking about input lag either (I've never noticed input lag, I'm not a hardcore gamer so I don't notice that) but low response time because that as the picture moves everything will leave a nasty trail of blurriness.


That's visible on low quality panels mostly or slow IPSs. I have 60 Hz IPS monitor now and I don't notice trailing, only low refresh rate. My mind is still blown away by those colors. They make everything look better.





Grandmaster Yoda said:


> On top of that, they have very poor black levels and low contrast ratios.


I wouldn't agree here. VA has better better contrast and blacks. On IPS displays it's often hard to see incorrectly rendered blacks. My AMOLED phone beats them all, but it's not really LCD. 




Grandmaster Yoda said:


> Another crutch is given called "dynamic contrast ratio" which is just another marketing term. It is the most marketable technology, it is cheap, bright and energy efficient.


I fail to see efficiency in it as it means nothing.




Grandmaster Yoda said:


> Also, don't forget high resolution. I'll tell you, when your entire screen gets blurry from moving objects that high resolution doesn't a mean a thing.


After my monitor upgrade, I don't see stuff like that. I upgraded from 1080p to 1440p. Only low refresh rate is visible. Again I bought content creator's class thing, so I wasn't cheaping out. 



Grandmaster Yoda said:


> I guess people enjoy it for some reason.


I'm pretty sure they don't. Most of them don't even know about such thing.

In GTA 3 there was option to turn on light trails lol. I know that some Windows version have an option to turn on mouse pointer trailing.




Grandmaster Yoda said:


> I am lucky enough to have DLP TV in my room, which generally has a tiny response time and black almost actually looks like the black instead of bluish black. Or if you have the backlight set all the way up as many people do, there is no black at all.


I would dare to differ. I only used calibrating guides and content on my TV. Nothing is maxed out. Finding sweet spots is the key, maxing out is generally a bad thing to do for any accuracy goals. 




Grandmaster Yoda said:


> You can only get over that but turning lights on which cancel it out a little bit. It makes perfect sense on a smartphone though. Really bright for really bright environments and it uses a steady amount of power no matter what color is on the screen.


But don't AMOLEDs save power on lower brightness areas as they have independently lit pixels?



Grandmaster Yoda said:


> Bad black levels are remedied by bright environments like outdoors.


But it's rare for screens to be outside in general terms. Also I notice that too. I'm picky, but mostly because I fucking hate low quality screens.



Grandmaster Yoda said:


> The Internet is mostly white pages. Perfect.


Washed out letters, perfect.




Grandmaster Yoda said:


> Yet, they made OLED phones mainstream. They use less power than LED on darker or mixed brightness images. But they consume more power when displaying white screens. Which is worse in my opinion for a phone since the Internet is white and making it black just makes it harder to see in the sun. It should be mainstream of television instead and computer monitors because it has much better picture quality in terms of color, contrast and everything. Only disadvantage is the degradation of the display is uneven and probably faster than regular LED. I think LCD is best for viewing text hands down, it has no screen burn-in potential at all. Good for computer monitors and smartphones. Not for movies or gaming.


Honestly, do you miss CRTs? I would buy one if they were still being made.

OLEDs have disadvantages too. Can't display whites correctly (often have yellowish tint), input lag, trailing, blurring. Totally not okay for monitors. Neither for more specific users such as photo editors or color critical workers. They wouldn't have a reason to exist. As I remember they are very thin and flexible and almost perfect contrast and nice colors (Sammy seems to oversaturate their screens and I dislike that, for me defeats purpose of having good colors).


----------



## The red spirit (Sep 29, 2015)

Electra said:


> :encouragement: Why did nobody here think of that yet :facepalm: :winky:


Because matrix is going to happen faster.


----------



## pwowq (Aug 7, 2016)

gte said:


> The future of the Internet? That's easy - Skynet!


I am fascinated by how Skynet does what it can to destroy itself in that series.


----------



## pwowq (Aug 7, 2016)

Electra said:


> What if someone decided to gain all open- or even locked- info of peoples social network profiles in a country during a war.


That reminds me of a separatist warrior in the early stages of the Russian invasion of Ukraine, 2014. He Tweeted a picture of himself (uncovering his groups position on a building). Later same day some civilian filmed a Mi-24 firing rockets at a building which looked exactly like the one posted on that twitter account. Lesson of the day: Your enemy reads your Twitter-account (and probably all your other social media accounts).


----------



## Electra (Oct 24, 2014)

pwowq said:


> That reminds me of a separatist warrior in the early stages of the Russian invasion of Ukraine, 2014. He Tweeted a picture of himself (uncovering his groups position on a building). Later same day some civilian filmed a Mi-24 firing rockets at a building which looked exactly like the one posted on that twitter account. Lesson of the day: Your enemy reads your Twitter-account (and probably all your other social media accounts).


Exactly!
So it's interesting that the goverments are still allowing it to be so open...
I know the info can be usefull for cops. But it seems like such a bait


----------



## Electra (Oct 24, 2014)

The red spirit said:


> Because matrix is going to happen faster.


What if we had all been programmed to believe we were self aware


----------



## The red spirit (Sep 29, 2015)

Electra said:


> What if we had all been programmed to believe we were self aware


Whatever, just enjoy the scenery.


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> Lmao those speeds haha XD. My parents have the cheapest net in this city and it reaches 100 mbps download pretty easily. My phone's wireless network in city reaches 22 mbps download. The fastest net service I have seen is 3gbps (maybe 1, I' writing from memory) and it doesn't cost a fortune either. Situation in smaller cities or villages isn't that good. They pay for the dreadful net same money as we do for the fastest.
> 
> 
> 
> ...


It was only a couple of years ago that any cable company near me starting offering gigabit downloads. 100-300 Mbps used to be the fastest in my area and I'm on Long Island not far away from the largest city in the world. That's the fastest we have. I don't know what we have in my house, but the highest download speed I've ever seen was 2MB/s in my room. Which doesn't really say much about our plan. But realistically for me, I haven't seen higher. Speed test is giving me 27Mbps but I still don't know what our plan is.

The Xbox One is my least favorite right now. They recently updated the interface as if it wasn't hard enough to navigate. I don't own one but my friend does and occasionally play on it. 

Quality does matter with the response time. It varies a lot between panels. I had a TN panel and IPS panel. The IPS had worse blacks, but I think it just a much better looking picture overall. I don't recall being bothered by motion blur on it. The TN panel is a Samsung and I just hate it. It superficially looks like a good TV, but the colors are weak and the contrast is bad. The motion blur also kills my eyes. I also had a 40" TV and it didn't really fit either category but it had nicer black levels, so I want to assume that it was a VA panel. But it was the blurriest thing in the world in gaming. Lol it was kind of big in my room when I first bought it.

I don't do an in-depth calibration but I am aware of certain settings that cause very bad quality. I set my sharpness to 0 and my color temperate to warm. It makes the whole picture softer and easier to look at. Much more natural as well. I use the iPhone night shift or f.lux to warm the screen a bit just to make it nicer to look at. Unfortunately, switching to more natural color settings does not fix motion blur and bad panels. I also always have my backlight set to 0 for two reasons. The first reason being that the highlighted blacks are a lot less annoying, in fact many of the problems are a lot less noticeable when they are shining brightly in my face. The other reason is I using view in relatively dim environments so it doesn't affect my ability to see. Really bright LCDs kind of hurt my eyes and make me want to look away. I haven't really had the problem in a while though.

My iPhone is also at zero percent brightness right now for the aforementioned reasons and it saves battery life. I'm actually using my 5s right now because I found out that I could jailbreak it the other day after so many months away from the news about it. The battery has gone through over 500 charge cycles and the battery has only degraded about 10-11%. Apple claims that after two years or I think somewhere either 400 or 600 cycles, your iPhone should be at 80% of the original battery capacity. Mine is doing perfectly fine and I think I got it mid-2014? I actually don't remember. But it's been well over 2 years and as you may know I use it the stuff a lot. People say that I'm blind because I read it without my glasses on up close, but I told that to my eye doctor and said that's completely normal behavior for someone closer to being near-sighted.

I wish I had a CRT. That too badly. But I like it a lot more. I used to have one in my room, standard definition. I preferred playing music on it over the PS3 because of that Earth animation it looked really good in the dark and of course the speakers were better than a flat panel's speaker. I would much prefer it playing dark-area games or watching movies. But it's not that bad of an urge anymore. It's gone and it's done. I always do think that my Uncle's HD Sony Trinitron was the best picture I've ever seen. I just seemed to pull out the details better. Since no CRT was bound to a native resolution, it also didn't look like complete filth when we played Wii. It would be an interesting scenarios to push out more performance on games by dropping resolution a little bit. I never had a specific problem with low-resolution so much as non-native resolution. Below a certain level it does get bad, but 720p on a computer is fine for me. 1024x768 is fine minus less screen area to put things.

I really admire the black levels on OLEDs but that's about it. There's nothing wrong with anything else, but that's the only thing that "Wows" me. My Android phone has an OLED, but it's one of those earlier ones when companies didn't bother calibrating their displays. So it looks horrible to me. I like the LEDs, not CCFLs. I think the CCFLs hurt my eyes. A well-built, well-calibrated LCD is fine, but I much prefer watching movies or playing games on my DLP projection TV.

I think reference monitors have become very high quality LCD screens as opposed to any other technology. They used to all be CRTs.


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> It was only a couple of years ago that any cable company near me starting offering gigabit downloads. 100-300 Mbps used to be the fastest in my area and I'm on Long Island not far away from the largest city in the world. That's the fastest we have. I don't know what we have in my house, but the highest download speed I've ever seen was 2MB/s in my room. Which doesn't really say much about our plan. But realistically for me, I haven't seen higher. Speed test is giving me 27Mbps but I still don't know what our plan is.


That's pretty sad to be honest. 



Grandmaster Yoda said:


> The Xbox One is my least favorite right now. They recently updated the interface as if it wasn't hard enough to navigate. I don't own one but my friend does and occasionally play on it.


I love my Xbox 360 RGHed. Now it's a perfect console with huge emulation capabilities and some other extras. Yet stock 360 is good too. 360 was Microsoft's best console ever made and I think it will remain as such for a long time. Meanwhile One is just unnecessary product. It doesn't have many exclusives, it just doesn't feel like a console, it aims to be some sort of dedicated PC for games and entertainment, perhaps a HTPC. Now buying PC is far better investment as it's more functional, so One is just like I said, useless product. Overall this gen consoles is a big let down.



Grandmaster Yoda said:


> Quality does matter with the response time. It varies a lot between panels. I had a TN panel and IPS panel. The IPS had worse blacks, but I think it just a much better looking picture overall. I don't recall being bothered by motion blur on it.


My IPS monitor BenQ BL2420PT. If you want, look at specs. Anyway it's really good, except it's refresh time. I only wish it was 120 Hz or at least 80 Hz. I feel like my Steelseries rival 300 capabilities of achieving high cpi are limited by only 60Hz. Anyway it's not pure IPS, but only IPS type display. It's AHVA. Surprisingly input lag is not noticeable and I really expected to see the bad side of IPS type display. Colors are stunning, PPI is too. Contrast amazed me too as it outputs much more of it than same specced Samsung TN I had before. Nice extra. Build quality is perfect. Very adjustable stand is good and is from very thick metal inside. To me it feels like all manufacturers should make adjustable stands for monitors. Height adjustment is essential, especially on TNs. I tried to overclock this monitor, but it didn't reach anything at all lol. Meanwhile my Samsung TN reached respectable 75Hz at 1080p. Overall it's a quality product with unique feature set for around 300 euros. No gaming BS and low quality aesthetical elements, this is a decent AHVA monitor capable of some serious stuff.

My TV is LG IPS. I can see input lag and trailing. It only reaches 60 Hz. From my subjective experience and calibration attempts I still can't say that it's similar to BenQ. BenQ has 100% sRGB, I wouldn't say that about LG. Exact model is 42LM640S. Its overall build quality is only alright, but nothing too good. 3D stuff was what sold it to me at the time (2014 I guess). It's not too bad, but it's not really high quality either. I would say it's acceptable. It has common IPS issues and TV issues. It was on the cheaper 42 inchers of the time, so it would have been stupid to expect miracles or something very good out of it. Yet it was a good value. 




Grandmaster Yoda said:


> The TN panel is a Samsung and I just hate it. It superficially looks like a good TV, but the colors are weak and the contrast is bad. The motion blur also kills my eyes. I also had a 40" TV and it didn't really fit either category but it had nicer black levels, so I want to assume that it was a VA panel. But it was the blurriest thing in the world in gaming. Lol it was kind of big in my room when I first bought it.


My TN monitor is Sammy too. It's S22C300. Cheap TN with 60Hz. Colors are fine, contrast is acceptable, brightness is passable, viewing angles are good, except when looking from the bottom and it was a big issue for me, because I sit incorrectly and am in the weirdest sitting or laying positions all the time. When overclocked to 75Hz it's pretty good for gaming. Acceptable for very low photo editing needs. Was bought when I got the PC. My dad just bought it because it was available at the store and I knew nothing about screens. Until I knew nothing it was alright, but only until then, same with pretty much everything I have lol. For wah tit had it was expensive and doesn't have almost anything premium. Maybe touch buttons at the front, good aesthetics and better than horrendous build quality is why it costed considerably more than other ones.

After all my experiences with different screens I would look at quality panels and wouldn't want to cheap out. It's what you look at all the time, so it would only be unwise to cut on that.




Grandmaster Yoda said:


> I don't do an in-depth calibration but I am aware of certain settings that cause very bad quality. I set my sharpness to 0 and my color temperate to warm. It makes the whole picture softer and easier to look at. Much more natural as well.


0 sharpness doesn't mean natural picture on all displays at all. Color temp on displays should be calibrated at 6500K for natural and life-alike colors accuracy and correct display of everything you see on screen. Sometimes panels are set way too cold, so in such case your attempt would help, but I would still look at you like a madman of settings. Sorry for that, I just know thing or two about displays and am obsessed with the best possible picture quality. My standards are very high and it's my problem. I don't seem to be able to rehab from that at all. I guess it's not because I'm evil.




Grandmaster Yoda said:


> I use the iPhone night shift or f.lux to warm the screen a bit just to make it nicer to look at. Unfortunately, switching to more natural color settings does not fix motion blur and bad panels.


Oh my Galaxy Note 3 Neo's AMOLED blurs a bit too.




Grandmaster Yoda said:


> I also always have my backlight set to 0 for two reasons. The first reason being that the highlighted blacks are a lot less annoying, in fact many of the problems are a lot less noticeable when they are shining brightly in my face. The other reason is I using view in relatively dim environments so it doesn't affect my ability to see. Really bright LCDs kind of hurt my eyes and make me want to look away. I haven't really had the problem in a while though.


Well I can't do that at all. I just wanna cry inside for all the lost shades of colors and greys. That's too much for me to sacrifice. Only when battery is close to die I do that, but only then. Also if PWM flickering is noticeable, the nit's bad to set low brightness.




Grandmaster Yoda said:


> My iPhone is also at zero percent brightness right now for the aforementioned reasons and it saves battery life. I'm actually using my 5s right now because I found out that I could jailbreak it the other day after so many months away from the news about it. The battery has gone through over 500 charge cycles and the battery has only degraded about 10-11%. Apple claims that after two years or I think somewhere either 400 or 600 cycles, your iPhone should be at 80% of the original battery capacity. Mine is doing perfectly fine and I think I got it mid-2014? I actually don't remember. But it's been well over 2 years and as you may know I use it the stuff a lot. People say that I'm blind because I read it without my glasses on up close, but I told that to my eye doctor and said that's completely normal behavior for someone closer to being near-sighted.


I have lots of charge cycles on Note too. I don't notice a big degradation. Phone is from early 2014. Meanwhile on Galaxy Ace 2 from 2012 I can really see that, but mostly because it had not very good battery life in the first place. On Nokia 3210 that I have since 2005 I can see the effects of degradations far better, but it still lats a lot and I played games on that thing a lot.



Grandmaster Yoda said:


> I wish I had a CRT. That too badly. But I like it a lot more. I used to have one in my room, standard definition. I preferred playing music on it over the PS3 because of that Earth animation it looked really good in the dark and of course the speakers were better than a flat panel's speaker. I would much prefer it playing dark-area games or watching movies. But it's not that bad of an urge anymore. It's gone and it's done. I always do think that my Uncle's HD Sony Trinitron was the best picture I've ever seen. I just seemed to pull out the details better.


Don't you know that Trinitrons were one of the better CRTs. They must be really good.




Grandmaster Yoda said:


> Since no CRT was bound to a native resolution, it also didn't look like complete filth when we played Wii. It would be an interesting scenarios to push out more performance on games by dropping resolution a little bit. I never had a specific problem with low-resolution so much as non-native resolution. Below a certain level it does get bad, but 720p on a computer is fine for me. 1024x768 is fine minus less screen area to put things.


I never understood that concept of CRTs and low resolution. It looked bad on them too, not much different than LCDs. I had maybe 360p LG CRT TV. It's low res display was the biggest problem to me. Seeing text of X360 was just painful. Yet zero input lag of CRTs must be great.




Grandmaster Yoda said:


> I really admire the black levels on OLEDs but that's about it. There's nothing wrong with anything else, but that's the only thing that "Wows" me. My Android phone has an OLED, but it's one of those earlier ones when companies didn't bother calibrating their displays. So it looks horrible to me. I like the LEDs, not CCFLs. I think the CCFLs hurt my eyes. A well-built, well-calibrated LCD is fine, but I much prefer watching movies or playing games on my DLP projection TV.


CCFLs also don't last long. They degrade pretty fast. On HP DV6000 from 2005 display is practically dead. Meanwhile Samsung monitor from 2005 for PC held up better, but backlight died. Both are CCFLs. 




Grandmaster Yoda said:


> I think reference monitors have become very high quality LCD screens as opposed to any other technology. They used to all be CRTs.


CRTs are still used for that from what I have heard. For LCDs maybe they use something like a mixture of IPS and VA or just OLED with ironed out flaws. Who knows? But it's very satisfying to dream about that.


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> That's pretty sad to be honest.
> 
> 
> I love my Xbox 360 RGHed. Now it's a perfect console with huge emulation capabilities and some other extras. Yet stock 360 is good too. 360 was Microsoft's best console ever made and I think it will remain as such for a long time. Meanwhile One is just unnecessary product. It doesn't have many exclusives, it just doesn't feel like a console, it aims to be some sort of dedicated PC for games and entertainment, perhaps a HTPC. Now buying PC is far better investment as it's more functional, so One is just like I said, useless product. Overall this gen consoles is a big let down.
> ...


Xbox one and ps4 just went to opposite extremes. I bought the ps4 when it was released and it couldn't play DVDs. You had to sign up for their online service. I was really annoyed because I could plug an old iPod into my PS3 and download all of the unprotected music files with no issue. I gave my ps4 to my stepdad because he wanted one for some reason. So it's basically downstairs, possibly still useable by me I don't think he would care. He gave me something in return of course. The interface on PS4 is fine, except that turning vibration off was completely inconsistent. I hate vibration. I'm not a big fan of either console. Now they are updating them with the Xbox One X and PS4 Pro to compete with PCs. 

I never had the opportunity to hack or jailbreak an Xbox 360. I would play Halo 3 and spawn enemies and made them fight each other. I liked the Xbox 360 and PS3. I liked my friends on PS3 that I found a lot. I never liked the online players on 360. Because of the more integrated chat functionality, it basically was a more competitive gaming console. On PS3 if you had a microphone, you could talk in-game but not out of game. Seemed like a whole lot less 12 year olds on PS3. But I liked the original Halo Series a lot more than anything they had on playstation. Killzone 3 had great music, but it was not synchronized to anything that the characters were saying. If you ever listen to it you will know what I'm saying. 

Sharpness depends on the display. Usually, I find that 0 is ideal. Certain weird computer monitors will someone how ruin the image by making it softer than it should be. That case and if you are using a projector are the only times I would suggest some sharpness. Also analog, the old point of the sharpness control was to get around some of the limitations of analog television and VHS. Otherwise, what it really does is create define contrasting outlines around things on the screen. The result is that the original picture modified and you actually lose detail. Putting those outlines over the picture erases part of the picture and obviously when set really high you get to see the white halo effects. That really happens on all levels of sharpness assuming your monitor isn't one of those weird ones that blur the picture when set on low sharpness. Basically you're looking for an unmodified picture that isn't unnecessarily sharpened or softened either way. This is really a vestigial organ from analog TV days and so is the tint control. Yeah, I don't actually take a camera in a pitch black room to measure things like the actual color temperature. But as a rule of thumb, both the cool color temperature and neutral color temperature settings on an average TV are both spending out color temperatures that are cooler than 6500K (which is ironic to say because 9300K is a higher temperature than 6500K but we associate red with warm colors and blue with cool colors even though even with fires, we know that with fire the blue part of the flame is the hottest.) I always go with warm on television. Another thing is that even though software might say 6500K that doesn't mean it actually is. For example, the iPhone has closer to a 7000K color temperature but the software is obviously configured to use 6500K. I think I've referred you to displaymate.com. Good website for some of the tests.

Trinitrons used a different type of technology than typical shadow mask CRTs. So they were brighter than usual ones. But there's something about them. My Uncle had one that did 1080i and when I watched people on screen, it just pulled out the details and it looked great. Colors look right. Cheaper LCD displays have trouble making skin tones look normal. 

CRTs never really that good for text anyway. LCDs are sharper. They have distorted shapes if you look closely. Like the widescreen bars on the side of the screen were not straight on my Uncles TV, but it's not noticeable in normal content. With CRTs they are not fixed-pixel displays. Sometimes you could sacrifice refresh rate for a higher resolution. But they all had their limits. The difference is that when LCDs use a non-native resolution, the picture gets really distorted because they have to use video scaling which tries to map the lower resolution to the native resolution. For example, viewing 720p on 1080p. Your monitor is always displaying 1080p, it can't do 720p. Before it reaches your screen it is scaled up or down to fit the 1080p. Alternatively there is one-to-one pixel mapping where you put the 720p image within its own space and there are black bars around the whole image. So it looks just as sharp as if it was at native resolution, but it doesn't take up the full screen. CRT doesn't have to do this. It just takes the signal and displays it. Which means it doesn't lose as much quality that is entirely because of video scaling. There are less pixels so it loses quality in that sense, but if it was a 1024x768 CRT it would display 800x600 just as well as an 800x600 CRT. A 1080p LCD will have significantly degraded quality when trying to display 800x600 and an 800x600 native LCD will probably look sharper. Just lower the resolution on your monitor and all your text will become blurry. That's the video scaler's fault because it's basically extrapolating the smaller size to the bigger size that the monitor supports. The monitor obviously has its own video scaler, but you can also scale using a GPU which is sometimes a better looking solution.


----------



## The red spirit (Sep 29, 2015)

Grandmaster Yoda said:


> Xbox one and ps4 just went to opposite extremes. I bought the ps4 when it was released and it couldn't play DVDs. You had to sign up for their online service. I was really annoyed because I could plug an old iPod into my PS3 and download all of the unprotected music files with no issue. I gave my ps4 to my stepdad because he wanted one for some reason. So it's basically downstairs, possibly still useable by me I don't think he would care. He gave me something in return of course.


It's so dreadful as it sounds from your story.




Grandmaster Yoda said:


> The interface on PS4 is fine, except that turning vibration off was completely inconsistent. I hate vibration. I'm not a big fan of either console.


Vibration is a must for me in racing games.



Grandmaster Yoda said:


> Now they are updating them with the Xbox One X and PS4 Pro to compete with PCs.


And there was times, when you could put diskette and play your games, now you have multi-functional device with x86 architecture that hides its PC genes and tries to appear as console.



Grandmaster Yoda said:


> I never had the opportunity to hack or jailbreak an Xbox 360.


In my country you can buy them new and hacked. Either only with flashed DVD drive or RGHed (reset glitch hacked). Hacked ones are only a bit more expensive. Police and authority doesn't give a shit about hat and about software hacking at all. 




Grandmaster Yoda said:


> I would play Halo 3 and spawn enemies and made them fight each other.


Yet with all those capabilities, I still play Forza 3 from original disc. Believe, you simply wouldn't want to play many games, when they are all free and available. Not very interesting. Quality of software matters the most. Only the best will catch your attention.

Same situation with PS2s, lots of them are hacked. Ps1 are very rarely hacked. Xboxes are rare. PS3 are rarely hacked. Almost no one has Wiis.




Grandmaster Yoda said:


> I liked the Xbox 360 and PS3. I liked my friends on PS3 that I found a lot. I never liked the online players on 360. Because of the more integrated chat functionality, it basically was a more competitive gaming console. On PS3 if you had a microphone, you could talk in-game but not out of game. Seemed like a whole lot less 12 year olds on PS3.


DDos too



Grandmaster Yoda said:


> But I liked the original Halo Series a lot more than anything they had on playstation.


That game was just boring to me...




Grandmaster Yoda said:


> Killzone 3 had great music, but it was not synchronized to anything that the characters were saying. If you ever listen to it you will know what I'm saying.


Very unlikely, not a fan of FPSs on consoles, neither a fan of PS3. Only Grand Turismo series look interesting.





Grandmaster Yoda said:


> Sharpness depends on the display. Usually, I find that 0 is ideal. Certain weird computer monitors will someone how ruin the image by making it softer than it should be.


60 on Sammy TN looks okay and how it should be. 0 is very blurry. 100 oversharped as hell. On LG TV varies a lot on input types. From 6 to 23 were alright in different situations.




Grandmaster Yoda said:


> That case and if you are using a projector are the only times I would suggest some sharpness. Also analog, the old point of the sharpness control was to get around some of the limitations of analog television and VHS.


TV broadcast look very oversharped and res is only 576p. One channel is 1080p. 




Grandmaster Yoda said:


> Otherwise, what it really does is create define contrasting outlines around things on the screen. The result is that the original picture modified and you actually lose detail. Putting those outlines over the picture erases part of the picture and obviously when set really high you get to see the white halo effects. That really happens on all levels of sharpness assuming your monitor isn't one of those weird ones that blur the picture when set on low sharpness. Basically you're looking for an unmodified picture that isn't unnecessarily sharpened or softened either way.


I have never seen monitors displaying good with 0 sharpness.




Grandmaster Yoda said:


> This is really a vestigial organ from analog TV days and so is the tint control. Yeah, I don't actually take a camera in a pitch black room to measure things like the actual color temperature. But as a rule of thumb, both the cool color temperature and neutral color temperature settings on an average TV are both spending out color temperatures that are cooler than 6500K (which is ironic to say because 9300K is a higher temperature than 6500K but we associate red with warm colors and blue with cool colors even though even with fires, we know that with fire the blue part of the flame is the hottest.) I always go with warm on television. Another thing is that even though software might say 6500K that doesn't mean it actually is. For example, the iPhone has closer to a 7000K color temperature but the software is obviously configured to use 6500K. I think I've referred you to displaymate.com. Good website for some of the tests.


Nah you didn't, but you just did, so yay. Yet I'm aware of temperature inaccuracies on displays.




Grandmaster Yoda said:


> Trinitrons used a different type of technology than typical shadow mask CRTs. So they were brighter than usual ones. But there's something about them. My Uncle had one that did 1080i and when I watched people on screen, it just pulled out the details and it looked great. Colors look right. Cheaper LCD displays have trouble making skin tones look normal.


Not only cheaper ones, but lots of them. 




Grandmaster Yoda said:


> CRTs never really that good for text anyway. LCDs are sharper. They have distorted shapes if you look closely.


Mine was flatscreen CRT. The late CRT era.



Grandmaster Yoda said:


> Like the widescreen bars on the side of the screen were not straight on my Uncles TV, but it's not noticeable in normal content. With CRTs they are not fixed-pixel displays.


But there still was a grid of pixels in them... so I'm confused what did you meant by that...



Grandmaster Yoda said:


> Sometimes you could sacrifice refresh rate for a higher resolution. But they all had their limits. The difference is that when LCDs use a non-native resolution, the picture gets really distorted because they have to use video scaling which tries to map the lower resolution to the native resolution.


That still seems like the same thing as on CRTs, just that on them it would be very blurry.




Grandmaster Yoda said:


> For example, viewing 720p on 1080p. Your monitor is always displaying 1080p, it can't do 720p. Before it reaches your screen it is scaled up or down to fit the 1080p. Alternatively there is one-to-one pixel mapping where you put the 720p image within its own space and there are black bars around the whole image.


My new monitor has that function.




Grandmaster Yoda said:


> So it looks just as sharp as if it was at native resolution, but it doesn't take up the full screen. CRT doesn't have to do this. It just takes the signal and displays it. Which means it doesn't lose as much quality that is entirely because of video scaling. There are less pixels so it loses quality in that sense, but if it was a 1024x768 CRT it would display 800x600 just as well as an 800x600 CRT. A 1080p LCD will have significantly degraded quality when trying to display 800x600 and an 800x600 native LCD will probably look sharper.


I wouldn't agree here about magical degrading on LCDs.



Grandmaster Yoda said:


> Just lower the resolution on your monitor and all your text will become blurry. That's the video scaler's fault because it's basically extrapolating the smaller size to the bigger size that the monitor supports. The monitor obviously has its own video scaler, but you can also scale using a GPU which is sometimes a better looking solution.


maybe

When I bought my last phone I looked at phones at acceptable price rand. 2K screens were obvious disadvantage for me. 1080p didn't intrigue me, meanwhile 720p looked perfect. Watching content on max res is always better CRT or LCD doesn't matter. Weird thinking, but 720p was a good decision. Now 1080p screen on phone maybe would be a better choice, but 720p AMOLED is very good by itself. I doesn't feel like missing anything worthwhile. While 2K phone was very laggy and was unable to cope with stupidly high resolution (LG G3), maybe software resolved that later, but it doesn't seem like they did anything. Galaxy S4 was and still is good phone. But extra note functions of Note 3 Neo was the only advantage over S4 and possibly somewhat longer lasting battery. I still think that the choice was very well thought out. Also Note 3 Neo possibly has slightly better performance in games due to lower resolution and almost identically performing internals as S4. Unexpected extra to be found in Note 3 Neo also was a good audio chip, much better than the ones with Snapdragon CPUs (or APUs to be more correct) (better than Note 3 Neo Snapdragon version and S4). Simply perfect decision it was, no regrets.


----------



## Grandmaster Yoda (Jan 18, 2014)

The red spirit said:


> It's so dreadful as it sounds from your story.
> 
> 
> 
> ...





The red spirit said:


> It's so dreadful as it sounds from your story.
> 
> 
> 
> ...


There used to be a weird thing in the Internet about PS3 untapped potential and all this talk about squeezing the most out of old console hardware. Obviously, they aren't even close to being as powerful as modern machines. But they didn't have that same talk with the new consoles. Everyone basically said, "That's a laptop processor, sad." Then they built new consoles with better graphics and higher clocked CPU. Realistically, what do people expect from a console? Be better than PCs for years? Just an absurd expectation. 

I don't see what's wrong with the media center being a Playstation or Xbox though. Getting your TV through it though doesn't make sense to me though when you will have a cable box if you even get cable tv. They have a Blu-Ray player inside, they might as well play movies.

Now there's a lot of updates and overhead though. Older people don't like them. They miss the old days of just popping in a disc and playing. That's why some people like the Nintendo Switch because it's simply to configure.

FPS used to be the main games I played on console. But I also liked platformers. Few of them anyway. I understand that the controller is less accurate than the keyboard and mouse. But I like to have a laid back posture when I play a game so I always put a controller in when I play FPS on PC. It's kind of uncomfortable to play with a mouse and keyboard and playing games that I've already played on a console makes using a controller much easier.

I'm surprised that you would be thinking about Gran Truismo. I used to play it as a kid, the 4th one of ps2. I had the 5th one PS3 but it took so long to load. Plus, I was never interested in 95% of the game, I just wanted to play a simple quick race once in a while if ever. I didn't care for tournaments or anything. When my friend get forza on Xbox one, I was so bad at playing it because I never play racing games. But the PS game doesn't include any car destruction or anything. Maybe there is a sixth game that does? I don't really care for it anyway.

I don't really play video games anymore. I just enjoy learning and being with my friends occasionally. Back in the ps2 days I used to jump up and down on my bed when I was able to play a video game by myself. Then I had friends on PS3 for a while, then people stopped playing and I stopped playing. It wouldn't be very pleasurable to play a game right now by myself. Most of the fun comes from having a friend physically present now, but even then I don't feel like playing the same game with them over and over.

My older Star Wars computer games were fun to me because I could use console commands and create scenarios for myself. I remember Star Wars Galactic Battlegrounds was one of my favorite games. I eventually fell down the sinkhole of the modding tools. I didn't actually create new content or graphics, but I changed things around so for example my units could be purchased for free. I noticed that I spent more time changing configurations than I did actually playing. After a while changing configurations is a pain, especially if you didn't backup anything properly. So I don't really play much anymore. I remember using god mode on call of duty zombies on PC, cheating is fun at first but then it demolishes any future for playing the game. But that's the advantage to PC, you can use to console to change things in weird ways. But eventually it becomes boring.

Here's a very stubby article on fixed pixel. I suggest doing a Google search for more. 
https://en.wikipedia.org/wiki/Fixed_pixel_display

I also sometimes reduced my 1440p android to 720p. It made the animations faster.


----------



## Electra (Oct 24, 2014)

I got an sms that can't be blocked with an ever-changing message + link attached.
I ignore it but suddenly there is a new one. Some of the many mail addresses appears to be spoofed.


----------



## Selena Grey (Jul 21, 2016)

I think today internet security became an important issue. With all these new technologies the risks of being hacked is too high. Not so long ago I started to use proxy server at [link removed] to protect my anonymity. It helps me to stay calm that my personal data are secured from online snoopers.


----------



## Electra (Oct 24, 2014)

Selena Grey said:


> I think today internet security became an important issue. With all these new technologies the risks of being hacked is too high. Not so long ago I started to use proxy server at https://buy.fineproxy.org/eng/ to protect my anonymity. It helps me to stay calm that my personal data is safe from online snoopers.


A proxy is an excellent idea.
I thought your internet company provided proxy's automaticly though?
Can your packets still be sniffed through programs such as Wireshark?
I also guess you could get malicious links with site directors through your mail. What OP do you use?


----------



## Cal (Sep 29, 2017)

The future of the internet:

More memes for all!


----------



## Cal (Sep 29, 2017)

.


----------

