Interested in advertising on Derpibooru? Click here for information!
Champions of Equestria

Help fund the $15 daily operational cost of Derpibooru - support us financially!

Description

just a photo of my desktop, including my gaming rig and my collection of ponies and other memorabilia.

safe2268082 artist:crystal-eclair18 applejack208045 fluttershy269140 pinkie pie265943 rainbow dash290842 rarity225846 twilight sparkle371373 pikachu1053 equestria girls267911 g42127099 brushable3337 collections1 controller3384 crimson omen1 gaming pc19 gears of war65 high res412963 irl86647 mane six38936 model763 photo100882 pokémon11482 rms lusitania1 thunderbirds20 toy26320 xbox 360 controller92 xbox controller75 xbox one245

Comments

Syntax quick reference: **bold** *italic* ||hide text|| `code` __underline__ ~~strike~~ ^sup^ %sub%

Detailed syntax guide

Durabiznik

@Durabiznik
my 1070 isn’t a standard 1070, its an MSI 1070, a gaming brand edition
MSI GeForce 1070 Gaming X
 
That just means it runs at marginally higher clocks than reference cards.  
In your case it’s +0 MHz base core clock, +114 MHz boost core clock and +100 MHz memory clock, which has pretty much no impact on real life performance.
 
The main benefit of non-reference cards (aside from better pricing) are generally better cooling solutions. That means less risk of thermal throttling and a better overclocking potential if you decide to do it yourself. But no matter what you do, you will only be able to squeeze an extra 5-10% out of your 1070 compared to the reference Founder’s edition, depending purely on luck (no two chips are made equal, yours might be still stable on 2200 MHz, or it might start crashing in the low 1900s no matter what you do, there’s no way to know until you try).
Durabiznik

I’m not making fun of you, just saying you should have consulted someone who knows what they’re doing before spending a lot of money on hardware in a basically random manner.
 
You bought a Freesync monitor but a GPU that supports G-sync.  
Got an extra 16GB of RAM towards a total of 32, which actually results in a negative performance gain (you don’t utilize the full 16GB to begin with and don’t have a motherboard that would support quad channel, so the two extra sticks do nothing except for being unnecessary strain on the memory controller). I hope you at least checked what brand and model HP used.
 
But first and foremost, you fell for the 4k gaming hype, which simply isn’t viable at this point unless you have an SLI/Crossfire setup (a dead end, as there is less multi-GPU support from developers as time progresses), or are one of the very few people who bought the Pascal TitanX $1200 housefire. You have a 1070 which is a great GPU (I have one myself), but 4k absolutely cripples it. At 1440p you can enjoy decent performance with a 1070, and there is a plethora of good monitors, including high refresh rate IPS ones.
 
@Durabiznik
This computer plays Gears of War 4 perfectly on ultra just fine
 
The GTX 1080 paired with an overclocked i7-5960X fails to reach stable 60+ FPS in GoW4 at 4k ultra, but your 1070 paired with i5-6600K at stock clocks doesn’t? Interesting.  
I hope you like your games running at 20-40 FPS, because at 4k that’s what your 1070 is going to get you in AAA titles unless you lower the graphics settings considerably. And it’s only going to get worse from now on for your poor 1070.
Appletart

@Durabiznik  
its not blue LED, it changes colours apart from the keyboard and you don’t know a thing. This computer plays Gears of War 4 perfectly on ultra just fine and the i5-6600K came with the computer, i didn’t build this from scratch.
 
If you must know, its a HP Envy Phoenix, it originally had the i5-6600K, 16GB DDR4 RAM and an AMD Radeon 4GB graphics card.