Everything posted by Derouiche™
-
Gaming monitors are designed with precision to assist hardware for realistic graphic approach. Without it your high-end hardware is incapable to show its power. Considering the best graphic quality of games, these monitors feature superfluous technologies to empower life like images, and so we have Viewsonic XG2703-GS. A potent LCD display powered by SuperClear IPS-Type Panel Technology, 165Hz Refresh Rate and 4Ms response time. Viewsonic XG2703-GS has a 27inch Anti-Glare TFT LCD Display powered by NVIDIA G-Sync and Quad HD Resolution. We see it as an optimum display for playing games on ultra-high settings with no glitch. To get the best possible output you need a compatible gpu featuring NVIDIA G-Sync, an AMD FreeSync version is also available at lower price tag. We are quiet delighted to test this model, at initial stages it performed well on our games and test results. We are going to share the details ahead, before that let’s have a look on major features and full specification of the model. Features: Designed for Gamers Adaptive Contrast Control Flicker-Free and Blue Light Technology Dual Speakers Versatile Connectivity WQHD 2560 x 1440 resolution 165Hz on DisplayPort 1.2a SuperClear IPS-Type Panel Technology NVIDIA Ultra Low Motion Blur, Dark Color Enhancement & G-Sync Technology 6-Axis Color Enhancement Ultra-fast 4ms response time Package: Below is the list of content of box: Monitor with attached stand 2x Power cord’s with AC/DC Power Adapter Display Cable USB Cable to power Monitor USB Ports Quick Start Guide and Manual. Note: It does not comes with a HDMI cable. Design: Viewsonic XG2703-GS is a part of premium monitor segment under 2 K categories. It is powered by Nvidia Gsync technology that gives you no tearing and lag free gaming remember it is a power hungry beast, which is said by its energy rating as well. Front: 27inch Wide Angle Anti-glare screen covers max area at front followed by thick black border and a green opening at the bottom for cable management backed by huge stand base to keep it stable. Mostly covered in plastic, the monitor weight is roughly 7kg with stand. Viewsonic XG2703-GS is a large monitor that requires a good amount of space for proper placement. We suggest maintaining sufficient distance between you and monitor for low eye tension and regulate brightness contrast accordingly for the best viewing. You’ll notice a green cable management space right above the base with power light indicator. This green is clear indication of its favor to NVIDIA Gsync technology so this surly goes matching with team green theme. Sides: Two USB ports on the right, two USB 3.0 ports facing downside at the back aligned with another port to power them, a cable is in the package and DC power input. Towards the right, there are two ports, one HDMI that supports max resolution of 2560 x 1440 @60Hz and another one a Display Port 1.2 which delivers 2560 x 1440 @165Hz Max. Later ahead an audio jack for headphones. Back: The body features VESA Mounting stand which allows you to tilt the display to be used in portrait mode and as well as good degree of back and front tilting as per need of your situation. The stand has easy monitor shift mechanism, you can move it up and down to get rational height, along with easy twisting from landscape to portrait mode. It is a removable stand, you can also wall mount with compatible VESA mounting panel. Creative thought of putting headphone hook at back of stand but I feel it could be more useful if it would be on side of display but again you may find it useful for various other reasons. The dual 2W stereo speakers doesn’t provide the dual sound quality at all, it’s very low and you need a headphone if sound is one of the things that makes your gaming complete. All keys to access OSD menu are visible from backside. Can be used for easy navigation, adjusting monitor settings is comfortable, I think touch sensitive buttons are lot easier to use compare to these clickable keys. Overall the design is remarkable, it gives a look of huge gaming display with a number of sober features. Speakers: We are not content with the speaker audio output but. The sound is not as loud as promised for immersive multimedia experience. The dual 2W stereo speakers doesn’t provide the dual sound quality at all, it’s very low and you need a headphone if sound is one of the things that makes your gaming complete. Display Quality: Light matte anti-glare display has good characteristics of anti-reflection. Wide range of viewing angle maintains the quality of colors, so don’t worry about getting lesser visual quality from distance and different angels. We got 100% of sRGB Color on our color gamut test. It has HDMI port but to get true juice of quality you need to use DisplayPort only, on which you unlocked the refresh rate of 165Hz, on HDMI gaming quality is not so pleasant. ViewSonic didn’t ship the HDMI cable in packaging which is another indication of DisplayPort preference. For Gaming, The Gsync feature is actually a game changer here, if you have Gsync compatible card then this monitor is gem for you else you might not be pleased with the quality. There is also FreeSync version available for Red team Graphics card users, can’t say anything about that yet, hopefully it works good for AMD users as good this one works for Nvidia users. The price tag varies on the basis of model, G-Sync version is the costliest one. OSD Menu: Viewsonic XG2703-GS has 4 different preset View modes in the OSD menu with all basic and advance settings. They are Standard, Game, Movie and Web with pre-configured settings to adjust the screen quality. Another 6 different sub-preset modes are available for gaming, under Game in the OSD Menu, they are Gamer1, Gamer2, FPS1, FPS2, RTS, and MOBA. By selecting any one from them gives you different saturation levels, Gama, color, and other type of image correction. We recommend sticking with pre-set modes, to get the best possible effects. The first button allows you to select those 6 sub-preset modes. The second key gives access to actual monitor menu, along with manual contrast/brightness adjustments and input selection. The main menu has ample of different settings like speaker volume control, view mode, color adjust, manual image adjust and OSD setup. View Mode further has 4 modes, they are Standard, Game, Movie and Web. The Standard mode further offers access to adjust ULMB (NVIDIA Ultra Low Motion Blur), Dark Boost, Adaptive contrast, Blue Light Filter, Response Time and Recall. These settings can help you to optimize the monitor output. For example Dark Boost offers different levels to enhance the darker area of screen. Use adaptive contrast for brighter picture, and if the screen seems too flashy use Blue Light Filter for plummeting eye strain. The monitor has three modes of Response time, which are Standard, Advance and Ultra fast. As per the specs 4ms is the fastest response time it has, and it is a common standard in various gaming displays. The Color adjust section offers settings like Control, Brightness, Saturation, 6-Axis Color, Color Temperature, Input Range and Gamma. Idle for expert users who want to play with custom settings for best output. Finally in Manual Image Adjust you can try the Overclocking feature. By default it is set on max on 165Hz or you can set it to native which will reduce it depending on the input source you are using, for example 144Hz. Viewsonic XG2703-GS gives utmost control for screen optimization. High refresh rate is possible only on DisplayPort, the HDMI is restricted to 60Hz max. So if you want much higher amount of detailed image quality go with DisplayPort that comes in the package. A HDMI is good for consoles or home theater systems. Performance Test: Viewsonic XG2703-GS shifts to WQHD resolution for gaming. 2K seems possible on the screen, but having a low refresh rate on HDMI cause issues with the image quality. We had noticed screen tearing and pixilated darker areas, caused by low refresh rate on HDMI and DisplayPort both. We assume this could be our Radeon GPU that lacks G-Sync support. There is a FreeSync version also available, so at the time of buying do check your Graphic Card compatibility with the monitor specification. Color reproduction seems well from a distant, our color gamut test shows, the monitor is capable of giving 90% of color accuracy on AdobeRGB and 100% on sRGB on 100% Saturation. As far as our quality test is concerned we are happy with the output. Without going much into the color detailing, Viewsonic XG2703-GS is capable of giving out good output as it promises. The different set of profiles via OSD menu and customization is something you need to work on, if your views demands best adjustment at gaming. So do spend some time with that and you will see the results. Verdict: So my verdict is, if it fits your budget and 2k Gaming is your thing then you should go with this model else look for 4k variant. The price tag varies on the basis of model, G-Sync version is the costliest one. If you own an AMD based gaming system, so your preference should go with the FreeSync model. But if you are having Nvidia enabled gpu in your system, a G-Sync monitor is required and obviously a higher price tag follows. Viewsonic XG2703-GS works seamless with a fast response time, IPS-Type Panel, good viewing angle, adjustable stand, etc . For us it is absolute for gaming setup. It features all new techs you need to get high end output, but only under the 2K Resolution range. Our test gave us satisfactory results, and we recommend it for consoles as well.
-
v2 text blur
-
correct chose the next one =)
-
this one ? =)
-
welcome to csblackdevil have fun / good luck
-
Welcome to CsBlackDevil have fun while your here
-
battle ✘ PsychO™✘ Vs The Ga[M]er.[Winner ✗-HaMoNa-✗]
Derouiche™ replied to ✘ PsychO™✘'s topic in GFX Battles
v3 texture , lighting -
Battel (City VS Farouk_Messi) (Farouk Messi Winner)
Derouiche™ replied to #Farouk Messi's topic in GFX Battles
v1 text -
v1 C4D
-
battle Hamona vs Maniac vs Paradox [Winner Hamona]
Derouiche™ replied to -Skrillexx™'s topic in GFX Battles
v1 text -
Battle [The Ga[M]er Vs ViRuS] winner The Ga[M]er.
Derouiche™ replied to The Ga[M]er.'s topic in GFX Battles
v2,text,brush -
Samsung has dominated the solid state storage scene for the past few years and they recently updated their SSD lineup with two of the fastest drives we've ever tested. The MLC-based SSD 960 Pro and the more affordable TLC 960 Evo, both reviewed and proven to be extremely fast NVMe SSDs. A tough act to follow for the competition, even for the likes of Intel, Corsair announced the Force MP500 SSD shortly after the release of the Samsung 960 Series. The Force MP500 is a high-speed NVMe SSD targeting power users. The drive takes advantage of the more desirable MLC type NAND flash. However, unlike the 960 Pro series, capacities are available below half a terabyte. In fact, Corsair is offering this new series with a maximum capacity of 480GB while it is also possible to purchase a 240GB version and even a piddly 120GBer. Availability for the new Samsung drives hasn't been great, however given the performance on tap at $330 for the 512GB 960 Pro and $250 for the 500GB 960 Evo, they've able to single-handedly hold the competition at bay. Intel's SSD 750 Series and Toshiba's OCZ RD400A simply can't compete at the same level, so we're looking forward to see how Corsair's top offering holds up. We have the MP500 480GB model on hand for review and at $325 it's priced to match the 960 Pro. Actually, given the slight variation in capacity, the Corsair drive ends up a few cents more expensive per gigabyte, so the MP500 series isn't exactly affordable. However, on paper, Corsair's new flagship SSD series looks mighty impressive boasting sequential read/write speeds of 3GB/s and 2.4GB/s. Before metaphorically seeing what the MP500 is made of, let's take that phrase literally... What Makes the MP500 Tick? Based on the M.2-2280 form-factor all MP500 models utilize the PCIe 3.0 x4 interface for a blistering fast sequential read speed of 3GB/s, coupled with an equally impressive write throughput of 2.4GB/s. What's interesting is that Corsair is claiming the same sequential read and write performance for all three models (120GB, 240GB and 480GB), though there is a large discrepancy in IOPS, where the 120GB version severely lags behind the larger 240GB and 480GB models. Whereas the 240GB and 480GB models are rated for 250,000 random read IOPS and 210,000 random write IOPS the 120GB model is good for just 150,000 read and 90,000 write IOPS. So as you might expect performance will be down on the 120GB model, so just be aware of that. Again for testing we do have the 480GB model on hand and as the fastest model in the series we will be showing best case performance. What's interesting about the new Force MP500 series and rather important to note is the fact that they are using MLC NAND memory, not the cheaper and less durable TLC stuff like the 960 Evo series. Powering the MP500 series is a Phison controller, more specifically the PS5007-E7. So far there are very few SSDs in the wild using this controller and I am personally yet to see how it handles. Connected to the controller is Toshiba's 15nm MLC NAND along with a 512MB DRAM buffer. Please note the 240GB model features a smaller 256MB buffer and the 120GB model comes with a 128MB buffer. Currently AES 256-bit encryption isn't supported but Corsair hopes to deliver a firmware update soon to add in support. Finally, all three models are being backed by a three-year warranty, which isn't bad, though we would have liked to have seen them match Samsung's five-year warranty. Still, Corsair is claiming 698TB of written data rating, but of course you only have three years to prove them wrong. How's the Software? The Force MP500 series is backed by Corsair's tacky SSD Toolbox. It has to be said I'm used to seeing much more polished looking software from Corsair and I expect the SSD Toolbox to see some major upgrades in 2017. Most of the functions aren't compatible with NVMe drives at this point so MP500 owners can't view the drive's S.M.A.R.T. status or total host writes, run a secure wipe, change the overprovisioning or mess with the TRIM command settings. Technically, the MP500 doesn't require any special drivers to work. That said, it does if you want to extract maximum performance from the drive and right now Corsair doesn't offer an NVMe driver. Samsung recently released a new NVM Express driver which greatly improved performance under Windows 10 and 8.1. The problem being that these operating systems by default enable a write-cache buffer known as the Force Unit Access command, which drastically reduces performance. This command is a conservative approach taken by Microsoft to ensure data integrity in case of sudden power loss. However, from Windows 8 onwards, Microsoft incorporated an automatic FLUSH command to ensure data integrity, but it simultaneously maintained the much older Force Unit Access command in the standard drive settings. This redundancy means that write speeds are significantly inhibited due to unnecessary write verification processes. By manually disabling the command, the write performance reached the expected levels. However, manually mani[CENSORED]ting drive properties isn't user-friendly, which is obviously why Samsung developed the NVMe Driver 2.0 to do this automatically. Hopefully Corsair will follow suit and do the same.
-
Nielsen has released its list of the year’s most po[CENSORED]r US smartphone apps and operating systems. Again, the top ten was dominated by Google and Facebook products, with the latter’s social media app taking the top spot in 2016. Despite Facebook already being installed on the vast majority of smartphones, the app saw a year-over-year increase of 14 percent. Thanks to its 146 million average unique users per month, Mark Zuckerberg’s platform was the most po[CENSORED]r mobile application of 2016. Messenger was the year’s second most po[CENSORED]r app with 129.6 million average unique users, representing a YoY growth of 28 percent. But while Facebook may have taken the top two spots, it was Google’s applications that made up most of the list. Places three through seven were taken by YouTube, Google Maps, Google Search, Google Play, and Gmail, in that order. Maps was Google’s best performer when it came to YoY growth, up 22 percent from 2015, though YouTube wasn’t far behind; the video steaming service was up 20 percent compared to the previous year. While Nielsen’s 2016 top ten was the made up of the same apps found in 2015’s list, there was one exception: Amazon. The retail giant’s app knocked Apple Maps out of the chart, stealing the number 10 position. It also boasted the largest YoY increase on the list – 43 percent. The eighth and ninth positions were taken by Instagram and Apple Music, respectively. The Facebook-owned photo/video sharing application had the second-highest YoY growth in 2016, up 36 percent from the year before. Nielsen found that 88 percent of US mobile subscribers now use a smartphone, up from 86 percent at the beginning of the 2016. Android (53 percent) is still ahead of Apple (45 percent) when it comes to OS market share, but iOS had a slightly larger yearly increase – 2.3 percent. Not surprisingly, Windows Phones made up just 2 percent of the market, while BlackBerry’s share somehow increased by 0.3 percent to 1 percent.
-
Apart from his apparent love of Twitter, President-elect Donald Trump isn’t a huge fan of technology. Speaking last week about the Russian hacking incidents, the Republican blamed computers for complicating people’s lives. Now, he’s suggested a method of keeping sensitive government information safe: stop using computers and send messages via human couriers. Speaking to reporters at his Mar-a-Lago estate in Southern Florida, Trump made the comments during his annual New Year’s Eve party on Saturday. Admittedly, his belief that “no computer is safe” is well founded, but using couriers instead of electronic means to communicate is excessive and comes with its own security issues. The method would mean that humans - often the weakest link in the cyber security chain - would be even more at risk from nefarious parties. Instead of upgrading government networks and security, Trump said he’d rather use couriers for those “really important” messages, though it’s hard to imagine that physically passing on notes would be any safer than the computerized equivalent. Hours after his statements, Trump’s press secretary, Sean Spicer, said the next president would continue to embrace twitter as a platform for making major policy announcements. "There's a new sheriff in town, and he's going to do things first and foremost for the American people … Absolutely you're going to see Twitter," Spicer said on ABC's This Week. As a punishment for its alleged hacking activities, the Obama administration placed new sanctions on Russia last week, which included the expulsion of 35 Russian diplomats. When asked what he thought about sanctioning Russia, Trump said: “I think we ought to get on with our lives.”
-
Facebook bases its entire business on collecting data about its users so it can make money through targeted ads. That’s the price you pay in exchange for access to the largest platform to connect with friends and family. But what you may not be aware of is that the company also work with several data brokers to gather information about users’ offline life. This can include things like places that you frequent, how much money you make and the number of credit cards you have. The fact that Facebook is buying data from third party data brokers isn't new, but that this includes data about users' offline lives isn’t widely known, and a report from ProPublica is shinning a light on the practice. The research found about 29,000 different categories Facebook provides to ad buyers, and almost 600 were “provided by a third party,” most of these related to users’ financial history. However, unlike Facebook's native data collection, users cannot see what information these third-party sources have on them directly from the social network’s website. When asked about the lack of disclosure, Facebook responded that it doesn't tell users about the third-party data because it's widely available and it is not collected by them. Steve Satterfield, a Facebook manager of privacy and public policy, says that users who don't want that information to be available to Facebook should contact the data brokers directly. He also points to a help center page with links to the opt-outs for six data brokers (Acxiom, Epsilon, Experian, Oracle Data Cloud, TransUnion and WPP ) that sell personal data to Facebook. ProPublica of course decided to try this procedure and found that it was extremely complicated, in some cases requiring a written request sent by mail along with government-issued identification. Asking data brokers to provide the information that they have on them is also a cumbersome process. Since the report went online Facebook has defended itself saying that ProPublica fails to mention that a person can click on the upper right corner of any ad on Facebook to learn why they’re seeing the ad, and if it’s because they’re in a data provider’s audience, Facebook discloses this and links to the data provider’s opt-out — it’s not clear if this still involves the complicated process ProPublica reported on. “Furthermore, we think when people choose not to see ads based on certain information, they don’t want to see those ads anywhere. When a person makes changes to her Ad Preferences (which apply to Facebook’s ad categories), we do our best to apply those choices wherever we show ads to that person using Facebook data. We wanted controls for data provider categories to work similarly, so we required the data providers to provide opt-outs that work across all the services that use their data for ads.”
-
Earlier this month, the Obama Administration promised that Russia would face the consequences for interfering with the US election. Yesterday, a new set of sanctions were announced against the country, which includes the expulsion of 35 Russian diplomats. The actions coincide with the release of a declassified joint report from the FBI and Department of Homeland Security that reveals the technical details of Russia’s hacking campaigns. The 13-page document states that two different Russian civilian and military intelligence Services (RIS) “participated in the intrusion into a US political party" - a clear reference to the Democratic National Committee hacks. The first group, known as Advanced Persistent Threat (APT) 29, aka Cozy Bear, compromised the Party’s systems in summer 2015. The second group, APT28, aka Fancy Bear, broke into the DNC's network during spring 2016. The report links APT29 to a spearphishing campaign that saw emails containing malicious links sent to over 1000 recipients, including multiple government officials, in mid-2015. At least one of the targets activated links that delivered malware to the DNC’s systems, giving APT29 access to sensitive information. APT28 used the same targeted spearphishing technique in Summer 2016 to once again infiltrate the DNC and other organizations. In this case, the emails tricked recipients into changing their passwords through fake webmail domains. The government agencies believe the data stolen in this instance was leaked to the press and publicly disclosed, thereby influencing November’s election. The report refers to the Russian operations using the codename “Grizzly Steppe.” It includes a diagram (below) that gives a visual representation of how the attacks took place. ome security experts have criticized the report for being overly basic and arriving too late. Obama has previously talked about responding to Russia’s cybercrimes “at a time and place of our choosing.” “I have issued an executive order that provides additional authority for responding to certain cyber activity that seeks to interfere with or undermine our election processes and institutions, or those of our allies or partners,” said the President. Using this new authority, I have sanctioned nine entities and individuals: the GRU and the FSB, two Russian intelligence services; four individual officers of the GRU; and three companies that provided material support to the GRU’s cyber operations. In addition, the secretary of the treasury is designating two Russian individuals for using cyber-enabled means to cause misappropriation of funds and personal identifying information. Additionally, Russia will no longer have access to compounds in Maryland and New York that have been used for intelligence purposes. More actions against the country are likely to be taken, though not all of them will be publicized. A spokesperson for Russian President Vladimir Putin said Russia regretted the new sanctions and would consider retaliatory measures. The Russian embassy in the UK sent out a tweet calling Obama’s administration a lame duck.18,677 likes In response to the sanctions, Russia has ordered the closure of the Anglo-American School of Moscow, which was attended by the children of Western embassy personal from the US, the UK, and Canada. It has also ordered the closure of a US embassy vacation house, located just outside of Moscow. It will be interesting to see how incoming president Donald Trump deals with the situation. When asked yesterday about the Russian hacking situation, he blamed computers for making people's lives much more complex. When pushed to comment on the new sanctions, the President-Elect said: “I think we ought to get on with our lives.”
-
Microsoft’s iconic Blue Screen of Death (BSoD) is the universal sign that things have taken a turn for the worse. In the near future, however, some Windows users that happen upon error screens will be seeing a new color – green. In stark contrast to what we’re all familiar with, a leaked preview version of Windows 10 (build 14997) has been shown to feature a Green Screen of Death (GSoD). Matthijs Hoekstra, a senior program manager for the Windows Enterprise Developer Platform, confirmed the authenticity of the GSoD on Twitter. Days earlier, Hoekstra said it was cool to read about all the new features that people were discovering in the leaked build but hinted that nobody had yet discovered a “big change.” A day later, he dropped the simple hint of “Green!” Contrary to what you might initially be thinking, Microsoft isn’t doing away with the iconic BSoD entirely. Instead, the GSoD is being reserved for preview builds seeded to Windows Insiders as a simple but effective way for Microsoft engineers to differentiate between stop screens originating from preview builds versus public releases. Other features discovered in the leaked build include a blue-light reduction mode to reduce eye strain before bed, Start Menu folders and a performance-enhancing Game Mode that some believe could boost the PC gaming experience by freeing up additional resources from apps running in the background while gaming.
-
both are incredible and equal vote v2 text,texture,C4D position
-
Welcome back
-
Microsoft sort of neglected Windows as a gaming platform for a few years by prioritizing its Xbox gaming console instead. Lately, however, the company has taken steps to converge both ecosystems and bring PC gaming back into the limelight. More improvements could be on the way, if a recently leaked DLL file is any indication, suggesting a new “Game Mode” might arrive in the upcoming Creators Update. The file was spotted in a recently leaked build of Windows 10 (14997). It isn’t functional yet, but according to Windows Central, it will enhance the PC gaming experience by minimizing resources used by running apps to almost nothing and allocating freed up resources when launching a game. The report notes that this is similar to how Xbox One handles resources when running a game. WindowsCentral also speculates on whether Game Mode will work exclusively with games from the Windows Store, or any Windows title from third party sources like Steam or Origin. Microsoft has yet to confirm the existence of the new “game mode,” but with the Creators Update due in the early part of 2017, Windows Insiders could get an early peek in a few weeks. Other gaming-centric features arriving with this update include support for game broadcasting and an Arena tournament feature that lets players create their own tournaments to compete with friends.
-
Concept and prototype vehicles are notoriously ugly, often times for no good reason other than to look so out-of-place that you can’t help but notice them (pro tip to automakers – that’s an idiotic idea). Occasionally, however, such vehicles are hard on the eyes out of necessity. Take the Ford Fusion Hybrid, for example. Early iterations of Ford’s self-driving test vehicle were equipped with roof-mounted assemblies that housed various cameras and sensors needed for autonomous driving. With more than three years of research under its belt, Ford has taken everything it learned and rolled it into a brand new autonomous development vehicle that’s ready to hit the roads. In unveiling the next-gen Ford Fusion Hybrid on Medium, Chris Brewer, Chief Program Engineer, Ford Autonomous Vehicle Development, said the vehicle uses Ford’s current autonomous vehicle platform albeit with new computer hardware for improved processing performance. What’s more, the electrical controls are closer to production-ready while new LiDAR sensors with a sleeker design and more targeted field of vision enabled Ford to get away with using just two sensors instead of four. As a whole, the system has a sensing range of roughly the length of two football fields in every direction which generates a terabyte of data per hour. The result is a roof-mounted system that somewhat resembles luggage racks commonly found atop many vehicles. From a distance, the only thing that really gives away the fact that it’s a self-driving car is the LiDAR sensor mounted on the driver’s side A-pillar. Ford says it aims to produce a fully autonomous, steering wheel- and pedal-less vehicle by 2021 for ride-sharing and ride-hailing services. The next-gen Fusion Hybrid will make an appearance at CES in Las Vegas next week.
-
As far as consumer-grade desktop monitors are concerned, Dell's UltraSharp line are consistently among the most highly revered. Back in January 2014 we reviewed Dell's first flagship 4K monitor, the massive 32-inch UP3214Q. Despite being the most impressive monitor TechSpot had looked at to date, the UP3214Q landed squarely in the early adopter category as evident by its tiled design (essentially two screens stitched together), limited 60Hz connectivity options, and a wallet-weeping $3,500 initial MSRP. It was also met with a general lack of 4K compatibility, support, and content – none of which were really Dell's fault, but facts nevertheless. All things considered, we advised the "average" reader to wait it out as substantial improvements were inevitably in store for its successor. Indeed, the successor – the UltraSharp 4K UP3216Q – hit the scene not all that long ago and with more than two-and-a-half years of industry maturation in tow, we revisit Dell's 32-inch 4K Ultra HD UltraSharp offering. Out of the Box The UP3216Q, like its predecessor, is packaged and shipped in environmentally friendly cardboard – no foam here. Tucked neatly inside is the monitor itself, an aluminum monitor stand with cable pass-through, a plastic cable cover that goes over the rear I/O connections for a tidier look, a power cable, a miniDisplayPort-to-DisplayPort cable, an HDMI cable, a USB 3.0 upstream cable for connecting the monitor to your computer, an optical disc that includes drivers and documentation, a quick setup guide, obligatory safety and regulatory information and a factory calibration report. Every UP3216Q leaves the factory with an average Delta-E < 2 calibration but more on that in a bit. If you're familiar with the first-generation monitor, the UP3216Q will feel like an old friend. Sitting side by side, you'd be hard-pressed to spot visual differences between the two as they look virtually identical (at least, when powered off). The IPS display is of the matte variety with a mild anti-glare coating that's framed by an inch-thick black plastic bezel lined by a silver strip as you round the corner. A single power backlit button is positioned in the bottom right corner of the bezel. Just above it are five unlabeled capacitive touch buttons that light up when activated, launching the on-screen menu beside them. Along the bottom bezel in the very center is a shiny Dell nameplate that I could do without as the reflections it picks up can be distracting at times. Sitting discretely on the left edge of the monitor is a 6-in-1 card reader and around back, a bank of connectivity ports comprised of – from left to right – the power connector, an HDMI port, a DisplayPort, a miniDisplayPort, a 3.5mm audio jack, a USB 3.0 upstream port and three USB 3.0 ports. A fourth USB 3.0 is distinctly positioned outside of the group on the right side as a charging port for your smartphone or other wireless device. On the opposite side, you'll find a security lock although a cable lock isn't included. Unlike some monitors, there aren't any integrated speakers. Dell sells an optional soundbar (model AC511) for about $20 that attaches to the bottom of the display to provide basic audio if that's all you're looking for. Getting started with the UP3216Q is about as easy of an experience as you can imagine. With the monitor out of the box, you can elect to install it on a wall or dedicated arm assembly via VESA mount or do as I did and use the included aluminum stand. The easiest way to do this is to lay the monitor face down, insert the top portion of the stand's mount first then snap the bottom into place. Initial Use, First Impressions More than a decade ago when flat panel televisions were just starting to trickle into the market, one of my best friends purchased a 50-inch plasma TV (I recall he paid several thousand for a Zenith set). At a time when the largest CRTs were in the 32- to 36-inch range, a 50-inch plasma was like heaven on Earth. Its sheer size and HD resolution ensured that you'd need a bib to catch all the drool. My first week with the UP3216Q afforded a very similar experience. Up to that point, I had been using a triple monitor configuration comprised of two 22-inch and one 24-inch 1080p monitor so needless to say, 32 inches was a big step up. been using the UP3216Q for close to two months now, primarily paired with my desktop workstation running on a mundane Radeon R7 250 graphics card that cost less than $100 back in the day, using DisplayPort through a miniDisplayPort-to-DisplayPort cable which affords me 60Hz action. In other words, no need for anything special to get proper 4K resolution support. Dell UltraSharp 4K UP3216Q - $1270 Diagonal Viewing Size: 31.5 inches (16:9) Panel: In-Plane Switching Native Resolution: 3,802 x 2.160 @ 60Hz Contrast Ratio: 1,000 to 1 (typical) / 2M to 1 (dynamic) Brightness: 300 cd/m2 (typical) Response Time: 6ms (gray to gray) fast mode Viewing Angle: 178° vertical / 178° horizontal Adjustability: Tilt, Swivel, Height Color Support: 1.07 billion colors Pixel Pitch: 0.182 mm Backlight Technology: LED light bar system Display Screen Coating: Anti-Glare with 3H hardness Color Gamut: 99.5% Adobe RGB, 100% sRGB, 100% REC709 and 87% DCI-P3 Three-year limited hardware warranty Connectivity: 1x DisplayPort, 1x miniDisplayPort, 1x HDMI (MHL), 4x USB 3.0 ports, 1x USB 3.0 upstream, 1x media card reader Physical Specifications: Dimensions (With Stand): Height: 482.6 mm (19.0 inches) / 572.4 mm (22.5 inches), Width: 749.9 mm (29.5 inches), Depth: 214.0 mm (8.4 inches) Dimensions (Without Stand): Height: 444.6 mm (17.5 inches), Width: 749.9 mm (29.5 inches), Depth: 51.5 mm (2.0 inches) Weight (panel only - for VESA mount): 8.6 kg (18.92 lbs) / (with packaging):15.2 kg (33.44 lbs)
-
In the two months since Intel unveiled Broadwell-E, I've been going back and forth with my decision to invest in one. We received the 10-core Core i7-6950X for review and while it was an attractive chip in terms of performance, it came at a seriously ugly price. At $1,650 we recommend taking a hard pass on the 6950X. Frankly, the older 8-core 5960X was difficult to justify at $1,050, so the slightly updated 6900K for $1,100 doesn't exactly have us whipping our wallets out. Spending over $600 on the 6-core 6850Kisn't too appealing either... So, what's an enthusiast to do if they require more than the 4 cores in Intel's mainstream desktop Core i7 processors? One solution would be building our beastly 16-core/32-thread Xeon E5-2670 workstation featured back in April. For under $1,000 we picked up core components including two 8-core E5-2670processors, a new dual-socket LGA2011 motherboard and 64GB of DDR3 memory. Throw in a case, power supply, graphics card and some storage and you have a seriously capable machine for the price of a Core i7-5960X. In terms of performance, our affordable Xeon build really stuck it to the 5960X by a rather large margin in more than one test. When the uber expensive 6950X appeared, we made sure to pit it against the dual-CPU system and to our surprise the Xeons stood strong, even coming out on top in a few tests. It was interesting to find that in many of the application and encoding tests, this older Sandy Bridge-EP build was able to put up a real fight. In terms of performance vs. price it tends to come out well on top with the only blemish being its power consumption. The dual-Xeon system pulled 300 watts in our Hybrid x265 test while the Core i7-6950X setup needed only half that amount. Of course we were comparing two 8-core processors to a single 10-core chip, but the main issue was the four-generation-old Sandy Bridge architecture. This put us on the hunt for affordable Xeon processors based on the Haswell-EP or perhaps even Broadwell-EP architectures -- it certainly seemed mere wishful thinking that we would come across a relatively inexpensive Broadwell-EP Xeon. Our search put us on the trail of Intel's Xeon E5 2630 v4, a 10-core Broadwell-EP part that runs at a base clock of 2.2GHz but can boost up to 3.1GHz depending on the workload. Typically, you'd spend something like $700 for this processor -- substantially more than the $70 we paid for each of our E5-2670 v1 processors -- however, it's possible to purchase the E5-2630 v4 for as little as $200 on eBay. The only catch is that they are engineering samples (ES), not retail chips. The examples we've come across are based on release stepping (SR2R7), so motherboard compatibility won't be an issue, providing the BIOS has been updated to support Broadwell-EP processors. Once upon a time it was rare to find Intel engineering samples, but today they appear online in huge volumes. Looking only on eBay for instance, thousands of these E5-2630 v4 ES chips have been sold with countless more still in stock. Typically, we suggest avoiding ES chips when possible, but $200 for a 10-core/20-thread Broadwell-EP processor really is too hard to refuse. With so many of you asking what these chips perform like over the past few weeks, we've decided to find out. The Build The previous build using the Xeon E5-2670 v1 processors was put together on a pretty tight budget and so we went with one of the most affordable Dual Socket R (LGA2011) motherboards we could find. Since we are spending more than twice as much on the processors this time ($400), we decided to go with a more capable motherboard. Having been so impressed with the previous Asrock Rack motherboard, we picked up the EP2C612D16-2L2T on Newegg for $580 (which is now $100 cheaper if you needed further temptation). This is a Dual Socket LGA2011 R3 motherboard that adheres to the SSI EEB form factor, measuring 12'' x 13'' (30.5 cm x 33 cm). Announced way back in September 2014, the EP2C612D16-2L2Tgained Broadwell-EP support in March via BIOS version 2.10. At the heart of the EP2C612D16-2L2T we find the Intel C612 chipset, a 7w part that was built using the 32nm process and offers Gen 2 PCIe support for up to 8-lanes, six USB 3.0 ports as well as 10 SATA 6Gb/s ports. Asrock Rack has expanded SATA support to a dozen ports with the inclusion of a single Marvell 9172 6Gb/s controller. Given that this is a two-year old motherboard, you won't find fancy storage options such as M.2. High speed SSDs will need to be integrated using PCI Express adapter cards. There are a total of 16 DIMM slots with support for NVDIMM (Non-Volatile Dual In-line Memory Module). Each processor is connected to 8 DIMMs and of course quad-channel memory support exists. Both RDIMM and LRDIMM modules are supported at speeds of DDR4 2133/1866 and 1600. Onboard we find three PCIe 3.0 x16 expansion slots along with a further three PCIe 3.0 x8 slots. That means there are 72 PCIe 3.0 lanes on tap -- impressive stuff. One of the key highlights of the EP2C612D16-2L2T is network support. Out of the box you get a pair of 10G network connections courtesy of the Intel X540 controller. In addition there are a pair of Intel i210 controllers for a pair of Gigabit Ethernet connections. Finally, there is also a single dedicated IPMI LAN port. The ECC memory this board can support is generally meant for servers, where any data corruption is unacceptable. Since this isn't really a concern for most of our readers, we went with standard UDIMM modules from G.Skill, rather than equip the board with ECC memory. Ideally we wanted to po[CENSORED]te every DIMM slot with DDR4-2133 memory so we reached out to our good friends over at G.Skill. Happy to oblige, they served up 16 4GB sticks of Ripjaws V DDR4-2133memory for a total capacity of 64GB, which will allow both Xeon E5-2630 v4 chips to enjoy quad-channel memory support. G.Skill sells this memory in 16GB quad-channel memory kits for just $74 each, taking the total cost for this build to just shy of $300. For those wondering, the memory operates at CL 15-15-15-35 timings using 1.2 volts. The modules are available with either red or black heat spreaders and we went with red. As with our previous dual-Xeon build, we equipped the processors with Noctua NH-U12DX i4 coolers. Noctua's DX line have become a po[CENSORED]r choice in high performance quiet cooling solutions for Intel Xeon CPUs. The latest i4 revision supports the LGA2011 platform (both Square ILM and Narrow ILM) and comes with a 120mm NF-F12 'Focused Flow' fan. Thanks to its slim design with a fin depth of 45mm, the NH-U12DX i4 ensures easy access to the RAM slots. When installed parallel to the slots, it will not overhang the memory even with two fans installed. For those concerned about space, the NH-D9DX i4 is an even more compact option. At $60, both the NH-U12DX i4 and NH-D9DX i4 are well priced and come backed by a six-year manufacturer's warranty.
-
Intel has published the technical product specifications for its upcoming “Arches Canyon” NUCs powered by its own Celeron J3455 CPU, a 10W chip with integrated HD Graphics 500 that’s soldered to the motherboard. Intel will offer two variations of the NUC – one with 2GB of DDR3 RAM, 32GB of eMMC storage and Windows 10 installed (model NUC6CAYS) and a barebones kit (NUC6CAYH) in which the buyer must supply their own RAM, storage and operating system. Those that opt for the barebones configuration should know that the system supports a maximum of only 8GB of 1600/1866 MHz memory. Elsewhere, you’ll find two USB 3.0 ports up front (one with fast charging) and two on the rear (sorry, no USB Type-C), a single 2.5-inch slot for a SATA SSD or HDD, an SDXC card reader, a full-size HDMI 2.0 port with CEC that supports up to 4K/60fps, a VGA connector that’s capable of 1,920 x 1,200 resolution at 60Hz, a front-mounted 3.5mm audio jack, a rear-mounted mini-TOSLink connector, a Gigabit Ethernet jack, dual-band 802.11 ac Wi-Fi and Bluetooth 4.0 as well as an infrared receiver up front. The specifications match up perfectly to what was listed on the consumer NUC roadmap that leaked online over the summer. No word yet on pricing nor do we know when exactly Intel plans to launch their Apollo Lake NUCs. With CES now less than a month away, I suspect we’ll hear more about Intel’s plans in early January (if not sooner).