Everything posted by Derouiche™
-
Nintendo Switch's power is limited when coming to the portable mode, according to the highly technical Nintendo patent that popped up in the last few hours on the Internet. It seems lower resolution on tablet in comparison with the docked mode is not just a matter of the screen, but also a limit coming from the reduced power of the console in the portable mode. Lowering CPU usage will be necessary, it seems, in order to have longer battery life: "…in the portable mode, the image rendering capacity (which can also be referred to as the image generating capacity) is limited. Specifically, in the portable mode, the resolution (in other words, the number of pixels) of images generated by the main unit is lower than that in the console mode." Nintendo has yet to confirm such information coming out from the patent, and we don't truly know if it will be one day or another, but it's likely that similar details might be coming from press giving an early try to the Nintendo Switch in January 12/13's hands on. From that point on we will learn more, but honestly it's not stuff that would be coming as a surprise now.
-
For recording high quality gameplay videos from console requires quiet a work when it comes to using a PC and compiling videos on a video editing software. Your work can be lot easier if you go for a capture card that does everything for you except editing. Epiphan brings a 4K external video capture card that works with consoles, pc, camera, Smartphone’s, etc. The card is designed to record 4K video without disturbance. AV.io HDMI to USB 4K capture card works on the hardware layer, once connected to your console, and the storage is detected its starts recording. At one side you need the video source that can be your console, and on the other a pc a stream, capture source. You can select custom resolution upto 4096x2160 with 4k UHD Video streaming at 30fps and 1080p at 60fps. It also record high quality audio upto 24bit 96Hz. The capture cards is costly, it comes for $400 on Amazon. Compare to this there are ample of model under $100, but none of them brings 4K video recording. On the other side the card also works with various other devices which are a plus point. The card is a ideal accessories for new consoles like Xbox Scorpio and PS4 Pro. They have 4K game support, but get the exact video output with positive sound won't be possible if the hardware itself can stream the content at highest resolution. If only 4K recording is your primary objective in capturing gameplay, then this card is worth to go for.
-
time in albania > 19:30 time in india >1.30 AM time in england > 18.30 if you need to know the time for all the zones go to google engine and type Romania time
-
Christmas is a great time for finding deals on various bits of equipment, and Newegg is offering an AMD Fury X at rock-bottom pricing. There’s an XFX version of the GPU for $319 at Newegg right now, and while we don’t normally recommend cards based on rebates, you can pull the total price down to $299 if you’re willing to submit one. That’s a pretty insane deal by any measure. When we compared the Radeon Fury against the GTX 1060 earlier this year, we noted that the Fury was a compelling challenger compared to the GTX 1060. The Fury X is roughly 10% faster than the Fury, which means this is about as good a deal as you can get on a $300 GPU. There are a few caveats to keep in mind. First, the Fury X is limited to a 4GB memory buffer, while other cards in this category now regularly offer more. Whether or not that’s an issue will depend on what kind of games you play and how often you upgrade. Our 4GB memory testing last year demonstrated that 2015-era GPUs tend to run out of steam before the memory limit becomes a problem, and that’s still true of the other $300 GPUs competing in this territory. There are a few things to be aware of if you’re eying a Fury X. First, it’s an older GPU and draws a fair bit of power as a result, so make sure to check power supply requirements and make certain your own hardware is within spec. The 4GB memory limitation is also a bit of a barrier at this point — it should be fine for current and future titles, provided you’re playing at 1080p or 1440p, but 4K isn’t a great resolution for the Fury X, even though the card can technically drive it. The bottom line is this: A $299 – $319 Fury X is a crazy deal. It’s easily the fastest GPU you can buy in this price bracket, albeit temporarily. The radiator and blower help keep things cool and the GPU supports features like AMD’s FreeSync for smoother gaming on supported monitors. Even the 4GB memory buffer shouldn’t be a major problem — while GPU buffers have grown of late, a number of the most po[CENSORED]r cards on the market, including the GTX 970 and GTX 980s were both 4GB cards. Nvidia’s Pascal has been lighting up the charts, but Maxwell and GCN-based hardware will still be supported in-market for some time to come.
-
The best-selling model under Volkswagen's Audi division was emitted about double the legal limits of nitrogen oxide (NOx) levels for Europe, Reuters says, citing laboratory tests overseen by the European Commission's Joint Research Center (JRC) in August. The Audi A3 was found in two tests to emit about double the legal limit of NOx, though one of the tests had the A3 within the limits when the engine was cold. An Audi spokesman told Reuters that the A3 was independently tested to have emissions levels within the legal limit and that he wasn't aware of the JRC test results. Still, the findings are another example of how Volkswagen, Europe's largest automaker, can not seem to shed the issues surrounding the diesel-emissions scandal that broke last September. VW has been fined about $19 billion for equipping diesel cars with software that cheats emissions-testing systems. About 11 million cars were affected, including about a half-million vehicles in the US. In addition reaching a $15 billion settlement with US regulatory bodies such as the Environmental Protection Agency (EPA) and the California Air Resources Board (CARB) earlier this year, VW has been fined $15 million by the South Korean government, which may impose more penalties because of allegations of false advertising. Audi is not the only VW unit to face further scrutiny. Germany's Transport Ministry and Federal Motor Transport Authority are taking a closer look at VW's Porsche division for potential emissions-cheating efforts, Bloomberg News recently reported. Additionally, the European Union is saying that at least seven of its member nations failed to provide sufficient oversight of automobiles' emissions-testing process, and may take legal action against Germany, Spain, Luxembourg, Czech Republic, Lithuania, Greece, and Great Britain, according to a separate Reuters article.
-
song uploaded
any one can guess the name of song ?
-
For years, the conventional wisdom was that Mars existed as little more than a cold, barren dust ball in space. The idea that it once supported life was considered unlikely. But then we started sending probes the the Red Planet, and more recently rovers like Curiosity. Since its arrival in 2012, Curiosity has covered more ground than all previous rovers, and now mission scientists are comfortable saying that Mars would have been capable of harboring life for hundreds of millions of years in the past. Curiosity landed in a region known as Yellowknife in Gale Crater, and has been making its way up to higher elevations around Mount Sharp, which is in the middle of Gale Crater. This gives it a chance to investigate the strata as it ascends, essentially scanning the Martian past. The new proclamation of Mars as a potential long-term home to ancient life comes from Curiosity science team member John Grotzinger, who spoke on the topic at a recent meeting of the American Geophysical Union (AGU). We’ve known from samples all the way back to Yellowknife that Gale Crater most likely played host to a vast lake and stream system, but it was not present continuously. That doesn’t necessarily mean everything living there disappeared with the water, though. According to Grotzinger, analysis of Curiosity data from various levels in Gale Crater paint a picture of fresh, neutral pH water that got more acidic and salty over time. The lakes also completely dried up and refilled repeatedly over the course of millions of years. Despite this, simple microorganisms could have persisted in the groundwater, ready to take advantage when standing water again flooded the surface. Curiosity has also identified a great diversity of minerals on Mars, which points to a complex chemical history — just the sort of thing life requires. The rover has detected many of the same minerals we have on Earth, including clays, magnetite, and boron. There’s even silica, which scientists are particularly happy about. On Earth, silica has been good at preserving microscopic fossils. If life did exist on Mars in the past, we might find strong evidence for it in silica deposits. This is all assuming alien life on Mars operates by the same rules as life on Earth. That’s certainly not a given. Even life on Earth can seem almost alien at times. Single-celled extremophiles can survive (and even thrive) in conditions too hot, acidic, or salty for any other organism. Maybe something like that lived (or lives?) on Mars. We might find more clues when NASA’s 2020 rover project heads to the Red Planet.
-
which one is better to use as signature ? :3
1st
2nd
3rd
-
[Battle] -Skrillexx Vs Heeroiik vs The Gamer [Closed]
Derouiche™ replied to -Skrillexx™'s topic in GFX Battles
accept -
New Steam >> http://steamcommunity.com/id/Heeroiik_csbd/
-
Well, this is just dandy. Earlier this year, Yahoo announced that it had suffered one of the largest hacks in history, with up to 500 million user accounts affected. Now, the company has come clean about an even bigger hack that happened a year earlier and exposed sensitive information on approximately one billion accounts. The most surprising thing about this, of course, is that one billion people had Yahoo accounts to start with. Here’s where things take a further detour into the ridiculous. In September 2016 we found out about a series of hacks that hit Yahoo back in 2014. Now, at the tail end of the year, we’re hearing that an even larger attack in 2013 not only captured more information, it captured vastly more sensitive data including plaintext security answers to identity questions. Yahoo is now requiring everyone to change their security passwords and is invalidating all of its old questions, but this isn’t just a case of locking the barn after the horse has escaped — the horse has already died of old age. Yahoo apparently only found evidence of these attacks after analyzing log files provided to it by law enforcement. Said files came from a third party who claimed they held information on Yahoo, which means the company didn’t even find this independently — it had to be handed the evidence others had gathered. Verizon is still expected to buy Yahoo, but the company talked publicly about potentially seeking a lower price in the wake of the earlier hack, and now Yahoo has a problem literally twice as big on its hands. This time, the hack actually involved personal information and could be easily mined for additional information on how users tend to select passwords. Hacks and security breaches are far more useful to black hats than just a list of passwords and logins. By creating dictionaries based on passwords people actually use, black hats can accelerate how quickly and effectively they are able to breach future accounts. In theory, users should create a different login and password for every site, but very few people do so. Most of us use a handful of passwords, at best, or a single common password that’s rotated out over time. Meanwhile, Yahoo took a short view on security for several years, possibly out of fear of losing users, possibly because the company had ideas for monetizing mass surveillance in ways the old East German Stasi would’ve envied. But more than anything, this just highlights how little society — or businesses — care about online security. Breaches are treated as non-events, even when critical information is exposed, and even when that data could be used to target individuals for theft. If you have access to someone’s email, you may well have access to information about their ongoing medical care, their bank accounts, billing statements, or other personally identifiable information. To coin an analogy: If the USPS announced that it had lost over a billion pieces of physical mail, people would be up in arms about it — but a hack of sensitive user information that may have exposed tens or hundreds of billions of pieces of mail (depending on which information was stolen and how it was used) stirs scarcely a ripple. If you’ve still got a Yahoo account, it’s probably time to dump it. Use Outlook.com, or Gmail, or any other third-party provider you like, but don’t keep using a company that plainly cares so little for your own privacy and security — unless, of course, you don’t care either.
-
Google has been an early leader in the self-driving car business, with millions of miles logged in its vehicles and an ambitious plan to create vehicles without steering wheels or pedal controls. Now, the company has announced a new direction: Google is moving its self-driving car unit out of Google X, the company’s skunkworks division, and establishing it as an independent subsidiary of Alphabet, Google’s parent corporation. Its new name? Waymo. The new CEO of Waymo, Jon Krafcik, has previously headed Hyundai Motors America. Scuttlebutt suggests that Alphabet wants to shift Waymo from researching the self-driving car problem to a partnership model in which the company allies with other automotive manufacturers to create self-driving cars. While this is a significant shift in how the unit has operated to date, it’s not a surprising one. If you want to work in the self-driving car business as a technology company, it makes much more sense to partner with established auto manufacturers and work together than to try and branch out into vehicle manufacturing yourself. Waymo will reportedly partner with Fiat Chrysler to launch a self-driving car fleet by the end of this year and hopes to have them in testing in the near future. “We’ve talked a lot about the two million miles we’ve driven on public roads,” Krafcik told the audience at Google’s press event to announce Waymo. “Now we’ve driven another million miles on public roads… And we have taken over 10,000 trips with Googlers and guests in places like Mountain View, Austin and Phoenix.” Google sees a wide range of uses for its self-driving technology, including ride sharing, trucking, and personal vehicle use. It’s not clear what will happen to the company’s vehicle prototypes, which lacked both steering wheels and pedals. Krafcik has stated that Waymo remains fully committed to developing Level 4 and 5 self-driving technology. Level 4 requires computer control that can handle the vehicle in all but a handful of environments, like severe weather, while Level 5 vehicles can pilot themselves under any conditions or weather and can reach any destination where it is legal to drive. Krafcik, however, also forcefully made the point that Waymo “is not a car company, there’s been some confusion on that point. We’re not in business of making better cars, we’re in the business of making better drivers.” Google had originally planned to commercialize its self-driving car technology by 2020, but allying with manufacturers rather than doing the manufacturing itself may help it beat that goal. The company did not announce any major new partnerships or concrete plans for the market beyond the spin-off and focus shifts discussed above. A number of manufacturers and companies are now tackling the self-driving market, from Intel, Delphi, and Mobileye to Tesla, Volvo, and Nvidia.
-
Last week, AMD took the wraps off its annual major driver refresh, dubbed Radeon ReLive this time around. While the company releases periodic driver updates with support for new games and various bug fixes, it launches a significant platform update for new features and capabilities roughly once a year. The new release brings the usual assortment of bug fixes and performance improvements, but it also promises a new power-efficiency feature, dubbed Radeon Chill. Radeon Chill is designed to change the game’s FPS rate to correspond to what’s happening on-screen. Imagine a situation where you’re playing a game and either alt-tab out to do something else, or simply have to get up and leave the keyboard. Alternately, you might be crafting, playing a mini-game, or sorting through your inventory. Either way, there’s not much going on at this particular point. Ordinarily, your GPU will simply render the highest frame rate it can, regardless of whether those frames are actually being put to any kind of use. Radeon Chill is, at least in theory, a way to get back some of the power you’re otherwise wasting on rendering 100+ frames per second on what amounts to a mostly still-life. It’s fully compatible with AMD’s FreeSync technology, and AMD claims it can improve frame rate responsiveness, not just improve power consumption. There are some caveats right now. Radeon Chill is currently only DX9 / DX11 compatible, and whitelisted games are manually selected for inclusion (AMD doesn’t enable this feature by default, in other words). That’s probably for the best given how new it is, but the list of titles it supports is at least fairly inclusive, with a solid number of top-tier titles of the past few years. Chill targets a 40 FPS frame rate when there’s not much going on in-game and a 60 FPS target otherwise, but how does performance shake out, and can Radeon Chill actually improve GPU responsiveness? Both Tom’s Hardware and Tech Report have examined different aspects of this question, and the answers look pretty solid. First, Tech Report checked AMD’s claims of responsiveness and overall performance, to see if Radeon Chill could actually reduce frametime latencies below their average, non-Chill level. Initial results are positive. While Radeon Chill does result in lower frame rates, Tech Report highlights the difference between how fluid their test-run of CS:Go felt versus how responsive the game actually was. Critically, there are even some places where the game’s frametimes were lower than their non-Chill equivalent. So yes, in at least some cases, Radeon Chill can boost frame timings over and above what users saw before (whether this is enough to ever make a practical difference in any given title is a different question). Over at Tom’s Hardware, they checked both overall performance and the impact on power consumption. Exactly how much power Radeon Chill saves you depends on what’s going on in-game, with the benefits ranging from dramatic to marginal, but this is a situational improvement rather than a unilateral one. If you tend to leave games alt-tabbed or just want to reduce your overall power consumption over time, there are clear gains from Radeon Chill. Tech Report also indicates that it results in a GPU that runs noticeably quieter, which could make late-night gaming sessions a bit more tolerable to significant others and roommates.
-
Hard drives may not demand the enthusiast attention they once did, thanks to the vastly improved performance of SSDs. But they still offer much more storage at a lower cost per gigabyte than your typical solid state drive. Western Digital was the first company to bring helium-filled hard drives to market, and the company is doubling down on its technology with the introduction of two new drives, with a 12TB drive shipping now and a 14TB coming in the near future. The 12GB HGST Ultrastar He12 (HGST is a subsidiary of Western Digital) is an eight-platter drive that packs 864Gbits/sq. inch and spins at 7200 RPM. It’s available in both SATA and SAS, with a maximum burst transfer rate of 6Gb/s or 12Gb/s depending on the interface (sustained transfer rate for the drive is identical between the two versions, at 243MiB/s – 255MiB/s). Both drives have a 256MB cache and an 8ms seek time for reads (8.6ms for writes). According to Western Digital, this eight-platter configuration packs in a full two platters above the highest-density air drives. For that, you can thank helium. Because helium is so much less dense than air, drive platters can be packed more tightly and there’s less vibration and friction to contend with inside the drive itself. In theory, Western Digital could use helium drives to spin a smaller number of platters more quickly, but the advent of SSDs has likely put an end to such endeavors. Even if an enterprise hard drive could be spun up to, say, 25,000 RPM (1.66x faster than the fastest 15K drives), it still wouldn’t hold a candle to the seek times or transfer rates on modern solid state storage. There is a difference between the 12TB and upcoming 14TB drives, however, even though both are based on the same eight-platter design and rely on helium: The 12TB drive uses PMR (perpendicular magnetic recording) while the 14TB drive will use SMR (shingled magnetic recording). SMR can reach higher areal densities than PMR because the drive tracks are overlaid on top of one another, as shown in the image below (the image is originally from a Seagate presentation, but the concept applies to all SMR drives). For more on SMR and Seagate’s rollouts, Because there is now an overlap across the tracks, the HDD’s write head will overwrite more than one track at the same time. This requires the drive to read and then rewrite data to multiple tracks at once to avoid erasing vital information, and it slows down overall drive performance. As a result, SMR drives typically aren’t recommended for consumer use — they’re intended for enterprise deployments where their drawbacks are outweighed by the increased storage capabilities they offer. SMR drives can pack up to 25% more data on the same platter as a conventional PMR drive, so it wouldn’t surprise us to see Western Digital roll a 16TB SMR drive at some point in the not-too-distant future. There are two kinds of SMR drives: host-managed and device-managed. Host-managed drives require the operating system and/or applications to manage their idiosyncrasies and ensure that track rewriting is kept to a minimum, while device-managed drives present to the operating system like standard hard drives. The upcoming Ultrastar He12 14TB is a host-managed device, which makes it a poor fit for consumer hardware. Seagate and other companies are expected to introduce their own 12TB drives in 2017, with 16TB drives expected in the next few years (our tea leaves are mum on whether this points to 2018 or 2019-2020 for new drive introductions).
-
Last week, Microsoft pushed an update to Windows 10 that broke DHCP and knocked some users offline until they rebooted their systems. The update is believed to have been part of cumulative update KB 3201845, which was released on December 9. After it was released, multiple European users reported being kicked offline. It’s not clear if the problem was isolated to Europe or not, but Microsoft is displaying a global banner that declares all users with Internet connectivity problems should restart (not shut down) their hardware. Yesterday, Microsoft released KB3206632, which Ars Technica believes might have fixed the issue. The new patch contains the following note: “Addressed a service crash in CDPSVC [Connected Devices Platform Service] that in some situations could lead to the machine not being able to acquire an IP address.” If you look up the CDPSVC, it’s described as follows: “This service is used for Connected Devices and Universal Glass scenarios.” Connected devices is self-explanatory, but we haven’t been able to find a definition for what “Universal Glass” is. Either way, the update broke Windows 10’s ability to configure DHCP (Dynamic Host Communication Protocol). DHCP is the protocol that distributes network configuration data to all the relevant devices on the network and handles automatically assigning IP addresses, for example. You don’t need a DHCP server to access the Internet, but most home networks are configured to expect one, and the average user probably isn’t comfortable with the process of mapping out static IPs to each device on the network. In this case, the problem can be solved with a simple “ipconfig /release” command, followed by “ipconfig /renew”. Some users are also reporting that this is fix is insufficient, and a separate set of commands are also needed, specifically: “netsh int ip reset” followed by “ipconfig /flushdns”. Combined, these should resolve any issues you experience, and allow an affected system to reconnect to the Internet and download the appropriate patch. The larger issue here, of course, is that these kinds of mistakes have become a regular part of the Windows 10 update process. In the past 12 months, we’ve seen multiple updates that variously bricked systems, broke Internet connectivity, or caused random crashes when ordinary USB devices (Kindles, in this case) were plugged into the system. That’s not even counting the malware-like activity of the last few months of the “Get Windows 10” campaign and the ill-will that caused towards Microsoft. Every operating system has these kinds of problems from time to time, including previous versions of Windows. This isn’t the first time Microsoft has had to push a patch to resolve issues it caused for itself with a previous update, and this kind of problem occasionally hits Linux and Apple users as well. But even after allowing for all of those factors, Windows 10 seems to have had more problems with weird corner cases, random bugs, and issues cropping up that the company’s Fast Ring / Slow Ring early adopter update system simply hasn’t been able to resolve. One potential reason for this is the type of OS testing Microsoft encourages its early adopters to engage in. If you’re in the fast ring, Microsoft recommends you not test your primary system and that you test within a virtual machine when possible. There’s a lot of things that can be checked that way, but certain issues — like USB device verification, for instance — probably don’t happen when users are running within a VM. To date, Microsoft has yet to announce any substantive changes to its policies that would close these gaps.
-
name changed
-
watch the video and follow the steps !
-
[Battle] Revo vs Hamona vs The GaMer [winner Hamona]
Derouiche™ replied to Derouiche™'s topic in GFX Battles
HaMona win T/C -
hello there is area's that needs pending approval from members you need access in the area that you are posting at , or u need pending approval from MODERATORS Good Luck
-
Hello, your posting in the wrong section,next time if you have any problem make a topic HERE using attachment to upload your images is not good but you can upload images at imgur.com or postimage.org members are allowed to have only one signature,there is other grades that can break this rules such as VIP for example i see you are new at CsBlackDevil community i advise you to read the community RULES to avoid warning points Good Luck
-
[Battle] Revo vs Hamona vs The GaMer [winner Hamona]
Derouiche™ replied to Derouiche™'s topic in GFX Battles
v1 v2 v3 start vote -
Name of the oponent: @Revo-™ vs @The Ga[M]er. vs あ ĦáMóИλ あ Theme of work:here Type of work (signature, banner, avatar, Userbar, logo, Large Piece):AVATAR Size:150.250 *Text: Dog Watermark: csbd/CsBlackDevil.com Working time:1houre
-
[Battle] The GaMeR Vs HaMoNa Vs Arcadionn [Winner HaMoNa]
Derouiche™ replied to Flenn.'s topic in GFX Battles
v3=text,brush -
welcome