Jump to content

XAMI

Ex-Staff
  • Posts

    2,777
  • Joined

  • Last visited

  • Days Won

    9
  • Country

    Colombia

Everything posted by XAMI

  1. Apple last week unveiled a pair of new smartphones in the iPhone 7 and iPhone 7 Plus. The latest handsets both feature multiple camera-related enhancements although in keeping with tradition, it’s the larger of the two that sees the most significant update. This year, the advantage comes in the form of a dual camera system. If you recall, both new iPhones feature a 28mm, 12-megapixel camera with optical image stabilization and f/1.8 aperture, six-element lens mated to a new image signaling processor. The larger iPhone 7 Plus adds a second camera, a 56mm telephoto lens with f/2.8 aperture that affords true 2x optical zoom. That all sounds impressive if you’re a photography enthusiast but in reality, people are going to judge the cameras based solely on image quality (as they should). Fortunately, we now have some early samples ready for evaluation. On Sunday, Sports Illustrated photographer David E. Klutho tested out the iPhone 7 Plus during the Tennessee Titans and Minnesota Vikings football game. Apple also gave one of its new handsets to ESPN photographer Landon Nordeman to cover the US Open. Results from both shoots have been embedded in this article and of course, you can check out both sites for additional images. So, what do the samples this tell us about the image quality of the iPhone 7 Plus? Not a whole lot, unfortunately. At first glance (and with the right display), they look great. The problem is that the images have all been resized so we don’t get to see what they look like in their native resolution. It’s also not known if the photos were captured using the basic Apple camera app or in a third-party app that affords much more control over things like aperture, shutter speed and ISO. Furthermore, were they retouched at all or are we seeing exactly what they look like straight out of the camera? With iPhone 7 and iPhone 7 Plus reviews scheduled to hit the web in the next day or so, we should get a much better idea of exactly what the new cameras are - and aren’t - capable of.
  2. The RX 480 may not be as drool inducing as a Pascal Titan X or even the GeForce GTX 1070, but it doesn't cost nearly as much either, which means this is what most people will end up buying. Whereas the GTX 1070 will set you back some $400+, the 4GB RX 480 should eventually sell for just $200. As history has shown us, once supply improves you can expect the RX480 4GB models to hit and possibly even dip below that official $200 MSRP. This got us thinking, what has $200 bought you previously from the red team? We're talking release day MSRPs here, so discounts applied over the lifetime of the product weren't considered, such as the Radeon R9 280 which launched at $250 but eventually sold for as little as $200. So keeping that in mind, last generation's $200 option was the Radeon R9 380, a card we crowned as the best mainstream option for 2015. Before that was the R9 270X which also landed at exactly $200. In 2012, the HD 7000 series didn't feature a $200 option, instead you had the HD 7790 at $150 or the HD 7850 at $250. Therefore, we have selected the more expensive Radeon HD 7850 for this comparison. A similar situation is found when looking at the HD 6000 series, which offered the HD 6850 for $180 or the 6870 for $240, so again we went with the more expensive option. The pricing strategy wasn't much different for the HD 5000 cards, though it was the HD 5830 that went for $240. Before we jump to the benchmarks, the table above will give you a good overall perspective of GPU specs, launch price, and the release date itself. The HD 5000 series was first to deliver DirectX 11 support, which is why we didn't go back further than this generation. This series and its successor were built using the aging TeraScale microarchitecture. TeraScale was a VLIW SIMD architecture, while Nvidia use a RISC SIMD architecture, similar to TeraScale's successor Graphics Core Next (GCN). Both 2012 and 2013's mid-range HD 7850 and R9 270X featured first-generation GCN, then mid-2015 brought the third-gen GCN R9 380 and then this year we received the fourth-gen GCN RX 480. With the exception of the HD 7850, we have seen a steady increase in core count from AMD's mid-range GPUs over the years. Memory bandwidth has also increased steadily and the biggest jump can be seen most recently from the R9 380 to the RX 480. Test System Specs Intel Core i7-6700K @ 4.50 GHz (Skylake) Asrock Z170 Z170 Extreme7+ 32GB (4x8GB) DDR4-3000 Samsung SSD 850 Pro 2TB Silverstone Strider Series ST1000-G Evolution AMD Crimson Edition 16.8.2 Hotfix Windows 10 Pro 64-bit Benchmarks: Tom Clancy's The Division, Overwatch First up, let's take a look at the performance of these mid-range GPUs using Tom Clancy's The Division with the medium quality preset in play. The Radeon HD 5830 was good for just 19fps while the 6970 was 26% faster, which was still only 24fps. From the 6870 to the 7850 we see a rather large 38% boost in performance to 33fps. Beyond that we start to see very playable performance as the R9 270X achieved 54fps. That is a massive 64% boost over the 7850 and frankly we didn't expect that. We believe the massive difference is down to the fact that the 7850 only has a 1GB memory buffer while the 270X has twice that. The R9 380 was 33% faster than the 270X which is another nice gain and this pushed the average frame rate to 72fps. Despite that one of the biggest gains we see is the move from the 380 to the RX 480 as AMD's latest mid-range offering was an impressive 47% faster. You have to hand it to Blizzard for making highly enjoyable games that run on just about anything. Overwatch might not have the visuals of Rise of the Tomb Raider or Battlefield 1, but it still looks remarkable in my opinion and it only makes it all the more impressive to see the HD 5830 hitting an average of 49fps at 1080p. Even with the 5830 able to deliver playable results, the 6870 went on to deliver a 45% boost for an average of 71fps. The 7850 was 35% faster still with an average of 96fps and didn't appear to be hindered by its 1GB frame buffer in this title. With the 7850 performing as it should, the R9 270X was just 17% faster this time. The R9 380 provided a further 35% performance jump, reaching an average of 151fps. Nevertheless, the RX 480 was good for almost 40% more performance again at 210fps -- impressive stuff from AMD's latest mid-range offering. Benchmarks: The Witcher 3, Star Wars Battlefront The Witcher 3: Wild Hunt has been tested using the medium quality settings while HairWorks has been disabled. These mild quality settings proved far too much for the HD 5830 as it averaged just 15fps. Pushing that result forward by 40%, the 6870 still only delivered 21fps on average. The 7850 was 38% faster again, though it too failed to deliver playable performance. Finally, with another 38% performance boost using the Radeon R9 270X we achieve playable performance with an average of 40fps. The R9 380 was able to provide a further 35% performance boost hitting a 54fps average. However, it was AMD's latest mid-range GPU that provided the biggest step forward, boosting the frame rate by 48% with an 80fps average. tar Wars Battlefront not only looks great but is also relatively hardware-friendly. The old HD 5830 might have come up short, but we were still surprised to find a 26fps average using the high quality preset at 1080p. The 6870 was 27% faster at 33fps, but it was the huge 42% boost seen when moving to the 7850 that landed us playable performance. Things got better from there. The R9 270X averaged 63fps, or 34% faster than the 7850. The R9 380 was just 22% faster than the R9 270X this time and we saw a massive 42% performance increase for the RX 480 over the R9 380 as AMD's latest-generation GPU breaks the 100fps barrier. Benchmarks: Rise of the Tomb Raider, Doom Rise of the Tomb Raider was tested using the high quality preset with anti-aliasing disabled which allowed the HD 5830 to average 23fps though it did dip as low as 6fps. The 6870 was 35% faster with an average of 31fps, the 7850 brought that figure forward another 39% to 43fps, and the jump from the 7850 to the 270X brought a big 49% boost. Given how demanding this game is on VRAM we suspect the 7850 has been crippled due to its 1GB frame buffer. Another 32% gain is seen when moving from the R9 270X to the R9 380, as the average frame rate climbs to 85fps. From there we hit 126fps with the RX 480, which is a mighty impressive 48% increase. Note that Doom was tested using the Vulkan API for GPUs that support it. The HD 5830 and 6870 don't support Vulkan and thus were tested with OpenGL. As a result, both bombed out, delivering well below the acceptable minimum frame rate for playable performance. The HD 7850 on the other hand sailed along nicely with a 63fps average. This, however, didn't stop the R9 380 from producing 40% more performance with a 121fps average. If you thought that was good, then the RX 480 will impress once again, being 48% faster than the R9 380 with a mighty 179fps. Power Consumption & Conclusion For testing power consumption we decided to use Overwatch where all GPUs performed reasonably well. Interestingly, the Radeon HD 5830, 6870, 7850 and R9 270X all pushed the total system power consumption to a similar figure with no more than 13 watts separating them. It's also interesting to note that we see a rather large jump in power consumption for the R9 380 andRX 480. Total system consumption increases by around 30% with these graphics cards. But of course, given how much faster they are, this is an excellent improvement in efficiency. As you will see in the graph below, the increased power consumption of these newer cards is offset by the increased performance. Taking the test system with the HD 5830 as an example, the total system consumption here is only 182 watts, however each frame came at a cost of ~7.9 watts when looking at the average performance on the games we tested. The new RX 480 on the other hand pushed total system power consumption higher to 243 watts, but because it averaged 135fps in the gaming sessions, each frame came at a cost of just 1.8 watts. This is a much larger improvement that we saw from the R9 380 (third-gen GCN) over the R9 270X (first-gen GCN). Looking exclusively at the performance gains, we find that the Radeon RX 480 provides one of the most significant steps forward in the $200 price range in recent times. Based on the games tested, the RX 480 was 45% faster (!) than the R9 380 at 1080p. The last time we saw a gain this large in the AMD camp was back in 2012 when the mid-range HD 7850 became the best value for mainstream buyers. Comparing the RX 480 and R9 380 here has been somewhat skewed by Doom's results. Excluding that game, there's a smaller but still impressive 38% jump between the two. So while it seems as though GPU pricing has been getting out of hand in recent years, we find that the mid-range offering still provides a serious bang for your buck, and as AMD works hard to recover market share, you can rest assured that mainstream pricing is only going to become more competitive. Notice By: TechSpot
  3. Today’s top VR platforms, namely the Oculus Rift and HTC Vive, require users be tethered to a powerful computer to drive the experience. In the absence of a wireless configuration, companies like MSI are seeking to solve the dilemma not by cutting the cord but by bringing the PC closer to gamers. MSI unveiled a backpack-style computer this past May designed specifically for virtual reality gaming. The company was short on details at the time but with the Tokyo Game Show 2016 just a few days away, we’re got some new details on the curious backpack PC which MSI is now calling the VR One. Tipping the scales at just under right pounds (with batteries), the VR One is said to be powered by an Nvidia GeForce GTX 10-series graphics card and an overclocked CPU. MSI says it’s equipped with an HDMI port, a miniDisplayPort, a Thunderbolt 3 port (USB Type-C) and four standard USB 3.0 ports. A pair of 90mm fans and nine heatpipes keep the backpack system cool with a maximum noise output of 41db. The PC comes with dual hot-swappable batteries that are good for a little over an hour and a half of full-on gameplay. All things considered, the VR One is reportedly capable of delivering more than 90 FPS on any high-level device. I’ve reached out to MSI for more detailed information about the system’s hardware, pricing and release date and will update this story when I hear back. At this time, all we know is that it is already in mass production and should be available sometime next month.
      • 2
      • I love it
  4. When Sony announced its faster PlayStation 4 Pro and slimmer original console last week, the company revealed it would also be bringing HDR capabilities to all consoles courtesy of an upcoming firmware update. We now know that the update in question will arrive tomorrow. John Koller, Vice President, PlayStation Brand Marketing, SIEA, said in a post on the officialPlayStation blog earlier today that the update – codenamed Shingen – brings a refreshed user interface, folder organization, library improvements, the aforementioned HDR capabilities and more. Sony has also made improvements to the Quick Menu, making it easier to access without having to leave a game and adding customization options. They’ve also added several new items to the menu like an enhanced music section that lets gamers play, discover and control Spotify without having to open the app – functionality that wasn’t featured in the beta program. Update 4.0 will also include some prep for the PlayStation 4 Pro. Specifically, it adds a number of features that support the system’s ability to output high-resolution content. Of course, those won’t come into play until November 10 when the PS4 Pro launches but it’s good to know they’re in place ahead of time. Those interested in learning more can check out today’s blog post as well as this one published last month that goes into a bit more detail on what to expect.
  5. Tesla Motors on Sunday outlined a number of enhancements that’ll arrive in version eight of its Autopilot software including a more prominent role for the radar system that was added to all vehicles in October 2014. Up to this point, the radar system has been used solely as a supplementary sensor to the primary camera and image processing system. After careful consideration, however, Tesla believes radar can be used as a primary control sensor alongside the camera rather than simply supplementing it. Because of how strange the world looks in radar, the challenge in using it as a primary detection system is to avoid false alarms. A discarded soda can on the road with its concave bottom facing the vehicle, for example, can appear to radar as a much larger (and potentially dangerous) object. That’s because metallic objects look like a mirror to radar and those with a dish shape can amplify the signal to many times its actual size. Having the car hit the brakes every time it sees a non-threatening object such as this would be annoying at best and at worst, could cause injury. Omg it's DAFT PUNK To solve the issue, Tesla said version 8.0 of its software unlocks access to six times as many radar objects with a lot more information per object. The system will also assemble radar snapshots – captured every tenth of a second – into a 3D “picture” of the world. This will help the system determine if an object is moving and help determine the probability of a collision. Tesla described another situation that’s difficult for radar to handle and how they plan to tackle it. A vehicle approaching an overhead highway road sign that is positioned on the rise in the road or a bridge where the road dips underneath can appear to the system as a potential obstruction. Navigation and GPS data isn’t enough to determine whether or not the car will pass under the sign / bridge and by the time the pitch in the road changes, it’s too late. In such situations, fleet learning (in which vehicles report to a Tesla database the position of road signs, bridges and other stationary objects) comes in handy. With other vehicles having already mapped out the area, fleet data can be used to help other vehicles navigating the same stretch of road determine what is safe and thus, cut back on excessive, unnecessary braking. Other quick-hit changes in the coming update include the ability for a vehicle in Autopilot mode to offset its position in a lane when overtaking a vehicle driving close to the lane edge and the use of amplified braking in an emergency. Tesla also said that after further data gathering, a car will activate Autosteer to help avoid a collision when the probability of impact is ~100 percent. What’s more, a Tesla vehicle will not be able to reengage Autopilot until it has been put into park if a user has ignored repeated alertness warnings. Tesla owners can expect the new software to arrive in the coming weeks.
  6. From video games to operating systems, the C programming language is one of the most widely-used languages in the industry. With the Complete C Family Bundle, you can dive into these po[CENSORED]r programming tools for only $39. The course bundle features over 40 hours of training in six comprehensive courses. Here are a few highlights: C#: C Sharp Comprehensive Course: Go from C# beginner to expert as you learn how to build fast and secure applications on the .NET framework. C Programming Course: Discover how to express common coding ideas in an accessible fashion using one of the oldest coding languages out there. Comprehensive C++ Training: Design programs that are clear and easy to understand using C++. Learn how to run your own C++ program and make it resistant to bugging threats. C# Programming: Advanced Optimization Techniques: Refine your C# knowledge as you learn how to boost application performance and responsiveness via asynchronous programming. The Complete C Programming Bundle retails for $1,000+, but with the current deal you cansave over 90% and get it for just $39.
  7. Nvidia recently gifted laptop gamers with fully functional GeForce 10 series GPUs. Our coverage of the launch and details on Nvidia's new mobile lineup can be found here, but for those who missed the announcement the big news is that Pascal brings GPUs with near exact specifications to laptops as their desktop counterparts. This is in stark contrast to essentially every other mobile GPU ever released. In other words, on a GeForce GTX 1080-powered laptop you can expect 1080-like desktop performance, or thereabouts. To put those claims to the test we received a prototype of Asus' ROG GX800. Those of you familiar with the previous-gen ROG GX700 will know it's an over the top liquid cooled laptop. Well, brace yourself because the GX800 is even more extreme. The G-Sync panel on the new model is larger, the Core i7 processor is clocked higher, there are now three SSDs of the NVMe variety, the keyboard is mechanical and not just the GPU is liquid cooled but also the CPU. We totally expect to see more svelte laptops powered by the GTX 1060 and possibly 1070s, however the GX800 is not about balances, it's about performance and it's the first laptop we've held that carries two GTX 1080 GPUs which will let us test mobile Pascal on both single and dual GPU scenarios. The clunky old GTX 980 has been dropped for GTX 1080 SLI. In combination with all the other enhancements it sees the laptop's thickness increase by almost 30% and weight by 46% to a back breaking 5.7kgs. This all sounds crazy and wildly impractical, but Asus doesn’t care. That is kind of the point. Asus is well aware this laptop isn’t for everyone and it doesn’t make sense for most, but the challenge here is to push the bounds of current PC gaming technology, in creating something out of curiosity, out of pride. For that we commend them. As keen as we are to show you the GX800 in detail, the unit we received is still a prototype, so our official review will come at a later time. For now, we'll be content to see just how well the GTX 1080 performs in a laptop and how comparable the performance is to the desktop. While we are at it, it won’t hurt to look into the SLI performance as well. So with that we'll get into benchmarks. In total we have tested eight AAA titles at 4K, so that should give us a clear idea of how Pascal performs when condensed down to notebook form. Can Laptop Gaming Be Just as Good? All results we gathered were using the Asus ROG GX800 docked to the liquid cooling unit which also supplies the laptop with more power. This is required to power both GPUs. There is the option to undock and the laptop will still run normally but then the two GPUs are throttled down, throwing less than desired performance. In such scenario, you can use the ROG Gaming Center to disable one of the GPUs and still get pretty good performance. Asus' ROG software can also be used to enable XMP mode which enables a 100MHz higher CPU boost clock and operates the DDR4 memory at 2800MHz. Asus recommends users run with this mode as it boosts memory performance by almost 20% and the combined CPU and memory performance ensure higher minimum frame rates when gaming. With XMP mode enabled but SLI disabled it is possible to use the GX800 on air and get about the same performance you would when docked and only one GPU enabled. So the single GTX 1080 figures you are about to see are achievable using just air cooling, while the SLI results require the dock, not so much for its cooling abilities but for that extra 330 watts of power. First up we compare our Core i7-6700K desktop test system to the Asus ROG GX800. Using a single GTX 1080 we find the same 49fps on both platforms. Enabling SLI boosted the GX800 frame rate performance by 90% to a seriously smooth 93fps at 4K. This is again comparable to a fully-fledged ATX desktop system running a pair of GTX 1080 Founders Edition graphics cards. This time we find exactly the same SLI performance on both platforms while the GX800 laptop managed to come out slightly ahead in the single card comparison. The fact that the card is being liquid cooled is likely helping it maintain higher clock speeds. No doubt someone was going to say “but can it play Crysis”, well the answer is yes, at 4K. Granted we aren’t exceeding 60fps but the game is being played with 4xTXAA enabled, so the 50fps average with SLI enabled is still very impressive. Tom Clancy’s The Division was very smooth at around 60fps on the GX800 with SLI enabled and with frame dips no lower than 47fps the 4K performance was exceptional. Review By: TechSpot
      • 1
      • I love it
  8. Intel last year said in a job posting that it was looking for a CPU architect / researcher to spearhead the company’s research and development of processor cores and graphics processors to be built on its 7nm manufacturing process. The post, first spotted by Ashraf Eassa from fool.com, said the products in question would arrive in the “2020 and beyond” timeframe. Eassa recently noticed, however, that Intel has published a revised version of the job listing that suggests Intel may be looking to milk its upcoming 10nm process longer than most initially expected. In the updated listing, Intel says it is looking for someone to spearhead the research and advanced development of microprocessor cores in the 2022 and beyond timeframe. The microarchitecture and design of these advanced CPUs, the company says, will be aggressively co-optimized with Intel's sub-10nm technology nodes deep into the next decade. On the surface, it indeed sounds like Intel may be pushing its 7nm chips out by a couple of years, from 2020 to 2022. As Eassa notes, it’s possible that Intel could instead be talking about 5nm technology in the new listing versus the 7nm it explicitly referenced in the original posting. If that were the case, it seems unlikely that Intel would use the phrase “sub-10nm technology nodes” rather than something more fitting like “sub-7nm technology nodes.” Notice By: TechSpot
  9. Welcome Djak
  10. Nairo Quintana is set to win the Vuelta a Espana after holding off Briton Chris Froome's challenge on the final mountain-top finish of the race. The Colombian, 26, leads Froome by one minute 23 seconds with only Sunday's processional stage into Madrid to come. Team Sky's Froome, 31, attacked Quintana repeatedly as the pair climbed Alto de Aitana, but the Movistar rider beat Froome by two seconds. The stage was won by Frenchman Pierre-Roger Latour of Ag2r-La Mondiale. The 22-year-old outsprinted Colombia's Darwin Atapuma at the summit. Stage 20 was Froome's last chance to attack Quintana after trimming his lead by more than two minutes in the individual time trial on Friday. He was attempting to become the first man to win the Tour de France and Vuelta in the same season since Bernard Hinault in 1978. The win will be Quintana's second in a Grand Tour - his maiden success came in the 2014 Giro d'Italia. Quintana's compatriot Estaban Chaves will complete the podium places in general classification after overhauling Spain's Alberto Contador, a three-time winner of the Vuelta. Britain's Simon Yates finished 14th on the stage to remain sixth overall. Stage 20 result: 1. Pierre-Roger Latour (Fra/Ag2r) 5hrs 19 mins 41 secs 2. Darwin Atapuma (Col/BMC) +2secs 3. Fabio Felline (Ita/ Trek-Segafredo) +17secs 4. Mathias Frank (Swi/IAM Cycling) +40secs 5. Robert Gesink (Ned/LottoNl-Jumbo +1min 03secs 6. Bart de Clercq (Bel/Lotto Soudal) +1min 28secs 7. Rudy Molard (Fra/Cofidis) +2mins 02secs 8. Lilian Calmejane (Fra/Direct Energie) +3mins 01sec 9. Esteban Chaves (Col/Orica-BikeExchange) +3mins 17secs 10. Nairo Quintana (Col/Movistar) +4mins 03secs 11. Chris Froome (GB/Team Sky) +4mins 05secs 12. Andrew Talansky (USA/Cannondale) +4mins 34secs 13. Alberto Contador (Spa/Tinkoff) +4mins 41secs 14. Simon Yates (GB/Orica-BikeExchange) +5mins 04secs 15. David de la Cruz (Spa/Etixx - Quick-Step) +5mins 10secs General classification after stage 20: 1. Nairo Quintana (Col/Movistar) 80hrs 42mins 36secs 2. Chris Froome (GB/Team Sky) +1mins 23secs 3. Esteban Chaves (Col/Orica) +4mins 08secs 4. Alberto Contador (Spa/Tinkoff) +4mns 21secs 5. Andrew Talansky (US/Cannondale) +7mins 43secs 6. Simon Yates (GB/Orica) +8mins 33secs 7. David de la Cruz (Spa/Etixx - Quick-Step) +11mins 18secs 8. Daniel Fernandez-Moreno (Spa/Movistar) +13mins 04secs 9. Davide Formolo (Ita/Cannondale) +13mins 17secs 10. George Bennett (NZ/LottoNI-Jumbo) +14mins 07secs
  11. At yesterday’s Apple event, the company confirmed rumors that have been circulating for months: the iPhone’s headphone jack is no more. Quite a few people are unhappy about this, so in an interview with Buzzfeed, Apple’s VP of worldwide marketing Phil Schiller, VP of hardware engineering Dan Riccio, VP iOS, iPad and iPhone marketing Greg Joswiak, and CEO Tim Cook defended the decision. The foursome said there were several reasons why Apple ditched the 3.5mm connection, the main ones being the amount of space it takes up, the way it hinders water resistance, and the fact it’s antiquated. “The audio connector is more than 100 years old,” said Joswiak.“It had its last big innovation about 50 years ago. You know what that was? They made it smaller. It hasn’t been touched since then. It’s a dinosaur. It’s time to move on.” Riccio, who called the jack a mere “hole filled with air,” said its inclusion had held Apple back when it came to adding new features to previous iPhones.“It was fighting for space with camera technologies and processors and battery life. And frankly, when there’s a better, modern solution available, it’s crazy to keep it around.” Some have claimed that Apple removed the connector so it could introduce a new DRM platform for audio consumption, which Schiller called “pure, paranoid conspiracy theory.” At the iPhone event, Apple said it was ditching the jack because of “courage,” a term that seems to have annoyed people even more. Maybe the company should have listened to its co-founder, Steve Wozniak, who last month said that the decision would “tick off a lot of people.” Intel is another company to have praised the advantages of removing the headphone jack. At its recent Developer Forum in San Francisco, architects Brad Saunders and Rahman Ismail claimed replacing the old connection with USB-C would improve both audio quality and smartphones in general. Despite the furor over Apple’s move, it’s unlikely to have much negative effect on iPhone 7 sales. “Remember, we’ve been through this many times before,” says Schiller. “We got rid of parallel ports, the serial bus, floppy drives, physical keyboards on phones — do you miss the physical keyboards on your phone? … At some point — some point soon, I think — we’re all going to look back at the furor over the headphone jack and wonder what the big deal was.”
  12. It’s no secret that Sony’s smartphone business is struggling. Competition has become fierce over the last couple of years, with continued innovation and lower prices causing trouble for market incumbents. Rather than doubling down on their flagship products, like HTC did with their excellentHTC 10, Sony seems happy to trundle along with minor iterations year after year. The new Xperia X series may not carry the same name as its predecessor, the Xperia Z, but there’s no mistaking these products for a revolutionary change. I've been testing the the top-end Xperia X Performance for a few weeks now and I've found it to be remarkably similar to the Xperia Z5 that preceded it. Considering the Z5 was the sixth derivation of the 3-year-old Xperia Z, it’s easy to feel this design is getting stale. The similarities between the Xperia X Performance and the Xperia Z1 in particular are striking. Released in 2013, the Xperia Z1 came with a 5.0-inch 1080p display; in 2016 we’re still seeing a 5.0-inch 1080p display on the X Performance. Both phones are water resistant. Both phones have cameras with more than 20 megapixels, and both have dedicated shutter buttons. So what has Sony managed to achieve in three years? Well, the Xperia X Performance ditches the Snapdragon 800 SoC for the modern Snapdragon 820, bringing along better connectivity, more storage, and more RAM. The front camera is up to 13 megapixels now, while Sony claims the rear camera is their fastest ever. Unbelievably, we’re getting a smaller battery in a phone just as thick. Plus we’re getting some new software features, too. When I first picked up the Xperia X Performance at MWC 2016, and then again for this review, I couldn’t help but feel that the design is bland. I’ve seen this exact sort of smartphone style from Sony more than three years ago, and it seems that as competitors have got slimmer and sexier, Sony’s designers have gone backwards. There are very few aspects to the Xperia X Performance that are visually more appealing than my favorite Sony smartphone design: the Xperia Z3. Since that phone's release in 2014, Sony’s flagship device has gained a millimeter of thickness, lost the comfortable and stylish curved edges, and swapped premium materials for plastic in some areas. On Sony’s product page, the company is quick to highlight the sleek metal back panel, which I will admit looks great thanks to a subtle brushed finish and minimal distractions, but the plastic edges let this handset down; they look cheap and don’t carry the same textural pleasure. Sony has tried to color these edges to look the same as the rear panel, but it hasn’t worked: the visible seam that joins the back panel to the sides is unsightly, and the difference in luster between metal and plastic is obvious. Placed next to the similar Huawei P9, it’s clear what Sony should have done in designing the Xperia X Performance. The P9’s metal unibody looks and feels fantastic, as the premium materials curve seamlessly around the edges. The Xperia X Performance’s transition from metal to plastic looks ugly in comparison, and gives the impression that Sony’s creation is the cheaper device. It’s not. The metal back does have its advantages. There are no antenna lines which are usually necessary on a unibody design and can detract from the style. The Xperia X Performance is also easier to grip than its Xperia Z series predecessors, which used fragile glass backs that were prone to cracks, as well as being slippery fingerprint magnets. The front panel is constructed from a slab of glass protecting the display, which subtly curves into the edges, creating a swooshable feel. The bezels are average in size for a 5.0-inch smartphone, and are colored to match the edges and back panel. I’m not a fan of this single-tone design – it looks a bit boring – and the rose gold model I received to review is not something I’d normally choose to buy. That said, the X Performance is also available in white, black and “lime gold.” The Xperia X Performance is one of the very few remaining flagships to pack dual front-facing speakers, which provide a stereo experience when watching video and playing games. I love stereo speakers on the front of smartphones, and it’s sad to see companies like HTC move away from this. The quality of the X Performance’s speakers isn’t particularly great, but their volume is decent and I don’t expect amazing sound from such small drivers. Along the bottom edge of the Xperia X Performance is a micro-USB port, which is disappointing considering most high-end devices have transitioned to the more versatile USB-C port. Samsung is the other outlier here, as they have their Gear VR system which still uses micro-USB. Sony doesn’t have anything like that, so I’m puzzled as to why they didn’t update their charging and data port to the modern standard. The top edge features a 3.5mm headphone jack, while the left edge has a tray for either two nano-SIMs, or a nano-SIM and microSD card slot. I appreciate seeing dual-SIM functionality here, as it isn’t often found on high-end smartphones. On the right edge is the fingerprint sensor, which doubles as the power button. I was critical of the Xperia Z5’s fingerprint sensor in this location as it didn’t seem to work very well, but these issues have been resolved on the Xperia X Performance. This sensor is ludicrously fast to operate, much more accurate than its previous implementation, and the positioning is just about perfect. My only concern is that the tactile feedback from the button isn’t great, and it might've made the phone thicker than necessary. Below the fingerprint sensor is the volume rocker, which is in an awkward position due to its low position on the right-hand edge. A more comfortable location here would be above the fingerprint sensor, and there would be less chance of accidental presses as well. Below the volume rocker is the dedicated two-stage camera button. The Xperia X Performance is water resistant, carrying an IP65 and IP68 rating that allows immersion in fresh water up to 1.5m for up to 30 minutes. It’s also dust tight and resistant to low pressure water jets. It’s always handy to have a water resistant phone for the times it accidentally gets dropped in the toilet or splashed with coffee, but under no circumstances should the X Performance be taken into salt water. As far as usability is concerned, the Xperia X Performance’s 5.0-inch display makes this smartphone easy to use with one hand. At 8.7mm thick it’s a chunkier than I would have liked to see, especially as the phone doesn’t pack in a large battery, though it remains more portable than handsets with larger displays. Review By: TechsPot
  13. There are plenty of large phones to choose from these days: the iPhone 7 Plus, the Galaxy Note 7, the OnePlus 3, and many others that feature displays at least 5.5-inches in size, which can make them a bit cumbersome to use in one hand. But nothing compares to the Xiaomi Mi Max, a gigantic 6.4-inch smartphone/phablet that dwarfs what most people carry with them. This phone is, quite simply, ridiculously large. While this handset's size will suit only a small fraction of phone buyers, there’s actually a lot to like about the hardware inside. For just under $250 from Gearbest, who helped us acquire this phone for testing, you’re getting a massive display, a metal body with a fingerprint sensor, a 16-megapixel rear camera, and a powerful Qualcomm Snapdragon 650 SoC. The battery inside is just as enormous as the handset, at 4,850 mAh. Having now used (or, rather, struggled to use) the Xiaomi Mi Max for the last couple of weeks, I’ve learned a lot about what is physically required to operate a 6.4-inch handset on a daily basis. At $250 the Mi Max's price point is very attractive, but does the whole package deliver to earn our recommendation? Let’s find out. Clearly the 6.44-inch display makes this device significantly larger than anything else I’ve used for years. It’s roughly as thick as the iPhone 6s at 7.5mm, but it’s 15mm taller and a whopping 10.4mm wider. Width is the key dimension for determining one-handed usability, and any increases on the already-wide iPhone 6s Plus can make a phone hard to operate. The Mi Max compares even less favorably to the Samsung Galaxy Note 7, which packs a 5.7-inch display in a near bezel-free body. Here the Mi Max is nearly 20mm taller and 15mm wider, leading to a phone that’s 35% larger than an already-large device. Even with a screen-to-body ratio of approximately 75%, the size of this display introduces a number of complications during regular usage. In short, the Xiaomi Mi Max requires two hands to use. It’s impossible to use the Mi Max in one hand. I don’t have particularly large hands, but even with some serious acrobatic work my fingers are at least an inch from reaching the upper left corner of the display. Reaching the same corner on my 5.7-inch Nexus 6P is relatively easy in comparison, and that’s a phone that most people think is reasonably large. The extra screen real estate effectively works against the Mi Max in this regard. Being able to reach each corner of the display is critical as many smartphone applications place navigation elements or other buttons in these locations. In Gmail, for example, I am simply unable to hit the hamburger menu icon in the top left without performing a gymnastics routine with my fingers to appropriately position and grip the smartphone. While typing, attempting to hit the q, w or a keys gives my thumb a workout with all the stretching that is required. To make matters worse, the capacitive navigation buttons below the display are hard to hit with one hand, especially the app switching button on the far left side. Alternatively, left-handers will struggle to hit the important back button, which normally I’d criticize for being in the wrong location, but on this phone I’m glad it’s the closest to my hand. n short, the Xiaomi Mi Max requires two hands to use. This sounds relatively trivial, but it’s not until you’re trying to use the phone on a regular basis that you discover how annoying it is. Carrying some shopping bags home from the store in one hand? Forget using the Mi Max in your other hand. Want to quickly send a message to a co-worker while sipping your morning coffee? No chance. If you’re comfortable using such a massive handset and don’t mind being restricted to two-handed use, there’s actually a lot to like about the Mi Max's design. The phone features an excellent metal back panel that curves around the left and right edges, which makes the handset look just as good as some flagships of the past few years. The metal build also feels great in the hand, and provides good durability: for such a large phone there is barely any flex in the body. The design is hampered somewhat by plastic sections above and below the metal. These sections attempt to imitate the color and finish of the metal but don’t look nearly as good. Luckily they are the only downside to an otherwise great mid-range smartphone design. I appreciate the precision of the Mi Max’s design, which includes properly aligned elements such as the front camera and sensor array, and the two speaker grills along the bottom edge. I was disappointed, however, that the Mi Max doesn’t include front-facing stereo speakers, which would complement the large, media-friendly display. The speakers on this phone are okay, but their positioning could be better. Review By: TechsPot
  14. Welcome Dragos
  15. For those unwilling or unable to purchase Microsoft Office, the open source projectOpenOffice has long been an excellent alternative. But it now looks as if the free productivity suite could be shut down unless more volunteer developers come on board. Ars Technica reports that Dennis Hamilton, volunteer vice president of Apache OpenOffice, sent out an email thread stating: "It is my considered opinion that there is no ready supply of developers who have the capacity, capability, and will to supplement the roughly half-dozen volunteers holding the project together." He added that no decisions had yet been made, but "retirement of the project is a serious possibility." Many of OpenOffice’s volunteers have left to work on LibreOffice - a fork of OpenOffice that launched in 2011. Its updates arrive more frequently than OpenOffice: 14 in 2015 alone, which is a lot more than the single update OpenOffice received across the whole of last year. The dearth of volunteers has meant that dealing with security vulnerabilities has posed a problem. Apache informed users of a vulnerability in June that could let attackers craft denial-of-service attacks and execute arbitrary code. The company suggested users switch to Microsoft Office or LibreOffice as a solution. A patch that needed to be manually installed was released a month later, but security problems remain. Another issued face by OpenOffice is that the few developers still working there are “aging,” and that working there isn’t “much of a resume builder.” Despite the lack of updates, OpenOffice was downloaded more than 29 million times on Windows and Mac last year, making a cumulative total of 160 million downloads since May 2012, according to project statistics. While there are plenty of people who want OpenOffice to continue by finding other ways of attracting new contributors, the signs aren’t looking good for the open source software.
  16. With the release of Deus Ex: Mankind Divided's DirectX 12 patch, at least in a beta version, AMD has unleashed a new set of Radeon Software Crimson Edition drivers that further optimize the game in its DirectX 12 mode. The Radeon Software 16.9.1 drivers also include a new CrossFire profile for Dota 2's DirectX 11 mode. On top of this, AMD has included a range of bug fixes including those that fix flickering issues with 144 Hz displays, as well as crashes and flickering in games like GTA V, Dirt Rally, and Doom. Known issues in this driver include a problem with the AMD Gaming Evolved overlay that causes crashes in some games. Users should also be wary that upgrading to this new 16.9.1 driver may reset user settings in Radeon Settings to their default values, so it could be a good idea to take note of what you've changed in case this happens to you. As always, you can download the latest Radeon Software drivers through Radeon Settings automatically, or you can head over to our driver download section and grab a manual installer.
  17. Sony on Wednesday introduced the world to its new family of PlayStation 4 consoles consisting of a slimmer version of the original and a more powerful variant called the PlayStation 4 Pro. The slimmer and lighter PlayStation 4 will replace the original console in the lineup. It’s functionally identical to the first-generation console, we’re told, although it is a bit more energy efficient and comes with a slightly different DualShock 4 controller. Codenamed Neo, the new PlayStation 4 Pro with 1TB hard drive builds on the success of the original with a faster processor, better graphics and support for glorious 4K resolution and HDR. Although it’s designed to get the most out of 4K televisions that support HDR, such sets aren’t a requirement. As PlayStation lead system architect Mark Cerny explained, even games on a 1080p TV will look better through the use of technologies like super-sampling and advanced anti-aliasing. Games on standard HD sets will also feature brighter colors and better reflections although it sounds like high-fidelity patches will be needed to bring out the best in older titles. PlayStation 4 Pro will also afford a much better overall experience when using PlayStation VR, the company’s upcoming virtual reality headset. Netflix is even developing an app for the console that’ll allow for 4K video streaming. One has to wonder if this will expedite Microsoft’s plans to bring Project Scorpio to market. It’s worth mentioning that HDR capabilities will be coming to all PlayStation 4 consoles via firmware update. The slimmer PlayStation 4 goes on sale September 15 priced at $299. If it’s the PlayStation 4 Pro you’re after, be prepared to wait until November 10 to get one for $399.
  18. I spent my first hour with Origin PC’s GeForce GTX 1080-powered, 34-inch ultra-wide curved screen Omni all-in-one gaming PC just staring at my desk. There’s a lot more of desk now. I didn’t know I had so much desk. Technically I spent the first hour looking for my power screwdriver to extricate the system from the obligatory wooden crate, all part of the Origin PC purchasing process. I imagine someone purchasing one of these instead of having to store a large wooden crate in their cramped apartment home for several weeks while they review the system would be tickled pink. We’ve been using the crate as a side table. But then, so much desk. I’d removed my normal work tower from the surface in order to make room, but I really didn’t have to. Even now, as I sit in front of the Omni writing this review, my eyes continuously shift left and right, anxious about the lack of clutter. Much better. This must be what a Mac user feels like all the time. An all-in-one computer combines a monitor and computer into a single unit, generally with a smaller footprint. Apple’s always been a big proponent of the design, and when I started seeing computers in school (I’m so old) they were mainly boxy Macs. When I got my first real computer, I was surprised a monitor was not attached to it. Technology has changed significantly since those days. We’ve got cellular telephones, color televisions, fire and the ability to stuff a powerful PC behind a top-of-the-line monitor, leaving plenty of room on my desk for toys. Enter the Origin Omni. Alone it looks like a strangely-thick 34-inch curved screen ultra-wide (3440 x 1440) quad HD monitor, which it is. It’s just got some junk in the truck, and by junk I mean high-end PC parts. Strip away the back panel of the monitor, and you’ve got a full-sized monitorboard, full-sized graphics card, memory, wires, fans, some cooling bits. You know, the stuff inside a computer, only Tetris’d into a much flatter form factor. Adding only a small amount of thickness to the monitor and a bunch of heft) transforms it into a relatively low-compromise PC enclosure. It’s got a full-size ASUS Z170I Pro Gaming motherboard fitted with a watercooled Intel Core i7 6700K Quad-Core 4.0GHz CPU, a Geforce GTX 1080 graphics card, 16GB of Kingston DDR4 memory and two hard drives, a 250GB Samsung SSD and a 2TB Seagate Sata drive. The only real compromise here is the power supply, which in an Origin-branded 450 watt, which is right on the edge of what the Geforce GTX 1080 card supports. But hey, it’s worked like a charm so far, so I won’t argue with results. Here’s the parts list, as configured at the Origin PC website: Case: Origin Omni Display Type: Omni 34" 3440 x 1440 Curved Ultra-Wide 60Hz Matte Display Power Supply: Built-In 450 Watt Motherboard: ASUS Z170I Pro Gaming System Cooling: Built-In Closed Loop Liquid Cooling Solution for 1151 Processors: Intel Core i5 6500 Quad-Core 3.2GHz (3.6GHz TurboBoost) Graphic Cards: Single 8GB NVIDIA GeForce GTX 1080 Founders Edition Memory: 16GB Origin PC DDR4 Powered by Kingston 2666MHz (2 X 8GB) Operating System: MS Windows 10 Home Hard Drive One (Operating System Drive #1): FREE 250GB Samsung 750 EVO Series Hard Drive Two: 2TB Seagate 5400RPM 2.5" Hard Drive Audio: On Board High Definition 8-Channel Audio Networking: Onboard Network Port Warranty: Lifetime 24/7 U.S. Based Support and Lifetime Free Labor. 1 Year Part Replacement & 45 Day Shipping Warranty Webcam: Omni Webcam (Included with Omni) Price as configured: $3,101 The price looks high, but considering the price of a 34 inch curved QHD monitor standalone, the individual components and the lifetime support and free labor warranty, it all adds up to about right. Plus you get a free t-shirt, and a lovely wooden side table with built-in storage. What’s Good Performance: This is a gaming PC, and it games quite well, thanks in no small part to the Geforce GTX 1080 Founder’s Edition that comes packed inside of it. I reviewed the card and found it very impressive, delivering more than playable performance (at the very least 30 FPS) on most modern games at 4K resolution. But this system isn’t 4K. It’s Ultra Wide 3440 x 1440. That means it’s not pushing as many pixels as a 4K screen, but it’s delivering an enhanced experience in supported games. We’re talking 60 frames per second easy at highest settings on most games I played, including World of Warcraft, Deus Ex: Human Revolution, World of Warcraft, Hitman, a nifty little Steam futuristic racing game called Redout, Rise of the Tomb Raider and World of Warcraft. Let’s take Rise of the Tomb Raider as an example. In my 1080 review I clocked it at an average of 44 frames per second on high settings at 4K. The same settings running at wide QHD delivered 66 frames per second on average. Plus the views are so much more expansive. Profile: It’s a monitor, at least that’s what it looks like sitting on your desk. I must admit that there’s a part of me that’s having an issue with not having a big box with random LED lighting sitting on top of my desk, but that part of me is silly. This is an elegant computer solution that makes me feel bad for making fun of all the folks in the office typing directly into their iMac monitors. The Panel: Curved monitors are stupid, until you spend an hour in front of one. It’s not exactly immersion so much as it is a deeper intimacy between you and your display, like it’s constantly trying to embrace you but it just can’t. I’ve fiddled with ultra-wide monitors that weren’t curved, and there was always something just a little bit off. The curve makes a ton of difference when you’re working with the extra real estate, bringing the far edges of the screen a little closer. Note the monitor also has HDMI in and picture-in-picture capabilities, so it doesn’t just have to be your PC. I only wish the panel had a better refresh rate than 60hz. Otherwise it’s bright, inviting and—best of all—you can’t really tell there’s a PC behind it unless you’re looking for it. What’s Not So Good The Noise: When the fans of the Origin Omni all-in-one kick in, you know it. I didn’t notice it much at first, when I had the system set up in my always-noisy living room, but once I made the move to my office it’s hard to ignore. The noise is low but constant during everyday use. When a game gets going it can rises to a low roar, with a bit of a rattle while ramping up. Not quite deal-breaking levels of noise, but load enough to bear mentioning. Port Access: The ports for the motherboard and graphics card are located on the underside of the monitor, and they can be a bitch to get to. While there are a pair of USB ports on the back and one up top (for the webcam), the rest of the Omni’s plugs and ports are on the underside, accessed either by tilting the system back on its stand or turning it around and removing a strip of back plate. Either way it’s a bit of hassle—I’d suggest getting a USB hub and forgetting the underports entirely. All-In-One Gaming Works Thank goodness for the flat panel display, so thin that we can all sorts of lovely things behind it. Between Apple systems and consumer-aimed IBM compatibles (there’s me aging myself), all-in-one systems have been enjoying a renaissance over the past decade. Until recently however, I’ve been looking at them as either work machines or easy-to-use, less-powerful PCs of the sort I might buy a parent for Christmas. The Origin Omni (and systems like it) has shown me that all-in-one systems can be powerful gaming systems as well, as long as they’ve got a big enough monitor up front to hide all the goodies. Now to get this beast back in its crate. Gonna miss that side table. Review By: TechSpot
      • 1
      • I love it
  19. It sounds like something out of a B-grade Hollywood plot — a flash drive that you plug into a computer and is capable of destroying it within seconds. Last year, hacker Dark Purple disclosed a USB flash drive designed to fry a modern system as soon as you plug it in. The drive works by discharging -220V through the USB port. The exact details on how the drive functioned weren’t immediately released. But there’s now a Hong Kong-based company selling a USB Kill Drive 2.0 for just $50. Here’s how the company describes the product: The USB Kill 2.0 is a testing device created to test USB ports against power surge attacks. The USB Kill 2.0 tests your device’s resistance against this attack. The USB Kill collects power from the USB power lines (5V, 1 – 3A) until it reaches ~ -240V, upon which it discharges the stored voltage into the USB data lines. This charge / discharge cycle is very rapid and happens multiple times per second. The process of rapid discharging will continue while the device is plugged in, or the device can no longer discharge – that is, the circuit in the host machine is broken. The integrated nature of modern SoCs means that blasting the USB controller with -200V the way this drive does will typically cause severe damage, up to and including destroying the SoC. While modern motherboards include overcurrent protection, this typically protects against positive voltage. (The difference between positive and negative voltage is a reference to the voltage relative to the ground). If the voltage source is connected to ground by a “-” terminal, the voltage source is positive. If it connects via the “+” terminal, the voltage source is negative. The company also plans to sell a USB Kill Tester Shield, which it claims will prevent both the USB Kill device from functioning and protect user data from certain kinds of snooping or intrusion if you hook up to an unknown charging station or other device. This kind of intrusion is known as “juice jacking,” though it’s not clear if this attack vector has been widely used in the real world. There’s not much to say about the Kill Tester Shield at the moment — all of the links on the website to the actual product are non-functional as of this writing. Caveat Emptor is good advice in a situation like this. The larger question, I think, is whether devices like this pose a threat to the average consumer. Right now, I think they don’t. At $5, it’s easy to imagine someone ordering these in bulk and scattering them just to screw with people in general. At $50 each, you probably aren’t going to stumble over a tiny block of death. At the same time, however, studies have shown that up to 50% of people will cheerfullyplug in a USB drive they found on the ground without taking precautions for what kind of data or malware might be on the drive. If the USB Kill 2.0 is actually shipping in volume, it’s probably a good idea to revisit that tendency — or at least keep an old computer around for testing.
  20. Welcome killer
  21. Welcome Amarjeet
  22. Custom silicon vendor Movidius has attracted a lot of attention for its high-performance, low-power chips that have powered vision applications like Google Tango, as well as making machine learning possible on mobile devices. Now it has received the ultimate compliment. Chip giant Intel has acquired it to help accelerate its RealSense project and other efforts to provide computer vision and deep learning solutions. Intel is expecting to see Movidius technology deployed in drones, robots, and VR headsets — in addition to more traditional mobile devices such as smartphones and tablets. The Movidius advantage Power requirements are the traditional Achilles heel of mobile solutions that require substantial computation, with vision and machine learning being two of the most extreme cases. By creating optimized, custom silicon — its Myriad chip family — Movidius has reduced the power needed to run machine learning and vision libraries by well over an order of magnitude compared to a more-general-purpose GPU. RealSense After a lot of initial excitement, Intel’s first-generation RealSense products — designed to provide devices with a 3D view of their surroundings to support mapping, navigation, and gesture recognition — faltered due to technical shortcomings. However, Intel has more than re-doubled its efforts, and is aiming to make RealSense the eyes and ears of the Internet of Things, which Intel believes will comprise over 50 billion devices by 2020. Intel Senior VP Josh Walden likens vision processors such as Movidius’s Myriad to the “visual cortex” of IoT devices. Intel taking aim at Nvidia’s GPU strategy This move takes Intel further into Nvidia’s home turf. Nvidia has bet big on high-performance computing for AI, self-driving cars, vision, and VR — the exact markets Intel is trying to move into with its RealSense platform, and now the Movidius acquisition. This pits Nvidia’s strategy of providing the most possible general computing power per watt versus Intel’s custom silicon. On paper, the advantages of each are fairly straightforward. General purpose GPU (GPGPU) computing provides the most flexibility and adaptability, while custom silicon can be more efficient when running a specific task or library — once it has been developed. In the market, expect to see plenty of design wins for both Intel and Nvidia, and some leapfrogging of each other as subsequent product generations roll out from each.
      • 1
      • I love it
  23. Un poco tarde para que diga Feliz cumpleaños DarkJesus? Te deseo muchos exitos en tu vida y que sigas adelante parcero!
  24. Android 7.0 Nougat has finally reached Google’s Nexus devices after more than five months of developer preview testing. The final version is more stable and has a ton of new features. Most phones won’t get a Nougat update for a few months at least, and that’ll only happen if the carrier and OEM consider it a high priority. What can you expect when that glorious day finally arrives? I’ve been using Nougat on both a Nexus 6P and a Pixel C for the last few days. I won’t bother reciting all the features, which you can find a simple list of them on Google’s site. Let’s talk about what it’s actually like to use Android 7.0 Nougat as a daily driver. Doze mode gets better In my opinion, Doze Mode in Android 6.0 was one of the most important features to ever show up in Android. It addressed an ongoing issue with phones that could cause problems even for the most experienced users. Sometimes you’d install an app, and it would misbehave in the background, draining your battery while the phone sat idle. Doze Mode shuts all that down. In Marshmallow, leaving the phone sitting for about 30 minutes would activate Doze mode. Almost all apps would be put to sleep and not permitted by the system to wake up until a regular update window was reached or you picked up the phone. Important push notifications would still arrive while in Doze. In Nougat, Doze Mode does the same thing, but it turns on in more situations. It’s no longer limited only to when a device is stationary. Thus, if the phone is in your pocket, Doze Mode can still be activated. This has made a noticeable positive impact on my battery life over the last few days. New notifications will take some getting used to Notifications are one of the stock Android features most OEMs implement without a ton of changes. So, a lot of users will be getting the new bundled Nougat notifications in the coming months. They’re much more powerful than notifications in Marshmallow, but also a little overwhelming at first. Nougat bundles together all the active notifications from an app in a single expandable item. For example, if you’ve got several Gmail notifications, you can expand the Gmail notification to see each one individually right in the notifications. You get the full snippet preview of each one as if it were the only active notification. Plus, you can expand them individually to see more text and get action buttons for each notification in the bundle. If you get a lot of email, this becomes a rather ungainly list. Messaging apps that support Nougat’s new direct reply feature in notifications are limited right now, but I can’t wait for more of them to show up. This is quick reply done right — just expand the notification and tap in the embedded text box to type a reply. Multi-window has potential One of the headlining features in Android 7.0 is support for multi-window apps in split-screen. This was quite exciting when Google announced it, but in practice it’s a little disappointing. The basic functionality makes sense. You can long-press the overview button to shrink your current app down to the top half of the screen (left half in landscape) and bring up a list of open apps to choose one for the other half. The problem, though, is developers don’t have to support this feature at all. During the developer preview, any app would go into split-screen, even if it didn’t work correctly. Now, if an app is on a very old API level or the developer specifically opts to block multi-window, it won’t work. Most apps don’t have official support for multi-window, so you get a toast message pointing out that it might not work correctly. I haven’t seen any major issues, though. One of the apps that won’t work in split screen at all is Netflix, which would have been nice to have. Apps that don’t have full support for the new split-screen API won’t operate in the background when you’re interacting with the other app you have up. YouTube is a good example of what’s possible with split-screen apps. It continues playing in split-screen while you poke around in another app. It’s a cool experience. For multi-window to be truly useful, apps need to add full support. The double-tap shortcut for switching apps is fantastic, though. That’s going to be useful immediately. Customizable quick settings, finally! The quick settings UI has gotten a complete revamp in Nougat. Those are the settings toggles that show up at the top of the expanded notification pane. In Android 7.0, they are finally customizable. There’s an edit button that lets you add, remove, and rearrange the items. The first few buttons are also visible at the top of the un-expanded quick settings UI. This has already made a huge difference in my daily use. This might not have a significant impact on Android users at large, though. Most OEMs have had some form of customizable quick settings for a while. Still, this is a huge thing for those of us who prefer stock Android. However, if OEMs choose to adopt Google’s stock code for this feature (even if they re-skin it) we could have something great. There’s an API for tiles that developers can plug into. That means you can download more quick settings tiles from the Play Store. There are already a few that add things like VPN toggles and a live weather tile. Nougat, the kitty collector Those are all the things that will make an immediate impact on your smartphone experience when Nougat rolls out. That’s just scratching the surface, though. There are also seamless updates, the Vulkan graphics API, and an improved code compiler. You’ll also be able to play a built-in version of Neko Atsume, a weird kitty collector game. You leave treats in your quick settings to lure in unique virtual cats, which you can then share as images. That’s the Easter Egg in this version of Android. It’s more amusing and less frustrating than the Flappy Bird Easter Egg in Marshmallow. Nexus devices from the Nexus 6 onward are getting the Nougat OTA right now. That process should be complete in a week or two. The first devices from OEMs like Motorola and Samsung should get updates out by late fall.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.