Everything posted by XAMI
-
At yesterday’s Apple event, the company confirmed rumors that have been circulating for months: the iPhone’s headphone jack is no more. Quite a few people are unhappy about this, so in an interview with Buzzfeed, Apple’s VP of worldwide marketing Phil Schiller, VP of hardware engineering Dan Riccio, VP iOS, iPad and iPhone marketing Greg Joswiak, and CEO Tim Cook defended the decision. The foursome said there were several reasons why Apple ditched the 3.5mm connection, the main ones being the amount of space it takes up, the way it hinders water resistance, and the fact it’s antiquated. “The audio connector is more than 100 years old,” said Joswiak.“It had its last big innovation about 50 years ago. You know what that was? They made it smaller. It hasn’t been touched since then. It’s a dinosaur. It’s time to move on.” Riccio, who called the jack a mere “hole filled with air,” said its inclusion had held Apple back when it came to adding new features to previous iPhones.“It was fighting for space with camera technologies and processors and battery life. And frankly, when there’s a better, modern solution available, it’s crazy to keep it around.” Some have claimed that Apple removed the connector so it could introduce a new DRM platform for audio consumption, which Schiller called “pure, paranoid conspiracy theory.” At the iPhone event, Apple said it was ditching the jack because of “courage,” a term that seems to have annoyed people even more. Maybe the company should have listened to its co-founder, Steve Wozniak, who last month said that the decision would “tick off a lot of people.” Intel is another company to have praised the advantages of removing the headphone jack. At its recent Developer Forum in San Francisco, architects Brad Saunders and Rahman Ismail claimed replacing the old connection with USB-C would improve both audio quality and smartphones in general. Despite the furor over Apple’s move, it’s unlikely to have much negative effect on iPhone 7 sales. “Remember, we’ve been through this many times before,” says Schiller. “We got rid of parallel ports, the serial bus, floppy drives, physical keyboards on phones — do you miss the physical keyboards on your phone? … At some point — some point soon, I think — we’re all going to look back at the furor over the headphone jack and wonder what the big deal was.”
-
It’s no secret that Sony’s smartphone business is struggling. Competition has become fierce over the last couple of years, with continued innovation and lower prices causing trouble for market incumbents. Rather than doubling down on their flagship products, like HTC did with their excellentHTC 10, Sony seems happy to trundle along with minor iterations year after year. The new Xperia X series may not carry the same name as its predecessor, the Xperia Z, but there’s no mistaking these products for a revolutionary change. I've been testing the the top-end Xperia X Performance for a few weeks now and I've found it to be remarkably similar to the Xperia Z5 that preceded it. Considering the Z5 was the sixth derivation of the 3-year-old Xperia Z, it’s easy to feel this design is getting stale. The similarities between the Xperia X Performance and the Xperia Z1 in particular are striking. Released in 2013, the Xperia Z1 came with a 5.0-inch 1080p display; in 2016 we’re still seeing a 5.0-inch 1080p display on the X Performance. Both phones are water resistant. Both phones have cameras with more than 20 megapixels, and both have dedicated shutter buttons. So what has Sony managed to achieve in three years? Well, the Xperia X Performance ditches the Snapdragon 800 SoC for the modern Snapdragon 820, bringing along better connectivity, more storage, and more RAM. The front camera is up to 13 megapixels now, while Sony claims the rear camera is their fastest ever. Unbelievably, we’re getting a smaller battery in a phone just as thick. Plus we’re getting some new software features, too. When I first picked up the Xperia X Performance at MWC 2016, and then again for this review, I couldn’t help but feel that the design is bland. I’ve seen this exact sort of smartphone style from Sony more than three years ago, and it seems that as competitors have got slimmer and sexier, Sony’s designers have gone backwards. There are very few aspects to the Xperia X Performance that are visually more appealing than my favorite Sony smartphone design: the Xperia Z3. Since that phone's release in 2014, Sony’s flagship device has gained a millimeter of thickness, lost the comfortable and stylish curved edges, and swapped premium materials for plastic in some areas. On Sony’s product page, the company is quick to highlight the sleek metal back panel, which I will admit looks great thanks to a subtle brushed finish and minimal distractions, but the plastic edges let this handset down; they look cheap and don’t carry the same textural pleasure. Sony has tried to color these edges to look the same as the rear panel, but it hasn’t worked: the visible seam that joins the back panel to the sides is unsightly, and the difference in luster between metal and plastic is obvious. Placed next to the similar Huawei P9, it’s clear what Sony should have done in designing the Xperia X Performance. The P9’s metal unibody looks and feels fantastic, as the premium materials curve seamlessly around the edges. The Xperia X Performance’s transition from metal to plastic looks ugly in comparison, and gives the impression that Sony’s creation is the cheaper device. It’s not. The metal back does have its advantages. There are no antenna lines which are usually necessary on a unibody design and can detract from the style. The Xperia X Performance is also easier to grip than its Xperia Z series predecessors, which used fragile glass backs that were prone to cracks, as well as being slippery fingerprint magnets. The front panel is constructed from a slab of glass protecting the display, which subtly curves into the edges, creating a swooshable feel. The bezels are average in size for a 5.0-inch smartphone, and are colored to match the edges and back panel. I’m not a fan of this single-tone design – it looks a bit boring – and the rose gold model I received to review is not something I’d normally choose to buy. That said, the X Performance is also available in white, black and “lime gold.” The Xperia X Performance is one of the very few remaining flagships to pack dual front-facing speakers, which provide a stereo experience when watching video and playing games. I love stereo speakers on the front of smartphones, and it’s sad to see companies like HTC move away from this. The quality of the X Performance’s speakers isn’t particularly great, but their volume is decent and I don’t expect amazing sound from such small drivers. Along the bottom edge of the Xperia X Performance is a micro-USB port, which is disappointing considering most high-end devices have transitioned to the more versatile USB-C port. Samsung is the other outlier here, as they have their Gear VR system which still uses micro-USB. Sony doesn’t have anything like that, so I’m puzzled as to why they didn’t update their charging and data port to the modern standard. The top edge features a 3.5mm headphone jack, while the left edge has a tray for either two nano-SIMs, or a nano-SIM and microSD card slot. I appreciate seeing dual-SIM functionality here, as it isn’t often found on high-end smartphones. On the right edge is the fingerprint sensor, which doubles as the power button. I was critical of the Xperia Z5’s fingerprint sensor in this location as it didn’t seem to work very well, but these issues have been resolved on the Xperia X Performance. This sensor is ludicrously fast to operate, much more accurate than its previous implementation, and the positioning is just about perfect. My only concern is that the tactile feedback from the button isn’t great, and it might've made the phone thicker than necessary. Below the fingerprint sensor is the volume rocker, which is in an awkward position due to its low position on the right-hand edge. A more comfortable location here would be above the fingerprint sensor, and there would be less chance of accidental presses as well. Below the volume rocker is the dedicated two-stage camera button. The Xperia X Performance is water resistant, carrying an IP65 and IP68 rating that allows immersion in fresh water up to 1.5m for up to 30 minutes. It’s also dust tight and resistant to low pressure water jets. It’s always handy to have a water resistant phone for the times it accidentally gets dropped in the toilet or splashed with coffee, but under no circumstances should the X Performance be taken into salt water. As far as usability is concerned, the Xperia X Performance’s 5.0-inch display makes this smartphone easy to use with one hand. At 8.7mm thick it’s a chunkier than I would have liked to see, especially as the phone doesn’t pack in a large battery, though it remains more portable than handsets with larger displays. Review By: TechsPot
-
There are plenty of large phones to choose from these days: the iPhone 7 Plus, the Galaxy Note 7, the OnePlus 3, and many others that feature displays at least 5.5-inches in size, which can make them a bit cumbersome to use in one hand. But nothing compares to the Xiaomi Mi Max, a gigantic 6.4-inch smartphone/phablet that dwarfs what most people carry with them. This phone is, quite simply, ridiculously large. While this handset's size will suit only a small fraction of phone buyers, there’s actually a lot to like about the hardware inside. For just under $250 from Gearbest, who helped us acquire this phone for testing, you’re getting a massive display, a metal body with a fingerprint sensor, a 16-megapixel rear camera, and a powerful Qualcomm Snapdragon 650 SoC. The battery inside is just as enormous as the handset, at 4,850 mAh. Having now used (or, rather, struggled to use) the Xiaomi Mi Max for the last couple of weeks, I’ve learned a lot about what is physically required to operate a 6.4-inch handset on a daily basis. At $250 the Mi Max's price point is very attractive, but does the whole package deliver to earn our recommendation? Let’s find out. Clearly the 6.44-inch display makes this device significantly larger than anything else I’ve used for years. It’s roughly as thick as the iPhone 6s at 7.5mm, but it’s 15mm taller and a whopping 10.4mm wider. Width is the key dimension for determining one-handed usability, and any increases on the already-wide iPhone 6s Plus can make a phone hard to operate. The Mi Max compares even less favorably to the Samsung Galaxy Note 7, which packs a 5.7-inch display in a near bezel-free body. Here the Mi Max is nearly 20mm taller and 15mm wider, leading to a phone that’s 35% larger than an already-large device. Even with a screen-to-body ratio of approximately 75%, the size of this display introduces a number of complications during regular usage. In short, the Xiaomi Mi Max requires two hands to use. It’s impossible to use the Mi Max in one hand. I don’t have particularly large hands, but even with some serious acrobatic work my fingers are at least an inch from reaching the upper left corner of the display. Reaching the same corner on my 5.7-inch Nexus 6P is relatively easy in comparison, and that’s a phone that most people think is reasonably large. The extra screen real estate effectively works against the Mi Max in this regard. Being able to reach each corner of the display is critical as many smartphone applications place navigation elements or other buttons in these locations. In Gmail, for example, I am simply unable to hit the hamburger menu icon in the top left without performing a gymnastics routine with my fingers to appropriately position and grip the smartphone. While typing, attempting to hit the q, w or a keys gives my thumb a workout with all the stretching that is required. To make matters worse, the capacitive navigation buttons below the display are hard to hit with one hand, especially the app switching button on the far left side. Alternatively, left-handers will struggle to hit the important back button, which normally I’d criticize for being in the wrong location, but on this phone I’m glad it’s the closest to my hand. n short, the Xiaomi Mi Max requires two hands to use. This sounds relatively trivial, but it’s not until you’re trying to use the phone on a regular basis that you discover how annoying it is. Carrying some shopping bags home from the store in one hand? Forget using the Mi Max in your other hand. Want to quickly send a message to a co-worker while sipping your morning coffee? No chance. If you’re comfortable using such a massive handset and don’t mind being restricted to two-handed use, there’s actually a lot to like about the Mi Max's design. The phone features an excellent metal back panel that curves around the left and right edges, which makes the handset look just as good as some flagships of the past few years. The metal build also feels great in the hand, and provides good durability: for such a large phone there is barely any flex in the body. The design is hampered somewhat by plastic sections above and below the metal. These sections attempt to imitate the color and finish of the metal but don’t look nearly as good. Luckily they are the only downside to an otherwise great mid-range smartphone design. I appreciate the precision of the Mi Max’s design, which includes properly aligned elements such as the front camera and sensor array, and the two speaker grills along the bottom edge. I was disappointed, however, that the Mi Max doesn’t include front-facing stereo speakers, which would complement the large, media-friendly display. The speakers on this phone are okay, but their positioning could be better. Review By: TechsPot
-
Welcome Karim
-
For those unwilling or unable to purchase Microsoft Office, the open source projectOpenOffice has long been an excellent alternative. But it now looks as if the free productivity suite could be shut down unless more volunteer developers come on board. Ars Technica reports that Dennis Hamilton, volunteer vice president of Apache OpenOffice, sent out an email thread stating: "It is my considered opinion that there is no ready supply of developers who have the capacity, capability, and will to supplement the roughly half-dozen volunteers holding the project together." He added that no decisions had yet been made, but "retirement of the project is a serious possibility." Many of OpenOffice’s volunteers have left to work on LibreOffice - a fork of OpenOffice that launched in 2011. Its updates arrive more frequently than OpenOffice: 14 in 2015 alone, which is a lot more than the single update OpenOffice received across the whole of last year. The dearth of volunteers has meant that dealing with security vulnerabilities has posed a problem. Apache informed users of a vulnerability in June that could let attackers craft denial-of-service attacks and execute arbitrary code. The company suggested users switch to Microsoft Office or LibreOffice as a solution. A patch that needed to be manually installed was released a month later, but security problems remain. Another issued face by OpenOffice is that the few developers still working there are “aging,” and that working there isn’t “much of a resume builder.” Despite the lack of updates, OpenOffice was downloaded more than 29 million times on Windows and Mac last year, making a cumulative total of 160 million downloads since May 2012, according to project statistics. While there are plenty of people who want OpenOffice to continue by finding other ways of attracting new contributors, the signs aren’t looking good for the open source software.
-
With the release of Deus Ex: Mankind Divided's DirectX 12 patch, at least in a beta version, AMD has unleashed a new set of Radeon Software Crimson Edition drivers that further optimize the game in its DirectX 12 mode. The Radeon Software 16.9.1 drivers also include a new CrossFire profile for Dota 2's DirectX 11 mode. On top of this, AMD has included a range of bug fixes including those that fix flickering issues with 144 Hz displays, as well as crashes and flickering in games like GTA V, Dirt Rally, and Doom. Known issues in this driver include a problem with the AMD Gaming Evolved overlay that causes crashes in some games. Users should also be wary that upgrading to this new 16.9.1 driver may reset user settings in Radeon Settings to their default values, so it could be a good idea to take note of what you've changed in case this happens to you. As always, you can download the latest Radeon Software drivers through Radeon Settings automatically, or you can head over to our driver download section and grab a manual installer.
-
Sony on Wednesday introduced the world to its new family of PlayStation 4 consoles consisting of a slimmer version of the original and a more powerful variant called the PlayStation 4 Pro. The slimmer and lighter PlayStation 4 will replace the original console in the lineup. It’s functionally identical to the first-generation console, we’re told, although it is a bit more energy efficient and comes with a slightly different DualShock 4 controller. Codenamed Neo, the new PlayStation 4 Pro with 1TB hard drive builds on the success of the original with a faster processor, better graphics and support for glorious 4K resolution and HDR. Although it’s designed to get the most out of 4K televisions that support HDR, such sets aren’t a requirement. As PlayStation lead system architect Mark Cerny explained, even games on a 1080p TV will look better through the use of technologies like super-sampling and advanced anti-aliasing. Games on standard HD sets will also feature brighter colors and better reflections although it sounds like high-fidelity patches will be needed to bring out the best in older titles. PlayStation 4 Pro will also afford a much better overall experience when using PlayStation VR, the company’s upcoming virtual reality headset. Netflix is even developing an app for the console that’ll allow for 4K video streaming. One has to wonder if this will expedite Microsoft’s plans to bring Project Scorpio to market. It’s worth mentioning that HDR capabilities will be coming to all PlayStation 4 consoles via firmware update. The slimmer PlayStation 4 goes on sale September 15 priced at $299. If it’s the PlayStation 4 Pro you’re after, be prepared to wait until November 10 to get one for $399.
-
I spent my first hour with Origin PC’s GeForce GTX 1080-powered, 34-inch ultra-wide curved screen Omni all-in-one gaming PC just staring at my desk. There’s a lot more of desk now. I didn’t know I had so much desk. Technically I spent the first hour looking for my power screwdriver to extricate the system from the obligatory wooden crate, all part of the Origin PC purchasing process. I imagine someone purchasing one of these instead of having to store a large wooden crate in their cramped apartment home for several weeks while they review the system would be tickled pink. We’ve been using the crate as a side table. But then, so much desk. I’d removed my normal work tower from the surface in order to make room, but I really didn’t have to. Even now, as I sit in front of the Omni writing this review, my eyes continuously shift left and right, anxious about the lack of clutter. Much better. This must be what a Mac user feels like all the time. An all-in-one computer combines a monitor and computer into a single unit, generally with a smaller footprint. Apple’s always been a big proponent of the design, and when I started seeing computers in school (I’m so old) they were mainly boxy Macs. When I got my first real computer, I was surprised a monitor was not attached to it. Technology has changed significantly since those days. We’ve got cellular telephones, color televisions, fire and the ability to stuff a powerful PC behind a top-of-the-line monitor, leaving plenty of room on my desk for toys. Enter the Origin Omni. Alone it looks like a strangely-thick 34-inch curved screen ultra-wide (3440 x 1440) quad HD monitor, which it is. It’s just got some junk in the truck, and by junk I mean high-end PC parts. Strip away the back panel of the monitor, and you’ve got a full-sized monitorboard, full-sized graphics card, memory, wires, fans, some cooling bits. You know, the stuff inside a computer, only Tetris’d into a much flatter form factor. Adding only a small amount of thickness to the monitor and a bunch of heft) transforms it into a relatively low-compromise PC enclosure. It’s got a full-size ASUS Z170I Pro Gaming motherboard fitted with a watercooled Intel Core i7 6700K Quad-Core 4.0GHz CPU, a Geforce GTX 1080 graphics card, 16GB of Kingston DDR4 memory and two hard drives, a 250GB Samsung SSD and a 2TB Seagate Sata drive. The only real compromise here is the power supply, which in an Origin-branded 450 watt, which is right on the edge of what the Geforce GTX 1080 card supports. But hey, it’s worked like a charm so far, so I won’t argue with results. Here’s the parts list, as configured at the Origin PC website: Case: Origin Omni Display Type: Omni 34" 3440 x 1440 Curved Ultra-Wide 60Hz Matte Display Power Supply: Built-In 450 Watt Motherboard: ASUS Z170I Pro Gaming System Cooling: Built-In Closed Loop Liquid Cooling Solution for 1151 Processors: Intel Core i5 6500 Quad-Core 3.2GHz (3.6GHz TurboBoost) Graphic Cards: Single 8GB NVIDIA GeForce GTX 1080 Founders Edition Memory: 16GB Origin PC DDR4 Powered by Kingston 2666MHz (2 X 8GB) Operating System: MS Windows 10 Home Hard Drive One (Operating System Drive #1): FREE 250GB Samsung 750 EVO Series Hard Drive Two: 2TB Seagate 5400RPM 2.5" Hard Drive Audio: On Board High Definition 8-Channel Audio Networking: Onboard Network Port Warranty: Lifetime 24/7 U.S. Based Support and Lifetime Free Labor. 1 Year Part Replacement & 45 Day Shipping Warranty Webcam: Omni Webcam (Included with Omni) Price as configured: $3,101 The price looks high, but considering the price of a 34 inch curved QHD monitor standalone, the individual components and the lifetime support and free labor warranty, it all adds up to about right. Plus you get a free t-shirt, and a lovely wooden side table with built-in storage. What’s Good Performance: This is a gaming PC, and it games quite well, thanks in no small part to the Geforce GTX 1080 Founder’s Edition that comes packed inside of it. I reviewed the card and found it very impressive, delivering more than playable performance (at the very least 30 FPS) on most modern games at 4K resolution. But this system isn’t 4K. It’s Ultra Wide 3440 x 1440. That means it’s not pushing as many pixels as a 4K screen, but it’s delivering an enhanced experience in supported games. We’re talking 60 frames per second easy at highest settings on most games I played, including World of Warcraft, Deus Ex: Human Revolution, World of Warcraft, Hitman, a nifty little Steam futuristic racing game called Redout, Rise of the Tomb Raider and World of Warcraft. Let’s take Rise of the Tomb Raider as an example. In my 1080 review I clocked it at an average of 44 frames per second on high settings at 4K. The same settings running at wide QHD delivered 66 frames per second on average. Plus the views are so much more expansive. Profile: It’s a monitor, at least that’s what it looks like sitting on your desk. I must admit that there’s a part of me that’s having an issue with not having a big box with random LED lighting sitting on top of my desk, but that part of me is silly. This is an elegant computer solution that makes me feel bad for making fun of all the folks in the office typing directly into their iMac monitors. The Panel: Curved monitors are stupid, until you spend an hour in front of one. It’s not exactly immersion so much as it is a deeper intimacy between you and your display, like it’s constantly trying to embrace you but it just can’t. I’ve fiddled with ultra-wide monitors that weren’t curved, and there was always something just a little bit off. The curve makes a ton of difference when you’re working with the extra real estate, bringing the far edges of the screen a little closer. Note the monitor also has HDMI in and picture-in-picture capabilities, so it doesn’t just have to be your PC. I only wish the panel had a better refresh rate than 60hz. Otherwise it’s bright, inviting and—best of all—you can’t really tell there’s a PC behind it unless you’re looking for it. What’s Not So Good The Noise: When the fans of the Origin Omni all-in-one kick in, you know it. I didn’t notice it much at first, when I had the system set up in my always-noisy living room, but once I made the move to my office it’s hard to ignore. The noise is low but constant during everyday use. When a game gets going it can rises to a low roar, with a bit of a rattle while ramping up. Not quite deal-breaking levels of noise, but load enough to bear mentioning. Port Access: The ports for the motherboard and graphics card are located on the underside of the monitor, and they can be a bitch to get to. While there are a pair of USB ports on the back and one up top (for the webcam), the rest of the Omni’s plugs and ports are on the underside, accessed either by tilting the system back on its stand or turning it around and removing a strip of back plate. Either way it’s a bit of hassle—I’d suggest getting a USB hub and forgetting the underports entirely. All-In-One Gaming Works Thank goodness for the flat panel display, so thin that we can all sorts of lovely things behind it. Between Apple systems and consumer-aimed IBM compatibles (there’s me aging myself), all-in-one systems have been enjoying a renaissance over the past decade. Until recently however, I’ve been looking at them as either work machines or easy-to-use, less-powerful PCs of the sort I might buy a parent for Christmas. The Origin Omni (and systems like it) has shown me that all-in-one systems can be powerful gaming systems as well, as long as they’ve got a big enough monitor up front to hide all the goodies. Now to get this beast back in its crate. Gonna miss that side table. Review By: TechSpot
-
It sounds like something out of a B-grade Hollywood plot — a flash drive that you plug into a computer and is capable of destroying it within seconds. Last year, hacker Dark Purple disclosed a USB flash drive designed to fry a modern system as soon as you plug it in. The drive works by discharging -220V through the USB port. The exact details on how the drive functioned weren’t immediately released. But there’s now a Hong Kong-based company selling a USB Kill Drive 2.0 for just $50. Here’s how the company describes the product: The USB Kill 2.0 is a testing device created to test USB ports against power surge attacks. The USB Kill 2.0 tests your device’s resistance against this attack. The USB Kill collects power from the USB power lines (5V, 1 – 3A) until it reaches ~ -240V, upon which it discharges the stored voltage into the USB data lines. This charge / discharge cycle is very rapid and happens multiple times per second. The process of rapid discharging will continue while the device is plugged in, or the device can no longer discharge – that is, the circuit in the host machine is broken. The integrated nature of modern SoCs means that blasting the USB controller with -200V the way this drive does will typically cause severe damage, up to and including destroying the SoC. While modern motherboards include overcurrent protection, this typically protects against positive voltage. (The difference between positive and negative voltage is a reference to the voltage relative to the ground). If the voltage source is connected to ground by a “-” terminal, the voltage source is positive. If it connects via the “+” terminal, the voltage source is negative. The company also plans to sell a USB Kill Tester Shield, which it claims will prevent both the USB Kill device from functioning and protect user data from certain kinds of snooping or intrusion if you hook up to an unknown charging station or other device. This kind of intrusion is known as “juice jacking,” though it’s not clear if this attack vector has been widely used in the real world. There’s not much to say about the Kill Tester Shield at the moment — all of the links on the website to the actual product are non-functional as of this writing. Caveat Emptor is good advice in a situation like this. The larger question, I think, is whether devices like this pose a threat to the average consumer. Right now, I think they don’t. At $5, it’s easy to imagine someone ordering these in bulk and scattering them just to screw with people in general. At $50 each, you probably aren’t going to stumble over a tiny block of death. At the same time, however, studies have shown that up to 50% of people will cheerfullyplug in a USB drive they found on the ground without taking precautions for what kind of data or malware might be on the drive. If the USB Kill 2.0 is actually shipping in volume, it’s probably a good idea to revisit that tendency — or at least keep an old computer around for testing.
-
Welcome killer
-
Custom silicon vendor Movidius has attracted a lot of attention for its high-performance, low-power chips that have powered vision applications like Google Tango, as well as making machine learning possible on mobile devices. Now it has received the ultimate compliment. Chip giant Intel has acquired it to help accelerate its RealSense project and other efforts to provide computer vision and deep learning solutions. Intel is expecting to see Movidius technology deployed in drones, robots, and VR headsets — in addition to more traditional mobile devices such as smartphones and tablets. The Movidius advantage Power requirements are the traditional Achilles heel of mobile solutions that require substantial computation, with vision and machine learning being two of the most extreme cases. By creating optimized, custom silicon — its Myriad chip family — Movidius has reduced the power needed to run machine learning and vision libraries by well over an order of magnitude compared to a more-general-purpose GPU. RealSense After a lot of initial excitement, Intel’s first-generation RealSense products — designed to provide devices with a 3D view of their surroundings to support mapping, navigation, and gesture recognition — faltered due to technical shortcomings. However, Intel has more than re-doubled its efforts, and is aiming to make RealSense the eyes and ears of the Internet of Things, which Intel believes will comprise over 50 billion devices by 2020. Intel Senior VP Josh Walden likens vision processors such as Movidius’s Myriad to the “visual cortex” of IoT devices. Intel taking aim at Nvidia’s GPU strategy This move takes Intel further into Nvidia’s home turf. Nvidia has bet big on high-performance computing for AI, self-driving cars, vision, and VR — the exact markets Intel is trying to move into with its RealSense platform, and now the Movidius acquisition. This pits Nvidia’s strategy of providing the most possible general computing power per watt versus Intel’s custom silicon. On paper, the advantages of each are fairly straightforward. General purpose GPU (GPGPU) computing provides the most flexibility and adaptability, while custom silicon can be more efficient when running a specific task or library — once it has been developed. In the market, expect to see plenty of design wins for both Intel and Nvidia, and some leapfrogging of each other as subsequent product generations roll out from each.
-
Un poco tarde para que diga Feliz cumpleaños DarkJesus? Te deseo muchos exitos en tu vida y que sigas adelante parcero!
-
Android 7.0 Nougat has finally reached Google’s Nexus devices after more than five months of developer preview testing. The final version is more stable and has a ton of new features. Most phones won’t get a Nougat update for a few months at least, and that’ll only happen if the carrier and OEM consider it a high priority. What can you expect when that glorious day finally arrives? I’ve been using Nougat on both a Nexus 6P and a Pixel C for the last few days. I won’t bother reciting all the features, which you can find a simple list of them on Google’s site. Let’s talk about what it’s actually like to use Android 7.0 Nougat as a daily driver. Doze mode gets better In my opinion, Doze Mode in Android 6.0 was one of the most important features to ever show up in Android. It addressed an ongoing issue with phones that could cause problems even for the most experienced users. Sometimes you’d install an app, and it would misbehave in the background, draining your battery while the phone sat idle. Doze Mode shuts all that down. In Marshmallow, leaving the phone sitting for about 30 minutes would activate Doze mode. Almost all apps would be put to sleep and not permitted by the system to wake up until a regular update window was reached or you picked up the phone. Important push notifications would still arrive while in Doze. In Nougat, Doze Mode does the same thing, but it turns on in more situations. It’s no longer limited only to when a device is stationary. Thus, if the phone is in your pocket, Doze Mode can still be activated. This has made a noticeable positive impact on my battery life over the last few days. New notifications will take some getting used to Notifications are one of the stock Android features most OEMs implement without a ton of changes. So, a lot of users will be getting the new bundled Nougat notifications in the coming months. They’re much more powerful than notifications in Marshmallow, but also a little overwhelming at first. Nougat bundles together all the active notifications from an app in a single expandable item. For example, if you’ve got several Gmail notifications, you can expand the Gmail notification to see each one individually right in the notifications. You get the full snippet preview of each one as if it were the only active notification. Plus, you can expand them individually to see more text and get action buttons for each notification in the bundle. If you get a lot of email, this becomes a rather ungainly list. Messaging apps that support Nougat’s new direct reply feature in notifications are limited right now, but I can’t wait for more of them to show up. This is quick reply done right — just expand the notification and tap in the embedded text box to type a reply. Multi-window has potential One of the headlining features in Android 7.0 is support for multi-window apps in split-screen. This was quite exciting when Google announced it, but in practice it’s a little disappointing. The basic functionality makes sense. You can long-press the overview button to shrink your current app down to the top half of the screen (left half in landscape) and bring up a list of open apps to choose one for the other half. The problem, though, is developers don’t have to support this feature at all. During the developer preview, any app would go into split-screen, even if it didn’t work correctly. Now, if an app is on a very old API level or the developer specifically opts to block multi-window, it won’t work. Most apps don’t have official support for multi-window, so you get a toast message pointing out that it might not work correctly. I haven’t seen any major issues, though. One of the apps that won’t work in split screen at all is Netflix, which would have been nice to have. Apps that don’t have full support for the new split-screen API won’t operate in the background when you’re interacting with the other app you have up. YouTube is a good example of what’s possible with split-screen apps. It continues playing in split-screen while you poke around in another app. It’s a cool experience. For multi-window to be truly useful, apps need to add full support. The double-tap shortcut for switching apps is fantastic, though. That’s going to be useful immediately. Customizable quick settings, finally! The quick settings UI has gotten a complete revamp in Nougat. Those are the settings toggles that show up at the top of the expanded notification pane. In Android 7.0, they are finally customizable. There’s an edit button that lets you add, remove, and rearrange the items. The first few buttons are also visible at the top of the un-expanded quick settings UI. This has already made a huge difference in my daily use. This might not have a significant impact on Android users at large, though. Most OEMs have had some form of customizable quick settings for a while. Still, this is a huge thing for those of us who prefer stock Android. However, if OEMs choose to adopt Google’s stock code for this feature (even if they re-skin it) we could have something great. There’s an API for tiles that developers can plug into. That means you can download more quick settings tiles from the Play Store. There are already a few that add things like VPN toggles and a live weather tile. Nougat, the kitty collector Those are all the things that will make an immediate impact on your smartphone experience when Nougat rolls out. That’s just scratching the surface, though. There are also seamless updates, the Vulkan graphics API, and an improved code compiler. You’ll also be able to play a built-in version of Neko Atsume, a weird kitty collector game. You leave treats in your quick settings to lure in unique virtual cats, which you can then share as images. That’s the Easter Egg in this version of Android. It’s more amusing and less frustrating than the Flappy Bird Easter Egg in Marshmallow. Nexus devices from the Nexus 6 onward are getting the Nougat OTA right now. That process should be complete in a week or two. The first devices from OEMs like Motorola and Samsung should get updates out by late fall.
-
Over the past few years, Google has been working on a modular smartphone concept that would allow users to swap certain components and customize the device. Potential add-ons ran the gamut from improved cameras to larger batteries. Initially, Google implied that core blocks like the CPU, display, and sensors might be swappable. But the company changed the design spec earlier this year to a much more limited platform in which the CPU, GPU, sensors, battery, and display were all locked down — leaving little reason to buy an Ara at all. Google seems to have read the writing on the wall and killed the project altogether. While the company hasn’t made a formal announcement yet, Reuters has reported that the cancellation reflects an ongoing effort to streamline Google’s hardware projects and bring them under a single division. The company has also just killed its Pixel 2 Chromebook without announcing a replacement. Project Ara’s design was interesting, but it was never clear how much market the device would actually have, or whether it would offer replacement CPUs or displays. The problems were significant: While PC desktop hardware is extremely modular, PC desktop hardware is also big. Mobile devices are designed for tight levels of integration, which Project Ara would have intrinsically lacked. In a conventional SoC, the CPU, GPU, and I/O hardware all live on the same slice of silicon. Replacing one of these components means replacing all of them. Android would have had to adopt a more Windows-like model of shipping with a large quantity of drivers, and would’ve had to instruct people in how to download new drivers for advanced hardware modules. The theory behind Project Ara was sound: a modular smartphone that would generate much less waste, allow users to repair broken devices by replacing modules, and create a vibrant ecosystem of third-party components that would extend the usefulness and capability of smartphones in new ways (and possibly culminating in the creation of a tricorder-like device from Star Trek). The problem is, it doesn’t play nice with current laws of physics. Building modular devices with separate interconnects and magnetic locks introduces wear and tear issues that can damage these components and leave them non-functional long term. This was a problem Google reportedly struggled to solve — in order for the interconnects to function properly, the contacts had to make tight contact — then maintain that contact over hundreds or thousands of removals and replacements. Gold is often used for these type of contacts, but gold is also comparatively easy to scratch or abrade. And while it’s not clear how much power the interconnects in Project Ara consumed, it would have been over and above that of current smartphones. These issues and Google’s desire to consolidate its hardware efforts appear to have killed Project Ara. It’s not clear what comes next for the hardware it designed. There are rumors that it might license its technology to other partners. But with Google’s efforts having failed, few companies may be interested.
-
Apple announced the iPhone 7 today, along with its A10 microprocessor. We’ll have to wait a bit to see how the new CPU compares with the A9 in terms of overall performance, but Apple shared some significant details of what to expect with the new chip. Past rumors have suggested a second-generation 14nm SoC built at TSMC, but Apple said nothing about either its process node or foundry manufacturer of choice. What we do know is that the new A10 is a quad-core, 3.3-billion transistor CPU (the A9 CPU contained more than two billion transistors) with an estimated 40% performance advantage over A9 (according to Apple). Big.Little-ish Ever since it debuted the iPhone 4s, Apple has stuck to a dual-core policy for its iPhone, even as its Android competitors steadily boosted their core counts. It’s become common for flagship Android devices to field 4-8 cores — either a single cluster of high-end CPUs like the Snapdragon 820, or a combination of high-end and low-power CPUs in a unified cluster via ARM’s big.Little. When ARM debuted big.Little it was by no means certain that the idea would take off. As we covered at the time, making the large and small CPU clusters talk to each other effectively required some heavy lifting from both SoC manufacturers and Android itself. Intel, meanwhile, believed that it could use Dynamic Frequency and Voltage Scaling to effectively address the market with a single CPU or CPU cluster, rather than relying on clusters of high-performance and high-efficiency CPU cores that share data and workloads amongst themselves. Apple isn’t calling its own cluster implementation big.Little, but its implementation of the same concept appears similar. The A10 will combine two high performance cores with 40% better performance than the A9 with two high-efficiency cores that draw 1/5 the power of the A10’s high-end chips. The controller that manages these workloads is custom Apple silicon, so we don’t know how well Apple’s solution will compare with ARM’s. Apple could be facing a learning curve here — the first chips to implement big.Little didn’t actually do so properly and it took several technology and Android iterations before the standard was fully supported in both hardware and software. Alternately, Apple may have been developing the technology through several revisions, and only rolled it out when it knew it had everything nailed down. Apple is also claiming that the new GPU inside the iPhone 7 will deliver 50% more graphics performance than A9 and 240x more performance than the original iPhone. These gains appear to have been plotted against the iPhone 6s rather than the 6s Plus. The iPhone 7’s battery life is said to have improved as a result of these changes, with more than two hours of run-time compared with the iPhone 6s and one hour more than the iPhone 6s Plus. Apple’s decision to use a similar technology to big.Little should pay dividends in terms of battery performance, but we’ll have to wait for more details before we can compare the two technologies. The latest revisions of big.Little allow for workloads to be shared across all the CPU cores in a device, provided its thermal budget allows for this. Apple may have duplicated this functionality with the A10 Fusion. Apple is also claiming that its screens are now 25% brighter than before, and these types of changes can have a significant impact on battery life. Heading into the event there were rumors that the iPhone would use an Intel-branded modem rather than a Qualcomm chip, but if this is true Tim Cook didn’t mention it. The new Fusion technology Apple is debuting here is a significant shift for its device strategy and it’ll be interesting to see how the new hardware impacts overall SoC performance. The GPU is presumably by Imagination Technologies — they’ve built the hardware for every iPhone and there’s no sign Tim Cook changed suppliers for the iPhone 7.
-
Welcome Rahul
-
We love Android, but rooting your phone can give you the opportunity to do so much more than your phone can do out of the box—whether it’s wireless tethering, speeding it up with overclocking, or customizing the look of your phone with themes. Rooting, for those of you that don’t know, means giving yourself root permissions on your phone. It’s similar running programs as administrators in Windows, or running a command with sudo in Linux. With a rooted phone, you can run apps that require access to certain system settings, as well as flash custom ROMs to your phone, which add all sorts of extra features. There are a ton of different Android phones out there, and while some rooting methods might work for multiple phones, there is no one-size-fits-all guide for rooting every phone out there. As you learn more about the rooting process, you’ll probably run into a bunch of terms that can be confusing. Here are some of the most important ones and what they mean. Root: Rooting means you have root access to your device—that is, it can run the sudo command, and has enhanced privileges allowing it to run apps like Wireless Tether or SetCPU. You can root either by installing the Superuser application or by flashing a custom ROM that includes root access. ROM: A ROM is a modified version of Android. It may contain extra features, a different look, speed enhancements, or even a version of Android that hasn’t been released for your phone yet. Stock: “Stock” refers to a few different things, depending on the context. When we refer to “Stock Android,” we mean the Google-built version you’d find on Nexus devices, with no extra UI chances like HTC Sense or Samsung TouchWiz. Many ROMs are based on stock Android with some additions, like CyanogenMod, while others are based on the version that came with your phone. In other cases, “Stock” can also mean the version of Android that came with your phone—e.g., if you want to get rid of your ROM and return your phone to factory settings, you might say you’re “going back to stock.” Kernel: A kernel is the component of your operating system that manages communications between your software and hardware. There are a lot of custom kernels out there for most phones, many of which can speed up your phone and increase your battery life, among other things. Be careful with kernels, though, as a bad one can cause serious problems with your phone and possibly even brick it. Bootloader: Your bootloader is the lowest level of software on your phone, running all the code that’s necessary to start your operating system. Most bootloaders come locked, meaning you can’t flash custom recoveries or ROMs. Unlocking your bootloader doesn’t root your phone directly, but it does allow you to root and/or flash custom ROMs if you so desire. Recovery: Your recovery is the software on your phone that lets you make backups, flash ROMs, and perform other system-level tasks. The default recovery on your phone can’t do much, but you can flash a custom recovery—like ClockworkMod or TWRP—after you’ve unlocked your bootloader that will give you much more control over your device. This is often an integral part of the rooting process. Brick: To brick your phone is to break it during flashing or other acts. There is always a small risk with flashing, and if your phone becomes unable to function—that is, it basically becomes a brick—you’ve bricked your phone. The risk is very small, however, and more often than not people say “brick” when they really mean “it turns on but doesn’t boot properly,” which is a very fixable problem. What’s the difference between rooting, unlocking, and flashing a ROM? This can be confusing, since the three practices are often performed at the same time. We’ve detailed some of this above, but briefly: Unlocking your bootloader is usually the first step in the process and allows you to flash a custom recovery. From there, you can then give yourself root access or flash a ROM. Root access isn’t required to flash a ROM, but almost all custom ROMs will come with root access built-in. Is rooting illegal? No. Technically, it once was, but exceptions to the DCMA have made it legal for most phones (but not necessarily tablets). Either way, it’s hard to imagine anyone actually enforcing this rule. Will rooting void my warranty? Yes. Unlocking your bootloader will void the warranty on your phone, even if your manufacturer provides a way for you to do it. That said, if you need warranty service for a hardware issue, you can sometimes unroot your phone and take it in for service with no one the wiser. However, some phones have a digital “switch” that flips when you unlock your phone that is very difficult or impossible to revert, so do your research before unlocking if you want to preserve your warranty. Could rooting brick my phone? It’s possible, but pretty unlikely. As long as you follow instructions well, you probably won’t brick anything (but we’re not responsible yadda yadda yadda). Flashing custom kernels and radios is a little riskier than just rooting or flashing ROMs, but again, if you follow directions you should be okay. Keep in mind that bricking means your phone means it won’t turn on or function at all—if you’re stuck in a boot loop or boot straight to recovery, your phone is not bricked, and it can be fixed. To be sincery, i prefer be root user, because when you buy a phone this come with app pre-installled, you can't delete these app and that occupies space in the phone, but when you are root user you can unistall these apps and much more.
-
This year has been difficult for smartphones, which is a bit of a paradox when you consider just how much better things have gotten compared to last year. With Snapdragon 820, 650, 652, and 625 we’ve finally moved past the shadow of the Snapdragon 810, 808, and 617/615. While there were Android devices that shipped with the Exynos 7420, they were often paired with a modem that was not necessarily the most power efficient. Despite all of this, there seems to be a general disappointment with smartphones. People are increasingly finding it hard to justify phones like the HTC 10 or Galaxy S7 with competition from OnePlus, Xiaomi, and even Apple with their iPhone SE. In this context the Galaxy Note7 brings much of the flavor of the Galaxy S7 edge, but blends it with the S-Pen of the Note line and a few new features like the iris scanner. If you were paying attention to the industry with the launch of the Galaxy S6 and Galaxy Note5, it’s very much more of the same rather than the major redesigns that we saw moving from the Galaxy S5 to the Galaxy Note 4. To better illustrate what I mean we can take a look at the spec sheet. When we look at the spec sheet, the Galaxy Note7 is almost identical to the Galaxy S7 edge, but sees a minor bump in size and the addition of an S-Pen. Of course, the Galaxy Note7 is a big step up from the Note5, but for perspective it's generally more interesting to look at recent smartphone launches to contextualize the device under test. For the first time we’re really starting to see the impact on internal volume that the S-Pen has, as the Galaxy S7 edge is slightly smaller than the Galaxy Note7 but actually has a larger battery, which wasn’t the case when comparing the Galaxy S6 edge+ and Galaxy Note5. Of course, the S-Pen does also provide functional value if regularly used, so it’s a trade-off that has to be considered by the end user. While we’re still on the subject of the S-Pen, it no longer breaks the phone if inserted backwards. It also has a thinner 0.7mm tip and an extra bit of precision for pressure sensing, but we'll have to take a closer look a bit later here. Other than the addition of the S-Pen and a slightly larger display, the Galaxy Note7 also gains a USB-C port relative to the Galaxy S7 edge, which makes the connector reversible. It also supports USB 3.1 Gen 1 but the cable in the box is USB 2 only, which seems to be a po[CENSORED]r trend with a number of recent flagships. There’s also the addition of the iris scanner, which supports iris scanning from one pair of eyes. Other than these changes, the Galaxy Note7 at a high level is rather difficult to tell apart from the Galaxy S7 edge.