-
Posts
951 -
Joined
-
Last visited
Everything posted by Clayboy™
-
happy birthday bro !
-
welcome man
-
Google Pixel Fold and Pixel 7a smartphones are expected to make their debut soon. There have been multiple reports and leaks surrounding these handsets over the past few months. The Pixel Fold, expected to debut as the first ever foldable smartphone from Google, is a highly-anticipated handset. New reports have leaked the probable launch date of the Google Pixel 7a and Pixel Fold. The purported price of these phones and the colour options they are expected to arrive in have also been leaked online.According to a 9to5Google report citing retail sources, the Google Pixel 7a is expected to be priced at $499 (roughly Rs. 40,900) in the US. This price is $50 (roughly Rs. 4,100) more than its predecessor, the Google Pixel 6a, which was launched last year. The jump in price can reportedly be attributed to the upgrade in quality of raw materials used. The primary camera of the Pixel 7a is likely to feature a 64-megapixel sensor, accompanied by a 13-megapixel ultrawide lens. The phone is also expected to be powered by a Tensor G2 chipset which has also been used in the Pixel 7 and Pixel 7 Pro series of smartphones. The upcoming Pixel handset is also expected to sport a new 90Hz display, and wireless charging support.Previous reports suggested that the Google Pixel Fold will cost around $1,300 and $1,500 (roughly Rs. 1,07,400 to Rs. 1,23,935 ). These prices are considerably more expensive than the other models in the Pixel lineup. The upcoming Pixel Fold is still expected to be cheaper than the Samsung Galaxy Z Fold 4 which is priced at $1,799.99 (roughly Rs. 1,42,700) in the US. A tweet by tipster Jon Prosser (Twitter: @jon_prosser) suggests that the Google Pixel 7a will likely launch in Charcoal, Snow, Sea (light blue), and Coral colour options — the last colourway will only be available at Google Store, according to Prosser. The phone will be launched on May 10 during the Google I/O event and will also reportedly be immediately available for purchase in the US. In another post on Twitter, Prosser shared design renders of the upcoming Pixel Fold handset and claims that Google's first foldable smartphone is also likely to be launched on May 10 alongside the Pixel 7a at the Google I/O event. The tipster claims that the phone will be available for purchase in the US staring June 27, with pre-orders from Google Store beginning on May 10, and pre-orders from other partners or carriers starting May 30. The phone is seen in two colour options — black and white — in the renders shared by the tipster. https://www.gadgets360.com/mobiles/news/google-pixel-7a-fold-price-leak-launch-date-may-10-expected-3957918
-
AMD's custom EPYC 9V84 CPU which is part of the Genoa lineup has leaked on Chinese 3rd party-seller, Goofish. Custom AMD EPYC 9V84 Genoa CPU Makes It Way To Chinese 3rd Party Seller, Features 96 Zen 4 Cores & Massive Cinebench Performance Yesterday, the same platform leaked the first AMD Genoa-X CPU, the EPYC 9684X, and today, we get to see another unreleased Genoa chip, the EPYC 9V84. Just like the chip that leaked yesterday, the EPYC 9V84 is also an engineering sample that should be expected from Goofish sellers since they are mostly getting these out of their sources at various warehouses & companies where such samples are being sent out. Mostly, these units are supplied in trays for early evaluation & testing and that's the reason why these are not tagged as final retail units. Coming to the specifications, the EPYC 9V84 CPU seems to be a custom design by AMD for one of their many customers. Previously, Microsoft's Azure cloud services have featured EPYC CPUs with the "V' identifier though they aren't the only ones to do so. The CPU packs 96 Zen 4 cores based on the TSMC 5nm process node. There are 192 threads, 384 MB of L3, and 96 MB of L2 cache which is what we would also find on the EPYC 9654 CPU, the current flagship.However, since the AMD EPYC 9V84 Genoa chip is an ES CPU, it comes with much lower clocks that are rated at 2.0 GHz base and 2.4 GHz boost. For comparison, the EPYC 9654 features a base clock of 2.4 GHz and a boost clock of 3.7 GHz. The memory support and PCIe lanes are the same at 128 Gen 5.0 and 12-channel DDR5 support. The seller also posted the Cinebench R23 benchmark of the AMD EPYC 9V84 Genoa custom CPU which scores an impressive 111383 points at stock.The AMD EPYC 9V84 Genoa Custom ES CPU is listed for 21000 RMB or $3000 US which is a quarter of the price of EPYC 9654 that retails close to $12,000 US. But we have to remember that running ES CPUs like EPYC can be a major hassle and although the seller states that it should run fine on an SP5 motherboard, there's no proper way to verify it. It is likely that it runs on an older BIOS on one of these boards but updating to a new firmware might simply end that. https://wccftech.com/amd-custom-epyc-9v84-genoa-cpu-leak-96-zen-4-cores-2-4-ghz-clocks-over-110k-points-cinebench/
-
Today, CAPCOM announced Monster Hunter Now, a mobile game created by Niantic for iOS and Android devices. Monster Hunter Now will take advantage of Niantic's proven location-based technology to usher the hunt into the real world. The press release is extremely scant on details, though it does say it'll be possible to join other players for cooperative hunts and also clarifies that monsters encountered while the app was closed will still be able to be hunted at a later time.Either you or your companion will be able 'catch' a monster with an item called Paintball so that it can be hunted alone or in co-op at home. Monster Hunter Now is targeting a September launch, while a closed beta test begins on April 25th. Signups are available now on the official website. Monster Hunter Now is far from the first franchise game released for mobile devices. Monster Hunter Dynamic Hunting launched in 2011 on iOS, followed by Monster Hunter Freedom Unite (an enhanced version of Monster Hunter Freedom 2) in 2014, Monster Hunter Explore in 2015 (on both iOS and Android), and more recently by 2020's Monster Hunter Riders (again available on both iOS and Android).Clearly, CAPCOM is seeking to capitalize its newest golden egg goose (Monster Hunter World became by far the best-selling game ever released by the Japanese company, and Monster Hunter Rise is no slouch, either) on the mobile market with the help of Niantic's proven location-based technology. That said, while Pokémon Go was an incredible success, Niantic's subsequent releases didn't garner the same excitement. Harry Potter: Wizards Unite similarly offered players the chance to encounter the fantastic beasts imagined by J.K. Rowling in the real world, but the game failed to catch on and was shut down last year. A similar fate befell Catan: World Explorers, while Transformers: Heavy Metal was cancelled before even launching. Pikmin Bloom didn't do great, either.This year, Niantic launched NBA All-World, and they've also got the virtual pet augmented reality game Peridot and the location-based MARVEL World of Heroes coming out later in 2023. https://wccftech.com/niantics-monster-hunter-now-brings-the-hunt-to-the-real-world/
-
Xiaomi 13 Ultra is scheduled to launch in China on April 18. Ahead of the launch, the Chinese manufacturer has started teasing several aspects of the phone. Most recently, the company revealed some details about the camera sensors used in the smartphone. Now, the company has given us a first official look at the design of the Xiaomi 13 Ultra, and it looks more like a camera than a smartphone. The upcoming flagship is expected to succeed the Xiaomi 12S Ultra, and will feature Leica-tuned cameras.In a recent Weibo post, Xiaomi CEO Lei Jun shared a couple of promotional images of the Xiaomi 13 Ultra, that teases the design of the upcoming handset. It is a silhouette image of the smartphone with a camera-like design. Jun wrote in the post, “With the blessing of this suit, the phone turns into a camera! Mi 13 Ultra has a full range of professional photography. There are so many fun things in this set,” and added that more information will be divulged and discussed at the press conference. Even though it is not very clear, from the post, it can be assumed that the camera-like bump and grip seen in the promotional image is an external case that can be attached and detached from the Xiaomi 13 Ultra. It is important to note that the Xiaomi 12S Ultra Concept allowed users to attach a Leica M-series lens module to its body, therefore it is possible that Xiaomi will implement a similar mechanism this time.The company says that the Xiaomi 13 Ultra “is a professional imaging device, not a camera phone.” (translated) The official post also adds that at the April 18 launch press conference a unique model of the phone will be given away for “re-evaluation.” Earlier today, the handset was confirmed to feature Leica-tuned cameras with special Summicron lenses and Sony IMX989 and Sony IMX858 sensors. The quad camera setup will include a 50-megapixel Sony IMX989 sensor and three 50-megapixel Sony IMX858 sensors that are said to offer improved noise reduction technologies and HDR features.According to previous reports, the Xiaomi 13 Ultra could be powered by a Qualcomm Snapdragon 8 Gen 2 SoC with up to 16GB of RAM and up to 512GB of internal storage. The handset will reportedly include a 6.7-inch WQHD+ AMOLED LTPO display with a refresh rate of 120Hz. The smartphone is also tipped to come with a 32-megapixel front-facing camera. The Xiaomi 13 Ultra is expected to ship with Android 13-based MIUI 14. It could also house a 4,900mAh battery with 90W fast charging capabilities, as per leaked reports. https://www.gadgets360.com/mobiles/news/xiaomi-13-ultra-design-teaser-leica-camera-body-launch-april-18-3948880
-
Intel has released its latest performance-boosting driver for Arc graphics which delivers one hell of an improvement across various AAA games. Intel Arc and Iris Graphics 31.0.101.4311 driver makes its way online with support for several games and performance upgrades with DirectX 12 The latest Game On Driver version 31.0.101.4311, which is still in beta, matches the releases of Dead Island 2, Minecraft Legends, and Boundary to offer day-one support. Additionally, the blue team prepared the driver for Total War: Warhammer III's newest downloadable content, Mirror of Madness. You can find the latest drivers here for download.As per recent Game On Drivers from Intel, the company wants to ensure gamers that the performance of its graphics cards, ray tracing capabilities, and PPD, or Performance Per Dollar, is superior to NVIDIA's GeForce RTX 3060 12GB GPU. But why is Intel pushing Total War: Warhammer III's DLC today? It turns out that Mirror of Madness was co-developed by the company. Let's look at the numbers to see the performance gains by going with Intel instead of NVIDIA. Looking at the Intel Arc A750's performance against the NVIDIA GeForce RTX 3060 12 GB resource model in the previous driver and the current driver, Intel increased the average frames per second in the newly updated Dead Space game, jumping up an additional 30 fps in 1080p Ultra modes and 20 fps in 1440p High modes. NVIDIA's performance outshined the previous driver, version 4257, but is almost even with the newest update, if not losing by a skinny margin of two to five frames. Performance per USD shows that in the Intel-optimized Total War: Warhammer III DLC, Mirror of Madness, the company increases 62 to 72 percent compared to NVIDIA's GPU. Remember that the Intel Arc A750 GPU is currently $249, and NVIDIA's RTX 3060 12 GB is averaging $388. The same cards are compared to new releases Boundary, Dead Island 2, and Minecraft Legends, showing as low as 37% performance gains per dollar to up to 63% improvement per dollar. Ray tracing is no different from Intel's slides. Older games like DIRT 5, Metro Exodus: Enhanced Edition, F1 22, and Deathloop show minor increases from Intel in their previous Game On driver to the recent driver from today. And, looking at the two system configurations, Intel did not fluff any drivers or extra components needed, allowing users full transparency to the two test systems. Below is the new changelog for Intel Game On Driver version 31.0.101.4311, which can be downloaded here: Intel Game On Driver support on Intel Arc A-series Graphics for: Boundary Minecraft Legends Total War: Warhammer III – Mirror of Madness Dead Island 2 https://wccftech.com/intel-arc-gpus-get-another-major-performance-boosting-driver-up-to-63-improvement-across-latest-aaa-games/
-
A few hours ago, Naughty Dog and Iron Galaxy released another meaty (8.6 GB in size) patch for The Last of Us Part I PC. The new update, version 1.0.3, comes with a lot of visual, audio, and user interface fixes. First of all, the game features two new audio compatibility options. The first one, called Audio Output, lets users select which sounds are played through the spatial sound driver from Windows if there is one. This setting is meant to improve audio clarity in The Last of Us Part I for those who may hear muffled or excessively quiet dialogues and sounds. The developers recommend selecting Spatial mode in those cases. The second new audio setting is called Latency. As you can easily guess, it allows the tweaking of the brief delay between audio request and audio play. Higher latency values are recommended for those using low-end CPUs or if you have any sound distortion issues. You'll have to restart The Last of Us Part I before any changes to this setting are activated.There's a whole lot of visual fixes, too. NVIDIA users can now expect the DLSS sharpness slider to actually affect the game's sharpness, while AMD users won't have to deal with the corrupted environment bug that sometimes happened with a player character's flashlight if FSR 2 was enabled. Similarly, a bug that showed worse texture quality than the one selected in the settings has been fixed. Here's the (near) full changelog for The Last of Us Part I PC 1.0.3 patch. Updated keyboard and mouse (KBM) controls to allow players to reassign arrow keys Updated the 'Building Shaders %' user interface (UI) so progress is tracked more evenly Restored audio in the End Credits that are accessed via the main game, Left Behind, or Extras Menu Fixed a crash that may occur when opening a collectible in the backpack UI then attempting to restart or quit the game Fixed a crash that may occur while sitting (for extended times) or entering into combat areas Fixed an issue where texture quality in-game appears lower then the targeted quality setting Fixed an issue where the player's backpack UI could fail to render after altering Render Scale (Options > Display > Resolution Scaling > Scaling Mode > Render Scale) Fixed an issue where an enemy NPC may T-pose if Joel performs a single input quickturn while holding said enemy Fixed an issue where toggling player character's flashlight may cause the environment to visibly shift momentarily Fixed an issue where using the flashlight in darker areas may make the lighting appear corrupted Fixed an issue where lighting and fog may appear lower resolution on Ultra settings Fixed an issue where VRAM usage UI did not properly update when lowering the display resolution Fixed an issue where water reflections may appear corrupted or pixelated Fixed an issue where the Quicktime Events UI prompts were not rendering on Minimum spec setups Fixed an issue where rapidly moving left and right while aiming may cause unintended camera shifts Fixed an issue where the Steam and Epic clients’ collectible tracking did not match the in-game collectible tracking, preventing achievements from unlocking Fixed an issue where, if playing at a higher FPS, player animations may not play correctly The Last of Us Part I PC was released in late March, sporting a series of technical issues. However, Naughty Dog and Iron Galaxy have been rather quick to patch most of them out.in other news, there's a first-person mod in the works, though it is unclear if and when it will be released publically. https://wccftech.com/the-last-of-us-part-i-pc-patch-1-0-3-packs-a-ton-of-visual-audio-and-ui-fixes/
-
Samsung Galaxy F54 5G is tipped to launch later this month in India. The company is yet to officially confirm the launch date of the upcoming Galaxy F-series handset. However, a tipster suggests that the Samsung Galaxy F54 5G will launch in India in the last week of April. The handset is likely to be available via Flipkart, much like other Galaxy F-series smartphones. The tipster has also leaked some key specifications of the phone.Tipster Abhishek Yadav claims that the Samsung Galaxy F54 5G will sport a 6.7-inch sAMOLED display with a full-HD+ resolution. The display is said to have a layer of Corning Gorilla Glass 5 on top for added protection against scratches and accidental drops. Samsung is likely to feature its Exynos 1380 SoC under the hood, as per the tipster. Yadav claims that the phone will feature LPDDR4x RAM and UFS 2.2 storage standards. He did not reveal the RAM and storage options at the time of writing this. However, we can expect the Galaxy F54 5G to launch with up to 8GB of RAM and at least 128GB of storage.On the back, the phone is tipped to sport a triple-camera setup. There could be a 108-megapixel main camera sensor with optical image stabilisation (OIS) support. The handset is also said to feature an 8-megapixel ultra-wide camera and a 2-megapixel tertiary sensor. For selfies, the phone is tipped to get a 32-megapixel front camera.The phone is said to launch as a rebadged version of the Samsung Galaxy M54 5G, which was launched earlier this year in the Middle East. If true, the Galaxy F54 5G will have a hole-punch display. Yadav claims that the phone will also pack a 6,000mAh battery and support 25W fast charging. Further, the tipster suggests that the Galaxy M54 5G is likely to weigh about 199g and measure 8.4mm in terms of thickness. It will support Wi-Fi 6, and Bluetooth 5.3 and feature a USB Type-C port. Lastly, the phone is tipped to boot Android 13-based One UI 5.1 out-of-the-box https://www.gadgets360.com/mobiles/news/samsung-f54-5g-specifications-india-launch-timeline-leaked-galaxy-3947841
-
AMD has officially introduced its first workstation graphics cards based on the RDNA 3 GPUs, the Radeon Pro W7900 & W7800. AMD RDNA 3-Powered Radeon Pro W7900 & W7800 GPUs Official: Up To 48 GB VRAM At Half The Price of NVIDIA's RTX 6000 Ada The AMD Radeon Pro W7900 & Radeon Pro W7800 graphics cards are the first workstation parts to feature the Navi 31 "RDNA 3" GPU. The workstation cards are said to offer exceptional perf per $ vs the competition and accelerate workstation workloads such as content creation, rendering, etc with unbelievable speeds versus the prior generation. The AMD Radeon PRO W7900 and AMD Radeon PRO W7800 graphics cards are built on groundbreaking AMD RDNA 3 architecture, delivering significantly higher performance than the previous generation and exceptional performance per dollar compared to the competitive offering. The new graphics cards are designed for professionals to create and work with high-polygon count models seamlessly, deliver incredible image fidelity and color accuracy, and run graphics and compute-based applications concurrently without disruption to workflows.AMD Radeon PRO W7000 Series graphics cards feature the world’s first workstation GPU architecture based on AMD’s advanced chiplet design, providing real-world multi-tasking performance and incredible power efficiency. The new graphics cards are also the first professional workstation GPUs to offer the new AMD Radiance Display Engine featuring DisplayPort 2.1 which delivers a superior visual experience, higher resolutions, and more available colors than ever before. Following are some of the features of the Radeon Pro W7000 series graphics cards: AMD RDNA 3 Architecture – New compute units share resources between rendering, AI, and raytracing to make the most effective use of each transistor, offering approximately 50% more raytracing performance per compute unit than the previous generation. AMD RDNA 3 architecture also features optimizations for AEC, D&M, and M&E workflows for rendering, video editing, and multitasking. Advanced Chiplet Design – The world’s first workstation GPUs with a chiplet design provides higher performance and greater efficiency than the previous generation. It includes the new 5nm Graphics Compute Die (GCD) that provides the core GPU functionality. It also includes six new 6nm Memory Cache Die (MCD), each with second-generation AMD Infinity Cache technology. Dedicated AI Acceleration and Second-Generation Raytracing – New AI instructions and increased AI throughput deliver over 2X more performance than the previous AMD RDNA 2 architecture4, while second-generation raytracing technology delivers significantly higher performance than the previous generation. Up To 48 GDDR6 Memory – Allows professionals and creators to work with the largest 3D models and environments, edit and layer complex timelines using the latest digital cinema camera formats and render photorealistic, raytraced images with unparalleled quality. Professional applications that can take advantage of the larger framebuffer include Adobe Premiere Pro & After Effects, Autodesk 3ds Max & Maya, Blender, Boris FX Sapphire, Dassault Systèmes SOLIDWORKS Visualize, DaVinci Resolve, Lumion, Maxon Redshift, and many more. AMD Radiance Display Engine with DisplayPort 2.1 – Supports the highest resolutions and over 68 billion colors, and offers support for higher refresh rate displays compared to AMD RDNA 2 architecture and current competitive offerings. Display outputs support next-generation displays and multi-monitor configuration options, creating an ultra-immersive visual environment. AV1 Encode/Decode – Dual encode/decode media engines unlock new multi-media experiences with full AV1 encode/decode support designed for high resolutions, wide color gamut, and high-dynamic range enhancements. Optimized Driver Performance – All AMD Radeon PRO workstation graphics are supported by AMD Software: PRO Edition, which provides a modern and intuitive user interface. Radeon PRO Image Boost renders visuals higher than a display’s native resolution to optimize image quality and resolution, while Radeon PRO Viewport Boost dynamically adjusts viewport resolution, boosting framerates and navigation performance in select applications. The card is rated at a peak TDP of 260W and delivers a peak compute performance of 45.2 TFLOPs in single-precision (FP32) workloads. As for the design of this card, it makes use of a dual-slot cooling solution that houses a blower-style fan. The card offers 3 DisplayPort 2.1 and a single Mini DisplayPort 2.1 output. Power input includes dual 8-pin connectors and the card is based on a PCIe 4.0 x16 protocol interface though it's also backward compatible.Now in terms of pricing, the AMD Radeon Pro W7900 will retail at $3999 US which is less than half the price of NVIDIA's flagship workstation graphics card, the RTX 6000 Ada. Compared to the last-gen RTX A6000 which costs around $5000 US, the W7900 delivers better I/O and video capabilities while also beating it handily in SPECviewperf (Geomean). The Radeon Pro W7800 on the other hand is capable of achieving the same feat against the RTX A6000 at an even lower price of $2499 US but it does come with 32 GB VRAM. https://wccftech.com/amd-radeon-pro-w7900-48-gb-w7800-32-gb-rdna-3-workstation-gpus-official-half-the-cost-of-nvidia-rtx-6000-ada/
-
We here at Wccftech have been following the development of zero-gravity shooter Boundary for quite some time (the game has suffered several delays) and now the game has finally arrived, or at least an early version of it has. Boundary launches into Steam Early Access today, and Chinese indie developer Studio Surgical Scalpels has released a new teaser trailer for the game, which you can check out, below.As is often the case with early access titles, Boundary will be offering various “founders packs” for those who support the game early. Here’s a rundown of the various packs… Basic Edition - $25 One copy of Boundary Unicorn head charm Frontrunner badge Sun weapon decoration Deluxe Supporter Edition - $35 Everything in Tier 1 PLUS the Red Operator Skin Red Weapon Skin “Nanshan Security Group” Badge Season 1 Battle Pass (when released) Ultimate Supporter Edition - $45 Everything in Tier 1 and Tier 2 PLUS The Master Badge Master Head Charm Master Weapon Skin Master Operator Skin Master Weapon Decoration In addition, the Ultimate Supporter Edition comes with the Self-Reactive torque screwdriver (melee weapon), One new operator (when released), a digital artbook, and digital soundtrack One of the main reasons Boundary initially made headlines was its then-groundbreaking use of ray tracing, but partway through the game's development, RTX and DLSS were dropped in favor of AMD’s FSR 2 and Intel’s XeSS. In an interview with Wccftech, Boundary’s devs tried to explain the move…“Unfortunately, we need to remove Ray Tracing and DLSS from the EA version. The main reason is that our development resources cannot support multiple technical features, especially pure technical features, which means that this feature will not bring substantial improvements to gameplay. Therefore, we lowered the priority of this feature over the past year. After struggling for a long time, we finally decided to drop it from the launch version. This decision was not easy, as we are a team of technology-driven game developers, especially since we spent a lot of time doing ray tracing benchmarks for Boundary. Both [FSR 2 and XeSS] will be supported in the game. And specific thanks to AMD, who very much provided us with great technical and resource support to make sure FSR 2 performs extremely well in Boundary. AMD has been a wonderful partner these last few months.” Boundary is available on PC via Steam Early Access now. The full version of the game is also coming to PS5 and the devs have hinted an Xbox version may come later. https://wccftech.com/boundary-steam-early-access-launch-new-trailer/
-
Oppo A1 5G launched in China on Wednesday and it is available for purchase in a single storage configuration variant. The phone was recently spotted on certification sites like Geekbench, TENNA, and China Telecom, suggesting its impending launch. The A1 5G smartphone is powered by Qualcomm's Snapdragon chipset and is backed by a 5,000 mAh battery with 67W wired SuperVOOC fast charging support. The company is also reportedly working on a 300W SuperVOOC wired fast charging system paired with a 4,450mAh battery (4,600mAh battery typical), which is expected to launch within the year. Oppo A1 5G price Available for purchase in a single configuration of 12GB RAM and 256GB of internal storage, the Oppo A1 5G is priced at CNY 1,999 (roughly Rs. 23,800). The company has not yet confirmed plans for the model's global or Indian variant. The handset is offered in Caberia Orange, Ocean Blue, and Sandstone Black colour options. The Caberia Orange variant comes with a small lychee-patterned leather back panel and textured car stitching. The phone is up for pre-order on Oppo's website. Oppo A1 5G specifications, features The dual nano SIM-supported Oppo A1 5G features a 6.71-inch Full HD+ (2400 × 1080) LCD display with a refresh rate of up to 120Hz, a touch sampling rate of 360Hz and a pixel density of 391 ppi. The display also offers a peak local brightness of 680 nits. Powered by a Qualcomm Snapdragon 695 chipset, the Oppo A1 5G also comes with up to 12GB of LPDDR4X RAM and 256GB of UFS 2.2 internal storage. The device ships with Android 13-based ColorOS 13.The dual rear camera unit of the Oppo A1 5G device includes a 50-megapixel primary sensor and a 2-megapixel sensor, accompanied by an LED flash panel. The lenses are housed in two vertically aligned circular modules on the top-left of the back panel. The 8-megapixel front camera lens is placed inside a centred hole-punch slot at the top of the display. The Oppo A1 5G packs a 5,000mAh battery unit with 67W SuperVOOC wired fast charging support. The phone comes with a side-mounted fingerprint scanner for security, a USB Type-C port and a 3.5mm audio jack. It also offers 5G, WiFi 802.11a/b/g/n, Bluetooth 5.1, GPS, Glonass, and BeiDou connectivity. The handset weighs up to 193 grams and measures 165.6mm x 76.1mm with up to 8.25mm thickness. https://www.gadgets360.com/mobiles/news/oppo-a1-5g-price-cny-1999-launch-specifications-features-3942209
-
NVIDIA has officially unveiled its brand new GeForce RTX 4070 graphics card which aims to offer 100+ FPS in 1440P Gaming at $599 US. NVIDIA Unleashes Its GeForce RTX 4070 Graphics Card, Ada Lineup Now Starting At $599 US NVIDIA's GeForce RTX 4070 is the latest addition to the Ada Lovelace gaming lineup and the green team is stating it as a tremendous upgrade for existing GeForce GTX 1080 and GeForce RTX 2070 users. The graphics card also brings the pricing of the Ada Gaming lineup down to $599 US, being the most lowest priced next-gen card yet. Besides just adding more performance and efficiency, it adds a range of new features including: New Streaming Multiprocessors (SM) – The new SM delivers up to 2x performance and power efficiency 4th Generation Tensor Cores and Optical Flow – Enable and accelerate transformative AI technologies, including the new frame rate multiplying NVIDIA DLSS 3 3rd Generation RT Cores – Up to 2x ray tracing performance, delivering incredibly detailed virtual worlds like never before Shader Execution Reordering (SER) – SER improves ray tracing operations by 2x, boosting FPS up to 44% in Cyberpunk with RT: Overdrive Mode DLSS 3 – A revolutionary breakthrough in AI-powered graphics that massively boosts performance using AI to generate additional high-quality frames NVIDIA Studio – Unmatched performance in 3D rendering, video editing, and live streaming AV1 Encoders – The 8th generation NVIDIA Encoder (NVENC) with AV1 is 40% more efficient than H.264, enabling new possibilities for streamers, broadcasters, and video callers The NVIDIA GeForce RTX 4070 uses a cut-down AD104 GPU configuration with 46 SMs for a total of 5888 CUDA cores The GPU will come packed with 36 MB of L2 cache, 184 Texture Mapping Units & a total of 64 ROPs. The clock speeds for the graphics card are rated at 1920 MHz base and 2475 MHz boost clocks. The card offers 29 TFLOPs of FP32, 67.4 TFLOPs of RT, and 466 TFLOPs on INT8 compute output. NVIDIA states that increasing the L2 cache by 9 times on the RTX 4070 versus the RTX 3070 (36 MB vs 4 MB) improves performance, reduces latency, & increases power efficiency, as data access can remain on-chip (rather than having to rely on memory bandwidth).As for memory specs, the GeForce RTX 4070 features 12 GB GDDR6X capacities that will be adjusted at 21.0 Gbps speeds across a 192-bit bus interface. This will provide up to 504 GB/s of bandwidth. This is the same memory configuration as used by the RTX 4070 Ti and 4 GB more memory versus the 3070 series graphics cards. NVIDIA GeForce RTX 4070 12 GB "Official" TBP - 200W NVIDIA GeForce RTX 3070 8 GB "Official" TBP - 220W As far as the power consumption is concerned, the TBP is rated at 200W. The card will be powered by a single 16-pin connector which can theoretically deliver up to 600W of power. Custom models will be offering higher TBP targets but are also open to using standard 8-pin connectors. Just for comparison's sake: NVIDIA GeForce RTX 4090: 83 TFLOPs (FP32) (2.5 GHz Boost clock) NVIDIA GeForce RTX 4080: 49 TFLOPs (FP32) (2.5 GHz Boost clock) NVIDIA GeForce RTX 3090 Ti: 49 TFLOPs (FP32) (1.86 GHz Boost clock) NVIDIA GeForce RTX 4070 Ti: 40 TFLOPs (FP32) (2.6 GHz Boost clock) NVIDIA GeForce RTX 3090: 36 TFLOPs (FP32) (1.69 GHz Boost clock) NVIDIA GeForce RTX 4070: 29 TFLOPs (FP32) (2.48 GHz Boost clock) NVIDIA GeForce RTX 3080: 29 TFLOPs (FP32) (1.71 GHz Boost clock) NVIDIA GeForce RTX 3070 Ti: 21 TFLOPs (FP32) (1.77 GHz Boost clock) NVIDIA GeForce RTX 3070: 20 TFLOPs (FP32) (1.75 GHz Boost clock) Based on a boost clock speed of 2.48 GHz, you get up to 29 TFLOPs of compute performance and you can definitely squeeze out a lot more with an overclock as we had demonstrated with the RTX 4090. One should remember that compute performance doesn't necessarily indicate the overall gaming performance. https://wccftech.com/nvidia-geforce-rtx-4070-official-5888-cores-12-gb-g6x-memory-599-us-1440p-gaming-at-100-fps/
-
Since the launch of the Xbox Series X/S and PS5, console players have become used to most major games offering multiple visuals mode – usually a Quality/Resolution mode and a Performance mode, at the very least. Well, despite being a major first-party Xbox title, it seems the upcoming Redfall won’t be offering any such options. According to an official tweet, Redfall will only offer a single “Quality” visual mode on console at launch. On Xbox Series X this will be 4K/30fps and on Xbox Series S this will be 1440p/30fps. Redfall is launching on Xbox consoles with Quality mode only: Xbox Series X: 4K 30 FPS Xbox Series S: 1440p 30 FPS 60 FPS Performance mode will be added via game update at a later date. Now, I’m actually not one of those people who think anything below 60fps is some sort of death-knell for a game. 30fps can feel just fine if the game is designed around it. That said, major first-party titles like Redfall exist to show off the publisher’s hardware, or at least that’s the way it ought to be. A first-party title not offering multiple visual modes at launch seems to hint at a lack of polish or technical issues to me. Also, as Digital Foundry’s John Linneman has pointed out, Arkane has had consistent issues with frame-pacing at 30fps in the past, so hopefully that’s not a problem here. Haven’t been keeping up with Redfall? Here’s everything you need to know about the game. Wccftech’s Alessio Palumbo was able to go hands-on with a preview version of the game, and found it to be one of Akrane’s more atmospheric titles to date…“I could definitely see that the Arkane DNA is clear in this game, albeit with a few differences. First of all, the atmosphere is top-notch, and it's one of the reasons why I believe this game will be best enjoyed solo for the first playthrough.” …although even on PC, the game was no technical champ.“The preview session took place on PC (the specs of which were not shared by Bethesda). The build didn't exactly shine on the optimization front, but there's no telling how old it was, so it's hard to imagine how the game will fare at launch.” Redfall launches on PC and Xbox Series X/S on May 2. https://wccftech.com/redfall-only-30fps-quality-mode-on-xbox-series-x-s/
-
v2 ,text and effect
-
Xiaomi 13 Ultra design renders have leaked online. The upcoming Xiaomi flagship smartphone is confirmed to debut globally soon. Xiaomi revealed that the Xiaomi 13 Ultra will launch in China later this month. However, other than the Xiaomi 13 Ultra's camera featuring Leica-tuned lenses, the company did not reveal any other details or a launch date. Xiaomi is expected to reveal more details soon. However, a new report has now leaked alleged Xiaomi 13 Ultra design renders ahead of the official launch. The images suggest that the Xiaomi 13 Ultra will get some major design changes. According to the images shared by SmartPrix, in collaboration with tipster OnLeaks, the Xiaomi 13 Ultra is seen to have a huge circular module on the rear panel. The module appears to house a quad-camera setup and a dual-LED flash system. While the camera module itself is thick, it can be seen resting on top of a raised rear panel, giving the upper portion a dual-step look. The protrusion is reduced towards the bottom half of the rear panel.The Xiaomi 13 Ultra could have a flat frame design with rounded corners, going by the leaked renders. The rear panel is also seen curving towards the edges, which should help offer a better grip and in-hand feel. Also, the white colour variant of the phone seems to have a faux leather back panel, according to the report. On the front, the Xiaomi 13 Ultra is seen with a curved AMOLED display that has a hole-punch cutout at the top. The report states that the phone has a 6.7-inch 120Hz panel. On the right side, the power and volume buttons are visible. The bottom edge is seen housing the primary speaker grille, a SIM slot, and the USB Type-C port.The report further suggests that the Xiaomi 13 Ultra will measure 163.18×74.64×9.57mm. Including the camera module protrusion, the phone is said to be about 15.61mm thick. Some other details about the phone have leaked recently as well. The Xiaomi 13 Ultra is said to feature a Snapdragon 8 Gen 2 SoC with up to 16GB of LPDDR5X RAM and 512GB of UFS 4.0 storage. The phone is also said to pack a 4,900mAh battery. There could also be support for 50W wireless charging and 90W wired fast charging.The rear camera module is likely to feature a 50-megapixel Sony IMX989 1-inch main camera, which is also found in the Xiaomi 13 Pro (Review). It is also tipped to feature a 50-megapixel ultra-wide lens and two 50-megapixel sensors for telephoto shooting. For selfies, the phone is said to get a 32-megapixel front camera.Xiaomi's latest flagship is tipped to feature a WQHD+ AMOLED display with support for dynamic refresh rate switching between 1Hz and 120Hz. Lastly, the Xiaomi 13 Ultra could boot Android 13-based MIUI 14 out-of-the-box. https://www.gadgets360.com/mobiles/news/xiaomi-13-ultra-design-renders-leaked-camera-launch-april-china-global-3938435
-
One of the most reliable of leakers has done it again. The official benchmarks for NVIDIA's upcoming RTX 4070 are out in the open thanks to Videocardz. Considering these are first party benchmarks, a grain of salt never hurt anyone, but they are incredibly exciting as NVIDIA is stating that the upcoming RTX 4070 will be able to match the NVIDIA RTX 3080 GPU in DLSS performance without Frame Generation. This is a huge deal because Frame Generation is quite the controversial technology and so-called 'fake frames' have divided gamers. With the RTX 4070 however, even without framer generation, you are looking at RTX 3080 performance levels with standard DLSS. NVIDIA RTX 4070 is a very impressive 40% faster than an RTX 3080 with Frame Generation in official benchmarks We had already revealed that the NVIDIA RTX 4070 is going to be launching for $599 a while back and now WhyCry has revealed the official performance figures ahead of time as always. RTX 4070 appears to have a far greater value proposition once you tie in DLSS 3.0 and/or Frame Generation. Without Frame Generation and just good old DLSS3, it performs more or less identical to an RTX 3080. If you include Frame Generation however, it suddenly performs up to 40% faster than an RTX 3080 or 80% faster than an RTX 3070 - which is what generational upgrades should always be like. If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070. Worth noting that since both have exactly the same level of CUDA cores, the only real difference is in the clock speeds and IPC gains of the architectural revision. NVIDIA is targeting the graphics card dead center at the 1440p 120 fps gaming market as it can hit 100+ FPS with presumably maxed out settings so 120 should also be easily doable with tweaking a few settings. Of course if the card can perform 120 fps at 1440p it can also run games quite adequately at 4k 60 - which is starting to become the more commonly adopted resolution and standard. The NVIDIA RTX 4070 is going to be announced on April 13th and will have a single power connector.The RTX 4070 will be a cut-down AD104 die with ID of PG141-SKU336 and PG141-SKU337. The exact GPU variant name is going to be AD104-250-A1. The GPU will have the exact same transistor count as the RTX 4070 Ti but only 5888 CUDA Cores will be activated across 46 SMs. It will feature the exact same VRAM as the 4070 Ti at 12GBs worth of GDDR6X memory and the same bus width at 192-bit. The memory will also be clocked at the same rate of 21 Gbps bringing the bandwidth to a solid 504 GB/s. This is going to be the first card from NVIDIA that is going to have a default TGP of 250W. The die size is going to be the same as the RTX 4070 Ti at 295mm². https://wccftech.com/official-benchmarks-for-nvidia-rtx-4070-leak-online-matches-rtx-3080-performance-without-frame-generation/
-
The Star Wars Jedi series, which is about to get expanded with its second entry later this month, almost didn't feature Jedi, as Lucasfilm is very protective of them and didn't originally want to give Respawn permission to have one of them star in their franchise. Speaking with The Guardian, Survivor game director Stig Asmussen talked about how the series came to be and how they managed to develop a game that could speak to all sorts of Star Wars fans worldwide. In the beginning, things weren't exactly smooth sailing, as they had to get approval for almost every element from Lucasfilm, and the company was initially not very keen on letting Respawn star a Jedi in their game, as, according to Asmussen, they are incredibly protective of the iconic order, and wanted Respawn to have a Force using smuggler or bounty hunter as the main character of the game. Star Wars Jedi: Survivor is setting out to be a much bigger game than its predecessor, not just in terms of story and gameplay but also in file size, as it was confirmed last week that the game will require 155GB of free storage space on PC. Hopefully, performance issues will not be as huge as the game's file sizeStar Wars Jedi: Survivor launches on PC, PlayStation 5, Xbox Series X, and Xbox Series S on April 28th worldwide. You can learn more about the game by checking out Kai's preview as well as this page which details all of the game's editions, gameplay features, and more. https://wccftech.com/star-wars-jedi-dev-had-to-earn-the-right-to-use-the-jedi-for-their-series/
-
Vivo T2 5G is all set to launch in India on April 11 and the brand is actively teasing the design of the smartphone via Flipkart. Now, ahead of its official unveiling, a Vivo handset, thought to be the Vivo T2 5G, has appeared on the Geekbench benchmarking site. The listing suggests a Snapdragon 695 SoC and 6GB RAM on the upcoming device. The Vivo T2 5G is expected to run Android 13. It is confirmed to carry 64-megapixel dual rear camera unit and a full HD+ AMOLED display with 1300 nits of peak brightness.A Vivo smartphone has been spotted on the Geekbench website with model number V2240. The listing dated April 6 is thought to be of Vivo T2 5G. It has scored 678 points in single-core testing and 1,933 points in multi-core testing. The listing suggests 5.33GB of RAM, which could translate to 6GB on paper. It is shown to run Android 13 operating system as well. As per the listing, an octa-core chipset codenamed “Holi” will power the phone. It shows two CPU cores with a maximum clock speed of 2.21GHz and six cores capped at 1.80 GHz. These CPU speeds suggest the presence of a Snapdragon 695 SoC on the Vivo T2 5G.Vivo T2 5G is scheduled to launch in India on April 11. The launch event will begin at 12pm IST. The Vivo T2x 5G will also debut alongside. It will go on sale through Flipkart, the official Vivo website and select retail stores across the country.Both the Vivo T2 5G and Vivo T2X 5G are teased to feature full-HD+ AMOLED displays with 1,300 nits of peak brightness and a 360Hz touch sampling rate. They are confirmed to come with a dual rear camera unit, comprising a 64-megapixel primary sensor and a 2-megapixel bokeh sensor. They are expected to be priced at under Rs. 20,000. https://www.gadgets360.com/mobiles/news/vivo-t2-5g-specifications-features-snapdragon-695-geekbench-listing-3931259