Jump to content
  • Nintendo Switch Announced - Powered by NVIDIA Tegra


    Brian

    Today Nintendo announced it's brand new gaming system dubbed the Switch which will go on sale March 2017. The Switch features a tablet sized screen with detachable controllers called Joy-Cons allowing it to be docked to a unit at home to serve as a traditional console or have the controllers attach to the side of the screen giving gamers the ability to take the system with them on the go.

     

     

     

     

     

     

     

    SwitchMobile.jpg

    (image: Polygon)

     

    In addition, the Switch will utilize mini cartridges as a medium for its games although it remains to be seen whether it will have built in storage to accommodate downloadable games and/or content. There are no details yet about online gaming although given the fact that it's 2016 and the emphasis in the trailer (see below) is on social gaming, it would be a huge oversight not to include it. 

     

    SwitchCartridge.jpg

                                                                                                                                                                (image: Polygon)

     

    Unlike Nintendo's previous Wii U console which saw poor partner support, Nintendo today announced a fairly healthy list of developers for their new console which include EA, Activision, Bethesda, Capcom, Sega, Ubisoft, Square Enix, From Software, Warner Brothers, Codemasters and Atlus.

     

    SwitchPartners.jpg

     

    Finally, it was rumored that NVIDIA would be behind the Switch and today it was confirmed that they are indeed the ones that engineered the Switch in conjunction with Nintendo - the full press release by NVIDIA is available at the end of this article. 

     

    The Nintendo Switch in action:

     

    NVIDIA press release regarding Nintendo Switch:

    Quote

    A Console Architecture for the Living Room and Beyond

    Nintendo Switch is powered by the performance of the custom Tegra processor. The high-efficiency scalable processor includes an NVIDIA GPU based on the same architecture as the world’s top-performing GeForce gaming graphics cards.

    Nintendo SwitchThe Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.

    Gameplay is further enhanced by hardware-accelerated video playback and custom software for audio effects and rendering.

    We’ve optimized the full suite of hardware and software for gaming and mobile use cases. This includes custom operating system integration with the GPU to increase both performance and efficiency.

    NVIDIA gaming technology is integrated into all aspects of the new Nintendo Switch home gaming system, which promises to deliver a great experience to gamers.

    The Nintendo Switch will be available in March 2017. More information is available athttps://www.nintendo.com/switch.

    Nintendo Switch is a trademark of Nintendo.

     

    Edited by Brian

    • Thumbs Up 1
      Report Article


    User Feedback


    Great! More dumbed-down disposable kid toy garbage to give game developers another excuse to suck at PC game development. Just what we needed, LOL.

    Share this comment


    Link to comment
    Share on other sites

    would have loved the concept if it was any other other company than Nintendo with the absolutely miserable graphics quality on their last gen of consoles, the 3ds has an extremely small screen and still everything is pixilated and super low res, not expecting the switch to be much different

    Share this comment


    Link to comment
    Share on other sites

    crap again, nothing more to say :)

     

    1 more thing if they would do a remake of nintendo 64 it woud even sell better then this.

    Share this comment


    Link to comment
    Share on other sites

    I think it'd be rather convenient. I've always found Nintendo's first party controllers to be exceptionally comfortable to hold, 1st party games very well done, graphics in games pretty despite low resolution/power, and their dedication to local play fantastic. I definitely feel that this type of product has a market, especially amongst those who want an experience that takes little effort to get going. Then again, it's just the usual arguments for the validation of consoles. Despite that I feel like this steps away from the use case of PC games enough to justify a purchase. 

    Share this comment


    Link to comment
    Share on other sites

    I'm very excited about this all, I've been waiting years for this thing ever since it was called the NX and now I've finally placed a pre order and can't wait to play it on March 3rd.

    However I wonder would the screen scratch over time when put into the dock?

    Share this comment


    Link to comment
    Share on other sites


    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now

  • Similar Content

    • By Sparkhunter
      Hi there guys,
       
      I've been looking into getting a new GPU for my laptop (4-lane PCI-e) and enclosure (HP Omen Enclosure) but I cannot decided between the RX 5700 XT and the RTX 2070 (non SUPER).
       
      I did some price to performance calculations (higher is better) with some of the best prices I can currently get in the UK.
       

       
      The RX 5700 XT does perform better and is cheaper than the RTX 2070 (Although the 2070 does come with 2 games) but my main concern is drivers and from what I've heard NVIDIA cards work a lot better with eGPUs than AMD.
       
      If anyone has experience using both red & green or if anyone else has any other input i'd love to hear your experiences and input.
       
      Thanks.
       
    • By Solaire
      Hello everyone,
      some time by now I flashed a custom Vbios (thank you Klem) on an Acer Predator 15 G9-591 and I started doing some tests about how far I could go overclocking the gtx 970m without running too hot. With the voltage up to 1150mV and something like 300 additional MHz for the core clock, in the strongest stress condition for both Gpu and Cpu ( i7-6700hq undervolted), the Gpu reaches a max temp of 85C (with some 86 spikes but never more than that).
      Considering that soon I'm going to relid both Cpu and Gpu and clean everything in the internals (laptop is 4 years old) I think I'm going to have some additional heatroom.
      I've already tested some further overclocking and I noticed that even if the temperatures remains under 93C (which is just for testing purposes, but after relidding temps could be nicer) graphical glitches occur and after some time most games crash. But my question is, could it be because of lack of power supply? The laptop charger provides 180W 
      Could there be an increase in overclock margins with a 230W psu or something like that? (Obviously with the same 19.5V output)
      If anybody tried something like that on any laptop model, or knows the matter, I'd like to know
    • By ounces
      After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

      It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

      GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
      Banding was not present with the original TN panel using the same GPU.
       
      While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
       

       
      My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

      ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
       
      Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
       
      Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
       
      Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

       
      So far, we have two strings for BIOS version:
      80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
      Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
       
      ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

       
      Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

       
      "NVidia inspector" is of the same opinion:

       
      Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

       
      Here we have two three interesting findings:
      Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
      Loading the same file in "Kepler BIOS tweaker" confirms the facts:

       
      EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
      ...Shall I be worried about "60.02Hz" refresh rate?
       
      To summarize:
      Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
      Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
      K4000M.zip
    • By Blocker35
      Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
       
      - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
      - Razer X Core 
      - Apple TB3 to TB2 Adapter
      -  TB2 Cable (Cable Matters)
      - 23inch AOC External Monitor
       
      I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
      I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
      Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
    • By Kel
      I got the exp gdc beast 8.5 got all the connections right... laptop boots up... fans turn on the gpu.... device manager detects gtx 1060 with a “ ! “ in a yellow triangle. Now what to do.... i tried to do the bios mod.... but realised that i have 3.07 version. 
      Questions-
      1 is there any way to get this done without the mod?? 
      2 if mod is the only way... kindly help me with step by step instruction in laymans terms... i m not a techy... so i might not understand technical terms.... altho i am interested... but learning yet. Please help. 
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.