2nd HDMI DISPLAY PORT NOT WORKING?
Intel i7-9700F 3GHz 16GB RAM NVIDIA GTX 1660 Ti
I have one monitor hooked to one HDMI port (works fine)
I plugged another monitor into the 2nd HDMI port on the back and NOTHING happens
Every single piece of help tells me to use "windows key"
I don't have that on my keyboard when I finally got to the "Troubleshoot" page said to find "Hardware and Devices"
THIS OPTION IS NOT PRESENT
Only options I have to troubleshoot are as follows
Program Compatibility Troubleshooter
Search and Indexing
Windows Store Apps
All my versions are current (GPU and Windows are fully updated)
I have verified BOTH HDMI cables and BOTH MONITORS work fine
- 1 month ago
Windows 10 can't detected your second monitor? Use this guideline to troubleshoot and resolve the most common issues with external display.
Using a second monitor is an easy and convenient way to create a largery canvas to work with multiple apps, edit videos, and even play games on your desktop or tablet running Windows 10.
Using multiple monitors can be hugely beneficial, while working at home or anywhere else. While connecting an external display is usually a plug-and-play process, occasionally, Windows 10 may not be able to detect the second monitor as a result of hardware or driver related issues. However, if you're having problems,
there are a number of troubleshooting steps that you can use (such as checking the connection and installing the correct drivers) to fix the issue in no time.
How to fix external monitor connection issues using hardware troubleshooting
When a device isn't detecting a second monitor, the chances are are that you're experiencing a software-related issue, but it could also be a problem with the physical connection.
If Windows 10 can't detect the second monitor, before modifying any settings (and assuming it's connected to a power source), you should try these troubleshooting steps:
Restart your computer to re-establish the connection.
Using the built-in controls on your monitor, make sure the correct input is selected.
Check the cable connection between the monitor and the graphics card.
Disconnect the cable from both ends, wait a few seconds, and reconnect it again.
Use a different cable, as the issue could be the pins in the connector or a bad cable. Also, if you're connecting a new display, make sure you're not using the incorrect cable.
If the monitor and graphics card include multiple ports, try switching ports.
Connect the monitor to another computer to determine if it's not a problem with the monitor.
Connect another working monitor to determine if it's not an issue with the graphics card.
If you're using a Surface Pro 7 with a dock station, disconnect it, and try to connect the monitor straight to the device to determine if the Surface Dock is causing issues.
Depending on the type of display you're trying to set up as a second monitor, you may need to update its firmware to fix bugs or compatibility issues. (Check your display manufacturer's support website to find out how to apply the latest firmware update.)
After trying the basic troubleshooting steps outlined above, if nothing seems to be fixing the second monitor, then you can start troubleshooting for software related problems.
- StarryskyLv 71 month ago
"on the back" of what computer? I am guessing that you might have a desktop box with a motherboard video port. And you have a plug in video card by some maker that has the Nvidia GTX 1660 TI video chip system? Right?
Most video cards have up to 5 ports. But not all ports can be used at same time on some cards. Have to know brand and model of the card.
You could get a DVI or DisplayPort adapter or cable that has an HDMI end.
The others are right about using the motherboard port if a card is present. Some BIOS settings don't allow, some do.
There are quite a few ways to use more than one monitor:
---A correct combination of ports on a single video card
---A USB video adapter and its software.
---A second video card plugged into the motherboard. And yes, it will work.
---A USB docking station that has a video chip and port or ports on it.
---A Matrox dual or triple or quad Head2Go box that makes multiple screens from one cord out from most any computer. Best if the screens are identical size and resolution.
---Another computer"s screen, or a tablet or phone or smart TV that can be used as a "projected" image receiver. Some video system do that straight away, sometimes a software program can be used, like "Splashtop". Wireless or networking cable will transmit. There are wireless video adapters for HDMI to "dumb" TVs.
"Splitter" boxes or cables are mostly useless because they make the same image on two screens.Source(s): Using many monitors for flight sim cockpit views.
- AdrianLv 71 month ago
Most GTX 1660Ti I've seen have only one HDMI port on them. The rest are DVI or Display Port connections.
So, tell us what model you have that happens to have 2 HDMI ports on the video card??
If you are trying to use a HDMI port on the motherboard, forget it, not likely to work unless you have some special BIOS or something... In most cases, an add-on video card makes the BIOS disable the on board video.
You can get Display Port to HDMI adapters, try one of those to add the second HDMI port to your video card.
- Spock (rhp)Lv 71 month ago
recommend the simple tests first ... swap the HDMI connections at the graphics card and see that both monitors and cables work fine. If one doesn't, then you know where to zero in for troubleshooting. [most common cause ... monitor needs to be manually set to detect HDMI .. monitors are plenty dumb and do not do this automatically]. -- grampa
- What do you think of the answers? You can sign in to give your opinion on the answer.
- VPLv 71 month ago
You need to have a video card with TWO separate monitor outputs so that you can connect 2 monitors at the same time.
You won't be able to use 2 video cards at the same time.
- KnightSaber2000Lv 61 month ago
two methods.. the first method is by using Windows' Display settings.. minimize everything on your screen and then RIGHT click on an empty spot of your desktop.. in the sub-menu, click on 'Display Settings'.. if Windows was able to detect more than one display connected to the PC, it would show a number of boxes that represents the connected monitors.. but if for some reason that Windows can only detect one monitor, these boxes will not be there..
instead, you can manually detect monitors that somehow escaped detection in the first time.. just look for a button labelled "Detect" - and you may need to scroll-down the page to find that "Detect" button..
another easy and faster way, is to press and HOLD the Windows key on the keyboard and then press P on the keyboard while holding the Windows key.. a small menu should appear on the right side of the screen, of which you simply choose 'Duplicate' or 'Extend'..
the second method which i prefer, is to go through NVIDIA's Display Settings.. so
go to the system tray - at the lower right corner of the screen next to the clock, and look for NVIDIA's black&green icon (see the included photo below).. the NVIDIA icon should be in the system tray and you may need to click a small arrow/triangle to expand/show the hidden icons in the system tray..
if the icon is not there, you may need to restart Windows or install Nvidia latest drivers..
RIGHT CLICK on the NVIDIA icon and click on 'NVIDIA control panel'.. and then click on the 'set up multiple monitors' tab (on the left), and the monitors supported should populate the list.. `
but if the second monitor does not show up in the list of monitors, there should be a small blue link 'My Display is not shown..', click on it.. this would lead to a new window to manually detect outdated monitors that some how do not support the latest detection or plug&play protocols.. `
if that fails, you may need to update NVIDIA drivers and install all Windows updates, restart the PC, and borrow HDMI cables from friends and family to see whether the original cables were some how faulty.. and if all that fail, i see no other way to avoid sending the PC to repairs..