I hope I'm posting on the right forum for this!
Recently I have started programming with the Directx 11 June (2010) SDK on VC++ 2010, on a Dell LapTop with a NVidia GeForce GT 630M GPU and a Intel HD 4000 chip.
One of the things you do, is to try and enumerate available adapters and outputs, and so on. Here's an example:
IDXGIFactory1 *factory;
CreateDXGIFactory1(__uuidof(IDXGIFactory1), (LPVOID *)&factory);
IDXGIAdapter *adapter;
factory->EnumAdapters(0, &adapter);
DXGI_ADAPTER_DESC desc;
adapter->GetDesc(&desc);
When I run this, the desc structure contains the information for my Intel HD chip, and NOT the information for my GPU!
Now, when I open my NVidia control panel, and select the GPU as the preferred processor, and re-run the sample, I get the info for my GPU in desc - which is right! And also, when I then try to enumerate outputs for this adapter, I find that there is at least one.
My question is: Is there a way to accomplish this programmatically, like in the DirectX 11 SDK, so that I don't have to set the setting in my NVidia control panel?
I went through the SDK code (for EmptyProject11), and somehow they "grab" the GPU instead of the Intel chip. I commented out all the code in the WinMain function, and inserted the above code, and it still grabbed the GPU! Is it something to do with the Project Setup, environment variables, command line arguments, or....? I mean how do they do it!?!?!?
I would appreciate any insight into this matter.
Thanks
You can run through all of the adapters presents and get information on them by looping through all possible adapters using the same function that you're already using:
HRESULT r = S_OK;
unsigned int adapterindex = 0;
std::unique_ptr<IDXGIAdapter, ReleaseDirectX> dxgiadapter = null;
// Save the result of EnumAdapters to r, and then check if it's S_OK
while ( ( r = factory->EnumAdapters( adapterindex, &dxgiadapter ) ) == S_OK ) {
++adapterindex;
/* Work with your adapter here, particularly DXGI_ADAPTER_DESC */
}
Usually, you will have to choose the default one automatically or enumerate all of them and in some kind of settings panel let the user choose. The other way to do it is use the Adapter Description which has the most video memory. It's not a foolproof heuristic, but it's one I use to get the "best" (used liberally) video card for the system. More often than not, however, the default is the default for a reason.
Good luck!
Related
I'm working on a VC++ app, and my job is to correct some bugs in it, the code was mainly written by a student last year.
It is a graphical application that uses both SFML and TGUI.
It targets 32bits architectures but actually its only working well on x64 computers.
On a x64 , absolutly everything works perfectly , but on a 32-bits systems , the text displayed using sf:Text and sf:Font just prints black blocks after the first call to the function that sets sf:Text's string.
I know that code targeting 32bits systems should work fine on x64 , and that's why I am lost. I checked which system architecture I am targeting.
I am using compatibles , 32bits , last updated SFML and TGUi libraries.
I checked a billion times if I got the right dlls, include and lib paths . The linker > start property got all .lib that both sfml and tgui relies on.
I know that the Font is loaded, but I know that it doesn't stay set on the Text even if the variable containing the font isn't destroyed and that I have set the font on the text earlier.
This is how the font is loaded and associated with the text :
MainWindow.cpp :
Font& MainWindow::m_fontDisplayer = m_data.getFont_1();
//init displayer
m_data.getDisplayer().setFont(m_fontDisplayer);
m_data.setDisplayer("");
m_data.getDisplayer().setCharacterSize(54);
m_data.getDisplayer().setFillColor(Color(0, 0, 0, 255));
DataManager(m_data):
Font DataManager::m_font_1 = Font();
if (!m_font_1.loadFromFile("Ressources/font_1.ttf"))
{
printf("\nCould not find font_1.ttf font.");
}
void DataManager::setDisplayer(std::string s) {
if (m_displayer.getFont() == NULL) {
//m_displayer.setFont(m_font_1);
}
m_displayer.setString(s);
m_displayer.setPosition(sf::Vector2f(320 - m_displayer.getGlobalBounds().width,82));}
This function is the one that maked me say that the font isn't set to the Text after some time.(m_displayer is sf::Text)Because I have set a stop point at the commented line and that it is accessed even if setting the font is the first thing that i do. Actually , that "if" statement was only to check the stop point, It doesn't fix the problem (makes my app crash on 32bit), and it would be too long because sf:Font operations are really heavy.
What could possibly cause that compatibility issue between the two architectures?
To add to what i said , I'm sure that the only difference between the systems im testing on is the architecture and not windows version. I tried on multiple computers, and again , the problems only appaers on 32-bit computers.
Thanks to anyone that could explain and help me understand the problem.
EDIT : I cleared the text about Tgui issues as i solved them.
I'm trying to get my game to automatically set the window size as the correct resolution for the monitor.
For example, my desktop PC is at 1920x1080 resolution, so I want my game to run at 1920x1080 on here, however my laptop is at 1366x768 so I want my game to run at 1366x768 on there, etc.
I've tried so many different things such as GraphicsDevice.Adapter.CurrentDisplayMode.Width/Height, and even printed out the list of GraphicsDevice.Adapter.SupportedDisplayModes and they all tell me that the only display mode supported for me is 800x600. This is surely not the case, because I'm running my Windows 7 at 1920x1080.
So what on earth am I doing wrong? I tried putting this code in the Game1 constructor, the initialiser, I can't figure out why it isn't working properly!
Okay I fixed it. I just realised I was being a little bit stupid in that I forgot to mention this a "MonoGame" application, not a straight forward XNA project... (I didn't think it would make a difference but oh I was wrong)..
As it turns out, MonoGame has a massive bug to do with the graphics devices, and there is supposedly a way to solve it (build from the latest source or something?) but what I did was install the XNA 4.0 Refresh for Visual Studio 2013, and copied all my source code across to a new XNA project as opposed to a MonoGame project.
And hey presto, GraphicsDevice.DisplayMode.Width and Height are now correctly registering as 1920 and 1080 pixels. So now I can carry on with my game FINALLY.
Thanks to all the people that tried to help me solve this issue!
You can set the resolution of your game in the constructor by adjusting the graphics' PreferredBackBufferWidth and PreferredBackBufferHeight:
For example this will produce a game window that's 480x320:
public Game1()
{
graphics = new GraphicsDeviceManager(this);
Content.RootDirectory = "Content";
graphics.PreferredBackBufferHeight = 320;
graphics.PreferredBackBufferWidth = 480;
}
Keep in mind that when in windowed mode your game will (by default) have a title bar which prevents the game window from being as big as your full screen.
This is my method on how to get your maximum supported resolution(and set it, as an example to clarify it):
// in the Initialize method
graphics.PreferredBackBufferWidth = GraphicsDevice.DisplayMode.Width;
graphics.PreferredBackBufferHeight = GraphicsDevice.DisplayMode.Height;
graphics.IsFullScreen = false;
graphics.ApplyChanges(); // <-- not needed in the Game constructor
However, I don't know what you're doing wrong.
I use openjdk-1.6 on Linux platform (that's a requirement)
I need to play stream audio. Based on this example http://www.java2s.com/Code/Java/Development-Class/PlayingStreamingSampledAudio.htm
I wrote something like this:
SourceDataLine.Info info = new DataLine.Info(....);
SourceDataLine line = (SourceDataLine) AudioSystem.getLine(info);
line.open(stream.getFormat());
line.start();
But the problem is in line
line.open(stream.getFormat());
Used format is supported by the system.
I have LineUnavailableException.
But the problem is that I have this exception only under eclipse. When I create executable jar and run it - everything is OK - no exceptions.
As far as I understand from googling and some experiments - problem is in security restrictions. Executable jar runs under current user and has access to sound device, eclipse somehow not. I tried to add all system users to audio group, tried to run eclipse as root... Nothing helped.
I'm not a guru in Linux and eclipse. Does anybody know how to solve this problem or at least how to change security restrictions for eclipse?
Any ideas would be highly appreciated!!
Here is the exact example I have in code.
int sampleRate = 40000;
int sampleBits = 16;
SourceDataLine _line;
AudioFormat format = new AudioFormat(sampleRate,sampleBits,1,true, true);
SourceDataLine.Info info = new SourceDataLine.Info(SourceDataLine.class, format);
_line = (SourceDataLine)AudioSystem.getLine(info);
_line.open(format);
Due to my print of properties was considered as wrong code, here is a link with JRE properties list (under eclipse):
https://docs.google.com/document/d/1ef4f8GvEprhWWrrA57B0p2w6h5qOio4Wjz0ycY9-_0Q/edit?usp=sharing
here using executable jar:
https://docs.google.com/document/d/1lxbPIa4YUwURCdUZsCNL1Iehq--OT2eu35Q0QHKonEg/edit?usp=sharing
To be honest - didn't find any criminal differences...
var m_FilterGraph = (IFilterGraph2)new FilterGraph();
int hr = m_FilterGraph.AddSourceFilter(file, "Ds.NET FileFilter", out capFilter);
When my project is x64 it will throw System.Runtime.InteropServices.COMException (0x80040241): The source filter for this file could not be loaded. With x86 everything is OK.
It's c# code but the problem is present in every x64 app i've got to build ds graphs.
examples:
https://code.google.com/p/graph-studio-next/
http://www.codeproject.com/Articles/21105/DSGraphEdit-A-Reasonable-Facsimile-of-Microsoft-s
When i build graph by hand (ex. File source async. -> LavSplitter -> some decoder -> Enhanced Video Render) it's working.
So it's rather system problem than code, but x64 video players that i've got are working ok o.O so i don't know... Maybe somebody has an idea what could be wrong?
32 and 64 bit environments have their own set of filters. They start with similar stock filters and then you install additional filters for 32 bit, 64 bit or both. When you have 32 bit filter installed, and there is no corresponding 64 bit filter, then you you have situation as described in question: Win32 works fine, x64 does not work. Install missing 64-bit filters to have it resolved.
I had compiled some examples from svgalib, the console show :
Using EGA driver
svglib 1.4.3
Nothing more, its like its drawing somewhere but I cannot see it.
This could be a ver very noob question about svgalib, but also a configuration problem.
Also I check the virtual console that it says is drawing (if I run from X), running from console just stays there. I also put sleep in the code
example code :
include stdlib.h
include vga.h
int main(void)
{
vga_init();
vga_setmode(G320x200x256);
vga_setcolor(4);
vga_drawpixel(10, 10);
sleep(5);
vga_setmode(TEXT);
return EXIT_SUCCESS;
}
compile with
gcc -o tut tut.c -lvga
So do you have other SVGAlib applications working on your system? Such svgatest, which may be in a separate distribution package (svgalib-bin or similar).
Have you configured svgalib for your system? Common locate of the config file is /etc/vga/libvga.config and read man svgalib should give you more details.
I suspect that once you have SVGAlib working in general, the tutorial example program will work.
Install by software manager all svgalibrary.
Set the resolution at yours graphics screen
es : G1024x768x256
set color pixel white = 15
my linux mint (mate) 17.1 on hard disk work fine.
good luck !