Results 1 to 8 of 8

Thread: Getting the temp. of a component

  1. #1
    Join Date
    Dec 2011
    Location
    Toronto, Ontario
    Posts
    6,424
    Mentioned
    84 Post(s)
    Quoted
    863 Post(s)

    Default Getting the temp. of a component

    Is it possible (in C++ or Java, since I'm proficient in only either of those) to retrieve the temperature of a PC component? Such as the Motherboard/CPU/HDD.
    I'm aware that programs such as Speccy and Speedfan do those, but HOW exactly? This'd be invaluable for me :>

    @Brandon

    Edit, and is it possible to stop the GPU from rendering/displaying anything until I hit a key?

  2. #2
    Join Date
    Jun 2013
    Posts
    115
    Mentioned
    3 Post(s)
    Quoted
    54 Post(s)

    Default

    To get the temperature of components in Windows then this might help you.
    http://msdn.microsoft.com/en-us/libr...8VS.85%29.aspx
    Keep in mind that i know close to nothing about C++.


    And yes it's possible to make the GPU stop rendering by a hit of a key though the only way i know how it to over volt the GPU or the more safe one(I guess) under volt it.
    *RuneScape Helper(Open Source)
    -Upcoming features
    --Price Checker(Fully Working)
    ---A system that will calculate how long until you level up in the selected skill(s).
    ---Better GUI(Partly Done)

  3. #3
    Join Date
    Feb 2011
    Location
    The Future.
    Posts
    5,600
    Mentioned
    396 Post(s)
    Quoted
    1598 Post(s)

    Default

    @Sin;

    Yes it is possible in C/C++ but not natively in Java. If you want to do this in Java, you'd be wasting your time because you'd be writing the C++ code into a .dll anyway and loading that with Java.. lol.


    What he linked above will do it via the Windows COM dll's. Specifically you want to include:

    C++ Code:
    #include <Objbase.h>

    and also link to: Ole32.lib or Ole32.a. There's other ways to do it without the COM dll's but it'd require rolling your own. There's actually a ton of Undocumented functions that do this sort of thing (but of course no documentation on usage). I think the temperature is easy enough though and it's documented so stick to the COM.


    The third way I can think of doing it is to write a KernelMode driver to do it for you. That's going to take practice though and good knowledge which I have no doubt that you have. Just lots of reading will be required. <-- That I doubt you want to do (try almost a whole book's worth)


    As for making the GPU wait, yeah.. I did it via the following. If you want something specific to OpenGL you can use gDebugger to stall specific apps. The following can be used for anything though and it's an out of context snippet.


    C++ Code:
    while(true)
    {
        if (!(GetAsyncKeyState(VK_F1) & 1))
        {
            std::this_thread::sleep_for(std::chrono::seconds(100));
        }
        break;
    }

    I inserted that into any function I wanted the GPU to pause on. You can simply insert it into a dummy function and have the GPU run that. Bam. GPU stall. Another way is to do a System hook: http://msdn.microsoft.com/en-us/libr...(v=vs.85).aspx

    and hook all running services and processes on the system. Then if they call any GDI functions, you simply stall it using the above code.

    Another idea is to just subclass every window and stall again in the WM_PAINT using the above. Then again, you can always use Assembly


    Any reasons you want to do this? It's highly NOT recommended to just "Stall" the GPU "system wide". Stalling for specific applications is usually the recommended approach.
    Last edited by Brandon; 06-27-2013 at 04:29 AM.
    I am Ggzz..
    Hackintosher

  4. #4
    Join Date
    May 2012
    Location
    Somewhere in, PA
    Posts
    1,810
    Mentioned
    9 Post(s)
    Quoted
    226 Post(s)

    Default

    Does the Best Solution here help?:

    http://www.tomshardware.com/forum/28...n-temp-sensors

    If I understand SpeedFan correctly, it monitors data provided by a potentially wide variety of sensor chips attached to various buses, including the SMB and ISA bus. The SpeedFan FAQ states that the data from those sensors are frequently labelled with less than helpful terms, so it suggests one compare “labels and readings” in BIOS with those reported by SpeedFan.

    It also recommends a bit of experimentation. From the FAQ:

    To find your CPU's temperature sensor you can leave your system idle for a few minutes, to let temperatures drop, and then go to 100% usage for a while. The temperature that rises faster is the one you're searching for. Other available temperature readings usually come from your sensor chip itself, from the southbridge, the voltage regulator, or even from an additional probe placed under the processor.... As a final note, please remember that not all available temperature sensors are actually connected to something. If you happen to read unusually high or low temps, they are likely to be from a disconnected (unused) temperature sensor.

    There’s a fair amount of info at these two pages:
    http://www.almico.com/sffaq.php
    http://www.almico.com/sfarticles.php
    And may I ask why you want to stop/start your GPU at will?
    My First Build!, Selling Downloadable Games
    -------------------------------------

  5. #5
    Join Date
    Dec 2011
    Location
    Toronto, Ontario
    Posts
    6,424
    Mentioned
    84 Post(s)
    Quoted
    863 Post(s)

    Default

    Quote Originally Posted by Brandon View Post
    @Sin;

    Yes it is possible in C/C++ but not natively in Java. If you want to do this in Java, you'd be wasting your time because you'd be writing the C++ code into a .dll anyway and loading that with Java.. lol.


    What he linked above will do it via the Windows COM dll's. Specifically you want to include:

    C++ Code:
    #include <Objbase.h>

    and also link to: Ole32.lib or Ole32.a. There's other ways to do it without the COM dll's but it'd require rolling your own. There's actually a ton of Undocumented functions that do this sort of thing (but of course no documentation on usage). I think the temperature is easy enough though and it's documented so stick to the COM.


    The third way I can think of doing it is to write a KernelMode driver to do it for you. That's going to take practice though and good knowledge which I have no doubt that you have. Just lots of reading will be required. <-- That I doubt you want to do (try almost a whole book's worth)


    As for making the GPU wait, yeah.. I did it via the following. If you want something specific to OpenGL you can use gDebugger to stall specific apps. The following can be used for anything though and it's an out of context snippet.


    C++ Code:
    while(true)
    {
        if (!(GetAsyncKeyState(VK_F1) & 1))
        {
            std::this_thread::sleep_for(std::chrono::seconds(100));
        }
        break;
    }

    I inserted that into any function I wanted the GPU to pause on. You can simply insert it into a dummy function and have the GPU run that. Bam. GPU stall. Another way is to do a System hook: http://msdn.microsoft.com/en-us/libr...(v=vs.85).aspx

    and hook all running services and processes on the system. Then if they call any GDI functions, you simply stall it using the above code.

    Another idea is to just subclass every window and stall again in the WM_PAINT using the above. Then again, you can always use Assembly


    Any reasons you want to do this? It's highly NOT recommended to just "Stall" the GPU "system wide". Stalling for specific applications is usually the recommended approach.
    Gonna read this in depth once i'm a bit more awake, thanks

    Quote Originally Posted by Austin View Post
    Does the Best Solution here help?:

    http://www.tomshardware.com/forum/28...n-temp-sensors



    And may I ask why you want to stop/start your GPU at will?
    I basically don't want my GPU to render anything, but still be able to bot during the night with Simba. I don't want my GPU to overheat during the night. Also, I want to obtain the temperature of components so that if any of them exceed a certain degree, I can shut the computer off.

  6. #6
    Join Date
    Feb 2011
    Location
    The Future.
    Posts
    5,600
    Mentioned
    396 Post(s)
    Quoted
    1598 Post(s)

    Default

    Quote Originally Posted by Sin View Post
    Gonna read this in depth once i'm a bit more awake, thanks

    I basically don't want my GPU to render anything, but still be able to bot during the night with Simba. I don't want my GPU to overheat during the night. Also, I want to obtain the temperature of components so that if any of them exceed a certain degree, I can shut the computer off.


    LOL.. That's impossible. To render nothing and still bot? Only way is with Injection/Reflection. What you can do is just go to battery settings and make it turn off everything except the NIC card and Processor. Then when you turn the Cover down on the laptop or put it to sleep, it will be botting but nothing rendering to the screen. The bot will still be reading from the graphics card but not drawing anything to the screen. It'll use a buffer. Well Smart uses a buffer when minimized anyway. Doesn't draw to the screen. The temperature one is a good idea but having the GPU not render at all :S? How would you stop your program if you can't see anything?
    I am Ggzz..
    Hackintosher

  7. #7
    Join Date
    Jun 2013
    Posts
    115
    Mentioned
    3 Post(s)
    Quoted
    54 Post(s)

    Default

    Quote Originally Posted by Brandon View Post
    LOL.. That's impossible. To render nothing and still bot? Only way is with Injection/Reflection. What you can do is just go to battery settings and make it turn off everything except the NIC card and Processor. Then when you turn the Cover down on the laptop or put it to sleep, it will be botting but nothing rendering to the screen. The bot will still be reading from the graphics card but not drawing anything to the screen. It'll use a buffer. Well Smart uses a buffer when minimized anyway. Doesn't draw to the screen. The temperature one is a good idea but having the GPU not render at all :S? How would you stop your program if you can't see anything?
    I think what he means is making the video card to not render anything but the integrated GPU in the CPU will still render as normal.
    *RuneScape Helper(Open Source)
    -Upcoming features
    --Price Checker(Fully Working)
    ---A system that will calculate how long until you level up in the selected skill(s).
    ---Better GUI(Partly Done)

  8. #8
    Join Date
    Dec 2011
    Location
    Toronto, Ontario
    Posts
    6,424
    Mentioned
    84 Post(s)
    Quoted
    863 Post(s)

    Default

    It's a computer, and the GPU is a AMD Radeon HD 7770, not integrated :P
    I basically dont want the GPU to do anything, just make the processor to do all the work.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •