[RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
[RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
Hello
DGIndexNV.ini to put in, to use the full potential of the gpu 1070 xtreme gtx?
I'm using bdrebuilder and see that mostly uses the cpu and gpu uses only.
I would like to use the gpu, but the cpu and 1920 Cudas
Thank you
1b81 GP104 [GeForce GTX 1070]
DGIndexNV.ini to put in, to use the full potential of the gpu 1070 xtreme gtx?
I'm using bdrebuilder and see that mostly uses the cpu and gpu uses only.
I would like to use the gpu, but the cpu and 1920 Cudas
Thank you
1b81 GP104 [GeForce GTX 1070]
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
Testing on my GT 620 shows CUVID and CUDA to be indistinguishable and they give the following performance for a 2160p AVC stream (GT 620 is pretty crappy nowadays ):
GPU 47%
VPU 95%
CPU 9%
FPS 32
DXVA mode gives:
GPU 72%
VPU 95%
CPU 5%
FPS 25
CPU 5% is pretty good for this high-resolution stream. You'll have lots of CPU to spare for encoding.
Your decision is do you want to trade CPU for frame rate. If your bottleneck is not the decoding frame rate, then you might consider DXVA mode. DXVA mode, however, limits you to one instance of DGSource() in your scripts, but that shouldn't be a big deal because multi-instance capability is rarely used in practice. Beware, though, not all video cards may behave the same, so you really should do your own testing.
I see that the List GPUs dialog is not displaying the GPU type correctly. Are you using build 2052 with the updated gpus.txt file included therein? If so, probably I am not taking into account case for the a-f hex digits. I'll check and slipstream if needed.
GPU 47%
VPU 95%
CPU 9%
FPS 32
DXVA mode gives:
GPU 72%
VPU 95%
CPU 5%
FPS 25
CPU 5% is pretty good for this high-resolution stream. You'll have lots of CPU to spare for encoding.
Your decision is do you want to trade CPU for frame rate. If your bottleneck is not the decoding frame rate, then you might consider DXVA mode. DXVA mode, however, limits you to one instance of DGSource() in your scripts, but that shouldn't be a big deal because multi-instance capability is rarely used in practice. Beware, though, not all video cards may behave the same, so you really should do your own testing.
I see that the List GPUs dialog is not displaying the GPU type correctly. Are you using build 2052 with the updated gpus.txt file included therein? If so, probably I am not taking into account case for the a-f hex digits. I'll check and slipstream if needed.
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
yes i use version DGDecNV build 2052 , but I have to put in DGIndexNV.ini to force GP104 appears 1b81 [GeForce GTX 1070]
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
I don't understand that. What are you putting in the INI file? I don't know of anything in there that could affect this, other than the GPU selection, but that should be correct by default because you have only one GPU. Did you mean you edited the gpus.txt file to make it uppercase (1B81)? If so, that's a hack and won't fix all the other entries with hex digits.pacoor wrote:yes i use version DGDecNV build 2052 , but I have to put in DGIndexNV.ini to force GP104 appears 1b81 [GeForce GTX 1070]
Anyway, I made a slipstream to do the comparison correctly. Please download it and try it.
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
admin wrote:I don't understand that. What are you putting in the INI file? I don't know of anything in there that could affect this, other than the GPU selection, but that should be correct by default because you have only one GPU.pacoor wrote:yes i use version DGDecNV build 2052 , but I have to put in DGIndexNV.ini to force GP104 appears 1b81 [GeForce GTX 1070]
Anyway, I made a slipstream to do the comparison correctly. Please download it and try it.
i try to force of 1070 gtx in .ini
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
What is the file name of the file you changed and what did you change in there? In any case, your screenshot is not showing the GPU core correctly, which is what I am talking about.
With CUDA_Device=255 or CUDA_Device=0 in DGIndexNV.ini and and the slipstreamed DGIndexNV.exe, everything should display correctly. Please let me know if it doesn't.
With CUDA_Device=255 or CUDA_Device=0 in DGIndexNV.ini and and the slipstreamed DGIndexNV.exe, everything should display correctly. Please let me know if it doesn't.
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
no understand, try to dowload dgdecnv2052.zip other?admin wrote:What is the file name of the file you changed and what did you change in there? In any case, your screenshot is not showing the GPU core correctly, which is what I am talking about.
With CUDA_Device=255 or CUDA_Device=0 in DGIndexNV.ini and and the slipstreamed gpus.txt file, everthing should display correctly. Please let me know if it doesn't.
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
Get new build 2052 again from my site and use the new DGIndexNV.exe in there. Then show screenshot of List GPUs dialog. Thanks!
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
admin wrote:Get new build 2052 again from my site and use the new DGIndexNV.exe in there. Then show screenshot of List GPUs dialog. Thanks!
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
Working great now! Thanks for your testing, Sir.
Re: DGIndexNV.ini GTX 1070 XTREME best cuda use
thanks!admin wrote:Working great now! Thanks for your testing, Sir.
Could you help me to get the most GPU process, instead of the CPU?
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
I already explained it. Didn't you read it?
Use DXVA mode as described in the manual. You'll lose some FPS, however. You'll also be limited to one instance of DGSource() in your script.
Set these in the DGIndexNV.ini file:
UseD3D=1
Decode_Modes=2,2,2,2
Use DXVA mode as described in the manual. You'll lose some FPS, however. You'll also be limited to one instance of DGSource() in your script.
Set these in the DGIndexNV.ini file:
UseD3D=1
Decode_Modes=2,2,2,2
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
I have no idea to handle your program, so.admin wrote:I already explained it. Didn't you read it?
Use DXVA mode as described in the manual. You'll lose some FPS, however. You'll also be limited to one instance of DGSource() in your script.
Set these in the DGIndexNV.ini file:
UseD3D=1
Decode_Modes=2,2,2,2
ok I will try their config in .ini
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
It's all explained in my brilliantly written users manuals.pacoor wrote:I have no idea to handle your program, so.
I suppose the English is an obstacle for you. OK, no problem, just post here and we'll all help you. You seem like a great guy.
Thank you for your interest in my tools.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
yes english ......admin wrote:It's all explained in my brilliantly written users manuals.pacoor wrote:I have no idea to handle your program, so.
I suppose the English is an obstacle for you. OK, no problem, just post here and we'll all help you. You seem like a great guy.
Thank you for your interest in my tools.
I do not understand, or even nvidia drivers are not very compatible with 1070 gtx, or not explain.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
Forgot to tell you that DXVA mode works only on Win 7 and Win 8/8.1. Sorry but it is an nVidia limitation. 9% CPU for 2160p is pretty damn good, IMHO, so don't be sad.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
admin wrote:DXVA mode works only on Win 7 and Win 8. Sorry but it is an nVidia limitation.
try to reinstall the previous version
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
It works fine on 7/8/8.1. On my main development machine I use 8.1. I hope to be able to ditch my GWX defenses any day now.
Before going to all the trouble of reverting your OS, however, I would run AVSMeter tests to see if you can live with CUVID/CUDA mode on your system. Tell us about your use case; maybe we can have other suggestions for you. Maybe some multi-threading, etc.
Good luck, my friend.
Before going to all the trouble of reverting your OS, however, I would run AVSMeter tests to see if you can live with CUVID/CUDA mode on your system. Tell us about your use case; maybe we can have other suggestions for you. Maybe some multi-threading, etc.
Good luck, my friend.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
I tried to download AVS Meter, but I have problems with x64 avisynthadmin wrote:It works fine on 7/8/8.1. On my main development machine I use 8.1. I hope to be able to ditch my GWX defenses any day now.
Before going to all the trouble of reverting your OS, however, I would run AVSMeter tests to see if you can live with CUVID/CUDA mode on your system. Tell us about your use case; maybe we can have other suggestions for you. Maybe some multi-threading, etc.
Good luck, my friend.
thanks, tomorrow I will try other things.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
yes i new in new to Avisynth processing
try to reinstall Avisynth tomorrow. Thanks Donald!
try to reinstall Avisynth tomorrow. Thanks Donald!
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
i have installed the last package Anon17 r2085 binaries https://www.dropbox.com/s/t6yg2cc900tkc ... 85.7z?dl=0 and avsmeter 2.3.4 from http://forum.doom9.org/showthread.php?t=165528
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
hello|
I try to use mediacoder and see the gpu is working very well, accelerating the process almost 16x
with bdrebuilder and DGIndexNV 2052 only 6% load gpu
I try to use mediacoder and see the gpu is working very well, accelerating the process almost 16x
with bdrebuilder and DGIndexNV 2052 only 6% load gpu
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
the first thing you need to understand is the first picture is showing that its using the NVENC which is the builtin hardware encoder so yes it will be faster....
best way to find out if it is using gpu to the fullist is to load up a video directly into dgnv itself goto video > disable display then click play if your fps is around ~700 fps then its all fine since this is showing you how fast and able your 1070 is at decoding however in the pictures above its showing your only encoding at 32 fps which is why you would be seeing ~6% video useage
best way to find out if it is using gpu to the fullist is to load up a video directly into dgnv itself goto video > disable display then click play if your fps is around ~700 fps then its all fine since this is showing you how fast and able your 1070 is at decoding however in the pictures above its showing your only encoding at 32 fps which is why you would be seeing ~6% video useage
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
aceado is correct. DGDecNV accelerates only the decoding, not the encoding. Try his suggestion while monitoring GPU usage.
What I typically do is run my Avisynth script with AVSMeter. Either way will give you useful statistics.
Perhaps BDRebuilder can also be configured to use NVENC, I don't know. If it uses only a SW encoder then it is expected that the GPU utilization will be much lower, because most of the real time is spent doing software things, not GPU things.
It seems to me that you should focus on maximizing the transcoding frame rate (which determines how long the job will take) rather than the GPU usage. They may be related but it is not a straightforward relation.
What I typically do is run my Avisynth script with AVSMeter. Either way will give you useful statistics.
Perhaps BDRebuilder can also be configured to use NVENC, I don't know. If it uses only a SW encoder then it is expected that the GPU utilization will be much lower, because most of the real time is spent doing software things, not GPU things.
It seems to me that you should focus on maximizing the transcoding frame rate (which determines how long the job will take) rather than the GPU usage. They may be related but it is not a straightforward relation.
Re: [RESOLVED] DGIndexNV.ini GTX 1070 XTREME best cuda use
thank you admin! both a avisynth script and using avsmeter and this this method should yeilf the same result depeg how the use wants to go about it... for me this way is more to the point (quicker lol) the latter would be the true decoding speed before any filters are added
here's a quick small video showing you that it does fully work just fine. the first video is h264 full hd (1920x1080) and the second video is mpeg also full hd (1920x1080) you'll see if i play it i also see about ~6% video useage
the decoding speeds in the video shows its able to decode h264 at ~756fps and mpeg2 ~854fps in this example
http://www29.zippyshare.com/v/ykcaJml6/file.html
here's a quick small video showing you that it does fully work just fine. the first video is h264 full hd (1920x1080) and the second video is mpeg also full hd (1920x1080) you'll see if i play it i also see about ~6% video useage
the decoding speeds in the video shows its able to decode h264 at ~756fps and mpeg2 ~854fps in this example
http://www29.zippyshare.com/v/ykcaJml6/file.html