-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is NVD API very slow for dependency check today? #6531
Comments
From https://nvd.nist.gov:
However, it seems the process always gets stuck at
I'm playing with nvdApiDelay and nvdMaxRetryCount without any success. |
seeing the same error on 20th Mar too |
Would it be crazy to package the latest CVE data with each release and automate a weekly release or something? The random failures, slowdowns and memory issues are driving me nuts. |
If you want to make your installation more stable - consider keeping the database around between executions. You should only have to download the full NVD data set once. See https://jeremylong.github.io/DependencyCheck/data/cacheh2.html |
Seeing the same issue FWIW. I had to flush the database because couple of my systems hit this error when trying to run the scan:
I did the purge and now I am unable to rebuild the DB, hitting the described error at 95% consistently. |
I'm caching/restoring the entire |
FWIW I use the |
Disabled dependencyCheck during Github build since NVD key update times out: jeremylong/DependencyCheck#6531 # Updated 3rd party dependencies: - postgresql 42.7.3 - spring-boot 3.2.3 - awaitility 4.2.1 - spring-data-mongodb 4.2.4 - spring-data-jpa 3.2.4 - log4j-to-slf4j 2.23.1 - json-path 2.9.0 - micrometer 1.12.4 - micrometer-tracing 1.2.4 - netty 4.1.107.Final - testcontainers 1.19.7 - jdbi3 3.45.1 - jackson 2.17.0 - reactor 2023.0.4 - spring 6.1.5 - mockito 5.11.0 - logback 1.5.3 - commons-compress 1.26.1 # Updated maven plugins: - dependency-check-maven 9.0.10 - maven-compiler-plugin 3.13.0 - maven-gpg-plugin 3.2.1
Reverting to version 8.4.3 has helped. NVD updates are downloaded |
Even at the high failure rate which I don't have a lot of control over - following either of these should help: |
Perhaps it was working a bit better earlier when these folks managed to get 95% through building a DB, but right now the failure rate for me seems to be 100% of requests to So unless one has an already existing populated cache or mirror to work off with |
@jeremylong what about having NVD as mirror similar to Retire JS Repository (i.e. https://raw.githubusercontent.com/Retirejs/retire.js/master/repository/jsrepository.json) as mentioned on page https://jeremylong.github.io/DependencyCheck/data/mirrornvd.html |
@chadlwilson I've had the vulnz CLI updating nightly for a while now and haven't seen much of an issue: https://github.com/dependency-check/DependencyCheck_Builder/actions/workflows/cache.yml |
@chadlwilson but yes - for a brand new user this is problematic. |
@jeremylong yeah, but the 100% failure rate is for the last 6 hours or so (not 24 hours ago when your job last ran successfully). Your job that just kicked off 45m ago is stuck/failing like everyone else's: https://github.com/dependency-check/DependencyCheck_Builder/actions/runs/8374001401/job/22928273185 |
Same problem here, it even seems to be worse today than yesterday |
agree with @irineuruiz today is worse. all ADO pipelines are failing today. we use ADO extension OWASP Dependency Check and run pipelines on self hosted agents with azure vm scale set. |
Just some context for people who are maybe not following infosec news (no affiliation whatsoever). |
Well - there is a data feed here: https://dependency-check.github.io/DependencyCheck_Builder/nvd_cache/ As @chadlwilson pointed out - the update is currently failing. But the data was refreshed last night so it is fairly current. |
Disabled dependencyCheck during Github build since NVD key update times out: jeremylong/DependencyCheck#6531 # Updated 3rd party dependencies: - postgresql 42.7.3 - spring-boot 3.2.3 - awaitility 4.2.1 - spring-data-mongodb 4.2.4 - spring-data-jpa 3.2.4 - log4j-to-slf4j 2.23.1 - json-path 2.9.0 - micrometer 1.12.4 - micrometer-tracing 1.2.4 - netty 4.1.107.Final - testcontainers 1.19.7 - jdbi3 3.45.1 - jackson 2.17.0 - reactor 2023.0.4 - spring 6.1.5 - mockito 5.11.0 - logback 1.5.3 - commons-compress 1.26.1 # Updated maven plugins: - dependency-check-maven 9.0.10 - maven-compiler-plugin 3.13.0 - maven-gpg-plugin 3.2.1 (cherry picked from commit f4a701b)
@jeremylong I'm caching the db between runs using version 9.0.2 and 9.0.9 (cli). Today all our github actions are stuck in a loop in the retry request:
After a while it hits out of memory. The counter looks off and it should stop after a while no? It so hardcore that i cannot even cancel the github workflows. :-D (I will have set a timeout for sure.) |
Ran into this today too, it looks like a bug. The documentation says the default should be 10, but it infinitely loops. |
Reverting to version 8.4.3 works. |
Hi @jeremylong, I have not yet familiarized myself with the download client and the NVD API. But I see via
|
Using 2 workarounds with latest build.:
|
Disabled dependencyCheck during Github build since NVD key update times out: jeremylong/DependencyCheck#6531 # Updated 3rd party dependencies: - postgresql 42.7.3 - spring-boot 3.2.3 - awaitility 4.2.1 - spring-data-mongodb 4.2.4 - spring-data-jpa 3.2.4 - log4j-to-slf4j 2.23.1 - json-path 2.9.0 - micrometer 1.12.4 - micrometer-tracing 1.2.4 - netty 4.1.107.Final - testcontainers 1.19.7 - jdbi3 3.45.1 - jackson 2.17.0 - reactor 2023.0.4 - spring 6.1.5 - mockito 5.11.0 - logback 1.5.3 - commons-compress 1.26.1 # Updated maven plugins: - dependency-check-maven 9.0.10 - maven-compiler-plugin 3.13.0 - maven-gpg-plugin 3.2.1 (cherry picked from commit f4a701b)
Disabled dependencyCheck during Github build since NVD key update times out: jeremylong/DependencyCheck#6531 # Updated 3rd party dependencies: - postgresql 42.7.3 - spring-boot 3.0.13 - awaitility 4.2.1 - spring-data-mongodb 4.0.12 - spring-data-jpa 3.0.12 - log4j-to-slf4j 2.23.1 - json-path 2.9.0 - micrometer 1.12.4 - micrometer-tracing 1.2.4 - netty 4.1.107.Final - testcontainers 1.19.7 - jdbi3 3.45.1 - jackson 2.17.0 - reactor 2023.0.4 - spring 6.0.18 - mockito 5.11.0 - logback 1.5.3 - commons-compress 1.26.1 - spring-kafka 3.0.15 # Updated maven plugins: - dependency-check-maven 9.0.10 - maven-compiler-plugin 3.13.0 - maven-gpg-plugin 3.2.1 (cherry picked from commit f4a701b)
Same here (version 9.0.9), using local database. Stuck in "update" forever. Only workaround (pom.xml):
If autoUpdate is set to false, database won't be updated, but at least the build-job is running... |
I am seeing the same behavior as @Reamer with multiple retries of the same requests. With 9.0.10, I'm also not seeing the expected date range on the request URLs: With
With
|
Email from NIST... Good afternoon, We are aware of availability issues with the 2.0 API endpoints and are currently investigating. We apologize for the inconvenience at this time. V/r, |
Disabled dependencyCheck during Github build since NVD key update times out: jeremylong/DependencyCheck#6531 Updated 3rd party dependencies: reactor 2020.0.42 spring 5.3.33 java-uuid-generator 5.0.0 kotlin 1.9.23 postgresql 42.7.3 awaitility 4.2.1 log4j-to-slf4j 2.23.1 json-path 2.9.0 micrometer 1.12.4 micrometer-tracing 1.2.4 netty 4.1.107.Final testcontainers 1.19.7 jdbi3 3.45.1 jackson 2.17.0 mockito 5.11.0 commons-compress 1.26.1 Updated maven plugins: dependency-check-maven 9.0.10 maven-compiler-plugin 3.13.0 maven-gpg-plugin 3.2.1 (cherry picked from commit f4a701b)
The retry limit is definitely broken. This was with it set to 2:
And when it was set to 0:
|
What about an alternate data feed, like https://osv.dev/? Obviously, this would be a major change for this project. |
As opposed to what some people have indicated, reverting to version 8.4.3 does NOT actually work. However, if anyone experienced otherwise recently, we could use that as an alternative to partially unblock ourselves. |
@arnabcse28 We switched back to version 8.4.3 two hours ago and it worked, but we've only run it twice so far. |
Maybe change the code so that if it fails to connect NVD, it falls back to what's cached instead of failing. For example, it tries 10 times (configurable setting), and I get a 503 (HTTP error codes also configurable, but defaults to a 503), so instead of failing, display a WARNING that lets the user know that NVD couldn't be contacted because of a 503 error 10 times and that a local cached copy of the NVD database dated X is being used instead. The user should be able to decide if they want the failover to a cached copy of the NVD database to be allowed via a parameter (default is debatable, but I'd say false by default). This won't help users who still need to get a cached copy, but at least for those who do, things will keep working until NVD comes back online. Lately, NVD's instability has made me want to build a local cache of the NVD DB. I haven't done this only because NVD has been fairly reliable. |
It seems the NVD is broadly back up now (at least sufficiently to work off a cache or mirror), so perhaps we can close this to move discussion to more focused issues about improving the caching, addressing the retry strategy challenges and/or logging of retries? The irony is that as @marcelstoer noted above in #6531 (comment) (thank you!) currently the NVD is not actually going to be feeding much (any?) useful new data to projects anyway, as few CVEs are being mapped to CPEs, making ODC unable to do much with the updates from NVD. |
I second closing this issue. |
People should look at sorting out a token for the OSSIndex feed if they haven't already. |
Hmm, that's interesting, because ODC wasn't highlighting https://ossindex.sonatype.org/vulnerability/CVE-2024-22257 to me via either NVD or OSSIndex (when I observed by contrast that Trivy-via-GHSA was) so I started to fear they may be relying upon NVD for more than we might have anticipated - but I do note it is mapped to products in OSSIndex now. I think there was just a bit delayed, which is somewhat reassuring about Sonatype's support/commitment. |
ODC told me about that on Wednesday, but I don't run it every day. |
appears to be resolved on the NVD end. |
Unfortunately, NVD is acting up again after a few days of being stable... |
It seems like the NVD service is down again. Running dependencyCheckAnalyze task using gradle plugin : Plugin: dependencyCheck { Implementing a local cache would be fail-safe for any outages from the NVD service. |
yes facing this issue once again at my end... I think followings could be options to pass through dependency-check stage while NVD is down if a hosted DB is not available:
cc: @jeremylong |
We've been using -DnvdValidForHours=10000 during the previous downtime, so to only block NVD updates and not i.e. the supression list. |
it seems that the NVD API occur HTTP 503 errors today |
There is one, but #6535 prevents it being used. |
It seems there were some issues last weekend as well, we see a lot of failed pipelines. |
Different root cause. Please look at open issues before jumping to conclusions: #6746 |
No conclusions were made, just stated we ran into issues. |
Well, posting on an old ticket before checking for new ones which might explain your issues is not particularly helpful, especially when people are opening many duplicates as it is. |
I am consistently getting errors while trying to get updates for my builds from NVD at: dependency-check-maven:9.0.10:check
It started late morning today (March 20th, 2024) and the issue occurs with or without using a NVD API Key. Before today the same was working seamlessly with same infrastructures (Jenkins as well as individual workstation).
Below is a sample error that I have got after 59 minutes of run of the dependency update:
The text was updated successfully, but these errors were encountered: