Back

Decrypting encrypted files from Akira ransomware using a bunch of GPUs

257 points1 monthtinyhack.com
__alexander1 month ago

Note: Someone commented on the “limited shelf-life” of ransomware and why this doesn’t hurt other victims. They deleted their comment but I’m posting my response.

You are incorrect. What is limited is the number of attacks that can be used for victims to recover their files. If you think the author is the only person that was using this attack to recover files, you are incorrect again. I’d recommend checking out book The Ransomware Hunting Team. It’s interesting book about what happens behind the scene for helping victims recover their files.

PUSH_AX1 month ago

What use is a counterattack if it’s inaccessible, either by another cost or because it’s only known by a few experts?

This feels like a net win.

stavros1 month ago

What use is a counterattack if it's immediately fixed? Then absolutely nobody can use it, not even a few experts.

PUSH_AX1 month ago

You’re making a lot of assumptions about the capability to reconnect and patch/update itself, preface the fix with “keep your machine offline from here in out” and we’re back to fixing it for everyone before that point.

w-ll1 month ago

Im confused, are you saying that you think building a method for anyone to break/brute the ransomeware is bad?

comex1 month ago

They're saying that publicly disclosing the vulnerability is bad because now it will be fixed.

hassleblad231 month ago

This is a game of cat and mouse, like it has always been. Cannot rely on security by obscurity.

+1
throw109201 month ago
BestHackerOnHN1 month ago

[dead]

bawolff1 month ago

Anyone know why they are using timestamps instead of /dev/random?

Dont get me wrong,im glad they don't, its just kind of surprising as it seems like such a rookie mistake. Is there something i'm missing here or is it more a caseof people who know what they are doing don't chose a life of crime?

Retr0id1 month ago

afaik the majority of ransomware does manage to use cryptography securely, so we only hear about decrypting like this when they fuck up. I don't think there's any good reason beyond the fact that they evidently don't know what they're doing.

ulrikrasmussen1 month ago

My unqualified hunch: if they did that, then a mitigation against such malware could be for the OS to serve completely deterministic data from /dev/random for all but a select few processes which are a priori defined.

zerd1 month ago

You can do the same with time though, just return a predefined sequence of timestamps.

Retr0id1 month ago

And from a "defensive" perspective, if you don't trust any single entropy source, the paranoid solution is to combine multiple sources together rather than to switch to another source.

If it were me, I'd combine urandom (or equivalent), high-res timestamps, and clock jitter (sample the LSB of a fast clock at "fixed" intervals where the interval is a few orders of magnitude slower than the clock resolution), and hash them all together.

0cf8612b2e1e1 month ago

Even if the attackers used a fully broken since 1980s encryption-how many organizations have the expertise to dissect it?

I assume that threat detection maintains a big fingerprint databases of tools associated with malware. Rolling your own tooling, rather than importing a known library, gives one less heuristic to trip detection.

int0x291 month ago

They used this with the IVs mucked with: https://www.gnupg.org/software/libgcrypt/index.html

dherls1 month ago

Charitable, use of system level randomness primitives can be audited by antivirus/EDR.

hassleblad231 month ago

I wonder at what point would the antivirus kick in. It doesn't require reading /dev/urandom for too long.

__alexander1 month ago

Rolling your own crypto is still a thing.

mschuster911 month ago

If it works (reasonably) it works, and it throws wrenches into the gears of security researchers when the code isn't the usual, immediately recognizable S boxes and other patterns or library calls.

emmelaich1 month ago

Might be a bit a paranoia about official crypto libs backdoors, too.

econ1 month ago

In case the tool is used against them.

Ameo1 month ago

This was a great read and had just the right amount of detail to satisfy my curiosity about the process without being annoying to read.

Huge props to the author for coming up with this whole process and providing such fascinating details

throwaway484761 month ago

Ransomware would be less of a problem if applications were sandboxed by default.

XorNot1 month ago

Sandboxed how? Applications generally are used to edit files, and those are the valuable files to a user.

Ransomeware wouldn't be a problem at all if copy-on-write snapshotting filesystems were the default.

fennecbutt1 month ago

Sandbox, user specifies access to certain files (like you can do limiting access to certain gallery items on android).

Then changes made to files should be stored as deltas to the original.

But realistically a good readonly/write new backup solution is needed, you never know when something bad might happen.

cyberpunk1 month ago

Okay so you give the sandboxed app access to ~/Documents and those get encrypted…

I think most people don’t care about their system directories but their data?

Backups and onedrive for enterprises, yes. :)

stavros1 month ago

Obviously if you give all sandboxed processes access to /, that doesn't improve anything.

The idea is that you'd notice that your new git binary is trying to get access to /var/postgres, and you'd deny it, because it has no reason to want that.

edoceo1 month ago

Feels like a case where ZFS would help mitigate?

charcircuit1 month ago

Like Android and iOS. The user manually has to grant access to files.

XorNot1 month ago

Which doesn't scale to office workstations or workplaces with network drives, where users needing to search and update hundreds of files at a time is the norm.

Developers with 1 project open have potentially hundreds to thousands of open, quite valuable files.

Now of course, we generally expect developers to have backups via VCS but that's exactly the point: snapshotting filesystems with append semantics for common use cases is an actual, practical defense.

+1
econ1 month ago
+1
charcircuit1 month ago
TacticalCoder1 month ago

Of course, there are many things a company can do to be a bit more assured it can access its data: CoW snapshots, backups on read-only medium (e.g. DVD or BluRay discs), HDDs/SSDs offline on shelves, and certainly many other things could help companies.

That's not incompatible with sandboxing applications to limit the damage a malware can do.

Even on a regular user's "workstation" there's no need for every single app to access every single directory / every single network drive with rw permission etc.

P.S: FWIW the backup procedure I put in place doesn't just encrypt/compress/deduplicate the backups, it also compares the backup to previous backups (comparing size gives an idea, for example), then also verifies that the backup can be decrypted, using a variety of metrics (for example if, after decrypting then decompressing the backup a Git repo backup is found, it'll run "git fsck" on it, if a file with a checksum is found, it'll verify that file's checksum, etc.). Already helped us catch not a malware but a... bitflip! I figured out that if a procedure can help detect a single bitflip, it probably can help detect malware-encrypted data too. I'm not saying it's 100% foolproof: all I'm saying is there's a difference between "we're sandboxing stuff and running some checks" vs "we allow every single application to access everything on all our machines because we need users to access files".

gosub1001 month ago

Or if non-trusted/signed apps only had COW disk access.

gblargg1 month ago

Or if people backed up more often.

1vuio0pswjnm71 month ago

"On my mini PC CPU, I estimated a processing speed of 100,000 timestamp to random bytes calculations per second (utilizing all cores)."

Would like more details on the mini PC. Processor, RAM, price. Is it fanless.

heavensteeth1 month ago

What could explain encrypting the first 65k with KCipher2 and the rest with something else? Seems odd.

fragmede1 month ago

[flagged]

martinsnow1 month ago

Why don't you do the legwork instead of asking rhetorical questions?

charcircuit1 month ago

Legwork of what? Companies already have done the legwork to make it easy for strangers to send you money.

technion1 month ago

Companies that "do the legwork" of decrypting ransomware for the most part just pay the ransom on your behalf.

+2
tsujamin1 month ago
cannonpalms1 month ago

> why publish this?

New versions of Akira and any other ransomware are constantly being developed. This code is specific to a certain version of the malware.

As noted in the article, it also requires:

1. An extremely capable sysadmin 2. A bunch of GPU capacity 3. That the timestamps be brute-forced separately

So it's not exactly a turn-key defeat of Akira.

dylan6041 month ago

once your files are encrypted by ransomware, does the encryption change if the malware gets updated? if not, then anyone currently infected with this version can now possibly recover.

if they don't release their code, then what's the point of having the code? they accomplished their task, and now here you go for someone else that might have the same need. otherwise, don't get infected by a new version

IncreasePosts1 month ago

How would it be better, unless it's widely known to be breakable? And at that point, wouldn't the hackers know that too?