Jump to content

Cuchaz

Members
  • Posts

    27
  • Joined

  • Last visited

Everything posted by Cuchaz

  1. Thanks for the suggestions. The limitations of core mods seems to be a philosophy decision from cpw. There's no technical reason why core mods can't do these things. That's disappointing, but there doesn't seem to be anything I can do about it. Your first suggestion is what I'll have to do. I'll just have to split my mod into two mods.
  2. cpw says core mods should never do these things. But he didn't give a reason why. =(
  3. I take it not many people write coremods around here... Is there any way to get a dev response on this? What's the usual procedure for lobbying for patches to the forge codebase?
  4. And while I'm thinking of it, the event dispatcher for core mods seems to swallow exceptions. That makes debugging slightly more difficult, but it's not a huge problem. ie, if my mod container has a method like this: @Subscribe public void construct( FMLConstruction event ) { throw new RuntimeException( "where does this go?" ); } The exception appears nowhere in the log. =( The easy workaround for now is to just remember to put a try/catch in all my event handlers.
  5. If for some reason it's not taboo to have core mods registering things like entities and guis, this small patch seems to resolve the issue: in cpw.mods.fml.common.FMLCommonHandler: public ModContainer findContainerFor(Object mod) { if( mod instanceof ModContainer ) { return (ModContainer)mod; } return Loader.instance().getReversedModObjectList().get(mod); }
  6. Hello, I'm writing a core mod now. For some reason, FMLCommonHandler.instance().findContainerFor(mod) doesn't work on core mods. It can't find the container and just returns null. After glancing at the source, it seems this is by design. InjectedModContainers goes to great lengths to "override" the isImmutable() method and disable inclusion of the mod in the usual lookup list that findContainerFor() uses. So my question is, how can my core mod use findContainerFor()? Or more importantly, how can my core mod use Forge functions that rely on findContainerFor() like entity registration, gui registration, etc? Are core mods not supposed to register these things for some reason? Am I required to split my mod into two mods? One FML mod and one core mod? It would be nice to keep everything in one mod. Thanks, Cuchaz
  7. I agree. It would be slow. That's why it's easier to develop in a deobfuscated environment. You could probably skip the slow MCP compile script by just passing Eclipse's bin folder to the MCP reobfs script. Incremental compiling will save you a lot of time. Your launch script could also pipe the log file to stdout (or redirect Minecraft's stdout) if you want to see it immediately. Or you could use a log file viewer.
  8. Yeah, that seems the most straightforward way to do it.
  9. Or just distribute Javassist with your library. Up to you.
  10. I'm deeply disturbed this isn't considered a flaw. Nevertheless, you've made your position clear and I won't trouble you about it anymore. However, I will ask if anyone else sees this as a problem and wants to do anything about it. I can probably write a mod to provide a real security layer for mod authors. If any concerned modders out there want to help, they are certainly welcome to contribute.
  11. If you want to dynamically change class files at runtime, I recommend using a tool called Javassist. It will make your life much easier. http://www.csg.ci.i.u-tokyo.ac.jp/~chiba/javassist/
  12. Here is a demonstration of the exploit. This link points to a zip file with three mods in it. https://bitbucket.org/cuchaz/power-tools/downloads/exploitDemo.zip The first mod, testMod.zip is the original mod. It's actually my Power Tools mod v1.2 on PMC. I compiled it from source and signed it in the usual way using a private key. It handles the FMLFingerprintViolationEvent event with the following code: @EventHandler public void onSignatureFail( FMLFingerprintViolationEvent event ) { // ignore the development environment if( event.isDirectory ) { return; } System.out.println( "\n\n" ); System.out.println( "===============================================" ); System.out.println( " Hack Report!" ); System.out.println( "===============================================" ); System.out.println( String.format( "Mod %s failed fingerprint check!", event.source.getAbsolutePath() ) ); System.out.println( String.format( "\tExpected fingerprint: %s", event.expectedFingerprint ) ); System.out.println( String.format( "\tObserved %d fingerprints:", event.fingerprints.size() ) ); for( String fingerprint : event.fingerprints ) { System.out.println( "\t\t" + fingerprint ); } System.out.println( "\n\n" ); } If an invalid signature is detected for this mod, you'll see a "Hack Report" in Forge's log file. However, since testMod.zip is completely benign, there won't be anything interesting in the log when you load it with Forge. The second mod, hackedMod.detectable.zip is a hacked version of testMod.zip. Instead of actually loading the original mod, this hacked version is merely programmed to spit out a message to the console: If you load it using Forge, the invalid signature event will be thrown and you'll see the "Hack Report" in the Forge log. In this case, everything is working as intended. At least, that's how I assume you want this system to work. The third mod, hackedMod.undetectable.zip is another hacked version of testMod.zip. Instead of actually loading the original mod, this hacked version is merely programmed to spit out a different message to the console: If you load it using Forge, the invalid signature event will NOT be thrown and you WON'T see the "Hack Report" in the Forge log. Crucially, all three mods identify as id="cuchaz.powerTools" name="Power Tools" so an end user could not tell the difference from the Forge mods screen. The two hacked mods were NOT compiled from the original source and they were NOT signed using the original key. I used tools to modify the class files of the testMod.zip to install the hack. You could send me any mod zip file and I could repeat the same hack. And your signature system can't detect it. Hopefully this demonstration is sufficient to convince that there is a flaw in Forge's signature system.
  13. You touted your signature system as a way for mod authors to tell if their mod has been tampered with. Whether or not you want to call it a "security" system is irrelevant. I asserted that it doesn't work and disclosed the exploit. You haven't yet convinced me that it actually does work. It's clear to me now that you have no interest in fixing the flaw in your tamper evident system. I'm pretty sure you don't even understand what's wrong with it. Statements like "we have to trust the mod" lead me to believe that I'll never be able to convince that something is actually very very wrong. That's fine. I'll spend my attention elsewhere.
  14. The point I'm trying to make is that the system can't meet this guarantee. The system is broken. It doesn't work. From reading how the system works, it seems quite trivial to hack one of these "signed" jars. And the mod author can't do anything to keep someone from running the hacked code. I'll explain why. You're relying on the mod jar (which should not be trusted) for two things. 1: the correct value of the jar digest (ie the signature). and 2: what to do when a violation even occurs. Because of the way the system is implemented, there's no way for me as a mod author to prevent hacked versions of my mod from being loaded and running arbitrary code. The tamper-evident part of the system doesn't even work correctly. All an attacker needs to do to subvert a mod is delete 1 and 2 from the jar, and all the "security" is completely disabled. If the attacker wants to be fancy, the attacker could re-sign the jar and overwrite 1 without even needing to change 2. Signatures can be an extremely secure system. You just have to understand how crypto systems work and then implement them properly. Also, try to be nice. I'm not just complaining. I'm offering to help you fix it.
  15. It wasn't originally a suggestion, but sure. Moderate away.
  16. That's a great question. If you want this list to be a list of Modders Who Probably Wont Hack You, that's a lot of work for the Forge team. It's definitely not worth the effort to vet every piece of mod code. But all the list really needs to be is a list of Modders Who Are Allowed To Write Code for Mod X. That completely moves the burden of "Which modders do I trust" onto the end user. Then Forge can just worry about, "Did this code really come from modder Y?" In the second case, the list would never need moderation as long as we only add entries to the list. Now, if someone wants to change an existing entry, that would need some kind of approval I think. But that should rarely happen, and if you wanted to, you could completely disallow changes to existing entries by policy. Cuchaz
  17. Nah, none of that's really necessary. All we need to do is keep a list of trusted modders (who opt into this list), their public keys, and their mods. As long as they sign all their updated jars with the same private key, we don't need to update the list of public keys every time a jar is updated. We only need to update the list when a new mod author wants to opt in or when a mod author releases a new mod. Now, if redistributing the list every time a new mod author opts in or releases a mod is still too frequent, then we can start thinking about ways to get the FML to automatically download updates of the list. For that, we would just need an HTTP server. If you want to host that, awesome! Cuchaz
  18. If there are private keys baked into FML, we should fix that first thing. The only keys that should see the light of day are public keys. Private keys should be kept in the deepest darkest hole possible. Sorry, I didn't mean "you" individually. I meant the unspecified "you." I'm sure we can improve on whatever security system is currently built into Forge. If we actually want to do that, we don't need to host any authentication servers. We just need to write a little code for FML (which I'll offer to do if you want) and maintain the opt in list of public keys for mod authors. The list probably needs to be distributed with Forge. We'll have to decide on encryption/digest/signature standards and such as well. Cuchaz
  19. If signing the jar with another key will pass the security check, then that further illustrates the ineffectiveness of the system. If anyone is allowed to sign the jar, then the signatures are pretty meaningless. My mod is a client and a server mod. So if the server mod was downloaded from the same infected source as the client mod, then server-side-only checks are also useless. That also ignores the fact that a user doesn't need to log into a server for an infected client mod to wreak it's havok on the user's computer. But at least we're thinking about how to fix the problem now. I think in order to make this system work correctly, we'd need to make some changes to Forge itself. Typically, signature systems are based on some idea of trust. Someone trusted keeps a list of trusted public keys. Then new code is considered untrusted until it can be verified that the code was signed with the private counterpart to a trusted public key. Trusted code is allowed to execute. Untrusted code is not allowed to execute. Now, that system exactly as described probably wouldn't work for Forge unless you want everyone to sign their code and keep a copy of every possible mod author's public key. To avoid that kind of hassle, I'd suggest making an of opt in system. For mod authors that want the security of an actually-functioning signature system, they could opt in to this list. Alternatively, other systems deal with executing untrusted code by using a sandbox. That's probably too much trouble to implement though in this environment unless you can find a nice seamless Java library. But the worst possible thing you can do is implement a non-functioning security system and advertise that it actually works. This only gives people a false sense of security and can actually cause more harm than good.
  20. Any time you build a system that is designed to execute arbitrary code from the internet with an interactive user's credentials, you're not allowed to blame all your security problems on the operating system. Also, signing java jars with my private key is actually seriously hard to do if you don't have my private key. That's the whole point of cryptography. If you think you can forge my signature, I'd love to see you try. I've already asked my users to only download my mods from trusted sources. But I have no control over who redistributes my jars and who downloads the distributed versions. If, by your own admission, this signature system can't reject malicious modifications, what purpose does it serve? Anyway, would it be possible to get a developer's opinion on this? I'd love to talk to the person who wrote it to learn his/her thinking. Cuchaz
  21. I agree, in part. Forging a signature is probably not within the capabilities of your average (or even gifted) Minecraft hacker. However, we are talking about protecting credit card information here. If someone subverts my mod and inserts a keylogger/trojan, they could possibly steal someone's banking credentials. The security weakness I see in this case is not in SHA-1. It's in how mods with invalid signatures are handled by Forge. It seems that Forge asks the mod (that has failed a signature check and therefore should be untrusted) what to do about the invalid signature. Cuchaz
  22. The only documentation I can find on it is the javadoc here: http://jd.minecraftforge.net/cpw/mods/fml/common/Mod.html
  23. What it is it you want me to link? If it's documentation for the signature system in Forge, I don't know where to find it either. Cuchaz
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.