Jump to content
View in the app

A better way to browse. Learn more.

Forge Forums

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Leaderboard

Popular Content

Showing content with the highest reputation on 12/09/18 in Posts

  1. Actually it's the reverse. First create the framebuffer, bind it, setup the matrices and the viewport and render your stuff. Then use the resulting texture in your GUI. Framebuffers are opengl's way of "offscreen rendering". Instead of drawing onto the main draw buffer you would draw onto a custom framebuffer which results in a texture you can render somewhere else.
  2. Why? You should update to 1.12.2 The camera is usually a collection of 2 things - the viewprojection matrix and the frustrum. The ViewProjection matrix is a combination of a View matrix and a Projection matrix (as evident by name). The view matrix governs how vertices are transformed based on the position and the rotation of your "camera", or a view entity, or whatever it is, and the projection matrix governs how vertices are transformed based on the FOV, the aspect ratio and the width and the height of a viewport. This matrix is the minimal information that defines a basic camera. You can go without a frustrum but you must have the viewprojection matrix. There are helper methods that allow you to construct both the view and the projection matrices in the lwjgl library and then you just multiply them together to get the viewprojection matrix. However I unfortunately don't know whether the matrices in lwjgl are row-major or colomn-major so I can't tell you the order in which you multiply them. The frustrum is a collection of 6 planes defined by the viewprojection matrix used to do "culling" - basically not rendering anything that the camera can't see anyway. So here is the theory. In practice minecraft still uses old opengl for it's shaders and so you have to tell opengl which matrix to use. To do so you would need to first specify the viewport(the rectangle that defines, well, the viewport, the area of the screen the rendering takes place within) using GlStateManager.viewport. Then you switch to a GL_PROJECTION matrix mode with GlStateManager.matrixMode and load your projection matrix. Then switch to the GL_MODEL_VIEW matrix mode and load your view matrix. You load the matrix using GL11.glLoadMatrix, or set the matrix to identity with GlStateManager.loadIdentity and then multiply it by respective matrix using GlStateManager.glMultMatrix. However this topic of yours will require you to use a framebuffer aswell. You can ask me more about using that but it is a pretty advanced topic, even with minecraft helpfully providing the Framebuffer class.

Important Information

By using this site, you agree to our Terms of Use.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.