Jump to content
  • Sign Up

    Welcome to CANOPY.

    Browse the Forums, read articles in the Library, share your work in the Gallery, get the latest in Downloads

    Want more? Subscribe and get all our assets for a fraction of their asset store purchase price!

  • Introduction

    Optimizing 3D scenes for mobile devices often requires a range of techniques to reduce the amount of work both the CPU and GPU must do to render a scene. Here are 5 quick tips aimed to help increase the performance and framerate of your game on your Android and IOS mobile device.  

    Reduce Polycount

    Combine Meshes

    There is an overhead in Unity for keeping track of the many transforms of GameObjects in a scene, even if your meshes are quite low in polygons.

    Data for each mesh, including position, rotation, scale, vertex colors, vertex normals and UVs are sent to the GPU for rendering, and the process of sending this group of data is called a batch. The more batches your scene has, the more work the CPU must do to send this data to the GPU.

    We can reduce the number of batches in a scene by using less individual GameObjects with meshes, and we can do this by merging meshes.  Unity has built in API calls that you can code for this, but a lot of work is then required on top of this, and it is a slow and painful process.

    Alternatively, you can also use the Combine Meshes tool that comes with the PW Toolbox. More information about the PW Toolbox can be found here:

    Polygon budget

    While every game can differ in polygon counts, and mobile hardware is able to handle many more polygons than they used to, sticking to a guideline of around ~100,000 vertices can help with your games performance. 

    Ideally you would keep a memory budget of around 1-3 mb per 3d mesh, which could range from 300-1000 vertices.  

    Level of Detail

    Level of Detail (LOD) works by swapping higher and lower polygon versions of the same asset based on the screen size percentage that the model takes up.

    This can be good in situations where your mobile device cannot handle many high polygon models in a scene at once. However, setting up LODs should only be used if the number of polygons is affecting the performance, as swapping lower poly versions of meshes in and out of view increases the memory load and adds more work on the CPU for game to decide when to perform the swap.  

    image.png.71c4e9e45f9e6ccdf279c5244e720d36.png

    Another use case for LODs in Unity is their ability to hide (also known as culling) the object when it takes up a certain screen percentage. Hiding the mesh from a distance will stop it rendering, thus reducing the total polygon count. This can also be done with the LOD Group component.

    Some assets will come multiple LOD levels, but this adds even more work to the game to decide which LOD to show, and as this increases you suffer from diminishing returns where the LOD actually becomes more expensive than the extra polygons.

    For low poly assets we would typically only have a LOD 0 level, and a Cull level.  

    image.png.899f574f7c2ab7f62601f16b5ff0f765.png

    Occlusion Culling

    Occlusion culling is a method used to hide meshes that are obstructed from view by another mesh.  

    This process works well in small mobile levels where areas are often occluded by large meshes, such as interiors with corridors.  However, using this method on large scale open world mobile environments can cause more overhead than actual performance gains in some cases. The occlusion calculations end up taking more time than the actual rendering of the items in the scene.  

    To make use of occlusion culling, the object needs to be marked as either Occluder static or Occludee static. Setting the object to Occluder static will mean it can be used to test against if other objects can be hidden behind it. Setting the object to Occludee static will mean it can be hidden if it is behind an Occluder object.  

    Change Resolution

    Changing the resolution of the game can increase your performance when the main contributor to rendering is related to the number of pixels the scene must render (also called fill rate).

    Settings to change the resolution can be found in Edit->Project Settings->Player->Resolution Scaling. You can set a fixed resolution for mobile devices by setting Resolution Scaling Mode to Fixed DPI (Dots per Inch) and entering a custom DPI in the Target DPI field. The DPI of your mobile device can be set there, but you can also set it to a value lower than the device’s native DPI to render the scene at a lower resolution. 

    image.png.a11ec615988b6fd5dea678f2e750c478.png

    Another setting relating to Fixed DPI is located under Edit->Project Settings->Player->Quality-> Resolution Scaling Fixed DPI Factor. This setting is a multiplier for the previously mentioned Target DPI field. 

    image.png.1e2c1caed9e72fe612f6854ae732a4a5.png

    A value of 1 will result in the same setting as the Target DPI, however a value of 0.5 will scale the Target DPI by a half.  In the example, the Target DPI is set as 400, and the Resolution Scaling Fixed DPI Factor set as 0.5, so the resulting resolution would be 200 DPI. 

    Disable unused features

    Depth and Opaque Textures

    Disabling the creation of Depth and Opaque textures rendered from the camera can reduce the time taken to render a frame.  However, certain effects require the use of these textures such as post process effects and custom shaders that make use of _CameraDepthTexture and _CameraOpaqueTexure features. If you are certain your scene doesn’t make use of these textures, it may be helpful disabling them and measuring the gained performance. They can be disabled in the pipeline asset, located in the heading General. 

    image.png.181144b96750cb1019317a8ba7c57b48.png

    HDR

    Turning off HDR in the pipeline asset settings can also reduce the time taken to render each frame, as HDR increases the VRAM usage and requires a tone mapping process on top of the rendered image. 

     

    image.png.c7565dda8550c1354a42eb6832d05d6f.png

    Anti-Aliasing  

    Anti-aliasing is used to reduce the jaggy edges of objects in the scene. There are many ways this can be implemented, however each come with their own drawbacks and added computations. Experimenting with each technique on the mobile device is recommended to see how it affects the look of the scene. 

    Unity recommends FXAA, which can be set on the camera component. Also compare how the scene looks without any Anti-Aliasing at all to see if you wish to include it in your project. Because this is a mostly fill rate limited technique, it is best to avoid using the more computationally heavy types of Anti-aliasing where possible. 

    image.png.ab6b22bc67dba7e6a5e23ed40247baed.png

    The MSAA type of Anti-Aliasing works on a hardware level by rendering the boarders of polygon edges multiple times at a subpixel level.  This effect has a couple of different levels (2x, x4, x16), each with an increase to the rendering cost. 

    image.png.163e57f5d3a6318ee5580362a29a75f6.png

    The MSAA type of Anti-Aliasing can be turned off in the pipeline asset, located under the Quality heading. Test to see how much of performance benefit compared with the visual quality looks to determine if it needed in your project. 

    Lower Lighting Quality

    Assuming your scene is using forward rendering on a mobile device, reducing the number of real time lights calculated per pixel can benefit performance by reducing the number of polygons and draw calls. Settings for these changes can be found in the pipeline asset under the heading Lighting. 

    image.png.62874bee1dbc013e807eebebdc88e834.png

    Additional lights can be either Per Vertex or Per Pixel. 

    • Per Vertex will result in less computations but will have lower quality lights due to the data being interpolated. 
    • Per Pixel will increase the computation time but will result in higher quality lights. For additional lights viewed at a far distance, the cheaper Per Vertex option should be considered. 

    Opting for a baked lighting approach will be more performant and can allow more lights as Unity does not include these in any further lighting calculations at runtime. 

    Reduce texture memory

    Reducing the size of your textures can help speed up the fill rate calculations for your GPU, as it has less pixels to process. A quick way to find which textures are taking up the most space in your scene is by using the free tool Resource Checker (available here: https://assetstore.unity.com/packages/tools/utilities/resource-checker-3224)

    image.png.40b7cabddcd3419a65d35f9374c919fc.png

    It provides a quick overview of all the textures, meshes and materials in the scene. Looking at each texture and seeing the biggest sizes can help you identify textures that may need to be reduced. Anything above 2048x2048 pixels on mobile can be considered too much, and an excessive number of textures in the 2048-4096 range will slow down your scene, so it’s best to stick with 512 and 1024 resolution textures. 

    Where possible, try to reduce the number of textures by creating texture atlases. These images contain more than one texture in them (at a reduced resolution), where multiple meshes can sample from this one texture. For low poly models, having a color palette texture that all your assets reference for their colors is common approach to reducing the number of textures present.  

    image.png.eb68809f2466123110f4db3c7039818b.png

    Example of a texture atlas as seen in Minecraft 

    Conclusion

    For a more in-depth article on optimizing a scene, including other areas to consider not just in the mobile environment, see our other articles: 

     

     

    And don’t forget to check our PW Toolbox, which contains the Combine Meshes, for an easy and customizable mesh combining tool!  

     

     

    image.png

    • Like 1

    User Feedback

    Recommended Comments

    There are no comments to display.


×
×
  • Create New...