// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1689 transmissions indexed — page 78 of 85

[ 2018 ]

20 entries
1541|blog.unity.com

Procedural patterns you can use with Tilemaps, part 1

Many creators have used procedural generation to add some diversity to their game. Some notable mentions include the likes of Minecraft, or more recently, Enter the Gungeon and Descenders. This post explains some of the algorithms you can use with Tilemap, introduced as a 2D feature in Unity 2017.2, and RuleTile.With procedurally created maps, you can make sure that no two plays of your game are the same. You can use various inputs, such as time or the current level of the player to ensure that the content changes dynamically even after the game has been built.We’ll take a look at some of the most common methods of creating a procedural world, and a couple of custom variations that I have created. Here’s an example of what you may be able to create after reading this article. Three algorithms are working together to create one map, using a Tilemap and a RuleTile:When we’re generating a map with any of the algorithms, we will receive an int array which contains all of the new data. We can then take this data and continue to modify it or render it to a tilemap.Good to know before you read further:The way we distinguish between what’s a tile and what isn’t is by using binary. 1 being on and 0 being off.We will store all of our maps into a 2D integer array, which is returned to the user at the end of each function (except for when we render).I will use the array function GetUpperBound() to get the height and width of each map so that we have fewer variables going into each function, and cleaner code.I often use Mathf.FloorToInt(), this is because the Tilemap coordinate system starts at the bottom left and using Mathf.FloorToInt() allows us to round the numbers to an integer.All of the code provided in this blog post is in C#.GenerateArray creates a new int array of the size given to it. We can also say whether the array should be full or empty (1 or 0). Here’s the code:This function is used to render our map to the tilemap. We cycle through the width and height of the map, only placing tiles if the array has a 1 at the location we are checking.This function is used only to update the map, rather than rendering again. This way we can use less resources as we aren’t redrawing every single tile and its tile data.Perlin noise can be used in various ways. The first way we can use it is to create a top layer for our map. This is as simple as just getting a new point using our current x position and a seed.This generation takes the simplest form of implementing Perlin Noise into level generation. We can use the Unity function for Perlin Noise to help us, so there is no fancy programming going into it. We are also going to ensure that we have whole numbers for our tilemap by using the function Mathf.FloorToInt().This is how it looks rendered onto a tilemap:We can also take this function and smooth it out. Set intervals to record the Perlin height, then smooth between the points. This function ends up being slightly more advanced, as we have to take into account Lists of integers for our intervals.For the first part of this function, we’re first checking to see if the interval is more than one. If it is, we then generate the noise. We do this at intervals to allow for smoothing. The next part is to work through smoothing the points.The smoothing happens through the following steps:- Get the current position and the last position- Get the difference between the two positions, the key information we want is the difference in the y-axis- Next, we determine how much we should change the hit by, this is done by dividing the y difference by the interval variable.Now we can start setting the positions. We’ll work our way down to zeroWhen we do hit 0 on the y-axis, we will add the height change to the current height and repeat the process for the next x positionOnce we have done every position between the last position and the current position, we will move on to the next pointIf the interval is less than one, we simply use the previous function to do the work for us.Let’s see how it looks rendered:The way this algorithm works is by flipping a coin. We then get one of two results. If the result is heads, we move up one block, if the result is tails we instead move down one block. This creates some height to our level by always moving either up or down. The only downside to this algorithm is that it looks very blocky. Let’s take a look at how it works.This generation gives us more of a smooth height compared to the Perlin noise generation.This generation gives us more of a smooth height compared to the Perlin noise generation.This Random Walk variation allows for a much smoother finish than the previous version. We can do this by adding two new variables to our function:The first variable is used to determine how long we have held our current height. This is an integer and is reset when we change the height.The second variable is an input for the function and is used as our minimum section width for the height. This will make more sense when you have seen the functionNow we know what we need to add. Let’s have a look at the function:As you can see from the gif below, the smoothing of the random walk algorithm allows for some nice flat pieces within the level.I hope this has inspired you to start using some form of procedural generation within your projects. If you want to learn more about procedural generating maps, check out the Procedural Generation Wiki or Roguebasin.com, which are both great resources.You can look out for the next post in the series to see how we can use procedural generation to create cave systems.If you make something cool using procedural generation feel free to message me on Twitter or leave a comment below!Want to hear more about it and get a live demo? I’m also talking about Procedural Patterns to use with Tilemaps at Unite Berlin, in the expo hall mini theater on June 20th. I’ll be around after the talk if you’d like to have a chat in person!

>access_file_
1544|blog.unity.com

Stripping scriptable shader variants

Massively reduce Player build time and data size by allowing developers to control which Shader variants are handled by the Unity Shader compiler and included in the Player data.Player build time and data size increase along with the complexity of your project because of the rising number of shader variants.With scriptable shader variants stripping, introduced in 2018.2 beta, you can manage the number of shader variants generated and therefore drastically reduce Player build time and data size.This feature allows you to strip all the shader variants with invalid code paths, strip shader variants for unused features or create shader build configurations such as “debug” and “release” without affecting iteration time or maintenance complexity.In this blog post, we first define some of the terms we use. Then we focus on the definition of shader variants to explain why we can generate so many. This is followed by a description of automatic shader variants stripping and how scriptable shader variants stripping is implemented in the Unity shader pipeline architecture. Then, we look at the scriptable shader variants stripping API before discussing results on the Fountainbleau demo and concluding with some tips on writing stripping scripts.Learning scriptable shader variants stripping is not a trivial undertaking, but it can lead to a massive increase in team efficiency!To understand the scriptable shader variants stripping feature it is important to have a precise understanding of the different concepts involved.Shader asset: The full file source code with properties, sub-shader, passes, and HLSL.Shader snippet: The HLSL input code with dependencies for a single shader stage.Shader stage: A specific stage in the GPU rendering pipeline, typically a vertex shader stage and a fragment shader stage.Shader keyword: A preprocessor identifier for compile-time branches across shaders.Shader keyword set: A specific set of shader keywords identifying a particular code path.Shader variant: The platform-specific shader code generated by the Unity shader compiler, for a single shader stage for a specific graphics tier, pass, shader keyword set, etc.Uber shader: A shader source that can produce many shader variants.In Unity, uber shaders are managed by ShaderLab sub shaders, passes, and shader types as well as the #pragma multi_compile and #pragma shader_feature preprocessor directives.To use scriptable shader variant stripping, you need a clear understanding of what a shader variant is, and how shader variants are generated by the shader build pipeline. The number of shader variants generated is directly proportional to the build time and the Player shader variant data size. A shader variant is one output of the shader build pipeline.Shader keywords are one of the elements that cause the generation of shader variants. An unconsidered use of shader keywords can quickly lead to a shader variants count explosion and therefore extremely long build time.To see how shader variants are generated, the following simple shader counts how many shader variants it produces:The total number of shader variants in a project is deterministic and given by the following equation:The following trivial ShaderVariantStripping example brings clarity to this equation. It’s a single shader which simplifies the equation as following:Similarly, this shader has a single sub shader and a single pass which further simplifies the equation into:Keywords in the equation refers to both platform and shader keywords. A graphics tier is a specific platform keyword set combination.The ShaderVariantStripping/Pass has two multi compile directives. The first directive defines 4 keywords (COLOR_ORANGE, COLOR_VIOLET, COLOR_GREEN, COLOR_GRAY) and the second directive defines 3 keywords (OP_ADD, OP_MUL, OP_SUB). Finally, the pass defines 2 shader stages: a vertex shader stage and a fragment shader stage.This shader variant total is given for a single supported graphics API. However, for each supported graphics API in the project, we need a dedicated set of shader variants. For example, if we build an Android Player that supports both OpenGL ES 3 and Vulkan, we need two sets of shader variants. As a result, the Player build time and shader data size are directly proportional to the number of supported graphics APIs.The shader compilation pipeline in Unity is a black box where each shader in the project is parsed to extract shader snippets before collecting the variant preprocessing instructions, such as multi_compile and shader_feature. This produces a list of compilation parameters, one per shader variant.These compilation parameters include the shader snippet, the graphics tier, the shader type, the shader keyword set, the pass type and name. Each of the set compilation parameters are used to produce a single shader variant.Consequently, Unity executes an automatic shader variant stripping pass based on two heuristics. Firstly, stripping is based on the Project Settings, for example, if Virtual Reality Supported is disabled then VR shader variants are systematically stripped. Second, the automatic stripping is based on the configuration of Shader Stripping section of the Graphics Settings.Automatic shader variants stripping is based on build time restrictions. Unity can’t automatically select only the necessary shader variants at build time because those shader variants depend on runtime C# execution. For example, if a C# script adds a point light but there were no point lights at build time, then there is no way for the shader build pipeline to figure out that the Player would need a shader variant that does point light shading.Here’s a list of shader variants with enabled keywords that get stripped automatically:Lightmap modes: LIGHTMAP_ON, DIRLIGHTMAP_COMBINED, DYNAMICLIGHTMAP_ON, LIGHTMAP_SHADOW_MIXING, SHADOWS_SHADOWMASKFog modes: FOG_LINEAR, FOG_EXP, FOG_EXP2Instancing Variants: INSTANCING_ONFurthermore, when Virtual Reality support is disabled, the shader variants with the following built-in enabled keywords are stripped:STEREO_INSTANCING_ON, STEREO_MULTIVIEW_ON, STEREO_CUBEMAP_RENDER_ON, UNITY_SINGLE_PASS_STEREOWhen automatic stripping is done, the shader build pipeline uses the remaining compilation parameter sets to schedule shader variant compilation in parallel, launching as many simultaneous compilations as the platform has CPU core threads.Here’s a visual representation of that process:In Unity 2018.2 beta, the shader pipeline architecture introduces a new stage right before the shader variant compilation scheduling, allowing users to control the shader variant compilation. This new stage is exposed via C# callbacks to user code, and each callback is executed per shader snippet.As an example, the following script enables stripping of all the shader variants that would be associated with a “DEBUG” configuration, identified by a “DEBUG” keyword used in development Player build. OnProcessShader is called right before the scheduling of the shader variant compilation.Each combination of a Shader, a ShaderSnippetData and ShaderCompilerData instances is an identifier for a single shader variant that the shader compiler will produce. To strip that shader variant, we only need to remove it from the ShaderCompilerData list.Every single shader variant that the shader compiler should generate will appear in this callback. When working on scripting the shader variants stripping, you need to first figure out which variants need removing, because they’re not useful for the project.One use case for the scriptable shader variants stripping is to systematically strip invalid shader variants of a render pipeline due to the various combinations of shader keywords.A shader variants stripping script included in the HD render pipeline allows you to systematically reduce the build time and size of a project using the HD render pipeline. This script applies to the following shaders:The script produces the following results:Furthermore, the Lightweight render pipeline for Unity 2018.2 has a UI to automatically feed a stripping script that can automatically strip up to 98% of the shader variants which we expect to be particularly valuable for mobile projects.Another use case is a script to strip all the rendering features of a render pipeline that are not used for a specific project. Using an internal test demo for the Lightweight rendering pipeline, we got the following results for the entire project:As we can see, using scriptable shader variant stripping can lead to significant results and with more work on the stripping script we could go even further.A project may quickly run into a shader variants count explosion, leading to unsustainable compilation time and Player data size. Scriptable shader stripping helps deal with this issue, but you should reevaluate how you are using shader keywords to generate more relevant shader variants. We can rely on the #pragma skip_variants to test unused keywords in the editor.For example, in ShaderStripping/Color Shader the preprocessing directives are declared with the following code: This approach implies that all the combinations of color keywords and operator keywords will be generated.Let’s say we want to render the following scene:First, we should make sure that every keyword is actually useful. In this scene COLOR_GRAY and OP_SUB are never used. If we can guarantee these keywords are never used, then we should remove them.Second, we should combine keywords that effectively produce a single code path. In this example, the ‘add’ operation is always used with the ‘orange’ color exclusively. So we can combine them in a single keyword and refactor the code as shown below. Of course, it’s not always possible to refactor keywords. In these cases, scriptable shader variants stripping is a valuable tool!For each snippet, all the shader variant stripping scripts are executed. We can order the scripts’ execution by ordering the value returned by the callbackOrder member function. The shader build pipeline will execute the callbacks in order of increasing callbackOrder, so lowest first and highest last.A use case for using multiple shader stripping scripts is to separate the scripting per purpose. For example:Script 1: Systematically strips all the shader variants with invalid code paths.Script 2: Strips all the debug shader variants.Script 3: Strips all the shader variants in the code base that are not necessary for the current project.Script 4: Logs the remaining shader variants and strips them all for fast iteration time on the stripping scripts.Shader variants stripping is extremely powerful but requires a lot of work to achieve good results.In the Project view, filter for all shaders.Select a shader and, in the Inspector, click Show to open the list of keywords / variants of that shader. There will be a list of keywords that are always included in the build.Make sure that you know which specific graphics features the project uses.Check whether the keywords are used in all shader stages. Only one variant is necessary for stages that don’t use these keywords.Strip shader variants in the script.Verify the visuals in the build.Repeat steps 2 - 6 for each shader.Download the sample projectThe example project used to illustrate this blog post can be downloaded here. It requires Unity 2018.2.0b1.Come to Jonas Echterhoff's June 21 talk and learn about all the new tools that give you more control over what ends up in your build!

>access_file_
1545|blog.unity.com

Launching 3D Game Kit: Explore and Learn Unity in 3D

In February, Unity’s Content Team brought you The Explorer 2D Game Kit, a collection of mechanics, tools, systems and assets to hook up gameplay without writing any code. Now we’re bringing you the three-dimensional world of Ellen and her mission to explore the unknown alien landscape where her dropship has crash landed. The beautiful game example takes you through the towering abandoned ruins of this mysterious planet standing in the shadows of strange, giant plantlife.We’ve created the game example using similar systems to the 2D Game Kit, so if you’ve had fun with that and want to learn more, or want to start from scratch with 3D, check it out!Meet Ellen, our Principal Engineer. Her ship has crashed on a mysterious planet, full of monstrous brightly colored plant life, slobbering maniacal creatures and fizzing acid-filled pools. The walls and ruins surrounding her are clearly derelict, abandoned many thousands of years ago. What life can she discover? Who is waiting for her in the dark crypts below the dual-moon skies?The Content Team has crammed this Game Kit project with loads of beautiful galactic 3D art assets. And just like the 2D Game Kit, this includes moving platforms, destructible boxes and switches to open giant alien stone doors. Plus, of course, some excellent evil to defeat.Let’s just quickly recap what we mean by Game Kit. Every single asset, gameplay element, visual effect and sound is saved in the project ready for you to use. Create a new scene using the kit’s tools and find Ellen ready to go on a small piece of ground. Then add hills, pillars, giant towering stairs and doorways. With Prefabs-a-plenty to drag in, start making your own Explorer 3D game with a few clicks.To start making your own 3D platformer, we’ve put together a Quick Start Guide on the Learn Site. Keep an eye out for more step-by-step documentation and a reference guide over the next few weeks.The 3D Game Kit is available now for 2018.1, and you can find it on the Asset Store or through the Learn Site. You can also access the Asset Store from within the Unity engine itself and search for ‘3D Game Kit’. For extra help and to connect with other users of the Kit, head to our dedicated forum thread.On May 23, Matt Schell will be hosting an Online User Group where the development leads of the Kit will be discussing what’s included and how to use it. Tune in at 7pm BST/11am PT.

>access_file_
1548|blog.unity.com

2018.1 is now available

Unity 2018.1 marks the start of a new cycle that introduces a major upgrade to our core technology, which gives artists, developers and engineers the power to express their talents and collaborate more efficiently to make their AAA dreams a reality.Let’s start with a few of the highlights, and then you can dig into the details of all the features. The first two highlights described below, the Scriptable Render Pipeline and the C# Job System, represent the first versions of two major features, which will continue to evolve to help you unlock beautiful graphics and increase the runtime performance of Unity.While you go through the list of new features, you can download Unity 2018.1 here.Scriptable Render Pipeline (SRP)Available in Preview with Unity 2018.1, the new Scriptable Render Pipeline (SRP) places the power of modern hardware and GPUs directly into the hands of developers and technical artists, without having to digest millions of lines of C++ engine code.SRP makes it easy to customize the rendering pipeline via C# code and material shaders, giving you maximum control without all the complexity and challenges of writing or modifying a complete C++ rendering pipeline.We are also introducing two out-of-the-box render pipelines to fit your needs. The High-Definition Render Pipeline (HD RP) is for developers with AAA aspirations, and the Lightweight Render Pipeline (LW RP) is for those looking for a combination of beauty and speed, and it also optimizes the battery life for mobile devices and similar platforms.The C# Job System & Entity Component System (ECS)Combined with a new programming model (Entity Component System), the new runtime system enables you to take full advantage of multicore processors without the programming headache. You can use that extra horsepower, for example, to add more effects and complexity to your games or to add AI that makes your creations richer and more immersive.Level design and shadersUnity 2017.x introduced new features that help teams of artists, designers and developers build experiences together. We added powerful visual tools like Timeline, Cinemachine and a new FBX Exporter, which enables smooth round-tripping with Digital Content Creation tools like 3ds Max and Maya.With Unity 2018.1, we are continuing our efforts to help artists, designers and developers collaborate more efficiently by making it possible to create levels, cinematic content, and gameplay sequences without coding. For example, new tools like ProBuilder/Polybrush and the new visual Shader Graph offer intuitive ways to design levels and create shaders without programming skills.PackagesUnity 2017.2 introduced the Package Manager, an underlying core modular system and API that enables dynamic loading and updating of new Unity features in your projects. Unity 2018.1 builds on that with the newly released Package Manager User Interface, the Hub and Project Templates, all of which help you get new projects started faster and more efficiently.Several of the features are available in packages. The idea is to make Unity more modular so that it’s easier for us to release features on an ongoing basis.We use the "Preview" label for these new features to indicate that they are not recommended for production nor fully supported just yet. Previews offer you an opportunity to update, modify and experiment with features at an early stage as a separate modularized package, which you may want to use at a later time in productions.Unity’s built-in rendering modes offer a compelling pipeline for creating a wide range of games. With the evolution and growing diversity of platforms (performance, architecture, form factors), however, we wanted to provide a more powerful and flexible rendering pipeline.Available in Preview with Unity 2018.1, the new Scriptable Render Pipeline (SRP) places the power of modern hardware and GPUs directly into the hands of developers and technical artists, without having to digest millions of lines of C++ engine code.SRP will allow you to easily customize the rendering pipeline via C# code and material shaders. This gives you maximum control without all the complexity and challenges of writing or modifying a complete C++ rendering pipeline.Unity provides several built-in rendering modes, which are sufficient for the majority of smaller games. However, SRP allows you to go beyond what comes out-of-the-box to tailor the rendering process based on your specific needs and to optimize performance for the specific hardware of your target platform.SRP offers a new way of rendering in Unity. We’re going from a black-box model to one where most things are in C#, a more open system where users can write their own pipelines or customize templates for their needs. We’re releasing two initial pipelines in 18.1 in addition to the built-in rendering engine.You can learn more about the Scriptable Render Pipeline (SRP) and how to get started in our recent blog post.For high-end visuals on PCs and consolesThe HD RP is a modern renderer that will support a limited set of platforms (PC DX11+, PS4, Xbox One, Metal, Vulkan — no XR support yet).The HD RP targets high-end PCs and consoles and prioritizes stunning, high definition visuals. The tradeoff here is that HD RP will not work on less powerful platforms, and there will be some learning and re-tooling required.The renderer is a hybrid Tile/Cluster Forward/Deferred renderer with features parity between Forward and Deferred. Its features include volumetric lighting (in progress), unified lighting (the same lighting for opaque/transparent/volumetric), new light shapes (point lights now have line and rectangle options, spotlights now have box and pyramid options), and decals.Fewer draw callsThe LWRP is a single-pass forward renderer that uses fewer draw calls. Using the LW RP will decrease the draw call count on your project when compared to using the built-in rendering pipeline. While it supports all platforms, it is an ideal solution for mobile, and performance-hungry applications like XR. The trade-off here is that, as with the HDRP, switching to the new SRP workflow will require a learning curve, and it’s worth keeping in mind that some third-party tools are not yet compatible with it.The LW RP has its own process for rendering and therefore requires shaders written with it in mind. We have developed a new set of Standard Shaders that are located under the Lightweight Pipeline group in the material’s shader+selection dropdown. These include a Standard PBR shader, a Non-PBR Standard shader with a simplified lighting model, a Standard Terrain shader and a Standard Unlit shader. It’s worth noting that all Unity’s unlit stock shaders already work with the LW RP. This includes legacy particles, UI, skybox, and sprite shader.Download the LW RP using the Package Manager, and learn more about the LW RP and how to get started by reading our recent blog post.Templates provide pre-selected settings based on common best practices for projects, depending on whether they are 2D, 3D, high-end platforms, such as PC/consoles, or lightweight platforms, such as mobile. That way, you don’t have to worry about setting up the basics, and you have a better out-of-the-box experience.Templates ship with optimized Unity project settings, as well as some prefabs and assets to get you started. You don’t have to worry about changing many of the default settings in Unity when you start a new project because they are already pre-set for a target game-type or level of visual fidelity.Not only does this make it faster to get started, it also introduces you to settings, which you otherwise might not have discovered, and to new features, like Scriptable Render Pipeline, Shader Graph, and the Post-Processing Stack.Below is a list of the various templates you can choose from.Template Description 2D For 2D projects that use Unity’s built-in rendering pipeline.Configures project settings for 2D, including Image Import, Sprite Packer, Scene View, Lighting and Orthographic Camera.3D For 3D projects that use Unity’s built-in rendering pipeline.Configures project settings for 3D, includes updates such as setting the default color space to Linear and the default Lightmapper to Progressive.3D with Extras (Preview) Similar to the 3D Template but with the added benefits of post-processing, presets and example content. High End (Preview) For high-end graphics on platforms that support Shader Model 5.0 (DX11 and above). This template uses the HD RP.This template uses the HD RP, a modern rendering pipeline, which includes advance material types and a configurable hybrid Tile/Cluster deferred/Forward- lighting architecture.Lightweight (Preview) For focus on performance and projects that primarily use a baked lighting solution. This template uses the LW RP, a single-pass forward renderer with light culling per-object. Using the LW RP will decrease the draw call count on your project, making it an ideal solution for lower-end hardware.All lights are shaded in a single pass rather than additional passes per pixel light.Lightweight VR (Preview) Focuses on performance when developing VR projects that primarily use a baked lighting solution. This template uses the LW RP and requires a VR device to run.Authoring shaders in Unity have traditionally been the realm of people with some programming ability. In 2018 we are changing this!Shader Graph enables you to build your shaders visually using a designer tool — without writing one line of code. Instead, you create and connect nodes in a graph network with easy drag-and-drop usability. You can see the results immediately and then iterate, making it simple for new users to become involved in shader creation.The Shader Graph system:Is designed to work with the LW render pipeline (HD render pipeline coming soon)Can be extended to work with any custom render pipelineHas an open architecture that allows for custom nodes to be writtenIf you want to learn more about how to get started with Shader Graph, we highly recommend that you check out the Shader Graph Example Library on GitHub and our Shader Graph Tutorial videos.The Progressive Lightmapper offers great results for baked lights, and improves the workflow for lighting artists, enabling them to iterate quickly and predictably by providing progressive updates in the Unity Editor.While originally released as a “preview” feature in version 5.6, it has been improved with more features in each subsequent release. In 2018.1, it comes out of preview mode and includes memory optimizations for baking large scenes.As of 2018.1, the Progressive Lightmapper also helps power users via the Custom Bakes API. This enables access to data within the baking solution for the development of new lighting tools, such as the custom Occlusion Probes system used to create the first-person interactive experience in Book of the Dead.The Post-Processing Stack enables you to apply realistic filters to scenes using professional-grade controls. The artist-friendly interface makes it easy to create and fine-tune high-quality visuals for dramatic and realistic effect.Coming out of beta in 2018.1, we’ve added the most requested features and have fixed as many bugs as possible. We’re also improving our XR support by adding mobile-specific paths, volume blending, and a complete framework for custom user-effects.This version of the Post-Processing Stack will be shipped as the first of many upcoming packages, which will give users the plug-in flexibility of an Asset Store pack, but with the update-ability of a core Unity feature.In 2018.1 the Post-Processing Stack has been improved to feature higher-quality effects, automatic volume blending with a powerful override stack, and a flexible framework to write and distribute your own custom effects. It’s compatible with the LW RP, HD RP, and built-in rendering pipelines.Dynamic Resolution was first introduced in Unity for Xbox One in 2017.3. Now we are bringing the same functionality to PS4. The feature helps users dynamically manage their GPU budget. For example, it may be desirable for a game to hit high resolutions (such as 4K) in some scenarios. At other times, however, it is preferable for the resolution to drop in order to allow for GPU performance to increase.Users can select which render textures and cameras will participate in Dynamic Resolution from the Unity Editor and then scale the resolution of those items at runtime from a single script call. This combined with the information provided by the FrameTimingManager allows users to produce scripts that can automatically balance their GPU load by changing the resolution of their chosen render targets. Internally the system uses no more memory for render targets than would be allocated if Dynamic Resolution were not in use, and no significant CPU performance overhead is incurred when changing the resolution.Note: Users should be sure to check that their titles can become GPU-bound before adopting Dynamic Resolution as it will be of no benefit in CPU-bound scenarios.GPU Instancing now supports fetching of Global Illumination data for each instance. This can achieved either by allowing Unity render loops to automatically batch LightProbe-lit or Lightmap-lit objects, or by manually calling the new APIs to extract the LightProbe data baked with the scene into a MaterialPropertyBlock object used later for instanced rendering. This feature addresses an issue related to undesired artefacts. This occurs when a lightmap is divided into a number of charts and sampled, and the texel values from one chart bleeds into another (if they are too close) leading to undesired artefacts. The new UV Overlap Visualization feature lets you immediately see which charts/texels are affected by this issue. It automatically identifies overlaps and enables you to make more informed decisions when solving such issues (e.g. by increasing chart margins).Tessellation for Metal is a way to increase visual fidelity while using lower-quality meshes. It follows the DX11 hardware tessellation convention of using hull/domain shader stages. Existing HLSL shaders using this feature are cross-compiled and transparently transformed into Metal-compute shaders, aiming to make it a seamless transition between platforms. (Target-graphic APIs have a different underlying native approach to implementing tessellation.)Other graphics improvements and new featuresSky Occlusion ships as an experimental feature in 2018.1. It improves graphical fidelity and realism by including the contribution of lighting from the skybox as part of ambient occlusion calculations.We also added a new experimental C# interface to pass light information to the GI-baking backends.We now provide the ability to use all the CPU cores on a device to perform the rigidbody simulation. This includes the ability to execute the discovery of new contacts, performing both discrete and continuous island solvers and broad-phase synchronization, all using the native Job System. To get started, simply go to the 2D Physics settings and under “Job Options (Experimental),” check “Use Multithreading.” We have also provided a couple of test projects for job-based physics (used during development of this feature), which are available on Github.SpriteShape is a sprite layout and world-building tool that provides the ability to tile sprites along the path of a shape based on given angle ranges. Additionally the shape can be filled with a tiling texture.The main advantage of the SpriteShape feature is the powerful combination of a bezier spline path with the ability to tile sprites adaptively or continuously. When tiling continuously, sprites assigned to given angles are automatically switched.The feature is now available as a preview package and you can get our sample project and documentation here.Head over to our forum and let us know about your experiences. Tell us how this fits into your workflow or enhances your project? What worked smoothly? And what didn’t?We’re working on a new 2D animation system, which will be released over multiple phases and is now available as a preview package.In its first release, we have focused on developing tooling for rigging a sprite for skeletal animation (although similar, it is not an integration of Anima2D into Unity).The tools include: Bind-pose editing, manual mesh-tessellation and skin-weights painting. A runtime component ties it all together to drive sprite deformation. Our goal is to allow users to create simple animated characters that consist of a single sprite.An ongoing projectWe will continue to build new features and workflows on top of this. This may include tooling to make the creation of multi-sprite characters more efficient and to support workflows for larger productions. This will allow you to create complex multi-sprite characters and potentially share rigs and animation clips across multiple characters.The feature is now available as a preview package and you can get our sample project and documentation here. We're eager to hear what you think. Let us know which workflows we supported correctly, and what we missed, ideas for improvement, and more in our dedicated forum.The Particle System now supports GPU Instancing, which enables many more particle meshes to be rendered with a much smaller CPU performance cost. The Particle System uses Procedural Instancing, which is explained in more detail here.Instancing support has been added to the Particle Standard Shaders, and will be enabled by default on all new content. In Unity 2018.1, you can manually enable content from older Unity versions simply by clicking the checkbox in the Renderer Module. It’s also possible to add particle-instancing support to your own shaders.Below, you can see 10,000 sphere meshes using the old non-instanced technique, rendering at 8.6 fps followed by 100,000 sphere meshes using the new instanced technique, rendering at 8 5fps.Unity 2018.1 adds some new options to the Velocity over Lifetime module, allowing you to make particles travel relative to a defined center point. By default, the center is aligned with the Transform, but can be overridden within the Module. Particles can be made to travel around the center point, using the Orbital parameters, and away/towards the center point, using the Radial parameters.All shape types in this module now support a texture. The texture can be used for:Controlling particle colorsControlling particle alphasDiscarding particles based on a threshold and a texture channel of your choiceThere are two new ways to spawn sub-emitters in Unity 2018.1. The first is via the Trigger Module, which works in a similar way to how sub-emitters are spawned from the Collision Module. Simply choose Trigger as the sub-emitter type in the Sub-Emitter Module, and then, when conditions are met inside the Trigger Module (i.e. particles have entered the collision volume), the corresponding sub-emitters will be triggered.The second new way to trigger sub-emitters is via script. We have added a new script API called TriggerSubEmitter, which can be used to trigger a sub-emitter for a single particle, a list of particles, or all particles. In the Sub-Emitter module, you can choose Manual as the spawn type, which tells the Particle System that this emitter will only be triggered via a call in script. It is also possible to use the existing types (Collision or Death) and add additional triggers for these sub-emitters via script.The Legacy Particle System continues to be a development burden for each Unity version where it is supported. New engine features, such as VR and multi-threaded rendering, require time spent on ensuring compatibility as Unity evolves, and there will, of course, always be new engine features requiring maintenance of the Legacy Particle System code.This has prompted us to take the next logical step and retire the Legacy Particle System. Therefore, we have decided to remove its Script Bindings in Unity 2018.1.It has been fully deprecated since Unity 5.4, and our analytics show almost non-existent usage. Our target is to fully remove the Legacy Particle System in Unity 2018.3.If any of this will affect you, you have the following options:Migrate your Legacy Particle Systems to ShurikenUse our auto-updater script to attempt automatic conversion (https://forum.unity.com/threads/release-legacy-particle-system-updater.510879/)Reach out to us for help at https://forum.unity.com/threads/legacy-particle-system-deprecation.420351/While we can’t promise to solve every problem this causes, we will definitely listen to your concerns and do our best to mitigate any pain it may cause.Weight Tangents allows animators to create animation curves with fewer keys and smoother curves by allowing them to control the weight of the tangents. Once a tangent is set to Weighted, you can stretch it to affect the curve interpolation without adding any additional keys, for a smoother, more precise result.The Animation team has added support for weighted tangents to all curve editing in Unity. This means that you can use this new feature with the Particle System.The added bezier handles and weighted tangents enable you to create animation curves with fewer keys and smoother curves by controlling the weight of the tangents. Once a tangent is set to Weighted, you can stretch it to affect the curve interpolation without adding any additional keys, for a smoother, more precise result.We also added a simple, but oh-so-convenient, Zoom Control feature in the Animator Controller window!As we announced in February, ProBuilder and its creators have joined Unity. With the ProBuilder, PolyGrid and Polybrush tools, we are now offering integrated advanced level design in the Unity Editor at no additional cost. The package, which consists of ProBuilder, ProGrid and Polybrush, is included with all Unity subscription plans (Personal, Plus, Pro and Enterprise).ProBuilder is a unique hybrid of 3D-modeling and level-design tools optimized for building simple geometry, but capable of detailed editing and UV unwrapping as needed.You can use ProBuilder to quickly prototype structures, levels, complex terrain features, vehicles and weapons, or to make custom collision geometry, trigger zones, or nav meshes.ProBuilder also includes tools for exporting your models, editing imported meshes, and a run-time ready API for accessing the ProBuilder toolset from your own code.Also available, ProGrid gives you both a visual and functional grid, which snaps on all 3 axes. Working on a grid facilitates speed and quality, making level construction incredibly fast, easy, and precise. It is especially handy for modular or tile-based environments, but when combined with ProBuilder, it enables faster, and more precise geometry construction for all types of work. Check the ProGrid Introduction and Tutorial to learn more. Scroll to the bottom of this blog post for information on the integration and availability of ProGrid.Polybrush enables you to blend textures and colors, sculpt meshes and scatter objects directly in the Unity editor. Combined with ProBuilder, you get a complete in-editor level-design solution.Finally, enjoy Unity’s seamless round-tripping with Digital Content Creation tools (like Maya) to further detail and polish your models.Below are a few examples of how ProBuilder enables you to quickly prototype structures, complex terrain features, vehicles and weapons, or to make custom collision geometry, trigger-zones or nav meshes.Probuilder has been used by many made-with-Unity games. Check out the highlight reel video:Here’s a quick overview of the many features in ProBuilderPolybrush enables you to blend textures and colors, and sculpt meshes directly in the Unity Editor.Polybrush is in beta, and just got a new feature in its latest iteration; it now allows you to scatter objects using highly customizable brushes.Check the Polybrush Introduction and Tutorial to learn more.As a package, ProBuilder, Polybrush and ProGrid give you a complete in-editor level-design solution that enables you to construct precise geometry faster.ProBuilder is available via the new Unity Package Manager. We plan to integrate the other two parts of the package directly into Unity at some point in 2018, but right now, you can download Polybrush (Beta) and ProGrids for free on the Asset Store.How to get started:In the Unity Editor, go to the Window menu>Package Manager, click All, select ProBuilder and click Install:For more information, check the documentation and the Getting Started with ProBuilder tutorial series.If you already purchased one of the packages, have a look at the “How to get the tools, what’s the roadmap” section of this blog post for information.With our new high-performance multithreaded system, we’re rebuilding the very core foundation of Unity. The new system will enable your games to take full advantage of the multicore processors currently available — without the programming headache. This is possible thanks to the new C# Job System, which gives you a safe and easy sandbox in which to write parallel code. We are also introducing a new model to write performant code by default with the Entity Component System, and the Burst compiler to produce highly optimized native code.With performance by default, not only will you be able to run your games on a wider variety of hardware, you’ll also be able to create richer game worlds with more units and more complex simulations.Write very fast, parallelized code in C# to take full advantage of multicore processorsThe trend in modern hardware architecture has been heading towards multiple cores to increase processing power over the more traditional solution of increasing core speed. The introduction of the C# Job System will help you fully leverage this increase in processing power.It enables you to write fast jobified code in C# Scripts. It’s also safe as it provides protection from the pitfalls of multi-threading, such as race conditions and deadlocks.Better performance across the boardThe C# Job System enables better overall performance, especially as new Unity features like the Entity Component System (18.1 preview) and our new Burst compiler (18.1 preview) technology become available. The goal of all of these systems is to increase what is fundamentally possible in Unity in terms of performance, while still supporting existing workflows and allowing for a smooth technical transition.What you’ll need to doIn order to achieve these performance gains, there are a few key changes you need to make to the way you write code in Unity. First, providing the CPU with clean, linear arrays of data to read from, rather than pulling from multiple locations in memory during calculation, allows for much faster performance. By taking an active role in memory management, we ensure that memory is managed in a way that optimizes performance. A new set of tools added to Unity’s API allows you to manage your data layout and the way memory is managed in an explicit and detailed way.The Entity Component System is a way of writing code that focuses on the actual problems you are solving: The data and behavior that make up your game.In addition to being a better way of approaching game programming for design reasons, using Entity Component System puts you in an ideal position to leverage Unity's Job System and Burst Compiler, letting you take full advantage of today's multicore processors.With Entity Component System, we are moving from an object-oriented approach to a data-oriented design, which means it will be easier to reuse the code and easier for others to understand and work on it as well.The Entity Component System ships as a preview package in 2018.1, and we will continue to develop and release new versions of it in the 2018.x cycle.Burst is our LLVM-based compiler, which takes .NET IL and produces machine code using a new math-aware backend Compiler Technology. Burst takes C# jobs and produces highly-optimized code, which takes advantage of the particular capabilities of the platform you’re compiling for. So you get many of the benefits of hand-tuned assembler code across multiple platforms, without all the hard work.Burst Compiler ships as a preview package in 2018.1, and we will continue to develop and release new versions of it in the 2018.x cycle.To help you get started, we’ve also provided a repository of examples that demonstrate using the C# job system to write systems at scale, for reference and sharing: C# Job System Cookbook.For more information on how to build using the Entity Component System, have a look at these Entity Component System Samples.Last fall at Unite Austin 2017, Unity and Autodesk announced a collaborative partnership to build more connected workflows between Autodesk 3D tools and the Unity engine.Since then we have improved the interoperability to benefit any game developer or artist who works on either side of a 3dsMax/Maya/Unity workflow.Our goal is to provide an artist-friendly interface and workflow that allows you to safely merge your changes back into those assets to continue your work.When roundtripping, assets are often edited and renamed, potentially changing their very nature. Now Unity will make sure that modifications made to the FBX by an external application can be remapped to the original with no loss of information.Other improvements to the workflow and integration include Lights roundtripping, Animations roundtripping (including custom properties) and Blendshapes (experimental).Version 6.0 of the remote Cache Server, which is the culmination of a six-month focused effort to elevate quality and performance, is now available. Cache Server makes creating with Unity faster by optimizing the asset-import process, either on your local machine or on a dedicated Local Area Network server. These improvements save time and speed up the development process for individuals and teams. Download Remote Cache Server now on GitHub.Package ManagerIn 2017.2, we introduced the first pillar of the new Package Manager, a more flexible and modular approach to managing Unity-developed features and assets that make up your projects. Initially only exposed as an API in previous releases, in Unity 2018.1, we’re introducing a new Package Manager User Interface. The new Package Manager UI will help you start projects more efficiently, and make it smoother and easier to install, update and enable new Unity features.The Unity Package Manager UI improves the following aspects of your project-management workflow:Quick access to newly released features: Browse the list of available features for your version of the editor, and download them from the cloud. Packages are instantly and dynamically included in your project.Get the latest fixes, instantly: Quickly check, download, and update the packages you have enabled in your project.Access to Preview features: Many new Unity features will be available to download as “Previews” giving you access to the latest technology.Easily share lightweight projects: Unity Packages are kept in a global cache on your machine, and referenced by your project. There’s no need to transfer packages when sharing your project; the editor will fetch and download the needed packages from the cloud repository.You can find the Package Manager in the Window and use it to install features such as Shader Graph, Post Processing, ProBuilder and the Lightweight and High Definition Render Pipelines.Unity automatically defines how scripts compile to managed assemblies. Compilation times in the Unity Editor for iterative script changes increase as you add more scripts to the project, thus increasing compilation time.In 2017.3, we introduced the ability to define your own managed assemblies based on scripts inside a folder. By splitting your project’s scripts into multiple assemblies, script compilation times in the editor can be greatly reduced for large projects. You can think of each managed assembly as a single library within the Unity Project.In 2018.1, Assembly Definition File (asmdef) assemblies are now compiled on startup before any other scripts (Assembly-CSharp.dll and friends), and compilation does not stop at the first compile error.All asmdef assemblies that successfully compile along with all their references are loaded before compiling the remaining scripts (Assembly-CSharp.dll and friends). This ensures that Unity packages are always built and loaded regardless of other compile errors in the project.It also makes it possible for packages to have playmode test assemblies without modifying user project settings.In the past, when adding play-mode tests, users needed to enable it in the settings. This would result in registering the assemblies in the build, and also meant that there was no separation, which C# developers normally employ in projects designated for tests.In 2018.1, you can mark the assembly to reference the test assemblies, and no other assemblies will reference them unless the old settings are used (backwards compatible). However predefined assemblies will not auto-reference these assemblies either.We also added a new BuildOption by default. The normal build will not build assemblies with these settings. Only the TestRunner will include and build test assemblies.All this then makes it possible to have tests in projects without having the setting activated.Presets are assets containing the type of asset to which they apply and a list of property modifications (name/value pairs).Presets can be easily applied to or created from any serializable object via the UI on the object inspector or from a public API method.Objects do not keep a link to an applied preset. Thus, modifying a preset after it has been applied has no side-effects.Each object type may have a single preset registered as its default via the new Preset Manager. Any time an object is created in the Unity Editor, the default preset for that object is automatically applied. Changing the default preset does not affect existing objects.Presets only exist in the editor. There are no runtime API changes.Presets expose API hooks, which allow them to be set immediately before import, and a new object creation API, which allows editor code to create objects with default presets.As announced in December 2017, we are deprecating built-in support for the import of Substance Designer materials in the Unity 2018.1 Editor. You can find more info on how to include substance here.In 2018.1, we are adding a number of improvements to Timeline. With Timeline keyboard navigation, you can now use tabs and arrow keys to speed up your workflow by toggling between collapsing and expanding Tracks easily.We also added the new Timeline Zoom bar which makes it easy to zoom in and out to get an overview of your Timeline tracks.We are also introducing editing modes in Timeline.When manipulating clips, zoom-based snap behaviors keeps clips together with neighboring clips. To create a blend effect, you can release (relax) the edge-magnet behavior through a clutch key (Ctrl). This allows moved/trimmed clips to blend into neighboring clips without being impeded by the edge-magnet.There are three modes:Mix mode allows two clips to blend together in order to create a transition. This mode is how Timeline has worked up until now, and it is the default editing mode.Rippling mode allows clips to be inserted on a track that already contains clips by making room for the new material.Replace mode allows clips to overwrite clips already present on a track in order to update the material while preserving timing.Cinemachine comes packed with improvements in 18.1, including the Cinemachine Storyboard, which enables you to set up the timing and basic animation of storyboards in Timeline as a function of a Cinemachine clip. These people will help you get people to use Unity from start to finish in the storytelling process keeping them there throughout the creative process.You can pace out your boards, do cross-fades, basic zooms and movement ‘Ken Burns’ style.Get your story blocked in and your timing and shots working the way you want them to. Add audio to create a realistic feel and pull your story together. Once you're ready, you can use just one button to toggle between the storyboard and a Cinemachine camera, all the while keeping your editing intact.There is no need to do your storyboard and previs edit in another tool; you can do it all inside Unity.Other improvements include Package Manager integration, camera-shake system, support for custom camera blend curves and numerous others.It’s important to us that we provide you with a great C# IDE experience to accompany the new C# features. So to support the latest C# features and C# debugging on the new .NET 4.6 scripting runtime on macOS, we are replacing MonoDevelop-Unity 5.9.6 with Visual Studio for Mac starting from Unity 2018.1.(as announced in January). This will make support for many of the exciting new C# features available in C# 6.0 and beyond better as we move to the (currently experimental) .NET 4.6 scripting runtime upgrade in Unity.On Windows, we will continue to ship Visual Studio 2017 Community with Unity. The Visual Studio Community already supports the latest C# features and C# debugging on the new .NET 4.6 scripting runtime. MonoDevelop-Unity 5.9.6 will be removed from the Unity 2018.1 Windows installer, as it does not support these features.Visual Studio Code also supports Unity and extensions to accommodate IntelliSense and the Unity Debugger Extension to debug your Unity C# projects. See Unity Development with VS Code for details, and learn more about the implications and alternative IDEs in our blog post.As always, we are looking for feedback, and if you experience issues with Visual Studio or its integration in Unity please report them at https://developercommunity.visualstudio.com/ "In 2018.1, we have added support for IL2CPP scripting backend for Windows standalone and macOS standalone. This brings the CPU speed improvements of IL2CPP to the Mac/OSX standalone and Windows standalone player, as well as enabling third-party DRM tech to help with application code security.The new scripting runtime is no longer experimental in Unity 2018.1. In Unity 2017.1, we shipped the first experimental preview of the stable scripting runtime. Throughout the 2017.2 and 2017.3 release cycle, many Unity users worked with this experimental scripting runtime and provided invaluable feedback. (Thanks, everyone!) We’ve also worked closely with excellent developers from Microsoft, both on the Mono and Visual Studio teams. As we’ve sorted out issues and corrected bugs, the modern scripting runtime has become more and more stable and is now ready for widespread use.Upgraded compilers and runtimes provide improved stability, richer debugging, and better parity with a modern .NET architecture across all platforms. For example, C# 6 and new .NET APIs make Unity compatible with modern .NET libraries and tools.Two profiles are available with the new scripting runtime: .NET 4.x and .NET Standard 2.0. Both of these profiles fully support .NET Standard 2.0 to enable the use of the latest cross-platform libraries. The .NET Standard 2.0 profile is optimized for small build size, cross-platform support, and ahead-of-time compilation. The .NET 4.x profile exposes a larger API surface and exists primarily for backward compatibility.Google’s spatial audio SDK, Resonance Audio is fully integrated in Unity 2018.1, enabling developers to create more realistic VR and AR experiences on mobile and desktop. With Resonance Audio in the Unity Editor, you can now render hundreds of simultaneous 3D sound sources in the highest fidelity for your XR, 3D and 360 video projects on Android, iOS, Windows, MacOS, and Linux.Some recent examples of made-with-Unity games are Pixar’s Coco VR for Gear VR, Disney’s Star WarsTM: Jedi Challenges AR app for Android and iOS, and Runaway’s Flutter VR for Daydream.A new, native, multiplatform XR API has been added in this release laying the groundwork for a more extensible framework that initially targets abstraction over handheld AR SDKs. The API is designed to enable developers to build Handheld AR apps once, and deploy across multiple device types. As functionality is added to this API, we will eventually deprecate the ARKit plugin on Bitbucket as well as the experimental ARInterace project on GitHub. Support for handheld AR SDKs will ship as preview packages in the editor.For all creators looking to start exploring Magic Leap, we’re thrilled to make a Unity for Magic Leap Technical Preview and the Lumin SDK available to you on the Magic Leap Creator Portal.The Magic Leap Technical Preview is meant for anyone looking to get a glimpse at this exciting new platform. In addition to 2018.1 features, the Technical Preview includes a new platform under the Build Window targeting Magic Leap’s Lumin OS. The preview coupled with the Lumin SDK will also give you access to Magic Leap Zero and Magic Leap Remote, which allows for simulation of the hardware platform. Learn more.Unity 2018.1 brings you support for Google’s Daydream standalone VR headset with Worldsense technology, enabling inside-out, six degrees of freedom (6DoF) tracking support for Daydream apps. Use it to get building now, and be ready when the hardware is released later this year!With ARCore out of developer preview, you can now create high-quality AR apps for more than 100 million Android-enabled devices on Google Play. ARCore 1.1 for Unity also enhances the environmental understanding of your scene with oriented feature points, a new capability that allows you to place virtual content on surfaces near detected feature points, such as cans, boxes, and books. It also allows you to test and iterate your AR app in near real-time with ARCore Instant Preview, Learn more about all that ARCore brings to Unity 2018.1.Whether you’re a VR developer who wants to make a 360 trailer to show off your skills, or a director who wants to make an engaging cinematic short film, your workflow just got easier. Unity’s new technology for capturing stereoscopic 360 images and video in Unity empowers you to create and share immersive experiences with an audience of millions on platforms such as YouTube, Within, Jaunt, Facebook 360, or Steam 360 Video.Our device-independent stereo-360 capture technique is based on Google’s Omni-Directional Stereo (ODS) technology, which uses stereo cubemap rendering. We support rendering to stereo cubemaps natively in Unity’s graphics pipeline both in the Unity Editor and on PC standalone players. After stereo cubemaps are generated, the cubemaps are converted to stereo equirectangular maps, which is a projection format used by 360-video players.Stereo 360 capture works in forward and deferred lighting pipelines with screen space and cubemap shadows, skybox, MSAA, HDR and the new Post-Processing Stack.You can find more info in this blog post.In 2018.1, we added ARM64-bit runtime support for Android based on IL2CPP technology. Currently, only 32-bit ARM or x86 builds for Android can be produced using Unity. Almost all current chipsets ship as 64-bit, but we have not been leveraging the potential advantages. The new support for running 64-bit Android apps will have performance benefits, plus games will be able to address more than 4GB of memory space.Android Sustained Performance Mode sets a predictable, consistent level of device performance over longer periods of time without thermal throttling. It offers improved battery life and a smooth experience at the cost of some reduced performance. It’s based on the Sustained Performance API from Google. This setting has already been available for VR applications, and now you can enable it for non-VR applications as well.Performance Reporting (included with Unity Plus and Unity Pro) now supports Windows desktop, joining support for MacOS and mobile platforms (iOS and Android). Native crashes for Windows will now be reported for you to view and debug from the Unity Developer Dashboard. The dashboard is also being updated with new features for managing and viewing reports, as well as better performance for highly active projects.As always, refer to the release notes for the full list of new features, improvements, and fixes. You can also provide us with feedback on our forums.At GDC, we announced our new plans for Unity releases, which include what will be known as the TECH stream and the Long-Term Support stream (LTS). The TECH stream will consist of three major releases a year with new features and functionality. The LTS stream will be the last TECH stream release of each year and will roll over to the following year.The TECH and LTS streams represent a major shift from the current approach of supporting each of the releases for a year. From now on:Support for each respective TECH release will end when the following one goes live. LTS releases will be supported for two years.Other aspects of our approach to releases will also change:From four to three releases: Instead of four feature releases, we are going to ship three TECH stream releases per year.First release in spring: Each year, a TECH stream will begin in the spring, this year starting with 2018.1. This will be followed by summer and fall releases.Frequency of bug fixes: Whereas the TECH stream will receive a weekly release with bug fixes, the LTS stream will receive regular bug fixes every other week.From patches to updates: We are dropping the .p# suffix for our weekly patches because, due to our improved testing of these releases, we believe they will be suitable for everyone.The first LTS release will be 2017.4, which is simply the latest 2017.3 updated release. The change in version number signifies that it is the beginning of the new LTS cycle. So, xxxx.1, .xxxx.2 & xxxx.3 are TECH releases and xxxx.4 is the LTS release.The regular updates on both TECH and LTS streams will have continuous version numbering. For example, 2017.4.0 will be followed by 2017.4.1, 2017.4.2, 2017.4.3, and so on.The chart below shows an example of how the streams will work, with the blue boxes representing the TECH streams and the green boxes representing the beginning of the LTS streams.To help you learn more about LTS, we created a blog post which answers Frequently Asked Questions.We also want to send a big thanks to everyone who helped beta-test, making it possible to release 2018.1 today. Thanks for wanting to be among the first to try all the new features and for all the great feedback.Info on the 2018.1 sweepstake winnersWe have found and contacted all the beta 18.1 sweepstakes winners and will send out the prizes to the lucky winners in the weeks to come.Be part of the 2018.2 betaIf you aren't already a beta tester, perhaps you’d like to consider becoming one. You’ll get early access to the latest new features, and you can test if your project is compatible with the new beta. By becoming a beta tester, you can:Get early access to all the latest features before they are officially releasedTest your compatibility to make sure your projects are ready to be upgradedJoin the experts to share insights with experienced members of the Unity communityWin cool prizes — beta testers are automatically entered into our sweepstakesInfluence the future of Unity with surveys, feedback and the chance to be invited to roundtablesBe part of an elite group that gets special benefits, such as discounts and invites to special eventsWe've been listening to your feedback from our beta survey and as a result, we're planning to launch a number of new initiatives. We're going to expand these open-beta initiatives into a more formalized program, which will result in a more efficient and speedy QA process and, ultimately, a more polished end product.You can get access simply by downloading our latest beta version. Not only will you get access to all the new features, you’ll also help us find bugs ensuring the highest quality software.As a starting point, have a look at this guide to being an effective beta tester to get an overview. If you would like to receive occasional mails with beta news, updates, tips and tricks, please sign up.

>access_file_
1551|blog.unity.com

Lifetime value: How app developers can maximize LTV

LTV, or lifetime value, is a critical metric which app developers use to measure overall app success. It does this by quantifying the value of a user cohort (or a group of users segmented by geo, app version, install date etc.) throughout their lifetime within an app. A core measurement of app health, it’s used on both sides - by both monetization and user acquisition managers.How to calculate user lifetime valueThere are many ways to calculate lifetime value, and each depends on how app developers define ‘value.’However, no matter how app developers choose to calculate LTV, one thing remains certain: ‘high LTV’ users always have high retention rates, and contribute generously to the overall revenue of the app. That’s why app developers typically calculate LTV in relation to monetization and/or retention.Monetization and lifetime valueIn relation to app monetization, lifetime value is calculated in order to better quantify the revenue-generating capabilities of an app’s user base - in other words, how much revenue do users generate?Of course, LTV as a factor of monetization depends on the app type and the forms of monetization it employs - in-app purchases, in-app advertising, sales, subscriptions etc. LTV might also differ from category to category.For example, a ride-sharing app might calculate LTV in terms of frequency of use or money spent on rides, while a mobile game might calculate LTV in terms of in-app purchases and rewarded videos watched.Typically, it’s relatively easy for app developers to calculate the IAP-based LTV of their users - i.e. the value of the in-app purchases a given user makes over their lifetime. But ad-based LTV can be harder to calculate since it’s based on data from various different ad networks. There are tools to simplify calculation of ad LTV - you can check out our tool here.Why is LTV so important?LTV is particularly important for app developers running user acquisition campaigns since LTV will help app marketers define what price makes sense to pay for an install. In order to ensure their campaigns are ROI positive, marketers need to ensure that the price they pay for a new user is below the projected LTV for that user.In this sense, the nexus between user retention and lifetime value becomes important. In terms of user retention, or an app’s ability to keep users coming back, lifetime value is calculated to learn the point in a user’s lifetime at which the user become profitable - i.e. when the price point paid for that user is passed.Here, app developers calculate LTV as a function of their app’s retention curve - for example, what is LTV 7 days after install? This way, app developers can hone in on certain points in the app lifecycle at which LTV is low, and amp up resources to increase LTV there.If the cost per acquired user in a user acquisition campaign is higher than the user is going to eventually generate in-app - i.e. if the cost of acquiring a user is higher than their projected LTV - app marketers lose money. That’s why it’s important to make sure that the cost per install (CPI) is less than LTV - so app advertisers can stay ROI positive and turn a profit.In concrete terms this means it’s better to pay $5 for a user that’s predicted to bring an average of $6, than to pay $0.10 for an install that won’t yield revenue.How can app publishers increase lifetime value?Optimize onboardingThe first few minutes of a user’s in-app experience is critical. That means your onboarding process needs to be enticing, simple, sticky, and engaging to ensure users stick around for the long haul and end up generating revenue throughout their lifetime.Integrate rewarded video adsThat brings us to the next point. Some ad units - like rewarded videos - are actually known to increase both retention and lifetime value within an app. This is because they not only drive ad revenue, but also drive further engagement with the app, and down the line, can even increase IAPs. That means better retention and revenue from in-app purchases. Giving users who aren’t ready to pay a way to keep playing your game or engaging with your app is key to building long-term loyalty, and rewarded videos are a win-win way of doing that.What are they? Rewarded videos are ads users watch in exchange for receiving in-app content like game currency or extra lives. In addition to generating high eCPMs and revenue for app developers, rewarded video ads also increase retention rates, since they let users continue playing without requiring they make in-app purchases they’d otherwise need to progress. More importantly, rewarded videos don’t cannibalize IAPs, and can actually increase them. As non-paying users get used to accessing in-app currency through RVs, they often eventually reach a stage where they’re ready to pay for that currency since its value is clear to them. This is why users who frequently watch rewarded videos have higher LTVs than others.Keep users engagedA core component of good LTV is active or engaged users - i.e. users who keep coming back to your app or game to spend time (and money). Increasing the number of engaged users you have will also therefore increase overall LTV.Some key tools to use when trying to increase in-app engagement are deep linking and push notifications. App re-engagement campaigns with deep links will direct lapsed, or unengaged users to specific content in your app that is relevant to them. By removing any ‘unnecessary’ steps like opening the app and navigating to the page they want, deep linking makes it easy for users to re-engage with an app they already have installed.Similarly, push notifications - when implemented thoughtfully - can help remind users of in-app benefits, or highlight special deals that could entice them to come back to the app. For example, showcasing a special promotion on in-app currency that doubles its value for a limited period of time.Wrapping upLTV is one of the most important metrics for determining an app’s success, and there are dozens of strategies for increasing lifetime value. If you’re interested in learning more about improving your app’s lifetime value, contact the team of experts at ironSource here.

>access_file_
1554|blog.unity.com

Photogrammetry in Unity: Making real-world objects into digital assets

Photogrammetry is the process of using multiple photos of the real-world objects to author game-ready assets. It’s best suited to objects that are time-consuming to produce in 3D sculpting software. This post explains how new Unity features assist you in working with photogrammetry. We’re also sharing our Fontainebleau photogrammetry demo project, including all meshes, textures and materials.Last year at Siggraph 2017, we revealed a highly detailed field guide for professional artists. It takes you through the entire process, step-by-step, for producing high-quality, reusable, and game-ready digital assets from photos and video.Part of this workflow, the team released the De-Lighting tool on the Asset Store. It enables artists and developers to remove lighting information from photogrammetry textures, so that the final assets can be used under any lighting condition. We also gave a talk at Siggraph about an overview of the photogrammetry workflow and shared technical details about the De-Lighting tool.This year, we created a step-by-step guide to shows how to use a layered shader to achieve the same level of quality as shown in the photogrammetry workflow guide, but optimizes on texture memory budget to cater for your gaming needs.Here’s an overview of the steps of authoring material with Photogrammetry asset in Unity:Check out the complete guide: Layering materials in Unity with Hight-Definition (HD) Render PipelinePhotogrammetry allows you to get a qualitative result, but requires a very high texture resolution to conserve details, as shown here:This is unpractical for game authoring due to memory budget, and it doesn’t allow you to add any variation to the object.Unity 2018.1 beta introduced a preview of Scriptable Render Pipeline. We will release two built-in render pipelines with Unity 2018.1: the Lightweight (LW) Render Pipeline and High-Definition (HD) Render Pipeline. HD Pipeline provides a shader dedicated to photogrammetry material authoring name: LayeredLit.A layered shader defines the visual with a combination of individual materials.You can see the main material in the picture above is mixed with other materials like stone, ground element and moss. These other materials are tileable. This means they can wrap around objects and you can reuse them on different objects. Using a combination of materials enables you to have a similar visual quality as a high resolution texture (as in a similar texel density on screen), but with low resolution textures, which saves memory:A layered shader allows you to share textures between assets, and to combine tileable materials to add variation. This makes it easy for you to populate a large world at low cost:We created the Fontainebleau demo to illustrate the photogrammetry workflow and the use of the LayeredLit shader. This technical demo is authored with game development condition in mind: it’s a representative game level and targets the standard PlayStation 4 platform at 1080p @ 30fps. The level represents a part of the Fontainebleau forest and uses a limited set of meshes and textures that are reused with different variation with the help of the LayeredLit shader. There is a playable first person and third person mode to walk inside the forest. Targeting consoles like XboxOne or PlayStation 4 requires consideration of how to get the most from these platforms.The demo also supports three different lighting condition to illustrate that correctly authored and de-lighted asset work fine in any lighting condition:Day lighting:Sunset lighting:Night lighting with lights off:Night lighting with lights on:Finally, we included 3 modes to explore the demo:Cinematic mode: select your lighting program, then sit back, relax and enjoy the show,First Person & 3 Person Modes: these are very rudimentary exploration modes to let you discover the environment on your own, with bonuses in First Person mode.Fontainebleau is the name of a forest close to the Unity Paris office. The forest is a good subject to speak about photogrammetry. Natural assets are often complex and hard to reproduce realistically. For our artists, it was important to have the subject close to them to do all the tests needed to analyze the best workflow possible for games.The project is intended to work with 2018.3 and uses the 4.8.0 HDRP package.You can download the pre-built PC stand alone player here.The project is available for clone only here on GitHub. That means that downloading a zip file using the green button on Github will not work. You must use a version of git that has LFS in order to get the project. Take a look at it and inspect the photogrammetry assets: meshes, textures and materials. You’re also allowed to reuse the assets in your Unity project with no restriction.At the start, the photogrammetry process can seem to be hard to manage, but the result is worth it.Hopefully this demo will convince some of you to use photogrammetry to improve the quality of assets and help you speed up your projects.We're looking forward to seeing what you make and talking to you about this demo and the photogrammetry workflow in Unity in general in this forum thread.

>access_file_
1555|blog.unity.com

Spotlight Team best practices: Making believable visuals in Unity

Being part of the Spotlight Team, I am fortunate to be involved in some very interesting projects. The Spotlight Team at Unity works on games together with our customers and a significant part of my role is to help developers achieve the desired look and quality for their projects. I get to hear many stories from all across the industry and can identify common issues that content creators are facing. Several of the projects I have worked on aimed for fairly realistic visuals. Given the project’s art contents, how do we make a scene in Unity that will look believable?There are a multitude of topics that need to be covered in order to make believable visuals. In this post I’m going to discuss lighting and render settings. Further down the post I’ll also share our Spotlight Tunnel Sample Scene and explain how you can use it to learn and experiment.Understanding how Unity’s rendering features can be used to realistically mimic the real world will help you to achieve your project’s visual goal.Linear rendering mode In simple terms, this sets Unity to do lighting and shading calculations using physically accurate math before transforming the final output into the format that works best for monitors.To specify a gamma or linear workflow, go to Edit > Project Settings > Player and open Player Settings. Then go to Other Settings > Rendering and change the Color Space to Linear.Defining your color space should be one of the earliest decisions in your project, because of the drastic impact on the final shading and lighting results.Rendering mode. In the Spotlight Tunnel Sample Scene, we use deferred rendering path. This allows content creators to work with multiple dynamic lights efficiently, combined multiple reflection cubemaps, and have the ability to use the existing Screen Space Reflection features in Unity 2017+.To set this, go to Graphic Settings > Rendering Path or Camera > Rendering PathYou can find more information about render modes refer in this part of Unity documentation.High Dynamic Range (HDR) Camera. When rendering believable lighting, much like in real life, content creators will be dealing with lighting values and emissive surfaces that have a brightness higher than 1 (high dynamic range). These values then need to be remapped to the proper screen range (this is called tonemapping). This setting is crucial to allow the Unity camera to process these high values and not clip them.To enable this, select the main camera on the scene and ensure that HDR is checked in the inspector tab for the selected camera.HDR Lightmap encoding. (optional) The Spotlight Tunnel sample scene didn’t use baked lighting, however, if you’re planning to work with High intensity (HDR) baked lighting, we recommend that you set the lightmap encoding to HDR lightmap to make sure the baked light result is consistent. The option can be found under Edit > Project > Player settings > Other settings > Lightmap encoding (Unity 2017.3+ only). Detailed information for Lightmap encoding can be found in the manual.Tonemapper for your Scene (part of Post Processing Stack). To display HDR lighting properly, a tonemapper needs to be enabled in the project. Make sure to install Unity Post Processing Stack (version 1) from the Asset Store first.Create a Post Process Profile Asset in your project and configure it as follows:Enable Color Grading > Tonemapper > ACES (Academy Color Encoding Standards)Enable Dithering. Dithering allows the Scene to alleviate banding artifact introduced by 8 Bit/channel output from HDR Scene. Modern engines use this technique to circumvent the limitation of 16M color output. Leave the rest of settings in tonemapper alone for now.Select the “Main Camera” and add component Post Processing Behaviour.Assign the Post Process profile previously created to the profile slot. If you want to use Post Processing Stack Version 2, please refer to the readme of the package as it is currently in Beta.Enable Image effect for viewport. This enables you to see the tonemapper all the time while working with the Scene view. Notice the highlight rendition and the dark tunnel value separation improvements in the tonemapped Scene. If you look at the non-tonemapped Scene, you can see how the highlights didn’t converge to a unified color. (the yellowish burning sun in this case).This setup essentially tries to replicate how a digital camera captures a scene with a fixed exposure (without exposure adaptation / eye adaptation features enabled).At this point, content creators have achieved a proper foundational scene rendering setup that should give believable results with a wide range of content.Unity caters to lots of different lighting strategies/systems and project scenarios. We recommend that you check out our extensive documentation on lighting modes and setup to understand all the different options.However, for fast iteration and simplicity, responsive visual feedback is necessary. For this reason, the Spotlight Tunnel Sample Scene is using real-time lighting with Realtime Global Illumination (GI). This will give us a nice range of specular response, good bounce lighting, and let us change our lights on the fly.Realtime lighting with Realtime GI + Light ProbeGoing back to the lighting itself, a typical Scene, at daytime, with outdoor areas, can be broken down to 3 lighting components:1) Hemisphere (Sky contribution).2) Direct lights (Sun + Local lights).3) Indirect illumination (GI lighting).At this stage, content creators are assumed to have meshes that are properly textured, and an assembled Scene.Initial Hemisphere lighting First component for outdoor lighting is Hemisphere lighting, called Environment Lighting in Unity. This is a fancy word for skylight. Night sky has minimal contribution, while daytime sky has very bright contribution. Hemisphere settings can be found under the Lighting tab (Window > Lighting > Settings > Environment). For a start, procedural skybox material would be prefered instead of HDRI cubemap. Create a new material in the project, name it SkyMaterial and then set it to Skybox / Procedural.Assign it to Environment Skybox Material inside Lighting tab > Scene.At this point the Scene is somewhat lit. There is ambient, but not exactly proper hemisphere lighting. We’ll leave this alone for now.Directional Light Typical sunlight or moonlight is a light source close to infinity distance and has parallel light direction and shadow. They’re usually represented by a directional light.Indirect Illumination / Global Illumination. Directional light + ambient alone won’t create believable lighting. Proper hemisphere lighting requires occlusion of the skylight lighting. We also need to simulate sunlight bouncing off subjects in the scene. The sky currently renders a single color value to the Scene making it flat. This is where Realtime Global Illumination or Baked Lighting is required to calculate occlusion and indirect bounce lighting. In order to achieve that, follow these steps:Make sure all meshes that need to contribute to the Realtime GI or baking are flagged with Enable Lightmap Static and Reflection Probe Static. These would typically be large static meshes.Next, enable Realtime Global Illumination (Leave at default-medium settings) in the Lighting tab > Scene > Realtime Lighting.Hit Generate Lighting or check Auto Generate.Whoa, the Scene is now dark after light generation has finished. To make matters worse, some elements of the Scene are out of place - notice the Tram and the door on the background. The static objects in the Scene currently have proper occlusion for hemisphere and indirect bounce response from the directional light, however, the rest of the objects lack a proper lighting setup.Light Probes and Reflection Probes. For dynamic objects or non-lightmap objects to receive Realtime/Baked Global Illumination, there needs to be light probes distributed in the Scene. Make sure to distribute light probe groups in the Scene efficiently near the areas where dynamically lit objects are located or will pass (such as player). Learn more about Light Probe Group in the manual. Hit Generate Lighting again or wait for the precomputation to finish if Auto Generate is checked.The Tram and the background door are grounded better, but reflections look out of place. Sky reflection is all over the place and shows up inside the tunnel. This is where reflection probes comes in. Efficiently place reflection probes with proper coverage in the Scene as needed. In the Scene above, one reflection probe for the main room is sufficient and two for each tunnel interior. 128 pixels Cubemap Resolution using box projection usually is a good baseline for typical cases and will keep memory and reflection bake times happy. Here’s more information about Reflection Probe.The Scene now looked properly grounded and cohesive, an important part of a believable Scene. But everything is even darker than before and nowhere near believable quality.HDR Lighting Value Many content creators don’t realize that, in reality, hemisphere lighting and sunlight are very bright light sources. They’re much brighter than the value 1. This is where HDR lighting comes into play.For now, turn off the directional light and then set the SkyMaterial Exposure to 16. This will give you a good idea what proper hemisphere lighting does to a Scene.Things start to look believable. Think of this state as a cloudy day, where sunlight is completely diffused in the sky, so there’s no directional light.At this point, you can reintroduce sunlight back into the Scene at a much higher value. Try Intensity 5 for a start. Despite the sun looking nearly white, it’s important that directional light color is chosen properly as the impact of indirect color from the strong sun can dramatically change the look of the Scene.Now the sun (directional light) looks like a high energy light as expected from real life. The Scene looks quite believable at this point.Screen Space Ambient Occlusion and Screen Space Reflection While the Scene lighting looks pretty good at this point, there’s additional details that you can add to Scene to push it further. Baking detailed occlusion usually isn’t possible because of the limited resolution set in the Realtime GI for reasonable performance. This is where Screen Space Ambient Occlusion can help. Enable SSAO in the Post Process Profile under Ambient occlusion. Settings for this example are set to Intensity 0.5, Radius to 1, Medium Sample count w/ Downsampling and Ambient Only checked for a start.While SSAO takes care of extra ambient lighting occlusion, reflection could use some accuracy improvements in addition to the simple reflection probes. Screen Space Raytraced Reflections can helps improve this situation. Enable the Screen Space Reflection in the post process profile.Notice that the left side of the wet track no longer renders bright reflections as SSR gives the Scene more accurate reflections for on screen objects. Both of these post process effects incur performance costs at runtime, so enable them wisely and set the quality settings to fit within your runtime performance constraints.Fog At this stage the content creators have achieved somewhat believable outdoor and indoor value separation on a fixed exposure. Reflection is visible in the dark indoor areas as strong highlights and not dim muddy values.However, the Scene foreground elements and background elements aren’t showing up despite having strong perspective elements. A subtle fog in the Scene can create a massive difference in giving the Scene additional dimension.Notice the foreground railing has better definition compared to the zero fog Scene. Fog is enabled in Lighting tab > Scene > Other Settings. Fog color #6D6B4EFF, Exponential at 0.025 density is enabled here. In Unity 2017 deferred rendering, you might also need to enable fog in the post process profile if it’s not activated automatically.Spotlight / Pointlight The staples of real time local lighting are spotlights and pointlights. Area lighting can only be used when baking lighting, unless you’re using the HD Scriptable Render Pipeline (SRP), introduced in 2018.1 beta. There are new area lights that can be rendered in realtime in HD SRP mode.Fundamentally, both of these types of lights emit light from one point in space and are limited by range with the spotlight having an additional limit by angle. More information about lighting is in the relevant section of Unity docs.The big differences between the two lights has to do with the way they cast shadows and interact with cookies. Shadowing with a point light costs 6 shadow maps compared to a spotlight’s single shadow map. For this reason shadow casting point lights are much more expensive and should be used very sparingly.NOTE: Baked lights don’t need to worry about this issue. Another difference is that a cookie texture on a Spotlight is a simple straight forward 2d texture while a pointlight requires a cubemap, usually authored in 3D software.Color and Intensity of light. Choosing the proper color and intensity for your lights needs to follow some loose guidelines to give plausible results.When selecting intensity for indoor lights, try to make sure no indoor lights have a greater intensity than the sun’s. This can create an unbalanced look depending on the Scene.Given this Sample Scene setting, it’s very unlikely that there’s high intensity lights shining from the ceiling that exceed the brightness of the daylight time.When selecting color, try not to leave out any one of the color channels completely. This creates a light that has problem converging to the white point.While it’s technically a valid light color, the light color on the left image removes all blue color from the final output. Having a limited final color palette in the Scene for a baseline isn’t a great idea, especially if you want to do color grading later on.Emissive Surfaces In Unity, emissive surfaces can contribute to lighting if Realtime GI or baked GI is enabled, giving the effect of area lighting. This is especially useful if Realtime GI is enabled. Content creators can modify the intensity and color of the emissive surface and get the feedback immediately, assuming that precompute has been done ahead of time.The image above showcases the subtle diffuse lighting from meshes on the ceiling.Unity San Francisco Spotlight Team has created the Spotlight Tunnel Sample Scene to help content creators do hands on learning and experimentation.Get the Spotlight Tunnel Sample project file here.Simply extract the project to the folder and open the project using Unity.Spotlight Tunnel Project was made with Unity 2017.1.0f3.Opening this project in a newer version of Unity will require lighting rebuild as there might be lighting data format incompatibility between versions.All assets provided in this project may only be used in a project developed with the Unity Engine.As mentioned before, there are more of things that you need to know about when making believable visuals. You can learn more about this topic in this tutorial. We’re also going to add a full Best Practices Guide to the Unity Docs. Stay tuned!Hopefully this blog helps content creators to stay on track in achieving believable visuals in Unity. We can’t wait to be dazzled by all Unity content creators out there.

>access_file_
1558|blog.unity.com

Unity Partners with PiXYZ Software to unlock CAD data for real-time development

Unity today is used across many industries beyond gaming, including aerospace, architecture, automotive, construction, gambling, transportation, manufacturing, medical, and more. The benefits of using real-time in these industries include accelerating innovation through better design collaboration, developing VR and AR training to improve training outcomes, and creating immersive experiences that can increase engagement and drive higher sales.Industry professionals are increasingly seeing the value of bringing real-time into their workflows, recognizing its potential to completely transform the way products and experiences are conceived and built. Despite the fact that forward-thinking individuals in these industries have made great strides to bring real-time into their creation processes, they have encountered persistent challenges, particularly when it comes to preparing or importing large CAD assemblies for real-time development.That’s why Unity and PiXYZ software have joined forces to provide large enterprises and individual professionals alike with best-in-class solutions to easily import and optimize CAD data for creating real-time experiences in Unity.“PiXYZ was founded by CAD professionals, for CAD professionals, with the goal of addressing the difficult challenges presented by working with large-scale CAD assemblies,” said Stéphane Imbert, Chairman, and Founder of PiXYZ Software. “Unity’s real-time development platform is widely recognized as the best real-time workflow for customers who use CAD and want to build high-quality experiences. Our strategic partnership with Unity will allow us to better serve our mutual customers together - today, and in the future - by delivering the best, most responsive, and most painless solution on the market.”With PiXYZ, we’re bringing you a way to work with large scale CAD assemblies in Unity, whether you are an enterprise managing large volumes of data, or an individual professional developing projects in Unity. Let’s take a look at how PiXYZ’s products fit these different needs.One thing all PiXYZ products share is a high performance core that combines fast processing speed, the ability to handle massive assemblies, and the ability to use data from virtually any common design system. Whether you’re working with a complete automotive powertrain assembly in NX, a large architectural curtain wall in CATIA, or manufacturing equipment in Inventor, Unity with PiXYZ has you covered.PiXYZ Plugin adds a menu option in the Unity editor that allows you to directly import CAD models into your project. The Plugin provides a set of dialog controls to scale, orient, repair (stitch), and tessellate your CAD model, and includes options to map UVs, control how design hierarchy is imported, and generate LODs. In addition, the Plugin has a secret superpower – the ability to directly import CAD data into a Windows runtime! Using the Plugin, you can publish a runtime executable from your project, and the end user can trigger the import of new CAD data. This is especially valuable when working with constantly-changing CAD data, providing immediate update without the need to rebuild a project or create asset bundles.PiXYZ Review and Review Premium provide an alternative to interacting with CAD data without the need to build a project at all. With Review, you can quickly open a native CAD file and interact with it using intuitive methods such as dynamic cutting planes, assembly exploding, interactive 3D measurement, and access to PMI (Product Manufacturing Information) data. Review also enables remote collaboration among multiple participants using different device types in different locations. With Review Premium, you can also use VR and AR devices for immersive collaboration.PiXYZ Studio and Studio Editor provide powerful and customizable CAD import and data preparation for the desktop. These products provide additional controls beyond those available in the Plugin, and add tools for model simplification, hidden geometry removal, fixing flipped normals, and interactive modification. With Studio Editor, you can create Python scripts that allow powerful automated processing based upon geometry characteristics or design metadata.PiXYZ Pipeline and Connect enable you to move your process into the datacenter. Pipeline provides all the power of Studio, with the ability to run automated batch processes on your on-premise servers or private cloud instances. If you need to process large volumes of data, you can add additional servers to scale to the capacity you need. With PiXYZ Connect, you can harness the power of multiple Pipeline servers to provide CAD import and data prep as a private web service within your company’s security perimeter.We also recognize that you need more than just industry-leading products. You also need support, training, and the ability to scale your development efforts dynamically. Unity already provides value to customers with a range of enterprise support offerings, and we’re extending that to further meet the needs of enterprise customers in industries beyond gaming by offering the Unity Industry Bundle, a customizable combination of PiXYZ products, support, and training, built on the foundation of Unity.We offer custom training as well as new training offerings focused on the needs of industry professionals. Unity also offers assistance to help your developers use Unity to its full potential. And, when you need to scale up your development efforts, you can look to the massive community of Unity developers – more than 1.5 million active developers per month – to find talent to accelerate your projects.The potential applications for real-time are seemingly endless, and with the ease of working with CAD that PiXYZ and Unity unlock, developers can create experiences for over 20 different platforms including many of the leading virtual reality, augmented reality, PC and mobile devices. In particular, companies and developers in these industries are seeing immense value in using real-time for product design and visualization, training applications, and customer-facing interactive or immersive experiences.Virtual Design Review – Accelerate InnovationUnity’s real-time development platform and augmented & virtual reality capabilities can help you create better designs faster and more efficiently. Designers and engineers can collaborate on common ground with a shared, at-scale visualization of the product where the impact of design changes across different departments and disciplines can easily be visualized. Virtual design review can even be extended to include suppliers and business partners.Training – Transfer KnowledgeVR and AR experiences help deliver training in a true-to-life experience that engages learners through a high-fidelity, realistic setting, resulting in improved training outcomes. With virtual and augmented reality training, you can train workers to operate a factory that is not yet built, or train people on the operation of dangerous equipment without risking physical safety, significantly reducing the risk of an accident happening on the job.Marketing and Sales – Deliver High-Impact ExperiencesUnity allows you to create real-time, interactive, and immersive customer experiences that help drive higher customer engagement and investment in the product, ultimately driving more sales. With PiXYZ, product design data can also be leveraged directly to create an online configurator, or a VR experience that allows customers to experience a product in true-to-life fidelity.If you would like to know more about the Unity Industry Bundle or how Unity and PiXYZ can help you accelerate innovation, deliver high-impact experiences, or transform your business, our dedicated team of experts are standing by and ready to answer your questions!

>access_file_